CHECK: Is CUDA the right version (10)? Using backbone resnet101 Using data augmentation generator weights arg is None Loading imagenet weights Creating model, this may take a second... Loading weights into model tracking anchors tracking anchors tracking anchors tracking anchors tracking anchors Model: "retinanet" __________________________________________________________________________________________________ Layer (type) Output Shape Param # Connected to ================================================================================================== input_1 (InputLayer) (None, None, None, 3 0 __________________________________________________________________________________________________ padding_conv1 (ZeroPadding2D) (None, None, None, 3 0 input_1[0][0] __________________________________________________________________________________________________ conv1 (Conv2D) (None, None, None, 6 9408 padding_conv1[0][0] __________________________________________________________________________________________________ bn_conv1 (BatchNormalization) (None, None, None, 6 256 conv1[0][0] __________________________________________________________________________________________________ conv1_relu (Activation) (None, None, None, 6 0 bn_conv1[0][0] __________________________________________________________________________________________________ pool1 (MaxPooling2D) (None, None, None, 6 0 conv1_relu[0][0] __________________________________________________________________________________________________ res2a_branch2a (Conv2D) (None, None, None, 6 4096 pool1[0][0] __________________________________________________________________________________________________ bn2a_branch2a (BatchNormalizati (None, None, None, 6 256 res2a_branch2a[0][0] __________________________________________________________________________________________________ res2a_branch2a_relu (Activation (None, None, None, 6 0 bn2a_branch2a[0][0] __________________________________________________________________________________________________ padding2a_branch2b (ZeroPadding (None, None, None, 6 0 res2a_branch2a_relu[0][0] __________________________________________________________________________________________________ res2a_branch2b (Conv2D) (None, None, None, 6 36864 padding2a_branch2b[0][0] __________________________________________________________________________________________________ bn2a_branch2b (BatchNormalizati (None, None, None, 6 256 res2a_branch2b[0][0] __________________________________________________________________________________________________ res2a_branch2b_relu (Activation (None, None, None, 6 0 bn2a_branch2b[0][0] __________________________________________________________________________________________________ res2a_branch2c (Conv2D) (None, None, None, 2 16384 res2a_branch2b_relu[0][0] __________________________________________________________________________________________________ res2a_branch1 (Conv2D) (None, None, None, 2 16384 pool1[0][0] __________________________________________________________________________________________________ bn2a_branch2c (BatchNormalizati (None, None, None, 2 1024 res2a_branch2c[0][0] __________________________________________________________________________________________________ bn2a_branch1 (BatchNormalizatio (None, None, None, 2 1024 res2a_branch1[0][0] __________________________________________________________________________________________________ res2a (Add) (None, None, None, 2 0 bn2a_branch2c[0][0] bn2a_branch1[0][0] __________________________________________________________________________________________________ res2a_relu (Activation) (None, None, None, 2 0 res2a[0][0] __________________________________________________________________________________________________ res2b_branch2a (Conv2D) (None, None, None, 6 16384 res2a_relu[0][0] __________________________________________________________________________________________________ bn2b_branch2a (BatchNormalizati (None, None, None, 6 256 res2b_branch2a[0][0] __________________________________________________________________________________________________ res2b_branch2a_relu (Activation (None, None, None, 6 0 bn2b_branch2a[0][0] __________________________________________________________________________________________________ padding2b_branch2b (ZeroPadding (None, None, None, 6 0 res2b_branch2a_relu[0][0] __________________________________________________________________________________________________ res2b_branch2b (Conv2D) (None, None, None, 6 36864 padding2b_branch2b[0][0] __________________________________________________________________________________________________ bn2b_branch2b (BatchNormalizati (None, None, None, 6 256 res2b_branch2b[0][0] __________________________________________________________________________________________________ res2b_branch2b_relu (Activation (None, None, None, 6 0 bn2b_branch2b[0][0] __________________________________________________________________________________________________ res2b_branch2c (Conv2D) (None, None, None, 2 16384 res2b_branch2b_relu[0][0] __________________________________________________________________________________________________ bn2b_branch2c (BatchNormalizati (None, None, None, 2 1024 res2b_branch2c[0][0] __________________________________________________________________________________________________ res2b (Add) (None, None, None, 2 0 bn2b_branch2c[0][0] res2a_relu[0][0] __________________________________________________________________________________________________ res2b_relu (Activation) (None, None, None, 2 0 res2b[0][0] __________________________________________________________________________________________________ res2c_branch2a (Conv2D) (None, None, None, 6 16384 res2b_relu[0][0] __________________________________________________________________________________________________ bn2c_branch2a (BatchNormalizati (None, None, None, 6 256 res2c_branch2a[0][0] __________________________________________________________________________________________________ res2c_branch2a_relu (Activation (None, None, None, 6 0 bn2c_branch2a[0][0] __________________________________________________________________________________________________ padding2c_branch2b (ZeroPadding (None, None, None, 6 0 res2c_branch2a_relu[0][0] __________________________________________________________________________________________________ res2c_branch2b (Conv2D) (None, None, None, 6 36864 padding2c_branch2b[0][0] __________________________________________________________________________________________________ bn2c_branch2b (BatchNormalizati (None, None, None, 6 256 res2c_branch2b[0][0] __________________________________________________________________________________________________ res2c_branch2b_relu (Activation (None, None, None, 6 0 bn2c_branch2b[0][0] __________________________________________________________________________________________________ res2c_branch2c (Conv2D) (None, None, None, 2 16384 res2c_branch2b_relu[0][0] __________________________________________________________________________________________________ bn2c_branch2c (BatchNormalizati (None, None, None, 2 1024 res2c_branch2c[0][0] __________________________________________________________________________________________________ res2c (Add) (None, None, None, 2 0 bn2c_branch2c[0][0] res2b_relu[0][0] __________________________________________________________________________________________________ res2c_relu (Activation) (None, None, None, 2 0 res2c[0][0] __________________________________________________________________________________________________ res3a_branch2a (Conv2D) (None, None, None, 1 32768 res2c_relu[0][0] __________________________________________________________________________________________________ bn3a_branch2a (BatchNormalizati (None, None, None, 1 512 res3a_branch2a[0][0] __________________________________________________________________________________________________ res3a_branch2a_relu (Activation (None, None, None, 1 0 bn3a_branch2a[0][0] __________________________________________________________________________________________________ padding3a_branch2b (ZeroPadding (None, None, None, 1 0 res3a_branch2a_relu[0][0] __________________________________________________________________________________________________ res3a_branch2b (Conv2D) (None, None, None, 1 147456 padding3a_branch2b[0][0] __________________________________________________________________________________________________ bn3a_branch2b (BatchNormalizati (None, None, None, 1 512 res3a_branch2b[0][0] __________________________________________________________________________________________________ res3a_branch2b_relu (Activation (None, None, None, 1 0 bn3a_branch2b[0][0] __________________________________________________________________________________________________ res3a_branch2c (Conv2D) (None, None, None, 5 65536 res3a_branch2b_relu[0][0] __________________________________________________________________________________________________ res3a_branch1 (Conv2D) (None, None, None, 5 131072 res2c_relu[0][0] __________________________________________________________________________________________________ bn3a_branch2c (BatchNormalizati (None, None, None, 5 2048 res3a_branch2c[0][0] __________________________________________________________________________________________________ bn3a_branch1 (BatchNormalizatio (None, None, None, 5 2048 res3a_branch1[0][0] __________________________________________________________________________________________________ res3a (Add) (None, None, None, 5 0 bn3a_branch2c[0][0] bn3a_branch1[0][0] __________________________________________________________________________________________________ res3a_relu (Activation) (None, None, None, 5 0 res3a[0][0] __________________________________________________________________________________________________ res3b1_branch2a (Conv2D) (None, None, None, 1 65536 res3a_relu[0][0] __________________________________________________________________________________________________ bn3b1_branch2a (BatchNormalizat (None, None, None, 1 512 res3b1_branch2a[0][0] __________________________________________________________________________________________________ res3b1_branch2a_relu (Activatio (None, None, None, 1 0 bn3b1_branch2a[0][0] __________________________________________________________________________________________________ padding3b1_branch2b (ZeroPaddin (None, None, None, 1 0 res3b1_branch2a_relu[0][0] __________________________________________________________________________________________________ res3b1_branch2b (Conv2D) (None, None, None, 1 147456 padding3b1_branch2b[0][0] __________________________________________________________________________________________________ bn3b1_branch2b (BatchNormalizat (None, None, None, 1 512 res3b1_branch2b[0][0] __________________________________________________________________________________________________ res3b1_branch2b_relu (Activatio (None, None, None, 1 0 bn3b1_branch2b[0][0] __________________________________________________________________________________________________ res3b1_branch2c (Conv2D) (None, None, None, 5 65536 res3b1_branch2b_relu[0][0] __________________________________________________________________________________________________ bn3b1_branch2c (BatchNormalizat (None, None, None, 5 2048 res3b1_branch2c[0][0] __________________________________________________________________________________________________ res3b1 (Add) (None, None, None, 5 0 bn3b1_branch2c[0][0] res3a_relu[0][0] __________________________________________________________________________________________________ res3b1_relu (Activation) (None, None, None, 5 0 res3b1[0][0] __________________________________________________________________________________________________ res3b2_branch2a (Conv2D) (None, None, None, 1 65536 res3b1_relu[0][0] __________________________________________________________________________________________________ bn3b2_branch2a (BatchNormalizat (None, None, None, 1 512 res3b2_branch2a[0][0] __________________________________________________________________________________________________ res3b2_branch2a_relu (Activatio (None, None, None, 1 0 bn3b2_branch2a[0][0] __________________________________________________________________________________________________ padding3b2_branch2b (ZeroPaddin (None, None, None, 1 0 res3b2_branch2a_relu[0][0] __________________________________________________________________________________________________ res3b2_branch2b (Conv2D) (None, None, None, 1 147456 padding3b2_branch2b[0][0] __________________________________________________________________________________________________ bn3b2_branch2b (BatchNormalizat (None, None, None, 1 512 res3b2_branch2b[0][0] __________________________________________________________________________________________________ res3b2_branch2b_relu (Activatio (None, None, None, 1 0 bn3b2_branch2b[0][0] __________________________________________________________________________________________________ res3b2_branch2c (Conv2D) (None, None, None, 5 65536 res3b2_branch2b_relu[0][0] __________________________________________________________________________________________________ bn3b2_branch2c (BatchNormalizat (None, None, None, 5 2048 res3b2_branch2c[0][0] __________________________________________________________________________________________________ res3b2 (Add) (None, None, None, 5 0 bn3b2_branch2c[0][0] res3b1_relu[0][0] __________________________________________________________________________________________________ res3b2_relu (Activation) (None, None, None, 5 0 res3b2[0][0] __________________________________________________________________________________________________ res3b3_branch2a (Conv2D) (None, None, None, 1 65536 res3b2_relu[0][0] __________________________________________________________________________________________________ bn3b3_branch2a (BatchNormalizat (None, None, None, 1 512 res3b3_branch2a[0][0] __________________________________________________________________________________________________ res3b3_branch2a_relu (Activatio (None, None, None, 1 0 bn3b3_branch2a[0][0] __________________________________________________________________________________________________ padding3b3_branch2b (ZeroPaddin (None, None, None, 1 0 res3b3_branch2a_relu[0][0] __________________________________________________________________________________________________ res3b3_branch2b (Conv2D) (None, None, None, 1 147456 padding3b3_branch2b[0][0] __________________________________________________________________________________________________ bn3b3_branch2b (BatchNormalizat (None, None, None, 1 512 res3b3_branch2b[0][0] __________________________________________________________________________________________________ res3b3_branch2b_relu (Activatio (None, None, None, 1 0 bn3b3_branch2b[0][0] __________________________________________________________________________________________________ res3b3_branch2c (Conv2D) (None, None, None, 5 65536 res3b3_branch2b_relu[0][0] __________________________________________________________________________________________________ bn3b3_branch2c (BatchNormalizat (None, None, None, 5 2048 res3b3_branch2c[0][0] __________________________________________________________________________________________________ res3b3 (Add) (None, None, None, 5 0 bn3b3_branch2c[0][0] res3b2_relu[0][0] __________________________________________________________________________________________________ res3b3_relu (Activation) (None, None, None, 5 0 res3b3[0][0] __________________________________________________________________________________________________ res4a_branch2a (Conv2D) (None, None, None, 2 131072 res3b3_relu[0][0] __________________________________________________________________________________________________ bn4a_branch2a (BatchNormalizati (None, None, None, 2 1024 res4a_branch2a[0][0] __________________________________________________________________________________________________ res4a_branch2a_relu (Activation (None, None, None, 2 0 bn4a_branch2a[0][0] __________________________________________________________________________________________________ padding4a_branch2b (ZeroPadding (None, None, None, 2 0 res4a_branch2a_relu[0][0] __________________________________________________________________________________________________ res4a_branch2b (Conv2D) (None, None, None, 2 589824 padding4a_branch2b[0][0] __________________________________________________________________________________________________ bn4a_branch2b (BatchNormalizati (None, None, None, 2 1024 res4a_branch2b[0][0] __________________________________________________________________________________________________ res4a_branch2b_relu (Activation (None, None, None, 2 0 bn4a_branch2b[0][0] __________________________________________________________________________________________________ res4a_branch2c (Conv2D) (None, None, None, 1 262144 res4a_branch2b_relu[0][0] __________________________________________________________________________________________________ res4a_branch1 (Conv2D) (None, None, None, 1 524288 res3b3_relu[0][0] __________________________________________________________________________________________________ bn4a_branch2c (BatchNormalizati (None, None, None, 1 4096 res4a_branch2c[0][0] __________________________________________________________________________________________________ bn4a_branch1 (BatchNormalizatio (None, None, None, 1 4096 res4a_branch1[0][0] __________________________________________________________________________________________________ res4a (Add) (None, None, None, 1 0 bn4a_branch2c[0][0] bn4a_branch1[0][0] __________________________________________________________________________________________________ res4a_relu (Activation) (None, None, None, 1 0 res4a[0][0] __________________________________________________________________________________________________ res4b1_branch2a (Conv2D) (None, None, None, 2 262144 res4a_relu[0][0] __________________________________________________________________________________________________ bn4b1_branch2a (BatchNormalizat (None, None, None, 2 1024 res4b1_branch2a[0][0] __________________________________________________________________________________________________ res4b1_branch2a_relu (Activatio (None, None, None, 2 0 bn4b1_branch2a[0][0] __________________________________________________________________________________________________ padding4b1_branch2b (ZeroPaddin (None, None, None, 2 0 res4b1_branch2a_relu[0][0] __________________________________________________________________________________________________ res4b1_branch2b (Conv2D) (None, None, None, 2 589824 padding4b1_branch2b[0][0] __________________________________________________________________________________________________ bn4b1_branch2b (BatchNormalizat (None, None, None, 2 1024 res4b1_branch2b[0][0] __________________________________________________________________________________________________ res4b1_branch2b_relu (Activatio (None, None, None, 2 0 bn4b1_branch2b[0][0] __________________________________________________________________________________________________ res4b1_branch2c (Conv2D) (None, None, None, 1 262144 res4b1_branch2b_relu[0][0] __________________________________________________________________________________________________ bn4b1_branch2c (BatchNormalizat (None, None, None, 1 4096 res4b1_branch2c[0][0] __________________________________________________________________________________________________ res4b1 (Add) (None, None, None, 1 0 bn4b1_branch2c[0][0] res4a_relu[0][0] __________________________________________________________________________________________________ res4b1_relu (Activation) (None, None, None, 1 0 res4b1[0][0] __________________________________________________________________________________________________ res4b2_branch2a (Conv2D) (None, None, None, 2 262144 res4b1_relu[0][0] __________________________________________________________________________________________________ bn4b2_branch2a (BatchNormalizat (None, None, None, 2 1024 res4b2_branch2a[0][0] __________________________________________________________________________________________________ res4b2_branch2a_relu (Activatio (None, None, None, 2 0 bn4b2_branch2a[0][0] __________________________________________________________________________________________________ padding4b2_branch2b (ZeroPaddin (None, None, None, 2 0 res4b2_branch2a_relu[0][0] __________________________________________________________________________________________________ res4b2_branch2b (Conv2D) (None, None, None, 2 589824 padding4b2_branch2b[0][0] __________________________________________________________________________________________________ bn4b2_branch2b (BatchNormalizat (None, None, None, 2 1024 res4b2_branch2b[0][0] __________________________________________________________________________________________________ res4b2_branch2b_relu (Activatio (None, None, None, 2 0 bn4b2_branch2b[0][0] __________________________________________________________________________________________________ res4b2_branch2c (Conv2D) (None, None, None, 1 262144 res4b2_branch2b_relu[0][0] __________________________________________________________________________________________________ bn4b2_branch2c (BatchNormalizat (None, None, None, 1 4096 res4b2_branch2c[0][0] __________________________________________________________________________________________________ res4b2 (Add) (None, None, None, 1 0 bn4b2_branch2c[0][0] res4b1_relu[0][0] __________________________________________________________________________________________________ res4b2_relu (Activation) (None, None, None, 1 0 res4b2[0][0] __________________________________________________________________________________________________ res4b3_branch2a (Conv2D) (None, None, None, 2 262144 res4b2_relu[0][0] __________________________________________________________________________________________________ bn4b3_branch2a (BatchNormalizat (None, None, None, 2 1024 res4b3_branch2a[0][0] __________________________________________________________________________________________________ res4b3_branch2a_relu (Activatio (None, None, None, 2 0 bn4b3_branch2a[0][0] __________________________________________________________________________________________________ padding4b3_branch2b (ZeroPaddin (None, None, None, 2 0 res4b3_branch2a_relu[0][0] __________________________________________________________________________________________________ res4b3_branch2b (Conv2D) (None, None, None, 2 589824 padding4b3_branch2b[0][0] __________________________________________________________________________________________________ bn4b3_branch2b (BatchNormalizat (None, None, None, 2 1024 res4b3_branch2b[0][0] __________________________________________________________________________________________________ res4b3_branch2b_relu (Activatio (None, None, None, 2 0 bn4b3_branch2b[0][0] __________________________________________________________________________________________________ res4b3_branch2c (Conv2D) (None, None, None, 1 262144 res4b3_branch2b_relu[0][0] __________________________________________________________________________________________________ bn4b3_branch2c (BatchNormalizat (None, None, None, 1 4096 res4b3_branch2c[0][0] __________________________________________________________________________________________________ res4b3 (Add) (None, None, None, 1 0 bn4b3_branch2c[0][0] res4b2_relu[0][0] __________________________________________________________________________________________________ res4b3_relu (Activation) (None, None, None, 1 0 res4b3[0][0] __________________________________________________________________________________________________ res4b4_branch2a (Conv2D) (None, None, None, 2 262144 res4b3_relu[0][0] __________________________________________________________________________________________________ bn4b4_branch2a (BatchNormalizat (None, None, None, 2 1024 res4b4_branch2a[0][0] __________________________________________________________________________________________________ res4b4_branch2a_relu (Activatio (None, None, None, 2 0 bn4b4_branch2a[0][0] __________________________________________________________________________________________________ padding4b4_branch2b (ZeroPaddin (None, None, None, 2 0 res4b4_branch2a_relu[0][0] __________________________________________________________________________________________________ res4b4_branch2b (Conv2D) (None, None, None, 2 589824 padding4b4_branch2b[0][0] __________________________________________________________________________________________________ bn4b4_branch2b (BatchNormalizat (None, None, None, 2 1024 res4b4_branch2b[0][0] __________________________________________________________________________________________________ res4b4_branch2b_relu (Activatio (None, None, None, 2 0 bn4b4_branch2b[0][0] __________________________________________________________________________________________________ res4b4_branch2c (Conv2D) (None, None, None, 1 262144 res4b4_branch2b_relu[0][0] __________________________________________________________________________________________________ bn4b4_branch2c (BatchNormalizat (None, None, None, 1 4096 res4b4_branch2c[0][0] __________________________________________________________________________________________________ res4b4 (Add) (None, None, None, 1 0 bn4b4_branch2c[0][0] res4b3_relu[0][0] __________________________________________________________________________________________________ res4b4_relu (Activation) (None, None, None, 1 0 res4b4[0][0] __________________________________________________________________________________________________ res4b5_branch2a (Conv2D) (None, None, None, 2 262144 res4b4_relu[0][0] __________________________________________________________________________________________________ bn4b5_branch2a (BatchNormalizat (None, None, None, 2 1024 res4b5_branch2a[0][0] __________________________________________________________________________________________________ res4b5_branch2a_relu (Activatio (None, None, None, 2 0 bn4b5_branch2a[0][0] __________________________________________________________________________________________________ padding4b5_branch2b (ZeroPaddin (None, None, None, 2 0 res4b5_branch2a_relu[0][0] __________________________________________________________________________________________________ res4b5_branch2b (Conv2D) (None, None, None, 2 589824 padding4b5_branch2b[0][0] __________________________________________________________________________________________________ bn4b5_branch2b (BatchNormalizat (None, None, None, 2 1024 res4b5_branch2b[0][0] __________________________________________________________________________________________________ res4b5_branch2b_relu (Activatio (None, None, None, 2 0 bn4b5_branch2b[0][0] __________________________________________________________________________________________________ res4b5_branch2c (Conv2D) (None, None, None, 1 262144 res4b5_branch2b_relu[0][0] __________________________________________________________________________________________________ bn4b5_branch2c (BatchNormalizat (None, None, None, 1 4096 res4b5_branch2c[0][0] __________________________________________________________________________________________________ res4b5 (Add) (None, None, None, 1 0 bn4b5_branch2c[0][0] res4b4_relu[0][0] __________________________________________________________________________________________________ res4b5_relu (Activation) (None, None, None, 1 0 res4b5[0][0] __________________________________________________________________________________________________ res4b6_branch2a (Conv2D) (None, None, None, 2 262144 res4b5_relu[0][0] __________________________________________________________________________________________________ bn4b6_branch2a (BatchNormalizat (None, None, None, 2 1024 res4b6_branch2a[0][0] __________________________________________________________________________________________________ res4b6_branch2a_relu (Activatio (None, None, None, 2 0 bn4b6_branch2a[0][0] __________________________________________________________________________________________________ padding4b6_branch2b (ZeroPaddin (None, None, None, 2 0 res4b6_branch2a_relu[0][0] __________________________________________________________________________________________________ res4b6_branch2b (Conv2D) (None, None, None, 2 589824 padding4b6_branch2b[0][0] __________________________________________________________________________________________________ bn4b6_branch2b (BatchNormalizat (None, None, None, 2 1024 res4b6_branch2b[0][0] __________________________________________________________________________________________________ res4b6_branch2b_relu (Activatio (None, None, None, 2 0 bn4b6_branch2b[0][0] __________________________________________________________________________________________________ res4b6_branch2c (Conv2D) (None, None, None, 1 262144 res4b6_branch2b_relu[0][0] __________________________________________________________________________________________________ bn4b6_branch2c (BatchNormalizat (None, None, None, 1 4096 res4b6_branch2c[0][0] __________________________________________________________________________________________________ res4b6 (Add) (None, None, None, 1 0 bn4b6_branch2c[0][0] res4b5_relu[0][0] __________________________________________________________________________________________________ res4b6_relu (Activation) (None, None, None, 1 0 res4b6[0][0] __________________________________________________________________________________________________ res4b7_branch2a (Conv2D) (None, None, None, 2 262144 res4b6_relu[0][0] __________________________________________________________________________________________________ bn4b7_branch2a (BatchNormalizat (None, None, None, 2 1024 res4b7_branch2a[0][0] __________________________________________________________________________________________________ res4b7_branch2a_relu (Activatio (None, None, None, 2 0 bn4b7_branch2a[0][0] __________________________________________________________________________________________________ padding4b7_branch2b (ZeroPaddin (None, None, None, 2 0 res4b7_branch2a_relu[0][0] __________________________________________________________________________________________________ res4b7_branch2b (Conv2D) (None, None, None, 2 589824 padding4b7_branch2b[0][0] __________________________________________________________________________________________________ bn4b7_branch2b (BatchNormalizat (None, None, None, 2 1024 res4b7_branch2b[0][0] __________________________________________________________________________________________________ res4b7_branch2b_relu (Activatio (None, None, None, 2 0 bn4b7_branch2b[0][0] __________________________________________________________________________________________________ res4b7_branch2c (Conv2D) (None, None, None, 1 262144 res4b7_branch2b_relu[0][0] __________________________________________________________________________________________________ bn4b7_branch2c (BatchNormalizat (None, None, None, 1 4096 res4b7_branch2c[0][0] __________________________________________________________________________________________________ res4b7 (Add) (None, None, None, 1 0 bn4b7_branch2c[0][0] res4b6_relu[0][0] __________________________________________________________________________________________________ res4b7_relu (Activation) (None, None, None, 1 0 res4b7[0][0] __________________________________________________________________________________________________ res4b8_branch2a (Conv2D) (None, None, None, 2 262144 res4b7_relu[0][0] __________________________________________________________________________________________________ bn4b8_branch2a (BatchNormalizat (None, None, None, 2 1024 res4b8_branch2a[0][0] __________________________________________________________________________________________________ res4b8_branch2a_relu (Activatio (None, None, None, 2 0 bn4b8_branch2a[0][0] __________________________________________________________________________________________________ padding4b8_branch2b (ZeroPaddin (None, None, None, 2 0 res4b8_branch2a_relu[0][0] __________________________________________________________________________________________________ res4b8_branch2b (Conv2D) (None, None, None, 2 589824 padding4b8_branch2b[0][0] __________________________________________________________________________________________________ bn4b8_branch2b (BatchNormalizat (None, None, None, 2 1024 res4b8_branch2b[0][0] __________________________________________________________________________________________________ res4b8_branch2b_relu (Activatio (None, None, None, 2 0 bn4b8_branch2b[0][0] __________________________________________________________________________________________________ res4b8_branch2c (Conv2D) (None, None, None, 1 262144 res4b8_branch2b_relu[0][0] __________________________________________________________________________________________________ bn4b8_branch2c (BatchNormalizat (None, None, None, 1 4096 res4b8_branch2c[0][0] __________________________________________________________________________________________________ res4b8 (Add) (None, None, None, 1 0 bn4b8_branch2c[0][0] res4b7_relu[0][0] __________________________________________________________________________________________________ res4b8_relu (Activation) (None, None, None, 1 0 res4b8[0][0] __________________________________________________________________________________________________ res4b9_branch2a (Conv2D) (None, None, None, 2 262144 res4b8_relu[0][0] __________________________________________________________________________________________________ bn4b9_branch2a (BatchNormalizat (None, None, None, 2 1024 res4b9_branch2a[0][0] __________________________________________________________________________________________________ res4b9_branch2a_relu (Activatio (None, None, None, 2 0 bn4b9_branch2a[0][0] __________________________________________________________________________________________________ padding4b9_branch2b (ZeroPaddin (None, None, None, 2 0 res4b9_branch2a_relu[0][0] __________________________________________________________________________________________________ res4b9_branch2b (Conv2D) (None, None, None, 2 589824 padding4b9_branch2b[0][0] __________________________________________________________________________________________________ bn4b9_branch2b (BatchNormalizat (None, None, None, 2 1024 res4b9_branch2b[0][0] __________________________________________________________________________________________________ res4b9_branch2b_relu (Activatio (None, None, None, 2 0 bn4b9_branch2b[0][0] __________________________________________________________________________________________________ res4b9_branch2c (Conv2D) (None, None, None, 1 262144 res4b9_branch2b_relu[0][0] __________________________________________________________________________________________________ bn4b9_branch2c (BatchNormalizat (None, None, None, 1 4096 res4b9_branch2c[0][0] __________________________________________________________________________________________________ res4b9 (Add) (None, None, None, 1 0 bn4b9_branch2c[0][0] res4b8_relu[0][0] __________________________________________________________________________________________________ res4b9_relu (Activation) (None, None, None, 1 0 res4b9[0][0] __________________________________________________________________________________________________ res4b10_branch2a (Conv2D) (None, None, None, 2 262144 res4b9_relu[0][0] __________________________________________________________________________________________________ bn4b10_branch2a (BatchNormaliza (None, None, None, 2 1024 res4b10_branch2a[0][0] __________________________________________________________________________________________________ res4b10_branch2a_relu (Activati (None, None, None, 2 0 bn4b10_branch2a[0][0] __________________________________________________________________________________________________ padding4b10_branch2b (ZeroPaddi (None, None, None, 2 0 res4b10_branch2a_relu[0][0] __________________________________________________________________________________________________ res4b10_branch2b (Conv2D) (None, None, None, 2 589824 padding4b10_branch2b[0][0] __________________________________________________________________________________________________ bn4b10_branch2b (BatchNormaliza (None, None, None, 2 1024 res4b10_branch2b[0][0] __________________________________________________________________________________________________ res4b10_branch2b_relu (Activati (None, None, None, 2 0 bn4b10_branch2b[0][0] __________________________________________________________________________________________________ res4b10_branch2c (Conv2D) (None, None, None, 1 262144 res4b10_branch2b_relu[0][0] __________________________________________________________________________________________________ bn4b10_branch2c (BatchNormaliza (None, None, None, 1 4096 res4b10_branch2c[0][0] __________________________________________________________________________________________________ res4b10 (Add) (None, None, None, 1 0 bn4b10_branch2c[0][0] res4b9_relu[0][0] __________________________________________________________________________________________________ res4b10_relu (Activation) (None, None, None, 1 0 res4b10[0][0] __________________________________________________________________________________________________ res4b11_branch2a (Conv2D) (None, None, None, 2 262144 res4b10_relu[0][0] __________________________________________________________________________________________________ bn4b11_branch2a (BatchNormaliza (None, None, None, 2 1024 res4b11_branch2a[0][0] __________________________________________________________________________________________________ res4b11_branch2a_relu (Activati (None, None, None, 2 0 bn4b11_branch2a[0][0] __________________________________________________________________________________________________ padding4b11_branch2b (ZeroPaddi (None, None, None, 2 0 res4b11_branch2a_relu[0][0] __________________________________________________________________________________________________ res4b11_branch2b (Conv2D) (None, None, None, 2 589824 padding4b11_branch2b[0][0] __________________________________________________________________________________________________ bn4b11_branch2b (BatchNormaliza (None, None, None, 2 1024 res4b11_branch2b[0][0] __________________________________________________________________________________________________ res4b11_branch2b_relu (Activati (None, None, None, 2 0 bn4b11_branch2b[0][0] __________________________________________________________________________________________________ res4b11_branch2c (Conv2D) (None, None, None, 1 262144 res4b11_branch2b_relu[0][0] __________________________________________________________________________________________________ bn4b11_branch2c (BatchNormaliza (None, None, None, 1 4096 res4b11_branch2c[0][0] __________________________________________________________________________________________________ res4b11 (Add) (None, None, None, 1 0 bn4b11_branch2c[0][0] res4b10_relu[0][0] __________________________________________________________________________________________________ res4b11_relu (Activation) (None, None, None, 1 0 res4b11[0][0] __________________________________________________________________________________________________ res4b12_branch2a (Conv2D) (None, None, None, 2 262144 res4b11_relu[0][0] __________________________________________________________________________________________________ bn4b12_branch2a (BatchNormaliza (None, None, None, 2 1024 res4b12_branch2a[0][0] __________________________________________________________________________________________________ res4b12_branch2a_relu (Activati (None, None, None, 2 0 bn4b12_branch2a[0][0] __________________________________________________________________________________________________ padding4b12_branch2b (ZeroPaddi (None, None, None, 2 0 res4b12_branch2a_relu[0][0] __________________________________________________________________________________________________ res4b12_branch2b (Conv2D) (None, None, None, 2 589824 padding4b12_branch2b[0][0] __________________________________________________________________________________________________ bn4b12_branch2b (BatchNormaliza (None, None, None, 2 1024 res4b12_branch2b[0][0] __________________________________________________________________________________________________ res4b12_branch2b_relu (Activati (None, None, None, 2 0 bn4b12_branch2b[0][0] __________________________________________________________________________________________________ res4b12_branch2c (Conv2D) (None, None, None, 1 262144 res4b12_branch2b_relu[0][0] __________________________________________________________________________________________________ bn4b12_branch2c (BatchNormaliza (None, None, None, 1 4096 res4b12_branch2c[0][0] __________________________________________________________________________________________________ res4b12 (Add) (None, None, None, 1 0 bn4b12_branch2c[0][0] res4b11_relu[0][0] __________________________________________________________________________________________________ res4b12_relu (Activation) (None, None, None, 1 0 res4b12[0][0] __________________________________________________________________________________________________ res4b13_branch2a (Conv2D) (None, None, None, 2 262144 res4b12_relu[0][0] __________________________________________________________________________________________________ bn4b13_branch2a (BatchNormaliza (None, None, None, 2 1024 res4b13_branch2a[0][0] __________________________________________________________________________________________________ res4b13_branch2a_relu (Activati (None, None, None, 2 0 bn4b13_branch2a[0][0] __________________________________________________________________________________________________ padding4b13_branch2b (ZeroPaddi (None, None, None, 2 0 res4b13_branch2a_relu[0][0] __________________________________________________________________________________________________ res4b13_branch2b (Conv2D) (None, None, None, 2 589824 padding4b13_branch2b[0][0] __________________________________________________________________________________________________ bn4b13_branch2b (BatchNormaliza (None, None, None, 2 1024 res4b13_branch2b[0][0] __________________________________________________________________________________________________ res4b13_branch2b_relu (Activati (None, None, None, 2 0 bn4b13_branch2b[0][0] __________________________________________________________________________________________________ res4b13_branch2c (Conv2D) (None, None, None, 1 262144 res4b13_branch2b_relu[0][0] __________________________________________________________________________________________________ bn4b13_branch2c (BatchNormaliza (None, None, None, 1 4096 res4b13_branch2c[0][0] __________________________________________________________________________________________________ res4b13 (Add) (None, None, None, 1 0 bn4b13_branch2c[0][0] res4b12_relu[0][0] __________________________________________________________________________________________________ res4b13_relu (Activation) (None, None, None, 1 0 res4b13[0][0] __________________________________________________________________________________________________ res4b14_branch2a (Conv2D) (None, None, None, 2 262144 res4b13_relu[0][0] __________________________________________________________________________________________________ bn4b14_branch2a (BatchNormaliza (None, None, None, 2 1024 res4b14_branch2a[0][0] __________________________________________________________________________________________________ res4b14_branch2a_relu (Activati (None, None, None, 2 0 bn4b14_branch2a[0][0] __________________________________________________________________________________________________ padding4b14_branch2b (ZeroPaddi (None, None, None, 2 0 res4b14_branch2a_relu[0][0] __________________________________________________________________________________________________ res4b14_branch2b (Conv2D) (None, None, None, 2 589824 padding4b14_branch2b[0][0] __________________________________________________________________________________________________ bn4b14_branch2b (BatchNormaliza (None, None, None, 2 1024 res4b14_branch2b[0][0] __________________________________________________________________________________________________ res4b14_branch2b_relu (Activati (None, None, None, 2 0 bn4b14_branch2b[0][0] __________________________________________________________________________________________________ res4b14_branch2c (Conv2D) (None, None, None, 1 262144 res4b14_branch2b_relu[0][0] __________________________________________________________________________________________________ bn4b14_branch2c (BatchNormaliza (None, None, None, 1 4096 res4b14_branch2c[0][0] __________________________________________________________________________________________________ res4b14 (Add) (None, None, None, 1 0 bn4b14_branch2c[0][0] res4b13_relu[0][0] __________________________________________________________________________________________________ res4b14_relu (Activation) (None, None, None, 1 0 res4b14[0][0] __________________________________________________________________________________________________ res4b15_branch2a (Conv2D) (None, None, None, 2 262144 res4b14_relu[0][0] __________________________________________________________________________________________________ bn4b15_branch2a (BatchNormaliza (None, None, None, 2 1024 res4b15_branch2a[0][0] __________________________________________________________________________________________________ res4b15_branch2a_relu (Activati (None, None, None, 2 0 bn4b15_branch2a[0][0] __________________________________________________________________________________________________ padding4b15_branch2b (ZeroPaddi (None, None, None, 2 0 res4b15_branch2a_relu[0][0] __________________________________________________________________________________________________ res4b15_branch2b (Conv2D) (None, None, None, 2 589824 padding4b15_branch2b[0][0] __________________________________________________________________________________________________ bn4b15_branch2b (BatchNormaliza (None, None, None, 2 1024 res4b15_branch2b[0][0] __________________________________________________________________________________________________ res4b15_branch2b_relu (Activati (None, None, None, 2 0 bn4b15_branch2b[0][0] __________________________________________________________________________________________________ res4b15_branch2c (Conv2D) (None, None, None, 1 262144 res4b15_branch2b_relu[0][0] __________________________________________________________________________________________________ bn4b15_branch2c (BatchNormaliza (None, None, None, 1 4096 res4b15_branch2c[0][0] __________________________________________________________________________________________________ res4b15 (Add) (None, None, None, 1 0 bn4b15_branch2c[0][0] res4b14_relu[0][0] __________________________________________________________________________________________________ res4b15_relu (Activation) (None, None, None, 1 0 res4b15[0][0] __________________________________________________________________________________________________ res4b16_branch2a (Conv2D) (None, None, None, 2 262144 res4b15_relu[0][0] __________________________________________________________________________________________________ bn4b16_branch2a (BatchNormaliza (None, None, None, 2 1024 res4b16_branch2a[0][0] __________________________________________________________________________________________________ res4b16_branch2a_relu (Activati (None, None, None, 2 0 bn4b16_branch2a[0][0] __________________________________________________________________________________________________ padding4b16_branch2b (ZeroPaddi (None, None, None, 2 0 res4b16_branch2a_relu[0][0] __________________________________________________________________________________________________ res4b16_branch2b (Conv2D) (None, None, None, 2 589824 padding4b16_branch2b[0][0] __________________________________________________________________________________________________ bn4b16_branch2b (BatchNormaliza (None, None, None, 2 1024 res4b16_branch2b[0][0] __________________________________________________________________________________________________ res4b16_branch2b_relu (Activati (None, None, None, 2 0 bn4b16_branch2b[0][0] __________________________________________________________________________________________________ res4b16_branch2c (Conv2D) (None, None, None, 1 262144 res4b16_branch2b_relu[0][0] __________________________________________________________________________________________________ bn4b16_branch2c (BatchNormaliza (None, None, None, 1 4096 res4b16_branch2c[0][0] __________________________________________________________________________________________________ res4b16 (Add) (None, None, None, 1 0 bn4b16_branch2c[0][0] res4b15_relu[0][0] __________________________________________________________________________________________________ res4b16_relu (Activation) (None, None, None, 1 0 res4b16[0][0] __________________________________________________________________________________________________ res4b17_branch2a (Conv2D) (None, None, None, 2 262144 res4b16_relu[0][0] __________________________________________________________________________________________________ bn4b17_branch2a (BatchNormaliza (None, None, None, 2 1024 res4b17_branch2a[0][0] __________________________________________________________________________________________________ res4b17_branch2a_relu (Activati (None, None, None, 2 0 bn4b17_branch2a[0][0] __________________________________________________________________________________________________ padding4b17_branch2b (ZeroPaddi (None, None, None, 2 0 res4b17_branch2a_relu[0][0] __________________________________________________________________________________________________ res4b17_branch2b (Conv2D) (None, None, None, 2 589824 padding4b17_branch2b[0][0] __________________________________________________________________________________________________ bn4b17_branch2b (BatchNormaliza (None, None, None, 2 1024 res4b17_branch2b[0][0] __________________________________________________________________________________________________ res4b17_branch2b_relu (Activati (None, None, None, 2 0 bn4b17_branch2b[0][0] __________________________________________________________________________________________________ res4b17_branch2c (Conv2D) (None, None, None, 1 262144 res4b17_branch2b_relu[0][0] __________________________________________________________________________________________________ bn4b17_branch2c (BatchNormaliza (None, None, None, 1 4096 res4b17_branch2c[0][0] __________________________________________________________________________________________________ res4b17 (Add) (None, None, None, 1 0 bn4b17_branch2c[0][0] res4b16_relu[0][0] __________________________________________________________________________________________________ res4b17_relu (Activation) (None, None, None, 1 0 res4b17[0][0] __________________________________________________________________________________________________ res4b18_branch2a (Conv2D) (None, None, None, 2 262144 res4b17_relu[0][0] __________________________________________________________________________________________________ bn4b18_branch2a (BatchNormaliza (None, None, None, 2 1024 res4b18_branch2a[0][0] __________________________________________________________________________________________________ res4b18_branch2a_relu (Activati (None, None, None, 2 0 bn4b18_branch2a[0][0] __________________________________________________________________________________________________ padding4b18_branch2b (ZeroPaddi (None, None, None, 2 0 res4b18_branch2a_relu[0][0] __________________________________________________________________________________________________ res4b18_branch2b (Conv2D) (None, None, None, 2 589824 padding4b18_branch2b[0][0] __________________________________________________________________________________________________ bn4b18_branch2b (BatchNormaliza (None, None, None, 2 1024 res4b18_branch2b[0][0] __________________________________________________________________________________________________ res4b18_branch2b_relu (Activati (None, None, None, 2 0 bn4b18_branch2b[0][0] __________________________________________________________________________________________________ res4b18_branch2c (Conv2D) (None, None, None, 1 262144 res4b18_branch2b_relu[0][0] __________________________________________________________________________________________________ bn4b18_branch2c (BatchNormaliza (None, None, None, 1 4096 res4b18_branch2c[0][0] __________________________________________________________________________________________________ res4b18 (Add) (None, None, None, 1 0 bn4b18_branch2c[0][0] res4b17_relu[0][0] __________________________________________________________________________________________________ res4b18_relu (Activation) (None, None, None, 1 0 res4b18[0][0] __________________________________________________________________________________________________ res4b19_branch2a (Conv2D) (None, None, None, 2 262144 res4b18_relu[0][0] __________________________________________________________________________________________________ bn4b19_branch2a (BatchNormaliza (None, None, None, 2 1024 res4b19_branch2a[0][0] __________________________________________________________________________________________________ res4b19_branch2a_relu (Activati (None, None, None, 2 0 bn4b19_branch2a[0][0] __________________________________________________________________________________________________ padding4b19_branch2b (ZeroPaddi (None, None, None, 2 0 res4b19_branch2a_relu[0][0] __________________________________________________________________________________________________ res4b19_branch2b (Conv2D) (None, None, None, 2 589824 padding4b19_branch2b[0][0] __________________________________________________________________________________________________ bn4b19_branch2b (BatchNormaliza (None, None, None, 2 1024 res4b19_branch2b[0][0] __________________________________________________________________________________________________ res4b19_branch2b_relu (Activati (None, None, None, 2 0 bn4b19_branch2b[0][0] __________________________________________________________________________________________________ res4b19_branch2c (Conv2D) (None, None, None, 1 262144 res4b19_branch2b_relu[0][0] __________________________________________________________________________________________________ bn4b19_branch2c (BatchNormaliza (None, None, None, 1 4096 res4b19_branch2c[0][0] __________________________________________________________________________________________________ res4b19 (Add) (None, None, None, 1 0 bn4b19_branch2c[0][0] res4b18_relu[0][0] __________________________________________________________________________________________________ res4b19_relu (Activation) (None, None, None, 1 0 res4b19[0][0] __________________________________________________________________________________________________ res4b20_branch2a (Conv2D) (None, None, None, 2 262144 res4b19_relu[0][0] __________________________________________________________________________________________________ bn4b20_branch2a (BatchNormaliza (None, None, None, 2 1024 res4b20_branch2a[0][0] __________________________________________________________________________________________________ res4b20_branch2a_relu (Activati (None, None, None, 2 0 bn4b20_branch2a[0][0] __________________________________________________________________________________________________ padding4b20_branch2b (ZeroPaddi (None, None, None, 2 0 res4b20_branch2a_relu[0][0] __________________________________________________________________________________________________ res4b20_branch2b (Conv2D) (None, None, None, 2 589824 padding4b20_branch2b[0][0] __________________________________________________________________________________________________ bn4b20_branch2b (BatchNormaliza (None, None, None, 2 1024 res4b20_branch2b[0][0] __________________________________________________________________________________________________ res4b20_branch2b_relu (Activati (None, None, None, 2 0 bn4b20_branch2b[0][0] __________________________________________________________________________________________________ res4b20_branch2c (Conv2D) (None, None, None, 1 262144 res4b20_branch2b_relu[0][0] __________________________________________________________________________________________________ bn4b20_branch2c (BatchNormaliza (None, None, None, 1 4096 res4b20_branch2c[0][0] __________________________________________________________________________________________________ res4b20 (Add) (None, None, None, 1 0 bn4b20_branch2c[0][0] res4b19_relu[0][0] __________________________________________________________________________________________________ res4b20_relu (Activation) (None, None, None, 1 0 res4b20[0][0] __________________________________________________________________________________________________ res4b21_branch2a (Conv2D) (None, None, None, 2 262144 res4b20_relu[0][0] __________________________________________________________________________________________________ bn4b21_branch2a (BatchNormaliza (None, None, None, 2 1024 res4b21_branch2a[0][0] __________________________________________________________________________________________________ res4b21_branch2a_relu (Activati (None, None, None, 2 0 bn4b21_branch2a[0][0] __________________________________________________________________________________________________ padding4b21_branch2b (ZeroPaddi (None, None, None, 2 0 res4b21_branch2a_relu[0][0] __________________________________________________________________________________________________ res4b21_branch2b (Conv2D) (None, None, None, 2 589824 padding4b21_branch2b[0][0] __________________________________________________________________________________________________ bn4b21_branch2b (BatchNormaliza (None, None, None, 2 1024 res4b21_branch2b[0][0] __________________________________________________________________________________________________ res4b21_branch2b_relu (Activati (None, None, None, 2 0 bn4b21_branch2b[0][0] __________________________________________________________________________________________________ res4b21_branch2c (Conv2D) (None, None, None, 1 262144 res4b21_branch2b_relu[0][0] __________________________________________________________________________________________________ bn4b21_branch2c (BatchNormaliza (None, None, None, 1 4096 res4b21_branch2c[0][0] __________________________________________________________________________________________________ res4b21 (Add) (None, None, None, 1 0 bn4b21_branch2c[0][0] res4b20_relu[0][0] __________________________________________________________________________________________________ res4b21_relu (Activation) (None, None, None, 1 0 res4b21[0][0] __________________________________________________________________________________________________ res4b22_branch2a (Conv2D) (None, None, None, 2 262144 res4b21_relu[0][0] __________________________________________________________________________________________________ bn4b22_branch2a (BatchNormaliza (None, None, None, 2 1024 res4b22_branch2a[0][0] __________________________________________________________________________________________________ res4b22_branch2a_relu (Activati (None, None, None, 2 0 bn4b22_branch2a[0][0] __________________________________________________________________________________________________ padding4b22_branch2b (ZeroPaddi (None, None, None, 2 0 res4b22_branch2a_relu[0][0] __________________________________________________________________________________________________ res4b22_branch2b (Conv2D) (None, None, None, 2 589824 padding4b22_branch2b[0][0] __________________________________________________________________________________________________ bn4b22_branch2b (BatchNormaliza (None, None, None, 2 1024 res4b22_branch2b[0][0] __________________________________________________________________________________________________ res4b22_branch2b_relu (Activati (None, None, None, 2 0 bn4b22_branch2b[0][0] __________________________________________________________________________________________________ res4b22_branch2c (Conv2D) (None, None, None, 1 262144 res4b22_branch2b_relu[0][0] __________________________________________________________________________________________________ bn4b22_branch2c (BatchNormaliza (None, None, None, 1 4096 res4b22_branch2c[0][0] __________________________________________________________________________________________________ res4b22 (Add) (None, None, None, 1 0 bn4b22_branch2c[0][0] res4b21_relu[0][0] __________________________________________________________________________________________________ res4b22_relu (Activation) (None, None, None, 1 0 res4b22[0][0] __________________________________________________________________________________________________ res5a_branch2a (Conv2D) (None, None, None, 5 524288 res4b22_relu[0][0] __________________________________________________________________________________________________ bn5a_branch2a (BatchNormalizati (None, None, None, 5 2048 res5a_branch2a[0][0] __________________________________________________________________________________________________ res5a_branch2a_relu (Activation (None, None, None, 5 0 bn5a_branch2a[0][0] __________________________________________________________________________________________________ padding5a_branch2b (ZeroPadding (None, None, None, 5 0 res5a_branch2a_relu[0][0] __________________________________________________________________________________________________ res5a_branch2b (Conv2D) (None, None, None, 5 2359296 padding5a_branch2b[0][0] __________________________________________________________________________________________________ bn5a_branch2b (BatchNormalizati (None, None, None, 5 2048 res5a_branch2b[0][0] __________________________________________________________________________________________________ res5a_branch2b_relu (Activation (None, None, None, 5 0 bn5a_branch2b[0][0] __________________________________________________________________________________________________ res5a_branch2c (Conv2D) (None, None, None, 2 1048576 res5a_branch2b_relu[0][0] __________________________________________________________________________________________________ res5a_branch1 (Conv2D) (None, None, None, 2 2097152 res4b22_relu[0][0] __________________________________________________________________________________________________ bn5a_branch2c (BatchNormalizati (None, None, None, 2 8192 res5a_branch2c[0][0] __________________________________________________________________________________________________ bn5a_branch1 (BatchNormalizatio (None, None, None, 2 8192 res5a_branch1[0][0] __________________________________________________________________________________________________ res5a (Add) (None, None, None, 2 0 bn5a_branch2c[0][0] bn5a_branch1[0][0] __________________________________________________________________________________________________ res5a_relu (Activation) (None, None, None, 2 0 res5a[0][0] __________________________________________________________________________________________________ res5b_branch2a (Conv2D) (None, None, None, 5 1048576 res5a_relu[0][0] __________________________________________________________________________________________________ bn5b_branch2a (BatchNormalizati (None, None, None, 5 2048 res5b_branch2a[0][0] __________________________________________________________________________________________________ res5b_branch2a_relu (Activation (None, None, None, 5 0 bn5b_branch2a[0][0] __________________________________________________________________________________________________ padding5b_branch2b (ZeroPadding (None, None, None, 5 0 res5b_branch2a_relu[0][0] __________________________________________________________________________________________________ res5b_branch2b (Conv2D) (None, None, None, 5 2359296 padding5b_branch2b[0][0] __________________________________________________________________________________________________ bn5b_branch2b (BatchNormalizati (None, None, None, 5 2048 res5b_branch2b[0][0] __________________________________________________________________________________________________ res5b_branch2b_relu (Activation (None, None, None, 5 0 bn5b_branch2b[0][0] __________________________________________________________________________________________________ res5b_branch2c (Conv2D) (None, None, None, 2 1048576 res5b_branch2b_relu[0][0] __________________________________________________________________________________________________ bn5b_branch2c (BatchNormalizati (None, None, None, 2 8192 res5b_branch2c[0][0] __________________________________________________________________________________________________ res5b (Add) (None, None, None, 2 0 bn5b_branch2c[0][0] res5a_relu[0][0] __________________________________________________________________________________________________ res5b_relu (Activation) (None, None, None, 2 0 res5b[0][0] __________________________________________________________________________________________________ res5c_branch2a (Conv2D) (None, None, None, 5 1048576 res5b_relu[0][0] __________________________________________________________________________________________________ bn5c_branch2a (BatchNormalizati (None, None, None, 5 2048 res5c_branch2a[0][0] __________________________________________________________________________________________________ res5c_branch2a_relu (Activation (None, None, None, 5 0 bn5c_branch2a[0][0] __________________________________________________________________________________________________ padding5c_branch2b (ZeroPadding (None, None, None, 5 0 res5c_branch2a_relu[0][0] __________________________________________________________________________________________________ res5c_branch2b (Conv2D) (None, None, None, 5 2359296 padding5c_branch2b[0][0] __________________________________________________________________________________________________ bn5c_branch2b (BatchNormalizati (None, None, None, 5 2048 res5c_branch2b[0][0] __________________________________________________________________________________________________ res5c_branch2b_relu (Activation (None, None, None, 5 0 bn5c_branch2b[0][0] __________________________________________________________________________________________________ res5c_branch2c (Conv2D) (None, None, None, 2 1048576 res5c_branch2b_relu[0][0] __________________________________________________________________________________________________ bn5c_branch2c (BatchNormalizati (None, None, None, 2 8192 res5c_branch2c[0][0] __________________________________________________________________________________________________ res5c (Add) (None, None, None, 2 0 bn5c_branch2c[0][0] res5b_relu[0][0] __________________________________________________________________________________________________ res5c_relu (Activation) (None, None, None, 2 0 res5c[0][0] __________________________________________________________________________________________________ C5_reduced (Conv2D) (None, None, None, 2 524544 res5c_relu[0][0] __________________________________________________________________________________________________ P5_upsampled (UpsampleLike) (None, None, None, 2 0 C5_reduced[0][0] res4b22_relu[0][0] __________________________________________________________________________________________________ C4_reduced (Conv2D) (None, None, None, 2 262400 res4b22_relu[0][0] __________________________________________________________________________________________________ P4_merged (Add) (None, None, None, 2 0 P5_upsampled[0][0] C4_reduced[0][0] __________________________________________________________________________________________________ P4_upsampled (UpsampleLike) (None, None, None, 2 0 P4_merged[0][0] res3b3_relu[0][0] __________________________________________________________________________________________________ C3_reduced (Conv2D) (None, None, None, 2 131328 res3b3_relu[0][0] __________________________________________________________________________________________________ P6 (Conv2D) (None, None, None, 2 4718848 res5c_relu[0][0] __________________________________________________________________________________________________ P3_merged (Add) (None, None, None, 2 0 P4_upsampled[0][0] C3_reduced[0][0] __________________________________________________________________________________________________ C6_relu (Activation) (None, None, None, 2 0 P6[0][0] __________________________________________________________________________________________________ P3 (Conv2D) (None, None, None, 2 590080 P3_merged[0][0] __________________________________________________________________________________________________ P4 (Conv2D) (None, None, None, 2 590080 P4_merged[0][0] __________________________________________________________________________________________________ P5 (Conv2D) (None, None, None, 2 590080 C5_reduced[0][0] __________________________________________________________________________________________________ P7 (Conv2D) (None, None, None, 2 590080 C6_relu[0][0] __________________________________________________________________________________________________ regression_submodel (Model) (None, None, 4) 2443300 P3[0][0] P4[0][0] P5[0][0] P6[0][0] P7[0][0] __________________________________________________________________________________________________ classification_submodel (Model) (None, None, 1) 2381065 P3[0][0] P4[0][0] P5[0][0] P6[0][0] P7[0][0] __________________________________________________________________________________________________ regression (Concatenate) (None, None, 4) 0 regression_submodel[1][0] regression_submodel[2][0] regression_submodel[3][0] regression_submodel[4][0] regression_submodel[5][0] __________________________________________________________________________________________________ classification (Concatenate) (None, None, 1) 0 classification_submodel[1][0] classification_submodel[2][0] classification_submodel[3][0] classification_submodel[4][0] classification_submodel[5][0] ================================================================================================== Total params: 55,427,309 Trainable params: 55,216,621 Non-trainable params: 210,688 __________________________________________________________________________________________________ None Epoch 1/150 1/500 [..............................] - ETA: 59:10 - loss: 4.2848 - regression_loss: 3.1543 - classification_loss: 1.1305 2/500 [..............................] - ETA: 30:51 - loss: 4.1900 - regression_loss: 3.0606 - classification_loss: 1.1294 3/500 [..............................] - ETA: 21:26 - loss: 4.1135 - regression_loss: 2.9845 - classification_loss: 1.1290 4/500 [..............................] - ETA: 16:43 - loss: 4.1621 - regression_loss: 3.0329 - classification_loss: 1.1292 5/500 [..............................] - ETA: 13:52 - loss: 4.1733 - regression_loss: 3.0440 - classification_loss: 1.1293 6/500 [..............................] - ETA: 11:58 - loss: 4.1492 - regression_loss: 3.0202 - classification_loss: 1.1291 7/500 [..............................] - ETA: 10:38 - loss: 4.0879 - regression_loss: 2.9585 - classification_loss: 1.1294 8/500 [..............................] - ETA: 9:37 - loss: 4.0991 - regression_loss: 2.9699 - classification_loss: 1.1293 9/500 [..............................] - ETA: 8:50 - loss: 4.1078 - regression_loss: 2.9787 - classification_loss: 1.1292 10/500 [..............................] - ETA: 8:13 - loss: 4.0996 - regression_loss: 2.9708 - classification_loss: 1.1288 11/500 [..............................] - ETA: 7:42 - loss: 4.0947 - regression_loss: 2.9659 - classification_loss: 1.1288 12/500 [..............................] - ETA: 7:16 - loss: 4.1013 - regression_loss: 2.9727 - classification_loss: 1.1286 13/500 [..............................] - ETA: 6:55 - loss: 4.0643 - regression_loss: 2.9357 - classification_loss: 1.1287 14/500 [..............................] - ETA: 6:36 - loss: 4.0603 - regression_loss: 2.9316 - classification_loss: 1.1287 15/500 [..............................] - ETA: 6:20 - loss: 4.0461 - regression_loss: 2.9172 - classification_loss: 1.1289 16/500 [..............................] - ETA: 6:06 - loss: 4.0504 - regression_loss: 2.9217 - classification_loss: 1.1287 17/500 [>.............................] - ETA: 5:53 - loss: 4.0499 - regression_loss: 2.9213 - classification_loss: 1.1286 18/500 [>.............................] - ETA: 5:41 - loss: 4.0539 - regression_loss: 2.9254 - classification_loss: 1.1285 19/500 [>.............................] - ETA: 5:31 - loss: 4.0551 - regression_loss: 2.9268 - classification_loss: 1.1283 20/500 [>.............................] - ETA: 5:22 - loss: 4.0624 - regression_loss: 2.9342 - classification_loss: 1.1282 21/500 [>.............................] - ETA: 5:14 - loss: 4.0643 - regression_loss: 2.9363 - classification_loss: 1.1280 22/500 [>.............................] - ETA: 5:06 - loss: 4.0581 - regression_loss: 2.9303 - classification_loss: 1.1279 23/500 [>.............................] - ETA: 4:59 - loss: 4.0679 - regression_loss: 2.9399 - classification_loss: 1.1280 24/500 [>.............................] - ETA: 4:53 - loss: 4.0721 - regression_loss: 2.9443 - classification_loss: 1.1278 25/500 [>.............................] - ETA: 4:47 - loss: 4.0671 - regression_loss: 2.9394 - classification_loss: 1.1277 26/500 [>.............................] - ETA: 4:41 - loss: 4.0703 - regression_loss: 2.9428 - classification_loss: 1.1275 27/500 [>.............................] - ETA: 4:36 - loss: 4.0563 - regression_loss: 2.9290 - classification_loss: 1.1273 28/500 [>.............................] - ETA: 4:31 - loss: 4.0588 - regression_loss: 2.9316 - classification_loss: 1.1272 29/500 [>.............................] - ETA: 4:27 - loss: 4.0516 - regression_loss: 2.9246 - classification_loss: 1.1270 30/500 [>.............................] - ETA: 4:23 - loss: 4.0496 - regression_loss: 2.9227 - classification_loss: 1.1268 31/500 [>.............................] - ETA: 4:19 - loss: 4.0438 - regression_loss: 2.9169 - classification_loss: 1.1270 32/500 [>.............................] - ETA: 4:15 - loss: 4.0461 - regression_loss: 2.9193 - classification_loss: 1.1268 33/500 [>.............................] - ETA: 4:11 - loss: 4.0434 - regression_loss: 2.9168 - classification_loss: 1.1265 34/500 [=>............................] - ETA: 4:08 - loss: 4.0445 - regression_loss: 2.9180 - classification_loss: 1.1265 35/500 [=>............................] - ETA: 4:05 - loss: 4.0411 - regression_loss: 2.9151 - classification_loss: 1.1261 36/500 [=>............................] - ETA: 4:02 - loss: 4.0344 - regression_loss: 2.9083 - classification_loss: 1.1261 37/500 [=>............................] - ETA: 3:59 - loss: 4.0382 - regression_loss: 2.9122 - classification_loss: 1.1261 38/500 [=>............................] - ETA: 3:56 - loss: 4.0340 - regression_loss: 2.9082 - classification_loss: 1.1258 39/500 [=>............................] - ETA: 3:54 - loss: 4.0325 - regression_loss: 2.9069 - classification_loss: 1.1256 40/500 [=>............................] - ETA: 3:51 - loss: 4.0293 - regression_loss: 2.9040 - classification_loss: 1.1253 41/500 [=>............................] - ETA: 3:49 - loss: 4.0232 - regression_loss: 2.8982 - classification_loss: 1.1250 42/500 [=>............................] - ETA: 3:47 - loss: 4.0238 - regression_loss: 2.8989 - classification_loss: 1.1249 43/500 [=>............................] - ETA: 3:45 - loss: 4.0183 - regression_loss: 2.8935 - classification_loss: 1.1248 44/500 [=>............................] - ETA: 3:42 - loss: 4.0159 - regression_loss: 2.8911 - classification_loss: 1.1247 45/500 [=>............................] - ETA: 3:41 - loss: 4.0152 - regression_loss: 2.8908 - classification_loss: 1.1244 46/500 [=>............................] - ETA: 3:39 - loss: 4.0134 - regression_loss: 2.8891 - classification_loss: 1.1243 47/500 [=>............................] - ETA: 3:37 - loss: 4.0053 - regression_loss: 2.8810 - classification_loss: 1.1243 48/500 [=>............................] - ETA: 3:35 - loss: 4.0035 - regression_loss: 2.8797 - classification_loss: 1.1238 49/500 [=>............................] - ETA: 3:33 - loss: 3.9981 - regression_loss: 2.8746 - classification_loss: 1.1236 50/500 [==>...........................] - ETA: 3:31 - loss: 3.9942 - regression_loss: 2.8712 - classification_loss: 1.1230 51/500 [==>...........................] - ETA: 3:29 - loss: 3.9893 - regression_loss: 2.8667 - classification_loss: 1.1226 52/500 [==>...........................] - ETA: 3:28 - loss: 3.9871 - regression_loss: 2.8650 - classification_loss: 1.1221 53/500 [==>...........................] - ETA: 3:26 - loss: 3.9858 - regression_loss: 2.8642 - classification_loss: 1.1216 54/500 [==>...........................] - ETA: 3:25 - loss: 3.9861 - regression_loss: 2.8647 - classification_loss: 1.1214 55/500 [==>...........................] - ETA: 3:23 - loss: 3.9825 - regression_loss: 2.8616 - classification_loss: 1.1209 56/500 [==>...........................] - ETA: 3:22 - loss: 3.9798 - regression_loss: 2.8592 - classification_loss: 1.1206 57/500 [==>...........................] - ETA: 3:21 - loss: 3.9759 - regression_loss: 2.8558 - classification_loss: 1.1201 58/500 [==>...........................] - ETA: 3:19 - loss: 3.9730 - regression_loss: 2.8533 - classification_loss: 1.1197 59/500 [==>...........................] - ETA: 3:18 - loss: 3.9679 - regression_loss: 2.8486 - classification_loss: 1.1193 60/500 [==>...........................] - ETA: 3:17 - loss: 3.9668 - regression_loss: 2.8475 - classification_loss: 1.1192 61/500 [==>...........................] - ETA: 3:15 - loss: 3.9610 - regression_loss: 2.8425 - classification_loss: 1.1185 62/500 [==>...........................] - ETA: 3:14 - loss: 3.9567 - regression_loss: 2.8387 - classification_loss: 1.1180 63/500 [==>...........................] - ETA: 3:13 - loss: 3.9537 - regression_loss: 2.8364 - classification_loss: 1.1173 64/500 [==>...........................] - ETA: 3:12 - loss: 3.9523 - regression_loss: 2.8352 - classification_loss: 1.1171 65/500 [==>...........................] - ETA: 3:11 - loss: 3.9464 - regression_loss: 2.8301 - classification_loss: 1.1163 66/500 [==>...........................] - ETA: 3:09 - loss: 3.9414 - regression_loss: 2.8259 - classification_loss: 1.1155 67/500 [===>..........................] - ETA: 3:08 - loss: 3.9353 - regression_loss: 2.8213 - classification_loss: 1.1140 68/500 [===>..........................] - ETA: 3:07 - loss: 3.9297 - regression_loss: 2.8163 - classification_loss: 1.1134 69/500 [===>..........................] - ETA: 3:06 - loss: 3.9240 - regression_loss: 2.8117 - classification_loss: 1.1123 70/500 [===>..........................] - ETA: 3:05 - loss: 3.9195 - regression_loss: 2.8088 - classification_loss: 1.1107 71/500 [===>..........................] - ETA: 3:04 - loss: 3.9164 - regression_loss: 2.8065 - classification_loss: 1.1099 72/500 [===>..........................] - ETA: 3:03 - loss: 3.9103 - regression_loss: 2.8016 - classification_loss: 1.1087 73/500 [===>..........................] - ETA: 3:02 - loss: 3.9058 - regression_loss: 2.7979 - classification_loss: 1.1079 74/500 [===>..........................] - ETA: 3:01 - loss: 3.9053 - regression_loss: 2.7976 - classification_loss: 1.1077 75/500 [===>..........................] - ETA: 3:00 - loss: 3.8979 - regression_loss: 2.7920 - classification_loss: 1.1059 76/500 [===>..........................] - ETA: 2:59 - loss: 3.8937 - regression_loss: 2.7880 - classification_loss: 1.1057 77/500 [===>..........................] - ETA: 2:59 - loss: 3.8919 - regression_loss: 2.7860 - classification_loss: 1.1059 78/500 [===>..........................] - ETA: 2:58 - loss: 3.8873 - regression_loss: 2.7833 - classification_loss: 1.1039 79/500 [===>..........................] - ETA: 2:57 - loss: 3.8828 - regression_loss: 2.7810 - classification_loss: 1.1018 80/500 [===>..........................] - ETA: 2:56 - loss: 3.8798 - regression_loss: 2.7782 - classification_loss: 1.1016 81/500 [===>..........................] - ETA: 2:55 - loss: 3.8739 - regression_loss: 2.7752 - classification_loss: 1.0987 82/500 [===>..........................] - ETA: 2:54 - loss: 3.8696 - regression_loss: 2.7733 - classification_loss: 1.0963 83/500 [===>..........................] - ETA: 2:53 - loss: 3.8643 - regression_loss: 2.7695 - classification_loss: 1.0948 84/500 [====>.........................] - ETA: 2:53 - loss: 3.8604 - regression_loss: 2.7682 - classification_loss: 1.0922 85/500 [====>.........................] - ETA: 2:52 - loss: 3.8540 - regression_loss: 2.7636 - classification_loss: 1.0904 86/500 [====>.........................] - ETA: 2:51 - loss: 3.8481 - regression_loss: 2.7608 - classification_loss: 1.0873 87/500 [====>.........................] - ETA: 2:50 - loss: 3.8436 - regression_loss: 2.7592 - classification_loss: 1.0844 88/500 [====>.........................] - ETA: 2:49 - loss: 3.8382 - regression_loss: 2.7570 - classification_loss: 1.0812 89/500 [====>.........................] - ETA: 2:49 - loss: 3.8334 - regression_loss: 2.7557 - classification_loss: 1.0777 90/500 [====>.........................] - ETA: 2:48 - loss: 3.8288 - regression_loss: 2.7521 - classification_loss: 1.0768 91/500 [====>.........................] - ETA: 2:47 - loss: 3.8222 - regression_loss: 2.7493 - classification_loss: 1.0729 92/500 [====>.........................] - ETA: 2:46 - loss: 3.8137 - regression_loss: 2.7440 - classification_loss: 1.0696 93/500 [====>.........................] - ETA: 2:46 - loss: 3.8091 - regression_loss: 2.7403 - classification_loss: 1.0688 94/500 [====>.........................] - ETA: 2:45 - loss: 3.8039 - regression_loss: 2.7376 - classification_loss: 1.0663 95/500 [====>.........................] - ETA: 2:44 - loss: 3.8033 - regression_loss: 2.7379 - classification_loss: 1.0654 96/500 [====>.........................] - ETA: 2:44 - loss: 3.8041 - regression_loss: 2.7388 - classification_loss: 1.0653 97/500 [====>.........................] - ETA: 2:43 - loss: 3.8011 - regression_loss: 2.7390 - classification_loss: 1.0621 98/500 [====>.........................] - ETA: 2:42 - loss: 3.7944 - regression_loss: 2.7361 - classification_loss: 1.0582 99/500 [====>.........................] - ETA: 2:41 - loss: 3.7886 - regression_loss: 2.7337 - classification_loss: 1.0549 100/500 [=====>........................] - ETA: 2:41 - loss: 3.7830 - regression_loss: 2.7314 - classification_loss: 1.0516 101/500 [=====>........................] - ETA: 2:40 - loss: 3.7794 - regression_loss: 2.7307 - classification_loss: 1.0487 102/500 [=====>........................] - ETA: 2:39 - loss: 3.7744 - regression_loss: 2.7292 - classification_loss: 1.0452 103/500 [=====>........................] - ETA: 2:39 - loss: 3.7721 - regression_loss: 2.7275 - classification_loss: 1.0447 104/500 [=====>........................] - ETA: 2:38 - loss: 3.7651 - regression_loss: 2.7242 - classification_loss: 1.0409 105/500 [=====>........................] - ETA: 2:37 - loss: 3.7621 - regression_loss: 2.7240 - classification_loss: 1.0381 106/500 [=====>........................] - ETA: 2:37 - loss: 3.7608 - regression_loss: 2.7235 - classification_loss: 1.0373 107/500 [=====>........................] - ETA: 2:36 - loss: 3.7533 - regression_loss: 2.7200 - classification_loss: 1.0333 108/500 [=====>........................] - ETA: 2:36 - loss: 3.7466 - regression_loss: 2.7173 - classification_loss: 1.0293 109/500 [=====>........................] - ETA: 2:35 - loss: 3.7408 - regression_loss: 2.7147 - classification_loss: 1.0261 110/500 [=====>........................] - ETA: 2:34 - loss: 3.7350 - regression_loss: 2.7126 - classification_loss: 1.0224 111/500 [=====>........................] - ETA: 2:34 - loss: 3.7281 - regression_loss: 2.7100 - classification_loss: 1.0181 112/500 [=====>........................] - ETA: 2:33 - loss: 3.7228 - regression_loss: 2.7082 - classification_loss: 1.0146 113/500 [=====>........................] - ETA: 2:33 - loss: 3.7210 - regression_loss: 2.7066 - classification_loss: 1.0144 114/500 [=====>........................] - ETA: 2:32 - loss: 3.7144 - regression_loss: 2.7043 - classification_loss: 1.0100 115/500 [=====>........................] - ETA: 2:31 - loss: 3.7111 - regression_loss: 2.7038 - classification_loss: 1.0073 116/500 [=====>........................] - ETA: 2:31 - loss: 3.7032 - regression_loss: 2.6999 - classification_loss: 1.0033 117/500 [======>.......................] - ETA: 2:30 - loss: 3.6959 - regression_loss: 2.6970 - classification_loss: 0.9989 118/500 [======>.......................] - ETA: 2:30 - loss: 3.6928 - regression_loss: 2.6955 - classification_loss: 0.9973 119/500 [======>.......................] - ETA: 2:29 - loss: 3.6870 - regression_loss: 2.6932 - classification_loss: 0.9938 120/500 [======>.......................] - ETA: 2:28 - loss: 3.6827 - regression_loss: 2.6910 - classification_loss: 0.9916 121/500 [======>.......................] - ETA: 2:28 - loss: 3.6828 - regression_loss: 2.6914 - classification_loss: 0.9914 122/500 [======>.......................] - ETA: 2:27 - loss: 3.6804 - regression_loss: 2.6890 - classification_loss: 0.9914 123/500 [======>.......................] - ETA: 2:27 - loss: 3.6758 - regression_loss: 2.6875 - classification_loss: 0.9883 124/500 [======>.......................] - ETA: 2:26 - loss: 3.6699 - regression_loss: 2.6856 - classification_loss: 0.9844 125/500 [======>.......................] - ETA: 2:26 - loss: 3.6696 - regression_loss: 2.6853 - classification_loss: 0.9842 126/500 [======>.......................] - ETA: 2:25 - loss: 3.6672 - regression_loss: 2.6851 - classification_loss: 0.9820 127/500 [======>.......................] - ETA: 2:25 - loss: 3.6619 - regression_loss: 2.6835 - classification_loss: 0.9784 128/500 [======>.......................] - ETA: 2:24 - loss: 3.6625 - regression_loss: 2.6861 - classification_loss: 0.9764 129/500 [======>.......................] - ETA: 2:23 - loss: 3.6615 - regression_loss: 2.6850 - classification_loss: 0.9764 130/500 [======>.......................] - ETA: 2:23 - loss: 3.6543 - regression_loss: 2.6821 - classification_loss: 0.9722 131/500 [======>.......................] - ETA: 2:22 - loss: 3.6488 - regression_loss: 2.6802 - classification_loss: 0.9686 132/500 [======>.......................] - ETA: 2:22 - loss: 3.6427 - regression_loss: 2.6774 - classification_loss: 0.9652 133/500 [======>.......................] - ETA: 2:21 - loss: 3.6351 - regression_loss: 2.6739 - classification_loss: 0.9613 134/500 [=======>......................] - ETA: 2:21 - loss: 3.6282 - regression_loss: 2.6711 - classification_loss: 0.9571 135/500 [=======>......................] - ETA: 2:20 - loss: 3.6245 - regression_loss: 2.6707 - classification_loss: 0.9539 136/500 [=======>......................] - ETA: 2:20 - loss: 3.6189 - regression_loss: 2.6682 - classification_loss: 0.9507 137/500 [=======>......................] - ETA: 2:19 - loss: 3.6130 - regression_loss: 2.6661 - classification_loss: 0.9469 138/500 [=======>......................] - ETA: 2:19 - loss: 3.6126 - regression_loss: 2.6654 - classification_loss: 0.9472 139/500 [=======>......................] - ETA: 2:18 - loss: 3.6119 - regression_loss: 2.6654 - classification_loss: 0.9465 140/500 [=======>......................] - ETA: 2:18 - loss: 3.6044 - regression_loss: 2.6620 - classification_loss: 0.9424 141/500 [=======>......................] - ETA: 2:17 - loss: 3.6054 - regression_loss: 2.6626 - classification_loss: 0.9428 142/500 [=======>......................] - ETA: 2:17 - loss: 3.6071 - regression_loss: 2.6642 - classification_loss: 0.9429 143/500 [=======>......................] - ETA: 2:16 - loss: 3.6056 - regression_loss: 2.6643 - classification_loss: 0.9413 144/500 [=======>......................] - ETA: 2:16 - loss: 3.5990 - regression_loss: 2.6615 - classification_loss: 0.9375 145/500 [=======>......................] - ETA: 2:15 - loss: 3.5923 - regression_loss: 2.6589 - classification_loss: 0.9334 146/500 [=======>......................] - ETA: 2:15 - loss: 3.5897 - regression_loss: 2.6585 - classification_loss: 0.9312 147/500 [=======>......................] - ETA: 2:14 - loss: 3.5886 - regression_loss: 2.6581 - classification_loss: 0.9305 148/500 [=======>......................] - ETA: 2:14 - loss: 3.5828 - regression_loss: 2.6553 - classification_loss: 0.9276 149/500 [=======>......................] - ETA: 2:13 - loss: 3.5817 - regression_loss: 2.6547 - classification_loss: 0.9270 150/500 [========>.....................] - ETA: 2:13 - loss: 3.5780 - regression_loss: 2.6538 - classification_loss: 0.9242 151/500 [========>.....................] - ETA: 2:12 - loss: 3.5740 - regression_loss: 2.6529 - classification_loss: 0.9211 152/500 [========>.....................] - ETA: 2:12 - loss: 3.5674 - regression_loss: 2.6499 - classification_loss: 0.9175 153/500 [========>.....................] - ETA: 2:11 - loss: 3.5630 - regression_loss: 2.6480 - classification_loss: 0.9150 154/500 [========>.....................] - ETA: 2:11 - loss: 3.5583 - regression_loss: 2.6467 - classification_loss: 0.9116 155/500 [========>.....................] - ETA: 2:10 - loss: 3.5537 - regression_loss: 2.6448 - classification_loss: 0.9088 156/500 [========>.....................] - ETA: 2:10 - loss: 3.5523 - regression_loss: 2.6452 - classification_loss: 0.9071 157/500 [========>.....................] - ETA: 2:09 - loss: 3.5483 - regression_loss: 2.6437 - classification_loss: 0.9046 158/500 [========>.....................] - ETA: 2:09 - loss: 3.5470 - regression_loss: 2.6431 - classification_loss: 0.9039 159/500 [========>.....................] - ETA: 2:08 - loss: 3.5439 - regression_loss: 2.6412 - classification_loss: 0.9027 160/500 [========>.....................] - ETA: 2:08 - loss: 3.5459 - regression_loss: 2.6432 - classification_loss: 0.9027 161/500 [========>.....................] - ETA: 2:08 - loss: 3.5449 - regression_loss: 2.6442 - classification_loss: 0.9007 162/500 [========>.....................] - ETA: 2:07 - loss: 3.5400 - regression_loss: 2.6421 - classification_loss: 0.8979 163/500 [========>.....................] - ETA: 2:07 - loss: 3.5392 - regression_loss: 2.6444 - classification_loss: 0.8948 164/500 [========>.....................] - ETA: 2:06 - loss: 3.5359 - regression_loss: 2.6434 - classification_loss: 0.8925 165/500 [========>.....................] - ETA: 2:06 - loss: 3.5313 - regression_loss: 2.6416 - classification_loss: 0.8897 166/500 [========>.....................] - ETA: 2:05 - loss: 3.5268 - regression_loss: 2.6407 - classification_loss: 0.8861 167/500 [=========>....................] - ETA: 2:05 - loss: 3.5244 - regression_loss: 2.6402 - classification_loss: 0.8842 168/500 [=========>....................] - ETA: 2:04 - loss: 3.5211 - regression_loss: 2.6387 - classification_loss: 0.8823 169/500 [=========>....................] - ETA: 2:04 - loss: 3.5186 - regression_loss: 2.6389 - classification_loss: 0.8798 170/500 [=========>....................] - ETA: 2:04 - loss: 3.5146 - regression_loss: 2.6375 - classification_loss: 0.8771 171/500 [=========>....................] - ETA: 2:03 - loss: 3.5100 - regression_loss: 2.6358 - classification_loss: 0.8741 172/500 [=========>....................] - ETA: 2:03 - loss: 3.5090 - regression_loss: 2.6348 - classification_loss: 0.8742 173/500 [=========>....................] - ETA: 2:02 - loss: 3.5096 - regression_loss: 2.6361 - classification_loss: 0.8736 174/500 [=========>....................] - ETA: 2:02 - loss: 3.5113 - regression_loss: 2.6381 - classification_loss: 0.8733 175/500 [=========>....................] - ETA: 2:01 - loss: 3.5121 - regression_loss: 2.6385 - classification_loss: 0.8735 176/500 [=========>....................] - ETA: 2:01 - loss: 3.5100 - regression_loss: 2.6381 - classification_loss: 0.8720 177/500 [=========>....................] - ETA: 2:00 - loss: 3.5074 - regression_loss: 2.6370 - classification_loss: 0.8705 178/500 [=========>....................] - ETA: 2:00 - loss: 3.5044 - regression_loss: 2.6358 - classification_loss: 0.8686 179/500 [=========>....................] - ETA: 1:59 - loss: 3.5015 - regression_loss: 2.6344 - classification_loss: 0.8671 180/500 [=========>....................] - ETA: 1:59 - loss: 3.4975 - regression_loss: 2.6325 - classification_loss: 0.8650 181/500 [=========>....................] - ETA: 1:58 - loss: 3.4945 - regression_loss: 2.6317 - classification_loss: 0.8629 182/500 [=========>....................] - ETA: 1:58 - loss: 3.4907 - regression_loss: 2.6299 - classification_loss: 0.8608 183/500 [=========>....................] - ETA: 1:58 - loss: 3.4857 - regression_loss: 2.6277 - classification_loss: 0.8580 184/500 [==========>...................] - ETA: 1:57 - loss: 3.4872 - regression_loss: 2.6298 - classification_loss: 0.8574 185/500 [==========>...................] - ETA: 1:57 - loss: 3.4853 - regression_loss: 2.6282 - classification_loss: 0.8571 186/500 [==========>...................] - ETA: 1:56 - loss: 3.4839 - regression_loss: 2.6260 - classification_loss: 0.8579 187/500 [==========>...................] - ETA: 1:56 - loss: 3.4810 - regression_loss: 2.6227 - classification_loss: 0.8583 188/500 [==========>...................] - ETA: 1:55 - loss: 3.4794 - regression_loss: 2.6224 - classification_loss: 0.8570 189/500 [==========>...................] - ETA: 1:55 - loss: 3.4754 - regression_loss: 2.6202 - classification_loss: 0.8552 190/500 [==========>...................] - ETA: 1:55 - loss: 3.4704 - regression_loss: 2.6179 - classification_loss: 0.8525 191/500 [==========>...................] - ETA: 1:54 - loss: 3.4674 - regression_loss: 2.6169 - classification_loss: 0.8505 192/500 [==========>...................] - ETA: 1:54 - loss: 3.4633 - regression_loss: 2.6154 - classification_loss: 0.8479 193/500 [==========>...................] - ETA: 1:53 - loss: 3.4615 - regression_loss: 2.6153 - classification_loss: 0.8462 194/500 [==========>...................] - ETA: 1:53 - loss: 3.4591 - regression_loss: 2.6146 - classification_loss: 0.8445 195/500 [==========>...................] - ETA: 1:52 - loss: 3.4548 - regression_loss: 2.6123 - classification_loss: 0.8425 196/500 [==========>...................] - ETA: 1:52 - loss: 3.4500 - regression_loss: 2.6097 - classification_loss: 0.8403 197/500 [==========>...................] - ETA: 1:52 - loss: 3.4459 - regression_loss: 2.6079 - classification_loss: 0.8381 198/500 [==========>...................] - ETA: 1:51 - loss: 3.4422 - regression_loss: 2.6061 - classification_loss: 0.8362 199/500 [==========>...................] - ETA: 1:51 - loss: 3.4376 - regression_loss: 2.6041 - classification_loss: 0.8336 200/500 [===========>..................] - ETA: 1:50 - loss: 3.4352 - regression_loss: 2.6040 - classification_loss: 0.8312 201/500 [===========>..................] - ETA: 1:50 - loss: 3.4324 - regression_loss: 2.6029 - classification_loss: 0.8294 202/500 [===========>..................] - ETA: 1:50 - loss: 3.4281 - regression_loss: 2.6009 - classification_loss: 0.8272 203/500 [===========>..................] - ETA: 1:49 - loss: 3.4263 - regression_loss: 2.6006 - classification_loss: 0.8257 204/500 [===========>..................] - ETA: 1:49 - loss: 3.4261 - regression_loss: 2.6016 - classification_loss: 0.8245 205/500 [===========>..................] - ETA: 1:48 - loss: 3.4242 - regression_loss: 2.6013 - classification_loss: 0.8229 206/500 [===========>..................] - ETA: 1:48 - loss: 3.4243 - regression_loss: 2.6020 - classification_loss: 0.8223 207/500 [===========>..................] - ETA: 1:48 - loss: 3.4216 - regression_loss: 2.6009 - classification_loss: 0.8208 208/500 [===========>..................] - ETA: 1:47 - loss: 3.4209 - regression_loss: 2.6002 - classification_loss: 0.8206 209/500 [===========>..................] - ETA: 1:47 - loss: 3.4191 - regression_loss: 2.6005 - classification_loss: 0.8186 210/500 [===========>..................] - ETA: 1:46 - loss: 3.4191 - regression_loss: 2.6019 - classification_loss: 0.8172 211/500 [===========>..................] - ETA: 1:46 - loss: 3.4152 - regression_loss: 2.6002 - classification_loss: 0.8150 212/500 [===========>..................] - ETA: 1:45 - loss: 3.4129 - regression_loss: 2.5995 - classification_loss: 0.8135 213/500 [===========>..................] - ETA: 1:45 - loss: 3.4105 - regression_loss: 2.5988 - classification_loss: 0.8117 214/500 [===========>..................] - ETA: 1:45 - loss: 3.4098 - regression_loss: 2.5990 - classification_loss: 0.8109 215/500 [===========>..................] - ETA: 1:44 - loss: 3.4068 - regression_loss: 2.5978 - classification_loss: 0.8090 216/500 [===========>..................] - ETA: 1:44 - loss: 3.4041 - regression_loss: 2.5963 - classification_loss: 0.8078 217/500 [============>.................] - ETA: 1:43 - loss: 3.4003 - regression_loss: 2.5944 - classification_loss: 0.8059 218/500 [============>.................] - ETA: 1:43 - loss: 3.4002 - regression_loss: 2.5948 - classification_loss: 0.8054 219/500 [============>.................] - ETA: 1:43 - loss: 3.3985 - regression_loss: 2.5943 - classification_loss: 0.8042 220/500 [============>.................] - ETA: 1:42 - loss: 3.3937 - regression_loss: 2.5914 - classification_loss: 0.8023 221/500 [============>.................] - ETA: 1:42 - loss: 3.3912 - regression_loss: 2.5906 - classification_loss: 0.8006 222/500 [============>.................] - ETA: 1:42 - loss: 3.3884 - regression_loss: 2.5897 - classification_loss: 0.7987 223/500 [============>.................] - ETA: 1:41 - loss: 3.3857 - regression_loss: 2.5885 - classification_loss: 0.7972 224/500 [============>.................] - ETA: 1:41 - loss: 3.3866 - regression_loss: 2.5900 - classification_loss: 0.7966 225/500 [============>.................] - ETA: 1:40 - loss: 3.3860 - regression_loss: 2.5908 - classification_loss: 0.7952 226/500 [============>.................] - ETA: 1:40 - loss: 3.3850 - regression_loss: 2.5906 - classification_loss: 0.7944 227/500 [============>.................] - ETA: 1:39 - loss: 3.3836 - regression_loss: 2.5902 - classification_loss: 0.7934 228/500 [============>.................] - ETA: 1:39 - loss: 3.3826 - regression_loss: 2.5898 - classification_loss: 0.7928 229/500 [============>.................] - ETA: 1:39 - loss: 3.3802 - regression_loss: 2.5890 - classification_loss: 0.7913 230/500 [============>.................] - ETA: 1:38 - loss: 3.3774 - regression_loss: 2.5881 - classification_loss: 0.7893 231/500 [============>.................] - ETA: 1:38 - loss: 3.3749 - regression_loss: 2.5868 - classification_loss: 0.7881 232/500 [============>.................] - ETA: 1:37 - loss: 3.3713 - regression_loss: 2.5852 - classification_loss: 0.7861 233/500 [============>.................] - ETA: 1:37 - loss: 3.3678 - regression_loss: 2.5834 - classification_loss: 0.7844 234/500 [=============>................] - ETA: 1:37 - loss: 3.3666 - regression_loss: 2.5836 - classification_loss: 0.7831 235/500 [=============>................] - ETA: 1:36 - loss: 3.3662 - regression_loss: 2.5831 - classification_loss: 0.7832 236/500 [=============>................] - ETA: 1:36 - loss: 3.3647 - regression_loss: 2.5827 - classification_loss: 0.7820 237/500 [=============>................] - ETA: 1:35 - loss: 3.3612 - regression_loss: 2.5808 - classification_loss: 0.7804 238/500 [=============>................] - ETA: 1:35 - loss: 3.3588 - regression_loss: 2.5796 - classification_loss: 0.7791 239/500 [=============>................] - ETA: 1:35 - loss: 3.3576 - regression_loss: 2.5796 - classification_loss: 0.7780 240/500 [=============>................] - ETA: 1:34 - loss: 3.3578 - regression_loss: 2.5811 - classification_loss: 0.7767 241/500 [=============>................] - ETA: 1:34 - loss: 3.3563 - regression_loss: 2.5810 - classification_loss: 0.7754 242/500 [=============>................] - ETA: 1:34 - loss: 3.3577 - regression_loss: 2.5825 - classification_loss: 0.7753 243/500 [=============>................] - ETA: 1:33 - loss: 3.3555 - regression_loss: 2.5812 - classification_loss: 0.7743 244/500 [=============>................] - ETA: 1:33 - loss: 3.3532 - regression_loss: 2.5804 - classification_loss: 0.7728 245/500 [=============>................] - ETA: 1:32 - loss: 3.3492 - regression_loss: 2.5783 - classification_loss: 0.7709 246/500 [=============>................] - ETA: 1:32 - loss: 3.3477 - regression_loss: 2.5780 - classification_loss: 0.7698 247/500 [=============>................] - ETA: 1:32 - loss: 3.3458 - regression_loss: 2.5772 - classification_loss: 0.7686 248/500 [=============>................] - ETA: 1:31 - loss: 3.3448 - regression_loss: 2.5774 - classification_loss: 0.7674 249/500 [=============>................] - ETA: 1:31 - loss: 3.3433 - regression_loss: 2.5775 - classification_loss: 0.7658 250/500 [==============>...............] - ETA: 1:30 - loss: 3.3402 - regression_loss: 2.5760 - classification_loss: 0.7642 251/500 [==============>...............] - ETA: 1:30 - loss: 3.3336 - regression_loss: 2.5658 - classification_loss: 0.7679 252/500 [==============>...............] - ETA: 1:30 - loss: 3.3309 - regression_loss: 2.5644 - classification_loss: 0.7665 253/500 [==============>...............] - ETA: 1:29 - loss: 3.3291 - regression_loss: 2.5639 - classification_loss: 0.7652 254/500 [==============>...............] - ETA: 1:29 - loss: 3.3273 - regression_loss: 2.5630 - classification_loss: 0.7643 255/500 [==============>...............] - ETA: 1:28 - loss: 3.3253 - regression_loss: 2.5624 - classification_loss: 0.7629 256/500 [==============>...............] - ETA: 1:28 - loss: 3.3233 - regression_loss: 2.5614 - classification_loss: 0.7619 257/500 [==============>...............] - ETA: 1:28 - loss: 3.3208 - regression_loss: 2.5601 - classification_loss: 0.7607 258/500 [==============>...............] - ETA: 1:27 - loss: 3.3161 - regression_loss: 2.5575 - classification_loss: 0.7587 259/500 [==============>...............] - ETA: 1:27 - loss: 3.3137 - regression_loss: 2.5567 - classification_loss: 0.7570 260/500 [==============>...............] - ETA: 1:27 - loss: 3.3116 - regression_loss: 2.5561 - classification_loss: 0.7555 261/500 [==============>...............] - ETA: 1:26 - loss: 3.3087 - regression_loss: 2.5544 - classification_loss: 0.7543 262/500 [==============>...............] - ETA: 1:26 - loss: 3.3076 - regression_loss: 2.5537 - classification_loss: 0.7538 263/500 [==============>...............] - ETA: 1:25 - loss: 3.3077 - regression_loss: 2.5538 - classification_loss: 0.7539 264/500 [==============>...............] - ETA: 1:25 - loss: 3.3061 - regression_loss: 2.5532 - classification_loss: 0.7528 265/500 [==============>...............] - ETA: 1:25 - loss: 3.3054 - regression_loss: 2.5528 - classification_loss: 0.7526 266/500 [==============>...............] - ETA: 1:24 - loss: 3.3039 - regression_loss: 2.5517 - classification_loss: 0.7523 267/500 [===============>..............] - ETA: 1:24 - loss: 3.3027 - regression_loss: 2.5518 - classification_loss: 0.7510 268/500 [===============>..............] - ETA: 1:24 - loss: 3.3022 - regression_loss: 2.5516 - classification_loss: 0.7506 269/500 [===============>..............] - ETA: 1:23 - loss: 3.3006 - regression_loss: 2.5510 - classification_loss: 0.7495 270/500 [===============>..............] - ETA: 1:23 - loss: 3.2939 - regression_loss: 2.5416 - classification_loss: 0.7523 271/500 [===============>..............] - ETA: 1:22 - loss: 3.2947 - regression_loss: 2.5426 - classification_loss: 0.7521 272/500 [===============>..............] - ETA: 1:22 - loss: 3.2907 - regression_loss: 2.5403 - classification_loss: 0.7504 273/500 [===============>..............] - ETA: 1:22 - loss: 3.2885 - regression_loss: 2.5395 - classification_loss: 0.7490 274/500 [===============>..............] - ETA: 1:21 - loss: 3.2864 - regression_loss: 2.5386 - classification_loss: 0.7478 275/500 [===============>..............] - ETA: 1:21 - loss: 3.2823 - regression_loss: 2.5359 - classification_loss: 0.7464 276/500 [===============>..............] - ETA: 1:20 - loss: 3.2821 - regression_loss: 2.5362 - classification_loss: 0.7459 277/500 [===============>..............] - ETA: 1:20 - loss: 3.2804 - regression_loss: 2.5357 - classification_loss: 0.7447 278/500 [===============>..............] - ETA: 1:20 - loss: 3.2785 - regression_loss: 2.5349 - classification_loss: 0.7436 279/500 [===============>..............] - ETA: 1:19 - loss: 3.2808 - regression_loss: 2.5377 - classification_loss: 0.7431 280/500 [===============>..............] - ETA: 1:19 - loss: 3.2793 - regression_loss: 2.5373 - classification_loss: 0.7421 281/500 [===============>..............] - ETA: 1:19 - loss: 3.2790 - regression_loss: 2.5374 - classification_loss: 0.7416 282/500 [===============>..............] - ETA: 1:18 - loss: 3.2777 - regression_loss: 2.5366 - classification_loss: 0.7412 283/500 [===============>..............] - ETA: 1:18 - loss: 3.2756 - regression_loss: 2.5351 - classification_loss: 0.7405 284/500 [================>.............] - ETA: 1:17 - loss: 3.2740 - regression_loss: 2.5347 - classification_loss: 0.7394 285/500 [================>.............] - ETA: 1:17 - loss: 3.2726 - regression_loss: 2.5341 - classification_loss: 0.7385 286/500 [================>.............] - ETA: 1:17 - loss: 3.2701 - regression_loss: 2.5329 - classification_loss: 0.7372 287/500 [================>.............] - ETA: 1:16 - loss: 3.2681 - regression_loss: 2.5318 - classification_loss: 0.7363 288/500 [================>.............] - ETA: 1:16 - loss: 3.2634 - regression_loss: 2.5288 - classification_loss: 0.7345 289/500 [================>.............] - ETA: 1:16 - loss: 3.2610 - regression_loss: 2.5276 - classification_loss: 0.7334 290/500 [================>.............] - ETA: 1:15 - loss: 3.2598 - regression_loss: 2.5272 - classification_loss: 0.7326 291/500 [================>.............] - ETA: 1:15 - loss: 3.2586 - regression_loss: 2.5270 - classification_loss: 0.7316 292/500 [================>.............] - ETA: 1:14 - loss: 3.2560 - regression_loss: 2.5256 - classification_loss: 0.7304 293/500 [================>.............] - ETA: 1:14 - loss: 3.2538 - regression_loss: 2.5246 - classification_loss: 0.7292 294/500 [================>.............] - ETA: 1:14 - loss: 3.2518 - regression_loss: 2.5235 - classification_loss: 0.7283 295/500 [================>.............] - ETA: 1:13 - loss: 3.2503 - regression_loss: 2.5231 - classification_loss: 0.7272 296/500 [================>.............] - ETA: 1:13 - loss: 3.2487 - regression_loss: 2.5221 - classification_loss: 0.7266 297/500 [================>.............] - ETA: 1:13 - loss: 3.2523 - regression_loss: 2.5242 - classification_loss: 0.7281 298/500 [================>.............] - ETA: 1:12 - loss: 3.2503 - regression_loss: 2.5232 - classification_loss: 0.7270 299/500 [================>.............] - ETA: 1:12 - loss: 3.2482 - regression_loss: 2.5224 - classification_loss: 0.7258 300/500 [=================>............] - ETA: 1:11 - loss: 3.2455 - regression_loss: 2.5209 - classification_loss: 0.7245 301/500 [=================>............] - ETA: 1:11 - loss: 3.2422 - regression_loss: 2.5193 - classification_loss: 0.7229 302/500 [=================>............] - ETA: 1:11 - loss: 3.2411 - regression_loss: 2.5191 - classification_loss: 0.7221 303/500 [=================>............] - ETA: 1:10 - loss: 3.2389 - regression_loss: 2.5183 - classification_loss: 0.7206 304/500 [=================>............] - ETA: 1:10 - loss: 3.2367 - regression_loss: 2.5173 - classification_loss: 0.7194 305/500 [=================>............] - ETA: 1:10 - loss: 3.2325 - regression_loss: 2.5144 - classification_loss: 0.7181 306/500 [=================>............] - ETA: 1:09 - loss: 3.2315 - regression_loss: 2.5142 - classification_loss: 0.7172 307/500 [=================>............] - ETA: 1:09 - loss: 3.2302 - regression_loss: 2.5136 - classification_loss: 0.7166 308/500 [=================>............] - ETA: 1:08 - loss: 3.2311 - regression_loss: 2.5145 - classification_loss: 0.7167 309/500 [=================>............] - ETA: 1:08 - loss: 3.2306 - regression_loss: 2.5150 - classification_loss: 0.7156 310/500 [=================>............] - ETA: 1:08 - loss: 3.2310 - regression_loss: 2.5154 - classification_loss: 0.7156 311/500 [=================>............] - ETA: 1:07 - loss: 3.2296 - regression_loss: 2.5147 - classification_loss: 0.7149 312/500 [=================>............] - ETA: 1:07 - loss: 3.2284 - regression_loss: 2.5144 - classification_loss: 0.7140 313/500 [=================>............] - ETA: 1:07 - loss: 3.2265 - regression_loss: 2.5134 - classification_loss: 0.7131 314/500 [=================>............] - ETA: 1:06 - loss: 3.2264 - regression_loss: 2.5137 - classification_loss: 0.7128 315/500 [=================>............] - ETA: 1:06 - loss: 3.2250 - regression_loss: 2.5131 - classification_loss: 0.7120 316/500 [=================>............] - ETA: 1:05 - loss: 3.2243 - regression_loss: 2.5127 - classification_loss: 0.7116 317/500 [==================>...........] - ETA: 1:05 - loss: 3.2253 - regression_loss: 2.5134 - classification_loss: 0.7118 318/500 [==================>...........] - ETA: 1:05 - loss: 3.2240 - regression_loss: 2.5130 - classification_loss: 0.7109 319/500 [==================>...........] - ETA: 1:04 - loss: 3.2225 - regression_loss: 2.5124 - classification_loss: 0.7101 320/500 [==================>...........] - ETA: 1:04 - loss: 3.2205 - regression_loss: 2.5114 - classification_loss: 0.7091 321/500 [==================>...........] - ETA: 1:04 - loss: 3.2185 - regression_loss: 2.5104 - classification_loss: 0.7081 322/500 [==================>...........] - ETA: 1:03 - loss: 3.2171 - regression_loss: 2.5096 - classification_loss: 0.7075 323/500 [==================>...........] - ETA: 1:03 - loss: 3.2154 - regression_loss: 2.5091 - classification_loss: 0.7063 324/500 [==================>...........] - ETA: 1:03 - loss: 3.2141 - regression_loss: 2.5089 - classification_loss: 0.7052 325/500 [==================>...........] - ETA: 1:02 - loss: 3.2165 - regression_loss: 2.5098 - classification_loss: 0.7067 326/500 [==================>...........] - ETA: 1:02 - loss: 3.2148 - regression_loss: 2.5093 - classification_loss: 0.7055 327/500 [==================>...........] - ETA: 1:01 - loss: 3.2137 - regression_loss: 2.5088 - classification_loss: 0.7049 328/500 [==================>...........] - ETA: 1:01 - loss: 3.2129 - regression_loss: 2.5087 - classification_loss: 0.7041 329/500 [==================>...........] - ETA: 1:01 - loss: 3.2110 - regression_loss: 2.5078 - classification_loss: 0.7032 330/500 [==================>...........] - ETA: 1:00 - loss: 3.2116 - regression_loss: 2.5085 - classification_loss: 0.7031 331/500 [==================>...........] - ETA: 1:00 - loss: 3.2094 - regression_loss: 2.5074 - classification_loss: 0.7020 332/500 [==================>...........] - ETA: 1:00 - loss: 3.2070 - regression_loss: 2.5060 - classification_loss: 0.7010 333/500 [==================>...........] - ETA: 59s - loss: 3.2053 - regression_loss: 2.5050 - classification_loss: 0.7003 334/500 [===================>..........] - ETA: 59s - loss: 3.2038 - regression_loss: 2.5045 - classification_loss: 0.6993 335/500 [===================>..........] - ETA: 58s - loss: 3.2019 - regression_loss: 2.5036 - classification_loss: 0.6983 336/500 [===================>..........] - ETA: 58s - loss: 3.1996 - regression_loss: 2.5024 - classification_loss: 0.6972 337/500 [===================>..........] - ETA: 58s - loss: 3.1992 - regression_loss: 2.5020 - classification_loss: 0.6971 338/500 [===================>..........] - ETA: 57s - loss: 3.1966 - regression_loss: 2.5007 - classification_loss: 0.6959 339/500 [===================>..........] - ETA: 57s - loss: 3.1955 - regression_loss: 2.5005 - classification_loss: 0.6950 340/500 [===================>..........] - ETA: 57s - loss: 3.1936 - regression_loss: 2.4995 - classification_loss: 0.6941 341/500 [===================>..........] - ETA: 56s - loss: 3.1915 - regression_loss: 2.4983 - classification_loss: 0.6931 342/500 [===================>..........] - ETA: 56s - loss: 3.1897 - regression_loss: 2.4975 - classification_loss: 0.6922 343/500 [===================>..........] - ETA: 56s - loss: 3.1882 - regression_loss: 2.4972 - classification_loss: 0.6910 344/500 [===================>..........] - ETA: 55s - loss: 3.1857 - regression_loss: 2.4959 - classification_loss: 0.6898 345/500 [===================>..........] - ETA: 55s - loss: 3.1844 - regression_loss: 2.4957 - classification_loss: 0.6887 346/500 [===================>..........] - ETA: 54s - loss: 3.1825 - regression_loss: 2.4947 - classification_loss: 0.6878 347/500 [===================>..........] - ETA: 54s - loss: 3.1818 - regression_loss: 2.4946 - classification_loss: 0.6872 348/500 [===================>..........] - ETA: 54s - loss: 3.1808 - regression_loss: 2.4946 - classification_loss: 0.6862 349/500 [===================>..........] - ETA: 53s - loss: 3.1808 - regression_loss: 2.4953 - classification_loss: 0.6855 350/500 [====================>.........] - ETA: 53s - loss: 3.1779 - regression_loss: 2.4938 - classification_loss: 0.6842 351/500 [====================>.........] - ETA: 53s - loss: 3.1764 - regression_loss: 2.4933 - classification_loss: 0.6831 352/500 [====================>.........] - ETA: 52s - loss: 3.1784 - regression_loss: 2.4948 - classification_loss: 0.6837 353/500 [====================>.........] - ETA: 52s - loss: 3.1760 - regression_loss: 2.4934 - classification_loss: 0.6826 354/500 [====================>.........] - ETA: 52s - loss: 3.1767 - regression_loss: 2.4939 - classification_loss: 0.6829 355/500 [====================>.........] - ETA: 51s - loss: 3.1760 - regression_loss: 2.4937 - classification_loss: 0.6823 356/500 [====================>.........] - ETA: 51s - loss: 3.1757 - regression_loss: 2.4939 - classification_loss: 0.6817 357/500 [====================>.........] - ETA: 50s - loss: 3.1774 - regression_loss: 2.4947 - classification_loss: 0.6827 358/500 [====================>.........] - ETA: 50s - loss: 3.1756 - regression_loss: 2.4939 - classification_loss: 0.6817 359/500 [====================>.........] - ETA: 50s - loss: 3.1749 - regression_loss: 2.4939 - classification_loss: 0.6811 360/500 [====================>.........] - ETA: 49s - loss: 3.1741 - regression_loss: 2.4936 - classification_loss: 0.6805 361/500 [====================>.........] - ETA: 49s - loss: 3.1713 - regression_loss: 2.4920 - classification_loss: 0.6794 362/500 [====================>.........] - ETA: 49s - loss: 3.1703 - regression_loss: 2.4916 - classification_loss: 0.6787 363/500 [====================>.........] - ETA: 48s - loss: 3.1700 - regression_loss: 2.4918 - classification_loss: 0.6782 364/500 [====================>.........] - ETA: 48s - loss: 3.1676 - regression_loss: 2.4906 - classification_loss: 0.6770 365/500 [====================>.........] - ETA: 48s - loss: 3.1655 - regression_loss: 2.4895 - classification_loss: 0.6760 366/500 [====================>.........] - ETA: 47s - loss: 3.1647 - regression_loss: 2.4893 - classification_loss: 0.6755 367/500 [=====================>........] - ETA: 47s - loss: 3.1634 - regression_loss: 2.4887 - classification_loss: 0.6746 368/500 [=====================>........] - ETA: 46s - loss: 3.1614 - regression_loss: 2.4877 - classification_loss: 0.6737 369/500 [=====================>........] - ETA: 46s - loss: 3.1597 - regression_loss: 2.4870 - classification_loss: 0.6727 370/500 [=====================>........] - ETA: 46s - loss: 3.1607 - regression_loss: 2.4879 - classification_loss: 0.6728 371/500 [=====================>........] - ETA: 45s - loss: 3.1585 - regression_loss: 2.4866 - classification_loss: 0.6720 372/500 [=====================>........] - ETA: 45s - loss: 3.1565 - regression_loss: 2.4853 - classification_loss: 0.6712 373/500 [=====================>........] - ETA: 45s - loss: 3.1545 - regression_loss: 2.4842 - classification_loss: 0.6703 374/500 [=====================>........] - ETA: 44s - loss: 3.1558 - regression_loss: 2.4853 - classification_loss: 0.6705 375/500 [=====================>........] - ETA: 44s - loss: 3.1543 - regression_loss: 2.4844 - classification_loss: 0.6700 376/500 [=====================>........] - ETA: 44s - loss: 3.1524 - regression_loss: 2.4833 - classification_loss: 0.6691 377/500 [=====================>........] - ETA: 43s - loss: 3.1520 - regression_loss: 2.4834 - classification_loss: 0.6685 378/500 [=====================>........] - ETA: 43s - loss: 3.1511 - regression_loss: 2.4833 - classification_loss: 0.6678 379/500 [=====================>........] - ETA: 42s - loss: 3.1516 - regression_loss: 2.4846 - classification_loss: 0.6671 380/500 [=====================>........] - ETA: 42s - loss: 3.1506 - regression_loss: 2.4843 - classification_loss: 0.6663 381/500 [=====================>........] - ETA: 42s - loss: 3.1516 - regression_loss: 2.4847 - classification_loss: 0.6668 382/500 [=====================>........] - ETA: 41s - loss: 3.1505 - regression_loss: 2.4845 - classification_loss: 0.6661 383/500 [=====================>........] - ETA: 41s - loss: 3.1525 - regression_loss: 2.4846 - classification_loss: 0.6678 384/500 [======================>.......] - ETA: 41s - loss: 3.1504 - regression_loss: 2.4834 - classification_loss: 0.6670 385/500 [======================>.......] - ETA: 40s - loss: 3.1485 - regression_loss: 2.4824 - classification_loss: 0.6661 386/500 [======================>.......] - ETA: 40s - loss: 3.1473 - regression_loss: 2.4820 - classification_loss: 0.6654 387/500 [======================>.......] - ETA: 40s - loss: 3.1459 - regression_loss: 2.4811 - classification_loss: 0.6648 388/500 [======================>.......] - ETA: 39s - loss: 3.1425 - regression_loss: 2.4788 - classification_loss: 0.6637 389/500 [======================>.......] - ETA: 39s - loss: 3.1407 - regression_loss: 2.4777 - classification_loss: 0.6630 390/500 [======================>.......] - ETA: 39s - loss: 3.1386 - regression_loss: 2.4766 - classification_loss: 0.6621 391/500 [======================>.......] - ETA: 38s - loss: 3.1371 - regression_loss: 2.4759 - classification_loss: 0.6612 392/500 [======================>.......] - ETA: 38s - loss: 3.1353 - regression_loss: 2.4747 - classification_loss: 0.6606 393/500 [======================>.......] - ETA: 37s - loss: 3.1375 - regression_loss: 2.4751 - classification_loss: 0.6625 394/500 [======================>.......] - ETA: 37s - loss: 3.1359 - regression_loss: 2.4741 - classification_loss: 0.6618 395/500 [======================>.......] - ETA: 37s - loss: 3.1355 - regression_loss: 2.4741 - classification_loss: 0.6614 396/500 [======================>.......] - ETA: 36s - loss: 3.1336 - regression_loss: 2.4730 - classification_loss: 0.6606 397/500 [======================>.......] - ETA: 36s - loss: 3.1342 - regression_loss: 2.4737 - classification_loss: 0.6605 398/500 [======================>.......] - ETA: 36s - loss: 3.1320 - regression_loss: 2.4724 - classification_loss: 0.6596 399/500 [======================>.......] - ETA: 35s - loss: 3.1320 - regression_loss: 2.4725 - classification_loss: 0.6596 400/500 [=======================>......] - ETA: 35s - loss: 3.1319 - regression_loss: 2.4726 - classification_loss: 0.6593 401/500 [=======================>......] - ETA: 35s - loss: 3.1309 - regression_loss: 2.4722 - classification_loss: 0.6587 402/500 [=======================>......] - ETA: 34s - loss: 3.1294 - regression_loss: 2.4715 - classification_loss: 0.6579 403/500 [=======================>......] - ETA: 34s - loss: 3.1283 - regression_loss: 2.4708 - classification_loss: 0.6574 404/500 [=======================>......] - ETA: 33s - loss: 3.1270 - regression_loss: 2.4700 - classification_loss: 0.6569 405/500 [=======================>......] - ETA: 33s - loss: 3.1261 - regression_loss: 2.4696 - classification_loss: 0.6565 406/500 [=======================>......] - ETA: 33s - loss: 3.1251 - regression_loss: 2.4690 - classification_loss: 0.6561 407/500 [=======================>......] - ETA: 32s - loss: 3.1238 - regression_loss: 2.4683 - classification_loss: 0.6555 408/500 [=======================>......] - ETA: 32s - loss: 3.1224 - regression_loss: 2.4676 - classification_loss: 0.6548 409/500 [=======================>......] - ETA: 32s - loss: 3.1209 - regression_loss: 2.4669 - classification_loss: 0.6540 410/500 [=======================>......] - ETA: 31s - loss: 3.1196 - regression_loss: 2.4661 - classification_loss: 0.6535 411/500 [=======================>......] - ETA: 31s - loss: 3.1179 - regression_loss: 2.4649 - classification_loss: 0.6530 412/500 [=======================>......] - ETA: 31s - loss: 3.1181 - regression_loss: 2.4653 - classification_loss: 0.6527 413/500 [=======================>......] - ETA: 30s - loss: 3.1165 - regression_loss: 2.4642 - classification_loss: 0.6523 414/500 [=======================>......] - ETA: 30s - loss: 3.1153 - regression_loss: 2.4636 - classification_loss: 0.6518 415/500 [=======================>......] - ETA: 30s - loss: 3.1146 - regression_loss: 2.4633 - classification_loss: 0.6513 416/500 [=======================>......] - ETA: 29s - loss: 3.1143 - regression_loss: 2.4635 - classification_loss: 0.6509 417/500 [========================>.....] - ETA: 29s - loss: 3.1135 - regression_loss: 2.4631 - classification_loss: 0.6504 418/500 [========================>.....] - ETA: 29s - loss: 3.1125 - regression_loss: 2.4626 - classification_loss: 0.6500 419/500 [========================>.....] - ETA: 28s - loss: 3.1132 - regression_loss: 2.4630 - classification_loss: 0.6502 420/500 [========================>.....] - ETA: 28s - loss: 3.1106 - regression_loss: 2.4612 - classification_loss: 0.6493 421/500 [========================>.....] - ETA: 27s - loss: 3.1098 - regression_loss: 2.4610 - classification_loss: 0.6488 422/500 [========================>.....] - ETA: 27s - loss: 3.1090 - regression_loss: 2.4608 - classification_loss: 0.6482 423/500 [========================>.....] - ETA: 27s - loss: 3.1081 - regression_loss: 2.4603 - classification_loss: 0.6477 424/500 [========================>.....] - ETA: 26s - loss: 3.1059 - regression_loss: 2.4588 - classification_loss: 0.6471 425/500 [========================>.....] - ETA: 26s - loss: 3.1046 - regression_loss: 2.4579 - classification_loss: 0.6467 426/500 [========================>.....] - ETA: 26s - loss: 3.1031 - regression_loss: 2.4571 - classification_loss: 0.6460 427/500 [========================>.....] - ETA: 25s - loss: 3.1032 - regression_loss: 2.4573 - classification_loss: 0.6459 428/500 [========================>.....] - ETA: 25s - loss: 3.1013 - regression_loss: 2.4560 - classification_loss: 0.6453 429/500 [========================>.....] - ETA: 25s - loss: 3.0989 - regression_loss: 2.4544 - classification_loss: 0.6445 430/500 [========================>.....] - ETA: 24s - loss: 3.0977 - regression_loss: 2.4538 - classification_loss: 0.6439 431/500 [========================>.....] - ETA: 24s - loss: 3.0973 - regression_loss: 2.4538 - classification_loss: 0.6435 432/500 [========================>.....] - ETA: 24s - loss: 3.0958 - regression_loss: 2.4531 - classification_loss: 0.6427 433/500 [========================>.....] - ETA: 23s - loss: 3.0941 - regression_loss: 2.4519 - classification_loss: 0.6422 434/500 [=========================>....] - ETA: 23s - loss: 3.0918 - regression_loss: 2.4505 - classification_loss: 0.6413 435/500 [=========================>....] - ETA: 22s - loss: 3.0917 - regression_loss: 2.4505 - classification_loss: 0.6412 436/500 [=========================>....] - ETA: 22s - loss: 3.0955 - regression_loss: 2.4532 - classification_loss: 0.6423 437/500 [=========================>....] - ETA: 22s - loss: 3.0945 - regression_loss: 2.4526 - classification_loss: 0.6419 438/500 [=========================>....] - ETA: 21s - loss: 3.0923 - regression_loss: 2.4513 - classification_loss: 0.6410 439/500 [=========================>....] - ETA: 21s - loss: 3.0901 - regression_loss: 2.4499 - classification_loss: 0.6403 440/500 [=========================>....] - ETA: 21s - loss: 3.0919 - regression_loss: 2.4516 - classification_loss: 0.6403 441/500 [=========================>....] - ETA: 20s - loss: 3.0896 - regression_loss: 2.4500 - classification_loss: 0.6396 442/500 [=========================>....] - ETA: 20s - loss: 3.0900 - regression_loss: 2.4508 - classification_loss: 0.6393 443/500 [=========================>....] - ETA: 20s - loss: 3.0889 - regression_loss: 2.4501 - classification_loss: 0.6388 444/500 [=========================>....] - ETA: 19s - loss: 3.0888 - regression_loss: 2.4498 - classification_loss: 0.6390 445/500 [=========================>....] - ETA: 19s - loss: 3.0865 - regression_loss: 2.4484 - classification_loss: 0.6381 446/500 [=========================>....] - ETA: 19s - loss: 3.0859 - regression_loss: 2.4480 - classification_loss: 0.6378 447/500 [=========================>....] - ETA: 18s - loss: 3.0843 - regression_loss: 2.4471 - classification_loss: 0.6372 448/500 [=========================>....] - ETA: 18s - loss: 3.0830 - regression_loss: 2.4465 - classification_loss: 0.6365 449/500 [=========================>....] - ETA: 17s - loss: 3.0807 - regression_loss: 2.4449 - classification_loss: 0.6358 450/500 [==========================>...] - ETA: 17s - loss: 3.0826 - regression_loss: 2.4468 - classification_loss: 0.6358 451/500 [==========================>...] - ETA: 17s - loss: 3.0804 - regression_loss: 2.4452 - classification_loss: 0.6351 452/500 [==========================>...] - ETA: 16s - loss: 3.0792 - regression_loss: 2.4447 - classification_loss: 0.6345 453/500 [==========================>...] - ETA: 16s - loss: 3.0786 - regression_loss: 2.4447 - classification_loss: 0.6339 454/500 [==========================>...] - ETA: 16s - loss: 3.0798 - regression_loss: 2.4459 - classification_loss: 0.6339 455/500 [==========================>...] - ETA: 15s - loss: 3.0793 - regression_loss: 2.4456 - classification_loss: 0.6338 456/500 [==========================>...] - ETA: 15s - loss: 3.0777 - regression_loss: 2.4445 - classification_loss: 0.6332 457/500 [==========================>...] - ETA: 15s - loss: 3.0760 - regression_loss: 2.4433 - classification_loss: 0.6327 458/500 [==========================>...] - ETA: 14s - loss: 3.0746 - regression_loss: 2.4423 - classification_loss: 0.6323 459/500 [==========================>...] - ETA: 14s - loss: 3.0710 - regression_loss: 2.4398 - classification_loss: 0.6313 460/500 [==========================>...] - ETA: 14s - loss: 3.0699 - regression_loss: 2.4391 - classification_loss: 0.6308 461/500 [==========================>...] - ETA: 13s - loss: 3.0696 - regression_loss: 2.4392 - classification_loss: 0.6304 462/500 [==========================>...] - ETA: 13s - loss: 3.0685 - regression_loss: 2.4387 - classification_loss: 0.6298 463/500 [==========================>...] - ETA: 13s - loss: 3.0677 - regression_loss: 2.4385 - classification_loss: 0.6292 464/500 [==========================>...] - ETA: 12s - loss: 3.0656 - regression_loss: 2.4370 - classification_loss: 0.6285 465/500 [==========================>...] - ETA: 12s - loss: 3.0653 - regression_loss: 2.4374 - classification_loss: 0.6280 466/500 [==========================>...] - ETA: 11s - loss: 3.0670 - regression_loss: 2.4377 - classification_loss: 0.6293 467/500 [===========================>..] - ETA: 11s - loss: 3.0660 - regression_loss: 2.4371 - classification_loss: 0.6288 468/500 [===========================>..] - ETA: 11s - loss: 3.0652 - regression_loss: 2.4371 - classification_loss: 0.6281 469/500 [===========================>..] - ETA: 10s - loss: 3.0646 - regression_loss: 2.4368 - classification_loss: 0.6278 470/500 [===========================>..] - ETA: 10s - loss: 3.0631 - regression_loss: 2.4358 - classification_loss: 0.6273 471/500 [===========================>..] - ETA: 10s - loss: 3.0605 - regression_loss: 2.4340 - classification_loss: 0.6265 472/500 [===========================>..] - ETA: 9s - loss: 3.0590 - regression_loss: 2.4332 - classification_loss: 0.6258 473/500 [===========================>..] - ETA: 9s - loss: 3.0576 - regression_loss: 2.4324 - classification_loss: 0.6252 474/500 [===========================>..] - ETA: 9s - loss: 3.0575 - regression_loss: 2.4325 - classification_loss: 0.6251 475/500 [===========================>..] - ETA: 8s - loss: 3.0565 - regression_loss: 2.4320 - classification_loss: 0.6245 476/500 [===========================>..] - ETA: 8s - loss: 3.0555 - regression_loss: 2.4315 - classification_loss: 0.6240 477/500 [===========================>..] - ETA: 8s - loss: 3.0559 - regression_loss: 2.4315 - classification_loss: 0.6244 478/500 [===========================>..] - ETA: 7s - loss: 3.0553 - regression_loss: 2.4313 - classification_loss: 0.6239 479/500 [===========================>..] - ETA: 7s - loss: 3.0541 - regression_loss: 2.4308 - classification_loss: 0.6233 480/500 [===========================>..] - ETA: 7s - loss: 3.0523 - regression_loss: 2.4297 - classification_loss: 0.6226 481/500 [===========================>..] - ETA: 6s - loss: 3.0499 - regression_loss: 2.4281 - classification_loss: 0.6219 482/500 [===========================>..] - ETA: 6s - loss: 3.0490 - regression_loss: 2.4277 - classification_loss: 0.6214 483/500 [===========================>..] - ETA: 5s - loss: 3.0479 - regression_loss: 2.4269 - classification_loss: 0.6209 484/500 [============================>.] - ETA: 5s - loss: 3.0461 - regression_loss: 2.4258 - classification_loss: 0.6203 485/500 [============================>.] - ETA: 5s - loss: 3.0444 - regression_loss: 2.4245 - classification_loss: 0.6199 486/500 [============================>.] - ETA: 4s - loss: 3.0415 - regression_loss: 2.4223 - classification_loss: 0.6191 487/500 [============================>.] - ETA: 4s - loss: 3.0410 - regression_loss: 2.4225 - classification_loss: 0.6186 488/500 [============================>.] - ETA: 4s - loss: 3.0402 - regression_loss: 2.4220 - classification_loss: 0.6182 489/500 [============================>.] - ETA: 3s - loss: 3.0370 - regression_loss: 2.4197 - classification_loss: 0.6173 490/500 [============================>.] - ETA: 3s - loss: 3.0347 - regression_loss: 2.4182 - classification_loss: 0.6165 491/500 [============================>.] - ETA: 3s - loss: 3.0343 - regression_loss: 2.4180 - classification_loss: 0.6163 492/500 [============================>.] - ETA: 2s - loss: 3.0328 - regression_loss: 2.4172 - classification_loss: 0.6157 493/500 [============================>.] - ETA: 2s - loss: 3.0320 - regression_loss: 2.4165 - classification_loss: 0.6155 494/500 [============================>.] - ETA: 2s - loss: 3.0296 - regression_loss: 2.4148 - classification_loss: 0.6148 495/500 [============================>.] - ETA: 1s - loss: 3.0282 - regression_loss: 2.4138 - classification_loss: 0.6144 496/500 [============================>.] - ETA: 1s - loss: 3.0312 - regression_loss: 2.4165 - classification_loss: 0.6147 497/500 [============================>.] - ETA: 1s - loss: 3.0296 - regression_loss: 2.4157 - classification_loss: 0.6139 498/500 [============================>.] - ETA: 0s - loss: 3.0279 - regression_loss: 2.4147 - classification_loss: 0.6132 499/500 [============================>.] - ETA: 0s - loss: 3.0265 - regression_loss: 2.4137 - classification_loss: 0.6128 500/500 [==============================] - 176s 351ms/step - loss: 3.0258 - regression_loss: 2.4132 - classification_loss: 0.6126 326 instances of class plum with average precision: 0.4767 mAP: 0.4767 Epoch 00001: saving model to ./training/snapshots/resnet101_pascal_01.h5 Epoch 2/150 1/500 [..............................] - ETA: 2:50 - loss: 3.0422 - regression_loss: 2.5926 - classification_loss: 0.4496 2/500 [..............................] - ETA: 2:45 - loss: 2.9612 - regression_loss: 2.4711 - classification_loss: 0.4901 3/500 [..............................] - ETA: 2:43 - loss: 2.8401 - regression_loss: 2.3886 - classification_loss: 0.4515 4/500 [..............................] - ETA: 2:43 - loss: 2.7047 - regression_loss: 2.3035 - classification_loss: 0.4013 5/500 [..............................] - ETA: 2:43 - loss: 2.6454 - regression_loss: 2.2463 - classification_loss: 0.3992 6/500 [..............................] - ETA: 2:43 - loss: 2.6322 - regression_loss: 2.2443 - classification_loss: 0.3879 7/500 [..............................] - ETA: 2:43 - loss: 2.6491 - regression_loss: 2.2620 - classification_loss: 0.3872 8/500 [..............................] - ETA: 2:44 - loss: 2.6718 - regression_loss: 2.2730 - classification_loss: 0.3989 9/500 [..............................] - ETA: 2:43 - loss: 2.6261 - regression_loss: 2.2334 - classification_loss: 0.3927 10/500 [..............................] - ETA: 2:43 - loss: 2.5885 - regression_loss: 2.2073 - classification_loss: 0.3812 11/500 [..............................] - ETA: 2:44 - loss: 2.5309 - regression_loss: 2.1444 - classification_loss: 0.3865 12/500 [..............................] - ETA: 2:44 - loss: 2.5276 - regression_loss: 2.1479 - classification_loss: 0.3797 13/500 [..............................] - ETA: 2:43 - loss: 2.5013 - regression_loss: 2.1300 - classification_loss: 0.3712 14/500 [..............................] - ETA: 2:42 - loss: 2.4702 - regression_loss: 2.1068 - classification_loss: 0.3634 15/500 [..............................] - ETA: 2:42 - loss: 2.4576 - regression_loss: 2.1005 - classification_loss: 0.3571 16/500 [..............................] - ETA: 2:42 - loss: 2.4529 - regression_loss: 2.0970 - classification_loss: 0.3559 17/500 [>.............................] - ETA: 2:41 - loss: 2.4013 - regression_loss: 2.0538 - classification_loss: 0.3475 18/500 [>.............................] - ETA: 2:41 - loss: 2.4466 - regression_loss: 2.0894 - classification_loss: 0.3572 19/500 [>.............................] - ETA: 2:41 - loss: 2.4934 - regression_loss: 2.1136 - classification_loss: 0.3798 20/500 [>.............................] - ETA: 2:41 - loss: 2.4944 - regression_loss: 2.1182 - classification_loss: 0.3762 21/500 [>.............................] - ETA: 2:40 - loss: 2.5087 - regression_loss: 2.1283 - classification_loss: 0.3804 22/500 [>.............................] - ETA: 2:40 - loss: 2.5065 - regression_loss: 2.1258 - classification_loss: 0.3807 23/500 [>.............................] - ETA: 2:40 - loss: 2.5211 - regression_loss: 2.1388 - classification_loss: 0.3823 24/500 [>.............................] - ETA: 2:40 - loss: 2.5219 - regression_loss: 2.1396 - classification_loss: 0.3823 25/500 [>.............................] - ETA: 2:40 - loss: 2.5041 - regression_loss: 2.1259 - classification_loss: 0.3782 26/500 [>.............................] - ETA: 2:40 - loss: 2.4777 - regression_loss: 2.1062 - classification_loss: 0.3715 27/500 [>.............................] - ETA: 2:40 - loss: 2.4543 - regression_loss: 2.0888 - classification_loss: 0.3654 28/500 [>.............................] - ETA: 2:39 - loss: 2.4519 - regression_loss: 2.0899 - classification_loss: 0.3621 29/500 [>.............................] - ETA: 2:38 - loss: 2.4593 - regression_loss: 2.0973 - classification_loss: 0.3619 30/500 [>.............................] - ETA: 2:37 - loss: 2.4413 - regression_loss: 2.0827 - classification_loss: 0.3586 31/500 [>.............................] - ETA: 2:37 - loss: 2.4232 - regression_loss: 2.0693 - classification_loss: 0.3539 32/500 [>.............................] - ETA: 2:36 - loss: 2.4185 - regression_loss: 2.0648 - classification_loss: 0.3537 33/500 [>.............................] - ETA: 2:36 - loss: 2.4275 - regression_loss: 2.0712 - classification_loss: 0.3564 34/500 [=>............................] - ETA: 2:36 - loss: 2.4368 - regression_loss: 2.0778 - classification_loss: 0.3591 35/500 [=>............................] - ETA: 2:35 - loss: 2.4510 - regression_loss: 2.0886 - classification_loss: 0.3623 36/500 [=>............................] - ETA: 2:35 - loss: 2.4459 - regression_loss: 2.0877 - classification_loss: 0.3582 37/500 [=>............................] - ETA: 2:35 - loss: 2.4392 - regression_loss: 2.0841 - classification_loss: 0.3552 38/500 [=>............................] - ETA: 2:34 - loss: 2.4369 - regression_loss: 2.0818 - classification_loss: 0.3551 39/500 [=>............................] - ETA: 2:34 - loss: 2.4357 - regression_loss: 2.0794 - classification_loss: 0.3563 40/500 [=>............................] - ETA: 2:34 - loss: 2.4099 - regression_loss: 2.0575 - classification_loss: 0.3525 41/500 [=>............................] - ETA: 2:33 - loss: 2.4278 - regression_loss: 2.0711 - classification_loss: 0.3567 42/500 [=>............................] - ETA: 2:33 - loss: 2.4308 - regression_loss: 2.0724 - classification_loss: 0.3584 43/500 [=>............................] - ETA: 2:33 - loss: 2.4275 - regression_loss: 2.0682 - classification_loss: 0.3593 44/500 [=>............................] - ETA: 2:32 - loss: 2.4331 - regression_loss: 2.0737 - classification_loss: 0.3594 45/500 [=>............................] - ETA: 2:32 - loss: 2.4455 - regression_loss: 2.0827 - classification_loss: 0.3628 46/500 [=>............................] - ETA: 2:31 - loss: 2.4024 - regression_loss: 2.0374 - classification_loss: 0.3649 47/500 [=>............................] - ETA: 2:31 - loss: 2.4104 - regression_loss: 2.0424 - classification_loss: 0.3680 48/500 [=>............................] - ETA: 2:31 - loss: 2.4058 - regression_loss: 2.0396 - classification_loss: 0.3662 49/500 [=>............................] - ETA: 2:30 - loss: 2.4076 - regression_loss: 2.0418 - classification_loss: 0.3658 50/500 [==>...........................] - ETA: 2:30 - loss: 2.4287 - regression_loss: 2.0619 - classification_loss: 0.3669 51/500 [==>...........................] - ETA: 2:30 - loss: 2.3877 - regression_loss: 2.0214 - classification_loss: 0.3662 52/500 [==>...........................] - ETA: 2:30 - loss: 2.3937 - regression_loss: 2.0244 - classification_loss: 0.3693 53/500 [==>...........................] - ETA: 2:29 - loss: 2.3898 - regression_loss: 2.0220 - classification_loss: 0.3677 54/500 [==>...........................] - ETA: 2:29 - loss: 2.3946 - regression_loss: 2.0270 - classification_loss: 0.3676 55/500 [==>...........................] - ETA: 2:29 - loss: 2.3974 - regression_loss: 2.0302 - classification_loss: 0.3673 56/500 [==>...........................] - ETA: 2:28 - loss: 2.3980 - regression_loss: 2.0284 - classification_loss: 0.3696 57/500 [==>...........................] - ETA: 2:28 - loss: 2.3942 - regression_loss: 2.0253 - classification_loss: 0.3689 58/500 [==>...........................] - ETA: 2:28 - loss: 2.3884 - regression_loss: 2.0209 - classification_loss: 0.3675 59/500 [==>...........................] - ETA: 2:27 - loss: 2.3784 - regression_loss: 2.0123 - classification_loss: 0.3661 60/500 [==>...........................] - ETA: 2:27 - loss: 2.3832 - regression_loss: 2.0173 - classification_loss: 0.3659 61/500 [==>...........................] - ETA: 2:27 - loss: 2.3887 - regression_loss: 2.0197 - classification_loss: 0.3690 62/500 [==>...........................] - ETA: 2:27 - loss: 2.3847 - regression_loss: 2.0177 - classification_loss: 0.3671 63/500 [==>...........................] - ETA: 2:26 - loss: 2.3922 - regression_loss: 2.0236 - classification_loss: 0.3686 64/500 [==>...........................] - ETA: 2:26 - loss: 2.4093 - regression_loss: 2.0301 - classification_loss: 0.3792 65/500 [==>...........................] - ETA: 2:26 - loss: 2.4126 - regression_loss: 2.0327 - classification_loss: 0.3799 66/500 [==>...........................] - ETA: 2:25 - loss: 2.4126 - regression_loss: 2.0340 - classification_loss: 0.3787 67/500 [===>..........................] - ETA: 2:25 - loss: 2.4166 - regression_loss: 2.0356 - classification_loss: 0.3809 68/500 [===>..........................] - ETA: 2:25 - loss: 2.4100 - regression_loss: 2.0307 - classification_loss: 0.3794 69/500 [===>..........................] - ETA: 2:24 - loss: 2.4309 - regression_loss: 2.0464 - classification_loss: 0.3845 70/500 [===>..........................] - ETA: 2:24 - loss: 2.4492 - regression_loss: 2.0602 - classification_loss: 0.3890 71/500 [===>..........................] - ETA: 2:24 - loss: 2.4615 - regression_loss: 2.0680 - classification_loss: 0.3935 72/500 [===>..........................] - ETA: 2:24 - loss: 2.4656 - regression_loss: 2.0725 - classification_loss: 0.3931 73/500 [===>..........................] - ETA: 2:23 - loss: 2.4679 - regression_loss: 2.0762 - classification_loss: 0.3916 74/500 [===>..........................] - ETA: 2:23 - loss: 2.4590 - regression_loss: 2.0696 - classification_loss: 0.3895 75/500 [===>..........................] - ETA: 2:23 - loss: 2.4652 - regression_loss: 2.0748 - classification_loss: 0.3904 76/500 [===>..........................] - ETA: 2:22 - loss: 2.4579 - regression_loss: 2.0699 - classification_loss: 0.3880 77/500 [===>..........................] - ETA: 2:22 - loss: 2.4657 - regression_loss: 2.0786 - classification_loss: 0.3871 78/500 [===>..........................] - ETA: 2:22 - loss: 2.4708 - regression_loss: 2.0825 - classification_loss: 0.3883 79/500 [===>..........................] - ETA: 2:21 - loss: 2.4601 - regression_loss: 2.0748 - classification_loss: 0.3853 80/500 [===>..........................] - ETA: 2:21 - loss: 2.4654 - regression_loss: 2.0783 - classification_loss: 0.3872 81/500 [===>..........................] - ETA: 2:21 - loss: 2.4591 - regression_loss: 2.0726 - classification_loss: 0.3864 82/500 [===>..........................] - ETA: 2:21 - loss: 2.4610 - regression_loss: 2.0742 - classification_loss: 0.3868 83/500 [===>..........................] - ETA: 2:20 - loss: 2.4551 - regression_loss: 2.0703 - classification_loss: 0.3848 84/500 [====>.........................] - ETA: 2:20 - loss: 2.4590 - regression_loss: 2.0709 - classification_loss: 0.3881 85/500 [====>.........................] - ETA: 2:19 - loss: 2.4581 - regression_loss: 2.0700 - classification_loss: 0.3881 86/500 [====>.........................] - ETA: 2:19 - loss: 2.4611 - regression_loss: 2.0724 - classification_loss: 0.3887 87/500 [====>.........................] - ETA: 2:19 - loss: 2.4599 - regression_loss: 2.0713 - classification_loss: 0.3886 88/500 [====>.........................] - ETA: 2:18 - loss: 2.4549 - regression_loss: 2.0675 - classification_loss: 0.3874 89/500 [====>.........................] - ETA: 2:18 - loss: 2.4541 - regression_loss: 2.0674 - classification_loss: 0.3867 90/500 [====>.........................] - ETA: 2:18 - loss: 2.4463 - regression_loss: 2.0613 - classification_loss: 0.3849 91/500 [====>.........................] - ETA: 2:17 - loss: 2.4427 - regression_loss: 2.0583 - classification_loss: 0.3844 92/500 [====>.........................] - ETA: 2:17 - loss: 2.4315 - regression_loss: 2.0485 - classification_loss: 0.3830 93/500 [====>.........................] - ETA: 2:16 - loss: 2.4274 - regression_loss: 2.0454 - classification_loss: 0.3820 94/500 [====>.........................] - ETA: 2:16 - loss: 2.4271 - regression_loss: 2.0452 - classification_loss: 0.3818 95/500 [====>.........................] - ETA: 2:16 - loss: 2.4237 - regression_loss: 2.0429 - classification_loss: 0.3807 96/500 [====>.........................] - ETA: 2:15 - loss: 2.4229 - regression_loss: 2.0423 - classification_loss: 0.3805 97/500 [====>.........................] - ETA: 2:15 - loss: 2.4239 - regression_loss: 2.0438 - classification_loss: 0.3801 98/500 [====>.........................] - ETA: 2:15 - loss: 2.4190 - regression_loss: 2.0399 - classification_loss: 0.3791 99/500 [====>.........................] - ETA: 2:15 - loss: 2.4132 - regression_loss: 2.0353 - classification_loss: 0.3779 100/500 [=====>........................] - ETA: 2:14 - loss: 2.4101 - regression_loss: 2.0331 - classification_loss: 0.3770 101/500 [=====>........................] - ETA: 2:14 - loss: 2.4107 - regression_loss: 2.0348 - classification_loss: 0.3759 102/500 [=====>........................] - ETA: 2:14 - loss: 2.4115 - regression_loss: 2.0350 - classification_loss: 0.3765 103/500 [=====>........................] - ETA: 2:13 - loss: 2.4144 - regression_loss: 2.0365 - classification_loss: 0.3780 104/500 [=====>........................] - ETA: 2:13 - loss: 2.4102 - regression_loss: 2.0329 - classification_loss: 0.3772 105/500 [=====>........................] - ETA: 2:13 - loss: 2.4203 - regression_loss: 2.0397 - classification_loss: 0.3806 106/500 [=====>........................] - ETA: 2:12 - loss: 2.4236 - regression_loss: 2.0421 - classification_loss: 0.3815 107/500 [=====>........................] - ETA: 2:12 - loss: 2.4293 - regression_loss: 2.0471 - classification_loss: 0.3822 108/500 [=====>........................] - ETA: 2:12 - loss: 2.4309 - regression_loss: 2.0480 - classification_loss: 0.3829 109/500 [=====>........................] - ETA: 2:11 - loss: 2.4320 - regression_loss: 2.0486 - classification_loss: 0.3833 110/500 [=====>........................] - ETA: 2:11 - loss: 2.4288 - regression_loss: 2.0466 - classification_loss: 0.3822 111/500 [=====>........................] - ETA: 2:11 - loss: 2.4281 - regression_loss: 2.0459 - classification_loss: 0.3822 112/500 [=====>........................] - ETA: 2:10 - loss: 2.4285 - regression_loss: 2.0467 - classification_loss: 0.3818 113/500 [=====>........................] - ETA: 2:10 - loss: 2.4333 - regression_loss: 2.0475 - classification_loss: 0.3858 114/500 [=====>........................] - ETA: 2:10 - loss: 2.4379 - regression_loss: 2.0513 - classification_loss: 0.3866 115/500 [=====>........................] - ETA: 2:09 - loss: 2.4373 - regression_loss: 2.0512 - classification_loss: 0.3861 116/500 [=====>........................] - ETA: 2:09 - loss: 2.4557 - regression_loss: 2.0599 - classification_loss: 0.3958 117/500 [======>.......................] - ETA: 2:08 - loss: 2.4577 - regression_loss: 2.0613 - classification_loss: 0.3965 118/500 [======>.......................] - ETA: 2:08 - loss: 2.4524 - regression_loss: 2.0572 - classification_loss: 0.3952 119/500 [======>.......................] - ETA: 2:08 - loss: 2.4469 - regression_loss: 2.0532 - classification_loss: 0.3937 120/500 [======>.......................] - ETA: 2:07 - loss: 2.4530 - regression_loss: 2.0592 - classification_loss: 0.3938 121/500 [======>.......................] - ETA: 2:07 - loss: 2.4519 - regression_loss: 2.0584 - classification_loss: 0.3935 122/500 [======>.......................] - ETA: 2:07 - loss: 2.4496 - regression_loss: 2.0563 - classification_loss: 0.3933 123/500 [======>.......................] - ETA: 2:06 - loss: 2.4479 - regression_loss: 2.0553 - classification_loss: 0.3926 124/500 [======>.......................] - ETA: 2:06 - loss: 2.4452 - regression_loss: 2.0527 - classification_loss: 0.3925 125/500 [======>.......................] - ETA: 2:06 - loss: 2.4383 - regression_loss: 2.0474 - classification_loss: 0.3909 126/500 [======>.......................] - ETA: 2:05 - loss: 2.4381 - regression_loss: 2.0466 - classification_loss: 0.3916 127/500 [======>.......................] - ETA: 2:05 - loss: 2.4368 - regression_loss: 2.0462 - classification_loss: 0.3906 128/500 [======>.......................] - ETA: 2:05 - loss: 2.4358 - regression_loss: 2.0461 - classification_loss: 0.3897 129/500 [======>.......................] - ETA: 2:05 - loss: 2.4387 - regression_loss: 2.0484 - classification_loss: 0.3903 130/500 [======>.......................] - ETA: 2:04 - loss: 2.4405 - regression_loss: 2.0499 - classification_loss: 0.3906 131/500 [======>.......................] - ETA: 2:04 - loss: 2.4380 - regression_loss: 2.0483 - classification_loss: 0.3897 132/500 [======>.......................] - ETA: 2:04 - loss: 2.4376 - regression_loss: 2.0328 - classification_loss: 0.4048 133/500 [======>.......................] - ETA: 2:03 - loss: 2.4460 - regression_loss: 2.0355 - classification_loss: 0.4105 134/500 [=======>......................] - ETA: 2:03 - loss: 2.4446 - regression_loss: 2.0349 - classification_loss: 0.4097 135/500 [=======>......................] - ETA: 2:03 - loss: 2.4433 - regression_loss: 2.0343 - classification_loss: 0.4089 136/500 [=======>......................] - ETA: 2:02 - loss: 2.4425 - regression_loss: 2.0340 - classification_loss: 0.4085 137/500 [=======>......................] - ETA: 2:02 - loss: 2.4447 - regression_loss: 2.0355 - classification_loss: 0.4092 138/500 [=======>......................] - ETA: 2:01 - loss: 2.4438 - regression_loss: 2.0352 - classification_loss: 0.4087 139/500 [=======>......................] - ETA: 2:01 - loss: 2.4451 - regression_loss: 2.0358 - classification_loss: 0.4093 140/500 [=======>......................] - ETA: 2:01 - loss: 2.4498 - regression_loss: 2.0402 - classification_loss: 0.4095 141/500 [=======>......................] - ETA: 2:00 - loss: 2.4569 - regression_loss: 2.0468 - classification_loss: 0.4100 142/500 [=======>......................] - ETA: 2:00 - loss: 2.4603 - regression_loss: 2.0502 - classification_loss: 0.4101 143/500 [=======>......................] - ETA: 2:00 - loss: 2.4620 - regression_loss: 2.0513 - classification_loss: 0.4107 144/500 [=======>......................] - ETA: 1:59 - loss: 2.4604 - regression_loss: 2.0507 - classification_loss: 0.4097 145/500 [=======>......................] - ETA: 1:59 - loss: 2.4631 - regression_loss: 2.0538 - classification_loss: 0.4093 146/500 [=======>......................] - ETA: 1:59 - loss: 2.4639 - regression_loss: 2.0544 - classification_loss: 0.4094 147/500 [=======>......................] - ETA: 1:58 - loss: 2.4619 - regression_loss: 2.0533 - classification_loss: 0.4086 148/500 [=======>......................] - ETA: 1:58 - loss: 2.4562 - regression_loss: 2.0485 - classification_loss: 0.4077 149/500 [=======>......................] - ETA: 1:58 - loss: 2.4546 - regression_loss: 2.0482 - classification_loss: 0.4064 150/500 [========>.....................] - ETA: 1:57 - loss: 2.4538 - regression_loss: 2.0476 - classification_loss: 0.4062 151/500 [========>.....................] - ETA: 1:57 - loss: 2.4525 - regression_loss: 2.0467 - classification_loss: 0.4058 152/500 [========>.....................] - ETA: 1:57 - loss: 2.4526 - regression_loss: 2.0476 - classification_loss: 0.4049 153/500 [========>.....................] - ETA: 1:56 - loss: 2.4490 - regression_loss: 2.0451 - classification_loss: 0.4039 154/500 [========>.....................] - ETA: 1:56 - loss: 2.4514 - regression_loss: 2.0468 - classification_loss: 0.4046 155/500 [========>.....................] - ETA: 1:56 - loss: 2.4533 - regression_loss: 2.0491 - classification_loss: 0.4042 156/500 [========>.....................] - ETA: 1:55 - loss: 2.4558 - regression_loss: 2.0503 - classification_loss: 0.4055 157/500 [========>.....................] - ETA: 1:55 - loss: 2.4544 - regression_loss: 2.0495 - classification_loss: 0.4049 158/500 [========>.....................] - ETA: 1:55 - loss: 2.4583 - regression_loss: 2.0528 - classification_loss: 0.4054 159/500 [========>.....................] - ETA: 1:54 - loss: 2.4555 - regression_loss: 2.0512 - classification_loss: 0.4043 160/500 [========>.....................] - ETA: 1:54 - loss: 2.4561 - regression_loss: 2.0520 - classification_loss: 0.4041 161/500 [========>.....................] - ETA: 1:54 - loss: 2.4610 - regression_loss: 2.0559 - classification_loss: 0.4050 162/500 [========>.....................] - ETA: 1:53 - loss: 2.4603 - regression_loss: 2.0555 - classification_loss: 0.4048 163/500 [========>.....................] - ETA: 1:53 - loss: 2.4632 - regression_loss: 2.0583 - classification_loss: 0.4049 164/500 [========>.....................] - ETA: 1:53 - loss: 2.4658 - regression_loss: 2.0598 - classification_loss: 0.4060 165/500 [========>.....................] - ETA: 1:52 - loss: 2.4652 - regression_loss: 2.0594 - classification_loss: 0.4058 166/500 [========>.....................] - ETA: 1:52 - loss: 2.4667 - regression_loss: 2.0615 - classification_loss: 0.4052 167/500 [=========>....................] - ETA: 1:52 - loss: 2.4609 - regression_loss: 2.0569 - classification_loss: 0.4040 168/500 [=========>....................] - ETA: 1:51 - loss: 2.4578 - regression_loss: 2.0547 - classification_loss: 0.4031 169/500 [=========>....................] - ETA: 1:51 - loss: 2.4556 - regression_loss: 2.0527 - classification_loss: 0.4028 170/500 [=========>....................] - ETA: 1:51 - loss: 2.4526 - regression_loss: 2.0505 - classification_loss: 0.4021 171/500 [=========>....................] - ETA: 1:50 - loss: 2.4497 - regression_loss: 2.0481 - classification_loss: 0.4016 172/500 [=========>....................] - ETA: 1:50 - loss: 2.4439 - regression_loss: 2.0436 - classification_loss: 0.4003 173/500 [=========>....................] - ETA: 1:50 - loss: 2.4442 - regression_loss: 2.0441 - classification_loss: 0.4001 174/500 [=========>....................] - ETA: 1:49 - loss: 2.4436 - regression_loss: 2.0438 - classification_loss: 0.3997 175/500 [=========>....................] - ETA: 1:49 - loss: 2.4420 - regression_loss: 2.0427 - classification_loss: 0.3993 176/500 [=========>....................] - ETA: 1:49 - loss: 2.4417 - regression_loss: 2.0427 - classification_loss: 0.3990 177/500 [=========>....................] - ETA: 1:48 - loss: 2.4407 - regression_loss: 2.0417 - classification_loss: 0.3990 178/500 [=========>....................] - ETA: 1:48 - loss: 2.4397 - regression_loss: 2.0409 - classification_loss: 0.3988 179/500 [=========>....................] - ETA: 1:48 - loss: 2.4439 - regression_loss: 2.0440 - classification_loss: 0.3999 180/500 [=========>....................] - ETA: 1:47 - loss: 2.4426 - regression_loss: 2.0432 - classification_loss: 0.3994 181/500 [=========>....................] - ETA: 1:47 - loss: 2.4446 - regression_loss: 2.0454 - classification_loss: 0.3992 182/500 [=========>....................] - ETA: 1:47 - loss: 2.4427 - regression_loss: 2.0442 - classification_loss: 0.3984 183/500 [=========>....................] - ETA: 1:46 - loss: 2.4401 - regression_loss: 2.0423 - classification_loss: 0.3978 184/500 [==========>...................] - ETA: 1:46 - loss: 2.4414 - regression_loss: 2.0438 - classification_loss: 0.3976 185/500 [==========>...................] - ETA: 1:46 - loss: 2.4346 - regression_loss: 2.0383 - classification_loss: 0.3963 186/500 [==========>...................] - ETA: 1:45 - loss: 2.4321 - regression_loss: 2.0365 - classification_loss: 0.3956 187/500 [==========>...................] - ETA: 1:45 - loss: 2.4307 - regression_loss: 2.0357 - classification_loss: 0.3950 188/500 [==========>...................] - ETA: 1:45 - loss: 2.4305 - regression_loss: 2.0356 - classification_loss: 0.3949 189/500 [==========>...................] - ETA: 1:44 - loss: 2.4239 - regression_loss: 2.0301 - classification_loss: 0.3939 190/500 [==========>...................] - ETA: 1:44 - loss: 2.4259 - regression_loss: 2.0315 - classification_loss: 0.3944 191/500 [==========>...................] - ETA: 1:44 - loss: 2.4240 - regression_loss: 2.0303 - classification_loss: 0.3937 192/500 [==========>...................] - ETA: 1:43 - loss: 2.4256 - regression_loss: 2.0318 - classification_loss: 0.3938 193/500 [==========>...................] - ETA: 1:43 - loss: 2.4244 - regression_loss: 2.0313 - classification_loss: 0.3931 194/500 [==========>...................] - ETA: 1:43 - loss: 2.4245 - regression_loss: 2.0319 - classification_loss: 0.3926 195/500 [==========>...................] - ETA: 1:42 - loss: 2.4225 - regression_loss: 2.0302 - classification_loss: 0.3923 196/500 [==========>...................] - ETA: 1:42 - loss: 2.4233 - regression_loss: 2.0310 - classification_loss: 0.3923 197/500 [==========>...................] - ETA: 1:42 - loss: 2.4212 - regression_loss: 2.0293 - classification_loss: 0.3918 198/500 [==========>...................] - ETA: 1:41 - loss: 2.4189 - regression_loss: 2.0276 - classification_loss: 0.3913 199/500 [==========>...................] - ETA: 1:41 - loss: 2.4172 - regression_loss: 2.0265 - classification_loss: 0.3908 200/500 [===========>..................] - ETA: 1:41 - loss: 2.4179 - regression_loss: 2.0272 - classification_loss: 0.3907 201/500 [===========>..................] - ETA: 1:40 - loss: 2.4160 - regression_loss: 2.0259 - classification_loss: 0.3901 202/500 [===========>..................] - ETA: 1:40 - loss: 2.4163 - regression_loss: 2.0264 - classification_loss: 0.3898 203/500 [===========>..................] - ETA: 1:40 - loss: 2.4148 - regression_loss: 2.0256 - classification_loss: 0.3892 204/500 [===========>..................] - ETA: 1:39 - loss: 2.4144 - regression_loss: 2.0248 - classification_loss: 0.3896 205/500 [===========>..................] - ETA: 1:39 - loss: 2.4110 - regression_loss: 2.0223 - classification_loss: 0.3887 206/500 [===========>..................] - ETA: 1:39 - loss: 2.4136 - regression_loss: 2.0232 - classification_loss: 0.3904 207/500 [===========>..................] - ETA: 1:38 - loss: 2.4165 - regression_loss: 2.0258 - classification_loss: 0.3907 208/500 [===========>..................] - ETA: 1:38 - loss: 2.4124 - regression_loss: 2.0220 - classification_loss: 0.3904 209/500 [===========>..................] - ETA: 1:38 - loss: 2.4130 - regression_loss: 2.0225 - classification_loss: 0.3905 210/500 [===========>..................] - ETA: 1:37 - loss: 2.4151 - regression_loss: 2.0243 - classification_loss: 0.3909 211/500 [===========>..................] - ETA: 1:37 - loss: 2.4145 - regression_loss: 2.0236 - classification_loss: 0.3909 212/500 [===========>..................] - ETA: 1:37 - loss: 2.4130 - regression_loss: 2.0226 - classification_loss: 0.3904 213/500 [===========>..................] - ETA: 1:36 - loss: 2.4166 - regression_loss: 2.0251 - classification_loss: 0.3915 214/500 [===========>..................] - ETA: 1:36 - loss: 2.4177 - regression_loss: 2.0256 - classification_loss: 0.3921 215/500 [===========>..................] - ETA: 1:36 - loss: 2.4161 - regression_loss: 2.0243 - classification_loss: 0.3918 216/500 [===========>..................] - ETA: 1:35 - loss: 2.4224 - regression_loss: 2.0291 - classification_loss: 0.3933 217/500 [============>.................] - ETA: 1:35 - loss: 2.4190 - regression_loss: 2.0261 - classification_loss: 0.3929 218/500 [============>.................] - ETA: 1:35 - loss: 2.4179 - regression_loss: 2.0256 - classification_loss: 0.3923 219/500 [============>.................] - ETA: 1:34 - loss: 2.4176 - regression_loss: 2.0258 - classification_loss: 0.3917 220/500 [============>.................] - ETA: 1:34 - loss: 2.4164 - regression_loss: 2.0249 - classification_loss: 0.3915 221/500 [============>.................] - ETA: 1:34 - loss: 2.4151 - regression_loss: 2.0240 - classification_loss: 0.3910 222/500 [============>.................] - ETA: 1:33 - loss: 2.4099 - regression_loss: 2.0197 - classification_loss: 0.3902 223/500 [============>.................] - ETA: 1:33 - loss: 2.4099 - regression_loss: 2.0195 - classification_loss: 0.3903 224/500 [============>.................] - ETA: 1:33 - loss: 2.4125 - regression_loss: 2.0218 - classification_loss: 0.3907 225/500 [============>.................] - ETA: 1:32 - loss: 2.4078 - regression_loss: 2.0179 - classification_loss: 0.3899 226/500 [============>.................] - ETA: 1:32 - loss: 2.4059 - regression_loss: 2.0161 - classification_loss: 0.3898 227/500 [============>.................] - ETA: 1:32 - loss: 2.4066 - regression_loss: 2.0167 - classification_loss: 0.3899 228/500 [============>.................] - ETA: 1:31 - loss: 2.4019 - regression_loss: 2.0130 - classification_loss: 0.3889 229/500 [============>.................] - ETA: 1:31 - loss: 2.4018 - regression_loss: 2.0129 - classification_loss: 0.3889 230/500 [============>.................] - ETA: 1:31 - loss: 2.4007 - regression_loss: 2.0120 - classification_loss: 0.3887 231/500 [============>.................] - ETA: 1:30 - loss: 2.3982 - regression_loss: 2.0096 - classification_loss: 0.3886 232/500 [============>.................] - ETA: 1:30 - loss: 2.3981 - regression_loss: 2.0095 - classification_loss: 0.3886 233/500 [============>.................] - ETA: 1:30 - loss: 2.4071 - regression_loss: 2.0166 - classification_loss: 0.3904 234/500 [=============>................] - ETA: 1:29 - loss: 2.4059 - regression_loss: 2.0158 - classification_loss: 0.3901 235/500 [=============>................] - ETA: 1:29 - loss: 2.4084 - regression_loss: 2.0179 - classification_loss: 0.3905 236/500 [=============>................] - ETA: 1:29 - loss: 2.4077 - regression_loss: 2.0175 - classification_loss: 0.3903 237/500 [=============>................] - ETA: 1:28 - loss: 2.4082 - regression_loss: 2.0178 - classification_loss: 0.3904 238/500 [=============>................] - ETA: 1:28 - loss: 2.4096 - regression_loss: 2.0190 - classification_loss: 0.3906 239/500 [=============>................] - ETA: 1:28 - loss: 2.4094 - regression_loss: 2.0187 - classification_loss: 0.3907 240/500 [=============>................] - ETA: 1:27 - loss: 2.4105 - regression_loss: 2.0200 - classification_loss: 0.3904 241/500 [=============>................] - ETA: 1:27 - loss: 2.4083 - regression_loss: 2.0183 - classification_loss: 0.3900 242/500 [=============>................] - ETA: 1:27 - loss: 2.4071 - regression_loss: 2.0177 - classification_loss: 0.3894 243/500 [=============>................] - ETA: 1:26 - loss: 2.4078 - regression_loss: 2.0181 - classification_loss: 0.3897 244/500 [=============>................] - ETA: 1:26 - loss: 2.4077 - regression_loss: 2.0183 - classification_loss: 0.3894 245/500 [=============>................] - ETA: 1:26 - loss: 2.4076 - regression_loss: 2.0185 - classification_loss: 0.3891 246/500 [=============>................] - ETA: 1:25 - loss: 2.4059 - regression_loss: 2.0174 - classification_loss: 0.3885 247/500 [=============>................] - ETA: 1:25 - loss: 2.4026 - regression_loss: 2.0148 - classification_loss: 0.3879 248/500 [=============>................] - ETA: 1:25 - loss: 2.4027 - regression_loss: 2.0149 - classification_loss: 0.3878 249/500 [=============>................] - ETA: 1:24 - loss: 2.4024 - regression_loss: 2.0145 - classification_loss: 0.3880 250/500 [==============>...............] - ETA: 1:24 - loss: 2.4025 - regression_loss: 2.0147 - classification_loss: 0.3878 251/500 [==============>...............] - ETA: 1:24 - loss: 2.4008 - regression_loss: 2.0137 - classification_loss: 0.3871 252/500 [==============>...............] - ETA: 1:23 - loss: 2.4022 - regression_loss: 2.0149 - classification_loss: 0.3874 253/500 [==============>...............] - ETA: 1:23 - loss: 2.4043 - regression_loss: 2.0159 - classification_loss: 0.3884 254/500 [==============>...............] - ETA: 1:23 - loss: 2.4011 - regression_loss: 2.0134 - classification_loss: 0.3877 255/500 [==============>...............] - ETA: 1:22 - loss: 2.4021 - regression_loss: 2.0140 - classification_loss: 0.3881 256/500 [==============>...............] - ETA: 1:22 - loss: 2.4000 - regression_loss: 2.0126 - classification_loss: 0.3874 257/500 [==============>...............] - ETA: 1:22 - loss: 2.3991 - regression_loss: 2.0119 - classification_loss: 0.3872 258/500 [==============>...............] - ETA: 1:21 - loss: 2.3982 - regression_loss: 2.0113 - classification_loss: 0.3869 259/500 [==============>...............] - ETA: 1:21 - loss: 2.3983 - regression_loss: 2.0118 - classification_loss: 0.3865 260/500 [==============>...............] - ETA: 1:21 - loss: 2.3988 - regression_loss: 2.0123 - classification_loss: 0.3864 261/500 [==============>...............] - ETA: 1:20 - loss: 2.3982 - regression_loss: 2.0121 - classification_loss: 0.3860 262/500 [==============>...............] - ETA: 1:20 - loss: 2.3977 - regression_loss: 2.0120 - classification_loss: 0.3857 263/500 [==============>...............] - ETA: 1:20 - loss: 2.3981 - regression_loss: 2.0124 - classification_loss: 0.3857 264/500 [==============>...............] - ETA: 1:19 - loss: 2.3984 - regression_loss: 2.0130 - classification_loss: 0.3853 265/500 [==============>...............] - ETA: 1:19 - loss: 2.3981 - regression_loss: 2.0126 - classification_loss: 0.3854 266/500 [==============>...............] - ETA: 1:19 - loss: 2.3966 - regression_loss: 2.0117 - classification_loss: 0.3849 267/500 [===============>..............] - ETA: 1:18 - loss: 2.3955 - regression_loss: 2.0104 - classification_loss: 0.3850 268/500 [===============>..............] - ETA: 1:18 - loss: 2.3934 - regression_loss: 2.0089 - classification_loss: 0.3845 269/500 [===============>..............] - ETA: 1:18 - loss: 2.3925 - regression_loss: 2.0081 - classification_loss: 0.3844 270/500 [===============>..............] - ETA: 1:17 - loss: 2.3919 - regression_loss: 2.0077 - classification_loss: 0.3842 271/500 [===============>..............] - ETA: 1:17 - loss: 2.3908 - regression_loss: 2.0070 - classification_loss: 0.3837 272/500 [===============>..............] - ETA: 1:17 - loss: 2.3881 - regression_loss: 2.0048 - classification_loss: 0.3833 273/500 [===============>..............] - ETA: 1:16 - loss: 2.3846 - regression_loss: 2.0021 - classification_loss: 0.3825 274/500 [===============>..............] - ETA: 1:16 - loss: 2.3854 - regression_loss: 2.0019 - classification_loss: 0.3835 275/500 [===============>..............] - ETA: 1:16 - loss: 2.3850 - regression_loss: 2.0017 - classification_loss: 0.3833 276/500 [===============>..............] - ETA: 1:15 - loss: 2.3895 - regression_loss: 2.0053 - classification_loss: 0.3842 277/500 [===============>..............] - ETA: 1:15 - loss: 2.3863 - regression_loss: 2.0028 - classification_loss: 0.3835 278/500 [===============>..............] - ETA: 1:15 - loss: 2.3873 - regression_loss: 2.0033 - classification_loss: 0.3841 279/500 [===============>..............] - ETA: 1:14 - loss: 2.3832 - regression_loss: 1.9998 - classification_loss: 0.3834 280/500 [===============>..............] - ETA: 1:14 - loss: 2.3845 - regression_loss: 2.0009 - classification_loss: 0.3836 281/500 [===============>..............] - ETA: 1:14 - loss: 2.3881 - regression_loss: 2.0032 - classification_loss: 0.3848 282/500 [===============>..............] - ETA: 1:13 - loss: 2.3888 - regression_loss: 2.0037 - classification_loss: 0.3851 283/500 [===============>..............] - ETA: 1:13 - loss: 2.3918 - regression_loss: 2.0057 - classification_loss: 0.3860 284/500 [================>.............] - ETA: 1:13 - loss: 2.3925 - regression_loss: 2.0061 - classification_loss: 0.3864 285/500 [================>.............] - ETA: 1:12 - loss: 2.3926 - regression_loss: 2.0062 - classification_loss: 0.3864 286/500 [================>.............] - ETA: 1:12 - loss: 2.3936 - regression_loss: 2.0069 - classification_loss: 0.3867 287/500 [================>.............] - ETA: 1:12 - loss: 2.3922 - regression_loss: 2.0057 - classification_loss: 0.3866 288/500 [================>.............] - ETA: 1:11 - loss: 2.3938 - regression_loss: 2.0061 - classification_loss: 0.3878 289/500 [================>.............] - ETA: 1:11 - loss: 2.3937 - regression_loss: 2.0058 - classification_loss: 0.3878 290/500 [================>.............] - ETA: 1:10 - loss: 2.3902 - regression_loss: 2.0031 - classification_loss: 0.3871 291/500 [================>.............] - ETA: 1:10 - loss: 2.3910 - regression_loss: 2.0019 - classification_loss: 0.3891 292/500 [================>.............] - ETA: 1:10 - loss: 2.3905 - regression_loss: 2.0018 - classification_loss: 0.3887 293/500 [================>.............] - ETA: 1:09 - loss: 2.3899 - regression_loss: 2.0015 - classification_loss: 0.3884 294/500 [================>.............] - ETA: 1:09 - loss: 2.3869 - regression_loss: 1.9991 - classification_loss: 0.3878 295/500 [================>.............] - ETA: 1:09 - loss: 2.3845 - regression_loss: 1.9973 - classification_loss: 0.3872 296/500 [================>.............] - ETA: 1:08 - loss: 2.3855 - regression_loss: 1.9980 - classification_loss: 0.3875 297/500 [================>.............] - ETA: 1:08 - loss: 2.3871 - regression_loss: 1.9992 - classification_loss: 0.3879 298/500 [================>.............] - ETA: 1:08 - loss: 2.3886 - regression_loss: 2.0005 - classification_loss: 0.3881 299/500 [================>.............] - ETA: 1:07 - loss: 2.3890 - regression_loss: 2.0012 - classification_loss: 0.3877 300/500 [=================>............] - ETA: 1:07 - loss: 2.3881 - regression_loss: 2.0005 - classification_loss: 0.3876 301/500 [=================>............] - ETA: 1:07 - loss: 2.3884 - regression_loss: 2.0011 - classification_loss: 0.3873 302/500 [=================>............] - ETA: 1:06 - loss: 2.3889 - regression_loss: 2.0015 - classification_loss: 0.3875 303/500 [=================>............] - ETA: 1:06 - loss: 2.3886 - regression_loss: 2.0012 - classification_loss: 0.3874 304/500 [=================>............] - ETA: 1:06 - loss: 2.3916 - regression_loss: 2.0035 - classification_loss: 0.3881 305/500 [=================>............] - ETA: 1:05 - loss: 2.3906 - regression_loss: 2.0028 - classification_loss: 0.3877 306/500 [=================>............] - ETA: 1:05 - loss: 2.3956 - regression_loss: 2.0065 - classification_loss: 0.3891 307/500 [=================>............] - ETA: 1:05 - loss: 2.3966 - regression_loss: 2.0075 - classification_loss: 0.3892 308/500 [=================>............] - ETA: 1:04 - loss: 2.3944 - regression_loss: 2.0057 - classification_loss: 0.3887 309/500 [=================>............] - ETA: 1:04 - loss: 2.3927 - regression_loss: 2.0047 - classification_loss: 0.3880 310/500 [=================>............] - ETA: 1:04 - loss: 2.3932 - regression_loss: 2.0052 - classification_loss: 0.3880 311/500 [=================>............] - ETA: 1:03 - loss: 2.3913 - regression_loss: 2.0039 - classification_loss: 0.3874 312/500 [=================>............] - ETA: 1:03 - loss: 2.3901 - regression_loss: 2.0031 - classification_loss: 0.3870 313/500 [=================>............] - ETA: 1:03 - loss: 2.3877 - regression_loss: 2.0012 - classification_loss: 0.3864 314/500 [=================>............] - ETA: 1:02 - loss: 2.3885 - regression_loss: 2.0021 - classification_loss: 0.3864 315/500 [=================>............] - ETA: 1:02 - loss: 2.3878 - regression_loss: 2.0017 - classification_loss: 0.3862 316/500 [=================>............] - ETA: 1:02 - loss: 2.3875 - regression_loss: 2.0013 - classification_loss: 0.3862 317/500 [==================>...........] - ETA: 1:01 - loss: 2.3866 - regression_loss: 2.0007 - classification_loss: 0.3859 318/500 [==================>...........] - ETA: 1:01 - loss: 2.3849 - regression_loss: 1.9993 - classification_loss: 0.3856 319/500 [==================>...........] - ETA: 1:01 - loss: 2.3846 - regression_loss: 1.9990 - classification_loss: 0.3855 320/500 [==================>...........] - ETA: 1:00 - loss: 2.3846 - regression_loss: 1.9992 - classification_loss: 0.3854 321/500 [==================>...........] - ETA: 1:00 - loss: 2.3840 - regression_loss: 1.9987 - classification_loss: 0.3853 322/500 [==================>...........] - ETA: 1:00 - loss: 2.3855 - regression_loss: 1.9998 - classification_loss: 0.3856 323/500 [==================>...........] - ETA: 59s - loss: 2.3835 - regression_loss: 1.9983 - classification_loss: 0.3853  324/500 [==================>...........] - ETA: 59s - loss: 2.3820 - regression_loss: 1.9972 - classification_loss: 0.3848 325/500 [==================>...........] - ETA: 59s - loss: 2.3842 - regression_loss: 1.9988 - classification_loss: 0.3854 326/500 [==================>...........] - ETA: 58s - loss: 2.3834 - regression_loss: 1.9983 - classification_loss: 0.3850 327/500 [==================>...........] - ETA: 58s - loss: 2.3830 - regression_loss: 1.9982 - classification_loss: 0.3848 328/500 [==================>...........] - ETA: 58s - loss: 2.3837 - regression_loss: 1.9987 - classification_loss: 0.3850 329/500 [==================>...........] - ETA: 57s - loss: 2.3859 - regression_loss: 2.0006 - classification_loss: 0.3852 330/500 [==================>...........] - ETA: 57s - loss: 2.3867 - regression_loss: 2.0015 - classification_loss: 0.3853 331/500 [==================>...........] - ETA: 57s - loss: 2.3906 - regression_loss: 2.0041 - classification_loss: 0.3865 332/500 [==================>...........] - ETA: 56s - loss: 2.3927 - regression_loss: 2.0051 - classification_loss: 0.3876 333/500 [==================>...........] - ETA: 56s - loss: 2.3915 - regression_loss: 2.0042 - classification_loss: 0.3874 334/500 [===================>..........] - ETA: 56s - loss: 2.3913 - regression_loss: 2.0042 - classification_loss: 0.3870 335/500 [===================>..........] - ETA: 55s - loss: 2.3925 - regression_loss: 2.0051 - classification_loss: 0.3874 336/500 [===================>..........] - ETA: 55s - loss: 2.3927 - regression_loss: 2.0048 - classification_loss: 0.3879 337/500 [===================>..........] - ETA: 55s - loss: 2.3929 - regression_loss: 2.0048 - classification_loss: 0.3881 338/500 [===================>..........] - ETA: 54s - loss: 2.3922 - regression_loss: 2.0040 - classification_loss: 0.3882 339/500 [===================>..........] - ETA: 54s - loss: 2.3908 - regression_loss: 2.0029 - classification_loss: 0.3879 340/500 [===================>..........] - ETA: 54s - loss: 2.3879 - regression_loss: 2.0005 - classification_loss: 0.3874 341/500 [===================>..........] - ETA: 53s - loss: 2.3880 - regression_loss: 2.0008 - classification_loss: 0.3872 342/500 [===================>..........] - ETA: 53s - loss: 2.3872 - regression_loss: 2.0003 - classification_loss: 0.3869 343/500 [===================>..........] - ETA: 53s - loss: 2.3875 - regression_loss: 2.0001 - classification_loss: 0.3874 344/500 [===================>..........] - ETA: 52s - loss: 2.3863 - regression_loss: 1.9993 - classification_loss: 0.3869 345/500 [===================>..........] - ETA: 52s - loss: 2.3840 - regression_loss: 1.9974 - classification_loss: 0.3866 346/500 [===================>..........] - ETA: 52s - loss: 2.3829 - regression_loss: 1.9966 - classification_loss: 0.3862 347/500 [===================>..........] - ETA: 51s - loss: 2.3838 - regression_loss: 1.9974 - classification_loss: 0.3863 348/500 [===================>..........] - ETA: 51s - loss: 2.3830 - regression_loss: 1.9968 - classification_loss: 0.3862 349/500 [===================>..........] - ETA: 51s - loss: 2.3852 - regression_loss: 1.9984 - classification_loss: 0.3868 350/500 [====================>.........] - ETA: 50s - loss: 2.3854 - regression_loss: 1.9985 - classification_loss: 0.3869 351/500 [====================>.........] - ETA: 50s - loss: 2.3832 - regression_loss: 1.9967 - classification_loss: 0.3865 352/500 [====================>.........] - ETA: 50s - loss: 2.3820 - regression_loss: 1.9958 - classification_loss: 0.3863 353/500 [====================>.........] - ETA: 49s - loss: 2.3843 - regression_loss: 1.9977 - classification_loss: 0.3866 354/500 [====================>.........] - ETA: 49s - loss: 2.3850 - regression_loss: 1.9984 - classification_loss: 0.3866 355/500 [====================>.........] - ETA: 49s - loss: 2.3856 - regression_loss: 1.9985 - classification_loss: 0.3871 356/500 [====================>.........] - ETA: 48s - loss: 2.3883 - regression_loss: 2.0006 - classification_loss: 0.3877 357/500 [====================>.........] - ETA: 48s - loss: 2.3862 - regression_loss: 1.9989 - classification_loss: 0.3873 358/500 [====================>.........] - ETA: 48s - loss: 2.3856 - regression_loss: 1.9984 - classification_loss: 0.3873 359/500 [====================>.........] - ETA: 47s - loss: 2.3873 - regression_loss: 1.9996 - classification_loss: 0.3877 360/500 [====================>.........] - ETA: 47s - loss: 2.3867 - regression_loss: 1.9992 - classification_loss: 0.3875 361/500 [====================>.........] - ETA: 47s - loss: 2.3851 - regression_loss: 1.9979 - classification_loss: 0.3872 362/500 [====================>.........] - ETA: 46s - loss: 2.3866 - regression_loss: 1.9991 - classification_loss: 0.3875 363/500 [====================>.........] - ETA: 46s - loss: 2.3858 - regression_loss: 1.9986 - classification_loss: 0.3872 364/500 [====================>.........] - ETA: 46s - loss: 2.3875 - regression_loss: 1.9994 - classification_loss: 0.3881 365/500 [====================>.........] - ETA: 45s - loss: 2.3887 - regression_loss: 2.0003 - classification_loss: 0.3884 366/500 [====================>.........] - ETA: 45s - loss: 2.3900 - regression_loss: 2.0010 - classification_loss: 0.3890 367/500 [=====================>........] - ETA: 45s - loss: 2.3899 - regression_loss: 2.0011 - classification_loss: 0.3888 368/500 [=====================>........] - ETA: 44s - loss: 2.3930 - regression_loss: 2.0030 - classification_loss: 0.3900 369/500 [=====================>........] - ETA: 44s - loss: 2.3918 - regression_loss: 2.0020 - classification_loss: 0.3898 370/500 [=====================>........] - ETA: 44s - loss: 2.3913 - regression_loss: 2.0016 - classification_loss: 0.3897 371/500 [=====================>........] - ETA: 43s - loss: 2.3913 - regression_loss: 2.0016 - classification_loss: 0.3897 372/500 [=====================>........] - ETA: 43s - loss: 2.3914 - regression_loss: 2.0016 - classification_loss: 0.3897 373/500 [=====================>........] - ETA: 43s - loss: 2.3924 - regression_loss: 2.0024 - classification_loss: 0.3900 374/500 [=====================>........] - ETA: 42s - loss: 2.3913 - regression_loss: 2.0016 - classification_loss: 0.3898 375/500 [=====================>........] - ETA: 42s - loss: 2.3921 - regression_loss: 2.0023 - classification_loss: 0.3898 376/500 [=====================>........] - ETA: 42s - loss: 2.3909 - regression_loss: 2.0015 - classification_loss: 0.3894 377/500 [=====================>........] - ETA: 41s - loss: 2.3917 - regression_loss: 2.0022 - classification_loss: 0.3895 378/500 [=====================>........] - ETA: 41s - loss: 2.3927 - regression_loss: 2.0028 - classification_loss: 0.3899 379/500 [=====================>........] - ETA: 40s - loss: 2.3905 - regression_loss: 2.0008 - classification_loss: 0.3897 380/500 [=====================>........] - ETA: 40s - loss: 2.3907 - regression_loss: 2.0010 - classification_loss: 0.3897 381/500 [=====================>........] - ETA: 40s - loss: 2.3901 - regression_loss: 2.0007 - classification_loss: 0.3894 382/500 [=====================>........] - ETA: 39s - loss: 2.3914 - regression_loss: 2.0016 - classification_loss: 0.3898 383/500 [=====================>........] - ETA: 39s - loss: 2.3919 - regression_loss: 2.0021 - classification_loss: 0.3897 384/500 [======================>.......] - ETA: 39s - loss: 2.3900 - regression_loss: 2.0008 - classification_loss: 0.3892 385/500 [======================>.......] - ETA: 38s - loss: 2.3888 - regression_loss: 1.9999 - classification_loss: 0.3890 386/500 [======================>.......] - ETA: 38s - loss: 2.3865 - regression_loss: 1.9980 - classification_loss: 0.3885 387/500 [======================>.......] - ETA: 38s - loss: 2.3864 - regression_loss: 1.9980 - classification_loss: 0.3884 388/500 [======================>.......] - ETA: 37s - loss: 2.3856 - regression_loss: 1.9974 - classification_loss: 0.3882 389/500 [======================>.......] - ETA: 37s - loss: 2.3855 - regression_loss: 1.9974 - classification_loss: 0.3880 390/500 [======================>.......] - ETA: 37s - loss: 2.3867 - regression_loss: 1.9985 - classification_loss: 0.3882 391/500 [======================>.......] - ETA: 36s - loss: 2.3886 - regression_loss: 2.0000 - classification_loss: 0.3886 392/500 [======================>.......] - ETA: 36s - loss: 2.3887 - regression_loss: 2.0002 - classification_loss: 0.3884 393/500 [======================>.......] - ETA: 36s - loss: 2.3900 - regression_loss: 2.0011 - classification_loss: 0.3890 394/500 [======================>.......] - ETA: 35s - loss: 2.3911 - regression_loss: 2.0020 - classification_loss: 0.3891 395/500 [======================>.......] - ETA: 35s - loss: 2.3901 - regression_loss: 2.0012 - classification_loss: 0.3888 396/500 [======================>.......] - ETA: 35s - loss: 2.3899 - regression_loss: 2.0013 - classification_loss: 0.3886 397/500 [======================>.......] - ETA: 34s - loss: 2.3892 - regression_loss: 2.0008 - classification_loss: 0.3884 398/500 [======================>.......] - ETA: 34s - loss: 2.3866 - regression_loss: 1.9988 - classification_loss: 0.3877 399/500 [======================>.......] - ETA: 34s - loss: 2.3866 - regression_loss: 1.9988 - classification_loss: 0.3878 400/500 [=======================>......] - ETA: 33s - loss: 2.3855 - regression_loss: 1.9980 - classification_loss: 0.3876 401/500 [=======================>......] - ETA: 33s - loss: 2.3857 - regression_loss: 1.9982 - classification_loss: 0.3875 402/500 [=======================>......] - ETA: 33s - loss: 2.3861 - regression_loss: 1.9982 - classification_loss: 0.3879 403/500 [=======================>......] - ETA: 32s - loss: 2.3859 - regression_loss: 1.9981 - classification_loss: 0.3878 404/500 [=======================>......] - ETA: 32s - loss: 2.3847 - regression_loss: 1.9973 - classification_loss: 0.3874 405/500 [=======================>......] - ETA: 32s - loss: 2.3842 - regression_loss: 1.9970 - classification_loss: 0.3873 406/500 [=======================>......] - ETA: 31s - loss: 2.3850 - regression_loss: 1.9974 - classification_loss: 0.3876 407/500 [=======================>......] - ETA: 31s - loss: 2.3854 - regression_loss: 1.9974 - classification_loss: 0.3881 408/500 [=======================>......] - ETA: 31s - loss: 2.3865 - regression_loss: 1.9987 - classification_loss: 0.3878 409/500 [=======================>......] - ETA: 30s - loss: 2.3865 - regression_loss: 1.9987 - classification_loss: 0.3877 410/500 [=======================>......] - ETA: 30s - loss: 2.3892 - regression_loss: 2.0007 - classification_loss: 0.3885 411/500 [=======================>......] - ETA: 30s - loss: 2.3908 - regression_loss: 2.0021 - classification_loss: 0.3888 412/500 [=======================>......] - ETA: 29s - loss: 2.3903 - regression_loss: 2.0016 - classification_loss: 0.3887 413/500 [=======================>......] - ETA: 29s - loss: 2.3883 - regression_loss: 2.0001 - classification_loss: 0.3882 414/500 [=======================>......] - ETA: 29s - loss: 2.3872 - regression_loss: 1.9994 - classification_loss: 0.3878 415/500 [=======================>......] - ETA: 28s - loss: 2.3867 - regression_loss: 1.9991 - classification_loss: 0.3876 416/500 [=======================>......] - ETA: 28s - loss: 2.3868 - regression_loss: 1.9993 - classification_loss: 0.3875 417/500 [========================>.....] - ETA: 28s - loss: 2.3857 - regression_loss: 1.9985 - classification_loss: 0.3872 418/500 [========================>.....] - ETA: 27s - loss: 2.3844 - regression_loss: 1.9975 - classification_loss: 0.3869 419/500 [========================>.....] - ETA: 27s - loss: 2.3817 - regression_loss: 1.9949 - classification_loss: 0.3867 420/500 [========================>.....] - ETA: 27s - loss: 2.3850 - regression_loss: 1.9969 - classification_loss: 0.3881 421/500 [========================>.....] - ETA: 26s - loss: 2.3844 - regression_loss: 1.9965 - classification_loss: 0.3879 422/500 [========================>.....] - ETA: 26s - loss: 2.3844 - regression_loss: 1.9965 - classification_loss: 0.3879 423/500 [========================>.....] - ETA: 26s - loss: 2.3828 - regression_loss: 1.9952 - classification_loss: 0.3876 424/500 [========================>.....] - ETA: 25s - loss: 2.3820 - regression_loss: 1.9948 - classification_loss: 0.3872 425/500 [========================>.....] - ETA: 25s - loss: 2.3812 - regression_loss: 1.9943 - classification_loss: 0.3870 426/500 [========================>.....] - ETA: 25s - loss: 2.3809 - regression_loss: 1.9942 - classification_loss: 0.3867 427/500 [========================>.....] - ETA: 24s - loss: 2.3816 - regression_loss: 1.9947 - classification_loss: 0.3869 428/500 [========================>.....] - ETA: 24s - loss: 2.3805 - regression_loss: 1.9940 - classification_loss: 0.3865 429/500 [========================>.....] - ETA: 24s - loss: 2.3801 - regression_loss: 1.9936 - classification_loss: 0.3865 430/500 [========================>.....] - ETA: 23s - loss: 2.3786 - regression_loss: 1.9925 - classification_loss: 0.3861 431/500 [========================>.....] - ETA: 23s - loss: 2.3765 - regression_loss: 1.9908 - classification_loss: 0.3857 432/500 [========================>.....] - ETA: 23s - loss: 2.3753 - regression_loss: 1.9898 - classification_loss: 0.3854 433/500 [========================>.....] - ETA: 22s - loss: 2.3762 - regression_loss: 1.9907 - classification_loss: 0.3855 434/500 [=========================>....] - ETA: 22s - loss: 2.3747 - regression_loss: 1.9896 - classification_loss: 0.3850 435/500 [=========================>....] - ETA: 21s - loss: 2.3750 - regression_loss: 1.9900 - classification_loss: 0.3850 436/500 [=========================>....] - ETA: 21s - loss: 2.3747 - regression_loss: 1.9898 - classification_loss: 0.3849 437/500 [=========================>....] - ETA: 21s - loss: 2.3748 - regression_loss: 1.9899 - classification_loss: 0.3850 438/500 [=========================>....] - ETA: 20s - loss: 2.3738 - regression_loss: 1.9892 - classification_loss: 0.3845 439/500 [=========================>....] - ETA: 20s - loss: 2.3754 - regression_loss: 1.9906 - classification_loss: 0.3848 440/500 [=========================>....] - ETA: 20s - loss: 2.3779 - regression_loss: 1.9916 - classification_loss: 0.3863 441/500 [=========================>....] - ETA: 19s - loss: 2.3807 - regression_loss: 1.9942 - classification_loss: 0.3865 442/500 [=========================>....] - ETA: 19s - loss: 2.3815 - regression_loss: 1.9951 - classification_loss: 0.3865 443/500 [=========================>....] - ETA: 19s - loss: 2.3824 - regression_loss: 1.9957 - classification_loss: 0.3867 444/500 [=========================>....] - ETA: 18s - loss: 2.3819 - regression_loss: 1.9954 - classification_loss: 0.3865 445/500 [=========================>....] - ETA: 18s - loss: 2.3811 - regression_loss: 1.9947 - classification_loss: 0.3865 446/500 [=========================>....] - ETA: 18s - loss: 2.3818 - regression_loss: 1.9949 - classification_loss: 0.3869 447/500 [=========================>....] - ETA: 17s - loss: 2.3818 - regression_loss: 1.9948 - classification_loss: 0.3870 448/500 [=========================>....] - ETA: 17s - loss: 2.3815 - regression_loss: 1.9947 - classification_loss: 0.3869 449/500 [=========================>....] - ETA: 17s - loss: 2.3805 - regression_loss: 1.9939 - classification_loss: 0.3866 450/500 [==========================>...] - ETA: 16s - loss: 2.3794 - regression_loss: 1.9931 - classification_loss: 0.3863 451/500 [==========================>...] - ETA: 16s - loss: 2.3834 - regression_loss: 1.9960 - classification_loss: 0.3874 452/500 [==========================>...] - ETA: 16s - loss: 2.3831 - regression_loss: 1.9957 - classification_loss: 0.3874 453/500 [==========================>...] - ETA: 15s - loss: 2.3814 - regression_loss: 1.9943 - classification_loss: 0.3870 454/500 [==========================>...] - ETA: 15s - loss: 2.3794 - regression_loss: 1.9929 - classification_loss: 0.3865 455/500 [==========================>...] - ETA: 15s - loss: 2.3788 - regression_loss: 1.9925 - classification_loss: 0.3863 456/500 [==========================>...] - ETA: 14s - loss: 2.3807 - regression_loss: 1.9942 - classification_loss: 0.3865 457/500 [==========================>...] - ETA: 14s - loss: 2.3798 - regression_loss: 1.9930 - classification_loss: 0.3868 458/500 [==========================>...] - ETA: 14s - loss: 2.3801 - regression_loss: 1.9933 - classification_loss: 0.3867 459/500 [==========================>...] - ETA: 13s - loss: 2.3808 - regression_loss: 1.9940 - classification_loss: 0.3868 460/500 [==========================>...] - ETA: 13s - loss: 2.3837 - regression_loss: 1.9960 - classification_loss: 0.3877 461/500 [==========================>...] - ETA: 13s - loss: 2.3850 - regression_loss: 1.9971 - classification_loss: 0.3879 462/500 [==========================>...] - ETA: 12s - loss: 2.3858 - regression_loss: 1.9977 - classification_loss: 0.3880 463/500 [==========================>...] - ETA: 12s - loss: 2.3851 - regression_loss: 1.9973 - classification_loss: 0.3878 464/500 [==========================>...] - ETA: 12s - loss: 2.3852 - regression_loss: 1.9973 - classification_loss: 0.3879 465/500 [==========================>...] - ETA: 11s - loss: 2.3848 - regression_loss: 1.9972 - classification_loss: 0.3876 466/500 [==========================>...] - ETA: 11s - loss: 2.3838 - regression_loss: 1.9965 - classification_loss: 0.3873 467/500 [===========================>..] - ETA: 11s - loss: 2.3830 - regression_loss: 1.9960 - classification_loss: 0.3870 468/500 [===========================>..] - ETA: 10s - loss: 2.3818 - regression_loss: 1.9952 - classification_loss: 0.3866 469/500 [===========================>..] - ETA: 10s - loss: 2.3807 - regression_loss: 1.9944 - classification_loss: 0.3863 470/500 [===========================>..] - ETA: 10s - loss: 2.3813 - regression_loss: 1.9948 - classification_loss: 0.3864 471/500 [===========================>..] - ETA: 9s - loss: 2.3810 - regression_loss: 1.9947 - classification_loss: 0.3863  472/500 [===========================>..] - ETA: 9s - loss: 2.3817 - regression_loss: 1.9955 - classification_loss: 0.3862 473/500 [===========================>..] - ETA: 9s - loss: 2.3844 - regression_loss: 1.9976 - classification_loss: 0.3867 474/500 [===========================>..] - ETA: 8s - loss: 2.3845 - regression_loss: 1.9979 - classification_loss: 0.3866 475/500 [===========================>..] - ETA: 8s - loss: 2.3841 - regression_loss: 1.9975 - classification_loss: 0.3865 476/500 [===========================>..] - ETA: 8s - loss: 2.3846 - regression_loss: 1.9982 - classification_loss: 0.3864 477/500 [===========================>..] - ETA: 7s - loss: 2.3842 - regression_loss: 1.9981 - classification_loss: 0.3862 478/500 [===========================>..] - ETA: 7s - loss: 2.3841 - regression_loss: 1.9982 - classification_loss: 0.3858 479/500 [===========================>..] - ETA: 7s - loss: 2.3839 - regression_loss: 1.9982 - classification_loss: 0.3857 480/500 [===========================>..] - ETA: 6s - loss: 2.3831 - regression_loss: 1.9975 - classification_loss: 0.3856 481/500 [===========================>..] - ETA: 6s - loss: 2.3824 - regression_loss: 1.9971 - classification_loss: 0.3853 482/500 [===========================>..] - ETA: 6s - loss: 2.3804 - regression_loss: 1.9953 - classification_loss: 0.3852 483/500 [===========================>..] - ETA: 5s - loss: 2.3797 - regression_loss: 1.9948 - classification_loss: 0.3849 484/500 [============================>.] - ETA: 5s - loss: 2.3793 - regression_loss: 1.9945 - classification_loss: 0.3848 485/500 [============================>.] - ETA: 5s - loss: 2.3790 - regression_loss: 1.9941 - classification_loss: 0.3849 486/500 [============================>.] - ETA: 4s - loss: 2.3769 - regression_loss: 1.9925 - classification_loss: 0.3845 487/500 [============================>.] - ETA: 4s - loss: 2.3770 - regression_loss: 1.9925 - classification_loss: 0.3845 488/500 [============================>.] - ETA: 4s - loss: 2.3794 - regression_loss: 1.9938 - classification_loss: 0.3855 489/500 [============================>.] - ETA: 3s - loss: 2.3778 - regression_loss: 1.9926 - classification_loss: 0.3852 490/500 [============================>.] - ETA: 3s - loss: 2.3776 - regression_loss: 1.9925 - classification_loss: 0.3851 491/500 [============================>.] - ETA: 3s - loss: 2.3761 - regression_loss: 1.9913 - classification_loss: 0.3848 492/500 [============================>.] - ETA: 2s - loss: 2.3756 - regression_loss: 1.9909 - classification_loss: 0.3847 493/500 [============================>.] - ETA: 2s - loss: 2.3751 - regression_loss: 1.9905 - classification_loss: 0.3846 494/500 [============================>.] - ETA: 2s - loss: 2.3741 - regression_loss: 1.9897 - classification_loss: 0.3843 495/500 [============================>.] - ETA: 1s - loss: 2.3730 - regression_loss: 1.9889 - classification_loss: 0.3841 496/500 [============================>.] - ETA: 1s - loss: 2.3733 - regression_loss: 1.9893 - classification_loss: 0.3840 497/500 [============================>.] - ETA: 1s - loss: 2.3731 - regression_loss: 1.9891 - classification_loss: 0.3840 498/500 [============================>.] - ETA: 0s - loss: 2.3723 - regression_loss: 1.9885 - classification_loss: 0.3838 499/500 [============================>.] - ETA: 0s - loss: 2.3717 - regression_loss: 1.9881 - classification_loss: 0.3836 500/500 [==============================] - 169s 338ms/step - loss: 2.3724 - regression_loss: 1.9884 - classification_loss: 0.3840 326 instances of class plum with average precision: 0.6687 mAP: 0.6687 Epoch 00002: saving model to ./training/snapshots/resnet101_pascal_02.h5 Epoch 3/150 1/500 [..............................] - ETA: 2:43 - loss: 1.3456 - regression_loss: 1.1605 - classification_loss: 0.1852 2/500 [..............................] - ETA: 2:49 - loss: 1.7134 - regression_loss: 1.5015 - classification_loss: 0.2119 3/500 [..............................] - ETA: 2:49 - loss: 1.9121 - regression_loss: 1.6742 - classification_loss: 0.2379 4/500 [..............................] - ETA: 2:47 - loss: 2.0729 - regression_loss: 1.7863 - classification_loss: 0.2867 5/500 [..............................] - ETA: 2:48 - loss: 2.1652 - regression_loss: 1.8575 - classification_loss: 0.3077 6/500 [..............................] - ETA: 2:46 - loss: 2.0465 - regression_loss: 1.7635 - classification_loss: 0.2830 7/500 [..............................] - ETA: 2:47 - loss: 2.0828 - regression_loss: 1.7931 - classification_loss: 0.2897 8/500 [..............................] - ETA: 2:46 - loss: 2.1052 - regression_loss: 1.8033 - classification_loss: 0.3019 9/500 [..............................] - ETA: 2:45 - loss: 2.1766 - regression_loss: 1.8656 - classification_loss: 0.3109 10/500 [..............................] - ETA: 2:46 - loss: 2.1408 - regression_loss: 1.8349 - classification_loss: 0.3059 11/500 [..............................] - ETA: 2:46 - loss: 2.1086 - regression_loss: 1.8099 - classification_loss: 0.2986 12/500 [..............................] - ETA: 2:46 - loss: 2.0628 - regression_loss: 1.7723 - classification_loss: 0.2905 13/500 [..............................] - ETA: 2:46 - loss: 2.0979 - regression_loss: 1.7994 - classification_loss: 0.2985 14/500 [..............................] - ETA: 2:45 - loss: 2.1365 - regression_loss: 1.8282 - classification_loss: 0.3083 15/500 [..............................] - ETA: 2:45 - loss: 2.1274 - regression_loss: 1.8186 - classification_loss: 0.3088 16/500 [..............................] - ETA: 2:45 - loss: 2.0991 - regression_loss: 1.7967 - classification_loss: 0.3024 17/500 [>.............................] - ETA: 2:44 - loss: 2.1302 - regression_loss: 1.8122 - classification_loss: 0.3179 18/500 [>.............................] - ETA: 2:43 - loss: 2.1054 - regression_loss: 1.7915 - classification_loss: 0.3139 19/500 [>.............................] - ETA: 2:43 - loss: 2.1763 - regression_loss: 1.8452 - classification_loss: 0.3311 20/500 [>.............................] - ETA: 2:43 - loss: 2.2006 - regression_loss: 1.8713 - classification_loss: 0.3293 21/500 [>.............................] - ETA: 2:42 - loss: 2.1688 - regression_loss: 1.8468 - classification_loss: 0.3220 22/500 [>.............................] - ETA: 2:42 - loss: 2.1652 - regression_loss: 1.8436 - classification_loss: 0.3217 23/500 [>.............................] - ETA: 2:42 - loss: 2.1769 - regression_loss: 1.8505 - classification_loss: 0.3264 24/500 [>.............................] - ETA: 2:41 - loss: 2.2196 - regression_loss: 1.8714 - classification_loss: 0.3482 25/500 [>.............................] - ETA: 2:41 - loss: 2.2212 - regression_loss: 1.8746 - classification_loss: 0.3466 26/500 [>.............................] - ETA: 2:40 - loss: 2.2038 - regression_loss: 1.8627 - classification_loss: 0.3411 27/500 [>.............................] - ETA: 2:40 - loss: 2.1972 - regression_loss: 1.8584 - classification_loss: 0.3388 28/500 [>.............................] - ETA: 2:40 - loss: 2.1974 - regression_loss: 1.8600 - classification_loss: 0.3373 29/500 [>.............................] - ETA: 2:40 - loss: 2.1646 - regression_loss: 1.8337 - classification_loss: 0.3309 30/500 [>.............................] - ETA: 2:39 - loss: 2.1154 - regression_loss: 1.7726 - classification_loss: 0.3428 31/500 [>.............................] - ETA: 2:39 - loss: 2.1272 - regression_loss: 1.7814 - classification_loss: 0.3458 32/500 [>.............................] - ETA: 2:39 - loss: 2.1298 - regression_loss: 1.7850 - classification_loss: 0.3448 33/500 [>.............................] - ETA: 2:38 - loss: 2.1166 - regression_loss: 1.7752 - classification_loss: 0.3414 34/500 [=>............................] - ETA: 2:38 - loss: 2.1089 - regression_loss: 1.7693 - classification_loss: 0.3397 35/500 [=>............................] - ETA: 2:37 - loss: 2.1354 - regression_loss: 1.7877 - classification_loss: 0.3477 36/500 [=>............................] - ETA: 2:37 - loss: 2.1335 - regression_loss: 1.7865 - classification_loss: 0.3470 37/500 [=>............................] - ETA: 2:37 - loss: 2.1315 - regression_loss: 1.7829 - classification_loss: 0.3486 38/500 [=>............................] - ETA: 2:36 - loss: 2.1305 - regression_loss: 1.7823 - classification_loss: 0.3482 39/500 [=>............................] - ETA: 2:36 - loss: 2.0905 - regression_loss: 1.7366 - classification_loss: 0.3539 40/500 [=>............................] - ETA: 2:36 - loss: 2.0873 - regression_loss: 1.7349 - classification_loss: 0.3524 41/500 [=>............................] - ETA: 2:35 - loss: 2.0658 - regression_loss: 1.7187 - classification_loss: 0.3471 42/500 [=>............................] - ETA: 2:35 - loss: 2.0501 - regression_loss: 1.7065 - classification_loss: 0.3435 43/500 [=>............................] - ETA: 2:35 - loss: 2.0400 - regression_loss: 1.6982 - classification_loss: 0.3419 44/500 [=>............................] - ETA: 2:34 - loss: 2.0399 - regression_loss: 1.6983 - classification_loss: 0.3417 45/500 [=>............................] - ETA: 2:34 - loss: 2.0403 - regression_loss: 1.6984 - classification_loss: 0.3420 46/500 [=>............................] - ETA: 2:33 - loss: 2.0527 - regression_loss: 1.7108 - classification_loss: 0.3420 47/500 [=>............................] - ETA: 2:33 - loss: 2.0627 - regression_loss: 1.7201 - classification_loss: 0.3426 48/500 [=>............................] - ETA: 2:33 - loss: 2.0584 - regression_loss: 1.7188 - classification_loss: 0.3396 49/500 [=>............................] - ETA: 2:33 - loss: 2.0510 - regression_loss: 1.7149 - classification_loss: 0.3361 50/500 [==>...........................] - ETA: 2:32 - loss: 2.0290 - regression_loss: 1.6970 - classification_loss: 0.3320 51/500 [==>...........................] - ETA: 2:32 - loss: 2.0364 - regression_loss: 1.7029 - classification_loss: 0.3334 52/500 [==>...........................] - ETA: 2:32 - loss: 2.0542 - regression_loss: 1.7172 - classification_loss: 0.3370 53/500 [==>...........................] - ETA: 2:31 - loss: 2.0762 - regression_loss: 1.7359 - classification_loss: 0.3403 54/500 [==>...........................] - ETA: 2:31 - loss: 2.0782 - regression_loss: 1.7385 - classification_loss: 0.3397 55/500 [==>...........................] - ETA: 2:30 - loss: 2.0874 - regression_loss: 1.7471 - classification_loss: 0.3403 56/500 [==>...........................] - ETA: 2:30 - loss: 2.0896 - regression_loss: 1.7504 - classification_loss: 0.3392 57/500 [==>...........................] - ETA: 2:29 - loss: 2.0936 - regression_loss: 1.7545 - classification_loss: 0.3391 58/500 [==>...........................] - ETA: 2:29 - loss: 2.0856 - regression_loss: 1.7495 - classification_loss: 0.3360 59/500 [==>...........................] - ETA: 2:29 - loss: 2.0836 - regression_loss: 1.7482 - classification_loss: 0.3354 60/500 [==>...........................] - ETA: 2:28 - loss: 2.1165 - regression_loss: 1.7657 - classification_loss: 0.3508 61/500 [==>...........................] - ETA: 2:28 - loss: 2.1138 - regression_loss: 1.7651 - classification_loss: 0.3487 62/500 [==>...........................] - ETA: 2:28 - loss: 2.1175 - regression_loss: 1.7668 - classification_loss: 0.3507 63/500 [==>...........................] - ETA: 2:27 - loss: 2.1156 - regression_loss: 1.7647 - classification_loss: 0.3509 64/500 [==>...........................] - ETA: 2:27 - loss: 2.1082 - regression_loss: 1.7595 - classification_loss: 0.3487 65/500 [==>...........................] - ETA: 2:27 - loss: 2.1126 - regression_loss: 1.7644 - classification_loss: 0.3482 66/500 [==>...........................] - ETA: 2:26 - loss: 2.1127 - regression_loss: 1.7657 - classification_loss: 0.3470 67/500 [===>..........................] - ETA: 2:26 - loss: 2.1091 - regression_loss: 1.7635 - classification_loss: 0.3456 68/500 [===>..........................] - ETA: 2:26 - loss: 2.1083 - regression_loss: 1.7649 - classification_loss: 0.3434 69/500 [===>..........................] - ETA: 2:25 - loss: 2.0990 - regression_loss: 1.7570 - classification_loss: 0.3420 70/500 [===>..........................] - ETA: 2:25 - loss: 2.0966 - regression_loss: 1.7552 - classification_loss: 0.3413 71/500 [===>..........................] - ETA: 2:25 - loss: 2.1020 - regression_loss: 1.7595 - classification_loss: 0.3425 72/500 [===>..........................] - ETA: 2:24 - loss: 2.1030 - regression_loss: 1.7596 - classification_loss: 0.3434 73/500 [===>..........................] - ETA: 2:24 - loss: 2.1017 - regression_loss: 1.7587 - classification_loss: 0.3430 74/500 [===>..........................] - ETA: 2:24 - loss: 2.0989 - regression_loss: 1.7579 - classification_loss: 0.3409 75/500 [===>..........................] - ETA: 2:23 - loss: 2.0975 - regression_loss: 1.7576 - classification_loss: 0.3399 76/500 [===>..........................] - ETA: 2:23 - loss: 2.1006 - regression_loss: 1.7598 - classification_loss: 0.3408 77/500 [===>..........................] - ETA: 2:23 - loss: 2.1090 - regression_loss: 1.7670 - classification_loss: 0.3420 78/500 [===>..........................] - ETA: 2:22 - loss: 2.1140 - regression_loss: 1.7714 - classification_loss: 0.3426 79/500 [===>..........................] - ETA: 2:22 - loss: 2.1293 - regression_loss: 1.7832 - classification_loss: 0.3461 80/500 [===>..........................] - ETA: 2:22 - loss: 2.1313 - regression_loss: 1.7859 - classification_loss: 0.3454 81/500 [===>..........................] - ETA: 2:21 - loss: 2.1295 - regression_loss: 1.7849 - classification_loss: 0.3447 82/500 [===>..........................] - ETA: 2:21 - loss: 2.1341 - regression_loss: 1.7886 - classification_loss: 0.3455 83/500 [===>..........................] - ETA: 2:21 - loss: 2.1318 - regression_loss: 1.7875 - classification_loss: 0.3442 84/500 [====>.........................] - ETA: 2:20 - loss: 2.1287 - regression_loss: 1.7854 - classification_loss: 0.3434 85/500 [====>.........................] - ETA: 2:20 - loss: 2.1411 - regression_loss: 1.7912 - classification_loss: 0.3498 86/500 [====>.........................] - ETA: 2:20 - loss: 2.1511 - regression_loss: 1.7993 - classification_loss: 0.3517 87/500 [====>.........................] - ETA: 2:19 - loss: 2.1513 - regression_loss: 1.7999 - classification_loss: 0.3514 88/500 [====>.........................] - ETA: 2:19 - loss: 2.1548 - regression_loss: 1.8031 - classification_loss: 0.3516 89/500 [====>.........................] - ETA: 2:19 - loss: 2.1463 - regression_loss: 1.7960 - classification_loss: 0.3502 90/500 [====>.........................] - ETA: 2:18 - loss: 2.1416 - regression_loss: 1.7926 - classification_loss: 0.3490 91/500 [====>.........................] - ETA: 2:18 - loss: 2.1427 - regression_loss: 1.7938 - classification_loss: 0.3489 92/500 [====>.........................] - ETA: 2:18 - loss: 2.1469 - regression_loss: 1.7982 - classification_loss: 0.3486 93/500 [====>.........................] - ETA: 2:18 - loss: 2.1455 - regression_loss: 1.7974 - classification_loss: 0.3481 94/500 [====>.........................] - ETA: 2:17 - loss: 2.1557 - regression_loss: 1.8042 - classification_loss: 0.3516 95/500 [====>.........................] - ETA: 2:17 - loss: 2.1609 - regression_loss: 1.8093 - classification_loss: 0.3516 96/500 [====>.........................] - ETA: 2:17 - loss: 2.1590 - regression_loss: 1.8081 - classification_loss: 0.3509 97/500 [====>.........................] - ETA: 2:16 - loss: 2.1561 - regression_loss: 1.8058 - classification_loss: 0.3503 98/500 [====>.........................] - ETA: 2:16 - loss: 2.1515 - regression_loss: 1.8030 - classification_loss: 0.3485 99/500 [====>.........................] - ETA: 2:16 - loss: 2.1502 - regression_loss: 1.8024 - classification_loss: 0.3477 100/500 [=====>........................] - ETA: 2:15 - loss: 2.1486 - regression_loss: 1.8013 - classification_loss: 0.3473 101/500 [=====>........................] - ETA: 2:15 - loss: 2.1447 - regression_loss: 1.7982 - classification_loss: 0.3465 102/500 [=====>........................] - ETA: 2:14 - loss: 2.1386 - regression_loss: 1.7932 - classification_loss: 0.3454 103/500 [=====>........................] - ETA: 2:14 - loss: 2.1371 - regression_loss: 1.7923 - classification_loss: 0.3448 104/500 [=====>........................] - ETA: 2:14 - loss: 2.1422 - regression_loss: 1.7963 - classification_loss: 0.3459 105/500 [=====>........................] - ETA: 2:13 - loss: 2.1470 - regression_loss: 1.8005 - classification_loss: 0.3465 106/500 [=====>........................] - ETA: 2:13 - loss: 2.1458 - regression_loss: 1.7998 - classification_loss: 0.3460 107/500 [=====>........................] - ETA: 2:13 - loss: 2.1458 - regression_loss: 1.8000 - classification_loss: 0.3458 108/500 [=====>........................] - ETA: 2:12 - loss: 2.1411 - regression_loss: 1.7959 - classification_loss: 0.3452 109/500 [=====>........................] - ETA: 2:12 - loss: 2.1446 - regression_loss: 1.8002 - classification_loss: 0.3444 110/500 [=====>........................] - ETA: 2:12 - loss: 2.1519 - regression_loss: 1.8060 - classification_loss: 0.3459 111/500 [=====>........................] - ETA: 2:11 - loss: 2.1556 - regression_loss: 1.8064 - classification_loss: 0.3492 112/500 [=====>........................] - ETA: 2:11 - loss: 2.1539 - regression_loss: 1.8046 - classification_loss: 0.3493 113/500 [=====>........................] - ETA: 2:11 - loss: 2.1559 - regression_loss: 1.8066 - classification_loss: 0.3493 114/500 [=====>........................] - ETA: 2:10 - loss: 2.1608 - regression_loss: 1.8105 - classification_loss: 0.3503 115/500 [=====>........................] - ETA: 2:10 - loss: 2.1587 - regression_loss: 1.8091 - classification_loss: 0.3496 116/500 [=====>........................] - ETA: 2:10 - loss: 2.1578 - regression_loss: 1.8082 - classification_loss: 0.3495 117/500 [======>.......................] - ETA: 2:09 - loss: 2.1566 - regression_loss: 1.8075 - classification_loss: 0.3491 118/500 [======>.......................] - ETA: 2:09 - loss: 2.1477 - regression_loss: 1.7998 - classification_loss: 0.3479 119/500 [======>.......................] - ETA: 2:09 - loss: 2.1476 - regression_loss: 1.7998 - classification_loss: 0.3478 120/500 [======>.......................] - ETA: 2:08 - loss: 2.1363 - regression_loss: 1.7900 - classification_loss: 0.3464 121/500 [======>.......................] - ETA: 2:08 - loss: 2.1338 - regression_loss: 1.7881 - classification_loss: 0.3456 122/500 [======>.......................] - ETA: 2:08 - loss: 2.1331 - regression_loss: 1.7874 - classification_loss: 0.3457 123/500 [======>.......................] - ETA: 2:07 - loss: 2.1305 - regression_loss: 1.7854 - classification_loss: 0.3450 124/500 [======>.......................] - ETA: 2:07 - loss: 2.1310 - regression_loss: 1.7857 - classification_loss: 0.3452 125/500 [======>.......................] - ETA: 2:07 - loss: 2.1329 - regression_loss: 1.7877 - classification_loss: 0.3452 126/500 [======>.......................] - ETA: 2:06 - loss: 2.1285 - regression_loss: 1.7846 - classification_loss: 0.3439 127/500 [======>.......................] - ETA: 2:06 - loss: 2.1285 - regression_loss: 1.7851 - classification_loss: 0.3434 128/500 [======>.......................] - ETA: 2:06 - loss: 2.1214 - regression_loss: 1.7794 - classification_loss: 0.3420 129/500 [======>.......................] - ETA: 2:05 - loss: 2.1261 - regression_loss: 1.7832 - classification_loss: 0.3429 130/500 [======>.......................] - ETA: 2:05 - loss: 2.1296 - regression_loss: 1.7870 - classification_loss: 0.3426 131/500 [======>.......................] - ETA: 2:05 - loss: 2.1249 - regression_loss: 1.7834 - classification_loss: 0.3415 132/500 [======>.......................] - ETA: 2:04 - loss: 2.1284 - regression_loss: 1.7863 - classification_loss: 0.3421 133/500 [======>.......................] - ETA: 2:04 - loss: 2.1297 - regression_loss: 1.7879 - classification_loss: 0.3418 134/500 [=======>......................] - ETA: 2:04 - loss: 2.1298 - regression_loss: 1.7879 - classification_loss: 0.3419 135/500 [=======>......................] - ETA: 2:03 - loss: 2.1277 - regression_loss: 1.7863 - classification_loss: 0.3414 136/500 [=======>......................] - ETA: 2:03 - loss: 2.1325 - regression_loss: 1.7908 - classification_loss: 0.3417 137/500 [=======>......................] - ETA: 2:02 - loss: 2.1262 - regression_loss: 1.7860 - classification_loss: 0.3402 138/500 [=======>......................] - ETA: 2:02 - loss: 2.1301 - regression_loss: 1.7893 - classification_loss: 0.3409 139/500 [=======>......................] - ETA: 2:02 - loss: 2.1286 - regression_loss: 1.7882 - classification_loss: 0.3404 140/500 [=======>......................] - ETA: 2:02 - loss: 2.1291 - regression_loss: 1.7895 - classification_loss: 0.3396 141/500 [=======>......................] - ETA: 2:01 - loss: 2.1311 - regression_loss: 1.7921 - classification_loss: 0.3390 142/500 [=======>......................] - ETA: 2:01 - loss: 2.1301 - regression_loss: 1.7916 - classification_loss: 0.3385 143/500 [=======>......................] - ETA: 2:01 - loss: 2.1309 - regression_loss: 1.7928 - classification_loss: 0.3380 144/500 [=======>......................] - ETA: 2:00 - loss: 2.1281 - regression_loss: 1.7904 - classification_loss: 0.3377 145/500 [=======>......................] - ETA: 2:00 - loss: 2.1262 - regression_loss: 1.7892 - classification_loss: 0.3370 146/500 [=======>......................] - ETA: 2:00 - loss: 2.1391 - regression_loss: 1.7999 - classification_loss: 0.3392 147/500 [=======>......................] - ETA: 1:59 - loss: 2.1380 - regression_loss: 1.7995 - classification_loss: 0.3384 148/500 [=======>......................] - ETA: 1:59 - loss: 2.1369 - regression_loss: 1.7990 - classification_loss: 0.3379 149/500 [=======>......................] - ETA: 1:59 - loss: 2.1388 - regression_loss: 1.8007 - classification_loss: 0.3381 150/500 [========>.....................] - ETA: 1:58 - loss: 2.1419 - regression_loss: 1.8040 - classification_loss: 0.3378 151/500 [========>.....................] - ETA: 1:58 - loss: 2.1458 - regression_loss: 1.8067 - classification_loss: 0.3391 152/500 [========>.....................] - ETA: 1:58 - loss: 2.1418 - regression_loss: 1.8033 - classification_loss: 0.3386 153/500 [========>.....................] - ETA: 1:57 - loss: 2.1400 - regression_loss: 1.8020 - classification_loss: 0.3380 154/500 [========>.....................] - ETA: 1:57 - loss: 2.1403 - regression_loss: 1.8022 - classification_loss: 0.3381 155/500 [========>.....................] - ETA: 1:57 - loss: 2.1470 - regression_loss: 1.8083 - classification_loss: 0.3386 156/500 [========>.....................] - ETA: 1:56 - loss: 2.1512 - regression_loss: 1.8108 - classification_loss: 0.3404 157/500 [========>.....................] - ETA: 1:56 - loss: 2.1497 - regression_loss: 1.8100 - classification_loss: 0.3397 158/500 [========>.....................] - ETA: 1:56 - loss: 2.1466 - regression_loss: 1.8078 - classification_loss: 0.3388 159/500 [========>.....................] - ETA: 1:55 - loss: 2.1405 - regression_loss: 1.8031 - classification_loss: 0.3374 160/500 [========>.....................] - ETA: 1:55 - loss: 2.1410 - regression_loss: 1.8039 - classification_loss: 0.3371 161/500 [========>.....................] - ETA: 1:55 - loss: 2.1359 - regression_loss: 1.8002 - classification_loss: 0.3357 162/500 [========>.....................] - ETA: 1:54 - loss: 2.1374 - regression_loss: 1.8013 - classification_loss: 0.3361 163/500 [========>.....................] - ETA: 1:54 - loss: 2.1357 - regression_loss: 1.7998 - classification_loss: 0.3359 164/500 [========>.....................] - ETA: 1:54 - loss: 2.1339 - regression_loss: 1.7985 - classification_loss: 0.3353 165/500 [========>.....................] - ETA: 1:53 - loss: 2.1286 - regression_loss: 1.7942 - classification_loss: 0.3343 166/500 [========>.....................] - ETA: 1:53 - loss: 2.1259 - regression_loss: 1.7919 - classification_loss: 0.3340 167/500 [=========>....................] - ETA: 1:53 - loss: 2.1261 - regression_loss: 1.7922 - classification_loss: 0.3339 168/500 [=========>....................] - ETA: 1:52 - loss: 2.1332 - regression_loss: 1.7973 - classification_loss: 0.3358 169/500 [=========>....................] - ETA: 1:52 - loss: 2.1325 - regression_loss: 1.7968 - classification_loss: 0.3357 170/500 [=========>....................] - ETA: 1:52 - loss: 2.1281 - regression_loss: 1.7931 - classification_loss: 0.3350 171/500 [=========>....................] - ETA: 1:51 - loss: 2.1285 - regression_loss: 1.7934 - classification_loss: 0.3350 172/500 [=========>....................] - ETA: 1:51 - loss: 2.1265 - regression_loss: 1.7913 - classification_loss: 0.3351 173/500 [=========>....................] - ETA: 1:51 - loss: 2.1251 - regression_loss: 1.7907 - classification_loss: 0.3344 174/500 [=========>....................] - ETA: 1:50 - loss: 2.1272 - regression_loss: 1.7923 - classification_loss: 0.3349 175/500 [=========>....................] - ETA: 1:50 - loss: 2.1302 - regression_loss: 1.7943 - classification_loss: 0.3359 176/500 [=========>....................] - ETA: 1:50 - loss: 2.1302 - regression_loss: 1.7948 - classification_loss: 0.3354 177/500 [=========>....................] - ETA: 1:49 - loss: 2.1311 - regression_loss: 1.7957 - classification_loss: 0.3354 178/500 [=========>....................] - ETA: 1:49 - loss: 2.1338 - regression_loss: 1.7982 - classification_loss: 0.3356 179/500 [=========>....................] - ETA: 1:49 - loss: 2.1357 - regression_loss: 1.8001 - classification_loss: 0.3356 180/500 [=========>....................] - ETA: 1:48 - loss: 2.1370 - regression_loss: 1.8011 - classification_loss: 0.3358 181/500 [=========>....................] - ETA: 1:48 - loss: 2.1348 - regression_loss: 1.7996 - classification_loss: 0.3352 182/500 [=========>....................] - ETA: 1:48 - loss: 2.1294 - regression_loss: 1.7954 - classification_loss: 0.3340 183/500 [=========>....................] - ETA: 1:47 - loss: 2.1268 - regression_loss: 1.7934 - classification_loss: 0.3334 184/500 [==========>...................] - ETA: 1:47 - loss: 2.1247 - regression_loss: 1.7913 - classification_loss: 0.3333 185/500 [==========>...................] - ETA: 1:47 - loss: 2.1242 - regression_loss: 1.7912 - classification_loss: 0.3329 186/500 [==========>...................] - ETA: 1:46 - loss: 2.1248 - regression_loss: 1.7921 - classification_loss: 0.3327 187/500 [==========>...................] - ETA: 1:46 - loss: 2.1248 - regression_loss: 1.7923 - classification_loss: 0.3324 188/500 [==========>...................] - ETA: 1:45 - loss: 2.1319 - regression_loss: 1.7969 - classification_loss: 0.3350 189/500 [==========>...................] - ETA: 1:45 - loss: 2.1389 - regression_loss: 1.8028 - classification_loss: 0.3361 190/500 [==========>...................] - ETA: 1:45 - loss: 2.1432 - regression_loss: 1.8063 - classification_loss: 0.3369 191/500 [==========>...................] - ETA: 1:44 - loss: 2.1428 - regression_loss: 1.8059 - classification_loss: 0.3369 192/500 [==========>...................] - ETA: 1:44 - loss: 2.1436 - regression_loss: 1.8061 - classification_loss: 0.3375 193/500 [==========>...................] - ETA: 1:44 - loss: 2.1429 - regression_loss: 1.8061 - classification_loss: 0.3369 194/500 [==========>...................] - ETA: 1:43 - loss: 2.1450 - regression_loss: 1.8079 - classification_loss: 0.3371 195/500 [==========>...................] - ETA: 1:43 - loss: 2.1458 - regression_loss: 1.8084 - classification_loss: 0.3374 196/500 [==========>...................] - ETA: 1:43 - loss: 2.1442 - regression_loss: 1.8074 - classification_loss: 0.3368 197/500 [==========>...................] - ETA: 1:42 - loss: 2.1439 - regression_loss: 1.8071 - classification_loss: 0.3368 198/500 [==========>...................] - ETA: 1:42 - loss: 2.1440 - regression_loss: 1.8074 - classification_loss: 0.3365 199/500 [==========>...................] - ETA: 1:42 - loss: 2.1392 - regression_loss: 1.8037 - classification_loss: 0.3355 200/500 [===========>..................] - ETA: 1:41 - loss: 2.1409 - regression_loss: 1.8055 - classification_loss: 0.3355 201/500 [===========>..................] - ETA: 1:41 - loss: 2.1408 - regression_loss: 1.8054 - classification_loss: 0.3354 202/500 [===========>..................] - ETA: 1:41 - loss: 2.1449 - regression_loss: 1.8095 - classification_loss: 0.3354 203/500 [===========>..................] - ETA: 1:40 - loss: 2.1457 - regression_loss: 1.8105 - classification_loss: 0.3351 204/500 [===========>..................] - ETA: 1:40 - loss: 2.1445 - regression_loss: 1.8100 - classification_loss: 0.3345 205/500 [===========>..................] - ETA: 1:40 - loss: 2.1422 - regression_loss: 1.8083 - classification_loss: 0.3338 206/500 [===========>..................] - ETA: 1:39 - loss: 2.1396 - regression_loss: 1.8064 - classification_loss: 0.3332 207/500 [===========>..................] - ETA: 1:39 - loss: 2.1396 - regression_loss: 1.8064 - classification_loss: 0.3332 208/500 [===========>..................] - ETA: 1:39 - loss: 2.1463 - regression_loss: 1.8098 - classification_loss: 0.3365 209/500 [===========>..................] - ETA: 1:38 - loss: 2.1436 - regression_loss: 1.8078 - classification_loss: 0.3358 210/500 [===========>..................] - ETA: 1:38 - loss: 2.1414 - regression_loss: 1.8061 - classification_loss: 0.3353 211/500 [===========>..................] - ETA: 1:38 - loss: 2.1387 - regression_loss: 1.8041 - classification_loss: 0.3346 212/500 [===========>..................] - ETA: 1:37 - loss: 2.1378 - regression_loss: 1.8036 - classification_loss: 0.3342 213/500 [===========>..................] - ETA: 1:37 - loss: 2.1372 - regression_loss: 1.8033 - classification_loss: 0.3339 214/500 [===========>..................] - ETA: 1:37 - loss: 2.1370 - regression_loss: 1.8031 - classification_loss: 0.3339 215/500 [===========>..................] - ETA: 1:36 - loss: 2.1374 - regression_loss: 1.8036 - classification_loss: 0.3338 216/500 [===========>..................] - ETA: 1:36 - loss: 2.1410 - regression_loss: 1.8064 - classification_loss: 0.3347 217/500 [============>.................] - ETA: 1:36 - loss: 2.1413 - regression_loss: 1.8066 - classification_loss: 0.3347 218/500 [============>.................] - ETA: 1:35 - loss: 2.1417 - regression_loss: 1.8068 - classification_loss: 0.3349 219/500 [============>.................] - ETA: 1:35 - loss: 2.1418 - regression_loss: 1.8072 - classification_loss: 0.3346 220/500 [============>.................] - ETA: 1:35 - loss: 2.1407 - regression_loss: 1.8063 - classification_loss: 0.3344 221/500 [============>.................] - ETA: 1:34 - loss: 2.1406 - regression_loss: 1.8063 - classification_loss: 0.3343 222/500 [============>.................] - ETA: 1:34 - loss: 2.1386 - regression_loss: 1.8049 - classification_loss: 0.3337 223/500 [============>.................] - ETA: 1:34 - loss: 2.1414 - regression_loss: 1.8070 - classification_loss: 0.3344 224/500 [============>.................] - ETA: 1:33 - loss: 2.1443 - regression_loss: 1.8092 - classification_loss: 0.3351 225/500 [============>.................] - ETA: 1:33 - loss: 2.1439 - regression_loss: 1.8094 - classification_loss: 0.3345 226/500 [============>.................] - ETA: 1:33 - loss: 2.1416 - regression_loss: 1.8076 - classification_loss: 0.3340 227/500 [============>.................] - ETA: 1:32 - loss: 2.1365 - regression_loss: 1.8033 - classification_loss: 0.3332 228/500 [============>.................] - ETA: 1:32 - loss: 2.1356 - regression_loss: 1.8028 - classification_loss: 0.3327 229/500 [============>.................] - ETA: 1:32 - loss: 2.1335 - regression_loss: 1.8013 - classification_loss: 0.3322 230/500 [============>.................] - ETA: 1:31 - loss: 2.1308 - regression_loss: 1.7988 - classification_loss: 0.3320 231/500 [============>.................] - ETA: 1:31 - loss: 2.1291 - regression_loss: 1.7975 - classification_loss: 0.3316 232/500 [============>.................] - ETA: 1:31 - loss: 2.1252 - regression_loss: 1.7944 - classification_loss: 0.3308 233/500 [============>.................] - ETA: 1:30 - loss: 2.1327 - regression_loss: 1.8002 - classification_loss: 0.3324 234/500 [=============>................] - ETA: 1:30 - loss: 2.1248 - regression_loss: 1.7925 - classification_loss: 0.3323 235/500 [=============>................] - ETA: 1:29 - loss: 2.1210 - regression_loss: 1.7894 - classification_loss: 0.3315 236/500 [=============>................] - ETA: 1:29 - loss: 2.1223 - regression_loss: 1.7902 - classification_loss: 0.3321 237/500 [=============>................] - ETA: 1:29 - loss: 2.1216 - regression_loss: 1.7893 - classification_loss: 0.3323 238/500 [=============>................] - ETA: 1:28 - loss: 2.1240 - regression_loss: 1.7909 - classification_loss: 0.3331 239/500 [=============>................] - ETA: 1:28 - loss: 2.1273 - regression_loss: 1.7929 - classification_loss: 0.3343 240/500 [=============>................] - ETA: 1:28 - loss: 2.1289 - regression_loss: 1.7939 - classification_loss: 0.3350 241/500 [=============>................] - ETA: 1:27 - loss: 2.1328 - regression_loss: 1.7976 - classification_loss: 0.3352 242/500 [=============>................] - ETA: 1:27 - loss: 2.1341 - regression_loss: 1.7988 - classification_loss: 0.3353 243/500 [=============>................] - ETA: 1:27 - loss: 2.1341 - regression_loss: 1.7992 - classification_loss: 0.3349 244/500 [=============>................] - ETA: 1:26 - loss: 2.1343 - regression_loss: 1.7994 - classification_loss: 0.3349 245/500 [=============>................] - ETA: 1:26 - loss: 2.1347 - regression_loss: 1.7996 - classification_loss: 0.3351 246/500 [=============>................] - ETA: 1:26 - loss: 2.1341 - regression_loss: 1.7992 - classification_loss: 0.3348 247/500 [=============>................] - ETA: 1:25 - loss: 2.1342 - regression_loss: 1.7991 - classification_loss: 0.3351 248/500 [=============>................] - ETA: 1:25 - loss: 2.1320 - regression_loss: 1.7972 - classification_loss: 0.3348 249/500 [=============>................] - ETA: 1:25 - loss: 2.1328 - regression_loss: 1.7978 - classification_loss: 0.3350 250/500 [==============>...............] - ETA: 1:24 - loss: 2.1312 - regression_loss: 1.7964 - classification_loss: 0.3348 251/500 [==============>...............] - ETA: 1:24 - loss: 2.1310 - regression_loss: 1.7961 - classification_loss: 0.3349 252/500 [==============>...............] - ETA: 1:24 - loss: 2.1318 - regression_loss: 1.7962 - classification_loss: 0.3356 253/500 [==============>...............] - ETA: 1:23 - loss: 2.1360 - regression_loss: 1.7986 - classification_loss: 0.3374 254/500 [==============>...............] - ETA: 1:23 - loss: 2.1360 - regression_loss: 1.7989 - classification_loss: 0.3372 255/500 [==============>...............] - ETA: 1:23 - loss: 2.1346 - regression_loss: 1.7979 - classification_loss: 0.3367 256/500 [==============>...............] - ETA: 1:22 - loss: 2.1324 - regression_loss: 1.7963 - classification_loss: 0.3361 257/500 [==============>...............] - ETA: 1:22 - loss: 2.1332 - regression_loss: 1.7970 - classification_loss: 0.3362 258/500 [==============>...............] - ETA: 1:22 - loss: 2.1303 - regression_loss: 1.7948 - classification_loss: 0.3356 259/500 [==============>...............] - ETA: 1:21 - loss: 2.1311 - regression_loss: 1.7954 - classification_loss: 0.3358 260/500 [==============>...............] - ETA: 1:21 - loss: 2.1304 - regression_loss: 1.7948 - classification_loss: 0.3356 261/500 [==============>...............] - ETA: 1:21 - loss: 2.1294 - regression_loss: 1.7941 - classification_loss: 0.3353 262/500 [==============>...............] - ETA: 1:20 - loss: 2.1297 - regression_loss: 1.7945 - classification_loss: 0.3352 263/500 [==============>...............] - ETA: 1:20 - loss: 2.1286 - regression_loss: 1.7937 - classification_loss: 0.3349 264/500 [==============>...............] - ETA: 1:20 - loss: 2.1287 - regression_loss: 1.7940 - classification_loss: 0.3348 265/500 [==============>...............] - ETA: 1:19 - loss: 2.1322 - regression_loss: 1.7950 - classification_loss: 0.3372 266/500 [==============>...............] - ETA: 1:19 - loss: 2.1318 - regression_loss: 1.7947 - classification_loss: 0.3371 267/500 [===============>..............] - ETA: 1:19 - loss: 2.1338 - regression_loss: 1.7959 - classification_loss: 0.3378 268/500 [===============>..............] - ETA: 1:18 - loss: 2.1350 - regression_loss: 1.7972 - classification_loss: 0.3378 269/500 [===============>..............] - ETA: 1:18 - loss: 2.1353 - regression_loss: 1.7976 - classification_loss: 0.3377 270/500 [===============>..............] - ETA: 1:18 - loss: 2.1343 - regression_loss: 1.7967 - classification_loss: 0.3376 271/500 [===============>..............] - ETA: 1:17 - loss: 2.1338 - regression_loss: 1.7963 - classification_loss: 0.3374 272/500 [===============>..............] - ETA: 1:17 - loss: 2.1342 - regression_loss: 1.7970 - classification_loss: 0.3372 273/500 [===============>..............] - ETA: 1:17 - loss: 2.1346 - regression_loss: 1.7975 - classification_loss: 0.3371 274/500 [===============>..............] - ETA: 1:16 - loss: 2.1337 - regression_loss: 1.7969 - classification_loss: 0.3368 275/500 [===============>..............] - ETA: 1:16 - loss: 2.1312 - regression_loss: 1.7949 - classification_loss: 0.3363 276/500 [===============>..............] - ETA: 1:16 - loss: 2.1302 - regression_loss: 1.7942 - classification_loss: 0.3360 277/500 [===============>..............] - ETA: 1:15 - loss: 2.1295 - regression_loss: 1.7935 - classification_loss: 0.3360 278/500 [===============>..............] - ETA: 1:15 - loss: 2.1277 - regression_loss: 1.7916 - classification_loss: 0.3361 279/500 [===============>..............] - ETA: 1:15 - loss: 2.1276 - regression_loss: 1.7916 - classification_loss: 0.3360 280/500 [===============>..............] - ETA: 1:14 - loss: 2.1280 - regression_loss: 1.7920 - classification_loss: 0.3360 281/500 [===============>..............] - ETA: 1:14 - loss: 2.1291 - regression_loss: 1.7931 - classification_loss: 0.3360 282/500 [===============>..............] - ETA: 1:14 - loss: 2.1265 - regression_loss: 1.7909 - classification_loss: 0.3355 283/500 [===============>..............] - ETA: 1:13 - loss: 2.1304 - regression_loss: 1.7938 - classification_loss: 0.3366 284/500 [================>.............] - ETA: 1:13 - loss: 2.1288 - regression_loss: 1.7926 - classification_loss: 0.3362 285/500 [================>.............] - ETA: 1:13 - loss: 2.1301 - regression_loss: 1.7936 - classification_loss: 0.3366 286/500 [================>.............] - ETA: 1:12 - loss: 2.1294 - regression_loss: 1.7932 - classification_loss: 0.3362 287/500 [================>.............] - ETA: 1:12 - loss: 2.1290 - regression_loss: 1.7929 - classification_loss: 0.3361 288/500 [================>.............] - ETA: 1:12 - loss: 2.1289 - regression_loss: 1.7927 - classification_loss: 0.3361 289/500 [================>.............] - ETA: 1:11 - loss: 2.1284 - regression_loss: 1.7924 - classification_loss: 0.3360 290/500 [================>.............] - ETA: 1:11 - loss: 2.1290 - regression_loss: 1.7931 - classification_loss: 0.3360 291/500 [================>.............] - ETA: 1:11 - loss: 2.1250 - regression_loss: 1.7897 - classification_loss: 0.3353 292/500 [================>.............] - ETA: 1:10 - loss: 2.1258 - regression_loss: 1.7903 - classification_loss: 0.3354 293/500 [================>.............] - ETA: 1:10 - loss: 2.1262 - regression_loss: 1.7901 - classification_loss: 0.3361 294/500 [================>.............] - ETA: 1:10 - loss: 2.1249 - regression_loss: 1.7887 - classification_loss: 0.3362 295/500 [================>.............] - ETA: 1:09 - loss: 2.1252 - regression_loss: 1.7892 - classification_loss: 0.3360 296/500 [================>.............] - ETA: 1:09 - loss: 2.1213 - regression_loss: 1.7860 - classification_loss: 0.3353 297/500 [================>.............] - ETA: 1:09 - loss: 2.1234 - regression_loss: 1.7869 - classification_loss: 0.3364 298/500 [================>.............] - ETA: 1:08 - loss: 2.1233 - regression_loss: 1.7870 - classification_loss: 0.3363 299/500 [================>.............] - ETA: 1:08 - loss: 2.1234 - regression_loss: 1.7870 - classification_loss: 0.3364 300/500 [=================>............] - ETA: 1:07 - loss: 2.1265 - regression_loss: 1.7896 - classification_loss: 0.3369 301/500 [=================>............] - ETA: 1:07 - loss: 2.1304 - regression_loss: 1.7921 - classification_loss: 0.3383 302/500 [=================>............] - ETA: 1:07 - loss: 2.1300 - regression_loss: 1.7919 - classification_loss: 0.3382 303/500 [=================>............] - ETA: 1:06 - loss: 2.1277 - regression_loss: 1.7900 - classification_loss: 0.3376 304/500 [=================>............] - ETA: 1:06 - loss: 2.1281 - regression_loss: 1.7904 - classification_loss: 0.3377 305/500 [=================>............] - ETA: 1:06 - loss: 2.1323 - regression_loss: 1.7923 - classification_loss: 0.3400 306/500 [=================>............] - ETA: 1:05 - loss: 2.1325 - regression_loss: 1.7925 - classification_loss: 0.3401 307/500 [=================>............] - ETA: 1:05 - loss: 2.1343 - regression_loss: 1.7941 - classification_loss: 0.3402 308/500 [=================>............] - ETA: 1:05 - loss: 2.1354 - regression_loss: 1.7949 - classification_loss: 0.3405 309/500 [=================>............] - ETA: 1:04 - loss: 2.1375 - regression_loss: 1.7962 - classification_loss: 0.3413 310/500 [=================>............] - ETA: 1:04 - loss: 2.1371 - regression_loss: 1.7959 - classification_loss: 0.3412 311/500 [=================>............] - ETA: 1:04 - loss: 2.1393 - regression_loss: 1.7977 - classification_loss: 0.3416 312/500 [=================>............] - ETA: 1:03 - loss: 2.1360 - regression_loss: 1.7950 - classification_loss: 0.3410 313/500 [=================>............] - ETA: 1:03 - loss: 2.1360 - regression_loss: 1.7948 - classification_loss: 0.3412 314/500 [=================>............] - ETA: 1:03 - loss: 2.1363 - regression_loss: 1.7954 - classification_loss: 0.3409 315/500 [=================>............] - ETA: 1:02 - loss: 2.1417 - regression_loss: 1.7999 - classification_loss: 0.3418 316/500 [=================>............] - ETA: 1:02 - loss: 2.1401 - regression_loss: 1.7988 - classification_loss: 0.3413 317/500 [==================>...........] - ETA: 1:02 - loss: 2.1403 - regression_loss: 1.7988 - classification_loss: 0.3415 318/500 [==================>...........] - ETA: 1:01 - loss: 2.1404 - regression_loss: 1.7989 - classification_loss: 0.3414 319/500 [==================>...........] - ETA: 1:01 - loss: 2.1422 - regression_loss: 1.8004 - classification_loss: 0.3418 320/500 [==================>...........] - ETA: 1:01 - loss: 2.1421 - regression_loss: 1.8006 - classification_loss: 0.3415 321/500 [==================>...........] - ETA: 1:00 - loss: 2.1417 - regression_loss: 1.8002 - classification_loss: 0.3415 322/500 [==================>...........] - ETA: 1:00 - loss: 2.1407 - regression_loss: 1.7994 - classification_loss: 0.3414 323/500 [==================>...........] - ETA: 1:00 - loss: 2.1415 - regression_loss: 1.7997 - classification_loss: 0.3417 324/500 [==================>...........] - ETA: 59s - loss: 2.1395 - regression_loss: 1.7984 - classification_loss: 0.3411  325/500 [==================>...........] - ETA: 59s - loss: 2.1377 - regression_loss: 1.7970 - classification_loss: 0.3406 326/500 [==================>...........] - ETA: 59s - loss: 2.1387 - regression_loss: 1.7975 - classification_loss: 0.3412 327/500 [==================>...........] - ETA: 58s - loss: 2.1382 - regression_loss: 1.7972 - classification_loss: 0.3410 328/500 [==================>...........] - ETA: 58s - loss: 2.1379 - regression_loss: 1.7971 - classification_loss: 0.3408 329/500 [==================>...........] - ETA: 58s - loss: 2.1388 - regression_loss: 1.7976 - classification_loss: 0.3411 330/500 [==================>...........] - ETA: 57s - loss: 2.1391 - regression_loss: 1.7959 - classification_loss: 0.3432 331/500 [==================>...........] - ETA: 57s - loss: 2.1382 - regression_loss: 1.7952 - classification_loss: 0.3430 332/500 [==================>...........] - ETA: 57s - loss: 2.1373 - regression_loss: 1.7945 - classification_loss: 0.3428 333/500 [==================>...........] - ETA: 56s - loss: 2.1366 - regression_loss: 1.7940 - classification_loss: 0.3426 334/500 [===================>..........] - ETA: 56s - loss: 2.1352 - regression_loss: 1.7929 - classification_loss: 0.3423 335/500 [===================>..........] - ETA: 56s - loss: 2.1365 - regression_loss: 1.7942 - classification_loss: 0.3423 336/500 [===================>..........] - ETA: 55s - loss: 2.1399 - regression_loss: 1.7970 - classification_loss: 0.3429 337/500 [===================>..........] - ETA: 55s - loss: 2.1361 - regression_loss: 1.7939 - classification_loss: 0.3422 338/500 [===================>..........] - ETA: 55s - loss: 2.1368 - regression_loss: 1.7945 - classification_loss: 0.3422 339/500 [===================>..........] - ETA: 54s - loss: 2.1345 - regression_loss: 1.7928 - classification_loss: 0.3417 340/500 [===================>..........] - ETA: 54s - loss: 2.1331 - regression_loss: 1.7917 - classification_loss: 0.3415 341/500 [===================>..........] - ETA: 54s - loss: 2.1315 - regression_loss: 1.7905 - classification_loss: 0.3410 342/500 [===================>..........] - ETA: 53s - loss: 2.1303 - regression_loss: 1.7897 - classification_loss: 0.3407 343/500 [===================>..........] - ETA: 53s - loss: 2.1298 - regression_loss: 1.7891 - classification_loss: 0.3407 344/500 [===================>..........] - ETA: 53s - loss: 2.1294 - regression_loss: 1.7887 - classification_loss: 0.3407 345/500 [===================>..........] - ETA: 52s - loss: 2.1303 - regression_loss: 1.7896 - classification_loss: 0.3407 346/500 [===================>..........] - ETA: 52s - loss: 2.1330 - regression_loss: 1.7913 - classification_loss: 0.3417 347/500 [===================>..........] - ETA: 51s - loss: 2.1325 - regression_loss: 1.7912 - classification_loss: 0.3413 348/500 [===================>..........] - ETA: 51s - loss: 2.1312 - regression_loss: 1.7903 - classification_loss: 0.3409 349/500 [===================>..........] - ETA: 51s - loss: 2.1309 - regression_loss: 1.7902 - classification_loss: 0.3406 350/500 [====================>.........] - ETA: 50s - loss: 2.1308 - regression_loss: 1.7905 - classification_loss: 0.3402 351/500 [====================>.........] - ETA: 50s - loss: 2.1279 - regression_loss: 1.7881 - classification_loss: 0.3398 352/500 [====================>.........] - ETA: 50s - loss: 2.1276 - regression_loss: 1.7879 - classification_loss: 0.3397 353/500 [====================>.........] - ETA: 49s - loss: 2.1293 - regression_loss: 1.7892 - classification_loss: 0.3400 354/500 [====================>.........] - ETA: 49s - loss: 2.1269 - regression_loss: 1.7872 - classification_loss: 0.3397 355/500 [====================>.........] - ETA: 49s - loss: 2.1274 - regression_loss: 1.7875 - classification_loss: 0.3399 356/500 [====================>.........] - ETA: 48s - loss: 2.1286 - regression_loss: 1.7887 - classification_loss: 0.3399 357/500 [====================>.........] - ETA: 48s - loss: 2.1274 - regression_loss: 1.7878 - classification_loss: 0.3396 358/500 [====================>.........] - ETA: 48s - loss: 2.1277 - regression_loss: 1.7882 - classification_loss: 0.3395 359/500 [====================>.........] - ETA: 47s - loss: 2.1264 - regression_loss: 1.7873 - classification_loss: 0.3392 360/500 [====================>.........] - ETA: 47s - loss: 2.1256 - regression_loss: 1.7867 - classification_loss: 0.3390 361/500 [====================>.........] - ETA: 47s - loss: 2.1277 - regression_loss: 1.7882 - classification_loss: 0.3396 362/500 [====================>.........] - ETA: 46s - loss: 2.1280 - regression_loss: 1.7885 - classification_loss: 0.3395 363/500 [====================>.........] - ETA: 46s - loss: 2.1278 - regression_loss: 1.7885 - classification_loss: 0.3393 364/500 [====================>.........] - ETA: 46s - loss: 2.1272 - regression_loss: 1.7880 - classification_loss: 0.3392 365/500 [====================>.........] - ETA: 45s - loss: 2.1279 - regression_loss: 1.7885 - classification_loss: 0.3393 366/500 [====================>.........] - ETA: 45s - loss: 2.1314 - regression_loss: 1.7908 - classification_loss: 0.3406 367/500 [=====================>........] - ETA: 45s - loss: 2.1344 - regression_loss: 1.7930 - classification_loss: 0.3414 368/500 [=====================>........] - ETA: 44s - loss: 2.1362 - regression_loss: 1.7947 - classification_loss: 0.3415 369/500 [=====================>........] - ETA: 44s - loss: 2.1360 - regression_loss: 1.7947 - classification_loss: 0.3413 370/500 [=====================>........] - ETA: 44s - loss: 2.1367 - regression_loss: 1.7956 - classification_loss: 0.3412 371/500 [=====================>........] - ETA: 43s - loss: 2.1356 - regression_loss: 1.7947 - classification_loss: 0.3409 372/500 [=====================>........] - ETA: 43s - loss: 2.1339 - regression_loss: 1.7935 - classification_loss: 0.3404 373/500 [=====================>........] - ETA: 43s - loss: 2.1334 - regression_loss: 1.7930 - classification_loss: 0.3404 374/500 [=====================>........] - ETA: 42s - loss: 2.1346 - regression_loss: 1.7943 - classification_loss: 0.3403 375/500 [=====================>........] - ETA: 42s - loss: 2.1349 - regression_loss: 1.7947 - classification_loss: 0.3402 376/500 [=====================>........] - ETA: 42s - loss: 2.1363 - regression_loss: 1.7962 - classification_loss: 0.3401 377/500 [=====================>........] - ETA: 41s - loss: 2.1351 - regression_loss: 1.7953 - classification_loss: 0.3398 378/500 [=====================>........] - ETA: 41s - loss: 2.1346 - regression_loss: 1.7950 - classification_loss: 0.3396 379/500 [=====================>........] - ETA: 41s - loss: 2.1346 - regression_loss: 1.7949 - classification_loss: 0.3397 380/500 [=====================>........] - ETA: 40s - loss: 2.1339 - regression_loss: 1.7943 - classification_loss: 0.3396 381/500 [=====================>........] - ETA: 40s - loss: 2.1351 - regression_loss: 1.7953 - classification_loss: 0.3398 382/500 [=====================>........] - ETA: 40s - loss: 2.1353 - regression_loss: 1.7955 - classification_loss: 0.3398 383/500 [=====================>........] - ETA: 39s - loss: 2.1343 - regression_loss: 1.7948 - classification_loss: 0.3395 384/500 [======================>.......] - ETA: 39s - loss: 2.1344 - regression_loss: 1.7949 - classification_loss: 0.3395 385/500 [======================>.......] - ETA: 39s - loss: 2.1351 - regression_loss: 1.7956 - classification_loss: 0.3395 386/500 [======================>.......] - ETA: 38s - loss: 2.1347 - regression_loss: 1.7954 - classification_loss: 0.3393 387/500 [======================>.......] - ETA: 38s - loss: 2.1325 - regression_loss: 1.7937 - classification_loss: 0.3388 388/500 [======================>.......] - ETA: 38s - loss: 2.1343 - regression_loss: 1.7948 - classification_loss: 0.3395 389/500 [======================>.......] - ETA: 37s - loss: 2.1371 - regression_loss: 1.7973 - classification_loss: 0.3398 390/500 [======================>.......] - ETA: 37s - loss: 2.1342 - regression_loss: 1.7950 - classification_loss: 0.3392 391/500 [======================>.......] - ETA: 37s - loss: 2.1340 - regression_loss: 1.7950 - classification_loss: 0.3389 392/500 [======================>.......] - ETA: 36s - loss: 2.1341 - regression_loss: 1.7952 - classification_loss: 0.3389 393/500 [======================>.......] - ETA: 36s - loss: 2.1337 - regression_loss: 1.7949 - classification_loss: 0.3388 394/500 [======================>.......] - ETA: 36s - loss: 2.1329 - regression_loss: 1.7943 - classification_loss: 0.3385 395/500 [======================>.......] - ETA: 35s - loss: 2.1330 - regression_loss: 1.7944 - classification_loss: 0.3386 396/500 [======================>.......] - ETA: 35s - loss: 2.1332 - regression_loss: 1.7948 - classification_loss: 0.3384 397/500 [======================>.......] - ETA: 35s - loss: 2.1345 - regression_loss: 1.7963 - classification_loss: 0.3383 398/500 [======================>.......] - ETA: 34s - loss: 2.1358 - regression_loss: 1.7973 - classification_loss: 0.3385 399/500 [======================>.......] - ETA: 34s - loss: 2.1356 - regression_loss: 1.7974 - classification_loss: 0.3382 400/500 [=======================>......] - ETA: 34s - loss: 2.1352 - regression_loss: 1.7972 - classification_loss: 0.3380 401/500 [=======================>......] - ETA: 33s - loss: 2.1355 - regression_loss: 1.7975 - classification_loss: 0.3380 402/500 [=======================>......] - ETA: 33s - loss: 2.1350 - regression_loss: 1.7974 - classification_loss: 0.3376 403/500 [=======================>......] - ETA: 33s - loss: 2.1327 - regression_loss: 1.7956 - classification_loss: 0.3371 404/500 [=======================>......] - ETA: 32s - loss: 2.1336 - regression_loss: 1.7963 - classification_loss: 0.3373 405/500 [=======================>......] - ETA: 32s - loss: 2.1324 - regression_loss: 1.7954 - classification_loss: 0.3371 406/500 [=======================>......] - ETA: 31s - loss: 2.1317 - regression_loss: 1.7950 - classification_loss: 0.3368 407/500 [=======================>......] - ETA: 31s - loss: 2.1313 - regression_loss: 1.7947 - classification_loss: 0.3366 408/500 [=======================>......] - ETA: 31s - loss: 2.1297 - regression_loss: 1.7934 - classification_loss: 0.3363 409/500 [=======================>......] - ETA: 30s - loss: 2.1318 - regression_loss: 1.7947 - classification_loss: 0.3371 410/500 [=======================>......] - ETA: 30s - loss: 2.1311 - regression_loss: 1.7942 - classification_loss: 0.3369 411/500 [=======================>......] - ETA: 30s - loss: 2.1307 - regression_loss: 1.7940 - classification_loss: 0.3367 412/500 [=======================>......] - ETA: 29s - loss: 2.1298 - regression_loss: 1.7934 - classification_loss: 0.3364 413/500 [=======================>......] - ETA: 29s - loss: 2.1302 - regression_loss: 1.7937 - classification_loss: 0.3365 414/500 [=======================>......] - ETA: 29s - loss: 2.1321 - regression_loss: 1.7957 - classification_loss: 0.3364 415/500 [=======================>......] - ETA: 28s - loss: 2.1321 - regression_loss: 1.7952 - classification_loss: 0.3369 416/500 [=======================>......] - ETA: 28s - loss: 2.1338 - regression_loss: 1.7966 - classification_loss: 0.3372 417/500 [========================>.....] - ETA: 28s - loss: 2.1320 - regression_loss: 1.7952 - classification_loss: 0.3368 418/500 [========================>.....] - ETA: 27s - loss: 2.1311 - regression_loss: 1.7945 - classification_loss: 0.3366 419/500 [========================>.....] - ETA: 27s - loss: 2.1298 - regression_loss: 1.7936 - classification_loss: 0.3362 420/500 [========================>.....] - ETA: 27s - loss: 2.1290 - regression_loss: 1.7930 - classification_loss: 0.3360 421/500 [========================>.....] - ETA: 26s - loss: 2.1282 - regression_loss: 1.7925 - classification_loss: 0.3357 422/500 [========================>.....] - ETA: 26s - loss: 2.1267 - regression_loss: 1.7916 - classification_loss: 0.3351 423/500 [========================>.....] - ETA: 26s - loss: 2.1270 - regression_loss: 1.7917 - classification_loss: 0.3353 424/500 [========================>.....] - ETA: 25s - loss: 2.1262 - regression_loss: 1.7912 - classification_loss: 0.3350 425/500 [========================>.....] - ETA: 25s - loss: 2.1265 - regression_loss: 1.7915 - classification_loss: 0.3350 426/500 [========================>.....] - ETA: 25s - loss: 2.1262 - regression_loss: 1.7912 - classification_loss: 0.3350 427/500 [========================>.....] - ETA: 24s - loss: 2.1265 - regression_loss: 1.7914 - classification_loss: 0.3351 428/500 [========================>.....] - ETA: 24s - loss: 2.1273 - regression_loss: 1.7921 - classification_loss: 0.3352 429/500 [========================>.....] - ETA: 24s - loss: 2.1265 - regression_loss: 1.7915 - classification_loss: 0.3350 430/500 [========================>.....] - ETA: 23s - loss: 2.1257 - regression_loss: 1.7909 - classification_loss: 0.3348 431/500 [========================>.....] - ETA: 23s - loss: 2.1246 - regression_loss: 1.7899 - classification_loss: 0.3347 432/500 [========================>.....] - ETA: 23s - loss: 2.1259 - regression_loss: 1.7912 - classification_loss: 0.3347 433/500 [========================>.....] - ETA: 22s - loss: 2.1247 - regression_loss: 1.7903 - classification_loss: 0.3345 434/500 [=========================>....] - ETA: 22s - loss: 2.1251 - regression_loss: 1.7909 - classification_loss: 0.3343 435/500 [=========================>....] - ETA: 22s - loss: 2.1260 - regression_loss: 1.7917 - classification_loss: 0.3343 436/500 [=========================>....] - ETA: 21s - loss: 2.1261 - regression_loss: 1.7920 - classification_loss: 0.3341 437/500 [=========================>....] - ETA: 21s - loss: 2.1263 - regression_loss: 1.7924 - classification_loss: 0.3339 438/500 [=========================>....] - ETA: 21s - loss: 2.1277 - regression_loss: 1.7937 - classification_loss: 0.3340 439/500 [=========================>....] - ETA: 20s - loss: 2.1272 - regression_loss: 1.7934 - classification_loss: 0.3337 440/500 [=========================>....] - ETA: 20s - loss: 2.1257 - regression_loss: 1.7920 - classification_loss: 0.3337 441/500 [=========================>....] - ETA: 20s - loss: 2.1235 - regression_loss: 1.7902 - classification_loss: 0.3333 442/500 [=========================>....] - ETA: 19s - loss: 2.1221 - regression_loss: 1.7889 - classification_loss: 0.3331 443/500 [=========================>....] - ETA: 19s - loss: 2.1216 - regression_loss: 1.7886 - classification_loss: 0.3329 444/500 [=========================>....] - ETA: 19s - loss: 2.1227 - regression_loss: 1.7896 - classification_loss: 0.3331 445/500 [=========================>....] - ETA: 18s - loss: 2.1209 - regression_loss: 1.7882 - classification_loss: 0.3327 446/500 [=========================>....] - ETA: 18s - loss: 2.1207 - regression_loss: 1.7880 - classification_loss: 0.3326 447/500 [=========================>....] - ETA: 18s - loss: 2.1214 - regression_loss: 1.7885 - classification_loss: 0.3329 448/500 [=========================>....] - ETA: 17s - loss: 2.1216 - regression_loss: 1.7887 - classification_loss: 0.3330 449/500 [=========================>....] - ETA: 17s - loss: 2.1214 - regression_loss: 1.7885 - classification_loss: 0.3328 450/500 [==========================>...] - ETA: 17s - loss: 2.1212 - regression_loss: 1.7885 - classification_loss: 0.3326 451/500 [==========================>...] - ETA: 16s - loss: 2.1208 - regression_loss: 1.7882 - classification_loss: 0.3326 452/500 [==========================>...] - ETA: 16s - loss: 2.1204 - regression_loss: 1.7878 - classification_loss: 0.3325 453/500 [==========================>...] - ETA: 15s - loss: 2.1196 - regression_loss: 1.7874 - classification_loss: 0.3323 454/500 [==========================>...] - ETA: 15s - loss: 2.1202 - regression_loss: 1.7878 - classification_loss: 0.3325 455/500 [==========================>...] - ETA: 15s - loss: 2.1207 - regression_loss: 1.7884 - classification_loss: 0.3323 456/500 [==========================>...] - ETA: 14s - loss: 2.1188 - regression_loss: 1.7870 - classification_loss: 0.3318 457/500 [==========================>...] - ETA: 14s - loss: 2.1190 - regression_loss: 1.7872 - classification_loss: 0.3318 458/500 [==========================>...] - ETA: 14s - loss: 2.1193 - regression_loss: 1.7875 - classification_loss: 0.3318 459/500 [==========================>...] - ETA: 13s - loss: 2.1192 - regression_loss: 1.7875 - classification_loss: 0.3317 460/500 [==========================>...] - ETA: 13s - loss: 2.1184 - regression_loss: 1.7870 - classification_loss: 0.3314 461/500 [==========================>...] - ETA: 13s - loss: 2.1193 - regression_loss: 1.7878 - classification_loss: 0.3315 462/500 [==========================>...] - ETA: 12s - loss: 2.1215 - regression_loss: 1.7898 - classification_loss: 0.3317 463/500 [==========================>...] - ETA: 12s - loss: 2.1205 - regression_loss: 1.7889 - classification_loss: 0.3315 464/500 [==========================>...] - ETA: 12s - loss: 2.1185 - regression_loss: 1.7874 - classification_loss: 0.3312 465/500 [==========================>...] - ETA: 11s - loss: 2.1163 - regression_loss: 1.7855 - classification_loss: 0.3308 466/500 [==========================>...] - ETA: 11s - loss: 2.1164 - regression_loss: 1.7858 - classification_loss: 0.3307 467/500 [===========================>..] - ETA: 11s - loss: 2.1168 - regression_loss: 1.7862 - classification_loss: 0.3306 468/500 [===========================>..] - ETA: 10s - loss: 2.1166 - regression_loss: 1.7861 - classification_loss: 0.3306 469/500 [===========================>..] - ETA: 10s - loss: 2.1155 - regression_loss: 1.7852 - classification_loss: 0.3303 470/500 [===========================>..] - ETA: 10s - loss: 2.1151 - regression_loss: 1.7852 - classification_loss: 0.3300 471/500 [===========================>..] - ETA: 9s - loss: 2.1139 - regression_loss: 1.7843 - classification_loss: 0.3297  472/500 [===========================>..] - ETA: 9s - loss: 2.1139 - regression_loss: 1.7844 - classification_loss: 0.3295 473/500 [===========================>..] - ETA: 9s - loss: 2.1144 - regression_loss: 1.7847 - classification_loss: 0.3297 474/500 [===========================>..] - ETA: 8s - loss: 2.1146 - regression_loss: 1.7848 - classification_loss: 0.3298 475/500 [===========================>..] - ETA: 8s - loss: 2.1149 - regression_loss: 1.7852 - classification_loss: 0.3298 476/500 [===========================>..] - ETA: 8s - loss: 2.1160 - regression_loss: 1.7857 - classification_loss: 0.3303 477/500 [===========================>..] - ETA: 7s - loss: 2.1153 - regression_loss: 1.7852 - classification_loss: 0.3301 478/500 [===========================>..] - ETA: 7s - loss: 2.1159 - regression_loss: 1.7854 - classification_loss: 0.3305 479/500 [===========================>..] - ETA: 7s - loss: 2.1159 - regression_loss: 1.7855 - classification_loss: 0.3304 480/500 [===========================>..] - ETA: 6s - loss: 2.1156 - regression_loss: 1.7854 - classification_loss: 0.3302 481/500 [===========================>..] - ETA: 6s - loss: 2.1158 - regression_loss: 1.7857 - classification_loss: 0.3302 482/500 [===========================>..] - ETA: 6s - loss: 2.1153 - regression_loss: 1.7854 - classification_loss: 0.3300 483/500 [===========================>..] - ETA: 5s - loss: 2.1182 - regression_loss: 1.7879 - classification_loss: 0.3304 484/500 [============================>.] - ETA: 5s - loss: 2.1178 - regression_loss: 1.7874 - classification_loss: 0.3304 485/500 [============================>.] - ETA: 5s - loss: 2.1164 - regression_loss: 1.7862 - classification_loss: 0.3302 486/500 [============================>.] - ETA: 4s - loss: 2.1152 - regression_loss: 1.7853 - classification_loss: 0.3300 487/500 [============================>.] - ETA: 4s - loss: 2.1147 - regression_loss: 1.7849 - classification_loss: 0.3298 488/500 [============================>.] - ETA: 4s - loss: 2.1170 - regression_loss: 1.7868 - classification_loss: 0.3303 489/500 [============================>.] - ETA: 3s - loss: 2.1170 - regression_loss: 1.7867 - classification_loss: 0.3302 490/500 [============================>.] - ETA: 3s - loss: 2.1152 - regression_loss: 1.7852 - classification_loss: 0.3299 491/500 [============================>.] - ETA: 3s - loss: 2.1142 - regression_loss: 1.7846 - classification_loss: 0.3297 492/500 [============================>.] - ETA: 2s - loss: 2.1125 - regression_loss: 1.7832 - classification_loss: 0.3292 493/500 [============================>.] - ETA: 2s - loss: 2.1118 - regression_loss: 1.7828 - classification_loss: 0.3291 494/500 [============================>.] - ETA: 2s - loss: 2.1108 - regression_loss: 1.7821 - classification_loss: 0.3287 495/500 [============================>.] - ETA: 1s - loss: 2.1117 - regression_loss: 1.7829 - classification_loss: 0.3288 496/500 [============================>.] - ETA: 1s - loss: 2.1152 - regression_loss: 1.7857 - classification_loss: 0.3296 497/500 [============================>.] - ETA: 1s - loss: 2.1145 - regression_loss: 1.7850 - classification_loss: 0.3295 498/500 [============================>.] - ETA: 0s - loss: 2.1129 - regression_loss: 1.7838 - classification_loss: 0.3291 499/500 [============================>.] - ETA: 0s - loss: 2.1137 - regression_loss: 1.7845 - classification_loss: 0.3292 500/500 [==============================] - 170s 340ms/step - loss: 2.1132 - regression_loss: 1.7842 - classification_loss: 0.3290 326 instances of class plum with average precision: 0.7665 mAP: 0.7665 Epoch 00003: saving model to ./training/snapshots/resnet101_pascal_03.h5 Epoch 4/150 1/500 [..............................] - ETA: 2:47 - loss: 1.7832 - regression_loss: 1.5567 - classification_loss: 0.2265 2/500 [..............................] - ETA: 2:47 - loss: 2.1180 - regression_loss: 1.7627 - classification_loss: 0.3553 3/500 [..............................] - ETA: 2:47 - loss: 1.9548 - regression_loss: 1.6284 - classification_loss: 0.3264 4/500 [..............................] - ETA: 2:48 - loss: 1.7845 - regression_loss: 1.5143 - classification_loss: 0.2702 5/500 [..............................] - ETA: 2:48 - loss: 1.7054 - regression_loss: 1.4589 - classification_loss: 0.2465 6/500 [..............................] - ETA: 2:47 - loss: 1.6798 - regression_loss: 1.4478 - classification_loss: 0.2320 7/500 [..............................] - ETA: 2:46 - loss: 1.7365 - regression_loss: 1.4982 - classification_loss: 0.2383 8/500 [..............................] - ETA: 2:46 - loss: 1.7243 - regression_loss: 1.4857 - classification_loss: 0.2386 9/500 [..............................] - ETA: 2:46 - loss: 1.7016 - regression_loss: 1.4653 - classification_loss: 0.2363 10/500 [..............................] - ETA: 2:46 - loss: 1.8026 - regression_loss: 1.5492 - classification_loss: 0.2534 11/500 [..............................] - ETA: 2:44 - loss: 1.8147 - regression_loss: 1.5607 - classification_loss: 0.2541 12/500 [..............................] - ETA: 2:43 - loss: 1.8059 - regression_loss: 1.5538 - classification_loss: 0.2521 13/500 [..............................] - ETA: 2:44 - loss: 1.8712 - regression_loss: 1.6040 - classification_loss: 0.2672 14/500 [..............................] - ETA: 2:44 - loss: 1.8626 - regression_loss: 1.5984 - classification_loss: 0.2641 15/500 [..............................] - ETA: 2:43 - loss: 1.8804 - regression_loss: 1.6133 - classification_loss: 0.2671 16/500 [..............................] - ETA: 2:43 - loss: 1.8898 - regression_loss: 1.6211 - classification_loss: 0.2687 17/500 [>.............................] - ETA: 2:43 - loss: 1.9200 - regression_loss: 1.6492 - classification_loss: 0.2708 18/500 [>.............................] - ETA: 2:43 - loss: 1.9080 - regression_loss: 1.6370 - classification_loss: 0.2710 19/500 [>.............................] - ETA: 2:43 - loss: 1.8950 - regression_loss: 1.6264 - classification_loss: 0.2686 20/500 [>.............................] - ETA: 2:42 - loss: 1.8455 - regression_loss: 1.5781 - classification_loss: 0.2675 21/500 [>.............................] - ETA: 2:42 - loss: 1.8381 - regression_loss: 1.5722 - classification_loss: 0.2659 22/500 [>.............................] - ETA: 2:42 - loss: 1.8198 - regression_loss: 1.5585 - classification_loss: 0.2613 23/500 [>.............................] - ETA: 2:42 - loss: 1.8292 - regression_loss: 1.5658 - classification_loss: 0.2635 24/500 [>.............................] - ETA: 2:41 - loss: 1.7792 - regression_loss: 1.5223 - classification_loss: 0.2569 25/500 [>.............................] - ETA: 2:41 - loss: 1.7567 - regression_loss: 1.5049 - classification_loss: 0.2519 26/500 [>.............................] - ETA: 2:40 - loss: 1.7769 - regression_loss: 1.5269 - classification_loss: 0.2499 27/500 [>.............................] - ETA: 2:39 - loss: 1.7792 - regression_loss: 1.5292 - classification_loss: 0.2500 28/500 [>.............................] - ETA: 2:39 - loss: 1.8050 - regression_loss: 1.5507 - classification_loss: 0.2542 29/500 [>.............................] - ETA: 2:39 - loss: 1.8047 - regression_loss: 1.5500 - classification_loss: 0.2548 30/500 [>.............................] - ETA: 2:38 - loss: 1.8049 - regression_loss: 1.5486 - classification_loss: 0.2563 31/500 [>.............................] - ETA: 2:38 - loss: 1.8043 - regression_loss: 1.5476 - classification_loss: 0.2567 32/500 [>.............................] - ETA: 2:38 - loss: 1.7964 - regression_loss: 1.5413 - classification_loss: 0.2551 33/500 [>.............................] - ETA: 2:37 - loss: 1.7930 - regression_loss: 1.5364 - classification_loss: 0.2566 34/500 [=>............................] - ETA: 2:37 - loss: 1.8029 - regression_loss: 1.5426 - classification_loss: 0.2603 35/500 [=>............................] - ETA: 2:36 - loss: 1.8079 - regression_loss: 1.5479 - classification_loss: 0.2600 36/500 [=>............................] - ETA: 2:36 - loss: 1.8484 - regression_loss: 1.5807 - classification_loss: 0.2677 37/500 [=>............................] - ETA: 2:36 - loss: 1.8323 - regression_loss: 1.5684 - classification_loss: 0.2639 38/500 [=>............................] - ETA: 2:36 - loss: 1.8184 - regression_loss: 1.5563 - classification_loss: 0.2621 39/500 [=>............................] - ETA: 2:35 - loss: 1.8130 - regression_loss: 1.5508 - classification_loss: 0.2622 40/500 [=>............................] - ETA: 2:35 - loss: 1.8107 - regression_loss: 1.5491 - classification_loss: 0.2616 41/500 [=>............................] - ETA: 2:35 - loss: 1.8194 - regression_loss: 1.5572 - classification_loss: 0.2622 42/500 [=>............................] - ETA: 2:34 - loss: 1.8191 - regression_loss: 1.5581 - classification_loss: 0.2610 43/500 [=>............................] - ETA: 2:34 - loss: 1.8163 - regression_loss: 1.5553 - classification_loss: 0.2610 44/500 [=>............................] - ETA: 2:34 - loss: 1.8207 - regression_loss: 1.5586 - classification_loss: 0.2622 45/500 [=>............................] - ETA: 2:34 - loss: 1.7882 - regression_loss: 1.5239 - classification_loss: 0.2643 46/500 [=>............................] - ETA: 2:33 - loss: 1.7918 - regression_loss: 1.5270 - classification_loss: 0.2648 47/500 [=>............................] - ETA: 2:33 - loss: 1.8283 - regression_loss: 1.5400 - classification_loss: 0.2883 48/500 [=>............................] - ETA: 2:33 - loss: 1.8336 - regression_loss: 1.5445 - classification_loss: 0.2892 49/500 [=>............................] - ETA: 2:32 - loss: 1.8312 - regression_loss: 1.5427 - classification_loss: 0.2885 50/500 [==>...........................] - ETA: 2:32 - loss: 1.8180 - regression_loss: 1.5324 - classification_loss: 0.2857 51/500 [==>...........................] - ETA: 2:32 - loss: 1.8198 - regression_loss: 1.5344 - classification_loss: 0.2854 52/500 [==>...........................] - ETA: 2:31 - loss: 1.8266 - regression_loss: 1.5413 - classification_loss: 0.2853 53/500 [==>...........................] - ETA: 2:31 - loss: 1.8413 - regression_loss: 1.5541 - classification_loss: 0.2872 54/500 [==>...........................] - ETA: 2:30 - loss: 1.8445 - regression_loss: 1.5570 - classification_loss: 0.2876 55/500 [==>...........................] - ETA: 2:30 - loss: 1.8532 - regression_loss: 1.5649 - classification_loss: 0.2883 56/500 [==>...........................] - ETA: 2:30 - loss: 1.8691 - regression_loss: 1.5775 - classification_loss: 0.2916 57/500 [==>...........................] - ETA: 2:29 - loss: 1.8669 - regression_loss: 1.5759 - classification_loss: 0.2910 58/500 [==>...........................] - ETA: 2:29 - loss: 1.8602 - regression_loss: 1.5710 - classification_loss: 0.2892 59/500 [==>...........................] - ETA: 2:29 - loss: 1.8722 - regression_loss: 1.5815 - classification_loss: 0.2908 60/500 [==>...........................] - ETA: 2:28 - loss: 1.8743 - regression_loss: 1.5832 - classification_loss: 0.2911 61/500 [==>...........................] - ETA: 2:28 - loss: 1.8717 - regression_loss: 1.5825 - classification_loss: 0.2893 62/500 [==>...........................] - ETA: 2:28 - loss: 1.8766 - regression_loss: 1.5871 - classification_loss: 0.2895 63/500 [==>...........................] - ETA: 2:27 - loss: 1.8841 - regression_loss: 1.5958 - classification_loss: 0.2882 64/500 [==>...........................] - ETA: 2:27 - loss: 1.8822 - regression_loss: 1.5954 - classification_loss: 0.2867 65/500 [==>...........................] - ETA: 2:27 - loss: 1.8761 - regression_loss: 1.5899 - classification_loss: 0.2862 66/500 [==>...........................] - ETA: 2:26 - loss: 1.8770 - regression_loss: 1.5917 - classification_loss: 0.2853 67/500 [===>..........................] - ETA: 2:26 - loss: 1.8634 - regression_loss: 1.5797 - classification_loss: 0.2837 68/500 [===>..........................] - ETA: 2:26 - loss: 1.8708 - regression_loss: 1.5866 - classification_loss: 0.2842 69/500 [===>..........................] - ETA: 2:25 - loss: 1.8699 - regression_loss: 1.5854 - classification_loss: 0.2844 70/500 [===>..........................] - ETA: 2:25 - loss: 1.8819 - regression_loss: 1.5946 - classification_loss: 0.2873 71/500 [===>..........................] - ETA: 2:25 - loss: 1.8891 - regression_loss: 1.6014 - classification_loss: 0.2877 72/500 [===>..........................] - ETA: 2:24 - loss: 1.8971 - regression_loss: 1.6079 - classification_loss: 0.2892 73/500 [===>..........................] - ETA: 2:24 - loss: 1.8973 - regression_loss: 1.6083 - classification_loss: 0.2890 74/500 [===>..........................] - ETA: 2:24 - loss: 1.8977 - regression_loss: 1.6088 - classification_loss: 0.2889 75/500 [===>..........................] - ETA: 2:23 - loss: 1.9012 - regression_loss: 1.6118 - classification_loss: 0.2894 76/500 [===>..........................] - ETA: 2:23 - loss: 1.9097 - regression_loss: 1.6186 - classification_loss: 0.2911 77/500 [===>..........................] - ETA: 2:23 - loss: 1.9103 - regression_loss: 1.6195 - classification_loss: 0.2907 78/500 [===>..........................] - ETA: 2:22 - loss: 1.9100 - regression_loss: 1.6196 - classification_loss: 0.2904 79/500 [===>..........................] - ETA: 2:22 - loss: 1.9318 - regression_loss: 1.6388 - classification_loss: 0.2930 80/500 [===>..........................] - ETA: 2:22 - loss: 1.9356 - regression_loss: 1.6426 - classification_loss: 0.2931 81/500 [===>..........................] - ETA: 2:21 - loss: 1.9428 - regression_loss: 1.6484 - classification_loss: 0.2944 82/500 [===>..........................] - ETA: 2:21 - loss: 1.9503 - regression_loss: 1.6525 - classification_loss: 0.2977 83/500 [===>..........................] - ETA: 2:21 - loss: 1.9497 - regression_loss: 1.6518 - classification_loss: 0.2978 84/500 [====>.........................] - ETA: 2:20 - loss: 1.9435 - regression_loss: 1.6470 - classification_loss: 0.2964 85/500 [====>.........................] - ETA: 2:20 - loss: 1.9443 - regression_loss: 1.6467 - classification_loss: 0.2975 86/500 [====>.........................] - ETA: 2:19 - loss: 1.9431 - regression_loss: 1.6466 - classification_loss: 0.2965 87/500 [====>.........................] - ETA: 2:19 - loss: 1.9414 - regression_loss: 1.6454 - classification_loss: 0.2960 88/500 [====>.........................] - ETA: 2:19 - loss: 1.9380 - regression_loss: 1.6427 - classification_loss: 0.2953 89/500 [====>.........................] - ETA: 2:18 - loss: 1.9339 - regression_loss: 1.6398 - classification_loss: 0.2942 90/500 [====>.........................] - ETA: 2:18 - loss: 1.9397 - regression_loss: 1.6442 - classification_loss: 0.2955 91/500 [====>.........................] - ETA: 2:18 - loss: 1.9275 - regression_loss: 1.6342 - classification_loss: 0.2933 92/500 [====>.........................] - ETA: 2:17 - loss: 1.9243 - regression_loss: 1.6323 - classification_loss: 0.2920 93/500 [====>.........................] - ETA: 2:17 - loss: 1.9224 - regression_loss: 1.6308 - classification_loss: 0.2916 94/500 [====>.........................] - ETA: 2:16 - loss: 1.9240 - regression_loss: 1.6323 - classification_loss: 0.2918 95/500 [====>.........................] - ETA: 2:16 - loss: 1.9271 - regression_loss: 1.6353 - classification_loss: 0.2918 96/500 [====>.........................] - ETA: 2:16 - loss: 1.9285 - regression_loss: 1.6363 - classification_loss: 0.2922 97/500 [====>.........................] - ETA: 2:15 - loss: 1.9300 - regression_loss: 1.6378 - classification_loss: 0.2922 98/500 [====>.........................] - ETA: 2:15 - loss: 1.9218 - regression_loss: 1.6311 - classification_loss: 0.2908 99/500 [====>.........................] - ETA: 2:15 - loss: 1.9177 - regression_loss: 1.6277 - classification_loss: 0.2900 100/500 [=====>........................] - ETA: 2:14 - loss: 1.9176 - regression_loss: 1.6276 - classification_loss: 0.2900 101/500 [=====>........................] - ETA: 2:14 - loss: 1.9207 - regression_loss: 1.6297 - classification_loss: 0.2911 102/500 [=====>........................] - ETA: 2:14 - loss: 1.9288 - regression_loss: 1.6362 - classification_loss: 0.2927 103/500 [=====>........................] - ETA: 2:14 - loss: 1.9267 - regression_loss: 1.6346 - classification_loss: 0.2921 104/500 [=====>........................] - ETA: 2:13 - loss: 1.9240 - regression_loss: 1.6326 - classification_loss: 0.2914 105/500 [=====>........................] - ETA: 2:13 - loss: 1.9181 - regression_loss: 1.6281 - classification_loss: 0.2900 106/500 [=====>........................] - ETA: 2:12 - loss: 1.9241 - regression_loss: 1.6307 - classification_loss: 0.2933 107/500 [=====>........................] - ETA: 2:12 - loss: 1.9208 - regression_loss: 1.6282 - classification_loss: 0.2926 108/500 [=====>........................] - ETA: 2:12 - loss: 1.9208 - regression_loss: 1.6287 - classification_loss: 0.2921 109/500 [=====>........................] - ETA: 2:11 - loss: 1.9151 - regression_loss: 1.6243 - classification_loss: 0.2908 110/500 [=====>........................] - ETA: 2:11 - loss: 1.9147 - regression_loss: 1.6241 - classification_loss: 0.2907 111/500 [=====>........................] - ETA: 2:11 - loss: 1.9122 - regression_loss: 1.6219 - classification_loss: 0.2903 112/500 [=====>........................] - ETA: 2:10 - loss: 1.9169 - regression_loss: 1.6256 - classification_loss: 0.2913 113/500 [=====>........................] - ETA: 2:10 - loss: 1.9182 - regression_loss: 1.6276 - classification_loss: 0.2907 114/500 [=====>........................] - ETA: 2:10 - loss: 1.9262 - regression_loss: 1.6339 - classification_loss: 0.2923 115/500 [=====>........................] - ETA: 2:09 - loss: 1.9278 - regression_loss: 1.6352 - classification_loss: 0.2926 116/500 [=====>........................] - ETA: 2:09 - loss: 1.9247 - regression_loss: 1.6328 - classification_loss: 0.2919 117/500 [======>.......................] - ETA: 2:09 - loss: 1.9208 - regression_loss: 1.6294 - classification_loss: 0.2914 118/500 [======>.......................] - ETA: 2:08 - loss: 1.9123 - regression_loss: 1.6225 - classification_loss: 0.2898 119/500 [======>.......................] - ETA: 2:08 - loss: 1.9079 - regression_loss: 1.6190 - classification_loss: 0.2889 120/500 [======>.......................] - ETA: 2:08 - loss: 1.9117 - regression_loss: 1.6227 - classification_loss: 0.2890 121/500 [======>.......................] - ETA: 2:07 - loss: 1.9087 - regression_loss: 1.6203 - classification_loss: 0.2884 122/500 [======>.......................] - ETA: 2:07 - loss: 1.9044 - regression_loss: 1.6166 - classification_loss: 0.2878 123/500 [======>.......................] - ETA: 2:07 - loss: 1.9076 - regression_loss: 1.6189 - classification_loss: 0.2887 124/500 [======>.......................] - ETA: 2:06 - loss: 1.9066 - regression_loss: 1.6179 - classification_loss: 0.2886 125/500 [======>.......................] - ETA: 2:06 - loss: 1.9076 - regression_loss: 1.6190 - classification_loss: 0.2886 126/500 [======>.......................] - ETA: 2:06 - loss: 1.9054 - regression_loss: 1.6159 - classification_loss: 0.2895 127/500 [======>.......................] - ETA: 2:05 - loss: 1.9029 - regression_loss: 1.6141 - classification_loss: 0.2888 128/500 [======>.......................] - ETA: 2:05 - loss: 1.8995 - regression_loss: 1.6116 - classification_loss: 0.2879 129/500 [======>.......................] - ETA: 2:05 - loss: 1.8990 - regression_loss: 1.6111 - classification_loss: 0.2879 130/500 [======>.......................] - ETA: 2:04 - loss: 1.8989 - regression_loss: 1.6113 - classification_loss: 0.2876 131/500 [======>.......................] - ETA: 2:04 - loss: 1.8951 - regression_loss: 1.6083 - classification_loss: 0.2868 132/500 [======>.......................] - ETA: 2:04 - loss: 1.8964 - regression_loss: 1.6085 - classification_loss: 0.2879 133/500 [======>.......................] - ETA: 2:03 - loss: 1.8960 - regression_loss: 1.6087 - classification_loss: 0.2873 134/500 [=======>......................] - ETA: 2:03 - loss: 1.9020 - regression_loss: 1.6136 - classification_loss: 0.2884 135/500 [=======>......................] - ETA: 2:03 - loss: 1.9023 - regression_loss: 1.6148 - classification_loss: 0.2876 136/500 [=======>......................] - ETA: 2:02 - loss: 1.9022 - regression_loss: 1.6151 - classification_loss: 0.2871 137/500 [=======>......................] - ETA: 2:02 - loss: 1.8998 - regression_loss: 1.6134 - classification_loss: 0.2865 138/500 [=======>......................] - ETA: 2:01 - loss: 1.9187 - regression_loss: 1.6283 - classification_loss: 0.2905 139/500 [=======>......................] - ETA: 2:01 - loss: 1.9175 - regression_loss: 1.6272 - classification_loss: 0.2903 140/500 [=======>......................] - ETA: 2:01 - loss: 1.9166 - regression_loss: 1.6268 - classification_loss: 0.2899 141/500 [=======>......................] - ETA: 2:00 - loss: 1.9139 - regression_loss: 1.6246 - classification_loss: 0.2893 142/500 [=======>......................] - ETA: 2:00 - loss: 1.9152 - regression_loss: 1.6257 - classification_loss: 0.2895 143/500 [=======>......................] - ETA: 2:00 - loss: 1.9226 - regression_loss: 1.6309 - classification_loss: 0.2917 144/500 [=======>......................] - ETA: 1:59 - loss: 1.9227 - regression_loss: 1.6306 - classification_loss: 0.2920 145/500 [=======>......................] - ETA: 1:59 - loss: 1.9198 - regression_loss: 1.6283 - classification_loss: 0.2915 146/500 [=======>......................] - ETA: 1:59 - loss: 1.9159 - regression_loss: 1.6251 - classification_loss: 0.2908 147/500 [=======>......................] - ETA: 1:58 - loss: 1.9195 - regression_loss: 1.6279 - classification_loss: 0.2916 148/500 [=======>......................] - ETA: 1:58 - loss: 1.9223 - regression_loss: 1.6299 - classification_loss: 0.2924 149/500 [=======>......................] - ETA: 1:58 - loss: 1.9198 - regression_loss: 1.6283 - classification_loss: 0.2915 150/500 [========>.....................] - ETA: 1:57 - loss: 1.9239 - regression_loss: 1.6319 - classification_loss: 0.2920 151/500 [========>.....................] - ETA: 1:57 - loss: 1.9250 - regression_loss: 1.6321 - classification_loss: 0.2929 152/500 [========>.....................] - ETA: 1:57 - loss: 1.9281 - regression_loss: 1.6347 - classification_loss: 0.2935 153/500 [========>.....................] - ETA: 1:56 - loss: 1.9227 - regression_loss: 1.6283 - classification_loss: 0.2944 154/500 [========>.....................] - ETA: 1:56 - loss: 1.9220 - regression_loss: 1.6283 - classification_loss: 0.2937 155/500 [========>.....................] - ETA: 1:56 - loss: 1.9251 - regression_loss: 1.6310 - classification_loss: 0.2941 156/500 [========>.....................] - ETA: 1:55 - loss: 1.9265 - regression_loss: 1.6324 - classification_loss: 0.2941 157/500 [========>.....................] - ETA: 1:55 - loss: 1.9315 - regression_loss: 1.6348 - classification_loss: 0.2967 158/500 [========>.....................] - ETA: 1:55 - loss: 1.9297 - regression_loss: 1.6335 - classification_loss: 0.2961 159/500 [========>.....................] - ETA: 1:54 - loss: 1.9355 - regression_loss: 1.6388 - classification_loss: 0.2967 160/500 [========>.....................] - ETA: 1:54 - loss: 1.9366 - regression_loss: 1.6395 - classification_loss: 0.2971 161/500 [========>.....................] - ETA: 1:54 - loss: 1.9422 - regression_loss: 1.6447 - classification_loss: 0.2975 162/500 [========>.....................] - ETA: 1:53 - loss: 1.9421 - regression_loss: 1.6443 - classification_loss: 0.2979 163/500 [========>.....................] - ETA: 1:53 - loss: 1.9492 - regression_loss: 1.6483 - classification_loss: 0.3010 164/500 [========>.....................] - ETA: 1:53 - loss: 1.9465 - regression_loss: 1.6460 - classification_loss: 0.3005 165/500 [========>.....................] - ETA: 1:52 - loss: 1.9512 - regression_loss: 1.6507 - classification_loss: 0.3006 166/500 [========>.....................] - ETA: 1:52 - loss: 1.9517 - regression_loss: 1.6508 - classification_loss: 0.3009 167/500 [=========>....................] - ETA: 1:52 - loss: 1.9546 - regression_loss: 1.6530 - classification_loss: 0.3016 168/500 [=========>....................] - ETA: 1:51 - loss: 1.9487 - regression_loss: 1.6481 - classification_loss: 0.3006 169/500 [=========>....................] - ETA: 1:51 - loss: 1.9533 - regression_loss: 1.6517 - classification_loss: 0.3015 170/500 [=========>....................] - ETA: 1:51 - loss: 1.9517 - regression_loss: 1.6507 - classification_loss: 0.3009 171/500 [=========>....................] - ETA: 1:50 - loss: 1.9525 - regression_loss: 1.6518 - classification_loss: 0.3007 172/500 [=========>....................] - ETA: 1:50 - loss: 1.9512 - regression_loss: 1.6508 - classification_loss: 0.3004 173/500 [=========>....................] - ETA: 1:50 - loss: 1.9611 - regression_loss: 1.6591 - classification_loss: 0.3019 174/500 [=========>....................] - ETA: 1:49 - loss: 1.9621 - regression_loss: 1.6609 - classification_loss: 0.3012 175/500 [=========>....................] - ETA: 1:49 - loss: 1.9633 - regression_loss: 1.6621 - classification_loss: 0.3012 176/500 [=========>....................] - ETA: 1:49 - loss: 1.9660 - regression_loss: 1.6643 - classification_loss: 0.3017 177/500 [=========>....................] - ETA: 1:48 - loss: 1.9646 - regression_loss: 1.6632 - classification_loss: 0.3014 178/500 [=========>....................] - ETA: 1:48 - loss: 1.9705 - regression_loss: 1.6682 - classification_loss: 0.3023 179/500 [=========>....................] - ETA: 1:48 - loss: 1.9739 - regression_loss: 1.6709 - classification_loss: 0.3029 180/500 [=========>....................] - ETA: 1:47 - loss: 1.9752 - regression_loss: 1.6722 - classification_loss: 0.3030 181/500 [=========>....................] - ETA: 1:47 - loss: 1.9660 - regression_loss: 1.6629 - classification_loss: 0.3030 182/500 [=========>....................] - ETA: 1:47 - loss: 1.9665 - regression_loss: 1.6637 - classification_loss: 0.3028 183/500 [=========>....................] - ETA: 1:47 - loss: 1.9667 - regression_loss: 1.6642 - classification_loss: 0.3025 184/500 [==========>...................] - ETA: 1:46 - loss: 1.9720 - regression_loss: 1.6686 - classification_loss: 0.3034 185/500 [==========>...................] - ETA: 1:46 - loss: 1.9678 - regression_loss: 1.6652 - classification_loss: 0.3027 186/500 [==========>...................] - ETA: 1:45 - loss: 1.9689 - regression_loss: 1.6665 - classification_loss: 0.3024 187/500 [==========>...................] - ETA: 1:45 - loss: 1.9708 - regression_loss: 1.6683 - classification_loss: 0.3025 188/500 [==========>...................] - ETA: 1:45 - loss: 1.9712 - regression_loss: 1.6691 - classification_loss: 0.3021 189/500 [==========>...................] - ETA: 1:44 - loss: 1.9703 - regression_loss: 1.6684 - classification_loss: 0.3020 190/500 [==========>...................] - ETA: 1:44 - loss: 1.9658 - regression_loss: 1.6646 - classification_loss: 0.3011 191/500 [==========>...................] - ETA: 1:44 - loss: 1.9636 - regression_loss: 1.6629 - classification_loss: 0.3008 192/500 [==========>...................] - ETA: 1:43 - loss: 1.9590 - regression_loss: 1.6594 - classification_loss: 0.2996 193/500 [==========>...................] - ETA: 1:43 - loss: 1.9611 - regression_loss: 1.6613 - classification_loss: 0.2998 194/500 [==========>...................] - ETA: 1:43 - loss: 1.9612 - regression_loss: 1.6617 - classification_loss: 0.2995 195/500 [==========>...................] - ETA: 1:43 - loss: 1.9613 - regression_loss: 1.6619 - classification_loss: 0.2994 196/500 [==========>...................] - ETA: 1:42 - loss: 1.9604 - regression_loss: 1.6613 - classification_loss: 0.2991 197/500 [==========>...................] - ETA: 1:42 - loss: 1.9597 - regression_loss: 1.6609 - classification_loss: 0.2988 198/500 [==========>...................] - ETA: 1:42 - loss: 1.9582 - regression_loss: 1.6598 - classification_loss: 0.2985 199/500 [==========>...................] - ETA: 1:41 - loss: 1.9564 - regression_loss: 1.6584 - classification_loss: 0.2980 200/500 [===========>..................] - ETA: 1:41 - loss: 1.9545 - regression_loss: 1.6565 - classification_loss: 0.2981 201/500 [===========>..................] - ETA: 1:41 - loss: 1.9515 - regression_loss: 1.6540 - classification_loss: 0.2975 202/500 [===========>..................] - ETA: 1:40 - loss: 1.9517 - regression_loss: 1.6543 - classification_loss: 0.2974 203/500 [===========>..................] - ETA: 1:40 - loss: 1.9493 - regression_loss: 1.6521 - classification_loss: 0.2972 204/500 [===========>..................] - ETA: 1:39 - loss: 1.9507 - regression_loss: 1.6534 - classification_loss: 0.2973 205/500 [===========>..................] - ETA: 1:39 - loss: 1.9510 - regression_loss: 1.6538 - classification_loss: 0.2973 206/500 [===========>..................] - ETA: 1:39 - loss: 1.9505 - regression_loss: 1.6535 - classification_loss: 0.2970 207/500 [===========>..................] - ETA: 1:38 - loss: 1.9421 - regression_loss: 1.6455 - classification_loss: 0.2966 208/500 [===========>..................] - ETA: 1:38 - loss: 1.9475 - regression_loss: 1.6488 - classification_loss: 0.2988 209/500 [===========>..................] - ETA: 1:38 - loss: 1.9464 - regression_loss: 1.6478 - classification_loss: 0.2985 210/500 [===========>..................] - ETA: 1:37 - loss: 1.9500 - regression_loss: 1.6503 - classification_loss: 0.2997 211/500 [===========>..................] - ETA: 1:37 - loss: 1.9491 - regression_loss: 1.6494 - classification_loss: 0.2998 212/500 [===========>..................] - ETA: 1:37 - loss: 1.9548 - regression_loss: 1.6539 - classification_loss: 0.3009 213/500 [===========>..................] - ETA: 1:36 - loss: 1.9530 - regression_loss: 1.6528 - classification_loss: 0.3002 214/500 [===========>..................] - ETA: 1:36 - loss: 1.9538 - regression_loss: 1.6538 - classification_loss: 0.3000 215/500 [===========>..................] - ETA: 1:36 - loss: 1.9526 - regression_loss: 1.6527 - classification_loss: 0.2999 216/500 [===========>..................] - ETA: 1:35 - loss: 1.9527 - regression_loss: 1.6529 - classification_loss: 0.2998 217/500 [============>.................] - ETA: 1:35 - loss: 1.9548 - regression_loss: 1.6540 - classification_loss: 0.3008 218/500 [============>.................] - ETA: 1:35 - loss: 1.9571 - regression_loss: 1.6558 - classification_loss: 0.3013 219/500 [============>.................] - ETA: 1:34 - loss: 1.9575 - regression_loss: 1.6559 - classification_loss: 0.3017 220/500 [============>.................] - ETA: 1:34 - loss: 1.9586 - regression_loss: 1.6572 - classification_loss: 0.3013 221/500 [============>.................] - ETA: 1:34 - loss: 1.9621 - regression_loss: 1.6589 - classification_loss: 0.3032 222/500 [============>.................] - ETA: 1:33 - loss: 1.9607 - regression_loss: 1.6580 - classification_loss: 0.3027 223/500 [============>.................] - ETA: 1:33 - loss: 1.9611 - regression_loss: 1.6589 - classification_loss: 0.3022 224/500 [============>.................] - ETA: 1:33 - loss: 1.9608 - regression_loss: 1.6588 - classification_loss: 0.3020 225/500 [============>.................] - ETA: 1:32 - loss: 1.9637 - regression_loss: 1.6619 - classification_loss: 0.3018 226/500 [============>.................] - ETA: 1:32 - loss: 1.9646 - regression_loss: 1.6624 - classification_loss: 0.3022 227/500 [============>.................] - ETA: 1:32 - loss: 1.9647 - regression_loss: 1.6626 - classification_loss: 0.3021 228/500 [============>.................] - ETA: 1:31 - loss: 1.9634 - regression_loss: 1.6617 - classification_loss: 0.3017 229/500 [============>.................] - ETA: 1:31 - loss: 1.9640 - regression_loss: 1.6623 - classification_loss: 0.3018 230/500 [============>.................] - ETA: 1:31 - loss: 1.9642 - regression_loss: 1.6625 - classification_loss: 0.3017 231/500 [============>.................] - ETA: 1:30 - loss: 1.9585 - regression_loss: 1.6576 - classification_loss: 0.3009 232/500 [============>.................] - ETA: 1:30 - loss: 1.9581 - regression_loss: 1.6574 - classification_loss: 0.3007 233/500 [============>.................] - ETA: 1:30 - loss: 1.9604 - regression_loss: 1.6593 - classification_loss: 0.3011 234/500 [=============>................] - ETA: 1:29 - loss: 1.9571 - regression_loss: 1.6566 - classification_loss: 0.3005 235/500 [=============>................] - ETA: 1:29 - loss: 1.9570 - regression_loss: 1.6562 - classification_loss: 0.3008 236/500 [=============>................] - ETA: 1:29 - loss: 1.9566 - regression_loss: 1.6559 - classification_loss: 0.3007 237/500 [=============>................] - ETA: 1:28 - loss: 1.9556 - regression_loss: 1.6551 - classification_loss: 0.3006 238/500 [=============>................] - ETA: 1:28 - loss: 1.9614 - regression_loss: 1.6590 - classification_loss: 0.3023 239/500 [=============>................] - ETA: 1:28 - loss: 1.9614 - regression_loss: 1.6591 - classification_loss: 0.3023 240/500 [=============>................] - ETA: 1:27 - loss: 1.9640 - regression_loss: 1.6610 - classification_loss: 0.3030 241/500 [=============>................] - ETA: 1:27 - loss: 1.9620 - regression_loss: 1.6597 - classification_loss: 0.3023 242/500 [=============>................] - ETA: 1:27 - loss: 1.9614 - regression_loss: 1.6594 - classification_loss: 0.3020 243/500 [=============>................] - ETA: 1:26 - loss: 1.9631 - regression_loss: 1.6609 - classification_loss: 0.3021 244/500 [=============>................] - ETA: 1:26 - loss: 1.9627 - regression_loss: 1.6607 - classification_loss: 0.3020 245/500 [=============>................] - ETA: 1:26 - loss: 1.9613 - regression_loss: 1.6597 - classification_loss: 0.3016 246/500 [=============>................] - ETA: 1:25 - loss: 1.9632 - regression_loss: 1.6611 - classification_loss: 0.3022 247/500 [=============>................] - ETA: 1:25 - loss: 1.9585 - regression_loss: 1.6571 - classification_loss: 0.3014 248/500 [=============>................] - ETA: 1:25 - loss: 1.9592 - regression_loss: 1.6577 - classification_loss: 0.3015 249/500 [=============>................] - ETA: 1:24 - loss: 1.9593 - regression_loss: 1.6578 - classification_loss: 0.3015 250/500 [==============>...............] - ETA: 1:24 - loss: 1.9563 - regression_loss: 1.6550 - classification_loss: 0.3013 251/500 [==============>...............] - ETA: 1:24 - loss: 1.9563 - regression_loss: 1.6554 - classification_loss: 0.3009 252/500 [==============>...............] - ETA: 1:23 - loss: 1.9646 - regression_loss: 1.6629 - classification_loss: 0.3017 253/500 [==============>...............] - ETA: 1:23 - loss: 1.9666 - regression_loss: 1.6643 - classification_loss: 0.3024 254/500 [==============>...............] - ETA: 1:23 - loss: 1.9648 - regression_loss: 1.6628 - classification_loss: 0.3020 255/500 [==============>...............] - ETA: 1:22 - loss: 1.9641 - regression_loss: 1.6622 - classification_loss: 0.3019 256/500 [==============>...............] - ETA: 1:22 - loss: 1.9639 - regression_loss: 1.6623 - classification_loss: 0.3015 257/500 [==============>...............] - ETA: 1:22 - loss: 1.9676 - regression_loss: 1.6648 - classification_loss: 0.3028 258/500 [==============>...............] - ETA: 1:21 - loss: 1.9658 - regression_loss: 1.6632 - classification_loss: 0.3026 259/500 [==============>...............] - ETA: 1:21 - loss: 1.9643 - regression_loss: 1.6616 - classification_loss: 0.3027 260/500 [==============>...............] - ETA: 1:21 - loss: 1.9635 - regression_loss: 1.6609 - classification_loss: 0.3027 261/500 [==============>...............] - ETA: 1:20 - loss: 1.9630 - regression_loss: 1.6607 - classification_loss: 0.3023 262/500 [==============>...............] - ETA: 1:20 - loss: 1.9626 - regression_loss: 1.6604 - classification_loss: 0.3023 263/500 [==============>...............] - ETA: 1:20 - loss: 1.9613 - regression_loss: 1.6595 - classification_loss: 0.3017 264/500 [==============>...............] - ETA: 1:19 - loss: 1.9608 - regression_loss: 1.6593 - classification_loss: 0.3015 265/500 [==============>...............] - ETA: 1:19 - loss: 1.9617 - regression_loss: 1.6602 - classification_loss: 0.3016 266/500 [==============>...............] - ETA: 1:19 - loss: 1.9599 - regression_loss: 1.6590 - classification_loss: 0.3009 267/500 [===============>..............] - ETA: 1:18 - loss: 1.9613 - regression_loss: 1.6601 - classification_loss: 0.3013 268/500 [===============>..............] - ETA: 1:18 - loss: 1.9619 - regression_loss: 1.6608 - classification_loss: 0.3011 269/500 [===============>..............] - ETA: 1:18 - loss: 1.9590 - regression_loss: 1.6583 - classification_loss: 0.3008 270/500 [===============>..............] - ETA: 1:17 - loss: 1.9565 - regression_loss: 1.6557 - classification_loss: 0.3008 271/500 [===============>..............] - ETA: 1:17 - loss: 1.9553 - regression_loss: 1.6548 - classification_loss: 0.3004 272/500 [===============>..............] - ETA: 1:17 - loss: 1.9551 - regression_loss: 1.6547 - classification_loss: 0.3004 273/500 [===============>..............] - ETA: 1:16 - loss: 1.9533 - regression_loss: 1.6531 - classification_loss: 0.3001 274/500 [===============>..............] - ETA: 1:16 - loss: 1.9506 - regression_loss: 1.6511 - classification_loss: 0.2995 275/500 [===============>..............] - ETA: 1:15 - loss: 1.9519 - regression_loss: 1.6522 - classification_loss: 0.2997 276/500 [===============>..............] - ETA: 1:15 - loss: 1.9499 - regression_loss: 1.6506 - classification_loss: 0.2993 277/500 [===============>..............] - ETA: 1:15 - loss: 1.9499 - regression_loss: 1.6506 - classification_loss: 0.2993 278/500 [===============>..............] - ETA: 1:14 - loss: 1.9483 - regression_loss: 1.6494 - classification_loss: 0.2989 279/500 [===============>..............] - ETA: 1:14 - loss: 1.9479 - regression_loss: 1.6491 - classification_loss: 0.2987 280/500 [===============>..............] - ETA: 1:14 - loss: 1.9479 - regression_loss: 1.6496 - classification_loss: 0.2983 281/500 [===============>..............] - ETA: 1:13 - loss: 1.9443 - regression_loss: 1.6467 - classification_loss: 0.2976 282/500 [===============>..............] - ETA: 1:13 - loss: 1.9448 - regression_loss: 1.6472 - classification_loss: 0.2976 283/500 [===============>..............] - ETA: 1:13 - loss: 1.9448 - regression_loss: 1.6470 - classification_loss: 0.2977 284/500 [================>.............] - ETA: 1:12 - loss: 1.9462 - regression_loss: 1.6485 - classification_loss: 0.2977 285/500 [================>.............] - ETA: 1:12 - loss: 1.9450 - regression_loss: 1.6475 - classification_loss: 0.2976 286/500 [================>.............] - ETA: 1:12 - loss: 1.9438 - regression_loss: 1.6466 - classification_loss: 0.2971 287/500 [================>.............] - ETA: 1:11 - loss: 1.9435 - regression_loss: 1.6465 - classification_loss: 0.2970 288/500 [================>.............] - ETA: 1:11 - loss: 1.9475 - regression_loss: 1.6504 - classification_loss: 0.2971 289/500 [================>.............] - ETA: 1:11 - loss: 1.9500 - regression_loss: 1.6522 - classification_loss: 0.2978 290/500 [================>.............] - ETA: 1:10 - loss: 1.9511 - regression_loss: 1.6535 - classification_loss: 0.2976 291/500 [================>.............] - ETA: 1:10 - loss: 1.9501 - regression_loss: 1.6529 - classification_loss: 0.2972 292/500 [================>.............] - ETA: 1:10 - loss: 1.9504 - regression_loss: 1.6532 - classification_loss: 0.2972 293/500 [================>.............] - ETA: 1:09 - loss: 1.9491 - regression_loss: 1.6523 - classification_loss: 0.2967 294/500 [================>.............] - ETA: 1:09 - loss: 1.9507 - regression_loss: 1.6538 - classification_loss: 0.2969 295/500 [================>.............] - ETA: 1:09 - loss: 1.9494 - regression_loss: 1.6529 - classification_loss: 0.2964 296/500 [================>.............] - ETA: 1:08 - loss: 1.9505 - regression_loss: 1.6538 - classification_loss: 0.2968 297/500 [================>.............] - ETA: 1:08 - loss: 1.9505 - regression_loss: 1.6538 - classification_loss: 0.2967 298/500 [================>.............] - ETA: 1:08 - loss: 1.9474 - regression_loss: 1.6513 - classification_loss: 0.2961 299/500 [================>.............] - ETA: 1:07 - loss: 1.9461 - regression_loss: 1.6506 - classification_loss: 0.2955 300/500 [=================>............] - ETA: 1:07 - loss: 1.9425 - regression_loss: 1.6477 - classification_loss: 0.2948 301/500 [=================>............] - ETA: 1:07 - loss: 1.9392 - regression_loss: 1.6451 - classification_loss: 0.2942 302/500 [=================>............] - ETA: 1:06 - loss: 1.9346 - regression_loss: 1.6412 - classification_loss: 0.2934 303/500 [=================>............] - ETA: 1:06 - loss: 1.9335 - regression_loss: 1.6402 - classification_loss: 0.2933 304/500 [=================>............] - ETA: 1:06 - loss: 1.9332 - regression_loss: 1.6400 - classification_loss: 0.2932 305/500 [=================>............] - ETA: 1:05 - loss: 1.9345 - regression_loss: 1.6411 - classification_loss: 0.2934 306/500 [=================>............] - ETA: 1:05 - loss: 1.9340 - regression_loss: 1.6407 - classification_loss: 0.2933 307/500 [=================>............] - ETA: 1:05 - loss: 1.9321 - regression_loss: 1.6390 - classification_loss: 0.2932 308/500 [=================>............] - ETA: 1:04 - loss: 1.9300 - regression_loss: 1.6372 - classification_loss: 0.2928 309/500 [=================>............] - ETA: 1:04 - loss: 1.9283 - regression_loss: 1.6359 - classification_loss: 0.2924 310/500 [=================>............] - ETA: 1:04 - loss: 1.9282 - regression_loss: 1.6359 - classification_loss: 0.2923 311/500 [=================>............] - ETA: 1:03 - loss: 1.9283 - regression_loss: 1.6360 - classification_loss: 0.2923 312/500 [=================>............] - ETA: 1:03 - loss: 1.9288 - regression_loss: 1.6366 - classification_loss: 0.2922 313/500 [=================>............] - ETA: 1:03 - loss: 1.9275 - regression_loss: 1.6357 - classification_loss: 0.2918 314/500 [=================>............] - ETA: 1:02 - loss: 1.9247 - regression_loss: 1.6334 - classification_loss: 0.2914 315/500 [=================>............] - ETA: 1:02 - loss: 1.9274 - regression_loss: 1.6353 - classification_loss: 0.2920 316/500 [=================>............] - ETA: 1:02 - loss: 1.9331 - regression_loss: 1.6406 - classification_loss: 0.2925 317/500 [==================>...........] - ETA: 1:01 - loss: 1.9332 - regression_loss: 1.6408 - classification_loss: 0.2924 318/500 [==================>...........] - ETA: 1:01 - loss: 1.9314 - regression_loss: 1.6393 - classification_loss: 0.2921 319/500 [==================>...........] - ETA: 1:01 - loss: 1.9318 - regression_loss: 1.6397 - classification_loss: 0.2921 320/500 [==================>...........] - ETA: 1:00 - loss: 1.9379 - regression_loss: 1.6445 - classification_loss: 0.2934 321/500 [==================>...........] - ETA: 1:00 - loss: 1.9368 - regression_loss: 1.6437 - classification_loss: 0.2931 322/500 [==================>...........] - ETA: 1:00 - loss: 1.9371 - regression_loss: 1.6441 - classification_loss: 0.2930 323/500 [==================>...........] - ETA: 59s - loss: 1.9364 - regression_loss: 1.6438 - classification_loss: 0.2926  324/500 [==================>...........] - ETA: 59s - loss: 1.9342 - regression_loss: 1.6419 - classification_loss: 0.2923 325/500 [==================>...........] - ETA: 59s - loss: 1.9336 - regression_loss: 1.6415 - classification_loss: 0.2921 326/500 [==================>...........] - ETA: 58s - loss: 1.9345 - regression_loss: 1.6421 - classification_loss: 0.2924 327/500 [==================>...........] - ETA: 58s - loss: 1.9331 - regression_loss: 1.6409 - classification_loss: 0.2922 328/500 [==================>...........] - ETA: 58s - loss: 1.9319 - regression_loss: 1.6398 - classification_loss: 0.2921 329/500 [==================>...........] - ETA: 57s - loss: 1.9309 - regression_loss: 1.6391 - classification_loss: 0.2917 330/500 [==================>...........] - ETA: 57s - loss: 1.9334 - regression_loss: 1.6409 - classification_loss: 0.2926 331/500 [==================>...........] - ETA: 57s - loss: 1.9329 - regression_loss: 1.6407 - classification_loss: 0.2923 332/500 [==================>...........] - ETA: 56s - loss: 1.9301 - regression_loss: 1.6382 - classification_loss: 0.2919 333/500 [==================>...........] - ETA: 56s - loss: 1.9323 - regression_loss: 1.6397 - classification_loss: 0.2926 334/500 [===================>..........] - ETA: 56s - loss: 1.9296 - regression_loss: 1.6374 - classification_loss: 0.2922 335/500 [===================>..........] - ETA: 55s - loss: 1.9317 - regression_loss: 1.6390 - classification_loss: 0.2927 336/500 [===================>..........] - ETA: 55s - loss: 1.9323 - regression_loss: 1.6396 - classification_loss: 0.2926 337/500 [===================>..........] - ETA: 54s - loss: 1.9308 - regression_loss: 1.6384 - classification_loss: 0.2923 338/500 [===================>..........] - ETA: 54s - loss: 1.9291 - regression_loss: 1.6370 - classification_loss: 0.2922 339/500 [===================>..........] - ETA: 54s - loss: 1.9287 - regression_loss: 1.6365 - classification_loss: 0.2922 340/500 [===================>..........] - ETA: 53s - loss: 1.9291 - regression_loss: 1.6370 - classification_loss: 0.2921 341/500 [===================>..........] - ETA: 53s - loss: 1.9283 - regression_loss: 1.6363 - classification_loss: 0.2920 342/500 [===================>..........] - ETA: 53s - loss: 1.9327 - regression_loss: 1.6400 - classification_loss: 0.2926 343/500 [===================>..........] - ETA: 52s - loss: 1.9313 - regression_loss: 1.6389 - classification_loss: 0.2923 344/500 [===================>..........] - ETA: 52s - loss: 1.9314 - regression_loss: 1.6390 - classification_loss: 0.2923 345/500 [===================>..........] - ETA: 52s - loss: 1.9282 - regression_loss: 1.6363 - classification_loss: 0.2918 346/500 [===================>..........] - ETA: 51s - loss: 1.9283 - regression_loss: 1.6365 - classification_loss: 0.2919 347/500 [===================>..........] - ETA: 51s - loss: 1.9295 - regression_loss: 1.6375 - classification_loss: 0.2920 348/500 [===================>..........] - ETA: 51s - loss: 1.9264 - regression_loss: 1.6347 - classification_loss: 0.2917 349/500 [===================>..........] - ETA: 50s - loss: 1.9259 - regression_loss: 1.6343 - classification_loss: 0.2916 350/500 [====================>.........] - ETA: 50s - loss: 1.9275 - regression_loss: 1.6354 - classification_loss: 0.2920 351/500 [====================>.........] - ETA: 50s - loss: 1.9281 - regression_loss: 1.6358 - classification_loss: 0.2923 352/500 [====================>.........] - ETA: 49s - loss: 1.9266 - regression_loss: 1.6347 - classification_loss: 0.2920 353/500 [====================>.........] - ETA: 49s - loss: 1.9249 - regression_loss: 1.6333 - classification_loss: 0.2916 354/500 [====================>.........] - ETA: 49s - loss: 1.9252 - regression_loss: 1.6336 - classification_loss: 0.2916 355/500 [====================>.........] - ETA: 48s - loss: 1.9243 - regression_loss: 1.6325 - classification_loss: 0.2918 356/500 [====================>.........] - ETA: 48s - loss: 1.9234 - regression_loss: 1.6320 - classification_loss: 0.2915 357/500 [====================>.........] - ETA: 48s - loss: 1.9242 - regression_loss: 1.6324 - classification_loss: 0.2918 358/500 [====================>.........] - ETA: 47s - loss: 1.9221 - regression_loss: 1.6306 - classification_loss: 0.2915 359/500 [====================>.........] - ETA: 47s - loss: 1.9243 - regression_loss: 1.6323 - classification_loss: 0.2920 360/500 [====================>.........] - ETA: 47s - loss: 1.9248 - regression_loss: 1.6324 - classification_loss: 0.2924 361/500 [====================>.........] - ETA: 46s - loss: 1.9246 - regression_loss: 1.6324 - classification_loss: 0.2922 362/500 [====================>.........] - ETA: 46s - loss: 1.9242 - regression_loss: 1.6322 - classification_loss: 0.2920 363/500 [====================>.........] - ETA: 46s - loss: 1.9251 - regression_loss: 1.6329 - classification_loss: 0.2922 364/500 [====================>.........] - ETA: 45s - loss: 1.9238 - regression_loss: 1.6318 - classification_loss: 0.2920 365/500 [====================>.........] - ETA: 45s - loss: 1.9233 - regression_loss: 1.6314 - classification_loss: 0.2919 366/500 [====================>.........] - ETA: 45s - loss: 1.9229 - regression_loss: 1.6311 - classification_loss: 0.2918 367/500 [=====================>........] - ETA: 44s - loss: 1.9229 - regression_loss: 1.6311 - classification_loss: 0.2918 368/500 [=====================>........] - ETA: 44s - loss: 1.9246 - regression_loss: 1.6326 - classification_loss: 0.2920 369/500 [=====================>........] - ETA: 44s - loss: 1.9251 - regression_loss: 1.6331 - classification_loss: 0.2919 370/500 [=====================>........] - ETA: 43s - loss: 1.9272 - regression_loss: 1.6351 - classification_loss: 0.2921 371/500 [=====================>........] - ETA: 43s - loss: 1.9278 - regression_loss: 1.6358 - classification_loss: 0.2920 372/500 [=====================>........] - ETA: 43s - loss: 1.9278 - regression_loss: 1.6358 - classification_loss: 0.2919 373/500 [=====================>........] - ETA: 42s - loss: 1.9276 - regression_loss: 1.6356 - classification_loss: 0.2920 374/500 [=====================>........] - ETA: 42s - loss: 1.9290 - regression_loss: 1.6369 - classification_loss: 0.2921 375/500 [=====================>........] - ETA: 42s - loss: 1.9272 - regression_loss: 1.6353 - classification_loss: 0.2919 376/500 [=====================>........] - ETA: 41s - loss: 1.9353 - regression_loss: 1.6415 - classification_loss: 0.2938 377/500 [=====================>........] - ETA: 41s - loss: 1.9378 - regression_loss: 1.6437 - classification_loss: 0.2941 378/500 [=====================>........] - ETA: 41s - loss: 1.9369 - regression_loss: 1.6429 - classification_loss: 0.2940 379/500 [=====================>........] - ETA: 40s - loss: 1.9385 - regression_loss: 1.6442 - classification_loss: 0.2944 380/500 [=====================>........] - ETA: 40s - loss: 1.9376 - regression_loss: 1.6436 - classification_loss: 0.2941 381/500 [=====================>........] - ETA: 40s - loss: 1.9384 - regression_loss: 1.6444 - classification_loss: 0.2940 382/500 [=====================>........] - ETA: 39s - loss: 1.9392 - regression_loss: 1.6449 - classification_loss: 0.2943 383/500 [=====================>........] - ETA: 39s - loss: 1.9393 - regression_loss: 1.6449 - classification_loss: 0.2944 384/500 [======================>.......] - ETA: 39s - loss: 1.9384 - regression_loss: 1.6442 - classification_loss: 0.2942 385/500 [======================>.......] - ETA: 38s - loss: 1.9376 - regression_loss: 1.6435 - classification_loss: 0.2941 386/500 [======================>.......] - ETA: 38s - loss: 1.9374 - regression_loss: 1.6435 - classification_loss: 0.2940 387/500 [======================>.......] - ETA: 38s - loss: 1.9403 - regression_loss: 1.6457 - classification_loss: 0.2946 388/500 [======================>.......] - ETA: 37s - loss: 1.9397 - regression_loss: 1.6452 - classification_loss: 0.2945 389/500 [======================>.......] - ETA: 37s - loss: 1.9387 - regression_loss: 1.6443 - classification_loss: 0.2944 390/500 [======================>.......] - ETA: 37s - loss: 1.9356 - regression_loss: 1.6414 - classification_loss: 0.2943 391/500 [======================>.......] - ETA: 36s - loss: 1.9366 - regression_loss: 1.6425 - classification_loss: 0.2941 392/500 [======================>.......] - ETA: 36s - loss: 1.9388 - regression_loss: 1.6442 - classification_loss: 0.2946 393/500 [======================>.......] - ETA: 36s - loss: 1.9389 - regression_loss: 1.6444 - classification_loss: 0.2945 394/500 [======================>.......] - ETA: 35s - loss: 1.9378 - regression_loss: 1.6437 - classification_loss: 0.2941 395/500 [======================>.......] - ETA: 35s - loss: 1.9369 - regression_loss: 1.6429 - classification_loss: 0.2940 396/500 [======================>.......] - ETA: 35s - loss: 1.9384 - regression_loss: 1.6443 - classification_loss: 0.2941 397/500 [======================>.......] - ETA: 34s - loss: 1.9403 - regression_loss: 1.6454 - classification_loss: 0.2949 398/500 [======================>.......] - ETA: 34s - loss: 1.9420 - regression_loss: 1.6469 - classification_loss: 0.2951 399/500 [======================>.......] - ETA: 34s - loss: 1.9414 - regression_loss: 1.6465 - classification_loss: 0.2948 400/500 [=======================>......] - ETA: 33s - loss: 1.9419 - regression_loss: 1.6470 - classification_loss: 0.2949 401/500 [=======================>......] - ETA: 33s - loss: 1.9409 - regression_loss: 1.6463 - classification_loss: 0.2946 402/500 [=======================>......] - ETA: 33s - loss: 1.9427 - regression_loss: 1.6476 - classification_loss: 0.2951 403/500 [=======================>......] - ETA: 32s - loss: 1.9428 - regression_loss: 1.6478 - classification_loss: 0.2950 404/500 [=======================>......] - ETA: 32s - loss: 1.9486 - regression_loss: 1.6526 - classification_loss: 0.2960 405/500 [=======================>......] - ETA: 32s - loss: 1.9484 - regression_loss: 1.6526 - classification_loss: 0.2958 406/500 [=======================>......] - ETA: 31s - loss: 1.9499 - regression_loss: 1.6537 - classification_loss: 0.2962 407/500 [=======================>......] - ETA: 31s - loss: 1.9490 - regression_loss: 1.6531 - classification_loss: 0.2959 408/500 [=======================>......] - ETA: 31s - loss: 1.9494 - regression_loss: 1.6537 - classification_loss: 0.2957 409/500 [=======================>......] - ETA: 30s - loss: 1.9498 - regression_loss: 1.6543 - classification_loss: 0.2955 410/500 [=======================>......] - ETA: 30s - loss: 1.9492 - regression_loss: 1.6540 - classification_loss: 0.2953 411/500 [=======================>......] - ETA: 30s - loss: 1.9501 - regression_loss: 1.6548 - classification_loss: 0.2953 412/500 [=======================>......] - ETA: 29s - loss: 1.9505 - regression_loss: 1.6553 - classification_loss: 0.2952 413/500 [=======================>......] - ETA: 29s - loss: 1.9515 - regression_loss: 1.6560 - classification_loss: 0.2955 414/500 [=======================>......] - ETA: 29s - loss: 1.9529 - regression_loss: 1.6570 - classification_loss: 0.2959 415/500 [=======================>......] - ETA: 28s - loss: 1.9521 - regression_loss: 1.6564 - classification_loss: 0.2957 416/500 [=======================>......] - ETA: 28s - loss: 1.9535 - regression_loss: 1.6576 - classification_loss: 0.2958 417/500 [========================>.....] - ETA: 28s - loss: 1.9526 - regression_loss: 1.6570 - classification_loss: 0.2956 418/500 [========================>.....] - ETA: 27s - loss: 1.9538 - regression_loss: 1.6581 - classification_loss: 0.2958 419/500 [========================>.....] - ETA: 27s - loss: 1.9516 - regression_loss: 1.6562 - classification_loss: 0.2954 420/500 [========================>.....] - ETA: 27s - loss: 1.9517 - regression_loss: 1.6564 - classification_loss: 0.2953 421/500 [========================>.....] - ETA: 26s - loss: 1.9510 - regression_loss: 1.6558 - classification_loss: 0.2952 422/500 [========================>.....] - ETA: 26s - loss: 1.9512 - regression_loss: 1.6560 - classification_loss: 0.2952 423/500 [========================>.....] - ETA: 26s - loss: 1.9515 - regression_loss: 1.6564 - classification_loss: 0.2951 424/500 [========================>.....] - ETA: 25s - loss: 1.9515 - regression_loss: 1.6566 - classification_loss: 0.2949 425/500 [========================>.....] - ETA: 25s - loss: 1.9509 - regression_loss: 1.6561 - classification_loss: 0.2948 426/500 [========================>.....] - ETA: 25s - loss: 1.9501 - regression_loss: 1.6555 - classification_loss: 0.2946 427/500 [========================>.....] - ETA: 24s - loss: 1.9503 - regression_loss: 1.6558 - classification_loss: 0.2945 428/500 [========================>.....] - ETA: 24s - loss: 1.9516 - regression_loss: 1.6569 - classification_loss: 0.2947 429/500 [========================>.....] - ETA: 24s - loss: 1.9512 - regression_loss: 1.6565 - classification_loss: 0.2947 430/500 [========================>.....] - ETA: 23s - loss: 1.9501 - regression_loss: 1.6557 - classification_loss: 0.2944 431/500 [========================>.....] - ETA: 23s - loss: 1.9492 - regression_loss: 1.6551 - classification_loss: 0.2941 432/500 [========================>.....] - ETA: 22s - loss: 1.9501 - regression_loss: 1.6559 - classification_loss: 0.2942 433/500 [========================>.....] - ETA: 22s - loss: 1.9512 - regression_loss: 1.6569 - classification_loss: 0.2943 434/500 [=========================>....] - ETA: 22s - loss: 1.9517 - regression_loss: 1.6575 - classification_loss: 0.2942 435/500 [=========================>....] - ETA: 21s - loss: 1.9518 - regression_loss: 1.6576 - classification_loss: 0.2942 436/500 [=========================>....] - ETA: 21s - loss: 1.9505 - regression_loss: 1.6565 - classification_loss: 0.2939 437/500 [=========================>....] - ETA: 21s - loss: 1.9483 - regression_loss: 1.6548 - classification_loss: 0.2935 438/500 [=========================>....] - ETA: 20s - loss: 1.9485 - regression_loss: 1.6550 - classification_loss: 0.2935 439/500 [=========================>....] - ETA: 20s - loss: 1.9461 - regression_loss: 1.6531 - classification_loss: 0.2930 440/500 [=========================>....] - ETA: 20s - loss: 1.9452 - regression_loss: 1.6522 - classification_loss: 0.2931 441/500 [=========================>....] - ETA: 19s - loss: 1.9445 - regression_loss: 1.6517 - classification_loss: 0.2928 442/500 [=========================>....] - ETA: 19s - loss: 1.9435 - regression_loss: 1.6510 - classification_loss: 0.2925 443/500 [=========================>....] - ETA: 19s - loss: 1.9440 - regression_loss: 1.6515 - classification_loss: 0.2925 444/500 [=========================>....] - ETA: 18s - loss: 1.9435 - regression_loss: 1.6512 - classification_loss: 0.2923 445/500 [=========================>....] - ETA: 18s - loss: 1.9447 - regression_loss: 1.6524 - classification_loss: 0.2923 446/500 [=========================>....] - ETA: 18s - loss: 1.9439 - regression_loss: 1.6518 - classification_loss: 0.2921 447/500 [=========================>....] - ETA: 17s - loss: 1.9442 - regression_loss: 1.6522 - classification_loss: 0.2920 448/500 [=========================>....] - ETA: 17s - loss: 1.9460 - regression_loss: 1.6535 - classification_loss: 0.2925 449/500 [=========================>....] - ETA: 17s - loss: 1.9460 - regression_loss: 1.6537 - classification_loss: 0.2923 450/500 [==========================>...] - ETA: 16s - loss: 1.9461 - regression_loss: 1.6537 - classification_loss: 0.2924 451/500 [==========================>...] - ETA: 16s - loss: 1.9496 - regression_loss: 1.6559 - classification_loss: 0.2937 452/500 [==========================>...] - ETA: 16s - loss: 1.9490 - regression_loss: 1.6555 - classification_loss: 0.2935 453/500 [==========================>...] - ETA: 15s - loss: 1.9505 - regression_loss: 1.6569 - classification_loss: 0.2935 454/500 [==========================>...] - ETA: 15s - loss: 1.9508 - regression_loss: 1.6572 - classification_loss: 0.2936 455/500 [==========================>...] - ETA: 15s - loss: 1.9517 - regression_loss: 1.6578 - classification_loss: 0.2939 456/500 [==========================>...] - ETA: 14s - loss: 1.9513 - regression_loss: 1.6572 - classification_loss: 0.2940 457/500 [==========================>...] - ETA: 14s - loss: 1.9521 - regression_loss: 1.6578 - classification_loss: 0.2943 458/500 [==========================>...] - ETA: 14s - loss: 1.9518 - regression_loss: 1.6578 - classification_loss: 0.2941 459/500 [==========================>...] - ETA: 13s - loss: 1.9511 - regression_loss: 1.6572 - classification_loss: 0.2939 460/500 [==========================>...] - ETA: 13s - loss: 1.9508 - regression_loss: 1.6571 - classification_loss: 0.2937 461/500 [==========================>...] - ETA: 13s - loss: 1.9508 - regression_loss: 1.6569 - classification_loss: 0.2939 462/500 [==========================>...] - ETA: 12s - loss: 1.9500 - regression_loss: 1.6561 - classification_loss: 0.2938 463/500 [==========================>...] - ETA: 12s - loss: 1.9498 - regression_loss: 1.6558 - classification_loss: 0.2940 464/500 [==========================>...] - ETA: 12s - loss: 1.9494 - regression_loss: 1.6556 - classification_loss: 0.2938 465/500 [==========================>...] - ETA: 11s - loss: 1.9488 - regression_loss: 1.6552 - classification_loss: 0.2936 466/500 [==========================>...] - ETA: 11s - loss: 1.9491 - regression_loss: 1.6554 - classification_loss: 0.2937 467/500 [===========================>..] - ETA: 11s - loss: 1.9491 - regression_loss: 1.6556 - classification_loss: 0.2935 468/500 [===========================>..] - ETA: 10s - loss: 1.9479 - regression_loss: 1.6546 - classification_loss: 0.2933 469/500 [===========================>..] - ETA: 10s - loss: 1.9485 - regression_loss: 1.6552 - classification_loss: 0.2933 470/500 [===========================>..] - ETA: 10s - loss: 1.9470 - regression_loss: 1.6540 - classification_loss: 0.2931 471/500 [===========================>..] - ETA: 9s - loss: 1.9474 - regression_loss: 1.6542 - classification_loss: 0.2932  472/500 [===========================>..] - ETA: 9s - loss: 1.9476 - regression_loss: 1.6545 - classification_loss: 0.2931 473/500 [===========================>..] - ETA: 9s - loss: 1.9484 - regression_loss: 1.6552 - classification_loss: 0.2932 474/500 [===========================>..] - ETA: 8s - loss: 1.9483 - regression_loss: 1.6551 - classification_loss: 0.2932 475/500 [===========================>..] - ETA: 8s - loss: 1.9473 - regression_loss: 1.6544 - classification_loss: 0.2929 476/500 [===========================>..] - ETA: 8s - loss: 1.9462 - regression_loss: 1.6536 - classification_loss: 0.2926 477/500 [===========================>..] - ETA: 7s - loss: 1.9497 - regression_loss: 1.6554 - classification_loss: 0.2943 478/500 [===========================>..] - ETA: 7s - loss: 1.9518 - regression_loss: 1.6575 - classification_loss: 0.2943 479/500 [===========================>..] - ETA: 7s - loss: 1.9506 - regression_loss: 1.6565 - classification_loss: 0.2940 480/500 [===========================>..] - ETA: 6s - loss: 1.9504 - regression_loss: 1.6564 - classification_loss: 0.2940 481/500 [===========================>..] - ETA: 6s - loss: 1.9499 - regression_loss: 1.6560 - classification_loss: 0.2938 482/500 [===========================>..] - ETA: 6s - loss: 1.9497 - regression_loss: 1.6558 - classification_loss: 0.2939 483/500 [===========================>..] - ETA: 5s - loss: 1.9490 - regression_loss: 1.6553 - classification_loss: 0.2937 484/500 [============================>.] - ETA: 5s - loss: 1.9481 - regression_loss: 1.6545 - classification_loss: 0.2936 485/500 [============================>.] - ETA: 5s - loss: 1.9481 - regression_loss: 1.6545 - classification_loss: 0.2937 486/500 [============================>.] - ETA: 4s - loss: 1.9473 - regression_loss: 1.6539 - classification_loss: 0.2934 487/500 [============================>.] - ETA: 4s - loss: 1.9462 - regression_loss: 1.6529 - classification_loss: 0.2933 488/500 [============================>.] - ETA: 4s - loss: 1.9463 - regression_loss: 1.6529 - classification_loss: 0.2935 489/500 [============================>.] - ETA: 3s - loss: 1.9456 - regression_loss: 1.6523 - classification_loss: 0.2933 490/500 [============================>.] - ETA: 3s - loss: 1.9464 - regression_loss: 1.6530 - classification_loss: 0.2934 491/500 [============================>.] - ETA: 3s - loss: 1.9456 - regression_loss: 1.6525 - classification_loss: 0.2932 492/500 [============================>.] - ETA: 2s - loss: 1.9460 - regression_loss: 1.6521 - classification_loss: 0.2939 493/500 [============================>.] - ETA: 2s - loss: 1.9450 - regression_loss: 1.6514 - classification_loss: 0.2936 494/500 [============================>.] - ETA: 2s - loss: 1.9465 - regression_loss: 1.6523 - classification_loss: 0.2942 495/500 [============================>.] - ETA: 1s - loss: 1.9451 - regression_loss: 1.6512 - classification_loss: 0.2939 496/500 [============================>.] - ETA: 1s - loss: 1.9466 - regression_loss: 1.6524 - classification_loss: 0.2942 497/500 [============================>.] - ETA: 1s - loss: 1.9470 - regression_loss: 1.6529 - classification_loss: 0.2941 498/500 [============================>.] - ETA: 0s - loss: 1.9469 - regression_loss: 1.6528 - classification_loss: 0.2941 499/500 [============================>.] - ETA: 0s - loss: 1.9526 - regression_loss: 1.6578 - classification_loss: 0.2948 500/500 [==============================] - 169s 338ms/step - loss: 1.9552 - regression_loss: 1.6598 - classification_loss: 0.2953 326 instances of class plum with average precision: 0.6237 mAP: 0.6237 Epoch 00004: saving model to ./training/snapshots/resnet101_pascal_04.h5 Epoch 5/150 1/500 [..............................] - ETA: 2:42 - loss: 1.7494 - regression_loss: 1.5820 - classification_loss: 0.1674 2/500 [..............................] - ETA: 2:41 - loss: 1.4741 - regression_loss: 1.3347 - classification_loss: 0.1394 3/500 [..............................] - ETA: 2:44 - loss: 1.8388 - regression_loss: 1.6961 - classification_loss: 0.1428 4/500 [..............................] - ETA: 2:46 - loss: 1.7616 - regression_loss: 1.6039 - classification_loss: 0.1576 5/500 [..............................] - ETA: 2:46 - loss: 1.7891 - regression_loss: 1.6090 - classification_loss: 0.1801 6/500 [..............................] - ETA: 2:46 - loss: 1.8170 - regression_loss: 1.6170 - classification_loss: 0.2000 7/500 [..............................] - ETA: 2:46 - loss: 1.8339 - regression_loss: 1.6181 - classification_loss: 0.2159 8/500 [..............................] - ETA: 2:46 - loss: 1.7166 - regression_loss: 1.5126 - classification_loss: 0.2040 9/500 [..............................] - ETA: 2:45 - loss: 1.7593 - regression_loss: 1.5395 - classification_loss: 0.2198 10/500 [..............................] - ETA: 2:46 - loss: 1.7603 - regression_loss: 1.5412 - classification_loss: 0.2191 11/500 [..............................] - ETA: 2:45 - loss: 1.7548 - regression_loss: 1.5341 - classification_loss: 0.2207 12/500 [..............................] - ETA: 2:46 - loss: 1.7873 - regression_loss: 1.5566 - classification_loss: 0.2306 13/500 [..............................] - ETA: 2:45 - loss: 1.7505 - regression_loss: 1.5269 - classification_loss: 0.2236 14/500 [..............................] - ETA: 2:45 - loss: 1.7645 - regression_loss: 1.5348 - classification_loss: 0.2297 15/500 [..............................] - ETA: 2:44 - loss: 1.7736 - regression_loss: 1.5464 - classification_loss: 0.2272 16/500 [..............................] - ETA: 2:44 - loss: 1.8315 - regression_loss: 1.5944 - classification_loss: 0.2370 17/500 [>.............................] - ETA: 2:44 - loss: 1.7739 - regression_loss: 1.5447 - classification_loss: 0.2293 18/500 [>.............................] - ETA: 2:43 - loss: 1.8160 - regression_loss: 1.5748 - classification_loss: 0.2411 19/500 [>.............................] - ETA: 2:43 - loss: 1.8133 - regression_loss: 1.5742 - classification_loss: 0.2391 20/500 [>.............................] - ETA: 2:43 - loss: 1.7916 - regression_loss: 1.5502 - classification_loss: 0.2414 21/500 [>.............................] - ETA: 2:42 - loss: 1.7780 - regression_loss: 1.5375 - classification_loss: 0.2405 22/500 [>.............................] - ETA: 2:42 - loss: 1.8054 - regression_loss: 1.5560 - classification_loss: 0.2494 23/500 [>.............................] - ETA: 2:42 - loss: 1.7760 - regression_loss: 1.5328 - classification_loss: 0.2432 24/500 [>.............................] - ETA: 2:41 - loss: 1.7624 - regression_loss: 1.5216 - classification_loss: 0.2407 25/500 [>.............................] - ETA: 2:41 - loss: 1.7663 - regression_loss: 1.5173 - classification_loss: 0.2489 26/500 [>.............................] - ETA: 2:41 - loss: 1.7747 - regression_loss: 1.5233 - classification_loss: 0.2514 27/500 [>.............................] - ETA: 2:40 - loss: 1.7842 - regression_loss: 1.5325 - classification_loss: 0.2516 28/500 [>.............................] - ETA: 2:39 - loss: 1.8011 - regression_loss: 1.5456 - classification_loss: 0.2556 29/500 [>.............................] - ETA: 2:39 - loss: 1.7820 - regression_loss: 1.5312 - classification_loss: 0.2508 30/500 [>.............................] - ETA: 2:39 - loss: 1.8022 - regression_loss: 1.5490 - classification_loss: 0.2532 31/500 [>.............................] - ETA: 2:38 - loss: 1.8225 - regression_loss: 1.5658 - classification_loss: 0.2567 32/500 [>.............................] - ETA: 2:38 - loss: 1.8057 - regression_loss: 1.5520 - classification_loss: 0.2538 33/500 [>.............................] - ETA: 2:38 - loss: 1.8278 - regression_loss: 1.5718 - classification_loss: 0.2560 34/500 [=>............................] - ETA: 2:37 - loss: 1.8228 - regression_loss: 1.5678 - classification_loss: 0.2550 35/500 [=>............................] - ETA: 2:37 - loss: 1.8368 - regression_loss: 1.5788 - classification_loss: 0.2580 36/500 [=>............................] - ETA: 2:37 - loss: 1.8331 - regression_loss: 1.5771 - classification_loss: 0.2560 37/500 [=>............................] - ETA: 2:36 - loss: 1.8225 - regression_loss: 1.5690 - classification_loss: 0.2535 38/500 [=>............................] - ETA: 2:36 - loss: 1.8671 - regression_loss: 1.6073 - classification_loss: 0.2598 39/500 [=>............................] - ETA: 2:36 - loss: 1.8678 - regression_loss: 1.6101 - classification_loss: 0.2577 40/500 [=>............................] - ETA: 2:35 - loss: 1.8785 - regression_loss: 1.6183 - classification_loss: 0.2602 41/500 [=>............................] - ETA: 2:35 - loss: 1.8647 - regression_loss: 1.6060 - classification_loss: 0.2587 42/500 [=>............................] - ETA: 2:35 - loss: 1.8614 - regression_loss: 1.6020 - classification_loss: 0.2594 43/500 [=>............................] - ETA: 2:35 - loss: 1.8727 - regression_loss: 1.6113 - classification_loss: 0.2614 44/500 [=>............................] - ETA: 2:34 - loss: 1.8827 - regression_loss: 1.6195 - classification_loss: 0.2632 45/500 [=>............................] - ETA: 2:34 - loss: 1.8776 - regression_loss: 1.6147 - classification_loss: 0.2630 46/500 [=>............................] - ETA: 2:34 - loss: 1.8827 - regression_loss: 1.6189 - classification_loss: 0.2639 47/500 [=>............................] - ETA: 2:33 - loss: 1.8807 - regression_loss: 1.6163 - classification_loss: 0.2644 48/500 [=>............................] - ETA: 2:33 - loss: 1.8763 - regression_loss: 1.6122 - classification_loss: 0.2641 49/500 [=>............................] - ETA: 2:33 - loss: 1.8810 - regression_loss: 1.6156 - classification_loss: 0.2654 50/500 [==>...........................] - ETA: 2:33 - loss: 1.8763 - regression_loss: 1.6121 - classification_loss: 0.2643 51/500 [==>...........................] - ETA: 2:32 - loss: 1.8880 - regression_loss: 1.6196 - classification_loss: 0.2684 52/500 [==>...........................] - ETA: 2:32 - loss: 1.8943 - regression_loss: 1.6245 - classification_loss: 0.2698 53/500 [==>...........................] - ETA: 2:32 - loss: 1.8877 - regression_loss: 1.6176 - classification_loss: 0.2701 54/500 [==>...........................] - ETA: 2:31 - loss: 1.8958 - regression_loss: 1.6241 - classification_loss: 0.2717 55/500 [==>...........................] - ETA: 2:31 - loss: 1.8931 - regression_loss: 1.6228 - classification_loss: 0.2702 56/500 [==>...........................] - ETA: 2:31 - loss: 1.8909 - regression_loss: 1.6205 - classification_loss: 0.2704 57/500 [==>...........................] - ETA: 2:30 - loss: 1.8953 - regression_loss: 1.6245 - classification_loss: 0.2709 58/500 [==>...........................] - ETA: 2:30 - loss: 1.8893 - regression_loss: 1.6203 - classification_loss: 0.2689 59/500 [==>...........................] - ETA: 2:30 - loss: 1.9206 - regression_loss: 1.6456 - classification_loss: 0.2750 60/500 [==>...........................] - ETA: 2:29 - loss: 1.9188 - regression_loss: 1.6428 - classification_loss: 0.2760 61/500 [==>...........................] - ETA: 2:29 - loss: 1.9067 - regression_loss: 1.6324 - classification_loss: 0.2743 62/500 [==>...........................] - ETA: 2:28 - loss: 1.9103 - regression_loss: 1.6366 - classification_loss: 0.2737 63/500 [==>...........................] - ETA: 2:28 - loss: 1.9140 - regression_loss: 1.6391 - classification_loss: 0.2749 64/500 [==>...........................] - ETA: 2:28 - loss: 1.9094 - regression_loss: 1.6356 - classification_loss: 0.2738 65/500 [==>...........................] - ETA: 2:27 - loss: 1.9081 - regression_loss: 1.6345 - classification_loss: 0.2736 66/500 [==>...........................] - ETA: 2:27 - loss: 1.9053 - regression_loss: 1.6322 - classification_loss: 0.2730 67/500 [===>..........................] - ETA: 2:27 - loss: 1.8959 - regression_loss: 1.6247 - classification_loss: 0.2712 68/500 [===>..........................] - ETA: 2:26 - loss: 1.9072 - regression_loss: 1.6312 - classification_loss: 0.2760 69/500 [===>..........................] - ETA: 2:26 - loss: 1.8997 - regression_loss: 1.6234 - classification_loss: 0.2763 70/500 [===>..........................] - ETA: 2:26 - loss: 1.8978 - regression_loss: 1.6224 - classification_loss: 0.2754 71/500 [===>..........................] - ETA: 2:25 - loss: 1.8991 - regression_loss: 1.6250 - classification_loss: 0.2741 72/500 [===>..........................] - ETA: 2:25 - loss: 1.8958 - regression_loss: 1.6207 - classification_loss: 0.2751 73/500 [===>..........................] - ETA: 2:25 - loss: 1.8896 - regression_loss: 1.6154 - classification_loss: 0.2742 74/500 [===>..........................] - ETA: 2:25 - loss: 1.8834 - regression_loss: 1.6110 - classification_loss: 0.2724 75/500 [===>..........................] - ETA: 2:24 - loss: 1.8811 - regression_loss: 1.6088 - classification_loss: 0.2722 76/500 [===>..........................] - ETA: 2:24 - loss: 1.8971 - regression_loss: 1.6191 - classification_loss: 0.2781 77/500 [===>..........................] - ETA: 2:24 - loss: 1.8974 - regression_loss: 1.6190 - classification_loss: 0.2784 78/500 [===>..........................] - ETA: 2:24 - loss: 1.8922 - regression_loss: 1.6147 - classification_loss: 0.2776 79/500 [===>..........................] - ETA: 2:23 - loss: 1.8975 - regression_loss: 1.6191 - classification_loss: 0.2784 80/500 [===>..........................] - ETA: 2:23 - loss: 1.8900 - regression_loss: 1.6121 - classification_loss: 0.2779 81/500 [===>..........................] - ETA: 2:22 - loss: 1.9119 - regression_loss: 1.6261 - classification_loss: 0.2858 82/500 [===>..........................] - ETA: 2:22 - loss: 1.9124 - regression_loss: 1.6268 - classification_loss: 0.2856 83/500 [===>..........................] - ETA: 2:22 - loss: 1.9100 - regression_loss: 1.6242 - classification_loss: 0.2858 84/500 [====>.........................] - ETA: 2:21 - loss: 1.9133 - regression_loss: 1.6254 - classification_loss: 0.2879 85/500 [====>.........................] - ETA: 2:21 - loss: 1.9109 - regression_loss: 1.6236 - classification_loss: 0.2873 86/500 [====>.........................] - ETA: 2:21 - loss: 1.9149 - regression_loss: 1.6270 - classification_loss: 0.2879 87/500 [====>.........................] - ETA: 2:20 - loss: 1.9053 - regression_loss: 1.6195 - classification_loss: 0.2859 88/500 [====>.........................] - ETA: 2:20 - loss: 1.9072 - regression_loss: 1.6211 - classification_loss: 0.2861 89/500 [====>.........................] - ETA: 2:20 - loss: 1.9045 - regression_loss: 1.6188 - classification_loss: 0.2856 90/500 [====>.........................] - ETA: 2:19 - loss: 1.9163 - regression_loss: 1.6277 - classification_loss: 0.2886 91/500 [====>.........................] - ETA: 2:19 - loss: 1.9146 - regression_loss: 1.6273 - classification_loss: 0.2873 92/500 [====>.........................] - ETA: 2:19 - loss: 1.9193 - regression_loss: 1.6326 - classification_loss: 0.2867 93/500 [====>.........................] - ETA: 2:18 - loss: 1.9212 - regression_loss: 1.6347 - classification_loss: 0.2865 94/500 [====>.........................] - ETA: 2:18 - loss: 1.9168 - regression_loss: 1.6315 - classification_loss: 0.2852 95/500 [====>.........................] - ETA: 2:18 - loss: 1.9117 - regression_loss: 1.6271 - classification_loss: 0.2846 96/500 [====>.........................] - ETA: 2:17 - loss: 1.9135 - regression_loss: 1.6280 - classification_loss: 0.2855 97/500 [====>.........................] - ETA: 2:17 - loss: 1.9112 - regression_loss: 1.6257 - classification_loss: 0.2856 98/500 [====>.........................] - ETA: 2:17 - loss: 1.9048 - regression_loss: 1.6208 - classification_loss: 0.2841 99/500 [====>.........................] - ETA: 2:16 - loss: 1.9154 - regression_loss: 1.6272 - classification_loss: 0.2882 100/500 [=====>........................] - ETA: 2:16 - loss: 1.9196 - regression_loss: 1.6308 - classification_loss: 0.2888 101/500 [=====>........................] - ETA: 2:15 - loss: 1.9261 - regression_loss: 1.6358 - classification_loss: 0.2902 102/500 [=====>........................] - ETA: 2:15 - loss: 1.9309 - regression_loss: 1.6400 - classification_loss: 0.2909 103/500 [=====>........................] - ETA: 2:15 - loss: 1.9271 - regression_loss: 1.6369 - classification_loss: 0.2902 104/500 [=====>........................] - ETA: 2:14 - loss: 1.9239 - regression_loss: 1.6344 - classification_loss: 0.2895 105/500 [=====>........................] - ETA: 2:14 - loss: 1.9198 - regression_loss: 1.6312 - classification_loss: 0.2887 106/500 [=====>........................] - ETA: 2:14 - loss: 1.9200 - regression_loss: 1.6318 - classification_loss: 0.2882 107/500 [=====>........................] - ETA: 2:13 - loss: 1.9199 - regression_loss: 1.6321 - classification_loss: 0.2878 108/500 [=====>........................] - ETA: 2:13 - loss: 1.9204 - regression_loss: 1.6326 - classification_loss: 0.2879 109/500 [=====>........................] - ETA: 2:13 - loss: 1.9223 - regression_loss: 1.6344 - classification_loss: 0.2880 110/500 [=====>........................] - ETA: 2:12 - loss: 1.9216 - regression_loss: 1.6342 - classification_loss: 0.2874 111/500 [=====>........................] - ETA: 2:12 - loss: 1.9201 - regression_loss: 1.6328 - classification_loss: 0.2872 112/500 [=====>........................] - ETA: 2:11 - loss: 1.9220 - regression_loss: 1.6356 - classification_loss: 0.2864 113/500 [=====>........................] - ETA: 2:11 - loss: 1.9156 - regression_loss: 1.6306 - classification_loss: 0.2850 114/500 [=====>........................] - ETA: 2:11 - loss: 1.9170 - regression_loss: 1.6321 - classification_loss: 0.2849 115/500 [=====>........................] - ETA: 2:10 - loss: 1.9229 - regression_loss: 1.6386 - classification_loss: 0.2843 116/500 [=====>........................] - ETA: 2:10 - loss: 1.9188 - regression_loss: 1.6349 - classification_loss: 0.2839 117/500 [======>.......................] - ETA: 2:10 - loss: 1.9205 - regression_loss: 1.6362 - classification_loss: 0.2843 118/500 [======>.......................] - ETA: 2:09 - loss: 1.9273 - regression_loss: 1.6417 - classification_loss: 0.2856 119/500 [======>.......................] - ETA: 2:09 - loss: 1.9288 - regression_loss: 1.6434 - classification_loss: 0.2854 120/500 [======>.......................] - ETA: 2:09 - loss: 1.9277 - regression_loss: 1.6426 - classification_loss: 0.2851 121/500 [======>.......................] - ETA: 2:08 - loss: 1.9257 - regression_loss: 1.6414 - classification_loss: 0.2844 122/500 [======>.......................] - ETA: 2:08 - loss: 1.9201 - regression_loss: 1.6368 - classification_loss: 0.2834 123/500 [======>.......................] - ETA: 2:08 - loss: 1.9200 - regression_loss: 1.6370 - classification_loss: 0.2830 124/500 [======>.......................] - ETA: 2:07 - loss: 1.9182 - regression_loss: 1.6360 - classification_loss: 0.2822 125/500 [======>.......................] - ETA: 2:07 - loss: 1.9160 - regression_loss: 1.6341 - classification_loss: 0.2819 126/500 [======>.......................] - ETA: 2:06 - loss: 1.9117 - regression_loss: 1.6306 - classification_loss: 0.2811 127/500 [======>.......................] - ETA: 2:06 - loss: 1.9050 - regression_loss: 1.6252 - classification_loss: 0.2798 128/500 [======>.......................] - ETA: 2:06 - loss: 1.9143 - regression_loss: 1.6348 - classification_loss: 0.2795 129/500 [======>.......................] - ETA: 2:05 - loss: 1.9179 - regression_loss: 1.6364 - classification_loss: 0.2815 130/500 [======>.......................] - ETA: 2:05 - loss: 1.9150 - regression_loss: 1.6341 - classification_loss: 0.2808 131/500 [======>.......................] - ETA: 2:05 - loss: 1.9149 - regression_loss: 1.6345 - classification_loss: 0.2804 132/500 [======>.......................] - ETA: 2:05 - loss: 1.9176 - regression_loss: 1.6358 - classification_loss: 0.2818 133/500 [======>.......................] - ETA: 2:04 - loss: 1.9167 - regression_loss: 1.6350 - classification_loss: 0.2817 134/500 [=======>......................] - ETA: 2:04 - loss: 1.9188 - regression_loss: 1.6369 - classification_loss: 0.2819 135/500 [=======>......................] - ETA: 2:04 - loss: 1.9147 - regression_loss: 1.6334 - classification_loss: 0.2813 136/500 [=======>......................] - ETA: 2:03 - loss: 1.9226 - regression_loss: 1.6395 - classification_loss: 0.2831 137/500 [=======>......................] - ETA: 2:03 - loss: 1.9209 - regression_loss: 1.6382 - classification_loss: 0.2827 138/500 [=======>......................] - ETA: 2:02 - loss: 1.9226 - regression_loss: 1.6396 - classification_loss: 0.2830 139/500 [=======>......................] - ETA: 2:02 - loss: 1.9195 - regression_loss: 1.6372 - classification_loss: 0.2823 140/500 [=======>......................] - ETA: 2:02 - loss: 1.9266 - regression_loss: 1.6438 - classification_loss: 0.2828 141/500 [=======>......................] - ETA: 2:01 - loss: 1.9255 - regression_loss: 1.6432 - classification_loss: 0.2823 142/500 [=======>......................] - ETA: 2:01 - loss: 1.9338 - regression_loss: 1.6503 - classification_loss: 0.2835 143/500 [=======>......................] - ETA: 2:01 - loss: 1.9300 - regression_loss: 1.6472 - classification_loss: 0.2827 144/500 [=======>......................] - ETA: 2:00 - loss: 1.9274 - regression_loss: 1.6452 - classification_loss: 0.2822 145/500 [=======>......................] - ETA: 2:00 - loss: 1.9209 - regression_loss: 1.6395 - classification_loss: 0.2813 146/500 [=======>......................] - ETA: 2:00 - loss: 1.9276 - regression_loss: 1.6447 - classification_loss: 0.2828 147/500 [=======>......................] - ETA: 1:59 - loss: 1.9272 - regression_loss: 1.6448 - classification_loss: 0.2824 148/500 [=======>......................] - ETA: 1:59 - loss: 1.9230 - regression_loss: 1.6415 - classification_loss: 0.2815 149/500 [=======>......................] - ETA: 1:59 - loss: 1.9226 - regression_loss: 1.6409 - classification_loss: 0.2817 150/500 [========>.....................] - ETA: 1:59 - loss: 1.9202 - regression_loss: 1.6393 - classification_loss: 0.2809 151/500 [========>.....................] - ETA: 1:58 - loss: 1.9220 - regression_loss: 1.6410 - classification_loss: 0.2811 152/500 [========>.....................] - ETA: 1:58 - loss: 1.9223 - regression_loss: 1.6412 - classification_loss: 0.2810 153/500 [========>.....................] - ETA: 1:58 - loss: 1.9173 - regression_loss: 1.6373 - classification_loss: 0.2799 154/500 [========>.....................] - ETA: 1:57 - loss: 1.9182 - regression_loss: 1.6380 - classification_loss: 0.2802 155/500 [========>.....................] - ETA: 1:57 - loss: 1.9178 - regression_loss: 1.6364 - classification_loss: 0.2814 156/500 [========>.....................] - ETA: 1:57 - loss: 1.9168 - regression_loss: 1.6354 - classification_loss: 0.2814 157/500 [========>.....................] - ETA: 1:56 - loss: 1.9190 - regression_loss: 1.6377 - classification_loss: 0.2813 158/500 [========>.....................] - ETA: 1:56 - loss: 1.9172 - regression_loss: 1.6362 - classification_loss: 0.2810 159/500 [========>.....................] - ETA: 1:56 - loss: 1.9085 - regression_loss: 1.6260 - classification_loss: 0.2825 160/500 [========>.....................] - ETA: 1:55 - loss: 1.9073 - regression_loss: 1.6249 - classification_loss: 0.2824 161/500 [========>.....................] - ETA: 1:55 - loss: 1.9129 - regression_loss: 1.6292 - classification_loss: 0.2836 162/500 [========>.....................] - ETA: 1:55 - loss: 1.9115 - regression_loss: 1.6285 - classification_loss: 0.2829 163/500 [========>.....................] - ETA: 1:54 - loss: 1.9107 - regression_loss: 1.6278 - classification_loss: 0.2829 164/500 [========>.....................] - ETA: 1:54 - loss: 1.9158 - regression_loss: 1.6318 - classification_loss: 0.2839 165/500 [========>.....................] - ETA: 1:54 - loss: 1.9139 - regression_loss: 1.6306 - classification_loss: 0.2833 166/500 [========>.....................] - ETA: 1:53 - loss: 1.9072 - regression_loss: 1.6208 - classification_loss: 0.2864 167/500 [=========>....................] - ETA: 1:53 - loss: 1.9063 - regression_loss: 1.6204 - classification_loss: 0.2859 168/500 [=========>....................] - ETA: 1:53 - loss: 1.9045 - regression_loss: 1.6190 - classification_loss: 0.2855 169/500 [=========>....................] - ETA: 1:52 - loss: 1.9002 - regression_loss: 1.6155 - classification_loss: 0.2847 170/500 [=========>....................] - ETA: 1:52 - loss: 1.8991 - regression_loss: 1.6141 - classification_loss: 0.2850 171/500 [=========>....................] - ETA: 1:52 - loss: 1.8985 - regression_loss: 1.6132 - classification_loss: 0.2852 172/500 [=========>....................] - ETA: 1:51 - loss: 1.8978 - regression_loss: 1.6123 - classification_loss: 0.2855 173/500 [=========>....................] - ETA: 1:51 - loss: 1.8963 - regression_loss: 1.6110 - classification_loss: 0.2853 174/500 [=========>....................] - ETA: 1:51 - loss: 1.8957 - regression_loss: 1.6106 - classification_loss: 0.2851 175/500 [=========>....................] - ETA: 1:50 - loss: 1.8945 - regression_loss: 1.6096 - classification_loss: 0.2849 176/500 [=========>....................] - ETA: 1:50 - loss: 1.8934 - regression_loss: 1.6090 - classification_loss: 0.2844 177/500 [=========>....................] - ETA: 1:50 - loss: 1.8866 - regression_loss: 1.6033 - classification_loss: 0.2833 178/500 [=========>....................] - ETA: 1:49 - loss: 1.8880 - regression_loss: 1.6044 - classification_loss: 0.2835 179/500 [=========>....................] - ETA: 1:49 - loss: 1.8892 - regression_loss: 1.6052 - classification_loss: 0.2840 180/500 [=========>....................] - ETA: 1:49 - loss: 1.8903 - regression_loss: 1.6063 - classification_loss: 0.2840 181/500 [=========>....................] - ETA: 1:48 - loss: 1.8844 - regression_loss: 1.6013 - classification_loss: 0.2831 182/500 [=========>....................] - ETA: 1:48 - loss: 1.8819 - regression_loss: 1.5992 - classification_loss: 0.2828 183/500 [=========>....................] - ETA: 1:48 - loss: 1.8839 - regression_loss: 1.6004 - classification_loss: 0.2836 184/500 [==========>...................] - ETA: 1:47 - loss: 1.8849 - regression_loss: 1.6012 - classification_loss: 0.2838 185/500 [==========>...................] - ETA: 1:47 - loss: 1.8827 - regression_loss: 1.5995 - classification_loss: 0.2832 186/500 [==========>...................] - ETA: 1:47 - loss: 1.8836 - regression_loss: 1.6000 - classification_loss: 0.2836 187/500 [==========>...................] - ETA: 1:46 - loss: 1.8783 - regression_loss: 1.5954 - classification_loss: 0.2829 188/500 [==========>...................] - ETA: 1:46 - loss: 1.8764 - regression_loss: 1.5943 - classification_loss: 0.2821 189/500 [==========>...................] - ETA: 1:45 - loss: 1.8767 - regression_loss: 1.5945 - classification_loss: 0.2822 190/500 [==========>...................] - ETA: 1:45 - loss: 1.8774 - regression_loss: 1.5948 - classification_loss: 0.2826 191/500 [==========>...................] - ETA: 1:45 - loss: 1.8739 - regression_loss: 1.5922 - classification_loss: 0.2817 192/500 [==========>...................] - ETA: 1:44 - loss: 1.8779 - regression_loss: 1.5952 - classification_loss: 0.2827 193/500 [==========>...................] - ETA: 1:44 - loss: 1.8774 - regression_loss: 1.5946 - classification_loss: 0.2828 194/500 [==========>...................] - ETA: 1:44 - loss: 1.8753 - regression_loss: 1.5927 - classification_loss: 0.2826 195/500 [==========>...................] - ETA: 1:43 - loss: 1.8744 - regression_loss: 1.5922 - classification_loss: 0.2822 196/500 [==========>...................] - ETA: 1:43 - loss: 1.8726 - regression_loss: 1.5912 - classification_loss: 0.2814 197/500 [==========>...................] - ETA: 1:43 - loss: 1.8724 - regression_loss: 1.5910 - classification_loss: 0.2813 198/500 [==========>...................] - ETA: 1:42 - loss: 1.8722 - regression_loss: 1.5914 - classification_loss: 0.2808 199/500 [==========>...................] - ETA: 1:42 - loss: 1.8727 - regression_loss: 1.5923 - classification_loss: 0.2804 200/500 [===========>..................] - ETA: 1:42 - loss: 1.8741 - regression_loss: 1.5932 - classification_loss: 0.2809 201/500 [===========>..................] - ETA: 1:41 - loss: 1.8719 - regression_loss: 1.5915 - classification_loss: 0.2804 202/500 [===========>..................] - ETA: 1:41 - loss: 1.8678 - regression_loss: 1.5882 - classification_loss: 0.2796 203/500 [===========>..................] - ETA: 1:41 - loss: 1.8741 - regression_loss: 1.5935 - classification_loss: 0.2806 204/500 [===========>..................] - ETA: 1:40 - loss: 1.8738 - regression_loss: 1.5938 - classification_loss: 0.2800 205/500 [===========>..................] - ETA: 1:40 - loss: 1.8728 - regression_loss: 1.5933 - classification_loss: 0.2795 206/500 [===========>..................] - ETA: 1:40 - loss: 1.8710 - regression_loss: 1.5920 - classification_loss: 0.2790 207/500 [===========>..................] - ETA: 1:39 - loss: 1.8711 - regression_loss: 1.5918 - classification_loss: 0.2793 208/500 [===========>..................] - ETA: 1:39 - loss: 1.8701 - regression_loss: 1.5912 - classification_loss: 0.2788 209/500 [===========>..................] - ETA: 1:39 - loss: 1.8690 - regression_loss: 1.5905 - classification_loss: 0.2786 210/500 [===========>..................] - ETA: 1:38 - loss: 1.8689 - regression_loss: 1.5907 - classification_loss: 0.2782 211/500 [===========>..................] - ETA: 1:38 - loss: 1.8677 - regression_loss: 1.5896 - classification_loss: 0.2781 212/500 [===========>..................] - ETA: 1:38 - loss: 1.8688 - regression_loss: 1.5909 - classification_loss: 0.2779 213/500 [===========>..................] - ETA: 1:37 - loss: 1.8678 - regression_loss: 1.5902 - classification_loss: 0.2776 214/500 [===========>..................] - ETA: 1:37 - loss: 1.8637 - regression_loss: 1.5860 - classification_loss: 0.2777 215/500 [===========>..................] - ETA: 1:36 - loss: 1.8631 - regression_loss: 1.5855 - classification_loss: 0.2776 216/500 [===========>..................] - ETA: 1:36 - loss: 1.8653 - regression_loss: 1.5878 - classification_loss: 0.2775 217/500 [============>.................] - ETA: 1:36 - loss: 1.8640 - regression_loss: 1.5867 - classification_loss: 0.2774 218/500 [============>.................] - ETA: 1:35 - loss: 1.8670 - regression_loss: 1.5891 - classification_loss: 0.2778 219/500 [============>.................] - ETA: 1:35 - loss: 1.8675 - regression_loss: 1.5898 - classification_loss: 0.2777 220/500 [============>.................] - ETA: 1:35 - loss: 1.8667 - regression_loss: 1.5895 - classification_loss: 0.2772 221/500 [============>.................] - ETA: 1:34 - loss: 1.8654 - regression_loss: 1.5884 - classification_loss: 0.2770 222/500 [============>.................] - ETA: 1:34 - loss: 1.8692 - regression_loss: 1.5913 - classification_loss: 0.2779 223/500 [============>.................] - ETA: 1:34 - loss: 1.8706 - regression_loss: 1.5921 - classification_loss: 0.2785 224/500 [============>.................] - ETA: 1:33 - loss: 1.8719 - regression_loss: 1.5933 - classification_loss: 0.2786 225/500 [============>.................] - ETA: 1:33 - loss: 1.8752 - regression_loss: 1.5955 - classification_loss: 0.2797 226/500 [============>.................] - ETA: 1:33 - loss: 1.8730 - regression_loss: 1.5938 - classification_loss: 0.2792 227/500 [============>.................] - ETA: 1:32 - loss: 1.8691 - regression_loss: 1.5900 - classification_loss: 0.2792 228/500 [============>.................] - ETA: 1:32 - loss: 1.8685 - regression_loss: 1.5896 - classification_loss: 0.2790 229/500 [============>.................] - ETA: 1:32 - loss: 1.8672 - regression_loss: 1.5886 - classification_loss: 0.2785 230/500 [============>.................] - ETA: 1:31 - loss: 1.8691 - regression_loss: 1.5897 - classification_loss: 0.2794 231/500 [============>.................] - ETA: 1:31 - loss: 1.8671 - regression_loss: 1.5881 - classification_loss: 0.2790 232/500 [============>.................] - ETA: 1:31 - loss: 1.8673 - regression_loss: 1.5885 - classification_loss: 0.2788 233/500 [============>.................] - ETA: 1:30 - loss: 1.8678 - regression_loss: 1.5891 - classification_loss: 0.2787 234/500 [=============>................] - ETA: 1:30 - loss: 1.8697 - regression_loss: 1.5900 - classification_loss: 0.2797 235/500 [=============>................] - ETA: 1:30 - loss: 1.8682 - regression_loss: 1.5888 - classification_loss: 0.2794 236/500 [=============>................] - ETA: 1:29 - loss: 1.8694 - regression_loss: 1.5897 - classification_loss: 0.2796 237/500 [=============>................] - ETA: 1:29 - loss: 1.8671 - regression_loss: 1.5879 - classification_loss: 0.2793 238/500 [=============>................] - ETA: 1:29 - loss: 1.8677 - regression_loss: 1.5884 - classification_loss: 0.2794 239/500 [=============>................] - ETA: 1:28 - loss: 1.8700 - regression_loss: 1.5909 - classification_loss: 0.2791 240/500 [=============>................] - ETA: 1:28 - loss: 1.8718 - regression_loss: 1.5924 - classification_loss: 0.2794 241/500 [=============>................] - ETA: 1:28 - loss: 1.8714 - regression_loss: 1.5921 - classification_loss: 0.2793 242/500 [=============>................] - ETA: 1:27 - loss: 1.8712 - regression_loss: 1.5923 - classification_loss: 0.2790 243/500 [=============>................] - ETA: 1:27 - loss: 1.8751 - regression_loss: 1.5949 - classification_loss: 0.2803 244/500 [=============>................] - ETA: 1:27 - loss: 1.8738 - regression_loss: 1.5939 - classification_loss: 0.2799 245/500 [=============>................] - ETA: 1:26 - loss: 1.8719 - regression_loss: 1.5925 - classification_loss: 0.2794 246/500 [=============>................] - ETA: 1:26 - loss: 1.8715 - regression_loss: 1.5922 - classification_loss: 0.2794 247/500 [=============>................] - ETA: 1:26 - loss: 1.8709 - regression_loss: 1.5916 - classification_loss: 0.2793 248/500 [=============>................] - ETA: 1:25 - loss: 1.8730 - regression_loss: 1.5937 - classification_loss: 0.2793 249/500 [=============>................] - ETA: 1:25 - loss: 1.8695 - regression_loss: 1.5908 - classification_loss: 0.2787 250/500 [==============>...............] - ETA: 1:25 - loss: 1.8681 - regression_loss: 1.5898 - classification_loss: 0.2784 251/500 [==============>...............] - ETA: 1:24 - loss: 1.8681 - regression_loss: 1.5891 - classification_loss: 0.2790 252/500 [==============>...............] - ETA: 1:24 - loss: 1.8675 - regression_loss: 1.5888 - classification_loss: 0.2787 253/500 [==============>...............] - ETA: 1:24 - loss: 1.8669 - regression_loss: 1.5887 - classification_loss: 0.2782 254/500 [==============>...............] - ETA: 1:23 - loss: 1.8713 - regression_loss: 1.5922 - classification_loss: 0.2791 255/500 [==============>...............] - ETA: 1:23 - loss: 1.8689 - regression_loss: 1.5902 - classification_loss: 0.2787 256/500 [==============>...............] - ETA: 1:23 - loss: 1.8698 - regression_loss: 1.5911 - classification_loss: 0.2787 257/500 [==============>...............] - ETA: 1:22 - loss: 1.8692 - regression_loss: 1.5905 - classification_loss: 0.2787 258/500 [==============>...............] - ETA: 1:22 - loss: 1.8708 - regression_loss: 1.5915 - classification_loss: 0.2793 259/500 [==============>...............] - ETA: 1:22 - loss: 1.8681 - regression_loss: 1.5893 - classification_loss: 0.2788 260/500 [==============>...............] - ETA: 1:21 - loss: 1.8667 - regression_loss: 1.5876 - classification_loss: 0.2791 261/500 [==============>...............] - ETA: 1:21 - loss: 1.8628 - regression_loss: 1.5844 - classification_loss: 0.2784 262/500 [==============>...............] - ETA: 1:20 - loss: 1.8624 - regression_loss: 1.5843 - classification_loss: 0.2781 263/500 [==============>...............] - ETA: 1:20 - loss: 1.8613 - regression_loss: 1.5835 - classification_loss: 0.2778 264/500 [==============>...............] - ETA: 1:20 - loss: 1.8574 - regression_loss: 1.5803 - classification_loss: 0.2771 265/500 [==============>...............] - ETA: 1:19 - loss: 1.8602 - regression_loss: 1.5822 - classification_loss: 0.2780 266/500 [==============>...............] - ETA: 1:19 - loss: 1.8603 - regression_loss: 1.5820 - classification_loss: 0.2783 267/500 [===============>..............] - ETA: 1:19 - loss: 1.8620 - regression_loss: 1.5835 - classification_loss: 0.2785 268/500 [===============>..............] - ETA: 1:18 - loss: 1.8638 - regression_loss: 1.5852 - classification_loss: 0.2786 269/500 [===============>..............] - ETA: 1:18 - loss: 1.8607 - regression_loss: 1.5827 - classification_loss: 0.2780 270/500 [===============>..............] - ETA: 1:18 - loss: 1.8597 - regression_loss: 1.5821 - classification_loss: 0.2776 271/500 [===============>..............] - ETA: 1:17 - loss: 1.8585 - regression_loss: 1.5812 - classification_loss: 0.2773 272/500 [===============>..............] - ETA: 1:17 - loss: 1.8579 - regression_loss: 1.5805 - classification_loss: 0.2773 273/500 [===============>..............] - ETA: 1:17 - loss: 1.8563 - regression_loss: 1.5794 - classification_loss: 0.2770 274/500 [===============>..............] - ETA: 1:16 - loss: 1.8554 - regression_loss: 1.5785 - classification_loss: 0.2768 275/500 [===============>..............] - ETA: 1:16 - loss: 1.8525 - regression_loss: 1.5761 - classification_loss: 0.2764 276/500 [===============>..............] - ETA: 1:16 - loss: 1.8502 - regression_loss: 1.5743 - classification_loss: 0.2759 277/500 [===============>..............] - ETA: 1:15 - loss: 1.8515 - regression_loss: 1.5755 - classification_loss: 0.2761 278/500 [===============>..............] - ETA: 1:15 - loss: 1.8539 - regression_loss: 1.5776 - classification_loss: 0.2763 279/500 [===============>..............] - ETA: 1:15 - loss: 1.8512 - regression_loss: 1.5754 - classification_loss: 0.2758 280/500 [===============>..............] - ETA: 1:14 - loss: 1.8535 - regression_loss: 1.5772 - classification_loss: 0.2763 281/500 [===============>..............] - ETA: 1:14 - loss: 1.8525 - regression_loss: 1.5764 - classification_loss: 0.2761 282/500 [===============>..............] - ETA: 1:14 - loss: 1.8525 - regression_loss: 1.5765 - classification_loss: 0.2760 283/500 [===============>..............] - ETA: 1:13 - loss: 1.8533 - regression_loss: 1.5772 - classification_loss: 0.2760 284/500 [================>.............] - ETA: 1:13 - loss: 1.8536 - regression_loss: 1.5776 - classification_loss: 0.2760 285/500 [================>.............] - ETA: 1:13 - loss: 1.8530 - regression_loss: 1.5772 - classification_loss: 0.2758 286/500 [================>.............] - ETA: 1:12 - loss: 1.8528 - regression_loss: 1.5770 - classification_loss: 0.2758 287/500 [================>.............] - ETA: 1:12 - loss: 1.8523 - regression_loss: 1.5767 - classification_loss: 0.2756 288/500 [================>.............] - ETA: 1:12 - loss: 1.8505 - regression_loss: 1.5754 - classification_loss: 0.2752 289/500 [================>.............] - ETA: 1:11 - loss: 1.8505 - regression_loss: 1.5753 - classification_loss: 0.2752 290/500 [================>.............] - ETA: 1:11 - loss: 1.8515 - regression_loss: 1.5760 - classification_loss: 0.2755 291/500 [================>.............] - ETA: 1:10 - loss: 1.8506 - regression_loss: 1.5753 - classification_loss: 0.2753 292/500 [================>.............] - ETA: 1:10 - loss: 1.8507 - regression_loss: 1.5753 - classification_loss: 0.2754 293/500 [================>.............] - ETA: 1:10 - loss: 1.8483 - regression_loss: 1.5733 - classification_loss: 0.2750 294/500 [================>.............] - ETA: 1:09 - loss: 1.8478 - regression_loss: 1.5730 - classification_loss: 0.2749 295/500 [================>.............] - ETA: 1:09 - loss: 1.8458 - regression_loss: 1.5714 - classification_loss: 0.2744 296/500 [================>.............] - ETA: 1:09 - loss: 1.8446 - regression_loss: 1.5700 - classification_loss: 0.2746 297/500 [================>.............] - ETA: 1:08 - loss: 1.8435 - regression_loss: 1.5692 - classification_loss: 0.2743 298/500 [================>.............] - ETA: 1:08 - loss: 1.8398 - regression_loss: 1.5660 - classification_loss: 0.2738 299/500 [================>.............] - ETA: 1:08 - loss: 1.8390 - regression_loss: 1.5655 - classification_loss: 0.2735 300/500 [=================>............] - ETA: 1:07 - loss: 1.8396 - regression_loss: 1.5662 - classification_loss: 0.2735 301/500 [=================>............] - ETA: 1:07 - loss: 1.8396 - regression_loss: 1.5662 - classification_loss: 0.2734 302/500 [=================>............] - ETA: 1:07 - loss: 1.8358 - regression_loss: 1.5630 - classification_loss: 0.2728 303/500 [=================>............] - ETA: 1:06 - loss: 1.8335 - regression_loss: 1.5610 - classification_loss: 0.2725 304/500 [=================>............] - ETA: 1:06 - loss: 1.8334 - regression_loss: 1.5611 - classification_loss: 0.2723 305/500 [=================>............] - ETA: 1:06 - loss: 1.8313 - regression_loss: 1.5594 - classification_loss: 0.2719 306/500 [=================>............] - ETA: 1:05 - loss: 1.8306 - regression_loss: 1.5587 - classification_loss: 0.2719 307/500 [=================>............] - ETA: 1:05 - loss: 1.8311 - regression_loss: 1.5593 - classification_loss: 0.2718 308/500 [=================>............] - ETA: 1:05 - loss: 1.8317 - regression_loss: 1.5596 - classification_loss: 0.2721 309/500 [=================>............] - ETA: 1:04 - loss: 1.8334 - regression_loss: 1.5616 - classification_loss: 0.2718 310/500 [=================>............] - ETA: 1:04 - loss: 1.8290 - regression_loss: 1.5579 - classification_loss: 0.2712 311/500 [=================>............] - ETA: 1:04 - loss: 1.8277 - regression_loss: 1.5567 - classification_loss: 0.2711 312/500 [=================>............] - ETA: 1:03 - loss: 1.8277 - regression_loss: 1.5569 - classification_loss: 0.2707 313/500 [=================>............] - ETA: 1:03 - loss: 1.8250 - regression_loss: 1.5548 - classification_loss: 0.2702 314/500 [=================>............] - ETA: 1:03 - loss: 1.8245 - regression_loss: 1.5544 - classification_loss: 0.2701 315/500 [=================>............] - ETA: 1:02 - loss: 1.8291 - regression_loss: 1.5577 - classification_loss: 0.2714 316/500 [=================>............] - ETA: 1:02 - loss: 1.8276 - regression_loss: 1.5566 - classification_loss: 0.2710 317/500 [==================>...........] - ETA: 1:02 - loss: 1.8290 - regression_loss: 1.5579 - classification_loss: 0.2710 318/500 [==================>...........] - ETA: 1:01 - loss: 1.8288 - regression_loss: 1.5579 - classification_loss: 0.2709 319/500 [==================>...........] - ETA: 1:01 - loss: 1.8331 - regression_loss: 1.5618 - classification_loss: 0.2714 320/500 [==================>...........] - ETA: 1:01 - loss: 1.8333 - regression_loss: 1.5621 - classification_loss: 0.2712 321/500 [==================>...........] - ETA: 1:00 - loss: 1.8357 - regression_loss: 1.5640 - classification_loss: 0.2717 322/500 [==================>...........] - ETA: 1:00 - loss: 1.8367 - regression_loss: 1.5647 - classification_loss: 0.2719 323/500 [==================>...........] - ETA: 1:00 - loss: 1.8366 - regression_loss: 1.5646 - classification_loss: 0.2719 324/500 [==================>...........] - ETA: 59s - loss: 1.8335 - regression_loss: 1.5621 - classification_loss: 0.2714  325/500 [==================>...........] - ETA: 59s - loss: 1.8348 - regression_loss: 1.5626 - classification_loss: 0.2721 326/500 [==================>...........] - ETA: 59s - loss: 1.8353 - regression_loss: 1.5631 - classification_loss: 0.2722 327/500 [==================>...........] - ETA: 58s - loss: 1.8368 - regression_loss: 1.5650 - classification_loss: 0.2718 328/500 [==================>...........] - ETA: 58s - loss: 1.8354 - regression_loss: 1.5640 - classification_loss: 0.2714 329/500 [==================>...........] - ETA: 58s - loss: 1.8339 - regression_loss: 1.5627 - classification_loss: 0.2712 330/500 [==================>...........] - ETA: 57s - loss: 1.8349 - regression_loss: 1.5638 - classification_loss: 0.2711 331/500 [==================>...........] - ETA: 57s - loss: 1.8346 - regression_loss: 1.5634 - classification_loss: 0.2712 332/500 [==================>...........] - ETA: 57s - loss: 1.8370 - regression_loss: 1.5654 - classification_loss: 0.2716 333/500 [==================>...........] - ETA: 56s - loss: 1.8370 - regression_loss: 1.5653 - classification_loss: 0.2717 334/500 [===================>..........] - ETA: 56s - loss: 1.8367 - regression_loss: 1.5652 - classification_loss: 0.2714 335/500 [===================>..........] - ETA: 56s - loss: 1.8396 - regression_loss: 1.5659 - classification_loss: 0.2737 336/500 [===================>..........] - ETA: 55s - loss: 1.8397 - regression_loss: 1.5661 - classification_loss: 0.2735 337/500 [===================>..........] - ETA: 55s - loss: 1.8386 - regression_loss: 1.5653 - classification_loss: 0.2733 338/500 [===================>..........] - ETA: 55s - loss: 1.8371 - regression_loss: 1.5641 - classification_loss: 0.2730 339/500 [===================>..........] - ETA: 54s - loss: 1.8359 - regression_loss: 1.5630 - classification_loss: 0.2729 340/500 [===================>..........] - ETA: 54s - loss: 1.8393 - regression_loss: 1.5659 - classification_loss: 0.2733 341/500 [===================>..........] - ETA: 54s - loss: 1.8380 - regression_loss: 1.5649 - classification_loss: 0.2731 342/500 [===================>..........] - ETA: 53s - loss: 1.8361 - regression_loss: 1.5633 - classification_loss: 0.2728 343/500 [===================>..........] - ETA: 53s - loss: 1.8351 - regression_loss: 1.5624 - classification_loss: 0.2727 344/500 [===================>..........] - ETA: 53s - loss: 1.8349 - regression_loss: 1.5623 - classification_loss: 0.2726 345/500 [===================>..........] - ETA: 52s - loss: 1.8361 - regression_loss: 1.5631 - classification_loss: 0.2730 346/500 [===================>..........] - ETA: 52s - loss: 1.8377 - regression_loss: 1.5642 - classification_loss: 0.2735 347/500 [===================>..........] - ETA: 51s - loss: 1.8372 - regression_loss: 1.5638 - classification_loss: 0.2734 348/500 [===================>..........] - ETA: 51s - loss: 1.8353 - regression_loss: 1.5617 - classification_loss: 0.2736 349/500 [===================>..........] - ETA: 51s - loss: 1.8363 - regression_loss: 1.5624 - classification_loss: 0.2739 350/500 [====================>.........] - ETA: 50s - loss: 1.8367 - regression_loss: 1.5630 - classification_loss: 0.2737 351/500 [====================>.........] - ETA: 50s - loss: 1.8367 - regression_loss: 1.5631 - classification_loss: 0.2737 352/500 [====================>.........] - ETA: 50s - loss: 1.8373 - regression_loss: 1.5637 - classification_loss: 0.2736 353/500 [====================>.........] - ETA: 49s - loss: 1.8395 - regression_loss: 1.5659 - classification_loss: 0.2736 354/500 [====================>.........] - ETA: 49s - loss: 1.8398 - regression_loss: 1.5663 - classification_loss: 0.2735 355/500 [====================>.........] - ETA: 49s - loss: 1.8381 - regression_loss: 1.5649 - classification_loss: 0.2732 356/500 [====================>.........] - ETA: 48s - loss: 1.8387 - regression_loss: 1.5651 - classification_loss: 0.2735 357/500 [====================>.........] - ETA: 48s - loss: 1.8385 - regression_loss: 1.5649 - classification_loss: 0.2736 358/500 [====================>.........] - ETA: 48s - loss: 1.8411 - regression_loss: 1.5671 - classification_loss: 0.2741 359/500 [====================>.........] - ETA: 47s - loss: 1.8417 - regression_loss: 1.5676 - classification_loss: 0.2741 360/500 [====================>.........] - ETA: 47s - loss: 1.8409 - regression_loss: 1.5671 - classification_loss: 0.2737 361/500 [====================>.........] - ETA: 47s - loss: 1.8399 - regression_loss: 1.5666 - classification_loss: 0.2733 362/500 [====================>.........] - ETA: 46s - loss: 1.8411 - regression_loss: 1.5675 - classification_loss: 0.2736 363/500 [====================>.........] - ETA: 46s - loss: 1.8413 - regression_loss: 1.5677 - classification_loss: 0.2736 364/500 [====================>.........] - ETA: 46s - loss: 1.8427 - regression_loss: 1.5683 - classification_loss: 0.2744 365/500 [====================>.........] - ETA: 45s - loss: 1.8395 - regression_loss: 1.5655 - classification_loss: 0.2740 366/500 [====================>.........] - ETA: 45s - loss: 1.8386 - regression_loss: 1.5647 - classification_loss: 0.2739 367/500 [=====================>........] - ETA: 45s - loss: 1.8392 - regression_loss: 1.5652 - classification_loss: 0.2741 368/500 [=====================>........] - ETA: 44s - loss: 1.8401 - regression_loss: 1.5660 - classification_loss: 0.2741 369/500 [=====================>........] - ETA: 44s - loss: 1.8393 - regression_loss: 1.5654 - classification_loss: 0.2739 370/500 [=====================>........] - ETA: 44s - loss: 1.8392 - regression_loss: 1.5654 - classification_loss: 0.2738 371/500 [=====================>........] - ETA: 43s - loss: 1.8378 - regression_loss: 1.5643 - classification_loss: 0.2735 372/500 [=====================>........] - ETA: 43s - loss: 1.8365 - regression_loss: 1.5632 - classification_loss: 0.2733 373/500 [=====================>........] - ETA: 43s - loss: 1.8375 - regression_loss: 1.5640 - classification_loss: 0.2736 374/500 [=====================>........] - ETA: 42s - loss: 1.8366 - regression_loss: 1.5634 - classification_loss: 0.2733 375/500 [=====================>........] - ETA: 42s - loss: 1.8358 - regression_loss: 1.5628 - classification_loss: 0.2730 376/500 [=====================>........] - ETA: 42s - loss: 1.8364 - regression_loss: 1.5632 - classification_loss: 0.2732 377/500 [=====================>........] - ETA: 41s - loss: 1.8360 - regression_loss: 1.5631 - classification_loss: 0.2729 378/500 [=====================>........] - ETA: 41s - loss: 1.8354 - regression_loss: 1.5627 - classification_loss: 0.2727 379/500 [=====================>........] - ETA: 41s - loss: 1.8343 - regression_loss: 1.5618 - classification_loss: 0.2725 380/500 [=====================>........] - ETA: 40s - loss: 1.8328 - regression_loss: 1.5606 - classification_loss: 0.2722 381/500 [=====================>........] - ETA: 40s - loss: 1.8320 - regression_loss: 1.5601 - classification_loss: 0.2719 382/500 [=====================>........] - ETA: 40s - loss: 1.8313 - regression_loss: 1.5597 - classification_loss: 0.2716 383/500 [=====================>........] - ETA: 39s - loss: 1.8302 - regression_loss: 1.5586 - classification_loss: 0.2716 384/500 [======================>.......] - ETA: 39s - loss: 1.8331 - regression_loss: 1.5610 - classification_loss: 0.2720 385/500 [======================>.......] - ETA: 39s - loss: 1.8349 - regression_loss: 1.5622 - classification_loss: 0.2727 386/500 [======================>.......] - ETA: 38s - loss: 1.8350 - regression_loss: 1.5625 - classification_loss: 0.2726 387/500 [======================>.......] - ETA: 38s - loss: 1.8320 - regression_loss: 1.5599 - classification_loss: 0.2721 388/500 [======================>.......] - ETA: 38s - loss: 1.8353 - regression_loss: 1.5628 - classification_loss: 0.2725 389/500 [======================>.......] - ETA: 37s - loss: 1.8365 - regression_loss: 1.5640 - classification_loss: 0.2725 390/500 [======================>.......] - ETA: 37s - loss: 1.8372 - regression_loss: 1.5647 - classification_loss: 0.2726 391/500 [======================>.......] - ETA: 37s - loss: 1.8358 - regression_loss: 1.5633 - classification_loss: 0.2725 392/500 [======================>.......] - ETA: 36s - loss: 1.8360 - regression_loss: 1.5636 - classification_loss: 0.2725 393/500 [======================>.......] - ETA: 36s - loss: 1.8355 - regression_loss: 1.5631 - classification_loss: 0.2724 394/500 [======================>.......] - ETA: 36s - loss: 1.8351 - regression_loss: 1.5629 - classification_loss: 0.2722 395/500 [======================>.......] - ETA: 35s - loss: 1.8396 - regression_loss: 1.5662 - classification_loss: 0.2734 396/500 [======================>.......] - ETA: 35s - loss: 1.8399 - regression_loss: 1.5668 - classification_loss: 0.2732 397/500 [======================>.......] - ETA: 35s - loss: 1.8401 - regression_loss: 1.5666 - classification_loss: 0.2735 398/500 [======================>.......] - ETA: 34s - loss: 1.8406 - regression_loss: 1.5672 - classification_loss: 0.2734 399/500 [======================>.......] - ETA: 34s - loss: 1.8385 - regression_loss: 1.5654 - classification_loss: 0.2731 400/500 [=======================>......] - ETA: 33s - loss: 1.8380 - regression_loss: 1.5650 - classification_loss: 0.2729 401/500 [=======================>......] - ETA: 33s - loss: 1.8401 - regression_loss: 1.5667 - classification_loss: 0.2733 402/500 [=======================>......] - ETA: 33s - loss: 1.8409 - regression_loss: 1.5673 - classification_loss: 0.2736 403/500 [=======================>......] - ETA: 32s - loss: 1.8435 - regression_loss: 1.5694 - classification_loss: 0.2741 404/500 [=======================>......] - ETA: 32s - loss: 1.8447 - regression_loss: 1.5704 - classification_loss: 0.2743 405/500 [=======================>......] - ETA: 32s - loss: 1.8455 - regression_loss: 1.5710 - classification_loss: 0.2744 406/500 [=======================>......] - ETA: 31s - loss: 1.8485 - regression_loss: 1.5731 - classification_loss: 0.2754 407/500 [=======================>......] - ETA: 31s - loss: 1.8478 - regression_loss: 1.5726 - classification_loss: 0.2752 408/500 [=======================>......] - ETA: 31s - loss: 1.8474 - regression_loss: 1.5723 - classification_loss: 0.2751 409/500 [=======================>......] - ETA: 30s - loss: 1.8468 - regression_loss: 1.5718 - classification_loss: 0.2750 410/500 [=======================>......] - ETA: 30s - loss: 1.8476 - regression_loss: 1.5726 - classification_loss: 0.2750 411/500 [=======================>......] - ETA: 30s - loss: 1.8465 - regression_loss: 1.5714 - classification_loss: 0.2751 412/500 [=======================>......] - ETA: 29s - loss: 1.8457 - regression_loss: 1.5709 - classification_loss: 0.2748 413/500 [=======================>......] - ETA: 29s - loss: 1.8455 - regression_loss: 1.5708 - classification_loss: 0.2748 414/500 [=======================>......] - ETA: 29s - loss: 1.8459 - regression_loss: 1.5708 - classification_loss: 0.2751 415/500 [=======================>......] - ETA: 28s - loss: 1.8459 - regression_loss: 1.5710 - classification_loss: 0.2750 416/500 [=======================>......] - ETA: 28s - loss: 1.8453 - regression_loss: 1.5704 - classification_loss: 0.2748 417/500 [========================>.....] - ETA: 28s - loss: 1.8450 - regression_loss: 1.5704 - classification_loss: 0.2747 418/500 [========================>.....] - ETA: 27s - loss: 1.8461 - regression_loss: 1.5713 - classification_loss: 0.2749 419/500 [========================>.....] - ETA: 27s - loss: 1.8436 - regression_loss: 1.5690 - classification_loss: 0.2747 420/500 [========================>.....] - ETA: 27s - loss: 1.8447 - regression_loss: 1.5698 - classification_loss: 0.2749 421/500 [========================>.....] - ETA: 26s - loss: 1.8436 - regression_loss: 1.5690 - classification_loss: 0.2746 422/500 [========================>.....] - ETA: 26s - loss: 1.8434 - regression_loss: 1.5692 - classification_loss: 0.2742 423/500 [========================>.....] - ETA: 26s - loss: 1.8459 - regression_loss: 1.5712 - classification_loss: 0.2746 424/500 [========================>.....] - ETA: 25s - loss: 1.8452 - regression_loss: 1.5707 - classification_loss: 0.2745 425/500 [========================>.....] - ETA: 25s - loss: 1.8460 - regression_loss: 1.5714 - classification_loss: 0.2746 426/500 [========================>.....] - ETA: 25s - loss: 1.8455 - regression_loss: 1.5710 - classification_loss: 0.2745 427/500 [========================>.....] - ETA: 24s - loss: 1.8458 - regression_loss: 1.5714 - classification_loss: 0.2745 428/500 [========================>.....] - ETA: 24s - loss: 1.8457 - regression_loss: 1.5715 - classification_loss: 0.2742 429/500 [========================>.....] - ETA: 24s - loss: 1.8474 - regression_loss: 1.5728 - classification_loss: 0.2746 430/500 [========================>.....] - ETA: 23s - loss: 1.8472 - regression_loss: 1.5727 - classification_loss: 0.2745 431/500 [========================>.....] - ETA: 23s - loss: 1.8475 - regression_loss: 1.5730 - classification_loss: 0.2745 432/500 [========================>.....] - ETA: 23s - loss: 1.8463 - regression_loss: 1.5720 - classification_loss: 0.2743 433/500 [========================>.....] - ETA: 22s - loss: 1.8464 - regression_loss: 1.5721 - classification_loss: 0.2743 434/500 [=========================>....] - ETA: 22s - loss: 1.8450 - regression_loss: 1.5709 - classification_loss: 0.2741 435/500 [=========================>....] - ETA: 22s - loss: 1.8433 - regression_loss: 1.5694 - classification_loss: 0.2739 436/500 [=========================>....] - ETA: 21s - loss: 1.8425 - regression_loss: 1.5686 - classification_loss: 0.2739 437/500 [=========================>....] - ETA: 21s - loss: 1.8425 - regression_loss: 1.5688 - classification_loss: 0.2737 438/500 [=========================>....] - ETA: 21s - loss: 1.8404 - regression_loss: 1.5671 - classification_loss: 0.2733 439/500 [=========================>....] - ETA: 20s - loss: 1.8392 - regression_loss: 1.5662 - classification_loss: 0.2730 440/500 [=========================>....] - ETA: 20s - loss: 1.8394 - regression_loss: 1.5658 - classification_loss: 0.2736 441/500 [=========================>....] - ETA: 20s - loss: 1.8384 - regression_loss: 1.5649 - classification_loss: 0.2734 442/500 [=========================>....] - ETA: 19s - loss: 1.8395 - regression_loss: 1.5655 - classification_loss: 0.2740 443/500 [=========================>....] - ETA: 19s - loss: 1.8382 - regression_loss: 1.5645 - classification_loss: 0.2737 444/500 [=========================>....] - ETA: 19s - loss: 1.8381 - regression_loss: 1.5646 - classification_loss: 0.2736 445/500 [=========================>....] - ETA: 18s - loss: 1.8394 - regression_loss: 1.5656 - classification_loss: 0.2738 446/500 [=========================>....] - ETA: 18s - loss: 1.8394 - regression_loss: 1.5656 - classification_loss: 0.2738 447/500 [=========================>....] - ETA: 18s - loss: 1.8403 - regression_loss: 1.5662 - classification_loss: 0.2741 448/500 [=========================>....] - ETA: 17s - loss: 1.8398 - regression_loss: 1.5658 - classification_loss: 0.2740 449/500 [=========================>....] - ETA: 17s - loss: 1.8404 - regression_loss: 1.5664 - classification_loss: 0.2740 450/500 [==========================>...] - ETA: 17s - loss: 1.8386 - regression_loss: 1.5649 - classification_loss: 0.2737 451/500 [==========================>...] - ETA: 16s - loss: 1.8361 - regression_loss: 1.5628 - classification_loss: 0.2732 452/500 [==========================>...] - ETA: 16s - loss: 1.8378 - regression_loss: 1.5642 - classification_loss: 0.2736 453/500 [==========================>...] - ETA: 15s - loss: 1.8439 - regression_loss: 1.5690 - classification_loss: 0.2748 454/500 [==========================>...] - ETA: 15s - loss: 1.8425 - regression_loss: 1.5680 - classification_loss: 0.2745 455/500 [==========================>...] - ETA: 15s - loss: 1.8439 - regression_loss: 1.5691 - classification_loss: 0.2749 456/500 [==========================>...] - ETA: 14s - loss: 1.8446 - regression_loss: 1.5694 - classification_loss: 0.2752 457/500 [==========================>...] - ETA: 14s - loss: 1.8442 - regression_loss: 1.5691 - classification_loss: 0.2751 458/500 [==========================>...] - ETA: 14s - loss: 1.8444 - regression_loss: 1.5693 - classification_loss: 0.2750 459/500 [==========================>...] - ETA: 13s - loss: 1.8439 - regression_loss: 1.5689 - classification_loss: 0.2750 460/500 [==========================>...] - ETA: 13s - loss: 1.8446 - regression_loss: 1.5694 - classification_loss: 0.2752 461/500 [==========================>...] - ETA: 13s - loss: 1.8429 - regression_loss: 1.5680 - classification_loss: 0.2749 462/500 [==========================>...] - ETA: 12s - loss: 1.8429 - regression_loss: 1.5681 - classification_loss: 0.2748 463/500 [==========================>...] - ETA: 12s - loss: 1.8416 - regression_loss: 1.5671 - classification_loss: 0.2745 464/500 [==========================>...] - ETA: 12s - loss: 1.8449 - regression_loss: 1.5697 - classification_loss: 0.2752 465/500 [==========================>...] - ETA: 11s - loss: 1.8451 - regression_loss: 1.5699 - classification_loss: 0.2752 466/500 [==========================>...] - ETA: 11s - loss: 1.8449 - regression_loss: 1.5700 - classification_loss: 0.2750 467/500 [===========================>..] - ETA: 11s - loss: 1.8448 - regression_loss: 1.5698 - classification_loss: 0.2750 468/500 [===========================>..] - ETA: 10s - loss: 1.8445 - regression_loss: 1.5697 - classification_loss: 0.2748 469/500 [===========================>..] - ETA: 10s - loss: 1.8444 - regression_loss: 1.5696 - classification_loss: 0.2748 470/500 [===========================>..] - ETA: 10s - loss: 1.8437 - regression_loss: 1.5691 - classification_loss: 0.2746 471/500 [===========================>..] - ETA: 9s - loss: 1.8427 - regression_loss: 1.5684 - classification_loss: 0.2743  472/500 [===========================>..] - ETA: 9s - loss: 1.8440 - regression_loss: 1.5696 - classification_loss: 0.2744 473/500 [===========================>..] - ETA: 9s - loss: 1.8433 - regression_loss: 1.5690 - classification_loss: 0.2743 474/500 [===========================>..] - ETA: 8s - loss: 1.8434 - regression_loss: 1.5690 - classification_loss: 0.2744 475/500 [===========================>..] - ETA: 8s - loss: 1.8428 - regression_loss: 1.5685 - classification_loss: 0.2744 476/500 [===========================>..] - ETA: 8s - loss: 1.8428 - regression_loss: 1.5685 - classification_loss: 0.2742 477/500 [===========================>..] - ETA: 7s - loss: 1.8414 - regression_loss: 1.5675 - classification_loss: 0.2739 478/500 [===========================>..] - ETA: 7s - loss: 1.8401 - regression_loss: 1.5663 - classification_loss: 0.2737 479/500 [===========================>..] - ETA: 7s - loss: 1.8403 - regression_loss: 1.5665 - classification_loss: 0.2737 480/500 [===========================>..] - ETA: 6s - loss: 1.8383 - regression_loss: 1.5650 - classification_loss: 0.2734 481/500 [===========================>..] - ETA: 6s - loss: 1.8387 - regression_loss: 1.5653 - classification_loss: 0.2735 482/500 [===========================>..] - ETA: 6s - loss: 1.8396 - regression_loss: 1.5658 - classification_loss: 0.2739 483/500 [===========================>..] - ETA: 5s - loss: 1.8392 - regression_loss: 1.5654 - classification_loss: 0.2738 484/500 [============================>.] - ETA: 5s - loss: 1.8398 - regression_loss: 1.5660 - classification_loss: 0.2739 485/500 [============================>.] - ETA: 5s - loss: 1.8397 - regression_loss: 1.5660 - classification_loss: 0.2737 486/500 [============================>.] - ETA: 4s - loss: 1.8381 - regression_loss: 1.5647 - classification_loss: 0.2734 487/500 [============================>.] - ETA: 4s - loss: 1.8360 - regression_loss: 1.5630 - classification_loss: 0.2731 488/500 [============================>.] - ETA: 4s - loss: 1.8345 - regression_loss: 1.5617 - classification_loss: 0.2728 489/500 [============================>.] - ETA: 3s - loss: 1.8342 - regression_loss: 1.5616 - classification_loss: 0.2726 490/500 [============================>.] - ETA: 3s - loss: 1.8348 - regression_loss: 1.5620 - classification_loss: 0.2728 491/500 [============================>.] - ETA: 3s - loss: 1.8334 - regression_loss: 1.5609 - classification_loss: 0.2725 492/500 [============================>.] - ETA: 2s - loss: 1.8353 - regression_loss: 1.5624 - classification_loss: 0.2728 493/500 [============================>.] - ETA: 2s - loss: 1.8353 - regression_loss: 1.5627 - classification_loss: 0.2726 494/500 [============================>.] - ETA: 2s - loss: 1.8349 - regression_loss: 1.5622 - classification_loss: 0.2727 495/500 [============================>.] - ETA: 1s - loss: 1.8342 - regression_loss: 1.5617 - classification_loss: 0.2725 496/500 [============================>.] - ETA: 1s - loss: 1.8337 - regression_loss: 1.5613 - classification_loss: 0.2724 497/500 [============================>.] - ETA: 1s - loss: 1.8335 - regression_loss: 1.5611 - classification_loss: 0.2724 498/500 [============================>.] - ETA: 0s - loss: 1.8333 - regression_loss: 1.5610 - classification_loss: 0.2723 499/500 [============================>.] - ETA: 0s - loss: 1.8327 - regression_loss: 1.5605 - classification_loss: 0.2722 500/500 [==============================] - 170s 340ms/step - loss: 1.8326 - regression_loss: 1.5603 - classification_loss: 0.2723 326 instances of class plum with average precision: 0.7810 mAP: 0.7810 Epoch 00005: saving model to ./training/snapshots/resnet101_pascal_05.h5 Epoch 6/150 1/500 [..............................] - ETA: 2:36 - loss: 1.7517 - regression_loss: 1.5431 - classification_loss: 0.2086 2/500 [..............................] - ETA: 2:40 - loss: 1.9914 - regression_loss: 1.6836 - classification_loss: 0.3079 3/500 [..............................] - ETA: 2:41 - loss: 1.9239 - regression_loss: 1.6567 - classification_loss: 0.2672 4/500 [..............................] - ETA: 2:44 - loss: 1.8532 - regression_loss: 1.6150 - classification_loss: 0.2382 5/500 [..............................] - ETA: 2:44 - loss: 2.0035 - regression_loss: 1.7238 - classification_loss: 0.2797 6/500 [..............................] - ETA: 2:43 - loss: 1.8694 - regression_loss: 1.6060 - classification_loss: 0.2634 7/500 [..............................] - ETA: 2:43 - loss: 1.8094 - regression_loss: 1.5562 - classification_loss: 0.2532 8/500 [..............................] - ETA: 2:44 - loss: 1.8254 - regression_loss: 1.5722 - classification_loss: 0.2532 9/500 [..............................] - ETA: 2:44 - loss: 1.6928 - regression_loss: 1.4510 - classification_loss: 0.2418 10/500 [..............................] - ETA: 2:43 - loss: 1.7444 - regression_loss: 1.4983 - classification_loss: 0.2461 11/500 [..............................] - ETA: 2:43 - loss: 1.7724 - regression_loss: 1.5278 - classification_loss: 0.2446 12/500 [..............................] - ETA: 2:43 - loss: 1.7539 - regression_loss: 1.5153 - classification_loss: 0.2386 13/500 [..............................] - ETA: 2:44 - loss: 1.7697 - regression_loss: 1.5288 - classification_loss: 0.2409 14/500 [..............................] - ETA: 2:43 - loss: 1.7807 - regression_loss: 1.5356 - classification_loss: 0.2450 15/500 [..............................] - ETA: 2:44 - loss: 1.7779 - regression_loss: 1.5366 - classification_loss: 0.2413 16/500 [..............................] - ETA: 2:43 - loss: 1.7330 - regression_loss: 1.4997 - classification_loss: 0.2334 17/500 [>.............................] - ETA: 2:43 - loss: 1.7727 - regression_loss: 1.5272 - classification_loss: 0.2455 18/500 [>.............................] - ETA: 2:43 - loss: 1.7493 - regression_loss: 1.5089 - classification_loss: 0.2404 19/500 [>.............................] - ETA: 2:43 - loss: 1.7537 - regression_loss: 1.5150 - classification_loss: 0.2387 20/500 [>.............................] - ETA: 2:43 - loss: 1.7166 - regression_loss: 1.4831 - classification_loss: 0.2335 21/500 [>.............................] - ETA: 2:43 - loss: 1.7390 - regression_loss: 1.5008 - classification_loss: 0.2382 22/500 [>.............................] - ETA: 2:43 - loss: 1.7397 - regression_loss: 1.5008 - classification_loss: 0.2388 23/500 [>.............................] - ETA: 2:42 - loss: 1.7522 - regression_loss: 1.5139 - classification_loss: 0.2383 24/500 [>.............................] - ETA: 2:42 - loss: 1.7879 - regression_loss: 1.5378 - classification_loss: 0.2501 25/500 [>.............................] - ETA: 2:41 - loss: 1.7881 - regression_loss: 1.5351 - classification_loss: 0.2530 26/500 [>.............................] - ETA: 2:41 - loss: 1.7786 - regression_loss: 1.5265 - classification_loss: 0.2521 27/500 [>.............................] - ETA: 2:41 - loss: 1.7727 - regression_loss: 1.5210 - classification_loss: 0.2517 28/500 [>.............................] - ETA: 2:41 - loss: 1.7492 - regression_loss: 1.4959 - classification_loss: 0.2533 29/500 [>.............................] - ETA: 2:40 - loss: 1.7712 - regression_loss: 1.5141 - classification_loss: 0.2572 30/500 [>.............................] - ETA: 2:40 - loss: 1.7750 - regression_loss: 1.5191 - classification_loss: 0.2559 31/500 [>.............................] - ETA: 2:39 - loss: 1.7926 - regression_loss: 1.5335 - classification_loss: 0.2591 32/500 [>.............................] - ETA: 2:39 - loss: 1.7820 - regression_loss: 1.5245 - classification_loss: 0.2575 33/500 [>.............................] - ETA: 2:39 - loss: 1.7644 - regression_loss: 1.5098 - classification_loss: 0.2546 34/500 [=>............................] - ETA: 2:38 - loss: 1.7666 - regression_loss: 1.5113 - classification_loss: 0.2553 35/500 [=>............................] - ETA: 2:38 - loss: 1.7741 - regression_loss: 1.5193 - classification_loss: 0.2548 36/500 [=>............................] - ETA: 2:38 - loss: 1.7657 - regression_loss: 1.5129 - classification_loss: 0.2527 37/500 [=>............................] - ETA: 2:37 - loss: 1.7589 - regression_loss: 1.5060 - classification_loss: 0.2528 38/500 [=>............................] - ETA: 2:37 - loss: 1.7461 - regression_loss: 1.4960 - classification_loss: 0.2501 39/500 [=>............................] - ETA: 2:37 - loss: 1.7664 - regression_loss: 1.5096 - classification_loss: 0.2568 40/500 [=>............................] - ETA: 2:37 - loss: 1.7808 - regression_loss: 1.5235 - classification_loss: 0.2572 41/500 [=>............................] - ETA: 2:37 - loss: 1.7652 - regression_loss: 1.5102 - classification_loss: 0.2550 42/500 [=>............................] - ETA: 2:36 - loss: 1.7906 - regression_loss: 1.5296 - classification_loss: 0.2610 43/500 [=>............................] - ETA: 2:36 - loss: 1.7900 - regression_loss: 1.5295 - classification_loss: 0.2605 44/500 [=>............................] - ETA: 2:36 - loss: 1.7924 - regression_loss: 1.5312 - classification_loss: 0.2612 45/500 [=>............................] - ETA: 2:35 - loss: 1.7701 - regression_loss: 1.5123 - classification_loss: 0.2578 46/500 [=>............................] - ETA: 2:35 - loss: 1.7551 - regression_loss: 1.4999 - classification_loss: 0.2552 47/500 [=>............................] - ETA: 2:35 - loss: 1.7569 - regression_loss: 1.5020 - classification_loss: 0.2549 48/500 [=>............................] - ETA: 2:34 - loss: 1.7627 - regression_loss: 1.5081 - classification_loss: 0.2546 49/500 [=>............................] - ETA: 2:34 - loss: 1.7762 - regression_loss: 1.5191 - classification_loss: 0.2570 50/500 [==>...........................] - ETA: 2:33 - loss: 1.7705 - regression_loss: 1.5146 - classification_loss: 0.2560 51/500 [==>...........................] - ETA: 2:33 - loss: 1.7515 - regression_loss: 1.4986 - classification_loss: 0.2529 52/500 [==>...........................] - ETA: 2:33 - loss: 1.7623 - regression_loss: 1.5070 - classification_loss: 0.2553 53/500 [==>...........................] - ETA: 2:33 - loss: 1.7969 - regression_loss: 1.5342 - classification_loss: 0.2628 54/500 [==>...........................] - ETA: 2:32 - loss: 1.7860 - regression_loss: 1.5255 - classification_loss: 0.2606 55/500 [==>...........................] - ETA: 2:32 - loss: 1.7771 - regression_loss: 1.5188 - classification_loss: 0.2583 56/500 [==>...........................] - ETA: 2:31 - loss: 1.7783 - regression_loss: 1.5202 - classification_loss: 0.2581 57/500 [==>...........................] - ETA: 2:31 - loss: 1.7734 - regression_loss: 1.5154 - classification_loss: 0.2580 58/500 [==>...........................] - ETA: 2:30 - loss: 1.7836 - regression_loss: 1.5234 - classification_loss: 0.2602 59/500 [==>...........................] - ETA: 2:30 - loss: 1.8045 - regression_loss: 1.5426 - classification_loss: 0.2619 60/500 [==>...........................] - ETA: 2:30 - loss: 1.8012 - regression_loss: 1.5409 - classification_loss: 0.2603 61/500 [==>...........................] - ETA: 2:30 - loss: 1.8078 - regression_loss: 1.5458 - classification_loss: 0.2620 62/500 [==>...........................] - ETA: 2:29 - loss: 1.8047 - regression_loss: 1.5441 - classification_loss: 0.2605 63/500 [==>...........................] - ETA: 2:29 - loss: 1.8017 - regression_loss: 1.5421 - classification_loss: 0.2595 64/500 [==>...........................] - ETA: 2:28 - loss: 1.7991 - regression_loss: 1.5406 - classification_loss: 0.2586 65/500 [==>...........................] - ETA: 2:28 - loss: 1.7925 - regression_loss: 1.5352 - classification_loss: 0.2573 66/500 [==>...........................] - ETA: 2:28 - loss: 1.7899 - regression_loss: 1.5338 - classification_loss: 0.2561 67/500 [===>..........................] - ETA: 2:27 - loss: 1.7859 - regression_loss: 1.5310 - classification_loss: 0.2550 68/500 [===>..........................] - ETA: 2:27 - loss: 1.7768 - regression_loss: 1.5238 - classification_loss: 0.2530 69/500 [===>..........................] - ETA: 2:27 - loss: 1.7719 - regression_loss: 1.5201 - classification_loss: 0.2517 70/500 [===>..........................] - ETA: 2:26 - loss: 1.7728 - regression_loss: 1.5217 - classification_loss: 0.2510 71/500 [===>..........................] - ETA: 2:26 - loss: 1.7656 - regression_loss: 1.5158 - classification_loss: 0.2498 72/500 [===>..........................] - ETA: 2:25 - loss: 1.7635 - regression_loss: 1.5144 - classification_loss: 0.2491 73/500 [===>..........................] - ETA: 2:25 - loss: 1.7772 - regression_loss: 1.5243 - classification_loss: 0.2530 74/500 [===>..........................] - ETA: 2:25 - loss: 1.7810 - regression_loss: 1.5279 - classification_loss: 0.2531 75/500 [===>..........................] - ETA: 2:24 - loss: 1.7759 - regression_loss: 1.5238 - classification_loss: 0.2521 76/500 [===>..........................] - ETA: 2:24 - loss: 1.7719 - regression_loss: 1.5204 - classification_loss: 0.2515 77/500 [===>..........................] - ETA: 2:24 - loss: 1.7636 - regression_loss: 1.5139 - classification_loss: 0.2497 78/500 [===>..........................] - ETA: 2:23 - loss: 1.7570 - regression_loss: 1.5085 - classification_loss: 0.2484 79/500 [===>..........................] - ETA: 2:23 - loss: 1.7456 - regression_loss: 1.4989 - classification_loss: 0.2467 80/500 [===>..........................] - ETA: 2:23 - loss: 1.7482 - regression_loss: 1.5020 - classification_loss: 0.2463 81/500 [===>..........................] - ETA: 2:22 - loss: 1.7524 - regression_loss: 1.5061 - classification_loss: 0.2463 82/500 [===>..........................] - ETA: 2:22 - loss: 1.7471 - regression_loss: 1.5017 - classification_loss: 0.2455 83/500 [===>..........................] - ETA: 2:22 - loss: 1.7493 - regression_loss: 1.5027 - classification_loss: 0.2466 84/500 [====>.........................] - ETA: 2:21 - loss: 1.7547 - regression_loss: 1.5062 - classification_loss: 0.2485 85/500 [====>.........................] - ETA: 2:21 - loss: 1.7666 - regression_loss: 1.5152 - classification_loss: 0.2514 86/500 [====>.........................] - ETA: 2:21 - loss: 1.7627 - regression_loss: 1.5108 - classification_loss: 0.2519 87/500 [====>.........................] - ETA: 2:20 - loss: 1.7750 - regression_loss: 1.5209 - classification_loss: 0.2541 88/500 [====>.........................] - ETA: 2:20 - loss: 1.7693 - regression_loss: 1.5151 - classification_loss: 0.2542 89/500 [====>.........................] - ETA: 2:20 - loss: 1.7605 - regression_loss: 1.5077 - classification_loss: 0.2528 90/500 [====>.........................] - ETA: 2:19 - loss: 1.7589 - regression_loss: 1.5063 - classification_loss: 0.2526 91/500 [====>.........................] - ETA: 2:19 - loss: 1.7601 - regression_loss: 1.5075 - classification_loss: 0.2526 92/500 [====>.........................] - ETA: 2:19 - loss: 1.7573 - regression_loss: 1.5039 - classification_loss: 0.2534 93/500 [====>.........................] - ETA: 2:18 - loss: 1.7603 - regression_loss: 1.5053 - classification_loss: 0.2549 94/500 [====>.........................] - ETA: 2:18 - loss: 1.7760 - regression_loss: 1.5169 - classification_loss: 0.2591 95/500 [====>.........................] - ETA: 2:17 - loss: 1.7825 - regression_loss: 1.5218 - classification_loss: 0.2607 96/500 [====>.........................] - ETA: 2:17 - loss: 1.7799 - regression_loss: 1.5192 - classification_loss: 0.2607 97/500 [====>.........................] - ETA: 2:17 - loss: 1.7792 - regression_loss: 1.5191 - classification_loss: 0.2601 98/500 [====>.........................] - ETA: 2:17 - loss: 1.7864 - regression_loss: 1.5254 - classification_loss: 0.2610 99/500 [====>.........................] - ETA: 2:16 - loss: 1.7895 - regression_loss: 1.5280 - classification_loss: 0.2615 100/500 [=====>........................] - ETA: 2:16 - loss: 1.7870 - regression_loss: 1.5263 - classification_loss: 0.2607 101/500 [=====>........................] - ETA: 2:16 - loss: 1.7915 - regression_loss: 1.5285 - classification_loss: 0.2630 102/500 [=====>........................] - ETA: 2:15 - loss: 1.7859 - regression_loss: 1.5241 - classification_loss: 0.2618 103/500 [=====>........................] - ETA: 2:15 - loss: 1.7803 - regression_loss: 1.5196 - classification_loss: 0.2606 104/500 [=====>........................] - ETA: 2:14 - loss: 1.7778 - regression_loss: 1.5176 - classification_loss: 0.2602 105/500 [=====>........................] - ETA: 2:14 - loss: 1.7712 - regression_loss: 1.5111 - classification_loss: 0.2601 106/500 [=====>........................] - ETA: 2:14 - loss: 1.7811 - regression_loss: 1.5197 - classification_loss: 0.2614 107/500 [=====>........................] - ETA: 2:13 - loss: 1.7830 - regression_loss: 1.5214 - classification_loss: 0.2616 108/500 [=====>........................] - ETA: 2:13 - loss: 1.7797 - regression_loss: 1.5189 - classification_loss: 0.2608 109/500 [=====>........................] - ETA: 2:13 - loss: 1.7796 - regression_loss: 1.5195 - classification_loss: 0.2600 110/500 [=====>........................] - ETA: 2:12 - loss: 1.7772 - regression_loss: 1.5179 - classification_loss: 0.2594 111/500 [=====>........................] - ETA: 2:12 - loss: 1.7770 - regression_loss: 1.5178 - classification_loss: 0.2592 112/500 [=====>........................] - ETA: 2:12 - loss: 1.7733 - regression_loss: 1.5152 - classification_loss: 0.2582 113/500 [=====>........................] - ETA: 2:12 - loss: 1.7714 - regression_loss: 1.5140 - classification_loss: 0.2574 114/500 [=====>........................] - ETA: 2:11 - loss: 1.7733 - regression_loss: 1.5157 - classification_loss: 0.2577 115/500 [=====>........................] - ETA: 2:11 - loss: 1.7715 - regression_loss: 1.5141 - classification_loss: 0.2574 116/500 [=====>........................] - ETA: 2:11 - loss: 1.7746 - regression_loss: 1.5166 - classification_loss: 0.2580 117/500 [======>.......................] - ETA: 2:10 - loss: 1.7723 - regression_loss: 1.5148 - classification_loss: 0.2575 118/500 [======>.......................] - ETA: 2:10 - loss: 1.7699 - regression_loss: 1.5132 - classification_loss: 0.2567 119/500 [======>.......................] - ETA: 2:09 - loss: 1.7692 - regression_loss: 1.5132 - classification_loss: 0.2560 120/500 [======>.......................] - ETA: 2:09 - loss: 1.7681 - regression_loss: 1.5126 - classification_loss: 0.2555 121/500 [======>.......................] - ETA: 2:09 - loss: 1.7748 - regression_loss: 1.5172 - classification_loss: 0.2577 122/500 [======>.......................] - ETA: 2:08 - loss: 1.7805 - regression_loss: 1.5217 - classification_loss: 0.2587 123/500 [======>.......................] - ETA: 2:08 - loss: 1.7773 - regression_loss: 1.5177 - classification_loss: 0.2596 124/500 [======>.......................] - ETA: 2:08 - loss: 1.7752 - regression_loss: 1.5163 - classification_loss: 0.2589 125/500 [======>.......................] - ETA: 2:07 - loss: 1.7763 - regression_loss: 1.5172 - classification_loss: 0.2591 126/500 [======>.......................] - ETA: 2:07 - loss: 1.7746 - regression_loss: 1.5161 - classification_loss: 0.2585 127/500 [======>.......................] - ETA: 2:06 - loss: 1.7743 - regression_loss: 1.5161 - classification_loss: 0.2582 128/500 [======>.......................] - ETA: 2:06 - loss: 1.7649 - regression_loss: 1.5081 - classification_loss: 0.2568 129/500 [======>.......................] - ETA: 2:06 - loss: 1.7588 - regression_loss: 1.5030 - classification_loss: 0.2557 130/500 [======>.......................] - ETA: 2:06 - loss: 1.7468 - regression_loss: 1.4915 - classification_loss: 0.2553 131/500 [======>.......................] - ETA: 2:05 - loss: 1.7542 - regression_loss: 1.4989 - classification_loss: 0.2553 132/500 [======>.......................] - ETA: 2:05 - loss: 1.7535 - regression_loss: 1.4990 - classification_loss: 0.2545 133/500 [======>.......................] - ETA: 2:04 - loss: 1.7541 - regression_loss: 1.4995 - classification_loss: 0.2547 134/500 [=======>......................] - ETA: 2:04 - loss: 1.7489 - regression_loss: 1.4953 - classification_loss: 0.2537 135/500 [=======>......................] - ETA: 2:04 - loss: 1.7571 - regression_loss: 1.5024 - classification_loss: 0.2547 136/500 [=======>......................] - ETA: 2:03 - loss: 1.7581 - regression_loss: 1.5031 - classification_loss: 0.2550 137/500 [=======>......................] - ETA: 2:03 - loss: 1.7648 - regression_loss: 1.5091 - classification_loss: 0.2557 138/500 [=======>......................] - ETA: 2:03 - loss: 1.7683 - regression_loss: 1.5120 - classification_loss: 0.2564 139/500 [=======>......................] - ETA: 2:02 - loss: 1.7728 - regression_loss: 1.5158 - classification_loss: 0.2569 140/500 [=======>......................] - ETA: 2:02 - loss: 1.7716 - regression_loss: 1.5144 - classification_loss: 0.2572 141/500 [=======>......................] - ETA: 2:02 - loss: 1.7732 - regression_loss: 1.5162 - classification_loss: 0.2570 142/500 [=======>......................] - ETA: 2:01 - loss: 1.7777 - regression_loss: 1.5198 - classification_loss: 0.2579 143/500 [=======>......................] - ETA: 2:01 - loss: 1.7787 - regression_loss: 1.5203 - classification_loss: 0.2583 144/500 [=======>......................] - ETA: 2:01 - loss: 1.7710 - regression_loss: 1.5140 - classification_loss: 0.2570 145/500 [=======>......................] - ETA: 2:00 - loss: 1.7748 - regression_loss: 1.5175 - classification_loss: 0.2572 146/500 [=======>......................] - ETA: 2:00 - loss: 1.7755 - regression_loss: 1.5181 - classification_loss: 0.2574 147/500 [=======>......................] - ETA: 2:00 - loss: 1.7819 - regression_loss: 1.5246 - classification_loss: 0.2573 148/500 [=======>......................] - ETA: 1:59 - loss: 1.7798 - regression_loss: 1.5227 - classification_loss: 0.2571 149/500 [=======>......................] - ETA: 1:59 - loss: 1.7773 - regression_loss: 1.5204 - classification_loss: 0.2569 150/500 [========>.....................] - ETA: 1:59 - loss: 1.7757 - regression_loss: 1.5193 - classification_loss: 0.2564 151/500 [========>.....................] - ETA: 1:58 - loss: 1.7709 - regression_loss: 1.5151 - classification_loss: 0.2558 152/500 [========>.....................] - ETA: 1:58 - loss: 1.7718 - regression_loss: 1.5163 - classification_loss: 0.2555 153/500 [========>.....................] - ETA: 1:58 - loss: 1.7707 - regression_loss: 1.5157 - classification_loss: 0.2550 154/500 [========>.....................] - ETA: 1:57 - loss: 1.7658 - regression_loss: 1.5058 - classification_loss: 0.2600 155/500 [========>.....................] - ETA: 1:57 - loss: 1.7737 - regression_loss: 1.5132 - classification_loss: 0.2604 156/500 [========>.....................] - ETA: 1:57 - loss: 1.7719 - regression_loss: 1.5121 - classification_loss: 0.2598 157/500 [========>.....................] - ETA: 1:56 - loss: 1.7779 - regression_loss: 1.5179 - classification_loss: 0.2601 158/500 [========>.....................] - ETA: 1:56 - loss: 1.7758 - regression_loss: 1.5162 - classification_loss: 0.2596 159/500 [========>.....................] - ETA: 1:56 - loss: 1.7742 - regression_loss: 1.5151 - classification_loss: 0.2591 160/500 [========>.....................] - ETA: 1:55 - loss: 1.7787 - regression_loss: 1.5184 - classification_loss: 0.2603 161/500 [========>.....................] - ETA: 1:55 - loss: 1.7780 - regression_loss: 1.5179 - classification_loss: 0.2601 162/500 [========>.....................] - ETA: 1:55 - loss: 1.7778 - regression_loss: 1.5175 - classification_loss: 0.2603 163/500 [========>.....................] - ETA: 1:54 - loss: 1.7815 - regression_loss: 1.5205 - classification_loss: 0.2610 164/500 [========>.....................] - ETA: 1:54 - loss: 1.7820 - regression_loss: 1.5220 - classification_loss: 0.2600 165/500 [========>.....................] - ETA: 1:54 - loss: 1.7795 - regression_loss: 1.5196 - classification_loss: 0.2598 166/500 [========>.....................] - ETA: 1:53 - loss: 1.7788 - regression_loss: 1.5189 - classification_loss: 0.2599 167/500 [=========>....................] - ETA: 1:53 - loss: 1.7840 - regression_loss: 1.5228 - classification_loss: 0.2611 168/500 [=========>....................] - ETA: 1:53 - loss: 1.7897 - regression_loss: 1.5270 - classification_loss: 0.2627 169/500 [=========>....................] - ETA: 1:52 - loss: 1.7901 - regression_loss: 1.5274 - classification_loss: 0.2627 170/500 [=========>....................] - ETA: 1:52 - loss: 1.7901 - regression_loss: 1.5272 - classification_loss: 0.2629 171/500 [=========>....................] - ETA: 1:52 - loss: 1.7913 - regression_loss: 1.5284 - classification_loss: 0.2629 172/500 [=========>....................] - ETA: 1:51 - loss: 1.7885 - regression_loss: 1.5257 - classification_loss: 0.2627 173/500 [=========>....................] - ETA: 1:51 - loss: 1.7871 - regression_loss: 1.5246 - classification_loss: 0.2625 174/500 [=========>....................] - ETA: 1:51 - loss: 1.7924 - regression_loss: 1.5291 - classification_loss: 0.2633 175/500 [=========>....................] - ETA: 1:50 - loss: 1.7907 - regression_loss: 1.5269 - classification_loss: 0.2638 176/500 [=========>....................] - ETA: 1:50 - loss: 1.7901 - regression_loss: 1.5265 - classification_loss: 0.2636 177/500 [=========>....................] - ETA: 1:50 - loss: 1.7844 - regression_loss: 1.5217 - classification_loss: 0.2627 178/500 [=========>....................] - ETA: 1:49 - loss: 1.7817 - regression_loss: 1.5194 - classification_loss: 0.2622 179/500 [=========>....................] - ETA: 1:49 - loss: 1.7800 - regression_loss: 1.5180 - classification_loss: 0.2620 180/500 [=========>....................] - ETA: 1:49 - loss: 1.7806 - regression_loss: 1.5184 - classification_loss: 0.2622 181/500 [=========>....................] - ETA: 1:48 - loss: 1.7749 - regression_loss: 1.5136 - classification_loss: 0.2613 182/500 [=========>....................] - ETA: 1:48 - loss: 1.7776 - regression_loss: 1.5159 - classification_loss: 0.2617 183/500 [=========>....................] - ETA: 1:48 - loss: 1.7764 - regression_loss: 1.5146 - classification_loss: 0.2618 184/500 [==========>...................] - ETA: 1:47 - loss: 1.7730 - regression_loss: 1.5117 - classification_loss: 0.2613 185/500 [==========>...................] - ETA: 1:47 - loss: 1.7666 - regression_loss: 1.5062 - classification_loss: 0.2604 186/500 [==========>...................] - ETA: 1:46 - loss: 1.7627 - regression_loss: 1.5030 - classification_loss: 0.2597 187/500 [==========>...................] - ETA: 1:46 - loss: 1.7626 - regression_loss: 1.5027 - classification_loss: 0.2599 188/500 [==========>...................] - ETA: 1:46 - loss: 1.7628 - regression_loss: 1.5032 - classification_loss: 0.2596 189/500 [==========>...................] - ETA: 1:45 - loss: 1.7589 - regression_loss: 1.5001 - classification_loss: 0.2588 190/500 [==========>...................] - ETA: 1:45 - loss: 1.7607 - regression_loss: 1.5013 - classification_loss: 0.2594 191/500 [==========>...................] - ETA: 1:45 - loss: 1.7620 - regression_loss: 1.5029 - classification_loss: 0.2591 192/500 [==========>...................] - ETA: 1:44 - loss: 1.7629 - regression_loss: 1.5041 - classification_loss: 0.2587 193/500 [==========>...................] - ETA: 1:44 - loss: 1.7645 - regression_loss: 1.5052 - classification_loss: 0.2594 194/500 [==========>...................] - ETA: 1:44 - loss: 1.7678 - regression_loss: 1.5089 - classification_loss: 0.2589 195/500 [==========>...................] - ETA: 1:43 - loss: 1.7676 - regression_loss: 1.5087 - classification_loss: 0.2589 196/500 [==========>...................] - ETA: 1:43 - loss: 1.7748 - regression_loss: 1.5140 - classification_loss: 0.2608 197/500 [==========>...................] - ETA: 1:43 - loss: 1.7729 - regression_loss: 1.5126 - classification_loss: 0.2603 198/500 [==========>...................] - ETA: 1:42 - loss: 1.7768 - regression_loss: 1.5150 - classification_loss: 0.2618 199/500 [==========>...................] - ETA: 1:42 - loss: 1.7798 - regression_loss: 1.5174 - classification_loss: 0.2623 200/500 [===========>..................] - ETA: 1:42 - loss: 1.7794 - regression_loss: 1.5171 - classification_loss: 0.2623 201/500 [===========>..................] - ETA: 1:41 - loss: 1.7801 - regression_loss: 1.5176 - classification_loss: 0.2624 202/500 [===========>..................] - ETA: 1:41 - loss: 1.7787 - regression_loss: 1.5166 - classification_loss: 0.2621 203/500 [===========>..................] - ETA: 1:41 - loss: 1.7811 - regression_loss: 1.5189 - classification_loss: 0.2623 204/500 [===========>..................] - ETA: 1:40 - loss: 1.7841 - regression_loss: 1.5211 - classification_loss: 0.2630 205/500 [===========>..................] - ETA: 1:40 - loss: 1.7798 - regression_loss: 1.5173 - classification_loss: 0.2625 206/500 [===========>..................] - ETA: 1:40 - loss: 1.7803 - regression_loss: 1.5179 - classification_loss: 0.2624 207/500 [===========>..................] - ETA: 1:39 - loss: 1.7790 - regression_loss: 1.5170 - classification_loss: 0.2621 208/500 [===========>..................] - ETA: 1:39 - loss: 1.7779 - regression_loss: 1.5161 - classification_loss: 0.2619 209/500 [===========>..................] - ETA: 1:39 - loss: 1.7762 - regression_loss: 1.5148 - classification_loss: 0.2614 210/500 [===========>..................] - ETA: 1:38 - loss: 1.7815 - regression_loss: 1.5191 - classification_loss: 0.2625 211/500 [===========>..................] - ETA: 1:38 - loss: 1.7816 - regression_loss: 1.5195 - classification_loss: 0.2621 212/500 [===========>..................] - ETA: 1:38 - loss: 1.7807 - regression_loss: 1.5189 - classification_loss: 0.2618 213/500 [===========>..................] - ETA: 1:37 - loss: 1.7801 - regression_loss: 1.5187 - classification_loss: 0.2614 214/500 [===========>..................] - ETA: 1:37 - loss: 1.7766 - regression_loss: 1.5159 - classification_loss: 0.2608 215/500 [===========>..................] - ETA: 1:37 - loss: 1.7721 - regression_loss: 1.5123 - classification_loss: 0.2599 216/500 [===========>..................] - ETA: 1:36 - loss: 1.7719 - regression_loss: 1.5124 - classification_loss: 0.2595 217/500 [============>.................] - ETA: 1:36 - loss: 1.7714 - regression_loss: 1.5121 - classification_loss: 0.2593 218/500 [============>.................] - ETA: 1:36 - loss: 1.7716 - regression_loss: 1.5127 - classification_loss: 0.2589 219/500 [============>.................] - ETA: 1:35 - loss: 1.7675 - regression_loss: 1.5095 - classification_loss: 0.2580 220/500 [============>.................] - ETA: 1:35 - loss: 1.7664 - regression_loss: 1.5084 - classification_loss: 0.2579 221/500 [============>.................] - ETA: 1:34 - loss: 1.7674 - regression_loss: 1.5089 - classification_loss: 0.2585 222/500 [============>.................] - ETA: 1:34 - loss: 1.7644 - regression_loss: 1.5066 - classification_loss: 0.2578 223/500 [============>.................] - ETA: 1:34 - loss: 1.7658 - regression_loss: 1.5074 - classification_loss: 0.2584 224/500 [============>.................] - ETA: 1:33 - loss: 1.7650 - regression_loss: 1.5068 - classification_loss: 0.2582 225/500 [============>.................] - ETA: 1:33 - loss: 1.7660 - regression_loss: 1.5074 - classification_loss: 0.2586 226/500 [============>.................] - ETA: 1:33 - loss: 1.7644 - regression_loss: 1.5061 - classification_loss: 0.2583 227/500 [============>.................] - ETA: 1:32 - loss: 1.7649 - regression_loss: 1.5064 - classification_loss: 0.2585 228/500 [============>.................] - ETA: 1:32 - loss: 1.7640 - regression_loss: 1.5058 - classification_loss: 0.2582 229/500 [============>.................] - ETA: 1:32 - loss: 1.7640 - regression_loss: 1.5060 - classification_loss: 0.2580 230/500 [============>.................] - ETA: 1:31 - loss: 1.7641 - regression_loss: 1.5065 - classification_loss: 0.2576 231/500 [============>.................] - ETA: 1:31 - loss: 1.7645 - regression_loss: 1.5070 - classification_loss: 0.2576 232/500 [============>.................] - ETA: 1:31 - loss: 1.7650 - regression_loss: 1.5072 - classification_loss: 0.2578 233/500 [============>.................] - ETA: 1:30 - loss: 1.7682 - regression_loss: 1.5097 - classification_loss: 0.2585 234/500 [=============>................] - ETA: 1:30 - loss: 1.7651 - regression_loss: 1.5072 - classification_loss: 0.2579 235/500 [=============>................] - ETA: 1:30 - loss: 1.7656 - regression_loss: 1.5078 - classification_loss: 0.2579 236/500 [=============>................] - ETA: 1:29 - loss: 1.7687 - regression_loss: 1.5101 - classification_loss: 0.2586 237/500 [=============>................] - ETA: 1:29 - loss: 1.7656 - regression_loss: 1.5076 - classification_loss: 0.2580 238/500 [=============>................] - ETA: 1:29 - loss: 1.7636 - regression_loss: 1.5061 - classification_loss: 0.2575 239/500 [=============>................] - ETA: 1:28 - loss: 1.7665 - regression_loss: 1.5084 - classification_loss: 0.2581 240/500 [=============>................] - ETA: 1:28 - loss: 1.7694 - regression_loss: 1.5116 - classification_loss: 0.2578 241/500 [=============>................] - ETA: 1:28 - loss: 1.7692 - regression_loss: 1.5116 - classification_loss: 0.2576 242/500 [=============>................] - ETA: 1:27 - loss: 1.7703 - regression_loss: 1.5123 - classification_loss: 0.2580 243/500 [=============>................] - ETA: 1:27 - loss: 1.7679 - regression_loss: 1.5105 - classification_loss: 0.2574 244/500 [=============>................] - ETA: 1:27 - loss: 1.7669 - regression_loss: 1.5097 - classification_loss: 0.2572 245/500 [=============>................] - ETA: 1:26 - loss: 1.7702 - regression_loss: 1.5124 - classification_loss: 0.2577 246/500 [=============>................] - ETA: 1:26 - loss: 1.7693 - regression_loss: 1.5120 - classification_loss: 0.2573 247/500 [=============>................] - ETA: 1:25 - loss: 1.7693 - regression_loss: 1.5120 - classification_loss: 0.2573 248/500 [=============>................] - ETA: 1:25 - loss: 1.7700 - regression_loss: 1.5125 - classification_loss: 0.2574 249/500 [=============>................] - ETA: 1:25 - loss: 1.7689 - regression_loss: 1.5119 - classification_loss: 0.2571 250/500 [==============>...............] - ETA: 1:24 - loss: 1.7680 - regression_loss: 1.5111 - classification_loss: 0.2569 251/500 [==============>...............] - ETA: 1:24 - loss: 1.7679 - regression_loss: 1.5110 - classification_loss: 0.2569 252/500 [==============>...............] - ETA: 1:24 - loss: 1.7694 - regression_loss: 1.5123 - classification_loss: 0.2570 253/500 [==============>...............] - ETA: 1:23 - loss: 1.7687 - regression_loss: 1.5118 - classification_loss: 0.2569 254/500 [==============>...............] - ETA: 1:23 - loss: 1.7683 - regression_loss: 1.5115 - classification_loss: 0.2568 255/500 [==============>...............] - ETA: 1:23 - loss: 1.7696 - regression_loss: 1.5127 - classification_loss: 0.2568 256/500 [==============>...............] - ETA: 1:22 - loss: 1.7710 - regression_loss: 1.5140 - classification_loss: 0.2570 257/500 [==============>...............] - ETA: 1:22 - loss: 1.7731 - regression_loss: 1.5162 - classification_loss: 0.2569 258/500 [==============>...............] - ETA: 1:22 - loss: 1.7789 - regression_loss: 1.5206 - classification_loss: 0.2583 259/500 [==============>...............] - ETA: 1:21 - loss: 1.7785 - regression_loss: 1.5202 - classification_loss: 0.2583 260/500 [==============>...............] - ETA: 1:21 - loss: 1.7783 - regression_loss: 1.5201 - classification_loss: 0.2582 261/500 [==============>...............] - ETA: 1:21 - loss: 1.7774 - regression_loss: 1.5193 - classification_loss: 0.2581 262/500 [==============>...............] - ETA: 1:20 - loss: 1.7788 - regression_loss: 1.5205 - classification_loss: 0.2583 263/500 [==============>...............] - ETA: 1:20 - loss: 1.7793 - regression_loss: 1.5215 - classification_loss: 0.2577 264/500 [==============>...............] - ETA: 1:20 - loss: 1.7763 - regression_loss: 1.5192 - classification_loss: 0.2571 265/500 [==============>...............] - ETA: 1:19 - loss: 1.7759 - regression_loss: 1.5192 - classification_loss: 0.2567 266/500 [==============>...............] - ETA: 1:19 - loss: 1.7771 - regression_loss: 1.5203 - classification_loss: 0.2567 267/500 [===============>..............] - ETA: 1:19 - loss: 1.7759 - regression_loss: 1.5194 - classification_loss: 0.2565 268/500 [===============>..............] - ETA: 1:18 - loss: 1.7792 - regression_loss: 1.5214 - classification_loss: 0.2578 269/500 [===============>..............] - ETA: 1:18 - loss: 1.7760 - regression_loss: 1.5188 - classification_loss: 0.2573 270/500 [===============>..............] - ETA: 1:18 - loss: 1.7757 - regression_loss: 1.5183 - classification_loss: 0.2574 271/500 [===============>..............] - ETA: 1:17 - loss: 1.7735 - regression_loss: 1.5166 - classification_loss: 0.2569 272/500 [===============>..............] - ETA: 1:17 - loss: 1.7699 - regression_loss: 1.5136 - classification_loss: 0.2563 273/500 [===============>..............] - ETA: 1:17 - loss: 1.7658 - regression_loss: 1.5095 - classification_loss: 0.2563 274/500 [===============>..............] - ETA: 1:16 - loss: 1.7647 - regression_loss: 1.5089 - classification_loss: 0.2558 275/500 [===============>..............] - ETA: 1:16 - loss: 1.7661 - regression_loss: 1.5101 - classification_loss: 0.2559 276/500 [===============>..............] - ETA: 1:16 - loss: 1.7656 - regression_loss: 1.5099 - classification_loss: 0.2557 277/500 [===============>..............] - ETA: 1:15 - loss: 1.7656 - regression_loss: 1.5099 - classification_loss: 0.2558 278/500 [===============>..............] - ETA: 1:15 - loss: 1.7661 - regression_loss: 1.5105 - classification_loss: 0.2556 279/500 [===============>..............] - ETA: 1:15 - loss: 1.7763 - regression_loss: 1.5188 - classification_loss: 0.2575 280/500 [===============>..............] - ETA: 1:14 - loss: 1.7729 - regression_loss: 1.5156 - classification_loss: 0.2572 281/500 [===============>..............] - ETA: 1:14 - loss: 1.7727 - regression_loss: 1.5158 - classification_loss: 0.2569 282/500 [===============>..............] - ETA: 1:14 - loss: 1.7788 - regression_loss: 1.5209 - classification_loss: 0.2579 283/500 [===============>..............] - ETA: 1:13 - loss: 1.7801 - regression_loss: 1.5217 - classification_loss: 0.2584 284/500 [================>.............] - ETA: 1:13 - loss: 1.7770 - regression_loss: 1.5189 - classification_loss: 0.2581 285/500 [================>.............] - ETA: 1:13 - loss: 1.7771 - regression_loss: 1.5190 - classification_loss: 0.2582 286/500 [================>.............] - ETA: 1:12 - loss: 1.7762 - regression_loss: 1.5184 - classification_loss: 0.2577 287/500 [================>.............] - ETA: 1:12 - loss: 1.7753 - regression_loss: 1.5177 - classification_loss: 0.2576 288/500 [================>.............] - ETA: 1:12 - loss: 1.7764 - regression_loss: 1.5185 - classification_loss: 0.2579 289/500 [================>.............] - ETA: 1:11 - loss: 1.7757 - regression_loss: 1.5180 - classification_loss: 0.2577 290/500 [================>.............] - ETA: 1:11 - loss: 1.7740 - regression_loss: 1.5166 - classification_loss: 0.2574 291/500 [================>.............] - ETA: 1:11 - loss: 1.7750 - regression_loss: 1.5176 - classification_loss: 0.2574 292/500 [================>.............] - ETA: 1:10 - loss: 1.7772 - regression_loss: 1.5192 - classification_loss: 0.2580 293/500 [================>.............] - ETA: 1:10 - loss: 1.7791 - regression_loss: 1.5209 - classification_loss: 0.2583 294/500 [================>.............] - ETA: 1:09 - loss: 1.7804 - regression_loss: 1.5221 - classification_loss: 0.2583 295/500 [================>.............] - ETA: 1:09 - loss: 1.7783 - regression_loss: 1.5205 - classification_loss: 0.2578 296/500 [================>.............] - ETA: 1:09 - loss: 1.7773 - regression_loss: 1.5197 - classification_loss: 0.2576 297/500 [================>.............] - ETA: 1:08 - loss: 1.7770 - regression_loss: 1.5196 - classification_loss: 0.2574 298/500 [================>.............] - ETA: 1:08 - loss: 1.7755 - regression_loss: 1.5183 - classification_loss: 0.2572 299/500 [================>.............] - ETA: 1:08 - loss: 1.7747 - regression_loss: 1.5177 - classification_loss: 0.2570 300/500 [=================>............] - ETA: 1:07 - loss: 1.7749 - regression_loss: 1.5181 - classification_loss: 0.2568 301/500 [=================>............] - ETA: 1:07 - loss: 1.7722 - regression_loss: 1.5161 - classification_loss: 0.2562 302/500 [=================>............] - ETA: 1:07 - loss: 1.7709 - regression_loss: 1.5150 - classification_loss: 0.2559 303/500 [=================>............] - ETA: 1:06 - loss: 1.7715 - regression_loss: 1.5156 - classification_loss: 0.2559 304/500 [=================>............] - ETA: 1:06 - loss: 1.7696 - regression_loss: 1.5141 - classification_loss: 0.2554 305/500 [=================>............] - ETA: 1:06 - loss: 1.7650 - regression_loss: 1.5101 - classification_loss: 0.2549 306/500 [=================>............] - ETA: 1:05 - loss: 1.7634 - regression_loss: 1.5088 - classification_loss: 0.2546 307/500 [=================>............] - ETA: 1:05 - loss: 1.7625 - regression_loss: 1.5072 - classification_loss: 0.2552 308/500 [=================>............] - ETA: 1:05 - loss: 1.7612 - regression_loss: 1.5063 - classification_loss: 0.2548 309/500 [=================>............] - ETA: 1:04 - loss: 1.7594 - regression_loss: 1.5050 - classification_loss: 0.2544 310/500 [=================>............] - ETA: 1:04 - loss: 1.7569 - regression_loss: 1.5028 - classification_loss: 0.2541 311/500 [=================>............] - ETA: 1:04 - loss: 1.7571 - regression_loss: 1.5028 - classification_loss: 0.2543 312/500 [=================>............] - ETA: 1:03 - loss: 1.7594 - regression_loss: 1.5044 - classification_loss: 0.2549 313/500 [=================>............] - ETA: 1:03 - loss: 1.7576 - regression_loss: 1.5030 - classification_loss: 0.2545 314/500 [=================>............] - ETA: 1:03 - loss: 1.7548 - regression_loss: 1.5008 - classification_loss: 0.2540 315/500 [=================>............] - ETA: 1:02 - loss: 1.7538 - regression_loss: 1.4999 - classification_loss: 0.2539 316/500 [=================>............] - ETA: 1:02 - loss: 1.7546 - regression_loss: 1.5008 - classification_loss: 0.2538 317/500 [==================>...........] - ETA: 1:02 - loss: 1.7548 - regression_loss: 1.5007 - classification_loss: 0.2540 318/500 [==================>...........] - ETA: 1:01 - loss: 1.7546 - regression_loss: 1.5008 - classification_loss: 0.2538 319/500 [==================>...........] - ETA: 1:01 - loss: 1.7603 - regression_loss: 1.5056 - classification_loss: 0.2547 320/500 [==================>...........] - ETA: 1:01 - loss: 1.7621 - regression_loss: 1.5070 - classification_loss: 0.2551 321/500 [==================>...........] - ETA: 1:00 - loss: 1.7633 - regression_loss: 1.5081 - classification_loss: 0.2552 322/500 [==================>...........] - ETA: 1:00 - loss: 1.7642 - regression_loss: 1.5089 - classification_loss: 0.2553 323/500 [==================>...........] - ETA: 1:00 - loss: 1.7642 - regression_loss: 1.5090 - classification_loss: 0.2552 324/500 [==================>...........] - ETA: 59s - loss: 1.7627 - regression_loss: 1.5077 - classification_loss: 0.2550  325/500 [==================>...........] - ETA: 59s - loss: 1.7624 - regression_loss: 1.5076 - classification_loss: 0.2548 326/500 [==================>...........] - ETA: 58s - loss: 1.7640 - regression_loss: 1.5092 - classification_loss: 0.2549 327/500 [==================>...........] - ETA: 58s - loss: 1.7655 - regression_loss: 1.5103 - classification_loss: 0.2552 328/500 [==================>...........] - ETA: 58s - loss: 1.7660 - regression_loss: 1.5108 - classification_loss: 0.2551 329/500 [==================>...........] - ETA: 57s - loss: 1.7673 - regression_loss: 1.5119 - classification_loss: 0.2554 330/500 [==================>...........] - ETA: 57s - loss: 1.7671 - regression_loss: 1.5118 - classification_loss: 0.2554 331/500 [==================>...........] - ETA: 57s - loss: 1.7644 - regression_loss: 1.5094 - classification_loss: 0.2550 332/500 [==================>...........] - ETA: 56s - loss: 1.7633 - regression_loss: 1.5085 - classification_loss: 0.2548 333/500 [==================>...........] - ETA: 56s - loss: 1.7625 - regression_loss: 1.5078 - classification_loss: 0.2547 334/500 [===================>..........] - ETA: 56s - loss: 1.7626 - regression_loss: 1.5081 - classification_loss: 0.2545 335/500 [===================>..........] - ETA: 55s - loss: 1.7645 - regression_loss: 1.5093 - classification_loss: 0.2552 336/500 [===================>..........] - ETA: 55s - loss: 1.7642 - regression_loss: 1.5092 - classification_loss: 0.2550 337/500 [===================>..........] - ETA: 55s - loss: 1.7634 - regression_loss: 1.5085 - classification_loss: 0.2549 338/500 [===================>..........] - ETA: 54s - loss: 1.7663 - regression_loss: 1.5110 - classification_loss: 0.2554 339/500 [===================>..........] - ETA: 54s - loss: 1.7668 - regression_loss: 1.5116 - classification_loss: 0.2552 340/500 [===================>..........] - ETA: 54s - loss: 1.7655 - regression_loss: 1.5105 - classification_loss: 0.2550 341/500 [===================>..........] - ETA: 53s - loss: 1.7663 - regression_loss: 1.5117 - classification_loss: 0.2546 342/500 [===================>..........] - ETA: 53s - loss: 1.7668 - regression_loss: 1.5122 - classification_loss: 0.2545 343/500 [===================>..........] - ETA: 53s - loss: 1.7666 - regression_loss: 1.5121 - classification_loss: 0.2545 344/500 [===================>..........] - ETA: 52s - loss: 1.7656 - regression_loss: 1.5113 - classification_loss: 0.2543 345/500 [===================>..........] - ETA: 52s - loss: 1.7651 - regression_loss: 1.5110 - classification_loss: 0.2542 346/500 [===================>..........] - ETA: 52s - loss: 1.7639 - regression_loss: 1.5101 - classification_loss: 0.2538 347/500 [===================>..........] - ETA: 51s - loss: 1.7629 - regression_loss: 1.5095 - classification_loss: 0.2535 348/500 [===================>..........] - ETA: 51s - loss: 1.7685 - regression_loss: 1.5141 - classification_loss: 0.2544 349/500 [===================>..........] - ETA: 51s - loss: 1.7688 - regression_loss: 1.5145 - classification_loss: 0.2543 350/500 [====================>.........] - ETA: 50s - loss: 1.7709 - regression_loss: 1.5158 - classification_loss: 0.2551 351/500 [====================>.........] - ETA: 50s - loss: 1.7701 - regression_loss: 1.5148 - classification_loss: 0.2552 352/500 [====================>.........] - ETA: 50s - loss: 1.7694 - regression_loss: 1.5144 - classification_loss: 0.2550 353/500 [====================>.........] - ETA: 49s - loss: 1.7677 - regression_loss: 1.5129 - classification_loss: 0.2548 354/500 [====================>.........] - ETA: 49s - loss: 1.7662 - regression_loss: 1.5116 - classification_loss: 0.2546 355/500 [====================>.........] - ETA: 49s - loss: 1.7632 - regression_loss: 1.5090 - classification_loss: 0.2542 356/500 [====================>.........] - ETA: 48s - loss: 1.7642 - regression_loss: 1.5097 - classification_loss: 0.2545 357/500 [====================>.........] - ETA: 48s - loss: 1.7668 - regression_loss: 1.5114 - classification_loss: 0.2554 358/500 [====================>.........] - ETA: 48s - loss: 1.7662 - regression_loss: 1.5109 - classification_loss: 0.2553 359/500 [====================>.........] - ETA: 47s - loss: 1.7680 - regression_loss: 1.5121 - classification_loss: 0.2560 360/500 [====================>.........] - ETA: 47s - loss: 1.7699 - regression_loss: 1.5135 - classification_loss: 0.2565 361/500 [====================>.........] - ETA: 47s - loss: 1.7696 - regression_loss: 1.5133 - classification_loss: 0.2563 362/500 [====================>.........] - ETA: 46s - loss: 1.7670 - regression_loss: 1.5112 - classification_loss: 0.2558 363/500 [====================>.........] - ETA: 46s - loss: 1.7668 - regression_loss: 1.5109 - classification_loss: 0.2558 364/500 [====================>.........] - ETA: 46s - loss: 1.7653 - regression_loss: 1.5096 - classification_loss: 0.2556 365/500 [====================>.........] - ETA: 45s - loss: 1.7652 - regression_loss: 1.5096 - classification_loss: 0.2557 366/500 [====================>.........] - ETA: 45s - loss: 1.7638 - regression_loss: 1.5084 - classification_loss: 0.2554 367/500 [=====================>........] - ETA: 45s - loss: 1.7629 - regression_loss: 1.5078 - classification_loss: 0.2551 368/500 [=====================>........] - ETA: 44s - loss: 1.7643 - regression_loss: 1.5089 - classification_loss: 0.2554 369/500 [=====================>........] - ETA: 44s - loss: 1.7637 - regression_loss: 1.5086 - classification_loss: 0.2551 370/500 [=====================>........] - ETA: 44s - loss: 1.7620 - regression_loss: 1.5072 - classification_loss: 0.2548 371/500 [=====================>........] - ETA: 43s - loss: 1.7621 - regression_loss: 1.5071 - classification_loss: 0.2550 372/500 [=====================>........] - ETA: 43s - loss: 1.7619 - regression_loss: 1.5069 - classification_loss: 0.2549 373/500 [=====================>........] - ETA: 43s - loss: 1.7599 - regression_loss: 1.5050 - classification_loss: 0.2549 374/500 [=====================>........] - ETA: 42s - loss: 1.7591 - regression_loss: 1.5044 - classification_loss: 0.2547 375/500 [=====================>........] - ETA: 42s - loss: 1.7587 - regression_loss: 1.5040 - classification_loss: 0.2547 376/500 [=====================>........] - ETA: 42s - loss: 1.7555 - regression_loss: 1.5011 - classification_loss: 0.2544 377/500 [=====================>........] - ETA: 41s - loss: 1.7544 - regression_loss: 1.5001 - classification_loss: 0.2543 378/500 [=====================>........] - ETA: 41s - loss: 1.7539 - regression_loss: 1.4998 - classification_loss: 0.2541 379/500 [=====================>........] - ETA: 41s - loss: 1.7532 - regression_loss: 1.4993 - classification_loss: 0.2539 380/500 [=====================>........] - ETA: 40s - loss: 1.7532 - regression_loss: 1.4993 - classification_loss: 0.2539 381/500 [=====================>........] - ETA: 40s - loss: 1.7533 - regression_loss: 1.4994 - classification_loss: 0.2540 382/500 [=====================>........] - ETA: 40s - loss: 1.7514 - regression_loss: 1.4977 - classification_loss: 0.2537 383/500 [=====================>........] - ETA: 39s - loss: 1.7500 - regression_loss: 1.4965 - classification_loss: 0.2535 384/500 [======================>.......] - ETA: 39s - loss: 1.7496 - regression_loss: 1.4964 - classification_loss: 0.2533 385/500 [======================>.......] - ETA: 39s - loss: 1.7498 - regression_loss: 1.4965 - classification_loss: 0.2533 386/500 [======================>.......] - ETA: 38s - loss: 1.7534 - regression_loss: 1.4993 - classification_loss: 0.2541 387/500 [======================>.......] - ETA: 38s - loss: 1.7548 - regression_loss: 1.5006 - classification_loss: 0.2542 388/500 [======================>.......] - ETA: 38s - loss: 1.7558 - regression_loss: 1.5016 - classification_loss: 0.2542 389/500 [======================>.......] - ETA: 37s - loss: 1.7553 - regression_loss: 1.5011 - classification_loss: 0.2542 390/500 [======================>.......] - ETA: 37s - loss: 1.7564 - regression_loss: 1.5017 - classification_loss: 0.2547 391/500 [======================>.......] - ETA: 36s - loss: 1.7558 - regression_loss: 1.5013 - classification_loss: 0.2545 392/500 [======================>.......] - ETA: 36s - loss: 1.7539 - regression_loss: 1.4992 - classification_loss: 0.2547 393/500 [======================>.......] - ETA: 36s - loss: 1.7526 - regression_loss: 1.4982 - classification_loss: 0.2544 394/500 [======================>.......] - ETA: 35s - loss: 1.7524 - regression_loss: 1.4981 - classification_loss: 0.2543 395/500 [======================>.......] - ETA: 35s - loss: 1.7518 - regression_loss: 1.4977 - classification_loss: 0.2541 396/500 [======================>.......] - ETA: 35s - loss: 1.7508 - regression_loss: 1.4969 - classification_loss: 0.2539 397/500 [======================>.......] - ETA: 34s - loss: 1.7505 - regression_loss: 1.4967 - classification_loss: 0.2538 398/500 [======================>.......] - ETA: 34s - loss: 1.7490 - regression_loss: 1.4956 - classification_loss: 0.2534 399/500 [======================>.......] - ETA: 34s - loss: 1.7488 - regression_loss: 1.4956 - classification_loss: 0.2533 400/500 [=======================>......] - ETA: 33s - loss: 1.7517 - regression_loss: 1.4983 - classification_loss: 0.2535 401/500 [=======================>......] - ETA: 33s - loss: 1.7507 - regression_loss: 1.4973 - classification_loss: 0.2534 402/500 [=======================>......] - ETA: 33s - loss: 1.7497 - regression_loss: 1.4966 - classification_loss: 0.2531 403/500 [=======================>......] - ETA: 32s - loss: 1.7499 - regression_loss: 1.4970 - classification_loss: 0.2529 404/500 [=======================>......] - ETA: 32s - loss: 1.7493 - regression_loss: 1.4965 - classification_loss: 0.2528 405/500 [=======================>......] - ETA: 32s - loss: 1.7523 - regression_loss: 1.4988 - classification_loss: 0.2535 406/500 [=======================>......] - ETA: 31s - loss: 1.7524 - regression_loss: 1.4987 - classification_loss: 0.2537 407/500 [=======================>......] - ETA: 31s - loss: 1.7515 - regression_loss: 1.4978 - classification_loss: 0.2538 408/500 [=======================>......] - ETA: 31s - loss: 1.7527 - regression_loss: 1.4987 - classification_loss: 0.2540 409/500 [=======================>......] - ETA: 30s - loss: 1.7527 - regression_loss: 1.4989 - classification_loss: 0.2538 410/500 [=======================>......] - ETA: 30s - loss: 1.7515 - regression_loss: 1.4981 - classification_loss: 0.2534 411/500 [=======================>......] - ETA: 30s - loss: 1.7513 - regression_loss: 1.4981 - classification_loss: 0.2532 412/500 [=======================>......] - ETA: 29s - loss: 1.7500 - regression_loss: 1.4971 - classification_loss: 0.2529 413/500 [=======================>......] - ETA: 29s - loss: 1.7516 - regression_loss: 1.4988 - classification_loss: 0.2529 414/500 [=======================>......] - ETA: 29s - loss: 1.7562 - regression_loss: 1.5027 - classification_loss: 0.2535 415/500 [=======================>......] - ETA: 28s - loss: 1.7556 - regression_loss: 1.5023 - classification_loss: 0.2533 416/500 [=======================>......] - ETA: 28s - loss: 1.7542 - regression_loss: 1.5013 - classification_loss: 0.2529 417/500 [========================>.....] - ETA: 28s - loss: 1.7540 - regression_loss: 1.5012 - classification_loss: 0.2528 418/500 [========================>.....] - ETA: 27s - loss: 1.7552 - regression_loss: 1.5018 - classification_loss: 0.2534 419/500 [========================>.....] - ETA: 27s - loss: 1.7536 - regression_loss: 1.5005 - classification_loss: 0.2531 420/500 [========================>.....] - ETA: 27s - loss: 1.7571 - regression_loss: 1.5020 - classification_loss: 0.2550 421/500 [========================>.....] - ETA: 26s - loss: 1.7578 - regression_loss: 1.5027 - classification_loss: 0.2551 422/500 [========================>.....] - ETA: 26s - loss: 1.7581 - regression_loss: 1.5032 - classification_loss: 0.2549 423/500 [========================>.....] - ETA: 26s - loss: 1.7563 - regression_loss: 1.5017 - classification_loss: 0.2546 424/500 [========================>.....] - ETA: 25s - loss: 1.7560 - regression_loss: 1.5015 - classification_loss: 0.2545 425/500 [========================>.....] - ETA: 25s - loss: 1.7558 - regression_loss: 1.5014 - classification_loss: 0.2544 426/500 [========================>.....] - ETA: 25s - loss: 1.7590 - regression_loss: 1.5042 - classification_loss: 0.2548 427/500 [========================>.....] - ETA: 24s - loss: 1.7587 - regression_loss: 1.5040 - classification_loss: 0.2547 428/500 [========================>.....] - ETA: 24s - loss: 1.7591 - regression_loss: 1.5043 - classification_loss: 0.2548 429/500 [========================>.....] - ETA: 24s - loss: 1.7617 - regression_loss: 1.5065 - classification_loss: 0.2552 430/500 [========================>.....] - ETA: 23s - loss: 1.7605 - regression_loss: 1.5056 - classification_loss: 0.2549 431/500 [========================>.....] - ETA: 23s - loss: 1.7591 - regression_loss: 1.5045 - classification_loss: 0.2546 432/500 [========================>.....] - ETA: 23s - loss: 1.7578 - regression_loss: 1.5034 - classification_loss: 0.2544 433/500 [========================>.....] - ETA: 22s - loss: 1.7576 - regression_loss: 1.5033 - classification_loss: 0.2543 434/500 [=========================>....] - ETA: 22s - loss: 1.7589 - regression_loss: 1.5043 - classification_loss: 0.2545 435/500 [=========================>....] - ETA: 22s - loss: 1.7588 - regression_loss: 1.5044 - classification_loss: 0.2544 436/500 [=========================>....] - ETA: 21s - loss: 1.7575 - regression_loss: 1.5035 - classification_loss: 0.2540 437/500 [=========================>....] - ETA: 21s - loss: 1.7579 - regression_loss: 1.5040 - classification_loss: 0.2539 438/500 [=========================>....] - ETA: 21s - loss: 1.7585 - regression_loss: 1.5045 - classification_loss: 0.2540 439/500 [=========================>....] - ETA: 20s - loss: 1.7572 - regression_loss: 1.5035 - classification_loss: 0.2537 440/500 [=========================>....] - ETA: 20s - loss: 1.7572 - regression_loss: 1.5036 - classification_loss: 0.2536 441/500 [=========================>....] - ETA: 20s - loss: 1.7560 - regression_loss: 1.5027 - classification_loss: 0.2533 442/500 [=========================>....] - ETA: 19s - loss: 1.7553 - regression_loss: 1.5022 - classification_loss: 0.2531 443/500 [=========================>....] - ETA: 19s - loss: 1.7548 - regression_loss: 1.5019 - classification_loss: 0.2529 444/500 [=========================>....] - ETA: 19s - loss: 1.7538 - regression_loss: 1.5012 - classification_loss: 0.2526 445/500 [=========================>....] - ETA: 18s - loss: 1.7517 - regression_loss: 1.4995 - classification_loss: 0.2522 446/500 [=========================>....] - ETA: 18s - loss: 1.7510 - regression_loss: 1.4989 - classification_loss: 0.2521 447/500 [=========================>....] - ETA: 17s - loss: 1.7524 - regression_loss: 1.4997 - classification_loss: 0.2527 448/500 [=========================>....] - ETA: 17s - loss: 1.7523 - regression_loss: 1.4996 - classification_loss: 0.2526 449/500 [=========================>....] - ETA: 17s - loss: 1.7518 - regression_loss: 1.4993 - classification_loss: 0.2525 450/500 [==========================>...] - ETA: 16s - loss: 1.7519 - regression_loss: 1.4994 - classification_loss: 0.2525 451/500 [==========================>...] - ETA: 16s - loss: 1.7520 - regression_loss: 1.4992 - classification_loss: 0.2527 452/500 [==========================>...] - ETA: 16s - loss: 1.7532 - regression_loss: 1.5000 - classification_loss: 0.2531 453/500 [==========================>...] - ETA: 15s - loss: 1.7512 - regression_loss: 1.4985 - classification_loss: 0.2527 454/500 [==========================>...] - ETA: 15s - loss: 1.7506 - regression_loss: 1.4981 - classification_loss: 0.2525 455/500 [==========================>...] - ETA: 15s - loss: 1.7518 - regression_loss: 1.4990 - classification_loss: 0.2528 456/500 [==========================>...] - ETA: 14s - loss: 1.7524 - regression_loss: 1.4995 - classification_loss: 0.2529 457/500 [==========================>...] - ETA: 14s - loss: 1.7540 - regression_loss: 1.5008 - classification_loss: 0.2532 458/500 [==========================>...] - ETA: 14s - loss: 1.7552 - regression_loss: 1.5017 - classification_loss: 0.2535 459/500 [==========================>...] - ETA: 13s - loss: 1.7554 - regression_loss: 1.5019 - classification_loss: 0.2535 460/500 [==========================>...] - ETA: 13s - loss: 1.7548 - regression_loss: 1.5016 - classification_loss: 0.2532 461/500 [==========================>...] - ETA: 13s - loss: 1.7533 - regression_loss: 1.5004 - classification_loss: 0.2529 462/500 [==========================>...] - ETA: 12s - loss: 1.7539 - regression_loss: 1.5007 - classification_loss: 0.2532 463/500 [==========================>...] - ETA: 12s - loss: 1.7535 - regression_loss: 1.5003 - classification_loss: 0.2531 464/500 [==========================>...] - ETA: 12s - loss: 1.7535 - regression_loss: 1.5006 - classification_loss: 0.2529 465/500 [==========================>...] - ETA: 11s - loss: 1.7529 - regression_loss: 1.4994 - classification_loss: 0.2535 466/500 [==========================>...] - ETA: 11s - loss: 1.7524 - regression_loss: 1.4988 - classification_loss: 0.2536 467/500 [===========================>..] - ETA: 11s - loss: 1.7521 - regression_loss: 1.4986 - classification_loss: 0.2535 468/500 [===========================>..] - ETA: 10s - loss: 1.7518 - regression_loss: 1.4984 - classification_loss: 0.2534 469/500 [===========================>..] - ETA: 10s - loss: 1.7533 - regression_loss: 1.4996 - classification_loss: 0.2537 470/500 [===========================>..] - ETA: 10s - loss: 1.7517 - regression_loss: 1.4982 - classification_loss: 0.2535 471/500 [===========================>..] - ETA: 9s - loss: 1.7498 - regression_loss: 1.4967 - classification_loss: 0.2532  472/500 [===========================>..] - ETA: 9s - loss: 1.7503 - regression_loss: 1.4974 - classification_loss: 0.2529 473/500 [===========================>..] - ETA: 9s - loss: 1.7483 - regression_loss: 1.4956 - classification_loss: 0.2526 474/500 [===========================>..] - ETA: 8s - loss: 1.7495 - regression_loss: 1.4965 - classification_loss: 0.2530 475/500 [===========================>..] - ETA: 8s - loss: 1.7496 - regression_loss: 1.4967 - classification_loss: 0.2529 476/500 [===========================>..] - ETA: 8s - loss: 1.7497 - regression_loss: 1.4969 - classification_loss: 0.2528 477/500 [===========================>..] - ETA: 7s - loss: 1.7521 - regression_loss: 1.4989 - classification_loss: 0.2532 478/500 [===========================>..] - ETA: 7s - loss: 1.7519 - regression_loss: 1.4988 - classification_loss: 0.2531 479/500 [===========================>..] - ETA: 7s - loss: 1.7528 - regression_loss: 1.4996 - classification_loss: 0.2533 480/500 [===========================>..] - ETA: 6s - loss: 1.7533 - regression_loss: 1.5000 - classification_loss: 0.2533 481/500 [===========================>..] - ETA: 6s - loss: 1.7530 - regression_loss: 1.4997 - classification_loss: 0.2532 482/500 [===========================>..] - ETA: 6s - loss: 1.7525 - regression_loss: 1.4995 - classification_loss: 0.2530 483/500 [===========================>..] - ETA: 5s - loss: 1.7520 - regression_loss: 1.4990 - classification_loss: 0.2531 484/500 [============================>.] - ETA: 5s - loss: 1.7520 - regression_loss: 1.4990 - classification_loss: 0.2530 485/500 [============================>.] - ETA: 5s - loss: 1.7530 - regression_loss: 1.4998 - classification_loss: 0.2531 486/500 [============================>.] - ETA: 4s - loss: 1.7523 - regression_loss: 1.4995 - classification_loss: 0.2529 487/500 [============================>.] - ETA: 4s - loss: 1.7514 - regression_loss: 1.4987 - classification_loss: 0.2528 488/500 [============================>.] - ETA: 4s - loss: 1.7538 - regression_loss: 1.5007 - classification_loss: 0.2532 489/500 [============================>.] - ETA: 3s - loss: 1.7538 - regression_loss: 1.5006 - classification_loss: 0.2532 490/500 [============================>.] - ETA: 3s - loss: 1.7519 - regression_loss: 1.4990 - classification_loss: 0.2529 491/500 [============================>.] - ETA: 3s - loss: 1.7510 - regression_loss: 1.4983 - classification_loss: 0.2528 492/500 [============================>.] - ETA: 2s - loss: 1.7509 - regression_loss: 1.4983 - classification_loss: 0.2526 493/500 [============================>.] - ETA: 2s - loss: 1.7518 - regression_loss: 1.4988 - classification_loss: 0.2530 494/500 [============================>.] - ETA: 2s - loss: 1.7521 - regression_loss: 1.4990 - classification_loss: 0.2531 495/500 [============================>.] - ETA: 1s - loss: 1.7509 - regression_loss: 1.4980 - classification_loss: 0.2529 496/500 [============================>.] - ETA: 1s - loss: 1.7496 - regression_loss: 1.4968 - classification_loss: 0.2528 497/500 [============================>.] - ETA: 1s - loss: 1.7503 - regression_loss: 1.4973 - classification_loss: 0.2530 498/500 [============================>.] - ETA: 0s - loss: 1.7499 - regression_loss: 1.4971 - classification_loss: 0.2528 499/500 [============================>.] - ETA: 0s - loss: 1.7518 - regression_loss: 1.4984 - classification_loss: 0.2534 500/500 [==============================] - 170s 339ms/step - loss: 1.7505 - regression_loss: 1.4973 - classification_loss: 0.2531 326 instances of class plum with average precision: 0.7957 mAP: 0.7957 Epoch 00006: saving model to ./training/snapshots/resnet101_pascal_06.h5 Epoch 7/150 1/500 [..............................] - ETA: 2:45 - loss: 0.6640 - regression_loss: 0.5438 - classification_loss: 0.1203 2/500 [..............................] - ETA: 2:51 - loss: 1.0361 - regression_loss: 0.8745 - classification_loss: 0.1616 3/500 [..............................] - ETA: 2:52 - loss: 1.1804 - regression_loss: 1.0023 - classification_loss: 0.1782 4/500 [..............................] - ETA: 2:51 - loss: 1.1966 - regression_loss: 1.0392 - classification_loss: 0.1574 5/500 [..............................] - ETA: 2:50 - loss: 1.3105 - regression_loss: 1.1225 - classification_loss: 0.1880 6/500 [..............................] - ETA: 2:49 - loss: 1.3156 - regression_loss: 1.1383 - classification_loss: 0.1773 7/500 [..............................] - ETA: 2:49 - loss: 1.4577 - regression_loss: 1.2494 - classification_loss: 0.2082 8/500 [..............................] - ETA: 2:49 - loss: 1.3888 - regression_loss: 1.1950 - classification_loss: 0.1938 9/500 [..............................] - ETA: 2:49 - loss: 1.4232 - regression_loss: 1.2269 - classification_loss: 0.1963 10/500 [..............................] - ETA: 2:48 - loss: 1.5281 - regression_loss: 1.3127 - classification_loss: 0.2154 11/500 [..............................] - ETA: 2:48 - loss: 1.4323 - regression_loss: 1.2298 - classification_loss: 0.2025 12/500 [..............................] - ETA: 2:47 - loss: 1.4663 - regression_loss: 1.2609 - classification_loss: 0.2054 13/500 [..............................] - ETA: 2:47 - loss: 1.4696 - regression_loss: 1.2677 - classification_loss: 0.2019 14/500 [..............................] - ETA: 2:46 - loss: 1.5078 - regression_loss: 1.3099 - classification_loss: 0.1979 15/500 [..............................] - ETA: 2:45 - loss: 1.4627 - regression_loss: 1.2695 - classification_loss: 0.1932 16/500 [..............................] - ETA: 2:45 - loss: 1.5789 - regression_loss: 1.3712 - classification_loss: 0.2077 17/500 [>.............................] - ETA: 2:45 - loss: 1.6065 - regression_loss: 1.3966 - classification_loss: 0.2098 18/500 [>.............................] - ETA: 2:45 - loss: 1.5964 - regression_loss: 1.3891 - classification_loss: 0.2073 19/500 [>.............................] - ETA: 2:45 - loss: 1.6363 - regression_loss: 1.4224 - classification_loss: 0.2139 20/500 [>.............................] - ETA: 2:45 - loss: 1.6281 - regression_loss: 1.4166 - classification_loss: 0.2114 21/500 [>.............................] - ETA: 2:44 - loss: 1.6304 - regression_loss: 1.4179 - classification_loss: 0.2125 22/500 [>.............................] - ETA: 2:44 - loss: 1.6225 - regression_loss: 1.4133 - classification_loss: 0.2092 23/500 [>.............................] - ETA: 2:44 - loss: 1.6228 - regression_loss: 1.4114 - classification_loss: 0.2114 24/500 [>.............................] - ETA: 2:43 - loss: 1.6400 - regression_loss: 1.4265 - classification_loss: 0.2135 25/500 [>.............................] - ETA: 2:43 - loss: 1.6245 - regression_loss: 1.4137 - classification_loss: 0.2108 26/500 [>.............................] - ETA: 2:43 - loss: 1.6339 - regression_loss: 1.4230 - classification_loss: 0.2110 27/500 [>.............................] - ETA: 2:42 - loss: 1.6579 - regression_loss: 1.4391 - classification_loss: 0.2188 28/500 [>.............................] - ETA: 2:42 - loss: 1.6953 - regression_loss: 1.4614 - classification_loss: 0.2339 29/500 [>.............................] - ETA: 2:42 - loss: 1.6676 - regression_loss: 1.4360 - classification_loss: 0.2316 30/500 [>.............................] - ETA: 2:41 - loss: 1.6813 - regression_loss: 1.4420 - classification_loss: 0.2393 31/500 [>.............................] - ETA: 2:41 - loss: 1.6710 - regression_loss: 1.4348 - classification_loss: 0.2362 32/500 [>.............................] - ETA: 2:41 - loss: 1.6381 - regression_loss: 1.4063 - classification_loss: 0.2317 33/500 [>.............................] - ETA: 2:40 - loss: 1.6377 - regression_loss: 1.4056 - classification_loss: 0.2321 34/500 [=>............................] - ETA: 2:40 - loss: 1.6345 - regression_loss: 1.4038 - classification_loss: 0.2307 35/500 [=>............................] - ETA: 2:39 - loss: 1.6343 - regression_loss: 1.4042 - classification_loss: 0.2301 36/500 [=>............................] - ETA: 2:39 - loss: 1.6180 - regression_loss: 1.3883 - classification_loss: 0.2297 37/500 [=>............................] - ETA: 2:39 - loss: 1.6155 - regression_loss: 1.3874 - classification_loss: 0.2281 38/500 [=>............................] - ETA: 2:38 - loss: 1.6142 - regression_loss: 1.3864 - classification_loss: 0.2278 39/500 [=>............................] - ETA: 2:38 - loss: 1.5891 - regression_loss: 1.3625 - classification_loss: 0.2266 40/500 [=>............................] - ETA: 2:38 - loss: 1.5819 - regression_loss: 1.3569 - classification_loss: 0.2249 41/500 [=>............................] - ETA: 2:37 - loss: 1.5949 - regression_loss: 1.3641 - classification_loss: 0.2308 42/500 [=>............................] - ETA: 2:37 - loss: 1.6079 - regression_loss: 1.3749 - classification_loss: 0.2330 43/500 [=>............................] - ETA: 2:37 - loss: 1.6103 - regression_loss: 1.3771 - classification_loss: 0.2331 44/500 [=>............................] - ETA: 2:36 - loss: 1.6091 - regression_loss: 1.3759 - classification_loss: 0.2332 45/500 [=>............................] - ETA: 2:36 - loss: 1.6152 - regression_loss: 1.3811 - classification_loss: 0.2342 46/500 [=>............................] - ETA: 2:36 - loss: 1.5998 - regression_loss: 1.3684 - classification_loss: 0.2315 47/500 [=>............................] - ETA: 2:35 - loss: 1.6089 - regression_loss: 1.3781 - classification_loss: 0.2309 48/500 [=>............................] - ETA: 2:35 - loss: 1.6078 - regression_loss: 1.3784 - classification_loss: 0.2294 49/500 [=>............................] - ETA: 2:34 - loss: 1.6016 - regression_loss: 1.3726 - classification_loss: 0.2289 50/500 [==>...........................] - ETA: 2:34 - loss: 1.6217 - regression_loss: 1.3901 - classification_loss: 0.2316 51/500 [==>...........................] - ETA: 2:33 - loss: 1.6415 - regression_loss: 1.4045 - classification_loss: 0.2370 52/500 [==>...........................] - ETA: 2:33 - loss: 1.6414 - regression_loss: 1.4056 - classification_loss: 0.2358 53/500 [==>...........................] - ETA: 2:33 - loss: 1.6529 - regression_loss: 1.4157 - classification_loss: 0.2372 54/500 [==>...........................] - ETA: 2:32 - loss: 1.6460 - regression_loss: 1.4104 - classification_loss: 0.2355 55/500 [==>...........................] - ETA: 2:32 - loss: 1.6465 - regression_loss: 1.4120 - classification_loss: 0.2345 56/500 [==>...........................] - ETA: 2:32 - loss: 1.6696 - regression_loss: 1.4316 - classification_loss: 0.2380 57/500 [==>...........................] - ETA: 2:31 - loss: 1.6792 - regression_loss: 1.4397 - classification_loss: 0.2395 58/500 [==>...........................] - ETA: 2:31 - loss: 1.6843 - regression_loss: 1.4429 - classification_loss: 0.2413 59/500 [==>...........................] - ETA: 2:31 - loss: 1.6824 - regression_loss: 1.4420 - classification_loss: 0.2404 60/500 [==>...........................] - ETA: 2:30 - loss: 1.6886 - regression_loss: 1.4465 - classification_loss: 0.2420 61/500 [==>...........................] - ETA: 2:30 - loss: 1.6848 - regression_loss: 1.4431 - classification_loss: 0.2418 62/500 [==>...........................] - ETA: 2:30 - loss: 1.6862 - regression_loss: 1.4445 - classification_loss: 0.2417 63/500 [==>...........................] - ETA: 2:29 - loss: 1.6931 - regression_loss: 1.4508 - classification_loss: 0.2423 64/500 [==>...........................] - ETA: 2:29 - loss: 1.6854 - regression_loss: 1.4444 - classification_loss: 0.2410 65/500 [==>...........................] - ETA: 2:29 - loss: 1.6840 - regression_loss: 1.4442 - classification_loss: 0.2398 66/500 [==>...........................] - ETA: 2:28 - loss: 1.6861 - regression_loss: 1.4457 - classification_loss: 0.2404 67/500 [===>..........................] - ETA: 2:28 - loss: 1.6912 - regression_loss: 1.4500 - classification_loss: 0.2413 68/500 [===>..........................] - ETA: 2:28 - loss: 1.6912 - regression_loss: 1.4497 - classification_loss: 0.2414 69/500 [===>..........................] - ETA: 2:27 - loss: 1.6888 - regression_loss: 1.4480 - classification_loss: 0.2408 70/500 [===>..........................] - ETA: 2:27 - loss: 1.7067 - regression_loss: 1.4625 - classification_loss: 0.2441 71/500 [===>..........................] - ETA: 2:27 - loss: 1.6995 - regression_loss: 1.4568 - classification_loss: 0.2427 72/500 [===>..........................] - ETA: 2:26 - loss: 1.6993 - regression_loss: 1.4576 - classification_loss: 0.2417 73/500 [===>..........................] - ETA: 2:26 - loss: 1.7148 - regression_loss: 1.4681 - classification_loss: 0.2467 74/500 [===>..........................] - ETA: 2:26 - loss: 1.7093 - regression_loss: 1.4637 - classification_loss: 0.2457 75/500 [===>..........................] - ETA: 2:25 - loss: 1.7006 - regression_loss: 1.4569 - classification_loss: 0.2437 76/500 [===>..........................] - ETA: 2:25 - loss: 1.6812 - regression_loss: 1.4377 - classification_loss: 0.2435 77/500 [===>..........................] - ETA: 2:25 - loss: 1.6746 - regression_loss: 1.4321 - classification_loss: 0.2424 78/500 [===>..........................] - ETA: 2:24 - loss: 1.6729 - regression_loss: 1.4312 - classification_loss: 0.2417 79/500 [===>..........................] - ETA: 2:24 - loss: 1.6841 - regression_loss: 1.4396 - classification_loss: 0.2445 80/500 [===>..........................] - ETA: 2:23 - loss: 1.6859 - regression_loss: 1.4423 - classification_loss: 0.2435 81/500 [===>..........................] - ETA: 2:23 - loss: 1.6863 - regression_loss: 1.4419 - classification_loss: 0.2444 82/500 [===>..........................] - ETA: 2:22 - loss: 1.6833 - regression_loss: 1.4390 - classification_loss: 0.2443 83/500 [===>..........................] - ETA: 2:22 - loss: 1.6801 - regression_loss: 1.4364 - classification_loss: 0.2437 84/500 [====>.........................] - ETA: 2:21 - loss: 1.6856 - regression_loss: 1.4408 - classification_loss: 0.2447 85/500 [====>.........................] - ETA: 2:21 - loss: 1.6856 - regression_loss: 1.4418 - classification_loss: 0.2438 86/500 [====>.........................] - ETA: 2:20 - loss: 1.6836 - regression_loss: 1.4405 - classification_loss: 0.2431 87/500 [====>.........................] - ETA: 2:20 - loss: 1.6843 - regression_loss: 1.4404 - classification_loss: 0.2438 88/500 [====>.........................] - ETA: 2:20 - loss: 1.6963 - regression_loss: 1.4505 - classification_loss: 0.2459 89/500 [====>.........................] - ETA: 2:19 - loss: 1.6849 - regression_loss: 1.4408 - classification_loss: 0.2441 90/500 [====>.........................] - ETA: 2:19 - loss: 1.6894 - regression_loss: 1.4444 - classification_loss: 0.2450 91/500 [====>.........................] - ETA: 2:19 - loss: 1.6804 - regression_loss: 1.4371 - classification_loss: 0.2433 92/500 [====>.........................] - ETA: 2:19 - loss: 1.6790 - regression_loss: 1.4366 - classification_loss: 0.2424 93/500 [====>.........................] - ETA: 2:18 - loss: 1.6890 - regression_loss: 1.4429 - classification_loss: 0.2461 94/500 [====>.........................] - ETA: 2:18 - loss: 1.7016 - regression_loss: 1.4518 - classification_loss: 0.2498 95/500 [====>.........................] - ETA: 2:17 - loss: 1.6993 - regression_loss: 1.4502 - classification_loss: 0.2492 96/500 [====>.........................] - ETA: 2:17 - loss: 1.7020 - regression_loss: 1.4518 - classification_loss: 0.2502 97/500 [====>.........................] - ETA: 2:17 - loss: 1.7029 - regression_loss: 1.4527 - classification_loss: 0.2502 98/500 [====>.........................] - ETA: 2:16 - loss: 1.6993 - regression_loss: 1.4508 - classification_loss: 0.2485 99/500 [====>.........................] - ETA: 2:16 - loss: 1.7030 - regression_loss: 1.4534 - classification_loss: 0.2496 100/500 [=====>........................] - ETA: 2:16 - loss: 1.6986 - regression_loss: 1.4502 - classification_loss: 0.2484 101/500 [=====>........................] - ETA: 2:15 - loss: 1.6956 - regression_loss: 1.4483 - classification_loss: 0.2473 102/500 [=====>........................] - ETA: 2:15 - loss: 1.6927 - regression_loss: 1.4461 - classification_loss: 0.2467 103/500 [=====>........................] - ETA: 2:15 - loss: 1.6860 - regression_loss: 1.4409 - classification_loss: 0.2451 104/500 [=====>........................] - ETA: 2:15 - loss: 1.6810 - regression_loss: 1.4369 - classification_loss: 0.2442 105/500 [=====>........................] - ETA: 2:14 - loss: 1.6840 - regression_loss: 1.4406 - classification_loss: 0.2435 106/500 [=====>........................] - ETA: 2:14 - loss: 1.6817 - regression_loss: 1.4392 - classification_loss: 0.2425 107/500 [=====>........................] - ETA: 2:14 - loss: 1.6744 - regression_loss: 1.4334 - classification_loss: 0.2410 108/500 [=====>........................] - ETA: 2:13 - loss: 1.6715 - regression_loss: 1.4313 - classification_loss: 0.2402 109/500 [=====>........................] - ETA: 2:13 - loss: 1.6690 - regression_loss: 1.4295 - classification_loss: 0.2395 110/500 [=====>........................] - ETA: 2:12 - loss: 1.6727 - regression_loss: 1.4322 - classification_loss: 0.2406 111/500 [=====>........................] - ETA: 2:12 - loss: 1.6776 - regression_loss: 1.4362 - classification_loss: 0.2414 112/500 [=====>........................] - ETA: 2:12 - loss: 1.6751 - regression_loss: 1.4340 - classification_loss: 0.2411 113/500 [=====>........................] - ETA: 2:11 - loss: 1.6823 - regression_loss: 1.4384 - classification_loss: 0.2439 114/500 [=====>........................] - ETA: 2:11 - loss: 1.6852 - regression_loss: 1.4405 - classification_loss: 0.2447 115/500 [=====>........................] - ETA: 2:11 - loss: 1.6846 - regression_loss: 1.4405 - classification_loss: 0.2440 116/500 [=====>........................] - ETA: 2:10 - loss: 1.6774 - regression_loss: 1.4344 - classification_loss: 0.2430 117/500 [======>.......................] - ETA: 2:10 - loss: 1.6774 - regression_loss: 1.4349 - classification_loss: 0.2424 118/500 [======>.......................] - ETA: 2:10 - loss: 1.6705 - regression_loss: 1.4294 - classification_loss: 0.2412 119/500 [======>.......................] - ETA: 2:09 - loss: 1.6671 - regression_loss: 1.4260 - classification_loss: 0.2410 120/500 [======>.......................] - ETA: 2:09 - loss: 1.6662 - regression_loss: 1.4258 - classification_loss: 0.2404 121/500 [======>.......................] - ETA: 2:09 - loss: 1.6718 - regression_loss: 1.4298 - classification_loss: 0.2421 122/500 [======>.......................] - ETA: 2:08 - loss: 1.6715 - regression_loss: 1.4295 - classification_loss: 0.2420 123/500 [======>.......................] - ETA: 2:08 - loss: 1.6678 - regression_loss: 1.4263 - classification_loss: 0.2415 124/500 [======>.......................] - ETA: 2:08 - loss: 1.6664 - regression_loss: 1.4253 - classification_loss: 0.2411 125/500 [======>.......................] - ETA: 2:07 - loss: 1.6656 - regression_loss: 1.4243 - classification_loss: 0.2413 126/500 [======>.......................] - ETA: 2:07 - loss: 1.6649 - regression_loss: 1.4245 - classification_loss: 0.2404 127/500 [======>.......................] - ETA: 2:06 - loss: 1.6653 - regression_loss: 1.4256 - classification_loss: 0.2397 128/500 [======>.......................] - ETA: 2:06 - loss: 1.6621 - regression_loss: 1.4230 - classification_loss: 0.2391 129/500 [======>.......................] - ETA: 2:06 - loss: 1.6604 - regression_loss: 1.4219 - classification_loss: 0.2385 130/500 [======>.......................] - ETA: 2:05 - loss: 1.6528 - regression_loss: 1.4157 - classification_loss: 0.2371 131/500 [======>.......................] - ETA: 2:05 - loss: 1.6500 - regression_loss: 1.4133 - classification_loss: 0.2368 132/500 [======>.......................] - ETA: 2:05 - loss: 1.6497 - regression_loss: 1.4134 - classification_loss: 0.2363 133/500 [======>.......................] - ETA: 2:04 - loss: 1.6552 - regression_loss: 1.4184 - classification_loss: 0.2369 134/500 [=======>......................] - ETA: 2:04 - loss: 1.6614 - regression_loss: 1.4238 - classification_loss: 0.2377 135/500 [=======>......................] - ETA: 2:04 - loss: 1.6547 - regression_loss: 1.4180 - classification_loss: 0.2367 136/500 [=======>......................] - ETA: 2:03 - loss: 1.6532 - regression_loss: 1.4171 - classification_loss: 0.2362 137/500 [=======>......................] - ETA: 2:03 - loss: 1.6539 - regression_loss: 1.4181 - classification_loss: 0.2358 138/500 [=======>......................] - ETA: 2:03 - loss: 1.6591 - regression_loss: 1.4237 - classification_loss: 0.2354 139/500 [=======>......................] - ETA: 2:02 - loss: 1.6724 - regression_loss: 1.4344 - classification_loss: 0.2380 140/500 [=======>......................] - ETA: 2:02 - loss: 1.6755 - regression_loss: 1.4367 - classification_loss: 0.2388 141/500 [=======>......................] - ETA: 2:01 - loss: 1.6795 - regression_loss: 1.4402 - classification_loss: 0.2393 142/500 [=======>......................] - ETA: 2:01 - loss: 1.6839 - regression_loss: 1.4437 - classification_loss: 0.2403 143/500 [=======>......................] - ETA: 2:01 - loss: 1.6824 - regression_loss: 1.4423 - classification_loss: 0.2400 144/500 [=======>......................] - ETA: 2:01 - loss: 1.6821 - regression_loss: 1.4421 - classification_loss: 0.2401 145/500 [=======>......................] - ETA: 2:00 - loss: 1.6810 - regression_loss: 1.4413 - classification_loss: 0.2397 146/500 [=======>......................] - ETA: 2:00 - loss: 1.6802 - regression_loss: 1.4407 - classification_loss: 0.2395 147/500 [=======>......................] - ETA: 2:00 - loss: 1.6756 - regression_loss: 1.4370 - classification_loss: 0.2386 148/500 [=======>......................] - ETA: 1:59 - loss: 1.6774 - regression_loss: 1.4385 - classification_loss: 0.2389 149/500 [=======>......................] - ETA: 1:59 - loss: 1.6800 - regression_loss: 1.4409 - classification_loss: 0.2390 150/500 [========>.....................] - ETA: 1:59 - loss: 1.6771 - regression_loss: 1.4383 - classification_loss: 0.2388 151/500 [========>.....................] - ETA: 1:58 - loss: 1.6758 - regression_loss: 1.4373 - classification_loss: 0.2385 152/500 [========>.....................] - ETA: 1:58 - loss: 1.6762 - regression_loss: 1.4368 - classification_loss: 0.2394 153/500 [========>.....................] - ETA: 1:58 - loss: 1.6771 - regression_loss: 1.4379 - classification_loss: 0.2391 154/500 [========>.....................] - ETA: 1:57 - loss: 1.6738 - regression_loss: 1.4351 - classification_loss: 0.2387 155/500 [========>.....................] - ETA: 1:57 - loss: 1.6700 - regression_loss: 1.4324 - classification_loss: 0.2376 156/500 [========>.....................] - ETA: 1:57 - loss: 1.6691 - regression_loss: 1.4320 - classification_loss: 0.2370 157/500 [========>.....................] - ETA: 1:56 - loss: 1.6715 - regression_loss: 1.4339 - classification_loss: 0.2376 158/500 [========>.....................] - ETA: 1:56 - loss: 1.6687 - regression_loss: 1.4317 - classification_loss: 0.2369 159/500 [========>.....................] - ETA: 1:56 - loss: 1.6671 - regression_loss: 1.4304 - classification_loss: 0.2367 160/500 [========>.....................] - ETA: 1:55 - loss: 1.6706 - regression_loss: 1.4337 - classification_loss: 0.2370 161/500 [========>.....................] - ETA: 1:55 - loss: 1.6712 - regression_loss: 1.4343 - classification_loss: 0.2369 162/500 [========>.....................] - ETA: 1:54 - loss: 1.6684 - regression_loss: 1.4325 - classification_loss: 0.2360 163/500 [========>.....................] - ETA: 1:54 - loss: 1.6678 - regression_loss: 1.4321 - classification_loss: 0.2358 164/500 [========>.....................] - ETA: 1:54 - loss: 1.6693 - regression_loss: 1.4336 - classification_loss: 0.2357 165/500 [========>.....................] - ETA: 1:53 - loss: 1.6681 - regression_loss: 1.4324 - classification_loss: 0.2358 166/500 [========>.....................] - ETA: 1:53 - loss: 1.6676 - regression_loss: 1.4317 - classification_loss: 0.2360 167/500 [=========>....................] - ETA: 1:53 - loss: 1.6707 - regression_loss: 1.4342 - classification_loss: 0.2365 168/500 [=========>....................] - ETA: 1:52 - loss: 1.6703 - regression_loss: 1.4344 - classification_loss: 0.2359 169/500 [=========>....................] - ETA: 1:52 - loss: 1.6682 - regression_loss: 1.4322 - classification_loss: 0.2360 170/500 [=========>....................] - ETA: 1:52 - loss: 1.6698 - regression_loss: 1.4341 - classification_loss: 0.2357 171/500 [=========>....................] - ETA: 1:51 - loss: 1.6704 - regression_loss: 1.4348 - classification_loss: 0.2356 172/500 [=========>....................] - ETA: 1:51 - loss: 1.6649 - regression_loss: 1.4302 - classification_loss: 0.2347 173/500 [=========>....................] - ETA: 1:51 - loss: 1.6643 - regression_loss: 1.4298 - classification_loss: 0.2345 174/500 [=========>....................] - ETA: 1:50 - loss: 1.6628 - regression_loss: 1.4285 - classification_loss: 0.2344 175/500 [=========>....................] - ETA: 1:50 - loss: 1.6602 - regression_loss: 1.4263 - classification_loss: 0.2339 176/500 [=========>....................] - ETA: 1:50 - loss: 1.6586 - regression_loss: 1.4247 - classification_loss: 0.2339 177/500 [=========>....................] - ETA: 1:49 - loss: 1.6567 - regression_loss: 1.4232 - classification_loss: 0.2335 178/500 [=========>....................] - ETA: 1:49 - loss: 1.6578 - regression_loss: 1.4240 - classification_loss: 0.2338 179/500 [=========>....................] - ETA: 1:49 - loss: 1.6642 - regression_loss: 1.4295 - classification_loss: 0.2348 180/500 [=========>....................] - ETA: 1:48 - loss: 1.6632 - regression_loss: 1.4285 - classification_loss: 0.2347 181/500 [=========>....................] - ETA: 1:48 - loss: 1.6623 - regression_loss: 1.4279 - classification_loss: 0.2343 182/500 [=========>....................] - ETA: 1:48 - loss: 1.6603 - regression_loss: 1.4265 - classification_loss: 0.2338 183/500 [=========>....................] - ETA: 1:47 - loss: 1.6620 - regression_loss: 1.4280 - classification_loss: 0.2340 184/500 [==========>...................] - ETA: 1:47 - loss: 1.6612 - regression_loss: 1.4270 - classification_loss: 0.2342 185/500 [==========>...................] - ETA: 1:47 - loss: 1.6617 - regression_loss: 1.4279 - classification_loss: 0.2338 186/500 [==========>...................] - ETA: 1:46 - loss: 1.6582 - regression_loss: 1.4251 - classification_loss: 0.2330 187/500 [==========>...................] - ETA: 1:46 - loss: 1.6593 - regression_loss: 1.4261 - classification_loss: 0.2331 188/500 [==========>...................] - ETA: 1:46 - loss: 1.6589 - regression_loss: 1.4258 - classification_loss: 0.2332 189/500 [==========>...................] - ETA: 1:45 - loss: 1.6596 - regression_loss: 1.4268 - classification_loss: 0.2328 190/500 [==========>...................] - ETA: 1:45 - loss: 1.6563 - regression_loss: 1.4238 - classification_loss: 0.2325 191/500 [==========>...................] - ETA: 1:44 - loss: 1.6533 - regression_loss: 1.4214 - classification_loss: 0.2319 192/500 [==========>...................] - ETA: 1:44 - loss: 1.6632 - regression_loss: 1.4295 - classification_loss: 0.2336 193/500 [==========>...................] - ETA: 1:44 - loss: 1.6617 - regression_loss: 1.4284 - classification_loss: 0.2333 194/500 [==========>...................] - ETA: 1:43 - loss: 1.6651 - regression_loss: 1.4313 - classification_loss: 0.2338 195/500 [==========>...................] - ETA: 1:43 - loss: 1.6638 - regression_loss: 1.4306 - classification_loss: 0.2333 196/500 [==========>...................] - ETA: 1:43 - loss: 1.6620 - regression_loss: 1.4292 - classification_loss: 0.2328 197/500 [==========>...................] - ETA: 1:42 - loss: 1.6595 - regression_loss: 1.4270 - classification_loss: 0.2325 198/500 [==========>...................] - ETA: 1:42 - loss: 1.6630 - regression_loss: 1.4292 - classification_loss: 0.2338 199/500 [==========>...................] - ETA: 1:42 - loss: 1.6627 - regression_loss: 1.4291 - classification_loss: 0.2337 200/500 [===========>..................] - ETA: 1:41 - loss: 1.6652 - regression_loss: 1.4308 - classification_loss: 0.2344 201/500 [===========>..................] - ETA: 1:41 - loss: 1.6663 - regression_loss: 1.4319 - classification_loss: 0.2343 202/500 [===========>..................] - ETA: 1:41 - loss: 1.6668 - regression_loss: 1.4324 - classification_loss: 0.2344 203/500 [===========>..................] - ETA: 1:40 - loss: 1.6629 - regression_loss: 1.4290 - classification_loss: 0.2340 204/500 [===========>..................] - ETA: 1:40 - loss: 1.6624 - regression_loss: 1.4284 - classification_loss: 0.2340 205/500 [===========>..................] - ETA: 1:40 - loss: 1.6636 - regression_loss: 1.4287 - classification_loss: 0.2349 206/500 [===========>..................] - ETA: 1:39 - loss: 1.6724 - regression_loss: 1.4375 - classification_loss: 0.2349 207/500 [===========>..................] - ETA: 1:39 - loss: 1.6700 - regression_loss: 1.4358 - classification_loss: 0.2343 208/500 [===========>..................] - ETA: 1:39 - loss: 1.6732 - regression_loss: 1.4386 - classification_loss: 0.2345 209/500 [===========>..................] - ETA: 1:38 - loss: 1.6734 - regression_loss: 1.4391 - classification_loss: 0.2342 210/500 [===========>..................] - ETA: 1:38 - loss: 1.6732 - regression_loss: 1.4388 - classification_loss: 0.2343 211/500 [===========>..................] - ETA: 1:38 - loss: 1.6710 - regression_loss: 1.4369 - classification_loss: 0.2341 212/500 [===========>..................] - ETA: 1:37 - loss: 1.6709 - regression_loss: 1.4367 - classification_loss: 0.2342 213/500 [===========>..................] - ETA: 1:37 - loss: 1.6710 - regression_loss: 1.4368 - classification_loss: 0.2342 214/500 [===========>..................] - ETA: 1:37 - loss: 1.6695 - regression_loss: 1.4356 - classification_loss: 0.2340 215/500 [===========>..................] - ETA: 1:36 - loss: 1.6697 - regression_loss: 1.4359 - classification_loss: 0.2338 216/500 [===========>..................] - ETA: 1:36 - loss: 1.6681 - regression_loss: 1.4346 - classification_loss: 0.2335 217/500 [============>.................] - ETA: 1:36 - loss: 1.6679 - regression_loss: 1.4347 - classification_loss: 0.2332 218/500 [============>.................] - ETA: 1:35 - loss: 1.6686 - regression_loss: 1.4353 - classification_loss: 0.2333 219/500 [============>.................] - ETA: 1:35 - loss: 1.6665 - regression_loss: 1.4335 - classification_loss: 0.2330 220/500 [============>.................] - ETA: 1:35 - loss: 1.6672 - regression_loss: 1.4343 - classification_loss: 0.2329 221/500 [============>.................] - ETA: 1:34 - loss: 1.6649 - regression_loss: 1.4326 - classification_loss: 0.2324 222/500 [============>.................] - ETA: 1:34 - loss: 1.6621 - regression_loss: 1.4304 - classification_loss: 0.2317 223/500 [============>.................] - ETA: 1:34 - loss: 1.6636 - regression_loss: 1.4321 - classification_loss: 0.2315 224/500 [============>.................] - ETA: 1:33 - loss: 1.6602 - regression_loss: 1.4293 - classification_loss: 0.2309 225/500 [============>.................] - ETA: 1:33 - loss: 1.6580 - regression_loss: 1.4276 - classification_loss: 0.2304 226/500 [============>.................] - ETA: 1:33 - loss: 1.6557 - regression_loss: 1.4258 - classification_loss: 0.2298 227/500 [============>.................] - ETA: 1:32 - loss: 1.6554 - regression_loss: 1.4258 - classification_loss: 0.2295 228/500 [============>.................] - ETA: 1:32 - loss: 1.6580 - regression_loss: 1.4275 - classification_loss: 0.2305 229/500 [============>.................] - ETA: 1:32 - loss: 1.6666 - regression_loss: 1.4344 - classification_loss: 0.2322 230/500 [============>.................] - ETA: 1:31 - loss: 1.6662 - regression_loss: 1.4341 - classification_loss: 0.2320 231/500 [============>.................] - ETA: 1:31 - loss: 1.6654 - regression_loss: 1.4335 - classification_loss: 0.2318 232/500 [============>.................] - ETA: 1:31 - loss: 1.6654 - regression_loss: 1.4335 - classification_loss: 0.2319 233/500 [============>.................] - ETA: 1:30 - loss: 1.6649 - regression_loss: 1.4331 - classification_loss: 0.2319 234/500 [=============>................] - ETA: 1:30 - loss: 1.6654 - regression_loss: 1.4332 - classification_loss: 0.2322 235/500 [=============>................] - ETA: 1:30 - loss: 1.6616 - regression_loss: 1.4298 - classification_loss: 0.2318 236/500 [=============>................] - ETA: 1:29 - loss: 1.6588 - regression_loss: 1.4275 - classification_loss: 0.2313 237/500 [=============>................] - ETA: 1:29 - loss: 1.6609 - regression_loss: 1.4291 - classification_loss: 0.2317 238/500 [=============>................] - ETA: 1:29 - loss: 1.6638 - regression_loss: 1.4311 - classification_loss: 0.2327 239/500 [=============>................] - ETA: 1:28 - loss: 1.6641 - regression_loss: 1.4317 - classification_loss: 0.2324 240/500 [=============>................] - ETA: 1:28 - loss: 1.6654 - regression_loss: 1.4331 - classification_loss: 0.2323 241/500 [=============>................] - ETA: 1:28 - loss: 1.6670 - regression_loss: 1.4344 - classification_loss: 0.2326 242/500 [=============>................] - ETA: 1:27 - loss: 1.6686 - regression_loss: 1.4360 - classification_loss: 0.2327 243/500 [=============>................] - ETA: 1:27 - loss: 1.6656 - regression_loss: 1.4332 - classification_loss: 0.2324 244/500 [=============>................] - ETA: 1:27 - loss: 1.6682 - regression_loss: 1.4354 - classification_loss: 0.2327 245/500 [=============>................] - ETA: 1:26 - loss: 1.6700 - regression_loss: 1.4361 - classification_loss: 0.2339 246/500 [=============>................] - ETA: 1:26 - loss: 1.6738 - regression_loss: 1.4389 - classification_loss: 0.2349 247/500 [=============>................] - ETA: 1:26 - loss: 1.6752 - regression_loss: 1.4398 - classification_loss: 0.2353 248/500 [=============>................] - ETA: 1:25 - loss: 1.6801 - regression_loss: 1.4435 - classification_loss: 0.2366 249/500 [=============>................] - ETA: 1:25 - loss: 1.6800 - regression_loss: 1.4430 - classification_loss: 0.2370 250/500 [==============>...............] - ETA: 1:25 - loss: 1.6800 - regression_loss: 1.4431 - classification_loss: 0.2370 251/500 [==============>...............] - ETA: 1:24 - loss: 1.6811 - regression_loss: 1.4436 - classification_loss: 0.2375 252/500 [==============>...............] - ETA: 1:24 - loss: 1.6810 - regression_loss: 1.4433 - classification_loss: 0.2377 253/500 [==============>...............] - ETA: 1:24 - loss: 1.6842 - regression_loss: 1.4457 - classification_loss: 0.2385 254/500 [==============>...............] - ETA: 1:23 - loss: 1.6924 - regression_loss: 1.4491 - classification_loss: 0.2433 255/500 [==============>...............] - ETA: 1:23 - loss: 1.6898 - regression_loss: 1.4471 - classification_loss: 0.2427 256/500 [==============>...............] - ETA: 1:23 - loss: 1.6958 - regression_loss: 1.4524 - classification_loss: 0.2434 257/500 [==============>...............] - ETA: 1:22 - loss: 1.6958 - regression_loss: 1.4524 - classification_loss: 0.2434 258/500 [==============>...............] - ETA: 1:22 - loss: 1.6966 - regression_loss: 1.4528 - classification_loss: 0.2438 259/500 [==============>...............] - ETA: 1:22 - loss: 1.6993 - regression_loss: 1.4548 - classification_loss: 0.2445 260/500 [==============>...............] - ETA: 1:21 - loss: 1.6952 - regression_loss: 1.4513 - classification_loss: 0.2439 261/500 [==============>...............] - ETA: 1:21 - loss: 1.6937 - regression_loss: 1.4499 - classification_loss: 0.2438 262/500 [==============>...............] - ETA: 1:21 - loss: 1.6917 - regression_loss: 1.4481 - classification_loss: 0.2436 263/500 [==============>...............] - ETA: 1:20 - loss: 1.6908 - regression_loss: 1.4472 - classification_loss: 0.2436 264/500 [==============>...............] - ETA: 1:20 - loss: 1.6877 - regression_loss: 1.4448 - classification_loss: 0.2430 265/500 [==============>...............] - ETA: 1:20 - loss: 1.6880 - regression_loss: 1.4448 - classification_loss: 0.2432 266/500 [==============>...............] - ETA: 1:19 - loss: 1.6944 - regression_loss: 1.4508 - classification_loss: 0.2436 267/500 [===============>..............] - ETA: 1:19 - loss: 1.6944 - regression_loss: 1.4507 - classification_loss: 0.2437 268/500 [===============>..............] - ETA: 1:18 - loss: 1.6922 - regression_loss: 1.4490 - classification_loss: 0.2433 269/500 [===============>..............] - ETA: 1:18 - loss: 1.6912 - regression_loss: 1.4484 - classification_loss: 0.2428 270/500 [===============>..............] - ETA: 1:18 - loss: 1.6901 - regression_loss: 1.4476 - classification_loss: 0.2425 271/500 [===============>..............] - ETA: 1:17 - loss: 1.6906 - regression_loss: 1.4483 - classification_loss: 0.2423 272/500 [===============>..............] - ETA: 1:17 - loss: 1.6903 - regression_loss: 1.4481 - classification_loss: 0.2422 273/500 [===============>..............] - ETA: 1:17 - loss: 1.6893 - regression_loss: 1.4473 - classification_loss: 0.2420 274/500 [===============>..............] - ETA: 1:16 - loss: 1.6895 - regression_loss: 1.4475 - classification_loss: 0.2419 275/500 [===============>..............] - ETA: 1:16 - loss: 1.6892 - regression_loss: 1.4472 - classification_loss: 0.2420 276/500 [===============>..............] - ETA: 1:16 - loss: 1.6882 - regression_loss: 1.4453 - classification_loss: 0.2429 277/500 [===============>..............] - ETA: 1:15 - loss: 1.6879 - regression_loss: 1.4451 - classification_loss: 0.2428 278/500 [===============>..............] - ETA: 1:15 - loss: 1.6911 - regression_loss: 1.4477 - classification_loss: 0.2434 279/500 [===============>..............] - ETA: 1:15 - loss: 1.6889 - regression_loss: 1.4459 - classification_loss: 0.2430 280/500 [===============>..............] - ETA: 1:14 - loss: 1.6894 - regression_loss: 1.4461 - classification_loss: 0.2433 281/500 [===============>..............] - ETA: 1:14 - loss: 1.6859 - regression_loss: 1.4432 - classification_loss: 0.2427 282/500 [===============>..............] - ETA: 1:14 - loss: 1.6861 - regression_loss: 1.4434 - classification_loss: 0.2427 283/500 [===============>..............] - ETA: 1:13 - loss: 1.6877 - regression_loss: 1.4448 - classification_loss: 0.2429 284/500 [================>.............] - ETA: 1:13 - loss: 1.6848 - regression_loss: 1.4423 - classification_loss: 0.2424 285/500 [================>.............] - ETA: 1:13 - loss: 1.6884 - regression_loss: 1.4455 - classification_loss: 0.2429 286/500 [================>.............] - ETA: 1:12 - loss: 1.6874 - regression_loss: 1.4445 - classification_loss: 0.2429 287/500 [================>.............] - ETA: 1:12 - loss: 1.6856 - regression_loss: 1.4431 - classification_loss: 0.2425 288/500 [================>.............] - ETA: 1:12 - loss: 1.6853 - regression_loss: 1.4430 - classification_loss: 0.2423 289/500 [================>.............] - ETA: 1:11 - loss: 1.6856 - regression_loss: 1.4434 - classification_loss: 0.2422 290/500 [================>.............] - ETA: 1:11 - loss: 1.6906 - regression_loss: 1.4469 - classification_loss: 0.2437 291/500 [================>.............] - ETA: 1:11 - loss: 1.6937 - regression_loss: 1.4492 - classification_loss: 0.2445 292/500 [================>.............] - ETA: 1:10 - loss: 1.6940 - regression_loss: 1.4496 - classification_loss: 0.2443 293/500 [================>.............] - ETA: 1:10 - loss: 1.6933 - regression_loss: 1.4489 - classification_loss: 0.2443 294/500 [================>.............] - ETA: 1:10 - loss: 1.6919 - regression_loss: 1.4479 - classification_loss: 0.2440 295/500 [================>.............] - ETA: 1:09 - loss: 1.6934 - regression_loss: 1.4491 - classification_loss: 0.2443 296/500 [================>.............] - ETA: 1:09 - loss: 1.6954 - regression_loss: 1.4505 - classification_loss: 0.2449 297/500 [================>.............] - ETA: 1:09 - loss: 1.6948 - regression_loss: 1.4503 - classification_loss: 0.2445 298/500 [================>.............] - ETA: 1:08 - loss: 1.6945 - regression_loss: 1.4500 - classification_loss: 0.2445 299/500 [================>.............] - ETA: 1:08 - loss: 1.6939 - regression_loss: 1.4494 - classification_loss: 0.2445 300/500 [=================>............] - ETA: 1:08 - loss: 1.6934 - regression_loss: 1.4490 - classification_loss: 0.2445 301/500 [=================>............] - ETA: 1:07 - loss: 1.6941 - regression_loss: 1.4497 - classification_loss: 0.2444 302/500 [=================>............] - ETA: 1:07 - loss: 1.6939 - regression_loss: 1.4494 - classification_loss: 0.2444 303/500 [=================>............] - ETA: 1:07 - loss: 1.6921 - regression_loss: 1.4480 - classification_loss: 0.2441 304/500 [=================>............] - ETA: 1:06 - loss: 1.6924 - regression_loss: 1.4483 - classification_loss: 0.2441 305/500 [=================>............] - ETA: 1:06 - loss: 1.6917 - regression_loss: 1.4477 - classification_loss: 0.2439 306/500 [=================>............] - ETA: 1:06 - loss: 1.6888 - regression_loss: 1.4453 - classification_loss: 0.2435 307/500 [=================>............] - ETA: 1:05 - loss: 1.6880 - regression_loss: 1.4447 - classification_loss: 0.2433 308/500 [=================>............] - ETA: 1:05 - loss: 1.6867 - regression_loss: 1.4437 - classification_loss: 0.2430 309/500 [=================>............] - ETA: 1:05 - loss: 1.6876 - regression_loss: 1.4448 - classification_loss: 0.2429 310/500 [=================>............] - ETA: 1:04 - loss: 1.6877 - regression_loss: 1.4433 - classification_loss: 0.2445 311/500 [=================>............] - ETA: 1:04 - loss: 1.6875 - regression_loss: 1.4430 - classification_loss: 0.2445 312/500 [=================>............] - ETA: 1:03 - loss: 1.6869 - regression_loss: 1.4425 - classification_loss: 0.2444 313/500 [=================>............] - ETA: 1:03 - loss: 1.6856 - regression_loss: 1.4415 - classification_loss: 0.2441 314/500 [=================>............] - ETA: 1:03 - loss: 1.6846 - regression_loss: 1.4406 - classification_loss: 0.2440 315/500 [=================>............] - ETA: 1:02 - loss: 1.6844 - regression_loss: 1.4406 - classification_loss: 0.2438 316/500 [=================>............] - ETA: 1:02 - loss: 1.6858 - regression_loss: 1.4416 - classification_loss: 0.2442 317/500 [==================>...........] - ETA: 1:02 - loss: 1.6864 - regression_loss: 1.4421 - classification_loss: 0.2443 318/500 [==================>...........] - ETA: 1:01 - loss: 1.6845 - regression_loss: 1.4405 - classification_loss: 0.2440 319/500 [==================>...........] - ETA: 1:01 - loss: 1.6850 - regression_loss: 1.4410 - classification_loss: 0.2441 320/500 [==================>...........] - ETA: 1:01 - loss: 1.6869 - regression_loss: 1.4427 - classification_loss: 0.2443 321/500 [==================>...........] - ETA: 1:00 - loss: 1.6858 - regression_loss: 1.4418 - classification_loss: 0.2440 322/500 [==================>...........] - ETA: 1:00 - loss: 1.6831 - regression_loss: 1.4396 - classification_loss: 0.2435 323/500 [==================>...........] - ETA: 1:00 - loss: 1.6829 - regression_loss: 1.4394 - classification_loss: 0.2436 324/500 [==================>...........] - ETA: 59s - loss: 1.6821 - regression_loss: 1.4387 - classification_loss: 0.2434  325/500 [==================>...........] - ETA: 59s - loss: 1.6820 - regression_loss: 1.4386 - classification_loss: 0.2434 326/500 [==================>...........] - ETA: 59s - loss: 1.6820 - regression_loss: 1.4384 - classification_loss: 0.2436 327/500 [==================>...........] - ETA: 58s - loss: 1.6805 - regression_loss: 1.4373 - classification_loss: 0.2432 328/500 [==================>...........] - ETA: 58s - loss: 1.6790 - regression_loss: 1.4362 - classification_loss: 0.2429 329/500 [==================>...........] - ETA: 58s - loss: 1.6797 - regression_loss: 1.4361 - classification_loss: 0.2436 330/500 [==================>...........] - ETA: 57s - loss: 1.6788 - regression_loss: 1.4350 - classification_loss: 0.2438 331/500 [==================>...........] - ETA: 57s - loss: 1.6760 - regression_loss: 1.4324 - classification_loss: 0.2436 332/500 [==================>...........] - ETA: 57s - loss: 1.6779 - regression_loss: 1.4334 - classification_loss: 0.2445 333/500 [==================>...........] - ETA: 56s - loss: 1.6798 - regression_loss: 1.4345 - classification_loss: 0.2453 334/500 [===================>..........] - ETA: 56s - loss: 1.6802 - regression_loss: 1.4348 - classification_loss: 0.2453 335/500 [===================>..........] - ETA: 56s - loss: 1.6785 - regression_loss: 1.4336 - classification_loss: 0.2449 336/500 [===================>..........] - ETA: 55s - loss: 1.6769 - regression_loss: 1.4324 - classification_loss: 0.2445 337/500 [===================>..........] - ETA: 55s - loss: 1.6766 - regression_loss: 1.4321 - classification_loss: 0.2445 338/500 [===================>..........] - ETA: 55s - loss: 1.6740 - regression_loss: 1.4300 - classification_loss: 0.2440 339/500 [===================>..........] - ETA: 54s - loss: 1.6718 - regression_loss: 1.4282 - classification_loss: 0.2436 340/500 [===================>..........] - ETA: 54s - loss: 1.6729 - regression_loss: 1.4289 - classification_loss: 0.2440 341/500 [===================>..........] - ETA: 54s - loss: 1.6734 - regression_loss: 1.4294 - classification_loss: 0.2440 342/500 [===================>..........] - ETA: 53s - loss: 1.6727 - regression_loss: 1.4290 - classification_loss: 0.2437 343/500 [===================>..........] - ETA: 53s - loss: 1.6717 - regression_loss: 1.4281 - classification_loss: 0.2435 344/500 [===================>..........] - ETA: 53s - loss: 1.6716 - regression_loss: 1.4283 - classification_loss: 0.2434 345/500 [===================>..........] - ETA: 52s - loss: 1.6710 - regression_loss: 1.4278 - classification_loss: 0.2433 346/500 [===================>..........] - ETA: 52s - loss: 1.6703 - regression_loss: 1.4270 - classification_loss: 0.2433 347/500 [===================>..........] - ETA: 52s - loss: 1.6720 - regression_loss: 1.4284 - classification_loss: 0.2435 348/500 [===================>..........] - ETA: 51s - loss: 1.6717 - regression_loss: 1.4283 - classification_loss: 0.2434 349/500 [===================>..........] - ETA: 51s - loss: 1.6718 - regression_loss: 1.4285 - classification_loss: 0.2433 350/500 [====================>.........] - ETA: 50s - loss: 1.6697 - regression_loss: 1.4269 - classification_loss: 0.2428 351/500 [====================>.........] - ETA: 50s - loss: 1.6687 - regression_loss: 1.4262 - classification_loss: 0.2426 352/500 [====================>.........] - ETA: 50s - loss: 1.6677 - regression_loss: 1.4254 - classification_loss: 0.2424 353/500 [====================>.........] - ETA: 49s - loss: 1.6692 - regression_loss: 1.4270 - classification_loss: 0.2421 354/500 [====================>.........] - ETA: 49s - loss: 1.6701 - regression_loss: 1.4279 - classification_loss: 0.2421 355/500 [====================>.........] - ETA: 49s - loss: 1.6715 - regression_loss: 1.4291 - classification_loss: 0.2424 356/500 [====================>.........] - ETA: 48s - loss: 1.6716 - regression_loss: 1.4294 - classification_loss: 0.2422 357/500 [====================>.........] - ETA: 48s - loss: 1.6704 - regression_loss: 1.4284 - classification_loss: 0.2420 358/500 [====================>.........] - ETA: 48s - loss: 1.6691 - regression_loss: 1.4274 - classification_loss: 0.2418 359/500 [====================>.........] - ETA: 47s - loss: 1.6687 - regression_loss: 1.4271 - classification_loss: 0.2416 360/500 [====================>.........] - ETA: 47s - loss: 1.6692 - regression_loss: 1.4274 - classification_loss: 0.2418 361/500 [====================>.........] - ETA: 47s - loss: 1.6684 - regression_loss: 1.4269 - classification_loss: 0.2415 362/500 [====================>.........] - ETA: 46s - loss: 1.6653 - regression_loss: 1.4241 - classification_loss: 0.2412 363/500 [====================>.........] - ETA: 46s - loss: 1.6655 - regression_loss: 1.4244 - classification_loss: 0.2411 364/500 [====================>.........] - ETA: 46s - loss: 1.6672 - regression_loss: 1.4255 - classification_loss: 0.2416 365/500 [====================>.........] - ETA: 45s - loss: 1.6675 - regression_loss: 1.4259 - classification_loss: 0.2416 366/500 [====================>.........] - ETA: 45s - loss: 1.6672 - regression_loss: 1.4256 - classification_loss: 0.2416 367/500 [=====================>........] - ETA: 45s - loss: 1.6668 - regression_loss: 1.4253 - classification_loss: 0.2415 368/500 [=====================>........] - ETA: 44s - loss: 1.6659 - regression_loss: 1.4246 - classification_loss: 0.2413 369/500 [=====================>........] - ETA: 44s - loss: 1.6674 - regression_loss: 1.4259 - classification_loss: 0.2415 370/500 [=====================>........] - ETA: 44s - loss: 1.6662 - regression_loss: 1.4249 - classification_loss: 0.2413 371/500 [=====================>........] - ETA: 43s - loss: 1.6637 - regression_loss: 1.4228 - classification_loss: 0.2409 372/500 [=====================>........] - ETA: 43s - loss: 1.6623 - regression_loss: 1.4216 - classification_loss: 0.2407 373/500 [=====================>........] - ETA: 43s - loss: 1.6624 - regression_loss: 1.4218 - classification_loss: 0.2406 374/500 [=====================>........] - ETA: 42s - loss: 1.6636 - regression_loss: 1.4227 - classification_loss: 0.2409 375/500 [=====================>........] - ETA: 42s - loss: 1.6625 - regression_loss: 1.4218 - classification_loss: 0.2407 376/500 [=====================>........] - ETA: 42s - loss: 1.6609 - regression_loss: 1.4205 - classification_loss: 0.2404 377/500 [=====================>........] - ETA: 41s - loss: 1.6629 - regression_loss: 1.4225 - classification_loss: 0.2404 378/500 [=====================>........] - ETA: 41s - loss: 1.6636 - regression_loss: 1.4230 - classification_loss: 0.2406 379/500 [=====================>........] - ETA: 41s - loss: 1.6608 - regression_loss: 1.4205 - classification_loss: 0.2403 380/500 [=====================>........] - ETA: 40s - loss: 1.6588 - regression_loss: 1.4187 - classification_loss: 0.2401 381/500 [=====================>........] - ETA: 40s - loss: 1.6597 - regression_loss: 1.4195 - classification_loss: 0.2402 382/500 [=====================>........] - ETA: 40s - loss: 1.6597 - regression_loss: 1.4196 - classification_loss: 0.2402 383/500 [=====================>........] - ETA: 39s - loss: 1.6590 - regression_loss: 1.4192 - classification_loss: 0.2399 384/500 [======================>.......] - ETA: 39s - loss: 1.6574 - regression_loss: 1.4177 - classification_loss: 0.2397 385/500 [======================>.......] - ETA: 39s - loss: 1.6590 - regression_loss: 1.4188 - classification_loss: 0.2402 386/500 [======================>.......] - ETA: 38s - loss: 1.6579 - regression_loss: 1.4178 - classification_loss: 0.2400 387/500 [======================>.......] - ETA: 38s - loss: 1.6588 - regression_loss: 1.4189 - classification_loss: 0.2399 388/500 [======================>.......] - ETA: 38s - loss: 1.6592 - regression_loss: 1.4191 - classification_loss: 0.2402 389/500 [======================>.......] - ETA: 37s - loss: 1.6596 - regression_loss: 1.4196 - classification_loss: 0.2400 390/500 [======================>.......] - ETA: 37s - loss: 1.6620 - regression_loss: 1.4214 - classification_loss: 0.2405 391/500 [======================>.......] - ETA: 37s - loss: 1.6603 - regression_loss: 1.4202 - classification_loss: 0.2401 392/500 [======================>.......] - ETA: 36s - loss: 1.6613 - regression_loss: 1.4211 - classification_loss: 0.2402 393/500 [======================>.......] - ETA: 36s - loss: 1.6607 - regression_loss: 1.4205 - classification_loss: 0.2401 394/500 [======================>.......] - ETA: 36s - loss: 1.6597 - regression_loss: 1.4199 - classification_loss: 0.2399 395/500 [======================>.......] - ETA: 35s - loss: 1.6602 - regression_loss: 1.4201 - classification_loss: 0.2401 396/500 [======================>.......] - ETA: 35s - loss: 1.6592 - regression_loss: 1.4193 - classification_loss: 0.2398 397/500 [======================>.......] - ETA: 35s - loss: 1.6601 - regression_loss: 1.4200 - classification_loss: 0.2401 398/500 [======================>.......] - ETA: 34s - loss: 1.6595 - regression_loss: 1.4195 - classification_loss: 0.2400 399/500 [======================>.......] - ETA: 34s - loss: 1.6585 - regression_loss: 1.4188 - classification_loss: 0.2397 400/500 [=======================>......] - ETA: 33s - loss: 1.6561 - regression_loss: 1.4166 - classification_loss: 0.2395 401/500 [=======================>......] - ETA: 33s - loss: 1.6564 - regression_loss: 1.4170 - classification_loss: 0.2394 402/500 [=======================>......] - ETA: 33s - loss: 1.6555 - regression_loss: 1.4164 - classification_loss: 0.2391 403/500 [=======================>......] - ETA: 32s - loss: 1.6567 - regression_loss: 1.4173 - classification_loss: 0.2394 404/500 [=======================>......] - ETA: 32s - loss: 1.6542 - regression_loss: 1.4151 - classification_loss: 0.2391 405/500 [=======================>......] - ETA: 32s - loss: 1.6542 - regression_loss: 1.4152 - classification_loss: 0.2389 406/500 [=======================>......] - ETA: 31s - loss: 1.6530 - regression_loss: 1.4142 - classification_loss: 0.2388 407/500 [=======================>......] - ETA: 31s - loss: 1.6539 - regression_loss: 1.4153 - classification_loss: 0.2386 408/500 [=======================>......] - ETA: 31s - loss: 1.6531 - regression_loss: 1.4147 - classification_loss: 0.2384 409/500 [=======================>......] - ETA: 30s - loss: 1.6535 - regression_loss: 1.4151 - classification_loss: 0.2384 410/500 [=======================>......] - ETA: 30s - loss: 1.6521 - regression_loss: 1.4138 - classification_loss: 0.2383 411/500 [=======================>......] - ETA: 30s - loss: 1.6518 - regression_loss: 1.4136 - classification_loss: 0.2382 412/500 [=======================>......] - ETA: 29s - loss: 1.6519 - regression_loss: 1.4136 - classification_loss: 0.2382 413/500 [=======================>......] - ETA: 29s - loss: 1.6498 - regression_loss: 1.4119 - classification_loss: 0.2379 414/500 [=======================>......] - ETA: 29s - loss: 1.6515 - regression_loss: 1.4131 - classification_loss: 0.2384 415/500 [=======================>......] - ETA: 28s - loss: 1.6492 - regression_loss: 1.4111 - classification_loss: 0.2381 416/500 [=======================>......] - ETA: 28s - loss: 1.6495 - regression_loss: 1.4116 - classification_loss: 0.2379 417/500 [========================>.....] - ETA: 28s - loss: 1.6491 - regression_loss: 1.4114 - classification_loss: 0.2378 418/500 [========================>.....] - ETA: 27s - loss: 1.6470 - regression_loss: 1.4097 - classification_loss: 0.2373 419/500 [========================>.....] - ETA: 27s - loss: 1.6464 - regression_loss: 1.4093 - classification_loss: 0.2371 420/500 [========================>.....] - ETA: 27s - loss: 1.6454 - regression_loss: 1.4083 - classification_loss: 0.2371 421/500 [========================>.....] - ETA: 26s - loss: 1.6473 - regression_loss: 1.4096 - classification_loss: 0.2377 422/500 [========================>.....] - ETA: 26s - loss: 1.6459 - regression_loss: 1.4084 - classification_loss: 0.2374 423/500 [========================>.....] - ETA: 26s - loss: 1.6461 - regression_loss: 1.4086 - classification_loss: 0.2375 424/500 [========================>.....] - ETA: 25s - loss: 1.6447 - regression_loss: 1.4074 - classification_loss: 0.2372 425/500 [========================>.....] - ETA: 25s - loss: 1.6442 - regression_loss: 1.4071 - classification_loss: 0.2371 426/500 [========================>.....] - ETA: 25s - loss: 1.6432 - regression_loss: 1.4063 - classification_loss: 0.2369 427/500 [========================>.....] - ETA: 24s - loss: 1.6432 - regression_loss: 1.4065 - classification_loss: 0.2367 428/500 [========================>.....] - ETA: 24s - loss: 1.6456 - regression_loss: 1.4080 - classification_loss: 0.2376 429/500 [========================>.....] - ETA: 24s - loss: 1.6459 - regression_loss: 1.4082 - classification_loss: 0.2376 430/500 [========================>.....] - ETA: 23s - loss: 1.6476 - regression_loss: 1.4096 - classification_loss: 0.2380 431/500 [========================>.....] - ETA: 23s - loss: 1.6455 - regression_loss: 1.4077 - classification_loss: 0.2378 432/500 [========================>.....] - ETA: 23s - loss: 1.6457 - regression_loss: 1.4079 - classification_loss: 0.2378 433/500 [========================>.....] - ETA: 22s - loss: 1.6449 - regression_loss: 1.4072 - classification_loss: 0.2377 434/500 [=========================>....] - ETA: 22s - loss: 1.6448 - regression_loss: 1.4072 - classification_loss: 0.2376 435/500 [=========================>....] - ETA: 22s - loss: 1.6464 - regression_loss: 1.4087 - classification_loss: 0.2377 436/500 [=========================>....] - ETA: 21s - loss: 1.6458 - regression_loss: 1.4082 - classification_loss: 0.2376 437/500 [=========================>....] - ETA: 21s - loss: 1.6468 - regression_loss: 1.4089 - classification_loss: 0.2379 438/500 [=========================>....] - ETA: 21s - loss: 1.6458 - regression_loss: 1.4079 - classification_loss: 0.2379 439/500 [=========================>....] - ETA: 20s - loss: 1.6479 - regression_loss: 1.4095 - classification_loss: 0.2384 440/500 [=========================>....] - ETA: 20s - loss: 1.6459 - regression_loss: 1.4079 - classification_loss: 0.2380 441/500 [=========================>....] - ETA: 20s - loss: 1.6456 - regression_loss: 1.4074 - classification_loss: 0.2381 442/500 [=========================>....] - ETA: 19s - loss: 1.6453 - regression_loss: 1.4073 - classification_loss: 0.2380 443/500 [=========================>....] - ETA: 19s - loss: 1.6437 - regression_loss: 1.4059 - classification_loss: 0.2378 444/500 [=========================>....] - ETA: 19s - loss: 1.6443 - regression_loss: 1.4063 - classification_loss: 0.2380 445/500 [=========================>....] - ETA: 18s - loss: 1.6455 - regression_loss: 1.4076 - classification_loss: 0.2379 446/500 [=========================>....] - ETA: 18s - loss: 1.6457 - regression_loss: 1.4078 - classification_loss: 0.2379 447/500 [=========================>....] - ETA: 18s - loss: 1.6453 - regression_loss: 1.4075 - classification_loss: 0.2377 448/500 [=========================>....] - ETA: 17s - loss: 1.6435 - regression_loss: 1.4059 - classification_loss: 0.2376 449/500 [=========================>....] - ETA: 17s - loss: 1.6419 - regression_loss: 1.4046 - classification_loss: 0.2373 450/500 [==========================>...] - ETA: 16s - loss: 1.6404 - regression_loss: 1.4034 - classification_loss: 0.2370 451/500 [==========================>...] - ETA: 16s - loss: 1.6402 - regression_loss: 1.4035 - classification_loss: 0.2367 452/500 [==========================>...] - ETA: 16s - loss: 1.6401 - regression_loss: 1.4035 - classification_loss: 0.2366 453/500 [==========================>...] - ETA: 15s - loss: 1.6419 - regression_loss: 1.4050 - classification_loss: 0.2370 454/500 [==========================>...] - ETA: 15s - loss: 1.6440 - regression_loss: 1.4066 - classification_loss: 0.2374 455/500 [==========================>...] - ETA: 15s - loss: 1.6482 - regression_loss: 1.4096 - classification_loss: 0.2386 456/500 [==========================>...] - ETA: 14s - loss: 1.6498 - regression_loss: 1.4108 - classification_loss: 0.2390 457/500 [==========================>...] - ETA: 14s - loss: 1.6496 - regression_loss: 1.4107 - classification_loss: 0.2389 458/500 [==========================>...] - ETA: 14s - loss: 1.6490 - regression_loss: 1.4101 - classification_loss: 0.2388 459/500 [==========================>...] - ETA: 13s - loss: 1.6464 - regression_loss: 1.4080 - classification_loss: 0.2384 460/500 [==========================>...] - ETA: 13s - loss: 1.6450 - regression_loss: 1.4068 - classification_loss: 0.2382 461/500 [==========================>...] - ETA: 13s - loss: 1.6457 - regression_loss: 1.4073 - classification_loss: 0.2384 462/500 [==========================>...] - ETA: 12s - loss: 1.6458 - regression_loss: 1.4074 - classification_loss: 0.2383 463/500 [==========================>...] - ETA: 12s - loss: 1.6461 - regression_loss: 1.4079 - classification_loss: 0.2382 464/500 [==========================>...] - ETA: 12s - loss: 1.6452 - regression_loss: 1.4069 - classification_loss: 0.2383 465/500 [==========================>...] - ETA: 11s - loss: 1.6453 - regression_loss: 1.4072 - classification_loss: 0.2381 466/500 [==========================>...] - ETA: 11s - loss: 1.6452 - regression_loss: 1.4071 - classification_loss: 0.2381 467/500 [===========================>..] - ETA: 11s - loss: 1.6465 - regression_loss: 1.4081 - classification_loss: 0.2384 468/500 [===========================>..] - ETA: 10s - loss: 1.6471 - regression_loss: 1.4087 - classification_loss: 0.2384 469/500 [===========================>..] - ETA: 10s - loss: 1.6463 - regression_loss: 1.4081 - classification_loss: 0.2382 470/500 [===========================>..] - ETA: 10s - loss: 1.6449 - regression_loss: 1.4066 - classification_loss: 0.2383 471/500 [===========================>..] - ETA: 9s - loss: 1.6429 - regression_loss: 1.4048 - classification_loss: 0.2382  472/500 [===========================>..] - ETA: 9s - loss: 1.6428 - regression_loss: 1.4045 - classification_loss: 0.2383 473/500 [===========================>..] - ETA: 9s - loss: 1.6417 - regression_loss: 1.4036 - classification_loss: 0.2381 474/500 [===========================>..] - ETA: 8s - loss: 1.6418 - regression_loss: 1.4037 - classification_loss: 0.2381 475/500 [===========================>..] - ETA: 8s - loss: 1.6425 - regression_loss: 1.4041 - classification_loss: 0.2383 476/500 [===========================>..] - ETA: 8s - loss: 1.6418 - regression_loss: 1.4035 - classification_loss: 0.2382 477/500 [===========================>..] - ETA: 7s - loss: 1.6415 - regression_loss: 1.4033 - classification_loss: 0.2382 478/500 [===========================>..] - ETA: 7s - loss: 1.6402 - regression_loss: 1.4022 - classification_loss: 0.2380 479/500 [===========================>..] - ETA: 7s - loss: 1.6401 - regression_loss: 1.4021 - classification_loss: 0.2380 480/500 [===========================>..] - ETA: 6s - loss: 1.6385 - regression_loss: 1.4008 - classification_loss: 0.2377 481/500 [===========================>..] - ETA: 6s - loss: 1.6388 - regression_loss: 1.4008 - classification_loss: 0.2379 482/500 [===========================>..] - ETA: 6s - loss: 1.6385 - regression_loss: 1.4007 - classification_loss: 0.2378 483/500 [===========================>..] - ETA: 5s - loss: 1.6390 - regression_loss: 1.4012 - classification_loss: 0.2378 484/500 [============================>.] - ETA: 5s - loss: 1.6385 - regression_loss: 1.4008 - classification_loss: 0.2377 485/500 [============================>.] - ETA: 5s - loss: 1.6386 - regression_loss: 1.4009 - classification_loss: 0.2377 486/500 [============================>.] - ETA: 4s - loss: 1.6377 - regression_loss: 1.4003 - classification_loss: 0.2374 487/500 [============================>.] - ETA: 4s - loss: 1.6402 - regression_loss: 1.4016 - classification_loss: 0.2386 488/500 [============================>.] - ETA: 4s - loss: 1.6405 - regression_loss: 1.4018 - classification_loss: 0.2387 489/500 [============================>.] - ETA: 3s - loss: 1.6409 - regression_loss: 1.4021 - classification_loss: 0.2388 490/500 [============================>.] - ETA: 3s - loss: 1.6405 - regression_loss: 1.4018 - classification_loss: 0.2387 491/500 [============================>.] - ETA: 3s - loss: 1.6444 - regression_loss: 1.4051 - classification_loss: 0.2393 492/500 [============================>.] - ETA: 2s - loss: 1.6441 - regression_loss: 1.4050 - classification_loss: 0.2391 493/500 [============================>.] - ETA: 2s - loss: 1.6443 - regression_loss: 1.4052 - classification_loss: 0.2391 494/500 [============================>.] - ETA: 2s - loss: 1.6422 - regression_loss: 1.4035 - classification_loss: 0.2387 495/500 [============================>.] - ETA: 1s - loss: 1.6435 - regression_loss: 1.4045 - classification_loss: 0.2390 496/500 [============================>.] - ETA: 1s - loss: 1.6417 - regression_loss: 1.4030 - classification_loss: 0.2387 497/500 [============================>.] - ETA: 1s - loss: 1.6395 - regression_loss: 1.4012 - classification_loss: 0.2384 498/500 [============================>.] - ETA: 0s - loss: 1.6381 - regression_loss: 1.4000 - classification_loss: 0.2381 499/500 [============================>.] - ETA: 0s - loss: 1.6366 - regression_loss: 1.3987 - classification_loss: 0.2380 500/500 [==============================] - 170s 340ms/step - loss: 1.6373 - regression_loss: 1.3992 - classification_loss: 0.2381 326 instances of class plum with average precision: 0.8144 mAP: 0.8144 Epoch 00007: saving model to ./training/snapshots/resnet101_pascal_07.h5 Epoch 8/150 1/500 [..............................] - ETA: 2:39 - loss: 1.1218 - regression_loss: 0.9954 - classification_loss: 0.1264 2/500 [..............................] - ETA: 2:44 - loss: 1.2848 - regression_loss: 1.0547 - classification_loss: 0.2301 3/500 [..............................] - ETA: 2:44 - loss: 1.4524 - regression_loss: 1.2194 - classification_loss: 0.2330 4/500 [..............................] - ETA: 2:46 - loss: 1.6310 - regression_loss: 1.3511 - classification_loss: 0.2800 5/500 [..............................] - ETA: 2:45 - loss: 1.6653 - regression_loss: 1.3913 - classification_loss: 0.2741 6/500 [..............................] - ETA: 2:46 - loss: 1.6035 - regression_loss: 1.3466 - classification_loss: 0.2569 7/500 [..............................] - ETA: 2:47 - loss: 1.5722 - regression_loss: 1.3267 - classification_loss: 0.2455 8/500 [..............................] - ETA: 2:47 - loss: 1.6617 - regression_loss: 1.3992 - classification_loss: 0.2626 9/500 [..............................] - ETA: 2:46 - loss: 1.7397 - regression_loss: 1.4763 - classification_loss: 0.2633 10/500 [..............................] - ETA: 2:46 - loss: 1.7724 - regression_loss: 1.5083 - classification_loss: 0.2641 11/500 [..............................] - ETA: 2:47 - loss: 1.7499 - regression_loss: 1.4935 - classification_loss: 0.2564 12/500 [..............................] - ETA: 2:46 - loss: 1.7791 - regression_loss: 1.5216 - classification_loss: 0.2575 13/500 [..............................] - ETA: 2:46 - loss: 1.7137 - regression_loss: 1.4632 - classification_loss: 0.2504 14/500 [..............................] - ETA: 2:47 - loss: 1.6975 - regression_loss: 1.4502 - classification_loss: 0.2473 15/500 [..............................] - ETA: 2:46 - loss: 1.6421 - regression_loss: 1.4014 - classification_loss: 0.2407 16/500 [..............................] - ETA: 2:46 - loss: 1.6345 - regression_loss: 1.3984 - classification_loss: 0.2361 17/500 [>.............................] - ETA: 2:46 - loss: 1.6113 - regression_loss: 1.3802 - classification_loss: 0.2311 18/500 [>.............................] - ETA: 2:45 - loss: 1.6183 - regression_loss: 1.3884 - classification_loss: 0.2299 19/500 [>.............................] - ETA: 2:45 - loss: 1.5848 - regression_loss: 1.3606 - classification_loss: 0.2242 20/500 [>.............................] - ETA: 2:44 - loss: 1.5392 - regression_loss: 1.3226 - classification_loss: 0.2166 21/500 [>.............................] - ETA: 2:43 - loss: 1.5364 - regression_loss: 1.3187 - classification_loss: 0.2178 22/500 [>.............................] - ETA: 2:42 - loss: 1.5298 - regression_loss: 1.3156 - classification_loss: 0.2142 23/500 [>.............................] - ETA: 2:41 - loss: 1.5260 - regression_loss: 1.3111 - classification_loss: 0.2148 24/500 [>.............................] - ETA: 2:41 - loss: 1.4870 - regression_loss: 1.2789 - classification_loss: 0.2080 25/500 [>.............................] - ETA: 2:40 - loss: 1.4924 - regression_loss: 1.2806 - classification_loss: 0.2118 26/500 [>.............................] - ETA: 2:40 - loss: 1.5051 - regression_loss: 1.2911 - classification_loss: 0.2140 27/500 [>.............................] - ETA: 2:40 - loss: 1.5058 - regression_loss: 1.2942 - classification_loss: 0.2116 28/500 [>.............................] - ETA: 2:39 - loss: 1.5560 - regression_loss: 1.3373 - classification_loss: 0.2187 29/500 [>.............................] - ETA: 2:38 - loss: 1.5754 - regression_loss: 1.3506 - classification_loss: 0.2249 30/500 [>.............................] - ETA: 2:38 - loss: 1.5524 - regression_loss: 1.3318 - classification_loss: 0.2206 31/500 [>.............................] - ETA: 2:38 - loss: 1.5354 - regression_loss: 1.3167 - classification_loss: 0.2187 32/500 [>.............................] - ETA: 2:38 - loss: 1.5780 - regression_loss: 1.3545 - classification_loss: 0.2236 33/500 [>.............................] - ETA: 2:37 - loss: 1.5637 - regression_loss: 1.3438 - classification_loss: 0.2199 34/500 [=>............................] - ETA: 2:37 - loss: 1.5920 - regression_loss: 1.3656 - classification_loss: 0.2263 35/500 [=>............................] - ETA: 2:37 - loss: 1.6069 - regression_loss: 1.3755 - classification_loss: 0.2314 36/500 [=>............................] - ETA: 2:36 - loss: 1.6045 - regression_loss: 1.3748 - classification_loss: 0.2297 37/500 [=>............................] - ETA: 2:36 - loss: 1.6021 - regression_loss: 1.3735 - classification_loss: 0.2286 38/500 [=>............................] - ETA: 2:36 - loss: 1.6055 - regression_loss: 1.3772 - classification_loss: 0.2283 39/500 [=>............................] - ETA: 2:35 - loss: 1.5899 - regression_loss: 1.3647 - classification_loss: 0.2252 40/500 [=>............................] - ETA: 2:35 - loss: 1.6178 - regression_loss: 1.3885 - classification_loss: 0.2293 41/500 [=>............................] - ETA: 2:35 - loss: 1.6015 - regression_loss: 1.3753 - classification_loss: 0.2261 42/500 [=>............................] - ETA: 2:34 - loss: 1.5814 - regression_loss: 1.3548 - classification_loss: 0.2266 43/500 [=>............................] - ETA: 2:34 - loss: 1.5666 - regression_loss: 1.3428 - classification_loss: 0.2238 44/500 [=>............................] - ETA: 2:33 - loss: 1.5579 - regression_loss: 1.3365 - classification_loss: 0.2214 45/500 [=>............................] - ETA: 2:33 - loss: 1.5795 - regression_loss: 1.3522 - classification_loss: 0.2272 46/500 [=>............................] - ETA: 2:33 - loss: 1.5833 - regression_loss: 1.3565 - classification_loss: 0.2268 47/500 [=>............................] - ETA: 2:33 - loss: 1.5710 - regression_loss: 1.3467 - classification_loss: 0.2242 48/500 [=>............................] - ETA: 2:32 - loss: 1.5702 - regression_loss: 1.3470 - classification_loss: 0.2232 49/500 [=>............................] - ETA: 2:32 - loss: 1.5733 - regression_loss: 1.3499 - classification_loss: 0.2234 50/500 [==>...........................] - ETA: 2:32 - loss: 1.5755 - regression_loss: 1.3546 - classification_loss: 0.2209 51/500 [==>...........................] - ETA: 2:31 - loss: 1.6053 - regression_loss: 1.3827 - classification_loss: 0.2226 52/500 [==>...........................] - ETA: 2:31 - loss: 1.6181 - regression_loss: 1.3953 - classification_loss: 0.2228 53/500 [==>...........................] - ETA: 2:31 - loss: 1.6259 - regression_loss: 1.4021 - classification_loss: 0.2237 54/500 [==>...........................] - ETA: 2:30 - loss: 1.6549 - regression_loss: 1.4287 - classification_loss: 0.2262 55/500 [==>...........................] - ETA: 2:30 - loss: 1.6510 - regression_loss: 1.4264 - classification_loss: 0.2247 56/500 [==>...........................] - ETA: 2:30 - loss: 1.6631 - regression_loss: 1.4388 - classification_loss: 0.2243 57/500 [==>...........................] - ETA: 2:29 - loss: 1.6539 - regression_loss: 1.4317 - classification_loss: 0.2222 58/500 [==>...........................] - ETA: 2:29 - loss: 1.6660 - regression_loss: 1.4428 - classification_loss: 0.2232 59/500 [==>...........................] - ETA: 2:29 - loss: 1.6616 - regression_loss: 1.4388 - classification_loss: 0.2228 60/500 [==>...........................] - ETA: 2:29 - loss: 1.6527 - regression_loss: 1.4313 - classification_loss: 0.2214 61/500 [==>...........................] - ETA: 2:28 - loss: 1.6540 - regression_loss: 1.4317 - classification_loss: 0.2224 62/500 [==>...........................] - ETA: 2:28 - loss: 1.6522 - regression_loss: 1.4303 - classification_loss: 0.2219 63/500 [==>...........................] - ETA: 2:28 - loss: 1.6605 - regression_loss: 1.4378 - classification_loss: 0.2227 64/500 [==>...........................] - ETA: 2:27 - loss: 1.6527 - regression_loss: 1.4316 - classification_loss: 0.2212 65/500 [==>...........................] - ETA: 2:27 - loss: 1.6557 - regression_loss: 1.4331 - classification_loss: 0.2226 66/500 [==>...........................] - ETA: 2:27 - loss: 1.6686 - regression_loss: 1.4422 - classification_loss: 0.2265 67/500 [===>..........................] - ETA: 2:26 - loss: 1.6642 - regression_loss: 1.4384 - classification_loss: 0.2259 68/500 [===>..........................] - ETA: 2:26 - loss: 1.6651 - regression_loss: 1.4394 - classification_loss: 0.2258 69/500 [===>..........................] - ETA: 2:26 - loss: 1.6633 - regression_loss: 1.4365 - classification_loss: 0.2268 70/500 [===>..........................] - ETA: 2:25 - loss: 1.6608 - regression_loss: 1.4333 - classification_loss: 0.2275 71/500 [===>..........................] - ETA: 2:25 - loss: 1.6726 - regression_loss: 1.4406 - classification_loss: 0.2320 72/500 [===>..........................] - ETA: 2:25 - loss: 1.6745 - regression_loss: 1.4407 - classification_loss: 0.2338 73/500 [===>..........................] - ETA: 2:24 - loss: 1.6686 - regression_loss: 1.4349 - classification_loss: 0.2337 74/500 [===>..........................] - ETA: 2:24 - loss: 1.6777 - regression_loss: 1.4405 - classification_loss: 0.2371 75/500 [===>..........................] - ETA: 2:23 - loss: 1.6720 - regression_loss: 1.4358 - classification_loss: 0.2362 76/500 [===>..........................] - ETA: 2:23 - loss: 1.6697 - regression_loss: 1.4346 - classification_loss: 0.2351 77/500 [===>..........................] - ETA: 2:23 - loss: 1.6670 - regression_loss: 1.4326 - classification_loss: 0.2344 78/500 [===>..........................] - ETA: 2:23 - loss: 1.6689 - regression_loss: 1.4340 - classification_loss: 0.2349 79/500 [===>..........................] - ETA: 2:22 - loss: 1.6617 - regression_loss: 1.4269 - classification_loss: 0.2348 80/500 [===>..........................] - ETA: 2:22 - loss: 1.6558 - regression_loss: 1.4224 - classification_loss: 0.2334 81/500 [===>..........................] - ETA: 2:22 - loss: 1.6539 - regression_loss: 1.4212 - classification_loss: 0.2327 82/500 [===>..........................] - ETA: 2:21 - loss: 1.6534 - regression_loss: 1.4203 - classification_loss: 0.2331 83/500 [===>..........................] - ETA: 2:21 - loss: 1.6547 - regression_loss: 1.4213 - classification_loss: 0.2334 84/500 [====>.........................] - ETA: 2:21 - loss: 1.6497 - regression_loss: 1.4172 - classification_loss: 0.2326 85/500 [====>.........................] - ETA: 2:20 - loss: 1.6510 - regression_loss: 1.4184 - classification_loss: 0.2326 86/500 [====>.........................] - ETA: 2:20 - loss: 1.6562 - regression_loss: 1.4214 - classification_loss: 0.2347 87/500 [====>.........................] - ETA: 2:20 - loss: 1.6709 - regression_loss: 1.4337 - classification_loss: 0.2373 88/500 [====>.........................] - ETA: 2:19 - loss: 1.6771 - regression_loss: 1.4389 - classification_loss: 0.2383 89/500 [====>.........................] - ETA: 2:19 - loss: 1.6741 - regression_loss: 1.4361 - classification_loss: 0.2380 90/500 [====>.........................] - ETA: 2:18 - loss: 1.6645 - regression_loss: 1.4278 - classification_loss: 0.2367 91/500 [====>.........................] - ETA: 2:18 - loss: 1.6623 - regression_loss: 1.4264 - classification_loss: 0.2359 92/500 [====>.........................] - ETA: 2:18 - loss: 1.6600 - regression_loss: 1.4248 - classification_loss: 0.2351 93/500 [====>.........................] - ETA: 2:17 - loss: 1.6566 - regression_loss: 1.4224 - classification_loss: 0.2342 94/500 [====>.........................] - ETA: 2:17 - loss: 1.6588 - regression_loss: 1.4248 - classification_loss: 0.2340 95/500 [====>.........................] - ETA: 2:16 - loss: 1.6578 - regression_loss: 1.4242 - classification_loss: 0.2336 96/500 [====>.........................] - ETA: 2:16 - loss: 1.6565 - regression_loss: 1.4233 - classification_loss: 0.2332 97/500 [====>.........................] - ETA: 2:16 - loss: 1.6578 - regression_loss: 1.4241 - classification_loss: 0.2337 98/500 [====>.........................] - ETA: 2:16 - loss: 1.6599 - regression_loss: 1.4263 - classification_loss: 0.2337 99/500 [====>.........................] - ETA: 2:15 - loss: 1.6568 - regression_loss: 1.4245 - classification_loss: 0.2323 100/500 [=====>........................] - ETA: 2:15 - loss: 1.6591 - regression_loss: 1.4266 - classification_loss: 0.2325 101/500 [=====>........................] - ETA: 2:14 - loss: 1.6539 - regression_loss: 1.4226 - classification_loss: 0.2313 102/500 [=====>........................] - ETA: 2:14 - loss: 1.6494 - regression_loss: 1.4179 - classification_loss: 0.2315 103/500 [=====>........................] - ETA: 2:14 - loss: 1.6604 - regression_loss: 1.4267 - classification_loss: 0.2337 104/500 [=====>........................] - ETA: 2:13 - loss: 1.6552 - regression_loss: 1.4227 - classification_loss: 0.2325 105/500 [=====>........................] - ETA: 2:13 - loss: 1.6506 - regression_loss: 1.4187 - classification_loss: 0.2319 106/500 [=====>........................] - ETA: 2:13 - loss: 1.6518 - regression_loss: 1.4204 - classification_loss: 0.2314 107/500 [=====>........................] - ETA: 2:12 - loss: 1.6480 - regression_loss: 1.4169 - classification_loss: 0.2311 108/500 [=====>........................] - ETA: 2:12 - loss: 1.6363 - regression_loss: 1.4069 - classification_loss: 0.2294 109/500 [=====>........................] - ETA: 2:12 - loss: 1.6392 - regression_loss: 1.4097 - classification_loss: 0.2295 110/500 [=====>........................] - ETA: 2:11 - loss: 1.6374 - regression_loss: 1.4081 - classification_loss: 0.2293 111/500 [=====>........................] - ETA: 2:11 - loss: 1.6416 - regression_loss: 1.4128 - classification_loss: 0.2288 112/500 [=====>........................] - ETA: 2:11 - loss: 1.6409 - regression_loss: 1.4122 - classification_loss: 0.2287 113/500 [=====>........................] - ETA: 2:10 - loss: 1.6338 - regression_loss: 1.4066 - classification_loss: 0.2273 114/500 [=====>........................] - ETA: 2:10 - loss: 1.6347 - regression_loss: 1.4082 - classification_loss: 0.2265 115/500 [=====>........................] - ETA: 2:10 - loss: 1.6318 - regression_loss: 1.4060 - classification_loss: 0.2258 116/500 [=====>........................] - ETA: 2:09 - loss: 1.6305 - regression_loss: 1.4050 - classification_loss: 0.2255 117/500 [======>.......................] - ETA: 2:09 - loss: 1.6341 - regression_loss: 1.4075 - classification_loss: 0.2266 118/500 [======>.......................] - ETA: 2:09 - loss: 1.6326 - regression_loss: 1.4064 - classification_loss: 0.2262 119/500 [======>.......................] - ETA: 2:08 - loss: 1.6348 - regression_loss: 1.4081 - classification_loss: 0.2267 120/500 [======>.......................] - ETA: 2:08 - loss: 1.6338 - regression_loss: 1.4073 - classification_loss: 0.2265 121/500 [======>.......................] - ETA: 2:08 - loss: 1.6290 - regression_loss: 1.4031 - classification_loss: 0.2259 122/500 [======>.......................] - ETA: 2:07 - loss: 1.6292 - regression_loss: 1.4031 - classification_loss: 0.2262 123/500 [======>.......................] - ETA: 2:07 - loss: 1.6261 - regression_loss: 1.4009 - classification_loss: 0.2252 124/500 [======>.......................] - ETA: 2:07 - loss: 1.6246 - regression_loss: 1.4001 - classification_loss: 0.2246 125/500 [======>.......................] - ETA: 2:06 - loss: 1.6208 - regression_loss: 1.3969 - classification_loss: 0.2239 126/500 [======>.......................] - ETA: 2:06 - loss: 1.6226 - regression_loss: 1.3985 - classification_loss: 0.2240 127/500 [======>.......................] - ETA: 2:06 - loss: 1.6165 - regression_loss: 1.3937 - classification_loss: 0.2228 128/500 [======>.......................] - ETA: 2:05 - loss: 1.6144 - regression_loss: 1.3925 - classification_loss: 0.2219 129/500 [======>.......................] - ETA: 2:05 - loss: 1.6236 - regression_loss: 1.3999 - classification_loss: 0.2237 130/500 [======>.......................] - ETA: 2:05 - loss: 1.6168 - regression_loss: 1.3942 - classification_loss: 0.2226 131/500 [======>.......................] - ETA: 2:04 - loss: 1.6173 - regression_loss: 1.3950 - classification_loss: 0.2223 132/500 [======>.......................] - ETA: 2:04 - loss: 1.6156 - regression_loss: 1.3937 - classification_loss: 0.2219 133/500 [======>.......................] - ETA: 2:04 - loss: 1.6074 - regression_loss: 1.3866 - classification_loss: 0.2208 134/500 [=======>......................] - ETA: 2:03 - loss: 1.6059 - regression_loss: 1.3852 - classification_loss: 0.2206 135/500 [=======>......................] - ETA: 2:03 - loss: 1.6044 - regression_loss: 1.3841 - classification_loss: 0.2202 136/500 [=======>......................] - ETA: 2:03 - loss: 1.6032 - regression_loss: 1.3830 - classification_loss: 0.2202 137/500 [=======>......................] - ETA: 2:02 - loss: 1.6082 - regression_loss: 1.3874 - classification_loss: 0.2208 138/500 [=======>......................] - ETA: 2:02 - loss: 1.6077 - regression_loss: 1.3876 - classification_loss: 0.2201 139/500 [=======>......................] - ETA: 2:02 - loss: 1.6071 - regression_loss: 1.3871 - classification_loss: 0.2201 140/500 [=======>......................] - ETA: 2:01 - loss: 1.6080 - regression_loss: 1.3880 - classification_loss: 0.2200 141/500 [=======>......................] - ETA: 2:01 - loss: 1.6077 - regression_loss: 1.3881 - classification_loss: 0.2196 142/500 [=======>......................] - ETA: 2:01 - loss: 1.6153 - regression_loss: 1.3941 - classification_loss: 0.2212 143/500 [=======>......................] - ETA: 2:00 - loss: 1.6196 - regression_loss: 1.3978 - classification_loss: 0.2219 144/500 [=======>......................] - ETA: 2:00 - loss: 1.6177 - regression_loss: 1.3963 - classification_loss: 0.2214 145/500 [=======>......................] - ETA: 2:00 - loss: 1.6131 - regression_loss: 1.3925 - classification_loss: 0.2206 146/500 [=======>......................] - ETA: 1:59 - loss: 1.6158 - regression_loss: 1.3943 - classification_loss: 0.2215 147/500 [=======>......................] - ETA: 1:59 - loss: 1.6360 - regression_loss: 1.4055 - classification_loss: 0.2305 148/500 [=======>......................] - ETA: 1:59 - loss: 1.6395 - regression_loss: 1.4089 - classification_loss: 0.2307 149/500 [=======>......................] - ETA: 1:58 - loss: 1.6385 - regression_loss: 1.4079 - classification_loss: 0.2306 150/500 [========>.....................] - ETA: 1:58 - loss: 1.6377 - regression_loss: 1.4076 - classification_loss: 0.2301 151/500 [========>.....................] - ETA: 1:58 - loss: 1.6330 - regression_loss: 1.4038 - classification_loss: 0.2292 152/500 [========>.....................] - ETA: 1:57 - loss: 1.6321 - regression_loss: 1.4030 - classification_loss: 0.2290 153/500 [========>.....................] - ETA: 1:57 - loss: 1.6306 - regression_loss: 1.4017 - classification_loss: 0.2289 154/500 [========>.....................] - ETA: 1:57 - loss: 1.6326 - regression_loss: 1.4032 - classification_loss: 0.2294 155/500 [========>.....................] - ETA: 1:56 - loss: 1.6308 - regression_loss: 1.4018 - classification_loss: 0.2290 156/500 [========>.....................] - ETA: 1:56 - loss: 1.6304 - regression_loss: 1.4012 - classification_loss: 0.2292 157/500 [========>.....................] - ETA: 1:56 - loss: 1.6332 - regression_loss: 1.4027 - classification_loss: 0.2306 158/500 [========>.....................] - ETA: 1:55 - loss: 1.6311 - regression_loss: 1.4011 - classification_loss: 0.2301 159/500 [========>.....................] - ETA: 1:55 - loss: 1.6289 - regression_loss: 1.3988 - classification_loss: 0.2301 160/500 [========>.....................] - ETA: 1:55 - loss: 1.6276 - regression_loss: 1.3979 - classification_loss: 0.2297 161/500 [========>.....................] - ETA: 1:54 - loss: 1.6283 - regression_loss: 1.3986 - classification_loss: 0.2297 162/500 [========>.....................] - ETA: 1:54 - loss: 1.6279 - regression_loss: 1.3983 - classification_loss: 0.2296 163/500 [========>.....................] - ETA: 1:54 - loss: 1.6310 - regression_loss: 1.4012 - classification_loss: 0.2298 164/500 [========>.....................] - ETA: 1:53 - loss: 1.6291 - regression_loss: 1.3999 - classification_loss: 0.2292 165/500 [========>.....................] - ETA: 1:53 - loss: 1.6283 - regression_loss: 1.3997 - classification_loss: 0.2286 166/500 [========>.....................] - ETA: 1:53 - loss: 1.6252 - regression_loss: 1.3974 - classification_loss: 0.2278 167/500 [=========>....................] - ETA: 1:52 - loss: 1.6246 - regression_loss: 1.3967 - classification_loss: 0.2278 168/500 [=========>....................] - ETA: 1:52 - loss: 1.6314 - regression_loss: 1.4019 - classification_loss: 0.2295 169/500 [=========>....................] - ETA: 1:52 - loss: 1.6279 - regression_loss: 1.3991 - classification_loss: 0.2288 170/500 [=========>....................] - ETA: 1:51 - loss: 1.6337 - regression_loss: 1.4035 - classification_loss: 0.2302 171/500 [=========>....................] - ETA: 1:51 - loss: 1.6333 - regression_loss: 1.4034 - classification_loss: 0.2299 172/500 [=========>....................] - ETA: 1:51 - loss: 1.6338 - regression_loss: 1.4037 - classification_loss: 0.2301 173/500 [=========>....................] - ETA: 1:50 - loss: 1.6338 - regression_loss: 1.4036 - classification_loss: 0.2302 174/500 [=========>....................] - ETA: 1:50 - loss: 1.6311 - regression_loss: 1.4015 - classification_loss: 0.2296 175/500 [=========>....................] - ETA: 1:49 - loss: 1.6307 - regression_loss: 1.4016 - classification_loss: 0.2291 176/500 [=========>....................] - ETA: 1:49 - loss: 1.6273 - regression_loss: 1.3989 - classification_loss: 0.2284 177/500 [=========>....................] - ETA: 1:49 - loss: 1.6300 - regression_loss: 1.4012 - classification_loss: 0.2289 178/500 [=========>....................] - ETA: 1:48 - loss: 1.6297 - regression_loss: 1.4008 - classification_loss: 0.2289 179/500 [=========>....................] - ETA: 1:48 - loss: 1.6245 - regression_loss: 1.3965 - classification_loss: 0.2280 180/500 [=========>....................] - ETA: 1:48 - loss: 1.6203 - regression_loss: 1.3931 - classification_loss: 0.2272 181/500 [=========>....................] - ETA: 1:47 - loss: 1.6199 - regression_loss: 1.3932 - classification_loss: 0.2267 182/500 [=========>....................] - ETA: 1:47 - loss: 1.6212 - regression_loss: 1.3949 - classification_loss: 0.2263 183/500 [=========>....................] - ETA: 1:47 - loss: 1.6294 - regression_loss: 1.4007 - classification_loss: 0.2286 184/500 [==========>...................] - ETA: 1:47 - loss: 1.6319 - regression_loss: 1.4025 - classification_loss: 0.2294 185/500 [==========>...................] - ETA: 1:46 - loss: 1.6330 - regression_loss: 1.4035 - classification_loss: 0.2295 186/500 [==========>...................] - ETA: 1:46 - loss: 1.6314 - regression_loss: 1.4023 - classification_loss: 0.2291 187/500 [==========>...................] - ETA: 1:46 - loss: 1.6359 - regression_loss: 1.4060 - classification_loss: 0.2299 188/500 [==========>...................] - ETA: 1:45 - loss: 1.6314 - regression_loss: 1.4022 - classification_loss: 0.2292 189/500 [==========>...................] - ETA: 1:45 - loss: 1.6297 - regression_loss: 1.4010 - classification_loss: 0.2287 190/500 [==========>...................] - ETA: 1:45 - loss: 1.6281 - regression_loss: 1.4001 - classification_loss: 0.2280 191/500 [==========>...................] - ETA: 1:44 - loss: 1.6241 - regression_loss: 1.3969 - classification_loss: 0.2273 192/500 [==========>...................] - ETA: 1:44 - loss: 1.6254 - regression_loss: 1.3980 - classification_loss: 0.2274 193/500 [==========>...................] - ETA: 1:44 - loss: 1.6267 - regression_loss: 1.3992 - classification_loss: 0.2275 194/500 [==========>...................] - ETA: 1:43 - loss: 1.6231 - regression_loss: 1.3962 - classification_loss: 0.2269 195/500 [==========>...................] - ETA: 1:43 - loss: 1.6213 - regression_loss: 1.3948 - classification_loss: 0.2265 196/500 [==========>...................] - ETA: 1:43 - loss: 1.6189 - regression_loss: 1.3928 - classification_loss: 0.2260 197/500 [==========>...................] - ETA: 1:42 - loss: 1.6236 - regression_loss: 1.3965 - classification_loss: 0.2271 198/500 [==========>...................] - ETA: 1:42 - loss: 1.6206 - regression_loss: 1.3936 - classification_loss: 0.2271 199/500 [==========>...................] - ETA: 1:41 - loss: 1.6229 - regression_loss: 1.3952 - classification_loss: 0.2276 200/500 [===========>..................] - ETA: 1:41 - loss: 1.6206 - regression_loss: 1.3934 - classification_loss: 0.2273 201/500 [===========>..................] - ETA: 1:41 - loss: 1.6174 - regression_loss: 1.3906 - classification_loss: 0.2267 202/500 [===========>..................] - ETA: 1:40 - loss: 1.6136 - regression_loss: 1.3875 - classification_loss: 0.2261 203/500 [===========>..................] - ETA: 1:40 - loss: 1.6165 - regression_loss: 1.3898 - classification_loss: 0.2268 204/500 [===========>..................] - ETA: 1:40 - loss: 1.6152 - regression_loss: 1.3888 - classification_loss: 0.2265 205/500 [===========>..................] - ETA: 1:39 - loss: 1.6109 - regression_loss: 1.3852 - classification_loss: 0.2257 206/500 [===========>..................] - ETA: 1:39 - loss: 1.6089 - regression_loss: 1.3835 - classification_loss: 0.2253 207/500 [===========>..................] - ETA: 1:39 - loss: 1.6059 - regression_loss: 1.3812 - classification_loss: 0.2247 208/500 [===========>..................] - ETA: 1:38 - loss: 1.6092 - regression_loss: 1.3842 - classification_loss: 0.2251 209/500 [===========>..................] - ETA: 1:38 - loss: 1.6126 - regression_loss: 1.3865 - classification_loss: 0.2261 210/500 [===========>..................] - ETA: 1:38 - loss: 1.6089 - regression_loss: 1.3836 - classification_loss: 0.2254 211/500 [===========>..................] - ETA: 1:37 - loss: 1.6104 - regression_loss: 1.3848 - classification_loss: 0.2256 212/500 [===========>..................] - ETA: 1:37 - loss: 1.6100 - regression_loss: 1.3844 - classification_loss: 0.2256 213/500 [===========>..................] - ETA: 1:37 - loss: 1.6099 - regression_loss: 1.3845 - classification_loss: 0.2254 214/500 [===========>..................] - ETA: 1:36 - loss: 1.6066 - regression_loss: 1.3819 - classification_loss: 0.2247 215/500 [===========>..................] - ETA: 1:36 - loss: 1.6065 - regression_loss: 1.3816 - classification_loss: 0.2249 216/500 [===========>..................] - ETA: 1:36 - loss: 1.6079 - regression_loss: 1.3824 - classification_loss: 0.2255 217/500 [============>.................] - ETA: 1:35 - loss: 1.6109 - regression_loss: 1.3847 - classification_loss: 0.2262 218/500 [============>.................] - ETA: 1:35 - loss: 1.6097 - regression_loss: 1.3841 - classification_loss: 0.2256 219/500 [============>.................] - ETA: 1:35 - loss: 1.6113 - regression_loss: 1.3858 - classification_loss: 0.2255 220/500 [============>.................] - ETA: 1:34 - loss: 1.6161 - regression_loss: 1.3899 - classification_loss: 0.2262 221/500 [============>.................] - ETA: 1:34 - loss: 1.6177 - regression_loss: 1.3910 - classification_loss: 0.2266 222/500 [============>.................] - ETA: 1:34 - loss: 1.6171 - regression_loss: 1.3907 - classification_loss: 0.2263 223/500 [============>.................] - ETA: 1:33 - loss: 1.6180 - regression_loss: 1.3913 - classification_loss: 0.2267 224/500 [============>.................] - ETA: 1:33 - loss: 1.6160 - regression_loss: 1.3896 - classification_loss: 0.2263 225/500 [============>.................] - ETA: 1:33 - loss: 1.6181 - regression_loss: 1.3920 - classification_loss: 0.2260 226/500 [============>.................] - ETA: 1:32 - loss: 1.6171 - regression_loss: 1.3914 - classification_loss: 0.2257 227/500 [============>.................] - ETA: 1:32 - loss: 1.6152 - regression_loss: 1.3901 - classification_loss: 0.2251 228/500 [============>.................] - ETA: 1:32 - loss: 1.6142 - regression_loss: 1.3892 - classification_loss: 0.2250 229/500 [============>.................] - ETA: 1:31 - loss: 1.6171 - regression_loss: 1.3920 - classification_loss: 0.2251 230/500 [============>.................] - ETA: 1:31 - loss: 1.6189 - regression_loss: 1.3930 - classification_loss: 0.2259 231/500 [============>.................] - ETA: 1:31 - loss: 1.6158 - regression_loss: 1.3901 - classification_loss: 0.2258 232/500 [============>.................] - ETA: 1:30 - loss: 1.6128 - regression_loss: 1.3877 - classification_loss: 0.2251 233/500 [============>.................] - ETA: 1:30 - loss: 1.6107 - regression_loss: 1.3859 - classification_loss: 0.2248 234/500 [=============>................] - ETA: 1:30 - loss: 1.6208 - regression_loss: 1.3921 - classification_loss: 0.2287 235/500 [=============>................] - ETA: 1:29 - loss: 1.6200 - regression_loss: 1.3916 - classification_loss: 0.2284 236/500 [=============>................] - ETA: 1:29 - loss: 1.6173 - regression_loss: 1.3894 - classification_loss: 0.2279 237/500 [=============>................] - ETA: 1:29 - loss: 1.6164 - regression_loss: 1.3891 - classification_loss: 0.2274 238/500 [=============>................] - ETA: 1:28 - loss: 1.6157 - regression_loss: 1.3890 - classification_loss: 0.2268 239/500 [=============>................] - ETA: 1:28 - loss: 1.6154 - regression_loss: 1.3890 - classification_loss: 0.2264 240/500 [=============>................] - ETA: 1:28 - loss: 1.6207 - regression_loss: 1.3929 - classification_loss: 0.2277 241/500 [=============>................] - ETA: 1:27 - loss: 1.6174 - regression_loss: 1.3904 - classification_loss: 0.2271 242/500 [=============>................] - ETA: 1:27 - loss: 1.6165 - regression_loss: 1.3895 - classification_loss: 0.2270 243/500 [=============>................] - ETA: 1:26 - loss: 1.6166 - regression_loss: 1.3896 - classification_loss: 0.2270 244/500 [=============>................] - ETA: 1:26 - loss: 1.6142 - regression_loss: 1.3875 - classification_loss: 0.2267 245/500 [=============>................] - ETA: 1:26 - loss: 1.6121 - regression_loss: 1.3858 - classification_loss: 0.2263 246/500 [=============>................] - ETA: 1:25 - loss: 1.6119 - regression_loss: 1.3856 - classification_loss: 0.2263 247/500 [=============>................] - ETA: 1:25 - loss: 1.6121 - regression_loss: 1.3860 - classification_loss: 0.2261 248/500 [=============>................] - ETA: 1:25 - loss: 1.6132 - regression_loss: 1.3868 - classification_loss: 0.2264 249/500 [=============>................] - ETA: 1:24 - loss: 1.6144 - regression_loss: 1.3878 - classification_loss: 0.2266 250/500 [==============>...............] - ETA: 1:24 - loss: 1.6124 - regression_loss: 1.3863 - classification_loss: 0.2261 251/500 [==============>...............] - ETA: 1:24 - loss: 1.6140 - regression_loss: 1.3877 - classification_loss: 0.2263 252/500 [==============>...............] - ETA: 1:24 - loss: 1.6124 - regression_loss: 1.3863 - classification_loss: 0.2260 253/500 [==============>...............] - ETA: 1:23 - loss: 1.6097 - regression_loss: 1.3841 - classification_loss: 0.2255 254/500 [==============>...............] - ETA: 1:23 - loss: 1.6089 - regression_loss: 1.3835 - classification_loss: 0.2255 255/500 [==============>...............] - ETA: 1:23 - loss: 1.6089 - regression_loss: 1.3833 - classification_loss: 0.2256 256/500 [==============>...............] - ETA: 1:22 - loss: 1.6118 - regression_loss: 1.3851 - classification_loss: 0.2268 257/500 [==============>...............] - ETA: 1:22 - loss: 1.6112 - regression_loss: 1.3846 - classification_loss: 0.2267 258/500 [==============>...............] - ETA: 1:22 - loss: 1.6092 - regression_loss: 1.3830 - classification_loss: 0.2263 259/500 [==============>...............] - ETA: 1:21 - loss: 1.6126 - regression_loss: 1.3854 - classification_loss: 0.2272 260/500 [==============>...............] - ETA: 1:21 - loss: 1.6127 - regression_loss: 1.3856 - classification_loss: 0.2271 261/500 [==============>...............] - ETA: 1:21 - loss: 1.6152 - regression_loss: 1.3876 - classification_loss: 0.2276 262/500 [==============>...............] - ETA: 1:20 - loss: 1.6174 - regression_loss: 1.3891 - classification_loss: 0.2283 263/500 [==============>...............] - ETA: 1:20 - loss: 1.6187 - regression_loss: 1.3903 - classification_loss: 0.2284 264/500 [==============>...............] - ETA: 1:20 - loss: 1.6199 - regression_loss: 1.3913 - classification_loss: 0.2286 265/500 [==============>...............] - ETA: 1:19 - loss: 1.6177 - regression_loss: 1.3893 - classification_loss: 0.2284 266/500 [==============>...............] - ETA: 1:19 - loss: 1.6190 - regression_loss: 1.3898 - classification_loss: 0.2292 267/500 [===============>..............] - ETA: 1:18 - loss: 1.6210 - regression_loss: 1.3915 - classification_loss: 0.2295 268/500 [===============>..............] - ETA: 1:18 - loss: 1.6237 - regression_loss: 1.3941 - classification_loss: 0.2296 269/500 [===============>..............] - ETA: 1:18 - loss: 1.6243 - regression_loss: 1.3946 - classification_loss: 0.2297 270/500 [===============>..............] - ETA: 1:17 - loss: 1.6261 - regression_loss: 1.3966 - classification_loss: 0.2295 271/500 [===============>..............] - ETA: 1:17 - loss: 1.6254 - regression_loss: 1.3962 - classification_loss: 0.2292 272/500 [===============>..............] - ETA: 1:17 - loss: 1.6232 - regression_loss: 1.3945 - classification_loss: 0.2287 273/500 [===============>..............] - ETA: 1:16 - loss: 1.6235 - regression_loss: 1.3948 - classification_loss: 0.2286 274/500 [===============>..............] - ETA: 1:16 - loss: 1.6292 - regression_loss: 1.3973 - classification_loss: 0.2319 275/500 [===============>..............] - ETA: 1:16 - loss: 1.6276 - regression_loss: 1.3959 - classification_loss: 0.2317 276/500 [===============>..............] - ETA: 1:15 - loss: 1.6267 - regression_loss: 1.3952 - classification_loss: 0.2315 277/500 [===============>..............] - ETA: 1:15 - loss: 1.6260 - regression_loss: 1.3945 - classification_loss: 0.2315 278/500 [===============>..............] - ETA: 1:15 - loss: 1.6236 - regression_loss: 1.3925 - classification_loss: 0.2311 279/500 [===============>..............] - ETA: 1:14 - loss: 1.6231 - regression_loss: 1.3920 - classification_loss: 0.2310 280/500 [===============>..............] - ETA: 1:14 - loss: 1.6228 - regression_loss: 1.3919 - classification_loss: 0.2310 281/500 [===============>..............] - ETA: 1:14 - loss: 1.6216 - regression_loss: 1.3911 - classification_loss: 0.2305 282/500 [===============>..............] - ETA: 1:13 - loss: 1.6204 - regression_loss: 1.3903 - classification_loss: 0.2301 283/500 [===============>..............] - ETA: 1:13 - loss: 1.6183 - regression_loss: 1.3887 - classification_loss: 0.2296 284/500 [================>.............] - ETA: 1:13 - loss: 1.6226 - regression_loss: 1.3926 - classification_loss: 0.2300 285/500 [================>.............] - ETA: 1:12 - loss: 1.6203 - regression_loss: 1.3905 - classification_loss: 0.2299 286/500 [================>.............] - ETA: 1:12 - loss: 1.6186 - regression_loss: 1.3892 - classification_loss: 0.2295 287/500 [================>.............] - ETA: 1:12 - loss: 1.6182 - regression_loss: 1.3887 - classification_loss: 0.2294 288/500 [================>.............] - ETA: 1:11 - loss: 1.6187 - regression_loss: 1.3891 - classification_loss: 0.2296 289/500 [================>.............] - ETA: 1:11 - loss: 1.6183 - regression_loss: 1.3889 - classification_loss: 0.2294 290/500 [================>.............] - ETA: 1:11 - loss: 1.6199 - regression_loss: 1.3906 - classification_loss: 0.2293 291/500 [================>.............] - ETA: 1:10 - loss: 1.6213 - regression_loss: 1.3918 - classification_loss: 0.2295 292/500 [================>.............] - ETA: 1:10 - loss: 1.6200 - regression_loss: 1.3905 - classification_loss: 0.2295 293/500 [================>.............] - ETA: 1:10 - loss: 1.6208 - regression_loss: 1.3914 - classification_loss: 0.2294 294/500 [================>.............] - ETA: 1:09 - loss: 1.6212 - regression_loss: 1.3919 - classification_loss: 0.2293 295/500 [================>.............] - ETA: 1:09 - loss: 1.6212 - regression_loss: 1.3919 - classification_loss: 0.2293 296/500 [================>.............] - ETA: 1:09 - loss: 1.6219 - regression_loss: 1.3926 - classification_loss: 0.2293 297/500 [================>.............] - ETA: 1:08 - loss: 1.6234 - regression_loss: 1.3938 - classification_loss: 0.2296 298/500 [================>.............] - ETA: 1:08 - loss: 1.6230 - regression_loss: 1.3934 - classification_loss: 0.2296 299/500 [================>.............] - ETA: 1:08 - loss: 1.6231 - regression_loss: 1.3937 - classification_loss: 0.2295 300/500 [=================>............] - ETA: 1:07 - loss: 1.6215 - regression_loss: 1.3924 - classification_loss: 0.2292 301/500 [=================>............] - ETA: 1:07 - loss: 1.6195 - regression_loss: 1.3908 - classification_loss: 0.2287 302/500 [=================>............] - ETA: 1:07 - loss: 1.6162 - regression_loss: 1.3880 - classification_loss: 0.2281 303/500 [=================>............] - ETA: 1:06 - loss: 1.6163 - regression_loss: 1.3881 - classification_loss: 0.2281 304/500 [=================>............] - ETA: 1:06 - loss: 1.6142 - regression_loss: 1.3836 - classification_loss: 0.2306 305/500 [=================>............] - ETA: 1:06 - loss: 1.6149 - regression_loss: 1.3842 - classification_loss: 0.2307 306/500 [=================>............] - ETA: 1:05 - loss: 1.6135 - regression_loss: 1.3832 - classification_loss: 0.2304 307/500 [=================>............] - ETA: 1:05 - loss: 1.6128 - regression_loss: 1.3825 - classification_loss: 0.2303 308/500 [=================>............] - ETA: 1:05 - loss: 1.6122 - regression_loss: 1.3822 - classification_loss: 0.2301 309/500 [=================>............] - ETA: 1:04 - loss: 1.6122 - regression_loss: 1.3821 - classification_loss: 0.2300 310/500 [=================>............] - ETA: 1:04 - loss: 1.6124 - regression_loss: 1.3825 - classification_loss: 0.2299 311/500 [=================>............] - ETA: 1:04 - loss: 1.6110 - regression_loss: 1.3815 - classification_loss: 0.2294 312/500 [=================>............] - ETA: 1:03 - loss: 1.6105 - regression_loss: 1.3814 - classification_loss: 0.2291 313/500 [=================>............] - ETA: 1:03 - loss: 1.6087 - regression_loss: 1.3801 - classification_loss: 0.2287 314/500 [=================>............] - ETA: 1:03 - loss: 1.6086 - regression_loss: 1.3801 - classification_loss: 0.2285 315/500 [=================>............] - ETA: 1:02 - loss: 1.6077 - regression_loss: 1.3796 - classification_loss: 0.2282 316/500 [=================>............] - ETA: 1:02 - loss: 1.6154 - regression_loss: 1.3848 - classification_loss: 0.2306 317/500 [==================>...........] - ETA: 1:02 - loss: 1.6159 - regression_loss: 1.3852 - classification_loss: 0.2307 318/500 [==================>...........] - ETA: 1:01 - loss: 1.6149 - regression_loss: 1.3843 - classification_loss: 0.2306 319/500 [==================>...........] - ETA: 1:01 - loss: 1.6151 - regression_loss: 1.3846 - classification_loss: 0.2305 320/500 [==================>...........] - ETA: 1:01 - loss: 1.6144 - regression_loss: 1.3842 - classification_loss: 0.2302 321/500 [==================>...........] - ETA: 1:00 - loss: 1.6169 - regression_loss: 1.3861 - classification_loss: 0.2308 322/500 [==================>...........] - ETA: 1:00 - loss: 1.6198 - regression_loss: 1.3887 - classification_loss: 0.2311 323/500 [==================>...........] - ETA: 1:00 - loss: 1.6186 - regression_loss: 1.3878 - classification_loss: 0.2308 324/500 [==================>...........] - ETA: 59s - loss: 1.6188 - regression_loss: 1.3880 - classification_loss: 0.2308  325/500 [==================>...........] - ETA: 59s - loss: 1.6198 - regression_loss: 1.3888 - classification_loss: 0.2310 326/500 [==================>...........] - ETA: 59s - loss: 1.6173 - regression_loss: 1.3867 - classification_loss: 0.2305 327/500 [==================>...........] - ETA: 58s - loss: 1.6165 - regression_loss: 1.3862 - classification_loss: 0.2303 328/500 [==================>...........] - ETA: 58s - loss: 1.6151 - regression_loss: 1.3850 - classification_loss: 0.2301 329/500 [==================>...........] - ETA: 58s - loss: 1.6145 - regression_loss: 1.3847 - classification_loss: 0.2299 330/500 [==================>...........] - ETA: 57s - loss: 1.6143 - regression_loss: 1.3845 - classification_loss: 0.2298 331/500 [==================>...........] - ETA: 57s - loss: 1.6132 - regression_loss: 1.3837 - classification_loss: 0.2295 332/500 [==================>...........] - ETA: 57s - loss: 1.6133 - regression_loss: 1.3836 - classification_loss: 0.2297 333/500 [==================>...........] - ETA: 56s - loss: 1.6127 - regression_loss: 1.3832 - classification_loss: 0.2295 334/500 [===================>..........] - ETA: 56s - loss: 1.6106 - regression_loss: 1.3815 - classification_loss: 0.2291 335/500 [===================>..........] - ETA: 56s - loss: 1.6109 - regression_loss: 1.3819 - classification_loss: 0.2289 336/500 [===================>..........] - ETA: 55s - loss: 1.6091 - regression_loss: 1.3804 - classification_loss: 0.2287 337/500 [===================>..........] - ETA: 55s - loss: 1.6103 - regression_loss: 1.3815 - classification_loss: 0.2288 338/500 [===================>..........] - ETA: 55s - loss: 1.6083 - regression_loss: 1.3797 - classification_loss: 0.2286 339/500 [===================>..........] - ETA: 54s - loss: 1.6069 - regression_loss: 1.3786 - classification_loss: 0.2283 340/500 [===================>..........] - ETA: 54s - loss: 1.6057 - regression_loss: 1.3776 - classification_loss: 0.2281 341/500 [===================>..........] - ETA: 53s - loss: 1.6050 - regression_loss: 1.3772 - classification_loss: 0.2279 342/500 [===================>..........] - ETA: 53s - loss: 1.6033 - regression_loss: 1.3756 - classification_loss: 0.2277 343/500 [===================>..........] - ETA: 53s - loss: 1.6058 - regression_loss: 1.3777 - classification_loss: 0.2281 344/500 [===================>..........] - ETA: 52s - loss: 1.6060 - regression_loss: 1.3778 - classification_loss: 0.2282 345/500 [===================>..........] - ETA: 52s - loss: 1.6058 - regression_loss: 1.3778 - classification_loss: 0.2279 346/500 [===================>..........] - ETA: 52s - loss: 1.6062 - regression_loss: 1.3784 - classification_loss: 0.2278 347/500 [===================>..........] - ETA: 51s - loss: 1.6088 - regression_loss: 1.3807 - classification_loss: 0.2281 348/500 [===================>..........] - ETA: 51s - loss: 1.6076 - regression_loss: 1.3797 - classification_loss: 0.2279 349/500 [===================>..........] - ETA: 51s - loss: 1.6072 - regression_loss: 1.3793 - classification_loss: 0.2280 350/500 [====================>.........] - ETA: 50s - loss: 1.6083 - regression_loss: 1.3800 - classification_loss: 0.2283 351/500 [====================>.........] - ETA: 50s - loss: 1.6058 - regression_loss: 1.3780 - classification_loss: 0.2278 352/500 [====================>.........] - ETA: 50s - loss: 1.6067 - regression_loss: 1.3789 - classification_loss: 0.2279 353/500 [====================>.........] - ETA: 49s - loss: 1.6092 - regression_loss: 1.3808 - classification_loss: 0.2284 354/500 [====================>.........] - ETA: 49s - loss: 1.6089 - regression_loss: 1.3807 - classification_loss: 0.2283 355/500 [====================>.........] - ETA: 49s - loss: 1.6071 - regression_loss: 1.3791 - classification_loss: 0.2280 356/500 [====================>.........] - ETA: 48s - loss: 1.6063 - regression_loss: 1.3785 - classification_loss: 0.2277 357/500 [====================>.........] - ETA: 48s - loss: 1.6050 - regression_loss: 1.3776 - classification_loss: 0.2274 358/500 [====================>.........] - ETA: 48s - loss: 1.6068 - regression_loss: 1.3787 - classification_loss: 0.2281 359/500 [====================>.........] - ETA: 47s - loss: 1.6057 - regression_loss: 1.3779 - classification_loss: 0.2279 360/500 [====================>.........] - ETA: 47s - loss: 1.6053 - regression_loss: 1.3775 - classification_loss: 0.2278 361/500 [====================>.........] - ETA: 47s - loss: 1.6042 - regression_loss: 1.3766 - classification_loss: 0.2276 362/500 [====================>.........] - ETA: 46s - loss: 1.6032 - regression_loss: 1.3758 - classification_loss: 0.2274 363/500 [====================>.........] - ETA: 46s - loss: 1.6017 - regression_loss: 1.3746 - classification_loss: 0.2271 364/500 [====================>.........] - ETA: 46s - loss: 1.5999 - regression_loss: 1.3731 - classification_loss: 0.2267 365/500 [====================>.........] - ETA: 45s - loss: 1.5989 - regression_loss: 1.3724 - classification_loss: 0.2265 366/500 [====================>.........] - ETA: 45s - loss: 1.5969 - regression_loss: 1.3707 - classification_loss: 0.2263 367/500 [=====================>........] - ETA: 45s - loss: 1.5958 - regression_loss: 1.3698 - classification_loss: 0.2260 368/500 [=====================>........] - ETA: 44s - loss: 1.5961 - regression_loss: 1.3703 - classification_loss: 0.2258 369/500 [=====================>........] - ETA: 44s - loss: 1.6006 - regression_loss: 1.3734 - classification_loss: 0.2272 370/500 [=====================>........] - ETA: 44s - loss: 1.6000 - regression_loss: 1.3730 - classification_loss: 0.2271 371/500 [=====================>........] - ETA: 43s - loss: 1.5999 - regression_loss: 1.3730 - classification_loss: 0.2269 372/500 [=====================>........] - ETA: 43s - loss: 1.5996 - regression_loss: 1.3730 - classification_loss: 0.2266 373/500 [=====================>........] - ETA: 43s - loss: 1.5985 - regression_loss: 1.3722 - classification_loss: 0.2264 374/500 [=====================>........] - ETA: 42s - loss: 1.5993 - regression_loss: 1.3725 - classification_loss: 0.2269 375/500 [=====================>........] - ETA: 42s - loss: 1.5993 - regression_loss: 1.3723 - classification_loss: 0.2270 376/500 [=====================>........] - ETA: 42s - loss: 1.5991 - regression_loss: 1.3719 - classification_loss: 0.2272 377/500 [=====================>........] - ETA: 41s - loss: 1.5967 - regression_loss: 1.3699 - classification_loss: 0.2268 378/500 [=====================>........] - ETA: 41s - loss: 1.5992 - regression_loss: 1.3720 - classification_loss: 0.2272 379/500 [=====================>........] - ETA: 41s - loss: 1.5983 - regression_loss: 1.3713 - classification_loss: 0.2270 380/500 [=====================>........] - ETA: 40s - loss: 1.5989 - regression_loss: 1.3718 - classification_loss: 0.2271 381/500 [=====================>........] - ETA: 40s - loss: 1.5978 - regression_loss: 1.3709 - classification_loss: 0.2269 382/500 [=====================>........] - ETA: 40s - loss: 1.5966 - regression_loss: 1.3700 - classification_loss: 0.2266 383/500 [=====================>........] - ETA: 39s - loss: 1.5977 - regression_loss: 1.3707 - classification_loss: 0.2270 384/500 [======================>.......] - ETA: 39s - loss: 1.5984 - regression_loss: 1.3712 - classification_loss: 0.2272 385/500 [======================>.......] - ETA: 39s - loss: 1.5983 - regression_loss: 1.3713 - classification_loss: 0.2270 386/500 [======================>.......] - ETA: 38s - loss: 1.5963 - regression_loss: 1.3693 - classification_loss: 0.2269 387/500 [======================>.......] - ETA: 38s - loss: 1.5970 - regression_loss: 1.3703 - classification_loss: 0.2268 388/500 [======================>.......] - ETA: 38s - loss: 1.5990 - regression_loss: 1.3725 - classification_loss: 0.2265 389/500 [======================>.......] - ETA: 37s - loss: 1.5987 - regression_loss: 1.3723 - classification_loss: 0.2263 390/500 [======================>.......] - ETA: 37s - loss: 1.5974 - regression_loss: 1.3713 - classification_loss: 0.2261 391/500 [======================>.......] - ETA: 37s - loss: 1.5971 - regression_loss: 1.3712 - classification_loss: 0.2260 392/500 [======================>.......] - ETA: 36s - loss: 1.5992 - regression_loss: 1.3732 - classification_loss: 0.2260 393/500 [======================>.......] - ETA: 36s - loss: 1.5992 - regression_loss: 1.3732 - classification_loss: 0.2260 394/500 [======================>.......] - ETA: 36s - loss: 1.5976 - regression_loss: 1.3721 - classification_loss: 0.2256 395/500 [======================>.......] - ETA: 35s - loss: 1.5981 - regression_loss: 1.3722 - classification_loss: 0.2259 396/500 [======================>.......] - ETA: 35s - loss: 1.6004 - regression_loss: 1.3743 - classification_loss: 0.2260 397/500 [======================>.......] - ETA: 34s - loss: 1.5997 - regression_loss: 1.3737 - classification_loss: 0.2260 398/500 [======================>.......] - ETA: 34s - loss: 1.6005 - regression_loss: 1.3742 - classification_loss: 0.2262 399/500 [======================>.......] - ETA: 34s - loss: 1.5996 - regression_loss: 1.3736 - classification_loss: 0.2259 400/500 [=======================>......] - ETA: 33s - loss: 1.5972 - regression_loss: 1.3717 - classification_loss: 0.2256 401/500 [=======================>......] - ETA: 33s - loss: 1.5971 - regression_loss: 1.3715 - classification_loss: 0.2256 402/500 [=======================>......] - ETA: 33s - loss: 1.6011 - regression_loss: 1.3744 - classification_loss: 0.2267 403/500 [=======================>......] - ETA: 32s - loss: 1.5992 - regression_loss: 1.3728 - classification_loss: 0.2263 404/500 [=======================>......] - ETA: 32s - loss: 1.5991 - regression_loss: 1.3730 - classification_loss: 0.2261 405/500 [=======================>......] - ETA: 32s - loss: 1.5987 - regression_loss: 1.3729 - classification_loss: 0.2258 406/500 [=======================>......] - ETA: 31s - loss: 1.5990 - regression_loss: 1.3731 - classification_loss: 0.2259 407/500 [=======================>......] - ETA: 31s - loss: 1.6007 - regression_loss: 1.3744 - classification_loss: 0.2263 408/500 [=======================>......] - ETA: 31s - loss: 1.6005 - regression_loss: 1.3741 - classification_loss: 0.2263 409/500 [=======================>......] - ETA: 30s - loss: 1.6003 - regression_loss: 1.3740 - classification_loss: 0.2263 410/500 [=======================>......] - ETA: 30s - loss: 1.6005 - regression_loss: 1.3743 - classification_loss: 0.2263 411/500 [=======================>......] - ETA: 30s - loss: 1.6026 - regression_loss: 1.3762 - classification_loss: 0.2263 412/500 [=======================>......] - ETA: 29s - loss: 1.6016 - regression_loss: 1.3753 - classification_loss: 0.2263 413/500 [=======================>......] - ETA: 29s - loss: 1.6016 - regression_loss: 1.3755 - classification_loss: 0.2262 414/500 [=======================>......] - ETA: 29s - loss: 1.6001 - regression_loss: 1.3743 - classification_loss: 0.2259 415/500 [=======================>......] - ETA: 28s - loss: 1.5998 - regression_loss: 1.3739 - classification_loss: 0.2259 416/500 [=======================>......] - ETA: 28s - loss: 1.6020 - regression_loss: 1.3744 - classification_loss: 0.2276 417/500 [========================>.....] - ETA: 28s - loss: 1.6013 - regression_loss: 1.3740 - classification_loss: 0.2274 418/500 [========================>.....] - ETA: 27s - loss: 1.6020 - regression_loss: 1.3744 - classification_loss: 0.2276 419/500 [========================>.....] - ETA: 27s - loss: 1.6010 - regression_loss: 1.3736 - classification_loss: 0.2273 420/500 [========================>.....] - ETA: 27s - loss: 1.5994 - regression_loss: 1.3723 - classification_loss: 0.2271 421/500 [========================>.....] - ETA: 26s - loss: 1.5976 - regression_loss: 1.3708 - classification_loss: 0.2268 422/500 [========================>.....] - ETA: 26s - loss: 1.5965 - regression_loss: 1.3699 - classification_loss: 0.2266 423/500 [========================>.....] - ETA: 26s - loss: 1.5958 - regression_loss: 1.3694 - classification_loss: 0.2264 424/500 [========================>.....] - ETA: 25s - loss: 1.5937 - regression_loss: 1.3676 - classification_loss: 0.2261 425/500 [========================>.....] - ETA: 25s - loss: 1.5943 - regression_loss: 1.3681 - classification_loss: 0.2262 426/500 [========================>.....] - ETA: 25s - loss: 1.5934 - regression_loss: 1.3674 - classification_loss: 0.2260 427/500 [========================>.....] - ETA: 24s - loss: 1.5926 - regression_loss: 1.3668 - classification_loss: 0.2258 428/500 [========================>.....] - ETA: 24s - loss: 1.5918 - regression_loss: 1.3661 - classification_loss: 0.2257 429/500 [========================>.....] - ETA: 24s - loss: 1.5912 - regression_loss: 1.3655 - classification_loss: 0.2257 430/500 [========================>.....] - ETA: 23s - loss: 1.5915 - regression_loss: 1.3659 - classification_loss: 0.2256 431/500 [========================>.....] - ETA: 23s - loss: 1.5904 - regression_loss: 1.3651 - classification_loss: 0.2253 432/500 [========================>.....] - ETA: 23s - loss: 1.5904 - regression_loss: 1.3651 - classification_loss: 0.2253 433/500 [========================>.....] - ETA: 22s - loss: 1.5898 - regression_loss: 1.3647 - classification_loss: 0.2250 434/500 [=========================>....] - ETA: 22s - loss: 1.5901 - regression_loss: 1.3651 - classification_loss: 0.2250 435/500 [=========================>....] - ETA: 22s - loss: 1.5890 - regression_loss: 1.3642 - classification_loss: 0.2248 436/500 [=========================>....] - ETA: 21s - loss: 1.5877 - regression_loss: 1.3632 - classification_loss: 0.2245 437/500 [=========================>....] - ETA: 21s - loss: 1.5888 - regression_loss: 1.3642 - classification_loss: 0.2246 438/500 [=========================>....] - ETA: 21s - loss: 1.5890 - regression_loss: 1.3644 - classification_loss: 0.2247 439/500 [=========================>....] - ETA: 20s - loss: 1.5870 - regression_loss: 1.3627 - classification_loss: 0.2243 440/500 [=========================>....] - ETA: 20s - loss: 1.5877 - regression_loss: 1.3633 - classification_loss: 0.2245 441/500 [=========================>....] - ETA: 20s - loss: 1.5855 - regression_loss: 1.3614 - classification_loss: 0.2241 442/500 [=========================>....] - ETA: 19s - loss: 1.5834 - regression_loss: 1.3597 - classification_loss: 0.2237 443/500 [=========================>....] - ETA: 19s - loss: 1.5840 - regression_loss: 1.3602 - classification_loss: 0.2238 444/500 [=========================>....] - ETA: 19s - loss: 1.5857 - regression_loss: 1.3614 - classification_loss: 0.2243 445/500 [=========================>....] - ETA: 18s - loss: 1.5856 - regression_loss: 1.3613 - classification_loss: 0.2243 446/500 [=========================>....] - ETA: 18s - loss: 1.5875 - regression_loss: 1.3625 - classification_loss: 0.2250 447/500 [=========================>....] - ETA: 18s - loss: 1.5885 - regression_loss: 1.3633 - classification_loss: 0.2252 448/500 [=========================>....] - ETA: 17s - loss: 1.5874 - regression_loss: 1.3624 - classification_loss: 0.2251 449/500 [=========================>....] - ETA: 17s - loss: 1.5866 - regression_loss: 1.3617 - classification_loss: 0.2249 450/500 [==========================>...] - ETA: 16s - loss: 1.5865 - regression_loss: 1.3616 - classification_loss: 0.2248 451/500 [==========================>...] - ETA: 16s - loss: 1.5886 - regression_loss: 1.3632 - classification_loss: 0.2254 452/500 [==========================>...] - ETA: 16s - loss: 1.5867 - regression_loss: 1.3614 - classification_loss: 0.2252 453/500 [==========================>...] - ETA: 15s - loss: 1.5850 - regression_loss: 1.3600 - classification_loss: 0.2250 454/500 [==========================>...] - ETA: 15s - loss: 1.5858 - regression_loss: 1.3609 - classification_loss: 0.2250 455/500 [==========================>...] - ETA: 15s - loss: 1.5864 - regression_loss: 1.3614 - classification_loss: 0.2250 456/500 [==========================>...] - ETA: 14s - loss: 1.5845 - regression_loss: 1.3598 - classification_loss: 0.2247 457/500 [==========================>...] - ETA: 14s - loss: 1.5840 - regression_loss: 1.3595 - classification_loss: 0.2245 458/500 [==========================>...] - ETA: 14s - loss: 1.5832 - regression_loss: 1.3588 - classification_loss: 0.2245 459/500 [==========================>...] - ETA: 13s - loss: 1.5828 - regression_loss: 1.3584 - classification_loss: 0.2244 460/500 [==========================>...] - ETA: 13s - loss: 1.5804 - regression_loss: 1.3564 - classification_loss: 0.2240 461/500 [==========================>...] - ETA: 13s - loss: 1.5808 - regression_loss: 1.3567 - classification_loss: 0.2241 462/500 [==========================>...] - ETA: 12s - loss: 1.5812 - regression_loss: 1.3571 - classification_loss: 0.2241 463/500 [==========================>...] - ETA: 12s - loss: 1.5815 - regression_loss: 1.3572 - classification_loss: 0.2243 464/500 [==========================>...] - ETA: 12s - loss: 1.5813 - regression_loss: 1.3572 - classification_loss: 0.2241 465/500 [==========================>...] - ETA: 11s - loss: 1.5814 - regression_loss: 1.3573 - classification_loss: 0.2241 466/500 [==========================>...] - ETA: 11s - loss: 1.5807 - regression_loss: 1.3568 - classification_loss: 0.2239 467/500 [===========================>..] - ETA: 11s - loss: 1.5822 - regression_loss: 1.3581 - classification_loss: 0.2241 468/500 [===========================>..] - ETA: 10s - loss: 1.5847 - regression_loss: 1.3601 - classification_loss: 0.2245 469/500 [===========================>..] - ETA: 10s - loss: 1.5826 - regression_loss: 1.3584 - classification_loss: 0.2242 470/500 [===========================>..] - ETA: 10s - loss: 1.5828 - regression_loss: 1.3586 - classification_loss: 0.2242 471/500 [===========================>..] - ETA: 9s - loss: 1.5829 - regression_loss: 1.3588 - classification_loss: 0.2241  472/500 [===========================>..] - ETA: 9s - loss: 1.5835 - regression_loss: 1.3591 - classification_loss: 0.2244 473/500 [===========================>..] - ETA: 9s - loss: 1.5833 - regression_loss: 1.3590 - classification_loss: 0.2243 474/500 [===========================>..] - ETA: 8s - loss: 1.5839 - regression_loss: 1.3595 - classification_loss: 0.2244 475/500 [===========================>..] - ETA: 8s - loss: 1.5838 - regression_loss: 1.3595 - classification_loss: 0.2243 476/500 [===========================>..] - ETA: 8s - loss: 1.5849 - regression_loss: 1.3602 - classification_loss: 0.2246 477/500 [===========================>..] - ETA: 7s - loss: 1.5835 - regression_loss: 1.3591 - classification_loss: 0.2244 478/500 [===========================>..] - ETA: 7s - loss: 1.5847 - regression_loss: 1.3603 - classification_loss: 0.2244 479/500 [===========================>..] - ETA: 7s - loss: 1.5841 - regression_loss: 1.3599 - classification_loss: 0.2242 480/500 [===========================>..] - ETA: 6s - loss: 1.5837 - regression_loss: 1.3595 - classification_loss: 0.2242 481/500 [===========================>..] - ETA: 6s - loss: 1.5835 - regression_loss: 1.3593 - classification_loss: 0.2242 482/500 [===========================>..] - ETA: 6s - loss: 1.5836 - regression_loss: 1.3594 - classification_loss: 0.2242 483/500 [===========================>..] - ETA: 5s - loss: 1.5829 - regression_loss: 1.3590 - classification_loss: 0.2239 484/500 [============================>.] - ETA: 5s - loss: 1.5827 - regression_loss: 1.3588 - classification_loss: 0.2239 485/500 [============================>.] - ETA: 5s - loss: 1.5824 - regression_loss: 1.3583 - classification_loss: 0.2240 486/500 [============================>.] - ETA: 4s - loss: 1.5823 - regression_loss: 1.3584 - classification_loss: 0.2240 487/500 [============================>.] - ETA: 4s - loss: 1.5821 - regression_loss: 1.3583 - classification_loss: 0.2238 488/500 [============================>.] - ETA: 4s - loss: 1.5835 - regression_loss: 1.3595 - classification_loss: 0.2240 489/500 [============================>.] - ETA: 3s - loss: 1.5820 - regression_loss: 1.3583 - classification_loss: 0.2237 490/500 [============================>.] - ETA: 3s - loss: 1.5818 - regression_loss: 1.3581 - classification_loss: 0.2236 491/500 [============================>.] - ETA: 3s - loss: 1.5830 - regression_loss: 1.3591 - classification_loss: 0.2239 492/500 [============================>.] - ETA: 2s - loss: 1.5831 - regression_loss: 1.3593 - classification_loss: 0.2238 493/500 [============================>.] - ETA: 2s - loss: 1.5823 - regression_loss: 1.3587 - classification_loss: 0.2236 494/500 [============================>.] - ETA: 2s - loss: 1.5834 - regression_loss: 1.3597 - classification_loss: 0.2237 495/500 [============================>.] - ETA: 1s - loss: 1.5832 - regression_loss: 1.3596 - classification_loss: 0.2236 496/500 [============================>.] - ETA: 1s - loss: 1.5842 - regression_loss: 1.3604 - classification_loss: 0.2238 497/500 [============================>.] - ETA: 1s - loss: 1.5835 - regression_loss: 1.3599 - classification_loss: 0.2236 498/500 [============================>.] - ETA: 0s - loss: 1.5832 - regression_loss: 1.3597 - classification_loss: 0.2235 499/500 [============================>.] - ETA: 0s - loss: 1.5832 - regression_loss: 1.3597 - classification_loss: 0.2235 500/500 [==============================] - 170s 340ms/step - loss: 1.5841 - regression_loss: 1.3604 - classification_loss: 0.2237 326 instances of class plum with average precision: 0.8137 mAP: 0.8137 Epoch 00008: saving model to ./training/snapshots/resnet101_pascal_08.h5 Epoch 9/150 1/500 [..............................] - ETA: 2:45 - loss: 2.5774 - regression_loss: 2.0970 - classification_loss: 0.4804 2/500 [..............................] - ETA: 2:49 - loss: 2.1592 - regression_loss: 1.7996 - classification_loss: 0.3596 3/500 [..............................] - ETA: 2:52 - loss: 1.9151 - regression_loss: 1.6267 - classification_loss: 0.2885 4/500 [..............................] - ETA: 2:50 - loss: 1.8958 - regression_loss: 1.6106 - classification_loss: 0.2852 5/500 [..............................] - ETA: 2:49 - loss: 1.7279 - regression_loss: 1.4769 - classification_loss: 0.2510 6/500 [..............................] - ETA: 2:47 - loss: 1.6094 - regression_loss: 1.3310 - classification_loss: 0.2784 7/500 [..............................] - ETA: 2:47 - loss: 1.6016 - regression_loss: 1.3111 - classification_loss: 0.2904 8/500 [..............................] - ETA: 2:48 - loss: 1.6506 - regression_loss: 1.3722 - classification_loss: 0.2784 9/500 [..............................] - ETA: 2:47 - loss: 1.5840 - regression_loss: 1.3238 - classification_loss: 0.2601 10/500 [..............................] - ETA: 2:46 - loss: 1.5616 - regression_loss: 1.3092 - classification_loss: 0.2524 11/500 [..............................] - ETA: 2:46 - loss: 1.5420 - regression_loss: 1.2971 - classification_loss: 0.2450 12/500 [..............................] - ETA: 2:46 - loss: 1.5493 - regression_loss: 1.3052 - classification_loss: 0.2441 13/500 [..............................] - ETA: 2:46 - loss: 1.5705 - regression_loss: 1.3276 - classification_loss: 0.2428 14/500 [..............................] - ETA: 2:46 - loss: 1.7045 - regression_loss: 1.4232 - classification_loss: 0.2813 15/500 [..............................] - ETA: 2:45 - loss: 1.7105 - regression_loss: 1.4342 - classification_loss: 0.2763 16/500 [..............................] - ETA: 2:45 - loss: 1.7284 - regression_loss: 1.4504 - classification_loss: 0.2780 17/500 [>.............................] - ETA: 2:45 - loss: 1.7132 - regression_loss: 1.4403 - classification_loss: 0.2729 18/500 [>.............................] - ETA: 2:45 - loss: 1.7645 - regression_loss: 1.4873 - classification_loss: 0.2772 19/500 [>.............................] - ETA: 2:44 - loss: 1.7326 - regression_loss: 1.4641 - classification_loss: 0.2685 20/500 [>.............................] - ETA: 2:44 - loss: 1.7095 - regression_loss: 1.4467 - classification_loss: 0.2628 21/500 [>.............................] - ETA: 2:44 - loss: 1.7370 - regression_loss: 1.4701 - classification_loss: 0.2669 22/500 [>.............................] - ETA: 2:43 - loss: 1.7082 - regression_loss: 1.4458 - classification_loss: 0.2623 23/500 [>.............................] - ETA: 2:43 - loss: 1.7409 - regression_loss: 1.4741 - classification_loss: 0.2668 24/500 [>.............................] - ETA: 2:43 - loss: 1.7959 - regression_loss: 1.4990 - classification_loss: 0.2968 25/500 [>.............................] - ETA: 2:42 - loss: 1.7781 - regression_loss: 1.4858 - classification_loss: 0.2923 26/500 [>.............................] - ETA: 2:42 - loss: 1.7747 - regression_loss: 1.4855 - classification_loss: 0.2892 27/500 [>.............................] - ETA: 2:42 - loss: 1.7615 - regression_loss: 1.4750 - classification_loss: 0.2865 28/500 [>.............................] - ETA: 2:41 - loss: 1.7685 - regression_loss: 1.4806 - classification_loss: 0.2880 29/500 [>.............................] - ETA: 2:41 - loss: 1.7540 - regression_loss: 1.4683 - classification_loss: 0.2858 30/500 [>.............................] - ETA: 2:41 - loss: 1.7247 - regression_loss: 1.4451 - classification_loss: 0.2796 31/500 [>.............................] - ETA: 2:40 - loss: 1.6908 - regression_loss: 1.4189 - classification_loss: 0.2718 32/500 [>.............................] - ETA: 2:40 - loss: 1.6590 - regression_loss: 1.3930 - classification_loss: 0.2661 33/500 [>.............................] - ETA: 2:40 - loss: 1.6396 - regression_loss: 1.3767 - classification_loss: 0.2630 34/500 [=>............................] - ETA: 2:39 - loss: 1.6481 - regression_loss: 1.3855 - classification_loss: 0.2626 35/500 [=>............................] - ETA: 2:39 - loss: 1.6288 - regression_loss: 1.3696 - classification_loss: 0.2591 36/500 [=>............................] - ETA: 2:39 - loss: 1.6017 - regression_loss: 1.3479 - classification_loss: 0.2538 37/500 [=>............................] - ETA: 2:38 - loss: 1.5874 - regression_loss: 1.3379 - classification_loss: 0.2495 38/500 [=>............................] - ETA: 2:38 - loss: 1.6100 - regression_loss: 1.3563 - classification_loss: 0.2538 39/500 [=>............................] - ETA: 2:38 - loss: 1.6463 - regression_loss: 1.3817 - classification_loss: 0.2647 40/500 [=>............................] - ETA: 2:38 - loss: 1.6557 - regression_loss: 1.3909 - classification_loss: 0.2648 41/500 [=>............................] - ETA: 2:37 - loss: 1.6493 - regression_loss: 1.3885 - classification_loss: 0.2608 42/500 [=>............................] - ETA: 2:37 - loss: 1.6502 - regression_loss: 1.3900 - classification_loss: 0.2602 43/500 [=>............................] - ETA: 2:37 - loss: 1.6624 - regression_loss: 1.4015 - classification_loss: 0.2608 44/500 [=>............................] - ETA: 2:36 - loss: 1.6597 - regression_loss: 1.4001 - classification_loss: 0.2596 45/500 [=>............................] - ETA: 2:36 - loss: 1.6546 - regression_loss: 1.3954 - classification_loss: 0.2592 46/500 [=>............................] - ETA: 2:36 - loss: 1.6560 - regression_loss: 1.3958 - classification_loss: 0.2602 47/500 [=>............................] - ETA: 2:35 - loss: 1.6420 - regression_loss: 1.3853 - classification_loss: 0.2567 48/500 [=>............................] - ETA: 2:35 - loss: 1.6391 - regression_loss: 1.3834 - classification_loss: 0.2557 49/500 [=>............................] - ETA: 2:34 - loss: 1.6414 - regression_loss: 1.3858 - classification_loss: 0.2556 50/500 [==>...........................] - ETA: 2:33 - loss: 1.6339 - regression_loss: 1.3813 - classification_loss: 0.2526 51/500 [==>...........................] - ETA: 2:33 - loss: 1.6082 - regression_loss: 1.3542 - classification_loss: 0.2540 52/500 [==>...........................] - ETA: 2:33 - loss: 1.5919 - regression_loss: 1.3415 - classification_loss: 0.2504 53/500 [==>...........................] - ETA: 2:32 - loss: 1.5797 - regression_loss: 1.3304 - classification_loss: 0.2493 54/500 [==>...........................] - ETA: 2:32 - loss: 1.5780 - regression_loss: 1.3302 - classification_loss: 0.2478 55/500 [==>...........................] - ETA: 2:32 - loss: 1.5729 - regression_loss: 1.3269 - classification_loss: 0.2460 56/500 [==>...........................] - ETA: 2:31 - loss: 1.5824 - regression_loss: 1.3362 - classification_loss: 0.2462 57/500 [==>...........................] - ETA: 2:31 - loss: 1.5765 - regression_loss: 1.3321 - classification_loss: 0.2445 58/500 [==>...........................] - ETA: 2:31 - loss: 1.5690 - regression_loss: 1.3265 - classification_loss: 0.2425 59/500 [==>...........................] - ETA: 2:30 - loss: 1.5497 - regression_loss: 1.3101 - classification_loss: 0.2395 60/500 [==>...........................] - ETA: 2:30 - loss: 1.5403 - regression_loss: 1.3037 - classification_loss: 0.2366 61/500 [==>...........................] - ETA: 2:29 - loss: 1.5509 - regression_loss: 1.3143 - classification_loss: 0.2365 62/500 [==>...........................] - ETA: 2:29 - loss: 1.5541 - regression_loss: 1.3167 - classification_loss: 0.2374 63/500 [==>...........................] - ETA: 2:29 - loss: 1.5487 - regression_loss: 1.3131 - classification_loss: 0.2356 64/500 [==>...........................] - ETA: 2:28 - loss: 1.5553 - regression_loss: 1.3204 - classification_loss: 0.2349 65/500 [==>...........................] - ETA: 2:28 - loss: 1.5509 - regression_loss: 1.3177 - classification_loss: 0.2332 66/500 [==>...........................] - ETA: 2:28 - loss: 1.5516 - regression_loss: 1.3191 - classification_loss: 0.2325 67/500 [===>..........................] - ETA: 2:27 - loss: 1.5551 - regression_loss: 1.3214 - classification_loss: 0.2337 68/500 [===>..........................] - ETA: 2:27 - loss: 1.5657 - regression_loss: 1.3282 - classification_loss: 0.2375 69/500 [===>..........................] - ETA: 2:27 - loss: 1.5711 - regression_loss: 1.3328 - classification_loss: 0.2384 70/500 [===>..........................] - ETA: 2:26 - loss: 1.5775 - regression_loss: 1.3375 - classification_loss: 0.2400 71/500 [===>..........................] - ETA: 2:26 - loss: 1.5788 - regression_loss: 1.3389 - classification_loss: 0.2399 72/500 [===>..........................] - ETA: 2:26 - loss: 1.5823 - regression_loss: 1.3436 - classification_loss: 0.2387 73/500 [===>..........................] - ETA: 2:25 - loss: 1.5775 - regression_loss: 1.3402 - classification_loss: 0.2373 74/500 [===>..........................] - ETA: 2:25 - loss: 1.5695 - regression_loss: 1.3342 - classification_loss: 0.2353 75/500 [===>..........................] - ETA: 2:24 - loss: 1.5775 - regression_loss: 1.3405 - classification_loss: 0.2370 76/500 [===>..........................] - ETA: 2:24 - loss: 1.5838 - regression_loss: 1.3471 - classification_loss: 0.2368 77/500 [===>..........................] - ETA: 2:24 - loss: 1.5812 - regression_loss: 1.3452 - classification_loss: 0.2361 78/500 [===>..........................] - ETA: 2:23 - loss: 1.5754 - regression_loss: 1.3407 - classification_loss: 0.2348 79/500 [===>..........................] - ETA: 2:23 - loss: 1.5748 - regression_loss: 1.3410 - classification_loss: 0.2338 80/500 [===>..........................] - ETA: 2:23 - loss: 1.5735 - regression_loss: 1.3407 - classification_loss: 0.2327 81/500 [===>..........................] - ETA: 2:22 - loss: 1.5709 - regression_loss: 1.3391 - classification_loss: 0.2318 82/500 [===>..........................] - ETA: 2:22 - loss: 1.5708 - regression_loss: 1.3386 - classification_loss: 0.2322 83/500 [===>..........................] - ETA: 2:22 - loss: 1.5682 - regression_loss: 1.3369 - classification_loss: 0.2313 84/500 [====>.........................] - ETA: 2:21 - loss: 1.5615 - regression_loss: 1.3317 - classification_loss: 0.2298 85/500 [====>.........................] - ETA: 2:21 - loss: 1.5617 - regression_loss: 1.3315 - classification_loss: 0.2302 86/500 [====>.........................] - ETA: 2:20 - loss: 1.5639 - regression_loss: 1.3339 - classification_loss: 0.2301 87/500 [====>.........................] - ETA: 2:20 - loss: 1.5625 - regression_loss: 1.3332 - classification_loss: 0.2293 88/500 [====>.........................] - ETA: 2:20 - loss: 1.5635 - regression_loss: 1.3352 - classification_loss: 0.2283 89/500 [====>.........................] - ETA: 2:19 - loss: 1.5632 - regression_loss: 1.3346 - classification_loss: 0.2286 90/500 [====>.........................] - ETA: 2:19 - loss: 1.5513 - regression_loss: 1.3247 - classification_loss: 0.2265 91/500 [====>.........................] - ETA: 2:18 - loss: 1.5566 - regression_loss: 1.3283 - classification_loss: 0.2283 92/500 [====>.........................] - ETA: 2:18 - loss: 1.5562 - regression_loss: 1.3283 - classification_loss: 0.2279 93/500 [====>.........................] - ETA: 2:18 - loss: 1.5555 - regression_loss: 1.3282 - classification_loss: 0.2273 94/500 [====>.........................] - ETA: 2:17 - loss: 1.5544 - regression_loss: 1.3280 - classification_loss: 0.2264 95/500 [====>.........................] - ETA: 2:17 - loss: 1.5429 - regression_loss: 1.3177 - classification_loss: 0.2252 96/500 [====>.........................] - ETA: 2:17 - loss: 1.5307 - regression_loss: 1.3075 - classification_loss: 0.2233 97/500 [====>.........................] - ETA: 2:16 - loss: 1.5307 - regression_loss: 1.3078 - classification_loss: 0.2229 98/500 [====>.........................] - ETA: 2:16 - loss: 1.5228 - regression_loss: 1.3012 - classification_loss: 0.2216 99/500 [====>.........................] - ETA: 2:16 - loss: 1.5227 - regression_loss: 1.3007 - classification_loss: 0.2220 100/500 [=====>........................] - ETA: 2:15 - loss: 1.5201 - regression_loss: 1.2978 - classification_loss: 0.2223 101/500 [=====>........................] - ETA: 2:15 - loss: 1.5187 - regression_loss: 1.2969 - classification_loss: 0.2217 102/500 [=====>........................] - ETA: 2:15 - loss: 1.5124 - regression_loss: 1.2920 - classification_loss: 0.2203 103/500 [=====>........................] - ETA: 2:14 - loss: 1.5135 - regression_loss: 1.2934 - classification_loss: 0.2201 104/500 [=====>........................] - ETA: 2:14 - loss: 1.5117 - regression_loss: 1.2922 - classification_loss: 0.2194 105/500 [=====>........................] - ETA: 2:14 - loss: 1.5102 - regression_loss: 1.2912 - classification_loss: 0.2191 106/500 [=====>........................] - ETA: 2:13 - loss: 1.5091 - regression_loss: 1.2909 - classification_loss: 0.2181 107/500 [=====>........................] - ETA: 2:13 - loss: 1.5103 - regression_loss: 1.2925 - classification_loss: 0.2179 108/500 [=====>........................] - ETA: 2:13 - loss: 1.5170 - regression_loss: 1.2970 - classification_loss: 0.2200 109/500 [=====>........................] - ETA: 2:12 - loss: 1.5186 - regression_loss: 1.2994 - classification_loss: 0.2191 110/500 [=====>........................] - ETA: 2:12 - loss: 1.5201 - regression_loss: 1.3005 - classification_loss: 0.2196 111/500 [=====>........................] - ETA: 2:11 - loss: 1.5198 - regression_loss: 1.3008 - classification_loss: 0.2190 112/500 [=====>........................] - ETA: 2:11 - loss: 1.5267 - regression_loss: 1.3060 - classification_loss: 0.2207 113/500 [=====>........................] - ETA: 2:11 - loss: 1.5217 - regression_loss: 1.3019 - classification_loss: 0.2198 114/500 [=====>........................] - ETA: 2:10 - loss: 1.5255 - regression_loss: 1.3054 - classification_loss: 0.2201 115/500 [=====>........................] - ETA: 2:10 - loss: 1.5270 - regression_loss: 1.3065 - classification_loss: 0.2204 116/500 [=====>........................] - ETA: 2:10 - loss: 1.5252 - regression_loss: 1.3053 - classification_loss: 0.2200 117/500 [======>.......................] - ETA: 2:10 - loss: 1.5285 - regression_loss: 1.3051 - classification_loss: 0.2234 118/500 [======>.......................] - ETA: 2:09 - loss: 1.5252 - regression_loss: 1.3011 - classification_loss: 0.2241 119/500 [======>.......................] - ETA: 2:09 - loss: 1.5260 - regression_loss: 1.3025 - classification_loss: 0.2235 120/500 [======>.......................] - ETA: 2:08 - loss: 1.5225 - regression_loss: 1.3001 - classification_loss: 0.2224 121/500 [======>.......................] - ETA: 2:08 - loss: 1.5202 - regression_loss: 1.2978 - classification_loss: 0.2224 122/500 [======>.......................] - ETA: 2:08 - loss: 1.5203 - regression_loss: 1.2978 - classification_loss: 0.2225 123/500 [======>.......................] - ETA: 2:07 - loss: 1.5192 - regression_loss: 1.2974 - classification_loss: 0.2218 124/500 [======>.......................] - ETA: 2:07 - loss: 1.5190 - regression_loss: 1.2973 - classification_loss: 0.2217 125/500 [======>.......................] - ETA: 2:07 - loss: 1.5180 - regression_loss: 1.2966 - classification_loss: 0.2214 126/500 [======>.......................] - ETA: 2:06 - loss: 1.5149 - regression_loss: 1.2944 - classification_loss: 0.2205 127/500 [======>.......................] - ETA: 2:06 - loss: 1.5138 - regression_loss: 1.2937 - classification_loss: 0.2201 128/500 [======>.......................] - ETA: 2:06 - loss: 1.5157 - regression_loss: 1.2956 - classification_loss: 0.2200 129/500 [======>.......................] - ETA: 2:05 - loss: 1.5224 - regression_loss: 1.3016 - classification_loss: 0.2208 130/500 [======>.......................] - ETA: 2:05 - loss: 1.5189 - regression_loss: 1.2988 - classification_loss: 0.2201 131/500 [======>.......................] - ETA: 2:05 - loss: 1.5139 - regression_loss: 1.2947 - classification_loss: 0.2191 132/500 [======>.......................] - ETA: 2:04 - loss: 1.5209 - regression_loss: 1.3010 - classification_loss: 0.2200 133/500 [======>.......................] - ETA: 2:04 - loss: 1.5266 - regression_loss: 1.3059 - classification_loss: 0.2207 134/500 [=======>......................] - ETA: 2:03 - loss: 1.5221 - regression_loss: 1.3022 - classification_loss: 0.2199 135/500 [=======>......................] - ETA: 2:03 - loss: 1.5158 - regression_loss: 1.2961 - classification_loss: 0.2196 136/500 [=======>......................] - ETA: 2:03 - loss: 1.5152 - regression_loss: 1.2959 - classification_loss: 0.2193 137/500 [=======>......................] - ETA: 2:02 - loss: 1.5124 - regression_loss: 1.2934 - classification_loss: 0.2190 138/500 [=======>......................] - ETA: 2:02 - loss: 1.5124 - regression_loss: 1.2936 - classification_loss: 0.2188 139/500 [=======>......................] - ETA: 2:02 - loss: 1.5256 - regression_loss: 1.3048 - classification_loss: 0.2208 140/500 [=======>......................] - ETA: 2:02 - loss: 1.5288 - regression_loss: 1.3079 - classification_loss: 0.2208 141/500 [=======>......................] - ETA: 2:01 - loss: 1.5366 - regression_loss: 1.3160 - classification_loss: 0.2207 142/500 [=======>......................] - ETA: 2:01 - loss: 1.5379 - regression_loss: 1.3173 - classification_loss: 0.2205 143/500 [=======>......................] - ETA: 2:01 - loss: 1.5340 - regression_loss: 1.3140 - classification_loss: 0.2200 144/500 [=======>......................] - ETA: 2:00 - loss: 1.5349 - regression_loss: 1.3152 - classification_loss: 0.2197 145/500 [=======>......................] - ETA: 2:00 - loss: 1.5351 - regression_loss: 1.3157 - classification_loss: 0.2194 146/500 [=======>......................] - ETA: 2:00 - loss: 1.5320 - regression_loss: 1.3122 - classification_loss: 0.2198 147/500 [=======>......................] - ETA: 1:59 - loss: 1.5359 - regression_loss: 1.3161 - classification_loss: 0.2197 148/500 [=======>......................] - ETA: 1:59 - loss: 1.5348 - regression_loss: 1.3154 - classification_loss: 0.2193 149/500 [=======>......................] - ETA: 1:59 - loss: 1.5380 - regression_loss: 1.3187 - classification_loss: 0.2193 150/500 [========>.....................] - ETA: 1:58 - loss: 1.5412 - regression_loss: 1.3206 - classification_loss: 0.2206 151/500 [========>.....................] - ETA: 1:58 - loss: 1.5450 - regression_loss: 1.3240 - classification_loss: 0.2210 152/500 [========>.....................] - ETA: 1:58 - loss: 1.5434 - regression_loss: 1.3231 - classification_loss: 0.2204 153/500 [========>.....................] - ETA: 1:57 - loss: 1.5442 - regression_loss: 1.3234 - classification_loss: 0.2208 154/500 [========>.....................] - ETA: 1:57 - loss: 1.5408 - regression_loss: 1.3208 - classification_loss: 0.2201 155/500 [========>.....................] - ETA: 1:57 - loss: 1.5424 - regression_loss: 1.3220 - classification_loss: 0.2204 156/500 [========>.....................] - ETA: 1:56 - loss: 1.5401 - regression_loss: 1.3203 - classification_loss: 0.2198 157/500 [========>.....................] - ETA: 1:56 - loss: 1.5384 - regression_loss: 1.3189 - classification_loss: 0.2195 158/500 [========>.....................] - ETA: 1:56 - loss: 1.5426 - regression_loss: 1.3221 - classification_loss: 0.2204 159/500 [========>.....................] - ETA: 1:55 - loss: 1.5436 - regression_loss: 1.3233 - classification_loss: 0.2203 160/500 [========>.....................] - ETA: 1:55 - loss: 1.5431 - regression_loss: 1.3228 - classification_loss: 0.2202 161/500 [========>.....................] - ETA: 1:55 - loss: 1.5484 - regression_loss: 1.3279 - classification_loss: 0.2205 162/500 [========>.....................] - ETA: 1:54 - loss: 1.5496 - regression_loss: 1.3293 - classification_loss: 0.2204 163/500 [========>.....................] - ETA: 1:54 - loss: 1.5467 - regression_loss: 1.3272 - classification_loss: 0.2195 164/500 [========>.....................] - ETA: 1:53 - loss: 1.5408 - regression_loss: 1.3222 - classification_loss: 0.2186 165/500 [========>.....................] - ETA: 1:53 - loss: 1.5413 - regression_loss: 1.3222 - classification_loss: 0.2191 166/500 [========>.....................] - ETA: 1:53 - loss: 1.5420 - regression_loss: 1.3232 - classification_loss: 0.2188 167/500 [=========>....................] - ETA: 1:52 - loss: 1.5392 - regression_loss: 1.3187 - classification_loss: 0.2205 168/500 [=========>....................] - ETA: 1:52 - loss: 1.5395 - regression_loss: 1.3192 - classification_loss: 0.2203 169/500 [=========>....................] - ETA: 1:52 - loss: 1.5375 - regression_loss: 1.3173 - classification_loss: 0.2201 170/500 [=========>....................] - ETA: 1:51 - loss: 1.5366 - regression_loss: 1.3166 - classification_loss: 0.2200 171/500 [=========>....................] - ETA: 1:51 - loss: 1.5343 - regression_loss: 1.3149 - classification_loss: 0.2194 172/500 [=========>....................] - ETA: 1:51 - loss: 1.5370 - regression_loss: 1.3171 - classification_loss: 0.2199 173/500 [=========>....................] - ETA: 1:50 - loss: 1.5317 - regression_loss: 1.3126 - classification_loss: 0.2191 174/500 [=========>....................] - ETA: 1:50 - loss: 1.5305 - regression_loss: 1.3111 - classification_loss: 0.2194 175/500 [=========>....................] - ETA: 1:50 - loss: 1.5280 - regression_loss: 1.3088 - classification_loss: 0.2191 176/500 [=========>....................] - ETA: 1:50 - loss: 1.5237 - regression_loss: 1.3052 - classification_loss: 0.2185 177/500 [=========>....................] - ETA: 1:49 - loss: 1.5296 - regression_loss: 1.3109 - classification_loss: 0.2187 178/500 [=========>....................] - ETA: 1:49 - loss: 1.5284 - regression_loss: 1.3104 - classification_loss: 0.2180 179/500 [=========>....................] - ETA: 1:49 - loss: 1.5289 - regression_loss: 1.3115 - classification_loss: 0.2174 180/500 [=========>....................] - ETA: 1:48 - loss: 1.5298 - regression_loss: 1.3130 - classification_loss: 0.2168 181/500 [=========>....................] - ETA: 1:48 - loss: 1.5304 - regression_loss: 1.3135 - classification_loss: 0.2169 182/500 [=========>....................] - ETA: 1:48 - loss: 1.5268 - regression_loss: 1.3107 - classification_loss: 0.2162 183/500 [=========>....................] - ETA: 1:47 - loss: 1.5268 - regression_loss: 1.3107 - classification_loss: 0.2162 184/500 [==========>...................] - ETA: 1:47 - loss: 1.5246 - regression_loss: 1.3087 - classification_loss: 0.2159 185/500 [==========>...................] - ETA: 1:47 - loss: 1.5221 - regression_loss: 1.3069 - classification_loss: 0.2152 186/500 [==========>...................] - ETA: 1:46 - loss: 1.5165 - regression_loss: 1.3023 - classification_loss: 0.2142 187/500 [==========>...................] - ETA: 1:46 - loss: 1.5143 - regression_loss: 1.3004 - classification_loss: 0.2139 188/500 [==========>...................] - ETA: 1:45 - loss: 1.5173 - regression_loss: 1.3039 - classification_loss: 0.2135 189/500 [==========>...................] - ETA: 1:45 - loss: 1.5194 - regression_loss: 1.3053 - classification_loss: 0.2141 190/500 [==========>...................] - ETA: 1:45 - loss: 1.5174 - regression_loss: 1.3039 - classification_loss: 0.2136 191/500 [==========>...................] - ETA: 1:44 - loss: 1.5152 - regression_loss: 1.3023 - classification_loss: 0.2129 192/500 [==========>...................] - ETA: 1:44 - loss: 1.5187 - regression_loss: 1.3049 - classification_loss: 0.2139 193/500 [==========>...................] - ETA: 1:44 - loss: 1.5280 - regression_loss: 1.3139 - classification_loss: 0.2141 194/500 [==========>...................] - ETA: 1:43 - loss: 1.5250 - regression_loss: 1.3115 - classification_loss: 0.2135 195/500 [==========>...................] - ETA: 1:43 - loss: 1.5270 - regression_loss: 1.3132 - classification_loss: 0.2139 196/500 [==========>...................] - ETA: 1:43 - loss: 1.5258 - regression_loss: 1.3124 - classification_loss: 0.2133 197/500 [==========>...................] - ETA: 1:42 - loss: 1.5237 - regression_loss: 1.3108 - classification_loss: 0.2129 198/500 [==========>...................] - ETA: 1:42 - loss: 1.5236 - regression_loss: 1.3106 - classification_loss: 0.2130 199/500 [==========>...................] - ETA: 1:42 - loss: 1.5225 - regression_loss: 1.3095 - classification_loss: 0.2130 200/500 [===========>..................] - ETA: 1:41 - loss: 1.5209 - regression_loss: 1.3085 - classification_loss: 0.2124 201/500 [===========>..................] - ETA: 1:41 - loss: 1.5225 - regression_loss: 1.3095 - classification_loss: 0.2130 202/500 [===========>..................] - ETA: 1:41 - loss: 1.5217 - regression_loss: 1.3090 - classification_loss: 0.2127 203/500 [===========>..................] - ETA: 1:40 - loss: 1.5211 - regression_loss: 1.3087 - classification_loss: 0.2125 204/500 [===========>..................] - ETA: 1:40 - loss: 1.5205 - regression_loss: 1.3085 - classification_loss: 0.2121 205/500 [===========>..................] - ETA: 1:40 - loss: 1.5260 - regression_loss: 1.3127 - classification_loss: 0.2133 206/500 [===========>..................] - ETA: 1:39 - loss: 1.5256 - regression_loss: 1.3125 - classification_loss: 0.2131 207/500 [===========>..................] - ETA: 1:39 - loss: 1.5286 - regression_loss: 1.3143 - classification_loss: 0.2142 208/500 [===========>..................] - ETA: 1:39 - loss: 1.5299 - regression_loss: 1.3152 - classification_loss: 0.2147 209/500 [===========>..................] - ETA: 1:38 - loss: 1.5286 - regression_loss: 1.3142 - classification_loss: 0.2145 210/500 [===========>..................] - ETA: 1:38 - loss: 1.5289 - regression_loss: 1.3142 - classification_loss: 0.2147 211/500 [===========>..................] - ETA: 1:38 - loss: 1.5306 - regression_loss: 1.3155 - classification_loss: 0.2151 212/500 [===========>..................] - ETA: 1:37 - loss: 1.5321 - regression_loss: 1.3166 - classification_loss: 0.2155 213/500 [===========>..................] - ETA: 1:37 - loss: 1.5315 - regression_loss: 1.3162 - classification_loss: 0.2153 214/500 [===========>..................] - ETA: 1:37 - loss: 1.5327 - regression_loss: 1.3169 - classification_loss: 0.2158 215/500 [===========>..................] - ETA: 1:36 - loss: 1.5342 - regression_loss: 1.3185 - classification_loss: 0.2157 216/500 [===========>..................] - ETA: 1:36 - loss: 1.5307 - regression_loss: 1.3153 - classification_loss: 0.2154 217/500 [============>.................] - ETA: 1:36 - loss: 1.5303 - regression_loss: 1.3147 - classification_loss: 0.2156 218/500 [============>.................] - ETA: 1:35 - loss: 1.5292 - regression_loss: 1.3137 - classification_loss: 0.2155 219/500 [============>.................] - ETA: 1:35 - loss: 1.5269 - regression_loss: 1.3119 - classification_loss: 0.2151 220/500 [============>.................] - ETA: 1:35 - loss: 1.5259 - regression_loss: 1.3109 - classification_loss: 0.2151 221/500 [============>.................] - ETA: 1:34 - loss: 1.5316 - regression_loss: 1.3159 - classification_loss: 0.2157 222/500 [============>.................] - ETA: 1:34 - loss: 1.5378 - regression_loss: 1.3217 - classification_loss: 0.2161 223/500 [============>.................] - ETA: 1:34 - loss: 1.5386 - regression_loss: 1.3225 - classification_loss: 0.2161 224/500 [============>.................] - ETA: 1:33 - loss: 1.5385 - regression_loss: 1.3226 - classification_loss: 0.2159 225/500 [============>.................] - ETA: 1:33 - loss: 1.5357 - regression_loss: 1.3201 - classification_loss: 0.2155 226/500 [============>.................] - ETA: 1:32 - loss: 1.5354 - regression_loss: 1.3203 - classification_loss: 0.2151 227/500 [============>.................] - ETA: 1:32 - loss: 1.5360 - regression_loss: 1.3211 - classification_loss: 0.2149 228/500 [============>.................] - ETA: 1:32 - loss: 1.5351 - regression_loss: 1.3204 - classification_loss: 0.2147 229/500 [============>.................] - ETA: 1:31 - loss: 1.5395 - regression_loss: 1.3237 - classification_loss: 0.2158 230/500 [============>.................] - ETA: 1:31 - loss: 1.5406 - regression_loss: 1.3246 - classification_loss: 0.2160 231/500 [============>.................] - ETA: 1:31 - loss: 1.5395 - regression_loss: 1.3238 - classification_loss: 0.2157 232/500 [============>.................] - ETA: 1:30 - loss: 1.5392 - regression_loss: 1.3237 - classification_loss: 0.2155 233/500 [============>.................] - ETA: 1:30 - loss: 1.5357 - regression_loss: 1.3206 - classification_loss: 0.2151 234/500 [=============>................] - ETA: 1:30 - loss: 1.5328 - regression_loss: 1.3182 - classification_loss: 0.2146 235/500 [=============>................] - ETA: 1:29 - loss: 1.5314 - regression_loss: 1.3171 - classification_loss: 0.2144 236/500 [=============>................] - ETA: 1:29 - loss: 1.5325 - regression_loss: 1.3181 - classification_loss: 0.2144 237/500 [=============>................] - ETA: 1:29 - loss: 1.5363 - regression_loss: 1.3214 - classification_loss: 0.2149 238/500 [=============>................] - ETA: 1:28 - loss: 1.5362 - regression_loss: 1.3213 - classification_loss: 0.2149 239/500 [=============>................] - ETA: 1:28 - loss: 1.5364 - regression_loss: 1.3217 - classification_loss: 0.2146 240/500 [=============>................] - ETA: 1:28 - loss: 1.5362 - regression_loss: 1.3215 - classification_loss: 0.2148 241/500 [=============>................] - ETA: 1:27 - loss: 1.5353 - regression_loss: 1.3207 - classification_loss: 0.2146 242/500 [=============>................] - ETA: 1:27 - loss: 1.5332 - regression_loss: 1.3191 - classification_loss: 0.2141 243/500 [=============>................] - ETA: 1:27 - loss: 1.5325 - regression_loss: 1.3186 - classification_loss: 0.2139 244/500 [=============>................] - ETA: 1:26 - loss: 1.5298 - regression_loss: 1.3164 - classification_loss: 0.2133 245/500 [=============>................] - ETA: 1:26 - loss: 1.5275 - regression_loss: 1.3145 - classification_loss: 0.2130 246/500 [=============>................] - ETA: 1:26 - loss: 1.5298 - regression_loss: 1.3163 - classification_loss: 0.2135 247/500 [=============>................] - ETA: 1:25 - loss: 1.5304 - regression_loss: 1.3170 - classification_loss: 0.2134 248/500 [=============>................] - ETA: 1:25 - loss: 1.5277 - regression_loss: 1.3148 - classification_loss: 0.2128 249/500 [=============>................] - ETA: 1:25 - loss: 1.5277 - regression_loss: 1.3149 - classification_loss: 0.2128 250/500 [==============>...............] - ETA: 1:24 - loss: 1.5278 - regression_loss: 1.3151 - classification_loss: 0.2127 251/500 [==============>...............] - ETA: 1:24 - loss: 1.5325 - regression_loss: 1.3188 - classification_loss: 0.2137 252/500 [==============>...............] - ETA: 1:23 - loss: 1.5321 - regression_loss: 1.3188 - classification_loss: 0.2133 253/500 [==============>...............] - ETA: 1:23 - loss: 1.5302 - regression_loss: 1.3174 - classification_loss: 0.2129 254/500 [==============>...............] - ETA: 1:23 - loss: 1.5303 - regression_loss: 1.3173 - classification_loss: 0.2130 255/500 [==============>...............] - ETA: 1:22 - loss: 1.5293 - regression_loss: 1.3166 - classification_loss: 0.2128 256/500 [==============>...............] - ETA: 1:22 - loss: 1.5264 - regression_loss: 1.3141 - classification_loss: 0.2124 257/500 [==============>...............] - ETA: 1:22 - loss: 1.5255 - regression_loss: 1.3133 - classification_loss: 0.2123 258/500 [==============>...............] - ETA: 1:21 - loss: 1.5230 - regression_loss: 1.3113 - classification_loss: 0.2117 259/500 [==============>...............] - ETA: 1:21 - loss: 1.5262 - regression_loss: 1.3137 - classification_loss: 0.2126 260/500 [==============>...............] - ETA: 1:21 - loss: 1.5286 - regression_loss: 1.3157 - classification_loss: 0.2129 261/500 [==============>...............] - ETA: 1:20 - loss: 1.5324 - regression_loss: 1.3185 - classification_loss: 0.2138 262/500 [==============>...............] - ETA: 1:20 - loss: 1.5308 - regression_loss: 1.3173 - classification_loss: 0.2136 263/500 [==============>...............] - ETA: 1:20 - loss: 1.5336 - regression_loss: 1.3198 - classification_loss: 0.2138 264/500 [==============>...............] - ETA: 1:19 - loss: 1.5339 - regression_loss: 1.3201 - classification_loss: 0.2139 265/500 [==============>...............] - ETA: 1:19 - loss: 1.5335 - regression_loss: 1.3202 - classification_loss: 0.2133 266/500 [==============>...............] - ETA: 1:19 - loss: 1.5325 - regression_loss: 1.3195 - classification_loss: 0.2130 267/500 [===============>..............] - ETA: 1:18 - loss: 1.5323 - regression_loss: 1.3193 - classification_loss: 0.2130 268/500 [===============>..............] - ETA: 1:18 - loss: 1.5317 - regression_loss: 1.3188 - classification_loss: 0.2129 269/500 [===============>..............] - ETA: 1:18 - loss: 1.5286 - regression_loss: 1.3161 - classification_loss: 0.2125 270/500 [===============>..............] - ETA: 1:17 - loss: 1.5266 - regression_loss: 1.3145 - classification_loss: 0.2121 271/500 [===============>..............] - ETA: 1:17 - loss: 1.5268 - regression_loss: 1.3147 - classification_loss: 0.2122 272/500 [===============>..............] - ETA: 1:17 - loss: 1.5258 - regression_loss: 1.3140 - classification_loss: 0.2118 273/500 [===============>..............] - ETA: 1:16 - loss: 1.5275 - regression_loss: 1.3155 - classification_loss: 0.2120 274/500 [===============>..............] - ETA: 1:16 - loss: 1.5249 - regression_loss: 1.3133 - classification_loss: 0.2116 275/500 [===============>..............] - ETA: 1:16 - loss: 1.5248 - regression_loss: 1.3132 - classification_loss: 0.2116 276/500 [===============>..............] - ETA: 1:15 - loss: 1.5226 - regression_loss: 1.3113 - classification_loss: 0.2113 277/500 [===============>..............] - ETA: 1:15 - loss: 1.5187 - regression_loss: 1.3080 - classification_loss: 0.2107 278/500 [===============>..............] - ETA: 1:15 - loss: 1.5189 - regression_loss: 1.3082 - classification_loss: 0.2106 279/500 [===============>..............] - ETA: 1:14 - loss: 1.5173 - regression_loss: 1.3071 - classification_loss: 0.2102 280/500 [===============>..............] - ETA: 1:14 - loss: 1.5161 - regression_loss: 1.3061 - classification_loss: 0.2099 281/500 [===============>..............] - ETA: 1:14 - loss: 1.5161 - regression_loss: 1.3062 - classification_loss: 0.2099 282/500 [===============>..............] - ETA: 1:13 - loss: 1.5151 - regression_loss: 1.3056 - classification_loss: 0.2095 283/500 [===============>..............] - ETA: 1:13 - loss: 1.5160 - regression_loss: 1.3065 - classification_loss: 0.2096 284/500 [================>.............] - ETA: 1:13 - loss: 1.5164 - regression_loss: 1.3070 - classification_loss: 0.2094 285/500 [================>.............] - ETA: 1:12 - loss: 1.5148 - regression_loss: 1.3055 - classification_loss: 0.2093 286/500 [================>.............] - ETA: 1:12 - loss: 1.5117 - regression_loss: 1.3028 - classification_loss: 0.2089 287/500 [================>.............] - ETA: 1:12 - loss: 1.5097 - regression_loss: 1.3013 - classification_loss: 0.2084 288/500 [================>.............] - ETA: 1:11 - loss: 1.5088 - regression_loss: 1.3006 - classification_loss: 0.2082 289/500 [================>.............] - ETA: 1:11 - loss: 1.5089 - regression_loss: 1.3010 - classification_loss: 0.2079 290/500 [================>.............] - ETA: 1:11 - loss: 1.5075 - regression_loss: 1.2999 - classification_loss: 0.2076 291/500 [================>.............] - ETA: 1:10 - loss: 1.5096 - regression_loss: 1.3016 - classification_loss: 0.2079 292/500 [================>.............] - ETA: 1:10 - loss: 1.5113 - regression_loss: 1.3028 - classification_loss: 0.2085 293/500 [================>.............] - ETA: 1:10 - loss: 1.5132 - regression_loss: 1.3042 - classification_loss: 0.2090 294/500 [================>.............] - ETA: 1:09 - loss: 1.5127 - regression_loss: 1.3040 - classification_loss: 0.2088 295/500 [================>.............] - ETA: 1:09 - loss: 1.5126 - regression_loss: 1.3039 - classification_loss: 0.2087 296/500 [================>.............] - ETA: 1:09 - loss: 1.5104 - regression_loss: 1.3018 - classification_loss: 0.2087 297/500 [================>.............] - ETA: 1:08 - loss: 1.5166 - regression_loss: 1.3061 - classification_loss: 0.2106 298/500 [================>.............] - ETA: 1:08 - loss: 1.5181 - regression_loss: 1.3074 - classification_loss: 0.2107 299/500 [================>.............] - ETA: 1:08 - loss: 1.5171 - regression_loss: 1.3066 - classification_loss: 0.2105 300/500 [=================>............] - ETA: 1:07 - loss: 1.5182 - regression_loss: 1.3076 - classification_loss: 0.2106 301/500 [=================>............] - ETA: 1:07 - loss: 1.5171 - regression_loss: 1.3067 - classification_loss: 0.2104 302/500 [=================>............] - ETA: 1:07 - loss: 1.5156 - regression_loss: 1.3053 - classification_loss: 0.2103 303/500 [=================>............] - ETA: 1:06 - loss: 1.5122 - regression_loss: 1.3024 - classification_loss: 0.2098 304/500 [=================>............] - ETA: 1:06 - loss: 1.5143 - regression_loss: 1.3039 - classification_loss: 0.2104 305/500 [=================>............] - ETA: 1:06 - loss: 1.5128 - regression_loss: 1.3026 - classification_loss: 0.2102 306/500 [=================>............] - ETA: 1:05 - loss: 1.5163 - regression_loss: 1.3056 - classification_loss: 0.2107 307/500 [=================>............] - ETA: 1:05 - loss: 1.5167 - regression_loss: 1.3062 - classification_loss: 0.2105 308/500 [=================>............] - ETA: 1:05 - loss: 1.5186 - regression_loss: 1.3078 - classification_loss: 0.2108 309/500 [=================>............] - ETA: 1:04 - loss: 1.5162 - regression_loss: 1.3057 - classification_loss: 0.2105 310/500 [=================>............] - ETA: 1:04 - loss: 1.5155 - regression_loss: 1.3050 - classification_loss: 0.2105 311/500 [=================>............] - ETA: 1:04 - loss: 1.5168 - regression_loss: 1.3061 - classification_loss: 0.2107 312/500 [=================>............] - ETA: 1:03 - loss: 1.5170 - regression_loss: 1.3062 - classification_loss: 0.2108 313/500 [=================>............] - ETA: 1:03 - loss: 1.5177 - regression_loss: 1.3069 - classification_loss: 0.2108 314/500 [=================>............] - ETA: 1:03 - loss: 1.5172 - regression_loss: 1.3065 - classification_loss: 0.2107 315/500 [=================>............] - ETA: 1:02 - loss: 1.5145 - regression_loss: 1.3041 - classification_loss: 0.2104 316/500 [=================>............] - ETA: 1:02 - loss: 1.5146 - regression_loss: 1.3042 - classification_loss: 0.2103 317/500 [==================>...........] - ETA: 1:02 - loss: 1.5125 - regression_loss: 1.3024 - classification_loss: 0.2100 318/500 [==================>...........] - ETA: 1:01 - loss: 1.5163 - regression_loss: 1.3058 - classification_loss: 0.2106 319/500 [==================>...........] - ETA: 1:01 - loss: 1.5196 - regression_loss: 1.3081 - classification_loss: 0.2115 320/500 [==================>...........] - ETA: 1:01 - loss: 1.5185 - regression_loss: 1.3073 - classification_loss: 0.2112 321/500 [==================>...........] - ETA: 1:00 - loss: 1.5169 - regression_loss: 1.3061 - classification_loss: 0.2109 322/500 [==================>...........] - ETA: 1:00 - loss: 1.5170 - regression_loss: 1.3063 - classification_loss: 0.2107 323/500 [==================>...........] - ETA: 1:00 - loss: 1.5149 - regression_loss: 1.3046 - classification_loss: 0.2103 324/500 [==================>...........] - ETA: 59s - loss: 1.5133 - regression_loss: 1.3034 - classification_loss: 0.2100  325/500 [==================>...........] - ETA: 59s - loss: 1.5116 - regression_loss: 1.3020 - classification_loss: 0.2096 326/500 [==================>...........] - ETA: 59s - loss: 1.5134 - regression_loss: 1.3035 - classification_loss: 0.2098 327/500 [==================>...........] - ETA: 58s - loss: 1.5105 - regression_loss: 1.3011 - classification_loss: 0.2094 328/500 [==================>...........] - ETA: 58s - loss: 1.5079 - regression_loss: 1.2989 - classification_loss: 0.2089 329/500 [==================>...........] - ETA: 58s - loss: 1.5063 - regression_loss: 1.2977 - classification_loss: 0.2086 330/500 [==================>...........] - ETA: 57s - loss: 1.5055 - regression_loss: 1.2971 - classification_loss: 0.2084 331/500 [==================>...........] - ETA: 57s - loss: 1.5048 - regression_loss: 1.2967 - classification_loss: 0.2081 332/500 [==================>...........] - ETA: 56s - loss: 1.5073 - regression_loss: 1.2989 - classification_loss: 0.2084 333/500 [==================>...........] - ETA: 56s - loss: 1.5067 - regression_loss: 1.2982 - classification_loss: 0.2085 334/500 [===================>..........] - ETA: 56s - loss: 1.5078 - regression_loss: 1.2991 - classification_loss: 0.2088 335/500 [===================>..........] - ETA: 55s - loss: 1.5075 - regression_loss: 1.2988 - classification_loss: 0.2087 336/500 [===================>..........] - ETA: 55s - loss: 1.5076 - regression_loss: 1.2990 - classification_loss: 0.2086 337/500 [===================>..........] - ETA: 55s - loss: 1.5063 - regression_loss: 1.2979 - classification_loss: 0.2084 338/500 [===================>..........] - ETA: 54s - loss: 1.5071 - regression_loss: 1.2984 - classification_loss: 0.2087 339/500 [===================>..........] - ETA: 54s - loss: 1.5059 - regression_loss: 1.2974 - classification_loss: 0.2085 340/500 [===================>..........] - ETA: 54s - loss: 1.5044 - regression_loss: 1.2963 - classification_loss: 0.2081 341/500 [===================>..........] - ETA: 53s - loss: 1.5059 - regression_loss: 1.2974 - classification_loss: 0.2085 342/500 [===================>..........] - ETA: 53s - loss: 1.5065 - regression_loss: 1.2979 - classification_loss: 0.2086 343/500 [===================>..........] - ETA: 53s - loss: 1.5065 - regression_loss: 1.2980 - classification_loss: 0.2085 344/500 [===================>..........] - ETA: 52s - loss: 1.5042 - regression_loss: 1.2960 - classification_loss: 0.2082 345/500 [===================>..........] - ETA: 52s - loss: 1.5035 - regression_loss: 1.2955 - classification_loss: 0.2079 346/500 [===================>..........] - ETA: 52s - loss: 1.5005 - regression_loss: 1.2930 - classification_loss: 0.2075 347/500 [===================>..........] - ETA: 51s - loss: 1.5046 - regression_loss: 1.2969 - classification_loss: 0.2077 348/500 [===================>..........] - ETA: 51s - loss: 1.5032 - regression_loss: 1.2957 - classification_loss: 0.2075 349/500 [===================>..........] - ETA: 51s - loss: 1.5013 - regression_loss: 1.2941 - classification_loss: 0.2072 350/500 [====================>.........] - ETA: 50s - loss: 1.5015 - regression_loss: 1.2943 - classification_loss: 0.2072 351/500 [====================>.........] - ETA: 50s - loss: 1.5006 - regression_loss: 1.2935 - classification_loss: 0.2071 352/500 [====================>.........] - ETA: 50s - loss: 1.5046 - regression_loss: 1.2966 - classification_loss: 0.2080 353/500 [====================>.........] - ETA: 49s - loss: 1.5024 - regression_loss: 1.2947 - classification_loss: 0.2077 354/500 [====================>.........] - ETA: 49s - loss: 1.5024 - regression_loss: 1.2949 - classification_loss: 0.2075 355/500 [====================>.........] - ETA: 49s - loss: 1.5044 - regression_loss: 1.2964 - classification_loss: 0.2080 356/500 [====================>.........] - ETA: 48s - loss: 1.5034 - regression_loss: 1.2956 - classification_loss: 0.2078 357/500 [====================>.........] - ETA: 48s - loss: 1.5018 - regression_loss: 1.2942 - classification_loss: 0.2076 358/500 [====================>.........] - ETA: 48s - loss: 1.5033 - regression_loss: 1.2955 - classification_loss: 0.2078 359/500 [====================>.........] - ETA: 47s - loss: 1.5020 - regression_loss: 1.2944 - classification_loss: 0.2076 360/500 [====================>.........] - ETA: 47s - loss: 1.5035 - regression_loss: 1.2957 - classification_loss: 0.2078 361/500 [====================>.........] - ETA: 47s - loss: 1.5056 - regression_loss: 1.2974 - classification_loss: 0.2082 362/500 [====================>.........] - ETA: 46s - loss: 1.5068 - regression_loss: 1.2984 - classification_loss: 0.2084 363/500 [====================>.........] - ETA: 46s - loss: 1.5076 - regression_loss: 1.2993 - classification_loss: 0.2083 364/500 [====================>.........] - ETA: 46s - loss: 1.5058 - regression_loss: 1.2979 - classification_loss: 0.2079 365/500 [====================>.........] - ETA: 45s - loss: 1.5042 - regression_loss: 1.2966 - classification_loss: 0.2077 366/500 [====================>.........] - ETA: 45s - loss: 1.5040 - regression_loss: 1.2961 - classification_loss: 0.2079 367/500 [=====================>........] - ETA: 45s - loss: 1.5015 - regression_loss: 1.2941 - classification_loss: 0.2074 368/500 [=====================>........] - ETA: 44s - loss: 1.4990 - regression_loss: 1.2919 - classification_loss: 0.2071 369/500 [=====================>........] - ETA: 44s - loss: 1.4992 - regression_loss: 1.2921 - classification_loss: 0.2070 370/500 [=====================>........] - ETA: 44s - loss: 1.5004 - regression_loss: 1.2930 - classification_loss: 0.2073 371/500 [=====================>........] - ETA: 43s - loss: 1.5008 - regression_loss: 1.2935 - classification_loss: 0.2073 372/500 [=====================>........] - ETA: 43s - loss: 1.5029 - regression_loss: 1.2956 - classification_loss: 0.2073 373/500 [=====================>........] - ETA: 43s - loss: 1.5035 - regression_loss: 1.2962 - classification_loss: 0.2073 374/500 [=====================>........] - ETA: 42s - loss: 1.5037 - regression_loss: 1.2965 - classification_loss: 0.2073 375/500 [=====================>........] - ETA: 42s - loss: 1.5035 - regression_loss: 1.2964 - classification_loss: 0.2071 376/500 [=====================>........] - ETA: 42s - loss: 1.5043 - regression_loss: 1.2971 - classification_loss: 0.2072 377/500 [=====================>........] - ETA: 41s - loss: 1.5035 - regression_loss: 1.2965 - classification_loss: 0.2070 378/500 [=====================>........] - ETA: 41s - loss: 1.5055 - regression_loss: 1.2982 - classification_loss: 0.2073 379/500 [=====================>........] - ETA: 41s - loss: 1.5065 - regression_loss: 1.2991 - classification_loss: 0.2074 380/500 [=====================>........] - ETA: 40s - loss: 1.5058 - regression_loss: 1.2985 - classification_loss: 0.2073 381/500 [=====================>........] - ETA: 40s - loss: 1.5057 - regression_loss: 1.2981 - classification_loss: 0.2076 382/500 [=====================>........] - ETA: 40s - loss: 1.5033 - regression_loss: 1.2960 - classification_loss: 0.2073 383/500 [=====================>........] - ETA: 39s - loss: 1.5050 - regression_loss: 1.2972 - classification_loss: 0.2079 384/500 [======================>.......] - ETA: 39s - loss: 1.5045 - regression_loss: 1.2967 - classification_loss: 0.2078 385/500 [======================>.......] - ETA: 39s - loss: 1.5048 - regression_loss: 1.2970 - classification_loss: 0.2078 386/500 [======================>.......] - ETA: 38s - loss: 1.5053 - regression_loss: 1.2973 - classification_loss: 0.2080 387/500 [======================>.......] - ETA: 38s - loss: 1.5056 - regression_loss: 1.2976 - classification_loss: 0.2080 388/500 [======================>.......] - ETA: 38s - loss: 1.5053 - regression_loss: 1.2974 - classification_loss: 0.2079 389/500 [======================>.......] - ETA: 37s - loss: 1.5052 - regression_loss: 1.2975 - classification_loss: 0.2078 390/500 [======================>.......] - ETA: 37s - loss: 1.5057 - regression_loss: 1.2981 - classification_loss: 0.2076 391/500 [======================>.......] - ETA: 36s - loss: 1.5061 - regression_loss: 1.2986 - classification_loss: 0.2075 392/500 [======================>.......] - ETA: 36s - loss: 1.5102 - regression_loss: 1.3020 - classification_loss: 0.2082 393/500 [======================>.......] - ETA: 36s - loss: 1.5103 - regression_loss: 1.3021 - classification_loss: 0.2082 394/500 [======================>.......] - ETA: 35s - loss: 1.5102 - regression_loss: 1.3019 - classification_loss: 0.2083 395/500 [======================>.......] - ETA: 35s - loss: 1.5100 - regression_loss: 1.3018 - classification_loss: 0.2082 396/500 [======================>.......] - ETA: 35s - loss: 1.5100 - regression_loss: 1.3017 - classification_loss: 0.2082 397/500 [======================>.......] - ETA: 34s - loss: 1.5088 - regression_loss: 1.3008 - classification_loss: 0.2080 398/500 [======================>.......] - ETA: 34s - loss: 1.5156 - regression_loss: 1.3059 - classification_loss: 0.2097 399/500 [======================>.......] - ETA: 34s - loss: 1.5160 - regression_loss: 1.3064 - classification_loss: 0.2096 400/500 [=======================>......] - ETA: 33s - loss: 1.5170 - regression_loss: 1.3072 - classification_loss: 0.2097 401/500 [=======================>......] - ETA: 33s - loss: 1.5165 - regression_loss: 1.3069 - classification_loss: 0.2096 402/500 [=======================>......] - ETA: 33s - loss: 1.5161 - regression_loss: 1.3064 - classification_loss: 0.2097 403/500 [=======================>......] - ETA: 32s - loss: 1.5167 - regression_loss: 1.3070 - classification_loss: 0.2097 404/500 [=======================>......] - ETA: 32s - loss: 1.5174 - regression_loss: 1.3076 - classification_loss: 0.2098 405/500 [=======================>......] - ETA: 32s - loss: 1.5173 - regression_loss: 1.3077 - classification_loss: 0.2096 406/500 [=======================>......] - ETA: 31s - loss: 1.5154 - regression_loss: 1.3061 - classification_loss: 0.2093 407/500 [=======================>......] - ETA: 31s - loss: 1.5148 - regression_loss: 1.3057 - classification_loss: 0.2091 408/500 [=======================>......] - ETA: 31s - loss: 1.5177 - regression_loss: 1.3081 - classification_loss: 0.2097 409/500 [=======================>......] - ETA: 30s - loss: 1.5178 - regression_loss: 1.3082 - classification_loss: 0.2096 410/500 [=======================>......] - ETA: 30s - loss: 1.5189 - regression_loss: 1.3096 - classification_loss: 0.2094 411/500 [=======================>......] - ETA: 30s - loss: 1.5176 - regression_loss: 1.3084 - classification_loss: 0.2092 412/500 [=======================>......] - ETA: 29s - loss: 1.5159 - regression_loss: 1.3071 - classification_loss: 0.2088 413/500 [=======================>......] - ETA: 29s - loss: 1.5180 - regression_loss: 1.3089 - classification_loss: 0.2090 414/500 [=======================>......] - ETA: 29s - loss: 1.5158 - regression_loss: 1.3072 - classification_loss: 0.2087 415/500 [=======================>......] - ETA: 28s - loss: 1.5144 - regression_loss: 1.3060 - classification_loss: 0.2084 416/500 [=======================>......] - ETA: 28s - loss: 1.5139 - regression_loss: 1.3054 - classification_loss: 0.2085 417/500 [========================>.....] - ETA: 28s - loss: 1.5139 - regression_loss: 1.3055 - classification_loss: 0.2084 418/500 [========================>.....] - ETA: 27s - loss: 1.5152 - regression_loss: 1.3064 - classification_loss: 0.2088 419/500 [========================>.....] - ETA: 27s - loss: 1.5143 - regression_loss: 1.3057 - classification_loss: 0.2085 420/500 [========================>.....] - ETA: 27s - loss: 1.5166 - regression_loss: 1.3078 - classification_loss: 0.2087 421/500 [========================>.....] - ETA: 26s - loss: 1.5176 - regression_loss: 1.3087 - classification_loss: 0.2089 422/500 [========================>.....] - ETA: 26s - loss: 1.5178 - regression_loss: 1.3089 - classification_loss: 0.2089 423/500 [========================>.....] - ETA: 26s - loss: 1.5185 - regression_loss: 1.3095 - classification_loss: 0.2090 424/500 [========================>.....] - ETA: 25s - loss: 1.5201 - regression_loss: 1.3104 - classification_loss: 0.2097 425/500 [========================>.....] - ETA: 25s - loss: 1.5205 - regression_loss: 1.3107 - classification_loss: 0.2099 426/500 [========================>.....] - ETA: 25s - loss: 1.5202 - regression_loss: 1.3104 - classification_loss: 0.2098 427/500 [========================>.....] - ETA: 24s - loss: 1.5223 - regression_loss: 1.3112 - classification_loss: 0.2111 428/500 [========================>.....] - ETA: 24s - loss: 1.5225 - regression_loss: 1.3113 - classification_loss: 0.2112 429/500 [========================>.....] - ETA: 24s - loss: 1.5272 - regression_loss: 1.3148 - classification_loss: 0.2124 430/500 [========================>.....] - ETA: 23s - loss: 1.5259 - regression_loss: 1.3139 - classification_loss: 0.2120 431/500 [========================>.....] - ETA: 23s - loss: 1.5251 - regression_loss: 1.3132 - classification_loss: 0.2119 432/500 [========================>.....] - ETA: 23s - loss: 1.5235 - regression_loss: 1.3119 - classification_loss: 0.2117 433/500 [========================>.....] - ETA: 22s - loss: 1.5235 - regression_loss: 1.3119 - classification_loss: 0.2117 434/500 [=========================>....] - ETA: 22s - loss: 1.5226 - regression_loss: 1.3111 - classification_loss: 0.2115 435/500 [=========================>....] - ETA: 22s - loss: 1.5216 - regression_loss: 1.3102 - classification_loss: 0.2113 436/500 [=========================>....] - ETA: 21s - loss: 1.5217 - regression_loss: 1.3102 - classification_loss: 0.2115 437/500 [=========================>....] - ETA: 21s - loss: 1.5218 - regression_loss: 1.3102 - classification_loss: 0.2116 438/500 [=========================>....] - ETA: 21s - loss: 1.5217 - regression_loss: 1.3099 - classification_loss: 0.2118 439/500 [=========================>....] - ETA: 20s - loss: 1.5205 - regression_loss: 1.3089 - classification_loss: 0.2116 440/500 [=========================>....] - ETA: 20s - loss: 1.5203 - regression_loss: 1.3087 - classification_loss: 0.2116 441/500 [=========================>....] - ETA: 20s - loss: 1.5198 - regression_loss: 1.3085 - classification_loss: 0.2114 442/500 [=========================>....] - ETA: 19s - loss: 1.5182 - regression_loss: 1.3073 - classification_loss: 0.2110 443/500 [=========================>....] - ETA: 19s - loss: 1.5169 - regression_loss: 1.3062 - classification_loss: 0.2107 444/500 [=========================>....] - ETA: 18s - loss: 1.5184 - regression_loss: 1.3073 - classification_loss: 0.2111 445/500 [=========================>....] - ETA: 18s - loss: 1.5189 - regression_loss: 1.3076 - classification_loss: 0.2113 446/500 [=========================>....] - ETA: 18s - loss: 1.5203 - regression_loss: 1.3086 - classification_loss: 0.2117 447/500 [=========================>....] - ETA: 17s - loss: 1.5199 - regression_loss: 1.3085 - classification_loss: 0.2114 448/500 [=========================>....] - ETA: 17s - loss: 1.5192 - regression_loss: 1.3080 - classification_loss: 0.2112 449/500 [=========================>....] - ETA: 17s - loss: 1.5192 - regression_loss: 1.3082 - classification_loss: 0.2111 450/500 [==========================>...] - ETA: 16s - loss: 1.5236 - regression_loss: 1.3117 - classification_loss: 0.2119 451/500 [==========================>...] - ETA: 16s - loss: 1.5260 - regression_loss: 1.3133 - classification_loss: 0.2126 452/500 [==========================>...] - ETA: 16s - loss: 1.5244 - regression_loss: 1.3120 - classification_loss: 0.2124 453/500 [==========================>...] - ETA: 15s - loss: 1.5235 - regression_loss: 1.3112 - classification_loss: 0.2123 454/500 [==========================>...] - ETA: 15s - loss: 1.5231 - regression_loss: 1.3111 - classification_loss: 0.2121 455/500 [==========================>...] - ETA: 15s - loss: 1.5244 - regression_loss: 1.3122 - classification_loss: 0.2122 456/500 [==========================>...] - ETA: 14s - loss: 1.5236 - regression_loss: 1.3115 - classification_loss: 0.2121 457/500 [==========================>...] - ETA: 14s - loss: 1.5239 - regression_loss: 1.3118 - classification_loss: 0.2121 458/500 [==========================>...] - ETA: 14s - loss: 1.5241 - regression_loss: 1.3120 - classification_loss: 0.2121 459/500 [==========================>...] - ETA: 13s - loss: 1.5240 - regression_loss: 1.3120 - classification_loss: 0.2120 460/500 [==========================>...] - ETA: 13s - loss: 1.5245 - regression_loss: 1.3125 - classification_loss: 0.2120 461/500 [==========================>...] - ETA: 13s - loss: 1.5230 - regression_loss: 1.3114 - classification_loss: 0.2117 462/500 [==========================>...] - ETA: 12s - loss: 1.5226 - regression_loss: 1.3110 - classification_loss: 0.2116 463/500 [==========================>...] - ETA: 12s - loss: 1.5224 - regression_loss: 1.3109 - classification_loss: 0.2115 464/500 [==========================>...] - ETA: 12s - loss: 1.5221 - regression_loss: 1.3106 - classification_loss: 0.2115 465/500 [==========================>...] - ETA: 11s - loss: 1.5210 - regression_loss: 1.3096 - classification_loss: 0.2113 466/500 [==========================>...] - ETA: 11s - loss: 1.5202 - regression_loss: 1.3091 - classification_loss: 0.2112 467/500 [===========================>..] - ETA: 11s - loss: 1.5203 - regression_loss: 1.3090 - classification_loss: 0.2113 468/500 [===========================>..] - ETA: 10s - loss: 1.5203 - regression_loss: 1.3091 - classification_loss: 0.2112 469/500 [===========================>..] - ETA: 10s - loss: 1.5208 - regression_loss: 1.3097 - classification_loss: 0.2111 470/500 [===========================>..] - ETA: 10s - loss: 1.5199 - regression_loss: 1.3090 - classification_loss: 0.2109 471/500 [===========================>..] - ETA: 9s - loss: 1.5253 - regression_loss: 1.3123 - classification_loss: 0.2130  472/500 [===========================>..] - ETA: 9s - loss: 1.5253 - regression_loss: 1.3124 - classification_loss: 0.2129 473/500 [===========================>..] - ETA: 9s - loss: 1.5253 - regression_loss: 1.3126 - classification_loss: 0.2127 474/500 [===========================>..] - ETA: 8s - loss: 1.5261 - regression_loss: 1.3130 - classification_loss: 0.2131 475/500 [===========================>..] - ETA: 8s - loss: 1.5261 - regression_loss: 1.3132 - classification_loss: 0.2130 476/500 [===========================>..] - ETA: 8s - loss: 1.5273 - regression_loss: 1.3140 - classification_loss: 0.2133 477/500 [===========================>..] - ETA: 7s - loss: 1.5297 - regression_loss: 1.3159 - classification_loss: 0.2138 478/500 [===========================>..] - ETA: 7s - loss: 1.5296 - regression_loss: 1.3157 - classification_loss: 0.2139 479/500 [===========================>..] - ETA: 7s - loss: 1.5282 - regression_loss: 1.3145 - classification_loss: 0.2137 480/500 [===========================>..] - ETA: 6s - loss: 1.5298 - regression_loss: 1.3157 - classification_loss: 0.2141 481/500 [===========================>..] - ETA: 6s - loss: 1.5312 - regression_loss: 1.3168 - classification_loss: 0.2144 482/500 [===========================>..] - ETA: 6s - loss: 1.5308 - regression_loss: 1.3165 - classification_loss: 0.2143 483/500 [===========================>..] - ETA: 5s - loss: 1.5290 - regression_loss: 1.3149 - classification_loss: 0.2141 484/500 [============================>.] - ETA: 5s - loss: 1.5285 - regression_loss: 1.3146 - classification_loss: 0.2139 485/500 [============================>.] - ETA: 5s - loss: 1.5289 - regression_loss: 1.3149 - classification_loss: 0.2141 486/500 [============================>.] - ETA: 4s - loss: 1.5286 - regression_loss: 1.3146 - classification_loss: 0.2140 487/500 [============================>.] - ETA: 4s - loss: 1.5285 - regression_loss: 1.3146 - classification_loss: 0.2139 488/500 [============================>.] - ETA: 4s - loss: 1.5287 - regression_loss: 1.3150 - classification_loss: 0.2137 489/500 [============================>.] - ETA: 3s - loss: 1.5288 - regression_loss: 1.3152 - classification_loss: 0.2136 490/500 [============================>.] - ETA: 3s - loss: 1.5283 - regression_loss: 1.3147 - classification_loss: 0.2136 491/500 [============================>.] - ETA: 3s - loss: 1.5282 - regression_loss: 1.3147 - classification_loss: 0.2135 492/500 [============================>.] - ETA: 2s - loss: 1.5283 - regression_loss: 1.3145 - classification_loss: 0.2138 493/500 [============================>.] - ETA: 2s - loss: 1.5283 - regression_loss: 1.3146 - classification_loss: 0.2137 494/500 [============================>.] - ETA: 2s - loss: 1.5277 - regression_loss: 1.3142 - classification_loss: 0.2135 495/500 [============================>.] - ETA: 1s - loss: 1.5269 - regression_loss: 1.3135 - classification_loss: 0.2134 496/500 [============================>.] - ETA: 1s - loss: 1.5284 - regression_loss: 1.3151 - classification_loss: 0.2133 497/500 [============================>.] - ETA: 1s - loss: 1.5278 - regression_loss: 1.3147 - classification_loss: 0.2131 498/500 [============================>.] - ETA: 0s - loss: 1.5266 - regression_loss: 1.3137 - classification_loss: 0.2129 499/500 [============================>.] - ETA: 0s - loss: 1.5288 - regression_loss: 1.3158 - classification_loss: 0.2130 500/500 [==============================] - 170s 339ms/step - loss: 1.5290 - regression_loss: 1.3160 - classification_loss: 0.2130 326 instances of class plum with average precision: 0.7792 mAP: 0.7792 Epoch 00009: saving model to ./training/snapshots/resnet101_pascal_09.h5 Epoch 10/150 1/500 [..............................] - ETA: 2:43 - loss: 1.3028 - regression_loss: 1.1793 - classification_loss: 0.1235 2/500 [..............................] - ETA: 2:48 - loss: 1.4283 - regression_loss: 1.2735 - classification_loss: 0.1548 3/500 [..............................] - ETA: 2:48 - loss: 1.2430 - regression_loss: 1.1127 - classification_loss: 0.1303 4/500 [..............................] - ETA: 2:46 - loss: 1.2658 - regression_loss: 1.1210 - classification_loss: 0.1447 5/500 [..............................] - ETA: 2:47 - loss: 1.4129 - regression_loss: 1.2203 - classification_loss: 0.1927 6/500 [..............................] - ETA: 2:48 - loss: 1.5288 - regression_loss: 1.3331 - classification_loss: 0.1957 7/500 [..............................] - ETA: 2:49 - loss: 1.5176 - regression_loss: 1.3333 - classification_loss: 0.1843 8/500 [..............................] - ETA: 2:48 - loss: 1.5742 - regression_loss: 1.3764 - classification_loss: 0.1978 9/500 [..............................] - ETA: 2:47 - loss: 1.4310 - regression_loss: 1.2502 - classification_loss: 0.1809 10/500 [..............................] - ETA: 2:48 - loss: 1.4705 - regression_loss: 1.2828 - classification_loss: 0.1876 11/500 [..............................] - ETA: 2:46 - loss: 1.4524 - regression_loss: 1.2729 - classification_loss: 0.1795 12/500 [..............................] - ETA: 2:46 - loss: 1.4642 - regression_loss: 1.2859 - classification_loss: 0.1783 13/500 [..............................] - ETA: 2:46 - loss: 1.4707 - regression_loss: 1.2926 - classification_loss: 0.1781 14/500 [..............................] - ETA: 2:46 - loss: 1.5054 - regression_loss: 1.3214 - classification_loss: 0.1840 15/500 [..............................] - ETA: 2:46 - loss: 1.4942 - regression_loss: 1.3114 - classification_loss: 0.1828 16/500 [..............................] - ETA: 2:45 - loss: 1.4593 - regression_loss: 1.2826 - classification_loss: 0.1767 17/500 [>.............................] - ETA: 2:45 - loss: 1.4673 - regression_loss: 1.2884 - classification_loss: 0.1789 18/500 [>.............................] - ETA: 2:44 - loss: 1.4502 - regression_loss: 1.2718 - classification_loss: 0.1784 19/500 [>.............................] - ETA: 2:44 - loss: 1.4650 - regression_loss: 1.2794 - classification_loss: 0.1857 20/500 [>.............................] - ETA: 2:44 - loss: 1.4479 - regression_loss: 1.2648 - classification_loss: 0.1831 21/500 [>.............................] - ETA: 2:43 - loss: 1.4484 - regression_loss: 1.2640 - classification_loss: 0.1844 22/500 [>.............................] - ETA: 2:43 - loss: 1.4798 - regression_loss: 1.2922 - classification_loss: 0.1876 23/500 [>.............................] - ETA: 2:42 - loss: 1.4912 - regression_loss: 1.3001 - classification_loss: 0.1911 24/500 [>.............................] - ETA: 2:42 - loss: 1.4859 - regression_loss: 1.2970 - classification_loss: 0.1889 25/500 [>.............................] - ETA: 2:42 - loss: 1.4915 - regression_loss: 1.3015 - classification_loss: 0.1900 26/500 [>.............................] - ETA: 2:42 - loss: 1.4800 - regression_loss: 1.2914 - classification_loss: 0.1885 27/500 [>.............................] - ETA: 2:41 - loss: 1.4939 - regression_loss: 1.3033 - classification_loss: 0.1906 28/500 [>.............................] - ETA: 2:41 - loss: 1.5093 - regression_loss: 1.3179 - classification_loss: 0.1914 29/500 [>.............................] - ETA: 2:41 - loss: 1.4946 - regression_loss: 1.3058 - classification_loss: 0.1888 30/500 [>.............................] - ETA: 2:40 - loss: 1.4887 - regression_loss: 1.3015 - classification_loss: 0.1872 31/500 [>.............................] - ETA: 2:40 - loss: 1.5023 - regression_loss: 1.3121 - classification_loss: 0.1903 32/500 [>.............................] - ETA: 2:39 - loss: 1.4982 - regression_loss: 1.3106 - classification_loss: 0.1876 33/500 [>.............................] - ETA: 2:39 - loss: 1.5107 - regression_loss: 1.3193 - classification_loss: 0.1914 34/500 [=>............................] - ETA: 2:38 - loss: 1.4838 - regression_loss: 1.2963 - classification_loss: 0.1875 35/500 [=>............................] - ETA: 2:38 - loss: 1.4738 - regression_loss: 1.2877 - classification_loss: 0.1860 36/500 [=>............................] - ETA: 2:38 - loss: 1.4953 - regression_loss: 1.3041 - classification_loss: 0.1912 37/500 [=>............................] - ETA: 2:38 - loss: 1.4946 - regression_loss: 1.3036 - classification_loss: 0.1910 38/500 [=>............................] - ETA: 2:37 - loss: 1.5462 - regression_loss: 1.3387 - classification_loss: 0.2076 39/500 [=>............................] - ETA: 2:37 - loss: 1.5374 - regression_loss: 1.3304 - classification_loss: 0.2069 40/500 [=>............................] - ETA: 2:37 - loss: 1.5184 - regression_loss: 1.3147 - classification_loss: 0.2037 41/500 [=>............................] - ETA: 2:37 - loss: 1.5174 - regression_loss: 1.3155 - classification_loss: 0.2019 42/500 [=>............................] - ETA: 2:36 - loss: 1.5120 - regression_loss: 1.3115 - classification_loss: 0.2005 43/500 [=>............................] - ETA: 2:36 - loss: 1.5141 - regression_loss: 1.3139 - classification_loss: 0.2002 44/500 [=>............................] - ETA: 2:36 - loss: 1.4986 - regression_loss: 1.2998 - classification_loss: 0.1988 45/500 [=>............................] - ETA: 2:35 - loss: 1.5090 - regression_loss: 1.3104 - classification_loss: 0.1985 46/500 [=>............................] - ETA: 2:35 - loss: 1.5016 - regression_loss: 1.3005 - classification_loss: 0.2011 47/500 [=>............................] - ETA: 2:35 - loss: 1.5173 - regression_loss: 1.3152 - classification_loss: 0.2021 48/500 [=>............................] - ETA: 2:34 - loss: 1.5015 - regression_loss: 1.3010 - classification_loss: 0.2004 49/500 [=>............................] - ETA: 2:34 - loss: 1.4946 - regression_loss: 1.2959 - classification_loss: 0.1987 50/500 [==>...........................] - ETA: 2:33 - loss: 1.4963 - regression_loss: 1.2972 - classification_loss: 0.1991 51/500 [==>...........................] - ETA: 2:33 - loss: 1.4897 - regression_loss: 1.2925 - classification_loss: 0.1971 52/500 [==>...........................] - ETA: 2:32 - loss: 1.4981 - regression_loss: 1.2991 - classification_loss: 0.1991 53/500 [==>...........................] - ETA: 2:32 - loss: 1.5023 - regression_loss: 1.3024 - classification_loss: 0.1999 54/500 [==>...........................] - ETA: 2:32 - loss: 1.4844 - regression_loss: 1.2868 - classification_loss: 0.1976 55/500 [==>...........................] - ETA: 2:31 - loss: 1.5028 - regression_loss: 1.3017 - classification_loss: 0.2011 56/500 [==>...........................] - ETA: 2:31 - loss: 1.5116 - regression_loss: 1.3090 - classification_loss: 0.2026 57/500 [==>...........................] - ETA: 2:31 - loss: 1.5025 - regression_loss: 1.3012 - classification_loss: 0.2014 58/500 [==>...........................] - ETA: 2:30 - loss: 1.4910 - regression_loss: 1.2917 - classification_loss: 0.1992 59/500 [==>...........................] - ETA: 2:30 - loss: 1.4895 - regression_loss: 1.2912 - classification_loss: 0.1983 60/500 [==>...........................] - ETA: 2:29 - loss: 1.4802 - regression_loss: 1.2828 - classification_loss: 0.1974 61/500 [==>...........................] - ETA: 2:29 - loss: 1.4806 - regression_loss: 1.2828 - classification_loss: 0.1978 62/500 [==>...........................] - ETA: 2:29 - loss: 1.4738 - regression_loss: 1.2772 - classification_loss: 0.1966 63/500 [==>...........................] - ETA: 2:28 - loss: 1.4708 - regression_loss: 1.2739 - classification_loss: 0.1968 64/500 [==>...........................] - ETA: 2:28 - loss: 1.4821 - regression_loss: 1.2831 - classification_loss: 0.1990 65/500 [==>...........................] - ETA: 2:28 - loss: 1.4740 - regression_loss: 1.2757 - classification_loss: 0.1983 66/500 [==>...........................] - ETA: 2:27 - loss: 1.4763 - regression_loss: 1.2784 - classification_loss: 0.1979 67/500 [===>..........................] - ETA: 2:27 - loss: 1.4632 - regression_loss: 1.2675 - classification_loss: 0.1957 68/500 [===>..........................] - ETA: 2:27 - loss: 1.4673 - regression_loss: 1.2704 - classification_loss: 0.1969 69/500 [===>..........................] - ETA: 2:26 - loss: 1.4715 - regression_loss: 1.2744 - classification_loss: 0.1971 70/500 [===>..........................] - ETA: 2:26 - loss: 1.4725 - regression_loss: 1.2754 - classification_loss: 0.1971 71/500 [===>..........................] - ETA: 2:25 - loss: 1.4704 - regression_loss: 1.2743 - classification_loss: 0.1961 72/500 [===>..........................] - ETA: 2:25 - loss: 1.4597 - regression_loss: 1.2655 - classification_loss: 0.1943 73/500 [===>..........................] - ETA: 2:25 - loss: 1.4541 - regression_loss: 1.2608 - classification_loss: 0.1933 74/500 [===>..........................] - ETA: 2:24 - loss: 1.4485 - regression_loss: 1.2563 - classification_loss: 0.1922 75/500 [===>..........................] - ETA: 2:24 - loss: 1.4500 - regression_loss: 1.2579 - classification_loss: 0.1921 76/500 [===>..........................] - ETA: 2:23 - loss: 1.4567 - regression_loss: 1.2633 - classification_loss: 0.1934 77/500 [===>..........................] - ETA: 2:23 - loss: 1.4582 - regression_loss: 1.2641 - classification_loss: 0.1941 78/500 [===>..........................] - ETA: 2:23 - loss: 1.4560 - regression_loss: 1.2619 - classification_loss: 0.1941 79/500 [===>..........................] - ETA: 2:22 - loss: 1.4510 - regression_loss: 1.2583 - classification_loss: 0.1927 80/500 [===>..........................] - ETA: 2:22 - loss: 1.4469 - regression_loss: 1.2551 - classification_loss: 0.1918 81/500 [===>..........................] - ETA: 2:22 - loss: 1.4459 - regression_loss: 1.2545 - classification_loss: 0.1914 82/500 [===>..........................] - ETA: 2:21 - loss: 1.4373 - regression_loss: 1.2474 - classification_loss: 0.1899 83/500 [===>..........................] - ETA: 2:21 - loss: 1.4362 - regression_loss: 1.2468 - classification_loss: 0.1894 84/500 [====>.........................] - ETA: 2:21 - loss: 1.4359 - regression_loss: 1.2459 - classification_loss: 0.1900 85/500 [====>.........................] - ETA: 2:20 - loss: 1.4239 - regression_loss: 1.2354 - classification_loss: 0.1885 86/500 [====>.........................] - ETA: 2:20 - loss: 1.4154 - regression_loss: 1.2278 - classification_loss: 0.1877 87/500 [====>.........................] - ETA: 2:20 - loss: 1.4176 - regression_loss: 1.2294 - classification_loss: 0.1882 88/500 [====>.........................] - ETA: 2:19 - loss: 1.4219 - regression_loss: 1.2330 - classification_loss: 0.1889 89/500 [====>.........................] - ETA: 2:19 - loss: 1.4266 - regression_loss: 1.2361 - classification_loss: 0.1906 90/500 [====>.........................] - ETA: 2:19 - loss: 1.4269 - regression_loss: 1.2360 - classification_loss: 0.1909 91/500 [====>.........................] - ETA: 2:18 - loss: 1.4227 - regression_loss: 1.2328 - classification_loss: 0.1899 92/500 [====>.........................] - ETA: 2:18 - loss: 1.4233 - regression_loss: 1.2342 - classification_loss: 0.1891 93/500 [====>.........................] - ETA: 2:18 - loss: 1.4223 - regression_loss: 1.2336 - classification_loss: 0.1887 94/500 [====>.........................] - ETA: 2:17 - loss: 1.4307 - regression_loss: 1.2404 - classification_loss: 0.1903 95/500 [====>.........................] - ETA: 2:17 - loss: 1.4359 - regression_loss: 1.2445 - classification_loss: 0.1914 96/500 [====>.........................] - ETA: 2:17 - loss: 1.4366 - regression_loss: 1.2449 - classification_loss: 0.1917 97/500 [====>.........................] - ETA: 2:16 - loss: 1.4343 - regression_loss: 1.2435 - classification_loss: 0.1908 98/500 [====>.........................] - ETA: 2:16 - loss: 1.4402 - regression_loss: 1.2488 - classification_loss: 0.1914 99/500 [====>.........................] - ETA: 2:16 - loss: 1.4369 - regression_loss: 1.2462 - classification_loss: 0.1907 100/500 [=====>........................] - ETA: 2:15 - loss: 1.4581 - regression_loss: 1.2645 - classification_loss: 0.1936 101/500 [=====>........................] - ETA: 2:15 - loss: 1.4548 - regression_loss: 1.2620 - classification_loss: 0.1928 102/500 [=====>........................] - ETA: 2:15 - loss: 1.4549 - regression_loss: 1.2619 - classification_loss: 0.1930 103/500 [=====>........................] - ETA: 2:14 - loss: 1.4594 - regression_loss: 1.2655 - classification_loss: 0.1939 104/500 [=====>........................] - ETA: 2:14 - loss: 1.4616 - regression_loss: 1.2677 - classification_loss: 0.1939 105/500 [=====>........................] - ETA: 2:14 - loss: 1.4672 - regression_loss: 1.2730 - classification_loss: 0.1942 106/500 [=====>........................] - ETA: 2:13 - loss: 1.4723 - regression_loss: 1.2770 - classification_loss: 0.1953 107/500 [=====>........................] - ETA: 2:13 - loss: 1.4662 - regression_loss: 1.2721 - classification_loss: 0.1941 108/500 [=====>........................] - ETA: 2:13 - loss: 1.4694 - regression_loss: 1.2743 - classification_loss: 0.1951 109/500 [=====>........................] - ETA: 2:12 - loss: 1.4694 - regression_loss: 1.2744 - classification_loss: 0.1950 110/500 [=====>........................] - ETA: 2:12 - loss: 1.4746 - regression_loss: 1.2797 - classification_loss: 0.1949 111/500 [=====>........................] - ETA: 2:12 - loss: 1.4731 - regression_loss: 1.2783 - classification_loss: 0.1948 112/500 [=====>........................] - ETA: 2:11 - loss: 1.4750 - regression_loss: 1.2804 - classification_loss: 0.1946 113/500 [=====>........................] - ETA: 2:11 - loss: 1.4725 - regression_loss: 1.2785 - classification_loss: 0.1939 114/500 [=====>........................] - ETA: 2:11 - loss: 1.4750 - regression_loss: 1.2804 - classification_loss: 0.1946 115/500 [=====>........................] - ETA: 2:10 - loss: 1.4695 - regression_loss: 1.2757 - classification_loss: 0.1938 116/500 [=====>........................] - ETA: 2:10 - loss: 1.4665 - regression_loss: 1.2735 - classification_loss: 0.1930 117/500 [======>.......................] - ETA: 2:10 - loss: 1.4675 - regression_loss: 1.2749 - classification_loss: 0.1926 118/500 [======>.......................] - ETA: 2:09 - loss: 1.4700 - regression_loss: 1.2768 - classification_loss: 0.1932 119/500 [======>.......................] - ETA: 2:09 - loss: 1.4742 - regression_loss: 1.2816 - classification_loss: 0.1927 120/500 [======>.......................] - ETA: 2:09 - loss: 1.4729 - regression_loss: 1.2806 - classification_loss: 0.1924 121/500 [======>.......................] - ETA: 2:08 - loss: 1.4763 - regression_loss: 1.2829 - classification_loss: 0.1933 122/500 [======>.......................] - ETA: 2:08 - loss: 1.4860 - regression_loss: 1.2912 - classification_loss: 0.1948 123/500 [======>.......................] - ETA: 2:08 - loss: 1.4821 - regression_loss: 1.2883 - classification_loss: 0.1938 124/500 [======>.......................] - ETA: 2:07 - loss: 1.4835 - regression_loss: 1.2898 - classification_loss: 0.1936 125/500 [======>.......................] - ETA: 2:07 - loss: 1.4848 - regression_loss: 1.2908 - classification_loss: 0.1939 126/500 [======>.......................] - ETA: 2:06 - loss: 1.4858 - regression_loss: 1.2918 - classification_loss: 0.1940 127/500 [======>.......................] - ETA: 2:06 - loss: 1.4831 - regression_loss: 1.2896 - classification_loss: 0.1935 128/500 [======>.......................] - ETA: 2:06 - loss: 1.4835 - regression_loss: 1.2896 - classification_loss: 0.1938 129/500 [======>.......................] - ETA: 2:05 - loss: 1.4820 - regression_loss: 1.2884 - classification_loss: 0.1936 130/500 [======>.......................] - ETA: 2:05 - loss: 1.4813 - regression_loss: 1.2880 - classification_loss: 0.1933 131/500 [======>.......................] - ETA: 2:05 - loss: 1.4773 - regression_loss: 1.2846 - classification_loss: 0.1927 132/500 [======>.......................] - ETA: 2:04 - loss: 1.4836 - regression_loss: 1.2897 - classification_loss: 0.1939 133/500 [======>.......................] - ETA: 2:04 - loss: 1.4873 - regression_loss: 1.2926 - classification_loss: 0.1947 134/500 [=======>......................] - ETA: 2:04 - loss: 1.4971 - regression_loss: 1.3013 - classification_loss: 0.1958 135/500 [=======>......................] - ETA: 2:03 - loss: 1.5020 - regression_loss: 1.3053 - classification_loss: 0.1967 136/500 [=======>......................] - ETA: 2:03 - loss: 1.5005 - regression_loss: 1.3042 - classification_loss: 0.1964 137/500 [=======>......................] - ETA: 2:03 - loss: 1.4931 - regression_loss: 1.2978 - classification_loss: 0.1953 138/500 [=======>......................] - ETA: 2:02 - loss: 1.4906 - regression_loss: 1.2957 - classification_loss: 0.1949 139/500 [=======>......................] - ETA: 2:02 - loss: 1.4912 - regression_loss: 1.2958 - classification_loss: 0.1954 140/500 [=======>......................] - ETA: 2:02 - loss: 1.4885 - regression_loss: 1.2936 - classification_loss: 0.1950 141/500 [=======>......................] - ETA: 2:01 - loss: 1.4812 - regression_loss: 1.2874 - classification_loss: 0.1938 142/500 [=======>......................] - ETA: 2:01 - loss: 1.4809 - regression_loss: 1.2875 - classification_loss: 0.1934 143/500 [=======>......................] - ETA: 2:01 - loss: 1.4826 - regression_loss: 1.2889 - classification_loss: 0.1937 144/500 [=======>......................] - ETA: 2:00 - loss: 1.4810 - regression_loss: 1.2873 - classification_loss: 0.1937 145/500 [=======>......................] - ETA: 2:00 - loss: 1.4830 - regression_loss: 1.2889 - classification_loss: 0.1941 146/500 [=======>......................] - ETA: 2:00 - loss: 1.4808 - regression_loss: 1.2870 - classification_loss: 0.1938 147/500 [=======>......................] - ETA: 1:59 - loss: 1.4771 - regression_loss: 1.2840 - classification_loss: 0.1930 148/500 [=======>......................] - ETA: 1:59 - loss: 1.4717 - regression_loss: 1.2791 - classification_loss: 0.1926 149/500 [=======>......................] - ETA: 1:59 - loss: 1.4722 - regression_loss: 1.2797 - classification_loss: 0.1925 150/500 [========>.....................] - ETA: 1:58 - loss: 1.4737 - regression_loss: 1.2814 - classification_loss: 0.1924 151/500 [========>.....................] - ETA: 1:58 - loss: 1.4740 - regression_loss: 1.2820 - classification_loss: 0.1920 152/500 [========>.....................] - ETA: 1:58 - loss: 1.4737 - regression_loss: 1.2815 - classification_loss: 0.1922 153/500 [========>.....................] - ETA: 1:58 - loss: 1.4713 - regression_loss: 1.2791 - classification_loss: 0.1921 154/500 [========>.....................] - ETA: 1:57 - loss: 1.4750 - regression_loss: 1.2829 - classification_loss: 0.1922 155/500 [========>.....................] - ETA: 1:57 - loss: 1.4754 - regression_loss: 1.2837 - classification_loss: 0.1917 156/500 [========>.....................] - ETA: 1:57 - loss: 1.4731 - regression_loss: 1.2819 - classification_loss: 0.1912 157/500 [========>.....................] - ETA: 1:56 - loss: 1.4764 - regression_loss: 1.2841 - classification_loss: 0.1924 158/500 [========>.....................] - ETA: 1:56 - loss: 1.4737 - regression_loss: 1.2816 - classification_loss: 0.1921 159/500 [========>.....................] - ETA: 1:56 - loss: 1.4723 - regression_loss: 1.2804 - classification_loss: 0.1919 160/500 [========>.....................] - ETA: 1:55 - loss: 1.4712 - regression_loss: 1.2794 - classification_loss: 0.1918 161/500 [========>.....................] - ETA: 1:55 - loss: 1.4685 - regression_loss: 1.2771 - classification_loss: 0.1914 162/500 [========>.....................] - ETA: 1:54 - loss: 1.4787 - regression_loss: 1.2854 - classification_loss: 0.1933 163/500 [========>.....................] - ETA: 1:54 - loss: 1.4827 - regression_loss: 1.2884 - classification_loss: 0.1943 164/500 [========>.....................] - ETA: 1:54 - loss: 1.4772 - regression_loss: 1.2837 - classification_loss: 0.1936 165/500 [========>.....................] - ETA: 1:53 - loss: 1.4769 - regression_loss: 1.2839 - classification_loss: 0.1931 166/500 [========>.....................] - ETA: 1:53 - loss: 1.4728 - regression_loss: 1.2800 - classification_loss: 0.1928 167/500 [=========>....................] - ETA: 1:53 - loss: 1.4764 - regression_loss: 1.2827 - classification_loss: 0.1937 168/500 [=========>....................] - ETA: 1:52 - loss: 1.4744 - regression_loss: 1.2812 - classification_loss: 0.1932 169/500 [=========>....................] - ETA: 1:52 - loss: 1.4715 - regression_loss: 1.2786 - classification_loss: 0.1928 170/500 [=========>....................] - ETA: 1:52 - loss: 1.4683 - regression_loss: 1.2759 - classification_loss: 0.1924 171/500 [=========>....................] - ETA: 1:51 - loss: 1.4679 - regression_loss: 1.2755 - classification_loss: 0.1924 172/500 [=========>....................] - ETA: 1:51 - loss: 1.4684 - regression_loss: 1.2759 - classification_loss: 0.1925 173/500 [=========>....................] - ETA: 1:51 - loss: 1.4703 - regression_loss: 1.2772 - classification_loss: 0.1931 174/500 [=========>....................] - ETA: 1:50 - loss: 1.4715 - regression_loss: 1.2784 - classification_loss: 0.1930 175/500 [=========>....................] - ETA: 1:50 - loss: 1.4717 - regression_loss: 1.2785 - classification_loss: 0.1932 176/500 [=========>....................] - ETA: 1:50 - loss: 1.4753 - regression_loss: 1.2813 - classification_loss: 0.1940 177/500 [=========>....................] - ETA: 1:49 - loss: 1.4745 - regression_loss: 1.2807 - classification_loss: 0.1939 178/500 [=========>....................] - ETA: 1:49 - loss: 1.4745 - regression_loss: 1.2810 - classification_loss: 0.1935 179/500 [=========>....................] - ETA: 1:49 - loss: 1.4786 - regression_loss: 1.2845 - classification_loss: 0.1941 180/500 [=========>....................] - ETA: 1:48 - loss: 1.4824 - regression_loss: 1.2872 - classification_loss: 0.1952 181/500 [=========>....................] - ETA: 1:48 - loss: 1.4819 - regression_loss: 1.2866 - classification_loss: 0.1953 182/500 [=========>....................] - ETA: 1:48 - loss: 1.4799 - regression_loss: 1.2850 - classification_loss: 0.1949 183/500 [=========>....................] - ETA: 1:47 - loss: 1.4774 - regression_loss: 1.2828 - classification_loss: 0.1946 184/500 [==========>...................] - ETA: 1:47 - loss: 1.4769 - regression_loss: 1.2822 - classification_loss: 0.1947 185/500 [==========>...................] - ETA: 1:47 - loss: 1.4757 - regression_loss: 1.2812 - classification_loss: 0.1946 186/500 [==========>...................] - ETA: 1:46 - loss: 1.4780 - regression_loss: 1.2831 - classification_loss: 0.1949 187/500 [==========>...................] - ETA: 1:46 - loss: 1.4798 - regression_loss: 1.2846 - classification_loss: 0.1951 188/500 [==========>...................] - ETA: 1:46 - loss: 1.4797 - regression_loss: 1.2844 - classification_loss: 0.1953 189/500 [==========>...................] - ETA: 1:45 - loss: 1.4804 - regression_loss: 1.2850 - classification_loss: 0.1954 190/500 [==========>...................] - ETA: 1:45 - loss: 1.4819 - regression_loss: 1.2865 - classification_loss: 0.1955 191/500 [==========>...................] - ETA: 1:45 - loss: 1.4813 - regression_loss: 1.2860 - classification_loss: 0.1953 192/500 [==========>...................] - ETA: 1:44 - loss: 1.4766 - regression_loss: 1.2820 - classification_loss: 0.1946 193/500 [==========>...................] - ETA: 1:44 - loss: 1.4753 - regression_loss: 1.2809 - classification_loss: 0.1943 194/500 [==========>...................] - ETA: 1:44 - loss: 1.4872 - regression_loss: 1.2868 - classification_loss: 0.2005 195/500 [==========>...................] - ETA: 1:43 - loss: 1.4868 - regression_loss: 1.2866 - classification_loss: 0.2003 196/500 [==========>...................] - ETA: 1:43 - loss: 1.4875 - regression_loss: 1.2874 - classification_loss: 0.2002 197/500 [==========>...................] - ETA: 1:43 - loss: 1.4908 - regression_loss: 1.2902 - classification_loss: 0.2006 198/500 [==========>...................] - ETA: 1:42 - loss: 1.4952 - regression_loss: 1.2939 - classification_loss: 0.2014 199/500 [==========>...................] - ETA: 1:42 - loss: 1.4930 - regression_loss: 1.2922 - classification_loss: 0.2008 200/500 [===========>..................] - ETA: 1:42 - loss: 1.4944 - regression_loss: 1.2938 - classification_loss: 0.2006 201/500 [===========>..................] - ETA: 1:41 - loss: 1.4976 - regression_loss: 1.2964 - classification_loss: 0.2012 202/500 [===========>..................] - ETA: 1:41 - loss: 1.4962 - regression_loss: 1.2949 - classification_loss: 0.2013 203/500 [===========>..................] - ETA: 1:41 - loss: 1.4970 - regression_loss: 1.2954 - classification_loss: 0.2016 204/500 [===========>..................] - ETA: 1:40 - loss: 1.4951 - regression_loss: 1.2940 - classification_loss: 0.2011 205/500 [===========>..................] - ETA: 1:40 - loss: 1.4986 - regression_loss: 1.2968 - classification_loss: 0.2018 206/500 [===========>..................] - ETA: 1:39 - loss: 1.5021 - regression_loss: 1.2999 - classification_loss: 0.2022 207/500 [===========>..................] - ETA: 1:39 - loss: 1.5102 - regression_loss: 1.3060 - classification_loss: 0.2042 208/500 [===========>..................] - ETA: 1:39 - loss: 1.5090 - regression_loss: 1.3052 - classification_loss: 0.2038 209/500 [===========>..................] - ETA: 1:38 - loss: 1.5116 - regression_loss: 1.3076 - classification_loss: 0.2040 210/500 [===========>..................] - ETA: 1:38 - loss: 1.5100 - regression_loss: 1.3063 - classification_loss: 0.2037 211/500 [===========>..................] - ETA: 1:38 - loss: 1.5103 - regression_loss: 1.3068 - classification_loss: 0.2035 212/500 [===========>..................] - ETA: 1:37 - loss: 1.5081 - regression_loss: 1.3051 - classification_loss: 0.2030 213/500 [===========>..................] - ETA: 1:37 - loss: 1.5047 - regression_loss: 1.3023 - classification_loss: 0.2024 214/500 [===========>..................] - ETA: 1:37 - loss: 1.5123 - regression_loss: 1.3075 - classification_loss: 0.2048 215/500 [===========>..................] - ETA: 1:36 - loss: 1.5157 - regression_loss: 1.3101 - classification_loss: 0.2056 216/500 [===========>..................] - ETA: 1:36 - loss: 1.5204 - regression_loss: 1.3138 - classification_loss: 0.2066 217/500 [============>.................] - ETA: 1:36 - loss: 1.5187 - regression_loss: 1.3125 - classification_loss: 0.2063 218/500 [============>.................] - ETA: 1:35 - loss: 1.5195 - regression_loss: 1.3133 - classification_loss: 0.2062 219/500 [============>.................] - ETA: 1:35 - loss: 1.5219 - regression_loss: 1.3148 - classification_loss: 0.2070 220/500 [============>.................] - ETA: 1:35 - loss: 1.5199 - regression_loss: 1.3133 - classification_loss: 0.2067 221/500 [============>.................] - ETA: 1:34 - loss: 1.5201 - regression_loss: 1.3135 - classification_loss: 0.2066 222/500 [============>.................] - ETA: 1:34 - loss: 1.5204 - regression_loss: 1.3134 - classification_loss: 0.2071 223/500 [============>.................] - ETA: 1:34 - loss: 1.5199 - regression_loss: 1.3130 - classification_loss: 0.2069 224/500 [============>.................] - ETA: 1:33 - loss: 1.5220 - regression_loss: 1.3150 - classification_loss: 0.2070 225/500 [============>.................] - ETA: 1:33 - loss: 1.5206 - regression_loss: 1.3140 - classification_loss: 0.2067 226/500 [============>.................] - ETA: 1:33 - loss: 1.5175 - regression_loss: 1.3113 - classification_loss: 0.2062 227/500 [============>.................] - ETA: 1:32 - loss: 1.5158 - regression_loss: 1.3095 - classification_loss: 0.2063 228/500 [============>.................] - ETA: 1:32 - loss: 1.5148 - regression_loss: 1.3088 - classification_loss: 0.2060 229/500 [============>.................] - ETA: 1:32 - loss: 1.5157 - regression_loss: 1.3095 - classification_loss: 0.2062 230/500 [============>.................] - ETA: 1:31 - loss: 1.5153 - regression_loss: 1.3091 - classification_loss: 0.2062 231/500 [============>.................] - ETA: 1:31 - loss: 1.5175 - regression_loss: 1.3109 - classification_loss: 0.2066 232/500 [============>.................] - ETA: 1:31 - loss: 1.5158 - regression_loss: 1.3096 - classification_loss: 0.2063 233/500 [============>.................] - ETA: 1:30 - loss: 1.5148 - regression_loss: 1.3087 - classification_loss: 0.2061 234/500 [=============>................] - ETA: 1:30 - loss: 1.5155 - regression_loss: 1.3094 - classification_loss: 0.2061 235/500 [=============>................] - ETA: 1:30 - loss: 1.5137 - regression_loss: 1.3079 - classification_loss: 0.2058 236/500 [=============>................] - ETA: 1:29 - loss: 1.5145 - regression_loss: 1.3089 - classification_loss: 0.2057 237/500 [=============>................] - ETA: 1:29 - loss: 1.5147 - regression_loss: 1.3085 - classification_loss: 0.2061 238/500 [=============>................] - ETA: 1:29 - loss: 1.5129 - regression_loss: 1.3071 - classification_loss: 0.2058 239/500 [=============>................] - ETA: 1:28 - loss: 1.5135 - regression_loss: 1.3080 - classification_loss: 0.2054 240/500 [=============>................] - ETA: 1:28 - loss: 1.5132 - regression_loss: 1.3080 - classification_loss: 0.2052 241/500 [=============>................] - ETA: 1:28 - loss: 1.5143 - regression_loss: 1.3026 - classification_loss: 0.2117 242/500 [=============>................] - ETA: 1:27 - loss: 1.5120 - regression_loss: 1.3005 - classification_loss: 0.2115 243/500 [=============>................] - ETA: 1:27 - loss: 1.5081 - regression_loss: 1.2972 - classification_loss: 0.2109 244/500 [=============>................] - ETA: 1:27 - loss: 1.5120 - regression_loss: 1.3007 - classification_loss: 0.2112 245/500 [=============>................] - ETA: 1:26 - loss: 1.5097 - regression_loss: 1.2988 - classification_loss: 0.2109 246/500 [=============>................] - ETA: 1:26 - loss: 1.5078 - regression_loss: 1.2971 - classification_loss: 0.2107 247/500 [=============>................] - ETA: 1:26 - loss: 1.5071 - regression_loss: 1.2962 - classification_loss: 0.2109 248/500 [=============>................] - ETA: 1:25 - loss: 1.5063 - regression_loss: 1.2952 - classification_loss: 0.2111 249/500 [=============>................] - ETA: 1:25 - loss: 1.5064 - regression_loss: 1.2954 - classification_loss: 0.2110 250/500 [==============>...............] - ETA: 1:24 - loss: 1.5071 - regression_loss: 1.2961 - classification_loss: 0.2110 251/500 [==============>...............] - ETA: 1:24 - loss: 1.5096 - regression_loss: 1.2979 - classification_loss: 0.2118 252/500 [==============>...............] - ETA: 1:24 - loss: 1.5137 - regression_loss: 1.3012 - classification_loss: 0.2125 253/500 [==============>...............] - ETA: 1:23 - loss: 1.5141 - regression_loss: 1.3014 - classification_loss: 0.2126 254/500 [==============>...............] - ETA: 1:23 - loss: 1.5089 - regression_loss: 1.2963 - classification_loss: 0.2125 255/500 [==============>...............] - ETA: 1:23 - loss: 1.5053 - regression_loss: 1.2933 - classification_loss: 0.2119 256/500 [==============>...............] - ETA: 1:22 - loss: 1.5026 - regression_loss: 1.2910 - classification_loss: 0.2115 257/500 [==============>...............] - ETA: 1:22 - loss: 1.5030 - regression_loss: 1.2913 - classification_loss: 0.2116 258/500 [==============>...............] - ETA: 1:22 - loss: 1.5016 - regression_loss: 1.2902 - classification_loss: 0.2114 259/500 [==============>...............] - ETA: 1:21 - loss: 1.5020 - regression_loss: 1.2905 - classification_loss: 0.2115 260/500 [==============>...............] - ETA: 1:21 - loss: 1.4981 - regression_loss: 1.2871 - classification_loss: 0.2110 261/500 [==============>...............] - ETA: 1:21 - loss: 1.5002 - regression_loss: 1.2886 - classification_loss: 0.2116 262/500 [==============>...............] - ETA: 1:20 - loss: 1.5052 - regression_loss: 1.2927 - classification_loss: 0.2125 263/500 [==============>...............] - ETA: 1:20 - loss: 1.5029 - regression_loss: 1.2908 - classification_loss: 0.2120 264/500 [==============>...............] - ETA: 1:20 - loss: 1.5008 - regression_loss: 1.2892 - classification_loss: 0.2117 265/500 [==============>...............] - ETA: 1:19 - loss: 1.4987 - regression_loss: 1.2874 - classification_loss: 0.2113 266/500 [==============>...............] - ETA: 1:19 - loss: 1.4942 - regression_loss: 1.2826 - classification_loss: 0.2117 267/500 [===============>..............] - ETA: 1:19 - loss: 1.4938 - regression_loss: 1.2824 - classification_loss: 0.2114 268/500 [===============>..............] - ETA: 1:18 - loss: 1.4930 - regression_loss: 1.2819 - classification_loss: 0.2112 269/500 [===============>..............] - ETA: 1:18 - loss: 1.4965 - regression_loss: 1.2846 - classification_loss: 0.2119 270/500 [===============>..............] - ETA: 1:18 - loss: 1.4957 - regression_loss: 1.2837 - classification_loss: 0.2120 271/500 [===============>..............] - ETA: 1:17 - loss: 1.4964 - regression_loss: 1.2843 - classification_loss: 0.2121 272/500 [===============>..............] - ETA: 1:17 - loss: 1.4994 - regression_loss: 1.2863 - classification_loss: 0.2131 273/500 [===============>..............] - ETA: 1:17 - loss: 1.4979 - regression_loss: 1.2853 - classification_loss: 0.2126 274/500 [===============>..............] - ETA: 1:16 - loss: 1.5001 - regression_loss: 1.2873 - classification_loss: 0.2128 275/500 [===============>..............] - ETA: 1:16 - loss: 1.4991 - regression_loss: 1.2864 - classification_loss: 0.2127 276/500 [===============>..............] - ETA: 1:16 - loss: 1.4986 - regression_loss: 1.2861 - classification_loss: 0.2125 277/500 [===============>..............] - ETA: 1:15 - loss: 1.5011 - regression_loss: 1.2884 - classification_loss: 0.2127 278/500 [===============>..............] - ETA: 1:15 - loss: 1.5093 - regression_loss: 1.2951 - classification_loss: 0.2142 279/500 [===============>..............] - ETA: 1:15 - loss: 1.5093 - regression_loss: 1.2954 - classification_loss: 0.2138 280/500 [===============>..............] - ETA: 1:14 - loss: 1.5091 - regression_loss: 1.2954 - classification_loss: 0.2137 281/500 [===============>..............] - ETA: 1:14 - loss: 1.5080 - regression_loss: 1.2946 - classification_loss: 0.2134 282/500 [===============>..............] - ETA: 1:14 - loss: 1.5074 - regression_loss: 1.2942 - classification_loss: 0.2132 283/500 [===============>..............] - ETA: 1:13 - loss: 1.5093 - regression_loss: 1.2956 - classification_loss: 0.2137 284/500 [================>.............] - ETA: 1:13 - loss: 1.5111 - regression_loss: 1.2969 - classification_loss: 0.2142 285/500 [================>.............] - ETA: 1:13 - loss: 1.5121 - regression_loss: 1.2977 - classification_loss: 0.2144 286/500 [================>.............] - ETA: 1:12 - loss: 1.5106 - regression_loss: 1.2966 - classification_loss: 0.2140 287/500 [================>.............] - ETA: 1:12 - loss: 1.5087 - regression_loss: 1.2950 - classification_loss: 0.2137 288/500 [================>.............] - ETA: 1:12 - loss: 1.5084 - regression_loss: 1.2950 - classification_loss: 0.2134 289/500 [================>.............] - ETA: 1:11 - loss: 1.5072 - regression_loss: 1.2940 - classification_loss: 0.2131 290/500 [================>.............] - ETA: 1:11 - loss: 1.5083 - regression_loss: 1.2954 - classification_loss: 0.2128 291/500 [================>.............] - ETA: 1:11 - loss: 1.5063 - regression_loss: 1.2939 - classification_loss: 0.2124 292/500 [================>.............] - ETA: 1:10 - loss: 1.5044 - regression_loss: 1.2925 - classification_loss: 0.2119 293/500 [================>.............] - ETA: 1:10 - loss: 1.5050 - regression_loss: 1.2929 - classification_loss: 0.2121 294/500 [================>.............] - ETA: 1:10 - loss: 1.5052 - regression_loss: 1.2931 - classification_loss: 0.2121 295/500 [================>.............] - ETA: 1:09 - loss: 1.5035 - regression_loss: 1.2918 - classification_loss: 0.2117 296/500 [================>.............] - ETA: 1:09 - loss: 1.4997 - regression_loss: 1.2884 - classification_loss: 0.2112 297/500 [================>.............] - ETA: 1:09 - loss: 1.4984 - regression_loss: 1.2874 - classification_loss: 0.2110 298/500 [================>.............] - ETA: 1:08 - loss: 1.4974 - regression_loss: 1.2866 - classification_loss: 0.2107 299/500 [================>.............] - ETA: 1:08 - loss: 1.4974 - regression_loss: 1.2864 - classification_loss: 0.2111 300/500 [=================>............] - ETA: 1:07 - loss: 1.4981 - regression_loss: 1.2868 - classification_loss: 0.2113 301/500 [=================>............] - ETA: 1:07 - loss: 1.4979 - regression_loss: 1.2869 - classification_loss: 0.2110 302/500 [=================>............] - ETA: 1:07 - loss: 1.4989 - regression_loss: 1.2875 - classification_loss: 0.2114 303/500 [=================>............] - ETA: 1:07 - loss: 1.5016 - regression_loss: 1.2895 - classification_loss: 0.2120 304/500 [=================>............] - ETA: 1:06 - loss: 1.5001 - regression_loss: 1.2884 - classification_loss: 0.2117 305/500 [=================>............] - ETA: 1:06 - loss: 1.4971 - regression_loss: 1.2860 - classification_loss: 0.2111 306/500 [=================>............] - ETA: 1:06 - loss: 1.4982 - regression_loss: 1.2873 - classification_loss: 0.2109 307/500 [=================>............] - ETA: 1:05 - loss: 1.4991 - regression_loss: 1.2881 - classification_loss: 0.2110 308/500 [=================>............] - ETA: 1:05 - loss: 1.4995 - regression_loss: 1.2887 - classification_loss: 0.2107 309/500 [=================>............] - ETA: 1:04 - loss: 1.4984 - regression_loss: 1.2880 - classification_loss: 0.2104 310/500 [=================>............] - ETA: 1:04 - loss: 1.4987 - regression_loss: 1.2882 - classification_loss: 0.2105 311/500 [=================>............] - ETA: 1:04 - loss: 1.4982 - regression_loss: 1.2880 - classification_loss: 0.2102 312/500 [=================>............] - ETA: 1:03 - loss: 1.4958 - regression_loss: 1.2858 - classification_loss: 0.2099 313/500 [=================>............] - ETA: 1:03 - loss: 1.4942 - regression_loss: 1.2846 - classification_loss: 0.2096 314/500 [=================>............] - ETA: 1:03 - loss: 1.4942 - regression_loss: 1.2847 - classification_loss: 0.2094 315/500 [=================>............] - ETA: 1:02 - loss: 1.4952 - regression_loss: 1.2857 - classification_loss: 0.2095 316/500 [=================>............] - ETA: 1:02 - loss: 1.4941 - regression_loss: 1.2848 - classification_loss: 0.2092 317/500 [==================>...........] - ETA: 1:02 - loss: 1.4938 - regression_loss: 1.2846 - classification_loss: 0.2092 318/500 [==================>...........] - ETA: 1:01 - loss: 1.4925 - regression_loss: 1.2835 - classification_loss: 0.2090 319/500 [==================>...........] - ETA: 1:01 - loss: 1.4904 - regression_loss: 1.2819 - classification_loss: 0.2086 320/500 [==================>...........] - ETA: 1:01 - loss: 1.4906 - regression_loss: 1.2819 - classification_loss: 0.2086 321/500 [==================>...........] - ETA: 1:00 - loss: 1.4906 - regression_loss: 1.2820 - classification_loss: 0.2085 322/500 [==================>...........] - ETA: 1:00 - loss: 1.4886 - regression_loss: 1.2804 - classification_loss: 0.2082 323/500 [==================>...........] - ETA: 1:00 - loss: 1.4911 - regression_loss: 1.2820 - classification_loss: 0.2091 324/500 [==================>...........] - ETA: 59s - loss: 1.4916 - regression_loss: 1.2826 - classification_loss: 0.2090  325/500 [==================>...........] - ETA: 59s - loss: 1.4918 - regression_loss: 1.2829 - classification_loss: 0.2089 326/500 [==================>...........] - ETA: 59s - loss: 1.4910 - regression_loss: 1.2823 - classification_loss: 0.2087 327/500 [==================>...........] - ETA: 58s - loss: 1.4933 - regression_loss: 1.2843 - classification_loss: 0.2090 328/500 [==================>...........] - ETA: 58s - loss: 1.4940 - regression_loss: 1.2850 - classification_loss: 0.2091 329/500 [==================>...........] - ETA: 58s - loss: 1.4937 - regression_loss: 1.2848 - classification_loss: 0.2090 330/500 [==================>...........] - ETA: 57s - loss: 1.4931 - regression_loss: 1.2842 - classification_loss: 0.2088 331/500 [==================>...........] - ETA: 57s - loss: 1.4934 - regression_loss: 1.2845 - classification_loss: 0.2088 332/500 [==================>...........] - ETA: 57s - loss: 1.4954 - regression_loss: 1.2862 - classification_loss: 0.2092 333/500 [==================>...........] - ETA: 56s - loss: 1.4935 - regression_loss: 1.2845 - classification_loss: 0.2089 334/500 [===================>..........] - ETA: 56s - loss: 1.4934 - regression_loss: 1.2847 - classification_loss: 0.2086 335/500 [===================>..........] - ETA: 56s - loss: 1.4911 - regression_loss: 1.2828 - classification_loss: 0.2084 336/500 [===================>..........] - ETA: 55s - loss: 1.4916 - regression_loss: 1.2832 - classification_loss: 0.2084 337/500 [===================>..........] - ETA: 55s - loss: 1.4918 - regression_loss: 1.2835 - classification_loss: 0.2083 338/500 [===================>..........] - ETA: 55s - loss: 1.4919 - regression_loss: 1.2836 - classification_loss: 0.2083 339/500 [===================>..........] - ETA: 54s - loss: 1.4917 - regression_loss: 1.2833 - classification_loss: 0.2084 340/500 [===================>..........] - ETA: 54s - loss: 1.4949 - regression_loss: 1.2856 - classification_loss: 0.2092 341/500 [===================>..........] - ETA: 54s - loss: 1.4945 - regression_loss: 1.2853 - classification_loss: 0.2092 342/500 [===================>..........] - ETA: 53s - loss: 1.4939 - regression_loss: 1.2849 - classification_loss: 0.2089 343/500 [===================>..........] - ETA: 53s - loss: 1.4942 - regression_loss: 1.2854 - classification_loss: 0.2088 344/500 [===================>..........] - ETA: 53s - loss: 1.4920 - regression_loss: 1.2836 - classification_loss: 0.2084 345/500 [===================>..........] - ETA: 52s - loss: 1.4902 - regression_loss: 1.2822 - classification_loss: 0.2080 346/500 [===================>..........] - ETA: 52s - loss: 1.4893 - regression_loss: 1.2814 - classification_loss: 0.2080 347/500 [===================>..........] - ETA: 52s - loss: 1.4895 - regression_loss: 1.2817 - classification_loss: 0.2079 348/500 [===================>..........] - ETA: 51s - loss: 1.4901 - regression_loss: 1.2823 - classification_loss: 0.2078 349/500 [===================>..........] - ETA: 51s - loss: 1.4878 - regression_loss: 1.2803 - classification_loss: 0.2075 350/500 [====================>.........] - ETA: 50s - loss: 1.4867 - regression_loss: 1.2794 - classification_loss: 0.2073 351/500 [====================>.........] - ETA: 50s - loss: 1.4871 - regression_loss: 1.2796 - classification_loss: 0.2075 352/500 [====================>.........] - ETA: 50s - loss: 1.4870 - regression_loss: 1.2795 - classification_loss: 0.2075 353/500 [====================>.........] - ETA: 49s - loss: 1.4864 - regression_loss: 1.2791 - classification_loss: 0.2073 354/500 [====================>.........] - ETA: 49s - loss: 1.4861 - regression_loss: 1.2787 - classification_loss: 0.2074 355/500 [====================>.........] - ETA: 49s - loss: 1.4859 - regression_loss: 1.2784 - classification_loss: 0.2075 356/500 [====================>.........] - ETA: 48s - loss: 1.4846 - regression_loss: 1.2772 - classification_loss: 0.2074 357/500 [====================>.........] - ETA: 48s - loss: 1.4827 - regression_loss: 1.2757 - classification_loss: 0.2070 358/500 [====================>.........] - ETA: 48s - loss: 1.4811 - regression_loss: 1.2743 - classification_loss: 0.2067 359/500 [====================>.........] - ETA: 47s - loss: 1.4794 - regression_loss: 1.2728 - classification_loss: 0.2065 360/500 [====================>.........] - ETA: 47s - loss: 1.4782 - regression_loss: 1.2719 - classification_loss: 0.2063 361/500 [====================>.........] - ETA: 47s - loss: 1.4787 - regression_loss: 1.2724 - classification_loss: 0.2063 362/500 [====================>.........] - ETA: 46s - loss: 1.4780 - regression_loss: 1.2719 - classification_loss: 0.2062 363/500 [====================>.........] - ETA: 46s - loss: 1.4769 - regression_loss: 1.2700 - classification_loss: 0.2070 364/500 [====================>.........] - ETA: 46s - loss: 1.4787 - regression_loss: 1.2709 - classification_loss: 0.2078 365/500 [====================>.........] - ETA: 45s - loss: 1.4780 - regression_loss: 1.2700 - classification_loss: 0.2080 366/500 [====================>.........] - ETA: 45s - loss: 1.4785 - regression_loss: 1.2708 - classification_loss: 0.2078 367/500 [=====================>........] - ETA: 45s - loss: 1.4777 - regression_loss: 1.2702 - classification_loss: 0.2075 368/500 [=====================>........] - ETA: 44s - loss: 1.4765 - regression_loss: 1.2691 - classification_loss: 0.2073 369/500 [=====================>........] - ETA: 44s - loss: 1.4792 - regression_loss: 1.2714 - classification_loss: 0.2078 370/500 [=====================>........] - ETA: 44s - loss: 1.4790 - regression_loss: 1.2714 - classification_loss: 0.2076 371/500 [=====================>........] - ETA: 43s - loss: 1.4778 - regression_loss: 1.2702 - classification_loss: 0.2076 372/500 [=====================>........] - ETA: 43s - loss: 1.4760 - regression_loss: 1.2687 - classification_loss: 0.2073 373/500 [=====================>........] - ETA: 43s - loss: 1.4748 - regression_loss: 1.2678 - classification_loss: 0.2071 374/500 [=====================>........] - ETA: 42s - loss: 1.4737 - regression_loss: 1.2668 - classification_loss: 0.2069 375/500 [=====================>........] - ETA: 42s - loss: 1.4742 - regression_loss: 1.2670 - classification_loss: 0.2072 376/500 [=====================>........] - ETA: 42s - loss: 1.4745 - regression_loss: 1.2672 - classification_loss: 0.2073 377/500 [=====================>........] - ETA: 41s - loss: 1.4749 - regression_loss: 1.2676 - classification_loss: 0.2074 378/500 [=====================>........] - ETA: 41s - loss: 1.4722 - regression_loss: 1.2652 - classification_loss: 0.2070 379/500 [=====================>........] - ETA: 41s - loss: 1.4728 - regression_loss: 1.2657 - classification_loss: 0.2071 380/500 [=====================>........] - ETA: 40s - loss: 1.4736 - regression_loss: 1.2664 - classification_loss: 0.2072 381/500 [=====================>........] - ETA: 40s - loss: 1.4734 - regression_loss: 1.2662 - classification_loss: 0.2072 382/500 [=====================>........] - ETA: 40s - loss: 1.4744 - regression_loss: 1.2672 - classification_loss: 0.2072 383/500 [=====================>........] - ETA: 39s - loss: 1.4762 - regression_loss: 1.2689 - classification_loss: 0.2072 384/500 [======================>.......] - ETA: 39s - loss: 1.4763 - regression_loss: 1.2690 - classification_loss: 0.2073 385/500 [======================>.......] - ETA: 39s - loss: 1.4756 - regression_loss: 1.2683 - classification_loss: 0.2073 386/500 [======================>.......] - ETA: 38s - loss: 1.4735 - regression_loss: 1.2665 - classification_loss: 0.2070 387/500 [======================>.......] - ETA: 38s - loss: 1.4749 - regression_loss: 1.2675 - classification_loss: 0.2074 388/500 [======================>.......] - ETA: 38s - loss: 1.4746 - regression_loss: 1.2672 - classification_loss: 0.2074 389/500 [======================>.......] - ETA: 37s - loss: 1.4748 - regression_loss: 1.2674 - classification_loss: 0.2075 390/500 [======================>.......] - ETA: 37s - loss: 1.4721 - regression_loss: 1.2650 - classification_loss: 0.2071 391/500 [======================>.......] - ETA: 37s - loss: 1.4718 - regression_loss: 1.2648 - classification_loss: 0.2070 392/500 [======================>.......] - ETA: 36s - loss: 1.4723 - regression_loss: 1.2653 - classification_loss: 0.2071 393/500 [======================>.......] - ETA: 36s - loss: 1.4724 - regression_loss: 1.2654 - classification_loss: 0.2070 394/500 [======================>.......] - ETA: 36s - loss: 1.4743 - regression_loss: 1.2670 - classification_loss: 0.2073 395/500 [======================>.......] - ETA: 35s - loss: 1.4730 - regression_loss: 1.2659 - classification_loss: 0.2071 396/500 [======================>.......] - ETA: 35s - loss: 1.4720 - regression_loss: 1.2650 - classification_loss: 0.2070 397/500 [======================>.......] - ETA: 35s - loss: 1.4762 - regression_loss: 1.2682 - classification_loss: 0.2080 398/500 [======================>.......] - ETA: 34s - loss: 1.4745 - regression_loss: 1.2669 - classification_loss: 0.2076 399/500 [======================>.......] - ETA: 34s - loss: 1.4750 - regression_loss: 1.2675 - classification_loss: 0.2075 400/500 [=======================>......] - ETA: 34s - loss: 1.4764 - regression_loss: 1.2685 - classification_loss: 0.2079 401/500 [=======================>......] - ETA: 33s - loss: 1.4778 - regression_loss: 1.2699 - classification_loss: 0.2079 402/500 [=======================>......] - ETA: 33s - loss: 1.4769 - regression_loss: 1.2691 - classification_loss: 0.2077 403/500 [=======================>......] - ETA: 32s - loss: 1.4763 - regression_loss: 1.2685 - classification_loss: 0.2078 404/500 [=======================>......] - ETA: 32s - loss: 1.4781 - regression_loss: 1.2698 - classification_loss: 0.2083 405/500 [=======================>......] - ETA: 32s - loss: 1.4813 - regression_loss: 1.2720 - classification_loss: 0.2093 406/500 [=======================>......] - ETA: 31s - loss: 1.4815 - regression_loss: 1.2721 - classification_loss: 0.2094 407/500 [=======================>......] - ETA: 31s - loss: 1.4789 - regression_loss: 1.2699 - classification_loss: 0.2090 408/500 [=======================>......] - ETA: 31s - loss: 1.4796 - regression_loss: 1.2705 - classification_loss: 0.2090 409/500 [=======================>......] - ETA: 30s - loss: 1.4778 - regression_loss: 1.2691 - classification_loss: 0.2087 410/500 [=======================>......] - ETA: 30s - loss: 1.4794 - regression_loss: 1.2706 - classification_loss: 0.2088 411/500 [=======================>......] - ETA: 30s - loss: 1.4807 - regression_loss: 1.2717 - classification_loss: 0.2090 412/500 [=======================>......] - ETA: 29s - loss: 1.4809 - regression_loss: 1.2719 - classification_loss: 0.2090 413/500 [=======================>......] - ETA: 29s - loss: 1.4815 - regression_loss: 1.2725 - classification_loss: 0.2090 414/500 [=======================>......] - ETA: 29s - loss: 1.4823 - regression_loss: 1.2732 - classification_loss: 0.2091 415/500 [=======================>......] - ETA: 28s - loss: 1.4850 - regression_loss: 1.2754 - classification_loss: 0.2095 416/500 [=======================>......] - ETA: 28s - loss: 1.4843 - regression_loss: 1.2749 - classification_loss: 0.2094 417/500 [========================>.....] - ETA: 28s - loss: 1.4850 - regression_loss: 1.2754 - classification_loss: 0.2095 418/500 [========================>.....] - ETA: 27s - loss: 1.4845 - regression_loss: 1.2752 - classification_loss: 0.2093 419/500 [========================>.....] - ETA: 27s - loss: 1.4827 - regression_loss: 1.2737 - classification_loss: 0.2090 420/500 [========================>.....] - ETA: 27s - loss: 1.4835 - regression_loss: 1.2744 - classification_loss: 0.2091 421/500 [========================>.....] - ETA: 26s - loss: 1.4828 - regression_loss: 1.2739 - classification_loss: 0.2089 422/500 [========================>.....] - ETA: 26s - loss: 1.4825 - regression_loss: 1.2736 - classification_loss: 0.2088 423/500 [========================>.....] - ETA: 26s - loss: 1.4819 - regression_loss: 1.2730 - classification_loss: 0.2089 424/500 [========================>.....] - ETA: 25s - loss: 1.4816 - regression_loss: 1.2729 - classification_loss: 0.2087 425/500 [========================>.....] - ETA: 25s - loss: 1.4816 - regression_loss: 1.2730 - classification_loss: 0.2086 426/500 [========================>.....] - ETA: 25s - loss: 1.4816 - regression_loss: 1.2726 - classification_loss: 0.2089 427/500 [========================>.....] - ETA: 24s - loss: 1.4813 - regression_loss: 1.2725 - classification_loss: 0.2088 428/500 [========================>.....] - ETA: 24s - loss: 1.4822 - regression_loss: 1.2734 - classification_loss: 0.2088 429/500 [========================>.....] - ETA: 24s - loss: 1.4836 - regression_loss: 1.2750 - classification_loss: 0.2086 430/500 [========================>.....] - ETA: 23s - loss: 1.4821 - regression_loss: 1.2737 - classification_loss: 0.2084 431/500 [========================>.....] - ETA: 23s - loss: 1.4835 - regression_loss: 1.2747 - classification_loss: 0.2088 432/500 [========================>.....] - ETA: 23s - loss: 1.4849 - regression_loss: 1.2757 - classification_loss: 0.2092 433/500 [========================>.....] - ETA: 22s - loss: 1.4834 - regression_loss: 1.2745 - classification_loss: 0.2090 434/500 [=========================>....] - ETA: 22s - loss: 1.4822 - regression_loss: 1.2735 - classification_loss: 0.2087 435/500 [=========================>....] - ETA: 22s - loss: 1.4815 - regression_loss: 1.2728 - classification_loss: 0.2087 436/500 [=========================>....] - ETA: 21s - loss: 1.4810 - regression_loss: 1.2724 - classification_loss: 0.2086 437/500 [=========================>....] - ETA: 21s - loss: 1.4810 - regression_loss: 1.2725 - classification_loss: 0.2085 438/500 [=========================>....] - ETA: 21s - loss: 1.4815 - regression_loss: 1.2730 - classification_loss: 0.2085 439/500 [=========================>....] - ETA: 20s - loss: 1.4813 - regression_loss: 1.2728 - classification_loss: 0.2085 440/500 [=========================>....] - ETA: 20s - loss: 1.4805 - regression_loss: 1.2722 - classification_loss: 0.2083 441/500 [=========================>....] - ETA: 20s - loss: 1.4814 - regression_loss: 1.2731 - classification_loss: 0.2083 442/500 [=========================>....] - ETA: 19s - loss: 1.4809 - regression_loss: 1.2727 - classification_loss: 0.2082 443/500 [=========================>....] - ETA: 19s - loss: 1.4822 - regression_loss: 1.2737 - classification_loss: 0.2085 444/500 [=========================>....] - ETA: 19s - loss: 1.4828 - regression_loss: 1.2744 - classification_loss: 0.2085 445/500 [=========================>....] - ETA: 18s - loss: 1.4831 - regression_loss: 1.2747 - classification_loss: 0.2084 446/500 [=========================>....] - ETA: 18s - loss: 1.4828 - regression_loss: 1.2745 - classification_loss: 0.2083 447/500 [=========================>....] - ETA: 18s - loss: 1.4837 - regression_loss: 1.2753 - classification_loss: 0.2084 448/500 [=========================>....] - ETA: 17s - loss: 1.4844 - regression_loss: 1.2759 - classification_loss: 0.2085 449/500 [=========================>....] - ETA: 17s - loss: 1.4845 - regression_loss: 1.2761 - classification_loss: 0.2084 450/500 [==========================>...] - ETA: 16s - loss: 1.4861 - regression_loss: 1.2772 - classification_loss: 0.2090 451/500 [==========================>...] - ETA: 16s - loss: 1.4863 - regression_loss: 1.2774 - classification_loss: 0.2089 452/500 [==========================>...] - ETA: 16s - loss: 1.4846 - regression_loss: 1.2760 - classification_loss: 0.2086 453/500 [==========================>...] - ETA: 15s - loss: 1.4830 - regression_loss: 1.2746 - classification_loss: 0.2083 454/500 [==========================>...] - ETA: 15s - loss: 1.4820 - regression_loss: 1.2738 - classification_loss: 0.2081 455/500 [==========================>...] - ETA: 15s - loss: 1.4821 - regression_loss: 1.2740 - classification_loss: 0.2081 456/500 [==========================>...] - ETA: 14s - loss: 1.4813 - regression_loss: 1.2733 - classification_loss: 0.2079 457/500 [==========================>...] - ETA: 14s - loss: 1.4800 - regression_loss: 1.2723 - classification_loss: 0.2077 458/500 [==========================>...] - ETA: 14s - loss: 1.4804 - regression_loss: 1.2725 - classification_loss: 0.2078 459/500 [==========================>...] - ETA: 13s - loss: 1.4801 - regression_loss: 1.2723 - classification_loss: 0.2078 460/500 [==========================>...] - ETA: 13s - loss: 1.4809 - regression_loss: 1.2732 - classification_loss: 0.2078 461/500 [==========================>...] - ETA: 13s - loss: 1.4810 - regression_loss: 1.2733 - classification_loss: 0.2077 462/500 [==========================>...] - ETA: 12s - loss: 1.4810 - regression_loss: 1.2735 - classification_loss: 0.2075 463/500 [==========================>...] - ETA: 12s - loss: 1.4827 - regression_loss: 1.2751 - classification_loss: 0.2076 464/500 [==========================>...] - ETA: 12s - loss: 1.4824 - regression_loss: 1.2750 - classification_loss: 0.2074 465/500 [==========================>...] - ETA: 11s - loss: 1.4821 - regression_loss: 1.2749 - classification_loss: 0.2072 466/500 [==========================>...] - ETA: 11s - loss: 1.4817 - regression_loss: 1.2746 - classification_loss: 0.2071 467/500 [===========================>..] - ETA: 11s - loss: 1.4816 - regression_loss: 1.2747 - classification_loss: 0.2069 468/500 [===========================>..] - ETA: 10s - loss: 1.4812 - regression_loss: 1.2745 - classification_loss: 0.2067 469/500 [===========================>..] - ETA: 10s - loss: 1.4806 - regression_loss: 1.2741 - classification_loss: 0.2065 470/500 [===========================>..] - ETA: 10s - loss: 1.4797 - regression_loss: 1.2734 - classification_loss: 0.2063 471/500 [===========================>..] - ETA: 9s - loss: 1.4796 - regression_loss: 1.2734 - classification_loss: 0.2062  472/500 [===========================>..] - ETA: 9s - loss: 1.4805 - regression_loss: 1.2741 - classification_loss: 0.2065 473/500 [===========================>..] - ETA: 9s - loss: 1.4796 - regression_loss: 1.2734 - classification_loss: 0.2063 474/500 [===========================>..] - ETA: 8s - loss: 1.4778 - regression_loss: 1.2718 - classification_loss: 0.2060 475/500 [===========================>..] - ETA: 8s - loss: 1.4770 - regression_loss: 1.2711 - classification_loss: 0.2058 476/500 [===========================>..] - ETA: 8s - loss: 1.4768 - regression_loss: 1.2710 - classification_loss: 0.2058 477/500 [===========================>..] - ETA: 7s - loss: 1.4757 - regression_loss: 1.2702 - classification_loss: 0.2055 478/500 [===========================>..] - ETA: 7s - loss: 1.4760 - regression_loss: 1.2705 - classification_loss: 0.2055 479/500 [===========================>..] - ETA: 7s - loss: 1.4762 - regression_loss: 1.2708 - classification_loss: 0.2054 480/500 [===========================>..] - ETA: 6s - loss: 1.4753 - regression_loss: 1.2696 - classification_loss: 0.2057 481/500 [===========================>..] - ETA: 6s - loss: 1.4766 - regression_loss: 1.2709 - classification_loss: 0.2057 482/500 [===========================>..] - ETA: 6s - loss: 1.4762 - regression_loss: 1.2706 - classification_loss: 0.2056 483/500 [===========================>..] - ETA: 5s - loss: 1.4744 - regression_loss: 1.2691 - classification_loss: 0.2053 484/500 [============================>.] - ETA: 5s - loss: 1.4747 - regression_loss: 1.2696 - classification_loss: 0.2051 485/500 [============================>.] - ETA: 5s - loss: 1.4742 - regression_loss: 1.2693 - classification_loss: 0.2049 486/500 [============================>.] - ETA: 4s - loss: 1.4736 - regression_loss: 1.2689 - classification_loss: 0.2048 487/500 [============================>.] - ETA: 4s - loss: 1.4742 - regression_loss: 1.2694 - classification_loss: 0.2048 488/500 [============================>.] - ETA: 4s - loss: 1.4738 - regression_loss: 1.2690 - classification_loss: 0.2048 489/500 [============================>.] - ETA: 3s - loss: 1.4732 - regression_loss: 1.2685 - classification_loss: 0.2047 490/500 [============================>.] - ETA: 3s - loss: 1.4730 - regression_loss: 1.2684 - classification_loss: 0.2046 491/500 [============================>.] - ETA: 3s - loss: 1.4736 - regression_loss: 1.2689 - classification_loss: 0.2047 492/500 [============================>.] - ETA: 2s - loss: 1.4735 - regression_loss: 1.2688 - classification_loss: 0.2047 493/500 [============================>.] - ETA: 2s - loss: 1.4730 - regression_loss: 1.2685 - classification_loss: 0.2045 494/500 [============================>.] - ETA: 2s - loss: 1.4723 - regression_loss: 1.2680 - classification_loss: 0.2043 495/500 [============================>.] - ETA: 1s - loss: 1.4706 - regression_loss: 1.2665 - classification_loss: 0.2041 496/500 [============================>.] - ETA: 1s - loss: 1.4705 - regression_loss: 1.2665 - classification_loss: 0.2040 497/500 [============================>.] - ETA: 1s - loss: 1.4708 - regression_loss: 1.2668 - classification_loss: 0.2040 498/500 [============================>.] - ETA: 0s - loss: 1.4697 - regression_loss: 1.2659 - classification_loss: 0.2039 499/500 [============================>.] - ETA: 0s - loss: 1.4688 - regression_loss: 1.2651 - classification_loss: 0.2037 500/500 [==============================] - 170s 340ms/step - loss: 1.4690 - regression_loss: 1.2653 - classification_loss: 0.2037 326 instances of class plum with average precision: 0.7591 mAP: 0.7591 Epoch 00010: saving model to ./training/snapshots/resnet101_pascal_10.h5 Epoch 11/150 1/500 [..............................] - ETA: 2:31 - loss: 1.6787 - regression_loss: 1.2450 - classification_loss: 0.4337 2/500 [..............................] - ETA: 2:36 - loss: 1.3423 - regression_loss: 1.0472 - classification_loss: 0.2950 3/500 [..............................] - ETA: 2:40 - loss: 1.5205 - regression_loss: 1.2279 - classification_loss: 0.2927 4/500 [..............................] - ETA: 2:42 - loss: 1.4838 - regression_loss: 1.2289 - classification_loss: 0.2549 5/500 [..............................] - ETA: 2:45 - loss: 1.3983 - regression_loss: 1.1697 - classification_loss: 0.2287 6/500 [..............................] - ETA: 2:47 - loss: 1.5197 - regression_loss: 1.2761 - classification_loss: 0.2436 7/500 [..............................] - ETA: 2:48 - loss: 1.5422 - regression_loss: 1.2977 - classification_loss: 0.2445 8/500 [..............................] - ETA: 2:48 - loss: 1.4580 - regression_loss: 1.2313 - classification_loss: 0.2267 9/500 [..............................] - ETA: 2:47 - loss: 1.3966 - regression_loss: 1.1806 - classification_loss: 0.2160 10/500 [..............................] - ETA: 2:47 - loss: 1.3787 - regression_loss: 1.1666 - classification_loss: 0.2122 11/500 [..............................] - ETA: 2:47 - loss: 1.4240 - regression_loss: 1.2030 - classification_loss: 0.2210 12/500 [..............................] - ETA: 2:46 - loss: 1.3725 - regression_loss: 1.1632 - classification_loss: 0.2093 13/500 [..............................] - ETA: 2:45 - loss: 1.3403 - regression_loss: 1.1382 - classification_loss: 0.2021 14/500 [..............................] - ETA: 2:44 - loss: 1.3517 - regression_loss: 1.1494 - classification_loss: 0.2023 15/500 [..............................] - ETA: 2:44 - loss: 1.3861 - regression_loss: 1.1805 - classification_loss: 0.2056 16/500 [..............................] - ETA: 2:44 - loss: 1.3675 - regression_loss: 1.1656 - classification_loss: 0.2018 17/500 [>.............................] - ETA: 2:43 - loss: 1.3406 - regression_loss: 1.1459 - classification_loss: 0.1946 18/500 [>.............................] - ETA: 2:43 - loss: 1.3262 - regression_loss: 1.1362 - classification_loss: 0.1900 19/500 [>.............................] - ETA: 2:43 - loss: 1.2959 - regression_loss: 1.1130 - classification_loss: 0.1829 20/500 [>.............................] - ETA: 2:43 - loss: 1.3138 - regression_loss: 1.1272 - classification_loss: 0.1866 21/500 [>.............................] - ETA: 2:43 - loss: 1.3018 - regression_loss: 1.1185 - classification_loss: 0.1833 22/500 [>.............................] - ETA: 2:42 - loss: 1.2750 - regression_loss: 1.0969 - classification_loss: 0.1781 23/500 [>.............................] - ETA: 2:42 - loss: 1.2416 - regression_loss: 1.0691 - classification_loss: 0.1725 24/500 [>.............................] - ETA: 2:42 - loss: 1.2627 - regression_loss: 1.0891 - classification_loss: 0.1737 25/500 [>.............................] - ETA: 2:42 - loss: 1.2661 - regression_loss: 1.0920 - classification_loss: 0.1741 26/500 [>.............................] - ETA: 2:42 - loss: 1.2554 - regression_loss: 1.0845 - classification_loss: 0.1708 27/500 [>.............................] - ETA: 2:42 - loss: 1.2660 - regression_loss: 1.0960 - classification_loss: 0.1700 28/500 [>.............................] - ETA: 2:41 - loss: 1.3007 - regression_loss: 1.1238 - classification_loss: 0.1769 29/500 [>.............................] - ETA: 2:41 - loss: 1.3059 - regression_loss: 1.1306 - classification_loss: 0.1753 30/500 [>.............................] - ETA: 2:41 - loss: 1.3280 - regression_loss: 1.1528 - classification_loss: 0.1751 31/500 [>.............................] - ETA: 2:41 - loss: 1.3156 - regression_loss: 1.1437 - classification_loss: 0.1720 32/500 [>.............................] - ETA: 2:40 - loss: 1.3238 - regression_loss: 1.1532 - classification_loss: 0.1706 33/500 [>.............................] - ETA: 2:40 - loss: 1.3246 - regression_loss: 1.1535 - classification_loss: 0.1712 34/500 [=>............................] - ETA: 2:39 - loss: 1.3084 - regression_loss: 1.1389 - classification_loss: 0.1695 35/500 [=>............................] - ETA: 2:39 - loss: 1.3023 - regression_loss: 1.1342 - classification_loss: 0.1681 36/500 [=>............................] - ETA: 2:39 - loss: 1.3381 - regression_loss: 1.1619 - classification_loss: 0.1763 37/500 [=>............................] - ETA: 2:38 - loss: 1.3574 - regression_loss: 1.1755 - classification_loss: 0.1819 38/500 [=>............................] - ETA: 2:38 - loss: 1.3477 - regression_loss: 1.1672 - classification_loss: 0.1805 39/500 [=>............................] - ETA: 2:37 - loss: 1.3537 - regression_loss: 1.1714 - classification_loss: 0.1823 40/500 [=>............................] - ETA: 2:37 - loss: 1.3561 - regression_loss: 1.1735 - classification_loss: 0.1826 41/500 [=>............................] - ETA: 2:37 - loss: 1.3479 - regression_loss: 1.1657 - classification_loss: 0.1822 42/500 [=>............................] - ETA: 2:37 - loss: 1.3497 - regression_loss: 1.1656 - classification_loss: 0.1840 43/500 [=>............................] - ETA: 2:36 - loss: 1.3604 - regression_loss: 1.1740 - classification_loss: 0.1864 44/500 [=>............................] - ETA: 2:36 - loss: 1.3609 - regression_loss: 1.1747 - classification_loss: 0.1862 45/500 [=>............................] - ETA: 2:36 - loss: 1.3496 - regression_loss: 1.1650 - classification_loss: 0.1846 46/500 [=>............................] - ETA: 2:35 - loss: 1.3581 - regression_loss: 1.1726 - classification_loss: 0.1856 47/500 [=>............................] - ETA: 2:35 - loss: 1.3426 - regression_loss: 1.1597 - classification_loss: 0.1829 48/500 [=>............................] - ETA: 2:35 - loss: 1.3331 - regression_loss: 1.1506 - classification_loss: 0.1825 49/500 [=>............................] - ETA: 2:34 - loss: 1.3744 - regression_loss: 1.1847 - classification_loss: 0.1896 50/500 [==>...........................] - ETA: 2:34 - loss: 1.3861 - regression_loss: 1.1976 - classification_loss: 0.1886 51/500 [==>...........................] - ETA: 2:33 - loss: 1.3879 - regression_loss: 1.1996 - classification_loss: 0.1883 52/500 [==>...........................] - ETA: 2:33 - loss: 1.4055 - regression_loss: 1.2112 - classification_loss: 0.1943 53/500 [==>...........................] - ETA: 2:33 - loss: 1.4092 - regression_loss: 1.2111 - classification_loss: 0.1981 54/500 [==>...........................] - ETA: 2:32 - loss: 1.4179 - regression_loss: 1.2180 - classification_loss: 0.1999 55/500 [==>...........................] - ETA: 2:32 - loss: 1.4085 - regression_loss: 1.2108 - classification_loss: 0.1978 56/500 [==>...........................] - ETA: 2:32 - loss: 1.4265 - regression_loss: 1.2252 - classification_loss: 0.2012 57/500 [==>...........................] - ETA: 2:31 - loss: 1.4283 - regression_loss: 1.2260 - classification_loss: 0.2023 58/500 [==>...........................] - ETA: 2:31 - loss: 1.4400 - regression_loss: 1.2342 - classification_loss: 0.2059 59/500 [==>...........................] - ETA: 2:30 - loss: 1.4364 - regression_loss: 1.2316 - classification_loss: 0.2048 60/500 [==>...........................] - ETA: 2:30 - loss: 1.4481 - regression_loss: 1.2394 - classification_loss: 0.2087 61/500 [==>...........................] - ETA: 2:29 - loss: 1.4527 - regression_loss: 1.2423 - classification_loss: 0.2103 62/500 [==>...........................] - ETA: 2:29 - loss: 1.4536 - regression_loss: 1.2412 - classification_loss: 0.2124 63/500 [==>...........................] - ETA: 2:29 - loss: 1.4497 - regression_loss: 1.2387 - classification_loss: 0.2110 64/500 [==>...........................] - ETA: 2:28 - loss: 1.4381 - regression_loss: 1.2291 - classification_loss: 0.2089 65/500 [==>...........................] - ETA: 2:28 - loss: 1.4441 - regression_loss: 1.2336 - classification_loss: 0.2105 66/500 [==>...........................] - ETA: 2:28 - loss: 1.4425 - regression_loss: 1.2326 - classification_loss: 0.2099 67/500 [===>..........................] - ETA: 2:28 - loss: 1.4414 - regression_loss: 1.2306 - classification_loss: 0.2108 68/500 [===>..........................] - ETA: 2:27 - loss: 1.4486 - regression_loss: 1.2377 - classification_loss: 0.2110 69/500 [===>..........................] - ETA: 2:27 - loss: 1.4530 - regression_loss: 1.2416 - classification_loss: 0.2113 70/500 [===>..........................] - ETA: 2:27 - loss: 1.4533 - regression_loss: 1.2415 - classification_loss: 0.2118 71/500 [===>..........................] - ETA: 2:26 - loss: 1.4529 - regression_loss: 1.2422 - classification_loss: 0.2107 72/500 [===>..........................] - ETA: 2:26 - loss: 1.4546 - regression_loss: 1.2452 - classification_loss: 0.2094 73/500 [===>..........................] - ETA: 2:25 - loss: 1.4456 - regression_loss: 1.2382 - classification_loss: 0.2075 74/500 [===>..........................] - ETA: 2:25 - loss: 1.4484 - regression_loss: 1.2400 - classification_loss: 0.2083 75/500 [===>..........................] - ETA: 2:25 - loss: 1.4587 - regression_loss: 1.2503 - classification_loss: 0.2084 76/500 [===>..........................] - ETA: 2:24 - loss: 1.4529 - regression_loss: 1.2461 - classification_loss: 0.2069 77/500 [===>..........................] - ETA: 2:24 - loss: 1.4443 - regression_loss: 1.2390 - classification_loss: 0.2053 78/500 [===>..........................] - ETA: 2:23 - loss: 1.4423 - regression_loss: 1.2371 - classification_loss: 0.2053 79/500 [===>..........................] - ETA: 2:23 - loss: 1.4449 - regression_loss: 1.2394 - classification_loss: 0.2055 80/500 [===>..........................] - ETA: 2:23 - loss: 1.4635 - regression_loss: 1.2555 - classification_loss: 0.2080 81/500 [===>..........................] - ETA: 2:22 - loss: 1.4737 - regression_loss: 1.2645 - classification_loss: 0.2092 82/500 [===>..........................] - ETA: 2:22 - loss: 1.4723 - regression_loss: 1.2646 - classification_loss: 0.2077 83/500 [===>..........................] - ETA: 2:22 - loss: 1.4604 - regression_loss: 1.2547 - classification_loss: 0.2057 84/500 [====>.........................] - ETA: 2:21 - loss: 1.4609 - regression_loss: 1.2549 - classification_loss: 0.2061 85/500 [====>.........................] - ETA: 2:21 - loss: 1.4516 - regression_loss: 1.2475 - classification_loss: 0.2041 86/500 [====>.........................] - ETA: 2:20 - loss: 1.4451 - regression_loss: 1.2420 - classification_loss: 0.2030 87/500 [====>.........................] - ETA: 2:20 - loss: 1.4408 - regression_loss: 1.2378 - classification_loss: 0.2029 88/500 [====>.........................] - ETA: 2:20 - loss: 1.4564 - regression_loss: 1.2507 - classification_loss: 0.2057 89/500 [====>.........................] - ETA: 2:19 - loss: 1.4617 - regression_loss: 1.2538 - classification_loss: 0.2079 90/500 [====>.........................] - ETA: 2:19 - loss: 1.4610 - regression_loss: 1.2534 - classification_loss: 0.2076 91/500 [====>.........................] - ETA: 2:19 - loss: 1.4648 - regression_loss: 1.2566 - classification_loss: 0.2082 92/500 [====>.........................] - ETA: 2:18 - loss: 1.4599 - regression_loss: 1.2525 - classification_loss: 0.2074 93/500 [====>.........................] - ETA: 2:18 - loss: 1.4601 - regression_loss: 1.2523 - classification_loss: 0.2078 94/500 [====>.........................] - ETA: 2:18 - loss: 1.4637 - regression_loss: 1.2553 - classification_loss: 0.2084 95/500 [====>.........................] - ETA: 2:17 - loss: 1.4582 - regression_loss: 1.2511 - classification_loss: 0.2072 96/500 [====>.........................] - ETA: 2:17 - loss: 1.4508 - regression_loss: 1.2450 - classification_loss: 0.2059 97/500 [====>.........................] - ETA: 2:17 - loss: 1.4488 - regression_loss: 1.2436 - classification_loss: 0.2052 98/500 [====>.........................] - ETA: 2:16 - loss: 1.4424 - regression_loss: 1.2386 - classification_loss: 0.2038 99/500 [====>.........................] - ETA: 2:16 - loss: 1.4429 - regression_loss: 1.2386 - classification_loss: 0.2043 100/500 [=====>........................] - ETA: 2:15 - loss: 1.4494 - regression_loss: 1.2442 - classification_loss: 0.2052 101/500 [=====>........................] - ETA: 2:15 - loss: 1.4415 - regression_loss: 1.2378 - classification_loss: 0.2037 102/500 [=====>........................] - ETA: 2:15 - loss: 1.4419 - regression_loss: 1.2389 - classification_loss: 0.2030 103/500 [=====>........................] - ETA: 2:15 - loss: 1.4366 - regression_loss: 1.2345 - classification_loss: 0.2021 104/500 [=====>........................] - ETA: 2:14 - loss: 1.4446 - regression_loss: 1.2422 - classification_loss: 0.2024 105/500 [=====>........................] - ETA: 2:14 - loss: 1.4478 - regression_loss: 1.2451 - classification_loss: 0.2027 106/500 [=====>........................] - ETA: 2:13 - loss: 1.4493 - regression_loss: 1.2465 - classification_loss: 0.2028 107/500 [=====>........................] - ETA: 2:13 - loss: 1.4558 - regression_loss: 1.2518 - classification_loss: 0.2039 108/500 [=====>........................] - ETA: 2:13 - loss: 1.4535 - regression_loss: 1.2505 - classification_loss: 0.2030 109/500 [=====>........................] - ETA: 2:12 - loss: 1.4455 - regression_loss: 1.2440 - classification_loss: 0.2015 110/500 [=====>........................] - ETA: 2:12 - loss: 1.4465 - regression_loss: 1.2450 - classification_loss: 0.2015 111/500 [=====>........................] - ETA: 2:12 - loss: 1.4469 - regression_loss: 1.2458 - classification_loss: 0.2011 112/500 [=====>........................] - ETA: 2:11 - loss: 1.4486 - regression_loss: 1.2480 - classification_loss: 0.2006 113/500 [=====>........................] - ETA: 2:11 - loss: 1.4492 - regression_loss: 1.2493 - classification_loss: 0.1999 114/500 [=====>........................] - ETA: 2:10 - loss: 1.4524 - regression_loss: 1.2521 - classification_loss: 0.2003 115/500 [=====>........................] - ETA: 2:10 - loss: 1.4497 - regression_loss: 1.2494 - classification_loss: 0.2003 116/500 [=====>........................] - ETA: 2:10 - loss: 1.4420 - regression_loss: 1.2426 - classification_loss: 0.1993 117/500 [======>.......................] - ETA: 2:09 - loss: 1.4372 - regression_loss: 1.2320 - classification_loss: 0.2052 118/500 [======>.......................] - ETA: 2:09 - loss: 1.4345 - regression_loss: 1.2301 - classification_loss: 0.2044 119/500 [======>.......................] - ETA: 2:09 - loss: 1.4385 - regression_loss: 1.2332 - classification_loss: 0.2053 120/500 [======>.......................] - ETA: 2:08 - loss: 1.4415 - regression_loss: 1.2353 - classification_loss: 0.2062 121/500 [======>.......................] - ETA: 2:08 - loss: 1.4401 - regression_loss: 1.2341 - classification_loss: 0.2060 122/500 [======>.......................] - ETA: 2:08 - loss: 1.4415 - regression_loss: 1.2349 - classification_loss: 0.2066 123/500 [======>.......................] - ETA: 2:07 - loss: 1.4500 - regression_loss: 1.2428 - classification_loss: 0.2072 124/500 [======>.......................] - ETA: 2:07 - loss: 1.4546 - regression_loss: 1.2463 - classification_loss: 0.2083 125/500 [======>.......................] - ETA: 2:07 - loss: 1.4588 - regression_loss: 1.2500 - classification_loss: 0.2089 126/500 [======>.......................] - ETA: 2:07 - loss: 1.4623 - regression_loss: 1.2527 - classification_loss: 0.2096 127/500 [======>.......................] - ETA: 2:06 - loss: 1.4606 - regression_loss: 1.2513 - classification_loss: 0.2093 128/500 [======>.......................] - ETA: 2:06 - loss: 1.4658 - regression_loss: 1.2550 - classification_loss: 0.2108 129/500 [======>.......................] - ETA: 2:06 - loss: 1.4700 - regression_loss: 1.2592 - classification_loss: 0.2107 130/500 [======>.......................] - ETA: 2:05 - loss: 1.4753 - regression_loss: 1.2644 - classification_loss: 0.2109 131/500 [======>.......................] - ETA: 2:05 - loss: 1.4729 - regression_loss: 1.2626 - classification_loss: 0.2103 132/500 [======>.......................] - ETA: 2:05 - loss: 1.4717 - regression_loss: 1.2619 - classification_loss: 0.2098 133/500 [======>.......................] - ETA: 2:04 - loss: 1.4654 - regression_loss: 1.2567 - classification_loss: 0.2087 134/500 [=======>......................] - ETA: 2:04 - loss: 1.4670 - regression_loss: 1.2576 - classification_loss: 0.2094 135/500 [=======>......................] - ETA: 2:04 - loss: 1.4657 - regression_loss: 1.2564 - classification_loss: 0.2093 136/500 [=======>......................] - ETA: 2:03 - loss: 1.4647 - regression_loss: 1.2559 - classification_loss: 0.2088 137/500 [=======>......................] - ETA: 2:03 - loss: 1.4646 - regression_loss: 1.2559 - classification_loss: 0.2087 138/500 [=======>......................] - ETA: 2:02 - loss: 1.4652 - regression_loss: 1.2566 - classification_loss: 0.2086 139/500 [=======>......................] - ETA: 2:02 - loss: 1.4643 - regression_loss: 1.2563 - classification_loss: 0.2080 140/500 [=======>......................] - ETA: 2:02 - loss: 1.4632 - regression_loss: 1.2557 - classification_loss: 0.2076 141/500 [=======>......................] - ETA: 2:01 - loss: 1.4610 - regression_loss: 1.2540 - classification_loss: 0.2070 142/500 [=======>......................] - ETA: 2:01 - loss: 1.4600 - regression_loss: 1.2531 - classification_loss: 0.2069 143/500 [=======>......................] - ETA: 2:01 - loss: 1.4650 - regression_loss: 1.2571 - classification_loss: 0.2079 144/500 [=======>......................] - ETA: 2:00 - loss: 1.4686 - regression_loss: 1.2602 - classification_loss: 0.2084 145/500 [=======>......................] - ETA: 2:00 - loss: 1.4732 - regression_loss: 1.2643 - classification_loss: 0.2089 146/500 [=======>......................] - ETA: 2:00 - loss: 1.4739 - regression_loss: 1.2656 - classification_loss: 0.2083 147/500 [=======>......................] - ETA: 1:59 - loss: 1.4728 - regression_loss: 1.2644 - classification_loss: 0.2084 148/500 [=======>......................] - ETA: 1:59 - loss: 1.4719 - regression_loss: 1.2642 - classification_loss: 0.2077 149/500 [=======>......................] - ETA: 1:59 - loss: 1.4766 - regression_loss: 1.2687 - classification_loss: 0.2078 150/500 [========>.....................] - ETA: 1:58 - loss: 1.4809 - regression_loss: 1.2724 - classification_loss: 0.2085 151/500 [========>.....................] - ETA: 1:58 - loss: 1.4807 - regression_loss: 1.2723 - classification_loss: 0.2084 152/500 [========>.....................] - ETA: 1:58 - loss: 1.4791 - regression_loss: 1.2710 - classification_loss: 0.2082 153/500 [========>.....................] - ETA: 1:58 - loss: 1.4761 - regression_loss: 1.2685 - classification_loss: 0.2076 154/500 [========>.....................] - ETA: 1:57 - loss: 1.4738 - regression_loss: 1.2666 - classification_loss: 0.2072 155/500 [========>.....................] - ETA: 1:57 - loss: 1.4695 - regression_loss: 1.2627 - classification_loss: 0.2068 156/500 [========>.....................] - ETA: 1:56 - loss: 1.4683 - regression_loss: 1.2615 - classification_loss: 0.2067 157/500 [========>.....................] - ETA: 1:56 - loss: 1.4671 - regression_loss: 1.2608 - classification_loss: 0.2062 158/500 [========>.....................] - ETA: 1:56 - loss: 1.4630 - regression_loss: 1.2575 - classification_loss: 0.2056 159/500 [========>.....................] - ETA: 1:56 - loss: 1.4709 - regression_loss: 1.2637 - classification_loss: 0.2072 160/500 [========>.....................] - ETA: 1:55 - loss: 1.4689 - regression_loss: 1.2622 - classification_loss: 0.2067 161/500 [========>.....................] - ETA: 1:55 - loss: 1.4638 - regression_loss: 1.2580 - classification_loss: 0.2058 162/500 [========>.....................] - ETA: 1:55 - loss: 1.4635 - regression_loss: 1.2580 - classification_loss: 0.2055 163/500 [========>.....................] - ETA: 1:54 - loss: 1.4620 - regression_loss: 1.2568 - classification_loss: 0.2052 164/500 [========>.....................] - ETA: 1:54 - loss: 1.4589 - regression_loss: 1.2543 - classification_loss: 0.2046 165/500 [========>.....................] - ETA: 1:53 - loss: 1.4618 - regression_loss: 1.2572 - classification_loss: 0.2047 166/500 [========>.....................] - ETA: 1:53 - loss: 1.4575 - regression_loss: 1.2536 - classification_loss: 0.2039 167/500 [=========>....................] - ETA: 1:53 - loss: 1.4640 - regression_loss: 1.2593 - classification_loss: 0.2047 168/500 [=========>....................] - ETA: 1:52 - loss: 1.4645 - regression_loss: 1.2598 - classification_loss: 0.2047 169/500 [=========>....................] - ETA: 1:52 - loss: 1.4632 - regression_loss: 1.2588 - classification_loss: 0.2044 170/500 [=========>....................] - ETA: 1:52 - loss: 1.4640 - regression_loss: 1.2599 - classification_loss: 0.2041 171/500 [=========>....................] - ETA: 1:51 - loss: 1.4645 - regression_loss: 1.2601 - classification_loss: 0.2044 172/500 [=========>....................] - ETA: 1:51 - loss: 1.4662 - regression_loss: 1.2617 - classification_loss: 0.2045 173/500 [=========>....................] - ETA: 1:51 - loss: 1.4631 - regression_loss: 1.2588 - classification_loss: 0.2042 174/500 [=========>....................] - ETA: 1:50 - loss: 1.4621 - regression_loss: 1.2582 - classification_loss: 0.2039 175/500 [=========>....................] - ETA: 1:50 - loss: 1.4590 - regression_loss: 1.2557 - classification_loss: 0.2034 176/500 [=========>....................] - ETA: 1:50 - loss: 1.4576 - regression_loss: 1.2548 - classification_loss: 0.2028 177/500 [=========>....................] - ETA: 1:49 - loss: 1.4569 - regression_loss: 1.2542 - classification_loss: 0.2027 178/500 [=========>....................] - ETA: 1:49 - loss: 1.4551 - regression_loss: 1.2531 - classification_loss: 0.2020 179/500 [=========>....................] - ETA: 1:49 - loss: 1.4518 - regression_loss: 1.2504 - classification_loss: 0.2014 180/500 [=========>....................] - ETA: 1:48 - loss: 1.4509 - regression_loss: 1.2496 - classification_loss: 0.2012 181/500 [=========>....................] - ETA: 1:48 - loss: 1.4517 - regression_loss: 1.2505 - classification_loss: 0.2013 182/500 [=========>....................] - ETA: 1:48 - loss: 1.4494 - regression_loss: 1.2487 - classification_loss: 0.2006 183/500 [=========>....................] - ETA: 1:47 - loss: 1.4510 - regression_loss: 1.2498 - classification_loss: 0.2012 184/500 [==========>...................] - ETA: 1:47 - loss: 1.4503 - regression_loss: 1.2495 - classification_loss: 0.2009 185/500 [==========>...................] - ETA: 1:47 - loss: 1.4482 - regression_loss: 1.2478 - classification_loss: 0.2004 186/500 [==========>...................] - ETA: 1:46 - loss: 1.4481 - regression_loss: 1.2480 - classification_loss: 0.2001 187/500 [==========>...................] - ETA: 1:46 - loss: 1.4471 - regression_loss: 1.2470 - classification_loss: 0.2000 188/500 [==========>...................] - ETA: 1:46 - loss: 1.4433 - regression_loss: 1.2437 - classification_loss: 0.1996 189/500 [==========>...................] - ETA: 1:45 - loss: 1.4427 - regression_loss: 1.2435 - classification_loss: 0.1992 190/500 [==========>...................] - ETA: 1:45 - loss: 1.4416 - regression_loss: 1.2426 - classification_loss: 0.1990 191/500 [==========>...................] - ETA: 1:44 - loss: 1.4432 - regression_loss: 1.2440 - classification_loss: 0.1992 192/500 [==========>...................] - ETA: 1:44 - loss: 1.4431 - regression_loss: 1.2441 - classification_loss: 0.1990 193/500 [==========>...................] - ETA: 1:44 - loss: 1.4454 - regression_loss: 1.2461 - classification_loss: 0.1992 194/500 [==========>...................] - ETA: 1:43 - loss: 1.4445 - regression_loss: 1.2455 - classification_loss: 0.1990 195/500 [==========>...................] - ETA: 1:43 - loss: 1.4441 - regression_loss: 1.2451 - classification_loss: 0.1990 196/500 [==========>...................] - ETA: 1:43 - loss: 1.4494 - regression_loss: 1.2506 - classification_loss: 0.1988 197/500 [==========>...................] - ETA: 1:42 - loss: 1.4468 - regression_loss: 1.2485 - classification_loss: 0.1983 198/500 [==========>...................] - ETA: 1:42 - loss: 1.4461 - regression_loss: 1.2483 - classification_loss: 0.1979 199/500 [==========>...................] - ETA: 1:42 - loss: 1.4476 - regression_loss: 1.2500 - classification_loss: 0.1976 200/500 [===========>..................] - ETA: 1:41 - loss: 1.4481 - regression_loss: 1.2505 - classification_loss: 0.1975 201/500 [===========>..................] - ETA: 1:41 - loss: 1.4498 - regression_loss: 1.2520 - classification_loss: 0.1978 202/500 [===========>..................] - ETA: 1:41 - loss: 1.4502 - regression_loss: 1.2525 - classification_loss: 0.1977 203/500 [===========>..................] - ETA: 1:40 - loss: 1.4503 - regression_loss: 1.2529 - classification_loss: 0.1974 204/500 [===========>..................] - ETA: 1:40 - loss: 1.4466 - regression_loss: 1.2497 - classification_loss: 0.1969 205/500 [===========>..................] - ETA: 1:40 - loss: 1.4475 - regression_loss: 1.2506 - classification_loss: 0.1970 206/500 [===========>..................] - ETA: 1:39 - loss: 1.4447 - regression_loss: 1.2477 - classification_loss: 0.1970 207/500 [===========>..................] - ETA: 1:39 - loss: 1.4440 - regression_loss: 1.2473 - classification_loss: 0.1967 208/500 [===========>..................] - ETA: 1:39 - loss: 1.4407 - regression_loss: 1.2444 - classification_loss: 0.1963 209/500 [===========>..................] - ETA: 1:38 - loss: 1.4418 - regression_loss: 1.2448 - classification_loss: 0.1970 210/500 [===========>..................] - ETA: 1:38 - loss: 1.4423 - regression_loss: 1.2452 - classification_loss: 0.1971 211/500 [===========>..................] - ETA: 1:38 - loss: 1.4381 - regression_loss: 1.2418 - classification_loss: 0.1963 212/500 [===========>..................] - ETA: 1:37 - loss: 1.4412 - regression_loss: 1.2443 - classification_loss: 0.1968 213/500 [===========>..................] - ETA: 1:37 - loss: 1.4372 - regression_loss: 1.2410 - classification_loss: 0.1962 214/500 [===========>..................] - ETA: 1:37 - loss: 1.4348 - regression_loss: 1.2393 - classification_loss: 0.1956 215/500 [===========>..................] - ETA: 1:36 - loss: 1.4328 - regression_loss: 1.2375 - classification_loss: 0.1953 216/500 [===========>..................] - ETA: 1:36 - loss: 1.4314 - regression_loss: 1.2358 - classification_loss: 0.1956 217/500 [============>.................] - ETA: 1:36 - loss: 1.4327 - regression_loss: 1.2375 - classification_loss: 0.1952 218/500 [============>.................] - ETA: 1:35 - loss: 1.4298 - regression_loss: 1.2351 - classification_loss: 0.1947 219/500 [============>.................] - ETA: 1:35 - loss: 1.4277 - regression_loss: 1.2332 - classification_loss: 0.1945 220/500 [============>.................] - ETA: 1:35 - loss: 1.4291 - regression_loss: 1.2349 - classification_loss: 0.1942 221/500 [============>.................] - ETA: 1:34 - loss: 1.4243 - regression_loss: 1.2307 - classification_loss: 0.1936 222/500 [============>.................] - ETA: 1:34 - loss: 1.4217 - regression_loss: 1.2286 - classification_loss: 0.1930 223/500 [============>.................] - ETA: 1:34 - loss: 1.4237 - regression_loss: 1.2303 - classification_loss: 0.1934 224/500 [============>.................] - ETA: 1:33 - loss: 1.4270 - regression_loss: 1.2333 - classification_loss: 0.1937 225/500 [============>.................] - ETA: 1:33 - loss: 1.4311 - regression_loss: 1.2363 - classification_loss: 0.1948 226/500 [============>.................] - ETA: 1:33 - loss: 1.4327 - regression_loss: 1.2376 - classification_loss: 0.1951 227/500 [============>.................] - ETA: 1:32 - loss: 1.4323 - regression_loss: 1.2376 - classification_loss: 0.1947 228/500 [============>.................] - ETA: 1:32 - loss: 1.4335 - regression_loss: 1.2386 - classification_loss: 0.1949 229/500 [============>.................] - ETA: 1:32 - loss: 1.4317 - regression_loss: 1.2371 - classification_loss: 0.1945 230/500 [============>.................] - ETA: 1:31 - loss: 1.4302 - regression_loss: 1.2357 - classification_loss: 0.1946 231/500 [============>.................] - ETA: 1:31 - loss: 1.4334 - regression_loss: 1.2384 - classification_loss: 0.1950 232/500 [============>.................] - ETA: 1:31 - loss: 1.4372 - regression_loss: 1.2412 - classification_loss: 0.1960 233/500 [============>.................] - ETA: 1:30 - loss: 1.4330 - regression_loss: 1.2376 - classification_loss: 0.1954 234/500 [=============>................] - ETA: 1:30 - loss: 1.4332 - regression_loss: 1.2378 - classification_loss: 0.1955 235/500 [=============>................] - ETA: 1:30 - loss: 1.4305 - regression_loss: 1.2355 - classification_loss: 0.1950 236/500 [=============>................] - ETA: 1:29 - loss: 1.4307 - regression_loss: 1.2358 - classification_loss: 0.1949 237/500 [=============>................] - ETA: 1:29 - loss: 1.4303 - regression_loss: 1.2306 - classification_loss: 0.1997 238/500 [=============>................] - ETA: 1:29 - loss: 1.4278 - regression_loss: 1.2285 - classification_loss: 0.1992 239/500 [=============>................] - ETA: 1:28 - loss: 1.4266 - regression_loss: 1.2276 - classification_loss: 0.1989 240/500 [=============>................] - ETA: 1:28 - loss: 1.4253 - regression_loss: 1.2266 - classification_loss: 0.1986 241/500 [=============>................] - ETA: 1:28 - loss: 1.4257 - regression_loss: 1.2271 - classification_loss: 0.1986 242/500 [=============>................] - ETA: 1:27 - loss: 1.4286 - regression_loss: 1.2298 - classification_loss: 0.1989 243/500 [=============>................] - ETA: 1:27 - loss: 1.4263 - regression_loss: 1.2278 - classification_loss: 0.1985 244/500 [=============>................] - ETA: 1:26 - loss: 1.4257 - regression_loss: 1.2274 - classification_loss: 0.1983 245/500 [=============>................] - ETA: 1:26 - loss: 1.4246 - regression_loss: 1.2265 - classification_loss: 0.1981 246/500 [=============>................] - ETA: 1:26 - loss: 1.4249 - regression_loss: 1.2267 - classification_loss: 0.1982 247/500 [=============>................] - ETA: 1:25 - loss: 1.4246 - regression_loss: 1.2265 - classification_loss: 0.1981 248/500 [=============>................] - ETA: 1:25 - loss: 1.4247 - regression_loss: 1.2266 - classification_loss: 0.1981 249/500 [=============>................] - ETA: 1:25 - loss: 1.4266 - regression_loss: 1.2281 - classification_loss: 0.1985 250/500 [==============>...............] - ETA: 1:24 - loss: 1.4253 - regression_loss: 1.2269 - classification_loss: 0.1984 251/500 [==============>...............] - ETA: 1:24 - loss: 1.4232 - regression_loss: 1.2251 - classification_loss: 0.1981 252/500 [==============>...............] - ETA: 1:24 - loss: 1.4202 - regression_loss: 1.2226 - classification_loss: 0.1977 253/500 [==============>...............] - ETA: 1:23 - loss: 1.4227 - regression_loss: 1.2249 - classification_loss: 0.1977 254/500 [==============>...............] - ETA: 1:23 - loss: 1.4243 - regression_loss: 1.2261 - classification_loss: 0.1981 255/500 [==============>...............] - ETA: 1:23 - loss: 1.4269 - regression_loss: 1.2278 - classification_loss: 0.1991 256/500 [==============>...............] - ETA: 1:22 - loss: 1.4237 - regression_loss: 1.2252 - classification_loss: 0.1985 257/500 [==============>...............] - ETA: 1:22 - loss: 1.4240 - regression_loss: 1.2256 - classification_loss: 0.1984 258/500 [==============>...............] - ETA: 1:22 - loss: 1.4221 - regression_loss: 1.2240 - classification_loss: 0.1980 259/500 [==============>...............] - ETA: 1:21 - loss: 1.4217 - regression_loss: 1.2236 - classification_loss: 0.1981 260/500 [==============>...............] - ETA: 1:21 - loss: 1.4201 - regression_loss: 1.2223 - classification_loss: 0.1978 261/500 [==============>...............] - ETA: 1:21 - loss: 1.4198 - regression_loss: 1.2220 - classification_loss: 0.1978 262/500 [==============>...............] - ETA: 1:20 - loss: 1.4214 - regression_loss: 1.2232 - classification_loss: 0.1982 263/500 [==============>...............] - ETA: 1:20 - loss: 1.4211 - regression_loss: 1.2232 - classification_loss: 0.1980 264/500 [==============>...............] - ETA: 1:20 - loss: 1.4185 - regression_loss: 1.2210 - classification_loss: 0.1975 265/500 [==============>...............] - ETA: 1:19 - loss: 1.4173 - regression_loss: 1.2201 - classification_loss: 0.1972 266/500 [==============>...............] - ETA: 1:19 - loss: 1.4156 - regression_loss: 1.2188 - classification_loss: 0.1967 267/500 [===============>..............] - ETA: 1:19 - loss: 1.4176 - regression_loss: 1.2207 - classification_loss: 0.1970 268/500 [===============>..............] - ETA: 1:18 - loss: 1.4175 - regression_loss: 1.2191 - classification_loss: 0.1983 269/500 [===============>..............] - ETA: 1:18 - loss: 1.4182 - regression_loss: 1.2199 - classification_loss: 0.1983 270/500 [===============>..............] - ETA: 1:18 - loss: 1.4189 - regression_loss: 1.2206 - classification_loss: 0.1983 271/500 [===============>..............] - ETA: 1:17 - loss: 1.4215 - regression_loss: 1.2227 - classification_loss: 0.1988 272/500 [===============>..............] - ETA: 1:17 - loss: 1.4216 - regression_loss: 1.2228 - classification_loss: 0.1987 273/500 [===============>..............] - ETA: 1:17 - loss: 1.4203 - regression_loss: 1.2218 - classification_loss: 0.1985 274/500 [===============>..............] - ETA: 1:16 - loss: 1.4200 - regression_loss: 1.2216 - classification_loss: 0.1984 275/500 [===============>..............] - ETA: 1:16 - loss: 1.4197 - regression_loss: 1.2216 - classification_loss: 0.1981 276/500 [===============>..............] - ETA: 1:16 - loss: 1.4188 - regression_loss: 1.2210 - classification_loss: 0.1978 277/500 [===============>..............] - ETA: 1:15 - loss: 1.4183 - regression_loss: 1.2205 - classification_loss: 0.1979 278/500 [===============>..............] - ETA: 1:15 - loss: 1.4173 - regression_loss: 1.2196 - classification_loss: 0.1976 279/500 [===============>..............] - ETA: 1:15 - loss: 1.4167 - regression_loss: 1.2193 - classification_loss: 0.1974 280/500 [===============>..............] - ETA: 1:14 - loss: 1.4185 - regression_loss: 1.2208 - classification_loss: 0.1977 281/500 [===============>..............] - ETA: 1:14 - loss: 1.4182 - regression_loss: 1.2208 - classification_loss: 0.1974 282/500 [===============>..............] - ETA: 1:14 - loss: 1.4182 - regression_loss: 1.2210 - classification_loss: 0.1972 283/500 [===============>..............] - ETA: 1:13 - loss: 1.4178 - regression_loss: 1.2208 - classification_loss: 0.1970 284/500 [================>.............] - ETA: 1:13 - loss: 1.4135 - regression_loss: 1.2165 - classification_loss: 0.1971 285/500 [================>.............] - ETA: 1:13 - loss: 1.4164 - regression_loss: 1.2187 - classification_loss: 0.1977 286/500 [================>.............] - ETA: 1:12 - loss: 1.4156 - regression_loss: 1.2180 - classification_loss: 0.1976 287/500 [================>.............] - ETA: 1:12 - loss: 1.4185 - regression_loss: 1.2202 - classification_loss: 0.1983 288/500 [================>.............] - ETA: 1:11 - loss: 1.4195 - regression_loss: 1.2210 - classification_loss: 0.1985 289/500 [================>.............] - ETA: 1:11 - loss: 1.4172 - regression_loss: 1.2190 - classification_loss: 0.1981 290/500 [================>.............] - ETA: 1:11 - loss: 1.4192 - regression_loss: 1.2211 - classification_loss: 0.1981 291/500 [================>.............] - ETA: 1:10 - loss: 1.4221 - regression_loss: 1.2232 - classification_loss: 0.1989 292/500 [================>.............] - ETA: 1:10 - loss: 1.4229 - regression_loss: 1.2240 - classification_loss: 0.1989 293/500 [================>.............] - ETA: 1:10 - loss: 1.4212 - regression_loss: 1.2226 - classification_loss: 0.1985 294/500 [================>.............] - ETA: 1:09 - loss: 1.4241 - regression_loss: 1.2247 - classification_loss: 0.1993 295/500 [================>.............] - ETA: 1:09 - loss: 1.4228 - regression_loss: 1.2237 - classification_loss: 0.1991 296/500 [================>.............] - ETA: 1:09 - loss: 1.4231 - regression_loss: 1.2241 - classification_loss: 0.1990 297/500 [================>.............] - ETA: 1:08 - loss: 1.4228 - regression_loss: 1.2240 - classification_loss: 0.1988 298/500 [================>.............] - ETA: 1:08 - loss: 1.4247 - regression_loss: 1.2255 - classification_loss: 0.1992 299/500 [================>.............] - ETA: 1:08 - loss: 1.4260 - regression_loss: 1.2265 - classification_loss: 0.1995 300/500 [=================>............] - ETA: 1:07 - loss: 1.4228 - regression_loss: 1.2238 - classification_loss: 0.1990 301/500 [=================>............] - ETA: 1:07 - loss: 1.4223 - regression_loss: 1.2236 - classification_loss: 0.1987 302/500 [=================>............] - ETA: 1:07 - loss: 1.4200 - regression_loss: 1.2216 - classification_loss: 0.1984 303/500 [=================>............] - ETA: 1:06 - loss: 1.4209 - regression_loss: 1.2227 - classification_loss: 0.1982 304/500 [=================>............] - ETA: 1:06 - loss: 1.4204 - regression_loss: 1.2222 - classification_loss: 0.1982 305/500 [=================>............] - ETA: 1:06 - loss: 1.4209 - regression_loss: 1.2228 - classification_loss: 0.1981 306/500 [=================>............] - ETA: 1:05 - loss: 1.4208 - regression_loss: 1.2228 - classification_loss: 0.1980 307/500 [=================>............] - ETA: 1:05 - loss: 1.4203 - regression_loss: 1.2224 - classification_loss: 0.1979 308/500 [=================>............] - ETA: 1:05 - loss: 1.4200 - regression_loss: 1.2223 - classification_loss: 0.1977 309/500 [=================>............] - ETA: 1:04 - loss: 1.4199 - regression_loss: 1.2221 - classification_loss: 0.1977 310/500 [=================>............] - ETA: 1:04 - loss: 1.4179 - regression_loss: 1.2206 - classification_loss: 0.1973 311/500 [=================>............] - ETA: 1:04 - loss: 1.4211 - regression_loss: 1.2233 - classification_loss: 0.1978 312/500 [=================>............] - ETA: 1:03 - loss: 1.4217 - regression_loss: 1.2239 - classification_loss: 0.1978 313/500 [=================>............] - ETA: 1:03 - loss: 1.4190 - regression_loss: 1.2216 - classification_loss: 0.1974 314/500 [=================>............] - ETA: 1:03 - loss: 1.4194 - regression_loss: 1.2220 - classification_loss: 0.1973 315/500 [=================>............] - ETA: 1:02 - loss: 1.4192 - regression_loss: 1.2220 - classification_loss: 0.1972 316/500 [=================>............] - ETA: 1:02 - loss: 1.4183 - regression_loss: 1.2213 - classification_loss: 0.1970 317/500 [==================>...........] - ETA: 1:02 - loss: 1.4172 - regression_loss: 1.2205 - classification_loss: 0.1967 318/500 [==================>...........] - ETA: 1:01 - loss: 1.4163 - regression_loss: 1.2196 - classification_loss: 0.1967 319/500 [==================>...........] - ETA: 1:01 - loss: 1.4162 - regression_loss: 1.2196 - classification_loss: 0.1966 320/500 [==================>...........] - ETA: 1:01 - loss: 1.4154 - regression_loss: 1.2191 - classification_loss: 0.1963 321/500 [==================>...........] - ETA: 1:00 - loss: 1.4145 - regression_loss: 1.2184 - classification_loss: 0.1961 322/500 [==================>...........] - ETA: 1:00 - loss: 1.4147 - regression_loss: 1.2188 - classification_loss: 0.1959 323/500 [==================>...........] - ETA: 1:00 - loss: 1.4144 - regression_loss: 1.2186 - classification_loss: 0.1958 324/500 [==================>...........] - ETA: 59s - loss: 1.4154 - regression_loss: 1.2195 - classification_loss: 0.1959  325/500 [==================>...........] - ETA: 59s - loss: 1.4215 - regression_loss: 1.2246 - classification_loss: 0.1969 326/500 [==================>...........] - ETA: 59s - loss: 1.4216 - regression_loss: 1.2245 - classification_loss: 0.1971 327/500 [==================>...........] - ETA: 58s - loss: 1.4229 - regression_loss: 1.2257 - classification_loss: 0.1972 328/500 [==================>...........] - ETA: 58s - loss: 1.4231 - regression_loss: 1.2259 - classification_loss: 0.1972 329/500 [==================>...........] - ETA: 58s - loss: 1.4239 - regression_loss: 1.2266 - classification_loss: 0.1973 330/500 [==================>...........] - ETA: 57s - loss: 1.4231 - regression_loss: 1.2260 - classification_loss: 0.1971 331/500 [==================>...........] - ETA: 57s - loss: 1.4219 - regression_loss: 1.2251 - classification_loss: 0.1968 332/500 [==================>...........] - ETA: 57s - loss: 1.4214 - regression_loss: 1.2248 - classification_loss: 0.1966 333/500 [==================>...........] - ETA: 56s - loss: 1.4199 - regression_loss: 1.2235 - classification_loss: 0.1964 334/500 [===================>..........] - ETA: 56s - loss: 1.4200 - regression_loss: 1.2235 - classification_loss: 0.1965 335/500 [===================>..........] - ETA: 55s - loss: 1.4210 - regression_loss: 1.2242 - classification_loss: 0.1968 336/500 [===================>..........] - ETA: 55s - loss: 1.4240 - regression_loss: 1.2264 - classification_loss: 0.1976 337/500 [===================>..........] - ETA: 55s - loss: 1.4219 - regression_loss: 1.2246 - classification_loss: 0.1973 338/500 [===================>..........] - ETA: 54s - loss: 1.4197 - regression_loss: 1.2210 - classification_loss: 0.1987 339/500 [===================>..........] - ETA: 54s - loss: 1.4208 - regression_loss: 1.2220 - classification_loss: 0.1988 340/500 [===================>..........] - ETA: 54s - loss: 1.4221 - regression_loss: 1.2231 - classification_loss: 0.1991 341/500 [===================>..........] - ETA: 53s - loss: 1.4217 - regression_loss: 1.2227 - classification_loss: 0.1990 342/500 [===================>..........] - ETA: 53s - loss: 1.4210 - regression_loss: 1.2222 - classification_loss: 0.1988 343/500 [===================>..........] - ETA: 53s - loss: 1.4219 - regression_loss: 1.2226 - classification_loss: 0.1993 344/500 [===================>..........] - ETA: 52s - loss: 1.4230 - regression_loss: 1.2240 - classification_loss: 0.1990 345/500 [===================>..........] - ETA: 52s - loss: 1.4221 - regression_loss: 1.2232 - classification_loss: 0.1989 346/500 [===================>..........] - ETA: 52s - loss: 1.4193 - regression_loss: 1.2207 - classification_loss: 0.1985 347/500 [===================>..........] - ETA: 51s - loss: 1.4159 - regression_loss: 1.2178 - classification_loss: 0.1981 348/500 [===================>..........] - ETA: 51s - loss: 1.4146 - regression_loss: 1.2167 - classification_loss: 0.1979 349/500 [===================>..........] - ETA: 51s - loss: 1.4148 - regression_loss: 1.2168 - classification_loss: 0.1980 350/500 [====================>.........] - ETA: 50s - loss: 1.4183 - regression_loss: 1.2198 - classification_loss: 0.1984 351/500 [====================>.........] - ETA: 50s - loss: 1.4195 - regression_loss: 1.2208 - classification_loss: 0.1988 352/500 [====================>.........] - ETA: 50s - loss: 1.4196 - regression_loss: 1.2209 - classification_loss: 0.1987 353/500 [====================>.........] - ETA: 49s - loss: 1.4222 - regression_loss: 1.2234 - classification_loss: 0.1988 354/500 [====================>.........] - ETA: 49s - loss: 1.4238 - regression_loss: 1.2244 - classification_loss: 0.1994 355/500 [====================>.........] - ETA: 49s - loss: 1.4219 - regression_loss: 1.2230 - classification_loss: 0.1990 356/500 [====================>.........] - ETA: 48s - loss: 1.4212 - regression_loss: 1.2221 - classification_loss: 0.1990 357/500 [====================>.........] - ETA: 48s - loss: 1.4209 - regression_loss: 1.2221 - classification_loss: 0.1988 358/500 [====================>.........] - ETA: 48s - loss: 1.4218 - regression_loss: 1.2229 - classification_loss: 0.1989 359/500 [====================>.........] - ETA: 47s - loss: 1.4205 - regression_loss: 1.2216 - classification_loss: 0.1989 360/500 [====================>.........] - ETA: 47s - loss: 1.4209 - regression_loss: 1.2223 - classification_loss: 0.1987 361/500 [====================>.........] - ETA: 47s - loss: 1.4204 - regression_loss: 1.2219 - classification_loss: 0.1985 362/500 [====================>.........] - ETA: 46s - loss: 1.4190 - regression_loss: 1.2208 - classification_loss: 0.1982 363/500 [====================>.........] - ETA: 46s - loss: 1.4187 - regression_loss: 1.2206 - classification_loss: 0.1981 364/500 [====================>.........] - ETA: 46s - loss: 1.4187 - regression_loss: 1.2207 - classification_loss: 0.1981 365/500 [====================>.........] - ETA: 45s - loss: 1.4187 - regression_loss: 1.2207 - classification_loss: 0.1980 366/500 [====================>.........] - ETA: 45s - loss: 1.4196 - regression_loss: 1.2212 - classification_loss: 0.1984 367/500 [=====================>........] - ETA: 45s - loss: 1.4195 - regression_loss: 1.2211 - classification_loss: 0.1985 368/500 [=====================>........] - ETA: 44s - loss: 1.4195 - regression_loss: 1.2208 - classification_loss: 0.1987 369/500 [=====================>........] - ETA: 44s - loss: 1.4239 - regression_loss: 1.2243 - classification_loss: 0.1996 370/500 [=====================>........] - ETA: 44s - loss: 1.4243 - regression_loss: 1.2246 - classification_loss: 0.1996 371/500 [=====================>........] - ETA: 43s - loss: 1.4227 - regression_loss: 1.2234 - classification_loss: 0.1993 372/500 [=====================>........] - ETA: 43s - loss: 1.4230 - regression_loss: 1.2237 - classification_loss: 0.1992 373/500 [=====================>........] - ETA: 43s - loss: 1.4253 - regression_loss: 1.2260 - classification_loss: 0.1993 374/500 [=====================>........] - ETA: 42s - loss: 1.4245 - regression_loss: 1.2254 - classification_loss: 0.1991 375/500 [=====================>........] - ETA: 42s - loss: 1.4232 - regression_loss: 1.2244 - classification_loss: 0.1988 376/500 [=====================>........] - ETA: 42s - loss: 1.4217 - regression_loss: 1.2232 - classification_loss: 0.1985 377/500 [=====================>........] - ETA: 41s - loss: 1.4218 - regression_loss: 1.2235 - classification_loss: 0.1982 378/500 [=====================>........] - ETA: 41s - loss: 1.4230 - regression_loss: 1.2247 - classification_loss: 0.1983 379/500 [=====================>........] - ETA: 40s - loss: 1.4238 - regression_loss: 1.2255 - classification_loss: 0.1982 380/500 [=====================>........] - ETA: 40s - loss: 1.4248 - regression_loss: 1.2265 - classification_loss: 0.1982 381/500 [=====================>........] - ETA: 40s - loss: 1.4267 - regression_loss: 1.2281 - classification_loss: 0.1986 382/500 [=====================>........] - ETA: 39s - loss: 1.4273 - regression_loss: 1.2285 - classification_loss: 0.1988 383/500 [=====================>........] - ETA: 39s - loss: 1.4280 - regression_loss: 1.2293 - classification_loss: 0.1987 384/500 [======================>.......] - ETA: 39s - loss: 1.4278 - regression_loss: 1.2293 - classification_loss: 0.1985 385/500 [======================>.......] - ETA: 38s - loss: 1.4253 - regression_loss: 1.2272 - classification_loss: 0.1982 386/500 [======================>.......] - ETA: 38s - loss: 1.4242 - regression_loss: 1.2262 - classification_loss: 0.1980 387/500 [======================>.......] - ETA: 38s - loss: 1.4256 - regression_loss: 1.2277 - classification_loss: 0.1980 388/500 [======================>.......] - ETA: 37s - loss: 1.4233 - regression_loss: 1.2256 - classification_loss: 0.1977 389/500 [======================>.......] - ETA: 37s - loss: 1.4212 - regression_loss: 1.2238 - classification_loss: 0.1973 390/500 [======================>.......] - ETA: 37s - loss: 1.4222 - regression_loss: 1.2246 - classification_loss: 0.1976 391/500 [======================>.......] - ETA: 36s - loss: 1.4221 - regression_loss: 1.2246 - classification_loss: 0.1975 392/500 [======================>.......] - ETA: 36s - loss: 1.4256 - regression_loss: 1.2274 - classification_loss: 0.1981 393/500 [======================>.......] - ETA: 36s - loss: 1.4260 - regression_loss: 1.2278 - classification_loss: 0.1981 394/500 [======================>.......] - ETA: 35s - loss: 1.4253 - regression_loss: 1.2272 - classification_loss: 0.1981 395/500 [======================>.......] - ETA: 35s - loss: 1.4280 - regression_loss: 1.2294 - classification_loss: 0.1986 396/500 [======================>.......] - ETA: 35s - loss: 1.4271 - regression_loss: 1.2287 - classification_loss: 0.1985 397/500 [======================>.......] - ETA: 34s - loss: 1.4271 - regression_loss: 1.2289 - classification_loss: 0.1982 398/500 [======================>.......] - ETA: 34s - loss: 1.4272 - regression_loss: 1.2291 - classification_loss: 0.1981 399/500 [======================>.......] - ETA: 34s - loss: 1.4267 - regression_loss: 1.2288 - classification_loss: 0.1979 400/500 [=======================>......] - ETA: 33s - loss: 1.4257 - regression_loss: 1.2280 - classification_loss: 0.1977 401/500 [=======================>......] - ETA: 33s - loss: 1.4239 - regression_loss: 1.2265 - classification_loss: 0.1974 402/500 [=======================>......] - ETA: 33s - loss: 1.4224 - regression_loss: 1.2253 - classification_loss: 0.1971 403/500 [=======================>......] - ETA: 32s - loss: 1.4212 - regression_loss: 1.2244 - classification_loss: 0.1968 404/500 [=======================>......] - ETA: 32s - loss: 1.4219 - regression_loss: 1.2250 - classification_loss: 0.1969 405/500 [=======================>......] - ETA: 32s - loss: 1.4217 - regression_loss: 1.2250 - classification_loss: 0.1967 406/500 [=======================>......] - ETA: 31s - loss: 1.4236 - regression_loss: 1.2267 - classification_loss: 0.1969 407/500 [=======================>......] - ETA: 31s - loss: 1.4246 - regression_loss: 1.2275 - classification_loss: 0.1971 408/500 [=======================>......] - ETA: 31s - loss: 1.4241 - regression_loss: 1.2273 - classification_loss: 0.1968 409/500 [=======================>......] - ETA: 30s - loss: 1.4263 - regression_loss: 1.2286 - classification_loss: 0.1977 410/500 [=======================>......] - ETA: 30s - loss: 1.4268 - regression_loss: 1.2292 - classification_loss: 0.1977 411/500 [=======================>......] - ETA: 30s - loss: 1.4295 - regression_loss: 1.2311 - classification_loss: 0.1984 412/500 [=======================>......] - ETA: 29s - loss: 1.4311 - regression_loss: 1.2324 - classification_loss: 0.1986 413/500 [=======================>......] - ETA: 29s - loss: 1.4319 - regression_loss: 1.2331 - classification_loss: 0.1988 414/500 [=======================>......] - ETA: 29s - loss: 1.4314 - regression_loss: 1.2328 - classification_loss: 0.1986 415/500 [=======================>......] - ETA: 28s - loss: 1.4311 - regression_loss: 1.2326 - classification_loss: 0.1986 416/500 [=======================>......] - ETA: 28s - loss: 1.4312 - regression_loss: 1.2327 - classification_loss: 0.1985 417/500 [========================>.....] - ETA: 28s - loss: 1.4306 - regression_loss: 1.2321 - classification_loss: 0.1984 418/500 [========================>.....] - ETA: 27s - loss: 1.4295 - regression_loss: 1.2313 - classification_loss: 0.1982 419/500 [========================>.....] - ETA: 27s - loss: 1.4300 - regression_loss: 1.2317 - classification_loss: 0.1983 420/500 [========================>.....] - ETA: 27s - loss: 1.4302 - regression_loss: 1.2318 - classification_loss: 0.1984 421/500 [========================>.....] - ETA: 26s - loss: 1.4305 - regression_loss: 1.2321 - classification_loss: 0.1984 422/500 [========================>.....] - ETA: 26s - loss: 1.4301 - regression_loss: 1.2319 - classification_loss: 0.1982 423/500 [========================>.....] - ETA: 26s - loss: 1.4292 - regression_loss: 1.2310 - classification_loss: 0.1982 424/500 [========================>.....] - ETA: 25s - loss: 1.4284 - regression_loss: 1.2304 - classification_loss: 0.1980 425/500 [========================>.....] - ETA: 25s - loss: 1.4300 - regression_loss: 1.2316 - classification_loss: 0.1984 426/500 [========================>.....] - ETA: 25s - loss: 1.4310 - regression_loss: 1.2324 - classification_loss: 0.1986 427/500 [========================>.....] - ETA: 24s - loss: 1.4318 - regression_loss: 1.2332 - classification_loss: 0.1986 428/500 [========================>.....] - ETA: 24s - loss: 1.4313 - regression_loss: 1.2328 - classification_loss: 0.1985 429/500 [========================>.....] - ETA: 24s - loss: 1.4330 - regression_loss: 1.2342 - classification_loss: 0.1987 430/500 [========================>.....] - ETA: 23s - loss: 1.4363 - regression_loss: 1.2370 - classification_loss: 0.1993 431/500 [========================>.....] - ETA: 23s - loss: 1.4362 - regression_loss: 1.2370 - classification_loss: 0.1992 432/500 [========================>.....] - ETA: 23s - loss: 1.4360 - regression_loss: 1.2370 - classification_loss: 0.1990 433/500 [========================>.....] - ETA: 22s - loss: 1.4364 - regression_loss: 1.2371 - classification_loss: 0.1992 434/500 [=========================>....] - ETA: 22s - loss: 1.4361 - regression_loss: 1.2370 - classification_loss: 0.1991 435/500 [=========================>....] - ETA: 22s - loss: 1.4342 - regression_loss: 1.2354 - classification_loss: 0.1988 436/500 [=========================>....] - ETA: 21s - loss: 1.4336 - regression_loss: 1.2349 - classification_loss: 0.1987 437/500 [=========================>....] - ETA: 21s - loss: 1.4320 - regression_loss: 1.2336 - classification_loss: 0.1985 438/500 [=========================>....] - ETA: 21s - loss: 1.4322 - regression_loss: 1.2338 - classification_loss: 0.1984 439/500 [=========================>....] - ETA: 20s - loss: 1.4320 - regression_loss: 1.2336 - classification_loss: 0.1984 440/500 [=========================>....] - ETA: 20s - loss: 1.4313 - regression_loss: 1.2329 - classification_loss: 0.1984 441/500 [=========================>....] - ETA: 19s - loss: 1.4328 - regression_loss: 1.2342 - classification_loss: 0.1986 442/500 [=========================>....] - ETA: 19s - loss: 1.4319 - regression_loss: 1.2334 - classification_loss: 0.1985 443/500 [=========================>....] - ETA: 19s - loss: 1.4305 - regression_loss: 1.2323 - classification_loss: 0.1982 444/500 [=========================>....] - ETA: 18s - loss: 1.4319 - regression_loss: 1.2335 - classification_loss: 0.1984 445/500 [=========================>....] - ETA: 18s - loss: 1.4311 - regression_loss: 1.2329 - classification_loss: 0.1982 446/500 [=========================>....] - ETA: 18s - loss: 1.4318 - regression_loss: 1.2337 - classification_loss: 0.1981 447/500 [=========================>....] - ETA: 17s - loss: 1.4312 - regression_loss: 1.2332 - classification_loss: 0.1980 448/500 [=========================>....] - ETA: 17s - loss: 1.4338 - regression_loss: 1.2357 - classification_loss: 0.1981 449/500 [=========================>....] - ETA: 17s - loss: 1.4351 - regression_loss: 1.2366 - classification_loss: 0.1985 450/500 [==========================>...] - ETA: 16s - loss: 1.4343 - regression_loss: 1.2358 - classification_loss: 0.1984 451/500 [==========================>...] - ETA: 16s - loss: 1.4326 - regression_loss: 1.2343 - classification_loss: 0.1982 452/500 [==========================>...] - ETA: 16s - loss: 1.4332 - regression_loss: 1.2349 - classification_loss: 0.1983 453/500 [==========================>...] - ETA: 15s - loss: 1.4338 - regression_loss: 1.2353 - classification_loss: 0.1985 454/500 [==========================>...] - ETA: 15s - loss: 1.4332 - regression_loss: 1.2349 - classification_loss: 0.1983 455/500 [==========================>...] - ETA: 15s - loss: 1.4312 - regression_loss: 1.2332 - classification_loss: 0.1980 456/500 [==========================>...] - ETA: 14s - loss: 1.4324 - regression_loss: 1.2342 - classification_loss: 0.1983 457/500 [==========================>...] - ETA: 14s - loss: 1.4335 - regression_loss: 1.2349 - classification_loss: 0.1986 458/500 [==========================>...] - ETA: 14s - loss: 1.4336 - regression_loss: 1.2350 - classification_loss: 0.1986 459/500 [==========================>...] - ETA: 13s - loss: 1.4337 - regression_loss: 1.2351 - classification_loss: 0.1987 460/500 [==========================>...] - ETA: 13s - loss: 1.4332 - regression_loss: 1.2347 - classification_loss: 0.1985 461/500 [==========================>...] - ETA: 13s - loss: 1.4328 - regression_loss: 1.2343 - classification_loss: 0.1984 462/500 [==========================>...] - ETA: 12s - loss: 1.4340 - regression_loss: 1.2355 - classification_loss: 0.1985 463/500 [==========================>...] - ETA: 12s - loss: 1.4340 - regression_loss: 1.2354 - classification_loss: 0.1986 464/500 [==========================>...] - ETA: 12s - loss: 1.4355 - regression_loss: 1.2365 - classification_loss: 0.1990 465/500 [==========================>...] - ETA: 11s - loss: 1.4343 - regression_loss: 1.2356 - classification_loss: 0.1988 466/500 [==========================>...] - ETA: 11s - loss: 1.4349 - regression_loss: 1.2361 - classification_loss: 0.1988 467/500 [===========================>..] - ETA: 11s - loss: 1.4340 - regression_loss: 1.2354 - classification_loss: 0.1986 468/500 [===========================>..] - ETA: 10s - loss: 1.4333 - regression_loss: 1.2349 - classification_loss: 0.1984 469/500 [===========================>..] - ETA: 10s - loss: 1.4331 - regression_loss: 1.2348 - classification_loss: 0.1983 470/500 [===========================>..] - ETA: 10s - loss: 1.4333 - regression_loss: 1.2351 - classification_loss: 0.1982 471/500 [===========================>..] - ETA: 9s - loss: 1.4343 - regression_loss: 1.2356 - classification_loss: 0.1986  472/500 [===========================>..] - ETA: 9s - loss: 1.4347 - regression_loss: 1.2358 - classification_loss: 0.1988 473/500 [===========================>..] - ETA: 9s - loss: 1.4359 - regression_loss: 1.2367 - classification_loss: 0.1992 474/500 [===========================>..] - ETA: 8s - loss: 1.4364 - regression_loss: 1.2373 - classification_loss: 0.1991 475/500 [===========================>..] - ETA: 8s - loss: 1.4364 - regression_loss: 1.2374 - classification_loss: 0.1990 476/500 [===========================>..] - ETA: 8s - loss: 1.4356 - regression_loss: 1.2366 - classification_loss: 0.1990 477/500 [===========================>..] - ETA: 7s - loss: 1.4348 - regression_loss: 1.2358 - classification_loss: 0.1990 478/500 [===========================>..] - ETA: 7s - loss: 1.4348 - regression_loss: 1.2358 - classification_loss: 0.1990 479/500 [===========================>..] - ETA: 7s - loss: 1.4348 - regression_loss: 1.2358 - classification_loss: 0.1990 480/500 [===========================>..] - ETA: 6s - loss: 1.4332 - regression_loss: 1.2345 - classification_loss: 0.1987 481/500 [===========================>..] - ETA: 6s - loss: 1.4344 - regression_loss: 1.2354 - classification_loss: 0.1991 482/500 [===========================>..] - ETA: 6s - loss: 1.4343 - regression_loss: 1.2353 - classification_loss: 0.1991 483/500 [===========================>..] - ETA: 5s - loss: 1.4340 - regression_loss: 1.2351 - classification_loss: 0.1989 484/500 [============================>.] - ETA: 5s - loss: 1.4341 - regression_loss: 1.2352 - classification_loss: 0.1989 485/500 [============================>.] - ETA: 5s - loss: 1.4321 - regression_loss: 1.2335 - classification_loss: 0.1986 486/500 [============================>.] - ETA: 4s - loss: 1.4314 - regression_loss: 1.2330 - classification_loss: 0.1984 487/500 [============================>.] - ETA: 4s - loss: 1.4323 - regression_loss: 1.2337 - classification_loss: 0.1987 488/500 [============================>.] - ETA: 4s - loss: 1.4320 - regression_loss: 1.2334 - classification_loss: 0.1987 489/500 [============================>.] - ETA: 3s - loss: 1.4327 - regression_loss: 1.2340 - classification_loss: 0.1987 490/500 [============================>.] - ETA: 3s - loss: 1.4336 - regression_loss: 1.2348 - classification_loss: 0.1988 491/500 [============================>.] - ETA: 3s - loss: 1.4349 - regression_loss: 1.2360 - classification_loss: 0.1989 492/500 [============================>.] - ETA: 2s - loss: 1.4356 - regression_loss: 1.2365 - classification_loss: 0.1991 493/500 [============================>.] - ETA: 2s - loss: 1.4346 - regression_loss: 1.2357 - classification_loss: 0.1989 494/500 [============================>.] - ETA: 2s - loss: 1.4333 - regression_loss: 1.2347 - classification_loss: 0.1987 495/500 [============================>.] - ETA: 1s - loss: 1.4349 - regression_loss: 1.2361 - classification_loss: 0.1988 496/500 [============================>.] - ETA: 1s - loss: 1.4362 - regression_loss: 1.2372 - classification_loss: 0.1989 497/500 [============================>.] - ETA: 1s - loss: 1.4373 - regression_loss: 1.2381 - classification_loss: 0.1992 498/500 [============================>.] - ETA: 0s - loss: 1.4376 - regression_loss: 1.2384 - classification_loss: 0.1992 499/500 [============================>.] - ETA: 0s - loss: 1.4381 - regression_loss: 1.2388 - classification_loss: 0.1993 500/500 [==============================] - 169s 339ms/step - loss: 1.4387 - regression_loss: 1.2393 - classification_loss: 0.1994 326 instances of class plum with average precision: 0.7817 mAP: 0.7817 Epoch 00011: saving model to ./training/snapshots/resnet101_pascal_11.h5 Epoch 12/150 1/500 [..............................] - ETA: 2:31 - loss: 1.5915 - regression_loss: 1.3627 - classification_loss: 0.2288 2/500 [..............................] - ETA: 2:36 - loss: 1.6413 - regression_loss: 1.3595 - classification_loss: 0.2819 3/500 [..............................] - ETA: 2:43 - loss: 1.4214 - regression_loss: 1.2064 - classification_loss: 0.2150 4/500 [..............................] - ETA: 2:45 - loss: 1.3477 - regression_loss: 1.1488 - classification_loss: 0.1989 5/500 [..............................] - ETA: 2:48 - loss: 1.3045 - regression_loss: 1.1338 - classification_loss: 0.1707 6/500 [..............................] - ETA: 2:49 - loss: 1.3160 - regression_loss: 1.1500 - classification_loss: 0.1660 7/500 [..............................] - ETA: 2:49 - loss: 1.2244 - regression_loss: 1.0664 - classification_loss: 0.1579 8/500 [..............................] - ETA: 2:48 - loss: 1.2393 - regression_loss: 1.0857 - classification_loss: 0.1536 9/500 [..............................] - ETA: 2:48 - loss: 1.2024 - regression_loss: 1.0574 - classification_loss: 0.1450 10/500 [..............................] - ETA: 2:48 - loss: 1.2193 - regression_loss: 1.0712 - classification_loss: 0.1481 11/500 [..............................] - ETA: 2:46 - loss: 1.2606 - regression_loss: 1.1109 - classification_loss: 0.1497 12/500 [..............................] - ETA: 2:45 - loss: 1.3646 - regression_loss: 1.1965 - classification_loss: 0.1681 13/500 [..............................] - ETA: 2:45 - loss: 1.4124 - regression_loss: 1.2382 - classification_loss: 0.1743 14/500 [..............................] - ETA: 2:45 - loss: 1.4160 - regression_loss: 1.2420 - classification_loss: 0.1740 15/500 [..............................] - ETA: 2:45 - loss: 1.3898 - regression_loss: 1.2197 - classification_loss: 0.1701 16/500 [..............................] - ETA: 2:44 - loss: 1.3518 - regression_loss: 1.1883 - classification_loss: 0.1635 17/500 [>.............................] - ETA: 2:44 - loss: 1.3335 - regression_loss: 1.1742 - classification_loss: 0.1593 18/500 [>.............................] - ETA: 2:44 - loss: 1.4504 - regression_loss: 1.2269 - classification_loss: 0.2235 19/500 [>.............................] - ETA: 2:43 - loss: 1.4711 - regression_loss: 1.2461 - classification_loss: 0.2250 20/500 [>.............................] - ETA: 2:43 - loss: 1.5665 - regression_loss: 1.3137 - classification_loss: 0.2527 21/500 [>.............................] - ETA: 2:43 - loss: 1.5811 - regression_loss: 1.3308 - classification_loss: 0.2503 22/500 [>.............................] - ETA: 2:43 - loss: 1.5244 - regression_loss: 1.2841 - classification_loss: 0.2403 23/500 [>.............................] - ETA: 2:43 - loss: 1.5100 - regression_loss: 1.2741 - classification_loss: 0.2359 24/500 [>.............................] - ETA: 2:42 - loss: 1.5118 - regression_loss: 1.2786 - classification_loss: 0.2331 25/500 [>.............................] - ETA: 2:42 - loss: 1.5037 - regression_loss: 1.2733 - classification_loss: 0.2304 26/500 [>.............................] - ETA: 2:42 - loss: 1.4972 - regression_loss: 1.2699 - classification_loss: 0.2274 27/500 [>.............................] - ETA: 2:41 - loss: 1.5179 - regression_loss: 1.2893 - classification_loss: 0.2286 28/500 [>.............................] - ETA: 2:41 - loss: 1.5173 - regression_loss: 1.2901 - classification_loss: 0.2272 29/500 [>.............................] - ETA: 2:41 - loss: 1.4948 - regression_loss: 1.2714 - classification_loss: 0.2235 30/500 [>.............................] - ETA: 2:41 - loss: 1.5014 - regression_loss: 1.2787 - classification_loss: 0.2227 31/500 [>.............................] - ETA: 2:40 - loss: 1.5022 - regression_loss: 1.2787 - classification_loss: 0.2235 32/500 [>.............................] - ETA: 2:40 - loss: 1.5026 - regression_loss: 1.2804 - classification_loss: 0.2223 33/500 [>.............................] - ETA: 2:40 - loss: 1.4959 - regression_loss: 1.2695 - classification_loss: 0.2264 34/500 [=>............................] - ETA: 2:40 - loss: 1.4946 - regression_loss: 1.2693 - classification_loss: 0.2253 35/500 [=>............................] - ETA: 2:39 - loss: 1.5034 - regression_loss: 1.2787 - classification_loss: 0.2246 36/500 [=>............................] - ETA: 2:39 - loss: 1.4880 - regression_loss: 1.2675 - classification_loss: 0.2205 37/500 [=>............................] - ETA: 2:38 - loss: 1.4644 - regression_loss: 1.2475 - classification_loss: 0.2169 38/500 [=>............................] - ETA: 2:38 - loss: 1.4670 - regression_loss: 1.2512 - classification_loss: 0.2159 39/500 [=>............................] - ETA: 2:38 - loss: 1.4591 - regression_loss: 1.2457 - classification_loss: 0.2134 40/500 [=>............................] - ETA: 2:38 - loss: 1.4528 - regression_loss: 1.2407 - classification_loss: 0.2121 41/500 [=>............................] - ETA: 2:37 - loss: 1.4711 - regression_loss: 1.2529 - classification_loss: 0.2181 42/500 [=>............................] - ETA: 2:37 - loss: 1.4627 - regression_loss: 1.2452 - classification_loss: 0.2175 43/500 [=>............................] - ETA: 2:37 - loss: 1.4573 - regression_loss: 1.2418 - classification_loss: 0.2155 44/500 [=>............................] - ETA: 2:36 - loss: 1.4445 - regression_loss: 1.2318 - classification_loss: 0.2127 45/500 [=>............................] - ETA: 2:36 - loss: 1.4352 - regression_loss: 1.2239 - classification_loss: 0.2113 46/500 [=>............................] - ETA: 2:35 - loss: 1.4327 - regression_loss: 1.2214 - classification_loss: 0.2113 47/500 [=>............................] - ETA: 2:35 - loss: 1.4214 - regression_loss: 1.2116 - classification_loss: 0.2098 48/500 [=>............................] - ETA: 2:35 - loss: 1.4107 - regression_loss: 1.2036 - classification_loss: 0.2071 49/500 [=>............................] - ETA: 2:34 - loss: 1.4093 - regression_loss: 1.2029 - classification_loss: 0.2064 50/500 [==>...........................] - ETA: 2:34 - loss: 1.4208 - regression_loss: 1.2134 - classification_loss: 0.2075 51/500 [==>...........................] - ETA: 2:34 - loss: 1.4200 - regression_loss: 1.2136 - classification_loss: 0.2063 52/500 [==>...........................] - ETA: 2:33 - loss: 1.4158 - regression_loss: 1.2119 - classification_loss: 0.2039 53/500 [==>...........................] - ETA: 2:33 - loss: 1.4177 - regression_loss: 1.2138 - classification_loss: 0.2038 54/500 [==>...........................] - ETA: 2:32 - loss: 1.4227 - regression_loss: 1.2179 - classification_loss: 0.2048 55/500 [==>...........................] - ETA: 2:32 - loss: 1.4143 - regression_loss: 1.2123 - classification_loss: 0.2020 56/500 [==>...........................] - ETA: 2:32 - loss: 1.4125 - regression_loss: 1.2085 - classification_loss: 0.2040 57/500 [==>...........................] - ETA: 2:31 - loss: 1.4091 - regression_loss: 1.2064 - classification_loss: 0.2027 58/500 [==>...........................] - ETA: 2:31 - loss: 1.4065 - regression_loss: 1.2050 - classification_loss: 0.2015 59/500 [==>...........................] - ETA: 2:31 - loss: 1.4099 - regression_loss: 1.2078 - classification_loss: 0.2020 60/500 [==>...........................] - ETA: 2:30 - loss: 1.4076 - regression_loss: 1.2043 - classification_loss: 0.2033 61/500 [==>...........................] - ETA: 2:30 - loss: 1.4109 - regression_loss: 1.2055 - classification_loss: 0.2053 62/500 [==>...........................] - ETA: 2:30 - loss: 1.4072 - regression_loss: 1.2019 - classification_loss: 0.2053 63/500 [==>...........................] - ETA: 2:29 - loss: 1.4075 - regression_loss: 1.2037 - classification_loss: 0.2038 64/500 [==>...........................] - ETA: 2:29 - loss: 1.4113 - regression_loss: 1.2075 - classification_loss: 0.2038 65/500 [==>...........................] - ETA: 2:28 - loss: 1.4000 - regression_loss: 1.1983 - classification_loss: 0.2017 66/500 [==>...........................] - ETA: 2:28 - loss: 1.4227 - regression_loss: 1.2186 - classification_loss: 0.2041 67/500 [===>..........................] - ETA: 2:28 - loss: 1.4254 - regression_loss: 1.2209 - classification_loss: 0.2045 68/500 [===>..........................] - ETA: 2:27 - loss: 1.4193 - regression_loss: 1.2159 - classification_loss: 0.2034 69/500 [===>..........................] - ETA: 2:27 - loss: 1.4078 - regression_loss: 1.2066 - classification_loss: 0.2012 70/500 [===>..........................] - ETA: 2:26 - loss: 1.4039 - regression_loss: 1.2040 - classification_loss: 0.1998 71/500 [===>..........................] - ETA: 2:26 - loss: 1.4000 - regression_loss: 1.2016 - classification_loss: 0.1984 72/500 [===>..........................] - ETA: 2:26 - loss: 1.3975 - regression_loss: 1.1996 - classification_loss: 0.1979 73/500 [===>..........................] - ETA: 2:25 - loss: 1.3926 - regression_loss: 1.1961 - classification_loss: 0.1965 74/500 [===>..........................] - ETA: 2:25 - loss: 1.3896 - regression_loss: 1.1931 - classification_loss: 0.1965 75/500 [===>..........................] - ETA: 2:25 - loss: 1.3964 - regression_loss: 1.1996 - classification_loss: 0.1968 76/500 [===>..........................] - ETA: 2:24 - loss: 1.3938 - regression_loss: 1.1981 - classification_loss: 0.1957 77/500 [===>..........................] - ETA: 2:24 - loss: 1.4018 - regression_loss: 1.2045 - classification_loss: 0.1974 78/500 [===>..........................] - ETA: 2:24 - loss: 1.3971 - regression_loss: 1.2012 - classification_loss: 0.1959 79/500 [===>..........................] - ETA: 2:23 - loss: 1.3930 - regression_loss: 1.1973 - classification_loss: 0.1957 80/500 [===>..........................] - ETA: 2:23 - loss: 1.3884 - regression_loss: 1.1940 - classification_loss: 0.1943 81/500 [===>..........................] - ETA: 2:22 - loss: 1.3973 - regression_loss: 1.2010 - classification_loss: 0.1963 82/500 [===>..........................] - ETA: 2:22 - loss: 1.3988 - regression_loss: 1.2025 - classification_loss: 0.1963 83/500 [===>..........................] - ETA: 2:22 - loss: 1.3894 - regression_loss: 1.1945 - classification_loss: 0.1948 84/500 [====>.........................] - ETA: 2:22 - loss: 1.3888 - regression_loss: 1.1948 - classification_loss: 0.1939 85/500 [====>.........................] - ETA: 2:21 - loss: 1.3846 - regression_loss: 1.1915 - classification_loss: 0.1932 86/500 [====>.........................] - ETA: 2:21 - loss: 1.3853 - regression_loss: 1.1926 - classification_loss: 0.1927 87/500 [====>.........................] - ETA: 2:21 - loss: 1.3801 - regression_loss: 1.1881 - classification_loss: 0.1921 88/500 [====>.........................] - ETA: 2:20 - loss: 1.3888 - regression_loss: 1.1902 - classification_loss: 0.1986 89/500 [====>.........................] - ETA: 2:20 - loss: 1.3849 - regression_loss: 1.1876 - classification_loss: 0.1973 90/500 [====>.........................] - ETA: 2:20 - loss: 1.3944 - regression_loss: 1.1955 - classification_loss: 0.1989 91/500 [====>.........................] - ETA: 2:19 - loss: 1.4023 - regression_loss: 1.2013 - classification_loss: 0.2011 92/500 [====>.........................] - ETA: 2:19 - loss: 1.4074 - regression_loss: 1.2062 - classification_loss: 0.2012 93/500 [====>.........................] - ETA: 2:19 - loss: 1.4152 - regression_loss: 1.2137 - classification_loss: 0.2014 94/500 [====>.........................] - ETA: 2:18 - loss: 1.4133 - regression_loss: 1.2126 - classification_loss: 0.2007 95/500 [====>.........................] - ETA: 2:18 - loss: 1.4087 - regression_loss: 1.2093 - classification_loss: 0.1994 96/500 [====>.........................] - ETA: 2:18 - loss: 1.4150 - regression_loss: 1.2138 - classification_loss: 0.2013 97/500 [====>.........................] - ETA: 2:17 - loss: 1.4165 - regression_loss: 1.2149 - classification_loss: 0.2016 98/500 [====>.........................] - ETA: 2:17 - loss: 1.4277 - regression_loss: 1.2243 - classification_loss: 0.2035 99/500 [====>.........................] - ETA: 2:17 - loss: 1.4179 - regression_loss: 1.2160 - classification_loss: 0.2019 100/500 [=====>........................] - ETA: 2:16 - loss: 1.4161 - regression_loss: 1.2151 - classification_loss: 0.2011 101/500 [=====>........................] - ETA: 2:16 - loss: 1.4138 - regression_loss: 1.2136 - classification_loss: 0.2002 102/500 [=====>........................] - ETA: 2:16 - loss: 1.4240 - regression_loss: 1.2190 - classification_loss: 0.2050 103/500 [=====>........................] - ETA: 2:15 - loss: 1.4377 - regression_loss: 1.2307 - classification_loss: 0.2069 104/500 [=====>........................] - ETA: 2:15 - loss: 1.4489 - regression_loss: 1.2424 - classification_loss: 0.2065 105/500 [=====>........................] - ETA: 2:15 - loss: 1.4477 - regression_loss: 1.2422 - classification_loss: 0.2054 106/500 [=====>........................] - ETA: 2:14 - loss: 1.4450 - regression_loss: 1.2408 - classification_loss: 0.2042 107/500 [=====>........................] - ETA: 2:14 - loss: 1.4474 - regression_loss: 1.2430 - classification_loss: 0.2044 108/500 [=====>........................] - ETA: 2:13 - loss: 1.4442 - regression_loss: 1.2404 - classification_loss: 0.2037 109/500 [=====>........................] - ETA: 2:13 - loss: 1.4409 - regression_loss: 1.2381 - classification_loss: 0.2027 110/500 [=====>........................] - ETA: 2:13 - loss: 1.4372 - regression_loss: 1.2352 - classification_loss: 0.2020 111/500 [=====>........................] - ETA: 2:12 - loss: 1.4352 - regression_loss: 1.2338 - classification_loss: 0.2014 112/500 [=====>........................] - ETA: 2:12 - loss: 1.4314 - regression_loss: 1.2304 - classification_loss: 0.2010 113/500 [=====>........................] - ETA: 2:12 - loss: 1.4324 - regression_loss: 1.2308 - classification_loss: 0.2016 114/500 [=====>........................] - ETA: 2:11 - loss: 1.4280 - regression_loss: 1.2266 - classification_loss: 0.2014 115/500 [=====>........................] - ETA: 2:11 - loss: 1.4239 - regression_loss: 1.2231 - classification_loss: 0.2008 116/500 [=====>........................] - ETA: 2:11 - loss: 1.4261 - regression_loss: 1.2250 - classification_loss: 0.2011 117/500 [======>.......................] - ETA: 2:10 - loss: 1.4239 - regression_loss: 1.2232 - classification_loss: 0.2007 118/500 [======>.......................] - ETA: 2:10 - loss: 1.4307 - regression_loss: 1.2296 - classification_loss: 0.2011 119/500 [======>.......................] - ETA: 2:09 - loss: 1.4322 - regression_loss: 1.2311 - classification_loss: 0.2010 120/500 [======>.......................] - ETA: 2:09 - loss: 1.4294 - regression_loss: 1.2293 - classification_loss: 0.2001 121/500 [======>.......................] - ETA: 2:09 - loss: 1.4324 - regression_loss: 1.2309 - classification_loss: 0.2015 122/500 [======>.......................] - ETA: 2:08 - loss: 1.4320 - regression_loss: 1.2307 - classification_loss: 0.2013 123/500 [======>.......................] - ETA: 2:08 - loss: 1.4304 - regression_loss: 1.2296 - classification_loss: 0.2008 124/500 [======>.......................] - ETA: 2:08 - loss: 1.4229 - regression_loss: 1.2231 - classification_loss: 0.1998 125/500 [======>.......................] - ETA: 2:07 - loss: 1.4206 - regression_loss: 1.2211 - classification_loss: 0.1995 126/500 [======>.......................] - ETA: 2:07 - loss: 1.4179 - regression_loss: 1.2190 - classification_loss: 0.1989 127/500 [======>.......................] - ETA: 2:07 - loss: 1.4132 - regression_loss: 1.2151 - classification_loss: 0.1981 128/500 [======>.......................] - ETA: 2:06 - loss: 1.4164 - regression_loss: 1.2182 - classification_loss: 0.1982 129/500 [======>.......................] - ETA: 2:06 - loss: 1.4146 - regression_loss: 1.2171 - classification_loss: 0.1975 130/500 [======>.......................] - ETA: 2:06 - loss: 1.4138 - regression_loss: 1.2168 - classification_loss: 0.1970 131/500 [======>.......................] - ETA: 2:05 - loss: 1.4107 - regression_loss: 1.2130 - classification_loss: 0.1977 132/500 [======>.......................] - ETA: 2:05 - loss: 1.4120 - regression_loss: 1.2143 - classification_loss: 0.1977 133/500 [======>.......................] - ETA: 2:04 - loss: 1.4266 - regression_loss: 1.2265 - classification_loss: 0.2000 134/500 [=======>......................] - ETA: 2:04 - loss: 1.4231 - regression_loss: 1.2237 - classification_loss: 0.1995 135/500 [=======>......................] - ETA: 2:04 - loss: 1.4189 - regression_loss: 1.2198 - classification_loss: 0.1991 136/500 [=======>......................] - ETA: 2:03 - loss: 1.4235 - regression_loss: 1.2229 - classification_loss: 0.2006 137/500 [=======>......................] - ETA: 2:03 - loss: 1.4199 - regression_loss: 1.2198 - classification_loss: 0.2000 138/500 [=======>......................] - ETA: 2:03 - loss: 1.4147 - regression_loss: 1.2155 - classification_loss: 0.1992 139/500 [=======>......................] - ETA: 2:02 - loss: 1.4168 - regression_loss: 1.2175 - classification_loss: 0.1993 140/500 [=======>......................] - ETA: 2:02 - loss: 1.4143 - regression_loss: 1.2154 - classification_loss: 0.1988 141/500 [=======>......................] - ETA: 2:02 - loss: 1.4115 - regression_loss: 1.2135 - classification_loss: 0.1979 142/500 [=======>......................] - ETA: 2:01 - loss: 1.4091 - regression_loss: 1.2117 - classification_loss: 0.1974 143/500 [=======>......................] - ETA: 2:01 - loss: 1.4052 - regression_loss: 1.2083 - classification_loss: 0.1969 144/500 [=======>......................] - ETA: 2:01 - loss: 1.4081 - regression_loss: 1.2108 - classification_loss: 0.1972 145/500 [=======>......................] - ETA: 2:00 - loss: 1.4026 - regression_loss: 1.2062 - classification_loss: 0.1964 146/500 [=======>......................] - ETA: 2:00 - loss: 1.4026 - regression_loss: 1.2067 - classification_loss: 0.1959 147/500 [=======>......................] - ETA: 2:00 - loss: 1.3995 - regression_loss: 1.2044 - classification_loss: 0.1951 148/500 [=======>......................] - ETA: 1:59 - loss: 1.3962 - regression_loss: 1.2017 - classification_loss: 0.1945 149/500 [=======>......................] - ETA: 1:59 - loss: 1.3931 - regression_loss: 1.1990 - classification_loss: 0.1941 150/500 [========>.....................] - ETA: 1:58 - loss: 1.3874 - regression_loss: 1.1929 - classification_loss: 0.1946 151/500 [========>.....................] - ETA: 1:58 - loss: 1.3888 - regression_loss: 1.1939 - classification_loss: 0.1949 152/500 [========>.....................] - ETA: 1:58 - loss: 1.3834 - regression_loss: 1.1894 - classification_loss: 0.1940 153/500 [========>.....................] - ETA: 1:57 - loss: 1.3838 - regression_loss: 1.1893 - classification_loss: 0.1945 154/500 [========>.....................] - ETA: 1:57 - loss: 1.3884 - regression_loss: 1.1935 - classification_loss: 0.1949 155/500 [========>.....................] - ETA: 1:57 - loss: 1.3889 - regression_loss: 1.1943 - classification_loss: 0.1947 156/500 [========>.....................] - ETA: 1:56 - loss: 1.3968 - regression_loss: 1.2007 - classification_loss: 0.1960 157/500 [========>.....................] - ETA: 1:56 - loss: 1.3970 - regression_loss: 1.2013 - classification_loss: 0.1957 158/500 [========>.....................] - ETA: 1:56 - loss: 1.3924 - regression_loss: 1.1975 - classification_loss: 0.1948 159/500 [========>.....................] - ETA: 1:55 - loss: 1.3942 - regression_loss: 1.1991 - classification_loss: 0.1951 160/500 [========>.....................] - ETA: 1:55 - loss: 1.3928 - regression_loss: 1.1979 - classification_loss: 0.1949 161/500 [========>.....................] - ETA: 1:55 - loss: 1.3897 - regression_loss: 1.1954 - classification_loss: 0.1942 162/500 [========>.....................] - ETA: 1:54 - loss: 1.3905 - regression_loss: 1.1965 - classification_loss: 0.1941 163/500 [========>.....................] - ETA: 1:54 - loss: 1.3887 - regression_loss: 1.1951 - classification_loss: 0.1936 164/500 [========>.....................] - ETA: 1:54 - loss: 1.3932 - regression_loss: 1.1983 - classification_loss: 0.1949 165/500 [========>.....................] - ETA: 1:53 - loss: 1.3971 - regression_loss: 1.2014 - classification_loss: 0.1957 166/500 [========>.....................] - ETA: 1:53 - loss: 1.4042 - regression_loss: 1.2072 - classification_loss: 0.1970 167/500 [=========>....................] - ETA: 1:53 - loss: 1.4028 - regression_loss: 1.2063 - classification_loss: 0.1965 168/500 [=========>....................] - ETA: 1:52 - loss: 1.3995 - regression_loss: 1.2038 - classification_loss: 0.1957 169/500 [=========>....................] - ETA: 1:52 - loss: 1.3977 - regression_loss: 1.2023 - classification_loss: 0.1954 170/500 [=========>....................] - ETA: 1:52 - loss: 1.3990 - regression_loss: 1.2038 - classification_loss: 0.1953 171/500 [=========>....................] - ETA: 1:51 - loss: 1.4053 - regression_loss: 1.2067 - classification_loss: 0.1987 172/500 [=========>....................] - ETA: 1:51 - loss: 1.4070 - regression_loss: 1.2075 - classification_loss: 0.1995 173/500 [=========>....................] - ETA: 1:51 - loss: 1.4059 - regression_loss: 1.2066 - classification_loss: 0.1994 174/500 [=========>....................] - ETA: 1:50 - loss: 1.4043 - regression_loss: 1.2051 - classification_loss: 0.1991 175/500 [=========>....................] - ETA: 1:50 - loss: 1.3990 - regression_loss: 1.2008 - classification_loss: 0.1982 176/500 [=========>....................] - ETA: 1:50 - loss: 1.3957 - regression_loss: 1.1974 - classification_loss: 0.1983 177/500 [=========>....................] - ETA: 1:49 - loss: 1.3940 - regression_loss: 1.1963 - classification_loss: 0.1977 178/500 [=========>....................] - ETA: 1:49 - loss: 1.3944 - regression_loss: 1.1967 - classification_loss: 0.1977 179/500 [=========>....................] - ETA: 1:49 - loss: 1.3957 - regression_loss: 1.1978 - classification_loss: 0.1979 180/500 [=========>....................] - ETA: 1:48 - loss: 1.4060 - regression_loss: 1.2067 - classification_loss: 0.1993 181/500 [=========>....................] - ETA: 1:48 - loss: 1.4045 - regression_loss: 1.2056 - classification_loss: 0.1989 182/500 [=========>....................] - ETA: 1:48 - loss: 1.4025 - regression_loss: 1.2041 - classification_loss: 0.1984 183/500 [=========>....................] - ETA: 1:47 - loss: 1.3980 - regression_loss: 1.2002 - classification_loss: 0.1977 184/500 [==========>...................] - ETA: 1:47 - loss: 1.3988 - regression_loss: 1.2006 - classification_loss: 0.1981 185/500 [==========>...................] - ETA: 1:47 - loss: 1.3999 - regression_loss: 1.2015 - classification_loss: 0.1984 186/500 [==========>...................] - ETA: 1:46 - loss: 1.3981 - regression_loss: 1.2000 - classification_loss: 0.1980 187/500 [==========>...................] - ETA: 1:46 - loss: 1.3958 - regression_loss: 1.1983 - classification_loss: 0.1975 188/500 [==========>...................] - ETA: 1:46 - loss: 1.3963 - regression_loss: 1.1982 - classification_loss: 0.1981 189/500 [==========>...................] - ETA: 1:45 - loss: 1.3953 - regression_loss: 1.1977 - classification_loss: 0.1976 190/500 [==========>...................] - ETA: 1:45 - loss: 1.3946 - regression_loss: 1.1972 - classification_loss: 0.1974 191/500 [==========>...................] - ETA: 1:45 - loss: 1.3943 - regression_loss: 1.1972 - classification_loss: 0.1972 192/500 [==========>...................] - ETA: 1:44 - loss: 1.3934 - regression_loss: 1.1965 - classification_loss: 0.1969 193/500 [==========>...................] - ETA: 1:44 - loss: 1.3941 - regression_loss: 1.1974 - classification_loss: 0.1967 194/500 [==========>...................] - ETA: 1:44 - loss: 1.3944 - regression_loss: 1.1978 - classification_loss: 0.1966 195/500 [==========>...................] - ETA: 1:43 - loss: 1.3945 - regression_loss: 1.1979 - classification_loss: 0.1966 196/500 [==========>...................] - ETA: 1:43 - loss: 1.4014 - regression_loss: 1.2032 - classification_loss: 0.1982 197/500 [==========>...................] - ETA: 1:43 - loss: 1.4009 - regression_loss: 1.2029 - classification_loss: 0.1980 198/500 [==========>...................] - ETA: 1:42 - loss: 1.4006 - regression_loss: 1.2032 - classification_loss: 0.1974 199/500 [==========>...................] - ETA: 1:42 - loss: 1.3965 - regression_loss: 1.1997 - classification_loss: 0.1968 200/500 [===========>..................] - ETA: 1:42 - loss: 1.3962 - regression_loss: 1.1998 - classification_loss: 0.1964 201/500 [===========>..................] - ETA: 1:41 - loss: 1.3981 - regression_loss: 1.2017 - classification_loss: 0.1964 202/500 [===========>..................] - ETA: 1:41 - loss: 1.3990 - regression_loss: 1.2026 - classification_loss: 0.1963 203/500 [===========>..................] - ETA: 1:41 - loss: 1.4006 - regression_loss: 1.2040 - classification_loss: 0.1965 204/500 [===========>..................] - ETA: 1:40 - loss: 1.4028 - regression_loss: 1.2059 - classification_loss: 0.1970 205/500 [===========>..................] - ETA: 1:40 - loss: 1.4005 - regression_loss: 1.2040 - classification_loss: 0.1965 206/500 [===========>..................] - ETA: 1:40 - loss: 1.3971 - regression_loss: 1.2013 - classification_loss: 0.1958 207/500 [===========>..................] - ETA: 1:39 - loss: 1.3984 - regression_loss: 1.2027 - classification_loss: 0.1957 208/500 [===========>..................] - ETA: 1:39 - loss: 1.4008 - regression_loss: 1.2048 - classification_loss: 0.1960 209/500 [===========>..................] - ETA: 1:39 - loss: 1.3991 - regression_loss: 1.2036 - classification_loss: 0.1955 210/500 [===========>..................] - ETA: 1:38 - loss: 1.3991 - regression_loss: 1.2038 - classification_loss: 0.1953 211/500 [===========>..................] - ETA: 1:38 - loss: 1.4048 - regression_loss: 1.2092 - classification_loss: 0.1956 212/500 [===========>..................] - ETA: 1:38 - loss: 1.4051 - regression_loss: 1.2096 - classification_loss: 0.1955 213/500 [===========>..................] - ETA: 1:37 - loss: 1.4045 - regression_loss: 1.2092 - classification_loss: 0.1953 214/500 [===========>..................] - ETA: 1:37 - loss: 1.4012 - regression_loss: 1.2066 - classification_loss: 0.1947 215/500 [===========>..................] - ETA: 1:37 - loss: 1.4019 - regression_loss: 1.2072 - classification_loss: 0.1947 216/500 [===========>..................] - ETA: 1:36 - loss: 1.3975 - regression_loss: 1.2034 - classification_loss: 0.1941 217/500 [============>.................] - ETA: 1:36 - loss: 1.3973 - regression_loss: 1.2033 - classification_loss: 0.1940 218/500 [============>.................] - ETA: 1:36 - loss: 1.3977 - regression_loss: 1.2038 - classification_loss: 0.1938 219/500 [============>.................] - ETA: 1:35 - loss: 1.3957 - regression_loss: 1.2023 - classification_loss: 0.1934 220/500 [============>.................] - ETA: 1:35 - loss: 1.3978 - regression_loss: 1.2042 - classification_loss: 0.1937 221/500 [============>.................] - ETA: 1:35 - loss: 1.3964 - regression_loss: 1.2031 - classification_loss: 0.1933 222/500 [============>.................] - ETA: 1:34 - loss: 1.3974 - regression_loss: 1.2040 - classification_loss: 0.1934 223/500 [============>.................] - ETA: 1:34 - loss: 1.3963 - regression_loss: 1.2031 - classification_loss: 0.1932 224/500 [============>.................] - ETA: 1:34 - loss: 1.3955 - regression_loss: 1.2027 - classification_loss: 0.1928 225/500 [============>.................] - ETA: 1:33 - loss: 1.3949 - regression_loss: 1.2021 - classification_loss: 0.1928 226/500 [============>.................] - ETA: 1:33 - loss: 1.3936 - regression_loss: 1.2010 - classification_loss: 0.1926 227/500 [============>.................] - ETA: 1:33 - loss: 1.3985 - regression_loss: 1.2048 - classification_loss: 0.1937 228/500 [============>.................] - ETA: 1:32 - loss: 1.3989 - regression_loss: 1.2052 - classification_loss: 0.1937 229/500 [============>.................] - ETA: 1:32 - loss: 1.3966 - regression_loss: 1.2032 - classification_loss: 0.1934 230/500 [============>.................] - ETA: 1:32 - loss: 1.3960 - regression_loss: 1.2029 - classification_loss: 0.1931 231/500 [============>.................] - ETA: 1:31 - loss: 1.3932 - regression_loss: 1.2006 - classification_loss: 0.1926 232/500 [============>.................] - ETA: 1:31 - loss: 1.3944 - regression_loss: 1.2021 - classification_loss: 0.1923 233/500 [============>.................] - ETA: 1:31 - loss: 1.3935 - regression_loss: 1.2014 - classification_loss: 0.1921 234/500 [=============>................] - ETA: 1:30 - loss: 1.3940 - regression_loss: 1.2021 - classification_loss: 0.1919 235/500 [=============>................] - ETA: 1:30 - loss: 1.3930 - regression_loss: 1.2014 - classification_loss: 0.1916 236/500 [=============>................] - ETA: 1:30 - loss: 1.3923 - regression_loss: 1.2011 - classification_loss: 0.1912 237/500 [=============>................] - ETA: 1:29 - loss: 1.3937 - regression_loss: 1.2020 - classification_loss: 0.1917 238/500 [=============>................] - ETA: 1:29 - loss: 1.3968 - regression_loss: 1.2044 - classification_loss: 0.1925 239/500 [=============>................] - ETA: 1:29 - loss: 1.3945 - regression_loss: 1.2026 - classification_loss: 0.1918 240/500 [=============>................] - ETA: 1:28 - loss: 1.3945 - regression_loss: 1.2028 - classification_loss: 0.1917 241/500 [=============>................] - ETA: 1:28 - loss: 1.3954 - regression_loss: 1.2034 - classification_loss: 0.1921 242/500 [=============>................] - ETA: 1:27 - loss: 1.3981 - regression_loss: 1.2056 - classification_loss: 0.1925 243/500 [=============>................] - ETA: 1:27 - loss: 1.3954 - regression_loss: 1.2034 - classification_loss: 0.1920 244/500 [=============>................] - ETA: 1:27 - loss: 1.3961 - regression_loss: 1.2041 - classification_loss: 0.1920 245/500 [=============>................] - ETA: 1:26 - loss: 1.3925 - regression_loss: 1.2010 - classification_loss: 0.1915 246/500 [=============>................] - ETA: 1:26 - loss: 1.3938 - regression_loss: 1.2022 - classification_loss: 0.1917 247/500 [=============>................] - ETA: 1:26 - loss: 1.3988 - regression_loss: 1.2062 - classification_loss: 0.1926 248/500 [=============>................] - ETA: 1:25 - loss: 1.4003 - regression_loss: 1.2077 - classification_loss: 0.1926 249/500 [=============>................] - ETA: 1:25 - loss: 1.4015 - regression_loss: 1.2088 - classification_loss: 0.1927 250/500 [==============>...............] - ETA: 1:25 - loss: 1.4016 - regression_loss: 1.2089 - classification_loss: 0.1927 251/500 [==============>...............] - ETA: 1:24 - loss: 1.4008 - regression_loss: 1.2083 - classification_loss: 0.1925 252/500 [==============>...............] - ETA: 1:24 - loss: 1.4004 - regression_loss: 1.2080 - classification_loss: 0.1924 253/500 [==============>...............] - ETA: 1:24 - loss: 1.3994 - regression_loss: 1.2072 - classification_loss: 0.1922 254/500 [==============>...............] - ETA: 1:23 - loss: 1.3977 - regression_loss: 1.2059 - classification_loss: 0.1918 255/500 [==============>...............] - ETA: 1:23 - loss: 1.3952 - regression_loss: 1.2037 - classification_loss: 0.1915 256/500 [==============>...............] - ETA: 1:23 - loss: 1.3973 - regression_loss: 1.2052 - classification_loss: 0.1921 257/500 [==============>...............] - ETA: 1:22 - loss: 1.3993 - regression_loss: 1.2072 - classification_loss: 0.1921 258/500 [==============>...............] - ETA: 1:22 - loss: 1.4010 - regression_loss: 1.2090 - classification_loss: 0.1920 259/500 [==============>...............] - ETA: 1:22 - loss: 1.4000 - regression_loss: 1.2083 - classification_loss: 0.1917 260/500 [==============>...............] - ETA: 1:21 - loss: 1.3977 - regression_loss: 1.2064 - classification_loss: 0.1913 261/500 [==============>...............] - ETA: 1:21 - loss: 1.3976 - regression_loss: 1.2064 - classification_loss: 0.1911 262/500 [==============>...............] - ETA: 1:21 - loss: 1.3972 - regression_loss: 1.2061 - classification_loss: 0.1910 263/500 [==============>...............] - ETA: 1:20 - loss: 1.3961 - regression_loss: 1.2041 - classification_loss: 0.1920 264/500 [==============>...............] - ETA: 1:20 - loss: 1.3928 - regression_loss: 1.2014 - classification_loss: 0.1915 265/500 [==============>...............] - ETA: 1:20 - loss: 1.3919 - regression_loss: 1.2006 - classification_loss: 0.1913 266/500 [==============>...............] - ETA: 1:19 - loss: 1.3915 - regression_loss: 1.2004 - classification_loss: 0.1911 267/500 [===============>..............] - ETA: 1:19 - loss: 1.3946 - regression_loss: 1.2027 - classification_loss: 0.1919 268/500 [===============>..............] - ETA: 1:19 - loss: 1.3934 - regression_loss: 1.2019 - classification_loss: 0.1915 269/500 [===============>..............] - ETA: 1:18 - loss: 1.3933 - regression_loss: 1.2016 - classification_loss: 0.1917 270/500 [===============>..............] - ETA: 1:18 - loss: 1.3929 - regression_loss: 1.2014 - classification_loss: 0.1916 271/500 [===============>..............] - ETA: 1:18 - loss: 1.3937 - regression_loss: 1.2022 - classification_loss: 0.1915 272/500 [===============>..............] - ETA: 1:17 - loss: 1.3923 - regression_loss: 1.2010 - classification_loss: 0.1912 273/500 [===============>..............] - ETA: 1:17 - loss: 1.3930 - regression_loss: 1.2017 - classification_loss: 0.1913 274/500 [===============>..............] - ETA: 1:17 - loss: 1.3896 - regression_loss: 1.1989 - classification_loss: 0.1908 275/500 [===============>..............] - ETA: 1:16 - loss: 1.3906 - regression_loss: 1.1995 - classification_loss: 0.1911 276/500 [===============>..............] - ETA: 1:16 - loss: 1.3904 - regression_loss: 1.1998 - classification_loss: 0.1906 277/500 [===============>..............] - ETA: 1:16 - loss: 1.3901 - regression_loss: 1.1997 - classification_loss: 0.1904 278/500 [===============>..............] - ETA: 1:15 - loss: 1.3891 - regression_loss: 1.1989 - classification_loss: 0.1902 279/500 [===============>..............] - ETA: 1:15 - loss: 1.3918 - regression_loss: 1.2008 - classification_loss: 0.1910 280/500 [===============>..............] - ETA: 1:15 - loss: 1.3920 - regression_loss: 1.2012 - classification_loss: 0.1909 281/500 [===============>..............] - ETA: 1:14 - loss: 1.3917 - regression_loss: 1.2010 - classification_loss: 0.1907 282/500 [===============>..............] - ETA: 1:14 - loss: 1.3908 - regression_loss: 1.2003 - classification_loss: 0.1905 283/500 [===============>..............] - ETA: 1:14 - loss: 1.3893 - regression_loss: 1.1990 - classification_loss: 0.1903 284/500 [================>.............] - ETA: 1:13 - loss: 1.3885 - regression_loss: 1.1985 - classification_loss: 0.1901 285/500 [================>.............] - ETA: 1:13 - loss: 1.3863 - regression_loss: 1.1966 - classification_loss: 0.1897 286/500 [================>.............] - ETA: 1:12 - loss: 1.3884 - regression_loss: 1.1980 - classification_loss: 0.1904 287/500 [================>.............] - ETA: 1:12 - loss: 1.3878 - regression_loss: 1.1976 - classification_loss: 0.1902 288/500 [================>.............] - ETA: 1:12 - loss: 1.3867 - regression_loss: 1.1966 - classification_loss: 0.1901 289/500 [================>.............] - ETA: 1:11 - loss: 1.3859 - regression_loss: 1.1960 - classification_loss: 0.1899 290/500 [================>.............] - ETA: 1:11 - loss: 1.3890 - regression_loss: 1.1980 - classification_loss: 0.1910 291/500 [================>.............] - ETA: 1:11 - loss: 1.3895 - regression_loss: 1.1983 - classification_loss: 0.1911 292/500 [================>.............] - ETA: 1:10 - loss: 1.3889 - regression_loss: 1.1976 - classification_loss: 0.1912 293/500 [================>.............] - ETA: 1:10 - loss: 1.3871 - regression_loss: 1.1962 - classification_loss: 0.1909 294/500 [================>.............] - ETA: 1:10 - loss: 1.3859 - regression_loss: 1.1954 - classification_loss: 0.1906 295/500 [================>.............] - ETA: 1:09 - loss: 1.3873 - regression_loss: 1.1967 - classification_loss: 0.1906 296/500 [================>.............] - ETA: 1:09 - loss: 1.3859 - regression_loss: 1.1955 - classification_loss: 0.1904 297/500 [================>.............] - ETA: 1:09 - loss: 1.3841 - regression_loss: 1.1936 - classification_loss: 0.1906 298/500 [================>.............] - ETA: 1:08 - loss: 1.3827 - regression_loss: 1.1925 - classification_loss: 0.1902 299/500 [================>.............] - ETA: 1:08 - loss: 1.3835 - regression_loss: 1.1931 - classification_loss: 0.1903 300/500 [=================>............] - ETA: 1:08 - loss: 1.3847 - regression_loss: 1.1942 - classification_loss: 0.1904 301/500 [=================>............] - ETA: 1:07 - loss: 1.3855 - regression_loss: 1.1947 - classification_loss: 0.1909 302/500 [=================>............] - ETA: 1:07 - loss: 1.3877 - regression_loss: 1.1966 - classification_loss: 0.1911 303/500 [=================>............] - ETA: 1:07 - loss: 1.3874 - regression_loss: 1.1963 - classification_loss: 0.1911 304/500 [=================>............] - ETA: 1:06 - loss: 1.3866 - regression_loss: 1.1957 - classification_loss: 0.1909 305/500 [=================>............] - ETA: 1:06 - loss: 1.3859 - regression_loss: 1.1950 - classification_loss: 0.1909 306/500 [=================>............] - ETA: 1:06 - loss: 1.3853 - regression_loss: 1.1945 - classification_loss: 0.1907 307/500 [=================>............] - ETA: 1:05 - loss: 1.3833 - regression_loss: 1.1929 - classification_loss: 0.1904 308/500 [=================>............] - ETA: 1:05 - loss: 1.3812 - regression_loss: 1.1912 - classification_loss: 0.1900 309/500 [=================>............] - ETA: 1:05 - loss: 1.3788 - regression_loss: 1.1890 - classification_loss: 0.1897 310/500 [=================>............] - ETA: 1:04 - loss: 1.3780 - regression_loss: 1.1884 - classification_loss: 0.1896 311/500 [=================>............] - ETA: 1:04 - loss: 1.3777 - regression_loss: 1.1882 - classification_loss: 0.1896 312/500 [=================>............] - ETA: 1:04 - loss: 1.3794 - regression_loss: 1.1894 - classification_loss: 0.1900 313/500 [=================>............] - ETA: 1:03 - loss: 1.3803 - regression_loss: 1.1905 - classification_loss: 0.1898 314/500 [=================>............] - ETA: 1:03 - loss: 1.3815 - regression_loss: 1.1913 - classification_loss: 0.1902 315/500 [=================>............] - ETA: 1:03 - loss: 1.3836 - regression_loss: 1.1930 - classification_loss: 0.1906 316/500 [=================>............] - ETA: 1:02 - loss: 1.3840 - regression_loss: 1.1936 - classification_loss: 0.1905 317/500 [==================>...........] - ETA: 1:02 - loss: 1.3837 - regression_loss: 1.1934 - classification_loss: 0.1903 318/500 [==================>...........] - ETA: 1:02 - loss: 1.3824 - regression_loss: 1.1918 - classification_loss: 0.1905 319/500 [==================>...........] - ETA: 1:01 - loss: 1.3828 - regression_loss: 1.1923 - classification_loss: 0.1905 320/500 [==================>...........] - ETA: 1:01 - loss: 1.3821 - regression_loss: 1.1918 - classification_loss: 0.1903 321/500 [==================>...........] - ETA: 1:01 - loss: 1.3818 - regression_loss: 1.1917 - classification_loss: 0.1902 322/500 [==================>...........] - ETA: 1:00 - loss: 1.3801 - regression_loss: 1.1903 - classification_loss: 0.1898 323/500 [==================>...........] - ETA: 1:00 - loss: 1.3789 - regression_loss: 1.1894 - classification_loss: 0.1895 324/500 [==================>...........] - ETA: 1:00 - loss: 1.3786 - regression_loss: 1.1891 - classification_loss: 0.1895 325/500 [==================>...........] - ETA: 59s - loss: 1.3786 - regression_loss: 1.1892 - classification_loss: 0.1894  326/500 [==================>...........] - ETA: 59s - loss: 1.3772 - regression_loss: 1.1880 - classification_loss: 0.1892 327/500 [==================>...........] - ETA: 58s - loss: 1.3775 - regression_loss: 1.1885 - classification_loss: 0.1891 328/500 [==================>...........] - ETA: 58s - loss: 1.3756 - regression_loss: 1.1868 - classification_loss: 0.1888 329/500 [==================>...........] - ETA: 58s - loss: 1.3763 - regression_loss: 1.1873 - classification_loss: 0.1890 330/500 [==================>...........] - ETA: 57s - loss: 1.3743 - regression_loss: 1.1857 - classification_loss: 0.1887 331/500 [==================>...........] - ETA: 57s - loss: 1.3728 - regression_loss: 1.1844 - classification_loss: 0.1883 332/500 [==================>...........] - ETA: 57s - loss: 1.3708 - regression_loss: 1.1827 - classification_loss: 0.1881 333/500 [==================>...........] - ETA: 56s - loss: 1.3699 - regression_loss: 1.1821 - classification_loss: 0.1878 334/500 [===================>..........] - ETA: 56s - loss: 1.3700 - regression_loss: 1.1821 - classification_loss: 0.1879 335/500 [===================>..........] - ETA: 56s - loss: 1.3686 - regression_loss: 1.1810 - classification_loss: 0.1876 336/500 [===================>..........] - ETA: 55s - loss: 1.3687 - regression_loss: 1.1813 - classification_loss: 0.1874 337/500 [===================>..........] - ETA: 55s - loss: 1.3684 - regression_loss: 1.1813 - classification_loss: 0.1871 338/500 [===================>..........] - ETA: 55s - loss: 1.3697 - regression_loss: 1.1822 - classification_loss: 0.1875 339/500 [===================>..........] - ETA: 54s - loss: 1.3720 - regression_loss: 1.1838 - classification_loss: 0.1882 340/500 [===================>..........] - ETA: 54s - loss: 1.3709 - regression_loss: 1.1828 - classification_loss: 0.1880 341/500 [===================>..........] - ETA: 54s - loss: 1.3715 - regression_loss: 1.1835 - classification_loss: 0.1879 342/500 [===================>..........] - ETA: 53s - loss: 1.3717 - regression_loss: 1.1840 - classification_loss: 0.1877 343/500 [===================>..........] - ETA: 53s - loss: 1.3714 - regression_loss: 1.1840 - classification_loss: 0.1873 344/500 [===================>..........] - ETA: 53s - loss: 1.3701 - regression_loss: 1.1830 - classification_loss: 0.1871 345/500 [===================>..........] - ETA: 52s - loss: 1.3695 - regression_loss: 1.1825 - classification_loss: 0.1870 346/500 [===================>..........] - ETA: 52s - loss: 1.3690 - regression_loss: 1.1822 - classification_loss: 0.1868 347/500 [===================>..........] - ETA: 52s - loss: 1.3670 - regression_loss: 1.1803 - classification_loss: 0.1867 348/500 [===================>..........] - ETA: 51s - loss: 1.3664 - regression_loss: 1.1799 - classification_loss: 0.1865 349/500 [===================>..........] - ETA: 51s - loss: 1.3718 - regression_loss: 1.1836 - classification_loss: 0.1882 350/500 [====================>.........] - ETA: 51s - loss: 1.3716 - regression_loss: 1.1832 - classification_loss: 0.1884 351/500 [====================>.........] - ETA: 50s - loss: 1.3717 - regression_loss: 1.1833 - classification_loss: 0.1884 352/500 [====================>.........] - ETA: 50s - loss: 1.3725 - regression_loss: 1.1840 - classification_loss: 0.1885 353/500 [====================>.........] - ETA: 50s - loss: 1.3730 - regression_loss: 1.1846 - classification_loss: 0.1884 354/500 [====================>.........] - ETA: 49s - loss: 1.3724 - regression_loss: 1.1842 - classification_loss: 0.1882 355/500 [====================>.........] - ETA: 49s - loss: 1.3705 - regression_loss: 1.1826 - classification_loss: 0.1879 356/500 [====================>.........] - ETA: 49s - loss: 1.3691 - regression_loss: 1.1815 - classification_loss: 0.1876 357/500 [====================>.........] - ETA: 48s - loss: 1.3686 - regression_loss: 1.1813 - classification_loss: 0.1874 358/500 [====================>.........] - ETA: 48s - loss: 1.3685 - regression_loss: 1.1814 - classification_loss: 0.1871 359/500 [====================>.........] - ETA: 48s - loss: 1.3697 - regression_loss: 1.1824 - classification_loss: 0.1873 360/500 [====================>.........] - ETA: 47s - loss: 1.3699 - regression_loss: 1.1825 - classification_loss: 0.1874 361/500 [====================>.........] - ETA: 47s - loss: 1.3687 - regression_loss: 1.1816 - classification_loss: 0.1871 362/500 [====================>.........] - ETA: 47s - loss: 1.3697 - regression_loss: 1.1827 - classification_loss: 0.1870 363/500 [====================>.........] - ETA: 46s - loss: 1.3683 - regression_loss: 1.1815 - classification_loss: 0.1868 364/500 [====================>.........] - ETA: 46s - loss: 1.3685 - regression_loss: 1.1819 - classification_loss: 0.1867 365/500 [====================>.........] - ETA: 45s - loss: 1.3701 - regression_loss: 1.1829 - classification_loss: 0.1871 366/500 [====================>.........] - ETA: 45s - loss: 1.3690 - regression_loss: 1.1821 - classification_loss: 0.1869 367/500 [=====================>........] - ETA: 45s - loss: 1.3689 - regression_loss: 1.1821 - classification_loss: 0.1868 368/500 [=====================>........] - ETA: 44s - loss: 1.3678 - regression_loss: 1.1812 - classification_loss: 0.1865 369/500 [=====================>........] - ETA: 44s - loss: 1.3687 - regression_loss: 1.1818 - classification_loss: 0.1869 370/500 [=====================>........] - ETA: 44s - loss: 1.3683 - regression_loss: 1.1814 - classification_loss: 0.1869 371/500 [=====================>........] - ETA: 43s - loss: 1.3675 - regression_loss: 1.1809 - classification_loss: 0.1866 372/500 [=====================>........] - ETA: 43s - loss: 1.3670 - regression_loss: 1.1804 - classification_loss: 0.1866 373/500 [=====================>........] - ETA: 43s - loss: 1.3662 - regression_loss: 1.1799 - classification_loss: 0.1863 374/500 [=====================>........] - ETA: 42s - loss: 1.3674 - regression_loss: 1.1807 - classification_loss: 0.1867 375/500 [=====================>........] - ETA: 42s - loss: 1.3669 - regression_loss: 1.1804 - classification_loss: 0.1865 376/500 [=====================>........] - ETA: 42s - loss: 1.3662 - regression_loss: 1.1798 - classification_loss: 0.1865 377/500 [=====================>........] - ETA: 41s - loss: 1.3643 - regression_loss: 1.1782 - classification_loss: 0.1861 378/500 [=====================>........] - ETA: 41s - loss: 1.3664 - regression_loss: 1.1800 - classification_loss: 0.1864 379/500 [=====================>........] - ETA: 41s - loss: 1.3652 - regression_loss: 1.1790 - classification_loss: 0.1862 380/500 [=====================>........] - ETA: 40s - loss: 1.3638 - regression_loss: 1.1779 - classification_loss: 0.1859 381/500 [=====================>........] - ETA: 40s - loss: 1.3636 - regression_loss: 1.1779 - classification_loss: 0.1857 382/500 [=====================>........] - ETA: 40s - loss: 1.3636 - regression_loss: 1.1779 - classification_loss: 0.1857 383/500 [=====================>........] - ETA: 39s - loss: 1.3651 - regression_loss: 1.1789 - classification_loss: 0.1862 384/500 [======================>.......] - ETA: 39s - loss: 1.3641 - regression_loss: 1.1781 - classification_loss: 0.1860 385/500 [======================>.......] - ETA: 39s - loss: 1.3637 - regression_loss: 1.1777 - classification_loss: 0.1860 386/500 [======================>.......] - ETA: 38s - loss: 1.3645 - regression_loss: 1.1786 - classification_loss: 0.1859 387/500 [======================>.......] - ETA: 38s - loss: 1.3644 - regression_loss: 1.1784 - classification_loss: 0.1859 388/500 [======================>.......] - ETA: 38s - loss: 1.3653 - regression_loss: 1.1794 - classification_loss: 0.1859 389/500 [======================>.......] - ETA: 37s - loss: 1.3671 - regression_loss: 1.1807 - classification_loss: 0.1864 390/500 [======================>.......] - ETA: 37s - loss: 1.3650 - regression_loss: 1.1789 - classification_loss: 0.1860 391/500 [======================>.......] - ETA: 37s - loss: 1.3649 - regression_loss: 1.1789 - classification_loss: 0.1860 392/500 [======================>.......] - ETA: 36s - loss: 1.3641 - regression_loss: 1.1783 - classification_loss: 0.1858 393/500 [======================>.......] - ETA: 36s - loss: 1.3633 - regression_loss: 1.1776 - classification_loss: 0.1856 394/500 [======================>.......] - ETA: 36s - loss: 1.3630 - regression_loss: 1.1775 - classification_loss: 0.1855 395/500 [======================>.......] - ETA: 35s - loss: 1.3631 - regression_loss: 1.1776 - classification_loss: 0.1855 396/500 [======================>.......] - ETA: 35s - loss: 1.3618 - regression_loss: 1.1766 - classification_loss: 0.1852 397/500 [======================>.......] - ETA: 35s - loss: 1.3612 - regression_loss: 1.1761 - classification_loss: 0.1851 398/500 [======================>.......] - ETA: 34s - loss: 1.3611 - regression_loss: 1.1763 - classification_loss: 0.1848 399/500 [======================>.......] - ETA: 34s - loss: 1.3620 - regression_loss: 1.1773 - classification_loss: 0.1847 400/500 [=======================>......] - ETA: 34s - loss: 1.3643 - regression_loss: 1.1792 - classification_loss: 0.1851 401/500 [=======================>......] - ETA: 33s - loss: 1.3641 - regression_loss: 1.1790 - classification_loss: 0.1850 402/500 [=======================>......] - ETA: 33s - loss: 1.3633 - regression_loss: 1.1784 - classification_loss: 0.1849 403/500 [=======================>......] - ETA: 33s - loss: 1.3639 - regression_loss: 1.1789 - classification_loss: 0.1851 404/500 [=======================>......] - ETA: 32s - loss: 1.3655 - regression_loss: 1.1800 - classification_loss: 0.1855 405/500 [=======================>......] - ETA: 32s - loss: 1.3639 - regression_loss: 1.1788 - classification_loss: 0.1851 406/500 [=======================>......] - ETA: 31s - loss: 1.3653 - regression_loss: 1.1802 - classification_loss: 0.1850 407/500 [=======================>......] - ETA: 31s - loss: 1.3656 - regression_loss: 1.1806 - classification_loss: 0.1850 408/500 [=======================>......] - ETA: 31s - loss: 1.3665 - regression_loss: 1.1815 - classification_loss: 0.1850 409/500 [=======================>......] - ETA: 30s - loss: 1.3662 - regression_loss: 1.1812 - classification_loss: 0.1849 410/500 [=======================>......] - ETA: 30s - loss: 1.3669 - regression_loss: 1.1818 - classification_loss: 0.1850 411/500 [=======================>......] - ETA: 30s - loss: 1.3668 - regression_loss: 1.1818 - classification_loss: 0.1849 412/500 [=======================>......] - ETA: 29s - loss: 1.3665 - regression_loss: 1.1817 - classification_loss: 0.1848 413/500 [=======================>......] - ETA: 29s - loss: 1.3659 - regression_loss: 1.1813 - classification_loss: 0.1846 414/500 [=======================>......] - ETA: 29s - loss: 1.3673 - regression_loss: 1.1823 - classification_loss: 0.1850 415/500 [=======================>......] - ETA: 28s - loss: 1.3676 - regression_loss: 1.1826 - classification_loss: 0.1850 416/500 [=======================>......] - ETA: 28s - loss: 1.3657 - regression_loss: 1.1810 - classification_loss: 0.1846 417/500 [========================>.....] - ETA: 28s - loss: 1.3649 - regression_loss: 1.1802 - classification_loss: 0.1847 418/500 [========================>.....] - ETA: 27s - loss: 1.3688 - regression_loss: 1.1827 - classification_loss: 0.1861 419/500 [========================>.....] - ETA: 27s - loss: 1.3675 - regression_loss: 1.1816 - classification_loss: 0.1858 420/500 [========================>.....] - ETA: 27s - loss: 1.3679 - regression_loss: 1.1821 - classification_loss: 0.1858 421/500 [========================>.....] - ETA: 26s - loss: 1.3696 - regression_loss: 1.1834 - classification_loss: 0.1862 422/500 [========================>.....] - ETA: 26s - loss: 1.3696 - regression_loss: 1.1836 - classification_loss: 0.1860 423/500 [========================>.....] - ETA: 26s - loss: 1.3698 - regression_loss: 1.1839 - classification_loss: 0.1859 424/500 [========================>.....] - ETA: 25s - loss: 1.3693 - regression_loss: 1.1835 - classification_loss: 0.1857 425/500 [========================>.....] - ETA: 25s - loss: 1.3713 - regression_loss: 1.1853 - classification_loss: 0.1860 426/500 [========================>.....] - ETA: 25s - loss: 1.3708 - regression_loss: 1.1850 - classification_loss: 0.1858 427/500 [========================>.....] - ETA: 24s - loss: 1.3709 - regression_loss: 1.1850 - classification_loss: 0.1859 428/500 [========================>.....] - ETA: 24s - loss: 1.3696 - regression_loss: 1.1839 - classification_loss: 0.1856 429/500 [========================>.....] - ETA: 24s - loss: 1.3712 - regression_loss: 1.1849 - classification_loss: 0.1863 430/500 [========================>.....] - ETA: 23s - loss: 1.3698 - regression_loss: 1.1836 - classification_loss: 0.1862 431/500 [========================>.....] - ETA: 23s - loss: 1.3698 - regression_loss: 1.1838 - classification_loss: 0.1860 432/500 [========================>.....] - ETA: 23s - loss: 1.3684 - regression_loss: 1.1826 - classification_loss: 0.1858 433/500 [========================>.....] - ETA: 22s - loss: 1.3698 - regression_loss: 1.1835 - classification_loss: 0.1862 434/500 [=========================>....] - ETA: 22s - loss: 1.3690 - regression_loss: 1.1829 - classification_loss: 0.1862 435/500 [=========================>....] - ETA: 22s - loss: 1.3689 - regression_loss: 1.1827 - classification_loss: 0.1862 436/500 [=========================>....] - ETA: 21s - loss: 1.3669 - regression_loss: 1.1810 - classification_loss: 0.1858 437/500 [=========================>....] - ETA: 21s - loss: 1.3673 - regression_loss: 1.1815 - classification_loss: 0.1858 438/500 [=========================>....] - ETA: 21s - loss: 1.3658 - regression_loss: 1.1802 - classification_loss: 0.1856 439/500 [=========================>....] - ETA: 20s - loss: 1.3682 - regression_loss: 1.1822 - classification_loss: 0.1860 440/500 [=========================>....] - ETA: 20s - loss: 1.3695 - regression_loss: 1.1830 - classification_loss: 0.1864 441/500 [=========================>....] - ETA: 20s - loss: 1.3697 - regression_loss: 1.1834 - classification_loss: 0.1863 442/500 [=========================>....] - ETA: 19s - loss: 1.3690 - regression_loss: 1.1827 - classification_loss: 0.1862 443/500 [=========================>....] - ETA: 19s - loss: 1.3716 - regression_loss: 1.1848 - classification_loss: 0.1868 444/500 [=========================>....] - ETA: 19s - loss: 1.3705 - regression_loss: 1.1840 - classification_loss: 0.1865 445/500 [=========================>....] - ETA: 18s - loss: 1.3689 - regression_loss: 1.1826 - classification_loss: 0.1863 446/500 [=========================>....] - ETA: 18s - loss: 1.3685 - regression_loss: 1.1822 - classification_loss: 0.1862 447/500 [=========================>....] - ETA: 18s - loss: 1.3672 - regression_loss: 1.1813 - classification_loss: 0.1859 448/500 [=========================>....] - ETA: 17s - loss: 1.3665 - regression_loss: 1.1807 - classification_loss: 0.1858 449/500 [=========================>....] - ETA: 17s - loss: 1.3665 - regression_loss: 1.1807 - classification_loss: 0.1859 450/500 [==========================>...] - ETA: 17s - loss: 1.3668 - regression_loss: 1.1809 - classification_loss: 0.1859 451/500 [==========================>...] - ETA: 16s - loss: 1.3661 - regression_loss: 1.1803 - classification_loss: 0.1857 452/500 [==========================>...] - ETA: 16s - loss: 1.3663 - regression_loss: 1.1805 - classification_loss: 0.1858 453/500 [==========================>...] - ETA: 16s - loss: 1.3675 - regression_loss: 1.1817 - classification_loss: 0.1858 454/500 [==========================>...] - ETA: 15s - loss: 1.3685 - regression_loss: 1.1827 - classification_loss: 0.1858 455/500 [==========================>...] - ETA: 15s - loss: 1.3667 - regression_loss: 1.1812 - classification_loss: 0.1855 456/500 [==========================>...] - ETA: 14s - loss: 1.3666 - regression_loss: 1.1811 - classification_loss: 0.1855 457/500 [==========================>...] - ETA: 14s - loss: 1.3694 - regression_loss: 1.1836 - classification_loss: 0.1858 458/500 [==========================>...] - ETA: 14s - loss: 1.3683 - regression_loss: 1.1826 - classification_loss: 0.1858 459/500 [==========================>...] - ETA: 13s - loss: 1.3682 - regression_loss: 1.1824 - classification_loss: 0.1858 460/500 [==========================>...] - ETA: 13s - loss: 1.3701 - regression_loss: 1.1836 - classification_loss: 0.1865 461/500 [==========================>...] - ETA: 13s - loss: 1.3699 - regression_loss: 1.1835 - classification_loss: 0.1864 462/500 [==========================>...] - ETA: 12s - loss: 1.3710 - regression_loss: 1.1847 - classification_loss: 0.1863 463/500 [==========================>...] - ETA: 12s - loss: 1.3702 - regression_loss: 1.1840 - classification_loss: 0.1861 464/500 [==========================>...] - ETA: 12s - loss: 1.3705 - regression_loss: 1.1843 - classification_loss: 0.1862 465/500 [==========================>...] - ETA: 11s - loss: 1.3699 - regression_loss: 1.1839 - classification_loss: 0.1860 466/500 [==========================>...] - ETA: 11s - loss: 1.3700 - regression_loss: 1.1840 - classification_loss: 0.1860 467/500 [===========================>..] - ETA: 11s - loss: 1.3690 - regression_loss: 1.1831 - classification_loss: 0.1859 468/500 [===========================>..] - ETA: 10s - loss: 1.3690 - regression_loss: 1.1831 - classification_loss: 0.1859 469/500 [===========================>..] - ETA: 10s - loss: 1.3680 - regression_loss: 1.1823 - classification_loss: 0.1857 470/500 [===========================>..] - ETA: 10s - loss: 1.3671 - regression_loss: 1.1814 - classification_loss: 0.1857 471/500 [===========================>..] - ETA: 9s - loss: 1.3664 - regression_loss: 1.1809 - classification_loss: 0.1855  472/500 [===========================>..] - ETA: 9s - loss: 1.3657 - regression_loss: 1.1804 - classification_loss: 0.1853 473/500 [===========================>..] - ETA: 9s - loss: 1.3652 - regression_loss: 1.1801 - classification_loss: 0.1851 474/500 [===========================>..] - ETA: 8s - loss: 1.3668 - regression_loss: 1.1813 - classification_loss: 0.1855 475/500 [===========================>..] - ETA: 8s - loss: 1.3672 - regression_loss: 1.1816 - classification_loss: 0.1856 476/500 [===========================>..] - ETA: 8s - loss: 1.3658 - regression_loss: 1.1802 - classification_loss: 0.1856 477/500 [===========================>..] - ETA: 7s - loss: 1.3666 - regression_loss: 1.1809 - classification_loss: 0.1857 478/500 [===========================>..] - ETA: 7s - loss: 1.3662 - regression_loss: 1.1806 - classification_loss: 0.1856 479/500 [===========================>..] - ETA: 7s - loss: 1.3683 - regression_loss: 1.1822 - classification_loss: 0.1861 480/500 [===========================>..] - ETA: 6s - loss: 1.3672 - regression_loss: 1.1812 - classification_loss: 0.1860 481/500 [===========================>..] - ETA: 6s - loss: 1.3664 - regression_loss: 1.1805 - classification_loss: 0.1859 482/500 [===========================>..] - ETA: 6s - loss: 1.3664 - regression_loss: 1.1806 - classification_loss: 0.1858 483/500 [===========================>..] - ETA: 5s - loss: 1.3657 - regression_loss: 1.1799 - classification_loss: 0.1857 484/500 [============================>.] - ETA: 5s - loss: 1.3648 - regression_loss: 1.1793 - classification_loss: 0.1855 485/500 [============================>.] - ETA: 5s - loss: 1.3673 - regression_loss: 1.1814 - classification_loss: 0.1860 486/500 [============================>.] - ETA: 4s - loss: 1.3672 - regression_loss: 1.1814 - classification_loss: 0.1858 487/500 [============================>.] - ETA: 4s - loss: 1.3675 - regression_loss: 1.1815 - classification_loss: 0.1860 488/500 [============================>.] - ETA: 4s - loss: 1.3674 - regression_loss: 1.1815 - classification_loss: 0.1859 489/500 [============================>.] - ETA: 3s - loss: 1.3666 - regression_loss: 1.1808 - classification_loss: 0.1858 490/500 [============================>.] - ETA: 3s - loss: 1.3661 - regression_loss: 1.1804 - classification_loss: 0.1857 491/500 [============================>.] - ETA: 3s - loss: 1.3655 - regression_loss: 1.1799 - classification_loss: 0.1856 492/500 [============================>.] - ETA: 2s - loss: 1.3650 - regression_loss: 1.1796 - classification_loss: 0.1854 493/500 [============================>.] - ETA: 2s - loss: 1.3648 - regression_loss: 1.1794 - classification_loss: 0.1855 494/500 [============================>.] - ETA: 2s - loss: 1.3652 - regression_loss: 1.1798 - classification_loss: 0.1855 495/500 [============================>.] - ETA: 1s - loss: 1.3642 - regression_loss: 1.1789 - classification_loss: 0.1853 496/500 [============================>.] - ETA: 1s - loss: 1.3644 - regression_loss: 1.1792 - classification_loss: 0.1852 497/500 [============================>.] - ETA: 1s - loss: 1.3620 - regression_loss: 1.1768 - classification_loss: 0.1852 498/500 [============================>.] - ETA: 0s - loss: 1.3619 - regression_loss: 1.1768 - classification_loss: 0.1851 499/500 [============================>.] - ETA: 0s - loss: 1.3636 - regression_loss: 1.1782 - classification_loss: 0.1854 500/500 [==============================] - 170s 340ms/step - loss: 1.3628 - regression_loss: 1.1775 - classification_loss: 0.1853 326 instances of class plum with average precision: 0.8059 mAP: 0.8059 Epoch 00012: saving model to ./training/snapshots/resnet101_pascal_12.h5 Epoch 13/150 1/500 [..............................] - ETA: 2:45 - loss: 1.9324 - regression_loss: 1.6826 - classification_loss: 0.2498 2/500 [..............................] - ETA: 2:51 - loss: 1.6354 - regression_loss: 1.4578 - classification_loss: 0.1776 3/500 [..............................] - ETA: 2:49 - loss: 1.4192 - regression_loss: 1.2735 - classification_loss: 0.1457 4/500 [..............................] - ETA: 2:49 - loss: 1.4225 - regression_loss: 1.2765 - classification_loss: 0.1460 5/500 [..............................] - ETA: 2:49 - loss: 1.3129 - regression_loss: 1.1839 - classification_loss: 0.1290 6/500 [..............................] - ETA: 2:48 - loss: 1.2383 - regression_loss: 1.1143 - classification_loss: 0.1240 7/500 [..............................] - ETA: 2:48 - loss: 1.2388 - regression_loss: 1.1046 - classification_loss: 0.1342 8/500 [..............................] - ETA: 2:48 - loss: 1.2550 - regression_loss: 1.1232 - classification_loss: 0.1318 9/500 [..............................] - ETA: 2:47 - loss: 1.3052 - regression_loss: 1.1621 - classification_loss: 0.1431 10/500 [..............................] - ETA: 2:46 - loss: 1.3579 - regression_loss: 1.1972 - classification_loss: 0.1607 11/500 [..............................] - ETA: 2:46 - loss: 1.3433 - regression_loss: 1.1840 - classification_loss: 0.1593 12/500 [..............................] - ETA: 2:46 - loss: 1.3177 - regression_loss: 1.1621 - classification_loss: 0.1555 13/500 [..............................] - ETA: 2:46 - loss: 1.3253 - regression_loss: 1.1634 - classification_loss: 0.1619 14/500 [..............................] - ETA: 2:45 - loss: 1.2731 - regression_loss: 1.1166 - classification_loss: 0.1566 15/500 [..............................] - ETA: 2:45 - loss: 1.2606 - regression_loss: 1.1007 - classification_loss: 0.1599 16/500 [..............................] - ETA: 2:45 - loss: 1.2204 - regression_loss: 1.0679 - classification_loss: 0.1525 17/500 [>.............................] - ETA: 2:45 - loss: 1.1931 - regression_loss: 1.0433 - classification_loss: 0.1498 18/500 [>.............................] - ETA: 2:45 - loss: 1.1876 - regression_loss: 1.0397 - classification_loss: 0.1479 19/500 [>.............................] - ETA: 2:44 - loss: 1.1640 - regression_loss: 1.0164 - classification_loss: 0.1476 20/500 [>.............................] - ETA: 2:44 - loss: 1.1801 - regression_loss: 1.0291 - classification_loss: 0.1510 21/500 [>.............................] - ETA: 2:44 - loss: 1.2214 - regression_loss: 1.0650 - classification_loss: 0.1564 22/500 [>.............................] - ETA: 2:44 - loss: 1.2577 - regression_loss: 1.0963 - classification_loss: 0.1614 23/500 [>.............................] - ETA: 2:44 - loss: 1.2399 - regression_loss: 1.0831 - classification_loss: 0.1568 24/500 [>.............................] - ETA: 2:43 - loss: 1.2697 - regression_loss: 1.1064 - classification_loss: 0.1633 25/500 [>.............................] - ETA: 2:43 - loss: 1.2885 - regression_loss: 1.1216 - classification_loss: 0.1669 26/500 [>.............................] - ETA: 2:43 - loss: 1.2825 - regression_loss: 1.1148 - classification_loss: 0.1677 27/500 [>.............................] - ETA: 2:42 - loss: 1.2944 - regression_loss: 1.1248 - classification_loss: 0.1696 28/500 [>.............................] - ETA: 2:42 - loss: 1.2911 - regression_loss: 1.1207 - classification_loss: 0.1704 29/500 [>.............................] - ETA: 2:41 - loss: 1.2985 - regression_loss: 1.1261 - classification_loss: 0.1724 30/500 [>.............................] - ETA: 2:41 - loss: 1.2972 - regression_loss: 1.1245 - classification_loss: 0.1727 31/500 [>.............................] - ETA: 2:40 - loss: 1.3033 - regression_loss: 1.1284 - classification_loss: 0.1749 32/500 [>.............................] - ETA: 2:40 - loss: 1.2834 - regression_loss: 1.1120 - classification_loss: 0.1714 33/500 [>.............................] - ETA: 2:39 - loss: 1.2616 - regression_loss: 1.0938 - classification_loss: 0.1678 34/500 [=>............................] - ETA: 2:39 - loss: 1.2822 - regression_loss: 1.1134 - classification_loss: 0.1688 35/500 [=>............................] - ETA: 2:39 - loss: 1.2894 - regression_loss: 1.1198 - classification_loss: 0.1697 36/500 [=>............................] - ETA: 2:38 - loss: 1.2974 - regression_loss: 1.1281 - classification_loss: 0.1694 37/500 [=>............................] - ETA: 2:38 - loss: 1.3199 - regression_loss: 1.1458 - classification_loss: 0.1741 38/500 [=>............................] - ETA: 2:38 - loss: 1.3054 - regression_loss: 1.1341 - classification_loss: 0.1712 39/500 [=>............................] - ETA: 2:37 - loss: 1.3267 - regression_loss: 1.1465 - classification_loss: 0.1803 40/500 [=>............................] - ETA: 2:37 - loss: 1.3057 - regression_loss: 1.1289 - classification_loss: 0.1768 41/500 [=>............................] - ETA: 2:36 - loss: 1.3397 - regression_loss: 1.1555 - classification_loss: 0.1842 42/500 [=>............................] - ETA: 2:36 - loss: 1.3299 - regression_loss: 1.1480 - classification_loss: 0.1819 43/500 [=>............................] - ETA: 2:36 - loss: 1.3318 - regression_loss: 1.1486 - classification_loss: 0.1832 44/500 [=>............................] - ETA: 2:35 - loss: 1.3409 - regression_loss: 1.1565 - classification_loss: 0.1844 45/500 [=>............................] - ETA: 2:35 - loss: 1.3480 - regression_loss: 1.1610 - classification_loss: 0.1870 46/500 [=>............................] - ETA: 2:35 - loss: 1.3461 - regression_loss: 1.1605 - classification_loss: 0.1856 47/500 [=>............................] - ETA: 2:34 - loss: 1.3460 - regression_loss: 1.1610 - classification_loss: 0.1850 48/500 [=>............................] - ETA: 2:34 - loss: 1.3423 - regression_loss: 1.1586 - classification_loss: 0.1837 49/500 [=>............................] - ETA: 2:33 - loss: 1.3355 - regression_loss: 1.1536 - classification_loss: 0.1819 50/500 [==>...........................] - ETA: 2:33 - loss: 1.3511 - regression_loss: 1.1664 - classification_loss: 0.1846 51/500 [==>...........................] - ETA: 2:32 - loss: 1.3585 - regression_loss: 1.1720 - classification_loss: 0.1865 52/500 [==>...........................] - ETA: 2:32 - loss: 1.3770 - regression_loss: 1.1875 - classification_loss: 0.1896 53/500 [==>...........................] - ETA: 2:32 - loss: 1.3849 - regression_loss: 1.1936 - classification_loss: 0.1914 54/500 [==>...........................] - ETA: 2:31 - loss: 1.3743 - regression_loss: 1.1840 - classification_loss: 0.1903 55/500 [==>...........................] - ETA: 2:31 - loss: 1.3742 - regression_loss: 1.1854 - classification_loss: 0.1888 56/500 [==>...........................] - ETA: 2:31 - loss: 1.3675 - regression_loss: 1.1790 - classification_loss: 0.1885 57/500 [==>...........................] - ETA: 2:31 - loss: 1.3627 - regression_loss: 1.1746 - classification_loss: 0.1881 58/500 [==>...........................] - ETA: 2:30 - loss: 1.3527 - regression_loss: 1.1656 - classification_loss: 0.1871 59/500 [==>...........................] - ETA: 2:30 - loss: 1.3448 - regression_loss: 1.1592 - classification_loss: 0.1856 60/500 [==>...........................] - ETA: 2:30 - loss: 1.3425 - regression_loss: 1.1577 - classification_loss: 0.1849 61/500 [==>...........................] - ETA: 2:29 - loss: 1.3386 - regression_loss: 1.1544 - classification_loss: 0.1842 62/500 [==>...........................] - ETA: 2:29 - loss: 1.3356 - regression_loss: 1.1523 - classification_loss: 0.1833 63/500 [==>...........................] - ETA: 2:29 - loss: 1.3393 - regression_loss: 1.1558 - classification_loss: 0.1835 64/500 [==>...........................] - ETA: 2:28 - loss: 1.3303 - regression_loss: 1.1488 - classification_loss: 0.1815 65/500 [==>...........................] - ETA: 2:28 - loss: 1.3293 - regression_loss: 1.1485 - classification_loss: 0.1809 66/500 [==>...........................] - ETA: 2:28 - loss: 1.3244 - regression_loss: 1.1447 - classification_loss: 0.1797 67/500 [===>..........................] - ETA: 2:27 - loss: 1.3268 - regression_loss: 1.1470 - classification_loss: 0.1798 68/500 [===>..........................] - ETA: 2:27 - loss: 1.3158 - regression_loss: 1.1377 - classification_loss: 0.1782 69/500 [===>..........................] - ETA: 2:27 - loss: 1.3162 - regression_loss: 1.1380 - classification_loss: 0.1782 70/500 [===>..........................] - ETA: 2:26 - loss: 1.3090 - regression_loss: 1.1317 - classification_loss: 0.1773 71/500 [===>..........................] - ETA: 2:26 - loss: 1.2998 - regression_loss: 1.1239 - classification_loss: 0.1758 72/500 [===>..........................] - ETA: 2:25 - loss: 1.3006 - regression_loss: 1.1250 - classification_loss: 0.1756 73/500 [===>..........................] - ETA: 2:25 - loss: 1.3059 - regression_loss: 1.1275 - classification_loss: 0.1785 74/500 [===>..........................] - ETA: 2:25 - loss: 1.3110 - regression_loss: 1.1327 - classification_loss: 0.1784 75/500 [===>..........................] - ETA: 2:24 - loss: 1.3143 - regression_loss: 1.1360 - classification_loss: 0.1784 76/500 [===>..........................] - ETA: 2:24 - loss: 1.3042 - regression_loss: 1.1274 - classification_loss: 0.1768 77/500 [===>..........................] - ETA: 2:24 - loss: 1.2969 - regression_loss: 1.1208 - classification_loss: 0.1761 78/500 [===>..........................] - ETA: 2:23 - loss: 1.3036 - regression_loss: 1.1277 - classification_loss: 0.1759 79/500 [===>..........................] - ETA: 2:23 - loss: 1.3025 - regression_loss: 1.1270 - classification_loss: 0.1755 80/500 [===>..........................] - ETA: 2:22 - loss: 1.2970 - regression_loss: 1.1223 - classification_loss: 0.1747 81/500 [===>..........................] - ETA: 2:22 - loss: 1.2928 - regression_loss: 1.1188 - classification_loss: 0.1740 82/500 [===>..........................] - ETA: 2:22 - loss: 1.2921 - regression_loss: 1.1188 - classification_loss: 0.1733 83/500 [===>..........................] - ETA: 2:21 - loss: 1.2918 - regression_loss: 1.1183 - classification_loss: 0.1734 84/500 [====>.........................] - ETA: 2:21 - loss: 1.2929 - regression_loss: 1.1198 - classification_loss: 0.1731 85/500 [====>.........................] - ETA: 2:21 - loss: 1.2975 - regression_loss: 1.1223 - classification_loss: 0.1752 86/500 [====>.........................] - ETA: 2:20 - loss: 1.2997 - regression_loss: 1.1238 - classification_loss: 0.1759 87/500 [====>.........................] - ETA: 2:20 - loss: 1.2959 - regression_loss: 1.1205 - classification_loss: 0.1754 88/500 [====>.........................] - ETA: 2:20 - loss: 1.3009 - regression_loss: 1.1240 - classification_loss: 0.1769 89/500 [====>.........................] - ETA: 2:19 - loss: 1.2999 - regression_loss: 1.1223 - classification_loss: 0.1776 90/500 [====>.........................] - ETA: 2:19 - loss: 1.3085 - regression_loss: 1.1284 - classification_loss: 0.1801 91/500 [====>.........................] - ETA: 2:18 - loss: 1.3058 - regression_loss: 1.1262 - classification_loss: 0.1796 92/500 [====>.........................] - ETA: 2:18 - loss: 1.3087 - regression_loss: 1.1270 - classification_loss: 0.1818 93/500 [====>.........................] - ETA: 2:18 - loss: 1.3084 - regression_loss: 1.1269 - classification_loss: 0.1815 94/500 [====>.........................] - ETA: 2:17 - loss: 1.3122 - regression_loss: 1.1294 - classification_loss: 0.1828 95/500 [====>.........................] - ETA: 2:17 - loss: 1.3210 - regression_loss: 1.1364 - classification_loss: 0.1846 96/500 [====>.........................] - ETA: 2:17 - loss: 1.3186 - regression_loss: 1.1346 - classification_loss: 0.1840 97/500 [====>.........................] - ETA: 2:17 - loss: 1.3223 - regression_loss: 1.1389 - classification_loss: 0.1834 98/500 [====>.........................] - ETA: 2:16 - loss: 1.3233 - regression_loss: 1.1407 - classification_loss: 0.1827 99/500 [====>.........................] - ETA: 2:16 - loss: 1.3157 - regression_loss: 1.1342 - classification_loss: 0.1815 100/500 [=====>........................] - ETA: 2:15 - loss: 1.3179 - regression_loss: 1.1369 - classification_loss: 0.1810 101/500 [=====>........................] - ETA: 2:15 - loss: 1.3194 - regression_loss: 1.1383 - classification_loss: 0.1810 102/500 [=====>........................] - ETA: 2:15 - loss: 1.3132 - regression_loss: 1.1331 - classification_loss: 0.1801 103/500 [=====>........................] - ETA: 2:15 - loss: 1.3178 - regression_loss: 1.1378 - classification_loss: 0.1800 104/500 [=====>........................] - ETA: 2:14 - loss: 1.3220 - regression_loss: 1.1412 - classification_loss: 0.1808 105/500 [=====>........................] - ETA: 2:14 - loss: 1.3206 - regression_loss: 1.1397 - classification_loss: 0.1809 106/500 [=====>........................] - ETA: 2:13 - loss: 1.3184 - regression_loss: 1.1380 - classification_loss: 0.1804 107/500 [=====>........................] - ETA: 2:13 - loss: 1.3236 - regression_loss: 1.1422 - classification_loss: 0.1814 108/500 [=====>........................] - ETA: 2:12 - loss: 1.3243 - regression_loss: 1.1430 - classification_loss: 0.1813 109/500 [=====>........................] - ETA: 2:12 - loss: 1.3159 - regression_loss: 1.1356 - classification_loss: 0.1803 110/500 [=====>........................] - ETA: 2:12 - loss: 1.3145 - regression_loss: 1.1349 - classification_loss: 0.1796 111/500 [=====>........................] - ETA: 2:11 - loss: 1.3104 - regression_loss: 1.1316 - classification_loss: 0.1788 112/500 [=====>........................] - ETA: 2:11 - loss: 1.3094 - regression_loss: 1.1311 - classification_loss: 0.1783 113/500 [=====>........................] - ETA: 2:11 - loss: 1.3080 - regression_loss: 1.1304 - classification_loss: 0.1776 114/500 [=====>........................] - ETA: 2:10 - loss: 1.3126 - regression_loss: 1.1345 - classification_loss: 0.1781 115/500 [=====>........................] - ETA: 2:10 - loss: 1.3104 - regression_loss: 1.1328 - classification_loss: 0.1776 116/500 [=====>........................] - ETA: 2:10 - loss: 1.3146 - regression_loss: 1.1365 - classification_loss: 0.1781 117/500 [======>.......................] - ETA: 2:09 - loss: 1.3166 - regression_loss: 1.1382 - classification_loss: 0.1784 118/500 [======>.......................] - ETA: 2:09 - loss: 1.3159 - regression_loss: 1.1379 - classification_loss: 0.1781 119/500 [======>.......................] - ETA: 2:09 - loss: 1.3147 - regression_loss: 1.1370 - classification_loss: 0.1776 120/500 [======>.......................] - ETA: 2:08 - loss: 1.3171 - regression_loss: 1.1396 - classification_loss: 0.1775 121/500 [======>.......................] - ETA: 2:08 - loss: 1.3163 - regression_loss: 1.1391 - classification_loss: 0.1772 122/500 [======>.......................] - ETA: 2:08 - loss: 1.3159 - regression_loss: 1.1389 - classification_loss: 0.1770 123/500 [======>.......................] - ETA: 2:07 - loss: 1.3151 - regression_loss: 1.1386 - classification_loss: 0.1765 124/500 [======>.......................] - ETA: 2:07 - loss: 1.3186 - regression_loss: 1.1418 - classification_loss: 0.1768 125/500 [======>.......................] - ETA: 2:07 - loss: 1.3144 - regression_loss: 1.1382 - classification_loss: 0.1762 126/500 [======>.......................] - ETA: 2:06 - loss: 1.3112 - regression_loss: 1.1354 - classification_loss: 0.1758 127/500 [======>.......................] - ETA: 2:06 - loss: 1.3197 - regression_loss: 1.1428 - classification_loss: 0.1769 128/500 [======>.......................] - ETA: 2:06 - loss: 1.3159 - regression_loss: 1.1396 - classification_loss: 0.1762 129/500 [======>.......................] - ETA: 2:05 - loss: 1.3207 - regression_loss: 1.1425 - classification_loss: 0.1783 130/500 [======>.......................] - ETA: 2:05 - loss: 1.3169 - regression_loss: 1.1393 - classification_loss: 0.1775 131/500 [======>.......................] - ETA: 2:05 - loss: 1.3129 - regression_loss: 1.1357 - classification_loss: 0.1772 132/500 [======>.......................] - ETA: 2:04 - loss: 1.3173 - regression_loss: 1.1400 - classification_loss: 0.1773 133/500 [======>.......................] - ETA: 2:04 - loss: 1.3179 - regression_loss: 1.1406 - classification_loss: 0.1773 134/500 [=======>......................] - ETA: 2:04 - loss: 1.3212 - regression_loss: 1.1437 - classification_loss: 0.1775 135/500 [=======>......................] - ETA: 2:03 - loss: 1.3238 - regression_loss: 1.1460 - classification_loss: 0.1779 136/500 [=======>......................] - ETA: 2:03 - loss: 1.3216 - regression_loss: 1.1439 - classification_loss: 0.1776 137/500 [=======>......................] - ETA: 2:03 - loss: 1.3250 - regression_loss: 1.1472 - classification_loss: 0.1778 138/500 [=======>......................] - ETA: 2:03 - loss: 1.3267 - regression_loss: 1.1482 - classification_loss: 0.1786 139/500 [=======>......................] - ETA: 2:02 - loss: 1.3324 - regression_loss: 1.1540 - classification_loss: 0.1784 140/500 [=======>......................] - ETA: 2:02 - loss: 1.3324 - regression_loss: 1.1540 - classification_loss: 0.1784 141/500 [=======>......................] - ETA: 2:01 - loss: 1.3297 - regression_loss: 1.1518 - classification_loss: 0.1779 142/500 [=======>......................] - ETA: 2:01 - loss: 1.3245 - regression_loss: 1.1475 - classification_loss: 0.1770 143/500 [=======>......................] - ETA: 2:01 - loss: 1.3230 - regression_loss: 1.1468 - classification_loss: 0.1761 144/500 [=======>......................] - ETA: 2:00 - loss: 1.3210 - regression_loss: 1.1455 - classification_loss: 0.1755 145/500 [=======>......................] - ETA: 2:00 - loss: 1.3274 - regression_loss: 1.1504 - classification_loss: 0.1770 146/500 [=======>......................] - ETA: 2:00 - loss: 1.3265 - regression_loss: 1.1499 - classification_loss: 0.1766 147/500 [=======>......................] - ETA: 1:59 - loss: 1.3277 - regression_loss: 1.1514 - classification_loss: 0.1763 148/500 [=======>......................] - ETA: 1:59 - loss: 1.3307 - regression_loss: 1.1540 - classification_loss: 0.1767 149/500 [=======>......................] - ETA: 1:59 - loss: 1.3394 - regression_loss: 1.1612 - classification_loss: 0.1782 150/500 [========>.....................] - ETA: 1:58 - loss: 1.3395 - regression_loss: 1.1610 - classification_loss: 0.1785 151/500 [========>.....................] - ETA: 1:58 - loss: 1.3367 - regression_loss: 1.1583 - classification_loss: 0.1785 152/500 [========>.....................] - ETA: 1:58 - loss: 1.3426 - regression_loss: 1.1638 - classification_loss: 0.1788 153/500 [========>.....................] - ETA: 1:57 - loss: 1.3382 - regression_loss: 1.1600 - classification_loss: 0.1781 154/500 [========>.....................] - ETA: 1:57 - loss: 1.3370 - regression_loss: 1.1592 - classification_loss: 0.1778 155/500 [========>.....................] - ETA: 1:57 - loss: 1.3358 - regression_loss: 1.1581 - classification_loss: 0.1777 156/500 [========>.....................] - ETA: 1:56 - loss: 1.3363 - regression_loss: 1.1584 - classification_loss: 0.1779 157/500 [========>.....................] - ETA: 1:56 - loss: 1.3435 - regression_loss: 1.1637 - classification_loss: 0.1798 158/500 [========>.....................] - ETA: 1:56 - loss: 1.3442 - regression_loss: 1.1646 - classification_loss: 0.1797 159/500 [========>.....................] - ETA: 1:55 - loss: 1.3458 - regression_loss: 1.1653 - classification_loss: 0.1806 160/500 [========>.....................] - ETA: 1:55 - loss: 1.3416 - regression_loss: 1.1616 - classification_loss: 0.1800 161/500 [========>.....................] - ETA: 1:54 - loss: 1.3542 - regression_loss: 1.1707 - classification_loss: 0.1836 162/500 [========>.....................] - ETA: 1:54 - loss: 1.3545 - regression_loss: 1.1704 - classification_loss: 0.1840 163/500 [========>.....................] - ETA: 1:54 - loss: 1.3524 - regression_loss: 1.1688 - classification_loss: 0.1836 164/500 [========>.....................] - ETA: 1:53 - loss: 1.3528 - regression_loss: 1.1691 - classification_loss: 0.1838 165/500 [========>.....................] - ETA: 1:53 - loss: 1.3498 - regression_loss: 1.1667 - classification_loss: 0.1831 166/500 [========>.....................] - ETA: 1:53 - loss: 1.3493 - regression_loss: 1.1662 - classification_loss: 0.1831 167/500 [=========>....................] - ETA: 1:52 - loss: 1.3429 - regression_loss: 1.1607 - classification_loss: 0.1822 168/500 [=========>....................] - ETA: 1:52 - loss: 1.3399 - regression_loss: 1.1585 - classification_loss: 0.1814 169/500 [=========>....................] - ETA: 1:52 - loss: 1.3441 - regression_loss: 1.1625 - classification_loss: 0.1815 170/500 [=========>....................] - ETA: 1:51 - loss: 1.3486 - regression_loss: 1.1667 - classification_loss: 0.1819 171/500 [=========>....................] - ETA: 1:51 - loss: 1.3489 - regression_loss: 1.1662 - classification_loss: 0.1827 172/500 [=========>....................] - ETA: 1:51 - loss: 1.3519 - regression_loss: 1.1676 - classification_loss: 0.1843 173/500 [=========>....................] - ETA: 1:50 - loss: 1.3523 - regression_loss: 1.1680 - classification_loss: 0.1843 174/500 [=========>....................] - ETA: 1:50 - loss: 1.3504 - regression_loss: 1.1664 - classification_loss: 0.1840 175/500 [=========>....................] - ETA: 1:50 - loss: 1.3508 - regression_loss: 1.1668 - classification_loss: 0.1840 176/500 [=========>....................] - ETA: 1:49 - loss: 1.3515 - regression_loss: 1.1673 - classification_loss: 0.1841 177/500 [=========>....................] - ETA: 1:49 - loss: 1.3506 - regression_loss: 1.1668 - classification_loss: 0.1838 178/500 [=========>....................] - ETA: 1:49 - loss: 1.3476 - regression_loss: 1.1644 - classification_loss: 0.1832 179/500 [=========>....................] - ETA: 1:48 - loss: 1.3485 - regression_loss: 1.1654 - classification_loss: 0.1832 180/500 [=========>....................] - ETA: 1:48 - loss: 1.3451 - regression_loss: 1.1626 - classification_loss: 0.1825 181/500 [=========>....................] - ETA: 1:48 - loss: 1.3435 - regression_loss: 1.1612 - classification_loss: 0.1822 182/500 [=========>....................] - ETA: 1:47 - loss: 1.3461 - regression_loss: 1.1632 - classification_loss: 0.1829 183/500 [=========>....................] - ETA: 1:47 - loss: 1.3462 - regression_loss: 1.1632 - classification_loss: 0.1830 184/500 [==========>...................] - ETA: 1:47 - loss: 1.3485 - regression_loss: 1.1654 - classification_loss: 0.1831 185/500 [==========>...................] - ETA: 1:46 - loss: 1.3475 - regression_loss: 1.1645 - classification_loss: 0.1830 186/500 [==========>...................] - ETA: 1:46 - loss: 1.3496 - regression_loss: 1.1659 - classification_loss: 0.1837 187/500 [==========>...................] - ETA: 1:46 - loss: 1.3481 - regression_loss: 1.1650 - classification_loss: 0.1831 188/500 [==========>...................] - ETA: 1:45 - loss: 1.3428 - regression_loss: 1.1604 - classification_loss: 0.1824 189/500 [==========>...................] - ETA: 1:45 - loss: 1.3442 - regression_loss: 1.1613 - classification_loss: 0.1829 190/500 [==========>...................] - ETA: 1:45 - loss: 1.3489 - regression_loss: 1.1654 - classification_loss: 0.1835 191/500 [==========>...................] - ETA: 1:44 - loss: 1.3456 - regression_loss: 1.1626 - classification_loss: 0.1830 192/500 [==========>...................] - ETA: 1:44 - loss: 1.3441 - regression_loss: 1.1616 - classification_loss: 0.1825 193/500 [==========>...................] - ETA: 1:44 - loss: 1.3400 - regression_loss: 1.1555 - classification_loss: 0.1845 194/500 [==========>...................] - ETA: 1:43 - loss: 1.3408 - regression_loss: 1.1564 - classification_loss: 0.1843 195/500 [==========>...................] - ETA: 1:43 - loss: 1.3396 - regression_loss: 1.1556 - classification_loss: 0.1840 196/500 [==========>...................] - ETA: 1:43 - loss: 1.3385 - regression_loss: 1.1548 - classification_loss: 0.1838 197/500 [==========>...................] - ETA: 1:42 - loss: 1.3374 - regression_loss: 1.1537 - classification_loss: 0.1837 198/500 [==========>...................] - ETA: 1:42 - loss: 1.3343 - regression_loss: 1.1513 - classification_loss: 0.1831 199/500 [==========>...................] - ETA: 1:42 - loss: 1.3332 - regression_loss: 1.1504 - classification_loss: 0.1828 200/500 [===========>..................] - ETA: 1:41 - loss: 1.3339 - regression_loss: 1.1516 - classification_loss: 0.1823 201/500 [===========>..................] - ETA: 1:41 - loss: 1.3324 - regression_loss: 1.1504 - classification_loss: 0.1820 202/500 [===========>..................] - ETA: 1:41 - loss: 1.3320 - regression_loss: 1.1503 - classification_loss: 0.1817 203/500 [===========>..................] - ETA: 1:40 - loss: 1.3338 - regression_loss: 1.1518 - classification_loss: 0.1820 204/500 [===========>..................] - ETA: 1:40 - loss: 1.3339 - regression_loss: 1.1518 - classification_loss: 0.1821 205/500 [===========>..................] - ETA: 1:40 - loss: 1.3316 - regression_loss: 1.1500 - classification_loss: 0.1815 206/500 [===========>..................] - ETA: 1:39 - loss: 1.3365 - regression_loss: 1.1534 - classification_loss: 0.1831 207/500 [===========>..................] - ETA: 1:39 - loss: 1.3359 - regression_loss: 1.1530 - classification_loss: 0.1829 208/500 [===========>..................] - ETA: 1:39 - loss: 1.3350 - regression_loss: 1.1524 - classification_loss: 0.1826 209/500 [===========>..................] - ETA: 1:38 - loss: 1.3338 - regression_loss: 1.1516 - classification_loss: 0.1822 210/500 [===========>..................] - ETA: 1:38 - loss: 1.3356 - regression_loss: 1.1530 - classification_loss: 0.1826 211/500 [===========>..................] - ETA: 1:38 - loss: 1.3360 - regression_loss: 1.1534 - classification_loss: 0.1826 212/500 [===========>..................] - ETA: 1:37 - loss: 1.3372 - regression_loss: 1.1547 - classification_loss: 0.1826 213/500 [===========>..................] - ETA: 1:37 - loss: 1.3347 - regression_loss: 1.1525 - classification_loss: 0.1822 214/500 [===========>..................] - ETA: 1:37 - loss: 1.3394 - regression_loss: 1.1567 - classification_loss: 0.1827 215/500 [===========>..................] - ETA: 1:36 - loss: 1.3466 - regression_loss: 1.1624 - classification_loss: 0.1842 216/500 [===========>..................] - ETA: 1:36 - loss: 1.3426 - regression_loss: 1.1591 - classification_loss: 0.1835 217/500 [============>.................] - ETA: 1:36 - loss: 1.3460 - regression_loss: 1.1621 - classification_loss: 0.1840 218/500 [============>.................] - ETA: 1:35 - loss: 1.3496 - regression_loss: 1.1650 - classification_loss: 0.1846 219/500 [============>.................] - ETA: 1:35 - loss: 1.3543 - regression_loss: 1.1684 - classification_loss: 0.1859 220/500 [============>.................] - ETA: 1:35 - loss: 1.3542 - regression_loss: 1.1682 - classification_loss: 0.1860 221/500 [============>.................] - ETA: 1:34 - loss: 1.3558 - regression_loss: 1.1693 - classification_loss: 0.1865 222/500 [============>.................] - ETA: 1:34 - loss: 1.3574 - regression_loss: 1.1712 - classification_loss: 0.1862 223/500 [============>.................] - ETA: 1:33 - loss: 1.3676 - regression_loss: 1.1769 - classification_loss: 0.1907 224/500 [============>.................] - ETA: 1:33 - loss: 1.3667 - regression_loss: 1.1763 - classification_loss: 0.1904 225/500 [============>.................] - ETA: 1:33 - loss: 1.3658 - regression_loss: 1.1757 - classification_loss: 0.1901 226/500 [============>.................] - ETA: 1:32 - loss: 1.3633 - regression_loss: 1.1738 - classification_loss: 0.1896 227/500 [============>.................] - ETA: 1:32 - loss: 1.3633 - regression_loss: 1.1742 - classification_loss: 0.1892 228/500 [============>.................] - ETA: 1:32 - loss: 1.3633 - regression_loss: 1.1743 - classification_loss: 0.1890 229/500 [============>.................] - ETA: 1:31 - loss: 1.3611 - regression_loss: 1.1725 - classification_loss: 0.1885 230/500 [============>.................] - ETA: 1:31 - loss: 1.3616 - regression_loss: 1.1734 - classification_loss: 0.1882 231/500 [============>.................] - ETA: 1:31 - loss: 1.3636 - regression_loss: 1.1749 - classification_loss: 0.1886 232/500 [============>.................] - ETA: 1:30 - loss: 1.3624 - regression_loss: 1.1742 - classification_loss: 0.1883 233/500 [============>.................] - ETA: 1:30 - loss: 1.3593 - regression_loss: 1.1715 - classification_loss: 0.1878 234/500 [=============>................] - ETA: 1:30 - loss: 1.3619 - regression_loss: 1.1739 - classification_loss: 0.1880 235/500 [=============>................] - ETA: 1:29 - loss: 1.3617 - regression_loss: 1.1736 - classification_loss: 0.1881 236/500 [=============>................] - ETA: 1:29 - loss: 1.3617 - regression_loss: 1.1739 - classification_loss: 0.1878 237/500 [=============>................] - ETA: 1:29 - loss: 1.3638 - regression_loss: 1.1749 - classification_loss: 0.1889 238/500 [=============>................] - ETA: 1:28 - loss: 1.3651 - regression_loss: 1.1762 - classification_loss: 0.1889 239/500 [=============>................] - ETA: 1:28 - loss: 1.3640 - regression_loss: 1.1750 - classification_loss: 0.1889 240/500 [=============>................] - ETA: 1:28 - loss: 1.3633 - regression_loss: 1.1747 - classification_loss: 0.1885 241/500 [=============>................] - ETA: 1:27 - loss: 1.3625 - regression_loss: 1.1739 - classification_loss: 0.1886 242/500 [=============>................] - ETA: 1:27 - loss: 1.3606 - regression_loss: 1.1724 - classification_loss: 0.1882 243/500 [=============>................] - ETA: 1:27 - loss: 1.3605 - regression_loss: 1.1722 - classification_loss: 0.1882 244/500 [=============>................] - ETA: 1:26 - loss: 1.3619 - regression_loss: 1.1736 - classification_loss: 0.1883 245/500 [=============>................] - ETA: 1:26 - loss: 1.3608 - regression_loss: 1.1724 - classification_loss: 0.1884 246/500 [=============>................] - ETA: 1:26 - loss: 1.3587 - regression_loss: 1.1705 - classification_loss: 0.1882 247/500 [=============>................] - ETA: 1:25 - loss: 1.3599 - regression_loss: 1.1719 - classification_loss: 0.1881 248/500 [=============>................] - ETA: 1:25 - loss: 1.3580 - regression_loss: 1.1704 - classification_loss: 0.1876 249/500 [=============>................] - ETA: 1:25 - loss: 1.3577 - regression_loss: 1.1701 - classification_loss: 0.1876 250/500 [==============>...............] - ETA: 1:24 - loss: 1.3589 - regression_loss: 1.1711 - classification_loss: 0.1877 251/500 [==============>...............] - ETA: 1:24 - loss: 1.3597 - regression_loss: 1.1720 - classification_loss: 0.1877 252/500 [==============>...............] - ETA: 1:24 - loss: 1.3611 - regression_loss: 1.1731 - classification_loss: 0.1880 253/500 [==============>...............] - ETA: 1:23 - loss: 1.3578 - regression_loss: 1.1704 - classification_loss: 0.1874 254/500 [==============>...............] - ETA: 1:23 - loss: 1.3559 - regression_loss: 1.1689 - classification_loss: 0.1870 255/500 [==============>...............] - ETA: 1:23 - loss: 1.3586 - regression_loss: 1.1714 - classification_loss: 0.1871 256/500 [==============>...............] - ETA: 1:22 - loss: 1.3568 - regression_loss: 1.1699 - classification_loss: 0.1869 257/500 [==============>...............] - ETA: 1:22 - loss: 1.3541 - regression_loss: 1.1676 - classification_loss: 0.1865 258/500 [==============>...............] - ETA: 1:22 - loss: 1.3569 - regression_loss: 1.1701 - classification_loss: 0.1868 259/500 [==============>...............] - ETA: 1:21 - loss: 1.3553 - regression_loss: 1.1689 - classification_loss: 0.1864 260/500 [==============>...............] - ETA: 1:21 - loss: 1.3545 - regression_loss: 1.1683 - classification_loss: 0.1862 261/500 [==============>...............] - ETA: 1:21 - loss: 1.3524 - regression_loss: 1.1665 - classification_loss: 0.1859 262/500 [==============>...............] - ETA: 1:20 - loss: 1.3528 - regression_loss: 1.1673 - classification_loss: 0.1855 263/500 [==============>...............] - ETA: 1:20 - loss: 1.3533 - regression_loss: 1.1677 - classification_loss: 0.1855 264/500 [==============>...............] - ETA: 1:20 - loss: 1.3506 - regression_loss: 1.1655 - classification_loss: 0.1851 265/500 [==============>...............] - ETA: 1:19 - loss: 1.3520 - regression_loss: 1.1666 - classification_loss: 0.1854 266/500 [==============>...............] - ETA: 1:19 - loss: 1.3526 - regression_loss: 1.1673 - classification_loss: 0.1853 267/500 [===============>..............] - ETA: 1:19 - loss: 1.3529 - regression_loss: 1.1676 - classification_loss: 0.1853 268/500 [===============>..............] - ETA: 1:18 - loss: 1.3504 - regression_loss: 1.1655 - classification_loss: 0.1848 269/500 [===============>..............] - ETA: 1:18 - loss: 1.3510 - regression_loss: 1.1663 - classification_loss: 0.1847 270/500 [===============>..............] - ETA: 1:18 - loss: 1.3497 - regression_loss: 1.1653 - classification_loss: 0.1844 271/500 [===============>..............] - ETA: 1:17 - loss: 1.3479 - regression_loss: 1.1639 - classification_loss: 0.1840 272/500 [===============>..............] - ETA: 1:17 - loss: 1.3493 - regression_loss: 1.1649 - classification_loss: 0.1844 273/500 [===============>..............] - ETA: 1:17 - loss: 1.3493 - regression_loss: 1.1650 - classification_loss: 0.1843 274/500 [===============>..............] - ETA: 1:16 - loss: 1.3507 - regression_loss: 1.1660 - classification_loss: 0.1847 275/500 [===============>..............] - ETA: 1:16 - loss: 1.3522 - regression_loss: 1.1672 - classification_loss: 0.1850 276/500 [===============>..............] - ETA: 1:16 - loss: 1.3490 - regression_loss: 1.1645 - classification_loss: 0.1844 277/500 [===============>..............] - ETA: 1:15 - loss: 1.3482 - regression_loss: 1.1641 - classification_loss: 0.1841 278/500 [===============>..............] - ETA: 1:15 - loss: 1.3495 - regression_loss: 1.1652 - classification_loss: 0.1843 279/500 [===============>..............] - ETA: 1:15 - loss: 1.3494 - regression_loss: 1.1651 - classification_loss: 0.1843 280/500 [===============>..............] - ETA: 1:14 - loss: 1.3531 - regression_loss: 1.1688 - classification_loss: 0.1843 281/500 [===============>..............] - ETA: 1:14 - loss: 1.3523 - regression_loss: 1.1681 - classification_loss: 0.1842 282/500 [===============>..............] - ETA: 1:14 - loss: 1.3529 - regression_loss: 1.1686 - classification_loss: 0.1843 283/500 [===============>..............] - ETA: 1:13 - loss: 1.3526 - regression_loss: 1.1684 - classification_loss: 0.1842 284/500 [================>.............] - ETA: 1:13 - loss: 1.3560 - regression_loss: 1.1716 - classification_loss: 0.1845 285/500 [================>.............] - ETA: 1:13 - loss: 1.3571 - regression_loss: 1.1723 - classification_loss: 0.1848 286/500 [================>.............] - ETA: 1:12 - loss: 1.3599 - regression_loss: 1.1745 - classification_loss: 0.1854 287/500 [================>.............] - ETA: 1:12 - loss: 1.3602 - regression_loss: 1.1749 - classification_loss: 0.1853 288/500 [================>.............] - ETA: 1:12 - loss: 1.3591 - regression_loss: 1.1739 - classification_loss: 0.1852 289/500 [================>.............] - ETA: 1:11 - loss: 1.3571 - regression_loss: 1.1723 - classification_loss: 0.1848 290/500 [================>.............] - ETA: 1:11 - loss: 1.3587 - regression_loss: 1.1736 - classification_loss: 0.1851 291/500 [================>.............] - ETA: 1:11 - loss: 1.3567 - regression_loss: 1.1718 - classification_loss: 0.1849 292/500 [================>.............] - ETA: 1:10 - loss: 1.3553 - regression_loss: 1.1708 - classification_loss: 0.1846 293/500 [================>.............] - ETA: 1:10 - loss: 1.3558 - regression_loss: 1.1711 - classification_loss: 0.1846 294/500 [================>.............] - ETA: 1:10 - loss: 1.3561 - regression_loss: 1.1716 - classification_loss: 0.1845 295/500 [================>.............] - ETA: 1:09 - loss: 1.3565 - regression_loss: 1.1719 - classification_loss: 0.1846 296/500 [================>.............] - ETA: 1:09 - loss: 1.3559 - regression_loss: 1.1714 - classification_loss: 0.1845 297/500 [================>.............] - ETA: 1:09 - loss: 1.3541 - regression_loss: 1.1699 - classification_loss: 0.1842 298/500 [================>.............] - ETA: 1:08 - loss: 1.3527 - regression_loss: 1.1689 - classification_loss: 0.1838 299/500 [================>.............] - ETA: 1:08 - loss: 1.3524 - regression_loss: 1.1687 - classification_loss: 0.1837 300/500 [=================>............] - ETA: 1:07 - loss: 1.3526 - regression_loss: 1.1689 - classification_loss: 0.1838 301/500 [=================>............] - ETA: 1:07 - loss: 1.3546 - regression_loss: 1.1705 - classification_loss: 0.1841 302/500 [=================>............] - ETA: 1:07 - loss: 1.3550 - regression_loss: 1.1706 - classification_loss: 0.1844 303/500 [=================>............] - ETA: 1:06 - loss: 1.3533 - regression_loss: 1.1693 - classification_loss: 0.1840 304/500 [=================>............] - ETA: 1:06 - loss: 1.3499 - regression_loss: 1.1664 - classification_loss: 0.1835 305/500 [=================>............] - ETA: 1:06 - loss: 1.3522 - regression_loss: 1.1681 - classification_loss: 0.1841 306/500 [=================>............] - ETA: 1:05 - loss: 1.3555 - regression_loss: 1.1706 - classification_loss: 0.1849 307/500 [=================>............] - ETA: 1:05 - loss: 1.3539 - regression_loss: 1.1690 - classification_loss: 0.1848 308/500 [=================>............] - ETA: 1:05 - loss: 1.3526 - regression_loss: 1.1680 - classification_loss: 0.1846 309/500 [=================>............] - ETA: 1:04 - loss: 1.3508 - regression_loss: 1.1666 - classification_loss: 0.1842 310/500 [=================>............] - ETA: 1:04 - loss: 1.3498 - regression_loss: 1.1658 - classification_loss: 0.1841 311/500 [=================>............] - ETA: 1:04 - loss: 1.3477 - regression_loss: 1.1641 - classification_loss: 0.1836 312/500 [=================>............] - ETA: 1:03 - loss: 1.3486 - regression_loss: 1.1649 - classification_loss: 0.1837 313/500 [=================>............] - ETA: 1:03 - loss: 1.3467 - regression_loss: 1.1634 - classification_loss: 0.1833 314/500 [=================>............] - ETA: 1:03 - loss: 1.3473 - regression_loss: 1.1641 - classification_loss: 0.1833 315/500 [=================>............] - ETA: 1:02 - loss: 1.3473 - regression_loss: 1.1640 - classification_loss: 0.1832 316/500 [=================>............] - ETA: 1:02 - loss: 1.3453 - regression_loss: 1.1623 - classification_loss: 0.1829 317/500 [==================>...........] - ETA: 1:02 - loss: 1.3479 - regression_loss: 1.1641 - classification_loss: 0.1837 318/500 [==================>...........] - ETA: 1:01 - loss: 1.3474 - regression_loss: 1.1640 - classification_loss: 0.1834 319/500 [==================>...........] - ETA: 1:01 - loss: 1.3469 - regression_loss: 1.1637 - classification_loss: 0.1832 320/500 [==================>...........] - ETA: 1:01 - loss: 1.3467 - regression_loss: 1.1636 - classification_loss: 0.1832 321/500 [==================>...........] - ETA: 1:00 - loss: 1.3461 - regression_loss: 1.1631 - classification_loss: 0.1830 322/500 [==================>...........] - ETA: 1:00 - loss: 1.3460 - regression_loss: 1.1632 - classification_loss: 0.1828 323/500 [==================>...........] - ETA: 1:00 - loss: 1.3436 - regression_loss: 1.1612 - classification_loss: 0.1824 324/500 [==================>...........] - ETA: 59s - loss: 1.3431 - regression_loss: 1.1607 - classification_loss: 0.1824  325/500 [==================>...........] - ETA: 59s - loss: 1.3428 - regression_loss: 1.1602 - classification_loss: 0.1825 326/500 [==================>...........] - ETA: 59s - loss: 1.3413 - regression_loss: 1.1591 - classification_loss: 0.1822 327/500 [==================>...........] - ETA: 58s - loss: 1.3416 - regression_loss: 1.1596 - classification_loss: 0.1820 328/500 [==================>...........] - ETA: 58s - loss: 1.3407 - regression_loss: 1.1591 - classification_loss: 0.1817 329/500 [==================>...........] - ETA: 58s - loss: 1.3420 - regression_loss: 1.1600 - classification_loss: 0.1820 330/500 [==================>...........] - ETA: 57s - loss: 1.3417 - regression_loss: 1.1597 - classification_loss: 0.1820 331/500 [==================>...........] - ETA: 57s - loss: 1.3413 - regression_loss: 1.1593 - classification_loss: 0.1820 332/500 [==================>...........] - ETA: 57s - loss: 1.3415 - regression_loss: 1.1596 - classification_loss: 0.1819 333/500 [==================>...........] - ETA: 56s - loss: 1.3419 - regression_loss: 1.1602 - classification_loss: 0.1818 334/500 [===================>..........] - ETA: 56s - loss: 1.3411 - regression_loss: 1.1594 - classification_loss: 0.1817 335/500 [===================>..........] - ETA: 56s - loss: 1.3412 - regression_loss: 1.1594 - classification_loss: 0.1819 336/500 [===================>..........] - ETA: 55s - loss: 1.3421 - regression_loss: 1.1598 - classification_loss: 0.1823 337/500 [===================>..........] - ETA: 55s - loss: 1.3441 - regression_loss: 1.1614 - classification_loss: 0.1827 338/500 [===================>..........] - ETA: 54s - loss: 1.3471 - regression_loss: 1.1638 - classification_loss: 0.1833 339/500 [===================>..........] - ETA: 54s - loss: 1.3450 - regression_loss: 1.1618 - classification_loss: 0.1832 340/500 [===================>..........] - ETA: 54s - loss: 1.3461 - regression_loss: 1.1627 - classification_loss: 0.1834 341/500 [===================>..........] - ETA: 53s - loss: 1.3457 - regression_loss: 1.1627 - classification_loss: 0.1831 342/500 [===================>..........] - ETA: 53s - loss: 1.3456 - regression_loss: 1.1626 - classification_loss: 0.1830 343/500 [===================>..........] - ETA: 53s - loss: 1.3450 - regression_loss: 1.1622 - classification_loss: 0.1828 344/500 [===================>..........] - ETA: 52s - loss: 1.3460 - regression_loss: 1.1632 - classification_loss: 0.1828 345/500 [===================>..........] - ETA: 52s - loss: 1.3464 - regression_loss: 1.1634 - classification_loss: 0.1830 346/500 [===================>..........] - ETA: 52s - loss: 1.3442 - regression_loss: 1.1615 - classification_loss: 0.1827 347/500 [===================>..........] - ETA: 51s - loss: 1.3423 - regression_loss: 1.1600 - classification_loss: 0.1823 348/500 [===================>..........] - ETA: 51s - loss: 1.3418 - regression_loss: 1.1596 - classification_loss: 0.1822 349/500 [===================>..........] - ETA: 51s - loss: 1.3428 - regression_loss: 1.1604 - classification_loss: 0.1824 350/500 [====================>.........] - ETA: 50s - loss: 1.3408 - regression_loss: 1.1588 - classification_loss: 0.1820 351/500 [====================>.........] - ETA: 50s - loss: 1.3406 - regression_loss: 1.1587 - classification_loss: 0.1819 352/500 [====================>.........] - ETA: 50s - loss: 1.3415 - regression_loss: 1.1593 - classification_loss: 0.1822 353/500 [====================>.........] - ETA: 49s - loss: 1.3405 - regression_loss: 1.1585 - classification_loss: 0.1820 354/500 [====================>.........] - ETA: 49s - loss: 1.3402 - regression_loss: 1.1583 - classification_loss: 0.1819 355/500 [====================>.........] - ETA: 49s - loss: 1.3407 - regression_loss: 1.1589 - classification_loss: 0.1818 356/500 [====================>.........] - ETA: 48s - loss: 1.3403 - regression_loss: 1.1580 - classification_loss: 0.1823 357/500 [====================>.........] - ETA: 48s - loss: 1.3425 - regression_loss: 1.1598 - classification_loss: 0.1827 358/500 [====================>.........] - ETA: 48s - loss: 1.3424 - regression_loss: 1.1598 - classification_loss: 0.1825 359/500 [====================>.........] - ETA: 47s - loss: 1.3439 - regression_loss: 1.1609 - classification_loss: 0.1829 360/500 [====================>.........] - ETA: 47s - loss: 1.3446 - regression_loss: 1.1615 - classification_loss: 0.1831 361/500 [====================>.........] - ETA: 47s - loss: 1.3420 - regression_loss: 1.1592 - classification_loss: 0.1828 362/500 [====================>.........] - ETA: 46s - loss: 1.3416 - regression_loss: 1.1588 - classification_loss: 0.1827 363/500 [====================>.........] - ETA: 46s - loss: 1.3442 - regression_loss: 1.1606 - classification_loss: 0.1836 364/500 [====================>.........] - ETA: 46s - loss: 1.3452 - regression_loss: 1.1615 - classification_loss: 0.1838 365/500 [====================>.........] - ETA: 45s - loss: 1.3449 - regression_loss: 1.1613 - classification_loss: 0.1836 366/500 [====================>.........] - ETA: 45s - loss: 1.3436 - regression_loss: 1.1602 - classification_loss: 0.1834 367/500 [=====================>........] - ETA: 45s - loss: 1.3430 - regression_loss: 1.1597 - classification_loss: 0.1834 368/500 [=====================>........] - ETA: 44s - loss: 1.3441 - regression_loss: 1.1607 - classification_loss: 0.1833 369/500 [=====================>........] - ETA: 44s - loss: 1.3451 - regression_loss: 1.1616 - classification_loss: 0.1835 370/500 [=====================>........] - ETA: 44s - loss: 1.3450 - regression_loss: 1.1615 - classification_loss: 0.1834 371/500 [=====================>........] - ETA: 43s - loss: 1.3428 - regression_loss: 1.1598 - classification_loss: 0.1830 372/500 [=====================>........] - ETA: 43s - loss: 1.3441 - regression_loss: 1.1610 - classification_loss: 0.1831 373/500 [=====================>........] - ETA: 43s - loss: 1.3473 - regression_loss: 1.1636 - classification_loss: 0.1837 374/500 [=====================>........] - ETA: 42s - loss: 1.3501 - regression_loss: 1.1659 - classification_loss: 0.1842 375/500 [=====================>........] - ETA: 42s - loss: 1.3483 - regression_loss: 1.1643 - classification_loss: 0.1840 376/500 [=====================>........] - ETA: 42s - loss: 1.3486 - regression_loss: 1.1644 - classification_loss: 0.1842 377/500 [=====================>........] - ETA: 41s - loss: 1.3476 - regression_loss: 1.1637 - classification_loss: 0.1840 378/500 [=====================>........] - ETA: 41s - loss: 1.3492 - regression_loss: 1.1649 - classification_loss: 0.1843 379/500 [=====================>........] - ETA: 41s - loss: 1.3468 - regression_loss: 1.1628 - classification_loss: 0.1840 380/500 [=====================>........] - ETA: 40s - loss: 1.3474 - regression_loss: 1.1634 - classification_loss: 0.1840 381/500 [=====================>........] - ETA: 40s - loss: 1.3480 - regression_loss: 1.1638 - classification_loss: 0.1842 382/500 [=====================>........] - ETA: 40s - loss: 1.3477 - regression_loss: 1.1636 - classification_loss: 0.1842 383/500 [=====================>........] - ETA: 39s - loss: 1.3489 - regression_loss: 1.1649 - classification_loss: 0.1840 384/500 [======================>.......] - ETA: 39s - loss: 1.3469 - regression_loss: 1.1632 - classification_loss: 0.1837 385/500 [======================>.......] - ETA: 39s - loss: 1.3454 - regression_loss: 1.1618 - classification_loss: 0.1835 386/500 [======================>.......] - ETA: 38s - loss: 1.3447 - regression_loss: 1.1613 - classification_loss: 0.1834 387/500 [======================>.......] - ETA: 38s - loss: 1.3430 - regression_loss: 1.1600 - classification_loss: 0.1830 388/500 [======================>.......] - ETA: 38s - loss: 1.3428 - regression_loss: 1.1600 - classification_loss: 0.1829 389/500 [======================>.......] - ETA: 37s - loss: 1.3433 - regression_loss: 1.1603 - classification_loss: 0.1830 390/500 [======================>.......] - ETA: 37s - loss: 1.3446 - regression_loss: 1.1613 - classification_loss: 0.1834 391/500 [======================>.......] - ETA: 37s - loss: 1.3430 - regression_loss: 1.1598 - classification_loss: 0.1831 392/500 [======================>.......] - ETA: 36s - loss: 1.3407 - regression_loss: 1.1580 - classification_loss: 0.1828 393/500 [======================>.......] - ETA: 36s - loss: 1.3409 - regression_loss: 1.1582 - classification_loss: 0.1827 394/500 [======================>.......] - ETA: 36s - loss: 1.3414 - regression_loss: 1.1588 - classification_loss: 0.1826 395/500 [======================>.......] - ETA: 35s - loss: 1.3411 - regression_loss: 1.1587 - classification_loss: 0.1824 396/500 [======================>.......] - ETA: 35s - loss: 1.3389 - regression_loss: 1.1569 - classification_loss: 0.1821 397/500 [======================>.......] - ETA: 34s - loss: 1.3396 - regression_loss: 1.1575 - classification_loss: 0.1821 398/500 [======================>.......] - ETA: 34s - loss: 1.3415 - regression_loss: 1.1591 - classification_loss: 0.1824 399/500 [======================>.......] - ETA: 34s - loss: 1.3403 - regression_loss: 1.1581 - classification_loss: 0.1821 400/500 [=======================>......] - ETA: 33s - loss: 1.3390 - regression_loss: 1.1571 - classification_loss: 0.1819 401/500 [=======================>......] - ETA: 33s - loss: 1.3386 - regression_loss: 1.1568 - classification_loss: 0.1818 402/500 [=======================>......] - ETA: 33s - loss: 1.3382 - regression_loss: 1.1564 - classification_loss: 0.1819 403/500 [=======================>......] - ETA: 32s - loss: 1.3374 - regression_loss: 1.1557 - classification_loss: 0.1817 404/500 [=======================>......] - ETA: 32s - loss: 1.3379 - regression_loss: 1.1563 - classification_loss: 0.1816 405/500 [=======================>......] - ETA: 32s - loss: 1.3366 - regression_loss: 1.1550 - classification_loss: 0.1817 406/500 [=======================>......] - ETA: 31s - loss: 1.3367 - regression_loss: 1.1550 - classification_loss: 0.1817 407/500 [=======================>......] - ETA: 31s - loss: 1.3351 - regression_loss: 1.1536 - classification_loss: 0.1814 408/500 [=======================>......] - ETA: 31s - loss: 1.3341 - regression_loss: 1.1529 - classification_loss: 0.1812 409/500 [=======================>......] - ETA: 30s - loss: 1.3346 - regression_loss: 1.1535 - classification_loss: 0.1812 410/500 [=======================>......] - ETA: 30s - loss: 1.3342 - regression_loss: 1.1533 - classification_loss: 0.1809 411/500 [=======================>......] - ETA: 30s - loss: 1.3351 - regression_loss: 1.1539 - classification_loss: 0.1812 412/500 [=======================>......] - ETA: 29s - loss: 1.3335 - regression_loss: 1.1527 - classification_loss: 0.1809 413/500 [=======================>......] - ETA: 29s - loss: 1.3359 - regression_loss: 1.1544 - classification_loss: 0.1815 414/500 [=======================>......] - ETA: 29s - loss: 1.3358 - regression_loss: 1.1543 - classification_loss: 0.1814 415/500 [=======================>......] - ETA: 28s - loss: 1.3357 - regression_loss: 1.1543 - classification_loss: 0.1814 416/500 [=======================>......] - ETA: 28s - loss: 1.3344 - regression_loss: 1.1532 - classification_loss: 0.1812 417/500 [========================>.....] - ETA: 28s - loss: 1.3363 - regression_loss: 1.1545 - classification_loss: 0.1818 418/500 [========================>.....] - ETA: 27s - loss: 1.3351 - regression_loss: 1.1536 - classification_loss: 0.1815 419/500 [========================>.....] - ETA: 27s - loss: 1.3322 - regression_loss: 1.1508 - classification_loss: 0.1814 420/500 [========================>.....] - ETA: 27s - loss: 1.3320 - regression_loss: 1.1508 - classification_loss: 0.1812 421/500 [========================>.....] - ETA: 26s - loss: 1.3316 - regression_loss: 1.1505 - classification_loss: 0.1811 422/500 [========================>.....] - ETA: 26s - loss: 1.3313 - regression_loss: 1.1503 - classification_loss: 0.1810 423/500 [========================>.....] - ETA: 26s - loss: 1.3306 - regression_loss: 1.1498 - classification_loss: 0.1809 424/500 [========================>.....] - ETA: 25s - loss: 1.3326 - regression_loss: 1.1516 - classification_loss: 0.1810 425/500 [========================>.....] - ETA: 25s - loss: 1.3349 - regression_loss: 1.1535 - classification_loss: 0.1814 426/500 [========================>.....] - ETA: 25s - loss: 1.3345 - regression_loss: 1.1532 - classification_loss: 0.1812 427/500 [========================>.....] - ETA: 24s - loss: 1.3354 - regression_loss: 1.1540 - classification_loss: 0.1814 428/500 [========================>.....] - ETA: 24s - loss: 1.3350 - regression_loss: 1.1537 - classification_loss: 0.1813 429/500 [========================>.....] - ETA: 24s - loss: 1.3390 - regression_loss: 1.1569 - classification_loss: 0.1822 430/500 [========================>.....] - ETA: 23s - loss: 1.3369 - regression_loss: 1.1551 - classification_loss: 0.1818 431/500 [========================>.....] - ETA: 23s - loss: 1.3366 - regression_loss: 1.1548 - classification_loss: 0.1817 432/500 [========================>.....] - ETA: 23s - loss: 1.3359 - regression_loss: 1.1543 - classification_loss: 0.1816 433/500 [========================>.....] - ETA: 22s - loss: 1.3370 - regression_loss: 1.1551 - classification_loss: 0.1819 434/500 [=========================>....] - ETA: 22s - loss: 1.3357 - regression_loss: 1.1541 - classification_loss: 0.1817 435/500 [=========================>....] - ETA: 22s - loss: 1.3343 - regression_loss: 1.1529 - classification_loss: 0.1814 436/500 [=========================>....] - ETA: 21s - loss: 1.3338 - regression_loss: 1.1526 - classification_loss: 0.1812 437/500 [=========================>....] - ETA: 21s - loss: 1.3336 - regression_loss: 1.1524 - classification_loss: 0.1812 438/500 [=========================>....] - ETA: 21s - loss: 1.3332 - regression_loss: 1.1521 - classification_loss: 0.1811 439/500 [=========================>....] - ETA: 20s - loss: 1.3334 - regression_loss: 1.1524 - classification_loss: 0.1811 440/500 [=========================>....] - ETA: 20s - loss: 1.3327 - regression_loss: 1.1518 - classification_loss: 0.1809 441/500 [=========================>....] - ETA: 20s - loss: 1.3317 - regression_loss: 1.1509 - classification_loss: 0.1808 442/500 [=========================>....] - ETA: 19s - loss: 1.3315 - regression_loss: 1.1508 - classification_loss: 0.1806 443/500 [=========================>....] - ETA: 19s - loss: 1.3314 - regression_loss: 1.1508 - classification_loss: 0.1806 444/500 [=========================>....] - ETA: 19s - loss: 1.3320 - regression_loss: 1.1512 - classification_loss: 0.1808 445/500 [=========================>....] - ETA: 18s - loss: 1.3322 - regression_loss: 1.1515 - classification_loss: 0.1807 446/500 [=========================>....] - ETA: 18s - loss: 1.3322 - regression_loss: 1.1517 - classification_loss: 0.1805 447/500 [=========================>....] - ETA: 17s - loss: 1.3325 - regression_loss: 1.1518 - classification_loss: 0.1807 448/500 [=========================>....] - ETA: 17s - loss: 1.3311 - regression_loss: 1.1507 - classification_loss: 0.1804 449/500 [=========================>....] - ETA: 17s - loss: 1.3317 - regression_loss: 1.1512 - classification_loss: 0.1805 450/500 [==========================>...] - ETA: 16s - loss: 1.3329 - regression_loss: 1.1520 - classification_loss: 0.1808 451/500 [==========================>...] - ETA: 16s - loss: 1.3340 - regression_loss: 1.1527 - classification_loss: 0.1813 452/500 [==========================>...] - ETA: 16s - loss: 1.3348 - regression_loss: 1.1535 - classification_loss: 0.1813 453/500 [==========================>...] - ETA: 15s - loss: 1.3340 - regression_loss: 1.1529 - classification_loss: 0.1811 454/500 [==========================>...] - ETA: 15s - loss: 1.3355 - regression_loss: 1.1541 - classification_loss: 0.1814 455/500 [==========================>...] - ETA: 15s - loss: 1.3355 - regression_loss: 1.1540 - classification_loss: 0.1815 456/500 [==========================>...] - ETA: 14s - loss: 1.3357 - regression_loss: 1.1542 - classification_loss: 0.1814 457/500 [==========================>...] - ETA: 14s - loss: 1.3359 - regression_loss: 1.1546 - classification_loss: 0.1813 458/500 [==========================>...] - ETA: 14s - loss: 1.3356 - regression_loss: 1.1545 - classification_loss: 0.1811 459/500 [==========================>...] - ETA: 13s - loss: 1.3364 - regression_loss: 1.1553 - classification_loss: 0.1811 460/500 [==========================>...] - ETA: 13s - loss: 1.3355 - regression_loss: 1.1546 - classification_loss: 0.1808 461/500 [==========================>...] - ETA: 13s - loss: 1.3355 - regression_loss: 1.1546 - classification_loss: 0.1808 462/500 [==========================>...] - ETA: 12s - loss: 1.3350 - regression_loss: 1.1544 - classification_loss: 0.1806 463/500 [==========================>...] - ETA: 12s - loss: 1.3343 - regression_loss: 1.1537 - classification_loss: 0.1806 464/500 [==========================>...] - ETA: 12s - loss: 1.3351 - regression_loss: 1.1543 - classification_loss: 0.1808 465/500 [==========================>...] - ETA: 11s - loss: 1.3350 - regression_loss: 1.1542 - classification_loss: 0.1808 466/500 [==========================>...] - ETA: 11s - loss: 1.3356 - regression_loss: 1.1548 - classification_loss: 0.1808 467/500 [===========================>..] - ETA: 11s - loss: 1.3375 - regression_loss: 1.1561 - classification_loss: 0.1813 468/500 [===========================>..] - ETA: 10s - loss: 1.3371 - regression_loss: 1.1559 - classification_loss: 0.1811 469/500 [===========================>..] - ETA: 10s - loss: 1.3368 - regression_loss: 1.1558 - classification_loss: 0.1811 470/500 [===========================>..] - ETA: 10s - loss: 1.3354 - regression_loss: 1.1546 - classification_loss: 0.1808 471/500 [===========================>..] - ETA: 9s - loss: 1.3341 - regression_loss: 1.1535 - classification_loss: 0.1806  472/500 [===========================>..] - ETA: 9s - loss: 1.3331 - regression_loss: 1.1528 - classification_loss: 0.1804 473/500 [===========================>..] - ETA: 9s - loss: 1.3330 - regression_loss: 1.1527 - classification_loss: 0.1803 474/500 [===========================>..] - ETA: 8s - loss: 1.3314 - regression_loss: 1.1514 - classification_loss: 0.1800 475/500 [===========================>..] - ETA: 8s - loss: 1.3313 - regression_loss: 1.1512 - classification_loss: 0.1800 476/500 [===========================>..] - ETA: 8s - loss: 1.3297 - regression_loss: 1.1498 - classification_loss: 0.1798 477/500 [===========================>..] - ETA: 7s - loss: 1.3288 - regression_loss: 1.1490 - classification_loss: 0.1798 478/500 [===========================>..] - ETA: 7s - loss: 1.3286 - regression_loss: 1.1489 - classification_loss: 0.1797 479/500 [===========================>..] - ETA: 7s - loss: 1.3277 - regression_loss: 1.1481 - classification_loss: 0.1797 480/500 [===========================>..] - ETA: 6s - loss: 1.3270 - regression_loss: 1.1476 - classification_loss: 0.1794 481/500 [===========================>..] - ETA: 6s - loss: 1.3266 - regression_loss: 1.1472 - classification_loss: 0.1795 482/500 [===========================>..] - ETA: 6s - loss: 1.3266 - regression_loss: 1.1473 - classification_loss: 0.1793 483/500 [===========================>..] - ETA: 5s - loss: 1.3261 - regression_loss: 1.1469 - classification_loss: 0.1792 484/500 [============================>.] - ETA: 5s - loss: 1.3269 - regression_loss: 1.1474 - classification_loss: 0.1795 485/500 [============================>.] - ETA: 5s - loss: 1.3272 - regression_loss: 1.1476 - classification_loss: 0.1795 486/500 [============================>.] - ETA: 4s - loss: 1.3271 - regression_loss: 1.1476 - classification_loss: 0.1796 487/500 [============================>.] - ETA: 4s - loss: 1.3276 - regression_loss: 1.1481 - classification_loss: 0.1795 488/500 [============================>.] - ETA: 4s - loss: 1.3277 - regression_loss: 1.1484 - classification_loss: 0.1793 489/500 [============================>.] - ETA: 3s - loss: 1.3265 - regression_loss: 1.1474 - classification_loss: 0.1791 490/500 [============================>.] - ETA: 3s - loss: 1.3260 - regression_loss: 1.1470 - classification_loss: 0.1790 491/500 [============================>.] - ETA: 3s - loss: 1.3268 - regression_loss: 1.1477 - classification_loss: 0.1791 492/500 [============================>.] - ETA: 2s - loss: 1.3279 - regression_loss: 1.1485 - classification_loss: 0.1794 493/500 [============================>.] - ETA: 2s - loss: 1.3270 - regression_loss: 1.1475 - classification_loss: 0.1795 494/500 [============================>.] - ETA: 2s - loss: 1.3272 - regression_loss: 1.1477 - classification_loss: 0.1795 495/500 [============================>.] - ETA: 1s - loss: 1.3259 - regression_loss: 1.1467 - classification_loss: 0.1792 496/500 [============================>.] - ETA: 1s - loss: 1.3256 - regression_loss: 1.1464 - classification_loss: 0.1791 497/500 [============================>.] - ETA: 1s - loss: 1.3254 - regression_loss: 1.1464 - classification_loss: 0.1791 498/500 [============================>.] - ETA: 0s - loss: 1.3257 - regression_loss: 1.1466 - classification_loss: 0.1791 499/500 [============================>.] - ETA: 0s - loss: 1.3261 - regression_loss: 1.1470 - classification_loss: 0.1792 500/500 [==============================] - 170s 339ms/step - loss: 1.3251 - regression_loss: 1.1462 - classification_loss: 0.1789 326 instances of class plum with average precision: 0.8093 mAP: 0.8093 Epoch 00013: saving model to ./training/snapshots/resnet101_pascal_13.h5 Epoch 14/150 1/500 [..............................] - ETA: 2:40 - loss: 1.3242 - regression_loss: 1.0308 - classification_loss: 0.2934 2/500 [..............................] - ETA: 2:45 - loss: 1.3049 - regression_loss: 1.0572 - classification_loss: 0.2477 3/500 [..............................] - ETA: 2:44 - loss: 1.6871 - regression_loss: 1.4191 - classification_loss: 0.2680 4/500 [..............................] - ETA: 2:44 - loss: 1.5618 - regression_loss: 1.3290 - classification_loss: 0.2328 5/500 [..............................] - ETA: 2:44 - loss: 1.4154 - regression_loss: 1.2134 - classification_loss: 0.2020 6/500 [..............................] - ETA: 2:45 - loss: 1.4325 - regression_loss: 1.2281 - classification_loss: 0.2044 7/500 [..............................] - ETA: 2:45 - loss: 1.4378 - regression_loss: 1.2301 - classification_loss: 0.2077 8/500 [..............................] - ETA: 2:45 - loss: 1.4302 - regression_loss: 1.2283 - classification_loss: 0.2019 9/500 [..............................] - ETA: 2:45 - loss: 1.4060 - regression_loss: 1.2191 - classification_loss: 0.1869 10/500 [..............................] - ETA: 2:44 - loss: 1.3931 - regression_loss: 1.2082 - classification_loss: 0.1848 11/500 [..............................] - ETA: 2:44 - loss: 1.3226 - regression_loss: 1.1422 - classification_loss: 0.1805 12/500 [..............................] - ETA: 2:43 - loss: 1.3043 - regression_loss: 1.1283 - classification_loss: 0.1760 13/500 [..............................] - ETA: 2:42 - loss: 1.3048 - regression_loss: 1.1301 - classification_loss: 0.1747 14/500 [..............................] - ETA: 2:41 - loss: 1.2855 - regression_loss: 1.1145 - classification_loss: 0.1710 15/500 [..............................] - ETA: 2:41 - loss: 1.2668 - regression_loss: 1.0992 - classification_loss: 0.1676 16/500 [..............................] - ETA: 2:41 - loss: 1.2928 - regression_loss: 1.1170 - classification_loss: 0.1758 17/500 [>.............................] - ETA: 2:41 - loss: 1.2549 - regression_loss: 1.0841 - classification_loss: 0.1707 18/500 [>.............................] - ETA: 2:41 - loss: 1.2421 - regression_loss: 1.0731 - classification_loss: 0.1690 19/500 [>.............................] - ETA: 2:40 - loss: 1.2429 - regression_loss: 1.0767 - classification_loss: 0.1663 20/500 [>.............................] - ETA: 2:40 - loss: 1.2698 - regression_loss: 1.1009 - classification_loss: 0.1689 21/500 [>.............................] - ETA: 2:40 - loss: 1.2427 - regression_loss: 1.0754 - classification_loss: 0.1674 22/500 [>.............................] - ETA: 2:40 - loss: 1.2744 - regression_loss: 1.1066 - classification_loss: 0.1678 23/500 [>.............................] - ETA: 2:40 - loss: 1.2791 - regression_loss: 1.1131 - classification_loss: 0.1659 24/500 [>.............................] - ETA: 2:40 - loss: 1.2765 - regression_loss: 1.1128 - classification_loss: 0.1637 25/500 [>.............................] - ETA: 2:40 - loss: 1.2872 - regression_loss: 1.1162 - classification_loss: 0.1709 26/500 [>.............................] - ETA: 2:40 - loss: 1.2693 - regression_loss: 1.1019 - classification_loss: 0.1674 27/500 [>.............................] - ETA: 2:39 - loss: 1.3027 - regression_loss: 1.1258 - classification_loss: 0.1769 28/500 [>.............................] - ETA: 2:39 - loss: 1.3125 - regression_loss: 1.1341 - classification_loss: 0.1784 29/500 [>.............................] - ETA: 2:39 - loss: 1.2928 - regression_loss: 1.1189 - classification_loss: 0.1740 30/500 [>.............................] - ETA: 2:39 - loss: 1.2648 - regression_loss: 1.0954 - classification_loss: 0.1693 31/500 [>.............................] - ETA: 2:38 - loss: 1.2775 - regression_loss: 1.1052 - classification_loss: 0.1723 32/500 [>.............................] - ETA: 2:38 - loss: 1.2932 - regression_loss: 1.1194 - classification_loss: 0.1738 33/500 [>.............................] - ETA: 2:38 - loss: 1.2801 - regression_loss: 1.1088 - classification_loss: 0.1713 34/500 [=>............................] - ETA: 2:38 - loss: 1.2712 - regression_loss: 1.1026 - classification_loss: 0.1686 35/500 [=>............................] - ETA: 2:37 - loss: 1.2719 - regression_loss: 1.1045 - classification_loss: 0.1674 36/500 [=>............................] - ETA: 2:37 - loss: 1.2802 - regression_loss: 1.1133 - classification_loss: 0.1669 37/500 [=>............................] - ETA: 2:37 - loss: 1.2810 - regression_loss: 1.1146 - classification_loss: 0.1664 38/500 [=>............................] - ETA: 2:36 - loss: 1.3073 - regression_loss: 1.1380 - classification_loss: 0.1693 39/500 [=>............................] - ETA: 2:36 - loss: 1.2962 - regression_loss: 1.1283 - classification_loss: 0.1679 40/500 [=>............................] - ETA: 2:35 - loss: 1.2792 - regression_loss: 1.1144 - classification_loss: 0.1648 41/500 [=>............................] - ETA: 2:35 - loss: 1.2846 - regression_loss: 1.1189 - classification_loss: 0.1657 42/500 [=>............................] - ETA: 2:35 - loss: 1.2945 - regression_loss: 1.1285 - classification_loss: 0.1660 43/500 [=>............................] - ETA: 2:34 - loss: 1.2788 - regression_loss: 1.1157 - classification_loss: 0.1631 44/500 [=>............................] - ETA: 2:34 - loss: 1.2981 - regression_loss: 1.1299 - classification_loss: 0.1681 45/500 [=>............................] - ETA: 2:34 - loss: 1.2987 - regression_loss: 1.1316 - classification_loss: 0.1671 46/500 [=>............................] - ETA: 2:33 - loss: 1.3008 - regression_loss: 1.1341 - classification_loss: 0.1667 47/500 [=>............................] - ETA: 2:33 - loss: 1.3102 - regression_loss: 1.1411 - classification_loss: 0.1691 48/500 [=>............................] - ETA: 2:33 - loss: 1.3092 - regression_loss: 1.1406 - classification_loss: 0.1687 49/500 [=>............................] - ETA: 2:32 - loss: 1.3054 - regression_loss: 1.1386 - classification_loss: 0.1668 50/500 [==>...........................] - ETA: 2:32 - loss: 1.3122 - regression_loss: 1.1455 - classification_loss: 0.1667 51/500 [==>...........................] - ETA: 2:31 - loss: 1.3053 - regression_loss: 1.1401 - classification_loss: 0.1652 52/500 [==>...........................] - ETA: 2:31 - loss: 1.3027 - regression_loss: 1.1392 - classification_loss: 0.1635 53/500 [==>...........................] - ETA: 2:31 - loss: 1.3076 - regression_loss: 1.1446 - classification_loss: 0.1630 54/500 [==>...........................] - ETA: 2:31 - loss: 1.3103 - regression_loss: 1.1475 - classification_loss: 0.1628 55/500 [==>...........................] - ETA: 2:30 - loss: 1.3117 - regression_loss: 1.1482 - classification_loss: 0.1635 56/500 [==>...........................] - ETA: 2:30 - loss: 1.3053 - regression_loss: 1.1434 - classification_loss: 0.1619 57/500 [==>...........................] - ETA: 2:30 - loss: 1.2953 - regression_loss: 1.1352 - classification_loss: 0.1601 58/500 [==>...........................] - ETA: 2:29 - loss: 1.2890 - regression_loss: 1.1304 - classification_loss: 0.1587 59/500 [==>...........................] - ETA: 2:29 - loss: 1.2891 - regression_loss: 1.1305 - classification_loss: 0.1586 60/500 [==>...........................] - ETA: 2:29 - loss: 1.2886 - regression_loss: 1.1302 - classification_loss: 0.1584 61/500 [==>...........................] - ETA: 2:28 - loss: 1.2836 - regression_loss: 1.1246 - classification_loss: 0.1590 62/500 [==>...........................] - ETA: 2:28 - loss: 1.2873 - regression_loss: 1.1272 - classification_loss: 0.1602 63/500 [==>...........................] - ETA: 2:27 - loss: 1.2838 - regression_loss: 1.1230 - classification_loss: 0.1609 64/500 [==>...........................] - ETA: 2:27 - loss: 1.2844 - regression_loss: 1.1236 - classification_loss: 0.1608 65/500 [==>...........................] - ETA: 2:27 - loss: 1.2929 - regression_loss: 1.1307 - classification_loss: 0.1622 66/500 [==>...........................] - ETA: 2:26 - loss: 1.2950 - regression_loss: 1.1326 - classification_loss: 0.1623 67/500 [===>..........................] - ETA: 2:26 - loss: 1.2812 - regression_loss: 1.1205 - classification_loss: 0.1607 68/500 [===>..........................] - ETA: 2:26 - loss: 1.2840 - regression_loss: 1.1234 - classification_loss: 0.1606 69/500 [===>..........................] - ETA: 2:25 - loss: 1.2823 - regression_loss: 1.1224 - classification_loss: 0.1599 70/500 [===>..........................] - ETA: 2:25 - loss: 1.2810 - regression_loss: 1.1219 - classification_loss: 0.1591 71/500 [===>..........................] - ETA: 2:25 - loss: 1.2850 - regression_loss: 1.1255 - classification_loss: 0.1595 72/500 [===>..........................] - ETA: 2:24 - loss: 1.2741 - regression_loss: 1.1163 - classification_loss: 0.1579 73/500 [===>..........................] - ETA: 2:24 - loss: 1.2783 - regression_loss: 1.1198 - classification_loss: 0.1584 74/500 [===>..........................] - ETA: 2:24 - loss: 1.2745 - regression_loss: 1.1168 - classification_loss: 0.1578 75/500 [===>..........................] - ETA: 2:24 - loss: 1.2638 - regression_loss: 1.1074 - classification_loss: 0.1564 76/500 [===>..........................] - ETA: 2:23 - loss: 1.2757 - regression_loss: 1.1149 - classification_loss: 0.1608 77/500 [===>..........................] - ETA: 2:23 - loss: 1.2864 - regression_loss: 1.1241 - classification_loss: 0.1623 78/500 [===>..........................] - ETA: 2:23 - loss: 1.2760 - regression_loss: 1.1149 - classification_loss: 0.1611 79/500 [===>..........................] - ETA: 2:22 - loss: 1.2685 - regression_loss: 1.1083 - classification_loss: 0.1601 80/500 [===>..........................] - ETA: 2:22 - loss: 1.2617 - regression_loss: 1.1030 - classification_loss: 0.1587 81/500 [===>..........................] - ETA: 2:22 - loss: 1.2606 - regression_loss: 1.1023 - classification_loss: 0.1583 82/500 [===>..........................] - ETA: 2:21 - loss: 1.2576 - regression_loss: 1.0997 - classification_loss: 0.1579 83/500 [===>..........................] - ETA: 2:21 - loss: 1.2552 - regression_loss: 1.0982 - classification_loss: 0.1570 84/500 [====>.........................] - ETA: 2:21 - loss: 1.2673 - regression_loss: 1.1072 - classification_loss: 0.1601 85/500 [====>.........................] - ETA: 2:20 - loss: 1.2624 - regression_loss: 1.1027 - classification_loss: 0.1597 86/500 [====>.........................] - ETA: 2:20 - loss: 1.2653 - regression_loss: 1.1045 - classification_loss: 0.1608 87/500 [====>.........................] - ETA: 2:20 - loss: 1.2600 - regression_loss: 1.1004 - classification_loss: 0.1596 88/500 [====>.........................] - ETA: 2:19 - loss: 1.2643 - regression_loss: 1.1038 - classification_loss: 0.1605 89/500 [====>.........................] - ETA: 2:19 - loss: 1.2681 - regression_loss: 1.1065 - classification_loss: 0.1616 90/500 [====>.........................] - ETA: 2:18 - loss: 1.2637 - regression_loss: 1.1028 - classification_loss: 0.1609 91/500 [====>.........................] - ETA: 2:18 - loss: 1.2709 - regression_loss: 1.1082 - classification_loss: 0.1627 92/500 [====>.........................] - ETA: 2:18 - loss: 1.2803 - regression_loss: 1.1153 - classification_loss: 0.1650 93/500 [====>.........................] - ETA: 2:17 - loss: 1.2815 - regression_loss: 1.1165 - classification_loss: 0.1650 94/500 [====>.........................] - ETA: 2:17 - loss: 1.2775 - regression_loss: 1.1136 - classification_loss: 0.1639 95/500 [====>.........................] - ETA: 2:17 - loss: 1.2719 - regression_loss: 1.1089 - classification_loss: 0.1630 96/500 [====>.........................] - ETA: 2:16 - loss: 1.2687 - regression_loss: 1.1064 - classification_loss: 0.1623 97/500 [====>.........................] - ETA: 2:16 - loss: 1.2646 - regression_loss: 1.1027 - classification_loss: 0.1619 98/500 [====>.........................] - ETA: 2:16 - loss: 1.2686 - regression_loss: 1.1063 - classification_loss: 0.1623 99/500 [====>.........................] - ETA: 2:15 - loss: 1.2697 - regression_loss: 1.1074 - classification_loss: 0.1622 100/500 [=====>........................] - ETA: 2:15 - loss: 1.2633 - regression_loss: 1.1021 - classification_loss: 0.1613 101/500 [=====>........................] - ETA: 2:15 - loss: 1.2625 - regression_loss: 1.1016 - classification_loss: 0.1610 102/500 [=====>........................] - ETA: 2:14 - loss: 1.2601 - regression_loss: 1.0991 - classification_loss: 0.1610 103/500 [=====>........................] - ETA: 2:14 - loss: 1.2552 - regression_loss: 1.0946 - classification_loss: 0.1605 104/500 [=====>........................] - ETA: 2:14 - loss: 1.2525 - regression_loss: 1.0923 - classification_loss: 0.1601 105/500 [=====>........................] - ETA: 2:13 - loss: 1.2494 - regression_loss: 1.0902 - classification_loss: 0.1592 106/500 [=====>........................] - ETA: 2:13 - loss: 1.2559 - regression_loss: 1.0960 - classification_loss: 0.1599 107/500 [=====>........................] - ETA: 2:13 - loss: 1.2575 - regression_loss: 1.0968 - classification_loss: 0.1607 108/500 [=====>........................] - ETA: 2:12 - loss: 1.2596 - regression_loss: 1.0986 - classification_loss: 0.1610 109/500 [=====>........................] - ETA: 2:12 - loss: 1.2582 - regression_loss: 1.0974 - classification_loss: 0.1608 110/500 [=====>........................] - ETA: 2:12 - loss: 1.2509 - regression_loss: 1.0911 - classification_loss: 0.1598 111/500 [=====>........................] - ETA: 2:11 - loss: 1.2505 - regression_loss: 1.0906 - classification_loss: 0.1599 112/500 [=====>........................] - ETA: 2:11 - loss: 1.2574 - regression_loss: 1.0954 - classification_loss: 0.1620 113/500 [=====>........................] - ETA: 2:11 - loss: 1.2518 - regression_loss: 1.0905 - classification_loss: 0.1614 114/500 [=====>........................] - ETA: 2:10 - loss: 1.2510 - regression_loss: 1.0904 - classification_loss: 0.1606 115/500 [=====>........................] - ETA: 2:10 - loss: 1.2534 - regression_loss: 1.0925 - classification_loss: 0.1609 116/500 [=====>........................] - ETA: 2:10 - loss: 1.2567 - regression_loss: 1.0951 - classification_loss: 0.1615 117/500 [======>.......................] - ETA: 2:10 - loss: 1.2522 - regression_loss: 1.0917 - classification_loss: 0.1605 118/500 [======>.......................] - ETA: 2:09 - loss: 1.2515 - regression_loss: 1.0914 - classification_loss: 0.1600 119/500 [======>.......................] - ETA: 2:09 - loss: 1.2497 - regression_loss: 1.0894 - classification_loss: 0.1604 120/500 [======>.......................] - ETA: 2:09 - loss: 1.2497 - regression_loss: 1.0898 - classification_loss: 0.1598 121/500 [======>.......................] - ETA: 2:08 - loss: 1.2490 - regression_loss: 1.0897 - classification_loss: 0.1593 122/500 [======>.......................] - ETA: 2:08 - loss: 1.2474 - regression_loss: 1.0883 - classification_loss: 0.1592 123/500 [======>.......................] - ETA: 2:08 - loss: 1.2497 - regression_loss: 1.0904 - classification_loss: 0.1593 124/500 [======>.......................] - ETA: 2:07 - loss: 1.2482 - regression_loss: 1.0895 - classification_loss: 0.1586 125/500 [======>.......................] - ETA: 2:07 - loss: 1.2506 - regression_loss: 1.0916 - classification_loss: 0.1590 126/500 [======>.......................] - ETA: 2:07 - loss: 1.2494 - regression_loss: 1.0911 - classification_loss: 0.1583 127/500 [======>.......................] - ETA: 2:06 - loss: 1.2434 - regression_loss: 1.0861 - classification_loss: 0.1573 128/500 [======>.......................] - ETA: 2:06 - loss: 1.2431 - regression_loss: 1.0859 - classification_loss: 0.1572 129/500 [======>.......................] - ETA: 2:06 - loss: 1.2417 - regression_loss: 1.0849 - classification_loss: 0.1568 130/500 [======>.......................] - ETA: 2:05 - loss: 1.2442 - regression_loss: 1.0875 - classification_loss: 0.1567 131/500 [======>.......................] - ETA: 2:05 - loss: 1.2397 - regression_loss: 1.0833 - classification_loss: 0.1564 132/500 [======>.......................] - ETA: 2:05 - loss: 1.2415 - regression_loss: 1.0843 - classification_loss: 0.1572 133/500 [======>.......................] - ETA: 2:04 - loss: 1.2457 - regression_loss: 1.0866 - classification_loss: 0.1591 134/500 [=======>......................] - ETA: 2:04 - loss: 1.2458 - regression_loss: 1.0870 - classification_loss: 0.1588 135/500 [=======>......................] - ETA: 2:03 - loss: 1.2481 - regression_loss: 1.0880 - classification_loss: 0.1601 136/500 [=======>......................] - ETA: 2:03 - loss: 1.2550 - regression_loss: 1.0938 - classification_loss: 0.1612 137/500 [=======>......................] - ETA: 2:03 - loss: 1.2562 - regression_loss: 1.0952 - classification_loss: 0.1610 138/500 [=======>......................] - ETA: 2:02 - loss: 1.2557 - regression_loss: 1.0946 - classification_loss: 0.1610 139/500 [=======>......................] - ETA: 2:02 - loss: 1.2554 - regression_loss: 1.0948 - classification_loss: 0.1607 140/500 [=======>......................] - ETA: 2:02 - loss: 1.2564 - regression_loss: 1.0958 - classification_loss: 0.1606 141/500 [=======>......................] - ETA: 2:01 - loss: 1.2530 - regression_loss: 1.0930 - classification_loss: 0.1600 142/500 [=======>......................] - ETA: 2:01 - loss: 1.2536 - regression_loss: 1.0927 - classification_loss: 0.1609 143/500 [=======>......................] - ETA: 2:01 - loss: 1.2546 - regression_loss: 1.0935 - classification_loss: 0.1611 144/500 [=======>......................] - ETA: 2:00 - loss: 1.2526 - regression_loss: 1.0921 - classification_loss: 0.1605 145/500 [=======>......................] - ETA: 2:00 - loss: 1.2472 - regression_loss: 1.0873 - classification_loss: 0.1599 146/500 [=======>......................] - ETA: 2:00 - loss: 1.2517 - regression_loss: 1.0915 - classification_loss: 0.1602 147/500 [=======>......................] - ETA: 1:59 - loss: 1.2578 - regression_loss: 1.0967 - classification_loss: 0.1611 148/500 [=======>......................] - ETA: 1:59 - loss: 1.2570 - regression_loss: 1.0959 - classification_loss: 0.1611 149/500 [=======>......................] - ETA: 1:59 - loss: 1.2540 - regression_loss: 1.0936 - classification_loss: 0.1604 150/500 [========>.....................] - ETA: 1:58 - loss: 1.2522 - regression_loss: 1.0922 - classification_loss: 0.1601 151/500 [========>.....................] - ETA: 1:58 - loss: 1.2575 - regression_loss: 1.0964 - classification_loss: 0.1611 152/500 [========>.....................] - ETA: 1:58 - loss: 1.2568 - regression_loss: 1.0953 - classification_loss: 0.1615 153/500 [========>.....................] - ETA: 1:57 - loss: 1.2587 - regression_loss: 1.0970 - classification_loss: 0.1617 154/500 [========>.....................] - ETA: 1:57 - loss: 1.2611 - regression_loss: 1.0998 - classification_loss: 0.1613 155/500 [========>.....................] - ETA: 1:56 - loss: 1.2589 - regression_loss: 1.0980 - classification_loss: 0.1609 156/500 [========>.....................] - ETA: 1:56 - loss: 1.2721 - regression_loss: 1.1086 - classification_loss: 0.1635 157/500 [========>.....................] - ETA: 1:56 - loss: 1.2719 - regression_loss: 1.1085 - classification_loss: 0.1634 158/500 [========>.....................] - ETA: 1:55 - loss: 1.2718 - regression_loss: 1.1085 - classification_loss: 0.1633 159/500 [========>.....................] - ETA: 1:55 - loss: 1.2694 - regression_loss: 1.1067 - classification_loss: 0.1627 160/500 [========>.....................] - ETA: 1:55 - loss: 1.2728 - regression_loss: 1.1089 - classification_loss: 0.1639 161/500 [========>.....................] - ETA: 1:54 - loss: 1.2710 - regression_loss: 1.1073 - classification_loss: 0.1636 162/500 [========>.....................] - ETA: 1:54 - loss: 1.2731 - regression_loss: 1.1092 - classification_loss: 0.1639 163/500 [========>.....................] - ETA: 1:54 - loss: 1.2748 - regression_loss: 1.1107 - classification_loss: 0.1642 164/500 [========>.....................] - ETA: 1:53 - loss: 1.2709 - regression_loss: 1.1075 - classification_loss: 0.1635 165/500 [========>.....................] - ETA: 1:53 - loss: 1.2719 - regression_loss: 1.1080 - classification_loss: 0.1639 166/500 [========>.....................] - ETA: 1:53 - loss: 1.2694 - regression_loss: 1.1059 - classification_loss: 0.1635 167/500 [=========>....................] - ETA: 1:52 - loss: 1.2693 - regression_loss: 1.1058 - classification_loss: 0.1635 168/500 [=========>....................] - ETA: 1:52 - loss: 1.2693 - regression_loss: 1.1059 - classification_loss: 0.1634 169/500 [=========>....................] - ETA: 1:52 - loss: 1.2718 - regression_loss: 1.1078 - classification_loss: 0.1640 170/500 [=========>....................] - ETA: 1:51 - loss: 1.2736 - regression_loss: 1.1089 - classification_loss: 0.1647 171/500 [=========>....................] - ETA: 1:51 - loss: 1.2755 - regression_loss: 1.1105 - classification_loss: 0.1651 172/500 [=========>....................] - ETA: 1:51 - loss: 1.2756 - regression_loss: 1.1105 - classification_loss: 0.1650 173/500 [=========>....................] - ETA: 1:50 - loss: 1.2792 - regression_loss: 1.1130 - classification_loss: 0.1662 174/500 [=========>....................] - ETA: 1:50 - loss: 1.2774 - regression_loss: 1.1108 - classification_loss: 0.1666 175/500 [=========>....................] - ETA: 1:50 - loss: 1.2865 - regression_loss: 1.1177 - classification_loss: 0.1688 176/500 [=========>....................] - ETA: 1:49 - loss: 1.2813 - regression_loss: 1.1133 - classification_loss: 0.1680 177/500 [=========>....................] - ETA: 1:49 - loss: 1.2805 - regression_loss: 1.1127 - classification_loss: 0.1678 178/500 [=========>....................] - ETA: 1:48 - loss: 1.2809 - regression_loss: 1.1132 - classification_loss: 0.1677 179/500 [=========>....................] - ETA: 1:48 - loss: 1.2797 - regression_loss: 1.1124 - classification_loss: 0.1674 180/500 [=========>....................] - ETA: 1:48 - loss: 1.2814 - regression_loss: 1.1137 - classification_loss: 0.1677 181/500 [=========>....................] - ETA: 1:47 - loss: 1.2821 - regression_loss: 1.1149 - classification_loss: 0.1672 182/500 [=========>....................] - ETA: 1:47 - loss: 1.2821 - regression_loss: 1.1148 - classification_loss: 0.1672 183/500 [=========>....................] - ETA: 1:47 - loss: 1.2821 - regression_loss: 1.1149 - classification_loss: 0.1672 184/500 [==========>...................] - ETA: 1:46 - loss: 1.2806 - regression_loss: 1.1137 - classification_loss: 0.1669 185/500 [==========>...................] - ETA: 1:46 - loss: 1.2813 - regression_loss: 1.1142 - classification_loss: 0.1671 186/500 [==========>...................] - ETA: 1:46 - loss: 1.2811 - regression_loss: 1.1140 - classification_loss: 0.1672 187/500 [==========>...................] - ETA: 1:45 - loss: 1.2810 - regression_loss: 1.1142 - classification_loss: 0.1668 188/500 [==========>...................] - ETA: 1:45 - loss: 1.2792 - regression_loss: 1.1130 - classification_loss: 0.1663 189/500 [==========>...................] - ETA: 1:45 - loss: 1.2811 - regression_loss: 1.1146 - classification_loss: 0.1665 190/500 [==========>...................] - ETA: 1:44 - loss: 1.2796 - regression_loss: 1.1132 - classification_loss: 0.1664 191/500 [==========>...................] - ETA: 1:44 - loss: 1.2813 - regression_loss: 1.1146 - classification_loss: 0.1667 192/500 [==========>...................] - ETA: 1:44 - loss: 1.2777 - regression_loss: 1.1116 - classification_loss: 0.1661 193/500 [==========>...................] - ETA: 1:43 - loss: 1.2772 - regression_loss: 1.1113 - classification_loss: 0.1659 194/500 [==========>...................] - ETA: 1:43 - loss: 1.2802 - regression_loss: 1.1142 - classification_loss: 0.1660 195/500 [==========>...................] - ETA: 1:42 - loss: 1.2793 - regression_loss: 1.1135 - classification_loss: 0.1658 196/500 [==========>...................] - ETA: 1:42 - loss: 1.2821 - regression_loss: 1.1154 - classification_loss: 0.1667 197/500 [==========>...................] - ETA: 1:42 - loss: 1.2870 - regression_loss: 1.1200 - classification_loss: 0.1670 198/500 [==========>...................] - ETA: 1:41 - loss: 1.2832 - regression_loss: 1.1167 - classification_loss: 0.1664 199/500 [==========>...................] - ETA: 1:41 - loss: 1.2822 - regression_loss: 1.1159 - classification_loss: 0.1663 200/500 [===========>..................] - ETA: 1:41 - loss: 1.2816 - regression_loss: 1.1153 - classification_loss: 0.1663 201/500 [===========>..................] - ETA: 1:40 - loss: 1.2824 - regression_loss: 1.1161 - classification_loss: 0.1663 202/500 [===========>..................] - ETA: 1:40 - loss: 1.2793 - regression_loss: 1.1135 - classification_loss: 0.1657 203/500 [===========>..................] - ETA: 1:40 - loss: 1.2779 - regression_loss: 1.1125 - classification_loss: 0.1654 204/500 [===========>..................] - ETA: 1:39 - loss: 1.2798 - regression_loss: 1.1145 - classification_loss: 0.1654 205/500 [===========>..................] - ETA: 1:39 - loss: 1.2795 - regression_loss: 1.1143 - classification_loss: 0.1653 206/500 [===========>..................] - ETA: 1:39 - loss: 1.2811 - regression_loss: 1.1157 - classification_loss: 0.1654 207/500 [===========>..................] - ETA: 1:38 - loss: 1.2792 - regression_loss: 1.1141 - classification_loss: 0.1651 208/500 [===========>..................] - ETA: 1:38 - loss: 1.2773 - regression_loss: 1.1126 - classification_loss: 0.1647 209/500 [===========>..................] - ETA: 1:38 - loss: 1.2838 - regression_loss: 1.1183 - classification_loss: 0.1655 210/500 [===========>..................] - ETA: 1:37 - loss: 1.2875 - regression_loss: 1.1210 - classification_loss: 0.1666 211/500 [===========>..................] - ETA: 1:37 - loss: 1.2870 - regression_loss: 1.1207 - classification_loss: 0.1662 212/500 [===========>..................] - ETA: 1:37 - loss: 1.2878 - regression_loss: 1.1216 - classification_loss: 0.1663 213/500 [===========>..................] - ETA: 1:36 - loss: 1.2862 - regression_loss: 1.1203 - classification_loss: 0.1659 214/500 [===========>..................] - ETA: 1:36 - loss: 1.2886 - regression_loss: 1.1225 - classification_loss: 0.1661 215/500 [===========>..................] - ETA: 1:36 - loss: 1.2873 - regression_loss: 1.1214 - classification_loss: 0.1659 216/500 [===========>..................] - ETA: 1:35 - loss: 1.2875 - regression_loss: 1.1216 - classification_loss: 0.1659 217/500 [============>.................] - ETA: 1:35 - loss: 1.2891 - regression_loss: 1.1227 - classification_loss: 0.1664 218/500 [============>.................] - ETA: 1:35 - loss: 1.2872 - regression_loss: 1.1209 - classification_loss: 0.1663 219/500 [============>.................] - ETA: 1:34 - loss: 1.2833 - regression_loss: 1.1176 - classification_loss: 0.1657 220/500 [============>.................] - ETA: 1:34 - loss: 1.2832 - regression_loss: 1.1176 - classification_loss: 0.1656 221/500 [============>.................] - ETA: 1:34 - loss: 1.2853 - regression_loss: 1.1194 - classification_loss: 0.1659 222/500 [============>.................] - ETA: 1:33 - loss: 1.2853 - regression_loss: 1.1195 - classification_loss: 0.1659 223/500 [============>.................] - ETA: 1:33 - loss: 1.2838 - regression_loss: 1.1185 - classification_loss: 0.1653 224/500 [============>.................] - ETA: 1:33 - loss: 1.2829 - regression_loss: 1.1174 - classification_loss: 0.1655 225/500 [============>.................] - ETA: 1:32 - loss: 1.2817 - regression_loss: 1.1163 - classification_loss: 0.1654 226/500 [============>.................] - ETA: 1:32 - loss: 1.2833 - regression_loss: 1.1174 - classification_loss: 0.1659 227/500 [============>.................] - ETA: 1:32 - loss: 1.2879 - regression_loss: 1.1204 - classification_loss: 0.1675 228/500 [============>.................] - ETA: 1:31 - loss: 1.2878 - regression_loss: 1.1206 - classification_loss: 0.1672 229/500 [============>.................] - ETA: 1:31 - loss: 1.2860 - regression_loss: 1.1191 - classification_loss: 0.1668 230/500 [============>.................] - ETA: 1:31 - loss: 1.2832 - regression_loss: 1.1169 - classification_loss: 0.1663 231/500 [============>.................] - ETA: 1:30 - loss: 1.2819 - regression_loss: 1.1160 - classification_loss: 0.1659 232/500 [============>.................] - ETA: 1:30 - loss: 1.2814 - regression_loss: 1.1156 - classification_loss: 0.1658 233/500 [============>.................] - ETA: 1:30 - loss: 1.2785 - regression_loss: 1.1131 - classification_loss: 0.1654 234/500 [=============>................] - ETA: 1:29 - loss: 1.2765 - regression_loss: 1.1113 - classification_loss: 0.1652 235/500 [=============>................] - ETA: 1:29 - loss: 1.2766 - regression_loss: 1.1116 - classification_loss: 0.1651 236/500 [=============>................] - ETA: 1:29 - loss: 1.2756 - regression_loss: 1.1108 - classification_loss: 0.1648 237/500 [=============>................] - ETA: 1:28 - loss: 1.2765 - regression_loss: 1.1116 - classification_loss: 0.1649 238/500 [=============>................] - ETA: 1:28 - loss: 1.2775 - regression_loss: 1.1126 - classification_loss: 0.1649 239/500 [=============>................] - ETA: 1:28 - loss: 1.2795 - regression_loss: 1.1137 - classification_loss: 0.1658 240/500 [=============>................] - ETA: 1:27 - loss: 1.2775 - regression_loss: 1.1121 - classification_loss: 0.1654 241/500 [=============>................] - ETA: 1:27 - loss: 1.2761 - regression_loss: 1.1109 - classification_loss: 0.1652 242/500 [=============>................] - ETA: 1:27 - loss: 1.2771 - regression_loss: 1.1120 - classification_loss: 0.1651 243/500 [=============>................] - ETA: 1:26 - loss: 1.2766 - regression_loss: 1.1117 - classification_loss: 0.1649 244/500 [=============>................] - ETA: 1:26 - loss: 1.2792 - regression_loss: 1.1133 - classification_loss: 0.1659 245/500 [=============>................] - ETA: 1:26 - loss: 1.2771 - regression_loss: 1.1114 - classification_loss: 0.1657 246/500 [=============>................] - ETA: 1:25 - loss: 1.2782 - regression_loss: 1.1121 - classification_loss: 0.1662 247/500 [=============>................] - ETA: 1:25 - loss: 1.2775 - regression_loss: 1.1115 - classification_loss: 0.1660 248/500 [=============>................] - ETA: 1:25 - loss: 1.2791 - regression_loss: 1.1130 - classification_loss: 0.1661 249/500 [=============>................] - ETA: 1:24 - loss: 1.2788 - regression_loss: 1.1129 - classification_loss: 0.1659 250/500 [==============>...............] - ETA: 1:24 - loss: 1.2815 - regression_loss: 1.1155 - classification_loss: 0.1659 251/500 [==============>...............] - ETA: 1:24 - loss: 1.2831 - regression_loss: 1.1173 - classification_loss: 0.1659 252/500 [==============>...............] - ETA: 1:23 - loss: 1.2809 - regression_loss: 1.1155 - classification_loss: 0.1654 253/500 [==============>...............] - ETA: 1:23 - loss: 1.2782 - regression_loss: 1.1132 - classification_loss: 0.1650 254/500 [==============>...............] - ETA: 1:23 - loss: 1.2787 - regression_loss: 1.1135 - classification_loss: 0.1651 255/500 [==============>...............] - ETA: 1:22 - loss: 1.2781 - regression_loss: 1.1133 - classification_loss: 0.1649 256/500 [==============>...............] - ETA: 1:22 - loss: 1.2768 - regression_loss: 1.1123 - classification_loss: 0.1645 257/500 [==============>...............] - ETA: 1:22 - loss: 1.2881 - regression_loss: 1.1196 - classification_loss: 0.1684 258/500 [==============>...............] - ETA: 1:21 - loss: 1.2880 - regression_loss: 1.1197 - classification_loss: 0.1683 259/500 [==============>...............] - ETA: 1:21 - loss: 1.2853 - regression_loss: 1.1174 - classification_loss: 0.1678 260/500 [==============>...............] - ETA: 1:21 - loss: 1.2865 - regression_loss: 1.1185 - classification_loss: 0.1680 261/500 [==============>...............] - ETA: 1:20 - loss: 1.2878 - regression_loss: 1.1198 - classification_loss: 0.1681 262/500 [==============>...............] - ETA: 1:20 - loss: 1.2877 - regression_loss: 1.1200 - classification_loss: 0.1677 263/500 [==============>...............] - ETA: 1:20 - loss: 1.2889 - regression_loss: 1.1211 - classification_loss: 0.1678 264/500 [==============>...............] - ETA: 1:19 - loss: 1.2905 - regression_loss: 1.1228 - classification_loss: 0.1678 265/500 [==============>...............] - ETA: 1:19 - loss: 1.2889 - regression_loss: 1.1215 - classification_loss: 0.1675 266/500 [==============>...............] - ETA: 1:19 - loss: 1.2881 - regression_loss: 1.1207 - classification_loss: 0.1674 267/500 [===============>..............] - ETA: 1:18 - loss: 1.2872 - regression_loss: 1.1200 - classification_loss: 0.1672 268/500 [===============>..............] - ETA: 1:18 - loss: 1.2870 - regression_loss: 1.1200 - classification_loss: 0.1670 269/500 [===============>..............] - ETA: 1:18 - loss: 1.2878 - regression_loss: 1.1205 - classification_loss: 0.1672 270/500 [===============>..............] - ETA: 1:17 - loss: 1.2889 - regression_loss: 1.1213 - classification_loss: 0.1676 271/500 [===============>..............] - ETA: 1:17 - loss: 1.2871 - regression_loss: 1.1199 - classification_loss: 0.1672 272/500 [===============>..............] - ETA: 1:17 - loss: 1.2867 - regression_loss: 1.1195 - classification_loss: 0.1672 273/500 [===============>..............] - ETA: 1:16 - loss: 1.2864 - regression_loss: 1.1192 - classification_loss: 0.1672 274/500 [===============>..............] - ETA: 1:16 - loss: 1.2856 - regression_loss: 1.1185 - classification_loss: 0.1670 275/500 [===============>..............] - ETA: 1:16 - loss: 1.2847 - regression_loss: 1.1179 - classification_loss: 0.1668 276/500 [===============>..............] - ETA: 1:15 - loss: 1.2845 - regression_loss: 1.1177 - classification_loss: 0.1668 277/500 [===============>..............] - ETA: 1:15 - loss: 1.2871 - regression_loss: 1.1198 - classification_loss: 0.1674 278/500 [===============>..............] - ETA: 1:15 - loss: 1.2872 - regression_loss: 1.1201 - classification_loss: 0.1672 279/500 [===============>..............] - ETA: 1:14 - loss: 1.2865 - regression_loss: 1.1195 - classification_loss: 0.1670 280/500 [===============>..............] - ETA: 1:14 - loss: 1.2824 - regression_loss: 1.1155 - classification_loss: 0.1669 281/500 [===============>..............] - ETA: 1:14 - loss: 1.2850 - regression_loss: 1.1182 - classification_loss: 0.1668 282/500 [===============>..............] - ETA: 1:13 - loss: 1.2841 - regression_loss: 1.1176 - classification_loss: 0.1665 283/500 [===============>..............] - ETA: 1:13 - loss: 1.2870 - regression_loss: 1.1198 - classification_loss: 0.1671 284/500 [================>.............] - ETA: 1:13 - loss: 1.2890 - regression_loss: 1.1216 - classification_loss: 0.1674 285/500 [================>.............] - ETA: 1:12 - loss: 1.2873 - regression_loss: 1.1202 - classification_loss: 0.1670 286/500 [================>.............] - ETA: 1:12 - loss: 1.2855 - regression_loss: 1.1185 - classification_loss: 0.1670 287/500 [================>.............] - ETA: 1:12 - loss: 1.2841 - regression_loss: 1.1175 - classification_loss: 0.1667 288/500 [================>.............] - ETA: 1:11 - loss: 1.2856 - regression_loss: 1.1184 - classification_loss: 0.1672 289/500 [================>.............] - ETA: 1:11 - loss: 1.2862 - regression_loss: 1.1190 - classification_loss: 0.1672 290/500 [================>.............] - ETA: 1:11 - loss: 1.2849 - regression_loss: 1.1179 - classification_loss: 0.1670 291/500 [================>.............] - ETA: 1:10 - loss: 1.2847 - regression_loss: 1.1176 - classification_loss: 0.1671 292/500 [================>.............] - ETA: 1:10 - loss: 1.2854 - regression_loss: 1.1182 - classification_loss: 0.1672 293/500 [================>.............] - ETA: 1:10 - loss: 1.2854 - regression_loss: 1.1182 - classification_loss: 0.1673 294/500 [================>.............] - ETA: 1:09 - loss: 1.2837 - regression_loss: 1.1169 - classification_loss: 0.1669 295/500 [================>.............] - ETA: 1:09 - loss: 1.2850 - regression_loss: 1.1180 - classification_loss: 0.1670 296/500 [================>.............] - ETA: 1:09 - loss: 1.2856 - regression_loss: 1.1186 - classification_loss: 0.1669 297/500 [================>.............] - ETA: 1:08 - loss: 1.2881 - regression_loss: 1.1208 - classification_loss: 0.1673 298/500 [================>.............] - ETA: 1:08 - loss: 1.2851 - regression_loss: 1.1183 - classification_loss: 0.1668 299/500 [================>.............] - ETA: 1:08 - loss: 1.2871 - regression_loss: 1.1201 - classification_loss: 0.1670 300/500 [=================>............] - ETA: 1:07 - loss: 1.2903 - regression_loss: 1.1228 - classification_loss: 0.1675 301/500 [=================>............] - ETA: 1:07 - loss: 1.2879 - regression_loss: 1.1208 - classification_loss: 0.1671 302/500 [=================>............] - ETA: 1:07 - loss: 1.2876 - regression_loss: 1.1206 - classification_loss: 0.1670 303/500 [=================>............] - ETA: 1:06 - loss: 1.2910 - regression_loss: 1.1233 - classification_loss: 0.1677 304/500 [=================>............] - ETA: 1:06 - loss: 1.2905 - regression_loss: 1.1229 - classification_loss: 0.1675 305/500 [=================>............] - ETA: 1:06 - loss: 1.2951 - regression_loss: 1.1262 - classification_loss: 0.1689 306/500 [=================>............] - ETA: 1:05 - loss: 1.2965 - regression_loss: 1.1275 - classification_loss: 0.1690 307/500 [=================>............] - ETA: 1:05 - loss: 1.2991 - regression_loss: 1.1301 - classification_loss: 0.1690 308/500 [=================>............] - ETA: 1:05 - loss: 1.2973 - regression_loss: 1.1278 - classification_loss: 0.1695 309/500 [=================>............] - ETA: 1:04 - loss: 1.2968 - regression_loss: 1.1273 - classification_loss: 0.1695 310/500 [=================>............] - ETA: 1:04 - loss: 1.2999 - regression_loss: 1.1296 - classification_loss: 0.1703 311/500 [=================>............] - ETA: 1:04 - loss: 1.3012 - regression_loss: 1.1307 - classification_loss: 0.1705 312/500 [=================>............] - ETA: 1:03 - loss: 1.2988 - regression_loss: 1.1286 - classification_loss: 0.1702 313/500 [=================>............] - ETA: 1:03 - loss: 1.2984 - regression_loss: 1.1281 - classification_loss: 0.1703 314/500 [=================>............] - ETA: 1:03 - loss: 1.2979 - regression_loss: 1.1277 - classification_loss: 0.1702 315/500 [=================>............] - ETA: 1:02 - loss: 1.3001 - regression_loss: 1.1294 - classification_loss: 0.1707 316/500 [=================>............] - ETA: 1:02 - loss: 1.3005 - regression_loss: 1.1297 - classification_loss: 0.1708 317/500 [==================>...........] - ETA: 1:02 - loss: 1.3013 - regression_loss: 1.1304 - classification_loss: 0.1709 318/500 [==================>...........] - ETA: 1:01 - loss: 1.3000 - regression_loss: 1.1293 - classification_loss: 0.1707 319/500 [==================>...........] - ETA: 1:01 - loss: 1.3012 - regression_loss: 1.1300 - classification_loss: 0.1713 320/500 [==================>...........] - ETA: 1:01 - loss: 1.3007 - regression_loss: 1.1296 - classification_loss: 0.1711 321/500 [==================>...........] - ETA: 1:00 - loss: 1.3002 - regression_loss: 1.1292 - classification_loss: 0.1710 322/500 [==================>...........] - ETA: 1:00 - loss: 1.3003 - regression_loss: 1.1293 - classification_loss: 0.1709 323/500 [==================>...........] - ETA: 1:00 - loss: 1.3005 - regression_loss: 1.1295 - classification_loss: 0.1710 324/500 [==================>...........] - ETA: 59s - loss: 1.3015 - regression_loss: 1.1303 - classification_loss: 0.1712  325/500 [==================>...........] - ETA: 59s - loss: 1.3007 - regression_loss: 1.1297 - classification_loss: 0.1710 326/500 [==================>...........] - ETA: 59s - loss: 1.2996 - regression_loss: 1.1288 - classification_loss: 0.1708 327/500 [==================>...........] - ETA: 58s - loss: 1.3018 - regression_loss: 1.1302 - classification_loss: 0.1716 328/500 [==================>...........] - ETA: 58s - loss: 1.3007 - regression_loss: 1.1293 - classification_loss: 0.1714 329/500 [==================>...........] - ETA: 58s - loss: 1.3008 - regression_loss: 1.1296 - classification_loss: 0.1712 330/500 [==================>...........] - ETA: 57s - loss: 1.2989 - regression_loss: 1.1280 - classification_loss: 0.1709 331/500 [==================>...........] - ETA: 57s - loss: 1.2997 - regression_loss: 1.1281 - classification_loss: 0.1716 332/500 [==================>...........] - ETA: 57s - loss: 1.2988 - regression_loss: 1.1274 - classification_loss: 0.1715 333/500 [==================>...........] - ETA: 56s - loss: 1.2997 - regression_loss: 1.1283 - classification_loss: 0.1714 334/500 [===================>..........] - ETA: 56s - loss: 1.3013 - regression_loss: 1.1297 - classification_loss: 0.1715 335/500 [===================>..........] - ETA: 56s - loss: 1.3008 - regression_loss: 1.1295 - classification_loss: 0.1714 336/500 [===================>..........] - ETA: 55s - loss: 1.2989 - regression_loss: 1.1278 - classification_loss: 0.1711 337/500 [===================>..........] - ETA: 55s - loss: 1.2998 - regression_loss: 1.1286 - classification_loss: 0.1712 338/500 [===================>..........] - ETA: 55s - loss: 1.2987 - regression_loss: 1.1279 - classification_loss: 0.1708 339/500 [===================>..........] - ETA: 54s - loss: 1.2977 - regression_loss: 1.1271 - classification_loss: 0.1706 340/500 [===================>..........] - ETA: 54s - loss: 1.2971 - regression_loss: 1.1266 - classification_loss: 0.1705 341/500 [===================>..........] - ETA: 54s - loss: 1.2957 - regression_loss: 1.1253 - classification_loss: 0.1704 342/500 [===================>..........] - ETA: 53s - loss: 1.2968 - regression_loss: 1.1261 - classification_loss: 0.1707 343/500 [===================>..........] - ETA: 53s - loss: 1.2973 - regression_loss: 1.1264 - classification_loss: 0.1709 344/500 [===================>..........] - ETA: 52s - loss: 1.2967 - regression_loss: 1.1260 - classification_loss: 0.1707 345/500 [===================>..........] - ETA: 52s - loss: 1.3027 - regression_loss: 1.1294 - classification_loss: 0.1733 346/500 [===================>..........] - ETA: 52s - loss: 1.3019 - regression_loss: 1.1288 - classification_loss: 0.1731 347/500 [===================>..........] - ETA: 51s - loss: 1.3004 - regression_loss: 1.1276 - classification_loss: 0.1729 348/500 [===================>..........] - ETA: 51s - loss: 1.3011 - regression_loss: 1.1283 - classification_loss: 0.1728 349/500 [===================>..........] - ETA: 51s - loss: 1.3001 - regression_loss: 1.1276 - classification_loss: 0.1725 350/500 [====================>.........] - ETA: 50s - loss: 1.2995 - regression_loss: 1.1270 - classification_loss: 0.1725 351/500 [====================>.........] - ETA: 50s - loss: 1.2986 - regression_loss: 1.1262 - classification_loss: 0.1723 352/500 [====================>.........] - ETA: 50s - loss: 1.2962 - regression_loss: 1.1242 - classification_loss: 0.1719 353/500 [====================>.........] - ETA: 49s - loss: 1.2963 - regression_loss: 1.1242 - classification_loss: 0.1721 354/500 [====================>.........] - ETA: 49s - loss: 1.2971 - regression_loss: 1.1250 - classification_loss: 0.1720 355/500 [====================>.........] - ETA: 49s - loss: 1.2997 - regression_loss: 1.1272 - classification_loss: 0.1725 356/500 [====================>.........] - ETA: 48s - loss: 1.2994 - regression_loss: 1.1270 - classification_loss: 0.1724 357/500 [====================>.........] - ETA: 48s - loss: 1.2979 - regression_loss: 1.1257 - classification_loss: 0.1722 358/500 [====================>.........] - ETA: 48s - loss: 1.2970 - regression_loss: 1.1250 - classification_loss: 0.1720 359/500 [====================>.........] - ETA: 47s - loss: 1.2974 - regression_loss: 1.1254 - classification_loss: 0.1720 360/500 [====================>.........] - ETA: 47s - loss: 1.2971 - regression_loss: 1.1251 - classification_loss: 0.1720 361/500 [====================>.........] - ETA: 47s - loss: 1.2971 - regression_loss: 1.1253 - classification_loss: 0.1719 362/500 [====================>.........] - ETA: 46s - loss: 1.2962 - regression_loss: 1.1245 - classification_loss: 0.1717 363/500 [====================>.........] - ETA: 46s - loss: 1.2963 - regression_loss: 1.1246 - classification_loss: 0.1717 364/500 [====================>.........] - ETA: 46s - loss: 1.2948 - regression_loss: 1.1234 - classification_loss: 0.1714 365/500 [====================>.........] - ETA: 45s - loss: 1.2933 - regression_loss: 1.1223 - classification_loss: 0.1710 366/500 [====================>.........] - ETA: 45s - loss: 1.2944 - regression_loss: 1.1233 - classification_loss: 0.1711 367/500 [=====================>........] - ETA: 45s - loss: 1.2966 - regression_loss: 1.1250 - classification_loss: 0.1715 368/500 [=====================>........] - ETA: 44s - loss: 1.2967 - regression_loss: 1.1252 - classification_loss: 0.1715 369/500 [=====================>........] - ETA: 44s - loss: 1.2960 - regression_loss: 1.1247 - classification_loss: 0.1713 370/500 [=====================>........] - ETA: 44s - loss: 1.2960 - regression_loss: 1.1247 - classification_loss: 0.1713 371/500 [=====================>........] - ETA: 43s - loss: 1.2957 - regression_loss: 1.1245 - classification_loss: 0.1712 372/500 [=====================>........] - ETA: 43s - loss: 1.2977 - regression_loss: 1.1264 - classification_loss: 0.1713 373/500 [=====================>........] - ETA: 43s - loss: 1.2974 - regression_loss: 1.1259 - classification_loss: 0.1715 374/500 [=====================>........] - ETA: 42s - loss: 1.2970 - regression_loss: 1.1256 - classification_loss: 0.1714 375/500 [=====================>........] - ETA: 42s - loss: 1.2973 - regression_loss: 1.1259 - classification_loss: 0.1714 376/500 [=====================>........] - ETA: 42s - loss: 1.2967 - regression_loss: 1.1255 - classification_loss: 0.1712 377/500 [=====================>........] - ETA: 41s - loss: 1.2988 - regression_loss: 1.1273 - classification_loss: 0.1715 378/500 [=====================>........] - ETA: 41s - loss: 1.2979 - regression_loss: 1.1266 - classification_loss: 0.1713 379/500 [=====================>........] - ETA: 41s - loss: 1.2968 - regression_loss: 1.1257 - classification_loss: 0.1710 380/500 [=====================>........] - ETA: 40s - loss: 1.2960 - regression_loss: 1.1251 - classification_loss: 0.1709 381/500 [=====================>........] - ETA: 40s - loss: 1.2946 - regression_loss: 1.1239 - classification_loss: 0.1707 382/500 [=====================>........] - ETA: 40s - loss: 1.2964 - regression_loss: 1.1254 - classification_loss: 0.1710 383/500 [=====================>........] - ETA: 39s - loss: 1.3002 - regression_loss: 1.1288 - classification_loss: 0.1715 384/500 [======================>.......] - ETA: 39s - loss: 1.2996 - regression_loss: 1.1283 - classification_loss: 0.1713 385/500 [======================>.......] - ETA: 39s - loss: 1.3002 - regression_loss: 1.1287 - classification_loss: 0.1715 386/500 [======================>.......] - ETA: 38s - loss: 1.2990 - regression_loss: 1.1277 - classification_loss: 0.1714 387/500 [======================>.......] - ETA: 38s - loss: 1.2974 - regression_loss: 1.1263 - classification_loss: 0.1711 388/500 [======================>.......] - ETA: 38s - loss: 1.2974 - regression_loss: 1.1264 - classification_loss: 0.1710 389/500 [======================>.......] - ETA: 37s - loss: 1.2972 - regression_loss: 1.1263 - classification_loss: 0.1709 390/500 [======================>.......] - ETA: 37s - loss: 1.2960 - regression_loss: 1.1254 - classification_loss: 0.1707 391/500 [======================>.......] - ETA: 37s - loss: 1.2959 - regression_loss: 1.1251 - classification_loss: 0.1707 392/500 [======================>.......] - ETA: 36s - loss: 1.2944 - regression_loss: 1.1240 - classification_loss: 0.1705 393/500 [======================>.......] - ETA: 36s - loss: 1.2926 - regression_loss: 1.1223 - classification_loss: 0.1702 394/500 [======================>.......] - ETA: 35s - loss: 1.2930 - regression_loss: 1.1230 - classification_loss: 0.1700 395/500 [======================>.......] - ETA: 35s - loss: 1.2951 - regression_loss: 1.1247 - classification_loss: 0.1704 396/500 [======================>.......] - ETA: 35s - loss: 1.2944 - regression_loss: 1.1241 - classification_loss: 0.1703 397/500 [======================>.......] - ETA: 34s - loss: 1.2962 - regression_loss: 1.1259 - classification_loss: 0.1703 398/500 [======================>.......] - ETA: 34s - loss: 1.2968 - regression_loss: 1.1263 - classification_loss: 0.1705 399/500 [======================>.......] - ETA: 34s - loss: 1.2966 - regression_loss: 1.1264 - classification_loss: 0.1703 400/500 [=======================>......] - ETA: 33s - loss: 1.2978 - regression_loss: 1.1271 - classification_loss: 0.1707 401/500 [=======================>......] - ETA: 33s - loss: 1.2980 - regression_loss: 1.1273 - classification_loss: 0.1707 402/500 [=======================>......] - ETA: 33s - loss: 1.2990 - regression_loss: 1.1283 - classification_loss: 0.1707 403/500 [=======================>......] - ETA: 32s - loss: 1.2983 - regression_loss: 1.1277 - classification_loss: 0.1706 404/500 [=======================>......] - ETA: 32s - loss: 1.2972 - regression_loss: 1.1268 - classification_loss: 0.1704 405/500 [=======================>......] - ETA: 32s - loss: 1.2965 - regression_loss: 1.1263 - classification_loss: 0.1702 406/500 [=======================>......] - ETA: 31s - loss: 1.2966 - regression_loss: 1.1264 - classification_loss: 0.1701 407/500 [=======================>......] - ETA: 31s - loss: 1.2954 - regression_loss: 1.1255 - classification_loss: 0.1699 408/500 [=======================>......] - ETA: 31s - loss: 1.2942 - regression_loss: 1.1243 - classification_loss: 0.1698 409/500 [=======================>......] - ETA: 30s - loss: 1.2928 - regression_loss: 1.1232 - classification_loss: 0.1696 410/500 [=======================>......] - ETA: 30s - loss: 1.2906 - regression_loss: 1.1213 - classification_loss: 0.1693 411/500 [=======================>......] - ETA: 30s - loss: 1.2897 - regression_loss: 1.1206 - classification_loss: 0.1691 412/500 [=======================>......] - ETA: 29s - loss: 1.2905 - regression_loss: 1.1215 - classification_loss: 0.1690 413/500 [=======================>......] - ETA: 29s - loss: 1.2906 - regression_loss: 1.1216 - classification_loss: 0.1690 414/500 [=======================>......] - ETA: 29s - loss: 1.2917 - regression_loss: 1.1227 - classification_loss: 0.1690 415/500 [=======================>......] - ETA: 28s - loss: 1.2909 - regression_loss: 1.1222 - classification_loss: 0.1688 416/500 [=======================>......] - ETA: 28s - loss: 1.2891 - regression_loss: 1.1206 - classification_loss: 0.1685 417/500 [========================>.....] - ETA: 28s - loss: 1.2879 - regression_loss: 1.1195 - classification_loss: 0.1685 418/500 [========================>.....] - ETA: 27s - loss: 1.2881 - regression_loss: 1.1196 - classification_loss: 0.1685 419/500 [========================>.....] - ETA: 27s - loss: 1.2881 - regression_loss: 1.1197 - classification_loss: 0.1684 420/500 [========================>.....] - ETA: 27s - loss: 1.2881 - regression_loss: 1.1195 - classification_loss: 0.1686 421/500 [========================>.....] - ETA: 26s - loss: 1.2904 - regression_loss: 1.1213 - classification_loss: 0.1691 422/500 [========================>.....] - ETA: 26s - loss: 1.2913 - regression_loss: 1.1219 - classification_loss: 0.1694 423/500 [========================>.....] - ETA: 26s - loss: 1.2903 - regression_loss: 1.1212 - classification_loss: 0.1691 424/500 [========================>.....] - ETA: 25s - loss: 1.2902 - regression_loss: 1.1211 - classification_loss: 0.1691 425/500 [========================>.....] - ETA: 25s - loss: 1.2893 - regression_loss: 1.1203 - classification_loss: 0.1689 426/500 [========================>.....] - ETA: 25s - loss: 1.2898 - regression_loss: 1.1207 - classification_loss: 0.1692 427/500 [========================>.....] - ETA: 24s - loss: 1.2896 - regression_loss: 1.1205 - classification_loss: 0.1691 428/500 [========================>.....] - ETA: 24s - loss: 1.2897 - regression_loss: 1.1206 - classification_loss: 0.1691 429/500 [========================>.....] - ETA: 24s - loss: 1.2894 - regression_loss: 1.1203 - classification_loss: 0.1691 430/500 [========================>.....] - ETA: 23s - loss: 1.2889 - regression_loss: 1.1200 - classification_loss: 0.1689 431/500 [========================>.....] - ETA: 23s - loss: 1.2888 - regression_loss: 1.1201 - classification_loss: 0.1687 432/500 [========================>.....] - ETA: 23s - loss: 1.2895 - regression_loss: 1.1205 - classification_loss: 0.1689 433/500 [========================>.....] - ETA: 22s - loss: 1.2906 - regression_loss: 1.1217 - classification_loss: 0.1689 434/500 [=========================>....] - ETA: 22s - loss: 1.2903 - regression_loss: 1.1215 - classification_loss: 0.1688 435/500 [=========================>....] - ETA: 22s - loss: 1.2900 - regression_loss: 1.1214 - classification_loss: 0.1686 436/500 [=========================>....] - ETA: 21s - loss: 1.2901 - regression_loss: 1.1216 - classification_loss: 0.1685 437/500 [=========================>....] - ETA: 21s - loss: 1.2884 - regression_loss: 1.1201 - classification_loss: 0.1683 438/500 [=========================>....] - ETA: 21s - loss: 1.2875 - regression_loss: 1.1194 - classification_loss: 0.1681 439/500 [=========================>....] - ETA: 20s - loss: 1.2881 - regression_loss: 1.1200 - classification_loss: 0.1682 440/500 [=========================>....] - ETA: 20s - loss: 1.2889 - regression_loss: 1.1206 - classification_loss: 0.1683 441/500 [=========================>....] - ETA: 20s - loss: 1.2913 - regression_loss: 1.1225 - classification_loss: 0.1689 442/500 [=========================>....] - ETA: 19s - loss: 1.2920 - regression_loss: 1.1229 - classification_loss: 0.1690 443/500 [=========================>....] - ETA: 19s - loss: 1.2917 - regression_loss: 1.1228 - classification_loss: 0.1689 444/500 [=========================>....] - ETA: 19s - loss: 1.2906 - regression_loss: 1.1219 - classification_loss: 0.1687 445/500 [=========================>....] - ETA: 18s - loss: 1.2916 - regression_loss: 1.1228 - classification_loss: 0.1689 446/500 [=========================>....] - ETA: 18s - loss: 1.2909 - regression_loss: 1.1223 - classification_loss: 0.1687 447/500 [=========================>....] - ETA: 18s - loss: 1.2892 - regression_loss: 1.1208 - classification_loss: 0.1685 448/500 [=========================>....] - ETA: 17s - loss: 1.2891 - regression_loss: 1.1203 - classification_loss: 0.1689 449/500 [=========================>....] - ETA: 17s - loss: 1.2891 - regression_loss: 1.1202 - classification_loss: 0.1689 450/500 [==========================>...] - ETA: 16s - loss: 1.2888 - regression_loss: 1.1199 - classification_loss: 0.1689 451/500 [==========================>...] - ETA: 16s - loss: 1.2888 - regression_loss: 1.1199 - classification_loss: 0.1689 452/500 [==========================>...] - ETA: 16s - loss: 1.2893 - regression_loss: 1.1204 - classification_loss: 0.1690 453/500 [==========================>...] - ETA: 15s - loss: 1.2886 - regression_loss: 1.1198 - classification_loss: 0.1688 454/500 [==========================>...] - ETA: 15s - loss: 1.2881 - regression_loss: 1.1194 - classification_loss: 0.1687 455/500 [==========================>...] - ETA: 15s - loss: 1.2859 - regression_loss: 1.1175 - classification_loss: 0.1684 456/500 [==========================>...] - ETA: 14s - loss: 1.2860 - regression_loss: 1.1176 - classification_loss: 0.1685 457/500 [==========================>...] - ETA: 14s - loss: 1.2853 - regression_loss: 1.1170 - classification_loss: 0.1683 458/500 [==========================>...] - ETA: 14s - loss: 1.2848 - regression_loss: 1.1166 - classification_loss: 0.1682 459/500 [==========================>...] - ETA: 13s - loss: 1.2839 - regression_loss: 1.1159 - classification_loss: 0.1679 460/500 [==========================>...] - ETA: 13s - loss: 1.2825 - regression_loss: 1.1147 - classification_loss: 0.1678 461/500 [==========================>...] - ETA: 13s - loss: 1.2833 - regression_loss: 1.1154 - classification_loss: 0.1679 462/500 [==========================>...] - ETA: 12s - loss: 1.2820 - regression_loss: 1.1143 - classification_loss: 0.1677 463/500 [==========================>...] - ETA: 12s - loss: 1.2810 - regression_loss: 1.1135 - classification_loss: 0.1675 464/500 [==========================>...] - ETA: 12s - loss: 1.2803 - regression_loss: 1.1130 - classification_loss: 0.1674 465/500 [==========================>...] - ETA: 11s - loss: 1.2783 - regression_loss: 1.1112 - classification_loss: 0.1671 466/500 [==========================>...] - ETA: 11s - loss: 1.2775 - regression_loss: 1.1105 - classification_loss: 0.1670 467/500 [===========================>..] - ETA: 11s - loss: 1.2777 - regression_loss: 1.1107 - classification_loss: 0.1670 468/500 [===========================>..] - ETA: 10s - loss: 1.2754 - regression_loss: 1.1083 - classification_loss: 0.1671 469/500 [===========================>..] - ETA: 10s - loss: 1.2748 - regression_loss: 1.1079 - classification_loss: 0.1670 470/500 [===========================>..] - ETA: 10s - loss: 1.2748 - regression_loss: 1.1080 - classification_loss: 0.1668 471/500 [===========================>..] - ETA: 9s - loss: 1.2753 - regression_loss: 1.1085 - classification_loss: 0.1668  472/500 [===========================>..] - ETA: 9s - loss: 1.2749 - regression_loss: 1.1083 - classification_loss: 0.1666 473/500 [===========================>..] - ETA: 9s - loss: 1.2767 - regression_loss: 1.1097 - classification_loss: 0.1670 474/500 [===========================>..] - ETA: 8s - loss: 1.2768 - regression_loss: 1.1099 - classification_loss: 0.1669 475/500 [===========================>..] - ETA: 8s - loss: 1.2771 - regression_loss: 1.1100 - classification_loss: 0.1671 476/500 [===========================>..] - ETA: 8s - loss: 1.2768 - regression_loss: 1.1098 - classification_loss: 0.1670 477/500 [===========================>..] - ETA: 7s - loss: 1.2790 - regression_loss: 1.1115 - classification_loss: 0.1675 478/500 [===========================>..] - ETA: 7s - loss: 1.2807 - regression_loss: 1.1131 - classification_loss: 0.1677 479/500 [===========================>..] - ETA: 7s - loss: 1.2823 - regression_loss: 1.1141 - classification_loss: 0.1682 480/500 [===========================>..] - ETA: 6s - loss: 1.2829 - regression_loss: 1.1145 - classification_loss: 0.1684 481/500 [===========================>..] - ETA: 6s - loss: 1.2838 - regression_loss: 1.1153 - classification_loss: 0.1685 482/500 [===========================>..] - ETA: 6s - loss: 1.2841 - regression_loss: 1.1156 - classification_loss: 0.1685 483/500 [===========================>..] - ETA: 5s - loss: 1.2845 - regression_loss: 1.1161 - classification_loss: 0.1685 484/500 [============================>.] - ETA: 5s - loss: 1.2852 - regression_loss: 1.1166 - classification_loss: 0.1686 485/500 [============================>.] - ETA: 5s - loss: 1.2849 - regression_loss: 1.1164 - classification_loss: 0.1685 486/500 [============================>.] - ETA: 4s - loss: 1.2856 - regression_loss: 1.1171 - classification_loss: 0.1684 487/500 [============================>.] - ETA: 4s - loss: 1.2852 - regression_loss: 1.1168 - classification_loss: 0.1684 488/500 [============================>.] - ETA: 4s - loss: 1.2859 - regression_loss: 1.1175 - classification_loss: 0.1684 489/500 [============================>.] - ETA: 3s - loss: 1.2872 - regression_loss: 1.1188 - classification_loss: 0.1685 490/500 [============================>.] - ETA: 3s - loss: 1.2868 - regression_loss: 1.1185 - classification_loss: 0.1683 491/500 [============================>.] - ETA: 3s - loss: 1.2862 - regression_loss: 1.1180 - classification_loss: 0.1682 492/500 [============================>.] - ETA: 2s - loss: 1.2843 - regression_loss: 1.1164 - classification_loss: 0.1679 493/500 [============================>.] - ETA: 2s - loss: 1.2835 - regression_loss: 1.1157 - classification_loss: 0.1678 494/500 [============================>.] - ETA: 2s - loss: 1.2817 - regression_loss: 1.1142 - classification_loss: 0.1675 495/500 [============================>.] - ETA: 1s - loss: 1.2830 - regression_loss: 1.1152 - classification_loss: 0.1678 496/500 [============================>.] - ETA: 1s - loss: 1.2846 - regression_loss: 1.1163 - classification_loss: 0.1683 497/500 [============================>.] - ETA: 1s - loss: 1.2836 - regression_loss: 1.1155 - classification_loss: 0.1681 498/500 [============================>.] - ETA: 0s - loss: 1.2824 - regression_loss: 1.1145 - classification_loss: 0.1679 499/500 [============================>.] - ETA: 0s - loss: 1.2823 - regression_loss: 1.1145 - classification_loss: 0.1678 500/500 [==============================] - 170s 340ms/step - loss: 1.2828 - regression_loss: 1.1149 - classification_loss: 0.1679 326 instances of class plum with average precision: 0.8159 mAP: 0.8159 Epoch 00014: saving model to ./training/snapshots/resnet101_pascal_14.h5 Epoch 15/150 1/500 [..............................] - ETA: 2:41 - loss: 1.4373 - regression_loss: 1.3079 - classification_loss: 0.1294 2/500 [..............................] - ETA: 2:45 - loss: 1.3725 - regression_loss: 1.1867 - classification_loss: 0.1858 3/500 [..............................] - ETA: 2:46 - loss: 1.7059 - regression_loss: 1.4795 - classification_loss: 0.2263 4/500 [..............................] - ETA: 2:46 - loss: 1.5958 - regression_loss: 1.4002 - classification_loss: 0.1956 5/500 [..............................] - ETA: 2:47 - loss: 1.5906 - regression_loss: 1.4001 - classification_loss: 0.1905 6/500 [..............................] - ETA: 2:46 - loss: 1.5144 - regression_loss: 1.3336 - classification_loss: 0.1809 7/500 [..............................] - ETA: 2:47 - loss: 1.4781 - regression_loss: 1.3067 - classification_loss: 0.1714 8/500 [..............................] - ETA: 2:46 - loss: 1.5190 - regression_loss: 1.3356 - classification_loss: 0.1834 9/500 [..............................] - ETA: 2:46 - loss: 1.4882 - regression_loss: 1.3088 - classification_loss: 0.1794 10/500 [..............................] - ETA: 2:45 - loss: 1.4514 - regression_loss: 1.2758 - classification_loss: 0.1756 11/500 [..............................] - ETA: 2:45 - loss: 1.4077 - regression_loss: 1.2348 - classification_loss: 0.1728 12/500 [..............................] - ETA: 2:44 - loss: 1.4565 - regression_loss: 1.2726 - classification_loss: 0.1839 13/500 [..............................] - ETA: 2:43 - loss: 1.4445 - regression_loss: 1.2623 - classification_loss: 0.1821 14/500 [..............................] - ETA: 2:43 - loss: 1.3761 - regression_loss: 1.2030 - classification_loss: 0.1731 15/500 [..............................] - ETA: 2:42 - loss: 1.3746 - regression_loss: 1.2033 - classification_loss: 0.1713 16/500 [..............................] - ETA: 2:42 - loss: 1.4903 - regression_loss: 1.2984 - classification_loss: 0.1919 17/500 [>.............................] - ETA: 2:42 - loss: 1.4602 - regression_loss: 1.2744 - classification_loss: 0.1858 18/500 [>.............................] - ETA: 2:41 - loss: 1.4603 - regression_loss: 1.2761 - classification_loss: 0.1842 19/500 [>.............................] - ETA: 2:41 - loss: 1.4383 - regression_loss: 1.2580 - classification_loss: 0.1803 20/500 [>.............................] - ETA: 2:41 - loss: 1.4361 - regression_loss: 1.2428 - classification_loss: 0.1933 21/500 [>.............................] - ETA: 2:40 - loss: 1.4027 - regression_loss: 1.2143 - classification_loss: 0.1884 22/500 [>.............................] - ETA: 2:40 - loss: 1.3859 - regression_loss: 1.2008 - classification_loss: 0.1851 23/500 [>.............................] - ETA: 2:40 - loss: 1.3900 - regression_loss: 1.2034 - classification_loss: 0.1866 24/500 [>.............................] - ETA: 2:40 - loss: 1.4528 - regression_loss: 1.2532 - classification_loss: 0.1996 25/500 [>.............................] - ETA: 2:40 - loss: 1.4240 - regression_loss: 1.2305 - classification_loss: 0.1935 26/500 [>.............................] - ETA: 2:39 - loss: 1.4076 - regression_loss: 1.2174 - classification_loss: 0.1902 27/500 [>.............................] - ETA: 2:39 - loss: 1.3815 - regression_loss: 1.1945 - classification_loss: 0.1871 28/500 [>.............................] - ETA: 2:39 - loss: 1.3633 - regression_loss: 1.1798 - classification_loss: 0.1835 29/500 [>.............................] - ETA: 2:38 - loss: 1.3577 - regression_loss: 1.1747 - classification_loss: 0.1830 30/500 [>.............................] - ETA: 2:38 - loss: 1.3415 - regression_loss: 1.1612 - classification_loss: 0.1803 31/500 [>.............................] - ETA: 2:38 - loss: 1.3433 - regression_loss: 1.1622 - classification_loss: 0.1811 32/500 [>.............................] - ETA: 2:38 - loss: 1.3372 - regression_loss: 1.1592 - classification_loss: 0.1780 33/500 [>.............................] - ETA: 2:37 - loss: 1.3414 - regression_loss: 1.1612 - classification_loss: 0.1802 34/500 [=>............................] - ETA: 2:37 - loss: 1.3681 - regression_loss: 1.1813 - classification_loss: 0.1868 35/500 [=>............................] - ETA: 2:37 - loss: 1.3702 - regression_loss: 1.1844 - classification_loss: 0.1858 36/500 [=>............................] - ETA: 2:37 - loss: 1.3776 - regression_loss: 1.1911 - classification_loss: 0.1865 37/500 [=>............................] - ETA: 2:37 - loss: 1.3645 - regression_loss: 1.1807 - classification_loss: 0.1838 38/500 [=>............................] - ETA: 2:37 - loss: 1.3496 - regression_loss: 1.1670 - classification_loss: 0.1826 39/500 [=>............................] - ETA: 2:36 - loss: 1.3464 - regression_loss: 1.1654 - classification_loss: 0.1811 40/500 [=>............................] - ETA: 2:36 - loss: 1.3561 - regression_loss: 1.1738 - classification_loss: 0.1824 41/500 [=>............................] - ETA: 2:35 - loss: 1.3464 - regression_loss: 1.1652 - classification_loss: 0.1811 42/500 [=>............................] - ETA: 2:35 - loss: 1.3538 - regression_loss: 1.1735 - classification_loss: 0.1803 43/500 [=>............................] - ETA: 2:34 - loss: 1.3526 - regression_loss: 1.1743 - classification_loss: 0.1783 44/500 [=>............................] - ETA: 2:34 - loss: 1.3492 - regression_loss: 1.1709 - classification_loss: 0.1783 45/500 [=>............................] - ETA: 2:34 - loss: 1.3349 - regression_loss: 1.1589 - classification_loss: 0.1760 46/500 [=>............................] - ETA: 2:34 - loss: 1.3307 - regression_loss: 1.1566 - classification_loss: 0.1742 47/500 [=>............................] - ETA: 2:33 - loss: 1.3407 - regression_loss: 1.1641 - classification_loss: 0.1766 48/500 [=>............................] - ETA: 2:33 - loss: 1.3406 - regression_loss: 1.1643 - classification_loss: 0.1763 49/500 [=>............................] - ETA: 2:32 - loss: 1.3420 - regression_loss: 1.1634 - classification_loss: 0.1785 50/500 [==>...........................] - ETA: 2:32 - loss: 1.3373 - regression_loss: 1.1592 - classification_loss: 0.1781 51/500 [==>...........................] - ETA: 2:32 - loss: 1.3353 - regression_loss: 1.1583 - classification_loss: 0.1770 52/500 [==>...........................] - ETA: 2:31 - loss: 1.3291 - regression_loss: 1.1531 - classification_loss: 0.1760 53/500 [==>...........................] - ETA: 2:31 - loss: 1.3219 - regression_loss: 1.1478 - classification_loss: 0.1742 54/500 [==>...........................] - ETA: 2:31 - loss: 1.3277 - regression_loss: 1.1527 - classification_loss: 0.1749 55/500 [==>...........................] - ETA: 2:30 - loss: 1.3181 - regression_loss: 1.1434 - classification_loss: 0.1747 56/500 [==>...........................] - ETA: 2:30 - loss: 1.3139 - regression_loss: 1.1407 - classification_loss: 0.1731 57/500 [==>...........................] - ETA: 2:30 - loss: 1.3146 - regression_loss: 1.1409 - classification_loss: 0.1737 58/500 [==>...........................] - ETA: 2:30 - loss: 1.2991 - regression_loss: 1.1278 - classification_loss: 0.1713 59/500 [==>...........................] - ETA: 2:29 - loss: 1.2930 - regression_loss: 1.1230 - classification_loss: 0.1700 60/500 [==>...........................] - ETA: 2:29 - loss: 1.2899 - regression_loss: 1.1201 - classification_loss: 0.1698 61/500 [==>...........................] - ETA: 2:29 - loss: 1.2803 - regression_loss: 1.1119 - classification_loss: 0.1684 62/500 [==>...........................] - ETA: 2:28 - loss: 1.2826 - regression_loss: 1.1141 - classification_loss: 0.1685 63/500 [==>...........................] - ETA: 2:28 - loss: 1.2806 - regression_loss: 1.1121 - classification_loss: 0.1685 64/500 [==>...........................] - ETA: 2:28 - loss: 1.2789 - regression_loss: 1.1108 - classification_loss: 0.1681 65/500 [==>...........................] - ETA: 2:28 - loss: 1.2816 - regression_loss: 1.1129 - classification_loss: 0.1687 66/500 [==>...........................] - ETA: 2:27 - loss: 1.2764 - regression_loss: 1.1085 - classification_loss: 0.1680 67/500 [===>..........................] - ETA: 2:27 - loss: 1.2783 - regression_loss: 1.1108 - classification_loss: 0.1675 68/500 [===>..........................] - ETA: 2:27 - loss: 1.2709 - regression_loss: 1.1048 - classification_loss: 0.1661 69/500 [===>..........................] - ETA: 2:27 - loss: 1.2631 - regression_loss: 1.0985 - classification_loss: 0.1646 70/500 [===>..........................] - ETA: 2:26 - loss: 1.2619 - regression_loss: 1.0977 - classification_loss: 0.1642 71/500 [===>..........................] - ETA: 2:26 - loss: 1.2583 - regression_loss: 1.0957 - classification_loss: 0.1626 72/500 [===>..........................] - ETA: 2:26 - loss: 1.2546 - regression_loss: 1.0928 - classification_loss: 0.1617 73/500 [===>..........................] - ETA: 2:25 - loss: 1.2690 - regression_loss: 1.1038 - classification_loss: 0.1652 74/500 [===>..........................] - ETA: 2:25 - loss: 1.2631 - regression_loss: 1.0991 - classification_loss: 0.1640 75/500 [===>..........................] - ETA: 2:24 - loss: 1.2541 - regression_loss: 1.0913 - classification_loss: 0.1627 76/500 [===>..........................] - ETA: 2:24 - loss: 1.2590 - regression_loss: 1.0954 - classification_loss: 0.1636 77/500 [===>..........................] - ETA: 2:24 - loss: 1.2502 - regression_loss: 1.0879 - classification_loss: 0.1623 78/500 [===>..........................] - ETA: 2:23 - loss: 1.2476 - regression_loss: 1.0853 - classification_loss: 0.1623 79/500 [===>..........................] - ETA: 2:23 - loss: 1.2492 - regression_loss: 1.0872 - classification_loss: 0.1620 80/500 [===>..........................] - ETA: 2:23 - loss: 1.2629 - regression_loss: 1.0989 - classification_loss: 0.1639 81/500 [===>..........................] - ETA: 2:22 - loss: 1.2610 - regression_loss: 1.0977 - classification_loss: 0.1633 82/500 [===>..........................] - ETA: 2:22 - loss: 1.2602 - regression_loss: 1.0971 - classification_loss: 0.1631 83/500 [===>..........................] - ETA: 2:22 - loss: 1.2576 - regression_loss: 1.0950 - classification_loss: 0.1625 84/500 [====>.........................] - ETA: 2:21 - loss: 1.2566 - regression_loss: 1.0942 - classification_loss: 0.1623 85/500 [====>.........................] - ETA: 2:21 - loss: 1.2540 - regression_loss: 1.0923 - classification_loss: 0.1616 86/500 [====>.........................] - ETA: 2:21 - loss: 1.2538 - regression_loss: 1.0929 - classification_loss: 0.1609 87/500 [====>.........................] - ETA: 2:21 - loss: 1.2537 - regression_loss: 1.0934 - classification_loss: 0.1603 88/500 [====>.........................] - ETA: 2:20 - loss: 1.2579 - regression_loss: 1.0963 - classification_loss: 0.1616 89/500 [====>.........................] - ETA: 2:20 - loss: 1.2612 - regression_loss: 1.0998 - classification_loss: 0.1615 90/500 [====>.........................] - ETA: 2:19 - loss: 1.2565 - regression_loss: 1.0961 - classification_loss: 0.1604 91/500 [====>.........................] - ETA: 2:19 - loss: 1.2597 - regression_loss: 1.0986 - classification_loss: 0.1611 92/500 [====>.........................] - ETA: 2:19 - loss: 1.2634 - regression_loss: 1.1012 - classification_loss: 0.1622 93/500 [====>.........................] - ETA: 2:18 - loss: 1.2670 - regression_loss: 1.1048 - classification_loss: 0.1622 94/500 [====>.........................] - ETA: 2:18 - loss: 1.2676 - regression_loss: 1.1055 - classification_loss: 0.1621 95/500 [====>.........................] - ETA: 2:18 - loss: 1.2773 - regression_loss: 1.1129 - classification_loss: 0.1644 96/500 [====>.........................] - ETA: 2:17 - loss: 1.2814 - regression_loss: 1.1148 - classification_loss: 0.1666 97/500 [====>.........................] - ETA: 2:17 - loss: 1.2797 - regression_loss: 1.1139 - classification_loss: 0.1658 98/500 [====>.........................] - ETA: 2:17 - loss: 1.2821 - regression_loss: 1.1160 - classification_loss: 0.1661 99/500 [====>.........................] - ETA: 2:16 - loss: 1.2779 - regression_loss: 1.1121 - classification_loss: 0.1659 100/500 [=====>........................] - ETA: 2:16 - loss: 1.2758 - regression_loss: 1.1103 - classification_loss: 0.1656 101/500 [=====>........................] - ETA: 2:16 - loss: 1.2778 - regression_loss: 1.1112 - classification_loss: 0.1666 102/500 [=====>........................] - ETA: 2:15 - loss: 1.2797 - regression_loss: 1.1116 - classification_loss: 0.1681 103/500 [=====>........................] - ETA: 2:15 - loss: 1.2834 - regression_loss: 1.1147 - classification_loss: 0.1687 104/500 [=====>........................] - ETA: 2:14 - loss: 1.2832 - regression_loss: 1.1146 - classification_loss: 0.1685 105/500 [=====>........................] - ETA: 2:14 - loss: 1.2886 - regression_loss: 1.1191 - classification_loss: 0.1695 106/500 [=====>........................] - ETA: 2:14 - loss: 1.2872 - regression_loss: 1.1181 - classification_loss: 0.1692 107/500 [=====>........................] - ETA: 2:13 - loss: 1.2847 - regression_loss: 1.1160 - classification_loss: 0.1687 108/500 [=====>........................] - ETA: 2:13 - loss: 1.2805 - regression_loss: 1.1123 - classification_loss: 0.1682 109/500 [=====>........................] - ETA: 2:13 - loss: 1.2786 - regression_loss: 1.1115 - classification_loss: 0.1671 110/500 [=====>........................] - ETA: 2:12 - loss: 1.2739 - regression_loss: 1.1077 - classification_loss: 0.1662 111/500 [=====>........................] - ETA: 2:12 - loss: 1.2711 - regression_loss: 1.1056 - classification_loss: 0.1655 112/500 [=====>........................] - ETA: 2:12 - loss: 1.2649 - regression_loss: 1.1005 - classification_loss: 0.1644 113/500 [=====>........................] - ETA: 2:11 - loss: 1.2657 - regression_loss: 1.1019 - classification_loss: 0.1638 114/500 [=====>........................] - ETA: 2:11 - loss: 1.2713 - regression_loss: 1.1056 - classification_loss: 0.1657 115/500 [=====>........................] - ETA: 2:11 - loss: 1.2677 - regression_loss: 1.1027 - classification_loss: 0.1650 116/500 [=====>........................] - ETA: 2:10 - loss: 1.2673 - regression_loss: 1.1023 - classification_loss: 0.1649 117/500 [======>.......................] - ETA: 2:10 - loss: 1.2726 - regression_loss: 1.1070 - classification_loss: 0.1656 118/500 [======>.......................] - ETA: 2:09 - loss: 1.2650 - regression_loss: 1.1006 - classification_loss: 0.1644 119/500 [======>.......................] - ETA: 2:09 - loss: 1.2632 - regression_loss: 1.0985 - classification_loss: 0.1646 120/500 [======>.......................] - ETA: 2:09 - loss: 1.2579 - regression_loss: 1.0939 - classification_loss: 0.1640 121/500 [======>.......................] - ETA: 2:08 - loss: 1.2570 - regression_loss: 1.0936 - classification_loss: 0.1634 122/500 [======>.......................] - ETA: 2:08 - loss: 1.2614 - regression_loss: 1.0975 - classification_loss: 0.1639 123/500 [======>.......................] - ETA: 2:08 - loss: 1.2598 - regression_loss: 1.0964 - classification_loss: 0.1633 124/500 [======>.......................] - ETA: 2:07 - loss: 1.2564 - regression_loss: 1.0936 - classification_loss: 0.1628 125/500 [======>.......................] - ETA: 2:07 - loss: 1.2538 - regression_loss: 1.0920 - classification_loss: 0.1618 126/500 [======>.......................] - ETA: 2:07 - loss: 1.2563 - regression_loss: 1.0940 - classification_loss: 0.1624 127/500 [======>.......................] - ETA: 2:06 - loss: 1.2552 - regression_loss: 1.0933 - classification_loss: 0.1619 128/500 [======>.......................] - ETA: 2:06 - loss: 1.2527 - regression_loss: 1.0916 - classification_loss: 0.1611 129/500 [======>.......................] - ETA: 2:06 - loss: 1.2468 - regression_loss: 1.0864 - classification_loss: 0.1604 130/500 [======>.......................] - ETA: 2:05 - loss: 1.2422 - regression_loss: 1.0825 - classification_loss: 0.1597 131/500 [======>.......................] - ETA: 2:05 - loss: 1.2428 - regression_loss: 1.0830 - classification_loss: 0.1598 132/500 [======>.......................] - ETA: 2:04 - loss: 1.2374 - regression_loss: 1.0782 - classification_loss: 0.1592 133/500 [======>.......................] - ETA: 2:04 - loss: 1.2361 - regression_loss: 1.0771 - classification_loss: 0.1590 134/500 [=======>......................] - ETA: 2:04 - loss: 1.2349 - regression_loss: 1.0763 - classification_loss: 0.1586 135/500 [=======>......................] - ETA: 2:03 - loss: 1.2363 - regression_loss: 1.0775 - classification_loss: 0.1588 136/500 [=======>......................] - ETA: 2:03 - loss: 1.2357 - regression_loss: 1.0772 - classification_loss: 0.1585 137/500 [=======>......................] - ETA: 2:03 - loss: 1.2438 - regression_loss: 1.0840 - classification_loss: 0.1599 138/500 [=======>......................] - ETA: 2:02 - loss: 1.2451 - regression_loss: 1.0855 - classification_loss: 0.1596 139/500 [=======>......................] - ETA: 2:02 - loss: 1.2418 - regression_loss: 1.0827 - classification_loss: 0.1591 140/500 [=======>......................] - ETA: 2:02 - loss: 1.2375 - regression_loss: 1.0791 - classification_loss: 0.1584 141/500 [=======>......................] - ETA: 2:01 - loss: 1.2361 - regression_loss: 1.0776 - classification_loss: 0.1585 142/500 [=======>......................] - ETA: 2:01 - loss: 1.2370 - regression_loss: 1.0782 - classification_loss: 0.1588 143/500 [=======>......................] - ETA: 2:01 - loss: 1.2368 - regression_loss: 1.0783 - classification_loss: 0.1585 144/500 [=======>......................] - ETA: 2:00 - loss: 1.2422 - regression_loss: 1.0826 - classification_loss: 0.1596 145/500 [=======>......................] - ETA: 2:00 - loss: 1.2473 - regression_loss: 1.0867 - classification_loss: 0.1606 146/500 [=======>......................] - ETA: 2:00 - loss: 1.2471 - regression_loss: 1.0868 - classification_loss: 0.1603 147/500 [=======>......................] - ETA: 1:59 - loss: 1.2470 - regression_loss: 1.0871 - classification_loss: 0.1599 148/500 [=======>......................] - ETA: 1:59 - loss: 1.2449 - regression_loss: 1.0845 - classification_loss: 0.1604 149/500 [=======>......................] - ETA: 1:59 - loss: 1.2438 - regression_loss: 1.0836 - classification_loss: 0.1602 150/500 [========>.....................] - ETA: 1:58 - loss: 1.2430 - regression_loss: 1.0831 - classification_loss: 0.1599 151/500 [========>.....................] - ETA: 1:58 - loss: 1.2413 - regression_loss: 1.0821 - classification_loss: 0.1592 152/500 [========>.....................] - ETA: 1:58 - loss: 1.2416 - regression_loss: 1.0826 - classification_loss: 0.1589 153/500 [========>.....................] - ETA: 1:57 - loss: 1.2418 - regression_loss: 1.0829 - classification_loss: 0.1589 154/500 [========>.....................] - ETA: 1:57 - loss: 1.2445 - regression_loss: 1.0849 - classification_loss: 0.1596 155/500 [========>.....................] - ETA: 1:56 - loss: 1.2411 - regression_loss: 1.0822 - classification_loss: 0.1590 156/500 [========>.....................] - ETA: 1:56 - loss: 1.2396 - regression_loss: 1.0808 - classification_loss: 0.1588 157/500 [========>.....................] - ETA: 1:56 - loss: 1.2421 - regression_loss: 1.0826 - classification_loss: 0.1595 158/500 [========>.....................] - ETA: 1:55 - loss: 1.2431 - regression_loss: 1.0836 - classification_loss: 0.1595 159/500 [========>.....................] - ETA: 1:55 - loss: 1.2377 - regression_loss: 1.0784 - classification_loss: 0.1593 160/500 [========>.....................] - ETA: 1:55 - loss: 1.2379 - regression_loss: 1.0788 - classification_loss: 0.1591 161/500 [========>.....................] - ETA: 1:54 - loss: 1.2364 - regression_loss: 1.0776 - classification_loss: 0.1587 162/500 [========>.....................] - ETA: 1:54 - loss: 1.2326 - regression_loss: 1.0742 - classification_loss: 0.1583 163/500 [========>.....................] - ETA: 1:54 - loss: 1.2335 - regression_loss: 1.0750 - classification_loss: 0.1585 164/500 [========>.....................] - ETA: 1:53 - loss: 1.2342 - regression_loss: 1.0758 - classification_loss: 0.1584 165/500 [========>.....................] - ETA: 1:53 - loss: 1.2356 - regression_loss: 1.0769 - classification_loss: 0.1586 166/500 [========>.....................] - ETA: 1:53 - loss: 1.2379 - regression_loss: 1.0788 - classification_loss: 0.1591 167/500 [=========>....................] - ETA: 1:52 - loss: 1.2330 - regression_loss: 1.0747 - classification_loss: 0.1583 168/500 [=========>....................] - ETA: 1:52 - loss: 1.2356 - regression_loss: 1.0771 - classification_loss: 0.1585 169/500 [=========>....................] - ETA: 1:52 - loss: 1.2362 - regression_loss: 1.0782 - classification_loss: 0.1580 170/500 [=========>....................] - ETA: 1:51 - loss: 1.2354 - regression_loss: 1.0775 - classification_loss: 0.1579 171/500 [=========>....................] - ETA: 1:51 - loss: 1.2344 - regression_loss: 1.0770 - classification_loss: 0.1574 172/500 [=========>....................] - ETA: 1:51 - loss: 1.2370 - regression_loss: 1.0791 - classification_loss: 0.1579 173/500 [=========>....................] - ETA: 1:50 - loss: 1.2397 - regression_loss: 1.0812 - classification_loss: 0.1585 174/500 [=========>....................] - ETA: 1:50 - loss: 1.2402 - regression_loss: 1.0815 - classification_loss: 0.1587 175/500 [=========>....................] - ETA: 1:50 - loss: 1.2393 - regression_loss: 1.0810 - classification_loss: 0.1583 176/500 [=========>....................] - ETA: 1:49 - loss: 1.2364 - regression_loss: 1.0784 - classification_loss: 0.1580 177/500 [=========>....................] - ETA: 1:49 - loss: 1.2365 - regression_loss: 1.0783 - classification_loss: 0.1582 178/500 [=========>....................] - ETA: 1:49 - loss: 1.2423 - regression_loss: 1.0832 - classification_loss: 0.1591 179/500 [=========>....................] - ETA: 1:48 - loss: 1.2408 - regression_loss: 1.0818 - classification_loss: 0.1589 180/500 [=========>....................] - ETA: 1:48 - loss: 1.2390 - regression_loss: 1.0804 - classification_loss: 0.1586 181/500 [=========>....................] - ETA: 1:48 - loss: 1.2367 - regression_loss: 1.0782 - classification_loss: 0.1585 182/500 [=========>....................] - ETA: 1:47 - loss: 1.2376 - regression_loss: 1.0794 - classification_loss: 0.1582 183/500 [=========>....................] - ETA: 1:47 - loss: 1.2425 - regression_loss: 1.0835 - classification_loss: 0.1590 184/500 [==========>...................] - ETA: 1:47 - loss: 1.2408 - regression_loss: 1.0820 - classification_loss: 0.1588 185/500 [==========>...................] - ETA: 1:46 - loss: 1.2402 - regression_loss: 1.0816 - classification_loss: 0.1585 186/500 [==========>...................] - ETA: 1:46 - loss: 1.2389 - regression_loss: 1.0808 - classification_loss: 0.1580 187/500 [==========>...................] - ETA: 1:46 - loss: 1.2362 - regression_loss: 1.0786 - classification_loss: 0.1576 188/500 [==========>...................] - ETA: 1:45 - loss: 1.2357 - regression_loss: 1.0781 - classification_loss: 0.1576 189/500 [==========>...................] - ETA: 1:45 - loss: 1.2323 - regression_loss: 1.0749 - classification_loss: 0.1573 190/500 [==========>...................] - ETA: 1:45 - loss: 1.2276 - regression_loss: 1.0709 - classification_loss: 0.1567 191/500 [==========>...................] - ETA: 1:45 - loss: 1.2266 - regression_loss: 1.0703 - classification_loss: 0.1563 192/500 [==========>...................] - ETA: 1:44 - loss: 1.2272 - regression_loss: 1.0708 - classification_loss: 0.1564 193/500 [==========>...................] - ETA: 1:44 - loss: 1.2284 - regression_loss: 1.0718 - classification_loss: 0.1567 194/500 [==========>...................] - ETA: 1:44 - loss: 1.2424 - regression_loss: 1.0822 - classification_loss: 0.1602 195/500 [==========>...................] - ETA: 1:43 - loss: 1.2413 - regression_loss: 1.0814 - classification_loss: 0.1600 196/500 [==========>...................] - ETA: 1:43 - loss: 1.2419 - regression_loss: 1.0811 - classification_loss: 0.1608 197/500 [==========>...................] - ETA: 1:43 - loss: 1.2429 - regression_loss: 1.0816 - classification_loss: 0.1613 198/500 [==========>...................] - ETA: 1:42 - loss: 1.2434 - regression_loss: 1.0821 - classification_loss: 0.1614 199/500 [==========>...................] - ETA: 1:42 - loss: 1.2449 - regression_loss: 1.0835 - classification_loss: 0.1614 200/500 [===========>..................] - ETA: 1:41 - loss: 1.2431 - regression_loss: 1.0821 - classification_loss: 0.1611 201/500 [===========>..................] - ETA: 1:41 - loss: 1.2447 - regression_loss: 1.0834 - classification_loss: 0.1613 202/500 [===========>..................] - ETA: 1:41 - loss: 1.2465 - regression_loss: 1.0847 - classification_loss: 0.1618 203/500 [===========>..................] - ETA: 1:40 - loss: 1.2480 - regression_loss: 1.0861 - classification_loss: 0.1620 204/500 [===========>..................] - ETA: 1:40 - loss: 1.2453 - regression_loss: 1.0835 - classification_loss: 0.1619 205/500 [===========>..................] - ETA: 1:40 - loss: 1.2482 - regression_loss: 1.0861 - classification_loss: 0.1621 206/500 [===========>..................] - ETA: 1:39 - loss: 1.2467 - regression_loss: 1.0850 - classification_loss: 0.1617 207/500 [===========>..................] - ETA: 1:39 - loss: 1.2461 - regression_loss: 1.0845 - classification_loss: 0.1616 208/500 [===========>..................] - ETA: 1:39 - loss: 1.2406 - regression_loss: 1.0793 - classification_loss: 0.1613 209/500 [===========>..................] - ETA: 1:38 - loss: 1.2407 - regression_loss: 1.0795 - classification_loss: 0.1612 210/500 [===========>..................] - ETA: 1:38 - loss: 1.2425 - regression_loss: 1.0814 - classification_loss: 0.1611 211/500 [===========>..................] - ETA: 1:38 - loss: 1.2487 - regression_loss: 1.0862 - classification_loss: 0.1625 212/500 [===========>..................] - ETA: 1:37 - loss: 1.2454 - regression_loss: 1.0830 - classification_loss: 0.1623 213/500 [===========>..................] - ETA: 1:37 - loss: 1.2447 - regression_loss: 1.0826 - classification_loss: 0.1622 214/500 [===========>..................] - ETA: 1:37 - loss: 1.2482 - regression_loss: 1.0852 - classification_loss: 0.1631 215/500 [===========>..................] - ETA: 1:36 - loss: 1.2476 - regression_loss: 1.0848 - classification_loss: 0.1628 216/500 [===========>..................] - ETA: 1:36 - loss: 1.2495 - regression_loss: 1.0862 - classification_loss: 0.1633 217/500 [============>.................] - ETA: 1:36 - loss: 1.2515 - regression_loss: 1.0877 - classification_loss: 0.1638 218/500 [============>.................] - ETA: 1:35 - loss: 1.2525 - regression_loss: 1.0884 - classification_loss: 0.1640 219/500 [============>.................] - ETA: 1:35 - loss: 1.2510 - regression_loss: 1.0872 - classification_loss: 0.1637 220/500 [============>.................] - ETA: 1:35 - loss: 1.2498 - regression_loss: 1.0863 - classification_loss: 0.1635 221/500 [============>.................] - ETA: 1:34 - loss: 1.2504 - regression_loss: 1.0869 - classification_loss: 0.1634 222/500 [============>.................] - ETA: 1:34 - loss: 1.2483 - regression_loss: 1.0853 - classification_loss: 0.1631 223/500 [============>.................] - ETA: 1:34 - loss: 1.2450 - regression_loss: 1.0825 - classification_loss: 0.1625 224/500 [============>.................] - ETA: 1:33 - loss: 1.2453 - regression_loss: 1.0823 - classification_loss: 0.1631 225/500 [============>.................] - ETA: 1:33 - loss: 1.2454 - regression_loss: 1.0824 - classification_loss: 0.1630 226/500 [============>.................] - ETA: 1:33 - loss: 1.2448 - regression_loss: 1.0822 - classification_loss: 0.1627 227/500 [============>.................] - ETA: 1:32 - loss: 1.2468 - regression_loss: 1.0832 - classification_loss: 0.1636 228/500 [============>.................] - ETA: 1:32 - loss: 1.2464 - regression_loss: 1.0829 - classification_loss: 0.1635 229/500 [============>.................] - ETA: 1:32 - loss: 1.2469 - regression_loss: 1.0834 - classification_loss: 0.1635 230/500 [============>.................] - ETA: 1:31 - loss: 1.2485 - regression_loss: 1.0845 - classification_loss: 0.1640 231/500 [============>.................] - ETA: 1:31 - loss: 1.2531 - regression_loss: 1.0880 - classification_loss: 0.1650 232/500 [============>.................] - ETA: 1:30 - loss: 1.2540 - regression_loss: 1.0887 - classification_loss: 0.1653 233/500 [============>.................] - ETA: 1:30 - loss: 1.2548 - regression_loss: 1.0893 - classification_loss: 0.1655 234/500 [=============>................] - ETA: 1:30 - loss: 1.2539 - regression_loss: 1.0887 - classification_loss: 0.1652 235/500 [=============>................] - ETA: 1:29 - loss: 1.2564 - regression_loss: 1.0906 - classification_loss: 0.1658 236/500 [=============>................] - ETA: 1:29 - loss: 1.2559 - regression_loss: 1.0903 - classification_loss: 0.1656 237/500 [=============>................] - ETA: 1:29 - loss: 1.2580 - regression_loss: 1.0923 - classification_loss: 0.1657 238/500 [=============>................] - ETA: 1:28 - loss: 1.2590 - regression_loss: 1.0933 - classification_loss: 0.1657 239/500 [=============>................] - ETA: 1:28 - loss: 1.2563 - regression_loss: 1.0910 - classification_loss: 0.1653 240/500 [=============>................] - ETA: 1:28 - loss: 1.2587 - regression_loss: 1.0932 - classification_loss: 0.1655 241/500 [=============>................] - ETA: 1:27 - loss: 1.2582 - regression_loss: 1.0929 - classification_loss: 0.1653 242/500 [=============>................] - ETA: 1:27 - loss: 1.2580 - regression_loss: 1.0929 - classification_loss: 0.1651 243/500 [=============>................] - ETA: 1:27 - loss: 1.2625 - regression_loss: 1.0956 - classification_loss: 0.1669 244/500 [=============>................] - ETA: 1:26 - loss: 1.2636 - regression_loss: 1.0967 - classification_loss: 0.1670 245/500 [=============>................] - ETA: 1:26 - loss: 1.2654 - regression_loss: 1.0982 - classification_loss: 0.1672 246/500 [=============>................] - ETA: 1:26 - loss: 1.2658 - regression_loss: 1.0988 - classification_loss: 0.1671 247/500 [=============>................] - ETA: 1:25 - loss: 1.2643 - regression_loss: 1.0976 - classification_loss: 0.1667 248/500 [=============>................] - ETA: 1:25 - loss: 1.2633 - regression_loss: 1.0969 - classification_loss: 0.1664 249/500 [=============>................] - ETA: 1:25 - loss: 1.2621 - regression_loss: 1.0960 - classification_loss: 0.1661 250/500 [==============>...............] - ETA: 1:24 - loss: 1.2624 - regression_loss: 1.0961 - classification_loss: 0.1663 251/500 [==============>...............] - ETA: 1:24 - loss: 1.2624 - regression_loss: 1.0962 - classification_loss: 0.1662 252/500 [==============>...............] - ETA: 1:24 - loss: 1.2601 - regression_loss: 1.0944 - classification_loss: 0.1658 253/500 [==============>...............] - ETA: 1:23 - loss: 1.2603 - regression_loss: 1.0946 - classification_loss: 0.1657 254/500 [==============>...............] - ETA: 1:23 - loss: 1.2609 - regression_loss: 1.0950 - classification_loss: 0.1659 255/500 [==============>...............] - ETA: 1:23 - loss: 1.2611 - regression_loss: 1.0954 - classification_loss: 0.1657 256/500 [==============>...............] - ETA: 1:22 - loss: 1.2602 - regression_loss: 1.0948 - classification_loss: 0.1654 257/500 [==============>...............] - ETA: 1:22 - loss: 1.2615 - regression_loss: 1.0954 - classification_loss: 0.1661 258/500 [==============>...............] - ETA: 1:22 - loss: 1.2611 - regression_loss: 1.0951 - classification_loss: 0.1660 259/500 [==============>...............] - ETA: 1:21 - loss: 1.2602 - regression_loss: 1.0944 - classification_loss: 0.1658 260/500 [==============>...............] - ETA: 1:21 - loss: 1.2588 - regression_loss: 1.0933 - classification_loss: 0.1655 261/500 [==============>...............] - ETA: 1:21 - loss: 1.2601 - regression_loss: 1.0943 - classification_loss: 0.1657 262/500 [==============>...............] - ETA: 1:20 - loss: 1.2609 - regression_loss: 1.0948 - classification_loss: 0.1661 263/500 [==============>...............] - ETA: 1:20 - loss: 1.2598 - regression_loss: 1.0939 - classification_loss: 0.1659 264/500 [==============>...............] - ETA: 1:20 - loss: 1.2562 - regression_loss: 1.0908 - classification_loss: 0.1654 265/500 [==============>...............] - ETA: 1:19 - loss: 1.2556 - regression_loss: 1.0904 - classification_loss: 0.1652 266/500 [==============>...............] - ETA: 1:19 - loss: 1.2528 - regression_loss: 1.0880 - classification_loss: 0.1648 267/500 [===============>..............] - ETA: 1:18 - loss: 1.2558 - regression_loss: 1.0903 - classification_loss: 0.1656 268/500 [===============>..............] - ETA: 1:18 - loss: 1.2534 - regression_loss: 1.0882 - classification_loss: 0.1652 269/500 [===============>..............] - ETA: 1:18 - loss: 1.2620 - regression_loss: 1.0929 - classification_loss: 0.1691 270/500 [===============>..............] - ETA: 1:17 - loss: 1.2627 - regression_loss: 1.0932 - classification_loss: 0.1694 271/500 [===============>..............] - ETA: 1:17 - loss: 1.2610 - regression_loss: 1.0920 - classification_loss: 0.1690 272/500 [===============>..............] - ETA: 1:17 - loss: 1.2621 - regression_loss: 1.0932 - classification_loss: 0.1689 273/500 [===============>..............] - ETA: 1:16 - loss: 1.2619 - regression_loss: 1.0932 - classification_loss: 0.1687 274/500 [===============>..............] - ETA: 1:16 - loss: 1.2609 - regression_loss: 1.0924 - classification_loss: 0.1685 275/500 [===============>..............] - ETA: 1:16 - loss: 1.2599 - regression_loss: 1.0917 - classification_loss: 0.1682 276/500 [===============>..............] - ETA: 1:15 - loss: 1.2591 - regression_loss: 1.0913 - classification_loss: 0.1678 277/500 [===============>..............] - ETA: 1:15 - loss: 1.2565 - regression_loss: 1.0891 - classification_loss: 0.1674 278/500 [===============>..............] - ETA: 1:15 - loss: 1.2564 - regression_loss: 1.0888 - classification_loss: 0.1676 279/500 [===============>..............] - ETA: 1:14 - loss: 1.2583 - regression_loss: 1.0904 - classification_loss: 0.1679 280/500 [===============>..............] - ETA: 1:14 - loss: 1.2616 - regression_loss: 1.0930 - classification_loss: 0.1686 281/500 [===============>..............] - ETA: 1:14 - loss: 1.2611 - regression_loss: 1.0925 - classification_loss: 0.1686 282/500 [===============>..............] - ETA: 1:13 - loss: 1.2593 - regression_loss: 1.0909 - classification_loss: 0.1684 283/500 [===============>..............] - ETA: 1:13 - loss: 1.2609 - regression_loss: 1.0924 - classification_loss: 0.1685 284/500 [================>.............] - ETA: 1:13 - loss: 1.2594 - regression_loss: 1.0912 - classification_loss: 0.1682 285/500 [================>.............] - ETA: 1:12 - loss: 1.2595 - regression_loss: 1.0911 - classification_loss: 0.1684 286/500 [================>.............] - ETA: 1:12 - loss: 1.2588 - regression_loss: 1.0905 - classification_loss: 0.1683 287/500 [================>.............] - ETA: 1:12 - loss: 1.2581 - regression_loss: 1.0898 - classification_loss: 0.1682 288/500 [================>.............] - ETA: 1:11 - loss: 1.2566 - regression_loss: 1.0886 - classification_loss: 0.1679 289/500 [================>.............] - ETA: 1:11 - loss: 1.2586 - regression_loss: 1.0904 - classification_loss: 0.1681 290/500 [================>.............] - ETA: 1:11 - loss: 1.2573 - regression_loss: 1.0895 - classification_loss: 0.1678 291/500 [================>.............] - ETA: 1:10 - loss: 1.2569 - regression_loss: 1.0893 - classification_loss: 0.1676 292/500 [================>.............] - ETA: 1:10 - loss: 1.2558 - regression_loss: 1.0883 - classification_loss: 0.1675 293/500 [================>.............] - ETA: 1:10 - loss: 1.2546 - regression_loss: 1.0866 - classification_loss: 0.1680 294/500 [================>.............] - ETA: 1:09 - loss: 1.2535 - regression_loss: 1.0857 - classification_loss: 0.1678 295/500 [================>.............] - ETA: 1:09 - loss: 1.2534 - regression_loss: 1.0856 - classification_loss: 0.1677 296/500 [================>.............] - ETA: 1:09 - loss: 1.2541 - regression_loss: 1.0863 - classification_loss: 0.1678 297/500 [================>.............] - ETA: 1:08 - loss: 1.2524 - regression_loss: 1.0850 - classification_loss: 0.1674 298/500 [================>.............] - ETA: 1:08 - loss: 1.2533 - regression_loss: 1.0857 - classification_loss: 0.1676 299/500 [================>.............] - ETA: 1:08 - loss: 1.2533 - regression_loss: 1.0853 - classification_loss: 0.1680 300/500 [=================>............] - ETA: 1:07 - loss: 1.2550 - regression_loss: 1.0870 - classification_loss: 0.1680 301/500 [=================>............] - ETA: 1:07 - loss: 1.2557 - regression_loss: 1.0875 - classification_loss: 0.1682 302/500 [=================>............] - ETA: 1:07 - loss: 1.2568 - regression_loss: 1.0885 - classification_loss: 0.1683 303/500 [=================>............] - ETA: 1:06 - loss: 1.2551 - regression_loss: 1.0871 - classification_loss: 0.1680 304/500 [=================>............] - ETA: 1:06 - loss: 1.2546 - regression_loss: 1.0867 - classification_loss: 0.1679 305/500 [=================>............] - ETA: 1:06 - loss: 1.2532 - regression_loss: 1.0855 - classification_loss: 0.1677 306/500 [=================>............] - ETA: 1:05 - loss: 1.2526 - regression_loss: 1.0850 - classification_loss: 0.1677 307/500 [=================>............] - ETA: 1:05 - loss: 1.2508 - regression_loss: 1.0834 - classification_loss: 0.1674 308/500 [=================>............] - ETA: 1:05 - loss: 1.2497 - regression_loss: 1.0825 - classification_loss: 0.1672 309/500 [=================>............] - ETA: 1:04 - loss: 1.2493 - regression_loss: 1.0820 - classification_loss: 0.1673 310/500 [=================>............] - ETA: 1:04 - loss: 1.2531 - regression_loss: 1.0850 - classification_loss: 0.1682 311/500 [=================>............] - ETA: 1:04 - loss: 1.2569 - regression_loss: 1.0879 - classification_loss: 0.1690 312/500 [=================>............] - ETA: 1:03 - loss: 1.2553 - regression_loss: 1.0864 - classification_loss: 0.1689 313/500 [=================>............] - ETA: 1:03 - loss: 1.2572 - regression_loss: 1.0880 - classification_loss: 0.1692 314/500 [=================>............] - ETA: 1:03 - loss: 1.2570 - regression_loss: 1.0879 - classification_loss: 0.1690 315/500 [=================>............] - ETA: 1:02 - loss: 1.2547 - regression_loss: 1.0856 - classification_loss: 0.1690 316/500 [=================>............] - ETA: 1:02 - loss: 1.2550 - regression_loss: 1.0861 - classification_loss: 0.1689 317/500 [==================>...........] - ETA: 1:02 - loss: 1.2536 - regression_loss: 1.0850 - classification_loss: 0.1686 318/500 [==================>...........] - ETA: 1:01 - loss: 1.2537 - regression_loss: 1.0850 - classification_loss: 0.1686 319/500 [==================>...........] - ETA: 1:01 - loss: 1.2534 - regression_loss: 1.0850 - classification_loss: 0.1685 320/500 [==================>...........] - ETA: 1:01 - loss: 1.2514 - regression_loss: 1.0832 - classification_loss: 0.1682 321/500 [==================>...........] - ETA: 1:00 - loss: 1.2508 - regression_loss: 1.0827 - classification_loss: 0.1681 322/500 [==================>...........] - ETA: 1:00 - loss: 1.2496 - regression_loss: 1.0817 - classification_loss: 0.1679 323/500 [==================>...........] - ETA: 1:00 - loss: 1.2491 - regression_loss: 1.0814 - classification_loss: 0.1677 324/500 [==================>...........] - ETA: 59s - loss: 1.2469 - regression_loss: 1.0795 - classification_loss: 0.1674  325/500 [==================>...........] - ETA: 59s - loss: 1.2463 - regression_loss: 1.0792 - classification_loss: 0.1672 326/500 [==================>...........] - ETA: 59s - loss: 1.2473 - regression_loss: 1.0797 - classification_loss: 0.1675 327/500 [==================>...........] - ETA: 58s - loss: 1.2482 - regression_loss: 1.0807 - classification_loss: 0.1675 328/500 [==================>...........] - ETA: 58s - loss: 1.2472 - regression_loss: 1.0799 - classification_loss: 0.1673 329/500 [==================>...........] - ETA: 58s - loss: 1.2477 - regression_loss: 1.0804 - classification_loss: 0.1673 330/500 [==================>...........] - ETA: 57s - loss: 1.2477 - regression_loss: 1.0806 - classification_loss: 0.1671 331/500 [==================>...........] - ETA: 57s - loss: 1.2471 - regression_loss: 1.0801 - classification_loss: 0.1669 332/500 [==================>...........] - ETA: 57s - loss: 1.2462 - regression_loss: 1.0795 - classification_loss: 0.1667 333/500 [==================>...........] - ETA: 56s - loss: 1.2452 - regression_loss: 1.0786 - classification_loss: 0.1665 334/500 [===================>..........] - ETA: 56s - loss: 1.2447 - regression_loss: 1.0782 - classification_loss: 0.1665 335/500 [===================>..........] - ETA: 56s - loss: 1.2432 - regression_loss: 1.0771 - classification_loss: 0.1662 336/500 [===================>..........] - ETA: 55s - loss: 1.2444 - regression_loss: 1.0777 - classification_loss: 0.1667 337/500 [===================>..........] - ETA: 55s - loss: 1.2445 - regression_loss: 1.0779 - classification_loss: 0.1666 338/500 [===================>..........] - ETA: 54s - loss: 1.2460 - regression_loss: 1.0793 - classification_loss: 0.1666 339/500 [===================>..........] - ETA: 54s - loss: 1.2433 - regression_loss: 1.0770 - classification_loss: 0.1663 340/500 [===================>..........] - ETA: 54s - loss: 1.2440 - regression_loss: 1.0777 - classification_loss: 0.1663 341/500 [===================>..........] - ETA: 53s - loss: 1.2437 - regression_loss: 1.0776 - classification_loss: 0.1661 342/500 [===================>..........] - ETA: 53s - loss: 1.2417 - regression_loss: 1.0759 - classification_loss: 0.1657 343/500 [===================>..........] - ETA: 53s - loss: 1.2416 - regression_loss: 1.0757 - classification_loss: 0.1659 344/500 [===================>..........] - ETA: 52s - loss: 1.2402 - regression_loss: 1.0743 - classification_loss: 0.1660 345/500 [===================>..........] - ETA: 52s - loss: 1.2402 - regression_loss: 1.0743 - classification_loss: 0.1658 346/500 [===================>..........] - ETA: 52s - loss: 1.2395 - regression_loss: 1.0739 - classification_loss: 0.1656 347/500 [===================>..........] - ETA: 51s - loss: 1.2417 - regression_loss: 1.0759 - classification_loss: 0.1659 348/500 [===================>..........] - ETA: 51s - loss: 1.2415 - regression_loss: 1.0757 - classification_loss: 0.1657 349/500 [===================>..........] - ETA: 51s - loss: 1.2432 - regression_loss: 1.0774 - classification_loss: 0.1658 350/500 [====================>.........] - ETA: 50s - loss: 1.2416 - regression_loss: 1.0760 - classification_loss: 0.1656 351/500 [====================>.........] - ETA: 50s - loss: 1.2386 - regression_loss: 1.0729 - classification_loss: 0.1656 352/500 [====================>.........] - ETA: 50s - loss: 1.2374 - regression_loss: 1.0720 - classification_loss: 0.1654 353/500 [====================>.........] - ETA: 49s - loss: 1.2397 - regression_loss: 1.0737 - classification_loss: 0.1660 354/500 [====================>.........] - ETA: 49s - loss: 1.2400 - regression_loss: 1.0741 - classification_loss: 0.1660 355/500 [====================>.........] - ETA: 49s - loss: 1.2402 - regression_loss: 1.0742 - classification_loss: 0.1660 356/500 [====================>.........] - ETA: 48s - loss: 1.2392 - regression_loss: 1.0734 - classification_loss: 0.1658 357/500 [====================>.........] - ETA: 48s - loss: 1.2425 - regression_loss: 1.0765 - classification_loss: 0.1659 358/500 [====================>.........] - ETA: 48s - loss: 1.2429 - regression_loss: 1.0769 - classification_loss: 0.1660 359/500 [====================>.........] - ETA: 47s - loss: 1.2414 - regression_loss: 1.0757 - classification_loss: 0.1657 360/500 [====================>.........] - ETA: 47s - loss: 1.2417 - regression_loss: 1.0758 - classification_loss: 0.1659 361/500 [====================>.........] - ETA: 47s - loss: 1.2410 - regression_loss: 1.0754 - classification_loss: 0.1656 362/500 [====================>.........] - ETA: 46s - loss: 1.2422 - regression_loss: 1.0763 - classification_loss: 0.1659 363/500 [====================>.........] - ETA: 46s - loss: 1.2425 - regression_loss: 1.0766 - classification_loss: 0.1660 364/500 [====================>.........] - ETA: 46s - loss: 1.2421 - regression_loss: 1.0762 - classification_loss: 0.1659 365/500 [====================>.........] - ETA: 45s - loss: 1.2428 - regression_loss: 1.0768 - classification_loss: 0.1660 366/500 [====================>.........] - ETA: 45s - loss: 1.2466 - regression_loss: 1.0796 - classification_loss: 0.1671 367/500 [=====================>........] - ETA: 45s - loss: 1.2454 - regression_loss: 1.0786 - classification_loss: 0.1669 368/500 [=====================>........] - ETA: 44s - loss: 1.2454 - regression_loss: 1.0787 - classification_loss: 0.1667 369/500 [=====================>........] - ETA: 44s - loss: 1.2476 - regression_loss: 1.0803 - classification_loss: 0.1673 370/500 [=====================>........] - ETA: 44s - loss: 1.2465 - regression_loss: 1.0794 - classification_loss: 0.1671 371/500 [=====================>........] - ETA: 43s - loss: 1.2481 - regression_loss: 1.0806 - classification_loss: 0.1676 372/500 [=====================>........] - ETA: 43s - loss: 1.2492 - regression_loss: 1.0812 - classification_loss: 0.1680 373/500 [=====================>........] - ETA: 43s - loss: 1.2504 - regression_loss: 1.0825 - classification_loss: 0.1679 374/500 [=====================>........] - ETA: 42s - loss: 1.2488 - regression_loss: 1.0812 - classification_loss: 0.1676 375/500 [=====================>........] - ETA: 42s - loss: 1.2487 - regression_loss: 1.0812 - classification_loss: 0.1675 376/500 [=====================>........] - ETA: 42s - loss: 1.2492 - regression_loss: 1.0818 - classification_loss: 0.1675 377/500 [=====================>........] - ETA: 41s - loss: 1.2489 - regression_loss: 1.0814 - classification_loss: 0.1676 378/500 [=====================>........] - ETA: 41s - loss: 1.2470 - regression_loss: 1.0798 - classification_loss: 0.1673 379/500 [=====================>........] - ETA: 41s - loss: 1.2469 - regression_loss: 1.0798 - classification_loss: 0.1671 380/500 [=====================>........] - ETA: 40s - loss: 1.2472 - regression_loss: 1.0801 - classification_loss: 0.1671 381/500 [=====================>........] - ETA: 40s - loss: 1.2468 - regression_loss: 1.0798 - classification_loss: 0.1670 382/500 [=====================>........] - ETA: 40s - loss: 1.2461 - regression_loss: 1.0792 - classification_loss: 0.1669 383/500 [=====================>........] - ETA: 39s - loss: 1.2456 - regression_loss: 1.0788 - classification_loss: 0.1668 384/500 [======================>.......] - ETA: 39s - loss: 1.2443 - regression_loss: 1.0778 - classification_loss: 0.1665 385/500 [======================>.......] - ETA: 39s - loss: 1.2428 - regression_loss: 1.0766 - classification_loss: 0.1662 386/500 [======================>.......] - ETA: 38s - loss: 1.2408 - regression_loss: 1.0749 - classification_loss: 0.1659 387/500 [======================>.......] - ETA: 38s - loss: 1.2508 - regression_loss: 1.0791 - classification_loss: 0.1717 388/500 [======================>.......] - ETA: 38s - loss: 1.2501 - regression_loss: 1.0786 - classification_loss: 0.1715 389/500 [======================>.......] - ETA: 37s - loss: 1.2493 - regression_loss: 1.0782 - classification_loss: 0.1712 390/500 [======================>.......] - ETA: 37s - loss: 1.2492 - regression_loss: 1.0781 - classification_loss: 0.1711 391/500 [======================>.......] - ETA: 36s - loss: 1.2471 - regression_loss: 1.0763 - classification_loss: 0.1708 392/500 [======================>.......] - ETA: 36s - loss: 1.2490 - regression_loss: 1.0782 - classification_loss: 0.1708 393/500 [======================>.......] - ETA: 36s - loss: 1.2482 - regression_loss: 1.0777 - classification_loss: 0.1705 394/500 [======================>.......] - ETA: 35s - loss: 1.2485 - regression_loss: 1.0779 - classification_loss: 0.1706 395/500 [======================>.......] - ETA: 35s - loss: 1.2482 - regression_loss: 1.0777 - classification_loss: 0.1706 396/500 [======================>.......] - ETA: 35s - loss: 1.2490 - regression_loss: 1.0784 - classification_loss: 0.1706 397/500 [======================>.......] - ETA: 34s - loss: 1.2504 - regression_loss: 1.0794 - classification_loss: 0.1710 398/500 [======================>.......] - ETA: 34s - loss: 1.2506 - regression_loss: 1.0797 - classification_loss: 0.1709 399/500 [======================>.......] - ETA: 34s - loss: 1.2511 - regression_loss: 1.0803 - classification_loss: 0.1708 400/500 [=======================>......] - ETA: 33s - loss: 1.2518 - regression_loss: 1.0810 - classification_loss: 0.1709 401/500 [=======================>......] - ETA: 33s - loss: 1.2510 - regression_loss: 1.0804 - classification_loss: 0.1706 402/500 [=======================>......] - ETA: 33s - loss: 1.2493 - regression_loss: 1.0790 - classification_loss: 0.1703 403/500 [=======================>......] - ETA: 32s - loss: 1.2478 - regression_loss: 1.0775 - classification_loss: 0.1703 404/500 [=======================>......] - ETA: 32s - loss: 1.2462 - regression_loss: 1.0761 - classification_loss: 0.1701 405/500 [=======================>......] - ETA: 32s - loss: 1.2474 - regression_loss: 1.0770 - classification_loss: 0.1704 406/500 [=======================>......] - ETA: 31s - loss: 1.2471 - regression_loss: 1.0768 - classification_loss: 0.1703 407/500 [=======================>......] - ETA: 31s - loss: 1.2470 - regression_loss: 1.0767 - classification_loss: 0.1703 408/500 [=======================>......] - ETA: 31s - loss: 1.2466 - regression_loss: 1.0765 - classification_loss: 0.1702 409/500 [=======================>......] - ETA: 30s - loss: 1.2449 - regression_loss: 1.0750 - classification_loss: 0.1699 410/500 [=======================>......] - ETA: 30s - loss: 1.2452 - regression_loss: 1.0750 - classification_loss: 0.1702 411/500 [=======================>......] - ETA: 30s - loss: 1.2465 - regression_loss: 1.0764 - classification_loss: 0.1700 412/500 [=======================>......] - ETA: 29s - loss: 1.2462 - regression_loss: 1.0763 - classification_loss: 0.1699 413/500 [=======================>......] - ETA: 29s - loss: 1.2455 - regression_loss: 1.0757 - classification_loss: 0.1697 414/500 [=======================>......] - ETA: 29s - loss: 1.2448 - regression_loss: 1.0751 - classification_loss: 0.1697 415/500 [=======================>......] - ETA: 28s - loss: 1.2431 - regression_loss: 1.0738 - classification_loss: 0.1694 416/500 [=======================>......] - ETA: 28s - loss: 1.2431 - regression_loss: 1.0737 - classification_loss: 0.1694 417/500 [========================>.....] - ETA: 28s - loss: 1.2438 - regression_loss: 1.0744 - classification_loss: 0.1694 418/500 [========================>.....] - ETA: 27s - loss: 1.2416 - regression_loss: 1.0725 - classification_loss: 0.1691 419/500 [========================>.....] - ETA: 27s - loss: 1.2422 - regression_loss: 1.0731 - classification_loss: 0.1691 420/500 [========================>.....] - ETA: 27s - loss: 1.2412 - regression_loss: 1.0724 - classification_loss: 0.1688 421/500 [========================>.....] - ETA: 26s - loss: 1.2412 - regression_loss: 1.0725 - classification_loss: 0.1688 422/500 [========================>.....] - ETA: 26s - loss: 1.2416 - regression_loss: 1.0729 - classification_loss: 0.1688 423/500 [========================>.....] - ETA: 26s - loss: 1.2420 - regression_loss: 1.0732 - classification_loss: 0.1688 424/500 [========================>.....] - ETA: 25s - loss: 1.2411 - regression_loss: 1.0725 - classification_loss: 0.1685 425/500 [========================>.....] - ETA: 25s - loss: 1.2421 - regression_loss: 1.0734 - classification_loss: 0.1687 426/500 [========================>.....] - ETA: 25s - loss: 1.2410 - regression_loss: 1.0727 - classification_loss: 0.1684 427/500 [========================>.....] - ETA: 24s - loss: 1.2408 - regression_loss: 1.0720 - classification_loss: 0.1688 428/500 [========================>.....] - ETA: 24s - loss: 1.2402 - regression_loss: 1.0716 - classification_loss: 0.1686 429/500 [========================>.....] - ETA: 24s - loss: 1.2426 - regression_loss: 1.0737 - classification_loss: 0.1688 430/500 [========================>.....] - ETA: 23s - loss: 1.2429 - regression_loss: 1.0741 - classification_loss: 0.1688 431/500 [========================>.....] - ETA: 23s - loss: 1.2414 - regression_loss: 1.0727 - classification_loss: 0.1687 432/500 [========================>.....] - ETA: 23s - loss: 1.2444 - regression_loss: 1.0753 - classification_loss: 0.1692 433/500 [========================>.....] - ETA: 22s - loss: 1.2457 - regression_loss: 1.0762 - classification_loss: 0.1695 434/500 [=========================>....] - ETA: 22s - loss: 1.2469 - regression_loss: 1.0775 - classification_loss: 0.1694 435/500 [=========================>....] - ETA: 22s - loss: 1.2464 - regression_loss: 1.0771 - classification_loss: 0.1693 436/500 [=========================>....] - ETA: 21s - loss: 1.2475 - regression_loss: 1.0783 - classification_loss: 0.1693 437/500 [=========================>....] - ETA: 21s - loss: 1.2460 - regression_loss: 1.0770 - classification_loss: 0.1690 438/500 [=========================>....] - ETA: 21s - loss: 1.2451 - regression_loss: 1.0762 - classification_loss: 0.1689 439/500 [=========================>....] - ETA: 20s - loss: 1.2441 - regression_loss: 1.0754 - classification_loss: 0.1687 440/500 [=========================>....] - ETA: 20s - loss: 1.2439 - regression_loss: 1.0752 - classification_loss: 0.1687 441/500 [=========================>....] - ETA: 20s - loss: 1.2429 - regression_loss: 1.0744 - classification_loss: 0.1685 442/500 [=========================>....] - ETA: 19s - loss: 1.2446 - regression_loss: 1.0761 - classification_loss: 0.1685 443/500 [=========================>....] - ETA: 19s - loss: 1.2449 - regression_loss: 1.0765 - classification_loss: 0.1683 444/500 [=========================>....] - ETA: 18s - loss: 1.2446 - regression_loss: 1.0763 - classification_loss: 0.1683 445/500 [=========================>....] - ETA: 18s - loss: 1.2444 - regression_loss: 1.0761 - classification_loss: 0.1683 446/500 [=========================>....] - ETA: 18s - loss: 1.2439 - regression_loss: 1.0758 - classification_loss: 0.1681 447/500 [=========================>....] - ETA: 17s - loss: 1.2432 - regression_loss: 1.0752 - classification_loss: 0.1680 448/500 [=========================>....] - ETA: 17s - loss: 1.2432 - regression_loss: 1.0751 - classification_loss: 0.1681 449/500 [=========================>....] - ETA: 17s - loss: 1.2421 - regression_loss: 1.0742 - classification_loss: 0.1679 450/500 [==========================>...] - ETA: 16s - loss: 1.2411 - regression_loss: 1.0733 - classification_loss: 0.1678 451/500 [==========================>...] - ETA: 16s - loss: 1.2398 - regression_loss: 1.0723 - classification_loss: 0.1676 452/500 [==========================>...] - ETA: 16s - loss: 1.2380 - regression_loss: 1.0707 - classification_loss: 0.1673 453/500 [==========================>...] - ETA: 15s - loss: 1.2376 - regression_loss: 1.0699 - classification_loss: 0.1676 454/500 [==========================>...] - ETA: 15s - loss: 1.2382 - regression_loss: 1.0703 - classification_loss: 0.1679 455/500 [==========================>...] - ETA: 15s - loss: 1.2383 - regression_loss: 1.0706 - classification_loss: 0.1678 456/500 [==========================>...] - ETA: 14s - loss: 1.2396 - regression_loss: 1.0714 - classification_loss: 0.1682 457/500 [==========================>...] - ETA: 14s - loss: 1.2389 - regression_loss: 1.0709 - classification_loss: 0.1680 458/500 [==========================>...] - ETA: 14s - loss: 1.2394 - regression_loss: 1.0713 - classification_loss: 0.1681 459/500 [==========================>...] - ETA: 13s - loss: 1.2453 - regression_loss: 1.0757 - classification_loss: 0.1696 460/500 [==========================>...] - ETA: 13s - loss: 1.2437 - regression_loss: 1.0744 - classification_loss: 0.1693 461/500 [==========================>...] - ETA: 13s - loss: 1.2435 - regression_loss: 1.0742 - classification_loss: 0.1692 462/500 [==========================>...] - ETA: 12s - loss: 1.2438 - regression_loss: 1.0745 - classification_loss: 0.1693 463/500 [==========================>...] - ETA: 12s - loss: 1.2443 - regression_loss: 1.0749 - classification_loss: 0.1693 464/500 [==========================>...] - ETA: 12s - loss: 1.2428 - regression_loss: 1.0737 - classification_loss: 0.1691 465/500 [==========================>...] - ETA: 11s - loss: 1.2407 - regression_loss: 1.0719 - classification_loss: 0.1688 466/500 [==========================>...] - ETA: 11s - loss: 1.2396 - regression_loss: 1.0710 - classification_loss: 0.1686 467/500 [===========================>..] - ETA: 11s - loss: 1.2407 - regression_loss: 1.0719 - classification_loss: 0.1688 468/500 [===========================>..] - ETA: 10s - loss: 1.2398 - regression_loss: 1.0712 - classification_loss: 0.1686 469/500 [===========================>..] - ETA: 10s - loss: 1.2413 - regression_loss: 1.0726 - classification_loss: 0.1687 470/500 [===========================>..] - ETA: 10s - loss: 1.2410 - regression_loss: 1.0724 - classification_loss: 0.1686 471/500 [===========================>..] - ETA: 9s - loss: 1.2412 - regression_loss: 1.0726 - classification_loss: 0.1686  472/500 [===========================>..] - ETA: 9s - loss: 1.2402 - regression_loss: 1.0718 - classification_loss: 0.1683 473/500 [===========================>..] - ETA: 9s - loss: 1.2393 - regression_loss: 1.0711 - classification_loss: 0.1682 474/500 [===========================>..] - ETA: 8s - loss: 1.2383 - regression_loss: 1.0702 - classification_loss: 0.1681 475/500 [===========================>..] - ETA: 8s - loss: 1.2376 - regression_loss: 1.0697 - classification_loss: 0.1679 476/500 [===========================>..] - ETA: 8s - loss: 1.2365 - regression_loss: 1.0688 - classification_loss: 0.1677 477/500 [===========================>..] - ETA: 7s - loss: 1.2352 - regression_loss: 1.0675 - classification_loss: 0.1676 478/500 [===========================>..] - ETA: 7s - loss: 1.2354 - regression_loss: 1.0677 - classification_loss: 0.1677 479/500 [===========================>..] - ETA: 7s - loss: 1.2374 - regression_loss: 1.0692 - classification_loss: 0.1681 480/500 [===========================>..] - ETA: 6s - loss: 1.2369 - regression_loss: 1.0689 - classification_loss: 0.1681 481/500 [===========================>..] - ETA: 6s - loss: 1.2352 - regression_loss: 1.0674 - classification_loss: 0.1679 482/500 [===========================>..] - ETA: 6s - loss: 1.2359 - regression_loss: 1.0680 - classification_loss: 0.1679 483/500 [===========================>..] - ETA: 5s - loss: 1.2344 - regression_loss: 1.0667 - classification_loss: 0.1677 484/500 [============================>.] - ETA: 5s - loss: 1.2338 - regression_loss: 1.0663 - classification_loss: 0.1675 485/500 [============================>.] - ETA: 5s - loss: 1.2343 - regression_loss: 1.0668 - classification_loss: 0.1675 486/500 [============================>.] - ETA: 4s - loss: 1.2361 - regression_loss: 1.0684 - classification_loss: 0.1677 487/500 [============================>.] - ETA: 4s - loss: 1.2351 - regression_loss: 1.0675 - classification_loss: 0.1676 488/500 [============================>.] - ETA: 4s - loss: 1.2358 - regression_loss: 1.0681 - classification_loss: 0.1677 489/500 [============================>.] - ETA: 3s - loss: 1.2363 - regression_loss: 1.0686 - classification_loss: 0.1677 490/500 [============================>.] - ETA: 3s - loss: 1.2366 - regression_loss: 1.0690 - classification_loss: 0.1676 491/500 [============================>.] - ETA: 3s - loss: 1.2374 - regression_loss: 1.0699 - classification_loss: 0.1675 492/500 [============================>.] - ETA: 2s - loss: 1.2370 - regression_loss: 1.0696 - classification_loss: 0.1675 493/500 [============================>.] - ETA: 2s - loss: 1.2375 - regression_loss: 1.0701 - classification_loss: 0.1674 494/500 [============================>.] - ETA: 2s - loss: 1.2365 - regression_loss: 1.0691 - classification_loss: 0.1674 495/500 [============================>.] - ETA: 1s - loss: 1.2372 - regression_loss: 1.0697 - classification_loss: 0.1675 496/500 [============================>.] - ETA: 1s - loss: 1.2384 - regression_loss: 1.0708 - classification_loss: 0.1676 497/500 [============================>.] - ETA: 1s - loss: 1.2394 - regression_loss: 1.0717 - classification_loss: 0.1678 498/500 [============================>.] - ETA: 0s - loss: 1.2402 - regression_loss: 1.0723 - classification_loss: 0.1679 499/500 [============================>.] - ETA: 0s - loss: 1.2404 - regression_loss: 1.0726 - classification_loss: 0.1679 500/500 [==============================] - 170s 339ms/step - loss: 1.2414 - regression_loss: 1.0735 - classification_loss: 0.1679 326 instances of class plum with average precision: 0.8327 mAP: 0.8327 Epoch 00015: saving model to ./training/snapshots/resnet101_pascal_15.h5 Epoch 16/150 1/500 [..............................] - ETA: 2:39 - loss: 0.9439 - regression_loss: 0.8879 - classification_loss: 0.0560 2/500 [..............................] - ETA: 2:43 - loss: 1.0768 - regression_loss: 0.9685 - classification_loss: 0.1083 3/500 [..............................] - ETA: 2:46 - loss: 0.8261 - regression_loss: 0.7428 - classification_loss: 0.0833 4/500 [..............................] - ETA: 2:46 - loss: 0.7281 - regression_loss: 0.6588 - classification_loss: 0.0693 5/500 [..............................] - ETA: 2:44 - loss: 0.7315 - regression_loss: 0.5970 - classification_loss: 0.1345 6/500 [..............................] - ETA: 2:44 - loss: 0.8490 - regression_loss: 0.7051 - classification_loss: 0.1439 7/500 [..............................] - ETA: 2:45 - loss: 0.8890 - regression_loss: 0.7531 - classification_loss: 0.1359 8/500 [..............................] - ETA: 2:44 - loss: 0.9709 - regression_loss: 0.8151 - classification_loss: 0.1557 9/500 [..............................] - ETA: 2:45 - loss: 1.0031 - regression_loss: 0.8429 - classification_loss: 0.1601 10/500 [..............................] - ETA: 2:44 - loss: 0.9240 - regression_loss: 0.7586 - classification_loss: 0.1654 11/500 [..............................] - ETA: 2:44 - loss: 0.9554 - regression_loss: 0.7937 - classification_loss: 0.1617 12/500 [..............................] - ETA: 2:44 - loss: 0.9547 - regression_loss: 0.7969 - classification_loss: 0.1579 13/500 [..............................] - ETA: 2:43 - loss: 1.0393 - regression_loss: 0.8546 - classification_loss: 0.1847 14/500 [..............................] - ETA: 2:43 - loss: 1.0321 - regression_loss: 0.8538 - classification_loss: 0.1784 15/500 [..............................] - ETA: 2:43 - loss: 1.0306 - regression_loss: 0.8573 - classification_loss: 0.1734 16/500 [..............................] - ETA: 2:43 - loss: 1.0350 - regression_loss: 0.8645 - classification_loss: 0.1705 17/500 [>.............................] - ETA: 2:42 - loss: 1.0290 - regression_loss: 0.8628 - classification_loss: 0.1663 18/500 [>.............................] - ETA: 2:42 - loss: 1.0357 - regression_loss: 0.8688 - classification_loss: 0.1669 19/500 [>.............................] - ETA: 2:42 - loss: 1.0535 - regression_loss: 0.8904 - classification_loss: 0.1631 20/500 [>.............................] - ETA: 2:42 - loss: 1.0487 - regression_loss: 0.8899 - classification_loss: 0.1587 21/500 [>.............................] - ETA: 2:41 - loss: 1.0293 - regression_loss: 0.8739 - classification_loss: 0.1553 22/500 [>.............................] - ETA: 2:40 - loss: 1.0135 - regression_loss: 0.8609 - classification_loss: 0.1526 23/500 [>.............................] - ETA: 2:40 - loss: 1.0301 - regression_loss: 0.8769 - classification_loss: 0.1532 24/500 [>.............................] - ETA: 2:39 - loss: 1.0391 - regression_loss: 0.8884 - classification_loss: 0.1507 25/500 [>.............................] - ETA: 2:39 - loss: 1.0497 - regression_loss: 0.8970 - classification_loss: 0.1527 26/500 [>.............................] - ETA: 2:39 - loss: 1.0703 - regression_loss: 0.9174 - classification_loss: 0.1530 27/500 [>.............................] - ETA: 2:38 - loss: 1.0718 - regression_loss: 0.9220 - classification_loss: 0.1498 28/500 [>.............................] - ETA: 2:38 - loss: 1.0867 - regression_loss: 0.9367 - classification_loss: 0.1500 29/500 [>.............................] - ETA: 2:38 - loss: 1.0862 - regression_loss: 0.9347 - classification_loss: 0.1516 30/500 [>.............................] - ETA: 2:37 - loss: 1.0790 - regression_loss: 0.9303 - classification_loss: 0.1487 31/500 [>.............................] - ETA: 2:37 - loss: 1.0752 - regression_loss: 0.9286 - classification_loss: 0.1465 32/500 [>.............................] - ETA: 2:37 - loss: 1.0747 - regression_loss: 0.9263 - classification_loss: 0.1484 33/500 [>.............................] - ETA: 2:36 - loss: 1.0723 - regression_loss: 0.9269 - classification_loss: 0.1454 34/500 [=>............................] - ETA: 2:36 - loss: 1.0849 - regression_loss: 0.9377 - classification_loss: 0.1472 35/500 [=>............................] - ETA: 2:36 - loss: 1.0831 - regression_loss: 0.9362 - classification_loss: 0.1470 36/500 [=>............................] - ETA: 2:36 - loss: 1.0853 - regression_loss: 0.9388 - classification_loss: 0.1465 37/500 [=>............................] - ETA: 2:35 - loss: 1.0905 - regression_loss: 0.9430 - classification_loss: 0.1474 38/500 [=>............................] - ETA: 2:35 - loss: 1.0973 - regression_loss: 0.9457 - classification_loss: 0.1516 39/500 [=>............................] - ETA: 2:35 - loss: 1.1033 - regression_loss: 0.9518 - classification_loss: 0.1515 40/500 [=>............................] - ETA: 2:34 - loss: 1.1075 - regression_loss: 0.9562 - classification_loss: 0.1513 41/500 [=>............................] - ETA: 2:34 - loss: 1.0927 - regression_loss: 0.9439 - classification_loss: 0.1488 42/500 [=>............................] - ETA: 2:34 - loss: 1.0985 - regression_loss: 0.9496 - classification_loss: 0.1489 43/500 [=>............................] - ETA: 2:33 - loss: 1.0989 - regression_loss: 0.9508 - classification_loss: 0.1481 44/500 [=>............................] - ETA: 2:33 - loss: 1.0933 - regression_loss: 0.9464 - classification_loss: 0.1470 45/500 [=>............................] - ETA: 2:33 - loss: 1.1044 - regression_loss: 0.9560 - classification_loss: 0.1484 46/500 [=>............................] - ETA: 2:33 - loss: 1.1163 - regression_loss: 0.9656 - classification_loss: 0.1507 47/500 [=>............................] - ETA: 2:32 - loss: 1.1178 - regression_loss: 0.9676 - classification_loss: 0.1502 48/500 [=>............................] - ETA: 2:32 - loss: 1.1304 - regression_loss: 0.9790 - classification_loss: 0.1514 49/500 [=>............................] - ETA: 2:32 - loss: 1.1542 - regression_loss: 1.0019 - classification_loss: 0.1523 50/500 [==>...........................] - ETA: 2:32 - loss: 1.1627 - regression_loss: 1.0058 - classification_loss: 0.1568 51/500 [==>...........................] - ETA: 2:31 - loss: 1.1603 - regression_loss: 1.0053 - classification_loss: 0.1551 52/500 [==>...........................] - ETA: 2:31 - loss: 1.1576 - regression_loss: 1.0035 - classification_loss: 0.1541 53/500 [==>...........................] - ETA: 2:31 - loss: 1.1588 - regression_loss: 1.0043 - classification_loss: 0.1545 54/500 [==>...........................] - ETA: 2:30 - loss: 1.1655 - regression_loss: 1.0100 - classification_loss: 0.1555 55/500 [==>...........................] - ETA: 2:30 - loss: 1.1706 - regression_loss: 1.0153 - classification_loss: 0.1554 56/500 [==>...........................] - ETA: 2:30 - loss: 1.1695 - regression_loss: 1.0141 - classification_loss: 0.1555 57/500 [==>...........................] - ETA: 2:29 - loss: 1.1645 - regression_loss: 1.0104 - classification_loss: 0.1541 58/500 [==>...........................] - ETA: 2:29 - loss: 1.1629 - regression_loss: 1.0100 - classification_loss: 0.1529 59/500 [==>...........................] - ETA: 2:29 - loss: 1.1679 - regression_loss: 1.0146 - classification_loss: 0.1533 60/500 [==>...........................] - ETA: 2:28 - loss: 1.1732 - regression_loss: 1.0209 - classification_loss: 0.1523 61/500 [==>...........................] - ETA: 2:28 - loss: 1.1714 - regression_loss: 1.0186 - classification_loss: 0.1528 62/500 [==>...........................] - ETA: 2:27 - loss: 1.1797 - regression_loss: 1.0253 - classification_loss: 0.1544 63/500 [==>...........................] - ETA: 2:27 - loss: 1.1850 - regression_loss: 1.0305 - classification_loss: 0.1545 64/500 [==>...........................] - ETA: 2:27 - loss: 1.1909 - regression_loss: 1.0348 - classification_loss: 0.1562 65/500 [==>...........................] - ETA: 2:26 - loss: 1.1884 - regression_loss: 1.0310 - classification_loss: 0.1574 66/500 [==>...........................] - ETA: 2:26 - loss: 1.1804 - regression_loss: 1.0239 - classification_loss: 0.1565 67/500 [===>..........................] - ETA: 2:26 - loss: 1.1744 - regression_loss: 1.0174 - classification_loss: 0.1569 68/500 [===>..........................] - ETA: 2:26 - loss: 1.1731 - regression_loss: 1.0174 - classification_loss: 0.1557 69/500 [===>..........................] - ETA: 2:25 - loss: 1.1677 - regression_loss: 1.0129 - classification_loss: 0.1547 70/500 [===>..........................] - ETA: 2:25 - loss: 1.1711 - regression_loss: 1.0174 - classification_loss: 0.1537 71/500 [===>..........................] - ETA: 2:24 - loss: 1.1825 - regression_loss: 1.0283 - classification_loss: 0.1542 72/500 [===>..........................] - ETA: 2:24 - loss: 1.1851 - regression_loss: 1.0297 - classification_loss: 0.1554 73/500 [===>..........................] - ETA: 2:24 - loss: 1.1824 - regression_loss: 1.0274 - classification_loss: 0.1550 74/500 [===>..........................] - ETA: 2:23 - loss: 1.1875 - regression_loss: 1.0320 - classification_loss: 0.1554 75/500 [===>..........................] - ETA: 2:23 - loss: 1.1891 - regression_loss: 1.0338 - classification_loss: 0.1554 76/500 [===>..........................] - ETA: 2:23 - loss: 1.1846 - regression_loss: 1.0300 - classification_loss: 0.1546 77/500 [===>..........................] - ETA: 2:22 - loss: 1.1902 - regression_loss: 1.0342 - classification_loss: 0.1560 78/500 [===>..........................] - ETA: 2:22 - loss: 1.1976 - regression_loss: 1.0406 - classification_loss: 0.1570 79/500 [===>..........................] - ETA: 2:22 - loss: 1.1877 - regression_loss: 1.0322 - classification_loss: 0.1555 80/500 [===>..........................] - ETA: 2:21 - loss: 1.1989 - regression_loss: 1.0436 - classification_loss: 0.1552 81/500 [===>..........................] - ETA: 2:21 - loss: 1.1958 - regression_loss: 1.0410 - classification_loss: 0.1548 82/500 [===>..........................] - ETA: 2:21 - loss: 1.1927 - regression_loss: 1.0390 - classification_loss: 0.1538 83/500 [===>..........................] - ETA: 2:20 - loss: 1.1825 - regression_loss: 1.0300 - classification_loss: 0.1525 84/500 [====>.........................] - ETA: 2:20 - loss: 1.1956 - regression_loss: 1.0417 - classification_loss: 0.1539 85/500 [====>.........................] - ETA: 2:20 - loss: 1.2004 - regression_loss: 1.0465 - classification_loss: 0.1539 86/500 [====>.........................] - ETA: 2:20 - loss: 1.1873 - regression_loss: 1.0343 - classification_loss: 0.1530 87/500 [====>.........................] - ETA: 2:19 - loss: 1.1885 - regression_loss: 1.0347 - classification_loss: 0.1538 88/500 [====>.........................] - ETA: 2:19 - loss: 1.1852 - regression_loss: 1.0321 - classification_loss: 0.1531 89/500 [====>.........................] - ETA: 2:18 - loss: 1.1920 - regression_loss: 1.0380 - classification_loss: 0.1540 90/500 [====>.........................] - ETA: 2:18 - loss: 1.1878 - regression_loss: 1.0346 - classification_loss: 0.1532 91/500 [====>.........................] - ETA: 2:18 - loss: 1.1873 - regression_loss: 1.0345 - classification_loss: 0.1528 92/500 [====>.........................] - ETA: 2:17 - loss: 1.1884 - regression_loss: 1.0364 - classification_loss: 0.1520 93/500 [====>.........................] - ETA: 2:17 - loss: 1.1938 - regression_loss: 1.0418 - classification_loss: 0.1520 94/500 [====>.........................] - ETA: 2:16 - loss: 1.1924 - regression_loss: 1.0408 - classification_loss: 0.1516 95/500 [====>.........................] - ETA: 2:16 - loss: 1.1920 - regression_loss: 1.0405 - classification_loss: 0.1515 96/500 [====>.........................] - ETA: 2:16 - loss: 1.1910 - regression_loss: 1.0396 - classification_loss: 0.1514 97/500 [====>.........................] - ETA: 2:16 - loss: 1.1897 - regression_loss: 1.0389 - classification_loss: 0.1508 98/500 [====>.........................] - ETA: 2:15 - loss: 1.1880 - regression_loss: 1.0381 - classification_loss: 0.1499 99/500 [====>.........................] - ETA: 2:15 - loss: 1.1885 - regression_loss: 1.0383 - classification_loss: 0.1502 100/500 [=====>........................] - ETA: 2:14 - loss: 1.1923 - regression_loss: 1.0412 - classification_loss: 0.1511 101/500 [=====>........................] - ETA: 2:14 - loss: 1.2035 - regression_loss: 1.0504 - classification_loss: 0.1531 102/500 [=====>........................] - ETA: 2:14 - loss: 1.2080 - regression_loss: 1.0543 - classification_loss: 0.1537 103/500 [=====>........................] - ETA: 2:13 - loss: 1.2092 - regression_loss: 1.0559 - classification_loss: 0.1533 104/500 [=====>........................] - ETA: 2:13 - loss: 1.2083 - regression_loss: 1.0557 - classification_loss: 0.1526 105/500 [=====>........................] - ETA: 2:13 - loss: 1.2106 - regression_loss: 1.0576 - classification_loss: 0.1530 106/500 [=====>........................] - ETA: 2:12 - loss: 1.2105 - regression_loss: 1.0582 - classification_loss: 0.1523 107/500 [=====>........................] - ETA: 2:12 - loss: 1.2184 - regression_loss: 1.0634 - classification_loss: 0.1550 108/500 [=====>........................] - ETA: 2:12 - loss: 1.2170 - regression_loss: 1.0628 - classification_loss: 0.1542 109/500 [=====>........................] - ETA: 2:11 - loss: 1.2212 - regression_loss: 1.0662 - classification_loss: 0.1551 110/500 [=====>........................] - ETA: 2:11 - loss: 1.2228 - regression_loss: 1.0680 - classification_loss: 0.1548 111/500 [=====>........................] - ETA: 2:11 - loss: 1.2195 - regression_loss: 1.0652 - classification_loss: 0.1543 112/500 [=====>........................] - ETA: 2:10 - loss: 1.2158 - regression_loss: 1.0618 - classification_loss: 0.1540 113/500 [=====>........................] - ETA: 2:10 - loss: 1.2179 - regression_loss: 1.0633 - classification_loss: 0.1546 114/500 [=====>........................] - ETA: 2:10 - loss: 1.2128 - regression_loss: 1.0590 - classification_loss: 0.1538 115/500 [=====>........................] - ETA: 2:09 - loss: 1.2153 - regression_loss: 1.0609 - classification_loss: 0.1543 116/500 [=====>........................] - ETA: 2:09 - loss: 1.2107 - regression_loss: 1.0570 - classification_loss: 0.1537 117/500 [======>.......................] - ETA: 2:09 - loss: 1.2134 - regression_loss: 1.0594 - classification_loss: 0.1540 118/500 [======>.......................] - ETA: 2:09 - loss: 1.2121 - regression_loss: 1.0586 - classification_loss: 0.1534 119/500 [======>.......................] - ETA: 2:08 - loss: 1.2117 - regression_loss: 1.0588 - classification_loss: 0.1529 120/500 [======>.......................] - ETA: 2:08 - loss: 1.2180 - regression_loss: 1.0642 - classification_loss: 0.1539 121/500 [======>.......................] - ETA: 2:08 - loss: 1.2205 - regression_loss: 1.0659 - classification_loss: 0.1546 122/500 [======>.......................] - ETA: 2:07 - loss: 1.2193 - regression_loss: 1.0645 - classification_loss: 0.1548 123/500 [======>.......................] - ETA: 2:07 - loss: 1.2208 - regression_loss: 1.0662 - classification_loss: 0.1547 124/500 [======>.......................] - ETA: 2:07 - loss: 1.2226 - regression_loss: 1.0681 - classification_loss: 0.1545 125/500 [======>.......................] - ETA: 2:06 - loss: 1.2198 - regression_loss: 1.0660 - classification_loss: 0.1538 126/500 [======>.......................] - ETA: 2:06 - loss: 1.2141 - regression_loss: 1.0612 - classification_loss: 0.1529 127/500 [======>.......................] - ETA: 2:06 - loss: 1.2113 - regression_loss: 1.0585 - classification_loss: 0.1528 128/500 [======>.......................] - ETA: 2:05 - loss: 1.2093 - regression_loss: 1.0573 - classification_loss: 0.1521 129/500 [======>.......................] - ETA: 2:05 - loss: 1.2095 - regression_loss: 1.0573 - classification_loss: 0.1521 130/500 [======>.......................] - ETA: 2:05 - loss: 1.2080 - regression_loss: 1.0556 - classification_loss: 0.1525 131/500 [======>.......................] - ETA: 2:04 - loss: 1.2163 - regression_loss: 1.0627 - classification_loss: 0.1535 132/500 [======>.......................] - ETA: 2:04 - loss: 1.2118 - regression_loss: 1.0590 - classification_loss: 0.1528 133/500 [======>.......................] - ETA: 2:04 - loss: 1.2111 - regression_loss: 1.0582 - classification_loss: 0.1529 134/500 [=======>......................] - ETA: 2:03 - loss: 1.2066 - regression_loss: 1.0544 - classification_loss: 0.1522 135/500 [=======>......................] - ETA: 2:03 - loss: 1.2085 - regression_loss: 1.0564 - classification_loss: 0.1521 136/500 [=======>......................] - ETA: 2:03 - loss: 1.2068 - regression_loss: 1.0551 - classification_loss: 0.1518 137/500 [=======>......................] - ETA: 2:02 - loss: 1.2018 - regression_loss: 1.0507 - classification_loss: 0.1511 138/500 [=======>......................] - ETA: 2:02 - loss: 1.2032 - regression_loss: 1.0519 - classification_loss: 0.1514 139/500 [=======>......................] - ETA: 2:02 - loss: 1.2029 - regression_loss: 1.0518 - classification_loss: 0.1511 140/500 [=======>......................] - ETA: 2:01 - loss: 1.2007 - regression_loss: 1.0497 - classification_loss: 0.1509 141/500 [=======>......................] - ETA: 2:01 - loss: 1.1981 - regression_loss: 1.0473 - classification_loss: 0.1508 142/500 [=======>......................] - ETA: 2:01 - loss: 1.1963 - regression_loss: 1.0456 - classification_loss: 0.1506 143/500 [=======>......................] - ETA: 2:00 - loss: 1.1956 - regression_loss: 1.0449 - classification_loss: 0.1506 144/500 [=======>......................] - ETA: 2:00 - loss: 1.1982 - regression_loss: 1.0471 - classification_loss: 0.1511 145/500 [=======>......................] - ETA: 2:00 - loss: 1.1969 - regression_loss: 1.0462 - classification_loss: 0.1508 146/500 [=======>......................] - ETA: 2:00 - loss: 1.1947 - regression_loss: 1.0442 - classification_loss: 0.1505 147/500 [=======>......................] - ETA: 1:59 - loss: 1.1920 - regression_loss: 1.0421 - classification_loss: 0.1498 148/500 [=======>......................] - ETA: 1:59 - loss: 1.1925 - regression_loss: 1.0425 - classification_loss: 0.1500 149/500 [=======>......................] - ETA: 1:59 - loss: 1.1970 - regression_loss: 1.0461 - classification_loss: 0.1509 150/500 [========>.....................] - ETA: 1:58 - loss: 1.1971 - regression_loss: 1.0463 - classification_loss: 0.1508 151/500 [========>.....................] - ETA: 1:58 - loss: 1.1905 - regression_loss: 1.0404 - classification_loss: 0.1501 152/500 [========>.....................] - ETA: 1:58 - loss: 1.1899 - regression_loss: 1.0400 - classification_loss: 0.1498 153/500 [========>.....................] - ETA: 1:57 - loss: 1.1944 - regression_loss: 1.0435 - classification_loss: 0.1509 154/500 [========>.....................] - ETA: 1:57 - loss: 1.1878 - regression_loss: 1.0378 - classification_loss: 0.1501 155/500 [========>.....................] - ETA: 1:57 - loss: 1.1902 - regression_loss: 1.0404 - classification_loss: 0.1498 156/500 [========>.....................] - ETA: 1:56 - loss: 1.1919 - regression_loss: 1.0410 - classification_loss: 0.1509 157/500 [========>.....................] - ETA: 1:56 - loss: 1.1958 - regression_loss: 1.0439 - classification_loss: 0.1519 158/500 [========>.....................] - ETA: 1:55 - loss: 1.1945 - regression_loss: 1.0430 - classification_loss: 0.1515 159/500 [========>.....................] - ETA: 1:55 - loss: 1.1957 - regression_loss: 1.0440 - classification_loss: 0.1518 160/500 [========>.....................] - ETA: 1:55 - loss: 1.1952 - regression_loss: 1.0433 - classification_loss: 0.1519 161/500 [========>.....................] - ETA: 1:54 - loss: 1.1958 - regression_loss: 1.0437 - classification_loss: 0.1521 162/500 [========>.....................] - ETA: 1:54 - loss: 1.1987 - regression_loss: 1.0456 - classification_loss: 0.1531 163/500 [========>.....................] - ETA: 1:54 - loss: 1.1944 - regression_loss: 1.0414 - classification_loss: 0.1529 164/500 [========>.....................] - ETA: 1:53 - loss: 1.1894 - regression_loss: 1.0369 - classification_loss: 0.1525 165/500 [========>.....................] - ETA: 1:53 - loss: 1.1905 - regression_loss: 1.0381 - classification_loss: 0.1523 166/500 [========>.....................] - ETA: 1:53 - loss: 1.1941 - regression_loss: 1.0409 - classification_loss: 0.1532 167/500 [=========>....................] - ETA: 1:52 - loss: 1.1940 - regression_loss: 1.0407 - classification_loss: 0.1532 168/500 [=========>....................] - ETA: 1:52 - loss: 1.1934 - regression_loss: 1.0407 - classification_loss: 0.1527 169/500 [=========>....................] - ETA: 1:52 - loss: 1.1973 - regression_loss: 1.0438 - classification_loss: 0.1536 170/500 [=========>....................] - ETA: 1:52 - loss: 1.1979 - regression_loss: 1.0448 - classification_loss: 0.1530 171/500 [=========>....................] - ETA: 1:51 - loss: 1.1966 - regression_loss: 1.0437 - classification_loss: 0.1529 172/500 [=========>....................] - ETA: 1:51 - loss: 1.1927 - regression_loss: 1.0404 - classification_loss: 0.1523 173/500 [=========>....................] - ETA: 1:50 - loss: 1.1885 - regression_loss: 1.0368 - classification_loss: 0.1516 174/500 [=========>....................] - ETA: 1:50 - loss: 1.1871 - regression_loss: 1.0360 - classification_loss: 0.1511 175/500 [=========>....................] - ETA: 1:50 - loss: 1.1890 - regression_loss: 1.0377 - classification_loss: 0.1512 176/500 [=========>....................] - ETA: 1:50 - loss: 1.1884 - regression_loss: 1.0374 - classification_loss: 0.1510 177/500 [=========>....................] - ETA: 1:49 - loss: 1.1872 - regression_loss: 1.0367 - classification_loss: 0.1505 178/500 [=========>....................] - ETA: 1:49 - loss: 1.1898 - regression_loss: 1.0391 - classification_loss: 0.1507 179/500 [=========>....................] - ETA: 1:48 - loss: 1.1894 - regression_loss: 1.0388 - classification_loss: 0.1506 180/500 [=========>....................] - ETA: 1:48 - loss: 1.1872 - regression_loss: 1.0370 - classification_loss: 0.1502 181/500 [=========>....................] - ETA: 1:48 - loss: 1.1924 - regression_loss: 1.0408 - classification_loss: 0.1516 182/500 [=========>....................] - ETA: 1:47 - loss: 1.1921 - regression_loss: 1.0404 - classification_loss: 0.1517 183/500 [=========>....................] - ETA: 1:47 - loss: 1.1916 - regression_loss: 1.0401 - classification_loss: 0.1515 184/500 [==========>...................] - ETA: 1:47 - loss: 1.1936 - regression_loss: 1.0421 - classification_loss: 0.1515 185/500 [==========>...................] - ETA: 1:47 - loss: 1.1916 - regression_loss: 1.0405 - classification_loss: 0.1512 186/500 [==========>...................] - ETA: 1:46 - loss: 1.1906 - regression_loss: 1.0390 - classification_loss: 0.1516 187/500 [==========>...................] - ETA: 1:46 - loss: 1.1885 - regression_loss: 1.0367 - classification_loss: 0.1518 188/500 [==========>...................] - ETA: 1:45 - loss: 1.1899 - regression_loss: 1.0379 - classification_loss: 0.1521 189/500 [==========>...................] - ETA: 1:45 - loss: 1.1957 - regression_loss: 1.0419 - classification_loss: 0.1539 190/500 [==========>...................] - ETA: 1:45 - loss: 1.1923 - regression_loss: 1.0389 - classification_loss: 0.1533 191/500 [==========>...................] - ETA: 1:44 - loss: 1.1907 - regression_loss: 1.0377 - classification_loss: 0.1529 192/500 [==========>...................] - ETA: 1:44 - loss: 1.1894 - regression_loss: 1.0370 - classification_loss: 0.1524 193/500 [==========>...................] - ETA: 1:44 - loss: 1.1903 - regression_loss: 1.0383 - classification_loss: 0.1521 194/500 [==========>...................] - ETA: 1:43 - loss: 1.1881 - regression_loss: 1.0364 - classification_loss: 0.1517 195/500 [==========>...................] - ETA: 1:43 - loss: 1.1893 - regression_loss: 1.0371 - classification_loss: 0.1522 196/500 [==========>...................] - ETA: 1:43 - loss: 1.1895 - regression_loss: 1.0374 - classification_loss: 0.1522 197/500 [==========>...................] - ETA: 1:42 - loss: 1.1888 - regression_loss: 1.0369 - classification_loss: 0.1519 198/500 [==========>...................] - ETA: 1:42 - loss: 1.1898 - regression_loss: 1.0378 - classification_loss: 0.1519 199/500 [==========>...................] - ETA: 1:42 - loss: 1.1913 - regression_loss: 1.0396 - classification_loss: 0.1518 200/500 [===========>..................] - ETA: 1:41 - loss: 1.1900 - regression_loss: 1.0385 - classification_loss: 0.1514 201/500 [===========>..................] - ETA: 1:41 - loss: 1.1897 - regression_loss: 1.0384 - classification_loss: 0.1512 202/500 [===========>..................] - ETA: 1:41 - loss: 1.1915 - regression_loss: 1.0402 - classification_loss: 0.1513 203/500 [===========>..................] - ETA: 1:40 - loss: 1.1914 - regression_loss: 1.0405 - classification_loss: 0.1509 204/500 [===========>..................] - ETA: 1:40 - loss: 1.1898 - regression_loss: 1.0392 - classification_loss: 0.1505 205/500 [===========>..................] - ETA: 1:40 - loss: 1.1918 - regression_loss: 1.0410 - classification_loss: 0.1508 206/500 [===========>..................] - ETA: 1:39 - loss: 1.1939 - regression_loss: 1.0428 - classification_loss: 0.1511 207/500 [===========>..................] - ETA: 1:39 - loss: 1.1958 - regression_loss: 1.0448 - classification_loss: 0.1510 208/500 [===========>..................] - ETA: 1:39 - loss: 1.1974 - regression_loss: 1.0463 - classification_loss: 0.1512 209/500 [===========>..................] - ETA: 1:38 - loss: 1.1975 - regression_loss: 1.0463 - classification_loss: 0.1512 210/500 [===========>..................] - ETA: 1:38 - loss: 1.1971 - regression_loss: 1.0460 - classification_loss: 0.1511 211/500 [===========>..................] - ETA: 1:38 - loss: 1.1940 - regression_loss: 1.0433 - classification_loss: 0.1506 212/500 [===========>..................] - ETA: 1:37 - loss: 1.1970 - regression_loss: 1.0456 - classification_loss: 0.1515 213/500 [===========>..................] - ETA: 1:37 - loss: 1.1992 - regression_loss: 1.0470 - classification_loss: 0.1522 214/500 [===========>..................] - ETA: 1:37 - loss: 1.2002 - regression_loss: 1.0477 - classification_loss: 0.1525 215/500 [===========>..................] - ETA: 1:36 - loss: 1.1987 - regression_loss: 1.0465 - classification_loss: 0.1522 216/500 [===========>..................] - ETA: 1:36 - loss: 1.2014 - regression_loss: 1.0488 - classification_loss: 0.1526 217/500 [============>.................] - ETA: 1:36 - loss: 1.2013 - regression_loss: 1.0489 - classification_loss: 0.1524 218/500 [============>.................] - ETA: 1:35 - loss: 1.2008 - regression_loss: 1.0483 - classification_loss: 0.1524 219/500 [============>.................] - ETA: 1:35 - loss: 1.2020 - regression_loss: 1.0492 - classification_loss: 0.1528 220/500 [============>.................] - ETA: 1:35 - loss: 1.2021 - regression_loss: 1.0493 - classification_loss: 0.1529 221/500 [============>.................] - ETA: 1:34 - loss: 1.2037 - regression_loss: 1.0507 - classification_loss: 0.1530 222/500 [============>.................] - ETA: 1:34 - loss: 1.2012 - regression_loss: 1.0487 - classification_loss: 0.1525 223/500 [============>.................] - ETA: 1:34 - loss: 1.2011 - regression_loss: 1.0489 - classification_loss: 0.1522 224/500 [============>.................] - ETA: 1:33 - loss: 1.2001 - regression_loss: 1.0482 - classification_loss: 0.1519 225/500 [============>.................] - ETA: 1:33 - loss: 1.2018 - regression_loss: 1.0497 - classification_loss: 0.1521 226/500 [============>.................] - ETA: 1:33 - loss: 1.2003 - regression_loss: 1.0482 - classification_loss: 0.1520 227/500 [============>.................] - ETA: 1:32 - loss: 1.2021 - regression_loss: 1.0498 - classification_loss: 0.1523 228/500 [============>.................] - ETA: 1:32 - loss: 1.2031 - regression_loss: 1.0509 - classification_loss: 0.1523 229/500 [============>.................] - ETA: 1:32 - loss: 1.2053 - regression_loss: 1.0526 - classification_loss: 0.1527 230/500 [============>.................] - ETA: 1:31 - loss: 1.2053 - regression_loss: 1.0522 - classification_loss: 0.1531 231/500 [============>.................] - ETA: 1:31 - loss: 1.2087 - regression_loss: 1.0549 - classification_loss: 0.1538 232/500 [============>.................] - ETA: 1:30 - loss: 1.2092 - regression_loss: 1.0551 - classification_loss: 0.1541 233/500 [============>.................] - ETA: 1:30 - loss: 1.2112 - regression_loss: 1.0568 - classification_loss: 0.1544 234/500 [=============>................] - ETA: 1:30 - loss: 1.2151 - regression_loss: 1.0605 - classification_loss: 0.1547 235/500 [=============>................] - ETA: 1:29 - loss: 1.2168 - regression_loss: 1.0618 - classification_loss: 0.1550 236/500 [=============>................] - ETA: 1:29 - loss: 1.2174 - regression_loss: 1.0624 - classification_loss: 0.1550 237/500 [=============>................] - ETA: 1:29 - loss: 1.2178 - regression_loss: 1.0627 - classification_loss: 0.1551 238/500 [=============>................] - ETA: 1:28 - loss: 1.2201 - regression_loss: 1.0646 - classification_loss: 0.1555 239/500 [=============>................] - ETA: 1:28 - loss: 1.2183 - regression_loss: 1.0631 - classification_loss: 0.1552 240/500 [=============>................] - ETA: 1:28 - loss: 1.2194 - regression_loss: 1.0637 - classification_loss: 0.1557 241/500 [=============>................] - ETA: 1:27 - loss: 1.2214 - regression_loss: 1.0655 - classification_loss: 0.1560 242/500 [=============>................] - ETA: 1:27 - loss: 1.2228 - regression_loss: 1.0664 - classification_loss: 0.1564 243/500 [=============>................] - ETA: 1:27 - loss: 1.2196 - regression_loss: 1.0636 - classification_loss: 0.1561 244/500 [=============>................] - ETA: 1:26 - loss: 1.2172 - regression_loss: 1.0614 - classification_loss: 0.1557 245/500 [=============>................] - ETA: 1:26 - loss: 1.2172 - regression_loss: 1.0613 - classification_loss: 0.1559 246/500 [=============>................] - ETA: 1:26 - loss: 1.2173 - regression_loss: 1.0615 - classification_loss: 0.1557 247/500 [=============>................] - ETA: 1:25 - loss: 1.2215 - regression_loss: 1.0651 - classification_loss: 0.1564 248/500 [=============>................] - ETA: 1:25 - loss: 1.2231 - regression_loss: 1.0666 - classification_loss: 0.1565 249/500 [=============>................] - ETA: 1:25 - loss: 1.2228 - regression_loss: 1.0664 - classification_loss: 0.1564 250/500 [==============>...............] - ETA: 1:24 - loss: 1.2225 - regression_loss: 1.0662 - classification_loss: 0.1562 251/500 [==============>...............] - ETA: 1:24 - loss: 1.2223 - regression_loss: 1.0662 - classification_loss: 0.1561 252/500 [==============>...............] - ETA: 1:24 - loss: 1.2224 - regression_loss: 1.0665 - classification_loss: 0.1559 253/500 [==============>...............] - ETA: 1:23 - loss: 1.2199 - regression_loss: 1.0644 - classification_loss: 0.1555 254/500 [==============>...............] - ETA: 1:23 - loss: 1.2203 - regression_loss: 1.0647 - classification_loss: 0.1555 255/500 [==============>...............] - ETA: 1:23 - loss: 1.2175 - regression_loss: 1.0624 - classification_loss: 0.1551 256/500 [==============>...............] - ETA: 1:22 - loss: 1.2180 - regression_loss: 1.0629 - classification_loss: 0.1551 257/500 [==============>...............] - ETA: 1:22 - loss: 1.2180 - regression_loss: 1.0630 - classification_loss: 0.1550 258/500 [==============>...............] - ETA: 1:22 - loss: 1.2176 - regression_loss: 1.0626 - classification_loss: 0.1550 259/500 [==============>...............] - ETA: 1:21 - loss: 1.2186 - regression_loss: 1.0637 - classification_loss: 0.1549 260/500 [==============>...............] - ETA: 1:21 - loss: 1.2192 - regression_loss: 1.0644 - classification_loss: 0.1548 261/500 [==============>...............] - ETA: 1:21 - loss: 1.2216 - regression_loss: 1.0659 - classification_loss: 0.1556 262/500 [==============>...............] - ETA: 1:20 - loss: 1.2228 - regression_loss: 1.0672 - classification_loss: 0.1556 263/500 [==============>...............] - ETA: 1:20 - loss: 1.2236 - regression_loss: 1.0680 - classification_loss: 0.1556 264/500 [==============>...............] - ETA: 1:20 - loss: 1.2240 - regression_loss: 1.0686 - classification_loss: 0.1554 265/500 [==============>...............] - ETA: 1:19 - loss: 1.2226 - regression_loss: 1.0675 - classification_loss: 0.1552 266/500 [==============>...............] - ETA: 1:19 - loss: 1.2199 - regression_loss: 1.0650 - classification_loss: 0.1550 267/500 [===============>..............] - ETA: 1:19 - loss: 1.2202 - regression_loss: 1.0652 - classification_loss: 0.1550 268/500 [===============>..............] - ETA: 1:18 - loss: 1.2210 - regression_loss: 1.0656 - classification_loss: 0.1554 269/500 [===============>..............] - ETA: 1:18 - loss: 1.2190 - regression_loss: 1.0639 - classification_loss: 0.1551 270/500 [===============>..............] - ETA: 1:17 - loss: 1.2204 - regression_loss: 1.0650 - classification_loss: 0.1554 271/500 [===============>..............] - ETA: 1:17 - loss: 1.2270 - regression_loss: 1.0698 - classification_loss: 0.1573 272/500 [===============>..............] - ETA: 1:17 - loss: 1.2259 - regression_loss: 1.0688 - classification_loss: 0.1571 273/500 [===============>..............] - ETA: 1:16 - loss: 1.2251 - regression_loss: 1.0683 - classification_loss: 0.1568 274/500 [===============>..............] - ETA: 1:16 - loss: 1.2249 - regression_loss: 1.0682 - classification_loss: 0.1568 275/500 [===============>..............] - ETA: 1:16 - loss: 1.2286 - regression_loss: 1.0713 - classification_loss: 0.1573 276/500 [===============>..............] - ETA: 1:15 - loss: 1.2292 - regression_loss: 1.0721 - classification_loss: 0.1571 277/500 [===============>..............] - ETA: 1:15 - loss: 1.2275 - regression_loss: 1.0708 - classification_loss: 0.1567 278/500 [===============>..............] - ETA: 1:15 - loss: 1.2269 - regression_loss: 1.0704 - classification_loss: 0.1565 279/500 [===============>..............] - ETA: 1:14 - loss: 1.2261 - regression_loss: 1.0697 - classification_loss: 0.1563 280/500 [===============>..............] - ETA: 1:14 - loss: 1.2267 - regression_loss: 1.0701 - classification_loss: 0.1566 281/500 [===============>..............] - ETA: 1:14 - loss: 1.2278 - regression_loss: 1.0698 - classification_loss: 0.1580 282/500 [===============>..............] - ETA: 1:13 - loss: 1.2367 - regression_loss: 1.0770 - classification_loss: 0.1597 283/500 [===============>..............] - ETA: 1:13 - loss: 1.2378 - regression_loss: 1.0783 - classification_loss: 0.1596 284/500 [================>.............] - ETA: 1:13 - loss: 1.2379 - regression_loss: 1.0784 - classification_loss: 0.1596 285/500 [================>.............] - ETA: 1:12 - loss: 1.2350 - regression_loss: 1.0759 - classification_loss: 0.1591 286/500 [================>.............] - ETA: 1:12 - loss: 1.2343 - regression_loss: 1.0754 - classification_loss: 0.1589 287/500 [================>.............] - ETA: 1:12 - loss: 1.2327 - regression_loss: 1.0742 - classification_loss: 0.1585 288/500 [================>.............] - ETA: 1:11 - loss: 1.2337 - regression_loss: 1.0749 - classification_loss: 0.1588 289/500 [================>.............] - ETA: 1:11 - loss: 1.2322 - regression_loss: 1.0736 - classification_loss: 0.1586 290/500 [================>.............] - ETA: 1:11 - loss: 1.2310 - regression_loss: 1.0725 - classification_loss: 0.1585 291/500 [================>.............] - ETA: 1:10 - loss: 1.2326 - regression_loss: 1.0735 - classification_loss: 0.1591 292/500 [================>.............] - ETA: 1:10 - loss: 1.2324 - regression_loss: 1.0734 - classification_loss: 0.1590 293/500 [================>.............] - ETA: 1:10 - loss: 1.2383 - regression_loss: 1.0780 - classification_loss: 0.1603 294/500 [================>.............] - ETA: 1:09 - loss: 1.2407 - regression_loss: 1.0798 - classification_loss: 0.1608 295/500 [================>.............] - ETA: 1:09 - loss: 1.2421 - regression_loss: 1.0815 - classification_loss: 0.1605 296/500 [================>.............] - ETA: 1:09 - loss: 1.2407 - regression_loss: 1.0805 - classification_loss: 0.1603 297/500 [================>.............] - ETA: 1:08 - loss: 1.2405 - regression_loss: 1.0802 - classification_loss: 0.1604 298/500 [================>.............] - ETA: 1:08 - loss: 1.2392 - regression_loss: 1.0792 - classification_loss: 0.1600 299/500 [================>.............] - ETA: 1:08 - loss: 1.2390 - regression_loss: 1.0790 - classification_loss: 0.1599 300/500 [=================>............] - ETA: 1:07 - loss: 1.2377 - regression_loss: 1.0782 - classification_loss: 0.1595 301/500 [=================>............] - ETA: 1:07 - loss: 1.2375 - regression_loss: 1.0782 - classification_loss: 0.1593 302/500 [=================>............] - ETA: 1:07 - loss: 1.2366 - regression_loss: 1.0776 - classification_loss: 0.1590 303/500 [=================>............] - ETA: 1:06 - loss: 1.2350 - regression_loss: 1.0760 - classification_loss: 0.1590 304/500 [=================>............] - ETA: 1:06 - loss: 1.2395 - regression_loss: 1.0795 - classification_loss: 0.1600 305/500 [=================>............] - ETA: 1:06 - loss: 1.2405 - regression_loss: 1.0806 - classification_loss: 0.1599 306/500 [=================>............] - ETA: 1:05 - loss: 1.2423 - regression_loss: 1.0822 - classification_loss: 0.1601 307/500 [=================>............] - ETA: 1:05 - loss: 1.2444 - regression_loss: 1.0842 - classification_loss: 0.1602 308/500 [=================>............] - ETA: 1:05 - loss: 1.2523 - regression_loss: 1.0880 - classification_loss: 0.1643 309/500 [=================>............] - ETA: 1:04 - loss: 1.2513 - regression_loss: 1.0871 - classification_loss: 0.1641 310/500 [=================>............] - ETA: 1:04 - loss: 1.2519 - regression_loss: 1.0878 - classification_loss: 0.1641 311/500 [=================>............] - ETA: 1:04 - loss: 1.2529 - regression_loss: 1.0886 - classification_loss: 0.1644 312/500 [=================>............] - ETA: 1:03 - loss: 1.2541 - regression_loss: 1.0895 - classification_loss: 0.1646 313/500 [=================>............] - ETA: 1:03 - loss: 1.2523 - regression_loss: 1.0879 - classification_loss: 0.1644 314/500 [=================>............] - ETA: 1:03 - loss: 1.2500 - regression_loss: 1.0859 - classification_loss: 0.1641 315/500 [=================>............] - ETA: 1:02 - loss: 1.2524 - regression_loss: 1.0880 - classification_loss: 0.1645 316/500 [=================>............] - ETA: 1:02 - loss: 1.2539 - regression_loss: 1.0890 - classification_loss: 0.1649 317/500 [==================>...........] - ETA: 1:02 - loss: 1.2540 - regression_loss: 1.0890 - classification_loss: 0.1650 318/500 [==================>...........] - ETA: 1:01 - loss: 1.2539 - regression_loss: 1.0888 - classification_loss: 0.1651 319/500 [==================>...........] - ETA: 1:01 - loss: 1.2533 - regression_loss: 1.0885 - classification_loss: 0.1648 320/500 [==================>...........] - ETA: 1:01 - loss: 1.2530 - regression_loss: 1.0879 - classification_loss: 0.1650 321/500 [==================>...........] - ETA: 1:00 - loss: 1.2545 - regression_loss: 1.0894 - classification_loss: 0.1651 322/500 [==================>...........] - ETA: 1:00 - loss: 1.2528 - regression_loss: 1.0881 - classification_loss: 0.1647 323/500 [==================>...........] - ETA: 59s - loss: 1.2509 - regression_loss: 1.0864 - classification_loss: 0.1644  324/500 [==================>...........] - ETA: 59s - loss: 1.2514 - regression_loss: 1.0868 - classification_loss: 0.1647 325/500 [==================>...........] - ETA: 59s - loss: 1.2510 - regression_loss: 1.0865 - classification_loss: 0.1645 326/500 [==================>...........] - ETA: 58s - loss: 1.2531 - regression_loss: 1.0882 - classification_loss: 0.1649 327/500 [==================>...........] - ETA: 58s - loss: 1.2532 - regression_loss: 1.0882 - classification_loss: 0.1650 328/500 [==================>...........] - ETA: 58s - loss: 1.2539 - regression_loss: 1.0888 - classification_loss: 0.1651 329/500 [==================>...........] - ETA: 57s - loss: 1.2542 - regression_loss: 1.0892 - classification_loss: 0.1651 330/500 [==================>...........] - ETA: 57s - loss: 1.2540 - regression_loss: 1.0890 - classification_loss: 0.1651 331/500 [==================>...........] - ETA: 57s - loss: 1.2515 - regression_loss: 1.0867 - classification_loss: 0.1647 332/500 [==================>...........] - ETA: 56s - loss: 1.2504 - regression_loss: 1.0861 - classification_loss: 0.1644 333/500 [==================>...........] - ETA: 56s - loss: 1.2528 - regression_loss: 1.0881 - classification_loss: 0.1647 334/500 [===================>..........] - ETA: 56s - loss: 1.2506 - regression_loss: 1.0862 - classification_loss: 0.1644 335/500 [===================>..........] - ETA: 55s - loss: 1.2509 - regression_loss: 1.0864 - classification_loss: 0.1645 336/500 [===================>..........] - ETA: 55s - loss: 1.2517 - regression_loss: 1.0870 - classification_loss: 0.1647 337/500 [===================>..........] - ETA: 55s - loss: 1.2517 - regression_loss: 1.0872 - classification_loss: 0.1646 338/500 [===================>..........] - ETA: 54s - loss: 1.2498 - regression_loss: 1.0853 - classification_loss: 0.1644 339/500 [===================>..........] - ETA: 54s - loss: 1.2484 - regression_loss: 1.0842 - classification_loss: 0.1642 340/500 [===================>..........] - ETA: 54s - loss: 1.2482 - regression_loss: 1.0840 - classification_loss: 0.1642 341/500 [===================>..........] - ETA: 53s - loss: 1.2491 - regression_loss: 1.0848 - classification_loss: 0.1642 342/500 [===================>..........] - ETA: 53s - loss: 1.2493 - regression_loss: 1.0853 - classification_loss: 0.1640 343/500 [===================>..........] - ETA: 53s - loss: 1.2487 - regression_loss: 1.0846 - classification_loss: 0.1641 344/500 [===================>..........] - ETA: 52s - loss: 1.2487 - regression_loss: 1.0848 - classification_loss: 0.1639 345/500 [===================>..........] - ETA: 52s - loss: 1.2479 - regression_loss: 1.0841 - classification_loss: 0.1638 346/500 [===================>..........] - ETA: 52s - loss: 1.2478 - regression_loss: 1.0842 - classification_loss: 0.1636 347/500 [===================>..........] - ETA: 51s - loss: 1.2462 - regression_loss: 1.0829 - classification_loss: 0.1633 348/500 [===================>..........] - ETA: 51s - loss: 1.2468 - regression_loss: 1.0835 - classification_loss: 0.1633 349/500 [===================>..........] - ETA: 51s - loss: 1.2484 - regression_loss: 1.0851 - classification_loss: 0.1633 350/500 [====================>.........] - ETA: 50s - loss: 1.2471 - regression_loss: 1.0841 - classification_loss: 0.1630 351/500 [====================>.........] - ETA: 50s - loss: 1.2460 - regression_loss: 1.0831 - classification_loss: 0.1629 352/500 [====================>.........] - ETA: 50s - loss: 1.2443 - regression_loss: 1.0817 - classification_loss: 0.1626 353/500 [====================>.........] - ETA: 49s - loss: 1.2454 - regression_loss: 1.0826 - classification_loss: 0.1629 354/500 [====================>.........] - ETA: 49s - loss: 1.2447 - regression_loss: 1.0820 - classification_loss: 0.1626 355/500 [====================>.........] - ETA: 49s - loss: 1.2462 - regression_loss: 1.0831 - classification_loss: 0.1632 356/500 [====================>.........] - ETA: 48s - loss: 1.2470 - regression_loss: 1.0838 - classification_loss: 0.1631 357/500 [====================>.........] - ETA: 48s - loss: 1.2471 - regression_loss: 1.0840 - classification_loss: 0.1630 358/500 [====================>.........] - ETA: 48s - loss: 1.2467 - regression_loss: 1.0839 - classification_loss: 0.1629 359/500 [====================>.........] - ETA: 47s - loss: 1.2455 - regression_loss: 1.0829 - classification_loss: 0.1626 360/500 [====================>.........] - ETA: 47s - loss: 1.2459 - regression_loss: 1.0834 - classification_loss: 0.1625 361/500 [====================>.........] - ETA: 47s - loss: 1.2463 - regression_loss: 1.0839 - classification_loss: 0.1624 362/500 [====================>.........] - ETA: 46s - loss: 1.2440 - regression_loss: 1.0820 - classification_loss: 0.1620 363/500 [====================>.........] - ETA: 46s - loss: 1.2439 - regression_loss: 1.0819 - classification_loss: 0.1620 364/500 [====================>.........] - ETA: 46s - loss: 1.2406 - regression_loss: 1.0789 - classification_loss: 0.1617 365/500 [====================>.........] - ETA: 45s - loss: 1.2420 - regression_loss: 1.0801 - classification_loss: 0.1620 366/500 [====================>.........] - ETA: 45s - loss: 1.2415 - regression_loss: 1.0798 - classification_loss: 0.1618 367/500 [=====================>........] - ETA: 45s - loss: 1.2416 - regression_loss: 1.0800 - classification_loss: 0.1616 368/500 [=====================>........] - ETA: 44s - loss: 1.2419 - regression_loss: 1.0805 - classification_loss: 0.1614 369/500 [=====================>........] - ETA: 44s - loss: 1.2420 - regression_loss: 1.0805 - classification_loss: 0.1615 370/500 [=====================>........] - ETA: 44s - loss: 1.2426 - regression_loss: 1.0812 - classification_loss: 0.1615 371/500 [=====================>........] - ETA: 43s - loss: 1.2428 - regression_loss: 1.0813 - classification_loss: 0.1615 372/500 [=====================>........] - ETA: 43s - loss: 1.2419 - regression_loss: 1.0806 - classification_loss: 0.1612 373/500 [=====================>........] - ETA: 43s - loss: 1.2415 - regression_loss: 1.0804 - classification_loss: 0.1611 374/500 [=====================>........] - ETA: 42s - loss: 1.2423 - regression_loss: 1.0811 - classification_loss: 0.1611 375/500 [=====================>........] - ETA: 42s - loss: 1.2429 - regression_loss: 1.0816 - classification_loss: 0.1613 376/500 [=====================>........] - ETA: 42s - loss: 1.2422 - regression_loss: 1.0810 - classification_loss: 0.1612 377/500 [=====================>........] - ETA: 41s - loss: 1.2400 - regression_loss: 1.0791 - classification_loss: 0.1608 378/500 [=====================>........] - ETA: 41s - loss: 1.2403 - regression_loss: 1.0795 - classification_loss: 0.1608 379/500 [=====================>........] - ETA: 41s - loss: 1.2394 - regression_loss: 1.0785 - classification_loss: 0.1609 380/500 [=====================>........] - ETA: 40s - loss: 1.2394 - regression_loss: 1.0787 - classification_loss: 0.1607 381/500 [=====================>........] - ETA: 40s - loss: 1.2395 - regression_loss: 1.0787 - classification_loss: 0.1608 382/500 [=====================>........] - ETA: 40s - loss: 1.2384 - regression_loss: 1.0778 - classification_loss: 0.1606 383/500 [=====================>........] - ETA: 39s - loss: 1.2372 - regression_loss: 1.0768 - classification_loss: 0.1604 384/500 [======================>.......] - ETA: 39s - loss: 1.2370 - regression_loss: 1.0766 - classification_loss: 0.1604 385/500 [======================>.......] - ETA: 38s - loss: 1.2372 - regression_loss: 1.0768 - classification_loss: 0.1604 386/500 [======================>.......] - ETA: 38s - loss: 1.2377 - regression_loss: 1.0774 - classification_loss: 0.1603 387/500 [======================>.......] - ETA: 38s - loss: 1.2358 - regression_loss: 1.0758 - classification_loss: 0.1600 388/500 [======================>.......] - ETA: 37s - loss: 1.2352 - regression_loss: 1.0753 - classification_loss: 0.1599 389/500 [======================>.......] - ETA: 37s - loss: 1.2350 - regression_loss: 1.0751 - classification_loss: 0.1599 390/500 [======================>.......] - ETA: 37s - loss: 1.2360 - regression_loss: 1.0760 - classification_loss: 0.1600 391/500 [======================>.......] - ETA: 36s - loss: 1.2359 - regression_loss: 1.0760 - classification_loss: 0.1599 392/500 [======================>.......] - ETA: 36s - loss: 1.2341 - regression_loss: 1.0745 - classification_loss: 0.1596 393/500 [======================>.......] - ETA: 36s - loss: 1.2331 - regression_loss: 1.0737 - classification_loss: 0.1594 394/500 [======================>.......] - ETA: 35s - loss: 1.2327 - regression_loss: 1.0735 - classification_loss: 0.1592 395/500 [======================>.......] - ETA: 35s - loss: 1.2319 - regression_loss: 1.0728 - classification_loss: 0.1590 396/500 [======================>.......] - ETA: 35s - loss: 1.2314 - regression_loss: 1.0724 - classification_loss: 0.1590 397/500 [======================>.......] - ETA: 34s - loss: 1.2318 - regression_loss: 1.0727 - classification_loss: 0.1591 398/500 [======================>.......] - ETA: 34s - loss: 1.2316 - regression_loss: 1.0726 - classification_loss: 0.1590 399/500 [======================>.......] - ETA: 34s - loss: 1.2337 - regression_loss: 1.0740 - classification_loss: 0.1597 400/500 [=======================>......] - ETA: 33s - loss: 1.2333 - regression_loss: 1.0738 - classification_loss: 0.1596 401/500 [=======================>......] - ETA: 33s - loss: 1.2328 - regression_loss: 1.0733 - classification_loss: 0.1595 402/500 [=======================>......] - ETA: 33s - loss: 1.2316 - regression_loss: 1.0724 - classification_loss: 0.1592 403/500 [=======================>......] - ETA: 32s - loss: 1.2329 - regression_loss: 1.0736 - classification_loss: 0.1593 404/500 [=======================>......] - ETA: 32s - loss: 1.2328 - regression_loss: 1.0734 - classification_loss: 0.1594 405/500 [=======================>......] - ETA: 32s - loss: 1.2311 - regression_loss: 1.0720 - classification_loss: 0.1591 406/500 [=======================>......] - ETA: 31s - loss: 1.2300 - regression_loss: 1.0711 - classification_loss: 0.1589 407/500 [=======================>......] - ETA: 31s - loss: 1.2276 - regression_loss: 1.0690 - classification_loss: 0.1586 408/500 [=======================>......] - ETA: 31s - loss: 1.2282 - regression_loss: 1.0697 - classification_loss: 0.1586 409/500 [=======================>......] - ETA: 30s - loss: 1.2288 - regression_loss: 1.0702 - classification_loss: 0.1586 410/500 [=======================>......] - ETA: 30s - loss: 1.2298 - regression_loss: 1.0712 - classification_loss: 0.1586 411/500 [=======================>......] - ETA: 30s - loss: 1.2311 - regression_loss: 1.0726 - classification_loss: 0.1585 412/500 [=======================>......] - ETA: 29s - loss: 1.2303 - regression_loss: 1.0719 - classification_loss: 0.1584 413/500 [=======================>......] - ETA: 29s - loss: 1.2302 - regression_loss: 1.0718 - classification_loss: 0.1583 414/500 [=======================>......] - ETA: 29s - loss: 1.2294 - regression_loss: 1.0710 - classification_loss: 0.1583 415/500 [=======================>......] - ETA: 28s - loss: 1.2304 - regression_loss: 1.0717 - classification_loss: 0.1587 416/500 [=======================>......] - ETA: 28s - loss: 1.2315 - regression_loss: 1.0727 - classification_loss: 0.1589 417/500 [========================>.....] - ETA: 28s - loss: 1.2320 - regression_loss: 1.0731 - classification_loss: 0.1589 418/500 [========================>.....] - ETA: 27s - loss: 1.2299 - regression_loss: 1.0713 - classification_loss: 0.1586 419/500 [========================>.....] - ETA: 27s - loss: 1.2291 - regression_loss: 1.0706 - classification_loss: 0.1584 420/500 [========================>.....] - ETA: 27s - loss: 1.2290 - regression_loss: 1.0708 - classification_loss: 0.1583 421/500 [========================>.....] - ETA: 26s - loss: 1.2322 - regression_loss: 1.0733 - classification_loss: 0.1588 422/500 [========================>.....] - ETA: 26s - loss: 1.2331 - regression_loss: 1.0742 - classification_loss: 0.1589 423/500 [========================>.....] - ETA: 26s - loss: 1.2321 - regression_loss: 1.0733 - classification_loss: 0.1588 424/500 [========================>.....] - ETA: 25s - loss: 1.2335 - regression_loss: 1.0746 - classification_loss: 0.1589 425/500 [========================>.....] - ETA: 25s - loss: 1.2333 - regression_loss: 1.0745 - classification_loss: 0.1588 426/500 [========================>.....] - ETA: 25s - loss: 1.2326 - regression_loss: 1.0739 - classification_loss: 0.1587 427/500 [========================>.....] - ETA: 24s - loss: 1.2319 - regression_loss: 1.0734 - classification_loss: 0.1585 428/500 [========================>.....] - ETA: 24s - loss: 1.2319 - regression_loss: 1.0734 - classification_loss: 0.1584 429/500 [========================>.....] - ETA: 24s - loss: 1.2322 - regression_loss: 1.0738 - classification_loss: 0.1584 430/500 [========================>.....] - ETA: 23s - loss: 1.2327 - regression_loss: 1.0745 - classification_loss: 0.1582 431/500 [========================>.....] - ETA: 23s - loss: 1.2318 - regression_loss: 1.0738 - classification_loss: 0.1580 432/500 [========================>.....] - ETA: 23s - loss: 1.2311 - regression_loss: 1.0733 - classification_loss: 0.1579 433/500 [========================>.....] - ETA: 22s - loss: 1.2319 - regression_loss: 1.0741 - classification_loss: 0.1579 434/500 [=========================>....] - ETA: 22s - loss: 1.2318 - regression_loss: 1.0739 - classification_loss: 0.1579 435/500 [=========================>....] - ETA: 22s - loss: 1.2315 - regression_loss: 1.0737 - classification_loss: 0.1578 436/500 [=========================>....] - ETA: 21s - loss: 1.2315 - regression_loss: 1.0736 - classification_loss: 0.1579 437/500 [=========================>....] - ETA: 21s - loss: 1.2310 - regression_loss: 1.0733 - classification_loss: 0.1577 438/500 [=========================>....] - ETA: 21s - loss: 1.2318 - regression_loss: 1.0742 - classification_loss: 0.1576 439/500 [=========================>....] - ETA: 20s - loss: 1.2311 - regression_loss: 1.0736 - classification_loss: 0.1575 440/500 [=========================>....] - ETA: 20s - loss: 1.2309 - regression_loss: 1.0735 - classification_loss: 0.1574 441/500 [=========================>....] - ETA: 20s - loss: 1.2323 - regression_loss: 1.0748 - classification_loss: 0.1575 442/500 [=========================>....] - ETA: 19s - loss: 1.2316 - regression_loss: 1.0743 - classification_loss: 0.1573 443/500 [=========================>....] - ETA: 19s - loss: 1.2311 - regression_loss: 1.0738 - classification_loss: 0.1573 444/500 [=========================>....] - ETA: 18s - loss: 1.2309 - regression_loss: 1.0737 - classification_loss: 0.1573 445/500 [=========================>....] - ETA: 18s - loss: 1.2309 - regression_loss: 1.0736 - classification_loss: 0.1573 446/500 [=========================>....] - ETA: 18s - loss: 1.2311 - regression_loss: 1.0739 - classification_loss: 0.1572 447/500 [=========================>....] - ETA: 17s - loss: 1.2325 - regression_loss: 1.0750 - classification_loss: 0.1574 448/500 [=========================>....] - ETA: 17s - loss: 1.2322 - regression_loss: 1.0748 - classification_loss: 0.1573 449/500 [=========================>....] - ETA: 17s - loss: 1.2333 - regression_loss: 1.0757 - classification_loss: 0.1576 450/500 [==========================>...] - ETA: 16s - loss: 1.2339 - regression_loss: 1.0763 - classification_loss: 0.1576 451/500 [==========================>...] - ETA: 16s - loss: 1.2341 - regression_loss: 1.0765 - classification_loss: 0.1576 452/500 [==========================>...] - ETA: 16s - loss: 1.2327 - regression_loss: 1.0754 - classification_loss: 0.1573 453/500 [==========================>...] - ETA: 15s - loss: 1.2337 - regression_loss: 1.0761 - classification_loss: 0.1576 454/500 [==========================>...] - ETA: 15s - loss: 1.2351 - regression_loss: 1.0772 - classification_loss: 0.1580 455/500 [==========================>...] - ETA: 15s - loss: 1.2340 - regression_loss: 1.0763 - classification_loss: 0.1578 456/500 [==========================>...] - ETA: 14s - loss: 1.2358 - regression_loss: 1.0778 - classification_loss: 0.1580 457/500 [==========================>...] - ETA: 14s - loss: 1.2355 - regression_loss: 1.0777 - classification_loss: 0.1579 458/500 [==========================>...] - ETA: 14s - loss: 1.2352 - regression_loss: 1.0775 - classification_loss: 0.1577 459/500 [==========================>...] - ETA: 13s - loss: 1.2365 - regression_loss: 1.0783 - classification_loss: 0.1582 460/500 [==========================>...] - ETA: 13s - loss: 1.2359 - regression_loss: 1.0779 - classification_loss: 0.1581 461/500 [==========================>...] - ETA: 13s - loss: 1.2354 - regression_loss: 1.0774 - classification_loss: 0.1580 462/500 [==========================>...] - ETA: 12s - loss: 1.2351 - regression_loss: 1.0773 - classification_loss: 0.1578 463/500 [==========================>...] - ETA: 12s - loss: 1.2338 - regression_loss: 1.0761 - classification_loss: 0.1577 464/500 [==========================>...] - ETA: 12s - loss: 1.2328 - regression_loss: 1.0753 - classification_loss: 0.1575 465/500 [==========================>...] - ETA: 11s - loss: 1.2328 - regression_loss: 1.0753 - classification_loss: 0.1575 466/500 [==========================>...] - ETA: 11s - loss: 1.2330 - regression_loss: 1.0756 - classification_loss: 0.1575 467/500 [===========================>..] - ETA: 11s - loss: 1.2339 - regression_loss: 1.0763 - classification_loss: 0.1576 468/500 [===========================>..] - ETA: 10s - loss: 1.2334 - regression_loss: 1.0759 - classification_loss: 0.1575 469/500 [===========================>..] - ETA: 10s - loss: 1.2333 - regression_loss: 1.0758 - classification_loss: 0.1575 470/500 [===========================>..] - ETA: 10s - loss: 1.2335 - regression_loss: 1.0761 - classification_loss: 0.1574 471/500 [===========================>..] - ETA: 9s - loss: 1.2341 - regression_loss: 1.0767 - classification_loss: 0.1574  472/500 [===========================>..] - ETA: 9s - loss: 1.2331 - regression_loss: 1.0759 - classification_loss: 0.1571 473/500 [===========================>..] - ETA: 9s - loss: 1.2334 - regression_loss: 1.0764 - classification_loss: 0.1570 474/500 [===========================>..] - ETA: 8s - loss: 1.2335 - regression_loss: 1.0766 - classification_loss: 0.1569 475/500 [===========================>..] - ETA: 8s - loss: 1.2327 - regression_loss: 1.0757 - classification_loss: 0.1569 476/500 [===========================>..] - ETA: 8s - loss: 1.2327 - regression_loss: 1.0758 - classification_loss: 0.1569 477/500 [===========================>..] - ETA: 7s - loss: 1.2323 - regression_loss: 1.0755 - classification_loss: 0.1568 478/500 [===========================>..] - ETA: 7s - loss: 1.2328 - regression_loss: 1.0760 - classification_loss: 0.1568 479/500 [===========================>..] - ETA: 7s - loss: 1.2327 - regression_loss: 1.0759 - classification_loss: 0.1568 480/500 [===========================>..] - ETA: 6s - loss: 1.2332 - regression_loss: 1.0763 - classification_loss: 0.1569 481/500 [===========================>..] - ETA: 6s - loss: 1.2322 - regression_loss: 1.0756 - classification_loss: 0.1566 482/500 [===========================>..] - ETA: 6s - loss: 1.2316 - regression_loss: 1.0751 - classification_loss: 0.1565 483/500 [===========================>..] - ETA: 5s - loss: 1.2318 - regression_loss: 1.0753 - classification_loss: 0.1565 484/500 [============================>.] - ETA: 5s - loss: 1.2337 - regression_loss: 1.0769 - classification_loss: 0.1568 485/500 [============================>.] - ETA: 5s - loss: 1.2341 - regression_loss: 1.0773 - classification_loss: 0.1568 486/500 [============================>.] - ETA: 4s - loss: 1.2349 - regression_loss: 1.0781 - classification_loss: 0.1568 487/500 [============================>.] - ETA: 4s - loss: 1.2344 - regression_loss: 1.0777 - classification_loss: 0.1568 488/500 [============================>.] - ETA: 4s - loss: 1.2329 - regression_loss: 1.0764 - classification_loss: 0.1566 489/500 [============================>.] - ETA: 3s - loss: 1.2319 - regression_loss: 1.0755 - classification_loss: 0.1564 490/500 [============================>.] - ETA: 3s - loss: 1.2320 - regression_loss: 1.0757 - classification_loss: 0.1564 491/500 [============================>.] - ETA: 3s - loss: 1.2319 - regression_loss: 1.0757 - classification_loss: 0.1562 492/500 [============================>.] - ETA: 2s - loss: 1.2327 - regression_loss: 1.0765 - classification_loss: 0.1562 493/500 [============================>.] - ETA: 2s - loss: 1.2341 - regression_loss: 1.0776 - classification_loss: 0.1565 494/500 [============================>.] - ETA: 2s - loss: 1.2341 - regression_loss: 1.0777 - classification_loss: 0.1565 495/500 [============================>.] - ETA: 1s - loss: 1.2339 - regression_loss: 1.0776 - classification_loss: 0.1563 496/500 [============================>.] - ETA: 1s - loss: 1.2332 - regression_loss: 1.0770 - classification_loss: 0.1562 497/500 [============================>.] - ETA: 1s - loss: 1.2324 - regression_loss: 1.0763 - classification_loss: 0.1561 498/500 [============================>.] - ETA: 0s - loss: 1.2364 - regression_loss: 1.0801 - classification_loss: 0.1563 499/500 [============================>.] - ETA: 0s - loss: 1.2379 - regression_loss: 1.0812 - classification_loss: 0.1567 500/500 [==============================] - 170s 339ms/step - loss: 1.2379 - regression_loss: 1.0813 - classification_loss: 0.1566 326 instances of class plum with average precision: 0.8204 mAP: 0.8204 Epoch 00016: saving model to ./training/snapshots/resnet101_pascal_16.h5 Epoch 17/150 1/500 [..............................] - ETA: 2:30 - loss: 0.6916 - regression_loss: 0.6339 - classification_loss: 0.0576 2/500 [..............................] - ETA: 2:42 - loss: 0.8119 - regression_loss: 0.7423 - classification_loss: 0.0696 3/500 [..............................] - ETA: 2:45 - loss: 0.6735 - regression_loss: 0.6015 - classification_loss: 0.0719 4/500 [..............................] - ETA: 2:46 - loss: 0.7809 - regression_loss: 0.7044 - classification_loss: 0.0765 5/500 [..............................] - ETA: 2:46 - loss: 1.2135 - regression_loss: 1.0422 - classification_loss: 0.1713 6/500 [..............................] - ETA: 2:45 - loss: 1.1639 - regression_loss: 0.9858 - classification_loss: 0.1781 7/500 [..............................] - ETA: 2:45 - loss: 1.1993 - regression_loss: 1.0168 - classification_loss: 0.1825 8/500 [..............................] - ETA: 2:45 - loss: 1.1075 - regression_loss: 0.9440 - classification_loss: 0.1635 9/500 [..............................] - ETA: 2:46 - loss: 1.2489 - regression_loss: 1.0461 - classification_loss: 0.2028 10/500 [..............................] - ETA: 2:47 - loss: 1.1816 - regression_loss: 0.9951 - classification_loss: 0.1865 11/500 [..............................] - ETA: 2:46 - loss: 1.1609 - regression_loss: 0.9845 - classification_loss: 0.1764 12/500 [..............................] - ETA: 2:46 - loss: 1.1620 - regression_loss: 0.9886 - classification_loss: 0.1734 13/500 [..............................] - ETA: 2:46 - loss: 1.1384 - regression_loss: 0.9724 - classification_loss: 0.1660 14/500 [..............................] - ETA: 2:45 - loss: 1.1947 - regression_loss: 1.0169 - classification_loss: 0.1779 15/500 [..............................] - ETA: 2:44 - loss: 1.1829 - regression_loss: 1.0106 - classification_loss: 0.1722 16/500 [..............................] - ETA: 2:44 - loss: 1.1828 - regression_loss: 1.0091 - classification_loss: 0.1737 17/500 [>.............................] - ETA: 2:44 - loss: 1.1740 - regression_loss: 1.0032 - classification_loss: 0.1707 18/500 [>.............................] - ETA: 2:44 - loss: 1.1877 - regression_loss: 1.0154 - classification_loss: 0.1723 19/500 [>.............................] - ETA: 2:43 - loss: 1.1943 - regression_loss: 1.0227 - classification_loss: 0.1716 20/500 [>.............................] - ETA: 2:42 - loss: 1.1963 - regression_loss: 1.0231 - classification_loss: 0.1733 21/500 [>.............................] - ETA: 2:41 - loss: 1.1669 - regression_loss: 0.9998 - classification_loss: 0.1671 22/500 [>.............................] - ETA: 2:41 - loss: 1.2540 - regression_loss: 1.0635 - classification_loss: 0.1905 23/500 [>.............................] - ETA: 2:41 - loss: 1.2410 - regression_loss: 1.0535 - classification_loss: 0.1875 24/500 [>.............................] - ETA: 2:41 - loss: 1.2365 - regression_loss: 1.0518 - classification_loss: 0.1847 25/500 [>.............................] - ETA: 2:41 - loss: 1.2159 - regression_loss: 1.0348 - classification_loss: 0.1810 26/500 [>.............................] - ETA: 2:40 - loss: 1.2327 - regression_loss: 1.0474 - classification_loss: 0.1853 27/500 [>.............................] - ETA: 2:40 - loss: 1.2222 - regression_loss: 1.0403 - classification_loss: 0.1819 28/500 [>.............................] - ETA: 2:40 - loss: 1.2161 - regression_loss: 1.0377 - classification_loss: 0.1783 29/500 [>.............................] - ETA: 2:39 - loss: 1.2115 - regression_loss: 1.0351 - classification_loss: 0.1764 30/500 [>.............................] - ETA: 2:39 - loss: 1.1877 - regression_loss: 1.0157 - classification_loss: 0.1720 31/500 [>.............................] - ETA: 2:38 - loss: 1.1919 - regression_loss: 1.0191 - classification_loss: 0.1728 32/500 [>.............................] - ETA: 2:38 - loss: 1.1806 - regression_loss: 1.0096 - classification_loss: 0.1711 33/500 [>.............................] - ETA: 2:38 - loss: 1.1789 - regression_loss: 1.0084 - classification_loss: 0.1705 34/500 [=>............................] - ETA: 2:37 - loss: 1.1828 - regression_loss: 1.0126 - classification_loss: 0.1702 35/500 [=>............................] - ETA: 2:37 - loss: 1.2010 - regression_loss: 1.0298 - classification_loss: 0.1712 36/500 [=>............................] - ETA: 2:37 - loss: 1.2090 - regression_loss: 1.0360 - classification_loss: 0.1730 37/500 [=>............................] - ETA: 2:37 - loss: 1.1965 - regression_loss: 1.0255 - classification_loss: 0.1710 38/500 [=>............................] - ETA: 2:36 - loss: 1.1933 - regression_loss: 1.0253 - classification_loss: 0.1681 39/500 [=>............................] - ETA: 2:36 - loss: 1.2161 - regression_loss: 1.0445 - classification_loss: 0.1716 40/500 [=>............................] - ETA: 2:35 - loss: 1.2074 - regression_loss: 1.0392 - classification_loss: 0.1682 41/500 [=>............................] - ETA: 2:35 - loss: 1.2101 - regression_loss: 1.0421 - classification_loss: 0.1680 42/500 [=>............................] - ETA: 2:35 - loss: 1.2159 - regression_loss: 1.0478 - classification_loss: 0.1681 43/500 [=>............................] - ETA: 2:34 - loss: 1.2269 - regression_loss: 1.0583 - classification_loss: 0.1685 44/500 [=>............................] - ETA: 2:34 - loss: 1.2265 - regression_loss: 1.0594 - classification_loss: 0.1671 45/500 [=>............................] - ETA: 2:34 - loss: 1.2315 - regression_loss: 1.0628 - classification_loss: 0.1687 46/500 [=>............................] - ETA: 2:34 - loss: 1.2338 - regression_loss: 1.0660 - classification_loss: 0.1678 47/500 [=>............................] - ETA: 2:33 - loss: 1.2379 - regression_loss: 1.0713 - classification_loss: 0.1667 48/500 [=>............................] - ETA: 2:33 - loss: 1.2394 - regression_loss: 1.0740 - classification_loss: 0.1654 49/500 [=>............................] - ETA: 2:33 - loss: 1.2473 - regression_loss: 1.0805 - classification_loss: 0.1669 50/500 [==>...........................] - ETA: 2:33 - loss: 1.2510 - regression_loss: 1.0852 - classification_loss: 0.1658 51/500 [==>...........................] - ETA: 2:32 - loss: 1.2639 - regression_loss: 1.0958 - classification_loss: 0.1681 52/500 [==>...........................] - ETA: 2:32 - loss: 1.2575 - regression_loss: 1.0902 - classification_loss: 0.1672 53/500 [==>...........................] - ETA: 2:31 - loss: 1.2488 - regression_loss: 1.0821 - classification_loss: 0.1666 54/500 [==>...........................] - ETA: 2:31 - loss: 1.2498 - regression_loss: 1.0827 - classification_loss: 0.1671 55/500 [==>...........................] - ETA: 2:30 - loss: 1.2543 - regression_loss: 1.0868 - classification_loss: 0.1675 56/500 [==>...........................] - ETA: 2:30 - loss: 1.2387 - regression_loss: 1.0739 - classification_loss: 0.1648 57/500 [==>...........................] - ETA: 2:29 - loss: 1.2369 - regression_loss: 1.0727 - classification_loss: 0.1642 58/500 [==>...........................] - ETA: 2:29 - loss: 1.2315 - regression_loss: 1.0685 - classification_loss: 0.1630 59/500 [==>...........................] - ETA: 2:29 - loss: 1.2320 - regression_loss: 1.0691 - classification_loss: 0.1629 60/500 [==>...........................] - ETA: 2:28 - loss: 1.2345 - regression_loss: 1.0717 - classification_loss: 0.1628 61/500 [==>...........................] - ETA: 2:28 - loss: 1.2312 - regression_loss: 1.0694 - classification_loss: 0.1618 62/500 [==>...........................] - ETA: 2:28 - loss: 1.2290 - regression_loss: 1.0680 - classification_loss: 0.1610 63/500 [==>...........................] - ETA: 2:27 - loss: 1.2402 - regression_loss: 1.0769 - classification_loss: 0.1633 64/500 [==>...........................] - ETA: 2:27 - loss: 1.2367 - regression_loss: 1.0742 - classification_loss: 0.1625 65/500 [==>...........................] - ETA: 2:27 - loss: 1.2815 - regression_loss: 1.0958 - classification_loss: 0.1857 66/500 [==>...........................] - ETA: 2:27 - loss: 1.2729 - regression_loss: 1.0894 - classification_loss: 0.1835 67/500 [===>..........................] - ETA: 2:26 - loss: 1.2682 - regression_loss: 1.0851 - classification_loss: 0.1831 68/500 [===>..........................] - ETA: 2:26 - loss: 1.2698 - regression_loss: 1.0876 - classification_loss: 0.1822 69/500 [===>..........................] - ETA: 2:26 - loss: 1.2679 - regression_loss: 1.0866 - classification_loss: 0.1813 70/500 [===>..........................] - ETA: 2:25 - loss: 1.2586 - regression_loss: 1.0793 - classification_loss: 0.1793 71/500 [===>..........................] - ETA: 2:25 - loss: 1.2538 - regression_loss: 1.0757 - classification_loss: 0.1781 72/500 [===>..........................] - ETA: 2:25 - loss: 1.2464 - regression_loss: 1.0698 - classification_loss: 0.1766 73/500 [===>..........................] - ETA: 2:24 - loss: 1.2431 - regression_loss: 1.0675 - classification_loss: 0.1756 74/500 [===>..........................] - ETA: 2:24 - loss: 1.2391 - regression_loss: 1.0646 - classification_loss: 0.1745 75/500 [===>..........................] - ETA: 2:24 - loss: 1.2281 - regression_loss: 1.0548 - classification_loss: 0.1733 76/500 [===>..........................] - ETA: 2:23 - loss: 1.2315 - regression_loss: 1.0573 - classification_loss: 0.1741 77/500 [===>..........................] - ETA: 2:23 - loss: 1.2265 - regression_loss: 1.0538 - classification_loss: 0.1726 78/500 [===>..........................] - ETA: 2:23 - loss: 1.2246 - regression_loss: 1.0521 - classification_loss: 0.1725 79/500 [===>..........................] - ETA: 2:22 - loss: 1.2252 - regression_loss: 1.0536 - classification_loss: 0.1716 80/500 [===>..........................] - ETA: 2:22 - loss: 1.2151 - regression_loss: 1.0448 - classification_loss: 0.1703 81/500 [===>..........................] - ETA: 2:22 - loss: 1.2168 - regression_loss: 1.0472 - classification_loss: 0.1696 82/500 [===>..........................] - ETA: 2:21 - loss: 1.2129 - regression_loss: 1.0437 - classification_loss: 0.1692 83/500 [===>..........................] - ETA: 2:21 - loss: 1.2112 - regression_loss: 1.0427 - classification_loss: 0.1685 84/500 [====>.........................] - ETA: 2:21 - loss: 1.2098 - regression_loss: 1.0416 - classification_loss: 0.1682 85/500 [====>.........................] - ETA: 2:20 - loss: 1.2016 - regression_loss: 1.0349 - classification_loss: 0.1667 86/500 [====>.........................] - ETA: 2:20 - loss: 1.2111 - regression_loss: 1.0432 - classification_loss: 0.1679 87/500 [====>.........................] - ETA: 2:20 - loss: 1.2091 - regression_loss: 1.0410 - classification_loss: 0.1681 88/500 [====>.........................] - ETA: 2:19 - loss: 1.2057 - regression_loss: 1.0381 - classification_loss: 0.1676 89/500 [====>.........................] - ETA: 2:19 - loss: 1.2006 - regression_loss: 1.0342 - classification_loss: 0.1664 90/500 [====>.........................] - ETA: 2:19 - loss: 1.2043 - regression_loss: 1.0375 - classification_loss: 0.1668 91/500 [====>.........................] - ETA: 2:19 - loss: 1.2037 - regression_loss: 1.0374 - classification_loss: 0.1663 92/500 [====>.........................] - ETA: 2:18 - loss: 1.1987 - regression_loss: 1.0335 - classification_loss: 0.1652 93/500 [====>.........................] - ETA: 2:18 - loss: 1.2006 - regression_loss: 1.0362 - classification_loss: 0.1645 94/500 [====>.........................] - ETA: 2:17 - loss: 1.1950 - regression_loss: 1.0307 - classification_loss: 0.1644 95/500 [====>.........................] - ETA: 2:17 - loss: 1.1950 - regression_loss: 1.0305 - classification_loss: 0.1645 96/500 [====>.........................] - ETA: 2:16 - loss: 1.1918 - regression_loss: 1.0276 - classification_loss: 0.1642 97/500 [====>.........................] - ETA: 2:16 - loss: 1.1910 - regression_loss: 1.0272 - classification_loss: 0.1638 98/500 [====>.........................] - ETA: 2:16 - loss: 1.1892 - regression_loss: 1.0259 - classification_loss: 0.1633 99/500 [====>.........................] - ETA: 2:15 - loss: 1.1842 - regression_loss: 1.0220 - classification_loss: 0.1622 100/500 [=====>........................] - ETA: 2:15 - loss: 1.1897 - regression_loss: 1.0261 - classification_loss: 0.1635 101/500 [=====>........................] - ETA: 2:14 - loss: 1.1890 - regression_loss: 1.0261 - classification_loss: 0.1630 102/500 [=====>........................] - ETA: 2:14 - loss: 1.1874 - regression_loss: 1.0251 - classification_loss: 0.1624 103/500 [=====>........................] - ETA: 2:14 - loss: 1.1837 - regression_loss: 1.0223 - classification_loss: 0.1614 104/500 [=====>........................] - ETA: 2:13 - loss: 1.1861 - regression_loss: 1.0242 - classification_loss: 0.1619 105/500 [=====>........................] - ETA: 2:13 - loss: 1.1894 - regression_loss: 1.0273 - classification_loss: 0.1621 106/500 [=====>........................] - ETA: 2:13 - loss: 1.1884 - regression_loss: 1.0264 - classification_loss: 0.1620 107/500 [=====>........................] - ETA: 2:12 - loss: 1.1834 - regression_loss: 1.0221 - classification_loss: 0.1613 108/500 [=====>........................] - ETA: 2:12 - loss: 1.1812 - regression_loss: 1.0206 - classification_loss: 0.1606 109/500 [=====>........................] - ETA: 2:12 - loss: 1.1738 - regression_loss: 1.0143 - classification_loss: 0.1594 110/500 [=====>........................] - ETA: 2:11 - loss: 1.1708 - regression_loss: 1.0120 - classification_loss: 0.1587 111/500 [=====>........................] - ETA: 2:11 - loss: 1.1654 - regression_loss: 1.0077 - classification_loss: 0.1577 112/500 [=====>........................] - ETA: 2:11 - loss: 1.1689 - regression_loss: 1.0109 - classification_loss: 0.1580 113/500 [=====>........................] - ETA: 2:10 - loss: 1.1676 - regression_loss: 1.0098 - classification_loss: 0.1577 114/500 [=====>........................] - ETA: 2:10 - loss: 1.1748 - regression_loss: 1.0164 - classification_loss: 0.1584 115/500 [=====>........................] - ETA: 2:10 - loss: 1.1675 - regression_loss: 1.0102 - classification_loss: 0.1573 116/500 [=====>........................] - ETA: 2:09 - loss: 1.1724 - regression_loss: 1.0145 - classification_loss: 0.1579 117/500 [======>.......................] - ETA: 2:09 - loss: 1.1630 - regression_loss: 1.0058 - classification_loss: 0.1572 118/500 [======>.......................] - ETA: 2:09 - loss: 1.1674 - regression_loss: 1.0107 - classification_loss: 0.1567 119/500 [======>.......................] - ETA: 2:08 - loss: 1.1704 - regression_loss: 1.0136 - classification_loss: 0.1567 120/500 [======>.......................] - ETA: 2:08 - loss: 1.1743 - regression_loss: 1.0174 - classification_loss: 0.1568 121/500 [======>.......................] - ETA: 2:08 - loss: 1.1753 - regression_loss: 1.0188 - classification_loss: 0.1565 122/500 [======>.......................] - ETA: 2:07 - loss: 1.1742 - regression_loss: 1.0179 - classification_loss: 0.1563 123/500 [======>.......................] - ETA: 2:07 - loss: 1.1793 - regression_loss: 1.0225 - classification_loss: 0.1568 124/500 [======>.......................] - ETA: 2:07 - loss: 1.1758 - regression_loss: 1.0196 - classification_loss: 0.1562 125/500 [======>.......................] - ETA: 2:06 - loss: 1.1866 - regression_loss: 1.0293 - classification_loss: 0.1574 126/500 [======>.......................] - ETA: 2:06 - loss: 1.1880 - regression_loss: 1.0297 - classification_loss: 0.1583 127/500 [======>.......................] - ETA: 2:06 - loss: 1.1861 - regression_loss: 1.0284 - classification_loss: 0.1578 128/500 [======>.......................] - ETA: 2:05 - loss: 1.1839 - regression_loss: 1.0267 - classification_loss: 0.1572 129/500 [======>.......................] - ETA: 2:05 - loss: 1.1805 - regression_loss: 1.0238 - classification_loss: 0.1567 130/500 [======>.......................] - ETA: 2:05 - loss: 1.1782 - regression_loss: 1.0217 - classification_loss: 0.1565 131/500 [======>.......................] - ETA: 2:04 - loss: 1.1789 - regression_loss: 1.0224 - classification_loss: 0.1565 132/500 [======>.......................] - ETA: 2:04 - loss: 1.1808 - regression_loss: 1.0238 - classification_loss: 0.1570 133/500 [======>.......................] - ETA: 2:04 - loss: 1.1769 - regression_loss: 1.0207 - classification_loss: 0.1563 134/500 [=======>......................] - ETA: 2:03 - loss: 1.1722 - regression_loss: 1.0167 - classification_loss: 0.1554 135/500 [=======>......................] - ETA: 2:03 - loss: 1.1718 - regression_loss: 1.0166 - classification_loss: 0.1552 136/500 [=======>......................] - ETA: 2:03 - loss: 1.1697 - regression_loss: 1.0152 - classification_loss: 0.1546 137/500 [=======>......................] - ETA: 2:02 - loss: 1.1648 - regression_loss: 1.0112 - classification_loss: 0.1537 138/500 [=======>......................] - ETA: 2:02 - loss: 1.1631 - regression_loss: 1.0098 - classification_loss: 0.1534 139/500 [=======>......................] - ETA: 2:02 - loss: 1.1644 - regression_loss: 1.0109 - classification_loss: 0.1535 140/500 [=======>......................] - ETA: 2:01 - loss: 1.1634 - regression_loss: 1.0100 - classification_loss: 0.1534 141/500 [=======>......................] - ETA: 2:01 - loss: 1.1625 - regression_loss: 1.0095 - classification_loss: 0.1530 142/500 [=======>......................] - ETA: 2:01 - loss: 1.1680 - regression_loss: 1.0131 - classification_loss: 0.1549 143/500 [=======>......................] - ETA: 2:00 - loss: 1.1657 - regression_loss: 1.0114 - classification_loss: 0.1543 144/500 [=======>......................] - ETA: 2:00 - loss: 1.1642 - regression_loss: 1.0102 - classification_loss: 0.1540 145/500 [=======>......................] - ETA: 2:00 - loss: 1.1579 - regression_loss: 1.0048 - classification_loss: 0.1531 146/500 [=======>......................] - ETA: 1:59 - loss: 1.1619 - regression_loss: 1.0073 - classification_loss: 0.1546 147/500 [=======>......................] - ETA: 1:59 - loss: 1.1673 - regression_loss: 1.0129 - classification_loss: 0.1544 148/500 [=======>......................] - ETA: 1:59 - loss: 1.1689 - regression_loss: 1.0143 - classification_loss: 0.1546 149/500 [=======>......................] - ETA: 1:58 - loss: 1.1678 - regression_loss: 1.0135 - classification_loss: 0.1542 150/500 [========>.....................] - ETA: 1:58 - loss: 1.1695 - regression_loss: 1.0151 - classification_loss: 0.1544 151/500 [========>.....................] - ETA: 1:58 - loss: 1.1729 - regression_loss: 1.0174 - classification_loss: 0.1555 152/500 [========>.....................] - ETA: 1:57 - loss: 1.1737 - regression_loss: 1.0182 - classification_loss: 0.1555 153/500 [========>.....................] - ETA: 1:57 - loss: 1.1791 - regression_loss: 1.0238 - classification_loss: 0.1554 154/500 [========>.....................] - ETA: 1:57 - loss: 1.1792 - regression_loss: 1.0237 - classification_loss: 0.1555 155/500 [========>.....................] - ETA: 1:56 - loss: 1.1804 - regression_loss: 1.0243 - classification_loss: 0.1561 156/500 [========>.....................] - ETA: 1:56 - loss: 1.1817 - regression_loss: 1.0255 - classification_loss: 0.1562 157/500 [========>.....................] - ETA: 1:56 - loss: 1.1783 - regression_loss: 1.0229 - classification_loss: 0.1554 158/500 [========>.....................] - ETA: 1:55 - loss: 1.1761 - regression_loss: 1.0211 - classification_loss: 0.1550 159/500 [========>.....................] - ETA: 1:55 - loss: 1.1747 - regression_loss: 1.0202 - classification_loss: 0.1545 160/500 [========>.....................] - ETA: 1:55 - loss: 1.1738 - regression_loss: 1.0194 - classification_loss: 0.1544 161/500 [========>.....................] - ETA: 1:54 - loss: 1.1738 - regression_loss: 1.0196 - classification_loss: 0.1543 162/500 [========>.....................] - ETA: 1:54 - loss: 1.1724 - regression_loss: 1.0186 - classification_loss: 0.1538 163/500 [========>.....................] - ETA: 1:54 - loss: 1.1697 - regression_loss: 1.0166 - classification_loss: 0.1531 164/500 [========>.....................] - ETA: 1:53 - loss: 1.1722 - regression_loss: 1.0185 - classification_loss: 0.1537 165/500 [========>.....................] - ETA: 1:53 - loss: 1.1728 - regression_loss: 1.0195 - classification_loss: 0.1533 166/500 [========>.....................] - ETA: 1:53 - loss: 1.1706 - regression_loss: 1.0177 - classification_loss: 0.1528 167/500 [=========>....................] - ETA: 1:52 - loss: 1.1704 - regression_loss: 1.0181 - classification_loss: 0.1523 168/500 [=========>....................] - ETA: 1:52 - loss: 1.1706 - regression_loss: 1.0185 - classification_loss: 0.1521 169/500 [=========>....................] - ETA: 1:52 - loss: 1.1706 - regression_loss: 1.0187 - classification_loss: 0.1519 170/500 [=========>....................] - ETA: 1:51 - loss: 1.1698 - regression_loss: 1.0182 - classification_loss: 0.1516 171/500 [=========>....................] - ETA: 1:51 - loss: 1.1676 - regression_loss: 1.0160 - classification_loss: 0.1516 172/500 [=========>....................] - ETA: 1:51 - loss: 1.1709 - regression_loss: 1.0197 - classification_loss: 0.1512 173/500 [=========>....................] - ETA: 1:50 - loss: 1.1702 - regression_loss: 1.0192 - classification_loss: 0.1510 174/500 [=========>....................] - ETA: 1:50 - loss: 1.1694 - regression_loss: 1.0184 - classification_loss: 0.1509 175/500 [=========>....................] - ETA: 1:50 - loss: 1.1684 - regression_loss: 1.0177 - classification_loss: 0.1507 176/500 [=========>....................] - ETA: 1:49 - loss: 1.1688 - regression_loss: 1.0181 - classification_loss: 0.1507 177/500 [=========>....................] - ETA: 1:49 - loss: 1.1683 - regression_loss: 1.0177 - classification_loss: 0.1506 178/500 [=========>....................] - ETA: 1:49 - loss: 1.1676 - regression_loss: 1.0170 - classification_loss: 0.1506 179/500 [=========>....................] - ETA: 1:48 - loss: 1.1681 - regression_loss: 1.0176 - classification_loss: 0.1505 180/500 [=========>....................] - ETA: 1:48 - loss: 1.1665 - regression_loss: 1.0163 - classification_loss: 0.1502 181/500 [=========>....................] - ETA: 1:47 - loss: 1.1714 - regression_loss: 1.0207 - classification_loss: 0.1507 182/500 [=========>....................] - ETA: 1:47 - loss: 1.1692 - regression_loss: 1.0190 - classification_loss: 0.1502 183/500 [=========>....................] - ETA: 1:47 - loss: 1.1717 - regression_loss: 1.0211 - classification_loss: 0.1506 184/500 [==========>...................] - ETA: 1:46 - loss: 1.1683 - regression_loss: 1.0183 - classification_loss: 0.1501 185/500 [==========>...................] - ETA: 1:46 - loss: 1.1723 - regression_loss: 1.0213 - classification_loss: 0.1510 186/500 [==========>...................] - ETA: 1:46 - loss: 1.1711 - regression_loss: 1.0201 - classification_loss: 0.1510 187/500 [==========>...................] - ETA: 1:45 - loss: 1.1712 - regression_loss: 1.0199 - classification_loss: 0.1514 188/500 [==========>...................] - ETA: 1:45 - loss: 1.1705 - regression_loss: 1.0193 - classification_loss: 0.1511 189/500 [==========>...................] - ETA: 1:45 - loss: 1.1722 - regression_loss: 1.0210 - classification_loss: 0.1512 190/500 [==========>...................] - ETA: 1:44 - loss: 1.1699 - regression_loss: 1.0192 - classification_loss: 0.1508 191/500 [==========>...................] - ETA: 1:44 - loss: 1.1676 - regression_loss: 1.0174 - classification_loss: 0.1503 192/500 [==========>...................] - ETA: 1:44 - loss: 1.1672 - regression_loss: 1.0171 - classification_loss: 0.1501 193/500 [==========>...................] - ETA: 1:43 - loss: 1.1723 - regression_loss: 1.0209 - classification_loss: 0.1513 194/500 [==========>...................] - ETA: 1:43 - loss: 1.1716 - regression_loss: 1.0202 - classification_loss: 0.1514 195/500 [==========>...................] - ETA: 1:43 - loss: 1.1691 - regression_loss: 1.0182 - classification_loss: 0.1509 196/500 [==========>...................] - ETA: 1:42 - loss: 1.1690 - regression_loss: 1.0182 - classification_loss: 0.1508 197/500 [==========>...................] - ETA: 1:42 - loss: 1.1680 - regression_loss: 1.0176 - classification_loss: 0.1504 198/500 [==========>...................] - ETA: 1:42 - loss: 1.1661 - regression_loss: 1.0157 - classification_loss: 0.1504 199/500 [==========>...................] - ETA: 1:41 - loss: 1.1664 - regression_loss: 1.0163 - classification_loss: 0.1501 200/500 [===========>..................] - ETA: 1:41 - loss: 1.1655 - regression_loss: 1.0156 - classification_loss: 0.1499 201/500 [===========>..................] - ETA: 1:41 - loss: 1.1641 - regression_loss: 1.0145 - classification_loss: 0.1496 202/500 [===========>..................] - ETA: 1:40 - loss: 1.1610 - regression_loss: 1.0119 - classification_loss: 0.1491 203/500 [===========>..................] - ETA: 1:40 - loss: 1.1599 - regression_loss: 1.0111 - classification_loss: 0.1488 204/500 [===========>..................] - ETA: 1:40 - loss: 1.1591 - regression_loss: 1.0104 - classification_loss: 0.1486 205/500 [===========>..................] - ETA: 1:39 - loss: 1.1598 - regression_loss: 1.0111 - classification_loss: 0.1487 206/500 [===========>..................] - ETA: 1:39 - loss: 1.1589 - regression_loss: 1.0106 - classification_loss: 0.1483 207/500 [===========>..................] - ETA: 1:39 - loss: 1.1638 - regression_loss: 1.0151 - classification_loss: 0.1487 208/500 [===========>..................] - ETA: 1:38 - loss: 1.1646 - regression_loss: 1.0156 - classification_loss: 0.1490 209/500 [===========>..................] - ETA: 1:38 - loss: 1.1643 - regression_loss: 1.0156 - classification_loss: 0.1487 210/500 [===========>..................] - ETA: 1:38 - loss: 1.1636 - regression_loss: 1.0150 - classification_loss: 0.1486 211/500 [===========>..................] - ETA: 1:37 - loss: 1.1650 - regression_loss: 1.0163 - classification_loss: 0.1486 212/500 [===========>..................] - ETA: 1:37 - loss: 1.1676 - regression_loss: 1.0182 - classification_loss: 0.1495 213/500 [===========>..................] - ETA: 1:37 - loss: 1.1647 - regression_loss: 1.0153 - classification_loss: 0.1494 214/500 [===========>..................] - ETA: 1:36 - loss: 1.1610 - regression_loss: 1.0121 - classification_loss: 0.1489 215/500 [===========>..................] - ETA: 1:36 - loss: 1.1592 - regression_loss: 1.0100 - classification_loss: 0.1492 216/500 [===========>..................] - ETA: 1:36 - loss: 1.1602 - regression_loss: 1.0110 - classification_loss: 0.1492 217/500 [============>.................] - ETA: 1:35 - loss: 1.1623 - regression_loss: 1.0131 - classification_loss: 0.1493 218/500 [============>.................] - ETA: 1:35 - loss: 1.1619 - regression_loss: 1.0126 - classification_loss: 0.1493 219/500 [============>.................] - ETA: 1:35 - loss: 1.1603 - regression_loss: 1.0113 - classification_loss: 0.1490 220/500 [============>.................] - ETA: 1:34 - loss: 1.1587 - regression_loss: 1.0099 - classification_loss: 0.1488 221/500 [============>.................] - ETA: 1:34 - loss: 1.1614 - regression_loss: 1.0112 - classification_loss: 0.1502 222/500 [============>.................] - ETA: 1:34 - loss: 1.1610 - regression_loss: 1.0108 - classification_loss: 0.1501 223/500 [============>.................] - ETA: 1:33 - loss: 1.1662 - regression_loss: 1.0153 - classification_loss: 0.1509 224/500 [============>.................] - ETA: 1:33 - loss: 1.1655 - regression_loss: 1.0148 - classification_loss: 0.1507 225/500 [============>.................] - ETA: 1:32 - loss: 1.1660 - regression_loss: 1.0151 - classification_loss: 0.1508 226/500 [============>.................] - ETA: 1:32 - loss: 1.1638 - regression_loss: 1.0135 - classification_loss: 0.1503 227/500 [============>.................] - ETA: 1:32 - loss: 1.1620 - regression_loss: 1.0118 - classification_loss: 0.1501 228/500 [============>.................] - ETA: 1:31 - loss: 1.1628 - regression_loss: 1.0126 - classification_loss: 0.1502 229/500 [============>.................] - ETA: 1:31 - loss: 1.1618 - regression_loss: 1.0119 - classification_loss: 0.1499 230/500 [============>.................] - ETA: 1:31 - loss: 1.1633 - regression_loss: 1.0132 - classification_loss: 0.1501 231/500 [============>.................] - ETA: 1:30 - loss: 1.1653 - regression_loss: 1.0150 - classification_loss: 0.1504 232/500 [============>.................] - ETA: 1:30 - loss: 1.1673 - regression_loss: 1.0169 - classification_loss: 0.1504 233/500 [============>.................] - ETA: 1:30 - loss: 1.1663 - regression_loss: 1.0161 - classification_loss: 0.1503 234/500 [=============>................] - ETA: 1:29 - loss: 1.1649 - regression_loss: 1.0150 - classification_loss: 0.1499 235/500 [=============>................] - ETA: 1:29 - loss: 1.1651 - regression_loss: 1.0151 - classification_loss: 0.1500 236/500 [=============>................] - ETA: 1:29 - loss: 1.1644 - regression_loss: 1.0127 - classification_loss: 0.1516 237/500 [=============>................] - ETA: 1:28 - loss: 1.1659 - regression_loss: 1.0140 - classification_loss: 0.1519 238/500 [=============>................] - ETA: 1:28 - loss: 1.1674 - regression_loss: 1.0155 - classification_loss: 0.1519 239/500 [=============>................] - ETA: 1:28 - loss: 1.1660 - regression_loss: 1.0144 - classification_loss: 0.1516 240/500 [=============>................] - ETA: 1:27 - loss: 1.1659 - regression_loss: 1.0144 - classification_loss: 0.1515 241/500 [=============>................] - ETA: 1:27 - loss: 1.1632 - regression_loss: 1.0123 - classification_loss: 0.1510 242/500 [=============>................] - ETA: 1:27 - loss: 1.1633 - regression_loss: 1.0119 - classification_loss: 0.1515 243/500 [=============>................] - ETA: 1:26 - loss: 1.1655 - regression_loss: 1.0135 - classification_loss: 0.1521 244/500 [=============>................] - ETA: 1:26 - loss: 1.1674 - regression_loss: 1.0150 - classification_loss: 0.1524 245/500 [=============>................] - ETA: 1:26 - loss: 1.1734 - regression_loss: 1.0192 - classification_loss: 0.1543 246/500 [=============>................] - ETA: 1:25 - loss: 1.1729 - regression_loss: 1.0188 - classification_loss: 0.1541 247/500 [=============>................] - ETA: 1:25 - loss: 1.1723 - regression_loss: 1.0184 - classification_loss: 0.1539 248/500 [=============>................] - ETA: 1:25 - loss: 1.1706 - regression_loss: 1.0171 - classification_loss: 0.1535 249/500 [=============>................] - ETA: 1:24 - loss: 1.1710 - regression_loss: 1.0173 - classification_loss: 0.1537 250/500 [==============>...............] - ETA: 1:24 - loss: 1.1717 - regression_loss: 1.0178 - classification_loss: 0.1538 251/500 [==============>...............] - ETA: 1:24 - loss: 1.1733 - regression_loss: 1.0192 - classification_loss: 0.1541 252/500 [==============>...............] - ETA: 1:23 - loss: 1.1727 - regression_loss: 1.0190 - classification_loss: 0.1537 253/500 [==============>...............] - ETA: 1:23 - loss: 1.1746 - regression_loss: 1.0209 - classification_loss: 0.1538 254/500 [==============>...............] - ETA: 1:23 - loss: 1.1784 - regression_loss: 1.0238 - classification_loss: 0.1546 255/500 [==============>...............] - ETA: 1:22 - loss: 1.1785 - regression_loss: 1.0240 - classification_loss: 0.1545 256/500 [==============>...............] - ETA: 1:22 - loss: 1.1767 - regression_loss: 1.0224 - classification_loss: 0.1543 257/500 [==============>...............] - ETA: 1:22 - loss: 1.1740 - regression_loss: 1.0200 - classification_loss: 0.1540 258/500 [==============>...............] - ETA: 1:21 - loss: 1.1706 - regression_loss: 1.0171 - classification_loss: 0.1535 259/500 [==============>...............] - ETA: 1:21 - loss: 1.1689 - regression_loss: 1.0156 - classification_loss: 0.1533 260/500 [==============>...............] - ETA: 1:21 - loss: 1.1720 - regression_loss: 1.0189 - classification_loss: 0.1531 261/500 [==============>...............] - ETA: 1:20 - loss: 1.1706 - regression_loss: 1.0176 - classification_loss: 0.1530 262/500 [==============>...............] - ETA: 1:20 - loss: 1.1701 - regression_loss: 1.0174 - classification_loss: 0.1527 263/500 [==============>...............] - ETA: 1:20 - loss: 1.1685 - regression_loss: 1.0161 - classification_loss: 0.1524 264/500 [==============>...............] - ETA: 1:19 - loss: 1.1694 - regression_loss: 1.0170 - classification_loss: 0.1524 265/500 [==============>...............] - ETA: 1:19 - loss: 1.1704 - regression_loss: 1.0178 - classification_loss: 0.1526 266/500 [==============>...............] - ETA: 1:19 - loss: 1.1688 - regression_loss: 1.0162 - classification_loss: 0.1526 267/500 [===============>..............] - ETA: 1:18 - loss: 1.1706 - regression_loss: 1.0177 - classification_loss: 0.1529 268/500 [===============>..............] - ETA: 1:18 - loss: 1.1735 - regression_loss: 1.0202 - classification_loss: 0.1533 269/500 [===============>..............] - ETA: 1:18 - loss: 1.1726 - regression_loss: 1.0195 - classification_loss: 0.1531 270/500 [===============>..............] - ETA: 1:17 - loss: 1.1724 - regression_loss: 1.0195 - classification_loss: 0.1529 271/500 [===============>..............] - ETA: 1:17 - loss: 1.1751 - regression_loss: 1.0219 - classification_loss: 0.1532 272/500 [===============>..............] - ETA: 1:17 - loss: 1.1759 - regression_loss: 1.0225 - classification_loss: 0.1535 273/500 [===============>..............] - ETA: 1:16 - loss: 1.1761 - regression_loss: 1.0227 - classification_loss: 0.1535 274/500 [===============>..............] - ETA: 1:16 - loss: 1.1737 - regression_loss: 1.0205 - classification_loss: 0.1532 275/500 [===============>..............] - ETA: 1:16 - loss: 1.1736 - regression_loss: 1.0206 - classification_loss: 0.1530 276/500 [===============>..............] - ETA: 1:15 - loss: 1.1718 - regression_loss: 1.0190 - classification_loss: 0.1527 277/500 [===============>..............] - ETA: 1:15 - loss: 1.1689 - regression_loss: 1.0165 - classification_loss: 0.1523 278/500 [===============>..............] - ETA: 1:15 - loss: 1.1678 - regression_loss: 1.0157 - classification_loss: 0.1521 279/500 [===============>..............] - ETA: 1:14 - loss: 1.1655 - regression_loss: 1.0136 - classification_loss: 0.1519 280/500 [===============>..............] - ETA: 1:14 - loss: 1.1694 - regression_loss: 1.0171 - classification_loss: 0.1523 281/500 [===============>..............] - ETA: 1:14 - loss: 1.1693 - regression_loss: 1.0170 - classification_loss: 0.1523 282/500 [===============>..............] - ETA: 1:13 - loss: 1.1682 - regression_loss: 1.0163 - classification_loss: 0.1519 283/500 [===============>..............] - ETA: 1:13 - loss: 1.1671 - regression_loss: 1.0154 - classification_loss: 0.1516 284/500 [================>.............] - ETA: 1:13 - loss: 1.1655 - regression_loss: 1.0143 - classification_loss: 0.1512 285/500 [================>.............] - ETA: 1:12 - loss: 1.1667 - regression_loss: 1.0157 - classification_loss: 0.1509 286/500 [================>.............] - ETA: 1:12 - loss: 1.1646 - regression_loss: 1.0139 - classification_loss: 0.1506 287/500 [================>.............] - ETA: 1:12 - loss: 1.1665 - regression_loss: 1.0158 - classification_loss: 0.1507 288/500 [================>.............] - ETA: 1:11 - loss: 1.1670 - regression_loss: 1.0164 - classification_loss: 0.1506 289/500 [================>.............] - ETA: 1:11 - loss: 1.1663 - regression_loss: 1.0160 - classification_loss: 0.1503 290/500 [================>.............] - ETA: 1:10 - loss: 1.1639 - regression_loss: 1.0140 - classification_loss: 0.1499 291/500 [================>.............] - ETA: 1:10 - loss: 1.1647 - regression_loss: 1.0149 - classification_loss: 0.1498 292/500 [================>.............] - ETA: 1:10 - loss: 1.1652 - regression_loss: 1.0156 - classification_loss: 0.1496 293/500 [================>.............] - ETA: 1:09 - loss: 1.1663 - regression_loss: 1.0163 - classification_loss: 0.1500 294/500 [================>.............] - ETA: 1:09 - loss: 1.1676 - regression_loss: 1.0174 - classification_loss: 0.1502 295/500 [================>.............] - ETA: 1:09 - loss: 1.1689 - regression_loss: 1.0185 - classification_loss: 0.1504 296/500 [================>.............] - ETA: 1:08 - loss: 1.1725 - regression_loss: 1.0217 - classification_loss: 0.1508 297/500 [================>.............] - ETA: 1:08 - loss: 1.1704 - regression_loss: 1.0197 - classification_loss: 0.1507 298/500 [================>.............] - ETA: 1:08 - loss: 1.1688 - regression_loss: 1.0184 - classification_loss: 0.1504 299/500 [================>.............] - ETA: 1:07 - loss: 1.1684 - regression_loss: 1.0180 - classification_loss: 0.1504 300/500 [=================>............] - ETA: 1:07 - loss: 1.1677 - regression_loss: 1.0175 - classification_loss: 0.1503 301/500 [=================>............] - ETA: 1:07 - loss: 1.1657 - regression_loss: 1.0157 - classification_loss: 0.1500 302/500 [=================>............] - ETA: 1:06 - loss: 1.1634 - regression_loss: 1.0137 - classification_loss: 0.1498 303/500 [=================>............] - ETA: 1:06 - loss: 1.1625 - regression_loss: 1.0129 - classification_loss: 0.1496 304/500 [=================>............] - ETA: 1:06 - loss: 1.1610 - regression_loss: 1.0118 - classification_loss: 0.1492 305/500 [=================>............] - ETA: 1:05 - loss: 1.1617 - regression_loss: 1.0125 - classification_loss: 0.1492 306/500 [=================>............] - ETA: 1:05 - loss: 1.1594 - regression_loss: 1.0106 - classification_loss: 0.1489 307/500 [=================>............] - ETA: 1:05 - loss: 1.1591 - regression_loss: 1.0101 - classification_loss: 0.1490 308/500 [=================>............] - ETA: 1:04 - loss: 1.1584 - regression_loss: 1.0095 - classification_loss: 0.1489 309/500 [=================>............] - ETA: 1:04 - loss: 1.1586 - regression_loss: 1.0098 - classification_loss: 0.1489 310/500 [=================>............] - ETA: 1:04 - loss: 1.1657 - regression_loss: 1.0148 - classification_loss: 0.1509 311/500 [=================>............] - ETA: 1:03 - loss: 1.1692 - regression_loss: 1.0179 - classification_loss: 0.1513 312/500 [=================>............] - ETA: 1:03 - loss: 1.1684 - regression_loss: 1.0173 - classification_loss: 0.1511 313/500 [=================>............] - ETA: 1:03 - loss: 1.1695 - regression_loss: 1.0178 - classification_loss: 0.1517 314/500 [=================>............] - ETA: 1:02 - loss: 1.1694 - regression_loss: 1.0180 - classification_loss: 0.1514 315/500 [=================>............] - ETA: 1:02 - loss: 1.1677 - regression_loss: 1.0166 - classification_loss: 0.1511 316/500 [=================>............] - ETA: 1:02 - loss: 1.1703 - regression_loss: 1.0186 - classification_loss: 0.1518 317/500 [==================>...........] - ETA: 1:01 - loss: 1.1690 - regression_loss: 1.0173 - classification_loss: 0.1517 318/500 [==================>...........] - ETA: 1:01 - loss: 1.1692 - regression_loss: 1.0177 - classification_loss: 0.1515 319/500 [==================>...........] - ETA: 1:01 - loss: 1.1695 - regression_loss: 1.0180 - classification_loss: 0.1515 320/500 [==================>...........] - ETA: 1:00 - loss: 1.1691 - regression_loss: 1.0178 - classification_loss: 0.1513 321/500 [==================>...........] - ETA: 1:00 - loss: 1.1696 - regression_loss: 1.0185 - classification_loss: 0.1511 322/500 [==================>...........] - ETA: 1:00 - loss: 1.1694 - regression_loss: 1.0182 - classification_loss: 0.1511 323/500 [==================>...........] - ETA: 59s - loss: 1.1705 - regression_loss: 1.0193 - classification_loss: 0.1512  324/500 [==================>...........] - ETA: 59s - loss: 1.1717 - regression_loss: 1.0205 - classification_loss: 0.1512 325/500 [==================>...........] - ETA: 59s - loss: 1.1735 - regression_loss: 1.0220 - classification_loss: 0.1515 326/500 [==================>...........] - ETA: 58s - loss: 1.1739 - regression_loss: 1.0225 - classification_loss: 0.1515 327/500 [==================>...........] - ETA: 58s - loss: 1.1748 - regression_loss: 1.0231 - classification_loss: 0.1516 328/500 [==================>...........] - ETA: 58s - loss: 1.1749 - regression_loss: 1.0233 - classification_loss: 0.1515 329/500 [==================>...........] - ETA: 57s - loss: 1.1771 - regression_loss: 1.0251 - classification_loss: 0.1520 330/500 [==================>...........] - ETA: 57s - loss: 1.1765 - regression_loss: 1.0244 - classification_loss: 0.1520 331/500 [==================>...........] - ETA: 57s - loss: 1.1764 - regression_loss: 1.0245 - classification_loss: 0.1520 332/500 [==================>...........] - ETA: 56s - loss: 1.1766 - regression_loss: 1.0246 - classification_loss: 0.1520 333/500 [==================>...........] - ETA: 56s - loss: 1.1778 - regression_loss: 1.0256 - classification_loss: 0.1522 334/500 [===================>..........] - ETA: 56s - loss: 1.1767 - regression_loss: 1.0245 - classification_loss: 0.1522 335/500 [===================>..........] - ETA: 55s - loss: 1.1747 - regression_loss: 1.0228 - classification_loss: 0.1519 336/500 [===================>..........] - ETA: 55s - loss: 1.1739 - regression_loss: 1.0221 - classification_loss: 0.1517 337/500 [===================>..........] - ETA: 55s - loss: 1.1734 - regression_loss: 1.0218 - classification_loss: 0.1516 338/500 [===================>..........] - ETA: 54s - loss: 1.1743 - regression_loss: 1.0226 - classification_loss: 0.1517 339/500 [===================>..........] - ETA: 54s - loss: 1.1742 - regression_loss: 1.0226 - classification_loss: 0.1515 340/500 [===================>..........] - ETA: 54s - loss: 1.1733 - regression_loss: 1.0215 - classification_loss: 0.1518 341/500 [===================>..........] - ETA: 53s - loss: 1.1713 - regression_loss: 1.0197 - classification_loss: 0.1515 342/500 [===================>..........] - ETA: 53s - loss: 1.1741 - regression_loss: 1.0220 - classification_loss: 0.1520 343/500 [===================>..........] - ETA: 53s - loss: 1.1726 - regression_loss: 1.0208 - classification_loss: 0.1518 344/500 [===================>..........] - ETA: 52s - loss: 1.1726 - regression_loss: 1.0208 - classification_loss: 0.1519 345/500 [===================>..........] - ETA: 52s - loss: 1.1730 - regression_loss: 1.0211 - classification_loss: 0.1519 346/500 [===================>..........] - ETA: 52s - loss: 1.1718 - regression_loss: 1.0201 - classification_loss: 0.1517 347/500 [===================>..........] - ETA: 51s - loss: 1.1717 - regression_loss: 1.0199 - classification_loss: 0.1518 348/500 [===================>..........] - ETA: 51s - loss: 1.1707 - regression_loss: 1.0191 - classification_loss: 0.1516 349/500 [===================>..........] - ETA: 51s - loss: 1.1700 - regression_loss: 1.0185 - classification_loss: 0.1515 350/500 [====================>.........] - ETA: 50s - loss: 1.1685 - regression_loss: 1.0173 - classification_loss: 0.1512 351/500 [====================>.........] - ETA: 50s - loss: 1.1672 - regression_loss: 1.0159 - classification_loss: 0.1512 352/500 [====================>.........] - ETA: 50s - loss: 1.1694 - regression_loss: 1.0179 - classification_loss: 0.1516 353/500 [====================>.........] - ETA: 49s - loss: 1.1696 - regression_loss: 1.0180 - classification_loss: 0.1516 354/500 [====================>.........] - ETA: 49s - loss: 1.1697 - regression_loss: 1.0180 - classification_loss: 0.1517 355/500 [====================>.........] - ETA: 49s - loss: 1.1714 - regression_loss: 1.0195 - classification_loss: 0.1519 356/500 [====================>.........] - ETA: 48s - loss: 1.1707 - regression_loss: 1.0190 - classification_loss: 0.1517 357/500 [====================>.........] - ETA: 48s - loss: 1.1698 - regression_loss: 1.0183 - classification_loss: 0.1516 358/500 [====================>.........] - ETA: 48s - loss: 1.1702 - regression_loss: 1.0186 - classification_loss: 0.1516 359/500 [====================>.........] - ETA: 47s - loss: 1.1700 - regression_loss: 1.0186 - classification_loss: 0.1514 360/500 [====================>.........] - ETA: 47s - loss: 1.1719 - regression_loss: 1.0201 - classification_loss: 0.1518 361/500 [====================>.........] - ETA: 47s - loss: 1.1731 - regression_loss: 1.0212 - classification_loss: 0.1518 362/500 [====================>.........] - ETA: 46s - loss: 1.1734 - regression_loss: 1.0217 - classification_loss: 0.1517 363/500 [====================>.........] - ETA: 46s - loss: 1.1737 - regression_loss: 1.0220 - classification_loss: 0.1516 364/500 [====================>.........] - ETA: 46s - loss: 1.1728 - regression_loss: 1.0213 - classification_loss: 0.1515 365/500 [====================>.........] - ETA: 45s - loss: 1.1735 - regression_loss: 1.0220 - classification_loss: 0.1515 366/500 [====================>.........] - ETA: 45s - loss: 1.1745 - regression_loss: 1.0232 - classification_loss: 0.1513 367/500 [=====================>........] - ETA: 45s - loss: 1.1763 - regression_loss: 1.0247 - classification_loss: 0.1516 368/500 [=====================>........] - ETA: 44s - loss: 1.1770 - regression_loss: 1.0252 - classification_loss: 0.1517 369/500 [=====================>........] - ETA: 44s - loss: 1.1760 - regression_loss: 1.0246 - classification_loss: 0.1514 370/500 [=====================>........] - ETA: 44s - loss: 1.1752 - regression_loss: 1.0240 - classification_loss: 0.1511 371/500 [=====================>........] - ETA: 43s - loss: 1.1749 - regression_loss: 1.0239 - classification_loss: 0.1510 372/500 [=====================>........] - ETA: 43s - loss: 1.1750 - regression_loss: 1.0241 - classification_loss: 0.1509 373/500 [=====================>........] - ETA: 43s - loss: 1.1747 - regression_loss: 1.0240 - classification_loss: 0.1507 374/500 [=====================>........] - ETA: 42s - loss: 1.1755 - regression_loss: 1.0245 - classification_loss: 0.1510 375/500 [=====================>........] - ETA: 42s - loss: 1.1763 - regression_loss: 1.0253 - classification_loss: 0.1510 376/500 [=====================>........] - ETA: 41s - loss: 1.1754 - regression_loss: 1.0247 - classification_loss: 0.1508 377/500 [=====================>........] - ETA: 41s - loss: 1.1744 - regression_loss: 1.0238 - classification_loss: 0.1506 378/500 [=====================>........] - ETA: 41s - loss: 1.1740 - regression_loss: 1.0236 - classification_loss: 0.1504 379/500 [=====================>........] - ETA: 40s - loss: 1.1764 - regression_loss: 1.0256 - classification_loss: 0.1507 380/500 [=====================>........] - ETA: 40s - loss: 1.1742 - regression_loss: 1.0239 - classification_loss: 0.1504 381/500 [=====================>........] - ETA: 40s - loss: 1.1744 - regression_loss: 1.0240 - classification_loss: 0.1504 382/500 [=====================>........] - ETA: 39s - loss: 1.1740 - regression_loss: 1.0239 - classification_loss: 0.1502 383/500 [=====================>........] - ETA: 39s - loss: 1.1755 - regression_loss: 1.0250 - classification_loss: 0.1505 384/500 [======================>.......] - ETA: 39s - loss: 1.1774 - regression_loss: 1.0268 - classification_loss: 0.1506 385/500 [======================>.......] - ETA: 38s - loss: 1.1778 - regression_loss: 1.0271 - classification_loss: 0.1506 386/500 [======================>.......] - ETA: 38s - loss: 1.1809 - regression_loss: 1.0296 - classification_loss: 0.1512 387/500 [======================>.......] - ETA: 38s - loss: 1.1826 - regression_loss: 1.0308 - classification_loss: 0.1519 388/500 [======================>.......] - ETA: 37s - loss: 1.1823 - regression_loss: 1.0307 - classification_loss: 0.1517 389/500 [======================>.......] - ETA: 37s - loss: 1.1810 - regression_loss: 1.0295 - classification_loss: 0.1515 390/500 [======================>.......] - ETA: 37s - loss: 1.1830 - regression_loss: 1.0314 - classification_loss: 0.1516 391/500 [======================>.......] - ETA: 36s - loss: 1.1821 - regression_loss: 1.0306 - classification_loss: 0.1514 392/500 [======================>.......] - ETA: 36s - loss: 1.1824 - regression_loss: 1.0310 - classification_loss: 0.1515 393/500 [======================>.......] - ETA: 36s - loss: 1.1831 - regression_loss: 1.0315 - classification_loss: 0.1517 394/500 [======================>.......] - ETA: 35s - loss: 1.1836 - regression_loss: 1.0318 - classification_loss: 0.1518 395/500 [======================>.......] - ETA: 35s - loss: 1.1822 - regression_loss: 1.0307 - classification_loss: 0.1515 396/500 [======================>.......] - ETA: 35s - loss: 1.1817 - regression_loss: 1.0303 - classification_loss: 0.1514 397/500 [======================>.......] - ETA: 34s - loss: 1.1803 - regression_loss: 1.0292 - classification_loss: 0.1511 398/500 [======================>.......] - ETA: 34s - loss: 1.1795 - regression_loss: 1.0286 - classification_loss: 0.1509 399/500 [======================>.......] - ETA: 34s - loss: 1.1803 - regression_loss: 1.0291 - classification_loss: 0.1512 400/500 [=======================>......] - ETA: 33s - loss: 1.1821 - regression_loss: 1.0305 - classification_loss: 0.1516 401/500 [=======================>......] - ETA: 33s - loss: 1.1812 - regression_loss: 1.0297 - classification_loss: 0.1515 402/500 [=======================>......] - ETA: 33s - loss: 1.1796 - regression_loss: 1.0284 - classification_loss: 0.1513 403/500 [=======================>......] - ETA: 32s - loss: 1.1801 - regression_loss: 1.0287 - classification_loss: 0.1514 404/500 [=======================>......] - ETA: 32s - loss: 1.1810 - regression_loss: 1.0293 - classification_loss: 0.1517 405/500 [=======================>......] - ETA: 32s - loss: 1.1792 - regression_loss: 1.0278 - classification_loss: 0.1514 406/500 [=======================>......] - ETA: 31s - loss: 1.1815 - regression_loss: 1.0301 - classification_loss: 0.1515 407/500 [=======================>......] - ETA: 31s - loss: 1.1812 - regression_loss: 1.0298 - classification_loss: 0.1514 408/500 [=======================>......] - ETA: 31s - loss: 1.1822 - regression_loss: 1.0306 - classification_loss: 0.1516 409/500 [=======================>......] - ETA: 30s - loss: 1.1827 - regression_loss: 1.0311 - classification_loss: 0.1516 410/500 [=======================>......] - ETA: 30s - loss: 1.1835 - regression_loss: 1.0318 - classification_loss: 0.1516 411/500 [=======================>......] - ETA: 30s - loss: 1.1849 - regression_loss: 1.0333 - classification_loss: 0.1516 412/500 [=======================>......] - ETA: 29s - loss: 1.1870 - regression_loss: 1.0348 - classification_loss: 0.1522 413/500 [=======================>......] - ETA: 29s - loss: 1.1868 - regression_loss: 1.0348 - classification_loss: 0.1521 414/500 [=======================>......] - ETA: 29s - loss: 1.1856 - regression_loss: 1.0336 - classification_loss: 0.1520 415/500 [=======================>......] - ETA: 28s - loss: 1.1838 - regression_loss: 1.0320 - classification_loss: 0.1518 416/500 [=======================>......] - ETA: 28s - loss: 1.1844 - regression_loss: 1.0325 - classification_loss: 0.1519 417/500 [========================>.....] - ETA: 28s - loss: 1.1847 - regression_loss: 1.0329 - classification_loss: 0.1518 418/500 [========================>.....] - ETA: 27s - loss: 1.1844 - regression_loss: 1.0326 - classification_loss: 0.1518 419/500 [========================>.....] - ETA: 27s - loss: 1.1856 - regression_loss: 1.0338 - classification_loss: 0.1518 420/500 [========================>.....] - ETA: 27s - loss: 1.1873 - regression_loss: 1.0349 - classification_loss: 0.1523 421/500 [========================>.....] - ETA: 26s - loss: 1.1898 - regression_loss: 1.0370 - classification_loss: 0.1528 422/500 [========================>.....] - ETA: 26s - loss: 1.1910 - regression_loss: 1.0380 - classification_loss: 0.1529 423/500 [========================>.....] - ETA: 26s - loss: 1.1912 - regression_loss: 1.0383 - classification_loss: 0.1529 424/500 [========================>.....] - ETA: 25s - loss: 1.1911 - regression_loss: 1.0383 - classification_loss: 0.1528 425/500 [========================>.....] - ETA: 25s - loss: 1.1913 - regression_loss: 1.0386 - classification_loss: 0.1527 426/500 [========================>.....] - ETA: 25s - loss: 1.1919 - regression_loss: 1.0390 - classification_loss: 0.1529 427/500 [========================>.....] - ETA: 24s - loss: 1.1917 - regression_loss: 1.0389 - classification_loss: 0.1528 428/500 [========================>.....] - ETA: 24s - loss: 1.1914 - regression_loss: 1.0386 - classification_loss: 0.1528 429/500 [========================>.....] - ETA: 24s - loss: 1.1921 - regression_loss: 1.0393 - classification_loss: 0.1528 430/500 [========================>.....] - ETA: 23s - loss: 1.1925 - regression_loss: 1.0396 - classification_loss: 0.1529 431/500 [========================>.....] - ETA: 23s - loss: 1.1914 - regression_loss: 1.0388 - classification_loss: 0.1526 432/500 [========================>.....] - ETA: 23s - loss: 1.1913 - regression_loss: 1.0388 - classification_loss: 0.1525 433/500 [========================>.....] - ETA: 22s - loss: 1.1911 - regression_loss: 1.0386 - classification_loss: 0.1525 434/500 [=========================>....] - ETA: 22s - loss: 1.1914 - regression_loss: 1.0390 - classification_loss: 0.1524 435/500 [=========================>....] - ETA: 22s - loss: 1.1908 - regression_loss: 1.0386 - classification_loss: 0.1522 436/500 [=========================>....] - ETA: 21s - loss: 1.1898 - regression_loss: 1.0378 - classification_loss: 0.1520 437/500 [=========================>....] - ETA: 21s - loss: 1.1896 - regression_loss: 1.0376 - classification_loss: 0.1520 438/500 [=========================>....] - ETA: 21s - loss: 1.1870 - regression_loss: 1.0353 - classification_loss: 0.1517 439/500 [=========================>....] - ETA: 20s - loss: 1.1852 - regression_loss: 1.0337 - classification_loss: 0.1515 440/500 [=========================>....] - ETA: 20s - loss: 1.1854 - regression_loss: 1.0340 - classification_loss: 0.1514 441/500 [=========================>....] - ETA: 20s - loss: 1.1841 - regression_loss: 1.0330 - classification_loss: 0.1512 442/500 [=========================>....] - ETA: 19s - loss: 1.1840 - regression_loss: 1.0330 - classification_loss: 0.1510 443/500 [=========================>....] - ETA: 19s - loss: 1.1832 - regression_loss: 1.0324 - classification_loss: 0.1508 444/500 [=========================>....] - ETA: 18s - loss: 1.1830 - regression_loss: 1.0325 - classification_loss: 0.1505 445/500 [=========================>....] - ETA: 18s - loss: 1.1828 - regression_loss: 1.0323 - classification_loss: 0.1504 446/500 [=========================>....] - ETA: 18s - loss: 1.1847 - regression_loss: 1.0340 - classification_loss: 0.1508 447/500 [=========================>....] - ETA: 17s - loss: 1.1845 - regression_loss: 1.0336 - classification_loss: 0.1508 448/500 [=========================>....] - ETA: 17s - loss: 1.1858 - regression_loss: 1.0348 - classification_loss: 0.1510 449/500 [=========================>....] - ETA: 17s - loss: 1.1858 - regression_loss: 1.0348 - classification_loss: 0.1510 450/500 [==========================>...] - ETA: 16s - loss: 1.1858 - regression_loss: 1.0348 - classification_loss: 0.1510 451/500 [==========================>...] - ETA: 16s - loss: 1.1860 - regression_loss: 1.0349 - classification_loss: 0.1511 452/500 [==========================>...] - ETA: 16s - loss: 1.1871 - regression_loss: 1.0359 - classification_loss: 0.1512 453/500 [==========================>...] - ETA: 15s - loss: 1.1873 - regression_loss: 1.0362 - classification_loss: 0.1512 454/500 [==========================>...] - ETA: 15s - loss: 1.1868 - regression_loss: 1.0358 - classification_loss: 0.1510 455/500 [==========================>...] - ETA: 15s - loss: 1.1855 - regression_loss: 1.0347 - classification_loss: 0.1508 456/500 [==========================>...] - ETA: 14s - loss: 1.1847 - regression_loss: 1.0341 - classification_loss: 0.1507 457/500 [==========================>...] - ETA: 14s - loss: 1.1843 - regression_loss: 1.0336 - classification_loss: 0.1507 458/500 [==========================>...] - ETA: 14s - loss: 1.1848 - regression_loss: 1.0341 - classification_loss: 0.1507 459/500 [==========================>...] - ETA: 13s - loss: 1.1845 - regression_loss: 1.0339 - classification_loss: 0.1506 460/500 [==========================>...] - ETA: 13s - loss: 1.1833 - regression_loss: 1.0329 - classification_loss: 0.1504 461/500 [==========================>...] - ETA: 13s - loss: 1.1832 - regression_loss: 1.0328 - classification_loss: 0.1504 462/500 [==========================>...] - ETA: 12s - loss: 1.1829 - regression_loss: 1.0327 - classification_loss: 0.1502 463/500 [==========================>...] - ETA: 12s - loss: 1.1838 - regression_loss: 1.0334 - classification_loss: 0.1504 464/500 [==========================>...] - ETA: 12s - loss: 1.1841 - regression_loss: 1.0335 - classification_loss: 0.1505 465/500 [==========================>...] - ETA: 11s - loss: 1.1834 - regression_loss: 1.0329 - classification_loss: 0.1505 466/500 [==========================>...] - ETA: 11s - loss: 1.1824 - regression_loss: 1.0321 - classification_loss: 0.1503 467/500 [===========================>..] - ETA: 11s - loss: 1.1836 - regression_loss: 1.0330 - classification_loss: 0.1506 468/500 [===========================>..] - ETA: 10s - loss: 1.1851 - regression_loss: 1.0342 - classification_loss: 0.1508 469/500 [===========================>..] - ETA: 10s - loss: 1.1852 - regression_loss: 1.0344 - classification_loss: 0.1508 470/500 [===========================>..] - ETA: 10s - loss: 1.1849 - regression_loss: 1.0342 - classification_loss: 0.1507 471/500 [===========================>..] - ETA: 9s - loss: 1.1842 - regression_loss: 1.0337 - classification_loss: 0.1506  472/500 [===========================>..] - ETA: 9s - loss: 1.1859 - regression_loss: 1.0349 - classification_loss: 0.1511 473/500 [===========================>..] - ETA: 9s - loss: 1.1861 - regression_loss: 1.0348 - classification_loss: 0.1513 474/500 [===========================>..] - ETA: 8s - loss: 1.1857 - regression_loss: 1.0346 - classification_loss: 0.1511 475/500 [===========================>..] - ETA: 8s - loss: 1.1842 - regression_loss: 1.0333 - classification_loss: 0.1509 476/500 [===========================>..] - ETA: 8s - loss: 1.1834 - regression_loss: 1.0327 - classification_loss: 0.1507 477/500 [===========================>..] - ETA: 7s - loss: 1.1830 - regression_loss: 1.0324 - classification_loss: 0.1506 478/500 [===========================>..] - ETA: 7s - loss: 1.1835 - regression_loss: 1.0329 - classification_loss: 0.1506 479/500 [===========================>..] - ETA: 7s - loss: 1.1829 - regression_loss: 1.0324 - classification_loss: 0.1504 480/500 [===========================>..] - ETA: 6s - loss: 1.1829 - regression_loss: 1.0325 - classification_loss: 0.1504 481/500 [===========================>..] - ETA: 6s - loss: 1.1822 - regression_loss: 1.0320 - classification_loss: 0.1502 482/500 [===========================>..] - ETA: 6s - loss: 1.1808 - regression_loss: 1.0308 - classification_loss: 0.1500 483/500 [===========================>..] - ETA: 5s - loss: 1.1811 - regression_loss: 1.0312 - classification_loss: 0.1499 484/500 [============================>.] - ETA: 5s - loss: 1.1794 - regression_loss: 1.0297 - classification_loss: 0.1497 485/500 [============================>.] - ETA: 5s - loss: 1.1794 - regression_loss: 1.0297 - classification_loss: 0.1497 486/500 [============================>.] - ETA: 4s - loss: 1.1797 - regression_loss: 1.0300 - classification_loss: 0.1498 487/500 [============================>.] - ETA: 4s - loss: 1.1810 - regression_loss: 1.0310 - classification_loss: 0.1500 488/500 [============================>.] - ETA: 4s - loss: 1.1817 - regression_loss: 1.0316 - classification_loss: 0.1501 489/500 [============================>.] - ETA: 3s - loss: 1.1823 - regression_loss: 1.0322 - classification_loss: 0.1501 490/500 [============================>.] - ETA: 3s - loss: 1.1820 - regression_loss: 1.0321 - classification_loss: 0.1499 491/500 [============================>.] - ETA: 3s - loss: 1.1815 - regression_loss: 1.0316 - classification_loss: 0.1499 492/500 [============================>.] - ETA: 2s - loss: 1.1810 - regression_loss: 1.0313 - classification_loss: 0.1498 493/500 [============================>.] - ETA: 2s - loss: 1.1799 - regression_loss: 1.0304 - classification_loss: 0.1495 494/500 [============================>.] - ETA: 2s - loss: 1.1807 - regression_loss: 1.0312 - classification_loss: 0.1494 495/500 [============================>.] - ETA: 1s - loss: 1.1804 - regression_loss: 1.0310 - classification_loss: 0.1494 496/500 [============================>.] - ETA: 1s - loss: 1.1811 - regression_loss: 1.0318 - classification_loss: 0.1493 497/500 [============================>.] - ETA: 1s - loss: 1.1806 - regression_loss: 1.0315 - classification_loss: 0.1490 498/500 [============================>.] - ETA: 0s - loss: 1.1792 - regression_loss: 1.0304 - classification_loss: 0.1488 499/500 [============================>.] - ETA: 0s - loss: 1.1786 - regression_loss: 1.0300 - classification_loss: 0.1486 500/500 [==============================] - 170s 339ms/step - loss: 1.1791 - regression_loss: 1.0304 - classification_loss: 0.1487 326 instances of class plum with average precision: 0.8298 mAP: 0.8298 Epoch 00017: saving model to ./training/snapshots/resnet101_pascal_17.h5 Epoch 18/150 1/500 [..............................] - ETA: 2:33 - loss: 1.0163 - regression_loss: 0.9233 - classification_loss: 0.0930 2/500 [..............................] - ETA: 2:41 - loss: 1.1983 - regression_loss: 1.0471 - classification_loss: 0.1512 3/500 [..............................] - ETA: 2:41 - loss: 1.2717 - regression_loss: 1.0847 - classification_loss: 0.1870 4/500 [..............................] - ETA: 2:44 - loss: 1.3425 - regression_loss: 1.1410 - classification_loss: 0.2015 5/500 [..............................] - ETA: 2:46 - loss: 1.5361 - regression_loss: 1.2644 - classification_loss: 0.2717 6/500 [..............................] - ETA: 2:46 - loss: 1.4642 - regression_loss: 1.2222 - classification_loss: 0.2420 7/500 [..............................] - ETA: 2:44 - loss: 1.3866 - regression_loss: 1.1665 - classification_loss: 0.2201 8/500 [..............................] - ETA: 2:44 - loss: 1.3395 - regression_loss: 1.1273 - classification_loss: 0.2122 9/500 [..............................] - ETA: 2:44 - loss: 1.4200 - regression_loss: 1.1891 - classification_loss: 0.2308 10/500 [..............................] - ETA: 2:44 - loss: 1.3415 - regression_loss: 1.1231 - classification_loss: 0.2183 11/500 [..............................] - ETA: 2:44 - loss: 1.3210 - regression_loss: 1.1156 - classification_loss: 0.2054 12/500 [..............................] - ETA: 2:44 - loss: 1.3462 - regression_loss: 1.1432 - classification_loss: 0.2030 13/500 [..............................] - ETA: 2:43 - loss: 1.2949 - regression_loss: 1.1034 - classification_loss: 0.1915 14/500 [..............................] - ETA: 2:43 - loss: 1.2869 - regression_loss: 1.0988 - classification_loss: 0.1882 15/500 [..............................] - ETA: 2:43 - loss: 1.2439 - regression_loss: 1.0646 - classification_loss: 0.1794 16/500 [..............................] - ETA: 2:42 - loss: 1.1996 - regression_loss: 1.0292 - classification_loss: 0.1704 17/500 [>.............................] - ETA: 2:42 - loss: 1.2291 - regression_loss: 1.0497 - classification_loss: 0.1794 18/500 [>.............................] - ETA: 2:41 - loss: 1.2347 - regression_loss: 1.0583 - classification_loss: 0.1764 19/500 [>.............................] - ETA: 2:41 - loss: 1.2114 - regression_loss: 1.0402 - classification_loss: 0.1712 20/500 [>.............................] - ETA: 2:41 - loss: 1.1918 - regression_loss: 1.0263 - classification_loss: 0.1656 21/500 [>.............................] - ETA: 2:41 - loss: 1.2030 - regression_loss: 1.0387 - classification_loss: 0.1643 22/500 [>.............................] - ETA: 2:40 - loss: 1.2233 - regression_loss: 1.0575 - classification_loss: 0.1658 23/500 [>.............................] - ETA: 2:40 - loss: 1.2024 - regression_loss: 1.0420 - classification_loss: 0.1604 24/500 [>.............................] - ETA: 2:39 - loss: 1.1907 - regression_loss: 1.0331 - classification_loss: 0.1576 25/500 [>.............................] - ETA: 2:39 - loss: 1.1784 - regression_loss: 1.0239 - classification_loss: 0.1545 26/500 [>.............................] - ETA: 2:39 - loss: 1.1659 - regression_loss: 1.0139 - classification_loss: 0.1519 27/500 [>.............................] - ETA: 2:39 - loss: 1.1651 - regression_loss: 1.0139 - classification_loss: 0.1512 28/500 [>.............................] - ETA: 2:38 - loss: 1.1447 - regression_loss: 0.9972 - classification_loss: 0.1474 29/500 [>.............................] - ETA: 2:38 - loss: 1.1367 - regression_loss: 0.9892 - classification_loss: 0.1476 30/500 [>.............................] - ETA: 2:38 - loss: 1.1347 - regression_loss: 0.9895 - classification_loss: 0.1452 31/500 [>.............................] - ETA: 2:38 - loss: 1.1427 - regression_loss: 0.9954 - classification_loss: 0.1473 32/500 [>.............................] - ETA: 2:37 - loss: 1.1457 - regression_loss: 0.9981 - classification_loss: 0.1476 33/500 [>.............................] - ETA: 2:37 - loss: 1.1549 - regression_loss: 1.0074 - classification_loss: 0.1475 34/500 [=>............................] - ETA: 2:37 - loss: 1.1512 - regression_loss: 1.0029 - classification_loss: 0.1483 35/500 [=>............................] - ETA: 2:37 - loss: 1.1483 - regression_loss: 1.0003 - classification_loss: 0.1480 36/500 [=>............................] - ETA: 2:37 - loss: 1.1479 - regression_loss: 1.0006 - classification_loss: 0.1472 37/500 [=>............................] - ETA: 2:36 - loss: 1.1397 - regression_loss: 0.9944 - classification_loss: 0.1453 38/500 [=>............................] - ETA: 2:36 - loss: 1.1452 - regression_loss: 0.9994 - classification_loss: 0.1458 39/500 [=>............................] - ETA: 2:36 - loss: 1.1313 - regression_loss: 0.9883 - classification_loss: 0.1430 40/500 [=>............................] - ETA: 2:36 - loss: 1.1504 - regression_loss: 0.9990 - classification_loss: 0.1514 41/500 [=>............................] - ETA: 2:35 - loss: 1.1509 - regression_loss: 0.9999 - classification_loss: 0.1510 42/500 [=>............................] - ETA: 2:35 - loss: 1.1581 - regression_loss: 1.0059 - classification_loss: 0.1521 43/500 [=>............................] - ETA: 2:35 - loss: 1.1415 - regression_loss: 0.9924 - classification_loss: 0.1491 44/500 [=>............................] - ETA: 2:34 - loss: 1.1391 - regression_loss: 0.9923 - classification_loss: 0.1468 45/500 [=>............................] - ETA: 2:34 - loss: 1.2476 - regression_loss: 1.0275 - classification_loss: 0.2201 46/500 [=>............................] - ETA: 2:34 - loss: 1.2569 - regression_loss: 1.0374 - classification_loss: 0.2195 47/500 [=>............................] - ETA: 2:33 - loss: 1.2529 - regression_loss: 1.0354 - classification_loss: 0.2174 48/500 [=>............................] - ETA: 2:33 - loss: 1.2405 - regression_loss: 1.0259 - classification_loss: 0.2146 49/500 [=>............................] - ETA: 2:33 - loss: 1.2319 - regression_loss: 1.0187 - classification_loss: 0.2132 50/500 [==>...........................] - ETA: 2:33 - loss: 1.2266 - regression_loss: 1.0159 - classification_loss: 0.2107 51/500 [==>...........................] - ETA: 2:32 - loss: 1.2224 - regression_loss: 1.0135 - classification_loss: 0.2089 52/500 [==>...........................] - ETA: 2:32 - loss: 1.2113 - regression_loss: 1.0054 - classification_loss: 0.2059 53/500 [==>...........................] - ETA: 2:32 - loss: 1.2201 - regression_loss: 1.0125 - classification_loss: 0.2076 54/500 [==>...........................] - ETA: 2:31 - loss: 1.2246 - regression_loss: 1.0166 - classification_loss: 0.2080 55/500 [==>...........................] - ETA: 2:31 - loss: 1.2221 - regression_loss: 1.0152 - classification_loss: 0.2069 56/500 [==>...........................] - ETA: 2:30 - loss: 1.2184 - regression_loss: 1.0134 - classification_loss: 0.2050 57/500 [==>...........................] - ETA: 2:30 - loss: 1.2176 - regression_loss: 1.0148 - classification_loss: 0.2029 58/500 [==>...........................] - ETA: 2:30 - loss: 1.2096 - regression_loss: 1.0092 - classification_loss: 0.2005 59/500 [==>...........................] - ETA: 2:29 - loss: 1.1998 - regression_loss: 1.0020 - classification_loss: 0.1978 60/500 [==>...........................] - ETA: 2:29 - loss: 1.1986 - regression_loss: 1.0015 - classification_loss: 0.1970 61/500 [==>...........................] - ETA: 2:28 - loss: 1.1928 - regression_loss: 0.9979 - classification_loss: 0.1949 62/500 [==>...........................] - ETA: 2:28 - loss: 1.1888 - regression_loss: 0.9952 - classification_loss: 0.1936 63/500 [==>...........................] - ETA: 2:28 - loss: 1.1901 - regression_loss: 0.9979 - classification_loss: 0.1922 64/500 [==>...........................] - ETA: 2:28 - loss: 1.1823 - regression_loss: 0.9923 - classification_loss: 0.1900 65/500 [==>...........................] - ETA: 2:27 - loss: 1.1807 - regression_loss: 0.9925 - classification_loss: 0.1882 66/500 [==>...........................] - ETA: 2:27 - loss: 1.1723 - regression_loss: 0.9857 - classification_loss: 0.1866 67/500 [===>..........................] - ETA: 2:26 - loss: 1.1787 - regression_loss: 0.9904 - classification_loss: 0.1883 68/500 [===>..........................] - ETA: 2:26 - loss: 1.1867 - regression_loss: 0.9976 - classification_loss: 0.1891 69/500 [===>..........................] - ETA: 2:25 - loss: 1.1914 - regression_loss: 1.0022 - classification_loss: 0.1892 70/500 [===>..........................] - ETA: 2:24 - loss: 1.2031 - regression_loss: 1.0108 - classification_loss: 0.1923 71/500 [===>..........................] - ETA: 2:24 - loss: 1.2025 - regression_loss: 1.0113 - classification_loss: 0.1912 72/500 [===>..........................] - ETA: 2:23 - loss: 1.1928 - regression_loss: 1.0037 - classification_loss: 0.1891 73/500 [===>..........................] - ETA: 2:23 - loss: 1.1873 - regression_loss: 1.0000 - classification_loss: 0.1873 74/500 [===>..........................] - ETA: 2:22 - loss: 1.1852 - regression_loss: 0.9986 - classification_loss: 0.1866 75/500 [===>..........................] - ETA: 2:22 - loss: 1.1983 - regression_loss: 1.0064 - classification_loss: 0.1919 76/500 [===>..........................] - ETA: 2:21 - loss: 1.1964 - regression_loss: 1.0064 - classification_loss: 0.1901 77/500 [===>..........................] - ETA: 2:21 - loss: 1.2112 - regression_loss: 1.0180 - classification_loss: 0.1933 78/500 [===>..........................] - ETA: 2:20 - loss: 1.2184 - regression_loss: 1.0247 - classification_loss: 0.1938 79/500 [===>..........................] - ETA: 2:20 - loss: 1.2227 - regression_loss: 1.0277 - classification_loss: 0.1950 80/500 [===>..........................] - ETA: 2:20 - loss: 1.2162 - regression_loss: 1.0230 - classification_loss: 0.1932 81/500 [===>..........................] - ETA: 2:19 - loss: 1.2211 - regression_loss: 1.0276 - classification_loss: 0.1935 82/500 [===>..........................] - ETA: 2:19 - loss: 1.2198 - regression_loss: 1.0273 - classification_loss: 0.1925 83/500 [===>..........................] - ETA: 2:19 - loss: 1.2190 - regression_loss: 1.0271 - classification_loss: 0.1918 84/500 [====>.........................] - ETA: 2:19 - loss: 1.2257 - regression_loss: 1.0315 - classification_loss: 0.1942 85/500 [====>.........................] - ETA: 2:18 - loss: 1.2347 - regression_loss: 1.0375 - classification_loss: 0.1972 86/500 [====>.........................] - ETA: 2:18 - loss: 1.2371 - regression_loss: 1.0392 - classification_loss: 0.1979 87/500 [====>.........................] - ETA: 2:18 - loss: 1.2389 - regression_loss: 1.0412 - classification_loss: 0.1977 88/500 [====>.........................] - ETA: 2:17 - loss: 1.2370 - regression_loss: 1.0401 - classification_loss: 0.1969 89/500 [====>.........................] - ETA: 2:17 - loss: 1.2417 - regression_loss: 1.0453 - classification_loss: 0.1964 90/500 [====>.........................] - ETA: 2:17 - loss: 1.2362 - regression_loss: 1.0415 - classification_loss: 0.1947 91/500 [====>.........................] - ETA: 2:16 - loss: 1.2320 - regression_loss: 1.0385 - classification_loss: 0.1935 92/500 [====>.........................] - ETA: 2:16 - loss: 1.2331 - regression_loss: 1.0400 - classification_loss: 0.1930 93/500 [====>.........................] - ETA: 2:16 - loss: 1.2325 - regression_loss: 1.0402 - classification_loss: 0.1923 94/500 [====>.........................] - ETA: 2:15 - loss: 1.2400 - regression_loss: 1.0450 - classification_loss: 0.1949 95/500 [====>.........................] - ETA: 2:15 - loss: 1.2369 - regression_loss: 1.0430 - classification_loss: 0.1939 96/500 [====>.........................] - ETA: 2:15 - loss: 1.2360 - regression_loss: 1.0429 - classification_loss: 0.1931 97/500 [====>.........................] - ETA: 2:14 - loss: 1.2305 - regression_loss: 1.0384 - classification_loss: 0.1922 98/500 [====>.........................] - ETA: 2:14 - loss: 1.2328 - regression_loss: 1.0401 - classification_loss: 0.1928 99/500 [====>.........................] - ETA: 2:14 - loss: 1.2319 - regression_loss: 1.0402 - classification_loss: 0.1917 100/500 [=====>........................] - ETA: 2:14 - loss: 1.2287 - regression_loss: 1.0378 - classification_loss: 0.1909 101/500 [=====>........................] - ETA: 2:13 - loss: 1.2299 - regression_loss: 1.0389 - classification_loss: 0.1910 102/500 [=====>........................] - ETA: 2:13 - loss: 1.2256 - regression_loss: 1.0358 - classification_loss: 0.1899 103/500 [=====>........................] - ETA: 2:12 - loss: 1.2243 - regression_loss: 1.0353 - classification_loss: 0.1889 104/500 [=====>........................] - ETA: 2:12 - loss: 1.2292 - regression_loss: 1.0410 - classification_loss: 0.1882 105/500 [=====>........................] - ETA: 2:12 - loss: 1.2360 - regression_loss: 1.0463 - classification_loss: 0.1897 106/500 [=====>........................] - ETA: 2:11 - loss: 1.2389 - regression_loss: 1.0495 - classification_loss: 0.1894 107/500 [=====>........................] - ETA: 2:11 - loss: 1.2402 - regression_loss: 1.0513 - classification_loss: 0.1889 108/500 [=====>........................] - ETA: 2:11 - loss: 1.2482 - regression_loss: 1.0586 - classification_loss: 0.1896 109/500 [=====>........................] - ETA: 2:10 - loss: 1.2447 - regression_loss: 1.0557 - classification_loss: 0.1890 110/500 [=====>........................] - ETA: 2:10 - loss: 1.2436 - regression_loss: 1.0550 - classification_loss: 0.1886 111/500 [=====>........................] - ETA: 2:10 - loss: 1.2453 - regression_loss: 1.0566 - classification_loss: 0.1887 112/500 [=====>........................] - ETA: 2:09 - loss: 1.2518 - regression_loss: 1.0626 - classification_loss: 0.1892 113/500 [=====>........................] - ETA: 2:09 - loss: 1.2514 - regression_loss: 1.0621 - classification_loss: 0.1893 114/500 [=====>........................] - ETA: 2:09 - loss: 1.2495 - regression_loss: 1.0610 - classification_loss: 0.1884 115/500 [=====>........................] - ETA: 2:09 - loss: 1.2479 - regression_loss: 1.0596 - classification_loss: 0.1883 116/500 [=====>........................] - ETA: 2:08 - loss: 1.2495 - regression_loss: 1.0609 - classification_loss: 0.1886 117/500 [======>.......................] - ETA: 2:08 - loss: 1.2503 - regression_loss: 1.0615 - classification_loss: 0.1888 118/500 [======>.......................] - ETA: 2:08 - loss: 1.2467 - regression_loss: 1.0585 - classification_loss: 0.1883 119/500 [======>.......................] - ETA: 2:07 - loss: 1.2428 - regression_loss: 1.0557 - classification_loss: 0.1870 120/500 [======>.......................] - ETA: 2:07 - loss: 1.2364 - regression_loss: 1.0504 - classification_loss: 0.1860 121/500 [======>.......................] - ETA: 2:07 - loss: 1.2290 - regression_loss: 1.0443 - classification_loss: 0.1847 122/500 [======>.......................] - ETA: 2:06 - loss: 1.2278 - regression_loss: 1.0434 - classification_loss: 0.1845 123/500 [======>.......................] - ETA: 2:06 - loss: 1.2278 - regression_loss: 1.0433 - classification_loss: 0.1845 124/500 [======>.......................] - ETA: 2:06 - loss: 1.2268 - regression_loss: 1.0425 - classification_loss: 0.1842 125/500 [======>.......................] - ETA: 2:05 - loss: 1.2242 - regression_loss: 1.0404 - classification_loss: 0.1838 126/500 [======>.......................] - ETA: 2:05 - loss: 1.2240 - regression_loss: 1.0407 - classification_loss: 0.1833 127/500 [======>.......................] - ETA: 2:04 - loss: 1.2269 - regression_loss: 1.0430 - classification_loss: 0.1840 128/500 [======>.......................] - ETA: 2:04 - loss: 1.2221 - regression_loss: 1.0388 - classification_loss: 0.1833 129/500 [======>.......................] - ETA: 2:04 - loss: 1.2210 - regression_loss: 1.0383 - classification_loss: 0.1828 130/500 [======>.......................] - ETA: 2:03 - loss: 1.2191 - regression_loss: 1.0370 - classification_loss: 0.1822 131/500 [======>.......................] - ETA: 2:03 - loss: 1.2189 - regression_loss: 1.0373 - classification_loss: 0.1816 132/500 [======>.......................] - ETA: 2:03 - loss: 1.2127 - regression_loss: 1.0319 - classification_loss: 0.1809 133/500 [======>.......................] - ETA: 2:02 - loss: 1.2081 - regression_loss: 1.0281 - classification_loss: 0.1801 134/500 [=======>......................] - ETA: 2:02 - loss: 1.2028 - regression_loss: 1.0237 - classification_loss: 0.1791 135/500 [=======>......................] - ETA: 2:02 - loss: 1.2016 - regression_loss: 1.0230 - classification_loss: 0.1786 136/500 [=======>......................] - ETA: 2:02 - loss: 1.1986 - regression_loss: 1.0202 - classification_loss: 0.1784 137/500 [=======>......................] - ETA: 2:01 - loss: 1.2010 - regression_loss: 1.0225 - classification_loss: 0.1786 138/500 [=======>......................] - ETA: 2:01 - loss: 1.1952 - regression_loss: 1.0175 - classification_loss: 0.1777 139/500 [=======>......................] - ETA: 2:01 - loss: 1.1955 - regression_loss: 1.0185 - classification_loss: 0.1770 140/500 [=======>......................] - ETA: 2:00 - loss: 1.1913 - regression_loss: 1.0151 - classification_loss: 0.1762 141/500 [=======>......................] - ETA: 2:00 - loss: 1.1856 - regression_loss: 1.0105 - classification_loss: 0.1751 142/500 [=======>......................] - ETA: 2:00 - loss: 1.1907 - regression_loss: 1.0153 - classification_loss: 0.1754 143/500 [=======>......................] - ETA: 1:59 - loss: 1.1881 - regression_loss: 1.0133 - classification_loss: 0.1748 144/500 [=======>......................] - ETA: 1:59 - loss: 1.1882 - regression_loss: 1.0138 - classification_loss: 0.1744 145/500 [=======>......................] - ETA: 1:59 - loss: 1.1892 - regression_loss: 1.0146 - classification_loss: 0.1746 146/500 [=======>......................] - ETA: 1:58 - loss: 1.1890 - regression_loss: 1.0146 - classification_loss: 0.1744 147/500 [=======>......................] - ETA: 1:58 - loss: 1.1889 - regression_loss: 1.0150 - classification_loss: 0.1739 148/500 [=======>......................] - ETA: 1:58 - loss: 1.1891 - regression_loss: 1.0156 - classification_loss: 0.1735 149/500 [=======>......................] - ETA: 1:57 - loss: 1.1905 - regression_loss: 1.0168 - classification_loss: 0.1736 150/500 [========>.....................] - ETA: 1:57 - loss: 1.1881 - regression_loss: 1.0151 - classification_loss: 0.1730 151/500 [========>.....................] - ETA: 1:57 - loss: 1.1921 - regression_loss: 1.0186 - classification_loss: 0.1735 152/500 [========>.....................] - ETA: 1:56 - loss: 1.1889 - regression_loss: 1.0161 - classification_loss: 0.1728 153/500 [========>.....................] - ETA: 1:56 - loss: 1.1907 - regression_loss: 1.0174 - classification_loss: 0.1734 154/500 [========>.....................] - ETA: 1:56 - loss: 1.1909 - regression_loss: 1.0174 - classification_loss: 0.1735 155/500 [========>.....................] - ETA: 1:55 - loss: 1.1862 - regression_loss: 1.0136 - classification_loss: 0.1726 156/500 [========>.....................] - ETA: 1:55 - loss: 1.1820 - regression_loss: 1.0100 - classification_loss: 0.1720 157/500 [========>.....................] - ETA: 1:55 - loss: 1.1809 - regression_loss: 1.0090 - classification_loss: 0.1719 158/500 [========>.....................] - ETA: 1:54 - loss: 1.1802 - regression_loss: 1.0085 - classification_loss: 0.1717 159/500 [========>.....................] - ETA: 1:54 - loss: 1.1772 - regression_loss: 1.0056 - classification_loss: 0.1716 160/500 [========>.....................] - ETA: 1:54 - loss: 1.1747 - regression_loss: 1.0040 - classification_loss: 0.1707 161/500 [========>.....................] - ETA: 1:53 - loss: 1.1727 - regression_loss: 1.0024 - classification_loss: 0.1703 162/500 [========>.....................] - ETA: 1:53 - loss: 1.1724 - regression_loss: 1.0026 - classification_loss: 0.1699 163/500 [========>.....................] - ETA: 1:53 - loss: 1.1726 - regression_loss: 1.0025 - classification_loss: 0.1701 164/500 [========>.....................] - ETA: 1:52 - loss: 1.1699 - regression_loss: 1.0006 - classification_loss: 0.1693 165/500 [========>.....................] - ETA: 1:52 - loss: 1.1716 - regression_loss: 1.0023 - classification_loss: 0.1693 166/500 [========>.....................] - ETA: 1:52 - loss: 1.1753 - regression_loss: 1.0057 - classification_loss: 0.1695 167/500 [=========>....................] - ETA: 1:51 - loss: 1.1736 - regression_loss: 1.0037 - classification_loss: 0.1699 168/500 [=========>....................] - ETA: 1:51 - loss: 1.1712 - regression_loss: 1.0019 - classification_loss: 0.1693 169/500 [=========>....................] - ETA: 1:51 - loss: 1.1706 - regression_loss: 1.0018 - classification_loss: 0.1688 170/500 [=========>....................] - ETA: 1:50 - loss: 1.1690 - regression_loss: 1.0008 - classification_loss: 0.1682 171/500 [=========>....................] - ETA: 1:50 - loss: 1.1709 - regression_loss: 1.0014 - classification_loss: 0.1696 172/500 [=========>....................] - ETA: 1:50 - loss: 1.1844 - regression_loss: 1.0110 - classification_loss: 0.1733 173/500 [=========>....................] - ETA: 1:49 - loss: 1.1814 - regression_loss: 1.0086 - classification_loss: 0.1728 174/500 [=========>....................] - ETA: 1:49 - loss: 1.1832 - regression_loss: 1.0102 - classification_loss: 0.1730 175/500 [=========>....................] - ETA: 1:49 - loss: 1.1824 - regression_loss: 1.0098 - classification_loss: 0.1726 176/500 [=========>....................] - ETA: 1:48 - loss: 1.1804 - regression_loss: 1.0082 - classification_loss: 0.1721 177/500 [=========>....................] - ETA: 1:48 - loss: 1.1771 - regression_loss: 1.0055 - classification_loss: 0.1716 178/500 [=========>....................] - ETA: 1:48 - loss: 1.1761 - regression_loss: 1.0049 - classification_loss: 0.1712 179/500 [=========>....................] - ETA: 1:47 - loss: 1.1763 - regression_loss: 1.0053 - classification_loss: 0.1709 180/500 [=========>....................] - ETA: 1:47 - loss: 1.1778 - regression_loss: 1.0072 - classification_loss: 0.1706 181/500 [=========>....................] - ETA: 1:47 - loss: 1.1770 - regression_loss: 1.0065 - classification_loss: 0.1705 182/500 [=========>....................] - ETA: 1:46 - loss: 1.1765 - regression_loss: 1.0061 - classification_loss: 0.1704 183/500 [=========>....................] - ETA: 1:46 - loss: 1.1768 - regression_loss: 1.0065 - classification_loss: 0.1704 184/500 [==========>...................] - ETA: 1:46 - loss: 1.1794 - regression_loss: 1.0090 - classification_loss: 0.1704 185/500 [==========>...................] - ETA: 1:46 - loss: 1.1775 - regression_loss: 1.0076 - classification_loss: 0.1700 186/500 [==========>...................] - ETA: 1:45 - loss: 1.1800 - regression_loss: 1.0100 - classification_loss: 0.1700 187/500 [==========>...................] - ETA: 1:45 - loss: 1.1820 - regression_loss: 1.0120 - classification_loss: 0.1700 188/500 [==========>...................] - ETA: 1:45 - loss: 1.1825 - regression_loss: 1.0128 - classification_loss: 0.1697 189/500 [==========>...................] - ETA: 1:44 - loss: 1.1791 - regression_loss: 1.0101 - classification_loss: 0.1690 190/500 [==========>...................] - ETA: 1:44 - loss: 1.1788 - regression_loss: 1.0102 - classification_loss: 0.1686 191/500 [==========>...................] - ETA: 1:44 - loss: 1.1760 - regression_loss: 1.0079 - classification_loss: 0.1681 192/500 [==========>...................] - ETA: 1:43 - loss: 1.1750 - regression_loss: 1.0074 - classification_loss: 0.1676 193/500 [==========>...................] - ETA: 1:43 - loss: 1.1748 - regression_loss: 1.0075 - classification_loss: 0.1673 194/500 [==========>...................] - ETA: 1:43 - loss: 1.1739 - regression_loss: 1.0069 - classification_loss: 0.1670 195/500 [==========>...................] - ETA: 1:42 - loss: 1.1779 - regression_loss: 1.0106 - classification_loss: 0.1673 196/500 [==========>...................] - ETA: 1:42 - loss: 1.1758 - regression_loss: 1.0090 - classification_loss: 0.1668 197/500 [==========>...................] - ETA: 1:42 - loss: 1.1738 - regression_loss: 1.0075 - classification_loss: 0.1663 198/500 [==========>...................] - ETA: 1:41 - loss: 1.1735 - regression_loss: 1.0071 - classification_loss: 0.1664 199/500 [==========>...................] - ETA: 1:41 - loss: 1.1712 - regression_loss: 1.0053 - classification_loss: 0.1658 200/500 [===========>..................] - ETA: 1:41 - loss: 1.1717 - regression_loss: 1.0056 - classification_loss: 0.1661 201/500 [===========>..................] - ETA: 1:40 - loss: 1.1712 - regression_loss: 1.0053 - classification_loss: 0.1659 202/500 [===========>..................] - ETA: 1:40 - loss: 1.1697 - regression_loss: 1.0041 - classification_loss: 0.1656 203/500 [===========>..................] - ETA: 1:40 - loss: 1.1678 - regression_loss: 1.0026 - classification_loss: 0.1651 204/500 [===========>..................] - ETA: 1:39 - loss: 1.1645 - regression_loss: 0.9999 - classification_loss: 0.1646 205/500 [===========>..................] - ETA: 1:39 - loss: 1.1616 - regression_loss: 0.9976 - classification_loss: 0.1640 206/500 [===========>..................] - ETA: 1:39 - loss: 1.1610 - regression_loss: 0.9973 - classification_loss: 0.1637 207/500 [===========>..................] - ETA: 1:38 - loss: 1.1650 - regression_loss: 1.0008 - classification_loss: 0.1642 208/500 [===========>..................] - ETA: 1:38 - loss: 1.1616 - regression_loss: 0.9979 - classification_loss: 0.1637 209/500 [===========>..................] - ETA: 1:38 - loss: 1.1614 - regression_loss: 0.9980 - classification_loss: 0.1634 210/500 [===========>..................] - ETA: 1:37 - loss: 1.1626 - regression_loss: 0.9987 - classification_loss: 0.1639 211/500 [===========>..................] - ETA: 1:37 - loss: 1.1612 - regression_loss: 0.9974 - classification_loss: 0.1638 212/500 [===========>..................] - ETA: 1:37 - loss: 1.1590 - regression_loss: 0.9957 - classification_loss: 0.1633 213/500 [===========>..................] - ETA: 1:36 - loss: 1.1563 - regression_loss: 0.9934 - classification_loss: 0.1628 214/500 [===========>..................] - ETA: 1:36 - loss: 1.1552 - regression_loss: 0.9927 - classification_loss: 0.1624 215/500 [===========>..................] - ETA: 1:36 - loss: 1.1536 - regression_loss: 0.9915 - classification_loss: 0.1621 216/500 [===========>..................] - ETA: 1:35 - loss: 1.1560 - regression_loss: 0.9938 - classification_loss: 0.1622 217/500 [============>.................] - ETA: 1:35 - loss: 1.1563 - regression_loss: 0.9939 - classification_loss: 0.1624 218/500 [============>.................] - ETA: 1:35 - loss: 1.1560 - regression_loss: 0.9939 - classification_loss: 0.1621 219/500 [============>.................] - ETA: 1:34 - loss: 1.1545 - regression_loss: 0.9925 - classification_loss: 0.1620 220/500 [============>.................] - ETA: 1:34 - loss: 1.1552 - regression_loss: 0.9931 - classification_loss: 0.1621 221/500 [============>.................] - ETA: 1:34 - loss: 1.1546 - regression_loss: 0.9929 - classification_loss: 0.1617 222/500 [============>.................] - ETA: 1:33 - loss: 1.1518 - regression_loss: 0.9907 - classification_loss: 0.1611 223/500 [============>.................] - ETA: 1:33 - loss: 1.1509 - regression_loss: 0.9901 - classification_loss: 0.1608 224/500 [============>.................] - ETA: 1:33 - loss: 1.1489 - regression_loss: 0.9883 - classification_loss: 0.1606 225/500 [============>.................] - ETA: 1:32 - loss: 1.1485 - regression_loss: 0.9879 - classification_loss: 0.1606 226/500 [============>.................] - ETA: 1:32 - loss: 1.1505 - regression_loss: 0.9896 - classification_loss: 0.1609 227/500 [============>.................] - ETA: 1:32 - loss: 1.1542 - regression_loss: 0.9923 - classification_loss: 0.1619 228/500 [============>.................] - ETA: 1:31 - loss: 1.1549 - regression_loss: 0.9928 - classification_loss: 0.1621 229/500 [============>.................] - ETA: 1:31 - loss: 1.1520 - regression_loss: 0.9904 - classification_loss: 0.1617 230/500 [============>.................] - ETA: 1:31 - loss: 1.1537 - regression_loss: 0.9918 - classification_loss: 0.1619 231/500 [============>.................] - ETA: 1:30 - loss: 1.1497 - regression_loss: 0.9884 - classification_loss: 0.1613 232/500 [============>.................] - ETA: 1:30 - loss: 1.1501 - regression_loss: 0.9891 - classification_loss: 0.1610 233/500 [============>.................] - ETA: 1:30 - loss: 1.1496 - regression_loss: 0.9889 - classification_loss: 0.1607 234/500 [=============>................] - ETA: 1:29 - loss: 1.1518 - regression_loss: 0.9908 - classification_loss: 0.1610 235/500 [=============>................] - ETA: 1:29 - loss: 1.1503 - regression_loss: 0.9897 - classification_loss: 0.1606 236/500 [=============>................] - ETA: 1:29 - loss: 1.1506 - regression_loss: 0.9900 - classification_loss: 0.1605 237/500 [=============>................] - ETA: 1:28 - loss: 1.1491 - regression_loss: 0.9889 - classification_loss: 0.1602 238/500 [=============>................] - ETA: 1:28 - loss: 1.1535 - regression_loss: 0.9926 - classification_loss: 0.1609 239/500 [=============>................] - ETA: 1:28 - loss: 1.1546 - regression_loss: 0.9936 - classification_loss: 0.1610 240/500 [=============>................] - ETA: 1:27 - loss: 1.1582 - regression_loss: 0.9965 - classification_loss: 0.1618 241/500 [=============>................] - ETA: 1:27 - loss: 1.1580 - regression_loss: 0.9963 - classification_loss: 0.1616 242/500 [=============>................] - ETA: 1:27 - loss: 1.1582 - regression_loss: 0.9968 - classification_loss: 0.1614 243/500 [=============>................] - ETA: 1:26 - loss: 1.1589 - regression_loss: 0.9977 - classification_loss: 0.1612 244/500 [=============>................] - ETA: 1:26 - loss: 1.1586 - regression_loss: 0.9976 - classification_loss: 0.1610 245/500 [=============>................] - ETA: 1:26 - loss: 1.1603 - regression_loss: 0.9992 - classification_loss: 0.1611 246/500 [=============>................] - ETA: 1:25 - loss: 1.1615 - regression_loss: 1.0003 - classification_loss: 0.1612 247/500 [=============>................] - ETA: 1:25 - loss: 1.1614 - regression_loss: 1.0004 - classification_loss: 0.1610 248/500 [=============>................] - ETA: 1:25 - loss: 1.1608 - regression_loss: 1.0001 - classification_loss: 0.1607 249/500 [=============>................] - ETA: 1:24 - loss: 1.1600 - regression_loss: 0.9995 - classification_loss: 0.1605 250/500 [==============>...............] - ETA: 1:24 - loss: 1.1599 - regression_loss: 0.9995 - classification_loss: 0.1604 251/500 [==============>...............] - ETA: 1:24 - loss: 1.1556 - regression_loss: 0.9955 - classification_loss: 0.1601 252/500 [==============>...............] - ETA: 1:23 - loss: 1.1551 - regression_loss: 0.9952 - classification_loss: 0.1598 253/500 [==============>...............] - ETA: 1:23 - loss: 1.1545 - regression_loss: 0.9948 - classification_loss: 0.1597 254/500 [==============>...............] - ETA: 1:23 - loss: 1.1524 - regression_loss: 0.9933 - classification_loss: 0.1592 255/500 [==============>...............] - ETA: 1:22 - loss: 1.1492 - regression_loss: 0.9905 - classification_loss: 0.1586 256/500 [==============>...............] - ETA: 1:22 - loss: 1.1468 - regression_loss: 0.9886 - classification_loss: 0.1582 257/500 [==============>...............] - ETA: 1:22 - loss: 1.1486 - regression_loss: 0.9899 - classification_loss: 0.1587 258/500 [==============>...............] - ETA: 1:21 - loss: 1.1516 - regression_loss: 0.9922 - classification_loss: 0.1594 259/500 [==============>...............] - ETA: 1:21 - loss: 1.1516 - regression_loss: 0.9924 - classification_loss: 0.1593 260/500 [==============>...............] - ETA: 1:21 - loss: 1.1488 - regression_loss: 0.9899 - classification_loss: 0.1589 261/500 [==============>...............] - ETA: 1:20 - loss: 1.1461 - regression_loss: 0.9878 - classification_loss: 0.1584 262/500 [==============>...............] - ETA: 1:20 - loss: 1.1475 - regression_loss: 0.9889 - classification_loss: 0.1585 263/500 [==============>...............] - ETA: 1:20 - loss: 1.1515 - regression_loss: 0.9921 - classification_loss: 0.1594 264/500 [==============>...............] - ETA: 1:19 - loss: 1.1507 - regression_loss: 0.9915 - classification_loss: 0.1593 265/500 [==============>...............] - ETA: 1:19 - loss: 1.1519 - regression_loss: 0.9923 - classification_loss: 0.1596 266/500 [==============>...............] - ETA: 1:19 - loss: 1.1520 - regression_loss: 0.9928 - classification_loss: 0.1592 267/500 [===============>..............] - ETA: 1:18 - loss: 1.1509 - regression_loss: 0.9920 - classification_loss: 0.1590 268/500 [===============>..............] - ETA: 1:18 - loss: 1.1484 - regression_loss: 0.9899 - classification_loss: 0.1586 269/500 [===============>..............] - ETA: 1:18 - loss: 1.1486 - regression_loss: 0.9900 - classification_loss: 0.1587 270/500 [===============>..............] - ETA: 1:17 - loss: 1.1492 - regression_loss: 0.9905 - classification_loss: 0.1588 271/500 [===============>..............] - ETA: 1:17 - loss: 1.1482 - regression_loss: 0.9894 - classification_loss: 0.1589 272/500 [===============>..............] - ETA: 1:17 - loss: 1.1499 - regression_loss: 0.9905 - classification_loss: 0.1595 273/500 [===============>..............] - ETA: 1:16 - loss: 1.1526 - regression_loss: 0.9929 - classification_loss: 0.1596 274/500 [===============>..............] - ETA: 1:16 - loss: 1.1564 - regression_loss: 0.9961 - classification_loss: 0.1602 275/500 [===============>..............] - ETA: 1:16 - loss: 1.1548 - regression_loss: 0.9950 - classification_loss: 0.1598 276/500 [===============>..............] - ETA: 1:15 - loss: 1.1548 - regression_loss: 0.9951 - classification_loss: 0.1597 277/500 [===============>..............] - ETA: 1:15 - loss: 1.1534 - regression_loss: 0.9942 - classification_loss: 0.1593 278/500 [===============>..............] - ETA: 1:15 - loss: 1.1524 - regression_loss: 0.9931 - classification_loss: 0.1592 279/500 [===============>..............] - ETA: 1:14 - loss: 1.1540 - regression_loss: 0.9942 - classification_loss: 0.1598 280/500 [===============>..............] - ETA: 1:14 - loss: 1.1526 - regression_loss: 0.9929 - classification_loss: 0.1597 281/500 [===============>..............] - ETA: 1:14 - loss: 1.1527 - regression_loss: 0.9931 - classification_loss: 0.1596 282/500 [===============>..............] - ETA: 1:13 - loss: 1.1523 - regression_loss: 0.9930 - classification_loss: 0.1593 283/500 [===============>..............] - ETA: 1:13 - loss: 1.1530 - regression_loss: 0.9938 - classification_loss: 0.1592 284/500 [================>.............] - ETA: 1:13 - loss: 1.1544 - regression_loss: 0.9950 - classification_loss: 0.1594 285/500 [================>.............] - ETA: 1:12 - loss: 1.1538 - regression_loss: 0.9945 - classification_loss: 0.1593 286/500 [================>.............] - ETA: 1:12 - loss: 1.1553 - regression_loss: 0.9963 - classification_loss: 0.1590 287/500 [================>.............] - ETA: 1:12 - loss: 1.1582 - regression_loss: 0.9984 - classification_loss: 0.1598 288/500 [================>.............] - ETA: 1:11 - loss: 1.1590 - regression_loss: 0.9994 - classification_loss: 0.1597 289/500 [================>.............] - ETA: 1:11 - loss: 1.1576 - regression_loss: 0.9983 - classification_loss: 0.1593 290/500 [================>.............] - ETA: 1:11 - loss: 1.1575 - regression_loss: 0.9983 - classification_loss: 0.1593 291/500 [================>.............] - ETA: 1:10 - loss: 1.1568 - regression_loss: 0.9978 - classification_loss: 0.1590 292/500 [================>.............] - ETA: 1:10 - loss: 1.1556 - regression_loss: 0.9967 - classification_loss: 0.1589 293/500 [================>.............] - ETA: 1:10 - loss: 1.1547 - regression_loss: 0.9956 - classification_loss: 0.1591 294/500 [================>.............] - ETA: 1:09 - loss: 1.1518 - regression_loss: 0.9931 - classification_loss: 0.1587 295/500 [================>.............] - ETA: 1:09 - loss: 1.1513 - regression_loss: 0.9928 - classification_loss: 0.1585 296/500 [================>.............] - ETA: 1:09 - loss: 1.1513 - regression_loss: 0.9928 - classification_loss: 0.1584 297/500 [================>.............] - ETA: 1:08 - loss: 1.1514 - regression_loss: 0.9926 - classification_loss: 0.1588 298/500 [================>.............] - ETA: 1:08 - loss: 1.1492 - regression_loss: 0.9909 - classification_loss: 0.1583 299/500 [================>.............] - ETA: 1:08 - loss: 1.1520 - regression_loss: 0.9933 - classification_loss: 0.1587 300/500 [=================>............] - ETA: 1:07 - loss: 1.1500 - regression_loss: 0.9916 - classification_loss: 0.1585 301/500 [=================>............] - ETA: 1:07 - loss: 1.1492 - regression_loss: 0.9909 - classification_loss: 0.1582 302/500 [=================>............] - ETA: 1:06 - loss: 1.1469 - regression_loss: 0.9891 - classification_loss: 0.1578 303/500 [=================>............] - ETA: 1:06 - loss: 1.1492 - regression_loss: 0.9911 - classification_loss: 0.1581 304/500 [=================>............] - ETA: 1:06 - loss: 1.1484 - regression_loss: 0.9906 - classification_loss: 0.1578 305/500 [=================>............] - ETA: 1:05 - loss: 1.1495 - regression_loss: 0.9915 - classification_loss: 0.1580 306/500 [=================>............] - ETA: 1:05 - loss: 1.1507 - regression_loss: 0.9925 - classification_loss: 0.1582 307/500 [=================>............] - ETA: 1:05 - loss: 1.1518 - regression_loss: 0.9935 - classification_loss: 0.1584 308/500 [=================>............] - ETA: 1:04 - loss: 1.1495 - regression_loss: 0.9914 - classification_loss: 0.1581 309/500 [=================>............] - ETA: 1:04 - loss: 1.1515 - regression_loss: 0.9929 - classification_loss: 0.1585 310/500 [=================>............] - ETA: 1:04 - loss: 1.1491 - regression_loss: 0.9910 - classification_loss: 0.1581 311/500 [=================>............] - ETA: 1:03 - loss: 1.1550 - regression_loss: 0.9959 - classification_loss: 0.1591 312/500 [=================>............] - ETA: 1:03 - loss: 1.1555 - regression_loss: 0.9963 - classification_loss: 0.1592 313/500 [=================>............] - ETA: 1:03 - loss: 1.1540 - regression_loss: 0.9952 - classification_loss: 0.1588 314/500 [=================>............] - ETA: 1:02 - loss: 1.1522 - regression_loss: 0.9938 - classification_loss: 0.1584 315/500 [=================>............] - ETA: 1:02 - loss: 1.1509 - regression_loss: 0.9924 - classification_loss: 0.1585 316/500 [=================>............] - ETA: 1:02 - loss: 1.1505 - regression_loss: 0.9922 - classification_loss: 0.1583 317/500 [==================>...........] - ETA: 1:01 - loss: 1.1516 - regression_loss: 0.9928 - classification_loss: 0.1587 318/500 [==================>...........] - ETA: 1:01 - loss: 1.1495 - regression_loss: 0.9911 - classification_loss: 0.1584 319/500 [==================>...........] - ETA: 1:01 - loss: 1.1516 - regression_loss: 0.9928 - classification_loss: 0.1588 320/500 [==================>...........] - ETA: 1:00 - loss: 1.1509 - regression_loss: 0.9923 - classification_loss: 0.1586 321/500 [==================>...........] - ETA: 1:00 - loss: 1.1529 - regression_loss: 0.9939 - classification_loss: 0.1590 322/500 [==================>...........] - ETA: 1:00 - loss: 1.1545 - regression_loss: 0.9953 - classification_loss: 0.1592 323/500 [==================>...........] - ETA: 59s - loss: 1.1548 - regression_loss: 0.9956 - classification_loss: 0.1592  324/500 [==================>...........] - ETA: 59s - loss: 1.1547 - regression_loss: 0.9956 - classification_loss: 0.1590 325/500 [==================>...........] - ETA: 59s - loss: 1.1544 - regression_loss: 0.9955 - classification_loss: 0.1589 326/500 [==================>...........] - ETA: 58s - loss: 1.1550 - regression_loss: 0.9963 - classification_loss: 0.1588 327/500 [==================>...........] - ETA: 58s - loss: 1.1553 - regression_loss: 0.9966 - classification_loss: 0.1587 328/500 [==================>...........] - ETA: 58s - loss: 1.1543 - regression_loss: 0.9959 - classification_loss: 0.1585 329/500 [==================>...........] - ETA: 57s - loss: 1.1550 - regression_loss: 0.9966 - classification_loss: 0.1584 330/500 [==================>...........] - ETA: 57s - loss: 1.1530 - regression_loss: 0.9949 - classification_loss: 0.1581 331/500 [==================>...........] - ETA: 57s - loss: 1.1568 - regression_loss: 0.9977 - classification_loss: 0.1591 332/500 [==================>...........] - ETA: 56s - loss: 1.1583 - regression_loss: 0.9990 - classification_loss: 0.1594 333/500 [==================>...........] - ETA: 56s - loss: 1.1587 - regression_loss: 0.9993 - classification_loss: 0.1594 334/500 [===================>..........] - ETA: 56s - loss: 1.1578 - regression_loss: 0.9987 - classification_loss: 0.1591 335/500 [===================>..........] - ETA: 55s - loss: 1.1560 - regression_loss: 0.9972 - classification_loss: 0.1588 336/500 [===================>..........] - ETA: 55s - loss: 1.1555 - regression_loss: 0.9969 - classification_loss: 0.1586 337/500 [===================>..........] - ETA: 55s - loss: 1.1560 - regression_loss: 0.9974 - classification_loss: 0.1586 338/500 [===================>..........] - ETA: 54s - loss: 1.1551 - regression_loss: 0.9966 - classification_loss: 0.1585 339/500 [===================>..........] - ETA: 54s - loss: 1.1546 - regression_loss: 0.9961 - classification_loss: 0.1585 340/500 [===================>..........] - ETA: 54s - loss: 1.1549 - regression_loss: 0.9964 - classification_loss: 0.1585 341/500 [===================>..........] - ETA: 53s - loss: 1.1550 - regression_loss: 0.9965 - classification_loss: 0.1585 342/500 [===================>..........] - ETA: 53s - loss: 1.1567 - regression_loss: 0.9979 - classification_loss: 0.1587 343/500 [===================>..........] - ETA: 53s - loss: 1.1573 - regression_loss: 0.9985 - classification_loss: 0.1588 344/500 [===================>..........] - ETA: 52s - loss: 1.1559 - regression_loss: 0.9974 - classification_loss: 0.1584 345/500 [===================>..........] - ETA: 52s - loss: 1.1552 - regression_loss: 0.9969 - classification_loss: 0.1582 346/500 [===================>..........] - ETA: 52s - loss: 1.1546 - regression_loss: 0.9963 - classification_loss: 0.1582 347/500 [===================>..........] - ETA: 51s - loss: 1.1556 - regression_loss: 0.9973 - classification_loss: 0.1584 348/500 [===================>..........] - ETA: 51s - loss: 1.1561 - regression_loss: 0.9979 - classification_loss: 0.1582 349/500 [===================>..........] - ETA: 51s - loss: 1.1567 - regression_loss: 0.9985 - classification_loss: 0.1582 350/500 [====================>.........] - ETA: 50s - loss: 1.1566 - regression_loss: 0.9985 - classification_loss: 0.1582 351/500 [====================>.........] - ETA: 50s - loss: 1.1559 - regression_loss: 0.9979 - classification_loss: 0.1581 352/500 [====================>.........] - ETA: 50s - loss: 1.1556 - regression_loss: 0.9977 - classification_loss: 0.1578 353/500 [====================>.........] - ETA: 49s - loss: 1.1566 - regression_loss: 0.9986 - classification_loss: 0.1580 354/500 [====================>.........] - ETA: 49s - loss: 1.1547 - regression_loss: 0.9970 - classification_loss: 0.1578 355/500 [====================>.........] - ETA: 49s - loss: 1.1550 - regression_loss: 0.9973 - classification_loss: 0.1577 356/500 [====================>.........] - ETA: 48s - loss: 1.1541 - regression_loss: 0.9966 - classification_loss: 0.1575 357/500 [====================>.........] - ETA: 48s - loss: 1.1530 - regression_loss: 0.9956 - classification_loss: 0.1573 358/500 [====================>.........] - ETA: 48s - loss: 1.1532 - regression_loss: 0.9959 - classification_loss: 0.1573 359/500 [====================>.........] - ETA: 47s - loss: 1.1510 - regression_loss: 0.9940 - classification_loss: 0.1570 360/500 [====================>.........] - ETA: 47s - loss: 1.1521 - regression_loss: 0.9951 - classification_loss: 0.1570 361/500 [====================>.........] - ETA: 47s - loss: 1.1536 - regression_loss: 0.9964 - classification_loss: 0.1572 362/500 [====================>.........] - ETA: 46s - loss: 1.1547 - regression_loss: 0.9974 - classification_loss: 0.1574 363/500 [====================>.........] - ETA: 46s - loss: 1.1534 - regression_loss: 0.9963 - classification_loss: 0.1571 364/500 [====================>.........] - ETA: 46s - loss: 1.1546 - regression_loss: 0.9976 - classification_loss: 0.1570 365/500 [====================>.........] - ETA: 45s - loss: 1.1535 - regression_loss: 0.9968 - classification_loss: 0.1567 366/500 [====================>.........] - ETA: 45s - loss: 1.1526 - regression_loss: 0.9962 - classification_loss: 0.1565 367/500 [=====================>........] - ETA: 45s - loss: 1.1520 - regression_loss: 0.9957 - classification_loss: 0.1563 368/500 [=====================>........] - ETA: 44s - loss: 1.1523 - regression_loss: 0.9961 - classification_loss: 0.1563 369/500 [=====================>........] - ETA: 44s - loss: 1.1527 - regression_loss: 0.9966 - classification_loss: 0.1562 370/500 [=====================>........] - ETA: 43s - loss: 1.1587 - regression_loss: 0.9994 - classification_loss: 0.1593 371/500 [=====================>........] - ETA: 43s - loss: 1.1599 - regression_loss: 1.0006 - classification_loss: 0.1592 372/500 [=====================>........] - ETA: 43s - loss: 1.1631 - regression_loss: 1.0020 - classification_loss: 0.1610 373/500 [=====================>........] - ETA: 42s - loss: 1.1632 - regression_loss: 1.0022 - classification_loss: 0.1610 374/500 [=====================>........] - ETA: 42s - loss: 1.1627 - regression_loss: 1.0017 - classification_loss: 0.1610 375/500 [=====================>........] - ETA: 42s - loss: 1.1657 - regression_loss: 1.0042 - classification_loss: 0.1615 376/500 [=====================>........] - ETA: 41s - loss: 1.1654 - regression_loss: 1.0040 - classification_loss: 0.1614 377/500 [=====================>........] - ETA: 41s - loss: 1.1671 - regression_loss: 1.0052 - classification_loss: 0.1619 378/500 [=====================>........] - ETA: 41s - loss: 1.1661 - regression_loss: 1.0044 - classification_loss: 0.1617 379/500 [=====================>........] - ETA: 40s - loss: 1.1679 - regression_loss: 1.0056 - classification_loss: 0.1623 380/500 [=====================>........] - ETA: 40s - loss: 1.1683 - regression_loss: 1.0059 - classification_loss: 0.1624 381/500 [=====================>........] - ETA: 40s - loss: 1.1691 - regression_loss: 1.0067 - classification_loss: 0.1623 382/500 [=====================>........] - ETA: 39s - loss: 1.1697 - regression_loss: 1.0072 - classification_loss: 0.1625 383/500 [=====================>........] - ETA: 39s - loss: 1.1681 - regression_loss: 1.0059 - classification_loss: 0.1622 384/500 [======================>.......] - ETA: 39s - loss: 1.1689 - regression_loss: 1.0064 - classification_loss: 0.1625 385/500 [======================>.......] - ETA: 38s - loss: 1.1674 - regression_loss: 1.0051 - classification_loss: 0.1622 386/500 [======================>.......] - ETA: 38s - loss: 1.1673 - regression_loss: 1.0051 - classification_loss: 0.1622 387/500 [======================>.......] - ETA: 38s - loss: 1.1668 - regression_loss: 1.0044 - classification_loss: 0.1623 388/500 [======================>.......] - ETA: 37s - loss: 1.1665 - regression_loss: 1.0042 - classification_loss: 0.1623 389/500 [======================>.......] - ETA: 37s - loss: 1.1679 - regression_loss: 1.0056 - classification_loss: 0.1623 390/500 [======================>.......] - ETA: 37s - loss: 1.1691 - regression_loss: 1.0066 - classification_loss: 0.1625 391/500 [======================>.......] - ETA: 36s - loss: 1.1694 - regression_loss: 1.0069 - classification_loss: 0.1625 392/500 [======================>.......] - ETA: 36s - loss: 1.1698 - regression_loss: 1.0073 - classification_loss: 0.1625 393/500 [======================>.......] - ETA: 36s - loss: 1.1689 - regression_loss: 1.0065 - classification_loss: 0.1624 394/500 [======================>.......] - ETA: 35s - loss: 1.1676 - regression_loss: 1.0055 - classification_loss: 0.1621 395/500 [======================>.......] - ETA: 35s - loss: 1.1677 - regression_loss: 1.0055 - classification_loss: 0.1622 396/500 [======================>.......] - ETA: 35s - loss: 1.1675 - regression_loss: 1.0053 - classification_loss: 0.1622 397/500 [======================>.......] - ETA: 34s - loss: 1.1674 - regression_loss: 1.0053 - classification_loss: 0.1621 398/500 [======================>.......] - ETA: 34s - loss: 1.1662 - regression_loss: 1.0042 - classification_loss: 0.1620 399/500 [======================>.......] - ETA: 34s - loss: 1.1661 - regression_loss: 1.0042 - classification_loss: 0.1619 400/500 [=======================>......] - ETA: 33s - loss: 1.1672 - regression_loss: 1.0053 - classification_loss: 0.1619 401/500 [=======================>......] - ETA: 33s - loss: 1.1663 - regression_loss: 1.0046 - classification_loss: 0.1617 402/500 [=======================>......] - ETA: 33s - loss: 1.1662 - regression_loss: 1.0046 - classification_loss: 0.1616 403/500 [=======================>......] - ETA: 32s - loss: 1.1668 - regression_loss: 1.0054 - classification_loss: 0.1614 404/500 [=======================>......] - ETA: 32s - loss: 1.1669 - regression_loss: 1.0056 - classification_loss: 0.1614 405/500 [=======================>......] - ETA: 32s - loss: 1.1682 - regression_loss: 1.0068 - classification_loss: 0.1614 406/500 [=======================>......] - ETA: 31s - loss: 1.1676 - regression_loss: 1.0064 - classification_loss: 0.1612 407/500 [=======================>......] - ETA: 31s - loss: 1.1678 - regression_loss: 1.0067 - classification_loss: 0.1611 408/500 [=======================>......] - ETA: 31s - loss: 1.1657 - regression_loss: 1.0049 - classification_loss: 0.1608 409/500 [=======================>......] - ETA: 30s - loss: 1.1647 - regression_loss: 1.0041 - classification_loss: 0.1606 410/500 [=======================>......] - ETA: 30s - loss: 1.1652 - regression_loss: 1.0044 - classification_loss: 0.1608 411/500 [=======================>......] - ETA: 30s - loss: 1.1648 - regression_loss: 1.0042 - classification_loss: 0.1605 412/500 [=======================>......] - ETA: 29s - loss: 1.1641 - regression_loss: 1.0037 - classification_loss: 0.1603 413/500 [=======================>......] - ETA: 29s - loss: 1.1655 - regression_loss: 1.0050 - classification_loss: 0.1605 414/500 [=======================>......] - ETA: 29s - loss: 1.1653 - regression_loss: 1.0047 - classification_loss: 0.1605 415/500 [=======================>......] - ETA: 28s - loss: 1.1669 - regression_loss: 1.0063 - classification_loss: 0.1606 416/500 [=======================>......] - ETA: 28s - loss: 1.1674 - regression_loss: 1.0066 - classification_loss: 0.1608 417/500 [========================>.....] - ETA: 28s - loss: 1.1665 - regression_loss: 1.0060 - classification_loss: 0.1606 418/500 [========================>.....] - ETA: 27s - loss: 1.1645 - regression_loss: 1.0035 - classification_loss: 0.1610 419/500 [========================>.....] - ETA: 27s - loss: 1.1635 - regression_loss: 1.0028 - classification_loss: 0.1607 420/500 [========================>.....] - ETA: 27s - loss: 1.1658 - regression_loss: 1.0044 - classification_loss: 0.1614 421/500 [========================>.....] - ETA: 26s - loss: 1.1670 - regression_loss: 1.0055 - classification_loss: 0.1615 422/500 [========================>.....] - ETA: 26s - loss: 1.1663 - regression_loss: 1.0049 - classification_loss: 0.1614 423/500 [========================>.....] - ETA: 26s - loss: 1.1665 - regression_loss: 1.0052 - classification_loss: 0.1613 424/500 [========================>.....] - ETA: 25s - loss: 1.1665 - regression_loss: 1.0053 - classification_loss: 0.1612 425/500 [========================>.....] - ETA: 25s - loss: 1.1659 - regression_loss: 1.0049 - classification_loss: 0.1610 426/500 [========================>.....] - ETA: 25s - loss: 1.1658 - regression_loss: 1.0050 - classification_loss: 0.1608 427/500 [========================>.....] - ETA: 24s - loss: 1.1661 - regression_loss: 1.0054 - classification_loss: 0.1607 428/500 [========================>.....] - ETA: 24s - loss: 1.1646 - regression_loss: 1.0041 - classification_loss: 0.1605 429/500 [========================>.....] - ETA: 24s - loss: 1.1641 - regression_loss: 1.0038 - classification_loss: 0.1602 430/500 [========================>.....] - ETA: 23s - loss: 1.1644 - regression_loss: 1.0041 - classification_loss: 0.1603 431/500 [========================>.....] - ETA: 23s - loss: 1.1650 - regression_loss: 1.0046 - classification_loss: 0.1604 432/500 [========================>.....] - ETA: 23s - loss: 1.1653 - regression_loss: 1.0048 - classification_loss: 0.1605 433/500 [========================>.....] - ETA: 22s - loss: 1.1646 - regression_loss: 1.0043 - classification_loss: 0.1602 434/500 [=========================>....] - ETA: 22s - loss: 1.1643 - regression_loss: 1.0040 - classification_loss: 0.1602 435/500 [=========================>....] - ETA: 21s - loss: 1.1635 - regression_loss: 1.0034 - classification_loss: 0.1601 436/500 [=========================>....] - ETA: 21s - loss: 1.1632 - regression_loss: 1.0032 - classification_loss: 0.1600 437/500 [=========================>....] - ETA: 21s - loss: 1.1643 - regression_loss: 1.0042 - classification_loss: 0.1601 438/500 [=========================>....] - ETA: 20s - loss: 1.1677 - regression_loss: 1.0073 - classification_loss: 0.1604 439/500 [=========================>....] - ETA: 20s - loss: 1.1675 - regression_loss: 1.0073 - classification_loss: 0.1603 440/500 [=========================>....] - ETA: 20s - loss: 1.1666 - regression_loss: 1.0065 - classification_loss: 0.1601 441/500 [=========================>....] - ETA: 19s - loss: 1.1661 - regression_loss: 1.0061 - classification_loss: 0.1601 442/500 [=========================>....] - ETA: 19s - loss: 1.1673 - regression_loss: 1.0070 - classification_loss: 0.1602 443/500 [=========================>....] - ETA: 19s - loss: 1.1680 - regression_loss: 1.0078 - classification_loss: 0.1603 444/500 [=========================>....] - ETA: 18s - loss: 1.1670 - regression_loss: 1.0070 - classification_loss: 0.1600 445/500 [=========================>....] - ETA: 18s - loss: 1.1676 - regression_loss: 1.0074 - classification_loss: 0.1602 446/500 [=========================>....] - ETA: 18s - loss: 1.1682 - regression_loss: 1.0078 - classification_loss: 0.1604 447/500 [=========================>....] - ETA: 17s - loss: 1.1668 - regression_loss: 1.0067 - classification_loss: 0.1601 448/500 [=========================>....] - ETA: 17s - loss: 1.1665 - regression_loss: 1.0066 - classification_loss: 0.1600 449/500 [=========================>....] - ETA: 17s - loss: 1.1671 - regression_loss: 1.0069 - classification_loss: 0.1602 450/500 [==========================>...] - ETA: 16s - loss: 1.1668 - regression_loss: 1.0066 - classification_loss: 0.1602 451/500 [==========================>...] - ETA: 16s - loss: 1.1660 - regression_loss: 1.0059 - classification_loss: 0.1600 452/500 [==========================>...] - ETA: 16s - loss: 1.1642 - regression_loss: 1.0045 - classification_loss: 0.1597 453/500 [==========================>...] - ETA: 15s - loss: 1.1657 - regression_loss: 1.0056 - classification_loss: 0.1601 454/500 [==========================>...] - ETA: 15s - loss: 1.1660 - regression_loss: 1.0058 - classification_loss: 0.1602 455/500 [==========================>...] - ETA: 15s - loss: 1.1641 - regression_loss: 1.0042 - classification_loss: 0.1599 456/500 [==========================>...] - ETA: 14s - loss: 1.1640 - regression_loss: 1.0039 - classification_loss: 0.1600 457/500 [==========================>...] - ETA: 14s - loss: 1.1625 - regression_loss: 1.0026 - classification_loss: 0.1599 458/500 [==========================>...] - ETA: 14s - loss: 1.1628 - regression_loss: 1.0028 - classification_loss: 0.1600 459/500 [==========================>...] - ETA: 13s - loss: 1.1621 - regression_loss: 1.0022 - classification_loss: 0.1599 460/500 [==========================>...] - ETA: 13s - loss: 1.1602 - regression_loss: 1.0005 - classification_loss: 0.1596 461/500 [==========================>...] - ETA: 13s - loss: 1.1609 - regression_loss: 1.0010 - classification_loss: 0.1599 462/500 [==========================>...] - ETA: 12s - loss: 1.1619 - regression_loss: 1.0017 - classification_loss: 0.1602 463/500 [==========================>...] - ETA: 12s - loss: 1.1617 - regression_loss: 1.0016 - classification_loss: 0.1601 464/500 [==========================>...] - ETA: 12s - loss: 1.1621 - regression_loss: 1.0021 - classification_loss: 0.1600 465/500 [==========================>...] - ETA: 11s - loss: 1.1615 - regression_loss: 1.0015 - classification_loss: 0.1600 466/500 [==========================>...] - ETA: 11s - loss: 1.1630 - regression_loss: 1.0027 - classification_loss: 0.1603 467/500 [===========================>..] - ETA: 11s - loss: 1.1627 - regression_loss: 1.0025 - classification_loss: 0.1602 468/500 [===========================>..] - ETA: 10s - loss: 1.1623 - regression_loss: 1.0023 - classification_loss: 0.1600 469/500 [===========================>..] - ETA: 10s - loss: 1.1624 - regression_loss: 1.0025 - classification_loss: 0.1599 470/500 [===========================>..] - ETA: 10s - loss: 1.1627 - regression_loss: 1.0028 - classification_loss: 0.1599 471/500 [===========================>..] - ETA: 9s - loss: 1.1618 - regression_loss: 1.0021 - classification_loss: 0.1597  472/500 [===========================>..] - ETA: 9s - loss: 1.1623 - regression_loss: 1.0027 - classification_loss: 0.1597 473/500 [===========================>..] - ETA: 9s - loss: 1.1610 - regression_loss: 1.0016 - classification_loss: 0.1594 474/500 [===========================>..] - ETA: 8s - loss: 1.1595 - regression_loss: 1.0002 - classification_loss: 0.1592 475/500 [===========================>..] - ETA: 8s - loss: 1.1605 - regression_loss: 1.0013 - classification_loss: 0.1592 476/500 [===========================>..] - ETA: 8s - loss: 1.1601 - regression_loss: 1.0011 - classification_loss: 0.1591 477/500 [===========================>..] - ETA: 7s - loss: 1.1603 - regression_loss: 1.0013 - classification_loss: 0.1590 478/500 [===========================>..] - ETA: 7s - loss: 1.1612 - regression_loss: 1.0022 - classification_loss: 0.1590 479/500 [===========================>..] - ETA: 7s - loss: 1.1610 - regression_loss: 1.0021 - classification_loss: 0.1589 480/500 [===========================>..] - ETA: 6s - loss: 1.1614 - regression_loss: 1.0021 - classification_loss: 0.1593 481/500 [===========================>..] - ETA: 6s - loss: 1.1617 - regression_loss: 1.0026 - classification_loss: 0.1591 482/500 [===========================>..] - ETA: 6s - loss: 1.1630 - regression_loss: 1.0035 - classification_loss: 0.1595 483/500 [===========================>..] - ETA: 5s - loss: 1.1618 - regression_loss: 1.0026 - classification_loss: 0.1593 484/500 [============================>.] - ETA: 5s - loss: 1.1622 - regression_loss: 1.0030 - classification_loss: 0.1592 485/500 [============================>.] - ETA: 5s - loss: 1.1615 - regression_loss: 1.0026 - classification_loss: 0.1590 486/500 [============================>.] - ETA: 4s - loss: 1.1602 - regression_loss: 1.0014 - classification_loss: 0.1588 487/500 [============================>.] - ETA: 4s - loss: 1.1589 - regression_loss: 1.0003 - classification_loss: 0.1586 488/500 [============================>.] - ETA: 4s - loss: 1.1581 - regression_loss: 0.9997 - classification_loss: 0.1584 489/500 [============================>.] - ETA: 3s - loss: 1.1588 - regression_loss: 1.0001 - classification_loss: 0.1587 490/500 [============================>.] - ETA: 3s - loss: 1.1615 - regression_loss: 1.0022 - classification_loss: 0.1593 491/500 [============================>.] - ETA: 3s - loss: 1.1602 - regression_loss: 1.0011 - classification_loss: 0.1591 492/500 [============================>.] - ETA: 2s - loss: 1.1587 - regression_loss: 0.9999 - classification_loss: 0.1589 493/500 [============================>.] - ETA: 2s - loss: 1.1582 - regression_loss: 0.9995 - classification_loss: 0.1586 494/500 [============================>.] - ETA: 2s - loss: 1.1585 - regression_loss: 0.9999 - classification_loss: 0.1586 495/500 [============================>.] - ETA: 1s - loss: 1.1581 - regression_loss: 0.9996 - classification_loss: 0.1585 496/500 [============================>.] - ETA: 1s - loss: 1.1573 - regression_loss: 0.9990 - classification_loss: 0.1583 497/500 [============================>.] - ETA: 1s - loss: 1.1575 - regression_loss: 0.9993 - classification_loss: 0.1582 498/500 [============================>.] - ETA: 0s - loss: 1.1570 - regression_loss: 0.9989 - classification_loss: 0.1581 499/500 [============================>.] - ETA: 0s - loss: 1.1576 - regression_loss: 0.9994 - classification_loss: 0.1582 500/500 [==============================] - 169s 338ms/step - loss: 1.1578 - regression_loss: 0.9996 - classification_loss: 0.1583 326 instances of class plum with average precision: 0.8287 mAP: 0.8287 Epoch 00018: saving model to ./training/snapshots/resnet101_pascal_18.h5 Epoch 19/150 1/500 [..............................] - ETA: 2:43 - loss: 0.2670 - regression_loss: 0.2386 - classification_loss: 0.0284 2/500 [..............................] - ETA: 2:47 - loss: 0.6963 - regression_loss: 0.6145 - classification_loss: 0.0818 3/500 [..............................] - ETA: 2:45 - loss: 0.6444 - regression_loss: 0.5807 - classification_loss: 0.0638 4/500 [..............................] - ETA: 2:47 - loss: 0.6771 - regression_loss: 0.5988 - classification_loss: 0.0782 5/500 [..............................] - ETA: 2:47 - loss: 0.6364 - regression_loss: 0.5671 - classification_loss: 0.0694 6/500 [..............................] - ETA: 2:45 - loss: 0.7445 - regression_loss: 0.6547 - classification_loss: 0.0897 7/500 [..............................] - ETA: 2:45 - loss: 0.9887 - regression_loss: 0.8523 - classification_loss: 0.1365 8/500 [..............................] - ETA: 2:44 - loss: 1.0724 - regression_loss: 0.9172 - classification_loss: 0.1551 9/500 [..............................] - ETA: 2:43 - loss: 1.0749 - regression_loss: 0.9270 - classification_loss: 0.1479 10/500 [..............................] - ETA: 2:43 - loss: 1.0909 - regression_loss: 0.9392 - classification_loss: 0.1517 11/500 [..............................] - ETA: 2:43 - loss: 1.0356 - regression_loss: 0.8907 - classification_loss: 0.1449 12/500 [..............................] - ETA: 2:42 - loss: 1.0217 - regression_loss: 0.8821 - classification_loss: 0.1396 13/500 [..............................] - ETA: 2:41 - loss: 1.0376 - regression_loss: 0.8984 - classification_loss: 0.1392 14/500 [..............................] - ETA: 2:40 - loss: 1.0628 - regression_loss: 0.9214 - classification_loss: 0.1414 15/500 [..............................] - ETA: 2:40 - loss: 1.0527 - regression_loss: 0.9168 - classification_loss: 0.1360 16/500 [..............................] - ETA: 2:40 - loss: 1.0663 - regression_loss: 0.9320 - classification_loss: 0.1343 17/500 [>.............................] - ETA: 2:40 - loss: 1.0734 - regression_loss: 0.9383 - classification_loss: 0.1351 18/500 [>.............................] - ETA: 2:40 - loss: 1.0596 - regression_loss: 0.9260 - classification_loss: 0.1336 19/500 [>.............................] - ETA: 2:40 - loss: 1.0919 - regression_loss: 0.9558 - classification_loss: 0.1361 20/500 [>.............................] - ETA: 2:40 - loss: 1.0830 - regression_loss: 0.9495 - classification_loss: 0.1335 21/500 [>.............................] - ETA: 2:39 - loss: 1.0914 - regression_loss: 0.9576 - classification_loss: 0.1337 22/500 [>.............................] - ETA: 2:39 - loss: 1.0938 - regression_loss: 0.9614 - classification_loss: 0.1324 23/500 [>.............................] - ETA: 2:39 - loss: 1.1087 - regression_loss: 0.9767 - classification_loss: 0.1320 24/500 [>.............................] - ETA: 2:39 - loss: 1.0927 - regression_loss: 0.9639 - classification_loss: 0.1288 25/500 [>.............................] - ETA: 2:39 - loss: 1.0870 - regression_loss: 0.9607 - classification_loss: 0.1263 26/500 [>.............................] - ETA: 2:39 - loss: 1.0807 - regression_loss: 0.9546 - classification_loss: 0.1261 27/500 [>.............................] - ETA: 2:38 - loss: 1.0878 - regression_loss: 0.9639 - classification_loss: 0.1239 28/500 [>.............................] - ETA: 2:38 - loss: 1.0770 - regression_loss: 0.9544 - classification_loss: 0.1226 29/500 [>.............................] - ETA: 2:38 - loss: 1.0712 - regression_loss: 0.9498 - classification_loss: 0.1214 30/500 [>.............................] - ETA: 2:38 - loss: 1.1077 - regression_loss: 0.9785 - classification_loss: 0.1292 31/500 [>.............................] - ETA: 2:37 - loss: 1.1069 - regression_loss: 0.9780 - classification_loss: 0.1289 32/500 [>.............................] - ETA: 2:37 - loss: 1.0956 - regression_loss: 0.9658 - classification_loss: 0.1298 33/500 [>.............................] - ETA: 2:37 - loss: 1.0920 - regression_loss: 0.9626 - classification_loss: 0.1294 34/500 [=>............................] - ETA: 2:37 - loss: 1.0906 - regression_loss: 0.9630 - classification_loss: 0.1276 35/500 [=>............................] - ETA: 2:36 - loss: 1.0935 - regression_loss: 0.9675 - classification_loss: 0.1260 36/500 [=>............................] - ETA: 2:36 - loss: 1.0739 - regression_loss: 0.9505 - classification_loss: 0.1234 37/500 [=>............................] - ETA: 2:35 - loss: 1.0880 - regression_loss: 0.9632 - classification_loss: 0.1248 38/500 [=>............................] - ETA: 2:35 - loss: 1.0849 - regression_loss: 0.9607 - classification_loss: 0.1242 39/500 [=>............................] - ETA: 2:35 - loss: 1.0737 - regression_loss: 0.9513 - classification_loss: 0.1223 40/500 [=>............................] - ETA: 2:34 - loss: 1.0737 - regression_loss: 0.9500 - classification_loss: 0.1237 41/500 [=>............................] - ETA: 2:34 - loss: 1.0817 - regression_loss: 0.9565 - classification_loss: 0.1252 42/500 [=>............................] - ETA: 2:34 - loss: 1.0769 - regression_loss: 0.9529 - classification_loss: 0.1239 43/500 [=>............................] - ETA: 2:34 - loss: 1.0829 - regression_loss: 0.9568 - classification_loss: 0.1261 44/500 [=>............................] - ETA: 2:33 - loss: 1.0740 - regression_loss: 0.9497 - classification_loss: 0.1243 45/500 [=>............................] - ETA: 2:33 - loss: 1.0694 - regression_loss: 0.9459 - classification_loss: 0.1235 46/500 [=>............................] - ETA: 2:33 - loss: 1.0657 - regression_loss: 0.9436 - classification_loss: 0.1222 47/500 [=>............................] - ETA: 2:32 - loss: 1.0621 - regression_loss: 0.9408 - classification_loss: 0.1213 48/500 [=>............................] - ETA: 2:32 - loss: 1.0525 - regression_loss: 0.9326 - classification_loss: 0.1198 49/500 [=>............................] - ETA: 2:32 - loss: 1.0612 - regression_loss: 0.9403 - classification_loss: 0.1209 50/500 [==>...........................] - ETA: 2:31 - loss: 1.0716 - regression_loss: 0.9491 - classification_loss: 0.1224 51/500 [==>...........................] - ETA: 2:31 - loss: 1.0610 - regression_loss: 0.9400 - classification_loss: 0.1210 52/500 [==>...........................] - ETA: 2:31 - loss: 1.0631 - regression_loss: 0.9415 - classification_loss: 0.1216 53/500 [==>...........................] - ETA: 2:31 - loss: 1.0751 - regression_loss: 0.9502 - classification_loss: 0.1249 54/500 [==>...........................] - ETA: 2:30 - loss: 1.0648 - regression_loss: 0.9414 - classification_loss: 0.1234 55/500 [==>...........................] - ETA: 2:30 - loss: 1.0766 - regression_loss: 0.9513 - classification_loss: 0.1252 56/500 [==>...........................] - ETA: 2:30 - loss: 1.0739 - regression_loss: 0.9494 - classification_loss: 0.1245 57/500 [==>...........................] - ETA: 2:30 - loss: 1.0715 - regression_loss: 0.9477 - classification_loss: 0.1238 58/500 [==>...........................] - ETA: 2:29 - loss: 1.0851 - regression_loss: 0.9592 - classification_loss: 0.1260 59/500 [==>...........................] - ETA: 2:29 - loss: 1.0815 - regression_loss: 0.9563 - classification_loss: 0.1251 60/500 [==>...........................] - ETA: 2:28 - loss: 1.0716 - regression_loss: 0.9475 - classification_loss: 0.1240 61/500 [==>...........................] - ETA: 2:28 - loss: 1.0828 - regression_loss: 0.9558 - classification_loss: 0.1270 62/500 [==>...........................] - ETA: 2:28 - loss: 1.0802 - regression_loss: 0.9533 - classification_loss: 0.1270 63/500 [==>...........................] - ETA: 2:27 - loss: 1.0826 - regression_loss: 0.9559 - classification_loss: 0.1267 64/500 [==>...........................] - ETA: 2:27 - loss: 1.0799 - regression_loss: 0.9540 - classification_loss: 0.1259 65/500 [==>...........................] - ETA: 2:27 - loss: 1.0721 - regression_loss: 0.9464 - classification_loss: 0.1257 66/500 [==>...........................] - ETA: 2:27 - loss: 1.0752 - regression_loss: 0.9493 - classification_loss: 0.1259 67/500 [===>..........................] - ETA: 2:26 - loss: 1.0784 - regression_loss: 0.9522 - classification_loss: 0.1262 68/500 [===>..........................] - ETA: 2:26 - loss: 1.0795 - regression_loss: 0.9528 - classification_loss: 0.1267 69/500 [===>..........................] - ETA: 2:25 - loss: 1.0819 - regression_loss: 0.9550 - classification_loss: 0.1269 70/500 [===>..........................] - ETA: 2:25 - loss: 1.0766 - regression_loss: 0.9508 - classification_loss: 0.1258 71/500 [===>..........................] - ETA: 2:25 - loss: 1.0849 - regression_loss: 0.9579 - classification_loss: 0.1270 72/500 [===>..........................] - ETA: 2:24 - loss: 1.0791 - regression_loss: 0.9531 - classification_loss: 0.1260 73/500 [===>..........................] - ETA: 2:24 - loss: 1.0904 - regression_loss: 0.9603 - classification_loss: 0.1301 74/500 [===>..........................] - ETA: 2:24 - loss: 1.0866 - regression_loss: 0.9566 - classification_loss: 0.1299 75/500 [===>..........................] - ETA: 2:23 - loss: 1.0904 - regression_loss: 0.9597 - classification_loss: 0.1307 76/500 [===>..........................] - ETA: 2:23 - loss: 1.0819 - regression_loss: 0.9521 - classification_loss: 0.1298 77/500 [===>..........................] - ETA: 2:23 - loss: 1.0788 - regression_loss: 0.9496 - classification_loss: 0.1292 78/500 [===>..........................] - ETA: 2:22 - loss: 1.0790 - regression_loss: 0.9502 - classification_loss: 0.1288 79/500 [===>..........................] - ETA: 2:22 - loss: 1.0821 - regression_loss: 0.9531 - classification_loss: 0.1290 80/500 [===>..........................] - ETA: 2:21 - loss: 1.0799 - regression_loss: 0.9516 - classification_loss: 0.1283 81/500 [===>..........................] - ETA: 2:21 - loss: 1.0810 - regression_loss: 0.9521 - classification_loss: 0.1289 82/500 [===>..........................] - ETA: 2:21 - loss: 1.0857 - regression_loss: 0.9565 - classification_loss: 0.1292 83/500 [===>..........................] - ETA: 2:20 - loss: 1.0933 - regression_loss: 0.9625 - classification_loss: 0.1308 84/500 [====>.........................] - ETA: 2:20 - loss: 1.0818 - regression_loss: 0.9511 - classification_loss: 0.1307 85/500 [====>.........................] - ETA: 2:20 - loss: 1.0861 - regression_loss: 0.9545 - classification_loss: 0.1316 86/500 [====>.........................] - ETA: 2:19 - loss: 1.0790 - regression_loss: 0.9486 - classification_loss: 0.1304 87/500 [====>.........................] - ETA: 2:19 - loss: 1.0846 - regression_loss: 0.9538 - classification_loss: 0.1309 88/500 [====>.........................] - ETA: 2:19 - loss: 1.0904 - regression_loss: 0.9581 - classification_loss: 0.1323 89/500 [====>.........................] - ETA: 2:18 - loss: 1.0919 - regression_loss: 0.9588 - classification_loss: 0.1331 90/500 [====>.........................] - ETA: 2:18 - loss: 1.0923 - regression_loss: 0.9589 - classification_loss: 0.1333 91/500 [====>.........................] - ETA: 2:18 - loss: 1.0857 - regression_loss: 0.9534 - classification_loss: 0.1323 92/500 [====>.........................] - ETA: 2:17 - loss: 1.0957 - regression_loss: 0.9618 - classification_loss: 0.1339 93/500 [====>.........................] - ETA: 2:17 - loss: 1.1207 - regression_loss: 0.9797 - classification_loss: 0.1410 94/500 [====>.........................] - ETA: 2:17 - loss: 1.1162 - regression_loss: 0.9758 - classification_loss: 0.1404 95/500 [====>.........................] - ETA: 2:16 - loss: 1.1215 - regression_loss: 0.9797 - classification_loss: 0.1418 96/500 [====>.........................] - ETA: 2:16 - loss: 1.1284 - regression_loss: 0.9865 - classification_loss: 0.1420 97/500 [====>.........................] - ETA: 2:15 - loss: 1.1317 - regression_loss: 0.9895 - classification_loss: 0.1422 98/500 [====>.........................] - ETA: 2:15 - loss: 1.1308 - regression_loss: 0.9890 - classification_loss: 0.1418 99/500 [====>.........................] - ETA: 2:15 - loss: 1.1284 - regression_loss: 0.9872 - classification_loss: 0.1412 100/500 [=====>........................] - ETA: 2:14 - loss: 1.1304 - regression_loss: 0.9889 - classification_loss: 0.1415 101/500 [=====>........................] - ETA: 2:14 - loss: 1.1394 - regression_loss: 0.9980 - classification_loss: 0.1414 102/500 [=====>........................] - ETA: 2:14 - loss: 1.1412 - regression_loss: 0.9998 - classification_loss: 0.1414 103/500 [=====>........................] - ETA: 2:13 - loss: 1.1457 - regression_loss: 1.0032 - classification_loss: 0.1425 104/500 [=====>........................] - ETA: 2:13 - loss: 1.1428 - regression_loss: 1.0001 - classification_loss: 0.1427 105/500 [=====>........................] - ETA: 2:13 - loss: 1.1426 - regression_loss: 1.0001 - classification_loss: 0.1426 106/500 [=====>........................] - ETA: 2:13 - loss: 1.1398 - regression_loss: 0.9977 - classification_loss: 0.1420 107/500 [=====>........................] - ETA: 2:12 - loss: 1.1429 - regression_loss: 1.0006 - classification_loss: 0.1423 108/500 [=====>........................] - ETA: 2:12 - loss: 1.1434 - regression_loss: 1.0010 - classification_loss: 0.1424 109/500 [=====>........................] - ETA: 2:11 - loss: 1.1518 - regression_loss: 1.0089 - classification_loss: 0.1430 110/500 [=====>........................] - ETA: 2:11 - loss: 1.1459 - regression_loss: 1.0037 - classification_loss: 0.1422 111/500 [=====>........................] - ETA: 2:11 - loss: 1.1458 - regression_loss: 1.0039 - classification_loss: 0.1418 112/500 [=====>........................] - ETA: 2:11 - loss: 1.1443 - regression_loss: 1.0031 - classification_loss: 0.1411 113/500 [=====>........................] - ETA: 2:10 - loss: 1.1479 - regression_loss: 1.0068 - classification_loss: 0.1411 114/500 [=====>........................] - ETA: 2:10 - loss: 1.1496 - regression_loss: 1.0083 - classification_loss: 0.1413 115/500 [=====>........................] - ETA: 2:10 - loss: 1.1445 - regression_loss: 1.0040 - classification_loss: 0.1404 116/500 [=====>........................] - ETA: 2:09 - loss: 1.1389 - regression_loss: 0.9993 - classification_loss: 0.1396 117/500 [======>.......................] - ETA: 2:09 - loss: 1.1357 - regression_loss: 0.9969 - classification_loss: 0.1388 118/500 [======>.......................] - ETA: 2:09 - loss: 1.1346 - regression_loss: 0.9961 - classification_loss: 0.1385 119/500 [======>.......................] - ETA: 2:08 - loss: 1.1391 - regression_loss: 1.0000 - classification_loss: 0.1391 120/500 [======>.......................] - ETA: 2:08 - loss: 1.1382 - regression_loss: 0.9992 - classification_loss: 0.1390 121/500 [======>.......................] - ETA: 2:08 - loss: 1.1446 - regression_loss: 1.0045 - classification_loss: 0.1401 122/500 [======>.......................] - ETA: 2:07 - loss: 1.1414 - regression_loss: 1.0020 - classification_loss: 0.1395 123/500 [======>.......................] - ETA: 2:07 - loss: 1.1421 - regression_loss: 1.0031 - classification_loss: 0.1390 124/500 [======>.......................] - ETA: 2:07 - loss: 1.1425 - regression_loss: 1.0038 - classification_loss: 0.1387 125/500 [======>.......................] - ETA: 2:06 - loss: 1.1381 - regression_loss: 1.0002 - classification_loss: 0.1379 126/500 [======>.......................] - ETA: 2:06 - loss: 1.1377 - regression_loss: 0.9998 - classification_loss: 0.1379 127/500 [======>.......................] - ETA: 2:06 - loss: 1.1362 - regression_loss: 0.9987 - classification_loss: 0.1375 128/500 [======>.......................] - ETA: 2:05 - loss: 1.1370 - regression_loss: 0.9995 - classification_loss: 0.1375 129/500 [======>.......................] - ETA: 2:05 - loss: 1.1392 - regression_loss: 1.0015 - classification_loss: 0.1377 130/500 [======>.......................] - ETA: 2:05 - loss: 1.1351 - regression_loss: 0.9979 - classification_loss: 0.1372 131/500 [======>.......................] - ETA: 2:04 - loss: 1.1321 - regression_loss: 0.9953 - classification_loss: 0.1368 132/500 [======>.......................] - ETA: 2:04 - loss: 1.1276 - regression_loss: 0.9916 - classification_loss: 0.1361 133/500 [======>.......................] - ETA: 2:04 - loss: 1.1294 - regression_loss: 0.9931 - classification_loss: 0.1363 134/500 [=======>......................] - ETA: 2:03 - loss: 1.1246 - regression_loss: 0.9893 - classification_loss: 0.1354 135/500 [=======>......................] - ETA: 2:03 - loss: 1.1265 - regression_loss: 0.9912 - classification_loss: 0.1353 136/500 [=======>......................] - ETA: 2:03 - loss: 1.1243 - regression_loss: 0.9897 - classification_loss: 0.1346 137/500 [=======>......................] - ETA: 2:02 - loss: 1.1231 - regression_loss: 0.9888 - classification_loss: 0.1343 138/500 [=======>......................] - ETA: 2:02 - loss: 1.1254 - regression_loss: 0.9900 - classification_loss: 0.1354 139/500 [=======>......................] - ETA: 2:02 - loss: 1.1253 - regression_loss: 0.9901 - classification_loss: 0.1352 140/500 [=======>......................] - ETA: 2:01 - loss: 1.1234 - regression_loss: 0.9885 - classification_loss: 0.1349 141/500 [=======>......................] - ETA: 2:01 - loss: 1.1231 - regression_loss: 0.9884 - classification_loss: 0.1348 142/500 [=======>......................] - ETA: 2:01 - loss: 1.1215 - regression_loss: 0.9870 - classification_loss: 0.1345 143/500 [=======>......................] - ETA: 2:00 - loss: 1.1192 - regression_loss: 0.9853 - classification_loss: 0.1339 144/500 [=======>......................] - ETA: 2:00 - loss: 1.1223 - regression_loss: 0.9880 - classification_loss: 0.1343 145/500 [=======>......................] - ETA: 2:00 - loss: 1.1212 - regression_loss: 0.9869 - classification_loss: 0.1342 146/500 [=======>......................] - ETA: 2:00 - loss: 1.1256 - regression_loss: 0.9905 - classification_loss: 0.1351 147/500 [=======>......................] - ETA: 1:59 - loss: 1.1291 - regression_loss: 0.9939 - classification_loss: 0.1352 148/500 [=======>......................] - ETA: 1:59 - loss: 1.1318 - regression_loss: 0.9961 - classification_loss: 0.1357 149/500 [=======>......................] - ETA: 1:59 - loss: 1.1320 - regression_loss: 0.9957 - classification_loss: 0.1362 150/500 [========>.....................] - ETA: 1:58 - loss: 1.1323 - regression_loss: 0.9962 - classification_loss: 0.1362 151/500 [========>.....................] - ETA: 1:58 - loss: 1.1391 - regression_loss: 1.0024 - classification_loss: 0.1368 152/500 [========>.....................] - ETA: 1:57 - loss: 1.1387 - regression_loss: 1.0022 - classification_loss: 0.1365 153/500 [========>.....................] - ETA: 1:57 - loss: 1.1376 - regression_loss: 1.0016 - classification_loss: 0.1360 154/500 [========>.....................] - ETA: 1:57 - loss: 1.1382 - regression_loss: 1.0022 - classification_loss: 0.1360 155/500 [========>.....................] - ETA: 1:56 - loss: 1.1441 - regression_loss: 1.0072 - classification_loss: 0.1369 156/500 [========>.....................] - ETA: 1:56 - loss: 1.1465 - regression_loss: 1.0092 - classification_loss: 0.1374 157/500 [========>.....................] - ETA: 1:56 - loss: 1.1448 - regression_loss: 1.0077 - classification_loss: 0.1372 158/500 [========>.....................] - ETA: 1:55 - loss: 1.1457 - regression_loss: 1.0085 - classification_loss: 0.1372 159/500 [========>.....................] - ETA: 1:55 - loss: 1.1424 - regression_loss: 1.0054 - classification_loss: 0.1370 160/500 [========>.....................] - ETA: 1:55 - loss: 1.1476 - regression_loss: 1.0096 - classification_loss: 0.1380 161/500 [========>.....................] - ETA: 1:54 - loss: 1.1508 - regression_loss: 1.0127 - classification_loss: 0.1381 162/500 [========>.....................] - ETA: 1:54 - loss: 1.1568 - regression_loss: 1.0171 - classification_loss: 0.1397 163/500 [========>.....................] - ETA: 1:54 - loss: 1.1536 - regression_loss: 1.0143 - classification_loss: 0.1393 164/500 [========>.....................] - ETA: 1:53 - loss: 1.1535 - regression_loss: 1.0143 - classification_loss: 0.1391 165/500 [========>.....................] - ETA: 1:53 - loss: 1.1518 - regression_loss: 1.0131 - classification_loss: 0.1387 166/500 [========>.....................] - ETA: 1:53 - loss: 1.1496 - regression_loss: 1.0114 - classification_loss: 0.1382 167/500 [=========>....................] - ETA: 1:52 - loss: 1.1501 - regression_loss: 1.0116 - classification_loss: 0.1386 168/500 [=========>....................] - ETA: 1:52 - loss: 1.1501 - regression_loss: 1.0115 - classification_loss: 0.1385 169/500 [=========>....................] - ETA: 1:52 - loss: 1.1522 - regression_loss: 1.0130 - classification_loss: 0.1391 170/500 [=========>....................] - ETA: 1:51 - loss: 1.1535 - regression_loss: 1.0145 - classification_loss: 0.1390 171/500 [=========>....................] - ETA: 1:51 - loss: 1.1531 - regression_loss: 1.0144 - classification_loss: 0.1387 172/500 [=========>....................] - ETA: 1:51 - loss: 1.1542 - regression_loss: 1.0153 - classification_loss: 0.1389 173/500 [=========>....................] - ETA: 1:50 - loss: 1.1533 - regression_loss: 1.0149 - classification_loss: 0.1384 174/500 [=========>....................] - ETA: 1:50 - loss: 1.1507 - regression_loss: 1.0091 - classification_loss: 0.1416 175/500 [=========>....................] - ETA: 1:50 - loss: 1.1493 - regression_loss: 1.0077 - classification_loss: 0.1416 176/500 [=========>....................] - ETA: 1:49 - loss: 1.1484 - regression_loss: 1.0069 - classification_loss: 0.1415 177/500 [=========>....................] - ETA: 1:49 - loss: 1.1465 - regression_loss: 1.0054 - classification_loss: 0.1412 178/500 [=========>....................] - ETA: 1:49 - loss: 1.1446 - regression_loss: 1.0039 - classification_loss: 0.1407 179/500 [=========>....................] - ETA: 1:48 - loss: 1.1415 - regression_loss: 1.0013 - classification_loss: 0.1402 180/500 [=========>....................] - ETA: 1:48 - loss: 1.1395 - regression_loss: 0.9997 - classification_loss: 0.1399 181/500 [=========>....................] - ETA: 1:48 - loss: 1.1392 - regression_loss: 0.9994 - classification_loss: 0.1397 182/500 [=========>....................] - ETA: 1:47 - loss: 1.1399 - regression_loss: 1.0002 - classification_loss: 0.1397 183/500 [=========>....................] - ETA: 1:47 - loss: 1.1391 - regression_loss: 0.9994 - classification_loss: 0.1397 184/500 [==========>...................] - ETA: 1:47 - loss: 1.1469 - regression_loss: 1.0069 - classification_loss: 0.1400 185/500 [==========>...................] - ETA: 1:46 - loss: 1.1446 - regression_loss: 1.0051 - classification_loss: 0.1395 186/500 [==========>...................] - ETA: 1:46 - loss: 1.1408 - regression_loss: 1.0017 - classification_loss: 0.1391 187/500 [==========>...................] - ETA: 1:46 - loss: 1.1409 - regression_loss: 1.0019 - classification_loss: 0.1390 188/500 [==========>...................] - ETA: 1:45 - loss: 1.1419 - regression_loss: 1.0029 - classification_loss: 0.1390 189/500 [==========>...................] - ETA: 1:45 - loss: 1.1419 - regression_loss: 1.0030 - classification_loss: 0.1390 190/500 [==========>...................] - ETA: 1:45 - loss: 1.1422 - regression_loss: 1.0033 - classification_loss: 0.1389 191/500 [==========>...................] - ETA: 1:44 - loss: 1.1417 - regression_loss: 1.0029 - classification_loss: 0.1388 192/500 [==========>...................] - ETA: 1:44 - loss: 1.1418 - regression_loss: 1.0033 - classification_loss: 0.1385 193/500 [==========>...................] - ETA: 1:44 - loss: 1.1402 - regression_loss: 1.0020 - classification_loss: 0.1382 194/500 [==========>...................] - ETA: 1:43 - loss: 1.1379 - regression_loss: 1.0002 - classification_loss: 0.1376 195/500 [==========>...................] - ETA: 1:43 - loss: 1.1386 - regression_loss: 1.0011 - classification_loss: 0.1375 196/500 [==========>...................] - ETA: 1:43 - loss: 1.1388 - regression_loss: 1.0012 - classification_loss: 0.1376 197/500 [==========>...................] - ETA: 1:42 - loss: 1.1333 - regression_loss: 0.9962 - classification_loss: 0.1372 198/500 [==========>...................] - ETA: 1:42 - loss: 1.1370 - regression_loss: 0.9984 - classification_loss: 0.1386 199/500 [==========>...................] - ETA: 1:42 - loss: 1.1396 - regression_loss: 1.0006 - classification_loss: 0.1390 200/500 [===========>..................] - ETA: 1:41 - loss: 1.1393 - regression_loss: 1.0006 - classification_loss: 0.1387 201/500 [===========>..................] - ETA: 1:41 - loss: 1.1416 - regression_loss: 1.0023 - classification_loss: 0.1393 202/500 [===========>..................] - ETA: 1:41 - loss: 1.1400 - regression_loss: 1.0011 - classification_loss: 0.1390 203/500 [===========>..................] - ETA: 1:40 - loss: 1.1430 - regression_loss: 1.0040 - classification_loss: 0.1390 204/500 [===========>..................] - ETA: 1:40 - loss: 1.1427 - regression_loss: 1.0036 - classification_loss: 0.1391 205/500 [===========>..................] - ETA: 1:40 - loss: 1.1426 - regression_loss: 1.0033 - classification_loss: 0.1392 206/500 [===========>..................] - ETA: 1:39 - loss: 1.1437 - regression_loss: 1.0038 - classification_loss: 0.1399 207/500 [===========>..................] - ETA: 1:39 - loss: 1.1427 - regression_loss: 1.0031 - classification_loss: 0.1396 208/500 [===========>..................] - ETA: 1:39 - loss: 1.1470 - regression_loss: 1.0063 - classification_loss: 0.1407 209/500 [===========>..................] - ETA: 1:38 - loss: 1.1489 - regression_loss: 1.0079 - classification_loss: 0.1410 210/500 [===========>..................] - ETA: 1:38 - loss: 1.1464 - regression_loss: 1.0058 - classification_loss: 0.1405 211/500 [===========>..................] - ETA: 1:38 - loss: 1.1470 - regression_loss: 1.0062 - classification_loss: 0.1408 212/500 [===========>..................] - ETA: 1:37 - loss: 1.1446 - regression_loss: 1.0042 - classification_loss: 0.1404 213/500 [===========>..................] - ETA: 1:37 - loss: 1.1470 - regression_loss: 1.0065 - classification_loss: 0.1405 214/500 [===========>..................] - ETA: 1:37 - loss: 1.1464 - regression_loss: 1.0057 - classification_loss: 0.1406 215/500 [===========>..................] - ETA: 1:36 - loss: 1.1462 - regression_loss: 1.0056 - classification_loss: 0.1406 216/500 [===========>..................] - ETA: 1:36 - loss: 1.1459 - regression_loss: 1.0054 - classification_loss: 0.1405 217/500 [============>.................] - ETA: 1:36 - loss: 1.1479 - regression_loss: 1.0070 - classification_loss: 0.1409 218/500 [============>.................] - ETA: 1:35 - loss: 1.1456 - regression_loss: 1.0051 - classification_loss: 0.1406 219/500 [============>.................] - ETA: 1:35 - loss: 1.1466 - regression_loss: 1.0057 - classification_loss: 0.1409 220/500 [============>.................] - ETA: 1:35 - loss: 1.1494 - regression_loss: 1.0083 - classification_loss: 0.1411 221/500 [============>.................] - ETA: 1:34 - loss: 1.1465 - regression_loss: 1.0058 - classification_loss: 0.1407 222/500 [============>.................] - ETA: 1:34 - loss: 1.1464 - regression_loss: 1.0059 - classification_loss: 0.1405 223/500 [============>.................] - ETA: 1:34 - loss: 1.1477 - regression_loss: 1.0068 - classification_loss: 0.1409 224/500 [============>.................] - ETA: 1:33 - loss: 1.1491 - regression_loss: 1.0083 - classification_loss: 0.1409 225/500 [============>.................] - ETA: 1:33 - loss: 1.1464 - regression_loss: 1.0058 - classification_loss: 0.1406 226/500 [============>.................] - ETA: 1:33 - loss: 1.1457 - regression_loss: 1.0053 - classification_loss: 0.1404 227/500 [============>.................] - ETA: 1:32 - loss: 1.1443 - regression_loss: 1.0041 - classification_loss: 0.1402 228/500 [============>.................] - ETA: 1:32 - loss: 1.1446 - regression_loss: 1.0045 - classification_loss: 0.1401 229/500 [============>.................] - ETA: 1:32 - loss: 1.1447 - regression_loss: 1.0049 - classification_loss: 0.1399 230/500 [============>.................] - ETA: 1:31 - loss: 1.1448 - regression_loss: 1.0050 - classification_loss: 0.1397 231/500 [============>.................] - ETA: 1:31 - loss: 1.1507 - regression_loss: 1.0101 - classification_loss: 0.1406 232/500 [============>.................] - ETA: 1:31 - loss: 1.1508 - regression_loss: 1.0103 - classification_loss: 0.1405 233/500 [============>.................] - ETA: 1:30 - loss: 1.1476 - regression_loss: 1.0075 - classification_loss: 0.1401 234/500 [=============>................] - ETA: 1:30 - loss: 1.1444 - regression_loss: 1.0048 - classification_loss: 0.1397 235/500 [=============>................] - ETA: 1:30 - loss: 1.1455 - regression_loss: 1.0056 - classification_loss: 0.1399 236/500 [=============>................] - ETA: 1:29 - loss: 1.1450 - regression_loss: 1.0050 - classification_loss: 0.1400 237/500 [=============>................] - ETA: 1:29 - loss: 1.1440 - regression_loss: 1.0042 - classification_loss: 0.1398 238/500 [=============>................] - ETA: 1:29 - loss: 1.1413 - regression_loss: 1.0017 - classification_loss: 0.1396 239/500 [=============>................] - ETA: 1:28 - loss: 1.1399 - regression_loss: 1.0007 - classification_loss: 0.1392 240/500 [=============>................] - ETA: 1:28 - loss: 1.1396 - regression_loss: 1.0006 - classification_loss: 0.1390 241/500 [=============>................] - ETA: 1:28 - loss: 1.1386 - regression_loss: 0.9997 - classification_loss: 0.1389 242/500 [=============>................] - ETA: 1:27 - loss: 1.1355 - regression_loss: 0.9970 - classification_loss: 0.1385 243/500 [=============>................] - ETA: 1:27 - loss: 1.1371 - regression_loss: 0.9980 - classification_loss: 0.1391 244/500 [=============>................] - ETA: 1:27 - loss: 1.1387 - regression_loss: 0.9992 - classification_loss: 0.1395 245/500 [=============>................] - ETA: 1:26 - loss: 1.1442 - regression_loss: 1.0040 - classification_loss: 0.1402 246/500 [=============>................] - ETA: 1:26 - loss: 1.1432 - regression_loss: 1.0032 - classification_loss: 0.1400 247/500 [=============>................] - ETA: 1:26 - loss: 1.1448 - regression_loss: 1.0046 - classification_loss: 0.1402 248/500 [=============>................] - ETA: 1:25 - loss: 1.1443 - regression_loss: 1.0042 - classification_loss: 0.1401 249/500 [=============>................] - ETA: 1:25 - loss: 1.1450 - regression_loss: 1.0053 - classification_loss: 0.1397 250/500 [==============>...............] - ETA: 1:25 - loss: 1.1434 - regression_loss: 1.0040 - classification_loss: 0.1395 251/500 [==============>...............] - ETA: 1:24 - loss: 1.1456 - regression_loss: 1.0062 - classification_loss: 0.1394 252/500 [==============>...............] - ETA: 1:24 - loss: 1.1460 - regression_loss: 1.0066 - classification_loss: 0.1394 253/500 [==============>...............] - ETA: 1:24 - loss: 1.1450 - regression_loss: 1.0059 - classification_loss: 0.1392 254/500 [==============>...............] - ETA: 1:23 - loss: 1.1438 - regression_loss: 1.0049 - classification_loss: 0.1389 255/500 [==============>...............] - ETA: 1:23 - loss: 1.1438 - regression_loss: 1.0050 - classification_loss: 0.1388 256/500 [==============>...............] - ETA: 1:23 - loss: 1.1425 - regression_loss: 1.0040 - classification_loss: 0.1386 257/500 [==============>...............] - ETA: 1:22 - loss: 1.1428 - regression_loss: 1.0042 - classification_loss: 0.1387 258/500 [==============>...............] - ETA: 1:22 - loss: 1.1447 - regression_loss: 1.0056 - classification_loss: 0.1391 259/500 [==============>...............] - ETA: 1:22 - loss: 1.1444 - regression_loss: 1.0054 - classification_loss: 0.1390 260/500 [==============>...............] - ETA: 1:21 - loss: 1.1441 - regression_loss: 1.0052 - classification_loss: 0.1389 261/500 [==============>...............] - ETA: 1:21 - loss: 1.1438 - regression_loss: 1.0049 - classification_loss: 0.1389 262/500 [==============>...............] - ETA: 1:21 - loss: 1.1433 - regression_loss: 1.0045 - classification_loss: 0.1388 263/500 [==============>...............] - ETA: 1:20 - loss: 1.1434 - regression_loss: 1.0047 - classification_loss: 0.1387 264/500 [==============>...............] - ETA: 1:20 - loss: 1.1446 - regression_loss: 1.0059 - classification_loss: 0.1388 265/500 [==============>...............] - ETA: 1:19 - loss: 1.1443 - regression_loss: 1.0058 - classification_loss: 0.1385 266/500 [==============>...............] - ETA: 1:19 - loss: 1.1412 - regression_loss: 1.0031 - classification_loss: 0.1381 267/500 [===============>..............] - ETA: 1:19 - loss: 1.1425 - regression_loss: 1.0043 - classification_loss: 0.1382 268/500 [===============>..............] - ETA: 1:18 - loss: 1.1470 - regression_loss: 1.0072 - classification_loss: 0.1398 269/500 [===============>..............] - ETA: 1:18 - loss: 1.1489 - regression_loss: 1.0089 - classification_loss: 0.1400 270/500 [===============>..............] - ETA: 1:18 - loss: 1.1490 - regression_loss: 1.0089 - classification_loss: 0.1401 271/500 [===============>..............] - ETA: 1:17 - loss: 1.1472 - regression_loss: 1.0075 - classification_loss: 0.1397 272/500 [===============>..............] - ETA: 1:17 - loss: 1.1459 - regression_loss: 1.0065 - classification_loss: 0.1395 273/500 [===============>..............] - ETA: 1:17 - loss: 1.1461 - regression_loss: 1.0069 - classification_loss: 0.1392 274/500 [===============>..............] - ETA: 1:16 - loss: 1.1452 - regression_loss: 1.0063 - classification_loss: 0.1390 275/500 [===============>..............] - ETA: 1:16 - loss: 1.1456 - regression_loss: 1.0064 - classification_loss: 0.1391 276/500 [===============>..............] - ETA: 1:16 - loss: 1.1442 - regression_loss: 1.0054 - classification_loss: 0.1388 277/500 [===============>..............] - ETA: 1:15 - loss: 1.1462 - regression_loss: 1.0070 - classification_loss: 0.1392 278/500 [===============>..............] - ETA: 1:15 - loss: 1.1449 - regression_loss: 1.0060 - classification_loss: 0.1389 279/500 [===============>..............] - ETA: 1:15 - loss: 1.1436 - regression_loss: 1.0050 - classification_loss: 0.1386 280/500 [===============>..............] - ETA: 1:14 - loss: 1.1428 - regression_loss: 1.0045 - classification_loss: 0.1384 281/500 [===============>..............] - ETA: 1:14 - loss: 1.1428 - regression_loss: 1.0045 - classification_loss: 0.1382 282/500 [===============>..............] - ETA: 1:14 - loss: 1.1433 - regression_loss: 1.0050 - classification_loss: 0.1383 283/500 [===============>..............] - ETA: 1:13 - loss: 1.1418 - regression_loss: 1.0036 - classification_loss: 0.1382 284/500 [================>.............] - ETA: 1:13 - loss: 1.1433 - regression_loss: 1.0048 - classification_loss: 0.1386 285/500 [================>.............] - ETA: 1:13 - loss: 1.1419 - regression_loss: 1.0037 - classification_loss: 0.1383 286/500 [================>.............] - ETA: 1:12 - loss: 1.1422 - regression_loss: 1.0040 - classification_loss: 0.1383 287/500 [================>.............] - ETA: 1:12 - loss: 1.1407 - regression_loss: 1.0027 - classification_loss: 0.1380 288/500 [================>.............] - ETA: 1:12 - loss: 1.1404 - regression_loss: 1.0025 - classification_loss: 0.1379 289/500 [================>.............] - ETA: 1:11 - loss: 1.1397 - regression_loss: 1.0019 - classification_loss: 0.1377 290/500 [================>.............] - ETA: 1:11 - loss: 1.1453 - regression_loss: 1.0068 - classification_loss: 0.1385 291/500 [================>.............] - ETA: 1:11 - loss: 1.1439 - regression_loss: 1.0057 - classification_loss: 0.1382 292/500 [================>.............] - ETA: 1:10 - loss: 1.1432 - regression_loss: 1.0050 - classification_loss: 0.1382 293/500 [================>.............] - ETA: 1:10 - loss: 1.1441 - regression_loss: 1.0056 - classification_loss: 0.1384 294/500 [================>.............] - ETA: 1:10 - loss: 1.1517 - regression_loss: 1.0108 - classification_loss: 0.1409 295/500 [================>.............] - ETA: 1:09 - loss: 1.1551 - regression_loss: 1.0132 - classification_loss: 0.1419 296/500 [================>.............] - ETA: 1:09 - loss: 1.1527 - regression_loss: 1.0111 - classification_loss: 0.1416 297/500 [================>.............] - ETA: 1:09 - loss: 1.1559 - regression_loss: 1.0138 - classification_loss: 0.1421 298/500 [================>.............] - ETA: 1:08 - loss: 1.1545 - regression_loss: 1.0127 - classification_loss: 0.1418 299/500 [================>.............] - ETA: 1:08 - loss: 1.1552 - regression_loss: 1.0134 - classification_loss: 0.1418 300/500 [=================>............] - ETA: 1:08 - loss: 1.1580 - regression_loss: 1.0159 - classification_loss: 0.1421 301/500 [=================>............] - ETA: 1:07 - loss: 1.1583 - regression_loss: 1.0162 - classification_loss: 0.1421 302/500 [=================>............] - ETA: 1:07 - loss: 1.1569 - regression_loss: 1.0146 - classification_loss: 0.1423 303/500 [=================>............] - ETA: 1:07 - loss: 1.1569 - regression_loss: 1.0148 - classification_loss: 0.1421 304/500 [=================>............] - ETA: 1:06 - loss: 1.1573 - regression_loss: 1.0149 - classification_loss: 0.1424 305/500 [=================>............] - ETA: 1:06 - loss: 1.1576 - regression_loss: 1.0151 - classification_loss: 0.1424 306/500 [=================>............] - ETA: 1:06 - loss: 1.1577 - regression_loss: 1.0154 - classification_loss: 0.1422 307/500 [=================>............] - ETA: 1:05 - loss: 1.1580 - regression_loss: 1.0155 - classification_loss: 0.1424 308/500 [=================>............] - ETA: 1:05 - loss: 1.1556 - regression_loss: 1.0134 - classification_loss: 0.1421 309/500 [=================>............] - ETA: 1:05 - loss: 1.1552 - regression_loss: 1.0132 - classification_loss: 0.1420 310/500 [=================>............] - ETA: 1:04 - loss: 1.1534 - regression_loss: 1.0117 - classification_loss: 0.1417 311/500 [=================>............] - ETA: 1:04 - loss: 1.1540 - regression_loss: 1.0124 - classification_loss: 0.1417 312/500 [=================>............] - ETA: 1:03 - loss: 1.1558 - regression_loss: 1.0135 - classification_loss: 0.1423 313/500 [=================>............] - ETA: 1:03 - loss: 1.1563 - regression_loss: 1.0140 - classification_loss: 0.1422 314/500 [=================>............] - ETA: 1:03 - loss: 1.1545 - regression_loss: 1.0126 - classification_loss: 0.1418 315/500 [=================>............] - ETA: 1:02 - loss: 1.1528 - regression_loss: 1.0112 - classification_loss: 0.1416 316/500 [=================>............] - ETA: 1:02 - loss: 1.1509 - regression_loss: 1.0093 - classification_loss: 0.1416 317/500 [==================>...........] - ETA: 1:02 - loss: 1.1508 - regression_loss: 1.0093 - classification_loss: 0.1415 318/500 [==================>...........] - ETA: 1:01 - loss: 1.1487 - regression_loss: 1.0075 - classification_loss: 0.1412 319/500 [==================>...........] - ETA: 1:01 - loss: 1.1481 - regression_loss: 1.0070 - classification_loss: 0.1411 320/500 [==================>...........] - ETA: 1:01 - loss: 1.1480 - regression_loss: 1.0071 - classification_loss: 0.1409 321/500 [==================>...........] - ETA: 1:00 - loss: 1.1477 - regression_loss: 1.0069 - classification_loss: 0.1409 322/500 [==================>...........] - ETA: 1:00 - loss: 1.1484 - regression_loss: 1.0075 - classification_loss: 0.1409 323/500 [==================>...........] - ETA: 1:00 - loss: 1.1475 - regression_loss: 1.0068 - classification_loss: 0.1407 324/500 [==================>...........] - ETA: 59s - loss: 1.1472 - regression_loss: 1.0066 - classification_loss: 0.1406  325/500 [==================>...........] - ETA: 59s - loss: 1.1487 - regression_loss: 1.0077 - classification_loss: 0.1410 326/500 [==================>...........] - ETA: 59s - loss: 1.1487 - regression_loss: 1.0078 - classification_loss: 0.1409 327/500 [==================>...........] - ETA: 58s - loss: 1.1499 - regression_loss: 1.0090 - classification_loss: 0.1409 328/500 [==================>...........] - ETA: 58s - loss: 1.1504 - regression_loss: 1.0096 - classification_loss: 0.1408 329/500 [==================>...........] - ETA: 58s - loss: 1.1504 - regression_loss: 1.0097 - classification_loss: 0.1407 330/500 [==================>...........] - ETA: 57s - loss: 1.1511 - regression_loss: 1.0103 - classification_loss: 0.1408 331/500 [==================>...........] - ETA: 57s - loss: 1.1531 - regression_loss: 1.0117 - classification_loss: 0.1414 332/500 [==================>...........] - ETA: 57s - loss: 1.1531 - regression_loss: 1.0117 - classification_loss: 0.1414 333/500 [==================>...........] - ETA: 56s - loss: 1.1543 - regression_loss: 1.0128 - classification_loss: 0.1415 334/500 [===================>..........] - ETA: 56s - loss: 1.1544 - regression_loss: 1.0129 - classification_loss: 0.1415 335/500 [===================>..........] - ETA: 56s - loss: 1.1520 - regression_loss: 1.0108 - classification_loss: 0.1412 336/500 [===================>..........] - ETA: 55s - loss: 1.1525 - regression_loss: 1.0112 - classification_loss: 0.1413 337/500 [===================>..........] - ETA: 55s - loss: 1.1499 - regression_loss: 1.0090 - classification_loss: 0.1409 338/500 [===================>..........] - ETA: 55s - loss: 1.1488 - regression_loss: 1.0079 - classification_loss: 0.1409 339/500 [===================>..........] - ETA: 54s - loss: 1.1479 - regression_loss: 1.0072 - classification_loss: 0.1407 340/500 [===================>..........] - ETA: 54s - loss: 1.1461 - regression_loss: 1.0055 - classification_loss: 0.1406 341/500 [===================>..........] - ETA: 54s - loss: 1.1470 - regression_loss: 1.0063 - classification_loss: 0.1408 342/500 [===================>..........] - ETA: 53s - loss: 1.1490 - regression_loss: 1.0077 - classification_loss: 0.1413 343/500 [===================>..........] - ETA: 53s - loss: 1.1504 - regression_loss: 1.0090 - classification_loss: 0.1414 344/500 [===================>..........] - ETA: 53s - loss: 1.1494 - regression_loss: 1.0082 - classification_loss: 0.1412 345/500 [===================>..........] - ETA: 52s - loss: 1.1479 - regression_loss: 1.0070 - classification_loss: 0.1410 346/500 [===================>..........] - ETA: 52s - loss: 1.1474 - regression_loss: 1.0066 - classification_loss: 0.1408 347/500 [===================>..........] - ETA: 52s - loss: 1.1475 - regression_loss: 1.0065 - classification_loss: 0.1410 348/500 [===================>..........] - ETA: 51s - loss: 1.1484 - regression_loss: 1.0073 - classification_loss: 0.1411 349/500 [===================>..........] - ETA: 51s - loss: 1.1489 - regression_loss: 1.0077 - classification_loss: 0.1412 350/500 [====================>.........] - ETA: 51s - loss: 1.1490 - regression_loss: 1.0077 - classification_loss: 0.1413 351/500 [====================>.........] - ETA: 50s - loss: 1.1488 - regression_loss: 1.0075 - classification_loss: 0.1413 352/500 [====================>.........] - ETA: 50s - loss: 1.1482 - regression_loss: 1.0070 - classification_loss: 0.1412 353/500 [====================>.........] - ETA: 50s - loss: 1.1473 - regression_loss: 1.0064 - classification_loss: 0.1410 354/500 [====================>.........] - ETA: 49s - loss: 1.1471 - regression_loss: 1.0063 - classification_loss: 0.1408 355/500 [====================>.........] - ETA: 49s - loss: 1.1459 - regression_loss: 1.0053 - classification_loss: 0.1405 356/500 [====================>.........] - ETA: 49s - loss: 1.1463 - regression_loss: 1.0058 - classification_loss: 0.1405 357/500 [====================>.........] - ETA: 48s - loss: 1.1452 - regression_loss: 1.0049 - classification_loss: 0.1404 358/500 [====================>.........] - ETA: 48s - loss: 1.1471 - regression_loss: 1.0068 - classification_loss: 0.1403 359/500 [====================>.........] - ETA: 48s - loss: 1.1462 - regression_loss: 1.0060 - classification_loss: 0.1402 360/500 [====================>.........] - ETA: 47s - loss: 1.1465 - regression_loss: 1.0064 - classification_loss: 0.1401 361/500 [====================>.........] - ETA: 47s - loss: 1.1460 - regression_loss: 1.0060 - classification_loss: 0.1400 362/500 [====================>.........] - ETA: 47s - loss: 1.1459 - regression_loss: 1.0062 - classification_loss: 0.1397 363/500 [====================>.........] - ETA: 46s - loss: 1.1457 - regression_loss: 1.0060 - classification_loss: 0.1397 364/500 [====================>.........] - ETA: 46s - loss: 1.1462 - regression_loss: 1.0064 - classification_loss: 0.1398 365/500 [====================>.........] - ETA: 45s - loss: 1.1455 - regression_loss: 1.0058 - classification_loss: 0.1397 366/500 [====================>.........] - ETA: 45s - loss: 1.1472 - regression_loss: 1.0073 - classification_loss: 0.1399 367/500 [=====================>........] - ETA: 45s - loss: 1.1457 - regression_loss: 1.0061 - classification_loss: 0.1396 368/500 [=====================>........] - ETA: 44s - loss: 1.1451 - regression_loss: 1.0054 - classification_loss: 0.1398 369/500 [=====================>........] - ETA: 44s - loss: 1.1450 - regression_loss: 1.0054 - classification_loss: 0.1397 370/500 [=====================>........] - ETA: 44s - loss: 1.1443 - regression_loss: 1.0049 - classification_loss: 0.1395 371/500 [=====================>........] - ETA: 43s - loss: 1.1456 - regression_loss: 1.0059 - classification_loss: 0.1397 372/500 [=====================>........] - ETA: 43s - loss: 1.1449 - regression_loss: 1.0053 - classification_loss: 0.1396 373/500 [=====================>........] - ETA: 43s - loss: 1.1435 - regression_loss: 1.0042 - classification_loss: 0.1394 374/500 [=====================>........] - ETA: 42s - loss: 1.1422 - regression_loss: 1.0030 - classification_loss: 0.1392 375/500 [=====================>........] - ETA: 42s - loss: 1.1403 - regression_loss: 1.0014 - classification_loss: 0.1389 376/500 [=====================>........] - ETA: 42s - loss: 1.1385 - regression_loss: 0.9998 - classification_loss: 0.1387 377/500 [=====================>........] - ETA: 41s - loss: 1.1379 - regression_loss: 0.9993 - classification_loss: 0.1387 378/500 [=====================>........] - ETA: 41s - loss: 1.1370 - regression_loss: 0.9985 - classification_loss: 0.1385 379/500 [=====================>........] - ETA: 41s - loss: 1.1357 - regression_loss: 0.9974 - classification_loss: 0.1383 380/500 [=====================>........] - ETA: 40s - loss: 1.1341 - regression_loss: 0.9961 - classification_loss: 0.1380 381/500 [=====================>........] - ETA: 40s - loss: 1.1351 - regression_loss: 0.9967 - classification_loss: 0.1384 382/500 [=====================>........] - ETA: 40s - loss: 1.1349 - regression_loss: 0.9965 - classification_loss: 0.1383 383/500 [=====================>........] - ETA: 39s - loss: 1.1337 - regression_loss: 0.9956 - classification_loss: 0.1381 384/500 [======================>.......] - ETA: 39s - loss: 1.1348 - regression_loss: 0.9968 - classification_loss: 0.1381 385/500 [======================>.......] - ETA: 39s - loss: 1.1349 - regression_loss: 0.9970 - classification_loss: 0.1379 386/500 [======================>.......] - ETA: 38s - loss: 1.1344 - regression_loss: 0.9966 - classification_loss: 0.1378 387/500 [======================>.......] - ETA: 38s - loss: 1.1345 - regression_loss: 0.9967 - classification_loss: 0.1378 388/500 [======================>.......] - ETA: 38s - loss: 1.1359 - regression_loss: 0.9977 - classification_loss: 0.1382 389/500 [======================>.......] - ETA: 37s - loss: 1.1370 - regression_loss: 0.9987 - classification_loss: 0.1383 390/500 [======================>.......] - ETA: 37s - loss: 1.1371 - regression_loss: 0.9987 - classification_loss: 0.1384 391/500 [======================>.......] - ETA: 37s - loss: 1.1367 - regression_loss: 0.9985 - classification_loss: 0.1382 392/500 [======================>.......] - ETA: 36s - loss: 1.1369 - regression_loss: 0.9988 - classification_loss: 0.1382 393/500 [======================>.......] - ETA: 36s - loss: 1.1381 - regression_loss: 0.9997 - classification_loss: 0.1384 394/500 [======================>.......] - ETA: 36s - loss: 1.1381 - regression_loss: 0.9997 - classification_loss: 0.1384 395/500 [======================>.......] - ETA: 35s - loss: 1.1380 - regression_loss: 0.9995 - classification_loss: 0.1385 396/500 [======================>.......] - ETA: 35s - loss: 1.1382 - regression_loss: 0.9996 - classification_loss: 0.1386 397/500 [======================>.......] - ETA: 35s - loss: 1.1387 - regression_loss: 1.0001 - classification_loss: 0.1386 398/500 [======================>.......] - ETA: 34s - loss: 1.1368 - regression_loss: 0.9985 - classification_loss: 0.1383 399/500 [======================>.......] - ETA: 34s - loss: 1.1383 - regression_loss: 0.9997 - classification_loss: 0.1386 400/500 [=======================>......] - ETA: 34s - loss: 1.1391 - regression_loss: 1.0004 - classification_loss: 0.1386 401/500 [=======================>......] - ETA: 33s - loss: 1.1380 - regression_loss: 0.9996 - classification_loss: 0.1384 402/500 [=======================>......] - ETA: 33s - loss: 1.1373 - regression_loss: 0.9990 - classification_loss: 0.1383 403/500 [=======================>......] - ETA: 33s - loss: 1.1369 - regression_loss: 0.9987 - classification_loss: 0.1382 404/500 [=======================>......] - ETA: 32s - loss: 1.1368 - regression_loss: 0.9987 - classification_loss: 0.1382 405/500 [=======================>......] - ETA: 32s - loss: 1.1359 - regression_loss: 0.9978 - classification_loss: 0.1381 406/500 [=======================>......] - ETA: 32s - loss: 1.1359 - regression_loss: 0.9979 - classification_loss: 0.1380 407/500 [=======================>......] - ETA: 31s - loss: 1.1353 - regression_loss: 0.9974 - classification_loss: 0.1379 408/500 [=======================>......] - ETA: 31s - loss: 1.1341 - regression_loss: 0.9964 - classification_loss: 0.1376 409/500 [=======================>......] - ETA: 31s - loss: 1.1344 - regression_loss: 0.9967 - classification_loss: 0.1377 410/500 [=======================>......] - ETA: 30s - loss: 1.1343 - regression_loss: 0.9965 - classification_loss: 0.1378 411/500 [=======================>......] - ETA: 30s - loss: 1.1350 - regression_loss: 0.9970 - classification_loss: 0.1379 412/500 [=======================>......] - ETA: 30s - loss: 1.1352 - regression_loss: 0.9973 - classification_loss: 0.1379 413/500 [=======================>......] - ETA: 29s - loss: 1.1360 - regression_loss: 0.9979 - classification_loss: 0.1381 414/500 [=======================>......] - ETA: 29s - loss: 1.1355 - regression_loss: 0.9975 - classification_loss: 0.1380 415/500 [=======================>......] - ETA: 28s - loss: 1.1349 - regression_loss: 0.9970 - classification_loss: 0.1379 416/500 [=======================>......] - ETA: 28s - loss: 1.1343 - regression_loss: 0.9965 - classification_loss: 0.1378 417/500 [========================>.....] - ETA: 28s - loss: 1.1344 - regression_loss: 0.9966 - classification_loss: 0.1378 418/500 [========================>.....] - ETA: 27s - loss: 1.1344 - regression_loss: 0.9967 - classification_loss: 0.1377 419/500 [========================>.....] - ETA: 27s - loss: 1.1343 - regression_loss: 0.9967 - classification_loss: 0.1376 420/500 [========================>.....] - ETA: 27s - loss: 1.1352 - regression_loss: 0.9976 - classification_loss: 0.1376 421/500 [========================>.....] - ETA: 26s - loss: 1.1351 - regression_loss: 0.9975 - classification_loss: 0.1376 422/500 [========================>.....] - ETA: 26s - loss: 1.1349 - regression_loss: 0.9974 - classification_loss: 0.1375 423/500 [========================>.....] - ETA: 26s - loss: 1.1339 - regression_loss: 0.9965 - classification_loss: 0.1373 424/500 [========================>.....] - ETA: 25s - loss: 1.1329 - regression_loss: 0.9958 - classification_loss: 0.1371 425/500 [========================>.....] - ETA: 25s - loss: 1.1359 - regression_loss: 0.9980 - classification_loss: 0.1379 426/500 [========================>.....] - ETA: 25s - loss: 1.1360 - regression_loss: 0.9981 - classification_loss: 0.1379 427/500 [========================>.....] - ETA: 24s - loss: 1.1364 - regression_loss: 0.9984 - classification_loss: 0.1379 428/500 [========================>.....] - ETA: 24s - loss: 1.1361 - regression_loss: 0.9984 - classification_loss: 0.1378 429/500 [========================>.....] - ETA: 24s - loss: 1.1367 - regression_loss: 0.9990 - classification_loss: 0.1378 430/500 [========================>.....] - ETA: 23s - loss: 1.1388 - regression_loss: 1.0007 - classification_loss: 0.1380 431/500 [========================>.....] - ETA: 23s - loss: 1.1386 - regression_loss: 1.0006 - classification_loss: 0.1380 432/500 [========================>.....] - ETA: 23s - loss: 1.1391 - regression_loss: 1.0010 - classification_loss: 0.1381 433/500 [========================>.....] - ETA: 22s - loss: 1.1382 - regression_loss: 1.0003 - classification_loss: 0.1379 434/500 [=========================>....] - ETA: 22s - loss: 1.1372 - regression_loss: 0.9994 - classification_loss: 0.1378 435/500 [=========================>....] - ETA: 22s - loss: 1.1368 - regression_loss: 0.9992 - classification_loss: 0.1376 436/500 [=========================>....] - ETA: 21s - loss: 1.1365 - regression_loss: 0.9990 - classification_loss: 0.1375 437/500 [=========================>....] - ETA: 21s - loss: 1.1360 - regression_loss: 0.9986 - classification_loss: 0.1374 438/500 [=========================>....] - ETA: 21s - loss: 1.1359 - regression_loss: 0.9987 - classification_loss: 0.1372 439/500 [=========================>....] - ETA: 20s - loss: 1.1371 - regression_loss: 0.9999 - classification_loss: 0.1372 440/500 [=========================>....] - ETA: 20s - loss: 1.1369 - regression_loss: 0.9999 - classification_loss: 0.1370 441/500 [=========================>....] - ETA: 20s - loss: 1.1373 - regression_loss: 1.0004 - classification_loss: 0.1369 442/500 [=========================>....] - ETA: 19s - loss: 1.1368 - regression_loss: 1.0000 - classification_loss: 0.1368 443/500 [=========================>....] - ETA: 19s - loss: 1.1372 - regression_loss: 1.0003 - classification_loss: 0.1369 444/500 [=========================>....] - ETA: 19s - loss: 1.1381 - regression_loss: 1.0011 - classification_loss: 0.1370 445/500 [=========================>....] - ETA: 18s - loss: 1.1390 - regression_loss: 1.0019 - classification_loss: 0.1370 446/500 [=========================>....] - ETA: 18s - loss: 1.1394 - regression_loss: 1.0024 - classification_loss: 0.1369 447/500 [=========================>....] - ETA: 18s - loss: 1.1394 - regression_loss: 1.0026 - classification_loss: 0.1368 448/500 [=========================>....] - ETA: 17s - loss: 1.1397 - regression_loss: 1.0030 - classification_loss: 0.1367 449/500 [=========================>....] - ETA: 17s - loss: 1.1383 - regression_loss: 1.0018 - classification_loss: 0.1365 450/500 [==========================>...] - ETA: 17s - loss: 1.1379 - regression_loss: 1.0014 - classification_loss: 0.1364 451/500 [==========================>...] - ETA: 16s - loss: 1.1371 - regression_loss: 1.0006 - classification_loss: 0.1365 452/500 [==========================>...] - ETA: 16s - loss: 1.1377 - regression_loss: 1.0010 - classification_loss: 0.1367 453/500 [==========================>...] - ETA: 16s - loss: 1.1390 - regression_loss: 1.0022 - classification_loss: 0.1368 454/500 [==========================>...] - ETA: 15s - loss: 1.1386 - regression_loss: 1.0019 - classification_loss: 0.1366 455/500 [==========================>...] - ETA: 15s - loss: 1.1390 - regression_loss: 1.0024 - classification_loss: 0.1366 456/500 [==========================>...] - ETA: 15s - loss: 1.1387 - regression_loss: 1.0021 - classification_loss: 0.1366 457/500 [==========================>...] - ETA: 14s - loss: 1.1396 - regression_loss: 1.0030 - classification_loss: 0.1366 458/500 [==========================>...] - ETA: 14s - loss: 1.1384 - regression_loss: 1.0020 - classification_loss: 0.1364 459/500 [==========================>...] - ETA: 13s - loss: 1.1383 - regression_loss: 1.0019 - classification_loss: 0.1364 460/500 [==========================>...] - ETA: 13s - loss: 1.1374 - regression_loss: 1.0012 - classification_loss: 0.1362 461/500 [==========================>...] - ETA: 13s - loss: 1.1383 - regression_loss: 1.0019 - classification_loss: 0.1363 462/500 [==========================>...] - ETA: 12s - loss: 1.1383 - regression_loss: 1.0020 - classification_loss: 0.1362 463/500 [==========================>...] - ETA: 12s - loss: 1.1377 - regression_loss: 1.0015 - classification_loss: 0.1362 464/500 [==========================>...] - ETA: 12s - loss: 1.1382 - regression_loss: 1.0020 - classification_loss: 0.1362 465/500 [==========================>...] - ETA: 11s - loss: 1.1374 - regression_loss: 1.0013 - classification_loss: 0.1361 466/500 [==========================>...] - ETA: 11s - loss: 1.1368 - regression_loss: 1.0008 - classification_loss: 0.1360 467/500 [===========================>..] - ETA: 11s - loss: 1.1373 - regression_loss: 1.0010 - classification_loss: 0.1363 468/500 [===========================>..] - ETA: 10s - loss: 1.1381 - regression_loss: 1.0018 - classification_loss: 0.1363 469/500 [===========================>..] - ETA: 10s - loss: 1.1372 - regression_loss: 1.0008 - classification_loss: 0.1363 470/500 [===========================>..] - ETA: 10s - loss: 1.1365 - regression_loss: 1.0002 - classification_loss: 0.1362 471/500 [===========================>..] - ETA: 9s - loss: 1.1372 - regression_loss: 1.0011 - classification_loss: 0.1361  472/500 [===========================>..] - ETA: 9s - loss: 1.1381 - regression_loss: 1.0020 - classification_loss: 0.1362 473/500 [===========================>..] - ETA: 9s - loss: 1.1391 - regression_loss: 1.0025 - classification_loss: 0.1366 474/500 [===========================>..] - ETA: 8s - loss: 1.1397 - regression_loss: 1.0029 - classification_loss: 0.1368 475/500 [===========================>..] - ETA: 8s - loss: 1.1390 - regression_loss: 1.0022 - classification_loss: 0.1368 476/500 [===========================>..] - ETA: 8s - loss: 1.1400 - regression_loss: 1.0031 - classification_loss: 0.1369 477/500 [===========================>..] - ETA: 7s - loss: 1.1404 - regression_loss: 1.0034 - classification_loss: 0.1369 478/500 [===========================>..] - ETA: 7s - loss: 1.1391 - regression_loss: 1.0024 - classification_loss: 0.1368 479/500 [===========================>..] - ETA: 7s - loss: 1.1385 - regression_loss: 1.0018 - classification_loss: 0.1367 480/500 [===========================>..] - ETA: 6s - loss: 1.1386 - regression_loss: 1.0020 - classification_loss: 0.1366 481/500 [===========================>..] - ETA: 6s - loss: 1.1383 - regression_loss: 1.0019 - classification_loss: 0.1364 482/500 [===========================>..] - ETA: 6s - loss: 1.1405 - regression_loss: 1.0035 - classification_loss: 0.1370 483/500 [===========================>..] - ETA: 5s - loss: 1.1397 - regression_loss: 1.0029 - classification_loss: 0.1368 484/500 [============================>.] - ETA: 5s - loss: 1.1386 - regression_loss: 1.0019 - classification_loss: 0.1367 485/500 [============================>.] - ETA: 5s - loss: 1.1410 - regression_loss: 1.0036 - classification_loss: 0.1374 486/500 [============================>.] - ETA: 4s - loss: 1.1403 - regression_loss: 1.0031 - classification_loss: 0.1373 487/500 [============================>.] - ETA: 4s - loss: 1.1391 - regression_loss: 1.0020 - classification_loss: 0.1371 488/500 [============================>.] - ETA: 4s - loss: 1.1392 - regression_loss: 1.0021 - classification_loss: 0.1371 489/500 [============================>.] - ETA: 3s - loss: 1.1410 - regression_loss: 1.0036 - classification_loss: 0.1375 490/500 [============================>.] - ETA: 3s - loss: 1.1404 - regression_loss: 1.0030 - classification_loss: 0.1374 491/500 [============================>.] - ETA: 3s - loss: 1.1406 - regression_loss: 1.0031 - classification_loss: 0.1375 492/500 [============================>.] - ETA: 2s - loss: 1.1421 - regression_loss: 1.0041 - classification_loss: 0.1381 493/500 [============================>.] - ETA: 2s - loss: 1.1407 - regression_loss: 1.0029 - classification_loss: 0.1378 494/500 [============================>.] - ETA: 2s - loss: 1.1411 - regression_loss: 1.0033 - classification_loss: 0.1378 495/500 [============================>.] - ETA: 1s - loss: 1.1403 - regression_loss: 1.0026 - classification_loss: 0.1377 496/500 [============================>.] - ETA: 1s - loss: 1.1395 - regression_loss: 1.0020 - classification_loss: 0.1375 497/500 [============================>.] - ETA: 1s - loss: 1.1415 - regression_loss: 1.0039 - classification_loss: 0.1376 498/500 [============================>.] - ETA: 0s - loss: 1.1419 - regression_loss: 1.0042 - classification_loss: 0.1377 499/500 [============================>.] - ETA: 0s - loss: 1.1411 - regression_loss: 1.0035 - classification_loss: 0.1376 500/500 [==============================] - 171s 341ms/step - loss: 1.1406 - regression_loss: 1.0032 - classification_loss: 0.1374 326 instances of class plum with average precision: 0.8231 mAP: 0.8231 Epoch 00019: saving model to ./training/snapshots/resnet101_pascal_19.h5 Epoch 20/150 1/500 [..............................] - ETA: 2:47 - loss: 1.6159 - regression_loss: 1.4209 - classification_loss: 0.1950 2/500 [..............................] - ETA: 2:48 - loss: 1.2684 - regression_loss: 1.1282 - classification_loss: 0.1402 3/500 [..............................] - ETA: 2:47 - loss: 1.2025 - regression_loss: 1.0753 - classification_loss: 0.1272 4/500 [..............................] - ETA: 2:45 - loss: 1.1476 - regression_loss: 1.0379 - classification_loss: 0.1096 5/500 [..............................] - ETA: 2:41 - loss: 1.1099 - regression_loss: 1.0015 - classification_loss: 0.1084 6/500 [..............................] - ETA: 2:40 - loss: 1.0805 - regression_loss: 0.9744 - classification_loss: 0.1061 7/500 [..............................] - ETA: 2:42 - loss: 0.9926 - regression_loss: 0.8965 - classification_loss: 0.0961 8/500 [..............................] - ETA: 2:43 - loss: 1.0294 - regression_loss: 0.9285 - classification_loss: 0.1009 9/500 [..............................] - ETA: 2:43 - loss: 0.9538 - regression_loss: 0.8621 - classification_loss: 0.0917 10/500 [..............................] - ETA: 2:43 - loss: 0.9754 - regression_loss: 0.8820 - classification_loss: 0.0934 11/500 [..............................] - ETA: 2:44 - loss: 0.9499 - regression_loss: 0.8591 - classification_loss: 0.0908 12/500 [..............................] - ETA: 2:44 - loss: 0.9648 - regression_loss: 0.8728 - classification_loss: 0.0919 13/500 [..............................] - ETA: 2:43 - loss: 0.9486 - regression_loss: 0.8599 - classification_loss: 0.0887 14/500 [..............................] - ETA: 2:43 - loss: 0.9109 - regression_loss: 0.8253 - classification_loss: 0.0856 15/500 [..............................] - ETA: 2:43 - loss: 0.8892 - regression_loss: 0.8039 - classification_loss: 0.0853 16/500 [..............................] - ETA: 2:43 - loss: 0.9206 - regression_loss: 0.8224 - classification_loss: 0.0982 17/500 [>.............................] - ETA: 2:43 - loss: 0.9328 - regression_loss: 0.8351 - classification_loss: 0.0977 18/500 [>.............................] - ETA: 2:43 - loss: 0.9420 - regression_loss: 0.8429 - classification_loss: 0.0991 19/500 [>.............................] - ETA: 2:43 - loss: 0.9772 - regression_loss: 0.8766 - classification_loss: 0.1006 20/500 [>.............................] - ETA: 2:42 - loss: 0.9678 - regression_loss: 0.8686 - classification_loss: 0.0992 21/500 [>.............................] - ETA: 2:41 - loss: 0.9695 - regression_loss: 0.8701 - classification_loss: 0.0994 22/500 [>.............................] - ETA: 2:41 - loss: 0.9936 - regression_loss: 0.8873 - classification_loss: 0.1063 23/500 [>.............................] - ETA: 2:41 - loss: 0.9883 - regression_loss: 0.8826 - classification_loss: 0.1057 24/500 [>.............................] - ETA: 2:40 - loss: 0.9929 - regression_loss: 0.8863 - classification_loss: 0.1066 25/500 [>.............................] - ETA: 2:40 - loss: 0.9803 - regression_loss: 0.8749 - classification_loss: 0.1054 26/500 [>.............................] - ETA: 2:39 - loss: 1.0053 - regression_loss: 0.8934 - classification_loss: 0.1119 27/500 [>.............................] - ETA: 2:39 - loss: 0.9693 - regression_loss: 0.8603 - classification_loss: 0.1090 28/500 [>.............................] - ETA: 2:39 - loss: 0.9695 - regression_loss: 0.8619 - classification_loss: 0.1076 29/500 [>.............................] - ETA: 2:39 - loss: 0.9852 - regression_loss: 0.8759 - classification_loss: 0.1093 30/500 [>.............................] - ETA: 2:38 - loss: 0.9904 - regression_loss: 0.8824 - classification_loss: 0.1080 31/500 [>.............................] - ETA: 2:38 - loss: 0.9886 - regression_loss: 0.8823 - classification_loss: 0.1063 32/500 [>.............................] - ETA: 2:38 - loss: 1.0005 - regression_loss: 0.8921 - classification_loss: 0.1085 33/500 [>.............................] - ETA: 2:38 - loss: 1.0038 - regression_loss: 0.8950 - classification_loss: 0.1088 34/500 [=>............................] - ETA: 2:38 - loss: 0.9892 - regression_loss: 0.8822 - classification_loss: 0.1071 35/500 [=>............................] - ETA: 2:38 - loss: 0.9763 - regression_loss: 0.8709 - classification_loss: 0.1054 36/500 [=>............................] - ETA: 2:37 - loss: 0.9649 - regression_loss: 0.8608 - classification_loss: 0.1041 37/500 [=>............................] - ETA: 2:37 - loss: 0.9667 - regression_loss: 0.8632 - classification_loss: 0.1036 38/500 [=>............................] - ETA: 2:37 - loss: 0.9711 - regression_loss: 0.8665 - classification_loss: 0.1046 39/500 [=>............................] - ETA: 2:37 - loss: 0.9722 - regression_loss: 0.8672 - classification_loss: 0.1050 40/500 [=>............................] - ETA: 2:36 - loss: 0.9669 - regression_loss: 0.8629 - classification_loss: 0.1040 41/500 [=>............................] - ETA: 2:36 - loss: 0.9649 - regression_loss: 0.8617 - classification_loss: 0.1032 42/500 [=>............................] - ETA: 2:35 - loss: 0.9661 - regression_loss: 0.8639 - classification_loss: 0.1021 43/500 [=>............................] - ETA: 2:35 - loss: 0.9684 - regression_loss: 0.8656 - classification_loss: 0.1027 44/500 [=>............................] - ETA: 2:34 - loss: 0.9718 - regression_loss: 0.8683 - classification_loss: 0.1035 45/500 [=>............................] - ETA: 2:34 - loss: 0.9795 - regression_loss: 0.8754 - classification_loss: 0.1042 46/500 [=>............................] - ETA: 2:34 - loss: 0.9820 - regression_loss: 0.8779 - classification_loss: 0.1042 47/500 [=>............................] - ETA: 2:33 - loss: 0.9857 - regression_loss: 0.8822 - classification_loss: 0.1035 48/500 [=>............................] - ETA: 2:33 - loss: 0.9867 - regression_loss: 0.8836 - classification_loss: 0.1031 49/500 [=>............................] - ETA: 2:33 - loss: 1.0060 - regression_loss: 0.9007 - classification_loss: 0.1054 50/500 [==>...........................] - ETA: 2:33 - loss: 1.0135 - regression_loss: 0.9059 - classification_loss: 0.1077 51/500 [==>...........................] - ETA: 2:32 - loss: 1.0278 - regression_loss: 0.9193 - classification_loss: 0.1085 52/500 [==>...........................] - ETA: 2:32 - loss: 1.0256 - regression_loss: 0.9176 - classification_loss: 0.1080 53/500 [==>...........................] - ETA: 2:32 - loss: 1.0240 - regression_loss: 0.9162 - classification_loss: 0.1077 54/500 [==>...........................] - ETA: 2:32 - loss: 1.0210 - regression_loss: 0.9136 - classification_loss: 0.1074 55/500 [==>...........................] - ETA: 2:31 - loss: 1.0220 - regression_loss: 0.9146 - classification_loss: 0.1074 56/500 [==>...........................] - ETA: 2:31 - loss: 1.0218 - regression_loss: 0.9138 - classification_loss: 0.1081 57/500 [==>...........................] - ETA: 2:30 - loss: 1.0087 - regression_loss: 0.9018 - classification_loss: 0.1069 58/500 [==>...........................] - ETA: 2:30 - loss: 1.0114 - regression_loss: 0.9041 - classification_loss: 0.1073 59/500 [==>...........................] - ETA: 2:29 - loss: 1.0127 - regression_loss: 0.9055 - classification_loss: 0.1072 60/500 [==>...........................] - ETA: 2:29 - loss: 1.0154 - regression_loss: 0.9080 - classification_loss: 0.1074 61/500 [==>...........................] - ETA: 2:29 - loss: 1.0167 - regression_loss: 0.9095 - classification_loss: 0.1072 62/500 [==>...........................] - ETA: 2:28 - loss: 1.0173 - regression_loss: 0.9103 - classification_loss: 0.1070 63/500 [==>...........................] - ETA: 2:28 - loss: 1.0119 - regression_loss: 0.9062 - classification_loss: 0.1058 64/500 [==>...........................] - ETA: 2:28 - loss: 1.0004 - regression_loss: 0.8960 - classification_loss: 0.1044 65/500 [==>...........................] - ETA: 2:27 - loss: 0.9862 - regression_loss: 0.8822 - classification_loss: 0.1040 66/500 [==>...........................] - ETA: 2:27 - loss: 0.9904 - regression_loss: 0.8862 - classification_loss: 0.1042 67/500 [===>..........................] - ETA: 2:27 - loss: 0.9845 - regression_loss: 0.8812 - classification_loss: 0.1033 68/500 [===>..........................] - ETA: 2:27 - loss: 0.9878 - regression_loss: 0.8839 - classification_loss: 0.1040 69/500 [===>..........................] - ETA: 2:26 - loss: 0.9828 - regression_loss: 0.8799 - classification_loss: 0.1029 70/500 [===>..........................] - ETA: 2:26 - loss: 0.9874 - regression_loss: 0.8837 - classification_loss: 0.1037 71/500 [===>..........................] - ETA: 2:26 - loss: 0.9846 - regression_loss: 0.8814 - classification_loss: 0.1032 72/500 [===>..........................] - ETA: 2:25 - loss: 0.9849 - regression_loss: 0.8824 - classification_loss: 0.1026 73/500 [===>..........................] - ETA: 2:25 - loss: 0.9852 - regression_loss: 0.8820 - classification_loss: 0.1032 74/500 [===>..........................] - ETA: 2:25 - loss: 0.9814 - regression_loss: 0.8789 - classification_loss: 0.1024 75/500 [===>..........................] - ETA: 2:24 - loss: 0.9850 - regression_loss: 0.8812 - classification_loss: 0.1038 76/500 [===>..........................] - ETA: 2:24 - loss: 0.9870 - regression_loss: 0.8832 - classification_loss: 0.1038 77/500 [===>..........................] - ETA: 2:24 - loss: 0.9860 - regression_loss: 0.8822 - classification_loss: 0.1038 78/500 [===>..........................] - ETA: 2:23 - loss: 0.9818 - regression_loss: 0.8790 - classification_loss: 0.1028 79/500 [===>..........................] - ETA: 2:23 - loss: 0.9857 - regression_loss: 0.8816 - classification_loss: 0.1040 80/500 [===>..........................] - ETA: 2:23 - loss: 0.9803 - regression_loss: 0.8769 - classification_loss: 0.1034 81/500 [===>..........................] - ETA: 2:22 - loss: 0.9901 - regression_loss: 0.8857 - classification_loss: 0.1043 82/500 [===>..........................] - ETA: 2:22 - loss: 0.9897 - regression_loss: 0.8856 - classification_loss: 0.1041 83/500 [===>..........................] - ETA: 2:21 - loss: 0.9908 - regression_loss: 0.8868 - classification_loss: 0.1040 84/500 [====>.........................] - ETA: 2:21 - loss: 0.9911 - regression_loss: 0.8865 - classification_loss: 0.1046 85/500 [====>.........................] - ETA: 2:21 - loss: 0.9920 - regression_loss: 0.8871 - classification_loss: 0.1049 86/500 [====>.........................] - ETA: 2:20 - loss: 0.9839 - regression_loss: 0.8798 - classification_loss: 0.1042 87/500 [====>.........................] - ETA: 2:20 - loss: 0.9918 - regression_loss: 0.8865 - classification_loss: 0.1053 88/500 [====>.........................] - ETA: 2:20 - loss: 0.9891 - regression_loss: 0.8843 - classification_loss: 0.1048 89/500 [====>.........................] - ETA: 2:19 - loss: 1.0025 - regression_loss: 0.8943 - classification_loss: 0.1083 90/500 [====>.........................] - ETA: 2:19 - loss: 1.0054 - regression_loss: 0.8971 - classification_loss: 0.1083 91/500 [====>.........................] - ETA: 2:19 - loss: 1.0061 - regression_loss: 0.8978 - classification_loss: 0.1083 92/500 [====>.........................] - ETA: 2:18 - loss: 1.0068 - regression_loss: 0.8976 - classification_loss: 0.1092 93/500 [====>.........................] - ETA: 2:18 - loss: 1.0110 - regression_loss: 0.9010 - classification_loss: 0.1099 94/500 [====>.........................] - ETA: 2:18 - loss: 1.0121 - regression_loss: 0.9021 - classification_loss: 0.1099 95/500 [====>.........................] - ETA: 2:17 - loss: 1.0057 - regression_loss: 0.8965 - classification_loss: 0.1092 96/500 [====>.........................] - ETA: 2:17 - loss: 1.0088 - regression_loss: 0.8993 - classification_loss: 0.1094 97/500 [====>.........................] - ETA: 2:17 - loss: 1.0140 - regression_loss: 0.9046 - classification_loss: 0.1094 98/500 [====>.........................] - ETA: 2:16 - loss: 1.0140 - regression_loss: 0.9047 - classification_loss: 0.1092 99/500 [====>.........................] - ETA: 2:16 - loss: 1.0187 - regression_loss: 0.9087 - classification_loss: 0.1100 100/500 [=====>........................] - ETA: 2:16 - loss: 1.0175 - regression_loss: 0.9077 - classification_loss: 0.1098 101/500 [=====>........................] - ETA: 2:15 - loss: 1.0234 - regression_loss: 0.9118 - classification_loss: 0.1117 102/500 [=====>........................] - ETA: 2:15 - loss: 1.0256 - regression_loss: 0.9141 - classification_loss: 0.1115 103/500 [=====>........................] - ETA: 2:15 - loss: 1.0255 - regression_loss: 0.9144 - classification_loss: 0.1111 104/500 [=====>........................] - ETA: 2:14 - loss: 1.0268 - regression_loss: 0.9150 - classification_loss: 0.1118 105/500 [=====>........................] - ETA: 2:14 - loss: 1.0278 - regression_loss: 0.9152 - classification_loss: 0.1126 106/500 [=====>........................] - ETA: 2:14 - loss: 1.0287 - regression_loss: 0.9163 - classification_loss: 0.1123 107/500 [=====>........................] - ETA: 2:13 - loss: 1.0272 - regression_loss: 0.9149 - classification_loss: 0.1123 108/500 [=====>........................] - ETA: 2:13 - loss: 1.0279 - regression_loss: 0.9156 - classification_loss: 0.1122 109/500 [=====>........................] - ETA: 2:12 - loss: 1.0209 - regression_loss: 0.9095 - classification_loss: 0.1114 110/500 [=====>........................] - ETA: 2:12 - loss: 1.0217 - regression_loss: 0.9101 - classification_loss: 0.1116 111/500 [=====>........................] - ETA: 2:12 - loss: 1.0193 - regression_loss: 0.9080 - classification_loss: 0.1113 112/500 [=====>........................] - ETA: 2:11 - loss: 1.0180 - regression_loss: 0.9070 - classification_loss: 0.1110 113/500 [=====>........................] - ETA: 2:11 - loss: 1.0178 - regression_loss: 0.9069 - classification_loss: 0.1109 114/500 [=====>........................] - ETA: 2:11 - loss: 1.0164 - regression_loss: 0.9057 - classification_loss: 0.1107 115/500 [=====>........................] - ETA: 2:11 - loss: 1.0179 - regression_loss: 0.9074 - classification_loss: 0.1105 116/500 [=====>........................] - ETA: 2:10 - loss: 1.0177 - regression_loss: 0.9073 - classification_loss: 0.1105 117/500 [======>.......................] - ETA: 2:10 - loss: 1.0193 - regression_loss: 0.9088 - classification_loss: 0.1106 118/500 [======>.......................] - ETA: 2:10 - loss: 1.0227 - regression_loss: 0.9108 - classification_loss: 0.1119 119/500 [======>.......................] - ETA: 2:09 - loss: 1.0292 - regression_loss: 0.9165 - classification_loss: 0.1127 120/500 [======>.......................] - ETA: 2:09 - loss: 1.0327 - regression_loss: 0.9196 - classification_loss: 0.1131 121/500 [======>.......................] - ETA: 2:09 - loss: 1.0349 - regression_loss: 0.9220 - classification_loss: 0.1129 122/500 [======>.......................] - ETA: 2:08 - loss: 1.0324 - regression_loss: 0.9199 - classification_loss: 0.1125 123/500 [======>.......................] - ETA: 2:08 - loss: 1.0282 - regression_loss: 0.9164 - classification_loss: 0.1118 124/500 [======>.......................] - ETA: 2:07 - loss: 1.0308 - regression_loss: 0.9186 - classification_loss: 0.1122 125/500 [======>.......................] - ETA: 2:07 - loss: 1.0319 - regression_loss: 0.9197 - classification_loss: 0.1122 126/500 [======>.......................] - ETA: 2:07 - loss: 1.0280 - regression_loss: 0.9165 - classification_loss: 0.1115 127/500 [======>.......................] - ETA: 2:06 - loss: 1.0313 - regression_loss: 0.9190 - classification_loss: 0.1124 128/500 [======>.......................] - ETA: 2:06 - loss: 1.0333 - regression_loss: 0.9206 - classification_loss: 0.1127 129/500 [======>.......................] - ETA: 2:06 - loss: 1.0313 - regression_loss: 0.9193 - classification_loss: 0.1120 130/500 [======>.......................] - ETA: 2:05 - loss: 1.0372 - regression_loss: 0.9243 - classification_loss: 0.1130 131/500 [======>.......................] - ETA: 2:05 - loss: 1.0381 - regression_loss: 0.9248 - classification_loss: 0.1134 132/500 [======>.......................] - ETA: 2:05 - loss: 1.0402 - regression_loss: 0.9263 - classification_loss: 0.1138 133/500 [======>.......................] - ETA: 2:04 - loss: 1.0377 - regression_loss: 0.9242 - classification_loss: 0.1135 134/500 [=======>......................] - ETA: 2:04 - loss: 1.0460 - regression_loss: 0.9311 - classification_loss: 0.1148 135/500 [=======>......................] - ETA: 2:04 - loss: 1.0473 - regression_loss: 0.9327 - classification_loss: 0.1147 136/500 [=======>......................] - ETA: 2:03 - loss: 1.0533 - regression_loss: 0.9384 - classification_loss: 0.1150 137/500 [=======>......................] - ETA: 2:03 - loss: 1.0526 - regression_loss: 0.9380 - classification_loss: 0.1147 138/500 [=======>......................] - ETA: 2:03 - loss: 1.0523 - regression_loss: 0.9367 - classification_loss: 0.1157 139/500 [=======>......................] - ETA: 2:02 - loss: 1.0535 - regression_loss: 0.9375 - classification_loss: 0.1159 140/500 [=======>......................] - ETA: 2:02 - loss: 1.0514 - regression_loss: 0.9359 - classification_loss: 0.1155 141/500 [=======>......................] - ETA: 2:01 - loss: 1.0479 - regression_loss: 0.9323 - classification_loss: 0.1156 142/500 [=======>......................] - ETA: 2:01 - loss: 1.0461 - regression_loss: 0.9310 - classification_loss: 0.1151 143/500 [=======>......................] - ETA: 2:01 - loss: 1.0461 - regression_loss: 0.9310 - classification_loss: 0.1151 144/500 [=======>......................] - ETA: 2:01 - loss: 1.0433 - regression_loss: 0.9278 - classification_loss: 0.1154 145/500 [=======>......................] - ETA: 2:00 - loss: 1.0432 - regression_loss: 0.9282 - classification_loss: 0.1150 146/500 [=======>......................] - ETA: 2:00 - loss: 1.0434 - regression_loss: 0.9280 - classification_loss: 0.1154 147/500 [=======>......................] - ETA: 1:59 - loss: 1.0431 - regression_loss: 0.9281 - classification_loss: 0.1150 148/500 [=======>......................] - ETA: 1:59 - loss: 1.0402 - regression_loss: 0.9257 - classification_loss: 0.1145 149/500 [=======>......................] - ETA: 1:59 - loss: 1.0427 - regression_loss: 0.9274 - classification_loss: 0.1153 150/500 [========>.....................] - ETA: 1:58 - loss: 1.0404 - regression_loss: 0.9257 - classification_loss: 0.1147 151/500 [========>.....................] - ETA: 1:58 - loss: 1.0448 - regression_loss: 0.9299 - classification_loss: 0.1149 152/500 [========>.....................] - ETA: 1:58 - loss: 1.0457 - regression_loss: 0.9311 - classification_loss: 0.1145 153/500 [========>.....................] - ETA: 1:57 - loss: 1.0502 - regression_loss: 0.9344 - classification_loss: 0.1158 154/500 [========>.....................] - ETA: 1:57 - loss: 1.0507 - regression_loss: 0.9349 - classification_loss: 0.1158 155/500 [========>.....................] - ETA: 1:57 - loss: 1.0495 - regression_loss: 0.9337 - classification_loss: 0.1158 156/500 [========>.....................] - ETA: 1:56 - loss: 1.0488 - regression_loss: 0.9330 - classification_loss: 0.1158 157/500 [========>.....................] - ETA: 1:56 - loss: 1.0490 - regression_loss: 0.9332 - classification_loss: 0.1158 158/500 [========>.....................] - ETA: 1:56 - loss: 1.0467 - regression_loss: 0.9314 - classification_loss: 0.1153 159/500 [========>.....................] - ETA: 1:55 - loss: 1.0456 - regression_loss: 0.9296 - classification_loss: 0.1160 160/500 [========>.....................] - ETA: 1:55 - loss: 1.0424 - regression_loss: 0.9270 - classification_loss: 0.1155 161/500 [========>.....................] - ETA: 1:55 - loss: 1.0418 - regression_loss: 0.9261 - classification_loss: 0.1157 162/500 [========>.....................] - ETA: 1:54 - loss: 1.0407 - regression_loss: 0.9253 - classification_loss: 0.1155 163/500 [========>.....................] - ETA: 1:54 - loss: 1.0424 - regression_loss: 0.9264 - classification_loss: 0.1160 164/500 [========>.....................] - ETA: 1:54 - loss: 1.0412 - regression_loss: 0.9255 - classification_loss: 0.1157 165/500 [========>.....................] - ETA: 1:53 - loss: 1.0398 - regression_loss: 0.9242 - classification_loss: 0.1156 166/500 [========>.....................] - ETA: 1:53 - loss: 1.0389 - regression_loss: 0.9233 - classification_loss: 0.1156 167/500 [=========>....................] - ETA: 1:53 - loss: 1.0411 - regression_loss: 0.9250 - classification_loss: 0.1162 168/500 [=========>....................] - ETA: 1:52 - loss: 1.0433 - regression_loss: 0.9270 - classification_loss: 0.1163 169/500 [=========>....................] - ETA: 1:52 - loss: 1.0394 - regression_loss: 0.9236 - classification_loss: 0.1158 170/500 [=========>....................] - ETA: 1:52 - loss: 1.0385 - regression_loss: 0.9230 - classification_loss: 0.1155 171/500 [=========>....................] - ETA: 1:51 - loss: 1.0385 - regression_loss: 0.9231 - classification_loss: 0.1154 172/500 [=========>....................] - ETA: 1:51 - loss: 1.0390 - regression_loss: 0.9237 - classification_loss: 0.1153 173/500 [=========>....................] - ETA: 1:51 - loss: 1.0371 - regression_loss: 0.9221 - classification_loss: 0.1150 174/500 [=========>....................] - ETA: 1:50 - loss: 1.0326 - regression_loss: 0.9182 - classification_loss: 0.1145 175/500 [=========>....................] - ETA: 1:50 - loss: 1.0378 - regression_loss: 0.9225 - classification_loss: 0.1153 176/500 [=========>....................] - ETA: 1:50 - loss: 1.0359 - regression_loss: 0.9191 - classification_loss: 0.1168 177/500 [=========>....................] - ETA: 1:49 - loss: 1.0349 - regression_loss: 0.9185 - classification_loss: 0.1165 178/500 [=========>....................] - ETA: 1:49 - loss: 1.0331 - regression_loss: 0.9169 - classification_loss: 0.1162 179/500 [=========>....................] - ETA: 1:49 - loss: 1.0380 - regression_loss: 0.9215 - classification_loss: 0.1164 180/500 [=========>....................] - ETA: 1:48 - loss: 1.0430 - regression_loss: 0.9248 - classification_loss: 0.1183 181/500 [=========>....................] - ETA: 1:48 - loss: 1.0419 - regression_loss: 0.9240 - classification_loss: 0.1179 182/500 [=========>....................] - ETA: 1:48 - loss: 1.0447 - regression_loss: 0.9257 - classification_loss: 0.1190 183/500 [=========>....................] - ETA: 1:47 - loss: 1.0515 - regression_loss: 0.9301 - classification_loss: 0.1214 184/500 [==========>...................] - ETA: 1:47 - loss: 1.0540 - regression_loss: 0.9322 - classification_loss: 0.1218 185/500 [==========>...................] - ETA: 1:47 - loss: 1.0541 - regression_loss: 0.9324 - classification_loss: 0.1216 186/500 [==========>...................] - ETA: 1:46 - loss: 1.0562 - regression_loss: 0.9342 - classification_loss: 0.1220 187/500 [==========>...................] - ETA: 1:46 - loss: 1.0595 - regression_loss: 0.9372 - classification_loss: 0.1223 188/500 [==========>...................] - ETA: 1:46 - loss: 1.0662 - regression_loss: 0.9429 - classification_loss: 0.1232 189/500 [==========>...................] - ETA: 1:45 - loss: 1.0666 - regression_loss: 0.9432 - classification_loss: 0.1234 190/500 [==========>...................] - ETA: 1:45 - loss: 1.0679 - regression_loss: 0.9443 - classification_loss: 0.1237 191/500 [==========>...................] - ETA: 1:45 - loss: 1.0664 - regression_loss: 0.9429 - classification_loss: 0.1235 192/500 [==========>...................] - ETA: 1:44 - loss: 1.0652 - regression_loss: 0.9419 - classification_loss: 0.1232 193/500 [==========>...................] - ETA: 1:44 - loss: 1.0652 - regression_loss: 0.9422 - classification_loss: 0.1231 194/500 [==========>...................] - ETA: 1:44 - loss: 1.0672 - regression_loss: 0.9437 - classification_loss: 0.1236 195/500 [==========>...................] - ETA: 1:43 - loss: 1.0710 - regression_loss: 0.9468 - classification_loss: 0.1242 196/500 [==========>...................] - ETA: 1:43 - loss: 1.0706 - regression_loss: 0.9465 - classification_loss: 0.1241 197/500 [==========>...................] - ETA: 1:42 - loss: 1.0688 - regression_loss: 0.9450 - classification_loss: 0.1239 198/500 [==========>...................] - ETA: 1:42 - loss: 1.0702 - regression_loss: 0.9459 - classification_loss: 0.1243 199/500 [==========>...................] - ETA: 1:42 - loss: 1.0715 - regression_loss: 0.9470 - classification_loss: 0.1245 200/500 [===========>..................] - ETA: 1:41 - loss: 1.0708 - regression_loss: 0.9466 - classification_loss: 0.1242 201/500 [===========>..................] - ETA: 1:41 - loss: 1.0707 - regression_loss: 0.9466 - classification_loss: 0.1241 202/500 [===========>..................] - ETA: 1:41 - loss: 1.0752 - regression_loss: 0.9480 - classification_loss: 0.1273 203/500 [===========>..................] - ETA: 1:40 - loss: 1.0784 - regression_loss: 0.9504 - classification_loss: 0.1280 204/500 [===========>..................] - ETA: 1:40 - loss: 1.0798 - regression_loss: 0.9513 - classification_loss: 0.1285 205/500 [===========>..................] - ETA: 1:40 - loss: 1.0785 - regression_loss: 0.9505 - classification_loss: 0.1280 206/500 [===========>..................] - ETA: 1:39 - loss: 1.0790 - regression_loss: 0.9509 - classification_loss: 0.1281 207/500 [===========>..................] - ETA: 1:39 - loss: 1.0762 - regression_loss: 0.9484 - classification_loss: 0.1278 208/500 [===========>..................] - ETA: 1:39 - loss: 1.0742 - regression_loss: 0.9468 - classification_loss: 0.1273 209/500 [===========>..................] - ETA: 1:38 - loss: 1.0792 - regression_loss: 0.9502 - classification_loss: 0.1290 210/500 [===========>..................] - ETA: 1:38 - loss: 1.0832 - regression_loss: 0.9532 - classification_loss: 0.1300 211/500 [===========>..................] - ETA: 1:38 - loss: 1.0819 - regression_loss: 0.9520 - classification_loss: 0.1299 212/500 [===========>..................] - ETA: 1:37 - loss: 1.0857 - regression_loss: 0.9554 - classification_loss: 0.1302 213/500 [===========>..................] - ETA: 1:37 - loss: 1.0853 - regression_loss: 0.9550 - classification_loss: 0.1303 214/500 [===========>..................] - ETA: 1:37 - loss: 1.0863 - regression_loss: 0.9560 - classification_loss: 0.1303 215/500 [===========>..................] - ETA: 1:36 - loss: 1.0856 - regression_loss: 0.9556 - classification_loss: 0.1300 216/500 [===========>..................] - ETA: 1:36 - loss: 1.0832 - regression_loss: 0.9532 - classification_loss: 0.1300 217/500 [============>.................] - ETA: 1:36 - loss: 1.0825 - regression_loss: 0.9527 - classification_loss: 0.1298 218/500 [============>.................] - ETA: 1:35 - loss: 1.0813 - regression_loss: 0.9516 - classification_loss: 0.1297 219/500 [============>.................] - ETA: 1:35 - loss: 1.0814 - regression_loss: 0.9516 - classification_loss: 0.1298 220/500 [============>.................] - ETA: 1:35 - loss: 1.0850 - regression_loss: 0.9543 - classification_loss: 0.1308 221/500 [============>.................] - ETA: 1:34 - loss: 1.0821 - regression_loss: 0.9519 - classification_loss: 0.1303 222/500 [============>.................] - ETA: 1:34 - loss: 1.0799 - regression_loss: 0.9500 - classification_loss: 0.1299 223/500 [============>.................] - ETA: 1:34 - loss: 1.0816 - regression_loss: 0.9512 - classification_loss: 0.1304 224/500 [============>.................] - ETA: 1:33 - loss: 1.0937 - regression_loss: 0.9587 - classification_loss: 0.1350 225/500 [============>.................] - ETA: 1:33 - loss: 1.0941 - regression_loss: 0.9592 - classification_loss: 0.1349 226/500 [============>.................] - ETA: 1:33 - loss: 1.0962 - regression_loss: 0.9610 - classification_loss: 0.1352 227/500 [============>.................] - ETA: 1:32 - loss: 1.0979 - regression_loss: 0.9628 - classification_loss: 0.1351 228/500 [============>.................] - ETA: 1:32 - loss: 1.0972 - regression_loss: 0.9622 - classification_loss: 0.1350 229/500 [============>.................] - ETA: 1:31 - loss: 1.0959 - regression_loss: 0.9611 - classification_loss: 0.1348 230/500 [============>.................] - ETA: 1:31 - loss: 1.0999 - regression_loss: 0.9643 - classification_loss: 0.1355 231/500 [============>.................] - ETA: 1:31 - loss: 1.1021 - regression_loss: 0.9659 - classification_loss: 0.1362 232/500 [============>.................] - ETA: 1:31 - loss: 1.1004 - regression_loss: 0.9646 - classification_loss: 0.1359 233/500 [============>.................] - ETA: 1:30 - loss: 1.0999 - regression_loss: 0.9642 - classification_loss: 0.1357 234/500 [=============>................] - ETA: 1:30 - loss: 1.1000 - regression_loss: 0.9644 - classification_loss: 0.1356 235/500 [=============>................] - ETA: 1:30 - loss: 1.0978 - regression_loss: 0.9626 - classification_loss: 0.1352 236/500 [=============>................] - ETA: 1:29 - loss: 1.0954 - regression_loss: 0.9606 - classification_loss: 0.1348 237/500 [=============>................] - ETA: 1:29 - loss: 1.0917 - regression_loss: 0.9574 - classification_loss: 0.1343 238/500 [=============>................] - ETA: 1:28 - loss: 1.0904 - regression_loss: 0.9563 - classification_loss: 0.1342 239/500 [=============>................] - ETA: 1:28 - loss: 1.0908 - regression_loss: 0.9567 - classification_loss: 0.1340 240/500 [=============>................] - ETA: 1:28 - loss: 1.0914 - regression_loss: 0.9574 - classification_loss: 0.1340 241/500 [=============>................] - ETA: 1:27 - loss: 1.0916 - regression_loss: 0.9572 - classification_loss: 0.1343 242/500 [=============>................] - ETA: 1:27 - loss: 1.0905 - regression_loss: 0.9564 - classification_loss: 0.1341 243/500 [=============>................] - ETA: 1:27 - loss: 1.0898 - regression_loss: 0.9561 - classification_loss: 0.1338 244/500 [=============>................] - ETA: 1:26 - loss: 1.0890 - regression_loss: 0.9555 - classification_loss: 0.1335 245/500 [=============>................] - ETA: 1:26 - loss: 1.0890 - regression_loss: 0.9552 - classification_loss: 0.1338 246/500 [=============>................] - ETA: 1:26 - loss: 1.0882 - regression_loss: 0.9547 - classification_loss: 0.1336 247/500 [=============>................] - ETA: 1:25 - loss: 1.0907 - regression_loss: 0.9569 - classification_loss: 0.1338 248/500 [=============>................] - ETA: 1:25 - loss: 1.0922 - regression_loss: 0.9584 - classification_loss: 0.1338 249/500 [=============>................] - ETA: 1:25 - loss: 1.0909 - regression_loss: 0.9572 - classification_loss: 0.1337 250/500 [==============>...............] - ETA: 1:24 - loss: 1.0876 - regression_loss: 0.9544 - classification_loss: 0.1332 251/500 [==============>...............] - ETA: 1:24 - loss: 1.0899 - regression_loss: 0.9563 - classification_loss: 0.1336 252/500 [==============>...............] - ETA: 1:24 - loss: 1.0904 - regression_loss: 0.9571 - classification_loss: 0.1333 253/500 [==============>...............] - ETA: 1:23 - loss: 1.0894 - regression_loss: 0.9563 - classification_loss: 0.1331 254/500 [==============>...............] - ETA: 1:23 - loss: 1.0900 - regression_loss: 0.9571 - classification_loss: 0.1329 255/500 [==============>...............] - ETA: 1:23 - loss: 1.0878 - regression_loss: 0.9552 - classification_loss: 0.1325 256/500 [==============>...............] - ETA: 1:22 - loss: 1.0913 - regression_loss: 0.9583 - classification_loss: 0.1330 257/500 [==============>...............] - ETA: 1:22 - loss: 1.0929 - regression_loss: 0.9598 - classification_loss: 0.1331 258/500 [==============>...............] - ETA: 1:22 - loss: 1.0898 - regression_loss: 0.9572 - classification_loss: 0.1327 259/500 [==============>...............] - ETA: 1:21 - loss: 1.0889 - regression_loss: 0.9564 - classification_loss: 0.1325 260/500 [==============>...............] - ETA: 1:21 - loss: 1.0875 - regression_loss: 0.9549 - classification_loss: 0.1325 261/500 [==============>...............] - ETA: 1:21 - loss: 1.0864 - regression_loss: 0.9539 - classification_loss: 0.1326 262/500 [==============>...............] - ETA: 1:20 - loss: 1.0853 - regression_loss: 0.9530 - classification_loss: 0.1323 263/500 [==============>...............] - ETA: 1:20 - loss: 1.0893 - regression_loss: 0.9564 - classification_loss: 0.1329 264/500 [==============>...............] - ETA: 1:20 - loss: 1.0901 - regression_loss: 0.9572 - classification_loss: 0.1329 265/500 [==============>...............] - ETA: 1:19 - loss: 1.0907 - regression_loss: 0.9576 - classification_loss: 0.1331 266/500 [==============>...............] - ETA: 1:19 - loss: 1.0903 - regression_loss: 0.9571 - classification_loss: 0.1332 267/500 [===============>..............] - ETA: 1:19 - loss: 1.0902 - regression_loss: 0.9572 - classification_loss: 0.1331 268/500 [===============>..............] - ETA: 1:18 - loss: 1.0871 - regression_loss: 0.9544 - classification_loss: 0.1327 269/500 [===============>..............] - ETA: 1:18 - loss: 1.0862 - regression_loss: 0.9537 - classification_loss: 0.1326 270/500 [===============>..............] - ETA: 1:18 - loss: 1.0856 - regression_loss: 0.9532 - classification_loss: 0.1323 271/500 [===============>..............] - ETA: 1:17 - loss: 1.0874 - regression_loss: 0.9546 - classification_loss: 0.1328 272/500 [===============>..............] - ETA: 1:17 - loss: 1.0882 - regression_loss: 0.9555 - classification_loss: 0.1327 273/500 [===============>..............] - ETA: 1:17 - loss: 1.0897 - regression_loss: 0.9570 - classification_loss: 0.1327 274/500 [===============>..............] - ETA: 1:16 - loss: 1.0881 - regression_loss: 0.9556 - classification_loss: 0.1325 275/500 [===============>..............] - ETA: 1:16 - loss: 1.0862 - regression_loss: 0.9542 - classification_loss: 0.1320 276/500 [===============>..............] - ETA: 1:16 - loss: 1.0874 - regression_loss: 0.9551 - classification_loss: 0.1323 277/500 [===============>..............] - ETA: 1:15 - loss: 1.0872 - regression_loss: 0.9550 - classification_loss: 0.1322 278/500 [===============>..............] - ETA: 1:15 - loss: 1.0867 - regression_loss: 0.9548 - classification_loss: 0.1319 279/500 [===============>..............] - ETA: 1:14 - loss: 1.0881 - regression_loss: 0.9559 - classification_loss: 0.1322 280/500 [===============>..............] - ETA: 1:14 - loss: 1.0881 - regression_loss: 0.9560 - classification_loss: 0.1321 281/500 [===============>..............] - ETA: 1:14 - loss: 1.0881 - regression_loss: 0.9561 - classification_loss: 0.1320 282/500 [===============>..............] - ETA: 1:13 - loss: 1.0889 - regression_loss: 0.9567 - classification_loss: 0.1322 283/500 [===============>..............] - ETA: 1:13 - loss: 1.0876 - regression_loss: 0.9557 - classification_loss: 0.1319 284/500 [================>.............] - ETA: 1:13 - loss: 1.0871 - regression_loss: 0.9554 - classification_loss: 0.1317 285/500 [================>.............] - ETA: 1:12 - loss: 1.0882 - regression_loss: 0.9564 - classification_loss: 0.1318 286/500 [================>.............] - ETA: 1:12 - loss: 1.0888 - regression_loss: 0.9567 - classification_loss: 0.1321 287/500 [================>.............] - ETA: 1:12 - loss: 1.0904 - regression_loss: 0.9582 - classification_loss: 0.1322 288/500 [================>.............] - ETA: 1:11 - loss: 1.0915 - regression_loss: 0.9592 - classification_loss: 0.1323 289/500 [================>.............] - ETA: 1:11 - loss: 1.0926 - regression_loss: 0.9602 - classification_loss: 0.1323 290/500 [================>.............] - ETA: 1:11 - loss: 1.0922 - regression_loss: 0.9597 - classification_loss: 0.1324 291/500 [================>.............] - ETA: 1:10 - loss: 1.0903 - regression_loss: 0.9581 - classification_loss: 0.1321 292/500 [================>.............] - ETA: 1:10 - loss: 1.0908 - regression_loss: 0.9588 - classification_loss: 0.1320 293/500 [================>.............] - ETA: 1:10 - loss: 1.0907 - regression_loss: 0.9589 - classification_loss: 0.1319 294/500 [================>.............] - ETA: 1:09 - loss: 1.0907 - regression_loss: 0.9589 - classification_loss: 0.1318 295/500 [================>.............] - ETA: 1:09 - loss: 1.0964 - regression_loss: 0.9633 - classification_loss: 0.1330 296/500 [================>.............] - ETA: 1:09 - loss: 1.0936 - regression_loss: 0.9609 - classification_loss: 0.1327 297/500 [================>.............] - ETA: 1:08 - loss: 1.0965 - regression_loss: 0.9637 - classification_loss: 0.1328 298/500 [================>.............] - ETA: 1:08 - loss: 1.0982 - regression_loss: 0.9649 - classification_loss: 0.1333 299/500 [================>.............] - ETA: 1:08 - loss: 1.1007 - regression_loss: 0.9666 - classification_loss: 0.1340 300/500 [=================>............] - ETA: 1:07 - loss: 1.0992 - regression_loss: 0.9655 - classification_loss: 0.1338 301/500 [=================>............] - ETA: 1:07 - loss: 1.0976 - regression_loss: 0.9641 - classification_loss: 0.1334 302/500 [=================>............] - ETA: 1:07 - loss: 1.0981 - regression_loss: 0.9646 - classification_loss: 0.1335 303/500 [=================>............] - ETA: 1:06 - loss: 1.0986 - regression_loss: 0.9649 - classification_loss: 0.1337 304/500 [=================>............] - ETA: 1:06 - loss: 1.1001 - regression_loss: 0.9659 - classification_loss: 0.1342 305/500 [=================>............] - ETA: 1:06 - loss: 1.1024 - regression_loss: 0.9682 - classification_loss: 0.1343 306/500 [=================>............] - ETA: 1:05 - loss: 1.1005 - regression_loss: 0.9665 - classification_loss: 0.1340 307/500 [=================>............] - ETA: 1:05 - loss: 1.0989 - regression_loss: 0.9651 - classification_loss: 0.1338 308/500 [=================>............] - ETA: 1:05 - loss: 1.0988 - regression_loss: 0.9651 - classification_loss: 0.1337 309/500 [=================>............] - ETA: 1:04 - loss: 1.0992 - regression_loss: 0.9654 - classification_loss: 0.1337 310/500 [=================>............] - ETA: 1:04 - loss: 1.1010 - regression_loss: 0.9664 - classification_loss: 0.1345 311/500 [=================>............] - ETA: 1:04 - loss: 1.1011 - regression_loss: 0.9666 - classification_loss: 0.1344 312/500 [=================>............] - ETA: 1:03 - loss: 1.1043 - regression_loss: 0.9693 - classification_loss: 0.1350 313/500 [=================>............] - ETA: 1:03 - loss: 1.1037 - regression_loss: 0.9689 - classification_loss: 0.1348 314/500 [=================>............] - ETA: 1:03 - loss: 1.1038 - regression_loss: 0.9690 - classification_loss: 0.1347 315/500 [=================>............] - ETA: 1:02 - loss: 1.1052 - regression_loss: 0.9701 - classification_loss: 0.1351 316/500 [=================>............] - ETA: 1:02 - loss: 1.1038 - regression_loss: 0.9689 - classification_loss: 0.1349 317/500 [==================>...........] - ETA: 1:02 - loss: 1.1040 - regression_loss: 0.9689 - classification_loss: 0.1351 318/500 [==================>...........] - ETA: 1:01 - loss: 1.1053 - regression_loss: 0.9700 - classification_loss: 0.1353 319/500 [==================>...........] - ETA: 1:01 - loss: 1.1048 - regression_loss: 0.9696 - classification_loss: 0.1352 320/500 [==================>...........] - ETA: 1:01 - loss: 1.1034 - regression_loss: 0.9685 - classification_loss: 0.1350 321/500 [==================>...........] - ETA: 1:00 - loss: 1.1061 - regression_loss: 0.9710 - classification_loss: 0.1351 322/500 [==================>...........] - ETA: 1:00 - loss: 1.1062 - regression_loss: 0.9711 - classification_loss: 0.1351 323/500 [==================>...........] - ETA: 1:00 - loss: 1.1058 - regression_loss: 0.9707 - classification_loss: 0.1351 324/500 [==================>...........] - ETA: 59s - loss: 1.1054 - regression_loss: 0.9705 - classification_loss: 0.1349  325/500 [==================>...........] - ETA: 59s - loss: 1.1058 - regression_loss: 0.9710 - classification_loss: 0.1348 326/500 [==================>...........] - ETA: 59s - loss: 1.1055 - regression_loss: 0.9707 - classification_loss: 0.1347 327/500 [==================>...........] - ETA: 58s - loss: 1.1057 - regression_loss: 0.9709 - classification_loss: 0.1348 328/500 [==================>...........] - ETA: 58s - loss: 1.1040 - regression_loss: 0.9695 - classification_loss: 0.1345 329/500 [==================>...........] - ETA: 58s - loss: 1.1041 - regression_loss: 0.9696 - classification_loss: 0.1345 330/500 [==================>...........] - ETA: 57s - loss: 1.1025 - regression_loss: 0.9682 - classification_loss: 0.1342 331/500 [==================>...........] - ETA: 57s - loss: 1.0999 - regression_loss: 0.9659 - classification_loss: 0.1339 332/500 [==================>...........] - ETA: 56s - loss: 1.1005 - regression_loss: 0.9666 - classification_loss: 0.1339 333/500 [==================>...........] - ETA: 56s - loss: 1.0990 - regression_loss: 0.9653 - classification_loss: 0.1336 334/500 [===================>..........] - ETA: 56s - loss: 1.0983 - regression_loss: 0.9648 - classification_loss: 0.1335 335/500 [===================>..........] - ETA: 55s - loss: 1.0980 - regression_loss: 0.9646 - classification_loss: 0.1334 336/500 [===================>..........] - ETA: 55s - loss: 1.0978 - regression_loss: 0.9646 - classification_loss: 0.1332 337/500 [===================>..........] - ETA: 55s - loss: 1.0959 - regression_loss: 0.9629 - classification_loss: 0.1330 338/500 [===================>..........] - ETA: 54s - loss: 1.0960 - regression_loss: 0.9632 - classification_loss: 0.1329 339/500 [===================>..........] - ETA: 54s - loss: 1.0965 - regression_loss: 0.9635 - classification_loss: 0.1329 340/500 [===================>..........] - ETA: 54s - loss: 1.0948 - regression_loss: 0.9622 - classification_loss: 0.1326 341/500 [===================>..........] - ETA: 53s - loss: 1.0949 - regression_loss: 0.9623 - classification_loss: 0.1326 342/500 [===================>..........] - ETA: 53s - loss: 1.0952 - regression_loss: 0.9626 - classification_loss: 0.1327 343/500 [===================>..........] - ETA: 53s - loss: 1.0965 - regression_loss: 0.9636 - classification_loss: 0.1329 344/500 [===================>..........] - ETA: 52s - loss: 1.0965 - regression_loss: 0.9637 - classification_loss: 0.1328 345/500 [===================>..........] - ETA: 52s - loss: 1.0969 - regression_loss: 0.9639 - classification_loss: 0.1330 346/500 [===================>..........] - ETA: 52s - loss: 1.0953 - regression_loss: 0.9625 - classification_loss: 0.1328 347/500 [===================>..........] - ETA: 51s - loss: 1.0956 - regression_loss: 0.9628 - classification_loss: 0.1328 348/500 [===================>..........] - ETA: 51s - loss: 1.0950 - regression_loss: 0.9624 - classification_loss: 0.1326 349/500 [===================>..........] - ETA: 51s - loss: 1.0955 - regression_loss: 0.9629 - classification_loss: 0.1327 350/500 [====================>.........] - ETA: 50s - loss: 1.0956 - regression_loss: 0.9629 - classification_loss: 0.1327 351/500 [====================>.........] - ETA: 50s - loss: 1.0943 - regression_loss: 0.9618 - classification_loss: 0.1325 352/500 [====================>.........] - ETA: 50s - loss: 1.0934 - regression_loss: 0.9612 - classification_loss: 0.1323 353/500 [====================>.........] - ETA: 49s - loss: 1.0964 - regression_loss: 0.9638 - classification_loss: 0.1325 354/500 [====================>.........] - ETA: 49s - loss: 1.0960 - regression_loss: 0.9636 - classification_loss: 0.1324 355/500 [====================>.........] - ETA: 49s - loss: 1.0972 - regression_loss: 0.9649 - classification_loss: 0.1323 356/500 [====================>.........] - ETA: 48s - loss: 1.0964 - regression_loss: 0.9642 - classification_loss: 0.1322 357/500 [====================>.........] - ETA: 48s - loss: 1.0969 - regression_loss: 0.9647 - classification_loss: 0.1322 358/500 [====================>.........] - ETA: 48s - loss: 1.0965 - regression_loss: 0.9644 - classification_loss: 0.1321 359/500 [====================>.........] - ETA: 47s - loss: 1.0953 - regression_loss: 0.9635 - classification_loss: 0.1319 360/500 [====================>.........] - ETA: 47s - loss: 1.0944 - regression_loss: 0.9626 - classification_loss: 0.1318 361/500 [====================>.........] - ETA: 47s - loss: 1.0950 - regression_loss: 0.9633 - classification_loss: 0.1317 362/500 [====================>.........] - ETA: 46s - loss: 1.0946 - regression_loss: 0.9631 - classification_loss: 0.1315 363/500 [====================>.........] - ETA: 46s - loss: 1.0941 - regression_loss: 0.9627 - classification_loss: 0.1314 364/500 [====================>.........] - ETA: 46s - loss: 1.0944 - regression_loss: 0.9631 - classification_loss: 0.1313 365/500 [====================>.........] - ETA: 45s - loss: 1.0943 - regression_loss: 0.9629 - classification_loss: 0.1314 366/500 [====================>.........] - ETA: 45s - loss: 1.0943 - regression_loss: 0.9631 - classification_loss: 0.1312 367/500 [=====================>........] - ETA: 45s - loss: 1.0935 - regression_loss: 0.9624 - classification_loss: 0.1311 368/500 [=====================>........] - ETA: 44s - loss: 1.0927 - regression_loss: 0.9618 - classification_loss: 0.1310 369/500 [=====================>........] - ETA: 44s - loss: 1.0926 - regression_loss: 0.9618 - classification_loss: 0.1308 370/500 [=====================>........] - ETA: 44s - loss: 1.0935 - regression_loss: 0.9627 - classification_loss: 0.1308 371/500 [=====================>........] - ETA: 43s - loss: 1.0926 - regression_loss: 0.9620 - classification_loss: 0.1306 372/500 [=====================>........] - ETA: 43s - loss: 1.0914 - regression_loss: 0.9609 - classification_loss: 0.1304 373/500 [=====================>........] - ETA: 43s - loss: 1.0912 - regression_loss: 0.9608 - classification_loss: 0.1303 374/500 [=====================>........] - ETA: 42s - loss: 1.0907 - regression_loss: 0.9605 - classification_loss: 0.1302 375/500 [=====================>........] - ETA: 42s - loss: 1.0923 - regression_loss: 0.9618 - classification_loss: 0.1305 376/500 [=====================>........] - ETA: 42s - loss: 1.0918 - regression_loss: 0.9612 - classification_loss: 0.1306 377/500 [=====================>........] - ETA: 41s - loss: 1.0931 - regression_loss: 0.9622 - classification_loss: 0.1310 378/500 [=====================>........] - ETA: 41s - loss: 1.0928 - regression_loss: 0.9620 - classification_loss: 0.1308 379/500 [=====================>........] - ETA: 40s - loss: 1.0928 - regression_loss: 0.9619 - classification_loss: 0.1308 380/500 [=====================>........] - ETA: 40s - loss: 1.0939 - regression_loss: 0.9627 - classification_loss: 0.1312 381/500 [=====================>........] - ETA: 40s - loss: 1.0932 - regression_loss: 0.9622 - classification_loss: 0.1311 382/500 [=====================>........] - ETA: 39s - loss: 1.0934 - regression_loss: 0.9624 - classification_loss: 0.1310 383/500 [=====================>........] - ETA: 39s - loss: 1.0920 - regression_loss: 0.9611 - classification_loss: 0.1309 384/500 [======================>.......] - ETA: 39s - loss: 1.0902 - regression_loss: 0.9596 - classification_loss: 0.1306 385/500 [======================>.......] - ETA: 38s - loss: 1.0900 - regression_loss: 0.9594 - classification_loss: 0.1306 386/500 [======================>.......] - ETA: 38s - loss: 1.0887 - regression_loss: 0.9583 - classification_loss: 0.1304 387/500 [======================>.......] - ETA: 38s - loss: 1.0892 - regression_loss: 0.9589 - classification_loss: 0.1303 388/500 [======================>.......] - ETA: 37s - loss: 1.0905 - regression_loss: 0.9599 - classification_loss: 0.1306 389/500 [======================>.......] - ETA: 37s - loss: 1.0901 - regression_loss: 0.9596 - classification_loss: 0.1306 390/500 [======================>.......] - ETA: 37s - loss: 1.0922 - regression_loss: 0.9613 - classification_loss: 0.1309 391/500 [======================>.......] - ETA: 36s - loss: 1.0929 - regression_loss: 0.9618 - classification_loss: 0.1311 392/500 [======================>.......] - ETA: 36s - loss: 1.0925 - regression_loss: 0.9616 - classification_loss: 0.1309 393/500 [======================>.......] - ETA: 36s - loss: 1.0936 - regression_loss: 0.9622 - classification_loss: 0.1315 394/500 [======================>.......] - ETA: 35s - loss: 1.0927 - regression_loss: 0.9613 - classification_loss: 0.1315 395/500 [======================>.......] - ETA: 35s - loss: 1.0934 - regression_loss: 0.9616 - classification_loss: 0.1318 396/500 [======================>.......] - ETA: 35s - loss: 1.0929 - regression_loss: 0.9613 - classification_loss: 0.1316 397/500 [======================>.......] - ETA: 34s - loss: 1.0932 - regression_loss: 0.9615 - classification_loss: 0.1317 398/500 [======================>.......] - ETA: 34s - loss: 1.0935 - regression_loss: 0.9618 - classification_loss: 0.1316 399/500 [======================>.......] - ETA: 34s - loss: 1.0939 - regression_loss: 0.9622 - classification_loss: 0.1317 400/500 [=======================>......] - ETA: 33s - loss: 1.0921 - regression_loss: 0.9606 - classification_loss: 0.1315 401/500 [=======================>......] - ETA: 33s - loss: 1.0920 - regression_loss: 0.9605 - classification_loss: 0.1315 402/500 [=======================>......] - ETA: 33s - loss: 1.0924 - regression_loss: 0.9609 - classification_loss: 0.1315 403/500 [=======================>......] - ETA: 32s - loss: 1.0924 - regression_loss: 0.9609 - classification_loss: 0.1315 404/500 [=======================>......] - ETA: 32s - loss: 1.0926 - regression_loss: 0.9610 - classification_loss: 0.1316 405/500 [=======================>......] - ETA: 32s - loss: 1.0937 - regression_loss: 0.9618 - classification_loss: 0.1319 406/500 [=======================>......] - ETA: 31s - loss: 1.0927 - regression_loss: 0.9608 - classification_loss: 0.1319 407/500 [=======================>......] - ETA: 31s - loss: 1.0937 - regression_loss: 0.9618 - classification_loss: 0.1320 408/500 [=======================>......] - ETA: 31s - loss: 1.0940 - regression_loss: 0.9619 - classification_loss: 0.1321 409/500 [=======================>......] - ETA: 30s - loss: 1.0940 - regression_loss: 0.9620 - classification_loss: 0.1320 410/500 [=======================>......] - ETA: 30s - loss: 1.0928 - regression_loss: 0.9609 - classification_loss: 0.1319 411/500 [=======================>......] - ETA: 30s - loss: 1.0924 - regression_loss: 0.9606 - classification_loss: 0.1318 412/500 [=======================>......] - ETA: 29s - loss: 1.0913 - regression_loss: 0.9593 - classification_loss: 0.1320 413/500 [=======================>......] - ETA: 29s - loss: 1.0922 - regression_loss: 0.9598 - classification_loss: 0.1324 414/500 [=======================>......] - ETA: 29s - loss: 1.0928 - regression_loss: 0.9603 - classification_loss: 0.1325 415/500 [=======================>......] - ETA: 28s - loss: 1.0969 - regression_loss: 0.9633 - classification_loss: 0.1336 416/500 [=======================>......] - ETA: 28s - loss: 1.0986 - regression_loss: 0.9645 - classification_loss: 0.1341 417/500 [========================>.....] - ETA: 28s - loss: 1.0986 - regression_loss: 0.9646 - classification_loss: 0.1340 418/500 [========================>.....] - ETA: 27s - loss: 1.1004 - regression_loss: 0.9659 - classification_loss: 0.1345 419/500 [========================>.....] - ETA: 27s - loss: 1.1002 - regression_loss: 0.9658 - classification_loss: 0.1344 420/500 [========================>.....] - ETA: 27s - loss: 1.1011 - regression_loss: 0.9665 - classification_loss: 0.1346 421/500 [========================>.....] - ETA: 26s - loss: 1.1017 - regression_loss: 0.9669 - classification_loss: 0.1348 422/500 [========================>.....] - ETA: 26s - loss: 1.1015 - regression_loss: 0.9668 - classification_loss: 0.1347 423/500 [========================>.....] - ETA: 26s - loss: 1.1000 - regression_loss: 0.9655 - classification_loss: 0.1345 424/500 [========================>.....] - ETA: 25s - loss: 1.1009 - regression_loss: 0.9662 - classification_loss: 0.1346 425/500 [========================>.....] - ETA: 25s - loss: 1.1015 - regression_loss: 0.9668 - classification_loss: 0.1347 426/500 [========================>.....] - ETA: 25s - loss: 1.1006 - regression_loss: 0.9662 - classification_loss: 0.1345 427/500 [========================>.....] - ETA: 24s - loss: 1.0994 - regression_loss: 0.9652 - classification_loss: 0.1342 428/500 [========================>.....] - ETA: 24s - loss: 1.0994 - regression_loss: 0.9654 - classification_loss: 0.1340 429/500 [========================>.....] - ETA: 24s - loss: 1.0977 - regression_loss: 0.9634 - classification_loss: 0.1344 430/500 [========================>.....] - ETA: 23s - loss: 1.0976 - regression_loss: 0.9633 - classification_loss: 0.1343 431/500 [========================>.....] - ETA: 23s - loss: 1.0977 - regression_loss: 0.9635 - classification_loss: 0.1343 432/500 [========================>.....] - ETA: 23s - loss: 1.0975 - regression_loss: 0.9634 - classification_loss: 0.1341 433/500 [========================>.....] - ETA: 22s - loss: 1.0961 - regression_loss: 0.9622 - classification_loss: 0.1339 434/500 [=========================>....] - ETA: 22s - loss: 1.0960 - regression_loss: 0.9622 - classification_loss: 0.1338 435/500 [=========================>....] - ETA: 22s - loss: 1.0962 - regression_loss: 0.9623 - classification_loss: 0.1339 436/500 [=========================>....] - ETA: 21s - loss: 1.0955 - regression_loss: 0.9615 - classification_loss: 0.1340 437/500 [=========================>....] - ETA: 21s - loss: 1.0943 - regression_loss: 0.9604 - classification_loss: 0.1339 438/500 [=========================>....] - ETA: 21s - loss: 1.0944 - regression_loss: 0.9606 - classification_loss: 0.1338 439/500 [=========================>....] - ETA: 20s - loss: 1.0944 - regression_loss: 0.9606 - classification_loss: 0.1339 440/500 [=========================>....] - ETA: 20s - loss: 1.0976 - regression_loss: 0.9632 - classification_loss: 0.1344 441/500 [=========================>....] - ETA: 20s - loss: 1.0990 - regression_loss: 0.9642 - classification_loss: 0.1348 442/500 [=========================>....] - ETA: 19s - loss: 1.0998 - regression_loss: 0.9650 - classification_loss: 0.1348 443/500 [=========================>....] - ETA: 19s - loss: 1.1001 - regression_loss: 0.9654 - classification_loss: 0.1347 444/500 [=========================>....] - ETA: 18s - loss: 1.1004 - regression_loss: 0.9659 - classification_loss: 0.1346 445/500 [=========================>....] - ETA: 18s - loss: 1.1025 - regression_loss: 0.9674 - classification_loss: 0.1352 446/500 [=========================>....] - ETA: 18s - loss: 1.1026 - regression_loss: 0.9674 - classification_loss: 0.1352 447/500 [=========================>....] - ETA: 17s - loss: 1.1021 - regression_loss: 0.9669 - classification_loss: 0.1352 448/500 [=========================>....] - ETA: 17s - loss: 1.1016 - regression_loss: 0.9666 - classification_loss: 0.1350 449/500 [=========================>....] - ETA: 17s - loss: 1.1013 - regression_loss: 0.9662 - classification_loss: 0.1351 450/500 [==========================>...] - ETA: 16s - loss: 1.1011 - regression_loss: 0.9661 - classification_loss: 0.1350 451/500 [==========================>...] - ETA: 16s - loss: 1.1009 - regression_loss: 0.9659 - classification_loss: 0.1349 452/500 [==========================>...] - ETA: 16s - loss: 1.1040 - regression_loss: 0.9681 - classification_loss: 0.1359 453/500 [==========================>...] - ETA: 15s - loss: 1.1033 - regression_loss: 0.9676 - classification_loss: 0.1357 454/500 [==========================>...] - ETA: 15s - loss: 1.1029 - regression_loss: 0.9672 - classification_loss: 0.1357 455/500 [==========================>...] - ETA: 15s - loss: 1.1029 - regression_loss: 0.9672 - classification_loss: 0.1357 456/500 [==========================>...] - ETA: 14s - loss: 1.1022 - regression_loss: 0.9667 - classification_loss: 0.1356 457/500 [==========================>...] - ETA: 14s - loss: 1.1007 - regression_loss: 0.9653 - classification_loss: 0.1354 458/500 [==========================>...] - ETA: 14s - loss: 1.1003 - regression_loss: 0.9650 - classification_loss: 0.1354 459/500 [==========================>...] - ETA: 13s - loss: 1.1002 - regression_loss: 0.9647 - classification_loss: 0.1354 460/500 [==========================>...] - ETA: 13s - loss: 1.1003 - regression_loss: 0.9649 - classification_loss: 0.1355 461/500 [==========================>...] - ETA: 13s - loss: 1.1015 - regression_loss: 0.9659 - classification_loss: 0.1356 462/500 [==========================>...] - ETA: 12s - loss: 1.1018 - regression_loss: 0.9663 - classification_loss: 0.1355 463/500 [==========================>...] - ETA: 12s - loss: 1.1025 - regression_loss: 0.9669 - classification_loss: 0.1356 464/500 [==========================>...] - ETA: 12s - loss: 1.1021 - regression_loss: 0.9665 - classification_loss: 0.1356 465/500 [==========================>...] - ETA: 11s - loss: 1.1025 - regression_loss: 0.9666 - classification_loss: 0.1358 466/500 [==========================>...] - ETA: 11s - loss: 1.1032 - regression_loss: 0.9674 - classification_loss: 0.1357 467/500 [===========================>..] - ETA: 11s - loss: 1.1040 - regression_loss: 0.9683 - classification_loss: 0.1358 468/500 [===========================>..] - ETA: 10s - loss: 1.1036 - regression_loss: 0.9681 - classification_loss: 0.1356 469/500 [===========================>..] - ETA: 10s - loss: 1.1048 - regression_loss: 0.9690 - classification_loss: 0.1359 470/500 [===========================>..] - ETA: 10s - loss: 1.1044 - regression_loss: 0.9686 - classification_loss: 0.1358 471/500 [===========================>..] - ETA: 9s - loss: 1.1051 - regression_loss: 0.9692 - classification_loss: 0.1358  472/500 [===========================>..] - ETA: 9s - loss: 1.1048 - regression_loss: 0.9690 - classification_loss: 0.1358 473/500 [===========================>..] - ETA: 9s - loss: 1.1040 - regression_loss: 0.9683 - classification_loss: 0.1357 474/500 [===========================>..] - ETA: 8s - loss: 1.1033 - regression_loss: 0.9677 - classification_loss: 0.1356 475/500 [===========================>..] - ETA: 8s - loss: 1.1031 - regression_loss: 0.9675 - classification_loss: 0.1356 476/500 [===========================>..] - ETA: 8s - loss: 1.1013 - regression_loss: 0.9659 - classification_loss: 0.1354 477/500 [===========================>..] - ETA: 7s - loss: 1.1008 - regression_loss: 0.9656 - classification_loss: 0.1352 478/500 [===========================>..] - ETA: 7s - loss: 1.1013 - regression_loss: 0.9663 - classification_loss: 0.1351 479/500 [===========================>..] - ETA: 7s - loss: 1.1014 - regression_loss: 0.9664 - classification_loss: 0.1349 480/500 [===========================>..] - ETA: 6s - loss: 1.1002 - regression_loss: 0.9654 - classification_loss: 0.1348 481/500 [===========================>..] - ETA: 6s - loss: 1.0999 - regression_loss: 0.9653 - classification_loss: 0.1345 482/500 [===========================>..] - ETA: 6s - loss: 1.0987 - regression_loss: 0.9644 - classification_loss: 0.1344 483/500 [===========================>..] - ETA: 5s - loss: 1.0995 - regression_loss: 0.9650 - classification_loss: 0.1345 484/500 [============================>.] - ETA: 5s - loss: 1.1004 - regression_loss: 0.9658 - classification_loss: 0.1346 485/500 [============================>.] - ETA: 5s - loss: 1.0992 - regression_loss: 0.9649 - classification_loss: 0.1343 486/500 [============================>.] - ETA: 4s - loss: 1.0987 - regression_loss: 0.9645 - classification_loss: 0.1342 487/500 [============================>.] - ETA: 4s - loss: 1.0979 - regression_loss: 0.9638 - classification_loss: 0.1340 488/500 [============================>.] - ETA: 4s - loss: 1.0978 - regression_loss: 0.9638 - classification_loss: 0.1340 489/500 [============================>.] - ETA: 3s - loss: 1.0960 - regression_loss: 0.9618 - classification_loss: 0.1342 490/500 [============================>.] - ETA: 3s - loss: 1.0962 - regression_loss: 0.9620 - classification_loss: 0.1342 491/500 [============================>.] - ETA: 3s - loss: 1.0959 - regression_loss: 0.9618 - classification_loss: 0.1341 492/500 [============================>.] - ETA: 2s - loss: 1.0968 - regression_loss: 0.9627 - classification_loss: 0.1341 493/500 [============================>.] - ETA: 2s - loss: 1.0967 - regression_loss: 0.9627 - classification_loss: 0.1341 494/500 [============================>.] - ETA: 2s - loss: 1.0958 - regression_loss: 0.9619 - classification_loss: 0.1339 495/500 [============================>.] - ETA: 1s - loss: 1.0962 - regression_loss: 0.9624 - classification_loss: 0.1338 496/500 [============================>.] - ETA: 1s - loss: 1.0952 - regression_loss: 0.9616 - classification_loss: 0.1336 497/500 [============================>.] - ETA: 1s - loss: 1.0980 - regression_loss: 0.9639 - classification_loss: 0.1341 498/500 [============================>.] - ETA: 0s - loss: 1.0964 - regression_loss: 0.9626 - classification_loss: 0.1339 499/500 [============================>.] - ETA: 0s - loss: 1.0960 - regression_loss: 0.9622 - classification_loss: 0.1338 500/500 [==============================] - 170s 339ms/step - loss: 1.0951 - regression_loss: 0.9613 - classification_loss: 0.1337 326 instances of class plum with average precision: 0.8382 mAP: 0.8382 Epoch 00020: saving model to ./training/snapshots/resnet101_pascal_20.h5 Epoch 21/150 1/500 [..............................] - ETA: 2:50 - loss: 2.2006 - regression_loss: 1.7556 - classification_loss: 0.4450 2/500 [..............................] - ETA: 2:52 - loss: 1.8840 - regression_loss: 1.5767 - classification_loss: 0.3073 3/500 [..............................] - ETA: 2:49 - loss: 1.7291 - regression_loss: 1.4800 - classification_loss: 0.2491 4/500 [..............................] - ETA: 2:48 - loss: 1.6070 - regression_loss: 1.3998 - classification_loss: 0.2072 5/500 [..............................] - ETA: 2:49 - loss: 1.4387 - regression_loss: 1.2638 - classification_loss: 0.1749 6/500 [..............................] - ETA: 2:47 - loss: 1.4438 - regression_loss: 1.2788 - classification_loss: 0.1651 7/500 [..............................] - ETA: 2:48 - loss: 1.3704 - regression_loss: 1.2103 - classification_loss: 0.1601 8/500 [..............................] - ETA: 2:48 - loss: 1.3585 - regression_loss: 1.2000 - classification_loss: 0.1585 9/500 [..............................] - ETA: 2:47 - loss: 1.4349 - regression_loss: 1.2682 - classification_loss: 0.1667 10/500 [..............................] - ETA: 2:47 - loss: 1.5503 - regression_loss: 1.3343 - classification_loss: 0.2160 11/500 [..............................] - ETA: 2:47 - loss: 1.4809 - regression_loss: 1.2775 - classification_loss: 0.2035 12/500 [..............................] - ETA: 2:47 - loss: 1.4428 - regression_loss: 1.2462 - classification_loss: 0.1966 13/500 [..............................] - ETA: 2:46 - loss: 1.3994 - regression_loss: 1.2113 - classification_loss: 0.1881 14/500 [..............................] - ETA: 2:46 - loss: 1.3712 - regression_loss: 1.1894 - classification_loss: 0.1817 15/500 [..............................] - ETA: 2:45 - loss: 1.3812 - regression_loss: 1.2003 - classification_loss: 0.1809 16/500 [..............................] - ETA: 2:44 - loss: 1.3591 - regression_loss: 1.1836 - classification_loss: 0.1755 17/500 [>.............................] - ETA: 2:44 - loss: 1.3605 - regression_loss: 1.1786 - classification_loss: 0.1819 18/500 [>.............................] - ETA: 2:43 - loss: 1.3314 - regression_loss: 1.1560 - classification_loss: 0.1754 19/500 [>.............................] - ETA: 2:43 - loss: 1.2881 - regression_loss: 1.1181 - classification_loss: 0.1700 20/500 [>.............................] - ETA: 2:43 - loss: 1.3208 - regression_loss: 1.1416 - classification_loss: 0.1791 21/500 [>.............................] - ETA: 2:43 - loss: 1.3469 - regression_loss: 1.1652 - classification_loss: 0.1817 22/500 [>.............................] - ETA: 2:43 - loss: 1.3274 - regression_loss: 1.1494 - classification_loss: 0.1780 23/500 [>.............................] - ETA: 2:42 - loss: 1.3407 - regression_loss: 1.1647 - classification_loss: 0.1760 24/500 [>.............................] - ETA: 2:42 - loss: 1.3237 - regression_loss: 1.1528 - classification_loss: 0.1709 25/500 [>.............................] - ETA: 2:42 - loss: 1.3444 - regression_loss: 1.1742 - classification_loss: 0.1702 26/500 [>.............................] - ETA: 2:42 - loss: 1.3224 - regression_loss: 1.1558 - classification_loss: 0.1666 27/500 [>.............................] - ETA: 2:41 - loss: 1.3130 - regression_loss: 1.1488 - classification_loss: 0.1642 28/500 [>.............................] - ETA: 2:41 - loss: 1.3040 - regression_loss: 1.1406 - classification_loss: 0.1635 29/500 [>.............................] - ETA: 2:41 - loss: 1.2783 - regression_loss: 1.1184 - classification_loss: 0.1599 30/500 [>.............................] - ETA: 2:41 - loss: 1.2732 - regression_loss: 1.1144 - classification_loss: 0.1588 31/500 [>.............................] - ETA: 2:40 - loss: 1.2677 - regression_loss: 1.1102 - classification_loss: 0.1575 32/500 [>.............................] - ETA: 2:40 - loss: 1.2599 - regression_loss: 1.1006 - classification_loss: 0.1592 33/500 [>.............................] - ETA: 2:39 - loss: 1.2539 - regression_loss: 1.0952 - classification_loss: 0.1587 34/500 [=>............................] - ETA: 2:38 - loss: 1.2615 - regression_loss: 1.1015 - classification_loss: 0.1600 35/500 [=>............................] - ETA: 2:38 - loss: 1.2656 - regression_loss: 1.1058 - classification_loss: 0.1598 36/500 [=>............................] - ETA: 2:38 - loss: 1.2547 - regression_loss: 1.0980 - classification_loss: 0.1567 37/500 [=>............................] - ETA: 2:37 - loss: 1.2530 - regression_loss: 1.0975 - classification_loss: 0.1555 38/500 [=>............................] - ETA: 2:37 - loss: 1.2337 - regression_loss: 1.0816 - classification_loss: 0.1522 39/500 [=>............................] - ETA: 2:36 - loss: 1.2341 - regression_loss: 1.0830 - classification_loss: 0.1511 40/500 [=>............................] - ETA: 2:36 - loss: 1.2431 - regression_loss: 1.0918 - classification_loss: 0.1513 41/500 [=>............................] - ETA: 2:36 - loss: 1.2423 - regression_loss: 1.0926 - classification_loss: 0.1497 42/500 [=>............................] - ETA: 2:36 - loss: 1.2415 - regression_loss: 1.0905 - classification_loss: 0.1510 43/500 [=>............................] - ETA: 2:35 - loss: 1.2319 - regression_loss: 1.0829 - classification_loss: 0.1491 44/500 [=>............................] - ETA: 2:35 - loss: 1.2183 - regression_loss: 1.0711 - classification_loss: 0.1472 45/500 [=>............................] - ETA: 2:34 - loss: 1.2115 - regression_loss: 1.0653 - classification_loss: 0.1463 46/500 [=>............................] - ETA: 2:34 - loss: 1.2116 - regression_loss: 1.0641 - classification_loss: 0.1474 47/500 [=>............................] - ETA: 2:34 - loss: 1.1964 - regression_loss: 1.0513 - classification_loss: 0.1450 48/500 [=>............................] - ETA: 2:33 - loss: 1.1886 - regression_loss: 1.0449 - classification_loss: 0.1437 49/500 [=>............................] - ETA: 2:33 - loss: 1.1890 - regression_loss: 1.0459 - classification_loss: 0.1432 50/500 [==>...........................] - ETA: 2:33 - loss: 1.1916 - regression_loss: 1.0488 - classification_loss: 0.1427 51/500 [==>...........................] - ETA: 2:32 - loss: 1.1896 - regression_loss: 1.0474 - classification_loss: 0.1422 52/500 [==>...........................] - ETA: 2:32 - loss: 1.1809 - regression_loss: 1.0400 - classification_loss: 0.1409 53/500 [==>...........................] - ETA: 2:31 - loss: 1.1804 - regression_loss: 1.0401 - classification_loss: 0.1404 54/500 [==>...........................] - ETA: 2:31 - loss: 1.1686 - regression_loss: 1.0301 - classification_loss: 0.1385 55/500 [==>...........................] - ETA: 2:31 - loss: 1.1758 - regression_loss: 1.0366 - classification_loss: 0.1392 56/500 [==>...........................] - ETA: 2:30 - loss: 1.1766 - regression_loss: 1.0380 - classification_loss: 0.1386 57/500 [==>...........................] - ETA: 2:30 - loss: 1.1643 - regression_loss: 1.0276 - classification_loss: 0.1367 58/500 [==>...........................] - ETA: 2:30 - loss: 1.1645 - regression_loss: 1.0281 - classification_loss: 0.1364 59/500 [==>...........................] - ETA: 2:29 - loss: 1.1607 - regression_loss: 1.0245 - classification_loss: 0.1361 60/500 [==>...........................] - ETA: 2:29 - loss: 1.1503 - regression_loss: 1.0158 - classification_loss: 0.1346 61/500 [==>...........................] - ETA: 2:29 - loss: 1.1509 - regression_loss: 1.0165 - classification_loss: 0.1343 62/500 [==>...........................] - ETA: 2:28 - loss: 1.1447 - regression_loss: 1.0109 - classification_loss: 0.1338 63/500 [==>...........................] - ETA: 2:28 - loss: 1.1320 - regression_loss: 0.9998 - classification_loss: 0.1322 64/500 [==>...........................] - ETA: 2:28 - loss: 1.1510 - regression_loss: 1.0145 - classification_loss: 0.1365 65/500 [==>...........................] - ETA: 2:27 - loss: 1.1442 - regression_loss: 1.0094 - classification_loss: 0.1348 66/500 [==>...........................] - ETA: 2:27 - loss: 1.1390 - regression_loss: 1.0052 - classification_loss: 0.1338 67/500 [===>..........................] - ETA: 2:27 - loss: 1.1403 - regression_loss: 1.0062 - classification_loss: 0.1341 68/500 [===>..........................] - ETA: 2:26 - loss: 1.1347 - regression_loss: 1.0022 - classification_loss: 0.1326 69/500 [===>..........................] - ETA: 2:26 - loss: 1.1326 - regression_loss: 1.0008 - classification_loss: 0.1318 70/500 [===>..........................] - ETA: 2:26 - loss: 1.1265 - regression_loss: 0.9957 - classification_loss: 0.1308 71/500 [===>..........................] - ETA: 2:26 - loss: 1.1421 - regression_loss: 1.0116 - classification_loss: 0.1305 72/500 [===>..........................] - ETA: 2:25 - loss: 1.1479 - regression_loss: 1.0177 - classification_loss: 0.1302 73/500 [===>..........................] - ETA: 2:25 - loss: 1.1402 - regression_loss: 1.0111 - classification_loss: 0.1292 74/500 [===>..........................] - ETA: 2:24 - loss: 1.1561 - regression_loss: 1.0198 - classification_loss: 0.1363 75/500 [===>..........................] - ETA: 2:24 - loss: 1.1640 - regression_loss: 1.0273 - classification_loss: 0.1366 76/500 [===>..........................] - ETA: 2:24 - loss: 1.1682 - regression_loss: 1.0302 - classification_loss: 0.1380 77/500 [===>..........................] - ETA: 2:24 - loss: 1.1683 - regression_loss: 1.0301 - classification_loss: 0.1382 78/500 [===>..........................] - ETA: 2:23 - loss: 1.1619 - regression_loss: 1.0246 - classification_loss: 0.1373 79/500 [===>..........................] - ETA: 2:23 - loss: 1.1625 - regression_loss: 1.0251 - classification_loss: 0.1374 80/500 [===>..........................] - ETA: 2:22 - loss: 1.1587 - regression_loss: 1.0218 - classification_loss: 0.1369 81/500 [===>..........................] - ETA: 2:22 - loss: 1.1576 - regression_loss: 1.0212 - classification_loss: 0.1364 82/500 [===>..........................] - ETA: 2:22 - loss: 1.1653 - regression_loss: 1.0274 - classification_loss: 0.1380 83/500 [===>..........................] - ETA: 2:21 - loss: 1.1662 - regression_loss: 1.0269 - classification_loss: 0.1393 84/500 [====>.........................] - ETA: 2:21 - loss: 1.1690 - regression_loss: 1.0291 - classification_loss: 0.1399 85/500 [====>.........................] - ETA: 2:21 - loss: 1.1678 - regression_loss: 1.0281 - classification_loss: 0.1397 86/500 [====>.........................] - ETA: 2:20 - loss: 1.1730 - regression_loss: 1.0341 - classification_loss: 0.1388 87/500 [====>.........................] - ETA: 2:20 - loss: 1.1831 - regression_loss: 1.0435 - classification_loss: 0.1396 88/500 [====>.........................] - ETA: 2:20 - loss: 1.1904 - regression_loss: 1.0482 - classification_loss: 0.1422 89/500 [====>.........................] - ETA: 2:19 - loss: 1.1942 - regression_loss: 1.0522 - classification_loss: 0.1420 90/500 [====>.........................] - ETA: 2:19 - loss: 1.1988 - regression_loss: 1.0559 - classification_loss: 0.1429 91/500 [====>.........................] - ETA: 2:19 - loss: 1.1962 - regression_loss: 1.0540 - classification_loss: 0.1422 92/500 [====>.........................] - ETA: 2:18 - loss: 1.1957 - regression_loss: 1.0536 - classification_loss: 0.1422 93/500 [====>.........................] - ETA: 2:18 - loss: 1.2005 - regression_loss: 1.0578 - classification_loss: 0.1427 94/500 [====>.........................] - ETA: 2:18 - loss: 1.1972 - regression_loss: 1.0554 - classification_loss: 0.1418 95/500 [====>.........................] - ETA: 2:17 - loss: 1.1951 - regression_loss: 1.0532 - classification_loss: 0.1420 96/500 [====>.........................] - ETA: 2:17 - loss: 1.1868 - regression_loss: 1.0460 - classification_loss: 0.1407 97/500 [====>.........................] - ETA: 2:17 - loss: 1.1858 - regression_loss: 1.0453 - classification_loss: 0.1404 98/500 [====>.........................] - ETA: 2:17 - loss: 1.1860 - regression_loss: 1.0460 - classification_loss: 0.1399 99/500 [====>.........................] - ETA: 2:16 - loss: 1.1930 - regression_loss: 1.0515 - classification_loss: 0.1415 100/500 [=====>........................] - ETA: 2:16 - loss: 1.2039 - regression_loss: 1.0598 - classification_loss: 0.1441 101/500 [=====>........................] - ETA: 2:16 - loss: 1.2032 - regression_loss: 1.0591 - classification_loss: 0.1440 102/500 [=====>........................] - ETA: 2:15 - loss: 1.1987 - regression_loss: 1.0551 - classification_loss: 0.1436 103/500 [=====>........................] - ETA: 2:15 - loss: 1.1951 - regression_loss: 1.0519 - classification_loss: 0.1432 104/500 [=====>........................] - ETA: 2:15 - loss: 1.1948 - regression_loss: 1.0520 - classification_loss: 0.1427 105/500 [=====>........................] - ETA: 2:14 - loss: 1.1949 - regression_loss: 1.0514 - classification_loss: 0.1435 106/500 [=====>........................] - ETA: 2:14 - loss: 1.1897 - regression_loss: 1.0468 - classification_loss: 0.1429 107/500 [=====>........................] - ETA: 2:13 - loss: 1.1918 - regression_loss: 1.0478 - classification_loss: 0.1440 108/500 [=====>........................] - ETA: 2:13 - loss: 1.1878 - regression_loss: 1.0446 - classification_loss: 0.1433 109/500 [=====>........................] - ETA: 2:13 - loss: 1.1849 - regression_loss: 1.0423 - classification_loss: 0.1426 110/500 [=====>........................] - ETA: 2:12 - loss: 1.1803 - regression_loss: 1.0386 - classification_loss: 0.1417 111/500 [=====>........................] - ETA: 2:12 - loss: 1.1873 - regression_loss: 1.0449 - classification_loss: 0.1424 112/500 [=====>........................] - ETA: 2:12 - loss: 1.1876 - regression_loss: 1.0454 - classification_loss: 0.1422 113/500 [=====>........................] - ETA: 2:11 - loss: 1.1900 - regression_loss: 1.0464 - classification_loss: 0.1436 114/500 [=====>........................] - ETA: 2:11 - loss: 1.1859 - regression_loss: 1.0429 - classification_loss: 0.1429 115/500 [=====>........................] - ETA: 2:11 - loss: 1.1870 - regression_loss: 1.0436 - classification_loss: 0.1434 116/500 [=====>........................] - ETA: 2:10 - loss: 1.1895 - regression_loss: 1.0457 - classification_loss: 0.1437 117/500 [======>.......................] - ETA: 2:10 - loss: 1.1829 - regression_loss: 1.0401 - classification_loss: 0.1428 118/500 [======>.......................] - ETA: 2:10 - loss: 1.1805 - regression_loss: 1.0382 - classification_loss: 0.1423 119/500 [======>.......................] - ETA: 2:09 - loss: 1.1727 - regression_loss: 1.0311 - classification_loss: 0.1416 120/500 [======>.......................] - ETA: 2:09 - loss: 1.1677 - regression_loss: 1.0264 - classification_loss: 0.1413 121/500 [======>.......................] - ETA: 2:08 - loss: 1.1664 - regression_loss: 1.0255 - classification_loss: 0.1408 122/500 [======>.......................] - ETA: 2:08 - loss: 1.1658 - regression_loss: 1.0251 - classification_loss: 0.1406 123/500 [======>.......................] - ETA: 2:08 - loss: 1.1724 - regression_loss: 1.0317 - classification_loss: 0.1407 124/500 [======>.......................] - ETA: 2:08 - loss: 1.1692 - regression_loss: 1.0293 - classification_loss: 0.1399 125/500 [======>.......................] - ETA: 2:07 - loss: 1.1773 - regression_loss: 1.0374 - classification_loss: 0.1399 126/500 [======>.......................] - ETA: 2:07 - loss: 1.1800 - regression_loss: 1.0391 - classification_loss: 0.1409 127/500 [======>.......................] - ETA: 2:07 - loss: 1.1824 - regression_loss: 1.0408 - classification_loss: 0.1416 128/500 [======>.......................] - ETA: 2:06 - loss: 1.1801 - regression_loss: 1.0392 - classification_loss: 0.1410 129/500 [======>.......................] - ETA: 2:06 - loss: 1.1767 - regression_loss: 1.0363 - classification_loss: 0.1403 130/500 [======>.......................] - ETA: 2:06 - loss: 1.1736 - regression_loss: 1.0340 - classification_loss: 0.1396 131/500 [======>.......................] - ETA: 2:05 - loss: 1.1718 - regression_loss: 1.0328 - classification_loss: 0.1390 132/500 [======>.......................] - ETA: 2:05 - loss: 1.1722 - regression_loss: 1.0332 - classification_loss: 0.1390 133/500 [======>.......................] - ETA: 2:04 - loss: 1.1722 - regression_loss: 1.0331 - classification_loss: 0.1391 134/500 [=======>......................] - ETA: 2:04 - loss: 1.1716 - regression_loss: 1.0323 - classification_loss: 0.1393 135/500 [=======>......................] - ETA: 2:04 - loss: 1.1681 - regression_loss: 1.0289 - classification_loss: 0.1393 136/500 [=======>......................] - ETA: 2:03 - loss: 1.1620 - regression_loss: 1.0236 - classification_loss: 0.1385 137/500 [=======>......................] - ETA: 2:03 - loss: 1.1616 - regression_loss: 1.0233 - classification_loss: 0.1383 138/500 [=======>......................] - ETA: 2:03 - loss: 1.1593 - regression_loss: 1.0211 - classification_loss: 0.1382 139/500 [=======>......................] - ETA: 2:02 - loss: 1.1532 - regression_loss: 1.0159 - classification_loss: 0.1374 140/500 [=======>......................] - ETA: 2:02 - loss: 1.1582 - regression_loss: 1.0202 - classification_loss: 0.1380 141/500 [=======>......................] - ETA: 2:02 - loss: 1.1604 - regression_loss: 1.0222 - classification_loss: 0.1382 142/500 [=======>......................] - ETA: 2:01 - loss: 1.1573 - regression_loss: 1.0195 - classification_loss: 0.1378 143/500 [=======>......................] - ETA: 2:01 - loss: 1.1561 - regression_loss: 1.0182 - classification_loss: 0.1379 144/500 [=======>......................] - ETA: 2:01 - loss: 1.1501 - regression_loss: 1.0129 - classification_loss: 0.1372 145/500 [=======>......................] - ETA: 2:00 - loss: 1.1507 - regression_loss: 1.0134 - classification_loss: 0.1373 146/500 [=======>......................] - ETA: 2:00 - loss: 1.1461 - regression_loss: 1.0094 - classification_loss: 0.1367 147/500 [=======>......................] - ETA: 2:00 - loss: 1.1507 - regression_loss: 1.0120 - classification_loss: 0.1387 148/500 [=======>......................] - ETA: 1:59 - loss: 1.1466 - regression_loss: 1.0085 - classification_loss: 0.1381 149/500 [=======>......................] - ETA: 1:59 - loss: 1.1432 - regression_loss: 1.0057 - classification_loss: 0.1374 150/500 [========>.....................] - ETA: 1:59 - loss: 1.1421 - regression_loss: 1.0046 - classification_loss: 0.1375 151/500 [========>.....................] - ETA: 1:58 - loss: 1.1407 - regression_loss: 1.0035 - classification_loss: 0.1372 152/500 [========>.....................] - ETA: 1:58 - loss: 1.1383 - regression_loss: 1.0015 - classification_loss: 0.1368 153/500 [========>.....................] - ETA: 1:57 - loss: 1.1357 - regression_loss: 0.9995 - classification_loss: 0.1362 154/500 [========>.....................] - ETA: 1:57 - loss: 1.1325 - regression_loss: 0.9967 - classification_loss: 0.1358 155/500 [========>.....................] - ETA: 1:57 - loss: 1.1317 - regression_loss: 0.9961 - classification_loss: 0.1356 156/500 [========>.....................] - ETA: 1:57 - loss: 1.1322 - regression_loss: 0.9966 - classification_loss: 0.1356 157/500 [========>.....................] - ETA: 1:56 - loss: 1.1322 - regression_loss: 0.9968 - classification_loss: 0.1354 158/500 [========>.....................] - ETA: 1:56 - loss: 1.1315 - regression_loss: 0.9963 - classification_loss: 0.1352 159/500 [========>.....................] - ETA: 1:56 - loss: 1.1302 - regression_loss: 0.9951 - classification_loss: 0.1351 160/500 [========>.....................] - ETA: 1:55 - loss: 1.1275 - regression_loss: 0.9928 - classification_loss: 0.1346 161/500 [========>.....................] - ETA: 1:55 - loss: 1.1270 - regression_loss: 0.9922 - classification_loss: 0.1347 162/500 [========>.....................] - ETA: 1:55 - loss: 1.1242 - regression_loss: 0.9897 - classification_loss: 0.1344 163/500 [========>.....................] - ETA: 1:54 - loss: 1.1288 - regression_loss: 0.9937 - classification_loss: 0.1351 164/500 [========>.....................] - ETA: 1:54 - loss: 1.1296 - regression_loss: 0.9942 - classification_loss: 0.1353 165/500 [========>.....................] - ETA: 1:53 - loss: 1.1257 - regression_loss: 0.9910 - classification_loss: 0.1347 166/500 [========>.....................] - ETA: 1:53 - loss: 1.1274 - regression_loss: 0.9925 - classification_loss: 0.1349 167/500 [=========>....................] - ETA: 1:53 - loss: 1.1294 - regression_loss: 0.9940 - classification_loss: 0.1353 168/500 [=========>....................] - ETA: 1:52 - loss: 1.1307 - regression_loss: 0.9951 - classification_loss: 0.1355 169/500 [=========>....................] - ETA: 1:52 - loss: 1.1300 - regression_loss: 0.9948 - classification_loss: 0.1352 170/500 [=========>....................] - ETA: 1:52 - loss: 1.1263 - regression_loss: 0.9918 - classification_loss: 0.1345 171/500 [=========>....................] - ETA: 1:51 - loss: 1.1261 - regression_loss: 0.9917 - classification_loss: 0.1344 172/500 [=========>....................] - ETA: 1:51 - loss: 1.1236 - regression_loss: 0.9897 - classification_loss: 0.1340 173/500 [=========>....................] - ETA: 1:51 - loss: 1.1238 - regression_loss: 0.9901 - classification_loss: 0.1337 174/500 [=========>....................] - ETA: 1:50 - loss: 1.1198 - regression_loss: 0.9867 - classification_loss: 0.1331 175/500 [=========>....................] - ETA: 1:50 - loss: 1.1197 - regression_loss: 0.9867 - classification_loss: 0.1329 176/500 [=========>....................] - ETA: 1:50 - loss: 1.1283 - regression_loss: 0.9933 - classification_loss: 0.1351 177/500 [=========>....................] - ETA: 1:49 - loss: 1.1284 - regression_loss: 0.9937 - classification_loss: 0.1348 178/500 [=========>....................] - ETA: 1:49 - loss: 1.1283 - regression_loss: 0.9934 - classification_loss: 0.1349 179/500 [=========>....................] - ETA: 1:49 - loss: 1.1314 - regression_loss: 0.9955 - classification_loss: 0.1359 180/500 [=========>....................] - ETA: 1:48 - loss: 1.1306 - regression_loss: 0.9953 - classification_loss: 0.1354 181/500 [=========>....................] - ETA: 1:48 - loss: 1.1315 - regression_loss: 0.9959 - classification_loss: 0.1356 182/500 [=========>....................] - ETA: 1:48 - loss: 1.1309 - regression_loss: 0.9954 - classification_loss: 0.1355 183/500 [=========>....................] - ETA: 1:47 - loss: 1.1284 - regression_loss: 0.9932 - classification_loss: 0.1351 184/500 [==========>...................] - ETA: 1:47 - loss: 1.1293 - regression_loss: 0.9942 - classification_loss: 0.1351 185/500 [==========>...................] - ETA: 1:47 - loss: 1.1295 - regression_loss: 0.9941 - classification_loss: 0.1354 186/500 [==========>...................] - ETA: 1:46 - loss: 1.1295 - regression_loss: 0.9940 - classification_loss: 0.1355 187/500 [==========>...................] - ETA: 1:46 - loss: 1.1284 - regression_loss: 0.9931 - classification_loss: 0.1352 188/500 [==========>...................] - ETA: 1:46 - loss: 1.1264 - regression_loss: 0.9915 - classification_loss: 0.1349 189/500 [==========>...................] - ETA: 1:45 - loss: 1.1252 - regression_loss: 0.9905 - classification_loss: 0.1347 190/500 [==========>...................] - ETA: 1:45 - loss: 1.1241 - regression_loss: 0.9896 - classification_loss: 0.1345 191/500 [==========>...................] - ETA: 1:44 - loss: 1.1260 - regression_loss: 0.9909 - classification_loss: 0.1351 192/500 [==========>...................] - ETA: 1:44 - loss: 1.1255 - regression_loss: 0.9908 - classification_loss: 0.1347 193/500 [==========>...................] - ETA: 1:44 - loss: 1.1243 - regression_loss: 0.9897 - classification_loss: 0.1346 194/500 [==========>...................] - ETA: 1:43 - loss: 1.1218 - regression_loss: 0.9877 - classification_loss: 0.1341 195/500 [==========>...................] - ETA: 1:43 - loss: 1.1197 - regression_loss: 0.9859 - classification_loss: 0.1338 196/500 [==========>...................] - ETA: 1:43 - loss: 1.1178 - regression_loss: 0.9844 - classification_loss: 0.1334 197/500 [==========>...................] - ETA: 1:42 - loss: 1.1169 - regression_loss: 0.9836 - classification_loss: 0.1334 198/500 [==========>...................] - ETA: 1:42 - loss: 1.1155 - regression_loss: 0.9820 - classification_loss: 0.1335 199/500 [==========>...................] - ETA: 1:42 - loss: 1.1219 - regression_loss: 0.9872 - classification_loss: 0.1347 200/500 [===========>..................] - ETA: 1:41 - loss: 1.1199 - regression_loss: 0.9855 - classification_loss: 0.1344 201/500 [===========>..................] - ETA: 1:41 - loss: 1.1190 - regression_loss: 0.9849 - classification_loss: 0.1341 202/500 [===========>..................] - ETA: 1:41 - loss: 1.1176 - regression_loss: 0.9838 - classification_loss: 0.1338 203/500 [===========>..................] - ETA: 1:40 - loss: 1.1186 - regression_loss: 0.9850 - classification_loss: 0.1336 204/500 [===========>..................] - ETA: 1:40 - loss: 1.1198 - regression_loss: 0.9861 - classification_loss: 0.1337 205/500 [===========>..................] - ETA: 1:40 - loss: 1.1194 - regression_loss: 0.9857 - classification_loss: 0.1337 206/500 [===========>..................] - ETA: 1:39 - loss: 1.1234 - regression_loss: 0.9890 - classification_loss: 0.1345 207/500 [===========>..................] - ETA: 1:39 - loss: 1.1231 - regression_loss: 0.9887 - classification_loss: 0.1344 208/500 [===========>..................] - ETA: 1:39 - loss: 1.1228 - regression_loss: 0.9885 - classification_loss: 0.1343 209/500 [===========>..................] - ETA: 1:38 - loss: 1.1231 - regression_loss: 0.9886 - classification_loss: 0.1345 210/500 [===========>..................] - ETA: 1:38 - loss: 1.1219 - regression_loss: 0.9874 - classification_loss: 0.1345 211/500 [===========>..................] - ETA: 1:38 - loss: 1.1217 - regression_loss: 0.9873 - classification_loss: 0.1343 212/500 [===========>..................] - ETA: 1:37 - loss: 1.1221 - regression_loss: 0.9877 - classification_loss: 0.1344 213/500 [===========>..................] - ETA: 1:37 - loss: 1.1241 - regression_loss: 0.9891 - classification_loss: 0.1351 214/500 [===========>..................] - ETA: 1:37 - loss: 1.1248 - regression_loss: 0.9901 - classification_loss: 0.1348 215/500 [===========>..................] - ETA: 1:36 - loss: 1.1253 - regression_loss: 0.9907 - classification_loss: 0.1346 216/500 [===========>..................] - ETA: 1:36 - loss: 1.1249 - regression_loss: 0.9902 - classification_loss: 0.1347 217/500 [============>.................] - ETA: 1:36 - loss: 1.1235 - regression_loss: 0.9886 - classification_loss: 0.1349 218/500 [============>.................] - ETA: 1:35 - loss: 1.1302 - regression_loss: 0.9947 - classification_loss: 0.1355 219/500 [============>.................] - ETA: 1:35 - loss: 1.1289 - regression_loss: 0.9934 - classification_loss: 0.1356 220/500 [============>.................] - ETA: 1:34 - loss: 1.1325 - regression_loss: 0.9959 - classification_loss: 0.1366 221/500 [============>.................] - ETA: 1:34 - loss: 1.1321 - regression_loss: 0.9957 - classification_loss: 0.1363 222/500 [============>.................] - ETA: 1:34 - loss: 1.1300 - regression_loss: 0.9939 - classification_loss: 0.1361 223/500 [============>.................] - ETA: 1:33 - loss: 1.1282 - regression_loss: 0.9924 - classification_loss: 0.1358 224/500 [============>.................] - ETA: 1:33 - loss: 1.1294 - regression_loss: 0.9934 - classification_loss: 0.1359 225/500 [============>.................] - ETA: 1:33 - loss: 1.1270 - regression_loss: 0.9914 - classification_loss: 0.1355 226/500 [============>.................] - ETA: 1:32 - loss: 1.1293 - regression_loss: 0.9933 - classification_loss: 0.1360 227/500 [============>.................] - ETA: 1:32 - loss: 1.1278 - regression_loss: 0.9920 - classification_loss: 0.1358 228/500 [============>.................] - ETA: 1:32 - loss: 1.1287 - regression_loss: 0.9928 - classification_loss: 0.1359 229/500 [============>.................] - ETA: 1:31 - loss: 1.1264 - regression_loss: 0.9908 - classification_loss: 0.1356 230/500 [============>.................] - ETA: 1:31 - loss: 1.1258 - regression_loss: 0.9902 - classification_loss: 0.1356 231/500 [============>.................] - ETA: 1:31 - loss: 1.1258 - regression_loss: 0.9905 - classification_loss: 0.1354 232/500 [============>.................] - ETA: 1:30 - loss: 1.1277 - regression_loss: 0.9925 - classification_loss: 0.1353 233/500 [============>.................] - ETA: 1:30 - loss: 1.1269 - regression_loss: 0.9919 - classification_loss: 0.1350 234/500 [=============>................] - ETA: 1:30 - loss: 1.1258 - regression_loss: 0.9909 - classification_loss: 0.1348 235/500 [=============>................] - ETA: 1:29 - loss: 1.1293 - regression_loss: 0.9938 - classification_loss: 0.1354 236/500 [=============>................] - ETA: 1:29 - loss: 1.1276 - regression_loss: 0.9925 - classification_loss: 0.1351 237/500 [=============>................] - ETA: 1:29 - loss: 1.1260 - regression_loss: 0.9912 - classification_loss: 0.1349 238/500 [=============>................] - ETA: 1:28 - loss: 1.1251 - regression_loss: 0.9903 - classification_loss: 0.1348 239/500 [=============>................] - ETA: 1:28 - loss: 1.1237 - regression_loss: 0.9891 - classification_loss: 0.1346 240/500 [=============>................] - ETA: 1:28 - loss: 1.1239 - regression_loss: 0.9892 - classification_loss: 0.1347 241/500 [=============>................] - ETA: 1:27 - loss: 1.1248 - regression_loss: 0.9898 - classification_loss: 0.1350 242/500 [=============>................] - ETA: 1:27 - loss: 1.1241 - regression_loss: 0.9893 - classification_loss: 0.1348 243/500 [=============>................] - ETA: 1:27 - loss: 1.1240 - regression_loss: 0.9894 - classification_loss: 0.1346 244/500 [=============>................] - ETA: 1:26 - loss: 1.1238 - regression_loss: 0.9895 - classification_loss: 0.1343 245/500 [=============>................] - ETA: 1:26 - loss: 1.1244 - regression_loss: 0.9902 - classification_loss: 0.1342 246/500 [=============>................] - ETA: 1:26 - loss: 1.1247 - regression_loss: 0.9906 - classification_loss: 0.1341 247/500 [=============>................] - ETA: 1:25 - loss: 1.1257 - regression_loss: 0.9916 - classification_loss: 0.1341 248/500 [=============>................] - ETA: 1:25 - loss: 1.1255 - regression_loss: 0.9916 - classification_loss: 0.1339 249/500 [=============>................] - ETA: 1:25 - loss: 1.1267 - regression_loss: 0.9927 - classification_loss: 0.1341 250/500 [==============>...............] - ETA: 1:24 - loss: 1.1260 - regression_loss: 0.9923 - classification_loss: 0.1337 251/500 [==============>...............] - ETA: 1:24 - loss: 1.1254 - regression_loss: 0.9913 - classification_loss: 0.1341 252/500 [==============>...............] - ETA: 1:24 - loss: 1.1249 - regression_loss: 0.9910 - classification_loss: 0.1340 253/500 [==============>...............] - ETA: 1:23 - loss: 1.1244 - regression_loss: 0.9906 - classification_loss: 0.1338 254/500 [==============>...............] - ETA: 1:23 - loss: 1.1239 - regression_loss: 0.9901 - classification_loss: 0.1338 255/500 [==============>...............] - ETA: 1:23 - loss: 1.1232 - regression_loss: 0.9898 - classification_loss: 0.1334 256/500 [==============>...............] - ETA: 1:22 - loss: 1.1234 - regression_loss: 0.9900 - classification_loss: 0.1334 257/500 [==============>...............] - ETA: 1:22 - loss: 1.1207 - regression_loss: 0.9877 - classification_loss: 0.1331 258/500 [==============>...............] - ETA: 1:22 - loss: 1.1185 - regression_loss: 0.9858 - classification_loss: 0.1327 259/500 [==============>...............] - ETA: 1:21 - loss: 1.1180 - regression_loss: 0.9855 - classification_loss: 0.1325 260/500 [==============>...............] - ETA: 1:21 - loss: 1.1205 - regression_loss: 0.9875 - classification_loss: 0.1329 261/500 [==============>...............] - ETA: 1:21 - loss: 1.1220 - regression_loss: 0.9890 - classification_loss: 0.1330 262/500 [==============>...............] - ETA: 1:20 - loss: 1.1189 - regression_loss: 0.9863 - classification_loss: 0.1326 263/500 [==============>...............] - ETA: 1:20 - loss: 1.1184 - regression_loss: 0.9859 - classification_loss: 0.1325 264/500 [==============>...............] - ETA: 1:20 - loss: 1.1170 - regression_loss: 0.9845 - classification_loss: 0.1325 265/500 [==============>...............] - ETA: 1:19 - loss: 1.1150 - regression_loss: 0.9829 - classification_loss: 0.1321 266/500 [==============>...............] - ETA: 1:19 - loss: 1.1145 - regression_loss: 0.9825 - classification_loss: 0.1320 267/500 [===============>..............] - ETA: 1:19 - loss: 1.1138 - regression_loss: 0.9818 - classification_loss: 0.1319 268/500 [===============>..............] - ETA: 1:18 - loss: 1.1145 - regression_loss: 0.9825 - classification_loss: 0.1320 269/500 [===============>..............] - ETA: 1:18 - loss: 1.1146 - regression_loss: 0.9824 - classification_loss: 0.1322 270/500 [===============>..............] - ETA: 1:18 - loss: 1.1156 - regression_loss: 0.9832 - classification_loss: 0.1324 271/500 [===============>..............] - ETA: 1:17 - loss: 1.1154 - regression_loss: 0.9831 - classification_loss: 0.1323 272/500 [===============>..............] - ETA: 1:17 - loss: 1.1131 - regression_loss: 0.9811 - classification_loss: 0.1320 273/500 [===============>..............] - ETA: 1:16 - loss: 1.1126 - regression_loss: 0.9809 - classification_loss: 0.1317 274/500 [===============>..............] - ETA: 1:16 - loss: 1.1105 - regression_loss: 0.9792 - classification_loss: 0.1313 275/500 [===============>..............] - ETA: 1:16 - loss: 1.1103 - regression_loss: 0.9791 - classification_loss: 0.1312 276/500 [===============>..............] - ETA: 1:15 - loss: 1.1091 - regression_loss: 0.9781 - classification_loss: 0.1311 277/500 [===============>..............] - ETA: 1:15 - loss: 1.1105 - regression_loss: 0.9789 - classification_loss: 0.1316 278/500 [===============>..............] - ETA: 1:15 - loss: 1.1089 - regression_loss: 0.9777 - classification_loss: 0.1312 279/500 [===============>..............] - ETA: 1:14 - loss: 1.1078 - regression_loss: 0.9769 - classification_loss: 0.1309 280/500 [===============>..............] - ETA: 1:14 - loss: 1.1078 - regression_loss: 0.9770 - classification_loss: 0.1308 281/500 [===============>..............] - ETA: 1:14 - loss: 1.1087 - regression_loss: 0.9776 - classification_loss: 0.1311 282/500 [===============>..............] - ETA: 1:13 - loss: 1.1084 - regression_loss: 0.9774 - classification_loss: 0.1310 283/500 [===============>..............] - ETA: 1:13 - loss: 1.1067 - regression_loss: 0.9759 - classification_loss: 0.1308 284/500 [================>.............] - ETA: 1:13 - loss: 1.1065 - regression_loss: 0.9758 - classification_loss: 0.1306 285/500 [================>.............] - ETA: 1:12 - loss: 1.1077 - regression_loss: 0.9771 - classification_loss: 0.1306 286/500 [================>.............] - ETA: 1:12 - loss: 1.1083 - regression_loss: 0.9778 - classification_loss: 0.1305 287/500 [================>.............] - ETA: 1:12 - loss: 1.1105 - regression_loss: 0.9796 - classification_loss: 0.1310 288/500 [================>.............] - ETA: 1:11 - loss: 1.1114 - regression_loss: 0.9804 - classification_loss: 0.1310 289/500 [================>.............] - ETA: 1:11 - loss: 1.1102 - regression_loss: 0.9795 - classification_loss: 0.1308 290/500 [================>.............] - ETA: 1:11 - loss: 1.1093 - regression_loss: 0.9788 - classification_loss: 0.1306 291/500 [================>.............] - ETA: 1:10 - loss: 1.1099 - regression_loss: 0.9792 - classification_loss: 0.1307 292/500 [================>.............] - ETA: 1:10 - loss: 1.1102 - regression_loss: 0.9796 - classification_loss: 0.1306 293/500 [================>.............] - ETA: 1:10 - loss: 1.1101 - regression_loss: 0.9795 - classification_loss: 0.1305 294/500 [================>.............] - ETA: 1:09 - loss: 1.1100 - regression_loss: 0.9797 - classification_loss: 0.1304 295/500 [================>.............] - ETA: 1:09 - loss: 1.1097 - regression_loss: 0.9795 - classification_loss: 0.1302 296/500 [================>.............] - ETA: 1:09 - loss: 1.1104 - regression_loss: 0.9796 - classification_loss: 0.1308 297/500 [================>.............] - ETA: 1:08 - loss: 1.1087 - regression_loss: 0.9780 - classification_loss: 0.1307 298/500 [================>.............] - ETA: 1:08 - loss: 1.1075 - regression_loss: 0.9769 - classification_loss: 0.1306 299/500 [================>.............] - ETA: 1:08 - loss: 1.1059 - regression_loss: 0.9754 - classification_loss: 0.1305 300/500 [=================>............] - ETA: 1:07 - loss: 1.1059 - regression_loss: 0.9756 - classification_loss: 0.1303 301/500 [=================>............] - ETA: 1:07 - loss: 1.1059 - regression_loss: 0.9757 - classification_loss: 0.1302 302/500 [=================>............] - ETA: 1:07 - loss: 1.1080 - regression_loss: 0.9773 - classification_loss: 0.1307 303/500 [=================>............] - ETA: 1:06 - loss: 1.1098 - regression_loss: 0.9790 - classification_loss: 0.1308 304/500 [=================>............] - ETA: 1:06 - loss: 1.1089 - regression_loss: 0.9781 - classification_loss: 0.1308 305/500 [=================>............] - ETA: 1:06 - loss: 1.1089 - regression_loss: 0.9781 - classification_loss: 0.1308 306/500 [=================>............] - ETA: 1:05 - loss: 1.1093 - regression_loss: 0.9785 - classification_loss: 0.1308 307/500 [=================>............] - ETA: 1:05 - loss: 1.1114 - regression_loss: 0.9801 - classification_loss: 0.1313 308/500 [=================>............] - ETA: 1:05 - loss: 1.1107 - regression_loss: 0.9796 - classification_loss: 0.1312 309/500 [=================>............] - ETA: 1:04 - loss: 1.1119 - regression_loss: 0.9805 - classification_loss: 0.1314 310/500 [=================>............] - ETA: 1:04 - loss: 1.1121 - regression_loss: 0.9808 - classification_loss: 0.1313 311/500 [=================>............] - ETA: 1:04 - loss: 1.1108 - regression_loss: 0.9798 - classification_loss: 0.1311 312/500 [=================>............] - ETA: 1:03 - loss: 1.1133 - regression_loss: 0.9818 - classification_loss: 0.1315 313/500 [=================>............] - ETA: 1:03 - loss: 1.1119 - regression_loss: 0.9806 - classification_loss: 0.1313 314/500 [=================>............] - ETA: 1:02 - loss: 1.1114 - regression_loss: 0.9801 - classification_loss: 0.1313 315/500 [=================>............] - ETA: 1:02 - loss: 1.1108 - regression_loss: 0.9798 - classification_loss: 0.1311 316/500 [=================>............] - ETA: 1:02 - loss: 1.1089 - regression_loss: 0.9781 - classification_loss: 0.1308 317/500 [==================>...........] - ETA: 1:01 - loss: 1.1102 - regression_loss: 0.9794 - classification_loss: 0.1309 318/500 [==================>...........] - ETA: 1:01 - loss: 1.1104 - regression_loss: 0.9796 - classification_loss: 0.1308 319/500 [==================>...........] - ETA: 1:01 - loss: 1.1083 - regression_loss: 0.9778 - classification_loss: 0.1305 320/500 [==================>...........] - ETA: 1:00 - loss: 1.1062 - regression_loss: 0.9748 - classification_loss: 0.1314 321/500 [==================>...........] - ETA: 1:00 - loss: 1.1053 - regression_loss: 0.9740 - classification_loss: 0.1313 322/500 [==================>...........] - ETA: 1:00 - loss: 1.1070 - regression_loss: 0.9754 - classification_loss: 0.1316 323/500 [==================>...........] - ETA: 59s - loss: 1.1065 - regression_loss: 0.9749 - classification_loss: 0.1316  324/500 [==================>...........] - ETA: 59s - loss: 1.1062 - regression_loss: 0.9747 - classification_loss: 0.1314 325/500 [==================>...........] - ETA: 59s - loss: 1.1034 - regression_loss: 0.9723 - classification_loss: 0.1311 326/500 [==================>...........] - ETA: 58s - loss: 1.1040 - regression_loss: 0.9727 - classification_loss: 0.1314 327/500 [==================>...........] - ETA: 58s - loss: 1.1030 - regression_loss: 0.9719 - classification_loss: 0.1312 328/500 [==================>...........] - ETA: 58s - loss: 1.1048 - regression_loss: 0.9734 - classification_loss: 0.1314 329/500 [==================>...........] - ETA: 57s - loss: 1.1058 - regression_loss: 0.9746 - classification_loss: 0.1312 330/500 [==================>...........] - ETA: 57s - loss: 1.1055 - regression_loss: 0.9743 - classification_loss: 0.1311 331/500 [==================>...........] - ETA: 57s - loss: 1.1028 - regression_loss: 0.9720 - classification_loss: 0.1308 332/500 [==================>...........] - ETA: 56s - loss: 1.1007 - regression_loss: 0.9702 - classification_loss: 0.1305 333/500 [==================>...........] - ETA: 56s - loss: 1.1006 - regression_loss: 0.9701 - classification_loss: 0.1305 334/500 [===================>..........] - ETA: 56s - loss: 1.1010 - regression_loss: 0.9705 - classification_loss: 0.1306 335/500 [===================>..........] - ETA: 55s - loss: 1.0996 - regression_loss: 0.9692 - classification_loss: 0.1303 336/500 [===================>..........] - ETA: 55s - loss: 1.0981 - regression_loss: 0.9679 - classification_loss: 0.1302 337/500 [===================>..........] - ETA: 55s - loss: 1.0971 - regression_loss: 0.9670 - classification_loss: 0.1302 338/500 [===================>..........] - ETA: 54s - loss: 1.0972 - regression_loss: 0.9669 - classification_loss: 0.1303 339/500 [===================>..........] - ETA: 54s - loss: 1.0970 - regression_loss: 0.9667 - classification_loss: 0.1303 340/500 [===================>..........] - ETA: 54s - loss: 1.0961 - regression_loss: 0.9658 - classification_loss: 0.1303 341/500 [===================>..........] - ETA: 53s - loss: 1.0986 - regression_loss: 0.9680 - classification_loss: 0.1305 342/500 [===================>..........] - ETA: 53s - loss: 1.0982 - regression_loss: 0.9679 - classification_loss: 0.1303 343/500 [===================>..........] - ETA: 53s - loss: 1.0979 - regression_loss: 0.9675 - classification_loss: 0.1303 344/500 [===================>..........] - ETA: 52s - loss: 1.0993 - regression_loss: 0.9684 - classification_loss: 0.1309 345/500 [===================>..........] - ETA: 52s - loss: 1.1021 - regression_loss: 0.9708 - classification_loss: 0.1312 346/500 [===================>..........] - ETA: 52s - loss: 1.1018 - regression_loss: 0.9707 - classification_loss: 0.1311 347/500 [===================>..........] - ETA: 51s - loss: 1.1012 - regression_loss: 0.9701 - classification_loss: 0.1311 348/500 [===================>..........] - ETA: 51s - loss: 1.0999 - regression_loss: 0.9691 - classification_loss: 0.1308 349/500 [===================>..........] - ETA: 51s - loss: 1.0995 - regression_loss: 0.9687 - classification_loss: 0.1308 350/500 [====================>.........] - ETA: 50s - loss: 1.1014 - regression_loss: 0.9704 - classification_loss: 0.1310 351/500 [====================>.........] - ETA: 50s - loss: 1.1024 - regression_loss: 0.9711 - classification_loss: 0.1313 352/500 [====================>.........] - ETA: 50s - loss: 1.1042 - regression_loss: 0.9730 - classification_loss: 0.1313 353/500 [====================>.........] - ETA: 49s - loss: 1.1059 - regression_loss: 0.9742 - classification_loss: 0.1316 354/500 [====================>.........] - ETA: 49s - loss: 1.1063 - regression_loss: 0.9745 - classification_loss: 0.1318 355/500 [====================>.........] - ETA: 49s - loss: 1.1062 - regression_loss: 0.9744 - classification_loss: 0.1318 356/500 [====================>.........] - ETA: 48s - loss: 1.1068 - regression_loss: 0.9751 - classification_loss: 0.1318 357/500 [====================>.........] - ETA: 48s - loss: 1.1061 - regression_loss: 0.9745 - classification_loss: 0.1316 358/500 [====================>.........] - ETA: 48s - loss: 1.1065 - regression_loss: 0.9749 - classification_loss: 0.1316 359/500 [====================>.........] - ETA: 47s - loss: 1.1044 - regression_loss: 0.9730 - classification_loss: 0.1313 360/500 [====================>.........] - ETA: 47s - loss: 1.1038 - regression_loss: 0.9726 - classification_loss: 0.1312 361/500 [====================>.........] - ETA: 47s - loss: 1.1026 - regression_loss: 0.9714 - classification_loss: 0.1312 362/500 [====================>.........] - ETA: 46s - loss: 1.1018 - regression_loss: 0.9708 - classification_loss: 0.1310 363/500 [====================>.........] - ETA: 46s - loss: 1.1037 - regression_loss: 0.9721 - classification_loss: 0.1316 364/500 [====================>.........] - ETA: 46s - loss: 1.1038 - regression_loss: 0.9723 - classification_loss: 0.1315 365/500 [====================>.........] - ETA: 45s - loss: 1.1037 - regression_loss: 0.9723 - classification_loss: 0.1314 366/500 [====================>.........] - ETA: 45s - loss: 1.1039 - regression_loss: 0.9725 - classification_loss: 0.1314 367/500 [=====================>........] - ETA: 45s - loss: 1.1039 - regression_loss: 0.9726 - classification_loss: 0.1314 368/500 [=====================>........] - ETA: 44s - loss: 1.1050 - regression_loss: 0.9735 - classification_loss: 0.1315 369/500 [=====================>........] - ETA: 44s - loss: 1.1068 - regression_loss: 0.9751 - classification_loss: 0.1317 370/500 [=====================>........] - ETA: 44s - loss: 1.1061 - regression_loss: 0.9746 - classification_loss: 0.1316 371/500 [=====================>........] - ETA: 43s - loss: 1.1090 - regression_loss: 0.9765 - classification_loss: 0.1324 372/500 [=====================>........] - ETA: 43s - loss: 1.1075 - regression_loss: 0.9754 - classification_loss: 0.1322 373/500 [=====================>........] - ETA: 43s - loss: 1.1060 - regression_loss: 0.9741 - classification_loss: 0.1319 374/500 [=====================>........] - ETA: 42s - loss: 1.1052 - regression_loss: 0.9735 - classification_loss: 0.1317 375/500 [=====================>........] - ETA: 42s - loss: 1.1024 - regression_loss: 0.9709 - classification_loss: 0.1316 376/500 [=====================>........] - ETA: 42s - loss: 1.1001 - regression_loss: 0.9688 - classification_loss: 0.1313 377/500 [=====================>........] - ETA: 41s - loss: 1.1003 - regression_loss: 0.9691 - classification_loss: 0.1312 378/500 [=====================>........] - ETA: 41s - loss: 1.0986 - regression_loss: 0.9677 - classification_loss: 0.1309 379/500 [=====================>........] - ETA: 41s - loss: 1.0991 - regression_loss: 0.9682 - classification_loss: 0.1309 380/500 [=====================>........] - ETA: 40s - loss: 1.0995 - regression_loss: 0.9685 - classification_loss: 0.1310 381/500 [=====================>........] - ETA: 40s - loss: 1.0997 - regression_loss: 0.9687 - classification_loss: 0.1310 382/500 [=====================>........] - ETA: 39s - loss: 1.0986 - regression_loss: 0.9677 - classification_loss: 0.1309 383/500 [=====================>........] - ETA: 39s - loss: 1.0997 - regression_loss: 0.9687 - classification_loss: 0.1310 384/500 [======================>.......] - ETA: 39s - loss: 1.1003 - regression_loss: 0.9693 - classification_loss: 0.1310 385/500 [======================>.......] - ETA: 38s - loss: 1.0995 - regression_loss: 0.9687 - classification_loss: 0.1308 386/500 [======================>.......] - ETA: 38s - loss: 1.1009 - regression_loss: 0.9695 - classification_loss: 0.1314 387/500 [======================>.......] - ETA: 38s - loss: 1.0990 - regression_loss: 0.9679 - classification_loss: 0.1311 388/500 [======================>.......] - ETA: 37s - loss: 1.1010 - regression_loss: 0.9693 - classification_loss: 0.1317 389/500 [======================>.......] - ETA: 37s - loss: 1.1005 - regression_loss: 0.9689 - classification_loss: 0.1316 390/500 [======================>.......] - ETA: 37s - loss: 1.1004 - regression_loss: 0.9688 - classification_loss: 0.1316 391/500 [======================>.......] - ETA: 36s - loss: 1.1017 - regression_loss: 0.9699 - classification_loss: 0.1318 392/500 [======================>.......] - ETA: 36s - loss: 1.1039 - regression_loss: 0.9716 - classification_loss: 0.1324 393/500 [======================>.......] - ETA: 36s - loss: 1.1033 - regression_loss: 0.9710 - classification_loss: 0.1322 394/500 [======================>.......] - ETA: 35s - loss: 1.1026 - regression_loss: 0.9704 - classification_loss: 0.1322 395/500 [======================>.......] - ETA: 35s - loss: 1.1033 - regression_loss: 0.9710 - classification_loss: 0.1323 396/500 [======================>.......] - ETA: 35s - loss: 1.1030 - regression_loss: 0.9708 - classification_loss: 0.1322 397/500 [======================>.......] - ETA: 34s - loss: 1.1045 - regression_loss: 0.9718 - classification_loss: 0.1327 398/500 [======================>.......] - ETA: 34s - loss: 1.1052 - regression_loss: 0.9723 - classification_loss: 0.1329 399/500 [======================>.......] - ETA: 34s - loss: 1.1053 - regression_loss: 0.9721 - classification_loss: 0.1331 400/500 [=======================>......] - ETA: 33s - loss: 1.1054 - regression_loss: 0.9722 - classification_loss: 0.1332 401/500 [=======================>......] - ETA: 33s - loss: 1.1049 - regression_loss: 0.9718 - classification_loss: 0.1332 402/500 [=======================>......] - ETA: 33s - loss: 1.1051 - regression_loss: 0.9719 - classification_loss: 0.1332 403/500 [=======================>......] - ETA: 32s - loss: 1.1026 - regression_loss: 0.9695 - classification_loss: 0.1331 404/500 [=======================>......] - ETA: 32s - loss: 1.1042 - regression_loss: 0.9708 - classification_loss: 0.1334 405/500 [=======================>......] - ETA: 32s - loss: 1.1032 - regression_loss: 0.9700 - classification_loss: 0.1332 406/500 [=======================>......] - ETA: 31s - loss: 1.1048 - regression_loss: 0.9714 - classification_loss: 0.1333 407/500 [=======================>......] - ETA: 31s - loss: 1.1061 - regression_loss: 0.9726 - classification_loss: 0.1335 408/500 [=======================>......] - ETA: 31s - loss: 1.1066 - regression_loss: 0.9730 - classification_loss: 0.1337 409/500 [=======================>......] - ETA: 30s - loss: 1.1060 - regression_loss: 0.9724 - classification_loss: 0.1336 410/500 [=======================>......] - ETA: 30s - loss: 1.1083 - regression_loss: 0.9742 - classification_loss: 0.1341 411/500 [=======================>......] - ETA: 30s - loss: 1.1086 - regression_loss: 0.9745 - classification_loss: 0.1341 412/500 [=======================>......] - ETA: 29s - loss: 1.1089 - regression_loss: 0.9750 - classification_loss: 0.1340 413/500 [=======================>......] - ETA: 29s - loss: 1.1084 - regression_loss: 0.9744 - classification_loss: 0.1339 414/500 [=======================>......] - ETA: 29s - loss: 1.1082 - regression_loss: 0.9744 - classification_loss: 0.1338 415/500 [=======================>......] - ETA: 28s - loss: 1.1080 - regression_loss: 0.9744 - classification_loss: 0.1337 416/500 [=======================>......] - ETA: 28s - loss: 1.1087 - regression_loss: 0.9751 - classification_loss: 0.1336 417/500 [========================>.....] - ETA: 28s - loss: 1.1084 - regression_loss: 0.9749 - classification_loss: 0.1335 418/500 [========================>.....] - ETA: 27s - loss: 1.1068 - regression_loss: 0.9735 - classification_loss: 0.1332 419/500 [========================>.....] - ETA: 27s - loss: 1.1060 - regression_loss: 0.9729 - classification_loss: 0.1331 420/500 [========================>.....] - ETA: 27s - loss: 1.1056 - regression_loss: 0.9726 - classification_loss: 0.1330 421/500 [========================>.....] - ETA: 26s - loss: 1.1070 - regression_loss: 0.9738 - classification_loss: 0.1332 422/500 [========================>.....] - ETA: 26s - loss: 1.1072 - regression_loss: 0.9741 - classification_loss: 0.1331 423/500 [========================>.....] - ETA: 26s - loss: 1.1058 - regression_loss: 0.9729 - classification_loss: 0.1328 424/500 [========================>.....] - ETA: 25s - loss: 1.1062 - regression_loss: 0.9732 - classification_loss: 0.1330 425/500 [========================>.....] - ETA: 25s - loss: 1.1059 - regression_loss: 0.9730 - classification_loss: 0.1329 426/500 [========================>.....] - ETA: 25s - loss: 1.1064 - regression_loss: 0.9733 - classification_loss: 0.1331 427/500 [========================>.....] - ETA: 24s - loss: 1.1066 - regression_loss: 0.9735 - classification_loss: 0.1331 428/500 [========================>.....] - ETA: 24s - loss: 1.1063 - regression_loss: 0.9732 - classification_loss: 0.1331 429/500 [========================>.....] - ETA: 24s - loss: 1.1064 - regression_loss: 0.9734 - classification_loss: 0.1330 430/500 [========================>.....] - ETA: 23s - loss: 1.1071 - regression_loss: 0.9740 - classification_loss: 0.1331 431/500 [========================>.....] - ETA: 23s - loss: 1.1072 - regression_loss: 0.9740 - classification_loss: 0.1332 432/500 [========================>.....] - ETA: 23s - loss: 1.1103 - regression_loss: 0.9763 - classification_loss: 0.1340 433/500 [========================>.....] - ETA: 22s - loss: 1.1096 - regression_loss: 0.9757 - classification_loss: 0.1339 434/500 [=========================>....] - ETA: 22s - loss: 1.1084 - regression_loss: 0.9747 - classification_loss: 0.1337 435/500 [=========================>....] - ETA: 22s - loss: 1.1080 - regression_loss: 0.9744 - classification_loss: 0.1336 436/500 [=========================>....] - ETA: 21s - loss: 1.1077 - regression_loss: 0.9742 - classification_loss: 0.1335 437/500 [=========================>....] - ETA: 21s - loss: 1.1069 - regression_loss: 0.9735 - classification_loss: 0.1334 438/500 [=========================>....] - ETA: 21s - loss: 1.1074 - regression_loss: 0.9739 - classification_loss: 0.1335 439/500 [=========================>....] - ETA: 20s - loss: 1.1064 - regression_loss: 0.9731 - classification_loss: 0.1333 440/500 [=========================>....] - ETA: 20s - loss: 1.1059 - regression_loss: 0.9727 - classification_loss: 0.1332 441/500 [=========================>....] - ETA: 20s - loss: 1.1044 - regression_loss: 0.9713 - classification_loss: 0.1331 442/500 [=========================>....] - ETA: 19s - loss: 1.1059 - regression_loss: 0.9723 - classification_loss: 0.1335 443/500 [=========================>....] - ETA: 19s - loss: 1.1100 - regression_loss: 0.9753 - classification_loss: 0.1346 444/500 [=========================>....] - ETA: 18s - loss: 1.1088 - regression_loss: 0.9744 - classification_loss: 0.1344 445/500 [=========================>....] - ETA: 18s - loss: 1.1087 - regression_loss: 0.9744 - classification_loss: 0.1343 446/500 [=========================>....] - ETA: 18s - loss: 1.1110 - regression_loss: 0.9761 - classification_loss: 0.1349 447/500 [=========================>....] - ETA: 17s - loss: 1.1112 - regression_loss: 0.9762 - classification_loss: 0.1349 448/500 [=========================>....] - ETA: 17s - loss: 1.1101 - regression_loss: 0.9752 - classification_loss: 0.1349 449/500 [=========================>....] - ETA: 17s - loss: 1.1094 - regression_loss: 0.9748 - classification_loss: 0.1347 450/500 [==========================>...] - ETA: 16s - loss: 1.1090 - regression_loss: 0.9744 - classification_loss: 0.1346 451/500 [==========================>...] - ETA: 16s - loss: 1.1087 - regression_loss: 0.9740 - classification_loss: 0.1347 452/500 [==========================>...] - ETA: 16s - loss: 1.1088 - regression_loss: 0.9740 - classification_loss: 0.1348 453/500 [==========================>...] - ETA: 15s - loss: 1.1080 - regression_loss: 0.9733 - classification_loss: 0.1347 454/500 [==========================>...] - ETA: 15s - loss: 1.1087 - regression_loss: 0.9740 - classification_loss: 0.1346 455/500 [==========================>...] - ETA: 15s - loss: 1.1087 - regression_loss: 0.9741 - classification_loss: 0.1346 456/500 [==========================>...] - ETA: 14s - loss: 1.1073 - regression_loss: 0.9730 - classification_loss: 0.1344 457/500 [==========================>...] - ETA: 14s - loss: 1.1073 - regression_loss: 0.9729 - classification_loss: 0.1344 458/500 [==========================>...] - ETA: 14s - loss: 1.1081 - regression_loss: 0.9737 - classification_loss: 0.1344 459/500 [==========================>...] - ETA: 13s - loss: 1.1086 - regression_loss: 0.9742 - classification_loss: 0.1343 460/500 [==========================>...] - ETA: 13s - loss: 1.1077 - regression_loss: 0.9735 - classification_loss: 0.1342 461/500 [==========================>...] - ETA: 13s - loss: 1.1073 - regression_loss: 0.9733 - classification_loss: 0.1340 462/500 [==========================>...] - ETA: 12s - loss: 1.1087 - regression_loss: 0.9745 - classification_loss: 0.1342 463/500 [==========================>...] - ETA: 12s - loss: 1.1098 - regression_loss: 0.9756 - classification_loss: 0.1341 464/500 [==========================>...] - ETA: 12s - loss: 1.1097 - regression_loss: 0.9756 - classification_loss: 0.1340 465/500 [==========================>...] - ETA: 11s - loss: 1.1093 - regression_loss: 0.9754 - classification_loss: 0.1339 466/500 [==========================>...] - ETA: 11s - loss: 1.1092 - regression_loss: 0.9754 - classification_loss: 0.1338 467/500 [===========================>..] - ETA: 11s - loss: 1.1101 - regression_loss: 0.9760 - classification_loss: 0.1342 468/500 [===========================>..] - ETA: 10s - loss: 1.1095 - regression_loss: 0.9754 - classification_loss: 0.1341 469/500 [===========================>..] - ETA: 10s - loss: 1.1094 - regression_loss: 0.9754 - classification_loss: 0.1341 470/500 [===========================>..] - ETA: 10s - loss: 1.1080 - regression_loss: 0.9741 - classification_loss: 0.1339 471/500 [===========================>..] - ETA: 9s - loss: 1.1084 - regression_loss: 0.9744 - classification_loss: 0.1340  472/500 [===========================>..] - ETA: 9s - loss: 1.1078 - regression_loss: 0.9739 - classification_loss: 0.1339 473/500 [===========================>..] - ETA: 9s - loss: 1.1081 - regression_loss: 0.9741 - classification_loss: 0.1339 474/500 [===========================>..] - ETA: 8s - loss: 1.1076 - regression_loss: 0.9737 - classification_loss: 0.1338 475/500 [===========================>..] - ETA: 8s - loss: 1.1088 - regression_loss: 0.9750 - classification_loss: 0.1338 476/500 [===========================>..] - ETA: 8s - loss: 1.1073 - regression_loss: 0.9737 - classification_loss: 0.1336 477/500 [===========================>..] - ETA: 7s - loss: 1.1084 - regression_loss: 0.9746 - classification_loss: 0.1338 478/500 [===========================>..] - ETA: 7s - loss: 1.1075 - regression_loss: 0.9739 - classification_loss: 0.1336 479/500 [===========================>..] - ETA: 7s - loss: 1.1067 - regression_loss: 0.9732 - classification_loss: 0.1335 480/500 [===========================>..] - ETA: 6s - loss: 1.1073 - regression_loss: 0.9738 - classification_loss: 0.1335 481/500 [===========================>..] - ETA: 6s - loss: 1.1078 - regression_loss: 0.9742 - classification_loss: 0.1335 482/500 [===========================>..] - ETA: 6s - loss: 1.1069 - regression_loss: 0.9735 - classification_loss: 0.1334 483/500 [===========================>..] - ETA: 5s - loss: 1.1063 - regression_loss: 0.9731 - classification_loss: 0.1333 484/500 [============================>.] - ETA: 5s - loss: 1.1059 - regression_loss: 0.9721 - classification_loss: 0.1338 485/500 [============================>.] - ETA: 5s - loss: 1.1068 - regression_loss: 0.9730 - classification_loss: 0.1338 486/500 [============================>.] - ETA: 4s - loss: 1.1065 - regression_loss: 0.9728 - classification_loss: 0.1337 487/500 [============================>.] - ETA: 4s - loss: 1.1060 - regression_loss: 0.9724 - classification_loss: 0.1336 488/500 [============================>.] - ETA: 4s - loss: 1.1050 - regression_loss: 0.9716 - classification_loss: 0.1335 489/500 [============================>.] - ETA: 3s - loss: 1.1048 - regression_loss: 0.9713 - classification_loss: 0.1334 490/500 [============================>.] - ETA: 3s - loss: 1.1050 - regression_loss: 0.9714 - classification_loss: 0.1335 491/500 [============================>.] - ETA: 3s - loss: 1.1042 - regression_loss: 0.9708 - classification_loss: 0.1334 492/500 [============================>.] - ETA: 2s - loss: 1.1045 - regression_loss: 0.9710 - classification_loss: 0.1335 493/500 [============================>.] - ETA: 2s - loss: 1.1047 - regression_loss: 0.9713 - classification_loss: 0.1334 494/500 [============================>.] - ETA: 2s - loss: 1.1045 - regression_loss: 0.9712 - classification_loss: 0.1334 495/500 [============================>.] - ETA: 1s - loss: 1.1039 - regression_loss: 0.9706 - classification_loss: 0.1333 496/500 [============================>.] - ETA: 1s - loss: 1.1032 - regression_loss: 0.9701 - classification_loss: 0.1332 497/500 [============================>.] - ETA: 1s - loss: 1.1030 - regression_loss: 0.9700 - classification_loss: 0.1331 498/500 [============================>.] - ETA: 0s - loss: 1.1037 - regression_loss: 0.9706 - classification_loss: 0.1331 499/500 [============================>.] - ETA: 0s - loss: 1.1032 - regression_loss: 0.9703 - classification_loss: 0.1329 500/500 [==============================] - 170s 339ms/step - loss: 1.1032 - regression_loss: 0.9703 - classification_loss: 0.1329 326 instances of class plum with average precision: 0.8227 mAP: 0.8227 Epoch 00021: saving model to ./training/snapshots/resnet101_pascal_21.h5 Epoch 22/150 1/500 [..............................] - ETA: 2:44 - loss: 1.2893 - regression_loss: 1.1627 - classification_loss: 0.1267 2/500 [..............................] - ETA: 2:49 - loss: 1.0870 - regression_loss: 0.9929 - classification_loss: 0.0941 3/500 [..............................] - ETA: 2:50 - loss: 1.0352 - regression_loss: 0.9230 - classification_loss: 0.1122 4/500 [..............................] - ETA: 2:48 - loss: 1.0928 - regression_loss: 0.9751 - classification_loss: 0.1177 5/500 [..............................] - ETA: 2:48 - loss: 1.1561 - regression_loss: 1.0312 - classification_loss: 0.1250 6/500 [..............................] - ETA: 2:49 - loss: 1.1884 - regression_loss: 1.0577 - classification_loss: 0.1307 7/500 [..............................] - ETA: 2:47 - loss: 1.1284 - regression_loss: 1.0054 - classification_loss: 0.1230 8/500 [..............................] - ETA: 2:47 - loss: 1.2223 - regression_loss: 1.0752 - classification_loss: 0.1472 9/500 [..............................] - ETA: 2:47 - loss: 1.2004 - regression_loss: 1.0562 - classification_loss: 0.1442 10/500 [..............................] - ETA: 2:46 - loss: 1.1374 - regression_loss: 0.9996 - classification_loss: 0.1378 11/500 [..............................] - ETA: 2:46 - loss: 1.0874 - regression_loss: 0.9561 - classification_loss: 0.1313 12/500 [..............................] - ETA: 2:45 - loss: 1.1026 - regression_loss: 0.9731 - classification_loss: 0.1295 13/500 [..............................] - ETA: 2:45 - loss: 1.0917 - regression_loss: 0.9636 - classification_loss: 0.1281 14/500 [..............................] - ETA: 2:45 - loss: 1.0791 - regression_loss: 0.9552 - classification_loss: 0.1239 15/500 [..............................] - ETA: 2:45 - loss: 1.0645 - regression_loss: 0.9436 - classification_loss: 0.1209 16/500 [..............................] - ETA: 2:44 - loss: 1.0315 - regression_loss: 0.9106 - classification_loss: 0.1209 17/500 [>.............................] - ETA: 2:44 - loss: 0.9992 - regression_loss: 0.8831 - classification_loss: 0.1160 18/500 [>.............................] - ETA: 2:44 - loss: 1.0037 - regression_loss: 0.8880 - classification_loss: 0.1158 19/500 [>.............................] - ETA: 2:43 - loss: 0.9838 - regression_loss: 0.8695 - classification_loss: 0.1143 20/500 [>.............................] - ETA: 2:43 - loss: 1.0024 - regression_loss: 0.8848 - classification_loss: 0.1175 21/500 [>.............................] - ETA: 2:43 - loss: 1.0632 - regression_loss: 0.9349 - classification_loss: 0.1283 22/500 [>.............................] - ETA: 2:43 - loss: 1.0606 - regression_loss: 0.9354 - classification_loss: 0.1252 23/500 [>.............................] - ETA: 2:43 - loss: 1.0589 - regression_loss: 0.9342 - classification_loss: 0.1247 24/500 [>.............................] - ETA: 2:43 - loss: 1.0610 - regression_loss: 0.9368 - classification_loss: 0.1241 25/500 [>.............................] - ETA: 2:42 - loss: 1.0783 - regression_loss: 0.9488 - classification_loss: 0.1296 26/500 [>.............................] - ETA: 2:42 - loss: 1.0728 - regression_loss: 0.9426 - classification_loss: 0.1303 27/500 [>.............................] - ETA: 2:42 - loss: 1.0744 - regression_loss: 0.9454 - classification_loss: 0.1290 28/500 [>.............................] - ETA: 2:42 - loss: 1.0749 - regression_loss: 0.9456 - classification_loss: 0.1294 29/500 [>.............................] - ETA: 2:41 - loss: 1.0765 - regression_loss: 0.9470 - classification_loss: 0.1295 30/500 [>.............................] - ETA: 2:41 - loss: 1.0696 - regression_loss: 0.9409 - classification_loss: 0.1287 31/500 [>.............................] - ETA: 2:40 - loss: 1.0807 - regression_loss: 0.9491 - classification_loss: 0.1316 32/500 [>.............................] - ETA: 2:40 - loss: 1.1351 - regression_loss: 0.9975 - classification_loss: 0.1375 33/500 [>.............................] - ETA: 2:39 - loss: 1.1453 - regression_loss: 1.0060 - classification_loss: 0.1393 34/500 [=>............................] - ETA: 2:39 - loss: 1.1545 - regression_loss: 1.0146 - classification_loss: 0.1399 35/500 [=>............................] - ETA: 2:38 - loss: 1.1437 - regression_loss: 1.0056 - classification_loss: 0.1381 36/500 [=>............................] - ETA: 2:38 - loss: 1.1452 - regression_loss: 1.0068 - classification_loss: 0.1384 37/500 [=>............................] - ETA: 2:38 - loss: 1.1366 - regression_loss: 1.0003 - classification_loss: 0.1364 38/500 [=>............................] - ETA: 2:37 - loss: 1.1160 - regression_loss: 0.9823 - classification_loss: 0.1337 39/500 [=>............................] - ETA: 2:37 - loss: 1.1122 - regression_loss: 0.9787 - classification_loss: 0.1335 40/500 [=>............................] - ETA: 2:36 - loss: 1.0924 - regression_loss: 0.9615 - classification_loss: 0.1309 41/500 [=>............................] - ETA: 2:36 - loss: 1.0877 - regression_loss: 0.9572 - classification_loss: 0.1304 42/500 [=>............................] - ETA: 2:35 - loss: 1.0790 - regression_loss: 0.9501 - classification_loss: 0.1288 43/500 [=>............................] - ETA: 2:35 - loss: 1.0722 - regression_loss: 0.9443 - classification_loss: 0.1279 44/500 [=>............................] - ETA: 2:35 - loss: 1.0731 - regression_loss: 0.9456 - classification_loss: 0.1275 45/500 [=>............................] - ETA: 2:35 - loss: 1.0713 - regression_loss: 0.9447 - classification_loss: 0.1266 46/500 [=>............................] - ETA: 2:34 - loss: 1.0871 - regression_loss: 0.9554 - classification_loss: 0.1317 47/500 [=>............................] - ETA: 2:34 - loss: 1.0937 - regression_loss: 0.9607 - classification_loss: 0.1330 48/500 [=>............................] - ETA: 2:34 - loss: 1.0988 - regression_loss: 0.9643 - classification_loss: 0.1345 49/500 [=>............................] - ETA: 2:33 - loss: 1.1088 - regression_loss: 0.9741 - classification_loss: 0.1347 50/500 [==>...........................] - ETA: 2:33 - loss: 1.1119 - regression_loss: 0.9773 - classification_loss: 0.1346 51/500 [==>...........................] - ETA: 2:33 - loss: 1.1013 - regression_loss: 0.9686 - classification_loss: 0.1327 52/500 [==>...........................] - ETA: 2:32 - loss: 1.1037 - regression_loss: 0.9711 - classification_loss: 0.1326 53/500 [==>...........................] - ETA: 2:32 - loss: 1.1113 - regression_loss: 0.9776 - classification_loss: 0.1337 54/500 [==>...........................] - ETA: 2:32 - loss: 1.0984 - regression_loss: 0.9659 - classification_loss: 0.1325 55/500 [==>...........................] - ETA: 2:31 - loss: 1.1039 - regression_loss: 0.9706 - classification_loss: 0.1332 56/500 [==>...........................] - ETA: 2:31 - loss: 1.0938 - regression_loss: 0.9622 - classification_loss: 0.1316 57/500 [==>...........................] - ETA: 2:31 - loss: 1.1117 - regression_loss: 0.9767 - classification_loss: 0.1351 58/500 [==>...........................] - ETA: 2:31 - loss: 1.1067 - regression_loss: 0.9712 - classification_loss: 0.1355 59/500 [==>...........................] - ETA: 2:30 - loss: 1.0979 - regression_loss: 0.9636 - classification_loss: 0.1343 60/500 [==>...........................] - ETA: 2:30 - loss: 1.0961 - regression_loss: 0.9624 - classification_loss: 0.1337 61/500 [==>...........................] - ETA: 2:29 - loss: 1.0924 - regression_loss: 0.9598 - classification_loss: 0.1325 62/500 [==>...........................] - ETA: 2:29 - loss: 1.0884 - regression_loss: 0.9564 - classification_loss: 0.1320 63/500 [==>...........................] - ETA: 2:28 - loss: 1.0793 - regression_loss: 0.9489 - classification_loss: 0.1304 64/500 [==>...........................] - ETA: 2:28 - loss: 1.0759 - regression_loss: 0.9466 - classification_loss: 0.1293 65/500 [==>...........................] - ETA: 2:27 - loss: 1.0855 - regression_loss: 0.9539 - classification_loss: 0.1316 66/500 [==>...........................] - ETA: 2:27 - loss: 1.0804 - regression_loss: 0.9494 - classification_loss: 0.1310 67/500 [===>..........................] - ETA: 2:27 - loss: 1.0724 - regression_loss: 0.9427 - classification_loss: 0.1298 68/500 [===>..........................] - ETA: 2:26 - loss: 1.0851 - regression_loss: 0.9525 - classification_loss: 0.1325 69/500 [===>..........................] - ETA: 2:26 - loss: 1.0758 - regression_loss: 0.9446 - classification_loss: 0.1312 70/500 [===>..........................] - ETA: 2:26 - loss: 1.0826 - regression_loss: 0.9502 - classification_loss: 0.1324 71/500 [===>..........................] - ETA: 2:25 - loss: 1.0930 - regression_loss: 0.9573 - classification_loss: 0.1357 72/500 [===>..........................] - ETA: 2:25 - loss: 1.0828 - regression_loss: 0.9487 - classification_loss: 0.1342 73/500 [===>..........................] - ETA: 2:25 - loss: 1.0879 - regression_loss: 0.9537 - classification_loss: 0.1343 74/500 [===>..........................] - ETA: 2:24 - loss: 1.1087 - regression_loss: 0.9715 - classification_loss: 0.1372 75/500 [===>..........................] - ETA: 2:24 - loss: 1.1108 - regression_loss: 0.9736 - classification_loss: 0.1373 76/500 [===>..........................] - ETA: 2:24 - loss: 1.1071 - regression_loss: 0.9705 - classification_loss: 0.1366 77/500 [===>..........................] - ETA: 2:24 - loss: 1.1012 - regression_loss: 0.9658 - classification_loss: 0.1354 78/500 [===>..........................] - ETA: 2:23 - loss: 1.1007 - regression_loss: 0.9659 - classification_loss: 0.1348 79/500 [===>..........................] - ETA: 2:23 - loss: 1.1002 - regression_loss: 0.9660 - classification_loss: 0.1342 80/500 [===>..........................] - ETA: 2:23 - loss: 1.0946 - regression_loss: 0.9614 - classification_loss: 0.1332 81/500 [===>..........................] - ETA: 2:22 - loss: 1.0943 - regression_loss: 0.9602 - classification_loss: 0.1341 82/500 [===>..........................] - ETA: 2:22 - loss: 1.1041 - regression_loss: 0.9681 - classification_loss: 0.1359 83/500 [===>..........................] - ETA: 2:22 - loss: 1.0918 - regression_loss: 0.9565 - classification_loss: 0.1354 84/500 [====>.........................] - ETA: 2:21 - loss: 1.0872 - regression_loss: 0.9523 - classification_loss: 0.1349 85/500 [====>.........................] - ETA: 2:21 - loss: 1.0855 - regression_loss: 0.9510 - classification_loss: 0.1345 86/500 [====>.........................] - ETA: 2:21 - loss: 1.0830 - regression_loss: 0.9489 - classification_loss: 0.1341 87/500 [====>.........................] - ETA: 2:20 - loss: 1.0848 - regression_loss: 0.9497 - classification_loss: 0.1351 88/500 [====>.........................] - ETA: 2:20 - loss: 1.0804 - regression_loss: 0.9461 - classification_loss: 0.1343 89/500 [====>.........................] - ETA: 2:19 - loss: 1.0763 - regression_loss: 0.9432 - classification_loss: 0.1331 90/500 [====>.........................] - ETA: 2:19 - loss: 1.0797 - regression_loss: 0.9456 - classification_loss: 0.1341 91/500 [====>.........................] - ETA: 2:19 - loss: 1.0793 - regression_loss: 0.9455 - classification_loss: 0.1338 92/500 [====>.........................] - ETA: 2:18 - loss: 1.0765 - regression_loss: 0.9434 - classification_loss: 0.1331 93/500 [====>.........................] - ETA: 2:18 - loss: 1.0834 - regression_loss: 0.9498 - classification_loss: 0.1336 94/500 [====>.........................] - ETA: 2:18 - loss: 1.0840 - regression_loss: 0.9511 - classification_loss: 0.1329 95/500 [====>.........................] - ETA: 2:17 - loss: 1.0894 - regression_loss: 0.9562 - classification_loss: 0.1332 96/500 [====>.........................] - ETA: 2:17 - loss: 1.0880 - regression_loss: 0.9555 - classification_loss: 0.1325 97/500 [====>.........................] - ETA: 2:17 - loss: 1.0933 - regression_loss: 0.9586 - classification_loss: 0.1347 98/500 [====>.........................] - ETA: 2:16 - loss: 1.0921 - regression_loss: 0.9574 - classification_loss: 0.1347 99/500 [====>.........................] - ETA: 2:16 - loss: 1.0896 - regression_loss: 0.9557 - classification_loss: 0.1339 100/500 [=====>........................] - ETA: 2:16 - loss: 1.0853 - regression_loss: 0.9521 - classification_loss: 0.1332 101/500 [=====>........................] - ETA: 2:15 - loss: 1.0872 - regression_loss: 0.9536 - classification_loss: 0.1336 102/500 [=====>........................] - ETA: 2:15 - loss: 1.0844 - regression_loss: 0.9513 - classification_loss: 0.1332 103/500 [=====>........................] - ETA: 2:15 - loss: 1.0836 - regression_loss: 0.9503 - classification_loss: 0.1333 104/500 [=====>........................] - ETA: 2:14 - loss: 1.0812 - regression_loss: 0.9485 - classification_loss: 0.1327 105/500 [=====>........................] - ETA: 2:14 - loss: 1.0751 - regression_loss: 0.9432 - classification_loss: 0.1319 106/500 [=====>........................] - ETA: 2:14 - loss: 1.0740 - regression_loss: 0.9426 - classification_loss: 0.1314 107/500 [=====>........................] - ETA: 2:13 - loss: 1.0707 - regression_loss: 0.9399 - classification_loss: 0.1308 108/500 [=====>........................] - ETA: 2:13 - loss: 1.0687 - regression_loss: 0.9381 - classification_loss: 0.1306 109/500 [=====>........................] - ETA: 2:13 - loss: 1.0676 - regression_loss: 0.9371 - classification_loss: 0.1305 110/500 [=====>........................] - ETA: 2:12 - loss: 1.0665 - regression_loss: 0.9359 - classification_loss: 0.1306 111/500 [=====>........................] - ETA: 2:12 - loss: 1.0633 - regression_loss: 0.9333 - classification_loss: 0.1301 112/500 [=====>........................] - ETA: 2:12 - loss: 1.0645 - regression_loss: 0.9348 - classification_loss: 0.1298 113/500 [=====>........................] - ETA: 2:11 - loss: 1.0627 - regression_loss: 0.9334 - classification_loss: 0.1293 114/500 [=====>........................] - ETA: 2:11 - loss: 1.0622 - regression_loss: 0.9331 - classification_loss: 0.1291 115/500 [=====>........................] - ETA: 2:11 - loss: 1.0640 - regression_loss: 0.9346 - classification_loss: 0.1294 116/500 [=====>........................] - ETA: 2:10 - loss: 1.0628 - regression_loss: 0.9337 - classification_loss: 0.1291 117/500 [======>.......................] - ETA: 2:10 - loss: 1.0624 - regression_loss: 0.9336 - classification_loss: 0.1288 118/500 [======>.......................] - ETA: 2:10 - loss: 1.0604 - regression_loss: 0.9322 - classification_loss: 0.1283 119/500 [======>.......................] - ETA: 2:09 - loss: 1.0551 - regression_loss: 0.9276 - classification_loss: 0.1275 120/500 [======>.......................] - ETA: 2:09 - loss: 1.0547 - regression_loss: 0.9273 - classification_loss: 0.1274 121/500 [======>.......................] - ETA: 2:09 - loss: 1.0537 - regression_loss: 0.9268 - classification_loss: 0.1269 122/500 [======>.......................] - ETA: 2:09 - loss: 1.0598 - regression_loss: 0.9306 - classification_loss: 0.1293 123/500 [======>.......................] - ETA: 2:08 - loss: 1.0602 - regression_loss: 0.9310 - classification_loss: 0.1292 124/500 [======>.......................] - ETA: 2:08 - loss: 1.0575 - regression_loss: 0.9286 - classification_loss: 0.1289 125/500 [======>.......................] - ETA: 2:07 - loss: 1.0629 - regression_loss: 0.9339 - classification_loss: 0.1290 126/500 [======>.......................] - ETA: 2:07 - loss: 1.0702 - regression_loss: 0.9394 - classification_loss: 0.1308 127/500 [======>.......................] - ETA: 2:07 - loss: 1.0649 - regression_loss: 0.9348 - classification_loss: 0.1301 128/500 [======>.......................] - ETA: 2:07 - loss: 1.0649 - regression_loss: 0.9349 - classification_loss: 0.1300 129/500 [======>.......................] - ETA: 2:06 - loss: 1.0607 - regression_loss: 0.9314 - classification_loss: 0.1293 130/500 [======>.......................] - ETA: 2:06 - loss: 1.0569 - regression_loss: 0.9284 - classification_loss: 0.1285 131/500 [======>.......................] - ETA: 2:05 - loss: 1.0570 - regression_loss: 0.9287 - classification_loss: 0.1283 132/500 [======>.......................] - ETA: 2:05 - loss: 1.0578 - regression_loss: 0.9297 - classification_loss: 0.1281 133/500 [======>.......................] - ETA: 2:05 - loss: 1.0659 - regression_loss: 0.9353 - classification_loss: 0.1307 134/500 [=======>......................] - ETA: 2:04 - loss: 1.0640 - regression_loss: 0.9336 - classification_loss: 0.1305 135/500 [=======>......................] - ETA: 2:04 - loss: 1.0603 - regression_loss: 0.9305 - classification_loss: 0.1299 136/500 [=======>......................] - ETA: 2:04 - loss: 1.0595 - regression_loss: 0.9296 - classification_loss: 0.1299 137/500 [=======>......................] - ETA: 2:03 - loss: 1.0623 - regression_loss: 0.9324 - classification_loss: 0.1299 138/500 [=======>......................] - ETA: 2:03 - loss: 1.0639 - regression_loss: 0.9332 - classification_loss: 0.1307 139/500 [=======>......................] - ETA: 2:03 - loss: 1.0645 - regression_loss: 0.9340 - classification_loss: 0.1306 140/500 [=======>......................] - ETA: 2:02 - loss: 1.0617 - regression_loss: 0.9315 - classification_loss: 0.1301 141/500 [=======>......................] - ETA: 2:02 - loss: 1.0614 - regression_loss: 0.9316 - classification_loss: 0.1298 142/500 [=======>......................] - ETA: 2:02 - loss: 1.0787 - regression_loss: 0.9429 - classification_loss: 0.1358 143/500 [=======>......................] - ETA: 2:01 - loss: 1.0847 - regression_loss: 0.9478 - classification_loss: 0.1369 144/500 [=======>......................] - ETA: 2:01 - loss: 1.0815 - regression_loss: 0.9450 - classification_loss: 0.1365 145/500 [=======>......................] - ETA: 2:01 - loss: 1.0768 - regression_loss: 0.9410 - classification_loss: 0.1358 146/500 [=======>......................] - ETA: 2:00 - loss: 1.0770 - regression_loss: 0.9416 - classification_loss: 0.1354 147/500 [=======>......................] - ETA: 2:00 - loss: 1.0753 - regression_loss: 0.9398 - classification_loss: 0.1355 148/500 [=======>......................] - ETA: 1:59 - loss: 1.0765 - regression_loss: 0.9411 - classification_loss: 0.1354 149/500 [=======>......................] - ETA: 1:59 - loss: 1.0767 - regression_loss: 0.9419 - classification_loss: 0.1348 150/500 [========>.....................] - ETA: 1:59 - loss: 1.0786 - regression_loss: 0.9430 - classification_loss: 0.1356 151/500 [========>.....................] - ETA: 1:58 - loss: 1.0775 - regression_loss: 0.9421 - classification_loss: 0.1354 152/500 [========>.....................] - ETA: 1:58 - loss: 1.0752 - regression_loss: 0.9400 - classification_loss: 0.1352 153/500 [========>.....................] - ETA: 1:58 - loss: 1.0833 - regression_loss: 0.9461 - classification_loss: 0.1372 154/500 [========>.....................] - ETA: 1:57 - loss: 1.0835 - regression_loss: 0.9462 - classification_loss: 0.1373 155/500 [========>.....................] - ETA: 1:57 - loss: 1.0839 - regression_loss: 0.9468 - classification_loss: 0.1370 156/500 [========>.....................] - ETA: 1:57 - loss: 1.0856 - regression_loss: 0.9483 - classification_loss: 0.1373 157/500 [========>.....................] - ETA: 1:57 - loss: 1.0926 - regression_loss: 0.9532 - classification_loss: 0.1394 158/500 [========>.....................] - ETA: 1:56 - loss: 1.0883 - regression_loss: 0.9494 - classification_loss: 0.1388 159/500 [========>.....................] - ETA: 1:56 - loss: 1.0879 - regression_loss: 0.9492 - classification_loss: 0.1387 160/500 [========>.....................] - ETA: 1:55 - loss: 1.0875 - regression_loss: 0.9489 - classification_loss: 0.1386 161/500 [========>.....................] - ETA: 1:55 - loss: 1.0837 - regression_loss: 0.9457 - classification_loss: 0.1381 162/500 [========>.....................] - ETA: 1:55 - loss: 1.0841 - regression_loss: 0.9461 - classification_loss: 0.1380 163/500 [========>.....................] - ETA: 1:54 - loss: 1.0850 - regression_loss: 0.9470 - classification_loss: 0.1381 164/500 [========>.....................] - ETA: 1:54 - loss: 1.0843 - regression_loss: 0.9464 - classification_loss: 0.1379 165/500 [========>.....................] - ETA: 1:54 - loss: 1.0800 - regression_loss: 0.9428 - classification_loss: 0.1372 166/500 [========>.....................] - ETA: 1:53 - loss: 1.0778 - regression_loss: 0.9410 - classification_loss: 0.1368 167/500 [=========>....................] - ETA: 1:53 - loss: 1.0768 - regression_loss: 0.9404 - classification_loss: 0.1364 168/500 [=========>....................] - ETA: 1:53 - loss: 1.0732 - regression_loss: 0.9374 - classification_loss: 0.1357 169/500 [=========>....................] - ETA: 1:53 - loss: 1.0770 - regression_loss: 0.9410 - classification_loss: 0.1360 170/500 [=========>....................] - ETA: 1:52 - loss: 1.0776 - regression_loss: 0.9417 - classification_loss: 0.1360 171/500 [=========>....................] - ETA: 1:52 - loss: 1.0767 - regression_loss: 0.9411 - classification_loss: 0.1356 172/500 [=========>....................] - ETA: 1:51 - loss: 1.0771 - regression_loss: 0.9415 - classification_loss: 0.1356 173/500 [=========>....................] - ETA: 1:51 - loss: 1.0812 - regression_loss: 0.9454 - classification_loss: 0.1358 174/500 [=========>....................] - ETA: 1:51 - loss: 1.0797 - regression_loss: 0.9440 - classification_loss: 0.1357 175/500 [=========>....................] - ETA: 1:50 - loss: 1.0780 - regression_loss: 0.9427 - classification_loss: 0.1354 176/500 [=========>....................] - ETA: 1:50 - loss: 1.0787 - regression_loss: 0.9436 - classification_loss: 0.1351 177/500 [=========>....................] - ETA: 1:50 - loss: 1.0791 - regression_loss: 0.9442 - classification_loss: 0.1349 178/500 [=========>....................] - ETA: 1:49 - loss: 1.0781 - regression_loss: 0.9437 - classification_loss: 0.1345 179/500 [=========>....................] - ETA: 1:49 - loss: 1.0764 - regression_loss: 0.9425 - classification_loss: 0.1339 180/500 [=========>....................] - ETA: 1:49 - loss: 1.0734 - regression_loss: 0.9400 - classification_loss: 0.1334 181/500 [=========>....................] - ETA: 1:48 - loss: 1.0729 - regression_loss: 0.9397 - classification_loss: 0.1332 182/500 [=========>....................] - ETA: 1:48 - loss: 1.0701 - regression_loss: 0.9372 - classification_loss: 0.1329 183/500 [=========>....................] - ETA: 1:48 - loss: 1.0687 - regression_loss: 0.9361 - classification_loss: 0.1326 184/500 [==========>...................] - ETA: 1:47 - loss: 1.0667 - regression_loss: 0.9345 - classification_loss: 0.1322 185/500 [==========>...................] - ETA: 1:47 - loss: 1.0658 - regression_loss: 0.9340 - classification_loss: 0.1318 186/500 [==========>...................] - ETA: 1:47 - loss: 1.0621 - regression_loss: 0.9308 - classification_loss: 0.1313 187/500 [==========>...................] - ETA: 1:46 - loss: 1.0608 - regression_loss: 0.9298 - classification_loss: 0.1310 188/500 [==========>...................] - ETA: 1:46 - loss: 1.0580 - regression_loss: 0.9274 - classification_loss: 0.1305 189/500 [==========>...................] - ETA: 1:46 - loss: 1.0548 - regression_loss: 0.9248 - classification_loss: 0.1300 190/500 [==========>...................] - ETA: 1:45 - loss: 1.0573 - regression_loss: 0.9260 - classification_loss: 0.1313 191/500 [==========>...................] - ETA: 1:45 - loss: 1.0589 - regression_loss: 0.9272 - classification_loss: 0.1318 192/500 [==========>...................] - ETA: 1:45 - loss: 1.0573 - regression_loss: 0.9260 - classification_loss: 0.1314 193/500 [==========>...................] - ETA: 1:44 - loss: 1.0576 - regression_loss: 0.9261 - classification_loss: 0.1316 194/500 [==========>...................] - ETA: 1:44 - loss: 1.0574 - regression_loss: 0.9263 - classification_loss: 0.1311 195/500 [==========>...................] - ETA: 1:44 - loss: 1.0560 - regression_loss: 0.9250 - classification_loss: 0.1310 196/500 [==========>...................] - ETA: 1:43 - loss: 1.0554 - regression_loss: 0.9244 - classification_loss: 0.1311 197/500 [==========>...................] - ETA: 1:43 - loss: 1.0555 - regression_loss: 0.9244 - classification_loss: 0.1311 198/500 [==========>...................] - ETA: 1:43 - loss: 1.0563 - regression_loss: 0.9253 - classification_loss: 0.1310 199/500 [==========>...................] - ETA: 1:42 - loss: 1.0559 - regression_loss: 0.9252 - classification_loss: 0.1308 200/500 [===========>..................] - ETA: 1:42 - loss: 1.0538 - regression_loss: 0.9233 - classification_loss: 0.1305 201/500 [===========>..................] - ETA: 1:41 - loss: 1.0594 - regression_loss: 0.9255 - classification_loss: 0.1339 202/500 [===========>..................] - ETA: 1:41 - loss: 1.0599 - regression_loss: 0.9258 - classification_loss: 0.1341 203/500 [===========>..................] - ETA: 1:41 - loss: 1.0586 - regression_loss: 0.9249 - classification_loss: 0.1338 204/500 [===========>..................] - ETA: 1:40 - loss: 1.0591 - regression_loss: 0.9255 - classification_loss: 0.1336 205/500 [===========>..................] - ETA: 1:40 - loss: 1.0578 - regression_loss: 0.9245 - classification_loss: 0.1334 206/500 [===========>..................] - ETA: 1:40 - loss: 1.0575 - regression_loss: 0.9243 - classification_loss: 0.1332 207/500 [===========>..................] - ETA: 1:39 - loss: 1.0558 - regression_loss: 0.9226 - classification_loss: 0.1332 208/500 [===========>..................] - ETA: 1:39 - loss: 1.0568 - regression_loss: 0.9234 - classification_loss: 0.1334 209/500 [===========>..................] - ETA: 1:39 - loss: 1.0563 - regression_loss: 0.9229 - classification_loss: 0.1334 210/500 [===========>..................] - ETA: 1:38 - loss: 1.0577 - regression_loss: 0.9242 - classification_loss: 0.1335 211/500 [===========>..................] - ETA: 1:38 - loss: 1.0570 - regression_loss: 0.9237 - classification_loss: 0.1333 212/500 [===========>..................] - ETA: 1:38 - loss: 1.0607 - regression_loss: 0.9268 - classification_loss: 0.1339 213/500 [===========>..................] - ETA: 1:37 - loss: 1.0625 - regression_loss: 0.9287 - classification_loss: 0.1338 214/500 [===========>..................] - ETA: 1:37 - loss: 1.0613 - regression_loss: 0.9278 - classification_loss: 0.1335 215/500 [===========>..................] - ETA: 1:37 - loss: 1.0593 - regression_loss: 0.9263 - classification_loss: 0.1331 216/500 [===========>..................] - ETA: 1:36 - loss: 1.0588 - regression_loss: 0.9259 - classification_loss: 0.1328 217/500 [============>.................] - ETA: 1:36 - loss: 1.0576 - regression_loss: 0.9251 - classification_loss: 0.1326 218/500 [============>.................] - ETA: 1:36 - loss: 1.0575 - regression_loss: 0.9251 - classification_loss: 0.1323 219/500 [============>.................] - ETA: 1:35 - loss: 1.0587 - regression_loss: 0.9263 - classification_loss: 0.1324 220/500 [============>.................] - ETA: 1:35 - loss: 1.0617 - regression_loss: 0.9290 - classification_loss: 0.1327 221/500 [============>.................] - ETA: 1:35 - loss: 1.0603 - regression_loss: 0.9278 - classification_loss: 0.1326 222/500 [============>.................] - ETA: 1:34 - loss: 1.0570 - regression_loss: 0.9248 - classification_loss: 0.1322 223/500 [============>.................] - ETA: 1:34 - loss: 1.0567 - regression_loss: 0.9243 - classification_loss: 0.1323 224/500 [============>.................] - ETA: 1:34 - loss: 1.0568 - regression_loss: 0.9245 - classification_loss: 0.1323 225/500 [============>.................] - ETA: 1:33 - loss: 1.0620 - regression_loss: 0.9285 - classification_loss: 0.1335 226/500 [============>.................] - ETA: 1:33 - loss: 1.0607 - regression_loss: 0.9274 - classification_loss: 0.1333 227/500 [============>.................] - ETA: 1:33 - loss: 1.0624 - regression_loss: 0.9290 - classification_loss: 0.1334 228/500 [============>.................] - ETA: 1:32 - loss: 1.0656 - regression_loss: 0.9313 - classification_loss: 0.1343 229/500 [============>.................] - ETA: 1:32 - loss: 1.0652 - regression_loss: 0.9308 - classification_loss: 0.1345 230/500 [============>.................] - ETA: 1:32 - loss: 1.0615 - regression_loss: 0.9275 - classification_loss: 0.1340 231/500 [============>.................] - ETA: 1:31 - loss: 1.0615 - regression_loss: 0.9274 - classification_loss: 0.1341 232/500 [============>.................] - ETA: 1:31 - loss: 1.0586 - regression_loss: 0.9250 - classification_loss: 0.1336 233/500 [============>.................] - ETA: 1:31 - loss: 1.0570 - regression_loss: 0.9236 - classification_loss: 0.1334 234/500 [=============>................] - ETA: 1:30 - loss: 1.0590 - regression_loss: 0.9253 - classification_loss: 0.1337 235/500 [=============>................] - ETA: 1:30 - loss: 1.0561 - regression_loss: 0.9229 - classification_loss: 0.1332 236/500 [=============>................] - ETA: 1:30 - loss: 1.0557 - regression_loss: 0.9221 - classification_loss: 0.1335 237/500 [=============>................] - ETA: 1:29 - loss: 1.0624 - regression_loss: 0.9280 - classification_loss: 0.1344 238/500 [=============>................] - ETA: 1:29 - loss: 1.0618 - regression_loss: 0.9275 - classification_loss: 0.1343 239/500 [=============>................] - ETA: 1:28 - loss: 1.0603 - regression_loss: 0.9262 - classification_loss: 0.1341 240/500 [=============>................] - ETA: 1:28 - loss: 1.0566 - regression_loss: 0.9223 - classification_loss: 0.1343 241/500 [=============>................] - ETA: 1:28 - loss: 1.0581 - regression_loss: 0.9236 - classification_loss: 0.1346 242/500 [=============>................] - ETA: 1:27 - loss: 1.0557 - regression_loss: 0.9216 - classification_loss: 0.1342 243/500 [=============>................] - ETA: 1:27 - loss: 1.0554 - regression_loss: 0.9214 - classification_loss: 0.1340 244/500 [=============>................] - ETA: 1:27 - loss: 1.0541 - regression_loss: 0.9202 - classification_loss: 0.1339 245/500 [=============>................] - ETA: 1:26 - loss: 1.0513 - regression_loss: 0.9179 - classification_loss: 0.1334 246/500 [=============>................] - ETA: 1:26 - loss: 1.0501 - regression_loss: 0.9169 - classification_loss: 0.1332 247/500 [=============>................] - ETA: 1:26 - loss: 1.0548 - regression_loss: 0.9206 - classification_loss: 0.1342 248/500 [=============>................] - ETA: 1:25 - loss: 1.0529 - regression_loss: 0.9191 - classification_loss: 0.1338 249/500 [=============>................] - ETA: 1:25 - loss: 1.0523 - regression_loss: 0.9185 - classification_loss: 0.1338 250/500 [==============>...............] - ETA: 1:25 - loss: 1.0530 - regression_loss: 0.9192 - classification_loss: 0.1339 251/500 [==============>...............] - ETA: 1:24 - loss: 1.0520 - regression_loss: 0.9184 - classification_loss: 0.1336 252/500 [==============>...............] - ETA: 1:24 - loss: 1.0535 - regression_loss: 0.9195 - classification_loss: 0.1340 253/500 [==============>...............] - ETA: 1:24 - loss: 1.0530 - regression_loss: 0.9190 - classification_loss: 0.1340 254/500 [==============>...............] - ETA: 1:23 - loss: 1.0524 - regression_loss: 0.9184 - classification_loss: 0.1340 255/500 [==============>...............] - ETA: 1:23 - loss: 1.0529 - regression_loss: 0.9190 - classification_loss: 0.1338 256/500 [==============>...............] - ETA: 1:23 - loss: 1.0545 - regression_loss: 0.9208 - classification_loss: 0.1337 257/500 [==============>...............] - ETA: 1:22 - loss: 1.0526 - regression_loss: 0.9192 - classification_loss: 0.1334 258/500 [==============>...............] - ETA: 1:22 - loss: 1.0511 - regression_loss: 0.9180 - classification_loss: 0.1331 259/500 [==============>...............] - ETA: 1:22 - loss: 1.0533 - regression_loss: 0.9200 - classification_loss: 0.1333 260/500 [==============>...............] - ETA: 1:21 - loss: 1.0556 - regression_loss: 0.9212 - classification_loss: 0.1344 261/500 [==============>...............] - ETA: 1:21 - loss: 1.0556 - regression_loss: 0.9214 - classification_loss: 0.1342 262/500 [==============>...............] - ETA: 1:21 - loss: 1.0543 - regression_loss: 0.9204 - classification_loss: 0.1340 263/500 [==============>...............] - ETA: 1:20 - loss: 1.0547 - regression_loss: 0.9207 - classification_loss: 0.1340 264/500 [==============>...............] - ETA: 1:20 - loss: 1.0559 - regression_loss: 0.9209 - classification_loss: 0.1350 265/500 [==============>...............] - ETA: 1:20 - loss: 1.0548 - regression_loss: 0.9200 - classification_loss: 0.1348 266/500 [==============>...............] - ETA: 1:19 - loss: 1.0541 - regression_loss: 0.9195 - classification_loss: 0.1346 267/500 [===============>..............] - ETA: 1:19 - loss: 1.0545 - regression_loss: 0.9197 - classification_loss: 0.1348 268/500 [===============>..............] - ETA: 1:18 - loss: 1.0549 - regression_loss: 0.9202 - classification_loss: 0.1347 269/500 [===============>..............] - ETA: 1:18 - loss: 1.0562 - regression_loss: 0.9214 - classification_loss: 0.1348 270/500 [===============>..............] - ETA: 1:18 - loss: 1.0561 - regression_loss: 0.9213 - classification_loss: 0.1348 271/500 [===============>..............] - ETA: 1:17 - loss: 1.0576 - regression_loss: 0.9225 - classification_loss: 0.1352 272/500 [===============>..............] - ETA: 1:17 - loss: 1.0562 - regression_loss: 0.9212 - classification_loss: 0.1350 273/500 [===============>..............] - ETA: 1:17 - loss: 1.0575 - regression_loss: 0.9228 - classification_loss: 0.1348 274/500 [===============>..............] - ETA: 1:16 - loss: 1.0567 - regression_loss: 0.9222 - classification_loss: 0.1345 275/500 [===============>..............] - ETA: 1:16 - loss: 1.0580 - regression_loss: 0.9234 - classification_loss: 0.1345 276/500 [===============>..............] - ETA: 1:16 - loss: 1.0590 - regression_loss: 0.9245 - classification_loss: 0.1346 277/500 [===============>..............] - ETA: 1:15 - loss: 1.0574 - regression_loss: 0.9232 - classification_loss: 0.1342 278/500 [===============>..............] - ETA: 1:15 - loss: 1.0573 - regression_loss: 0.9231 - classification_loss: 0.1342 279/500 [===============>..............] - ETA: 1:15 - loss: 1.0574 - regression_loss: 0.9234 - classification_loss: 0.1340 280/500 [===============>..............] - ETA: 1:14 - loss: 1.0583 - regression_loss: 0.9242 - classification_loss: 0.1341 281/500 [===============>..............] - ETA: 1:14 - loss: 1.0618 - regression_loss: 0.9268 - classification_loss: 0.1350 282/500 [===============>..............] - ETA: 1:14 - loss: 1.0607 - regression_loss: 0.9260 - classification_loss: 0.1347 283/500 [===============>..............] - ETA: 1:13 - loss: 1.0625 - regression_loss: 0.9275 - classification_loss: 0.1349 284/500 [================>.............] - ETA: 1:13 - loss: 1.0603 - regression_loss: 0.9257 - classification_loss: 0.1346 285/500 [================>.............] - ETA: 1:13 - loss: 1.0624 - regression_loss: 0.9274 - classification_loss: 0.1350 286/500 [================>.............] - ETA: 1:12 - loss: 1.0618 - regression_loss: 0.9268 - classification_loss: 0.1350 287/500 [================>.............] - ETA: 1:12 - loss: 1.0598 - regression_loss: 0.9251 - classification_loss: 0.1347 288/500 [================>.............] - ETA: 1:12 - loss: 1.0591 - regression_loss: 0.9246 - classification_loss: 0.1345 289/500 [================>.............] - ETA: 1:11 - loss: 1.0580 - regression_loss: 0.9238 - classification_loss: 0.1341 290/500 [================>.............] - ETA: 1:11 - loss: 1.0581 - regression_loss: 0.9242 - classification_loss: 0.1340 291/500 [================>.............] - ETA: 1:11 - loss: 1.0595 - regression_loss: 0.9255 - classification_loss: 0.1341 292/500 [================>.............] - ETA: 1:10 - loss: 1.0602 - regression_loss: 0.9263 - classification_loss: 0.1340 293/500 [================>.............] - ETA: 1:10 - loss: 1.0595 - regression_loss: 0.9256 - classification_loss: 0.1338 294/500 [================>.............] - ETA: 1:10 - loss: 1.0580 - regression_loss: 0.9245 - classification_loss: 0.1335 295/500 [================>.............] - ETA: 1:09 - loss: 1.0582 - regression_loss: 0.9248 - classification_loss: 0.1334 296/500 [================>.............] - ETA: 1:09 - loss: 1.0576 - regression_loss: 0.9243 - classification_loss: 0.1333 297/500 [================>.............] - ETA: 1:09 - loss: 1.0579 - regression_loss: 0.9247 - classification_loss: 0.1333 298/500 [================>.............] - ETA: 1:08 - loss: 1.0598 - regression_loss: 0.9263 - classification_loss: 0.1335 299/500 [================>.............] - ETA: 1:08 - loss: 1.0633 - regression_loss: 0.9293 - classification_loss: 0.1340 300/500 [=================>............] - ETA: 1:08 - loss: 1.0642 - regression_loss: 0.9303 - classification_loss: 0.1339 301/500 [=================>............] - ETA: 1:07 - loss: 1.0636 - regression_loss: 0.9297 - classification_loss: 0.1338 302/500 [=================>............] - ETA: 1:07 - loss: 1.0646 - regression_loss: 0.9307 - classification_loss: 0.1339 303/500 [=================>............] - ETA: 1:07 - loss: 1.0659 - regression_loss: 0.9319 - classification_loss: 0.1340 304/500 [=================>............] - ETA: 1:06 - loss: 1.0635 - regression_loss: 0.9299 - classification_loss: 0.1336 305/500 [=================>............] - ETA: 1:06 - loss: 1.0627 - regression_loss: 0.9294 - classification_loss: 0.1333 306/500 [=================>............] - ETA: 1:05 - loss: 1.0628 - regression_loss: 0.9295 - classification_loss: 0.1333 307/500 [=================>............] - ETA: 1:05 - loss: 1.0645 - regression_loss: 0.9311 - classification_loss: 0.1334 308/500 [=================>............] - ETA: 1:05 - loss: 1.0640 - regression_loss: 0.9308 - classification_loss: 0.1333 309/500 [=================>............] - ETA: 1:04 - loss: 1.0637 - regression_loss: 0.9305 - classification_loss: 0.1332 310/500 [=================>............] - ETA: 1:04 - loss: 1.0653 - regression_loss: 0.9320 - classification_loss: 0.1333 311/500 [=================>............] - ETA: 1:04 - loss: 1.0643 - regression_loss: 0.9312 - classification_loss: 0.1331 312/500 [=================>............] - ETA: 1:03 - loss: 1.0666 - regression_loss: 0.9327 - classification_loss: 0.1339 313/500 [=================>............] - ETA: 1:03 - loss: 1.0659 - regression_loss: 0.9323 - classification_loss: 0.1336 314/500 [=================>............] - ETA: 1:03 - loss: 1.0658 - regression_loss: 0.9323 - classification_loss: 0.1335 315/500 [=================>............] - ETA: 1:02 - loss: 1.0645 - regression_loss: 0.9311 - classification_loss: 0.1334 316/500 [=================>............] - ETA: 1:02 - loss: 1.0645 - regression_loss: 0.9312 - classification_loss: 0.1333 317/500 [==================>...........] - ETA: 1:02 - loss: 1.0638 - regression_loss: 0.9307 - classification_loss: 0.1330 318/500 [==================>...........] - ETA: 1:01 - loss: 1.0641 - regression_loss: 0.9311 - classification_loss: 0.1329 319/500 [==================>...........] - ETA: 1:01 - loss: 1.0634 - regression_loss: 0.9306 - classification_loss: 0.1327 320/500 [==================>...........] - ETA: 1:01 - loss: 1.0646 - regression_loss: 0.9314 - classification_loss: 0.1333 321/500 [==================>...........] - ETA: 1:00 - loss: 1.0665 - regression_loss: 0.9332 - classification_loss: 0.1332 322/500 [==================>...........] - ETA: 1:00 - loss: 1.0647 - regression_loss: 0.9317 - classification_loss: 0.1330 323/500 [==================>...........] - ETA: 1:00 - loss: 1.0653 - regression_loss: 0.9324 - classification_loss: 0.1330 324/500 [==================>...........] - ETA: 59s - loss: 1.0660 - regression_loss: 0.9330 - classification_loss: 0.1330  325/500 [==================>...........] - ETA: 59s - loss: 1.0671 - regression_loss: 0.9339 - classification_loss: 0.1332 326/500 [==================>...........] - ETA: 59s - loss: 1.0651 - regression_loss: 0.9323 - classification_loss: 0.1328 327/500 [==================>...........] - ETA: 58s - loss: 1.0653 - regression_loss: 0.9326 - classification_loss: 0.1327 328/500 [==================>...........] - ETA: 58s - loss: 1.0645 - regression_loss: 0.9320 - classification_loss: 0.1325 329/500 [==================>...........] - ETA: 58s - loss: 1.0632 - regression_loss: 0.9309 - classification_loss: 0.1323 330/500 [==================>...........] - ETA: 57s - loss: 1.0649 - regression_loss: 0.9326 - classification_loss: 0.1323 331/500 [==================>...........] - ETA: 57s - loss: 1.0678 - regression_loss: 0.9350 - classification_loss: 0.1328 332/500 [==================>...........] - ETA: 57s - loss: 1.0692 - regression_loss: 0.9361 - classification_loss: 0.1331 333/500 [==================>...........] - ETA: 56s - loss: 1.0680 - regression_loss: 0.9353 - classification_loss: 0.1328 334/500 [===================>..........] - ETA: 56s - loss: 1.0669 - regression_loss: 0.9343 - classification_loss: 0.1327 335/500 [===================>..........] - ETA: 56s - loss: 1.0661 - regression_loss: 0.9336 - classification_loss: 0.1324 336/500 [===================>..........] - ETA: 55s - loss: 1.0685 - regression_loss: 0.9357 - classification_loss: 0.1329 337/500 [===================>..........] - ETA: 55s - loss: 1.0700 - regression_loss: 0.9373 - classification_loss: 0.1328 338/500 [===================>..........] - ETA: 55s - loss: 1.0727 - regression_loss: 0.9394 - classification_loss: 0.1333 339/500 [===================>..........] - ETA: 54s - loss: 1.0736 - regression_loss: 0.9403 - classification_loss: 0.1333 340/500 [===================>..........] - ETA: 54s - loss: 1.0747 - regression_loss: 0.9412 - classification_loss: 0.1335 341/500 [===================>..........] - ETA: 54s - loss: 1.0762 - regression_loss: 0.9426 - classification_loss: 0.1336 342/500 [===================>..........] - ETA: 53s - loss: 1.0761 - regression_loss: 0.9425 - classification_loss: 0.1335 343/500 [===================>..........] - ETA: 53s - loss: 1.0766 - regression_loss: 0.9430 - classification_loss: 0.1336 344/500 [===================>..........] - ETA: 53s - loss: 1.0757 - regression_loss: 0.9423 - classification_loss: 0.1334 345/500 [===================>..........] - ETA: 52s - loss: 1.0750 - regression_loss: 0.9417 - classification_loss: 0.1334 346/500 [===================>..........] - ETA: 52s - loss: 1.0747 - regression_loss: 0.9414 - classification_loss: 0.1333 347/500 [===================>..........] - ETA: 52s - loss: 1.0752 - regression_loss: 0.9421 - classification_loss: 0.1332 348/500 [===================>..........] - ETA: 51s - loss: 1.0753 - regression_loss: 0.9422 - classification_loss: 0.1331 349/500 [===================>..........] - ETA: 51s - loss: 1.0738 - regression_loss: 0.9410 - classification_loss: 0.1328 350/500 [====================>.........] - ETA: 51s - loss: 1.0731 - regression_loss: 0.9404 - classification_loss: 0.1327 351/500 [====================>.........] - ETA: 50s - loss: 1.0735 - regression_loss: 0.9409 - classification_loss: 0.1326 352/500 [====================>.........] - ETA: 50s - loss: 1.0740 - regression_loss: 0.9413 - classification_loss: 0.1327 353/500 [====================>.........] - ETA: 49s - loss: 1.0749 - regression_loss: 0.9424 - classification_loss: 0.1325 354/500 [====================>.........] - ETA: 49s - loss: 1.0781 - regression_loss: 0.9448 - classification_loss: 0.1333 355/500 [====================>.........] - ETA: 49s - loss: 1.0787 - regression_loss: 0.9451 - classification_loss: 0.1336 356/500 [====================>.........] - ETA: 48s - loss: 1.0780 - regression_loss: 0.9444 - classification_loss: 0.1335 357/500 [====================>.........] - ETA: 48s - loss: 1.0773 - regression_loss: 0.9439 - classification_loss: 0.1334 358/500 [====================>.........] - ETA: 48s - loss: 1.0783 - regression_loss: 0.9448 - classification_loss: 0.1335 359/500 [====================>.........] - ETA: 47s - loss: 1.0788 - regression_loss: 0.9452 - classification_loss: 0.1335 360/500 [====================>.........] - ETA: 47s - loss: 1.0790 - regression_loss: 0.9455 - classification_loss: 0.1335 361/500 [====================>.........] - ETA: 47s - loss: 1.0792 - regression_loss: 0.9454 - classification_loss: 0.1338 362/500 [====================>.........] - ETA: 46s - loss: 1.0793 - regression_loss: 0.9455 - classification_loss: 0.1338 363/500 [====================>.........] - ETA: 46s - loss: 1.0778 - regression_loss: 0.9443 - classification_loss: 0.1335 364/500 [====================>.........] - ETA: 46s - loss: 1.0780 - regression_loss: 0.9447 - classification_loss: 0.1333 365/500 [====================>.........] - ETA: 45s - loss: 1.0777 - regression_loss: 0.9446 - classification_loss: 0.1331 366/500 [====================>.........] - ETA: 45s - loss: 1.0786 - regression_loss: 0.9454 - classification_loss: 0.1332 367/500 [=====================>........] - ETA: 45s - loss: 1.0764 - regression_loss: 0.9435 - classification_loss: 0.1329 368/500 [=====================>........] - ETA: 44s - loss: 1.0745 - regression_loss: 0.9419 - classification_loss: 0.1326 369/500 [=====================>........] - ETA: 44s - loss: 1.0744 - regression_loss: 0.9417 - classification_loss: 0.1326 370/500 [=====================>........] - ETA: 44s - loss: 1.0755 - regression_loss: 0.9425 - classification_loss: 0.1331 371/500 [=====================>........] - ETA: 43s - loss: 1.0759 - regression_loss: 0.9429 - classification_loss: 0.1331 372/500 [=====================>........] - ETA: 43s - loss: 1.0758 - regression_loss: 0.9428 - classification_loss: 0.1330 373/500 [=====================>........] - ETA: 43s - loss: 1.0755 - regression_loss: 0.9425 - classification_loss: 0.1330 374/500 [=====================>........] - ETA: 42s - loss: 1.0756 - regression_loss: 0.9426 - classification_loss: 0.1330 375/500 [=====================>........] - ETA: 42s - loss: 1.0748 - regression_loss: 0.9420 - classification_loss: 0.1328 376/500 [=====================>........] - ETA: 42s - loss: 1.0763 - regression_loss: 0.9433 - classification_loss: 0.1330 377/500 [=====================>........] - ETA: 41s - loss: 1.0769 - regression_loss: 0.9439 - classification_loss: 0.1330 378/500 [=====================>........] - ETA: 41s - loss: 1.0780 - regression_loss: 0.9449 - classification_loss: 0.1331 379/500 [=====================>........] - ETA: 41s - loss: 1.0771 - regression_loss: 0.9441 - classification_loss: 0.1329 380/500 [=====================>........] - ETA: 40s - loss: 1.0780 - regression_loss: 0.9451 - classification_loss: 0.1329 381/500 [=====================>........] - ETA: 40s - loss: 1.0790 - regression_loss: 0.9459 - classification_loss: 0.1332 382/500 [=====================>........] - ETA: 40s - loss: 1.0787 - regression_loss: 0.9458 - classification_loss: 0.1330 383/500 [=====================>........] - ETA: 39s - loss: 1.0792 - regression_loss: 0.9462 - classification_loss: 0.1330 384/500 [======================>.......] - ETA: 39s - loss: 1.0801 - regression_loss: 0.9470 - classification_loss: 0.1331 385/500 [======================>.......] - ETA: 39s - loss: 1.0795 - regression_loss: 0.9467 - classification_loss: 0.1329 386/500 [======================>.......] - ETA: 38s - loss: 1.0789 - regression_loss: 0.9463 - classification_loss: 0.1326 387/500 [======================>.......] - ETA: 38s - loss: 1.0777 - regression_loss: 0.9452 - classification_loss: 0.1324 388/500 [======================>.......] - ETA: 38s - loss: 1.0777 - regression_loss: 0.9452 - classification_loss: 0.1325 389/500 [======================>.......] - ETA: 37s - loss: 1.0785 - regression_loss: 0.9461 - classification_loss: 0.1325 390/500 [======================>.......] - ETA: 37s - loss: 1.0785 - regression_loss: 0.9460 - classification_loss: 0.1325 391/500 [======================>.......] - ETA: 37s - loss: 1.0783 - regression_loss: 0.9459 - classification_loss: 0.1324 392/500 [======================>.......] - ETA: 36s - loss: 1.0769 - regression_loss: 0.9447 - classification_loss: 0.1322 393/500 [======================>.......] - ETA: 36s - loss: 1.0768 - regression_loss: 0.9447 - classification_loss: 0.1320 394/500 [======================>.......] - ETA: 36s - loss: 1.0754 - regression_loss: 0.9435 - classification_loss: 0.1319 395/500 [======================>.......] - ETA: 35s - loss: 1.0765 - regression_loss: 0.9443 - classification_loss: 0.1322 396/500 [======================>.......] - ETA: 35s - loss: 1.0764 - regression_loss: 0.9444 - classification_loss: 0.1320 397/500 [======================>.......] - ETA: 35s - loss: 1.0762 - regression_loss: 0.9442 - classification_loss: 0.1320 398/500 [======================>.......] - ETA: 34s - loss: 1.0742 - regression_loss: 0.9423 - classification_loss: 0.1318 399/500 [======================>.......] - ETA: 34s - loss: 1.0731 - regression_loss: 0.9416 - classification_loss: 0.1316 400/500 [=======================>......] - ETA: 34s - loss: 1.0719 - regression_loss: 0.9405 - classification_loss: 0.1314 401/500 [=======================>......] - ETA: 33s - loss: 1.0719 - regression_loss: 0.9405 - classification_loss: 0.1314 402/500 [=======================>......] - ETA: 33s - loss: 1.0716 - regression_loss: 0.9403 - classification_loss: 0.1313 403/500 [=======================>......] - ETA: 32s - loss: 1.0710 - regression_loss: 0.9398 - classification_loss: 0.1311 404/500 [=======================>......] - ETA: 32s - loss: 1.0693 - regression_loss: 0.9383 - classification_loss: 0.1309 405/500 [=======================>......] - ETA: 32s - loss: 1.0698 - regression_loss: 0.9388 - classification_loss: 0.1311 406/500 [=======================>......] - ETA: 31s - loss: 1.0680 - regression_loss: 0.9371 - classification_loss: 0.1309 407/500 [=======================>......] - ETA: 31s - loss: 1.0689 - regression_loss: 0.9381 - classification_loss: 0.1308 408/500 [=======================>......] - ETA: 31s - loss: 1.0702 - regression_loss: 0.9392 - classification_loss: 0.1310 409/500 [=======================>......] - ETA: 30s - loss: 1.0700 - regression_loss: 0.9392 - classification_loss: 0.1308 410/500 [=======================>......] - ETA: 30s - loss: 1.0686 - regression_loss: 0.9379 - classification_loss: 0.1307 411/500 [=======================>......] - ETA: 30s - loss: 1.0702 - regression_loss: 0.9391 - classification_loss: 0.1311 412/500 [=======================>......] - ETA: 29s - loss: 1.0707 - regression_loss: 0.9393 - classification_loss: 0.1314 413/500 [=======================>......] - ETA: 29s - loss: 1.0699 - regression_loss: 0.9386 - classification_loss: 0.1313 414/500 [=======================>......] - ETA: 29s - loss: 1.0686 - regression_loss: 0.9376 - classification_loss: 0.1310 415/500 [=======================>......] - ETA: 28s - loss: 1.0684 - regression_loss: 0.9376 - classification_loss: 0.1309 416/500 [=======================>......] - ETA: 28s - loss: 1.0707 - regression_loss: 0.9396 - classification_loss: 0.1311 417/500 [========================>.....] - ETA: 28s - loss: 1.0711 - regression_loss: 0.9400 - classification_loss: 0.1310 418/500 [========================>.....] - ETA: 27s - loss: 1.0705 - regression_loss: 0.9396 - classification_loss: 0.1308 419/500 [========================>.....] - ETA: 27s - loss: 1.0713 - regression_loss: 0.9404 - classification_loss: 0.1309 420/500 [========================>.....] - ETA: 27s - loss: 1.0711 - regression_loss: 0.9403 - classification_loss: 0.1308 421/500 [========================>.....] - ETA: 26s - loss: 1.0701 - regression_loss: 0.9393 - classification_loss: 0.1308 422/500 [========================>.....] - ETA: 26s - loss: 1.0706 - regression_loss: 0.9398 - classification_loss: 0.1309 423/500 [========================>.....] - ETA: 26s - loss: 1.0717 - regression_loss: 0.9406 - classification_loss: 0.1311 424/500 [========================>.....] - ETA: 25s - loss: 1.0714 - regression_loss: 0.9403 - classification_loss: 0.1311 425/500 [========================>.....] - ETA: 25s - loss: 1.0736 - regression_loss: 0.9425 - classification_loss: 0.1312 426/500 [========================>.....] - ETA: 25s - loss: 1.0750 - regression_loss: 0.9435 - classification_loss: 0.1315 427/500 [========================>.....] - ETA: 24s - loss: 1.0755 - regression_loss: 0.9440 - classification_loss: 0.1315 428/500 [========================>.....] - ETA: 24s - loss: 1.0755 - regression_loss: 0.9441 - classification_loss: 0.1314 429/500 [========================>.....] - ETA: 24s - loss: 1.0766 - regression_loss: 0.9452 - classification_loss: 0.1314 430/500 [========================>.....] - ETA: 23s - loss: 1.0763 - regression_loss: 0.9450 - classification_loss: 0.1313 431/500 [========================>.....] - ETA: 23s - loss: 1.0748 - regression_loss: 0.9436 - classification_loss: 0.1311 432/500 [========================>.....] - ETA: 23s - loss: 1.0739 - regression_loss: 0.9429 - classification_loss: 0.1310 433/500 [========================>.....] - ETA: 22s - loss: 1.0744 - regression_loss: 0.9434 - classification_loss: 0.1310 434/500 [=========================>....] - ETA: 22s - loss: 1.0743 - regression_loss: 0.9434 - classification_loss: 0.1309 435/500 [=========================>....] - ETA: 22s - loss: 1.0733 - regression_loss: 0.9424 - classification_loss: 0.1309 436/500 [=========================>....] - ETA: 21s - loss: 1.0737 - regression_loss: 0.9427 - classification_loss: 0.1310 437/500 [=========================>....] - ETA: 21s - loss: 1.0741 - regression_loss: 0.9430 - classification_loss: 0.1311 438/500 [=========================>....] - ETA: 21s - loss: 1.0732 - regression_loss: 0.9421 - classification_loss: 0.1310 439/500 [=========================>....] - ETA: 20s - loss: 1.0720 - regression_loss: 0.9412 - classification_loss: 0.1308 440/500 [=========================>....] - ETA: 20s - loss: 1.0702 - regression_loss: 0.9396 - classification_loss: 0.1306 441/500 [=========================>....] - ETA: 20s - loss: 1.0714 - regression_loss: 0.9404 - classification_loss: 0.1310 442/500 [=========================>....] - ETA: 19s - loss: 1.0706 - regression_loss: 0.9398 - classification_loss: 0.1308 443/500 [=========================>....] - ETA: 19s - loss: 1.0708 - regression_loss: 0.9398 - classification_loss: 0.1310 444/500 [=========================>....] - ETA: 19s - loss: 1.0701 - regression_loss: 0.9392 - classification_loss: 0.1309 445/500 [=========================>....] - ETA: 18s - loss: 1.0695 - regression_loss: 0.9387 - classification_loss: 0.1307 446/500 [=========================>....] - ETA: 18s - loss: 1.0689 - regression_loss: 0.9382 - classification_loss: 0.1306 447/500 [=========================>....] - ETA: 18s - loss: 1.0687 - regression_loss: 0.9382 - classification_loss: 0.1305 448/500 [=========================>....] - ETA: 17s - loss: 1.0684 - regression_loss: 0.9380 - classification_loss: 0.1304 449/500 [=========================>....] - ETA: 17s - loss: 1.0681 - regression_loss: 0.9377 - classification_loss: 0.1304 450/500 [==========================>...] - ETA: 17s - loss: 1.0676 - regression_loss: 0.9374 - classification_loss: 0.1302 451/500 [==========================>...] - ETA: 16s - loss: 1.0670 - regression_loss: 0.9369 - classification_loss: 0.1301 452/500 [==========================>...] - ETA: 16s - loss: 1.0656 - regression_loss: 0.9356 - classification_loss: 0.1299 453/500 [==========================>...] - ETA: 15s - loss: 1.0665 - regression_loss: 0.9362 - classification_loss: 0.1303 454/500 [==========================>...] - ETA: 15s - loss: 1.0666 - regression_loss: 0.9364 - classification_loss: 0.1302 455/500 [==========================>...] - ETA: 15s - loss: 1.0671 - regression_loss: 0.9366 - classification_loss: 0.1304 456/500 [==========================>...] - ETA: 14s - loss: 1.0670 - regression_loss: 0.9366 - classification_loss: 0.1303 457/500 [==========================>...] - ETA: 14s - loss: 1.0668 - regression_loss: 0.9366 - classification_loss: 0.1302 458/500 [==========================>...] - ETA: 14s - loss: 1.0681 - regression_loss: 0.9376 - classification_loss: 0.1305 459/500 [==========================>...] - ETA: 13s - loss: 1.0694 - regression_loss: 0.9389 - classification_loss: 0.1305 460/500 [==========================>...] - ETA: 13s - loss: 1.0692 - regression_loss: 0.9388 - classification_loss: 0.1304 461/500 [==========================>...] - ETA: 13s - loss: 1.0683 - regression_loss: 0.9380 - classification_loss: 0.1303 462/500 [==========================>...] - ETA: 12s - loss: 1.0704 - regression_loss: 0.9399 - classification_loss: 0.1306 463/500 [==========================>...] - ETA: 12s - loss: 1.0691 - regression_loss: 0.9387 - classification_loss: 0.1304 464/500 [==========================>...] - ETA: 12s - loss: 1.0684 - regression_loss: 0.9382 - classification_loss: 0.1302 465/500 [==========================>...] - ETA: 11s - loss: 1.0686 - regression_loss: 0.9383 - classification_loss: 0.1303 466/500 [==========================>...] - ETA: 11s - loss: 1.0690 - regression_loss: 0.9387 - classification_loss: 0.1303 467/500 [===========================>..] - ETA: 11s - loss: 1.0681 - regression_loss: 0.9378 - classification_loss: 0.1303 468/500 [===========================>..] - ETA: 10s - loss: 1.0691 - regression_loss: 0.9387 - classification_loss: 0.1304 469/500 [===========================>..] - ETA: 10s - loss: 1.0684 - regression_loss: 0.9382 - classification_loss: 0.1303 470/500 [===========================>..] - ETA: 10s - loss: 1.0689 - regression_loss: 0.9386 - classification_loss: 0.1303 471/500 [===========================>..] - ETA: 9s - loss: 1.0687 - regression_loss: 0.9384 - classification_loss: 0.1303  472/500 [===========================>..] - ETA: 9s - loss: 1.0672 - regression_loss: 0.9372 - classification_loss: 0.1301 473/500 [===========================>..] - ETA: 9s - loss: 1.0668 - regression_loss: 0.9369 - classification_loss: 0.1300 474/500 [===========================>..] - ETA: 8s - loss: 1.0671 - regression_loss: 0.9370 - classification_loss: 0.1300 475/500 [===========================>..] - ETA: 8s - loss: 1.0657 - regression_loss: 0.9359 - classification_loss: 0.1298 476/500 [===========================>..] - ETA: 8s - loss: 1.0663 - regression_loss: 0.9365 - classification_loss: 0.1298 477/500 [===========================>..] - ETA: 7s - loss: 1.0666 - regression_loss: 0.9367 - classification_loss: 0.1299 478/500 [===========================>..] - ETA: 7s - loss: 1.0679 - regression_loss: 0.9379 - classification_loss: 0.1299 479/500 [===========================>..] - ETA: 7s - loss: 1.0668 - regression_loss: 0.9370 - classification_loss: 0.1298 480/500 [===========================>..] - ETA: 6s - loss: 1.0660 - regression_loss: 0.9363 - classification_loss: 0.1297 481/500 [===========================>..] - ETA: 6s - loss: 1.0656 - regression_loss: 0.9360 - classification_loss: 0.1296 482/500 [===========================>..] - ETA: 6s - loss: 1.0652 - regression_loss: 0.9355 - classification_loss: 0.1297 483/500 [===========================>..] - ETA: 5s - loss: 1.0649 - regression_loss: 0.9353 - classification_loss: 0.1296 484/500 [============================>.] - ETA: 5s - loss: 1.0658 - regression_loss: 0.9360 - classification_loss: 0.1297 485/500 [============================>.] - ETA: 5s - loss: 1.0654 - regression_loss: 0.9358 - classification_loss: 0.1296 486/500 [============================>.] - ETA: 4s - loss: 1.0648 - regression_loss: 0.9353 - classification_loss: 0.1294 487/500 [============================>.] - ETA: 4s - loss: 1.0638 - regression_loss: 0.9345 - classification_loss: 0.1293 488/500 [============================>.] - ETA: 4s - loss: 1.0636 - regression_loss: 0.9343 - classification_loss: 0.1293 489/500 [============================>.] - ETA: 3s - loss: 1.0622 - regression_loss: 0.9332 - classification_loss: 0.1291 490/500 [============================>.] - ETA: 3s - loss: 1.0620 - regression_loss: 0.9329 - classification_loss: 0.1291 491/500 [============================>.] - ETA: 3s - loss: 1.0634 - regression_loss: 0.9341 - classification_loss: 0.1293 492/500 [============================>.] - ETA: 2s - loss: 1.0641 - regression_loss: 0.9348 - classification_loss: 0.1293 493/500 [============================>.] - ETA: 2s - loss: 1.0642 - regression_loss: 0.9350 - classification_loss: 0.1293 494/500 [============================>.] - ETA: 2s - loss: 1.0621 - regression_loss: 0.9331 - classification_loss: 0.1291 495/500 [============================>.] - ETA: 1s - loss: 1.0627 - regression_loss: 0.9336 - classification_loss: 0.1292 496/500 [============================>.] - ETA: 1s - loss: 1.0632 - regression_loss: 0.9341 - classification_loss: 0.1292 497/500 [============================>.] - ETA: 1s - loss: 1.0630 - regression_loss: 0.9340 - classification_loss: 0.1290 498/500 [============================>.] - ETA: 0s - loss: 1.0642 - regression_loss: 0.9349 - classification_loss: 0.1293 499/500 [============================>.] - ETA: 0s - loss: 1.0643 - regression_loss: 0.9351 - classification_loss: 0.1292 500/500 [==============================] - 170s 340ms/step - loss: 1.0653 - regression_loss: 0.9361 - classification_loss: 0.1292 326 instances of class plum with average precision: 0.8173 mAP: 0.8173 Epoch 00022: saving model to ./training/snapshots/resnet101_pascal_22.h5 Epoch 23/150 1/500 [..............................] - ETA: 2:47 - loss: 0.3101 - regression_loss: 0.2832 - classification_loss: 0.0269 2/500 [..............................] - ETA: 2:51 - loss: 0.6882 - regression_loss: 0.6201 - classification_loss: 0.0681 3/500 [..............................] - ETA: 2:52 - loss: 0.9913 - regression_loss: 0.8204 - classification_loss: 0.1709 4/500 [..............................] - ETA: 2:51 - loss: 0.9307 - regression_loss: 0.7795 - classification_loss: 0.1511 5/500 [..............................] - ETA: 2:48 - loss: 0.9815 - regression_loss: 0.8327 - classification_loss: 0.1487 6/500 [..............................] - ETA: 2:49 - loss: 1.0425 - regression_loss: 0.9056 - classification_loss: 0.1369 7/500 [..............................] - ETA: 2:49 - loss: 0.9872 - regression_loss: 0.8616 - classification_loss: 0.1256 8/500 [..............................] - ETA: 2:49 - loss: 0.9432 - regression_loss: 0.8259 - classification_loss: 0.1173 9/500 [..............................] - ETA: 2:48 - loss: 0.9734 - regression_loss: 0.8526 - classification_loss: 0.1208 10/500 [..............................] - ETA: 2:47 - loss: 0.9342 - regression_loss: 0.8219 - classification_loss: 0.1123 11/500 [..............................] - ETA: 2:47 - loss: 0.9316 - regression_loss: 0.8181 - classification_loss: 0.1135 12/500 [..............................] - ETA: 2:47 - loss: 0.9345 - regression_loss: 0.8197 - classification_loss: 0.1148 13/500 [..............................] - ETA: 2:47 - loss: 0.9571 - regression_loss: 0.8415 - classification_loss: 0.1156 14/500 [..............................] - ETA: 2:46 - loss: 0.9931 - regression_loss: 0.8691 - classification_loss: 0.1241 15/500 [..............................] - ETA: 2:45 - loss: 0.9930 - regression_loss: 0.8716 - classification_loss: 0.1214 16/500 [..............................] - ETA: 2:45 - loss: 0.9973 - regression_loss: 0.8783 - classification_loss: 0.1189 17/500 [>.............................] - ETA: 2:44 - loss: 0.9769 - regression_loss: 0.8618 - classification_loss: 0.1151 18/500 [>.............................] - ETA: 2:44 - loss: 0.9750 - regression_loss: 0.8628 - classification_loss: 0.1122 19/500 [>.............................] - ETA: 2:44 - loss: 0.9837 - regression_loss: 0.8709 - classification_loss: 0.1128 20/500 [>.............................] - ETA: 2:43 - loss: 0.9836 - regression_loss: 0.8722 - classification_loss: 0.1114 21/500 [>.............................] - ETA: 2:43 - loss: 1.0007 - regression_loss: 0.8903 - classification_loss: 0.1104 22/500 [>.............................] - ETA: 2:42 - loss: 0.9973 - regression_loss: 0.8850 - classification_loss: 0.1123 23/500 [>.............................] - ETA: 2:42 - loss: 1.0141 - regression_loss: 0.8985 - classification_loss: 0.1155 24/500 [>.............................] - ETA: 2:42 - loss: 0.9949 - regression_loss: 0.8824 - classification_loss: 0.1125 25/500 [>.............................] - ETA: 2:41 - loss: 0.9962 - regression_loss: 0.8810 - classification_loss: 0.1152 26/500 [>.............................] - ETA: 2:41 - loss: 0.9920 - regression_loss: 0.8758 - classification_loss: 0.1161 27/500 [>.............................] - ETA: 2:40 - loss: 0.9798 - regression_loss: 0.8660 - classification_loss: 0.1138 28/500 [>.............................] - ETA: 2:40 - loss: 0.9924 - regression_loss: 0.8751 - classification_loss: 0.1172 29/500 [>.............................] - ETA: 2:39 - loss: 0.9948 - regression_loss: 0.8792 - classification_loss: 0.1156 30/500 [>.............................] - ETA: 2:39 - loss: 1.0042 - regression_loss: 0.8877 - classification_loss: 0.1166 31/500 [>.............................] - ETA: 2:39 - loss: 1.0099 - regression_loss: 0.8936 - classification_loss: 0.1163 32/500 [>.............................] - ETA: 2:38 - loss: 1.0471 - regression_loss: 0.9259 - classification_loss: 0.1212 33/500 [>.............................] - ETA: 2:38 - loss: 1.0727 - regression_loss: 0.9482 - classification_loss: 0.1245 34/500 [=>............................] - ETA: 2:38 - loss: 1.0491 - regression_loss: 0.9276 - classification_loss: 0.1215 35/500 [=>............................] - ETA: 2:37 - loss: 1.0500 - regression_loss: 0.9292 - classification_loss: 0.1208 36/500 [=>............................] - ETA: 2:37 - loss: 1.0481 - regression_loss: 0.9286 - classification_loss: 0.1196 37/500 [=>............................] - ETA: 2:36 - loss: 1.0414 - regression_loss: 0.9231 - classification_loss: 0.1183 38/500 [=>............................] - ETA: 2:36 - loss: 1.0284 - regression_loss: 0.9121 - classification_loss: 0.1163 39/500 [=>............................] - ETA: 2:36 - loss: 1.0257 - regression_loss: 0.9095 - classification_loss: 0.1162 40/500 [=>............................] - ETA: 2:35 - loss: 1.0444 - regression_loss: 0.9253 - classification_loss: 0.1191 41/500 [=>............................] - ETA: 2:35 - loss: 1.0477 - regression_loss: 0.9279 - classification_loss: 0.1198 42/500 [=>............................] - ETA: 2:34 - loss: 1.0713 - regression_loss: 0.9484 - classification_loss: 0.1229 43/500 [=>............................] - ETA: 2:34 - loss: 1.0698 - regression_loss: 0.9479 - classification_loss: 0.1219 44/500 [=>............................] - ETA: 2:34 - loss: 1.0526 - regression_loss: 0.9329 - classification_loss: 0.1197 45/500 [=>............................] - ETA: 2:34 - loss: 1.0429 - regression_loss: 0.9247 - classification_loss: 0.1182 46/500 [=>............................] - ETA: 2:33 - loss: 1.0402 - regression_loss: 0.9226 - classification_loss: 0.1176 47/500 [=>............................] - ETA: 2:33 - loss: 1.0240 - regression_loss: 0.9085 - classification_loss: 0.1155 48/500 [=>............................] - ETA: 2:33 - loss: 1.0183 - regression_loss: 0.9013 - classification_loss: 0.1170 49/500 [=>............................] - ETA: 2:32 - loss: 1.0133 - regression_loss: 0.8972 - classification_loss: 0.1161 50/500 [==>...........................] - ETA: 2:32 - loss: 1.0141 - regression_loss: 0.8980 - classification_loss: 0.1162 51/500 [==>...........................] - ETA: 2:32 - loss: 1.0152 - regression_loss: 0.9002 - classification_loss: 0.1150 52/500 [==>...........................] - ETA: 2:32 - loss: 1.0147 - regression_loss: 0.8991 - classification_loss: 0.1156 53/500 [==>...........................] - ETA: 2:31 - loss: 1.0074 - regression_loss: 0.8932 - classification_loss: 0.1142 54/500 [==>...........................] - ETA: 2:31 - loss: 1.0043 - regression_loss: 0.8914 - classification_loss: 0.1129 55/500 [==>...........................] - ETA: 2:31 - loss: 1.0119 - regression_loss: 0.8965 - classification_loss: 0.1154 56/500 [==>...........................] - ETA: 2:31 - loss: 1.0118 - regression_loss: 0.8966 - classification_loss: 0.1151 57/500 [==>...........................] - ETA: 2:30 - loss: 1.0129 - regression_loss: 0.8973 - classification_loss: 0.1155 58/500 [==>...........................] - ETA: 2:30 - loss: 1.0160 - regression_loss: 0.9002 - classification_loss: 0.1158 59/500 [==>...........................] - ETA: 2:29 - loss: 1.0304 - regression_loss: 0.9104 - classification_loss: 0.1200 60/500 [==>...........................] - ETA: 2:29 - loss: 1.0342 - regression_loss: 0.9127 - classification_loss: 0.1215 61/500 [==>...........................] - ETA: 2:29 - loss: 1.0390 - regression_loss: 0.9172 - classification_loss: 0.1218 62/500 [==>...........................] - ETA: 2:28 - loss: 1.0311 - regression_loss: 0.9103 - classification_loss: 0.1209 63/500 [==>...........................] - ETA: 2:28 - loss: 1.0235 - regression_loss: 0.9039 - classification_loss: 0.1197 64/500 [==>...........................] - ETA: 2:28 - loss: 1.0299 - regression_loss: 0.9095 - classification_loss: 0.1204 65/500 [==>...........................] - ETA: 2:27 - loss: 1.0473 - regression_loss: 0.9261 - classification_loss: 0.1211 66/500 [==>...........................] - ETA: 2:27 - loss: 1.0519 - regression_loss: 0.9295 - classification_loss: 0.1224 67/500 [===>..........................] - ETA: 2:27 - loss: 1.0535 - regression_loss: 0.9310 - classification_loss: 0.1224 68/500 [===>..........................] - ETA: 2:26 - loss: 1.0534 - regression_loss: 0.9316 - classification_loss: 0.1219 69/500 [===>..........................] - ETA: 2:26 - loss: 1.0532 - regression_loss: 0.9320 - classification_loss: 0.1212 70/500 [===>..........................] - ETA: 2:26 - loss: 1.0513 - regression_loss: 0.9309 - classification_loss: 0.1204 71/500 [===>..........................] - ETA: 2:25 - loss: 1.0603 - regression_loss: 0.9381 - classification_loss: 0.1222 72/500 [===>..........................] - ETA: 2:25 - loss: 1.0577 - regression_loss: 0.9359 - classification_loss: 0.1218 73/500 [===>..........................] - ETA: 2:25 - loss: 1.0521 - regression_loss: 0.9315 - classification_loss: 0.1206 74/500 [===>..........................] - ETA: 2:25 - loss: 1.0494 - regression_loss: 0.9290 - classification_loss: 0.1204 75/500 [===>..........................] - ETA: 2:24 - loss: 1.0464 - regression_loss: 0.9265 - classification_loss: 0.1200 76/500 [===>..........................] - ETA: 2:24 - loss: 1.0495 - regression_loss: 0.9288 - classification_loss: 0.1206 77/500 [===>..........................] - ETA: 2:24 - loss: 1.0456 - regression_loss: 0.9250 - classification_loss: 0.1206 78/500 [===>..........................] - ETA: 2:23 - loss: 1.0401 - regression_loss: 0.9205 - classification_loss: 0.1195 79/500 [===>..........................] - ETA: 2:23 - loss: 1.0364 - regression_loss: 0.9177 - classification_loss: 0.1187 80/500 [===>..........................] - ETA: 2:23 - loss: 1.0337 - regression_loss: 0.9155 - classification_loss: 0.1181 81/500 [===>..........................] - ETA: 2:22 - loss: 1.0322 - regression_loss: 0.9144 - classification_loss: 0.1177 82/500 [===>..........................] - ETA: 2:22 - loss: 1.0324 - regression_loss: 0.9151 - classification_loss: 0.1174 83/500 [===>..........................] - ETA: 2:22 - loss: 1.0386 - regression_loss: 0.9205 - classification_loss: 0.1180 84/500 [====>.........................] - ETA: 2:21 - loss: 1.0409 - regression_loss: 0.9220 - classification_loss: 0.1189 85/500 [====>.........................] - ETA: 2:21 - loss: 1.0356 - regression_loss: 0.9169 - classification_loss: 0.1186 86/500 [====>.........................] - ETA: 2:21 - loss: 1.0355 - regression_loss: 0.9169 - classification_loss: 0.1186 87/500 [====>.........................] - ETA: 2:20 - loss: 1.0373 - regression_loss: 0.9178 - classification_loss: 0.1195 88/500 [====>.........................] - ETA: 2:20 - loss: 1.0395 - regression_loss: 0.9193 - classification_loss: 0.1202 89/500 [====>.........................] - ETA: 2:19 - loss: 1.0363 - regression_loss: 0.9168 - classification_loss: 0.1196 90/500 [====>.........................] - ETA: 2:19 - loss: 1.0348 - regression_loss: 0.9155 - classification_loss: 0.1192 91/500 [====>.........................] - ETA: 2:19 - loss: 1.0336 - regression_loss: 0.9148 - classification_loss: 0.1188 92/500 [====>.........................] - ETA: 2:18 - loss: 1.0360 - regression_loss: 0.9172 - classification_loss: 0.1188 93/500 [====>.........................] - ETA: 2:18 - loss: 1.0385 - regression_loss: 0.9195 - classification_loss: 0.1190 94/500 [====>.........................] - ETA: 2:18 - loss: 1.0382 - regression_loss: 0.9189 - classification_loss: 0.1193 95/500 [====>.........................] - ETA: 2:17 - loss: 1.0389 - regression_loss: 0.9198 - classification_loss: 0.1191 96/500 [====>.........................] - ETA: 2:17 - loss: 1.0455 - regression_loss: 0.9267 - classification_loss: 0.1188 97/500 [====>.........................] - ETA: 2:17 - loss: 1.0500 - regression_loss: 0.9297 - classification_loss: 0.1202 98/500 [====>.........................] - ETA: 2:16 - loss: 1.0600 - regression_loss: 0.9364 - classification_loss: 0.1236 99/500 [====>.........................] - ETA: 2:16 - loss: 1.0609 - regression_loss: 0.9370 - classification_loss: 0.1239 100/500 [=====>........................] - ETA: 2:16 - loss: 1.0625 - regression_loss: 0.9384 - classification_loss: 0.1240 101/500 [=====>........................] - ETA: 2:15 - loss: 1.0568 - regression_loss: 0.9336 - classification_loss: 0.1232 102/500 [=====>........................] - ETA: 2:15 - loss: 1.0540 - regression_loss: 0.9314 - classification_loss: 0.1225 103/500 [=====>........................] - ETA: 2:15 - loss: 1.0541 - regression_loss: 0.9321 - classification_loss: 0.1220 104/500 [=====>........................] - ETA: 2:14 - loss: 1.0529 - regression_loss: 0.9315 - classification_loss: 0.1214 105/500 [=====>........................] - ETA: 2:14 - loss: 1.0506 - regression_loss: 0.9300 - classification_loss: 0.1206 106/500 [=====>........................] - ETA: 2:14 - loss: 1.0508 - regression_loss: 0.9306 - classification_loss: 0.1202 107/500 [=====>........................] - ETA: 2:14 - loss: 1.0481 - regression_loss: 0.9284 - classification_loss: 0.1197 108/500 [=====>........................] - ETA: 2:13 - loss: 1.0506 - regression_loss: 0.9304 - classification_loss: 0.1202 109/500 [=====>........................] - ETA: 2:13 - loss: 1.0540 - regression_loss: 0.9324 - classification_loss: 0.1215 110/500 [=====>........................] - ETA: 2:13 - loss: 1.0494 - regression_loss: 0.9285 - classification_loss: 0.1209 111/500 [=====>........................] - ETA: 2:12 - loss: 1.0480 - regression_loss: 0.9269 - classification_loss: 0.1210 112/500 [=====>........................] - ETA: 2:12 - loss: 1.0521 - regression_loss: 0.9308 - classification_loss: 0.1213 113/500 [=====>........................] - ETA: 2:11 - loss: 1.0525 - regression_loss: 0.9313 - classification_loss: 0.1212 114/500 [=====>........................] - ETA: 2:11 - loss: 1.0473 - regression_loss: 0.9267 - classification_loss: 0.1205 115/500 [=====>........................] - ETA: 2:11 - loss: 1.0450 - regression_loss: 0.9252 - classification_loss: 0.1198 116/500 [=====>........................] - ETA: 2:10 - loss: 1.0416 - regression_loss: 0.9224 - classification_loss: 0.1191 117/500 [======>.......................] - ETA: 2:10 - loss: 1.0369 - regression_loss: 0.9186 - classification_loss: 0.1183 118/500 [======>.......................] - ETA: 2:10 - loss: 1.0365 - regression_loss: 0.9182 - classification_loss: 0.1183 119/500 [======>.......................] - ETA: 2:09 - loss: 1.0383 - regression_loss: 0.9194 - classification_loss: 0.1189 120/500 [======>.......................] - ETA: 2:09 - loss: 1.0305 - regression_loss: 0.9118 - classification_loss: 0.1187 121/500 [======>.......................] - ETA: 2:09 - loss: 1.0340 - regression_loss: 0.9152 - classification_loss: 0.1188 122/500 [======>.......................] - ETA: 2:08 - loss: 1.0318 - regression_loss: 0.9134 - classification_loss: 0.1184 123/500 [======>.......................] - ETA: 2:08 - loss: 1.0356 - regression_loss: 0.9165 - classification_loss: 0.1191 124/500 [======>.......................] - ETA: 2:08 - loss: 1.0332 - regression_loss: 0.9147 - classification_loss: 0.1186 125/500 [======>.......................] - ETA: 2:07 - loss: 1.0316 - regression_loss: 0.9133 - classification_loss: 0.1183 126/500 [======>.......................] - ETA: 2:07 - loss: 1.0377 - regression_loss: 0.9186 - classification_loss: 0.1191 127/500 [======>.......................] - ETA: 2:07 - loss: 1.0412 - regression_loss: 0.9213 - classification_loss: 0.1199 128/500 [======>.......................] - ETA: 2:06 - loss: 1.0424 - regression_loss: 0.9228 - classification_loss: 0.1197 129/500 [======>.......................] - ETA: 2:06 - loss: 1.0429 - regression_loss: 0.9236 - classification_loss: 0.1193 130/500 [======>.......................] - ETA: 2:06 - loss: 1.0427 - regression_loss: 0.9232 - classification_loss: 0.1194 131/500 [======>.......................] - ETA: 2:05 - loss: 1.0431 - regression_loss: 0.9239 - classification_loss: 0.1192 132/500 [======>.......................] - ETA: 2:05 - loss: 1.0408 - regression_loss: 0.9221 - classification_loss: 0.1188 133/500 [======>.......................] - ETA: 2:05 - loss: 1.0382 - regression_loss: 0.9199 - classification_loss: 0.1183 134/500 [=======>......................] - ETA: 2:04 - loss: 1.0407 - regression_loss: 0.9226 - classification_loss: 0.1181 135/500 [=======>......................] - ETA: 2:04 - loss: 1.0369 - regression_loss: 0.9192 - classification_loss: 0.1176 136/500 [=======>......................] - ETA: 2:04 - loss: 1.0354 - regression_loss: 0.9180 - classification_loss: 0.1174 137/500 [=======>......................] - ETA: 2:03 - loss: 1.0354 - regression_loss: 0.9174 - classification_loss: 0.1180 138/500 [=======>......................] - ETA: 2:03 - loss: 1.0339 - regression_loss: 0.9162 - classification_loss: 0.1178 139/500 [=======>......................] - ETA: 2:03 - loss: 1.0298 - regression_loss: 0.9128 - classification_loss: 0.1170 140/500 [=======>......................] - ETA: 2:02 - loss: 1.0274 - regression_loss: 0.9104 - classification_loss: 0.1170 141/500 [=======>......................] - ETA: 2:02 - loss: 1.0303 - regression_loss: 0.9129 - classification_loss: 0.1174 142/500 [=======>......................] - ETA: 2:02 - loss: 1.0359 - regression_loss: 0.9162 - classification_loss: 0.1197 143/500 [=======>......................] - ETA: 2:01 - loss: 1.0435 - regression_loss: 0.9214 - classification_loss: 0.1220 144/500 [=======>......................] - ETA: 2:01 - loss: 1.0426 - regression_loss: 0.9210 - classification_loss: 0.1216 145/500 [=======>......................] - ETA: 2:01 - loss: 1.0449 - regression_loss: 0.9230 - classification_loss: 0.1219 146/500 [=======>......................] - ETA: 2:00 - loss: 1.0457 - regression_loss: 0.9224 - classification_loss: 0.1232 147/500 [=======>......................] - ETA: 2:00 - loss: 1.0430 - regression_loss: 0.9203 - classification_loss: 0.1228 148/500 [=======>......................] - ETA: 2:00 - loss: 1.0418 - regression_loss: 0.9194 - classification_loss: 0.1225 149/500 [=======>......................] - ETA: 1:59 - loss: 1.0401 - regression_loss: 0.9181 - classification_loss: 0.1220 150/500 [========>.....................] - ETA: 1:59 - loss: 1.0381 - regression_loss: 0.9164 - classification_loss: 0.1217 151/500 [========>.....................] - ETA: 1:59 - loss: 1.0366 - regression_loss: 0.9152 - classification_loss: 0.1214 152/500 [========>.....................] - ETA: 1:58 - loss: 1.0556 - regression_loss: 0.9304 - classification_loss: 0.1252 153/500 [========>.....................] - ETA: 1:58 - loss: 1.0612 - regression_loss: 0.9346 - classification_loss: 0.1266 154/500 [========>.....................] - ETA: 1:58 - loss: 1.0620 - regression_loss: 0.9357 - classification_loss: 0.1264 155/500 [========>.....................] - ETA: 1:57 - loss: 1.0628 - regression_loss: 0.9365 - classification_loss: 0.1263 156/500 [========>.....................] - ETA: 1:57 - loss: 1.0669 - regression_loss: 0.9400 - classification_loss: 0.1269 157/500 [========>.....................] - ETA: 1:57 - loss: 1.0661 - regression_loss: 0.9394 - classification_loss: 0.1267 158/500 [========>.....................] - ETA: 1:56 - loss: 1.0638 - regression_loss: 0.9375 - classification_loss: 0.1263 159/500 [========>.....................] - ETA: 1:56 - loss: 1.0633 - regression_loss: 0.9367 - classification_loss: 0.1266 160/500 [========>.....................] - ETA: 1:56 - loss: 1.0610 - regression_loss: 0.9348 - classification_loss: 0.1262 161/500 [========>.....................] - ETA: 1:55 - loss: 1.0607 - regression_loss: 0.9348 - classification_loss: 0.1259 162/500 [========>.....................] - ETA: 1:55 - loss: 1.0617 - regression_loss: 0.9357 - classification_loss: 0.1260 163/500 [========>.....................] - ETA: 1:54 - loss: 1.0618 - regression_loss: 0.9358 - classification_loss: 0.1260 164/500 [========>.....................] - ETA: 1:54 - loss: 1.0635 - regression_loss: 0.9375 - classification_loss: 0.1260 165/500 [========>.....................] - ETA: 1:54 - loss: 1.0638 - regression_loss: 0.9378 - classification_loss: 0.1261 166/500 [========>.....................] - ETA: 1:54 - loss: 1.0648 - regression_loss: 0.9390 - classification_loss: 0.1258 167/500 [=========>....................] - ETA: 1:53 - loss: 1.0675 - regression_loss: 0.9413 - classification_loss: 0.1262 168/500 [=========>....................] - ETA: 1:53 - loss: 1.0734 - regression_loss: 0.9467 - classification_loss: 0.1268 169/500 [=========>....................] - ETA: 1:53 - loss: 1.0739 - regression_loss: 0.9471 - classification_loss: 0.1268 170/500 [=========>....................] - ETA: 1:52 - loss: 1.0735 - regression_loss: 0.9469 - classification_loss: 0.1265 171/500 [=========>....................] - ETA: 1:52 - loss: 1.0717 - regression_loss: 0.9454 - classification_loss: 0.1263 172/500 [=========>....................] - ETA: 1:51 - loss: 1.0692 - regression_loss: 0.9433 - classification_loss: 0.1259 173/500 [=========>....................] - ETA: 1:51 - loss: 1.0682 - regression_loss: 0.9425 - classification_loss: 0.1256 174/500 [=========>....................] - ETA: 1:51 - loss: 1.0694 - regression_loss: 0.9432 - classification_loss: 0.1262 175/500 [=========>....................] - ETA: 1:51 - loss: 1.0763 - regression_loss: 0.9494 - classification_loss: 0.1268 176/500 [=========>....................] - ETA: 1:50 - loss: 1.0751 - regression_loss: 0.9484 - classification_loss: 0.1267 177/500 [=========>....................] - ETA: 1:50 - loss: 1.0708 - regression_loss: 0.9447 - classification_loss: 0.1261 178/500 [=========>....................] - ETA: 1:50 - loss: 1.0691 - regression_loss: 0.9432 - classification_loss: 0.1259 179/500 [=========>....................] - ETA: 1:49 - loss: 1.0683 - regression_loss: 0.9428 - classification_loss: 0.1255 180/500 [=========>....................] - ETA: 1:49 - loss: 1.0726 - regression_loss: 0.9459 - classification_loss: 0.1267 181/500 [=========>....................] - ETA: 1:49 - loss: 1.0720 - regression_loss: 0.9455 - classification_loss: 0.1265 182/500 [=========>....................] - ETA: 1:48 - loss: 1.0678 - regression_loss: 0.9418 - classification_loss: 0.1261 183/500 [=========>....................] - ETA: 1:48 - loss: 1.0648 - regression_loss: 0.9392 - classification_loss: 0.1256 184/500 [==========>...................] - ETA: 1:47 - loss: 1.0637 - regression_loss: 0.9382 - classification_loss: 0.1255 185/500 [==========>...................] - ETA: 1:47 - loss: 1.0608 - regression_loss: 0.9358 - classification_loss: 0.1250 186/500 [==========>...................] - ETA: 1:47 - loss: 1.0618 - regression_loss: 0.9369 - classification_loss: 0.1249 187/500 [==========>...................] - ETA: 1:46 - loss: 1.0586 - regression_loss: 0.9343 - classification_loss: 0.1244 188/500 [==========>...................] - ETA: 1:46 - loss: 1.0558 - regression_loss: 0.9318 - classification_loss: 0.1239 189/500 [==========>...................] - ETA: 1:46 - loss: 1.0553 - regression_loss: 0.9316 - classification_loss: 0.1237 190/500 [==========>...................] - ETA: 1:45 - loss: 1.0547 - regression_loss: 0.9306 - classification_loss: 0.1241 191/500 [==========>...................] - ETA: 1:45 - loss: 1.0547 - regression_loss: 0.9306 - classification_loss: 0.1241 192/500 [==========>...................] - ETA: 1:45 - loss: 1.0542 - regression_loss: 0.9302 - classification_loss: 0.1240 193/500 [==========>...................] - ETA: 1:44 - loss: 1.0555 - regression_loss: 0.9314 - classification_loss: 0.1241 194/500 [==========>...................] - ETA: 1:44 - loss: 1.0547 - regression_loss: 0.9311 - classification_loss: 0.1236 195/500 [==========>...................] - ETA: 1:44 - loss: 1.0565 - regression_loss: 0.9327 - classification_loss: 0.1238 196/500 [==========>...................] - ETA: 1:43 - loss: 1.0566 - regression_loss: 0.9329 - classification_loss: 0.1237 197/500 [==========>...................] - ETA: 1:43 - loss: 1.0558 - regression_loss: 0.9323 - classification_loss: 0.1235 198/500 [==========>...................] - ETA: 1:43 - loss: 1.0541 - regression_loss: 0.9312 - classification_loss: 0.1229 199/500 [==========>...................] - ETA: 1:42 - loss: 1.0543 - regression_loss: 0.9315 - classification_loss: 0.1229 200/500 [===========>..................] - ETA: 1:42 - loss: 1.0552 - regression_loss: 0.9323 - classification_loss: 0.1230 201/500 [===========>..................] - ETA: 1:42 - loss: 1.0557 - regression_loss: 0.9327 - classification_loss: 0.1230 202/500 [===========>..................] - ETA: 1:41 - loss: 1.0557 - regression_loss: 0.9329 - classification_loss: 0.1228 203/500 [===========>..................] - ETA: 1:41 - loss: 1.0584 - regression_loss: 0.9350 - classification_loss: 0.1233 204/500 [===========>..................] - ETA: 1:40 - loss: 1.0556 - regression_loss: 0.9326 - classification_loss: 0.1229 205/500 [===========>..................] - ETA: 1:40 - loss: 1.0565 - regression_loss: 0.9335 - classification_loss: 0.1230 206/500 [===========>..................] - ETA: 1:40 - loss: 1.0537 - regression_loss: 0.9308 - classification_loss: 0.1229 207/500 [===========>..................] - ETA: 1:39 - loss: 1.0508 - regression_loss: 0.9282 - classification_loss: 0.1226 208/500 [===========>..................] - ETA: 1:39 - loss: 1.0483 - regression_loss: 0.9261 - classification_loss: 0.1222 209/500 [===========>..................] - ETA: 1:39 - loss: 1.0477 - regression_loss: 0.9257 - classification_loss: 0.1220 210/500 [===========>..................] - ETA: 1:38 - loss: 1.0484 - regression_loss: 0.9264 - classification_loss: 0.1220 211/500 [===========>..................] - ETA: 1:38 - loss: 1.0498 - regression_loss: 0.9278 - classification_loss: 0.1220 212/500 [===========>..................] - ETA: 1:38 - loss: 1.0484 - regression_loss: 0.9261 - classification_loss: 0.1223 213/500 [===========>..................] - ETA: 1:37 - loss: 1.0462 - regression_loss: 0.9243 - classification_loss: 0.1219 214/500 [===========>..................] - ETA: 1:37 - loss: 1.0444 - regression_loss: 0.9227 - classification_loss: 0.1217 215/500 [===========>..................] - ETA: 1:37 - loss: 1.0447 - regression_loss: 0.9231 - classification_loss: 0.1216 216/500 [===========>..................] - ETA: 1:36 - loss: 1.0434 - regression_loss: 0.9222 - classification_loss: 0.1212 217/500 [============>.................] - ETA: 1:36 - loss: 1.0426 - regression_loss: 0.9216 - classification_loss: 0.1209 218/500 [============>.................] - ETA: 1:36 - loss: 1.0427 - regression_loss: 0.9216 - classification_loss: 0.1211 219/500 [============>.................] - ETA: 1:35 - loss: 1.0422 - regression_loss: 0.9213 - classification_loss: 0.1209 220/500 [============>.................] - ETA: 1:35 - loss: 1.0395 - regression_loss: 0.9190 - classification_loss: 0.1205 221/500 [============>.................] - ETA: 1:35 - loss: 1.0396 - regression_loss: 0.9190 - classification_loss: 0.1207 222/500 [============>.................] - ETA: 1:34 - loss: 1.0387 - regression_loss: 0.9181 - classification_loss: 0.1206 223/500 [============>.................] - ETA: 1:34 - loss: 1.0357 - regression_loss: 0.9155 - classification_loss: 0.1202 224/500 [============>.................] - ETA: 1:34 - loss: 1.0352 - regression_loss: 0.9148 - classification_loss: 0.1204 225/500 [============>.................] - ETA: 1:33 - loss: 1.0350 - regression_loss: 0.9146 - classification_loss: 0.1204 226/500 [============>.................] - ETA: 1:33 - loss: 1.0346 - regression_loss: 0.9143 - classification_loss: 0.1203 227/500 [============>.................] - ETA: 1:33 - loss: 1.0361 - regression_loss: 0.9153 - classification_loss: 0.1208 228/500 [============>.................] - ETA: 1:32 - loss: 1.0354 - regression_loss: 0.9149 - classification_loss: 0.1205 229/500 [============>.................] - ETA: 1:32 - loss: 1.0360 - regression_loss: 0.9156 - classification_loss: 0.1204 230/500 [============>.................] - ETA: 1:32 - loss: 1.0336 - regression_loss: 0.9135 - classification_loss: 0.1201 231/500 [============>.................] - ETA: 1:31 - loss: 1.0321 - regression_loss: 0.9124 - classification_loss: 0.1197 232/500 [============>.................] - ETA: 1:31 - loss: 1.0312 - regression_loss: 0.9116 - classification_loss: 0.1195 233/500 [============>.................] - ETA: 1:31 - loss: 1.0323 - regression_loss: 0.9124 - classification_loss: 0.1199 234/500 [=============>................] - ETA: 1:30 - loss: 1.0341 - regression_loss: 0.9141 - classification_loss: 0.1199 235/500 [=============>................] - ETA: 1:30 - loss: 1.0329 - regression_loss: 0.9131 - classification_loss: 0.1198 236/500 [=============>................] - ETA: 1:30 - loss: 1.0317 - regression_loss: 0.9119 - classification_loss: 0.1198 237/500 [=============>................] - ETA: 1:29 - loss: 1.0334 - regression_loss: 0.9134 - classification_loss: 0.1200 238/500 [=============>................] - ETA: 1:29 - loss: 1.0335 - regression_loss: 0.9136 - classification_loss: 0.1200 239/500 [=============>................] - ETA: 1:29 - loss: 1.0325 - regression_loss: 0.9127 - classification_loss: 0.1198 240/500 [=============>................] - ETA: 1:28 - loss: 1.0350 - regression_loss: 0.9149 - classification_loss: 0.1200 241/500 [=============>................] - ETA: 1:28 - loss: 1.0351 - regression_loss: 0.9150 - classification_loss: 0.1201 242/500 [=============>................] - ETA: 1:27 - loss: 1.0329 - regression_loss: 0.9126 - classification_loss: 0.1203 243/500 [=============>................] - ETA: 1:27 - loss: 1.0298 - regression_loss: 0.9099 - classification_loss: 0.1199 244/500 [=============>................] - ETA: 1:27 - loss: 1.0269 - regression_loss: 0.9075 - classification_loss: 0.1195 245/500 [=============>................] - ETA: 1:26 - loss: 1.0301 - regression_loss: 0.9097 - classification_loss: 0.1204 246/500 [=============>................] - ETA: 1:26 - loss: 1.0322 - regression_loss: 0.9118 - classification_loss: 0.1204 247/500 [=============>................] - ETA: 1:26 - loss: 1.0314 - regression_loss: 0.9112 - classification_loss: 0.1201 248/500 [=============>................] - ETA: 1:25 - loss: 1.0315 - regression_loss: 0.9115 - classification_loss: 0.1200 249/500 [=============>................] - ETA: 1:25 - loss: 1.0299 - regression_loss: 0.9103 - classification_loss: 0.1197 250/500 [==============>...............] - ETA: 1:25 - loss: 1.0291 - regression_loss: 0.9096 - classification_loss: 0.1195 251/500 [==============>...............] - ETA: 1:24 - loss: 1.0295 - regression_loss: 0.9100 - classification_loss: 0.1195 252/500 [==============>...............] - ETA: 1:24 - loss: 1.0301 - regression_loss: 0.9108 - classification_loss: 0.1194 253/500 [==============>...............] - ETA: 1:24 - loss: 1.0314 - regression_loss: 0.9119 - classification_loss: 0.1195 254/500 [==============>...............] - ETA: 1:23 - loss: 1.0305 - regression_loss: 0.9111 - classification_loss: 0.1193 255/500 [==============>...............] - ETA: 1:23 - loss: 1.0291 - regression_loss: 0.9101 - classification_loss: 0.1190 256/500 [==============>...............] - ETA: 1:23 - loss: 1.0278 - regression_loss: 0.9090 - classification_loss: 0.1188 257/500 [==============>...............] - ETA: 1:22 - loss: 1.0277 - regression_loss: 0.9090 - classification_loss: 0.1187 258/500 [==============>...............] - ETA: 1:22 - loss: 1.0281 - regression_loss: 0.9093 - classification_loss: 0.1188 259/500 [==============>...............] - ETA: 1:22 - loss: 1.0291 - regression_loss: 0.9105 - classification_loss: 0.1185 260/500 [==============>...............] - ETA: 1:21 - loss: 1.0309 - regression_loss: 0.9119 - classification_loss: 0.1191 261/500 [==============>...............] - ETA: 1:21 - loss: 1.0296 - regression_loss: 0.9106 - classification_loss: 0.1190 262/500 [==============>...............] - ETA: 1:21 - loss: 1.0300 - regression_loss: 0.9110 - classification_loss: 0.1190 263/500 [==============>...............] - ETA: 1:20 - loss: 1.0289 - regression_loss: 0.9102 - classification_loss: 0.1187 264/500 [==============>...............] - ETA: 1:20 - loss: 1.0287 - regression_loss: 0.9101 - classification_loss: 0.1186 265/500 [==============>...............] - ETA: 1:20 - loss: 1.0312 - regression_loss: 0.9127 - classification_loss: 0.1184 266/500 [==============>...............] - ETA: 1:19 - loss: 1.0311 - regression_loss: 0.9127 - classification_loss: 0.1184 267/500 [===============>..............] - ETA: 1:19 - loss: 1.0285 - regression_loss: 0.9103 - classification_loss: 0.1182 268/500 [===============>..............] - ETA: 1:19 - loss: 1.0296 - regression_loss: 0.9114 - classification_loss: 0.1182 269/500 [===============>..............] - ETA: 1:18 - loss: 1.0302 - regression_loss: 0.9119 - classification_loss: 0.1184 270/500 [===============>..............] - ETA: 1:18 - loss: 1.0301 - regression_loss: 0.9117 - classification_loss: 0.1184 271/500 [===============>..............] - ETA: 1:18 - loss: 1.0281 - regression_loss: 0.9100 - classification_loss: 0.1181 272/500 [===============>..............] - ETA: 1:17 - loss: 1.0314 - regression_loss: 0.9131 - classification_loss: 0.1183 273/500 [===============>..............] - ETA: 1:17 - loss: 1.0314 - regression_loss: 0.9133 - classification_loss: 0.1181 274/500 [===============>..............] - ETA: 1:16 - loss: 1.0317 - regression_loss: 0.9138 - classification_loss: 0.1178 275/500 [===============>..............] - ETA: 1:16 - loss: 1.0302 - regression_loss: 0.9125 - classification_loss: 0.1176 276/500 [===============>..............] - ETA: 1:16 - loss: 1.0310 - regression_loss: 0.9131 - classification_loss: 0.1180 277/500 [===============>..............] - ETA: 1:15 - loss: 1.0299 - regression_loss: 0.9120 - classification_loss: 0.1179 278/500 [===============>..............] - ETA: 1:15 - loss: 1.0290 - regression_loss: 0.9112 - classification_loss: 0.1178 279/500 [===============>..............] - ETA: 1:15 - loss: 1.0298 - regression_loss: 0.9117 - classification_loss: 0.1181 280/500 [===============>..............] - ETA: 1:14 - loss: 1.0290 - regression_loss: 0.9112 - classification_loss: 0.1178 281/500 [===============>..............] - ETA: 1:14 - loss: 1.0291 - regression_loss: 0.9113 - classification_loss: 0.1177 282/500 [===============>..............] - ETA: 1:14 - loss: 1.0289 - regression_loss: 0.9113 - classification_loss: 0.1177 283/500 [===============>..............] - ETA: 1:13 - loss: 1.0265 - regression_loss: 0.9091 - classification_loss: 0.1174 284/500 [================>.............] - ETA: 1:13 - loss: 1.0252 - regression_loss: 0.9081 - classification_loss: 0.1171 285/500 [================>.............] - ETA: 1:13 - loss: 1.0243 - regression_loss: 0.9074 - classification_loss: 0.1169 286/500 [================>.............] - ETA: 1:12 - loss: 1.0239 - regression_loss: 0.9070 - classification_loss: 0.1169 287/500 [================>.............] - ETA: 1:12 - loss: 1.0248 - regression_loss: 0.9079 - classification_loss: 0.1169 288/500 [================>.............] - ETA: 1:12 - loss: 1.0231 - regression_loss: 0.9064 - classification_loss: 0.1166 289/500 [================>.............] - ETA: 1:11 - loss: 1.0228 - regression_loss: 0.9061 - classification_loss: 0.1167 290/500 [================>.............] - ETA: 1:11 - loss: 1.0230 - regression_loss: 0.9062 - classification_loss: 0.1168 291/500 [================>.............] - ETA: 1:11 - loss: 1.0214 - regression_loss: 0.9049 - classification_loss: 0.1165 292/500 [================>.............] - ETA: 1:10 - loss: 1.0230 - regression_loss: 0.9061 - classification_loss: 0.1169 293/500 [================>.............] - ETA: 1:10 - loss: 1.0223 - regression_loss: 0.9056 - classification_loss: 0.1167 294/500 [================>.............] - ETA: 1:10 - loss: 1.0224 - regression_loss: 0.9058 - classification_loss: 0.1166 295/500 [================>.............] - ETA: 1:09 - loss: 1.0205 - regression_loss: 0.9040 - classification_loss: 0.1164 296/500 [================>.............] - ETA: 1:09 - loss: 1.0197 - regression_loss: 0.9033 - classification_loss: 0.1164 297/500 [================>.............] - ETA: 1:09 - loss: 1.0208 - regression_loss: 0.9044 - classification_loss: 0.1164 298/500 [================>.............] - ETA: 1:08 - loss: 1.0202 - regression_loss: 0.9038 - classification_loss: 0.1164 299/500 [================>.............] - ETA: 1:08 - loss: 1.0202 - regression_loss: 0.9039 - classification_loss: 0.1163 300/500 [=================>............] - ETA: 1:08 - loss: 1.0227 - regression_loss: 0.9058 - classification_loss: 0.1169 301/500 [=================>............] - ETA: 1:07 - loss: 1.0227 - regression_loss: 0.9057 - classification_loss: 0.1170 302/500 [=================>............] - ETA: 1:07 - loss: 1.0205 - regression_loss: 0.9039 - classification_loss: 0.1166 303/500 [=================>............] - ETA: 1:07 - loss: 1.0202 - regression_loss: 0.9038 - classification_loss: 0.1164 304/500 [=================>............] - ETA: 1:06 - loss: 1.0211 - regression_loss: 0.9047 - classification_loss: 0.1164 305/500 [=================>............] - ETA: 1:06 - loss: 1.0198 - regression_loss: 0.9037 - classification_loss: 0.1161 306/500 [=================>............] - ETA: 1:05 - loss: 1.0215 - regression_loss: 0.9048 - classification_loss: 0.1167 307/500 [=================>............] - ETA: 1:05 - loss: 1.0211 - regression_loss: 0.9044 - classification_loss: 0.1166 308/500 [=================>............] - ETA: 1:05 - loss: 1.0217 - regression_loss: 0.9046 - classification_loss: 0.1171 309/500 [=================>............] - ETA: 1:04 - loss: 1.0229 - regression_loss: 0.9054 - classification_loss: 0.1175 310/500 [=================>............] - ETA: 1:04 - loss: 1.0235 - regression_loss: 0.9062 - classification_loss: 0.1173 311/500 [=================>............] - ETA: 1:04 - loss: 1.0228 - regression_loss: 0.9055 - classification_loss: 0.1173 312/500 [=================>............] - ETA: 1:03 - loss: 1.0233 - regression_loss: 0.9061 - classification_loss: 0.1173 313/500 [=================>............] - ETA: 1:03 - loss: 1.0234 - regression_loss: 0.9061 - classification_loss: 0.1173 314/500 [=================>............] - ETA: 1:03 - loss: 1.0223 - regression_loss: 0.9052 - classification_loss: 0.1171 315/500 [=================>............] - ETA: 1:02 - loss: 1.0206 - regression_loss: 0.9038 - classification_loss: 0.1168 316/500 [=================>............] - ETA: 1:02 - loss: 1.0200 - regression_loss: 0.9027 - classification_loss: 0.1173 317/500 [==================>...........] - ETA: 1:02 - loss: 1.0200 - regression_loss: 0.9028 - classification_loss: 0.1171 318/500 [==================>...........] - ETA: 1:01 - loss: 1.0199 - regression_loss: 0.9028 - classification_loss: 0.1170 319/500 [==================>...........] - ETA: 1:01 - loss: 1.0223 - regression_loss: 0.9049 - classification_loss: 0.1175 320/500 [==================>...........] - ETA: 1:01 - loss: 1.0223 - regression_loss: 0.9050 - classification_loss: 0.1172 321/500 [==================>...........] - ETA: 1:00 - loss: 1.0212 - regression_loss: 0.9041 - classification_loss: 0.1171 322/500 [==================>...........] - ETA: 1:00 - loss: 1.0209 - regression_loss: 0.9036 - classification_loss: 0.1173 323/500 [==================>...........] - ETA: 1:00 - loss: 1.0190 - regression_loss: 0.9020 - classification_loss: 0.1171 324/500 [==================>...........] - ETA: 59s - loss: 1.0186 - regression_loss: 0.9017 - classification_loss: 0.1169  325/500 [==================>...........] - ETA: 59s - loss: 1.0164 - regression_loss: 0.8998 - classification_loss: 0.1166 326/500 [==================>...........] - ETA: 59s - loss: 1.0159 - regression_loss: 0.8994 - classification_loss: 0.1165 327/500 [==================>...........] - ETA: 58s - loss: 1.0141 - regression_loss: 0.8978 - classification_loss: 0.1163 328/500 [==================>...........] - ETA: 58s - loss: 1.0126 - regression_loss: 0.8965 - classification_loss: 0.1160 329/500 [==================>...........] - ETA: 58s - loss: 1.0114 - regression_loss: 0.8956 - classification_loss: 0.1159 330/500 [==================>...........] - ETA: 57s - loss: 1.0137 - regression_loss: 0.8974 - classification_loss: 0.1163 331/500 [==================>...........] - ETA: 57s - loss: 1.0128 - regression_loss: 0.8967 - classification_loss: 0.1162 332/500 [==================>...........] - ETA: 57s - loss: 1.0125 - regression_loss: 0.8965 - classification_loss: 0.1160 333/500 [==================>...........] - ETA: 56s - loss: 1.0124 - regression_loss: 0.8964 - classification_loss: 0.1160 334/500 [===================>..........] - ETA: 56s - loss: 1.0127 - regression_loss: 0.8965 - classification_loss: 0.1162 335/500 [===================>..........] - ETA: 56s - loss: 1.0128 - regression_loss: 0.8966 - classification_loss: 0.1162 336/500 [===================>..........] - ETA: 55s - loss: 1.0136 - regression_loss: 0.8974 - classification_loss: 0.1162 337/500 [===================>..........] - ETA: 55s - loss: 1.0144 - regression_loss: 0.8981 - classification_loss: 0.1164 338/500 [===================>..........] - ETA: 55s - loss: 1.0156 - regression_loss: 0.8989 - classification_loss: 0.1166 339/500 [===================>..........] - ETA: 54s - loss: 1.0152 - regression_loss: 0.8986 - classification_loss: 0.1166 340/500 [===================>..........] - ETA: 54s - loss: 1.0144 - regression_loss: 0.8979 - classification_loss: 0.1165 341/500 [===================>..........] - ETA: 54s - loss: 1.0163 - regression_loss: 0.8997 - classification_loss: 0.1166 342/500 [===================>..........] - ETA: 53s - loss: 1.0169 - regression_loss: 0.8999 - classification_loss: 0.1169 343/500 [===================>..........] - ETA: 53s - loss: 1.0165 - regression_loss: 0.8997 - classification_loss: 0.1168 344/500 [===================>..........] - ETA: 53s - loss: 1.0186 - regression_loss: 0.9018 - classification_loss: 0.1168 345/500 [===================>..........] - ETA: 52s - loss: 1.0193 - regression_loss: 0.9024 - classification_loss: 0.1169 346/500 [===================>..........] - ETA: 52s - loss: 1.0197 - regression_loss: 0.9027 - classification_loss: 0.1170 347/500 [===================>..........] - ETA: 51s - loss: 1.0195 - regression_loss: 0.9027 - classification_loss: 0.1168 348/500 [===================>..........] - ETA: 51s - loss: 1.0195 - regression_loss: 0.9027 - classification_loss: 0.1168 349/500 [===================>..........] - ETA: 51s - loss: 1.0186 - regression_loss: 0.9020 - classification_loss: 0.1166 350/500 [====================>.........] - ETA: 50s - loss: 1.0201 - regression_loss: 0.9033 - classification_loss: 0.1168 351/500 [====================>.........] - ETA: 50s - loss: 1.0204 - regression_loss: 0.9035 - classification_loss: 0.1169 352/500 [====================>.........] - ETA: 50s - loss: 1.0204 - regression_loss: 0.9038 - classification_loss: 0.1167 353/500 [====================>.........] - ETA: 49s - loss: 1.0199 - regression_loss: 0.9034 - classification_loss: 0.1165 354/500 [====================>.........] - ETA: 49s - loss: 1.0207 - regression_loss: 0.9041 - classification_loss: 0.1166 355/500 [====================>.........] - ETA: 49s - loss: 1.0216 - regression_loss: 0.9051 - classification_loss: 0.1165 356/500 [====================>.........] - ETA: 48s - loss: 1.0206 - regression_loss: 0.9043 - classification_loss: 0.1163 357/500 [====================>.........] - ETA: 48s - loss: 1.0204 - regression_loss: 0.9042 - classification_loss: 0.1162 358/500 [====================>.........] - ETA: 48s - loss: 1.0189 - regression_loss: 0.9029 - classification_loss: 0.1160 359/500 [====================>.........] - ETA: 47s - loss: 1.0167 - regression_loss: 0.9009 - classification_loss: 0.1158 360/500 [====================>.........] - ETA: 47s - loss: 1.0180 - regression_loss: 0.9021 - classification_loss: 0.1159 361/500 [====================>.........] - ETA: 47s - loss: 1.0182 - regression_loss: 0.9023 - classification_loss: 0.1159 362/500 [====================>.........] - ETA: 46s - loss: 1.0180 - regression_loss: 0.9022 - classification_loss: 0.1158 363/500 [====================>.........] - ETA: 46s - loss: 1.0165 - regression_loss: 0.9009 - classification_loss: 0.1156 364/500 [====================>.........] - ETA: 46s - loss: 1.0194 - regression_loss: 0.9029 - classification_loss: 0.1165 365/500 [====================>.........] - ETA: 45s - loss: 1.0199 - regression_loss: 0.9035 - classification_loss: 0.1163 366/500 [====================>.........] - ETA: 45s - loss: 1.0196 - regression_loss: 0.9033 - classification_loss: 0.1163 367/500 [=====================>........] - ETA: 45s - loss: 1.0194 - regression_loss: 0.9031 - classification_loss: 0.1163 368/500 [=====================>........] - ETA: 44s - loss: 1.0199 - regression_loss: 0.9035 - classification_loss: 0.1163 369/500 [=====================>........] - ETA: 44s - loss: 1.0203 - regression_loss: 0.9039 - classification_loss: 0.1163 370/500 [=====================>........] - ETA: 44s - loss: 1.0183 - regression_loss: 0.9022 - classification_loss: 0.1161 371/500 [=====================>........] - ETA: 43s - loss: 1.0179 - regression_loss: 0.9019 - classification_loss: 0.1160 372/500 [=====================>........] - ETA: 43s - loss: 1.0162 - regression_loss: 0.9004 - classification_loss: 0.1158 373/500 [=====================>........] - ETA: 43s - loss: 1.0172 - regression_loss: 0.9013 - classification_loss: 0.1159 374/500 [=====================>........] - ETA: 42s - loss: 1.0179 - regression_loss: 0.9019 - classification_loss: 0.1160 375/500 [=====================>........] - ETA: 42s - loss: 1.0183 - regression_loss: 0.9023 - classification_loss: 0.1160 376/500 [=====================>........] - ETA: 42s - loss: 1.0156 - regression_loss: 0.8999 - classification_loss: 0.1158 377/500 [=====================>........] - ETA: 41s - loss: 1.0151 - regression_loss: 0.8995 - classification_loss: 0.1156 378/500 [=====================>........] - ETA: 41s - loss: 1.0132 - regression_loss: 0.8979 - classification_loss: 0.1154 379/500 [=====================>........] - ETA: 41s - loss: 1.0139 - regression_loss: 0.8985 - classification_loss: 0.1154 380/500 [=====================>........] - ETA: 40s - loss: 1.0165 - regression_loss: 0.9002 - classification_loss: 0.1163 381/500 [=====================>........] - ETA: 40s - loss: 1.0174 - regression_loss: 0.9009 - classification_loss: 0.1165 382/500 [=====================>........] - ETA: 40s - loss: 1.0167 - regression_loss: 0.9004 - classification_loss: 0.1163 383/500 [=====================>........] - ETA: 39s - loss: 1.0164 - regression_loss: 0.9002 - classification_loss: 0.1162 384/500 [======================>.......] - ETA: 39s - loss: 1.0162 - regression_loss: 0.9001 - classification_loss: 0.1160 385/500 [======================>.......] - ETA: 39s - loss: 1.0159 - regression_loss: 0.8998 - classification_loss: 0.1161 386/500 [======================>.......] - ETA: 38s - loss: 1.0157 - regression_loss: 0.8993 - classification_loss: 0.1163 387/500 [======================>.......] - ETA: 38s - loss: 1.0169 - regression_loss: 0.9001 - classification_loss: 0.1168 388/500 [======================>.......] - ETA: 38s - loss: 1.0163 - regression_loss: 0.8995 - classification_loss: 0.1167 389/500 [======================>.......] - ETA: 37s - loss: 1.0146 - regression_loss: 0.8981 - classification_loss: 0.1165 390/500 [======================>.......] - ETA: 37s - loss: 1.0146 - regression_loss: 0.8981 - classification_loss: 0.1165 391/500 [======================>.......] - ETA: 37s - loss: 1.0148 - regression_loss: 0.8983 - classification_loss: 0.1165 392/500 [======================>.......] - ETA: 36s - loss: 1.0143 - regression_loss: 0.8979 - classification_loss: 0.1164 393/500 [======================>.......] - ETA: 36s - loss: 1.0147 - regression_loss: 0.8983 - classification_loss: 0.1164 394/500 [======================>.......] - ETA: 35s - loss: 1.0147 - regression_loss: 0.8982 - classification_loss: 0.1165 395/500 [======================>.......] - ETA: 35s - loss: 1.0135 - regression_loss: 0.8971 - classification_loss: 0.1164 396/500 [======================>.......] - ETA: 35s - loss: 1.0121 - regression_loss: 0.8959 - classification_loss: 0.1163 397/500 [======================>.......] - ETA: 34s - loss: 1.0122 - regression_loss: 0.8960 - classification_loss: 0.1163 398/500 [======================>.......] - ETA: 34s - loss: 1.0112 - regression_loss: 0.8950 - classification_loss: 0.1161 399/500 [======================>.......] - ETA: 34s - loss: 1.0129 - regression_loss: 0.8969 - classification_loss: 0.1161 400/500 [=======================>......] - ETA: 33s - loss: 1.0134 - regression_loss: 0.8974 - classification_loss: 0.1160 401/500 [=======================>......] - ETA: 33s - loss: 1.0131 - regression_loss: 0.8972 - classification_loss: 0.1160 402/500 [=======================>......] - ETA: 33s - loss: 1.0123 - regression_loss: 0.8966 - classification_loss: 0.1157 403/500 [=======================>......] - ETA: 32s - loss: 1.0122 - regression_loss: 0.8965 - classification_loss: 0.1157 404/500 [=======================>......] - ETA: 32s - loss: 1.0109 - regression_loss: 0.8955 - classification_loss: 0.1154 405/500 [=======================>......] - ETA: 32s - loss: 1.0117 - regression_loss: 0.8962 - classification_loss: 0.1155 406/500 [=======================>......] - ETA: 31s - loss: 1.0117 - regression_loss: 0.8962 - classification_loss: 0.1155 407/500 [=======================>......] - ETA: 31s - loss: 1.0120 - regression_loss: 0.8965 - classification_loss: 0.1155 408/500 [=======================>......] - ETA: 31s - loss: 1.0119 - regression_loss: 0.8966 - classification_loss: 0.1153 409/500 [=======================>......] - ETA: 30s - loss: 1.0124 - regression_loss: 0.8972 - classification_loss: 0.1152 410/500 [=======================>......] - ETA: 30s - loss: 1.0134 - regression_loss: 0.8983 - classification_loss: 0.1151 411/500 [=======================>......] - ETA: 30s - loss: 1.0146 - regression_loss: 0.8992 - classification_loss: 0.1154 412/500 [=======================>......] - ETA: 29s - loss: 1.0133 - regression_loss: 0.8980 - classification_loss: 0.1153 413/500 [=======================>......] - ETA: 29s - loss: 1.0141 - regression_loss: 0.8986 - classification_loss: 0.1155 414/500 [=======================>......] - ETA: 29s - loss: 1.0140 - regression_loss: 0.8985 - classification_loss: 0.1155 415/500 [=======================>......] - ETA: 28s - loss: 1.0138 - regression_loss: 0.8984 - classification_loss: 0.1154 416/500 [=======================>......] - ETA: 28s - loss: 1.0128 - regression_loss: 0.8975 - classification_loss: 0.1153 417/500 [========================>.....] - ETA: 28s - loss: 1.0118 - regression_loss: 0.8967 - classification_loss: 0.1151 418/500 [========================>.....] - ETA: 27s - loss: 1.0106 - regression_loss: 0.8957 - classification_loss: 0.1150 419/500 [========================>.....] - ETA: 27s - loss: 1.0105 - regression_loss: 0.8955 - classification_loss: 0.1150 420/500 [========================>.....] - ETA: 27s - loss: 1.0105 - regression_loss: 0.8956 - classification_loss: 0.1149 421/500 [========================>.....] - ETA: 26s - loss: 1.0096 - regression_loss: 0.8947 - classification_loss: 0.1149 422/500 [========================>.....] - ETA: 26s - loss: 1.0108 - regression_loss: 0.8957 - classification_loss: 0.1151 423/500 [========================>.....] - ETA: 26s - loss: 1.0110 - regression_loss: 0.8958 - classification_loss: 0.1152 424/500 [========================>.....] - ETA: 25s - loss: 1.0101 - regression_loss: 0.8951 - classification_loss: 0.1149 425/500 [========================>.....] - ETA: 25s - loss: 1.0102 - regression_loss: 0.8951 - classification_loss: 0.1151 426/500 [========================>.....] - ETA: 25s - loss: 1.0101 - regression_loss: 0.8948 - classification_loss: 0.1154 427/500 [========================>.....] - ETA: 24s - loss: 1.0107 - regression_loss: 0.8952 - classification_loss: 0.1155 428/500 [========================>.....] - ETA: 24s - loss: 1.0107 - regression_loss: 0.8952 - classification_loss: 0.1155 429/500 [========================>.....] - ETA: 24s - loss: 1.0096 - regression_loss: 0.8943 - classification_loss: 0.1153 430/500 [========================>.....] - ETA: 23s - loss: 1.0088 - regression_loss: 0.8935 - classification_loss: 0.1153 431/500 [========================>.....] - ETA: 23s - loss: 1.0098 - regression_loss: 0.8945 - classification_loss: 0.1153 432/500 [========================>.....] - ETA: 23s - loss: 1.0097 - regression_loss: 0.8944 - classification_loss: 0.1152 433/500 [========================>.....] - ETA: 22s - loss: 1.0093 - regression_loss: 0.8941 - classification_loss: 0.1152 434/500 [=========================>....] - ETA: 22s - loss: 1.0079 - regression_loss: 0.8930 - classification_loss: 0.1150 435/500 [=========================>....] - ETA: 22s - loss: 1.0073 - regression_loss: 0.8924 - classification_loss: 0.1148 436/500 [=========================>....] - ETA: 21s - loss: 1.0073 - regression_loss: 0.8924 - classification_loss: 0.1150 437/500 [=========================>....] - ETA: 21s - loss: 1.0086 - regression_loss: 0.8933 - classification_loss: 0.1154 438/500 [=========================>....] - ETA: 21s - loss: 1.0076 - regression_loss: 0.8924 - classification_loss: 0.1152 439/500 [=========================>....] - ETA: 20s - loss: 1.0110 - regression_loss: 0.8952 - classification_loss: 0.1158 440/500 [=========================>....] - ETA: 20s - loss: 1.0109 - regression_loss: 0.8952 - classification_loss: 0.1157 441/500 [=========================>....] - ETA: 20s - loss: 1.0100 - regression_loss: 0.8944 - classification_loss: 0.1156 442/500 [=========================>....] - ETA: 19s - loss: 1.0090 - regression_loss: 0.8936 - classification_loss: 0.1154 443/500 [=========================>....] - ETA: 19s - loss: 1.0080 - regression_loss: 0.8926 - classification_loss: 0.1154 444/500 [=========================>....] - ETA: 19s - loss: 1.0077 - regression_loss: 0.8924 - classification_loss: 0.1153 445/500 [=========================>....] - ETA: 18s - loss: 1.0086 - regression_loss: 0.8931 - classification_loss: 0.1155 446/500 [=========================>....] - ETA: 18s - loss: 1.0088 - regression_loss: 0.8934 - classification_loss: 0.1155 447/500 [=========================>....] - ETA: 17s - loss: 1.0086 - regression_loss: 0.8932 - classification_loss: 0.1154 448/500 [=========================>....] - ETA: 17s - loss: 1.0091 - regression_loss: 0.8935 - classification_loss: 0.1155 449/500 [=========================>....] - ETA: 17s - loss: 1.0081 - regression_loss: 0.8928 - classification_loss: 0.1153 450/500 [==========================>...] - ETA: 16s - loss: 1.0074 - regression_loss: 0.8922 - classification_loss: 0.1152 451/500 [==========================>...] - ETA: 16s - loss: 1.0077 - regression_loss: 0.8924 - classification_loss: 0.1154 452/500 [==========================>...] - ETA: 16s - loss: 1.0070 - regression_loss: 0.8918 - classification_loss: 0.1153 453/500 [==========================>...] - ETA: 15s - loss: 1.0086 - regression_loss: 0.8932 - classification_loss: 0.1154 454/500 [==========================>...] - ETA: 15s - loss: 1.0081 - regression_loss: 0.8928 - classification_loss: 0.1153 455/500 [==========================>...] - ETA: 15s - loss: 1.0089 - regression_loss: 0.8936 - classification_loss: 0.1154 456/500 [==========================>...] - ETA: 14s - loss: 1.0101 - regression_loss: 0.8945 - classification_loss: 0.1157 457/500 [==========================>...] - ETA: 14s - loss: 1.0102 - regression_loss: 0.8945 - classification_loss: 0.1157 458/500 [==========================>...] - ETA: 14s - loss: 1.0089 - regression_loss: 0.8934 - classification_loss: 0.1155 459/500 [==========================>...] - ETA: 13s - loss: 1.0090 - regression_loss: 0.8936 - classification_loss: 0.1154 460/500 [==========================>...] - ETA: 13s - loss: 1.0094 - regression_loss: 0.8941 - classification_loss: 0.1154 461/500 [==========================>...] - ETA: 13s - loss: 1.0103 - regression_loss: 0.8945 - classification_loss: 0.1158 462/500 [==========================>...] - ETA: 12s - loss: 1.0103 - regression_loss: 0.8946 - classification_loss: 0.1157 463/500 [==========================>...] - ETA: 12s - loss: 1.0100 - regression_loss: 0.8942 - classification_loss: 0.1158 464/500 [==========================>...] - ETA: 12s - loss: 1.0100 - regression_loss: 0.8942 - classification_loss: 0.1158 465/500 [==========================>...] - ETA: 11s - loss: 1.0103 - regression_loss: 0.8946 - classification_loss: 0.1157 466/500 [==========================>...] - ETA: 11s - loss: 1.0100 - regression_loss: 0.8944 - classification_loss: 0.1156 467/500 [===========================>..] - ETA: 11s - loss: 1.0089 - regression_loss: 0.8936 - classification_loss: 0.1154 468/500 [===========================>..] - ETA: 10s - loss: 1.0086 - regression_loss: 0.8933 - classification_loss: 0.1153 469/500 [===========================>..] - ETA: 10s - loss: 1.0074 - regression_loss: 0.8923 - classification_loss: 0.1151 470/500 [===========================>..] - ETA: 10s - loss: 1.0070 - regression_loss: 0.8920 - classification_loss: 0.1151 471/500 [===========================>..] - ETA: 9s - loss: 1.0062 - regression_loss: 0.8912 - classification_loss: 0.1149  472/500 [===========================>..] - ETA: 9s - loss: 1.0069 - regression_loss: 0.8918 - classification_loss: 0.1150 473/500 [===========================>..] - ETA: 9s - loss: 1.0067 - regression_loss: 0.8917 - classification_loss: 0.1150 474/500 [===========================>..] - ETA: 8s - loss: 1.0075 - regression_loss: 0.8925 - classification_loss: 0.1150 475/500 [===========================>..] - ETA: 8s - loss: 1.0066 - regression_loss: 0.8917 - classification_loss: 0.1149 476/500 [===========================>..] - ETA: 8s - loss: 1.0056 - regression_loss: 0.8908 - classification_loss: 0.1148 477/500 [===========================>..] - ETA: 7s - loss: 1.0055 - regression_loss: 0.8908 - classification_loss: 0.1147 478/500 [===========================>..] - ETA: 7s - loss: 1.0059 - regression_loss: 0.8911 - classification_loss: 0.1148 479/500 [===========================>..] - ETA: 7s - loss: 1.0057 - regression_loss: 0.8909 - classification_loss: 0.1147 480/500 [===========================>..] - ETA: 6s - loss: 1.0054 - regression_loss: 0.8908 - classification_loss: 0.1146 481/500 [===========================>..] - ETA: 6s - loss: 1.0057 - regression_loss: 0.8912 - classification_loss: 0.1146 482/500 [===========================>..] - ETA: 6s - loss: 1.0067 - regression_loss: 0.8919 - classification_loss: 0.1147 483/500 [===========================>..] - ETA: 5s - loss: 1.0062 - regression_loss: 0.8916 - classification_loss: 0.1146 484/500 [============================>.] - ETA: 5s - loss: 1.0060 - regression_loss: 0.8914 - classification_loss: 0.1146 485/500 [============================>.] - ETA: 5s - loss: 1.0052 - regression_loss: 0.8906 - classification_loss: 0.1145 486/500 [============================>.] - ETA: 4s - loss: 1.0063 - regression_loss: 0.8918 - classification_loss: 0.1146 487/500 [============================>.] - ETA: 4s - loss: 1.0072 - regression_loss: 0.8925 - classification_loss: 0.1148 488/500 [============================>.] - ETA: 4s - loss: 1.0077 - regression_loss: 0.8929 - classification_loss: 0.1148 489/500 [============================>.] - ETA: 3s - loss: 1.0080 - regression_loss: 0.8932 - classification_loss: 0.1148 490/500 [============================>.] - ETA: 3s - loss: 1.0089 - regression_loss: 0.8939 - classification_loss: 0.1151 491/500 [============================>.] - ETA: 3s - loss: 1.0078 - regression_loss: 0.8929 - classification_loss: 0.1149 492/500 [============================>.] - ETA: 2s - loss: 1.0075 - regression_loss: 0.8926 - classification_loss: 0.1149 493/500 [============================>.] - ETA: 2s - loss: 1.0098 - regression_loss: 0.8946 - classification_loss: 0.1151 494/500 [============================>.] - ETA: 2s - loss: 1.0087 - regression_loss: 0.8937 - classification_loss: 0.1150 495/500 [============================>.] - ETA: 1s - loss: 1.0093 - regression_loss: 0.8942 - classification_loss: 0.1150 496/500 [============================>.] - ETA: 1s - loss: 1.0116 - regression_loss: 0.8962 - classification_loss: 0.1154 497/500 [============================>.] - ETA: 1s - loss: 1.0121 - regression_loss: 0.8967 - classification_loss: 0.1154 498/500 [============================>.] - ETA: 0s - loss: 1.0113 - regression_loss: 0.8960 - classification_loss: 0.1153 499/500 [============================>.] - ETA: 0s - loss: 1.0106 - regression_loss: 0.8954 - classification_loss: 0.1152 500/500 [==============================] - 170s 339ms/step - loss: 1.0128 - regression_loss: 0.8974 - classification_loss: 0.1153 326 instances of class plum with average precision: 0.8310 mAP: 0.8310 Epoch 00023: saving model to ./training/snapshots/resnet101_pascal_23.h5 Epoch 24/150 1/500 [..............................] - ETA: 2:38 - loss: 0.5063 - regression_loss: 0.4659 - classification_loss: 0.0404 2/500 [..............................] - ETA: 2:47 - loss: 0.4403 - regression_loss: 0.4002 - classification_loss: 0.0401 3/500 [..............................] - ETA: 2:47 - loss: 0.4465 - regression_loss: 0.3804 - classification_loss: 0.0662 4/500 [..............................] - ETA: 2:46 - loss: 0.3410 - regression_loss: 0.2853 - classification_loss: 0.0557 5/500 [..............................] - ETA: 2:47 - loss: 0.3946 - regression_loss: 0.3397 - classification_loss: 0.0549 6/500 [..............................] - ETA: 2:47 - loss: 0.6245 - regression_loss: 0.5279 - classification_loss: 0.0966 7/500 [..............................] - ETA: 2:47 - loss: 0.5978 - regression_loss: 0.5124 - classification_loss: 0.0854 8/500 [..............................] - ETA: 2:47 - loss: 0.7141 - regression_loss: 0.6029 - classification_loss: 0.1111 9/500 [..............................] - ETA: 2:47 - loss: 0.7310 - regression_loss: 0.6191 - classification_loss: 0.1119 10/500 [..............................] - ETA: 2:47 - loss: 0.7961 - regression_loss: 0.6747 - classification_loss: 0.1214 11/500 [..............................] - ETA: 2:47 - loss: 0.8508 - regression_loss: 0.7171 - classification_loss: 0.1337 12/500 [..............................] - ETA: 2:47 - loss: 0.8949 - regression_loss: 0.7628 - classification_loss: 0.1321 13/500 [..............................] - ETA: 2:47 - loss: 0.9137 - regression_loss: 0.7798 - classification_loss: 0.1339 14/500 [..............................] - ETA: 2:47 - loss: 0.9105 - regression_loss: 0.7807 - classification_loss: 0.1298 15/500 [..............................] - ETA: 2:47 - loss: 0.9191 - regression_loss: 0.7945 - classification_loss: 0.1246 16/500 [..............................] - ETA: 2:46 - loss: 0.9353 - regression_loss: 0.8109 - classification_loss: 0.1245 17/500 [>.............................] - ETA: 2:45 - loss: 0.9699 - regression_loss: 0.8362 - classification_loss: 0.1336 18/500 [>.............................] - ETA: 2:45 - loss: 0.9896 - regression_loss: 0.8536 - classification_loss: 0.1360 19/500 [>.............................] - ETA: 2:44 - loss: 0.9792 - regression_loss: 0.8463 - classification_loss: 0.1329 20/500 [>.............................] - ETA: 2:44 - loss: 1.0182 - regression_loss: 0.8783 - classification_loss: 0.1400 21/500 [>.............................] - ETA: 2:44 - loss: 1.0628 - regression_loss: 0.9171 - classification_loss: 0.1457 22/500 [>.............................] - ETA: 2:43 - loss: 1.0578 - regression_loss: 0.9122 - classification_loss: 0.1455 23/500 [>.............................] - ETA: 2:42 - loss: 1.0592 - regression_loss: 0.9164 - classification_loss: 0.1429 24/500 [>.............................] - ETA: 2:42 - loss: 1.0588 - regression_loss: 0.9192 - classification_loss: 0.1396 25/500 [>.............................] - ETA: 2:42 - loss: 1.0446 - regression_loss: 0.9072 - classification_loss: 0.1373 26/500 [>.............................] - ETA: 2:41 - loss: 1.0413 - regression_loss: 0.9036 - classification_loss: 0.1377 27/500 [>.............................] - ETA: 2:41 - loss: 1.0491 - regression_loss: 0.9100 - classification_loss: 0.1391 28/500 [>.............................] - ETA: 2:40 - loss: 1.0497 - regression_loss: 0.9077 - classification_loss: 0.1420 29/500 [>.............................] - ETA: 2:40 - loss: 1.0489 - regression_loss: 0.9070 - classification_loss: 0.1420 30/500 [>.............................] - ETA: 2:40 - loss: 1.0780 - regression_loss: 0.9241 - classification_loss: 0.1538 31/500 [>.............................] - ETA: 2:39 - loss: 1.0694 - regression_loss: 0.9177 - classification_loss: 0.1516 32/500 [>.............................] - ETA: 2:39 - loss: 1.0464 - regression_loss: 0.8980 - classification_loss: 0.1484 33/500 [>.............................] - ETA: 2:38 - loss: 1.0369 - regression_loss: 0.8922 - classification_loss: 0.1447 34/500 [=>............................] - ETA: 2:38 - loss: 1.0302 - regression_loss: 0.8872 - classification_loss: 0.1430 35/500 [=>............................] - ETA: 2:38 - loss: 1.0343 - regression_loss: 0.8914 - classification_loss: 0.1429 36/500 [=>............................] - ETA: 2:37 - loss: 1.0360 - regression_loss: 0.8939 - classification_loss: 0.1422 37/500 [=>............................] - ETA: 2:37 - loss: 1.0367 - regression_loss: 0.8955 - classification_loss: 0.1413 38/500 [=>............................] - ETA: 2:37 - loss: 1.0291 - regression_loss: 0.8901 - classification_loss: 0.1390 39/500 [=>............................] - ETA: 2:36 - loss: 1.0577 - regression_loss: 0.9137 - classification_loss: 0.1441 40/500 [=>............................] - ETA: 2:36 - loss: 1.0597 - regression_loss: 0.9152 - classification_loss: 0.1445 41/500 [=>............................] - ETA: 2:35 - loss: 1.0819 - regression_loss: 0.9325 - classification_loss: 0.1494 42/500 [=>............................] - ETA: 2:35 - loss: 1.0677 - regression_loss: 0.9204 - classification_loss: 0.1473 43/500 [=>............................] - ETA: 2:34 - loss: 1.0670 - regression_loss: 0.9205 - classification_loss: 0.1464 44/500 [=>............................] - ETA: 2:34 - loss: 1.0567 - regression_loss: 0.9127 - classification_loss: 0.1440 45/500 [=>............................] - ETA: 2:34 - loss: 1.0536 - regression_loss: 0.9115 - classification_loss: 0.1421 46/500 [=>............................] - ETA: 2:34 - loss: 1.0514 - regression_loss: 0.9108 - classification_loss: 0.1406 47/500 [=>............................] - ETA: 2:33 - loss: 1.0688 - regression_loss: 0.9259 - classification_loss: 0.1429 48/500 [=>............................] - ETA: 2:33 - loss: 1.0699 - regression_loss: 0.9276 - classification_loss: 0.1423 49/500 [=>............................] - ETA: 2:33 - loss: 1.0618 - regression_loss: 0.9208 - classification_loss: 0.1410 50/500 [==>...........................] - ETA: 2:32 - loss: 1.0585 - regression_loss: 0.9162 - classification_loss: 0.1422 51/500 [==>...........................] - ETA: 2:32 - loss: 1.0616 - regression_loss: 0.9182 - classification_loss: 0.1434 52/500 [==>...........................] - ETA: 2:32 - loss: 1.0477 - regression_loss: 0.9067 - classification_loss: 0.1410 53/500 [==>...........................] - ETA: 2:31 - loss: 1.0546 - regression_loss: 0.9142 - classification_loss: 0.1404 54/500 [==>...........................] - ETA: 2:31 - loss: 1.0475 - regression_loss: 0.9081 - classification_loss: 0.1394 55/500 [==>...........................] - ETA: 2:30 - loss: 1.0546 - regression_loss: 0.9157 - classification_loss: 0.1389 56/500 [==>...........................] - ETA: 2:30 - loss: 1.0489 - regression_loss: 0.9111 - classification_loss: 0.1378 57/500 [==>...........................] - ETA: 2:30 - loss: 1.0501 - regression_loss: 0.9115 - classification_loss: 0.1385 58/500 [==>...........................] - ETA: 2:29 - loss: 1.0468 - regression_loss: 0.9092 - classification_loss: 0.1376 59/500 [==>...........................] - ETA: 2:29 - loss: 1.0431 - regression_loss: 0.9070 - classification_loss: 0.1361 60/500 [==>...........................] - ETA: 2:29 - loss: 1.0400 - regression_loss: 0.9049 - classification_loss: 0.1351 61/500 [==>...........................] - ETA: 2:28 - loss: 1.0312 - regression_loss: 0.8974 - classification_loss: 0.1339 62/500 [==>...........................] - ETA: 2:28 - loss: 1.0272 - regression_loss: 0.8941 - classification_loss: 0.1331 63/500 [==>...........................] - ETA: 2:28 - loss: 1.0249 - regression_loss: 0.8927 - classification_loss: 0.1322 64/500 [==>...........................] - ETA: 2:27 - loss: 1.0319 - regression_loss: 0.8990 - classification_loss: 0.1329 65/500 [==>...........................] - ETA: 2:27 - loss: 1.0318 - regression_loss: 0.8993 - classification_loss: 0.1325 66/500 [==>...........................] - ETA: 2:26 - loss: 1.0288 - regression_loss: 0.8973 - classification_loss: 0.1314 67/500 [===>..........................] - ETA: 2:26 - loss: 1.0270 - regression_loss: 0.8963 - classification_loss: 0.1307 68/500 [===>..........................] - ETA: 2:26 - loss: 1.0232 - regression_loss: 0.8936 - classification_loss: 0.1295 69/500 [===>..........................] - ETA: 2:25 - loss: 1.0180 - regression_loss: 0.8882 - classification_loss: 0.1298 70/500 [===>..........................] - ETA: 2:25 - loss: 1.0165 - regression_loss: 0.8861 - classification_loss: 0.1304 71/500 [===>..........................] - ETA: 2:25 - loss: 1.0158 - regression_loss: 0.8861 - classification_loss: 0.1297 72/500 [===>..........................] - ETA: 2:24 - loss: 1.0140 - regression_loss: 0.8854 - classification_loss: 0.1286 73/500 [===>..........................] - ETA: 2:24 - loss: 1.0116 - regression_loss: 0.8836 - classification_loss: 0.1279 74/500 [===>..........................] - ETA: 2:24 - loss: 1.0151 - regression_loss: 0.8855 - classification_loss: 0.1297 75/500 [===>..........................] - ETA: 2:23 - loss: 1.0131 - regression_loss: 0.8837 - classification_loss: 0.1294 76/500 [===>..........................] - ETA: 2:23 - loss: 1.0090 - regression_loss: 0.8803 - classification_loss: 0.1287 77/500 [===>..........................] - ETA: 2:23 - loss: 1.0066 - regression_loss: 0.8788 - classification_loss: 0.1278 78/500 [===>..........................] - ETA: 2:22 - loss: 1.0029 - regression_loss: 0.8756 - classification_loss: 0.1273 79/500 [===>..........................] - ETA: 2:22 - loss: 1.0002 - regression_loss: 0.8741 - classification_loss: 0.1261 80/500 [===>..........................] - ETA: 2:22 - loss: 0.9972 - regression_loss: 0.8720 - classification_loss: 0.1252 81/500 [===>..........................] - ETA: 2:21 - loss: 1.0053 - regression_loss: 0.8792 - classification_loss: 0.1261 82/500 [===>..........................] - ETA: 2:21 - loss: 1.0045 - regression_loss: 0.8791 - classification_loss: 0.1254 83/500 [===>..........................] - ETA: 2:21 - loss: 1.0078 - regression_loss: 0.8826 - classification_loss: 0.1252 84/500 [====>.........................] - ETA: 2:20 - loss: 1.0162 - regression_loss: 0.8907 - classification_loss: 0.1255 85/500 [====>.........................] - ETA: 2:20 - loss: 1.0172 - regression_loss: 0.8919 - classification_loss: 0.1253 86/500 [====>.........................] - ETA: 2:20 - loss: 1.0195 - regression_loss: 0.8933 - classification_loss: 0.1262 87/500 [====>.........................] - ETA: 2:20 - loss: 1.0172 - regression_loss: 0.8914 - classification_loss: 0.1258 88/500 [====>.........................] - ETA: 2:19 - loss: 1.0181 - regression_loss: 0.8924 - classification_loss: 0.1256 89/500 [====>.........................] - ETA: 2:19 - loss: 1.0122 - regression_loss: 0.8873 - classification_loss: 0.1248 90/500 [====>.........................] - ETA: 2:19 - loss: 1.0212 - regression_loss: 0.8941 - classification_loss: 0.1271 91/500 [====>.........................] - ETA: 2:18 - loss: 1.0231 - regression_loss: 0.8957 - classification_loss: 0.1274 92/500 [====>.........................] - ETA: 2:18 - loss: 1.0229 - regression_loss: 0.8953 - classification_loss: 0.1277 93/500 [====>.........................] - ETA: 2:18 - loss: 1.0260 - regression_loss: 0.8991 - classification_loss: 0.1269 94/500 [====>.........................] - ETA: 2:17 - loss: 1.0333 - regression_loss: 0.9065 - classification_loss: 0.1268 95/500 [====>.........................] - ETA: 2:17 - loss: 1.0296 - regression_loss: 0.9037 - classification_loss: 0.1260 96/500 [====>.........................] - ETA: 2:17 - loss: 1.0297 - regression_loss: 0.9039 - classification_loss: 0.1257 97/500 [====>.........................] - ETA: 2:16 - loss: 1.0265 - regression_loss: 0.9015 - classification_loss: 0.1250 98/500 [====>.........................] - ETA: 2:16 - loss: 1.0301 - regression_loss: 0.9054 - classification_loss: 0.1247 99/500 [====>.........................] - ETA: 2:15 - loss: 1.0281 - regression_loss: 0.9039 - classification_loss: 0.1242 100/500 [=====>........................] - ETA: 2:15 - loss: 1.0296 - regression_loss: 0.9051 - classification_loss: 0.1245 101/500 [=====>........................] - ETA: 2:15 - loss: 1.0305 - regression_loss: 0.9061 - classification_loss: 0.1244 102/500 [=====>........................] - ETA: 2:15 - loss: 1.0352 - regression_loss: 0.9116 - classification_loss: 0.1236 103/500 [=====>........................] - ETA: 2:14 - loss: 1.0366 - regression_loss: 0.9132 - classification_loss: 0.1234 104/500 [=====>........................] - ETA: 2:14 - loss: 1.0384 - regression_loss: 0.9155 - classification_loss: 0.1229 105/500 [=====>........................] - ETA: 2:14 - loss: 1.0477 - regression_loss: 0.9226 - classification_loss: 0.1251 106/500 [=====>........................] - ETA: 2:13 - loss: 1.0437 - regression_loss: 0.9185 - classification_loss: 0.1252 107/500 [=====>........................] - ETA: 2:13 - loss: 1.0452 - regression_loss: 0.9191 - classification_loss: 0.1261 108/500 [=====>........................] - ETA: 2:13 - loss: 1.0536 - regression_loss: 0.9273 - classification_loss: 0.1263 109/500 [=====>........................] - ETA: 2:12 - loss: 1.0534 - regression_loss: 0.9275 - classification_loss: 0.1259 110/500 [=====>........................] - ETA: 2:12 - loss: 1.0501 - regression_loss: 0.9245 - classification_loss: 0.1256 111/500 [=====>........................] - ETA: 2:12 - loss: 1.0530 - regression_loss: 0.9273 - classification_loss: 0.1258 112/500 [=====>........................] - ETA: 2:11 - loss: 1.0492 - regression_loss: 0.9241 - classification_loss: 0.1251 113/500 [=====>........................] - ETA: 2:11 - loss: 1.0661 - regression_loss: 0.9355 - classification_loss: 0.1305 114/500 [=====>........................] - ETA: 2:11 - loss: 1.0689 - regression_loss: 0.9377 - classification_loss: 0.1312 115/500 [=====>........................] - ETA: 2:10 - loss: 1.0729 - regression_loss: 0.9412 - classification_loss: 0.1317 116/500 [=====>........................] - ETA: 2:10 - loss: 1.0720 - regression_loss: 0.9409 - classification_loss: 0.1311 117/500 [======>.......................] - ETA: 2:10 - loss: 1.0784 - regression_loss: 0.9469 - classification_loss: 0.1315 118/500 [======>.......................] - ETA: 2:09 - loss: 1.0780 - regression_loss: 0.9464 - classification_loss: 0.1316 119/500 [======>.......................] - ETA: 2:09 - loss: 1.0712 - regression_loss: 0.9407 - classification_loss: 0.1306 120/500 [======>.......................] - ETA: 2:08 - loss: 1.0664 - regression_loss: 0.9355 - classification_loss: 0.1308 121/500 [======>.......................] - ETA: 2:08 - loss: 1.0672 - regression_loss: 0.9360 - classification_loss: 0.1312 122/500 [======>.......................] - ETA: 2:08 - loss: 1.0660 - regression_loss: 0.9346 - classification_loss: 0.1314 123/500 [======>.......................] - ETA: 2:07 - loss: 1.0643 - regression_loss: 0.9329 - classification_loss: 0.1314 124/500 [======>.......................] - ETA: 2:07 - loss: 1.0590 - regression_loss: 0.9285 - classification_loss: 0.1305 125/500 [======>.......................] - ETA: 2:07 - loss: 1.0627 - regression_loss: 0.9315 - classification_loss: 0.1312 126/500 [======>.......................] - ETA: 2:06 - loss: 1.0677 - regression_loss: 0.9357 - classification_loss: 0.1320 127/500 [======>.......................] - ETA: 2:06 - loss: 1.0666 - regression_loss: 0.9351 - classification_loss: 0.1316 128/500 [======>.......................] - ETA: 2:06 - loss: 1.0610 - regression_loss: 0.9303 - classification_loss: 0.1307 129/500 [======>.......................] - ETA: 2:06 - loss: 1.0581 - regression_loss: 0.9279 - classification_loss: 0.1302 130/500 [======>.......................] - ETA: 2:05 - loss: 1.0575 - regression_loss: 0.9275 - classification_loss: 0.1300 131/500 [======>.......................] - ETA: 2:05 - loss: 1.0552 - regression_loss: 0.9253 - classification_loss: 0.1299 132/500 [======>.......................] - ETA: 2:04 - loss: 1.0538 - regression_loss: 0.9243 - classification_loss: 0.1295 133/500 [======>.......................] - ETA: 2:04 - loss: 1.0538 - regression_loss: 0.9244 - classification_loss: 0.1294 134/500 [=======>......................] - ETA: 2:04 - loss: 1.0532 - regression_loss: 0.9243 - classification_loss: 0.1289 135/500 [=======>......................] - ETA: 2:03 - loss: 1.0487 - regression_loss: 0.9206 - classification_loss: 0.1281 136/500 [=======>......................] - ETA: 2:03 - loss: 1.0472 - regression_loss: 0.9189 - classification_loss: 0.1283 137/500 [=======>......................] - ETA: 2:03 - loss: 1.0446 - regression_loss: 0.9167 - classification_loss: 0.1279 138/500 [=======>......................] - ETA: 2:02 - loss: 1.0441 - regression_loss: 0.9164 - classification_loss: 0.1277 139/500 [=======>......................] - ETA: 2:02 - loss: 1.0468 - regression_loss: 0.9184 - classification_loss: 0.1284 140/500 [=======>......................] - ETA: 2:02 - loss: 1.0481 - regression_loss: 0.9192 - classification_loss: 0.1289 141/500 [=======>......................] - ETA: 2:01 - loss: 1.0476 - regression_loss: 0.9190 - classification_loss: 0.1286 142/500 [=======>......................] - ETA: 2:01 - loss: 1.0467 - regression_loss: 0.9184 - classification_loss: 0.1283 143/500 [=======>......................] - ETA: 2:01 - loss: 1.0471 - regression_loss: 0.9188 - classification_loss: 0.1284 144/500 [=======>......................] - ETA: 2:00 - loss: 1.0467 - regression_loss: 0.9185 - classification_loss: 0.1282 145/500 [=======>......................] - ETA: 2:00 - loss: 1.0488 - regression_loss: 0.9201 - classification_loss: 0.1287 146/500 [=======>......................] - ETA: 2:00 - loss: 1.0487 - regression_loss: 0.9199 - classification_loss: 0.1288 147/500 [=======>......................] - ETA: 1:59 - loss: 1.0489 - regression_loss: 0.9203 - classification_loss: 0.1286 148/500 [=======>......................] - ETA: 1:59 - loss: 1.0546 - regression_loss: 0.9254 - classification_loss: 0.1292 149/500 [=======>......................] - ETA: 1:59 - loss: 1.0547 - regression_loss: 0.9251 - classification_loss: 0.1296 150/500 [========>.....................] - ETA: 1:58 - loss: 1.0527 - regression_loss: 0.9236 - classification_loss: 0.1291 151/500 [========>.....................] - ETA: 1:58 - loss: 1.0511 - regression_loss: 0.9222 - classification_loss: 0.1288 152/500 [========>.....................] - ETA: 1:58 - loss: 1.0487 - regression_loss: 0.9204 - classification_loss: 0.1283 153/500 [========>.....................] - ETA: 1:57 - loss: 1.0476 - regression_loss: 0.9193 - classification_loss: 0.1282 154/500 [========>.....................] - ETA: 1:57 - loss: 1.0441 - regression_loss: 0.9165 - classification_loss: 0.1276 155/500 [========>.....................] - ETA: 1:57 - loss: 1.0430 - regression_loss: 0.9156 - classification_loss: 0.1274 156/500 [========>.....................] - ETA: 1:56 - loss: 1.0445 - regression_loss: 0.9169 - classification_loss: 0.1276 157/500 [========>.....................] - ETA: 1:56 - loss: 1.0453 - regression_loss: 0.9179 - classification_loss: 0.1274 158/500 [========>.....................] - ETA: 1:56 - loss: 1.0430 - regression_loss: 0.9160 - classification_loss: 0.1270 159/500 [========>.....................] - ETA: 1:55 - loss: 1.0443 - regression_loss: 0.9169 - classification_loss: 0.1274 160/500 [========>.....................] - ETA: 1:55 - loss: 1.0441 - regression_loss: 0.9167 - classification_loss: 0.1274 161/500 [========>.....................] - ETA: 1:55 - loss: 1.0404 - regression_loss: 0.9136 - classification_loss: 0.1268 162/500 [========>.....................] - ETA: 1:54 - loss: 1.0428 - regression_loss: 0.9153 - classification_loss: 0.1275 163/500 [========>.....................] - ETA: 1:54 - loss: 1.0427 - regression_loss: 0.9151 - classification_loss: 0.1275 164/500 [========>.....................] - ETA: 1:54 - loss: 1.0425 - regression_loss: 0.9149 - classification_loss: 0.1275 165/500 [========>.....................] - ETA: 1:53 - loss: 1.0434 - regression_loss: 0.9157 - classification_loss: 0.1276 166/500 [========>.....................] - ETA: 1:53 - loss: 1.0434 - regression_loss: 0.9159 - classification_loss: 0.1275 167/500 [=========>....................] - ETA: 1:53 - loss: 1.0432 - regression_loss: 0.9161 - classification_loss: 0.1271 168/500 [=========>....................] - ETA: 1:52 - loss: 1.0433 - regression_loss: 0.9162 - classification_loss: 0.1271 169/500 [=========>....................] - ETA: 1:52 - loss: 1.0422 - regression_loss: 0.9156 - classification_loss: 0.1266 170/500 [=========>....................] - ETA: 1:52 - loss: 1.0389 - regression_loss: 0.9126 - classification_loss: 0.1264 171/500 [=========>....................] - ETA: 1:51 - loss: 1.0393 - regression_loss: 0.9129 - classification_loss: 0.1264 172/500 [=========>....................] - ETA: 1:51 - loss: 1.0408 - regression_loss: 0.9142 - classification_loss: 0.1266 173/500 [=========>....................] - ETA: 1:51 - loss: 1.0414 - regression_loss: 0.9147 - classification_loss: 0.1267 174/500 [=========>....................] - ETA: 1:50 - loss: 1.0414 - regression_loss: 0.9149 - classification_loss: 0.1264 175/500 [=========>....................] - ETA: 1:50 - loss: 1.0409 - regression_loss: 0.9147 - classification_loss: 0.1262 176/500 [=========>....................] - ETA: 1:50 - loss: 1.0395 - regression_loss: 0.9135 - classification_loss: 0.1260 177/500 [=========>....................] - ETA: 1:49 - loss: 1.0413 - regression_loss: 0.9152 - classification_loss: 0.1261 178/500 [=========>....................] - ETA: 1:49 - loss: 1.0433 - regression_loss: 0.9175 - classification_loss: 0.1259 179/500 [=========>....................] - ETA: 1:49 - loss: 1.0427 - regression_loss: 0.9170 - classification_loss: 0.1257 180/500 [=========>....................] - ETA: 1:48 - loss: 1.0412 - regression_loss: 0.9158 - classification_loss: 0.1254 181/500 [=========>....................] - ETA: 1:48 - loss: 1.0424 - regression_loss: 0.9169 - classification_loss: 0.1255 182/500 [=========>....................] - ETA: 1:48 - loss: 1.0430 - regression_loss: 0.9176 - classification_loss: 0.1255 183/500 [=========>....................] - ETA: 1:47 - loss: 1.0395 - regression_loss: 0.9146 - classification_loss: 0.1249 184/500 [==========>...................] - ETA: 1:47 - loss: 1.0443 - regression_loss: 0.9187 - classification_loss: 0.1256 185/500 [==========>...................] - ETA: 1:47 - loss: 1.0497 - regression_loss: 0.9241 - classification_loss: 0.1256 186/500 [==========>...................] - ETA: 1:46 - loss: 1.0492 - regression_loss: 0.9238 - classification_loss: 0.1254 187/500 [==========>...................] - ETA: 1:46 - loss: 1.0513 - regression_loss: 0.9255 - classification_loss: 0.1258 188/500 [==========>...................] - ETA: 1:46 - loss: 1.0490 - regression_loss: 0.9236 - classification_loss: 0.1254 189/500 [==========>...................] - ETA: 1:45 - loss: 1.0504 - regression_loss: 0.9243 - classification_loss: 0.1261 190/500 [==========>...................] - ETA: 1:45 - loss: 1.0491 - regression_loss: 0.9233 - classification_loss: 0.1258 191/500 [==========>...................] - ETA: 1:45 - loss: 1.0509 - regression_loss: 0.9247 - classification_loss: 0.1262 192/500 [==========>...................] - ETA: 1:44 - loss: 1.0515 - regression_loss: 0.9250 - classification_loss: 0.1264 193/500 [==========>...................] - ETA: 1:44 - loss: 1.0518 - regression_loss: 0.9257 - classification_loss: 0.1261 194/500 [==========>...................] - ETA: 1:44 - loss: 1.0485 - regression_loss: 0.9229 - classification_loss: 0.1256 195/500 [==========>...................] - ETA: 1:43 - loss: 1.0489 - regression_loss: 0.9234 - classification_loss: 0.1255 196/500 [==========>...................] - ETA: 1:43 - loss: 1.0495 - regression_loss: 0.9241 - classification_loss: 0.1255 197/500 [==========>...................] - ETA: 1:43 - loss: 1.0514 - regression_loss: 0.9260 - classification_loss: 0.1255 198/500 [==========>...................] - ETA: 1:42 - loss: 1.0508 - regression_loss: 0.9255 - classification_loss: 0.1253 199/500 [==========>...................] - ETA: 1:42 - loss: 1.0474 - regression_loss: 0.9224 - classification_loss: 0.1250 200/500 [===========>..................] - ETA: 1:42 - loss: 1.0454 - regression_loss: 0.9209 - classification_loss: 0.1245 201/500 [===========>..................] - ETA: 1:41 - loss: 1.0453 - regression_loss: 0.9205 - classification_loss: 0.1248 202/500 [===========>..................] - ETA: 1:41 - loss: 1.0454 - regression_loss: 0.9207 - classification_loss: 0.1247 203/500 [===========>..................] - ETA: 1:41 - loss: 1.0455 - regression_loss: 0.9210 - classification_loss: 0.1245 204/500 [===========>..................] - ETA: 1:40 - loss: 1.0457 - regression_loss: 0.9213 - classification_loss: 0.1244 205/500 [===========>..................] - ETA: 1:40 - loss: 1.0439 - regression_loss: 0.9197 - classification_loss: 0.1241 206/500 [===========>..................] - ETA: 1:40 - loss: 1.0439 - regression_loss: 0.9199 - classification_loss: 0.1240 207/500 [===========>..................] - ETA: 1:39 - loss: 1.0439 - regression_loss: 0.9201 - classification_loss: 0.1238 208/500 [===========>..................] - ETA: 1:39 - loss: 1.0432 - regression_loss: 0.9196 - classification_loss: 0.1236 209/500 [===========>..................] - ETA: 1:38 - loss: 1.0409 - regression_loss: 0.9177 - classification_loss: 0.1232 210/500 [===========>..................] - ETA: 1:38 - loss: 1.0374 - regression_loss: 0.9146 - classification_loss: 0.1228 211/500 [===========>..................] - ETA: 1:38 - loss: 1.0365 - regression_loss: 0.9141 - classification_loss: 0.1224 212/500 [===========>..................] - ETA: 1:37 - loss: 1.0349 - regression_loss: 0.9128 - classification_loss: 0.1222 213/500 [===========>..................] - ETA: 1:37 - loss: 1.0320 - regression_loss: 0.9103 - classification_loss: 0.1217 214/500 [===========>..................] - ETA: 1:37 - loss: 1.0299 - regression_loss: 0.9086 - classification_loss: 0.1213 215/500 [===========>..................] - ETA: 1:36 - loss: 1.0305 - regression_loss: 0.9093 - classification_loss: 0.1213 216/500 [===========>..................] - ETA: 1:36 - loss: 1.0324 - regression_loss: 0.9106 - classification_loss: 0.1218 217/500 [============>.................] - ETA: 1:36 - loss: 1.0307 - regression_loss: 0.9091 - classification_loss: 0.1216 218/500 [============>.................] - ETA: 1:35 - loss: 1.0297 - regression_loss: 0.9084 - classification_loss: 0.1213 219/500 [============>.................] - ETA: 1:35 - loss: 1.0261 - regression_loss: 0.9052 - classification_loss: 0.1208 220/500 [============>.................] - ETA: 1:35 - loss: 1.0277 - regression_loss: 0.9064 - classification_loss: 0.1213 221/500 [============>.................] - ETA: 1:34 - loss: 1.0265 - regression_loss: 0.9054 - classification_loss: 0.1211 222/500 [============>.................] - ETA: 1:34 - loss: 1.0258 - regression_loss: 0.9049 - classification_loss: 0.1209 223/500 [============>.................] - ETA: 1:34 - loss: 1.0253 - regression_loss: 0.9047 - classification_loss: 0.1206 224/500 [============>.................] - ETA: 1:33 - loss: 1.0267 - regression_loss: 0.9059 - classification_loss: 0.1208 225/500 [============>.................] - ETA: 1:33 - loss: 1.0235 - regression_loss: 0.9030 - classification_loss: 0.1205 226/500 [============>.................] - ETA: 1:33 - loss: 1.0263 - regression_loss: 0.9043 - classification_loss: 0.1221 227/500 [============>.................] - ETA: 1:32 - loss: 1.0237 - regression_loss: 0.9021 - classification_loss: 0.1216 228/500 [============>.................] - ETA: 1:32 - loss: 1.0250 - regression_loss: 0.9030 - classification_loss: 0.1221 229/500 [============>.................] - ETA: 1:32 - loss: 1.0237 - regression_loss: 0.9017 - classification_loss: 0.1220 230/500 [============>.................] - ETA: 1:31 - loss: 1.0257 - regression_loss: 0.9035 - classification_loss: 0.1222 231/500 [============>.................] - ETA: 1:31 - loss: 1.0250 - regression_loss: 0.9031 - classification_loss: 0.1219 232/500 [============>.................] - ETA: 1:31 - loss: 1.0239 - regression_loss: 0.9023 - classification_loss: 0.1216 233/500 [============>.................] - ETA: 1:30 - loss: 1.0273 - regression_loss: 0.9046 - classification_loss: 0.1227 234/500 [=============>................] - ETA: 1:30 - loss: 1.0258 - regression_loss: 0.9035 - classification_loss: 0.1223 235/500 [=============>................] - ETA: 1:30 - loss: 1.0235 - regression_loss: 0.9015 - classification_loss: 0.1221 236/500 [=============>................] - ETA: 1:29 - loss: 1.0262 - regression_loss: 0.9040 - classification_loss: 0.1222 237/500 [=============>................] - ETA: 1:29 - loss: 1.0257 - regression_loss: 0.9037 - classification_loss: 0.1220 238/500 [=============>................] - ETA: 1:29 - loss: 1.0264 - regression_loss: 0.9045 - classification_loss: 0.1219 239/500 [=============>................] - ETA: 1:28 - loss: 1.0261 - regression_loss: 0.9044 - classification_loss: 0.1217 240/500 [=============>................] - ETA: 1:28 - loss: 1.0257 - regression_loss: 0.9040 - classification_loss: 0.1217 241/500 [=============>................] - ETA: 1:28 - loss: 1.0270 - regression_loss: 0.9052 - classification_loss: 0.1218 242/500 [=============>................] - ETA: 1:27 - loss: 1.0239 - regression_loss: 0.9025 - classification_loss: 0.1214 243/500 [=============>................] - ETA: 1:27 - loss: 1.0213 - regression_loss: 0.9003 - classification_loss: 0.1209 244/500 [=============>................] - ETA: 1:27 - loss: 1.0218 - regression_loss: 0.9009 - classification_loss: 0.1209 245/500 [=============>................] - ETA: 1:26 - loss: 1.0220 - regression_loss: 0.9012 - classification_loss: 0.1207 246/500 [=============>................] - ETA: 1:26 - loss: 1.0212 - regression_loss: 0.9005 - classification_loss: 0.1206 247/500 [=============>................] - ETA: 1:26 - loss: 1.0289 - regression_loss: 0.9067 - classification_loss: 0.1222 248/500 [=============>................] - ETA: 1:25 - loss: 1.0323 - regression_loss: 0.9097 - classification_loss: 0.1225 249/500 [=============>................] - ETA: 1:25 - loss: 1.0307 - regression_loss: 0.9086 - classification_loss: 0.1222 250/500 [==============>...............] - ETA: 1:24 - loss: 1.0289 - regression_loss: 0.9071 - classification_loss: 0.1218 251/500 [==============>...............] - ETA: 1:24 - loss: 1.0271 - regression_loss: 0.9056 - classification_loss: 0.1215 252/500 [==============>...............] - ETA: 1:24 - loss: 1.0275 - regression_loss: 0.9060 - classification_loss: 0.1215 253/500 [==============>...............] - ETA: 1:23 - loss: 1.0276 - regression_loss: 0.9064 - classification_loss: 0.1212 254/500 [==============>...............] - ETA: 1:23 - loss: 1.0268 - regression_loss: 0.9058 - classification_loss: 0.1210 255/500 [==============>...............] - ETA: 1:23 - loss: 1.0252 - regression_loss: 0.9045 - classification_loss: 0.1207 256/500 [==============>...............] - ETA: 1:22 - loss: 1.0254 - regression_loss: 0.9044 - classification_loss: 0.1210 257/500 [==============>...............] - ETA: 1:22 - loss: 1.0267 - regression_loss: 0.9053 - classification_loss: 0.1215 258/500 [==============>...............] - ETA: 1:22 - loss: 1.0285 - regression_loss: 0.9067 - classification_loss: 0.1218 259/500 [==============>...............] - ETA: 1:21 - loss: 1.0281 - regression_loss: 0.9065 - classification_loss: 0.1217 260/500 [==============>...............] - ETA: 1:21 - loss: 1.0285 - regression_loss: 0.9066 - classification_loss: 0.1219 261/500 [==============>...............] - ETA: 1:21 - loss: 1.0274 - regression_loss: 0.9058 - classification_loss: 0.1216 262/500 [==============>...............] - ETA: 1:20 - loss: 1.0262 - regression_loss: 0.9048 - classification_loss: 0.1214 263/500 [==============>...............] - ETA: 1:20 - loss: 1.0263 - regression_loss: 0.9047 - classification_loss: 0.1216 264/500 [==============>...............] - ETA: 1:20 - loss: 1.0254 - regression_loss: 0.9039 - classification_loss: 0.1215 265/500 [==============>...............] - ETA: 1:19 - loss: 1.0251 - regression_loss: 0.9037 - classification_loss: 0.1215 266/500 [==============>...............] - ETA: 1:19 - loss: 1.0254 - regression_loss: 0.9041 - classification_loss: 0.1213 267/500 [===============>..............] - ETA: 1:19 - loss: 1.0253 - regression_loss: 0.9040 - classification_loss: 0.1213 268/500 [===============>..............] - ETA: 1:18 - loss: 1.0263 - regression_loss: 0.9049 - classification_loss: 0.1213 269/500 [===============>..............] - ETA: 1:18 - loss: 1.0272 - regression_loss: 0.9059 - classification_loss: 0.1213 270/500 [===============>..............] - ETA: 1:18 - loss: 1.0281 - regression_loss: 0.9065 - classification_loss: 0.1216 271/500 [===============>..............] - ETA: 1:17 - loss: 1.0283 - regression_loss: 0.9068 - classification_loss: 0.1216 272/500 [===============>..............] - ETA: 1:17 - loss: 1.0291 - regression_loss: 0.9070 - classification_loss: 0.1221 273/500 [===============>..............] - ETA: 1:17 - loss: 1.0315 - regression_loss: 0.9092 - classification_loss: 0.1223 274/500 [===============>..............] - ETA: 1:16 - loss: 1.0312 - regression_loss: 0.9091 - classification_loss: 0.1221 275/500 [===============>..............] - ETA: 1:16 - loss: 1.0317 - regression_loss: 0.9094 - classification_loss: 0.1223 276/500 [===============>..............] - ETA: 1:16 - loss: 1.0333 - regression_loss: 0.9111 - classification_loss: 0.1222 277/500 [===============>..............] - ETA: 1:15 - loss: 1.0363 - regression_loss: 0.9137 - classification_loss: 0.1226 278/500 [===============>..............] - ETA: 1:15 - loss: 1.0360 - regression_loss: 0.9135 - classification_loss: 0.1226 279/500 [===============>..............] - ETA: 1:15 - loss: 1.0349 - regression_loss: 0.9126 - classification_loss: 0.1223 280/500 [===============>..............] - ETA: 1:14 - loss: 1.0350 - regression_loss: 0.9125 - classification_loss: 0.1225 281/500 [===============>..............] - ETA: 1:14 - loss: 1.0336 - regression_loss: 0.9113 - classification_loss: 0.1222 282/500 [===============>..............] - ETA: 1:14 - loss: 1.0324 - regression_loss: 0.9098 - classification_loss: 0.1227 283/500 [===============>..............] - ETA: 1:13 - loss: 1.0336 - regression_loss: 0.9108 - classification_loss: 0.1228 284/500 [================>.............] - ETA: 1:13 - loss: 1.0348 - regression_loss: 0.9119 - classification_loss: 0.1229 285/500 [================>.............] - ETA: 1:13 - loss: 1.0352 - regression_loss: 0.9123 - classification_loss: 0.1229 286/500 [================>.............] - ETA: 1:12 - loss: 1.0346 - regression_loss: 0.9119 - classification_loss: 0.1228 287/500 [================>.............] - ETA: 1:12 - loss: 1.0347 - regression_loss: 0.9118 - classification_loss: 0.1228 288/500 [================>.............] - ETA: 1:12 - loss: 1.0380 - regression_loss: 0.9138 - classification_loss: 0.1243 289/500 [================>.............] - ETA: 1:11 - loss: 1.0364 - regression_loss: 0.9123 - classification_loss: 0.1241 290/500 [================>.............] - ETA: 1:11 - loss: 1.0357 - regression_loss: 0.9117 - classification_loss: 0.1240 291/500 [================>.............] - ETA: 1:11 - loss: 1.0368 - regression_loss: 0.9127 - classification_loss: 0.1241 292/500 [================>.............] - ETA: 1:10 - loss: 1.0364 - regression_loss: 0.9125 - classification_loss: 0.1239 293/500 [================>.............] - ETA: 1:10 - loss: 1.0400 - regression_loss: 0.9153 - classification_loss: 0.1247 294/500 [================>.............] - ETA: 1:10 - loss: 1.0402 - regression_loss: 0.9157 - classification_loss: 0.1245 295/500 [================>.............] - ETA: 1:09 - loss: 1.0416 - regression_loss: 0.9167 - classification_loss: 0.1248 296/500 [================>.............] - ETA: 1:09 - loss: 1.0432 - regression_loss: 0.9177 - classification_loss: 0.1254 297/500 [================>.............] - ETA: 1:08 - loss: 1.0426 - regression_loss: 0.9173 - classification_loss: 0.1253 298/500 [================>.............] - ETA: 1:08 - loss: 1.0442 - regression_loss: 0.9184 - classification_loss: 0.1258 299/500 [================>.............] - ETA: 1:08 - loss: 1.0433 - regression_loss: 0.9176 - classification_loss: 0.1257 300/500 [=================>............] - ETA: 1:07 - loss: 1.0414 - regression_loss: 0.9161 - classification_loss: 0.1254 301/500 [=================>............] - ETA: 1:07 - loss: 1.0407 - regression_loss: 0.9155 - classification_loss: 0.1252 302/500 [=================>............] - ETA: 1:07 - loss: 1.0395 - regression_loss: 0.9145 - classification_loss: 0.1250 303/500 [=================>............] - ETA: 1:06 - loss: 1.0401 - regression_loss: 0.9151 - classification_loss: 0.1250 304/500 [=================>............] - ETA: 1:06 - loss: 1.0408 - regression_loss: 0.9158 - classification_loss: 0.1251 305/500 [=================>............] - ETA: 1:06 - loss: 1.0417 - regression_loss: 0.9167 - classification_loss: 0.1251 306/500 [=================>............] - ETA: 1:05 - loss: 1.0417 - regression_loss: 0.9166 - classification_loss: 0.1251 307/500 [=================>............] - ETA: 1:05 - loss: 1.0397 - regression_loss: 0.9149 - classification_loss: 0.1248 308/500 [=================>............] - ETA: 1:05 - loss: 1.0391 - regression_loss: 0.9145 - classification_loss: 0.1247 309/500 [=================>............] - ETA: 1:04 - loss: 1.0411 - regression_loss: 0.9162 - classification_loss: 0.1249 310/500 [=================>............] - ETA: 1:04 - loss: 1.0395 - regression_loss: 0.9148 - classification_loss: 0.1247 311/500 [=================>............] - ETA: 1:04 - loss: 1.0395 - regression_loss: 0.9144 - classification_loss: 0.1250 312/500 [=================>............] - ETA: 1:03 - loss: 1.0400 - regression_loss: 0.9149 - classification_loss: 0.1251 313/500 [=================>............] - ETA: 1:03 - loss: 1.0400 - regression_loss: 0.9150 - classification_loss: 0.1250 314/500 [=================>............] - ETA: 1:03 - loss: 1.0401 - regression_loss: 0.9151 - classification_loss: 0.1250 315/500 [=================>............] - ETA: 1:02 - loss: 1.0395 - regression_loss: 0.9146 - classification_loss: 0.1249 316/500 [=================>............] - ETA: 1:02 - loss: 1.0389 - regression_loss: 0.9141 - classification_loss: 0.1248 317/500 [==================>...........] - ETA: 1:02 - loss: 1.0394 - regression_loss: 0.9146 - classification_loss: 0.1248 318/500 [==================>...........] - ETA: 1:01 - loss: 1.0387 - regression_loss: 0.9141 - classification_loss: 0.1247 319/500 [==================>...........] - ETA: 1:01 - loss: 1.0368 - regression_loss: 0.9122 - classification_loss: 0.1246 320/500 [==================>...........] - ETA: 1:01 - loss: 1.0351 - regression_loss: 0.9108 - classification_loss: 0.1244 321/500 [==================>...........] - ETA: 1:00 - loss: 1.0350 - regression_loss: 0.9108 - classification_loss: 0.1243 322/500 [==================>...........] - ETA: 1:00 - loss: 1.0332 - regression_loss: 0.9093 - classification_loss: 0.1239 323/500 [==================>...........] - ETA: 1:00 - loss: 1.0328 - regression_loss: 0.9089 - classification_loss: 0.1239 324/500 [==================>...........] - ETA: 59s - loss: 1.0341 - regression_loss: 0.9099 - classification_loss: 0.1242  325/500 [==================>...........] - ETA: 59s - loss: 1.0347 - regression_loss: 0.9102 - classification_loss: 0.1244 326/500 [==================>...........] - ETA: 59s - loss: 1.0331 - regression_loss: 0.9089 - classification_loss: 0.1241 327/500 [==================>...........] - ETA: 58s - loss: 1.0343 - regression_loss: 0.9100 - classification_loss: 0.1243 328/500 [==================>...........] - ETA: 58s - loss: 1.0346 - regression_loss: 0.9104 - classification_loss: 0.1243 329/500 [==================>...........] - ETA: 58s - loss: 1.0357 - regression_loss: 0.9113 - classification_loss: 0.1244 330/500 [==================>...........] - ETA: 57s - loss: 1.0357 - regression_loss: 0.9113 - classification_loss: 0.1244 331/500 [==================>...........] - ETA: 57s - loss: 1.0393 - regression_loss: 0.9143 - classification_loss: 0.1250 332/500 [==================>...........] - ETA: 57s - loss: 1.0400 - regression_loss: 0.9148 - classification_loss: 0.1252 333/500 [==================>...........] - ETA: 56s - loss: 1.0387 - regression_loss: 0.9137 - classification_loss: 0.1250 334/500 [===================>..........] - ETA: 56s - loss: 1.0368 - regression_loss: 0.9120 - classification_loss: 0.1248 335/500 [===================>..........] - ETA: 56s - loss: 1.0367 - regression_loss: 0.9122 - classification_loss: 0.1245 336/500 [===================>..........] - ETA: 55s - loss: 1.0369 - regression_loss: 0.9125 - classification_loss: 0.1245 337/500 [===================>..........] - ETA: 55s - loss: 1.0358 - regression_loss: 0.9116 - classification_loss: 0.1242 338/500 [===================>..........] - ETA: 55s - loss: 1.0344 - regression_loss: 0.9105 - classification_loss: 0.1239 339/500 [===================>..........] - ETA: 54s - loss: 1.0345 - regression_loss: 0.9108 - classification_loss: 0.1238 340/500 [===================>..........] - ETA: 54s - loss: 1.0340 - regression_loss: 0.9101 - classification_loss: 0.1240 341/500 [===================>..........] - ETA: 53s - loss: 1.0325 - regression_loss: 0.9088 - classification_loss: 0.1237 342/500 [===================>..........] - ETA: 53s - loss: 1.0330 - regression_loss: 0.9092 - classification_loss: 0.1238 343/500 [===================>..........] - ETA: 53s - loss: 1.0322 - regression_loss: 0.9085 - classification_loss: 0.1237 344/500 [===================>..........] - ETA: 52s - loss: 1.0326 - regression_loss: 0.9090 - classification_loss: 0.1236 345/500 [===================>..........] - ETA: 52s - loss: 1.0306 - regression_loss: 0.9072 - classification_loss: 0.1234 346/500 [===================>..........] - ETA: 52s - loss: 1.0309 - regression_loss: 0.9072 - classification_loss: 0.1237 347/500 [===================>..........] - ETA: 51s - loss: 1.0321 - regression_loss: 0.9082 - classification_loss: 0.1240 348/500 [===================>..........] - ETA: 51s - loss: 1.0320 - regression_loss: 0.9083 - classification_loss: 0.1237 349/500 [===================>..........] - ETA: 51s - loss: 1.0306 - regression_loss: 0.9071 - classification_loss: 0.1235 350/500 [====================>.........] - ETA: 50s - loss: 1.0300 - regression_loss: 0.9066 - classification_loss: 0.1234 351/500 [====================>.........] - ETA: 50s - loss: 1.0307 - regression_loss: 0.9073 - classification_loss: 0.1234 352/500 [====================>.........] - ETA: 50s - loss: 1.0292 - regression_loss: 0.9060 - classification_loss: 0.1232 353/500 [====================>.........] - ETA: 49s - loss: 1.0331 - regression_loss: 0.9096 - classification_loss: 0.1234 354/500 [====================>.........] - ETA: 49s - loss: 1.0322 - regression_loss: 0.9088 - classification_loss: 0.1233 355/500 [====================>.........] - ETA: 49s - loss: 1.0304 - regression_loss: 0.9073 - classification_loss: 0.1230 356/500 [====================>.........] - ETA: 48s - loss: 1.0299 - regression_loss: 0.9069 - classification_loss: 0.1229 357/500 [====================>.........] - ETA: 48s - loss: 1.0286 - regression_loss: 0.9059 - classification_loss: 0.1227 358/500 [====================>.........] - ETA: 48s - loss: 1.0281 - regression_loss: 0.9055 - classification_loss: 0.1226 359/500 [====================>.........] - ETA: 47s - loss: 1.0285 - regression_loss: 0.9057 - classification_loss: 0.1228 360/500 [====================>.........] - ETA: 47s - loss: 1.0284 - regression_loss: 0.9057 - classification_loss: 0.1228 361/500 [====================>.........] - ETA: 47s - loss: 1.0284 - regression_loss: 0.9057 - classification_loss: 0.1227 362/500 [====================>.........] - ETA: 46s - loss: 1.0294 - regression_loss: 0.9066 - classification_loss: 0.1229 363/500 [====================>.........] - ETA: 46s - loss: 1.0287 - regression_loss: 0.9061 - classification_loss: 0.1226 364/500 [====================>.........] - ETA: 46s - loss: 1.0265 - regression_loss: 0.9041 - classification_loss: 0.1224 365/500 [====================>.........] - ETA: 45s - loss: 1.0257 - regression_loss: 0.9035 - classification_loss: 0.1221 366/500 [====================>.........] - ETA: 45s - loss: 1.0242 - regression_loss: 0.9023 - classification_loss: 0.1219 367/500 [=====================>........] - ETA: 45s - loss: 1.0227 - regression_loss: 0.9010 - classification_loss: 0.1217 368/500 [=====================>........] - ETA: 44s - loss: 1.0245 - regression_loss: 0.9024 - classification_loss: 0.1221 369/500 [=====================>........] - ETA: 44s - loss: 1.0238 - regression_loss: 0.9019 - classification_loss: 0.1219 370/500 [=====================>........] - ETA: 44s - loss: 1.0223 - regression_loss: 0.9006 - classification_loss: 0.1217 371/500 [=====================>........] - ETA: 43s - loss: 1.0226 - regression_loss: 0.9008 - classification_loss: 0.1218 372/500 [=====================>........] - ETA: 43s - loss: 1.0216 - regression_loss: 0.9000 - classification_loss: 0.1216 373/500 [=====================>........] - ETA: 43s - loss: 1.0224 - regression_loss: 0.9008 - classification_loss: 0.1216 374/500 [=====================>........] - ETA: 42s - loss: 1.0231 - regression_loss: 0.9015 - classification_loss: 0.1216 375/500 [=====================>........] - ETA: 42s - loss: 1.0230 - regression_loss: 0.9015 - classification_loss: 0.1215 376/500 [=====================>........] - ETA: 42s - loss: 1.0222 - regression_loss: 0.9009 - classification_loss: 0.1213 377/500 [=====================>........] - ETA: 41s - loss: 1.0228 - regression_loss: 0.9015 - classification_loss: 0.1212 378/500 [=====================>........] - ETA: 41s - loss: 1.0226 - regression_loss: 0.9014 - classification_loss: 0.1211 379/500 [=====================>........] - ETA: 41s - loss: 1.0243 - regression_loss: 0.9030 - classification_loss: 0.1213 380/500 [=====================>........] - ETA: 40s - loss: 1.0246 - regression_loss: 0.9034 - classification_loss: 0.1212 381/500 [=====================>........] - ETA: 40s - loss: 1.0243 - regression_loss: 0.9031 - classification_loss: 0.1211 382/500 [=====================>........] - ETA: 40s - loss: 1.0255 - regression_loss: 0.9042 - classification_loss: 0.1212 383/500 [=====================>........] - ETA: 39s - loss: 1.0264 - regression_loss: 0.9050 - classification_loss: 0.1214 384/500 [======================>.......] - ETA: 39s - loss: 1.0260 - regression_loss: 0.9048 - classification_loss: 0.1212 385/500 [======================>.......] - ETA: 39s - loss: 1.0259 - regression_loss: 0.9046 - classification_loss: 0.1213 386/500 [======================>.......] - ETA: 38s - loss: 1.0262 - regression_loss: 0.9048 - classification_loss: 0.1214 387/500 [======================>.......] - ETA: 38s - loss: 1.0262 - regression_loss: 0.9048 - classification_loss: 0.1215 388/500 [======================>.......] - ETA: 38s - loss: 1.0266 - regression_loss: 0.9051 - classification_loss: 0.1216 389/500 [======================>.......] - ETA: 37s - loss: 1.0264 - regression_loss: 0.9050 - classification_loss: 0.1214 390/500 [======================>.......] - ETA: 37s - loss: 1.0264 - regression_loss: 0.9051 - classification_loss: 0.1213 391/500 [======================>.......] - ETA: 37s - loss: 1.0263 - regression_loss: 0.9051 - classification_loss: 0.1212 392/500 [======================>.......] - ETA: 36s - loss: 1.0264 - regression_loss: 0.9052 - classification_loss: 0.1211 393/500 [======================>.......] - ETA: 36s - loss: 1.0262 - regression_loss: 0.9052 - classification_loss: 0.1210 394/500 [======================>.......] - ETA: 35s - loss: 1.0273 - regression_loss: 0.9060 - classification_loss: 0.1213 395/500 [======================>.......] - ETA: 35s - loss: 1.0285 - regression_loss: 0.9069 - classification_loss: 0.1216 396/500 [======================>.......] - ETA: 35s - loss: 1.0274 - regression_loss: 0.9060 - classification_loss: 0.1214 397/500 [======================>.......] - ETA: 34s - loss: 1.0273 - regression_loss: 0.9058 - classification_loss: 0.1215 398/500 [======================>.......] - ETA: 34s - loss: 1.0266 - regression_loss: 0.9054 - classification_loss: 0.1212 399/500 [======================>.......] - ETA: 34s - loss: 1.0272 - regression_loss: 0.9060 - classification_loss: 0.1212 400/500 [=======================>......] - ETA: 33s - loss: 1.0271 - regression_loss: 0.9060 - classification_loss: 0.1211 401/500 [=======================>......] - ETA: 33s - loss: 1.0288 - regression_loss: 0.9073 - classification_loss: 0.1215 402/500 [=======================>......] - ETA: 33s - loss: 1.0278 - regression_loss: 0.9064 - classification_loss: 0.1213 403/500 [=======================>......] - ETA: 32s - loss: 1.0266 - regression_loss: 0.9055 - classification_loss: 0.1211 404/500 [=======================>......] - ETA: 32s - loss: 1.0274 - regression_loss: 0.9062 - classification_loss: 0.1212 405/500 [=======================>......] - ETA: 32s - loss: 1.0274 - regression_loss: 0.9062 - classification_loss: 0.1211 406/500 [=======================>......] - ETA: 31s - loss: 1.0278 - regression_loss: 0.9066 - classification_loss: 0.1212 407/500 [=======================>......] - ETA: 31s - loss: 1.0287 - regression_loss: 0.9073 - classification_loss: 0.1214 408/500 [=======================>......] - ETA: 31s - loss: 1.0285 - regression_loss: 0.9073 - classification_loss: 0.1213 409/500 [=======================>......] - ETA: 30s - loss: 1.0281 - regression_loss: 0.9070 - classification_loss: 0.1211 410/500 [=======================>......] - ETA: 30s - loss: 1.0288 - regression_loss: 0.9075 - classification_loss: 0.1213 411/500 [=======================>......] - ETA: 30s - loss: 1.0289 - regression_loss: 0.9075 - classification_loss: 0.1214 412/500 [=======================>......] - ETA: 29s - loss: 1.0264 - regression_loss: 0.9053 - classification_loss: 0.1211 413/500 [=======================>......] - ETA: 29s - loss: 1.0266 - regression_loss: 0.9054 - classification_loss: 0.1212 414/500 [=======================>......] - ETA: 29s - loss: 1.0256 - regression_loss: 0.9046 - classification_loss: 0.1210 415/500 [=======================>......] - ETA: 28s - loss: 1.0251 - regression_loss: 0.9043 - classification_loss: 0.1208 416/500 [=======================>......] - ETA: 28s - loss: 1.0246 - regression_loss: 0.9037 - classification_loss: 0.1210 417/500 [========================>.....] - ETA: 28s - loss: 1.0250 - regression_loss: 0.9040 - classification_loss: 0.1211 418/500 [========================>.....] - ETA: 27s - loss: 1.0256 - regression_loss: 0.9044 - classification_loss: 0.1212 419/500 [========================>.....] - ETA: 27s - loss: 1.0264 - regression_loss: 0.9051 - classification_loss: 0.1213 420/500 [========================>.....] - ETA: 27s - loss: 1.0255 - regression_loss: 0.9044 - classification_loss: 0.1211 421/500 [========================>.....] - ETA: 26s - loss: 1.0265 - regression_loss: 0.9055 - classification_loss: 0.1210 422/500 [========================>.....] - ETA: 26s - loss: 1.0262 - regression_loss: 0.9053 - classification_loss: 0.1210 423/500 [========================>.....] - ETA: 26s - loss: 1.0258 - regression_loss: 0.9049 - classification_loss: 0.1209 424/500 [========================>.....] - ETA: 25s - loss: 1.0247 - regression_loss: 0.9040 - classification_loss: 0.1208 425/500 [========================>.....] - ETA: 25s - loss: 1.0243 - regression_loss: 0.9036 - classification_loss: 0.1206 426/500 [========================>.....] - ETA: 25s - loss: 1.0240 - regression_loss: 0.9033 - classification_loss: 0.1206 427/500 [========================>.....] - ETA: 24s - loss: 1.0249 - regression_loss: 0.9041 - classification_loss: 0.1207 428/500 [========================>.....] - ETA: 24s - loss: 1.0252 - regression_loss: 0.9045 - classification_loss: 0.1207 429/500 [========================>.....] - ETA: 24s - loss: 1.0249 - regression_loss: 0.9044 - classification_loss: 0.1205 430/500 [========================>.....] - ETA: 23s - loss: 1.0232 - regression_loss: 0.9029 - classification_loss: 0.1203 431/500 [========================>.....] - ETA: 23s - loss: 1.0248 - regression_loss: 0.9040 - classification_loss: 0.1208 432/500 [========================>.....] - ETA: 23s - loss: 1.0252 - regression_loss: 0.9044 - classification_loss: 0.1208 433/500 [========================>.....] - ETA: 22s - loss: 1.0251 - regression_loss: 0.9044 - classification_loss: 0.1207 434/500 [=========================>....] - ETA: 22s - loss: 1.0256 - regression_loss: 0.9051 - classification_loss: 0.1206 435/500 [=========================>....] - ETA: 22s - loss: 1.0252 - regression_loss: 0.9046 - classification_loss: 0.1205 436/500 [=========================>....] - ETA: 21s - loss: 1.0253 - regression_loss: 0.9048 - classification_loss: 0.1205 437/500 [=========================>....] - ETA: 21s - loss: 1.0249 - regression_loss: 0.9044 - classification_loss: 0.1205 438/500 [=========================>....] - ETA: 21s - loss: 1.0271 - regression_loss: 0.9066 - classification_loss: 0.1205 439/500 [=========================>....] - ETA: 20s - loss: 1.0284 - regression_loss: 0.9076 - classification_loss: 0.1207 440/500 [=========================>....] - ETA: 20s - loss: 1.0285 - regression_loss: 0.9077 - classification_loss: 0.1208 441/500 [=========================>....] - ETA: 20s - loss: 1.0289 - regression_loss: 0.9081 - classification_loss: 0.1208 442/500 [=========================>....] - ETA: 19s - loss: 1.0293 - regression_loss: 0.9086 - classification_loss: 0.1207 443/500 [=========================>....] - ETA: 19s - loss: 1.0298 - regression_loss: 0.9091 - classification_loss: 0.1208 444/500 [=========================>....] - ETA: 19s - loss: 1.0298 - regression_loss: 0.9092 - classification_loss: 0.1207 445/500 [=========================>....] - ETA: 18s - loss: 1.0304 - regression_loss: 0.9097 - classification_loss: 0.1207 446/500 [=========================>....] - ETA: 18s - loss: 1.0292 - regression_loss: 0.9087 - classification_loss: 0.1206 447/500 [=========================>....] - ETA: 17s - loss: 1.0276 - regression_loss: 0.9073 - classification_loss: 0.1204 448/500 [=========================>....] - ETA: 17s - loss: 1.0274 - regression_loss: 0.9070 - classification_loss: 0.1204 449/500 [=========================>....] - ETA: 17s - loss: 1.0268 - regression_loss: 0.9064 - classification_loss: 0.1203 450/500 [==========================>...] - ETA: 16s - loss: 1.0274 - regression_loss: 0.9071 - classification_loss: 0.1203 451/500 [==========================>...] - ETA: 16s - loss: 1.0274 - regression_loss: 0.9072 - classification_loss: 0.1202 452/500 [==========================>...] - ETA: 16s - loss: 1.0273 - regression_loss: 0.9072 - classification_loss: 0.1201 453/500 [==========================>...] - ETA: 15s - loss: 1.0280 - regression_loss: 0.9078 - classification_loss: 0.1202 454/500 [==========================>...] - ETA: 15s - loss: 1.0276 - regression_loss: 0.9075 - classification_loss: 0.1201 455/500 [==========================>...] - ETA: 15s - loss: 1.0273 - regression_loss: 0.9072 - classification_loss: 0.1200 456/500 [==========================>...] - ETA: 14s - loss: 1.0271 - regression_loss: 0.9071 - classification_loss: 0.1200 457/500 [==========================>...] - ETA: 14s - loss: 1.0274 - regression_loss: 0.9074 - classification_loss: 0.1200 458/500 [==========================>...] - ETA: 14s - loss: 1.0267 - regression_loss: 0.9069 - classification_loss: 0.1198 459/500 [==========================>...] - ETA: 13s - loss: 1.0258 - regression_loss: 0.9060 - classification_loss: 0.1197 460/500 [==========================>...] - ETA: 13s - loss: 1.0255 - regression_loss: 0.9058 - classification_loss: 0.1197 461/500 [==========================>...] - ETA: 13s - loss: 1.0260 - regression_loss: 0.9062 - classification_loss: 0.1198 462/500 [==========================>...] - ETA: 12s - loss: 1.0249 - regression_loss: 0.9053 - classification_loss: 0.1196 463/500 [==========================>...] - ETA: 12s - loss: 1.0236 - regression_loss: 0.9041 - classification_loss: 0.1195 464/500 [==========================>...] - ETA: 12s - loss: 1.0225 - regression_loss: 0.9032 - classification_loss: 0.1193 465/500 [==========================>...] - ETA: 11s - loss: 1.0211 - regression_loss: 0.9020 - classification_loss: 0.1191 466/500 [==========================>...] - ETA: 11s - loss: 1.0214 - regression_loss: 0.9022 - classification_loss: 0.1192 467/500 [===========================>..] - ETA: 11s - loss: 1.0209 - regression_loss: 0.9019 - classification_loss: 0.1191 468/500 [===========================>..] - ETA: 10s - loss: 1.0220 - regression_loss: 0.9027 - classification_loss: 0.1193 469/500 [===========================>..] - ETA: 10s - loss: 1.0224 - regression_loss: 0.9031 - classification_loss: 0.1192 470/500 [===========================>..] - ETA: 10s - loss: 1.0221 - regression_loss: 0.9030 - classification_loss: 0.1191 471/500 [===========================>..] - ETA: 9s - loss: 1.0216 - regression_loss: 0.9025 - classification_loss: 0.1190  472/500 [===========================>..] - ETA: 9s - loss: 1.0207 - regression_loss: 0.9017 - classification_loss: 0.1189 473/500 [===========================>..] - ETA: 9s - loss: 1.0226 - regression_loss: 0.9035 - classification_loss: 0.1191 474/500 [===========================>..] - ETA: 8s - loss: 1.0225 - regression_loss: 0.9035 - classification_loss: 0.1190 475/500 [===========================>..] - ETA: 8s - loss: 1.0215 - regression_loss: 0.9027 - classification_loss: 0.1188 476/500 [===========================>..] - ETA: 8s - loss: 1.0207 - regression_loss: 0.9021 - classification_loss: 0.1187 477/500 [===========================>..] - ETA: 7s - loss: 1.0207 - regression_loss: 0.9021 - classification_loss: 0.1186 478/500 [===========================>..] - ETA: 7s - loss: 1.0219 - regression_loss: 0.9031 - classification_loss: 0.1188 479/500 [===========================>..] - ETA: 7s - loss: 1.0219 - regression_loss: 0.9031 - classification_loss: 0.1189 480/500 [===========================>..] - ETA: 6s - loss: 1.0218 - regression_loss: 0.9030 - classification_loss: 0.1188 481/500 [===========================>..] - ETA: 6s - loss: 1.0238 - regression_loss: 0.9048 - classification_loss: 0.1189 482/500 [===========================>..] - ETA: 6s - loss: 1.0233 - regression_loss: 0.9044 - classification_loss: 0.1189 483/500 [===========================>..] - ETA: 5s - loss: 1.0226 - regression_loss: 0.9039 - classification_loss: 0.1187 484/500 [============================>.] - ETA: 5s - loss: 1.0231 - regression_loss: 0.9043 - classification_loss: 0.1188 485/500 [============================>.] - ETA: 5s - loss: 1.0227 - regression_loss: 0.9039 - classification_loss: 0.1187 486/500 [============================>.] - ETA: 4s - loss: 1.0215 - regression_loss: 0.9030 - classification_loss: 0.1186 487/500 [============================>.] - ETA: 4s - loss: 1.0215 - regression_loss: 0.9030 - classification_loss: 0.1185 488/500 [============================>.] - ETA: 4s - loss: 1.0205 - regression_loss: 0.9022 - classification_loss: 0.1183 489/500 [============================>.] - ETA: 3s - loss: 1.0202 - regression_loss: 0.9019 - classification_loss: 0.1182 490/500 [============================>.] - ETA: 3s - loss: 1.0210 - regression_loss: 0.9026 - classification_loss: 0.1183 491/500 [============================>.] - ETA: 3s - loss: 1.0203 - regression_loss: 0.9020 - classification_loss: 0.1182 492/500 [============================>.] - ETA: 2s - loss: 1.0200 - regression_loss: 0.9019 - classification_loss: 0.1181 493/500 [============================>.] - ETA: 2s - loss: 1.0189 - regression_loss: 0.9009 - classification_loss: 0.1180 494/500 [============================>.] - ETA: 2s - loss: 1.0185 - regression_loss: 0.9006 - classification_loss: 0.1178 495/500 [============================>.] - ETA: 1s - loss: 1.0180 - regression_loss: 0.9003 - classification_loss: 0.1177 496/500 [============================>.] - ETA: 1s - loss: 1.0184 - regression_loss: 0.9006 - classification_loss: 0.1178 497/500 [============================>.] - ETA: 1s - loss: 1.0190 - regression_loss: 0.9011 - classification_loss: 0.1178 498/500 [============================>.] - ETA: 0s - loss: 1.0186 - regression_loss: 0.9009 - classification_loss: 0.1177 499/500 [============================>.] - ETA: 0s - loss: 1.0192 - regression_loss: 0.9014 - classification_loss: 0.1177 500/500 [==============================] - 170s 340ms/step - loss: 1.0190 - regression_loss: 0.9014 - classification_loss: 0.1176 326 instances of class plum with average precision: 0.8467 mAP: 0.8467 Epoch 00024: saving model to ./training/snapshots/resnet101_pascal_24.h5 Epoch 25/150 1/500 [..............................] - ETA: 2:33 - loss: 0.6190 - regression_loss: 0.5622 - classification_loss: 0.0568 2/500 [..............................] - ETA: 2:38 - loss: 0.7668 - regression_loss: 0.7026 - classification_loss: 0.0642 3/500 [..............................] - ETA: 2:39 - loss: 0.9546 - regression_loss: 0.8464 - classification_loss: 0.1082 4/500 [..............................] - ETA: 2:42 - loss: 0.9006 - regression_loss: 0.8100 - classification_loss: 0.0906 5/500 [..............................] - ETA: 2:43 - loss: 0.8712 - regression_loss: 0.7878 - classification_loss: 0.0834 6/500 [..............................] - ETA: 2:44 - loss: 0.8593 - regression_loss: 0.7801 - classification_loss: 0.0792 7/500 [..............................] - ETA: 2:44 - loss: 0.9790 - regression_loss: 0.8966 - classification_loss: 0.0825 8/500 [..............................] - ETA: 2:43 - loss: 1.0053 - regression_loss: 0.9131 - classification_loss: 0.0921 9/500 [..............................] - ETA: 2:44 - loss: 1.0052 - regression_loss: 0.9115 - classification_loss: 0.0936 10/500 [..............................] - ETA: 2:43 - loss: 0.9942 - regression_loss: 0.9021 - classification_loss: 0.0922 11/500 [..............................] - ETA: 2:43 - loss: 1.0808 - regression_loss: 0.9774 - classification_loss: 0.1035 12/500 [..............................] - ETA: 2:43 - loss: 1.0724 - regression_loss: 0.9695 - classification_loss: 0.1030 13/500 [..............................] - ETA: 2:42 - loss: 1.1063 - regression_loss: 0.9947 - classification_loss: 0.1116 14/500 [..............................] - ETA: 2:42 - loss: 1.0881 - regression_loss: 0.9789 - classification_loss: 0.1092 15/500 [..............................] - ETA: 2:42 - loss: 1.1485 - regression_loss: 1.0180 - classification_loss: 0.1305 16/500 [..............................] - ETA: 2:42 - loss: 1.1271 - regression_loss: 0.9993 - classification_loss: 0.1279 17/500 [>.............................] - ETA: 2:42 - loss: 1.0945 - regression_loss: 0.9707 - classification_loss: 0.1238 18/500 [>.............................] - ETA: 2:41 - loss: 1.0982 - regression_loss: 0.9711 - classification_loss: 0.1271 19/500 [>.............................] - ETA: 2:41 - loss: 1.0632 - regression_loss: 0.9418 - classification_loss: 0.1214 20/500 [>.............................] - ETA: 2:41 - loss: 1.0931 - regression_loss: 0.9678 - classification_loss: 0.1253 21/500 [>.............................] - ETA: 2:40 - loss: 1.0715 - regression_loss: 0.9501 - classification_loss: 0.1214 22/500 [>.............................] - ETA: 2:41 - loss: 1.0462 - regression_loss: 0.9280 - classification_loss: 0.1181 23/500 [>.............................] - ETA: 2:40 - loss: 1.0596 - regression_loss: 0.9396 - classification_loss: 0.1200 24/500 [>.............................] - ETA: 2:40 - loss: 1.0596 - regression_loss: 0.9397 - classification_loss: 0.1199 25/500 [>.............................] - ETA: 2:40 - loss: 1.0590 - regression_loss: 0.9391 - classification_loss: 0.1199 26/500 [>.............................] - ETA: 2:40 - loss: 1.0639 - regression_loss: 0.9455 - classification_loss: 0.1184 27/500 [>.............................] - ETA: 2:39 - loss: 1.0703 - regression_loss: 0.9511 - classification_loss: 0.1192 28/500 [>.............................] - ETA: 2:39 - loss: 1.0619 - regression_loss: 0.9400 - classification_loss: 0.1219 29/500 [>.............................] - ETA: 2:40 - loss: 1.0496 - regression_loss: 0.9294 - classification_loss: 0.1202 30/500 [>.............................] - ETA: 2:39 - loss: 1.0420 - regression_loss: 0.9236 - classification_loss: 0.1183 31/500 [>.............................] - ETA: 2:39 - loss: 1.0410 - regression_loss: 0.9228 - classification_loss: 0.1182 32/500 [>.............................] - ETA: 2:38 - loss: 1.0423 - regression_loss: 0.9245 - classification_loss: 0.1178 33/500 [>.............................] - ETA: 2:38 - loss: 1.0335 - regression_loss: 0.9176 - classification_loss: 0.1159 34/500 [=>............................] - ETA: 2:38 - loss: 1.0179 - regression_loss: 0.9032 - classification_loss: 0.1146 35/500 [=>............................] - ETA: 2:37 - loss: 1.0312 - regression_loss: 0.9135 - classification_loss: 0.1177 36/500 [=>............................] - ETA: 2:37 - loss: 1.0317 - regression_loss: 0.9138 - classification_loss: 0.1179 37/500 [=>............................] - ETA: 2:37 - loss: 1.0241 - regression_loss: 0.9083 - classification_loss: 0.1158 38/500 [=>............................] - ETA: 2:37 - loss: 1.0254 - regression_loss: 0.9097 - classification_loss: 0.1157 39/500 [=>............................] - ETA: 2:36 - loss: 1.0191 - regression_loss: 0.9054 - classification_loss: 0.1138 40/500 [=>............................] - ETA: 2:36 - loss: 1.0210 - regression_loss: 0.9063 - classification_loss: 0.1147 41/500 [=>............................] - ETA: 2:35 - loss: 1.0070 - regression_loss: 0.8946 - classification_loss: 0.1124 42/500 [=>............................] - ETA: 2:35 - loss: 1.0206 - regression_loss: 0.9080 - classification_loss: 0.1126 43/500 [=>............................] - ETA: 2:35 - loss: 1.0234 - regression_loss: 0.9119 - classification_loss: 0.1115 44/500 [=>............................] - ETA: 2:34 - loss: 1.0248 - regression_loss: 0.9132 - classification_loss: 0.1116 45/500 [=>............................] - ETA: 2:34 - loss: 1.0392 - regression_loss: 0.9246 - classification_loss: 0.1146 46/500 [=>............................] - ETA: 2:33 - loss: 1.0347 - regression_loss: 0.9206 - classification_loss: 0.1140 47/500 [=>............................] - ETA: 2:33 - loss: 1.0215 - regression_loss: 0.9095 - classification_loss: 0.1120 48/500 [=>............................] - ETA: 2:33 - loss: 1.0119 - regression_loss: 0.9005 - classification_loss: 0.1114 49/500 [=>............................] - ETA: 2:33 - loss: 1.0022 - regression_loss: 0.8925 - classification_loss: 0.1098 50/500 [==>...........................] - ETA: 2:32 - loss: 1.0013 - regression_loss: 0.8915 - classification_loss: 0.1098 51/500 [==>...........................] - ETA: 2:32 - loss: 1.0039 - regression_loss: 0.8922 - classification_loss: 0.1117 52/500 [==>...........................] - ETA: 2:31 - loss: 1.0056 - regression_loss: 0.8927 - classification_loss: 0.1129 53/500 [==>...........................] - ETA: 2:31 - loss: 0.9963 - regression_loss: 0.8848 - classification_loss: 0.1114 54/500 [==>...........................] - ETA: 2:31 - loss: 0.9853 - regression_loss: 0.8753 - classification_loss: 0.1101 55/500 [==>...........................] - ETA: 2:30 - loss: 0.9880 - regression_loss: 0.8786 - classification_loss: 0.1094 56/500 [==>...........................] - ETA: 2:30 - loss: 0.9909 - regression_loss: 0.8805 - classification_loss: 0.1104 57/500 [==>...........................] - ETA: 2:30 - loss: 1.0007 - regression_loss: 0.8883 - classification_loss: 0.1124 58/500 [==>...........................] - ETA: 2:29 - loss: 1.0149 - regression_loss: 0.8999 - classification_loss: 0.1150 59/500 [==>...........................] - ETA: 2:29 - loss: 1.0188 - regression_loss: 0.9049 - classification_loss: 0.1139 60/500 [==>...........................] - ETA: 2:28 - loss: 1.0181 - regression_loss: 0.9047 - classification_loss: 0.1135 61/500 [==>...........................] - ETA: 2:28 - loss: 1.0333 - regression_loss: 0.9158 - classification_loss: 0.1176 62/500 [==>...........................] - ETA: 2:28 - loss: 1.0325 - regression_loss: 0.9152 - classification_loss: 0.1172 63/500 [==>...........................] - ETA: 2:28 - loss: 1.0297 - regression_loss: 0.9133 - classification_loss: 0.1164 64/500 [==>...........................] - ETA: 2:27 - loss: 1.0294 - regression_loss: 0.9132 - classification_loss: 0.1162 65/500 [==>...........................] - ETA: 2:27 - loss: 1.0331 - regression_loss: 0.9167 - classification_loss: 0.1164 66/500 [==>...........................] - ETA: 2:27 - loss: 1.0347 - regression_loss: 0.9191 - classification_loss: 0.1156 67/500 [===>..........................] - ETA: 2:26 - loss: 1.0235 - regression_loss: 0.9095 - classification_loss: 0.1140 68/500 [===>..........................] - ETA: 2:26 - loss: 1.0164 - regression_loss: 0.9031 - classification_loss: 0.1134 69/500 [===>..........................] - ETA: 2:26 - loss: 1.0147 - regression_loss: 0.9015 - classification_loss: 0.1132 70/500 [===>..........................] - ETA: 2:25 - loss: 1.0207 - regression_loss: 0.9076 - classification_loss: 0.1131 71/500 [===>..........................] - ETA: 2:25 - loss: 1.0253 - regression_loss: 0.9117 - classification_loss: 0.1136 72/500 [===>..........................] - ETA: 2:24 - loss: 1.0252 - regression_loss: 0.9119 - classification_loss: 0.1133 73/500 [===>..........................] - ETA: 2:24 - loss: 1.0191 - regression_loss: 0.9069 - classification_loss: 0.1122 74/500 [===>..........................] - ETA: 2:24 - loss: 1.0210 - regression_loss: 0.9085 - classification_loss: 0.1124 75/500 [===>..........................] - ETA: 2:23 - loss: 1.0152 - regression_loss: 0.9035 - classification_loss: 0.1117 76/500 [===>..........................] - ETA: 2:23 - loss: 1.0238 - regression_loss: 0.9101 - classification_loss: 0.1137 77/500 [===>..........................] - ETA: 2:23 - loss: 1.0224 - regression_loss: 0.9088 - classification_loss: 0.1135 78/500 [===>..........................] - ETA: 2:22 - loss: 1.0245 - regression_loss: 0.9107 - classification_loss: 0.1137 79/500 [===>..........................] - ETA: 2:22 - loss: 1.0256 - regression_loss: 0.9124 - classification_loss: 0.1132 80/500 [===>..........................] - ETA: 2:22 - loss: 1.0224 - regression_loss: 0.9094 - classification_loss: 0.1129 81/500 [===>..........................] - ETA: 2:21 - loss: 1.0200 - regression_loss: 0.9076 - classification_loss: 0.1124 82/500 [===>..........................] - ETA: 2:21 - loss: 1.0223 - regression_loss: 0.9088 - classification_loss: 0.1135 83/500 [===>..........................] - ETA: 2:21 - loss: 1.0138 - regression_loss: 0.9014 - classification_loss: 0.1125 84/500 [====>.........................] - ETA: 2:20 - loss: 1.0176 - regression_loss: 0.9045 - classification_loss: 0.1131 85/500 [====>.........................] - ETA: 2:20 - loss: 1.0243 - regression_loss: 0.9096 - classification_loss: 0.1147 86/500 [====>.........................] - ETA: 2:20 - loss: 1.0209 - regression_loss: 0.9070 - classification_loss: 0.1140 87/500 [====>.........................] - ETA: 2:19 - loss: 1.0259 - regression_loss: 0.9118 - classification_loss: 0.1141 88/500 [====>.........................] - ETA: 2:19 - loss: 1.0251 - regression_loss: 0.9114 - classification_loss: 0.1137 89/500 [====>.........................] - ETA: 2:19 - loss: 1.0212 - regression_loss: 0.9077 - classification_loss: 0.1135 90/500 [====>.........................] - ETA: 2:18 - loss: 1.0205 - regression_loss: 0.9075 - classification_loss: 0.1131 91/500 [====>.........................] - ETA: 2:18 - loss: 1.0229 - regression_loss: 0.9102 - classification_loss: 0.1127 92/500 [====>.........................] - ETA: 2:18 - loss: 1.0247 - regression_loss: 0.9113 - classification_loss: 0.1134 93/500 [====>.........................] - ETA: 2:17 - loss: 1.0402 - regression_loss: 0.9251 - classification_loss: 0.1151 94/500 [====>.........................] - ETA: 2:17 - loss: 1.0450 - regression_loss: 0.9292 - classification_loss: 0.1158 95/500 [====>.........................] - ETA: 2:17 - loss: 1.0494 - regression_loss: 0.9331 - classification_loss: 0.1163 96/500 [====>.........................] - ETA: 2:16 - loss: 1.0488 - regression_loss: 0.9311 - classification_loss: 0.1177 97/500 [====>.........................] - ETA: 2:16 - loss: 1.0489 - regression_loss: 0.9314 - classification_loss: 0.1176 98/500 [====>.........................] - ETA: 2:16 - loss: 1.0507 - regression_loss: 0.9323 - classification_loss: 0.1184 99/500 [====>.........................] - ETA: 2:15 - loss: 1.0486 - regression_loss: 0.9308 - classification_loss: 0.1178 100/500 [=====>........................] - ETA: 2:15 - loss: 1.0528 - regression_loss: 0.9340 - classification_loss: 0.1188 101/500 [=====>........................] - ETA: 2:14 - loss: 1.0486 - regression_loss: 0.9300 - classification_loss: 0.1186 102/500 [=====>........................] - ETA: 2:14 - loss: 1.0514 - regression_loss: 0.9322 - classification_loss: 0.1192 103/500 [=====>........................] - ETA: 2:14 - loss: 1.0514 - regression_loss: 0.9325 - classification_loss: 0.1189 104/500 [=====>........................] - ETA: 2:13 - loss: 1.0508 - regression_loss: 0.9318 - classification_loss: 0.1190 105/500 [=====>........................] - ETA: 2:13 - loss: 1.0485 - regression_loss: 0.9299 - classification_loss: 0.1186 106/500 [=====>........................] - ETA: 2:13 - loss: 1.0485 - regression_loss: 0.9299 - classification_loss: 0.1185 107/500 [=====>........................] - ETA: 2:12 - loss: 1.0552 - regression_loss: 0.9347 - classification_loss: 0.1204 108/500 [=====>........................] - ETA: 2:12 - loss: 1.0495 - regression_loss: 0.9298 - classification_loss: 0.1197 109/500 [=====>........................] - ETA: 2:12 - loss: 1.0432 - regression_loss: 0.9245 - classification_loss: 0.1187 110/500 [=====>........................] - ETA: 2:11 - loss: 1.0422 - regression_loss: 0.9238 - classification_loss: 0.1184 111/500 [=====>........................] - ETA: 2:11 - loss: 1.0507 - regression_loss: 0.9302 - classification_loss: 0.1205 112/500 [=====>........................] - ETA: 2:11 - loss: 1.0496 - regression_loss: 0.9299 - classification_loss: 0.1198 113/500 [=====>........................] - ETA: 2:10 - loss: 1.0505 - regression_loss: 0.9310 - classification_loss: 0.1196 114/500 [=====>........................] - ETA: 2:10 - loss: 1.0505 - regression_loss: 0.9314 - classification_loss: 0.1190 115/500 [=====>........................] - ETA: 2:10 - loss: 1.0502 - regression_loss: 0.9310 - classification_loss: 0.1192 116/500 [=====>........................] - ETA: 2:09 - loss: 1.0499 - regression_loss: 0.9309 - classification_loss: 0.1190 117/500 [======>.......................] - ETA: 2:09 - loss: 1.0533 - regression_loss: 0.9341 - classification_loss: 0.1191 118/500 [======>.......................] - ETA: 2:09 - loss: 1.0505 - regression_loss: 0.9319 - classification_loss: 0.1186 119/500 [======>.......................] - ETA: 2:08 - loss: 1.0531 - regression_loss: 0.9340 - classification_loss: 0.1191 120/500 [======>.......................] - ETA: 2:08 - loss: 1.0506 - regression_loss: 0.9319 - classification_loss: 0.1188 121/500 [======>.......................] - ETA: 2:08 - loss: 1.0538 - regression_loss: 0.9337 - classification_loss: 0.1200 122/500 [======>.......................] - ETA: 2:07 - loss: 1.0492 - regression_loss: 0.9298 - classification_loss: 0.1193 123/500 [======>.......................] - ETA: 2:07 - loss: 1.0485 - regression_loss: 0.9298 - classification_loss: 0.1186 124/500 [======>.......................] - ETA: 2:07 - loss: 1.0443 - regression_loss: 0.9263 - classification_loss: 0.1180 125/500 [======>.......................] - ETA: 2:06 - loss: 1.0407 - regression_loss: 0.9233 - classification_loss: 0.1174 126/500 [======>.......................] - ETA: 2:06 - loss: 1.0394 - regression_loss: 0.9222 - classification_loss: 0.1171 127/500 [======>.......................] - ETA: 2:06 - loss: 1.0372 - regression_loss: 0.9206 - classification_loss: 0.1166 128/500 [======>.......................] - ETA: 2:05 - loss: 1.0322 - regression_loss: 0.9164 - classification_loss: 0.1159 129/500 [======>.......................] - ETA: 2:05 - loss: 1.0311 - regression_loss: 0.9150 - classification_loss: 0.1161 130/500 [======>.......................] - ETA: 2:05 - loss: 1.0285 - regression_loss: 0.9127 - classification_loss: 0.1158 131/500 [======>.......................] - ETA: 2:04 - loss: 1.0252 - regression_loss: 0.9099 - classification_loss: 0.1153 132/500 [======>.......................] - ETA: 2:04 - loss: 1.0191 - regression_loss: 0.9046 - classification_loss: 0.1145 133/500 [======>.......................] - ETA: 2:04 - loss: 1.0180 - regression_loss: 0.9039 - classification_loss: 0.1141 134/500 [=======>......................] - ETA: 2:03 - loss: 1.0140 - regression_loss: 0.9004 - classification_loss: 0.1136 135/500 [=======>......................] - ETA: 2:03 - loss: 1.0121 - regression_loss: 0.8990 - classification_loss: 0.1131 136/500 [=======>......................] - ETA: 2:03 - loss: 1.0089 - regression_loss: 0.8963 - classification_loss: 0.1126 137/500 [=======>......................] - ETA: 2:02 - loss: 1.0076 - regression_loss: 0.8952 - classification_loss: 0.1124 138/500 [=======>......................] - ETA: 2:02 - loss: 1.0084 - regression_loss: 0.8961 - classification_loss: 0.1123 139/500 [=======>......................] - ETA: 2:02 - loss: 1.0041 - regression_loss: 0.8924 - classification_loss: 0.1117 140/500 [=======>......................] - ETA: 2:01 - loss: 1.0079 - regression_loss: 0.8956 - classification_loss: 0.1123 141/500 [=======>......................] - ETA: 2:01 - loss: 1.0085 - regression_loss: 0.8960 - classification_loss: 0.1124 142/500 [=======>......................] - ETA: 2:01 - loss: 1.0054 - regression_loss: 0.8935 - classification_loss: 0.1118 143/500 [=======>......................] - ETA: 2:00 - loss: 1.0075 - regression_loss: 0.8958 - classification_loss: 0.1117 144/500 [=======>......................] - ETA: 2:00 - loss: 1.0130 - regression_loss: 0.9003 - classification_loss: 0.1127 145/500 [=======>......................] - ETA: 2:00 - loss: 1.0162 - regression_loss: 0.9033 - classification_loss: 0.1130 146/500 [=======>......................] - ETA: 1:59 - loss: 1.0131 - regression_loss: 0.9007 - classification_loss: 0.1124 147/500 [=======>......................] - ETA: 1:59 - loss: 1.0093 - regression_loss: 0.8975 - classification_loss: 0.1117 148/500 [=======>......................] - ETA: 1:59 - loss: 1.0049 - regression_loss: 0.8938 - classification_loss: 0.1112 149/500 [=======>......................] - ETA: 1:58 - loss: 1.0035 - regression_loss: 0.8923 - classification_loss: 0.1112 150/500 [========>.....................] - ETA: 1:58 - loss: 1.0033 - regression_loss: 0.8924 - classification_loss: 0.1109 151/500 [========>.....................] - ETA: 1:58 - loss: 1.0023 - regression_loss: 0.8915 - classification_loss: 0.1108 152/500 [========>.....................] - ETA: 1:57 - loss: 1.0059 - regression_loss: 0.8948 - classification_loss: 0.1111 153/500 [========>.....................] - ETA: 1:57 - loss: 1.0061 - regression_loss: 0.8948 - classification_loss: 0.1112 154/500 [========>.....................] - ETA: 1:57 - loss: 1.0025 - regression_loss: 0.8917 - classification_loss: 0.1108 155/500 [========>.....................] - ETA: 1:56 - loss: 1.0026 - regression_loss: 0.8917 - classification_loss: 0.1109 156/500 [========>.....................] - ETA: 1:56 - loss: 1.0025 - regression_loss: 0.8914 - classification_loss: 0.1111 157/500 [========>.....................] - ETA: 1:56 - loss: 1.0001 - regression_loss: 0.8894 - classification_loss: 0.1107 158/500 [========>.....................] - ETA: 1:55 - loss: 1.0037 - regression_loss: 0.8927 - classification_loss: 0.1110 159/500 [========>.....................] - ETA: 1:55 - loss: 1.0064 - regression_loss: 0.8945 - classification_loss: 0.1119 160/500 [========>.....................] - ETA: 1:55 - loss: 1.0021 - regression_loss: 0.8907 - classification_loss: 0.1113 161/500 [========>.....................] - ETA: 1:54 - loss: 1.0003 - regression_loss: 0.8892 - classification_loss: 0.1110 162/500 [========>.....................] - ETA: 1:54 - loss: 1.0006 - regression_loss: 0.8898 - classification_loss: 0.1108 163/500 [========>.....................] - ETA: 1:54 - loss: 0.9989 - regression_loss: 0.8881 - classification_loss: 0.1107 164/500 [========>.....................] - ETA: 1:53 - loss: 0.9980 - regression_loss: 0.8873 - classification_loss: 0.1107 165/500 [========>.....................] - ETA: 1:53 - loss: 0.9987 - regression_loss: 0.8879 - classification_loss: 0.1108 166/500 [========>.....................] - ETA: 1:53 - loss: 0.9963 - regression_loss: 0.8847 - classification_loss: 0.1116 167/500 [=========>....................] - ETA: 1:52 - loss: 0.9960 - regression_loss: 0.8847 - classification_loss: 0.1112 168/500 [=========>....................] - ETA: 1:52 - loss: 0.9975 - regression_loss: 0.8864 - classification_loss: 0.1111 169/500 [=========>....................] - ETA: 1:52 - loss: 0.9974 - regression_loss: 0.8864 - classification_loss: 0.1110 170/500 [=========>....................] - ETA: 1:51 - loss: 1.0003 - regression_loss: 0.8889 - classification_loss: 0.1113 171/500 [=========>....................] - ETA: 1:51 - loss: 0.9984 - regression_loss: 0.8869 - classification_loss: 0.1115 172/500 [=========>....................] - ETA: 1:51 - loss: 0.9999 - regression_loss: 0.8882 - classification_loss: 0.1118 173/500 [=========>....................] - ETA: 1:50 - loss: 1.0050 - regression_loss: 0.8929 - classification_loss: 0.1122 174/500 [=========>....................] - ETA: 1:50 - loss: 1.0039 - regression_loss: 0.8921 - classification_loss: 0.1118 175/500 [=========>....................] - ETA: 1:50 - loss: 1.0063 - regression_loss: 0.8942 - classification_loss: 0.1121 176/500 [=========>....................] - ETA: 1:49 - loss: 1.0052 - regression_loss: 0.8934 - classification_loss: 0.1117 177/500 [=========>....................] - ETA: 1:49 - loss: 1.0053 - regression_loss: 0.8935 - classification_loss: 0.1118 178/500 [=========>....................] - ETA: 1:49 - loss: 1.0047 - regression_loss: 0.8932 - classification_loss: 0.1115 179/500 [=========>....................] - ETA: 1:48 - loss: 1.0041 - regression_loss: 0.8926 - classification_loss: 0.1115 180/500 [=========>....................] - ETA: 1:48 - loss: 1.0016 - regression_loss: 0.8905 - classification_loss: 0.1111 181/500 [=========>....................] - ETA: 1:48 - loss: 1.0019 - regression_loss: 0.8905 - classification_loss: 0.1114 182/500 [=========>....................] - ETA: 1:47 - loss: 1.0021 - regression_loss: 0.8907 - classification_loss: 0.1114 183/500 [=========>....................] - ETA: 1:47 - loss: 1.0051 - regression_loss: 0.8924 - classification_loss: 0.1127 184/500 [==========>...................] - ETA: 1:47 - loss: 1.0048 - regression_loss: 0.8922 - classification_loss: 0.1126 185/500 [==========>...................] - ETA: 1:46 - loss: 1.0047 - regression_loss: 0.8923 - classification_loss: 0.1124 186/500 [==========>...................] - ETA: 1:46 - loss: 1.0042 - regression_loss: 0.8920 - classification_loss: 0.1123 187/500 [==========>...................] - ETA: 1:46 - loss: 1.0019 - regression_loss: 0.8900 - classification_loss: 0.1119 188/500 [==========>...................] - ETA: 1:45 - loss: 1.0031 - regression_loss: 0.8907 - classification_loss: 0.1124 189/500 [==========>...................] - ETA: 1:45 - loss: 1.0036 - regression_loss: 0.8912 - classification_loss: 0.1124 190/500 [==========>...................] - ETA: 1:45 - loss: 1.0042 - regression_loss: 0.8919 - classification_loss: 0.1123 191/500 [==========>...................] - ETA: 1:44 - loss: 1.0030 - regression_loss: 0.8910 - classification_loss: 0.1119 192/500 [==========>...................] - ETA: 1:44 - loss: 0.9995 - regression_loss: 0.8879 - classification_loss: 0.1116 193/500 [==========>...................] - ETA: 1:43 - loss: 0.9987 - regression_loss: 0.8874 - classification_loss: 0.1114 194/500 [==========>...................] - ETA: 1:43 - loss: 0.9964 - regression_loss: 0.8853 - classification_loss: 0.1111 195/500 [==========>...................] - ETA: 1:43 - loss: 0.9970 - regression_loss: 0.8860 - classification_loss: 0.1109 196/500 [==========>...................] - ETA: 1:42 - loss: 0.9960 - regression_loss: 0.8853 - classification_loss: 0.1107 197/500 [==========>...................] - ETA: 1:42 - loss: 0.9911 - regression_loss: 0.8808 - classification_loss: 0.1103 198/500 [==========>...................] - ETA: 1:42 - loss: 0.9899 - regression_loss: 0.8797 - classification_loss: 0.1102 199/500 [==========>...................] - ETA: 1:41 - loss: 0.9912 - regression_loss: 0.8810 - classification_loss: 0.1102 200/500 [===========>..................] - ETA: 1:41 - loss: 0.9945 - regression_loss: 0.8833 - classification_loss: 0.1112 201/500 [===========>..................] - ETA: 1:41 - loss: 0.9925 - regression_loss: 0.8817 - classification_loss: 0.1108 202/500 [===========>..................] - ETA: 1:40 - loss: 0.9895 - regression_loss: 0.8791 - classification_loss: 0.1104 203/500 [===========>..................] - ETA: 1:40 - loss: 0.9872 - regression_loss: 0.8772 - classification_loss: 0.1100 204/500 [===========>..................] - ETA: 1:40 - loss: 0.9868 - regression_loss: 0.8771 - classification_loss: 0.1098 205/500 [===========>..................] - ETA: 1:39 - loss: 0.9863 - regression_loss: 0.8766 - classification_loss: 0.1097 206/500 [===========>..................] - ETA: 1:39 - loss: 0.9864 - regression_loss: 0.8769 - classification_loss: 0.1095 207/500 [===========>..................] - ETA: 1:39 - loss: 0.9847 - regression_loss: 0.8753 - classification_loss: 0.1094 208/500 [===========>..................] - ETA: 1:38 - loss: 0.9840 - regression_loss: 0.8746 - classification_loss: 0.1094 209/500 [===========>..................] - ETA: 1:38 - loss: 0.9816 - regression_loss: 0.8726 - classification_loss: 0.1090 210/500 [===========>..................] - ETA: 1:38 - loss: 0.9811 - regression_loss: 0.8721 - classification_loss: 0.1089 211/500 [===========>..................] - ETA: 1:37 - loss: 0.9814 - regression_loss: 0.8725 - classification_loss: 0.1089 212/500 [===========>..................] - ETA: 1:37 - loss: 0.9806 - regression_loss: 0.8719 - classification_loss: 0.1087 213/500 [===========>..................] - ETA: 1:37 - loss: 0.9843 - regression_loss: 0.8748 - classification_loss: 0.1095 214/500 [===========>..................] - ETA: 1:36 - loss: 0.9840 - regression_loss: 0.8747 - classification_loss: 0.1093 215/500 [===========>..................] - ETA: 1:36 - loss: 0.9827 - regression_loss: 0.8736 - classification_loss: 0.1091 216/500 [===========>..................] - ETA: 1:36 - loss: 0.9834 - regression_loss: 0.8742 - classification_loss: 0.1092 217/500 [============>.................] - ETA: 1:35 - loss: 0.9803 - regression_loss: 0.8715 - classification_loss: 0.1088 218/500 [============>.................] - ETA: 1:35 - loss: 0.9794 - regression_loss: 0.8709 - classification_loss: 0.1086 219/500 [============>.................] - ETA: 1:35 - loss: 0.9788 - regression_loss: 0.8705 - classification_loss: 0.1084 220/500 [============>.................] - ETA: 1:34 - loss: 0.9774 - regression_loss: 0.8693 - classification_loss: 0.1081 221/500 [============>.................] - ETA: 1:34 - loss: 0.9790 - regression_loss: 0.8710 - classification_loss: 0.1080 222/500 [============>.................] - ETA: 1:34 - loss: 0.9794 - regression_loss: 0.8715 - classification_loss: 0.1079 223/500 [============>.................] - ETA: 1:33 - loss: 0.9786 - regression_loss: 0.8708 - classification_loss: 0.1078 224/500 [============>.................] - ETA: 1:33 - loss: 0.9778 - regression_loss: 0.8702 - classification_loss: 0.1077 225/500 [============>.................] - ETA: 1:33 - loss: 0.9788 - regression_loss: 0.8709 - classification_loss: 0.1078 226/500 [============>.................] - ETA: 1:32 - loss: 0.9776 - regression_loss: 0.8700 - classification_loss: 0.1075 227/500 [============>.................] - ETA: 1:32 - loss: 0.9764 - regression_loss: 0.8691 - classification_loss: 0.1073 228/500 [============>.................] - ETA: 1:32 - loss: 0.9756 - regression_loss: 0.8686 - classification_loss: 0.1070 229/500 [============>.................] - ETA: 1:31 - loss: 0.9767 - regression_loss: 0.8697 - classification_loss: 0.1071 230/500 [============>.................] - ETA: 1:31 - loss: 0.9751 - regression_loss: 0.8683 - classification_loss: 0.1068 231/500 [============>.................] - ETA: 1:31 - loss: 0.9759 - regression_loss: 0.8692 - classification_loss: 0.1068 232/500 [============>.................] - ETA: 1:30 - loss: 0.9762 - regression_loss: 0.8695 - classification_loss: 0.1067 233/500 [============>.................] - ETA: 1:30 - loss: 0.9764 - regression_loss: 0.8697 - classification_loss: 0.1067 234/500 [=============>................] - ETA: 1:30 - loss: 0.9758 - regression_loss: 0.8691 - classification_loss: 0.1067 235/500 [=============>................] - ETA: 1:29 - loss: 0.9737 - regression_loss: 0.8673 - classification_loss: 0.1064 236/500 [=============>................] - ETA: 1:29 - loss: 0.9725 - regression_loss: 0.8664 - classification_loss: 0.1061 237/500 [=============>................] - ETA: 1:29 - loss: 0.9693 - regression_loss: 0.8635 - classification_loss: 0.1057 238/500 [=============>................] - ETA: 1:28 - loss: 0.9679 - regression_loss: 0.8624 - classification_loss: 0.1055 239/500 [=============>................] - ETA: 1:28 - loss: 0.9663 - regression_loss: 0.8611 - classification_loss: 0.1052 240/500 [=============>................] - ETA: 1:27 - loss: 0.9649 - regression_loss: 0.8600 - classification_loss: 0.1049 241/500 [=============>................] - ETA: 1:27 - loss: 0.9638 - regression_loss: 0.8591 - classification_loss: 0.1047 242/500 [=============>................] - ETA: 1:27 - loss: 0.9636 - regression_loss: 0.8590 - classification_loss: 0.1046 243/500 [=============>................] - ETA: 1:26 - loss: 0.9653 - regression_loss: 0.8605 - classification_loss: 0.1048 244/500 [=============>................] - ETA: 1:26 - loss: 0.9656 - regression_loss: 0.8611 - classification_loss: 0.1046 245/500 [=============>................] - ETA: 1:26 - loss: 0.9677 - regression_loss: 0.8627 - classification_loss: 0.1050 246/500 [=============>................] - ETA: 1:25 - loss: 0.9664 - regression_loss: 0.8614 - classification_loss: 0.1050 247/500 [=============>................] - ETA: 1:25 - loss: 0.9638 - regression_loss: 0.8589 - classification_loss: 0.1048 248/500 [=============>................] - ETA: 1:25 - loss: 0.9672 - regression_loss: 0.8620 - classification_loss: 0.1052 249/500 [=============>................] - ETA: 1:24 - loss: 0.9673 - regression_loss: 0.8621 - classification_loss: 0.1052 250/500 [==============>...............] - ETA: 1:24 - loss: 0.9654 - regression_loss: 0.8605 - classification_loss: 0.1049 251/500 [==============>...............] - ETA: 1:24 - loss: 0.9660 - regression_loss: 0.8611 - classification_loss: 0.1049 252/500 [==============>...............] - ETA: 1:23 - loss: 0.9660 - regression_loss: 0.8612 - classification_loss: 0.1048 253/500 [==============>...............] - ETA: 1:23 - loss: 0.9667 - regression_loss: 0.8620 - classification_loss: 0.1047 254/500 [==============>...............] - ETA: 1:23 - loss: 0.9675 - regression_loss: 0.8627 - classification_loss: 0.1048 255/500 [==============>...............] - ETA: 1:22 - loss: 0.9686 - regression_loss: 0.8633 - classification_loss: 0.1053 256/500 [==============>...............] - ETA: 1:22 - loss: 0.9706 - regression_loss: 0.8650 - classification_loss: 0.1057 257/500 [==============>...............] - ETA: 1:22 - loss: 0.9689 - regression_loss: 0.8634 - classification_loss: 0.1054 258/500 [==============>...............] - ETA: 1:21 - loss: 0.9680 - regression_loss: 0.8628 - classification_loss: 0.1052 259/500 [==============>...............] - ETA: 1:21 - loss: 0.9687 - regression_loss: 0.8635 - classification_loss: 0.1052 260/500 [==============>...............] - ETA: 1:21 - loss: 0.9695 - regression_loss: 0.8643 - classification_loss: 0.1052 261/500 [==============>...............] - ETA: 1:20 - loss: 0.9686 - regression_loss: 0.8636 - classification_loss: 0.1050 262/500 [==============>...............] - ETA: 1:20 - loss: 0.9694 - regression_loss: 0.8644 - classification_loss: 0.1051 263/500 [==============>...............] - ETA: 1:20 - loss: 0.9681 - regression_loss: 0.8632 - classification_loss: 0.1048 264/500 [==============>...............] - ETA: 1:19 - loss: 0.9687 - regression_loss: 0.8637 - classification_loss: 0.1050 265/500 [==============>...............] - ETA: 1:19 - loss: 0.9707 - regression_loss: 0.8654 - classification_loss: 0.1053 266/500 [==============>...............] - ETA: 1:19 - loss: 0.9711 - regression_loss: 0.8658 - classification_loss: 0.1053 267/500 [===============>..............] - ETA: 1:18 - loss: 0.9734 - regression_loss: 0.8680 - classification_loss: 0.1054 268/500 [===============>..............] - ETA: 1:18 - loss: 0.9726 - regression_loss: 0.8673 - classification_loss: 0.1053 269/500 [===============>..............] - ETA: 1:18 - loss: 0.9749 - regression_loss: 0.8688 - classification_loss: 0.1060 270/500 [===============>..............] - ETA: 1:17 - loss: 0.9735 - regression_loss: 0.8677 - classification_loss: 0.1059 271/500 [===============>..............] - ETA: 1:17 - loss: 0.9738 - regression_loss: 0.8681 - classification_loss: 0.1057 272/500 [===============>..............] - ETA: 1:17 - loss: 0.9724 - regression_loss: 0.8668 - classification_loss: 0.1056 273/500 [===============>..............] - ETA: 1:16 - loss: 0.9720 - regression_loss: 0.8664 - classification_loss: 0.1056 274/500 [===============>..............] - ETA: 1:16 - loss: 0.9694 - regression_loss: 0.8640 - classification_loss: 0.1053 275/500 [===============>..............] - ETA: 1:16 - loss: 0.9688 - regression_loss: 0.8637 - classification_loss: 0.1051 276/500 [===============>..............] - ETA: 1:15 - loss: 0.9689 - regression_loss: 0.8636 - classification_loss: 0.1052 277/500 [===============>..............] - ETA: 1:15 - loss: 0.9674 - regression_loss: 0.8623 - classification_loss: 0.1051 278/500 [===============>..............] - ETA: 1:15 - loss: 0.9706 - regression_loss: 0.8643 - classification_loss: 0.1063 279/500 [===============>..............] - ETA: 1:14 - loss: 0.9692 - regression_loss: 0.8631 - classification_loss: 0.1061 280/500 [===============>..............] - ETA: 1:14 - loss: 0.9689 - regression_loss: 0.8629 - classification_loss: 0.1060 281/500 [===============>..............] - ETA: 1:14 - loss: 0.9678 - regression_loss: 0.8619 - classification_loss: 0.1058 282/500 [===============>..............] - ETA: 1:13 - loss: 0.9693 - regression_loss: 0.8630 - classification_loss: 0.1063 283/500 [===============>..............] - ETA: 1:13 - loss: 0.9697 - regression_loss: 0.8633 - classification_loss: 0.1064 284/500 [================>.............] - ETA: 1:13 - loss: 0.9675 - regression_loss: 0.8614 - classification_loss: 0.1061 285/500 [================>.............] - ETA: 1:12 - loss: 0.9679 - regression_loss: 0.8619 - classification_loss: 0.1060 286/500 [================>.............] - ETA: 1:12 - loss: 0.9670 - regression_loss: 0.8611 - classification_loss: 0.1059 287/500 [================>.............] - ETA: 1:12 - loss: 0.9647 - regression_loss: 0.8591 - classification_loss: 0.1056 288/500 [================>.............] - ETA: 1:11 - loss: 0.9640 - regression_loss: 0.8585 - classification_loss: 0.1055 289/500 [================>.............] - ETA: 1:11 - loss: 0.9645 - regression_loss: 0.8590 - classification_loss: 0.1055 290/500 [================>.............] - ETA: 1:11 - loss: 0.9667 - regression_loss: 0.8606 - classification_loss: 0.1061 291/500 [================>.............] - ETA: 1:10 - loss: 0.9686 - regression_loss: 0.8622 - classification_loss: 0.1063 292/500 [================>.............] - ETA: 1:10 - loss: 0.9696 - regression_loss: 0.8631 - classification_loss: 0.1065 293/500 [================>.............] - ETA: 1:10 - loss: 0.9690 - regression_loss: 0.8627 - classification_loss: 0.1063 294/500 [================>.............] - ETA: 1:09 - loss: 0.9690 - regression_loss: 0.8627 - classification_loss: 0.1063 295/500 [================>.............] - ETA: 1:09 - loss: 0.9670 - regression_loss: 0.8608 - classification_loss: 0.1062 296/500 [================>.............] - ETA: 1:09 - loss: 0.9657 - regression_loss: 0.8598 - classification_loss: 0.1060 297/500 [================>.............] - ETA: 1:08 - loss: 0.9667 - regression_loss: 0.8606 - classification_loss: 0.1061 298/500 [================>.............] - ETA: 1:08 - loss: 0.9678 - regression_loss: 0.8616 - classification_loss: 0.1062 299/500 [================>.............] - ETA: 1:07 - loss: 0.9681 - regression_loss: 0.8620 - classification_loss: 0.1061 300/500 [=================>............] - ETA: 1:07 - loss: 0.9670 - regression_loss: 0.8609 - classification_loss: 0.1061 301/500 [=================>............] - ETA: 1:07 - loss: 0.9666 - regression_loss: 0.8606 - classification_loss: 0.1060 302/500 [=================>............] - ETA: 1:06 - loss: 0.9663 - regression_loss: 0.8604 - classification_loss: 0.1059 303/500 [=================>............] - ETA: 1:06 - loss: 0.9676 - regression_loss: 0.8610 - classification_loss: 0.1065 304/500 [=================>............] - ETA: 1:06 - loss: 0.9652 - regression_loss: 0.8589 - classification_loss: 0.1062 305/500 [=================>............] - ETA: 1:05 - loss: 0.9661 - regression_loss: 0.8597 - classification_loss: 0.1064 306/500 [=================>............] - ETA: 1:05 - loss: 0.9657 - regression_loss: 0.8594 - classification_loss: 0.1063 307/500 [=================>............] - ETA: 1:05 - loss: 0.9688 - regression_loss: 0.8626 - classification_loss: 0.1061 308/500 [=================>............] - ETA: 1:04 - loss: 0.9700 - regression_loss: 0.8637 - classification_loss: 0.1063 309/500 [=================>............] - ETA: 1:04 - loss: 0.9713 - regression_loss: 0.8650 - classification_loss: 0.1063 310/500 [=================>............] - ETA: 1:04 - loss: 0.9736 - regression_loss: 0.8668 - classification_loss: 0.1068 311/500 [=================>............] - ETA: 1:03 - loss: 0.9728 - regression_loss: 0.8662 - classification_loss: 0.1066 312/500 [=================>............] - ETA: 1:03 - loss: 0.9750 - regression_loss: 0.8679 - classification_loss: 0.1071 313/500 [=================>............] - ETA: 1:03 - loss: 0.9747 - regression_loss: 0.8677 - classification_loss: 0.1070 314/500 [=================>............] - ETA: 1:02 - loss: 0.9738 - regression_loss: 0.8668 - classification_loss: 0.1070 315/500 [=================>............] - ETA: 1:02 - loss: 0.9751 - regression_loss: 0.8681 - classification_loss: 0.1070 316/500 [=================>............] - ETA: 1:02 - loss: 0.9738 - regression_loss: 0.8670 - classification_loss: 0.1068 317/500 [==================>...........] - ETA: 1:01 - loss: 0.9729 - regression_loss: 0.8662 - classification_loss: 0.1067 318/500 [==================>...........] - ETA: 1:01 - loss: 0.9724 - regression_loss: 0.8658 - classification_loss: 0.1066 319/500 [==================>...........] - ETA: 1:01 - loss: 0.9726 - regression_loss: 0.8658 - classification_loss: 0.1067 320/500 [==================>...........] - ETA: 1:00 - loss: 0.9739 - regression_loss: 0.8671 - classification_loss: 0.1068 321/500 [==================>...........] - ETA: 1:00 - loss: 0.9763 - regression_loss: 0.8693 - classification_loss: 0.1070 322/500 [==================>...........] - ETA: 1:00 - loss: 0.9763 - regression_loss: 0.8694 - classification_loss: 0.1069 323/500 [==================>...........] - ETA: 59s - loss: 0.9772 - regression_loss: 0.8702 - classification_loss: 0.1070  324/500 [==================>...........] - ETA: 59s - loss: 0.9769 - regression_loss: 0.8701 - classification_loss: 0.1069 325/500 [==================>...........] - ETA: 59s - loss: 0.9766 - regression_loss: 0.8698 - classification_loss: 0.1069 326/500 [==================>...........] - ETA: 58s - loss: 0.9755 - regression_loss: 0.8688 - classification_loss: 0.1067 327/500 [==================>...........] - ETA: 58s - loss: 0.9746 - regression_loss: 0.8680 - classification_loss: 0.1066 328/500 [==================>...........] - ETA: 58s - loss: 0.9756 - regression_loss: 0.8689 - classification_loss: 0.1067 329/500 [==================>...........] - ETA: 57s - loss: 0.9785 - regression_loss: 0.8710 - classification_loss: 0.1075 330/500 [==================>...........] - ETA: 57s - loss: 0.9774 - regression_loss: 0.8702 - classification_loss: 0.1073 331/500 [==================>...........] - ETA: 57s - loss: 0.9758 - regression_loss: 0.8687 - classification_loss: 0.1071 332/500 [==================>...........] - ETA: 56s - loss: 0.9739 - regression_loss: 0.8670 - classification_loss: 0.1069 333/500 [==================>...........] - ETA: 56s - loss: 0.9757 - regression_loss: 0.8683 - classification_loss: 0.1074 334/500 [===================>..........] - ETA: 56s - loss: 0.9759 - regression_loss: 0.8685 - classification_loss: 0.1074 335/500 [===================>..........] - ETA: 55s - loss: 0.9761 - regression_loss: 0.8686 - classification_loss: 0.1075 336/500 [===================>..........] - ETA: 55s - loss: 0.9764 - regression_loss: 0.8690 - classification_loss: 0.1073 337/500 [===================>..........] - ETA: 55s - loss: 0.9748 - regression_loss: 0.8677 - classification_loss: 0.1071 338/500 [===================>..........] - ETA: 54s - loss: 0.9733 - regression_loss: 0.8663 - classification_loss: 0.1070 339/500 [===================>..........] - ETA: 54s - loss: 0.9709 - regression_loss: 0.8642 - classification_loss: 0.1067 340/500 [===================>..........] - ETA: 54s - loss: 0.9721 - regression_loss: 0.8654 - classification_loss: 0.1067 341/500 [===================>..........] - ETA: 53s - loss: 0.9707 - regression_loss: 0.8641 - classification_loss: 0.1065 342/500 [===================>..........] - ETA: 53s - loss: 0.9696 - regression_loss: 0.8632 - classification_loss: 0.1064 343/500 [===================>..........] - ETA: 53s - loss: 0.9678 - regression_loss: 0.8617 - classification_loss: 0.1062 344/500 [===================>..........] - ETA: 52s - loss: 0.9687 - regression_loss: 0.8623 - classification_loss: 0.1064 345/500 [===================>..........] - ETA: 52s - loss: 0.9681 - regression_loss: 0.8616 - classification_loss: 0.1065 346/500 [===================>..........] - ETA: 52s - loss: 0.9676 - regression_loss: 0.8613 - classification_loss: 0.1063 347/500 [===================>..........] - ETA: 51s - loss: 0.9693 - regression_loss: 0.8629 - classification_loss: 0.1064 348/500 [===================>..........] - ETA: 51s - loss: 0.9721 - regression_loss: 0.8655 - classification_loss: 0.1066 349/500 [===================>..........] - ETA: 51s - loss: 0.9759 - regression_loss: 0.8687 - classification_loss: 0.1072 350/500 [====================>.........] - ETA: 50s - loss: 0.9768 - regression_loss: 0.8697 - classification_loss: 0.1071 351/500 [====================>.........] - ETA: 50s - loss: 0.9775 - regression_loss: 0.8702 - classification_loss: 0.1073 352/500 [====================>.........] - ETA: 50s - loss: 0.9771 - regression_loss: 0.8699 - classification_loss: 0.1072 353/500 [====================>.........] - ETA: 49s - loss: 0.9768 - regression_loss: 0.8697 - classification_loss: 0.1071 354/500 [====================>.........] - ETA: 49s - loss: 0.9759 - regression_loss: 0.8689 - classification_loss: 0.1071 355/500 [====================>.........] - ETA: 49s - loss: 0.9755 - regression_loss: 0.8685 - classification_loss: 0.1070 356/500 [====================>.........] - ETA: 48s - loss: 0.9757 - regression_loss: 0.8687 - classification_loss: 0.1070 357/500 [====================>.........] - ETA: 48s - loss: 0.9761 - regression_loss: 0.8690 - classification_loss: 0.1071 358/500 [====================>.........] - ETA: 48s - loss: 0.9766 - regression_loss: 0.8694 - classification_loss: 0.1071 359/500 [====================>.........] - ETA: 47s - loss: 0.9767 - regression_loss: 0.8697 - classification_loss: 0.1070 360/500 [====================>.........] - ETA: 47s - loss: 0.9785 - regression_loss: 0.8705 - classification_loss: 0.1081 361/500 [====================>.........] - ETA: 46s - loss: 0.9766 - regression_loss: 0.8687 - classification_loss: 0.1079 362/500 [====================>.........] - ETA: 46s - loss: 0.9746 - regression_loss: 0.8670 - classification_loss: 0.1076 363/500 [====================>.........] - ETA: 46s - loss: 0.9740 - regression_loss: 0.8665 - classification_loss: 0.1075 364/500 [====================>.........] - ETA: 45s - loss: 0.9729 - regression_loss: 0.8656 - classification_loss: 0.1073 365/500 [====================>.........] - ETA: 45s - loss: 0.9712 - regression_loss: 0.8641 - classification_loss: 0.1071 366/500 [====================>.........] - ETA: 45s - loss: 0.9704 - regression_loss: 0.8633 - classification_loss: 0.1071 367/500 [=====================>........] - ETA: 44s - loss: 0.9695 - regression_loss: 0.8626 - classification_loss: 0.1069 368/500 [=====================>........] - ETA: 44s - loss: 0.9687 - regression_loss: 0.8619 - classification_loss: 0.1068 369/500 [=====================>........] - ETA: 44s - loss: 0.9683 - regression_loss: 0.8617 - classification_loss: 0.1066 370/500 [=====================>........] - ETA: 43s - loss: 0.9675 - regression_loss: 0.8611 - classification_loss: 0.1064 371/500 [=====================>........] - ETA: 43s - loss: 0.9680 - regression_loss: 0.8615 - classification_loss: 0.1064 372/500 [=====================>........] - ETA: 43s - loss: 0.9673 - regression_loss: 0.8610 - classification_loss: 0.1063 373/500 [=====================>........] - ETA: 42s - loss: 0.9697 - regression_loss: 0.8631 - classification_loss: 0.1066 374/500 [=====================>........] - ETA: 42s - loss: 0.9693 - regression_loss: 0.8627 - classification_loss: 0.1066 375/500 [=====================>........] - ETA: 42s - loss: 0.9713 - regression_loss: 0.8643 - classification_loss: 0.1069 376/500 [=====================>........] - ETA: 41s - loss: 0.9727 - regression_loss: 0.8653 - classification_loss: 0.1074 377/500 [=====================>........] - ETA: 41s - loss: 0.9729 - regression_loss: 0.8654 - classification_loss: 0.1074 378/500 [=====================>........] - ETA: 41s - loss: 0.9729 - regression_loss: 0.8654 - classification_loss: 0.1075 379/500 [=====================>........] - ETA: 40s - loss: 0.9719 - regression_loss: 0.8646 - classification_loss: 0.1073 380/500 [=====================>........] - ETA: 40s - loss: 0.9723 - regression_loss: 0.8650 - classification_loss: 0.1073 381/500 [=====================>........] - ETA: 40s - loss: 0.9711 - regression_loss: 0.8640 - classification_loss: 0.1071 382/500 [=====================>........] - ETA: 39s - loss: 0.9702 - regression_loss: 0.8632 - classification_loss: 0.1070 383/500 [=====================>........] - ETA: 39s - loss: 0.9707 - regression_loss: 0.8636 - classification_loss: 0.1071 384/500 [======================>.......] - ETA: 39s - loss: 0.9710 - regression_loss: 0.8640 - classification_loss: 0.1070 385/500 [======================>.......] - ETA: 38s - loss: 0.9708 - regression_loss: 0.8618 - classification_loss: 0.1090 386/500 [======================>.......] - ETA: 38s - loss: 0.9707 - regression_loss: 0.8617 - classification_loss: 0.1090 387/500 [======================>.......] - ETA: 38s - loss: 0.9697 - regression_loss: 0.8609 - classification_loss: 0.1088 388/500 [======================>.......] - ETA: 37s - loss: 0.9699 - regression_loss: 0.8611 - classification_loss: 0.1088 389/500 [======================>.......] - ETA: 37s - loss: 0.9703 - regression_loss: 0.8616 - classification_loss: 0.1088 390/500 [======================>.......] - ETA: 37s - loss: 0.9708 - regression_loss: 0.8620 - classification_loss: 0.1088 391/500 [======================>.......] - ETA: 36s - loss: 0.9718 - regression_loss: 0.8628 - classification_loss: 0.1090 392/500 [======================>.......] - ETA: 36s - loss: 0.9704 - regression_loss: 0.8615 - classification_loss: 0.1088 393/500 [======================>.......] - ETA: 36s - loss: 0.9698 - regression_loss: 0.8611 - classification_loss: 0.1087 394/500 [======================>.......] - ETA: 35s - loss: 0.9686 - regression_loss: 0.8600 - classification_loss: 0.1086 395/500 [======================>.......] - ETA: 35s - loss: 0.9680 - regression_loss: 0.8596 - classification_loss: 0.1084 396/500 [======================>.......] - ETA: 35s - loss: 0.9681 - regression_loss: 0.8597 - classification_loss: 0.1084 397/500 [======================>.......] - ETA: 34s - loss: 0.9676 - regression_loss: 0.8593 - classification_loss: 0.1083 398/500 [======================>.......] - ETA: 34s - loss: 0.9696 - regression_loss: 0.8608 - classification_loss: 0.1087 399/500 [======================>.......] - ETA: 34s - loss: 0.9695 - regression_loss: 0.8608 - classification_loss: 0.1087 400/500 [=======================>......] - ETA: 33s - loss: 0.9724 - regression_loss: 0.8632 - classification_loss: 0.1092 401/500 [=======================>......] - ETA: 33s - loss: 0.9724 - regression_loss: 0.8631 - classification_loss: 0.1094 402/500 [=======================>......] - ETA: 33s - loss: 0.9713 - regression_loss: 0.8622 - classification_loss: 0.1091 403/500 [=======================>......] - ETA: 32s - loss: 0.9709 - regression_loss: 0.8619 - classification_loss: 0.1090 404/500 [=======================>......] - ETA: 32s - loss: 0.9725 - regression_loss: 0.8630 - classification_loss: 0.1095 405/500 [=======================>......] - ETA: 32s - loss: 0.9719 - regression_loss: 0.8625 - classification_loss: 0.1094 406/500 [=======================>......] - ETA: 31s - loss: 0.9707 - regression_loss: 0.8615 - classification_loss: 0.1092 407/500 [=======================>......] - ETA: 31s - loss: 0.9698 - regression_loss: 0.8609 - classification_loss: 0.1089 408/500 [=======================>......] - ETA: 31s - loss: 0.9692 - regression_loss: 0.8604 - classification_loss: 0.1088 409/500 [=======================>......] - ETA: 30s - loss: 0.9684 - regression_loss: 0.8596 - classification_loss: 0.1088 410/500 [=======================>......] - ETA: 30s - loss: 0.9687 - regression_loss: 0.8600 - classification_loss: 0.1088 411/500 [=======================>......] - ETA: 30s - loss: 0.9705 - regression_loss: 0.8615 - classification_loss: 0.1090 412/500 [=======================>......] - ETA: 29s - loss: 0.9711 - regression_loss: 0.8620 - classification_loss: 0.1091 413/500 [=======================>......] - ETA: 29s - loss: 0.9710 - regression_loss: 0.8620 - classification_loss: 0.1090 414/500 [=======================>......] - ETA: 29s - loss: 0.9697 - regression_loss: 0.8608 - classification_loss: 0.1088 415/500 [=======================>......] - ETA: 28s - loss: 0.9698 - regression_loss: 0.8610 - classification_loss: 0.1088 416/500 [=======================>......] - ETA: 28s - loss: 0.9715 - regression_loss: 0.8624 - classification_loss: 0.1090 417/500 [========================>.....] - ETA: 28s - loss: 0.9712 - regression_loss: 0.8622 - classification_loss: 0.1090 418/500 [========================>.....] - ETA: 27s - loss: 0.9760 - regression_loss: 0.8657 - classification_loss: 0.1103 419/500 [========================>.....] - ETA: 27s - loss: 0.9745 - regression_loss: 0.8644 - classification_loss: 0.1101 420/500 [========================>.....] - ETA: 27s - loss: 0.9729 - regression_loss: 0.8630 - classification_loss: 0.1099 421/500 [========================>.....] - ETA: 26s - loss: 0.9729 - regression_loss: 0.8632 - classification_loss: 0.1097 422/500 [========================>.....] - ETA: 26s - loss: 0.9727 - regression_loss: 0.8631 - classification_loss: 0.1096 423/500 [========================>.....] - ETA: 26s - loss: 0.9723 - regression_loss: 0.8627 - classification_loss: 0.1096 424/500 [========================>.....] - ETA: 25s - loss: 0.9732 - regression_loss: 0.8635 - classification_loss: 0.1097 425/500 [========================>.....] - ETA: 25s - loss: 0.9740 - regression_loss: 0.8643 - classification_loss: 0.1097 426/500 [========================>.....] - ETA: 25s - loss: 0.9737 - regression_loss: 0.8640 - classification_loss: 0.1097 427/500 [========================>.....] - ETA: 24s - loss: 0.9735 - regression_loss: 0.8638 - classification_loss: 0.1097 428/500 [========================>.....] - ETA: 24s - loss: 0.9736 - regression_loss: 0.8641 - classification_loss: 0.1096 429/500 [========================>.....] - ETA: 24s - loss: 0.9750 - regression_loss: 0.8648 - classification_loss: 0.1102 430/500 [========================>.....] - ETA: 23s - loss: 0.9757 - regression_loss: 0.8654 - classification_loss: 0.1103 431/500 [========================>.....] - ETA: 23s - loss: 0.9751 - regression_loss: 0.8650 - classification_loss: 0.1101 432/500 [========================>.....] - ETA: 23s - loss: 0.9751 - regression_loss: 0.8650 - classification_loss: 0.1101 433/500 [========================>.....] - ETA: 22s - loss: 0.9753 - regression_loss: 0.8653 - classification_loss: 0.1101 434/500 [=========================>....] - ETA: 22s - loss: 0.9744 - regression_loss: 0.8645 - classification_loss: 0.1099 435/500 [=========================>....] - ETA: 22s - loss: 0.9757 - regression_loss: 0.8657 - classification_loss: 0.1100 436/500 [=========================>....] - ETA: 21s - loss: 0.9750 - regression_loss: 0.8651 - classification_loss: 0.1098 437/500 [=========================>....] - ETA: 21s - loss: 0.9769 - regression_loss: 0.8668 - classification_loss: 0.1102 438/500 [=========================>....] - ETA: 20s - loss: 0.9756 - regression_loss: 0.8657 - classification_loss: 0.1100 439/500 [=========================>....] - ETA: 20s - loss: 0.9757 - regression_loss: 0.8658 - classification_loss: 0.1099 440/500 [=========================>....] - ETA: 20s - loss: 0.9755 - regression_loss: 0.8656 - classification_loss: 0.1099 441/500 [=========================>....] - ETA: 19s - loss: 0.9743 - regression_loss: 0.8646 - classification_loss: 0.1097 442/500 [=========================>....] - ETA: 19s - loss: 0.9737 - regression_loss: 0.8639 - classification_loss: 0.1098 443/500 [=========================>....] - ETA: 19s - loss: 0.9741 - regression_loss: 0.8643 - classification_loss: 0.1098 444/500 [=========================>....] - ETA: 18s - loss: 0.9742 - regression_loss: 0.8644 - classification_loss: 0.1098 445/500 [=========================>....] - ETA: 18s - loss: 0.9754 - regression_loss: 0.8654 - classification_loss: 0.1100 446/500 [=========================>....] - ETA: 18s - loss: 0.9739 - regression_loss: 0.8640 - classification_loss: 0.1099 447/500 [=========================>....] - ETA: 17s - loss: 0.9745 - regression_loss: 0.8646 - classification_loss: 0.1099 448/500 [=========================>....] - ETA: 17s - loss: 0.9748 - regression_loss: 0.8649 - classification_loss: 0.1099 449/500 [=========================>....] - ETA: 17s - loss: 0.9752 - regression_loss: 0.8650 - classification_loss: 0.1101 450/500 [==========================>...] - ETA: 16s - loss: 0.9743 - regression_loss: 0.8643 - classification_loss: 0.1100 451/500 [==========================>...] - ETA: 16s - loss: 0.9748 - regression_loss: 0.8647 - classification_loss: 0.1101 452/500 [==========================>...] - ETA: 16s - loss: 0.9747 - regression_loss: 0.8646 - classification_loss: 0.1101 453/500 [==========================>...] - ETA: 15s - loss: 0.9758 - regression_loss: 0.8653 - classification_loss: 0.1104 454/500 [==========================>...] - ETA: 15s - loss: 0.9758 - regression_loss: 0.8654 - classification_loss: 0.1104 455/500 [==========================>...] - ETA: 15s - loss: 0.9754 - regression_loss: 0.8650 - classification_loss: 0.1104 456/500 [==========================>...] - ETA: 14s - loss: 0.9769 - regression_loss: 0.8662 - classification_loss: 0.1107 457/500 [==========================>...] - ETA: 14s - loss: 0.9769 - regression_loss: 0.8662 - classification_loss: 0.1106 458/500 [==========================>...] - ETA: 14s - loss: 0.9776 - regression_loss: 0.8670 - classification_loss: 0.1106 459/500 [==========================>...] - ETA: 13s - loss: 0.9767 - regression_loss: 0.8661 - classification_loss: 0.1105 460/500 [==========================>...] - ETA: 13s - loss: 0.9766 - regression_loss: 0.8662 - classification_loss: 0.1104 461/500 [==========================>...] - ETA: 13s - loss: 0.9762 - regression_loss: 0.8659 - classification_loss: 0.1103 462/500 [==========================>...] - ETA: 12s - loss: 0.9758 - regression_loss: 0.8654 - classification_loss: 0.1103 463/500 [==========================>...] - ETA: 12s - loss: 0.9758 - regression_loss: 0.8655 - classification_loss: 0.1103 464/500 [==========================>...] - ETA: 12s - loss: 0.9762 - regression_loss: 0.8658 - classification_loss: 0.1104 465/500 [==========================>...] - ETA: 11s - loss: 0.9770 - regression_loss: 0.8664 - classification_loss: 0.1106 466/500 [==========================>...] - ETA: 11s - loss: 0.9770 - regression_loss: 0.8664 - classification_loss: 0.1106 467/500 [===========================>..] - ETA: 11s - loss: 0.9768 - regression_loss: 0.8663 - classification_loss: 0.1105 468/500 [===========================>..] - ETA: 10s - loss: 0.9759 - regression_loss: 0.8655 - classification_loss: 0.1104 469/500 [===========================>..] - ETA: 10s - loss: 0.9754 - regression_loss: 0.8651 - classification_loss: 0.1104 470/500 [===========================>..] - ETA: 10s - loss: 0.9740 - regression_loss: 0.8638 - classification_loss: 0.1102 471/500 [===========================>..] - ETA: 9s - loss: 0.9751 - regression_loss: 0.8649 - classification_loss: 0.1102  472/500 [===========================>..] - ETA: 9s - loss: 0.9786 - regression_loss: 0.8674 - classification_loss: 0.1112 473/500 [===========================>..] - ETA: 9s - loss: 0.9797 - regression_loss: 0.8683 - classification_loss: 0.1113 474/500 [===========================>..] - ETA: 8s - loss: 0.9802 - regression_loss: 0.8689 - classification_loss: 0.1113 475/500 [===========================>..] - ETA: 8s - loss: 0.9803 - regression_loss: 0.8691 - classification_loss: 0.1112 476/500 [===========================>..] - ETA: 8s - loss: 0.9800 - regression_loss: 0.8689 - classification_loss: 0.1111 477/500 [===========================>..] - ETA: 7s - loss: 0.9801 - regression_loss: 0.8688 - classification_loss: 0.1113 478/500 [===========================>..] - ETA: 7s - loss: 0.9802 - regression_loss: 0.8689 - classification_loss: 0.1113 479/500 [===========================>..] - ETA: 7s - loss: 0.9808 - regression_loss: 0.8692 - classification_loss: 0.1116 480/500 [===========================>..] - ETA: 6s - loss: 0.9817 - regression_loss: 0.8700 - classification_loss: 0.1117 481/500 [===========================>..] - ETA: 6s - loss: 0.9819 - regression_loss: 0.8702 - classification_loss: 0.1117 482/500 [===========================>..] - ETA: 6s - loss: 0.9815 - regression_loss: 0.8699 - classification_loss: 0.1116 483/500 [===========================>..] - ETA: 5s - loss: 0.9820 - regression_loss: 0.8702 - classification_loss: 0.1117 484/500 [============================>.] - ETA: 5s - loss: 0.9825 - regression_loss: 0.8709 - classification_loss: 0.1116 485/500 [============================>.] - ETA: 5s - loss: 0.9853 - regression_loss: 0.8732 - classification_loss: 0.1121 486/500 [============================>.] - ETA: 4s - loss: 0.9851 - regression_loss: 0.8730 - classification_loss: 0.1120 487/500 [============================>.] - ETA: 4s - loss: 0.9843 - regression_loss: 0.8724 - classification_loss: 0.1119 488/500 [============================>.] - ETA: 4s - loss: 0.9850 - regression_loss: 0.8729 - classification_loss: 0.1121 489/500 [============================>.] - ETA: 3s - loss: 0.9864 - regression_loss: 0.8739 - classification_loss: 0.1125 490/500 [============================>.] - ETA: 3s - loss: 0.9864 - regression_loss: 0.8739 - classification_loss: 0.1124 491/500 [============================>.] - ETA: 3s - loss: 0.9858 - regression_loss: 0.8735 - classification_loss: 0.1123 492/500 [============================>.] - ETA: 2s - loss: 0.9859 - regression_loss: 0.8735 - classification_loss: 0.1123 493/500 [============================>.] - ETA: 2s - loss: 0.9854 - regression_loss: 0.8731 - classification_loss: 0.1123 494/500 [============================>.] - ETA: 2s - loss: 0.9848 - regression_loss: 0.8726 - classification_loss: 0.1122 495/500 [============================>.] - ETA: 1s - loss: 0.9856 - regression_loss: 0.8733 - classification_loss: 0.1123 496/500 [============================>.] - ETA: 1s - loss: 0.9852 - regression_loss: 0.8731 - classification_loss: 0.1122 497/500 [============================>.] - ETA: 1s - loss: 0.9847 - regression_loss: 0.8727 - classification_loss: 0.1120 498/500 [============================>.] - ETA: 0s - loss: 0.9841 - regression_loss: 0.8720 - classification_loss: 0.1120 499/500 [============================>.] - ETA: 0s - loss: 0.9833 - regression_loss: 0.8714 - classification_loss: 0.1119 500/500 [==============================] - 169s 339ms/step - loss: 0.9826 - regression_loss: 0.8709 - classification_loss: 0.1118 326 instances of class plum with average precision: 0.8395 mAP: 0.8395 Epoch 00025: saving model to ./training/snapshots/resnet101_pascal_25.h5 Epoch 26/150 1/500 [..............................] - ETA: 2:37 - loss: 1.3797 - regression_loss: 1.2307 - classification_loss: 0.1490 2/500 [..............................] - ETA: 2:44 - loss: 1.0374 - regression_loss: 0.9350 - classification_loss: 0.1024 3/500 [..............................] - ETA: 2:44 - loss: 0.8859 - regression_loss: 0.8055 - classification_loss: 0.0804 4/500 [..............................] - ETA: 2:45 - loss: 0.9882 - regression_loss: 0.8994 - classification_loss: 0.0889 5/500 [..............................] - ETA: 2:45 - loss: 0.7984 - regression_loss: 0.7195 - classification_loss: 0.0789 6/500 [..............................] - ETA: 2:45 - loss: 0.8084 - regression_loss: 0.7320 - classification_loss: 0.0764 7/500 [..............................] - ETA: 2:45 - loss: 0.7937 - regression_loss: 0.7185 - classification_loss: 0.0752 8/500 [..............................] - ETA: 2:45 - loss: 0.8002 - regression_loss: 0.7218 - classification_loss: 0.0784 9/500 [..............................] - ETA: 2:44 - loss: 0.8110 - regression_loss: 0.7265 - classification_loss: 0.0845 10/500 [..............................] - ETA: 2:44 - loss: 0.8570 - regression_loss: 0.7625 - classification_loss: 0.0944 11/500 [..............................] - ETA: 2:44 - loss: 0.8226 - regression_loss: 0.7336 - classification_loss: 0.0890 12/500 [..............................] - ETA: 2:44 - loss: 0.8420 - regression_loss: 0.7572 - classification_loss: 0.0849 13/500 [..............................] - ETA: 2:44 - loss: 0.8072 - regression_loss: 0.7260 - classification_loss: 0.0812 14/500 [..............................] - ETA: 2:44 - loss: 0.8253 - regression_loss: 0.7411 - classification_loss: 0.0842 15/500 [..............................] - ETA: 2:43 - loss: 0.8261 - regression_loss: 0.7409 - classification_loss: 0.0852 16/500 [..............................] - ETA: 2:43 - loss: 0.8355 - regression_loss: 0.7500 - classification_loss: 0.0855 17/500 [>.............................] - ETA: 2:43 - loss: 0.8626 - regression_loss: 0.7694 - classification_loss: 0.0932 18/500 [>.............................] - ETA: 2:42 - loss: 0.8564 - regression_loss: 0.7625 - classification_loss: 0.0939 19/500 [>.............................] - ETA: 2:42 - loss: 0.8669 - regression_loss: 0.7714 - classification_loss: 0.0956 20/500 [>.............................] - ETA: 2:42 - loss: 0.8485 - regression_loss: 0.7566 - classification_loss: 0.0920 21/500 [>.............................] - ETA: 2:41 - loss: 0.8793 - regression_loss: 0.7835 - classification_loss: 0.0959 22/500 [>.............................] - ETA: 2:41 - loss: 0.8627 - regression_loss: 0.7690 - classification_loss: 0.0937 23/500 [>.............................] - ETA: 2:41 - loss: 0.8838 - regression_loss: 0.7864 - classification_loss: 0.0974 24/500 [>.............................] - ETA: 2:41 - loss: 0.8903 - regression_loss: 0.7936 - classification_loss: 0.0966 25/500 [>.............................] - ETA: 2:41 - loss: 0.8773 - regression_loss: 0.7832 - classification_loss: 0.0941 26/500 [>.............................] - ETA: 2:40 - loss: 0.8753 - regression_loss: 0.7823 - classification_loss: 0.0930 27/500 [>.............................] - ETA: 2:39 - loss: 0.8848 - regression_loss: 0.7916 - classification_loss: 0.0932 28/500 [>.............................] - ETA: 2:39 - loss: 0.8919 - regression_loss: 0.7990 - classification_loss: 0.0928 29/500 [>.............................] - ETA: 2:38 - loss: 0.9004 - regression_loss: 0.8064 - classification_loss: 0.0941 30/500 [>.............................] - ETA: 2:38 - loss: 0.8922 - regression_loss: 0.7995 - classification_loss: 0.0927 31/500 [>.............................] - ETA: 2:38 - loss: 0.9074 - regression_loss: 0.8140 - classification_loss: 0.0934 32/500 [>.............................] - ETA: 2:37 - loss: 0.9082 - regression_loss: 0.8158 - classification_loss: 0.0924 33/500 [>.............................] - ETA: 2:37 - loss: 0.9215 - regression_loss: 0.8264 - classification_loss: 0.0951 34/500 [=>............................] - ETA: 2:37 - loss: 0.9163 - regression_loss: 0.8224 - classification_loss: 0.0939 35/500 [=>............................] - ETA: 2:37 - loss: 0.9267 - regression_loss: 0.8332 - classification_loss: 0.0936 36/500 [=>............................] - ETA: 2:37 - loss: 0.9250 - regression_loss: 0.8324 - classification_loss: 0.0926 37/500 [=>............................] - ETA: 2:37 - loss: 0.9318 - regression_loss: 0.8389 - classification_loss: 0.0929 38/500 [=>............................] - ETA: 2:36 - loss: 0.9403 - regression_loss: 0.8466 - classification_loss: 0.0936 39/500 [=>............................] - ETA: 2:36 - loss: 0.9513 - regression_loss: 0.8561 - classification_loss: 0.0952 40/500 [=>............................] - ETA: 2:36 - loss: 0.9545 - regression_loss: 0.8572 - classification_loss: 0.0972 41/500 [=>............................] - ETA: 2:35 - loss: 0.9636 - regression_loss: 0.8660 - classification_loss: 0.0976 42/500 [=>............................] - ETA: 2:35 - loss: 0.9736 - regression_loss: 0.8717 - classification_loss: 0.1019 43/500 [=>............................] - ETA: 2:34 - loss: 0.9676 - regression_loss: 0.8674 - classification_loss: 0.1002 44/500 [=>............................] - ETA: 2:34 - loss: 0.9795 - regression_loss: 0.8759 - classification_loss: 0.1036 45/500 [=>............................] - ETA: 2:34 - loss: 0.9822 - regression_loss: 0.8774 - classification_loss: 0.1048 46/500 [=>............................] - ETA: 2:33 - loss: 1.0025 - regression_loss: 0.8919 - classification_loss: 0.1106 47/500 [=>............................] - ETA: 2:33 - loss: 1.0049 - regression_loss: 0.8948 - classification_loss: 0.1101 48/500 [=>............................] - ETA: 2:33 - loss: 1.0127 - regression_loss: 0.9033 - classification_loss: 0.1094 49/500 [=>............................] - ETA: 2:32 - loss: 1.0177 - regression_loss: 0.9082 - classification_loss: 0.1095 50/500 [==>...........................] - ETA: 2:32 - loss: 1.0084 - regression_loss: 0.9003 - classification_loss: 0.1082 51/500 [==>...........................] - ETA: 2:32 - loss: 0.9995 - regression_loss: 0.8923 - classification_loss: 0.1072 52/500 [==>...........................] - ETA: 2:31 - loss: 1.0073 - regression_loss: 0.8996 - classification_loss: 0.1077 53/500 [==>...........................] - ETA: 2:31 - loss: 1.0064 - regression_loss: 0.8990 - classification_loss: 0.1074 54/500 [==>...........................] - ETA: 2:31 - loss: 1.0104 - regression_loss: 0.9028 - classification_loss: 0.1076 55/500 [==>...........................] - ETA: 2:30 - loss: 1.0109 - regression_loss: 0.9039 - classification_loss: 0.1071 56/500 [==>...........................] - ETA: 2:30 - loss: 1.0112 - regression_loss: 0.9044 - classification_loss: 0.1068 57/500 [==>...........................] - ETA: 2:30 - loss: 1.0113 - regression_loss: 0.9042 - classification_loss: 0.1071 58/500 [==>...........................] - ETA: 2:29 - loss: 1.0111 - regression_loss: 0.9041 - classification_loss: 0.1069 59/500 [==>...........................] - ETA: 2:29 - loss: 1.0289 - regression_loss: 0.9175 - classification_loss: 0.1114 60/500 [==>...........................] - ETA: 2:29 - loss: 1.0301 - regression_loss: 0.9176 - classification_loss: 0.1125 61/500 [==>...........................] - ETA: 2:28 - loss: 1.0407 - regression_loss: 0.9247 - classification_loss: 0.1160 62/500 [==>...........................] - ETA: 2:28 - loss: 1.0500 - regression_loss: 0.9314 - classification_loss: 0.1187 63/500 [==>...........................] - ETA: 2:28 - loss: 1.0507 - regression_loss: 0.9326 - classification_loss: 0.1181 64/500 [==>...........................] - ETA: 2:27 - loss: 1.0688 - regression_loss: 0.9490 - classification_loss: 0.1198 65/500 [==>...........................] - ETA: 2:27 - loss: 1.0613 - regression_loss: 0.9416 - classification_loss: 0.1196 66/500 [==>...........................] - ETA: 2:27 - loss: 1.0563 - regression_loss: 0.9369 - classification_loss: 0.1194 67/500 [===>..........................] - ETA: 2:26 - loss: 1.0584 - regression_loss: 0.9389 - classification_loss: 0.1195 68/500 [===>..........................] - ETA: 2:26 - loss: 1.0612 - regression_loss: 0.9421 - classification_loss: 0.1191 69/500 [===>..........................] - ETA: 2:25 - loss: 1.0586 - regression_loss: 0.9396 - classification_loss: 0.1190 70/500 [===>..........................] - ETA: 2:25 - loss: 1.0597 - regression_loss: 0.9411 - classification_loss: 0.1185 71/500 [===>..........................] - ETA: 2:25 - loss: 1.0620 - regression_loss: 0.9424 - classification_loss: 0.1195 72/500 [===>..........................] - ETA: 2:24 - loss: 1.0586 - regression_loss: 0.9397 - classification_loss: 0.1189 73/500 [===>..........................] - ETA: 2:24 - loss: 1.0555 - regression_loss: 0.9373 - classification_loss: 0.1182 74/500 [===>..........................] - ETA: 2:24 - loss: 1.0472 - regression_loss: 0.9302 - classification_loss: 0.1170 75/500 [===>..........................] - ETA: 2:23 - loss: 1.0452 - regression_loss: 0.9284 - classification_loss: 0.1167 76/500 [===>..........................] - ETA: 2:23 - loss: 1.0363 - regression_loss: 0.9207 - classification_loss: 0.1156 77/500 [===>..........................] - ETA: 2:23 - loss: 1.0275 - regression_loss: 0.9126 - classification_loss: 0.1149 78/500 [===>..........................] - ETA: 2:22 - loss: 1.0252 - regression_loss: 0.9108 - classification_loss: 0.1144 79/500 [===>..........................] - ETA: 2:22 - loss: 1.0206 - regression_loss: 0.9070 - classification_loss: 0.1137 80/500 [===>..........................] - ETA: 2:22 - loss: 1.0253 - regression_loss: 0.9101 - classification_loss: 0.1151 81/500 [===>..........................] - ETA: 2:22 - loss: 1.0252 - regression_loss: 0.9103 - classification_loss: 0.1149 82/500 [===>..........................] - ETA: 2:21 - loss: 1.0277 - regression_loss: 0.9129 - classification_loss: 0.1148 83/500 [===>..........................] - ETA: 2:21 - loss: 1.0204 - regression_loss: 0.9067 - classification_loss: 0.1137 84/500 [====>.........................] - ETA: 2:21 - loss: 1.0253 - regression_loss: 0.9118 - classification_loss: 0.1136 85/500 [====>.........................] - ETA: 2:20 - loss: 1.0196 - regression_loss: 0.9065 - classification_loss: 0.1131 86/500 [====>.........................] - ETA: 2:20 - loss: 1.0149 - regression_loss: 0.9027 - classification_loss: 0.1122 87/500 [====>.........................] - ETA: 2:20 - loss: 1.0120 - regression_loss: 0.8990 - classification_loss: 0.1129 88/500 [====>.........................] - ETA: 2:19 - loss: 1.0054 - regression_loss: 0.8936 - classification_loss: 0.1118 89/500 [====>.........................] - ETA: 2:19 - loss: 0.9992 - regression_loss: 0.8879 - classification_loss: 0.1114 90/500 [====>.........................] - ETA: 2:19 - loss: 0.9922 - regression_loss: 0.8818 - classification_loss: 0.1103 91/500 [====>.........................] - ETA: 2:18 - loss: 0.9897 - regression_loss: 0.8798 - classification_loss: 0.1099 92/500 [====>.........................] - ETA: 2:18 - loss: 0.9905 - regression_loss: 0.8808 - classification_loss: 0.1097 93/500 [====>.........................] - ETA: 2:18 - loss: 0.9876 - regression_loss: 0.8785 - classification_loss: 0.1091 94/500 [====>.........................] - ETA: 2:17 - loss: 0.9901 - regression_loss: 0.8802 - classification_loss: 0.1099 95/500 [====>.........................] - ETA: 2:17 - loss: 0.9947 - regression_loss: 0.8829 - classification_loss: 0.1118 96/500 [====>.........................] - ETA: 2:17 - loss: 0.9930 - regression_loss: 0.8815 - classification_loss: 0.1115 97/500 [====>.........................] - ETA: 2:16 - loss: 0.9941 - regression_loss: 0.8822 - classification_loss: 0.1119 98/500 [====>.........................] - ETA: 2:16 - loss: 0.9931 - regression_loss: 0.8816 - classification_loss: 0.1114 99/500 [====>.........................] - ETA: 2:16 - loss: 0.9919 - regression_loss: 0.8780 - classification_loss: 0.1139 100/500 [=====>........................] - ETA: 2:15 - loss: 0.9988 - regression_loss: 0.8840 - classification_loss: 0.1148 101/500 [=====>........................] - ETA: 2:15 - loss: 0.9971 - regression_loss: 0.8830 - classification_loss: 0.1140 102/500 [=====>........................] - ETA: 2:15 - loss: 0.9974 - regression_loss: 0.8837 - classification_loss: 0.1136 103/500 [=====>........................] - ETA: 2:14 - loss: 0.9995 - regression_loss: 0.8861 - classification_loss: 0.1134 104/500 [=====>........................] - ETA: 2:14 - loss: 1.0030 - regression_loss: 0.8875 - classification_loss: 0.1155 105/500 [=====>........................] - ETA: 2:13 - loss: 1.0051 - regression_loss: 0.8889 - classification_loss: 0.1162 106/500 [=====>........................] - ETA: 2:13 - loss: 1.0006 - regression_loss: 0.8850 - classification_loss: 0.1155 107/500 [=====>........................] - ETA: 2:13 - loss: 0.9960 - regression_loss: 0.8814 - classification_loss: 0.1147 108/500 [=====>........................] - ETA: 2:12 - loss: 0.9983 - regression_loss: 0.8832 - classification_loss: 0.1150 109/500 [=====>........................] - ETA: 2:12 - loss: 1.0146 - regression_loss: 0.8950 - classification_loss: 0.1196 110/500 [=====>........................] - ETA: 2:12 - loss: 1.0154 - regression_loss: 0.8946 - classification_loss: 0.1208 111/500 [=====>........................] - ETA: 2:11 - loss: 1.0105 - regression_loss: 0.8904 - classification_loss: 0.1201 112/500 [=====>........................] - ETA: 2:11 - loss: 1.0088 - regression_loss: 0.8892 - classification_loss: 0.1196 113/500 [=====>........................] - ETA: 2:11 - loss: 1.0162 - regression_loss: 0.8955 - classification_loss: 0.1207 114/500 [=====>........................] - ETA: 2:10 - loss: 1.0157 - regression_loss: 0.8951 - classification_loss: 0.1206 115/500 [=====>........................] - ETA: 2:10 - loss: 1.0136 - regression_loss: 0.8934 - classification_loss: 0.1203 116/500 [=====>........................] - ETA: 2:10 - loss: 1.0145 - regression_loss: 0.8940 - classification_loss: 0.1205 117/500 [======>.......................] - ETA: 2:09 - loss: 1.0186 - regression_loss: 0.8973 - classification_loss: 0.1212 118/500 [======>.......................] - ETA: 2:09 - loss: 1.0184 - regression_loss: 0.8975 - classification_loss: 0.1209 119/500 [======>.......................] - ETA: 2:09 - loss: 1.0162 - regression_loss: 0.8956 - classification_loss: 0.1206 120/500 [======>.......................] - ETA: 2:08 - loss: 1.0136 - regression_loss: 0.8932 - classification_loss: 0.1204 121/500 [======>.......................] - ETA: 2:08 - loss: 1.0118 - regression_loss: 0.8918 - classification_loss: 0.1200 122/500 [======>.......................] - ETA: 2:08 - loss: 1.0128 - regression_loss: 0.8927 - classification_loss: 0.1201 123/500 [======>.......................] - ETA: 2:07 - loss: 1.0109 - regression_loss: 0.8910 - classification_loss: 0.1199 124/500 [======>.......................] - ETA: 2:07 - loss: 1.0115 - regression_loss: 0.8913 - classification_loss: 0.1202 125/500 [======>.......................] - ETA: 2:07 - loss: 1.0069 - regression_loss: 0.8873 - classification_loss: 0.1196 126/500 [======>.......................] - ETA: 2:06 - loss: 1.0092 - regression_loss: 0.8893 - classification_loss: 0.1200 127/500 [======>.......................] - ETA: 2:06 - loss: 1.0074 - regression_loss: 0.8878 - classification_loss: 0.1196 128/500 [======>.......................] - ETA: 2:06 - loss: 1.0045 - regression_loss: 0.8855 - classification_loss: 0.1190 129/500 [======>.......................] - ETA: 2:05 - loss: 1.0008 - regression_loss: 0.8824 - classification_loss: 0.1184 130/500 [======>.......................] - ETA: 2:05 - loss: 0.9991 - regression_loss: 0.8811 - classification_loss: 0.1180 131/500 [======>.......................] - ETA: 2:05 - loss: 0.9980 - regression_loss: 0.8803 - classification_loss: 0.1177 132/500 [======>.......................] - ETA: 2:04 - loss: 0.9961 - regression_loss: 0.8788 - classification_loss: 0.1172 133/500 [======>.......................] - ETA: 2:04 - loss: 0.9952 - regression_loss: 0.8784 - classification_loss: 0.1168 134/500 [=======>......................] - ETA: 2:04 - loss: 0.9979 - regression_loss: 0.8809 - classification_loss: 0.1171 135/500 [=======>......................] - ETA: 2:03 - loss: 1.0019 - regression_loss: 0.8844 - classification_loss: 0.1175 136/500 [=======>......................] - ETA: 2:03 - loss: 0.9973 - regression_loss: 0.8804 - classification_loss: 0.1169 137/500 [=======>......................] - ETA: 2:03 - loss: 0.9950 - regression_loss: 0.8786 - classification_loss: 0.1164 138/500 [=======>......................] - ETA: 2:02 - loss: 0.9936 - regression_loss: 0.8775 - classification_loss: 0.1160 139/500 [=======>......................] - ETA: 2:02 - loss: 0.9902 - regression_loss: 0.8746 - classification_loss: 0.1156 140/500 [=======>......................] - ETA: 2:02 - loss: 0.9902 - regression_loss: 0.8748 - classification_loss: 0.1155 141/500 [=======>......................] - ETA: 2:01 - loss: 0.9865 - regression_loss: 0.8711 - classification_loss: 0.1155 142/500 [=======>......................] - ETA: 2:01 - loss: 0.9844 - regression_loss: 0.8694 - classification_loss: 0.1150 143/500 [=======>......................] - ETA: 2:01 - loss: 0.9817 - regression_loss: 0.8672 - classification_loss: 0.1145 144/500 [=======>......................] - ETA: 2:00 - loss: 0.9775 - regression_loss: 0.8637 - classification_loss: 0.1139 145/500 [=======>......................] - ETA: 2:00 - loss: 0.9791 - regression_loss: 0.8650 - classification_loss: 0.1141 146/500 [=======>......................] - ETA: 2:00 - loss: 0.9766 - regression_loss: 0.8630 - classification_loss: 0.1137 147/500 [=======>......................] - ETA: 1:59 - loss: 0.9743 - regression_loss: 0.8611 - classification_loss: 0.1131 148/500 [=======>......................] - ETA: 1:59 - loss: 0.9771 - regression_loss: 0.8630 - classification_loss: 0.1141 149/500 [=======>......................] - ETA: 1:59 - loss: 0.9776 - regression_loss: 0.8630 - classification_loss: 0.1145 150/500 [========>.....................] - ETA: 1:58 - loss: 0.9788 - regression_loss: 0.8644 - classification_loss: 0.1144 151/500 [========>.....................] - ETA: 1:58 - loss: 0.9788 - regression_loss: 0.8648 - classification_loss: 0.1140 152/500 [========>.....................] - ETA: 1:58 - loss: 0.9807 - regression_loss: 0.8666 - classification_loss: 0.1141 153/500 [========>.....................] - ETA: 1:57 - loss: 0.9796 - regression_loss: 0.8659 - classification_loss: 0.1137 154/500 [========>.....................] - ETA: 1:57 - loss: 0.9809 - regression_loss: 0.8668 - classification_loss: 0.1141 155/500 [========>.....................] - ETA: 1:57 - loss: 0.9822 - regression_loss: 0.8683 - classification_loss: 0.1139 156/500 [========>.....................] - ETA: 1:56 - loss: 0.9853 - regression_loss: 0.8715 - classification_loss: 0.1138 157/500 [========>.....................] - ETA: 1:56 - loss: 0.9913 - regression_loss: 0.8769 - classification_loss: 0.1144 158/500 [========>.....................] - ETA: 1:56 - loss: 0.9953 - regression_loss: 0.8801 - classification_loss: 0.1152 159/500 [========>.....................] - ETA: 1:55 - loss: 0.9952 - regression_loss: 0.8803 - classification_loss: 0.1149 160/500 [========>.....................] - ETA: 1:55 - loss: 0.9920 - regression_loss: 0.8775 - classification_loss: 0.1145 161/500 [========>.....................] - ETA: 1:55 - loss: 0.9894 - regression_loss: 0.8754 - classification_loss: 0.1140 162/500 [========>.....................] - ETA: 1:54 - loss: 0.9866 - regression_loss: 0.8731 - classification_loss: 0.1135 163/500 [========>.....................] - ETA: 1:54 - loss: 0.9871 - regression_loss: 0.8736 - classification_loss: 0.1134 164/500 [========>.....................] - ETA: 1:53 - loss: 0.9896 - regression_loss: 0.8760 - classification_loss: 0.1136 165/500 [========>.....................] - ETA: 1:53 - loss: 0.9972 - regression_loss: 0.8823 - classification_loss: 0.1149 166/500 [========>.....................] - ETA: 1:53 - loss: 0.9997 - regression_loss: 0.8846 - classification_loss: 0.1151 167/500 [=========>....................] - ETA: 1:52 - loss: 0.9997 - regression_loss: 0.8846 - classification_loss: 0.1150 168/500 [=========>....................] - ETA: 1:52 - loss: 0.9979 - regression_loss: 0.8831 - classification_loss: 0.1148 169/500 [=========>....................] - ETA: 1:52 - loss: 0.9995 - regression_loss: 0.8844 - classification_loss: 0.1151 170/500 [=========>....................] - ETA: 1:51 - loss: 1.0017 - regression_loss: 0.8871 - classification_loss: 0.1146 171/500 [=========>....................] - ETA: 1:51 - loss: 1.0015 - regression_loss: 0.8870 - classification_loss: 0.1145 172/500 [=========>....................] - ETA: 1:51 - loss: 0.9999 - regression_loss: 0.8855 - classification_loss: 0.1144 173/500 [=========>....................] - ETA: 1:50 - loss: 1.0054 - regression_loss: 0.8887 - classification_loss: 0.1167 174/500 [=========>....................] - ETA: 1:50 - loss: 1.0057 - regression_loss: 0.8892 - classification_loss: 0.1165 175/500 [=========>....................] - ETA: 1:50 - loss: 1.0062 - regression_loss: 0.8896 - classification_loss: 0.1166 176/500 [=========>....................] - ETA: 1:49 - loss: 1.0055 - regression_loss: 0.8889 - classification_loss: 0.1167 177/500 [=========>....................] - ETA: 1:49 - loss: 1.0088 - regression_loss: 0.8920 - classification_loss: 0.1168 178/500 [=========>....................] - ETA: 1:49 - loss: 1.0087 - regression_loss: 0.8919 - classification_loss: 0.1168 179/500 [=========>....................] - ETA: 1:48 - loss: 1.0076 - regression_loss: 0.8910 - classification_loss: 0.1166 180/500 [=========>....................] - ETA: 1:48 - loss: 1.0080 - regression_loss: 0.8913 - classification_loss: 0.1167 181/500 [=========>....................] - ETA: 1:48 - loss: 1.0052 - regression_loss: 0.8889 - classification_loss: 0.1162 182/500 [=========>....................] - ETA: 1:47 - loss: 1.0046 - regression_loss: 0.8885 - classification_loss: 0.1161 183/500 [=========>....................] - ETA: 1:47 - loss: 1.0074 - regression_loss: 0.8908 - classification_loss: 0.1166 184/500 [==========>...................] - ETA: 1:47 - loss: 1.0054 - regression_loss: 0.8890 - classification_loss: 0.1164 185/500 [==========>...................] - ETA: 1:46 - loss: 1.0094 - regression_loss: 0.8919 - classification_loss: 0.1175 186/500 [==========>...................] - ETA: 1:46 - loss: 1.0105 - regression_loss: 0.8931 - classification_loss: 0.1174 187/500 [==========>...................] - ETA: 1:46 - loss: 1.0105 - regression_loss: 0.8931 - classification_loss: 0.1174 188/500 [==========>...................] - ETA: 1:45 - loss: 1.0119 - regression_loss: 0.8943 - classification_loss: 0.1176 189/500 [==========>...................] - ETA: 1:45 - loss: 1.0153 - regression_loss: 0.8971 - classification_loss: 0.1182 190/500 [==========>...................] - ETA: 1:45 - loss: 1.0133 - regression_loss: 0.8956 - classification_loss: 0.1177 191/500 [==========>...................] - ETA: 1:44 - loss: 1.0124 - regression_loss: 0.8950 - classification_loss: 0.1174 192/500 [==========>...................] - ETA: 1:44 - loss: 1.0145 - regression_loss: 0.8972 - classification_loss: 0.1172 193/500 [==========>...................] - ETA: 1:44 - loss: 1.0160 - regression_loss: 0.8983 - classification_loss: 0.1178 194/500 [==========>...................] - ETA: 1:43 - loss: 1.0146 - regression_loss: 0.8974 - classification_loss: 0.1173 195/500 [==========>...................] - ETA: 1:43 - loss: 1.0115 - regression_loss: 0.8947 - classification_loss: 0.1168 196/500 [==========>...................] - ETA: 1:43 - loss: 1.0130 - regression_loss: 0.8960 - classification_loss: 0.1169 197/500 [==========>...................] - ETA: 1:42 - loss: 1.0122 - regression_loss: 0.8956 - classification_loss: 0.1166 198/500 [==========>...................] - ETA: 1:42 - loss: 1.0121 - regression_loss: 0.8956 - classification_loss: 0.1165 199/500 [==========>...................] - ETA: 1:42 - loss: 1.0118 - regression_loss: 0.8955 - classification_loss: 0.1163 200/500 [===========>..................] - ETA: 1:41 - loss: 1.0149 - regression_loss: 0.8978 - classification_loss: 0.1171 201/500 [===========>..................] - ETA: 1:41 - loss: 1.0136 - regression_loss: 0.8967 - classification_loss: 0.1168 202/500 [===========>..................] - ETA: 1:41 - loss: 1.0146 - regression_loss: 0.8976 - classification_loss: 0.1169 203/500 [===========>..................] - ETA: 1:40 - loss: 1.0136 - regression_loss: 0.8970 - classification_loss: 0.1166 204/500 [===========>..................] - ETA: 1:40 - loss: 1.0145 - regression_loss: 0.8978 - classification_loss: 0.1166 205/500 [===========>..................] - ETA: 1:40 - loss: 1.0145 - regression_loss: 0.8979 - classification_loss: 0.1166 206/500 [===========>..................] - ETA: 1:39 - loss: 1.0141 - regression_loss: 0.8976 - classification_loss: 0.1165 207/500 [===========>..................] - ETA: 1:39 - loss: 1.0137 - regression_loss: 0.8976 - classification_loss: 0.1161 208/500 [===========>..................] - ETA: 1:39 - loss: 1.0133 - regression_loss: 0.8974 - classification_loss: 0.1159 209/500 [===========>..................] - ETA: 1:38 - loss: 1.0114 - regression_loss: 0.8958 - classification_loss: 0.1156 210/500 [===========>..................] - ETA: 1:38 - loss: 1.0091 - regression_loss: 0.8938 - classification_loss: 0.1153 211/500 [===========>..................] - ETA: 1:38 - loss: 1.0085 - regression_loss: 0.8933 - classification_loss: 0.1152 212/500 [===========>..................] - ETA: 1:37 - loss: 1.0065 - regression_loss: 0.8916 - classification_loss: 0.1149 213/500 [===========>..................] - ETA: 1:37 - loss: 1.0081 - regression_loss: 0.8928 - classification_loss: 0.1152 214/500 [===========>..................] - ETA: 1:37 - loss: 1.0094 - regression_loss: 0.8940 - classification_loss: 0.1154 215/500 [===========>..................] - ETA: 1:36 - loss: 1.0130 - regression_loss: 0.8973 - classification_loss: 0.1157 216/500 [===========>..................] - ETA: 1:36 - loss: 1.0123 - regression_loss: 0.8968 - classification_loss: 0.1155 217/500 [============>.................] - ETA: 1:35 - loss: 1.0103 - regression_loss: 0.8950 - classification_loss: 0.1153 218/500 [============>.................] - ETA: 1:35 - loss: 1.0079 - regression_loss: 0.8930 - classification_loss: 0.1149 219/500 [============>.................] - ETA: 1:35 - loss: 1.0110 - regression_loss: 0.8956 - classification_loss: 0.1154 220/500 [============>.................] - ETA: 1:34 - loss: 1.0100 - regression_loss: 0.8948 - classification_loss: 0.1152 221/500 [============>.................] - ETA: 1:34 - loss: 1.0093 - regression_loss: 0.8939 - classification_loss: 0.1153 222/500 [============>.................] - ETA: 1:34 - loss: 1.0082 - regression_loss: 0.8929 - classification_loss: 0.1153 223/500 [============>.................] - ETA: 1:33 - loss: 1.0095 - regression_loss: 0.8944 - classification_loss: 0.1152 224/500 [============>.................] - ETA: 1:33 - loss: 1.0085 - regression_loss: 0.8937 - classification_loss: 0.1148 225/500 [============>.................] - ETA: 1:33 - loss: 1.0106 - regression_loss: 0.8956 - classification_loss: 0.1150 226/500 [============>.................] - ETA: 1:32 - loss: 1.0106 - regression_loss: 0.8957 - classification_loss: 0.1149 227/500 [============>.................] - ETA: 1:32 - loss: 1.0090 - regression_loss: 0.8944 - classification_loss: 0.1146 228/500 [============>.................] - ETA: 1:32 - loss: 1.0099 - regression_loss: 0.8951 - classification_loss: 0.1148 229/500 [============>.................] - ETA: 1:31 - loss: 1.0095 - regression_loss: 0.8946 - classification_loss: 0.1149 230/500 [============>.................] - ETA: 1:31 - loss: 1.0103 - regression_loss: 0.8952 - classification_loss: 0.1151 231/500 [============>.................] - ETA: 1:31 - loss: 1.0089 - regression_loss: 0.8941 - classification_loss: 0.1148 232/500 [============>.................] - ETA: 1:30 - loss: 1.0095 - regression_loss: 0.8946 - classification_loss: 0.1149 233/500 [============>.................] - ETA: 1:30 - loss: 1.0079 - regression_loss: 0.8930 - classification_loss: 0.1149 234/500 [=============>................] - ETA: 1:30 - loss: 1.0093 - regression_loss: 0.8944 - classification_loss: 0.1149 235/500 [=============>................] - ETA: 1:29 - loss: 1.0101 - regression_loss: 0.8953 - classification_loss: 0.1148 236/500 [=============>................] - ETA: 1:29 - loss: 1.0105 - regression_loss: 0.8960 - classification_loss: 0.1145 237/500 [=============>................] - ETA: 1:29 - loss: 1.0108 - regression_loss: 0.8964 - classification_loss: 0.1144 238/500 [=============>................] - ETA: 1:28 - loss: 1.0094 - regression_loss: 0.8953 - classification_loss: 0.1141 239/500 [=============>................] - ETA: 1:28 - loss: 1.0103 - regression_loss: 0.8959 - classification_loss: 0.1144 240/500 [=============>................] - ETA: 1:28 - loss: 1.0087 - regression_loss: 0.8946 - classification_loss: 0.1141 241/500 [=============>................] - ETA: 1:27 - loss: 1.0090 - regression_loss: 0.8948 - classification_loss: 0.1142 242/500 [=============>................] - ETA: 1:27 - loss: 1.0081 - regression_loss: 0.8942 - classification_loss: 0.1139 243/500 [=============>................] - ETA: 1:27 - loss: 1.0095 - regression_loss: 0.8954 - classification_loss: 0.1141 244/500 [=============>................] - ETA: 1:26 - loss: 1.0099 - regression_loss: 0.8959 - classification_loss: 0.1140 245/500 [=============>................] - ETA: 1:26 - loss: 1.0102 - regression_loss: 0.8964 - classification_loss: 0.1138 246/500 [=============>................] - ETA: 1:26 - loss: 1.0101 - regression_loss: 0.8962 - classification_loss: 0.1139 247/500 [=============>................] - ETA: 1:25 - loss: 1.0090 - regression_loss: 0.8954 - classification_loss: 0.1136 248/500 [=============>................] - ETA: 1:25 - loss: 1.0074 - regression_loss: 0.8940 - classification_loss: 0.1134 249/500 [=============>................] - ETA: 1:25 - loss: 1.0066 - regression_loss: 0.8932 - classification_loss: 0.1133 250/500 [==============>...............] - ETA: 1:24 - loss: 1.0050 - regression_loss: 0.8918 - classification_loss: 0.1132 251/500 [==============>...............] - ETA: 1:24 - loss: 1.0048 - regression_loss: 0.8916 - classification_loss: 0.1131 252/500 [==============>...............] - ETA: 1:24 - loss: 1.0041 - regression_loss: 0.8911 - classification_loss: 0.1129 253/500 [==============>...............] - ETA: 1:23 - loss: 1.0033 - regression_loss: 0.8905 - classification_loss: 0.1129 254/500 [==============>...............] - ETA: 1:23 - loss: 1.0015 - regression_loss: 0.8889 - classification_loss: 0.1126 255/500 [==============>...............] - ETA: 1:23 - loss: 0.9986 - regression_loss: 0.8865 - classification_loss: 0.1122 256/500 [==============>...............] - ETA: 1:22 - loss: 0.9965 - regression_loss: 0.8847 - classification_loss: 0.1118 257/500 [==============>...............] - ETA: 1:22 - loss: 1.0005 - regression_loss: 0.8884 - classification_loss: 0.1121 258/500 [==============>...............] - ETA: 1:22 - loss: 0.9999 - regression_loss: 0.8879 - classification_loss: 0.1120 259/500 [==============>...............] - ETA: 1:21 - loss: 0.9997 - regression_loss: 0.8876 - classification_loss: 0.1120 260/500 [==============>...............] - ETA: 1:21 - loss: 0.9978 - regression_loss: 0.8861 - classification_loss: 0.1117 261/500 [==============>...............] - ETA: 1:21 - loss: 0.9999 - regression_loss: 0.8882 - classification_loss: 0.1117 262/500 [==============>...............] - ETA: 1:20 - loss: 1.0001 - regression_loss: 0.8885 - classification_loss: 0.1117 263/500 [==============>...............] - ETA: 1:20 - loss: 1.0016 - regression_loss: 0.8897 - classification_loss: 0.1119 264/500 [==============>...............] - ETA: 1:20 - loss: 0.9993 - regression_loss: 0.8877 - classification_loss: 0.1116 265/500 [==============>...............] - ETA: 1:19 - loss: 1.0013 - regression_loss: 0.8896 - classification_loss: 0.1117 266/500 [==============>...............] - ETA: 1:19 - loss: 1.0004 - regression_loss: 0.8887 - classification_loss: 0.1117 267/500 [===============>..............] - ETA: 1:19 - loss: 1.0001 - regression_loss: 0.8885 - classification_loss: 0.1115 268/500 [===============>..............] - ETA: 1:18 - loss: 0.9998 - regression_loss: 0.8883 - classification_loss: 0.1115 269/500 [===============>..............] - ETA: 1:18 - loss: 0.9992 - regression_loss: 0.8878 - classification_loss: 0.1114 270/500 [===============>..............] - ETA: 1:18 - loss: 0.9978 - regression_loss: 0.8866 - classification_loss: 0.1112 271/500 [===============>..............] - ETA: 1:17 - loss: 0.9962 - regression_loss: 0.8852 - classification_loss: 0.1110 272/500 [===============>..............] - ETA: 1:17 - loss: 0.9957 - regression_loss: 0.8849 - classification_loss: 0.1109 273/500 [===============>..............] - ETA: 1:16 - loss: 0.9956 - regression_loss: 0.8847 - classification_loss: 0.1109 274/500 [===============>..............] - ETA: 1:16 - loss: 0.9956 - regression_loss: 0.8847 - classification_loss: 0.1109 275/500 [===============>..............] - ETA: 1:16 - loss: 0.9946 - regression_loss: 0.8839 - classification_loss: 0.1106 276/500 [===============>..............] - ETA: 1:15 - loss: 0.9928 - regression_loss: 0.8823 - classification_loss: 0.1105 277/500 [===============>..............] - ETA: 1:15 - loss: 0.9938 - regression_loss: 0.8833 - classification_loss: 0.1105 278/500 [===============>..............] - ETA: 1:15 - loss: 0.9939 - regression_loss: 0.8834 - classification_loss: 0.1105 279/500 [===============>..............] - ETA: 1:14 - loss: 0.9913 - regression_loss: 0.8811 - classification_loss: 0.1102 280/500 [===============>..............] - ETA: 1:14 - loss: 0.9905 - regression_loss: 0.8806 - classification_loss: 0.1099 281/500 [===============>..............] - ETA: 1:14 - loss: 0.9897 - regression_loss: 0.8800 - classification_loss: 0.1097 282/500 [===============>..............] - ETA: 1:13 - loss: 0.9885 - regression_loss: 0.8790 - classification_loss: 0.1095 283/500 [===============>..............] - ETA: 1:13 - loss: 0.9883 - regression_loss: 0.8789 - classification_loss: 0.1094 284/500 [================>.............] - ETA: 1:13 - loss: 0.9890 - regression_loss: 0.8796 - classification_loss: 0.1094 285/500 [================>.............] - ETA: 1:12 - loss: 0.9930 - regression_loss: 0.8832 - classification_loss: 0.1098 286/500 [================>.............] - ETA: 1:12 - loss: 0.9920 - regression_loss: 0.8824 - classification_loss: 0.1096 287/500 [================>.............] - ETA: 1:12 - loss: 0.9910 - regression_loss: 0.8815 - classification_loss: 0.1094 288/500 [================>.............] - ETA: 1:11 - loss: 0.9894 - regression_loss: 0.8803 - classification_loss: 0.1092 289/500 [================>.............] - ETA: 1:11 - loss: 0.9901 - regression_loss: 0.8809 - classification_loss: 0.1092 290/500 [================>.............] - ETA: 1:11 - loss: 0.9910 - regression_loss: 0.8816 - classification_loss: 0.1095 291/500 [================>.............] - ETA: 1:10 - loss: 0.9893 - regression_loss: 0.8800 - classification_loss: 0.1093 292/500 [================>.............] - ETA: 1:10 - loss: 0.9892 - regression_loss: 0.8799 - classification_loss: 0.1093 293/500 [================>.............] - ETA: 1:10 - loss: 0.9882 - regression_loss: 0.8791 - classification_loss: 0.1091 294/500 [================>.............] - ETA: 1:09 - loss: 0.9875 - regression_loss: 0.8786 - classification_loss: 0.1089 295/500 [================>.............] - ETA: 1:09 - loss: 0.9856 - regression_loss: 0.8767 - classification_loss: 0.1089 296/500 [================>.............] - ETA: 1:09 - loss: 0.9847 - regression_loss: 0.8759 - classification_loss: 0.1087 297/500 [================>.............] - ETA: 1:08 - loss: 0.9856 - regression_loss: 0.8768 - classification_loss: 0.1088 298/500 [================>.............] - ETA: 1:08 - loss: 0.9865 - regression_loss: 0.8776 - classification_loss: 0.1089 299/500 [================>.............] - ETA: 1:08 - loss: 0.9867 - regression_loss: 0.8778 - classification_loss: 0.1089 300/500 [=================>............] - ETA: 1:07 - loss: 0.9864 - regression_loss: 0.8774 - classification_loss: 0.1089 301/500 [=================>............] - ETA: 1:07 - loss: 0.9862 - regression_loss: 0.8774 - classification_loss: 0.1089 302/500 [=================>............] - ETA: 1:07 - loss: 0.9864 - regression_loss: 0.8776 - classification_loss: 0.1088 303/500 [=================>............] - ETA: 1:06 - loss: 0.9840 - regression_loss: 0.8755 - classification_loss: 0.1085 304/500 [=================>............] - ETA: 1:06 - loss: 0.9841 - regression_loss: 0.8756 - classification_loss: 0.1085 305/500 [=================>............] - ETA: 1:06 - loss: 0.9821 - regression_loss: 0.8738 - classification_loss: 0.1082 306/500 [=================>............] - ETA: 1:05 - loss: 0.9817 - regression_loss: 0.8735 - classification_loss: 0.1081 307/500 [=================>............] - ETA: 1:05 - loss: 0.9824 - regression_loss: 0.8741 - classification_loss: 0.1083 308/500 [=================>............] - ETA: 1:05 - loss: 0.9851 - regression_loss: 0.8768 - classification_loss: 0.1083 309/500 [=================>............] - ETA: 1:04 - loss: 0.9876 - regression_loss: 0.8788 - classification_loss: 0.1089 310/500 [=================>............] - ETA: 1:04 - loss: 0.9872 - regression_loss: 0.8784 - classification_loss: 0.1088 311/500 [=================>............] - ETA: 1:04 - loss: 0.9892 - regression_loss: 0.8800 - classification_loss: 0.1092 312/500 [=================>............] - ETA: 1:03 - loss: 0.9880 - regression_loss: 0.8790 - classification_loss: 0.1090 313/500 [=================>............] - ETA: 1:03 - loss: 0.9881 - regression_loss: 0.8792 - classification_loss: 0.1089 314/500 [=================>............] - ETA: 1:03 - loss: 0.9876 - regression_loss: 0.8787 - classification_loss: 0.1088 315/500 [=================>............] - ETA: 1:02 - loss: 0.9872 - regression_loss: 0.8784 - classification_loss: 0.1088 316/500 [=================>............] - ETA: 1:02 - loss: 0.9854 - regression_loss: 0.8768 - classification_loss: 0.1085 317/500 [==================>...........] - ETA: 1:02 - loss: 0.9856 - regression_loss: 0.8772 - classification_loss: 0.1084 318/500 [==================>...........] - ETA: 1:01 - loss: 0.9851 - regression_loss: 0.8765 - classification_loss: 0.1086 319/500 [==================>...........] - ETA: 1:01 - loss: 0.9839 - regression_loss: 0.8755 - classification_loss: 0.1084 320/500 [==================>...........] - ETA: 1:01 - loss: 0.9843 - regression_loss: 0.8756 - classification_loss: 0.1086 321/500 [==================>...........] - ETA: 1:00 - loss: 0.9841 - regression_loss: 0.8755 - classification_loss: 0.1086 322/500 [==================>...........] - ETA: 1:00 - loss: 0.9855 - regression_loss: 0.8767 - classification_loss: 0.1088 323/500 [==================>...........] - ETA: 1:00 - loss: 0.9852 - regression_loss: 0.8766 - classification_loss: 0.1087 324/500 [==================>...........] - ETA: 59s - loss: 0.9847 - regression_loss: 0.8761 - classification_loss: 0.1086  325/500 [==================>...........] - ETA: 59s - loss: 0.9847 - regression_loss: 0.8762 - classification_loss: 0.1085 326/500 [==================>...........] - ETA: 59s - loss: 0.9847 - regression_loss: 0.8762 - classification_loss: 0.1085 327/500 [==================>...........] - ETA: 58s - loss: 0.9859 - regression_loss: 0.8771 - classification_loss: 0.1088 328/500 [==================>...........] - ETA: 58s - loss: 0.9843 - regression_loss: 0.8756 - classification_loss: 0.1086 329/500 [==================>...........] - ETA: 58s - loss: 0.9846 - regression_loss: 0.8759 - classification_loss: 0.1087 330/500 [==================>...........] - ETA: 57s - loss: 0.9841 - regression_loss: 0.8756 - classification_loss: 0.1085 331/500 [==================>...........] - ETA: 57s - loss: 0.9840 - regression_loss: 0.8755 - classification_loss: 0.1084 332/500 [==================>...........] - ETA: 57s - loss: 0.9853 - regression_loss: 0.8766 - classification_loss: 0.1087 333/500 [==================>...........] - ETA: 56s - loss: 0.9846 - regression_loss: 0.8761 - classification_loss: 0.1085 334/500 [===================>..........] - ETA: 56s - loss: 0.9841 - regression_loss: 0.8757 - classification_loss: 0.1084 335/500 [===================>..........] - ETA: 56s - loss: 0.9851 - regression_loss: 0.8762 - classification_loss: 0.1089 336/500 [===================>..........] - ETA: 55s - loss: 0.9878 - regression_loss: 0.8785 - classification_loss: 0.1093 337/500 [===================>..........] - ETA: 55s - loss: 0.9870 - regression_loss: 0.8779 - classification_loss: 0.1091 338/500 [===================>..........] - ETA: 55s - loss: 0.9903 - regression_loss: 0.8807 - classification_loss: 0.1096 339/500 [===================>..........] - ETA: 54s - loss: 0.9934 - regression_loss: 0.8833 - classification_loss: 0.1101 340/500 [===================>..........] - ETA: 54s - loss: 0.9940 - regression_loss: 0.8839 - classification_loss: 0.1101 341/500 [===================>..........] - ETA: 53s - loss: 0.9949 - regression_loss: 0.8847 - classification_loss: 0.1103 342/500 [===================>..........] - ETA: 53s - loss: 0.9945 - regression_loss: 0.8844 - classification_loss: 0.1101 343/500 [===================>..........] - ETA: 53s - loss: 0.9938 - regression_loss: 0.8836 - classification_loss: 0.1102 344/500 [===================>..........] - ETA: 52s - loss: 0.9936 - regression_loss: 0.8834 - classification_loss: 0.1102 345/500 [===================>..........] - ETA: 52s - loss: 0.9954 - regression_loss: 0.8849 - classification_loss: 0.1105 346/500 [===================>..........] - ETA: 52s - loss: 0.9945 - regression_loss: 0.8842 - classification_loss: 0.1104 347/500 [===================>..........] - ETA: 51s - loss: 1.0007 - regression_loss: 0.8881 - classification_loss: 0.1126 348/500 [===================>..........] - ETA: 51s - loss: 1.0003 - regression_loss: 0.8877 - classification_loss: 0.1126 349/500 [===================>..........] - ETA: 51s - loss: 1.0001 - regression_loss: 0.8875 - classification_loss: 0.1126 350/500 [====================>.........] - ETA: 50s - loss: 1.0006 - regression_loss: 0.8882 - classification_loss: 0.1124 351/500 [====================>.........] - ETA: 50s - loss: 1.0013 - regression_loss: 0.8890 - classification_loss: 0.1124 352/500 [====================>.........] - ETA: 50s - loss: 1.0038 - regression_loss: 0.8912 - classification_loss: 0.1126 353/500 [====================>.........] - ETA: 49s - loss: 1.0036 - regression_loss: 0.8911 - classification_loss: 0.1125 354/500 [====================>.........] - ETA: 49s - loss: 1.0032 - regression_loss: 0.8908 - classification_loss: 0.1124 355/500 [====================>.........] - ETA: 49s - loss: 1.0032 - regression_loss: 0.8909 - classification_loss: 0.1123 356/500 [====================>.........] - ETA: 48s - loss: 1.0035 - regression_loss: 0.8912 - classification_loss: 0.1123 357/500 [====================>.........] - ETA: 48s - loss: 1.0025 - regression_loss: 0.8903 - classification_loss: 0.1121 358/500 [====================>.........] - ETA: 48s - loss: 1.0011 - regression_loss: 0.8892 - classification_loss: 0.1120 359/500 [====================>.........] - ETA: 47s - loss: 1.0023 - regression_loss: 0.8900 - classification_loss: 0.1122 360/500 [====================>.........] - ETA: 47s - loss: 1.0013 - regression_loss: 0.8892 - classification_loss: 0.1121 361/500 [====================>.........] - ETA: 47s - loss: 1.0009 - regression_loss: 0.8889 - classification_loss: 0.1120 362/500 [====================>.........] - ETA: 46s - loss: 0.9994 - regression_loss: 0.8875 - classification_loss: 0.1119 363/500 [====================>.........] - ETA: 46s - loss: 1.0001 - regression_loss: 0.8881 - classification_loss: 0.1120 364/500 [====================>.........] - ETA: 46s - loss: 1.0013 - regression_loss: 0.8891 - classification_loss: 0.1123 365/500 [====================>.........] - ETA: 45s - loss: 1.0021 - regression_loss: 0.8893 - classification_loss: 0.1127 366/500 [====================>.........] - ETA: 45s - loss: 1.0024 - regression_loss: 0.8895 - classification_loss: 0.1129 367/500 [=====================>........] - ETA: 45s - loss: 1.0020 - regression_loss: 0.8891 - classification_loss: 0.1128 368/500 [=====================>........] - ETA: 44s - loss: 1.0024 - regression_loss: 0.8895 - classification_loss: 0.1128 369/500 [=====================>........] - ETA: 44s - loss: 1.0044 - regression_loss: 0.8916 - classification_loss: 0.1129 370/500 [=====================>........] - ETA: 44s - loss: 1.0040 - regression_loss: 0.8912 - classification_loss: 0.1128 371/500 [=====================>........] - ETA: 43s - loss: 1.0055 - regression_loss: 0.8925 - classification_loss: 0.1130 372/500 [=====================>........] - ETA: 43s - loss: 1.0063 - regression_loss: 0.8931 - classification_loss: 0.1131 373/500 [=====================>........] - ETA: 43s - loss: 1.0070 - regression_loss: 0.8939 - classification_loss: 0.1132 374/500 [=====================>........] - ETA: 42s - loss: 1.0066 - regression_loss: 0.8935 - classification_loss: 0.1131 375/500 [=====================>........] - ETA: 42s - loss: 1.0075 - regression_loss: 0.8943 - classification_loss: 0.1132 376/500 [=====================>........] - ETA: 42s - loss: 1.0079 - regression_loss: 0.8947 - classification_loss: 0.1132 377/500 [=====================>........] - ETA: 41s - loss: 1.0077 - regression_loss: 0.8945 - classification_loss: 0.1132 378/500 [=====================>........] - ETA: 41s - loss: 1.0072 - regression_loss: 0.8942 - classification_loss: 0.1130 379/500 [=====================>........] - ETA: 41s - loss: 1.0088 - regression_loss: 0.8956 - classification_loss: 0.1133 380/500 [=====================>........] - ETA: 40s - loss: 1.0087 - regression_loss: 0.8955 - classification_loss: 0.1132 381/500 [=====================>........] - ETA: 40s - loss: 1.0075 - regression_loss: 0.8945 - classification_loss: 0.1130 382/500 [=====================>........] - ETA: 40s - loss: 1.0070 - regression_loss: 0.8940 - classification_loss: 0.1130 383/500 [=====================>........] - ETA: 39s - loss: 1.0075 - regression_loss: 0.8943 - classification_loss: 0.1132 384/500 [======================>.......] - ETA: 39s - loss: 1.0075 - regression_loss: 0.8941 - classification_loss: 0.1133 385/500 [======================>.......] - ETA: 39s - loss: 1.0070 - regression_loss: 0.8938 - classification_loss: 0.1132 386/500 [======================>.......] - ETA: 38s - loss: 1.0059 - regression_loss: 0.8929 - classification_loss: 0.1130 387/500 [======================>.......] - ETA: 38s - loss: 1.0069 - regression_loss: 0.8933 - classification_loss: 0.1136 388/500 [======================>.......] - ETA: 38s - loss: 1.0092 - regression_loss: 0.8952 - classification_loss: 0.1139 389/500 [======================>.......] - ETA: 37s - loss: 1.0087 - regression_loss: 0.8948 - classification_loss: 0.1139 390/500 [======================>.......] - ETA: 37s - loss: 1.0078 - regression_loss: 0.8941 - classification_loss: 0.1137 391/500 [======================>.......] - ETA: 37s - loss: 1.0077 - regression_loss: 0.8940 - classification_loss: 0.1137 392/500 [======================>.......] - ETA: 36s - loss: 1.0079 - regression_loss: 0.8944 - classification_loss: 0.1135 393/500 [======================>.......] - ETA: 36s - loss: 1.0087 - regression_loss: 0.8951 - classification_loss: 0.1136 394/500 [======================>.......] - ETA: 35s - loss: 1.0077 - regression_loss: 0.8943 - classification_loss: 0.1134 395/500 [======================>.......] - ETA: 35s - loss: 1.0063 - regression_loss: 0.8931 - classification_loss: 0.1132 396/500 [======================>.......] - ETA: 35s - loss: 1.0053 - regression_loss: 0.8923 - classification_loss: 0.1130 397/500 [======================>.......] - ETA: 34s - loss: 1.0043 - regression_loss: 0.8914 - classification_loss: 0.1129 398/500 [======================>.......] - ETA: 34s - loss: 1.0069 - regression_loss: 0.8937 - classification_loss: 0.1132 399/500 [======================>.......] - ETA: 34s - loss: 1.0073 - regression_loss: 0.8942 - classification_loss: 0.1131 400/500 [=======================>......] - ETA: 33s - loss: 1.0061 - regression_loss: 0.8933 - classification_loss: 0.1128 401/500 [=======================>......] - ETA: 33s - loss: 1.0050 - regression_loss: 0.8924 - classification_loss: 0.1126 402/500 [=======================>......] - ETA: 33s - loss: 1.0047 - regression_loss: 0.8922 - classification_loss: 0.1125 403/500 [=======================>......] - ETA: 32s - loss: 1.0031 - regression_loss: 0.8909 - classification_loss: 0.1123 404/500 [=======================>......] - ETA: 32s - loss: 1.0038 - regression_loss: 0.8915 - classification_loss: 0.1123 405/500 [=======================>......] - ETA: 32s - loss: 1.0031 - regression_loss: 0.8909 - classification_loss: 0.1121 406/500 [=======================>......] - ETA: 31s - loss: 1.0028 - regression_loss: 0.8908 - classification_loss: 0.1120 407/500 [=======================>......] - ETA: 31s - loss: 1.0037 - regression_loss: 0.8915 - classification_loss: 0.1122 408/500 [=======================>......] - ETA: 31s - loss: 1.0029 - regression_loss: 0.8909 - classification_loss: 0.1120 409/500 [=======================>......] - ETA: 30s - loss: 1.0023 - regression_loss: 0.8904 - classification_loss: 0.1119 410/500 [=======================>......] - ETA: 30s - loss: 1.0012 - regression_loss: 0.8895 - classification_loss: 0.1117 411/500 [=======================>......] - ETA: 30s - loss: 1.0001 - regression_loss: 0.8886 - classification_loss: 0.1116 412/500 [=======================>......] - ETA: 29s - loss: 1.0002 - regression_loss: 0.8886 - classification_loss: 0.1116 413/500 [=======================>......] - ETA: 29s - loss: 1.0017 - regression_loss: 0.8900 - classification_loss: 0.1117 414/500 [=======================>......] - ETA: 29s - loss: 1.0011 - regression_loss: 0.8895 - classification_loss: 0.1116 415/500 [=======================>......] - ETA: 28s - loss: 1.0014 - regression_loss: 0.8897 - classification_loss: 0.1117 416/500 [=======================>......] - ETA: 28s - loss: 1.0020 - regression_loss: 0.8904 - classification_loss: 0.1116 417/500 [========================>.....] - ETA: 28s - loss: 1.0014 - regression_loss: 0.8900 - classification_loss: 0.1115 418/500 [========================>.....] - ETA: 27s - loss: 1.0014 - regression_loss: 0.8898 - classification_loss: 0.1115 419/500 [========================>.....] - ETA: 27s - loss: 1.0007 - regression_loss: 0.8892 - classification_loss: 0.1114 420/500 [========================>.....] - ETA: 27s - loss: 1.0012 - regression_loss: 0.8897 - classification_loss: 0.1115 421/500 [========================>.....] - ETA: 26s - loss: 1.0005 - regression_loss: 0.8892 - classification_loss: 0.1113 422/500 [========================>.....] - ETA: 26s - loss: 0.9990 - regression_loss: 0.8878 - classification_loss: 0.1112 423/500 [========================>.....] - ETA: 26s - loss: 0.9982 - regression_loss: 0.8872 - classification_loss: 0.1110 424/500 [========================>.....] - ETA: 25s - loss: 0.9983 - regression_loss: 0.8873 - classification_loss: 0.1109 425/500 [========================>.....] - ETA: 25s - loss: 0.9989 - regression_loss: 0.8880 - classification_loss: 0.1110 426/500 [========================>.....] - ETA: 25s - loss: 0.9998 - regression_loss: 0.8887 - classification_loss: 0.1111 427/500 [========================>.....] - ETA: 24s - loss: 0.9999 - regression_loss: 0.8888 - classification_loss: 0.1111 428/500 [========================>.....] - ETA: 24s - loss: 0.9993 - regression_loss: 0.8883 - classification_loss: 0.1110 429/500 [========================>.....] - ETA: 24s - loss: 0.9991 - regression_loss: 0.8883 - classification_loss: 0.1109 430/500 [========================>.....] - ETA: 23s - loss: 0.9996 - regression_loss: 0.8884 - classification_loss: 0.1111 431/500 [========================>.....] - ETA: 23s - loss: 0.9995 - regression_loss: 0.8883 - classification_loss: 0.1112 432/500 [========================>.....] - ETA: 23s - loss: 1.0009 - regression_loss: 0.8894 - classification_loss: 0.1115 433/500 [========================>.....] - ETA: 22s - loss: 1.0004 - regression_loss: 0.8890 - classification_loss: 0.1114 434/500 [=========================>....] - ETA: 22s - loss: 1.0008 - regression_loss: 0.8894 - classification_loss: 0.1114 435/500 [=========================>....] - ETA: 22s - loss: 1.0006 - regression_loss: 0.8893 - classification_loss: 0.1113 436/500 [=========================>....] - ETA: 21s - loss: 1.0016 - regression_loss: 0.8901 - classification_loss: 0.1115 437/500 [=========================>....] - ETA: 21s - loss: 1.0004 - regression_loss: 0.8889 - classification_loss: 0.1115 438/500 [=========================>....] - ETA: 21s - loss: 1.0014 - regression_loss: 0.8898 - classification_loss: 0.1116 439/500 [=========================>....] - ETA: 20s - loss: 1.0014 - regression_loss: 0.8899 - classification_loss: 0.1115 440/500 [=========================>....] - ETA: 20s - loss: 1.0007 - regression_loss: 0.8893 - classification_loss: 0.1114 441/500 [=========================>....] - ETA: 20s - loss: 0.9998 - regression_loss: 0.8885 - classification_loss: 0.1112 442/500 [=========================>....] - ETA: 19s - loss: 0.9986 - regression_loss: 0.8874 - classification_loss: 0.1112 443/500 [=========================>....] - ETA: 19s - loss: 0.9992 - regression_loss: 0.8879 - classification_loss: 0.1113 444/500 [=========================>....] - ETA: 19s - loss: 1.0000 - regression_loss: 0.8886 - classification_loss: 0.1114 445/500 [=========================>....] - ETA: 18s - loss: 0.9997 - regression_loss: 0.8884 - classification_loss: 0.1113 446/500 [=========================>....] - ETA: 18s - loss: 0.9983 - regression_loss: 0.8872 - classification_loss: 0.1111 447/500 [=========================>....] - ETA: 18s - loss: 0.9983 - regression_loss: 0.8873 - classification_loss: 0.1110 448/500 [=========================>....] - ETA: 17s - loss: 0.9995 - regression_loss: 0.8883 - classification_loss: 0.1112 449/500 [=========================>....] - ETA: 17s - loss: 0.9991 - regression_loss: 0.8880 - classification_loss: 0.1111 450/500 [==========================>...] - ETA: 17s - loss: 0.9983 - regression_loss: 0.8873 - classification_loss: 0.1110 451/500 [==========================>...] - ETA: 16s - loss: 0.9976 - regression_loss: 0.8867 - classification_loss: 0.1109 452/500 [==========================>...] - ETA: 16s - loss: 0.9976 - regression_loss: 0.8866 - classification_loss: 0.1110 453/500 [==========================>...] - ETA: 15s - loss: 0.9989 - regression_loss: 0.8878 - classification_loss: 0.1112 454/500 [==========================>...] - ETA: 15s - loss: 0.9988 - regression_loss: 0.8875 - classification_loss: 0.1113 455/500 [==========================>...] - ETA: 15s - loss: 0.9986 - regression_loss: 0.8874 - classification_loss: 0.1112 456/500 [==========================>...] - ETA: 14s - loss: 0.9984 - regression_loss: 0.8874 - classification_loss: 0.1110 457/500 [==========================>...] - ETA: 14s - loss: 0.9980 - regression_loss: 0.8869 - classification_loss: 0.1111 458/500 [==========================>...] - ETA: 14s - loss: 0.9976 - regression_loss: 0.8867 - classification_loss: 0.1109 459/500 [==========================>...] - ETA: 13s - loss: 0.9972 - regression_loss: 0.8864 - classification_loss: 0.1108 460/500 [==========================>...] - ETA: 13s - loss: 0.9972 - regression_loss: 0.8865 - classification_loss: 0.1107 461/500 [==========================>...] - ETA: 13s - loss: 0.9965 - regression_loss: 0.8859 - classification_loss: 0.1106 462/500 [==========================>...] - ETA: 12s - loss: 0.9970 - regression_loss: 0.8862 - classification_loss: 0.1108 463/500 [==========================>...] - ETA: 12s - loss: 0.9965 - regression_loss: 0.8858 - classification_loss: 0.1106 464/500 [==========================>...] - ETA: 12s - loss: 0.9960 - regression_loss: 0.8853 - classification_loss: 0.1107 465/500 [==========================>...] - ETA: 11s - loss: 0.9959 - regression_loss: 0.8852 - classification_loss: 0.1107 466/500 [==========================>...] - ETA: 11s - loss: 0.9996 - regression_loss: 0.8879 - classification_loss: 0.1118 467/500 [===========================>..] - ETA: 11s - loss: 0.9997 - regression_loss: 0.8881 - classification_loss: 0.1117 468/500 [===========================>..] - ETA: 10s - loss: 1.0011 - regression_loss: 0.8891 - classification_loss: 0.1120 469/500 [===========================>..] - ETA: 10s - loss: 1.0021 - regression_loss: 0.8901 - classification_loss: 0.1121 470/500 [===========================>..] - ETA: 10s - loss: 1.0010 - regression_loss: 0.8891 - classification_loss: 0.1119 471/500 [===========================>..] - ETA: 9s - loss: 1.0013 - regression_loss: 0.8893 - classification_loss: 0.1120  472/500 [===========================>..] - ETA: 9s - loss: 1.0020 - regression_loss: 0.8901 - classification_loss: 0.1119 473/500 [===========================>..] - ETA: 9s - loss: 1.0008 - regression_loss: 0.8891 - classification_loss: 0.1117 474/500 [===========================>..] - ETA: 8s - loss: 1.0034 - regression_loss: 0.8913 - classification_loss: 0.1121 475/500 [===========================>..] - ETA: 8s - loss: 1.0042 - regression_loss: 0.8920 - classification_loss: 0.1122 476/500 [===========================>..] - ETA: 8s - loss: 1.0031 - regression_loss: 0.8911 - classification_loss: 0.1121 477/500 [===========================>..] - ETA: 7s - loss: 1.0038 - regression_loss: 0.8916 - classification_loss: 0.1122 478/500 [===========================>..] - ETA: 7s - loss: 1.0039 - regression_loss: 0.8917 - classification_loss: 0.1122 479/500 [===========================>..] - ETA: 7s - loss: 1.0024 - regression_loss: 0.8904 - classification_loss: 0.1120 480/500 [===========================>..] - ETA: 6s - loss: 1.0031 - regression_loss: 0.8908 - classification_loss: 0.1123 481/500 [===========================>..] - ETA: 6s - loss: 1.0018 - regression_loss: 0.8896 - classification_loss: 0.1122 482/500 [===========================>..] - ETA: 6s - loss: 1.0023 - regression_loss: 0.8901 - classification_loss: 0.1122 483/500 [===========================>..] - ETA: 5s - loss: 1.0022 - regression_loss: 0.8900 - classification_loss: 0.1122 484/500 [============================>.] - ETA: 5s - loss: 1.0010 - regression_loss: 0.8890 - classification_loss: 0.1120 485/500 [============================>.] - ETA: 5s - loss: 1.0012 - regression_loss: 0.8893 - classification_loss: 0.1120 486/500 [============================>.] - ETA: 4s - loss: 1.0012 - regression_loss: 0.8892 - classification_loss: 0.1119 487/500 [============================>.] - ETA: 4s - loss: 1.0016 - regression_loss: 0.8897 - classification_loss: 0.1119 488/500 [============================>.] - ETA: 4s - loss: 1.0017 - regression_loss: 0.8897 - classification_loss: 0.1119 489/500 [============================>.] - ETA: 3s - loss: 1.0004 - regression_loss: 0.8886 - classification_loss: 0.1118 490/500 [============================>.] - ETA: 3s - loss: 1.0001 - regression_loss: 0.8885 - classification_loss: 0.1117 491/500 [============================>.] - ETA: 3s - loss: 0.9997 - regression_loss: 0.8881 - classification_loss: 0.1117 492/500 [============================>.] - ETA: 2s - loss: 1.0011 - regression_loss: 0.8891 - classification_loss: 0.1120 493/500 [============================>.] - ETA: 2s - loss: 1.0006 - regression_loss: 0.8888 - classification_loss: 0.1119 494/500 [============================>.] - ETA: 2s - loss: 1.0008 - regression_loss: 0.8889 - classification_loss: 0.1119 495/500 [============================>.] - ETA: 1s - loss: 1.0015 - regression_loss: 0.8894 - classification_loss: 0.1121 496/500 [============================>.] - ETA: 1s - loss: 1.0006 - regression_loss: 0.8886 - classification_loss: 0.1120 497/500 [============================>.] - ETA: 1s - loss: 1.0025 - regression_loss: 0.8904 - classification_loss: 0.1121 498/500 [============================>.] - ETA: 0s - loss: 1.0032 - regression_loss: 0.8911 - classification_loss: 0.1121 499/500 [============================>.] - ETA: 0s - loss: 1.0033 - regression_loss: 0.8912 - classification_loss: 0.1121 500/500 [==============================] - 170s 340ms/step - loss: 1.0025 - regression_loss: 0.8904 - classification_loss: 0.1122 326 instances of class plum with average precision: 0.8400 mAP: 0.8400 Epoch 00026: saving model to ./training/snapshots/resnet101_pascal_26.h5 Epoch 27/150 1/500 [..............................] - ETA: 2:40 - loss: 0.5673 - regression_loss: 0.5446 - classification_loss: 0.0226 2/500 [..............................] - ETA: 2:43 - loss: 0.9303 - regression_loss: 0.8428 - classification_loss: 0.0875 3/500 [..............................] - ETA: 2:43 - loss: 1.0432 - regression_loss: 0.9493 - classification_loss: 0.0939 4/500 [..............................] - ETA: 2:45 - loss: 1.3267 - regression_loss: 1.1731 - classification_loss: 0.1536 5/500 [..............................] - ETA: 2:47 - loss: 1.1999 - regression_loss: 1.0649 - classification_loss: 0.1350 6/500 [..............................] - ETA: 2:47 - loss: 1.1766 - regression_loss: 1.0499 - classification_loss: 0.1267 7/500 [..............................] - ETA: 2:45 - loss: 1.1250 - regression_loss: 1.0060 - classification_loss: 0.1190 8/500 [..............................] - ETA: 2:44 - loss: 1.0826 - regression_loss: 0.9717 - classification_loss: 0.1109 9/500 [..............................] - ETA: 2:43 - loss: 1.1522 - regression_loss: 1.0255 - classification_loss: 0.1267 10/500 [..............................] - ETA: 2:44 - loss: 1.0980 - regression_loss: 0.9781 - classification_loss: 0.1198 11/500 [..............................] - ETA: 2:44 - loss: 1.0848 - regression_loss: 0.9649 - classification_loss: 0.1199 12/500 [..............................] - ETA: 2:45 - loss: 1.0432 - regression_loss: 0.9269 - classification_loss: 0.1163 13/500 [..............................] - ETA: 2:44 - loss: 1.0245 - regression_loss: 0.9090 - classification_loss: 0.1155 14/500 [..............................] - ETA: 2:44 - loss: 1.0373 - regression_loss: 0.9192 - classification_loss: 0.1182 15/500 [..............................] - ETA: 2:44 - loss: 1.0002 - regression_loss: 0.8884 - classification_loss: 0.1118 16/500 [..............................] - ETA: 2:43 - loss: 0.9975 - regression_loss: 0.8890 - classification_loss: 0.1084 17/500 [>.............................] - ETA: 2:44 - loss: 0.9810 - regression_loss: 0.8761 - classification_loss: 0.1049 18/500 [>.............................] - ETA: 2:43 - loss: 0.9890 - regression_loss: 0.8823 - classification_loss: 0.1067 19/500 [>.............................] - ETA: 2:42 - loss: 0.9770 - regression_loss: 0.8745 - classification_loss: 0.1024 20/500 [>.............................] - ETA: 2:43 - loss: 0.9720 - regression_loss: 0.8713 - classification_loss: 0.1007 21/500 [>.............................] - ETA: 2:42 - loss: 0.9864 - regression_loss: 0.8795 - classification_loss: 0.1069 22/500 [>.............................] - ETA: 2:42 - loss: 0.9864 - regression_loss: 0.8798 - classification_loss: 0.1066 23/500 [>.............................] - ETA: 2:42 - loss: 1.0284 - regression_loss: 0.9110 - classification_loss: 0.1174 24/500 [>.............................] - ETA: 2:41 - loss: 1.0629 - regression_loss: 0.9429 - classification_loss: 0.1200 25/500 [>.............................] - ETA: 2:40 - loss: 1.0518 - regression_loss: 0.9344 - classification_loss: 0.1174 26/500 [>.............................] - ETA: 2:40 - loss: 1.0455 - regression_loss: 0.9290 - classification_loss: 0.1165 27/500 [>.............................] - ETA: 2:40 - loss: 1.0253 - regression_loss: 0.9120 - classification_loss: 0.1132 28/500 [>.............................] - ETA: 2:39 - loss: 1.0222 - regression_loss: 0.9088 - classification_loss: 0.1134 29/500 [>.............................] - ETA: 2:39 - loss: 1.0151 - regression_loss: 0.9037 - classification_loss: 0.1114 30/500 [>.............................] - ETA: 2:39 - loss: 1.0100 - regression_loss: 0.9002 - classification_loss: 0.1098 31/500 [>.............................] - ETA: 2:38 - loss: 1.0081 - regression_loss: 0.8993 - classification_loss: 0.1088 32/500 [>.............................] - ETA: 2:38 - loss: 1.0067 - regression_loss: 0.8977 - classification_loss: 0.1090 33/500 [>.............................] - ETA: 2:38 - loss: 1.0018 - regression_loss: 0.8935 - classification_loss: 0.1084 34/500 [=>............................] - ETA: 2:37 - loss: 0.9986 - regression_loss: 0.8916 - classification_loss: 0.1070 35/500 [=>............................] - ETA: 2:37 - loss: 0.9964 - regression_loss: 0.8883 - classification_loss: 0.1081 36/500 [=>............................] - ETA: 2:37 - loss: 1.0000 - regression_loss: 0.8922 - classification_loss: 0.1078 37/500 [=>............................] - ETA: 2:36 - loss: 1.0262 - regression_loss: 0.9126 - classification_loss: 0.1136 38/500 [=>............................] - ETA: 2:36 - loss: 1.0195 - regression_loss: 0.9071 - classification_loss: 0.1124 39/500 [=>............................] - ETA: 2:36 - loss: 1.0419 - regression_loss: 0.9256 - classification_loss: 0.1163 40/500 [=>............................] - ETA: 2:35 - loss: 1.0387 - regression_loss: 0.9235 - classification_loss: 0.1152 41/500 [=>............................] - ETA: 2:35 - loss: 1.0323 - regression_loss: 0.9179 - classification_loss: 0.1144 42/500 [=>............................] - ETA: 2:35 - loss: 1.0157 - regression_loss: 0.9032 - classification_loss: 0.1125 43/500 [=>............................] - ETA: 2:34 - loss: 1.0155 - regression_loss: 0.9044 - classification_loss: 0.1112 44/500 [=>............................] - ETA: 2:34 - loss: 1.0335 - regression_loss: 0.9193 - classification_loss: 0.1142 45/500 [=>............................] - ETA: 2:34 - loss: 1.0403 - regression_loss: 0.9245 - classification_loss: 0.1158 46/500 [=>............................] - ETA: 2:33 - loss: 1.0288 - regression_loss: 0.9150 - classification_loss: 0.1138 47/500 [=>............................] - ETA: 2:33 - loss: 1.0299 - regression_loss: 0.9169 - classification_loss: 0.1130 48/500 [=>............................] - ETA: 2:33 - loss: 1.0211 - regression_loss: 0.9087 - classification_loss: 0.1125 49/500 [=>............................] - ETA: 2:32 - loss: 1.0301 - regression_loss: 0.9156 - classification_loss: 0.1145 50/500 [==>...........................] - ETA: 2:32 - loss: 1.0338 - regression_loss: 0.9197 - classification_loss: 0.1142 51/500 [==>...........................] - ETA: 2:32 - loss: 1.0241 - regression_loss: 0.9113 - classification_loss: 0.1128 52/500 [==>...........................] - ETA: 2:32 - loss: 1.0143 - regression_loss: 0.9029 - classification_loss: 0.1114 53/500 [==>...........................] - ETA: 2:31 - loss: 1.0042 - regression_loss: 0.8943 - classification_loss: 0.1099 54/500 [==>...........................] - ETA: 2:31 - loss: 1.0105 - regression_loss: 0.8998 - classification_loss: 0.1107 55/500 [==>...........................] - ETA: 2:31 - loss: 1.0089 - regression_loss: 0.8981 - classification_loss: 0.1108 56/500 [==>...........................] - ETA: 2:31 - loss: 1.0125 - regression_loss: 0.9024 - classification_loss: 0.1101 57/500 [==>...........................] - ETA: 2:30 - loss: 1.0138 - regression_loss: 0.9042 - classification_loss: 0.1096 58/500 [==>...........................] - ETA: 2:30 - loss: 1.0018 - regression_loss: 0.8936 - classification_loss: 0.1082 59/500 [==>...........................] - ETA: 2:30 - loss: 1.0007 - regression_loss: 0.8930 - classification_loss: 0.1077 60/500 [==>...........................] - ETA: 2:29 - loss: 0.9963 - regression_loss: 0.8900 - classification_loss: 0.1063 61/500 [==>...........................] - ETA: 2:29 - loss: 1.0005 - regression_loss: 0.8936 - classification_loss: 0.1070 62/500 [==>...........................] - ETA: 2:28 - loss: 0.9927 - regression_loss: 0.8870 - classification_loss: 0.1057 63/500 [==>...........................] - ETA: 2:28 - loss: 0.9906 - regression_loss: 0.8844 - classification_loss: 0.1062 64/500 [==>...........................] - ETA: 2:28 - loss: 1.0012 - regression_loss: 0.8929 - classification_loss: 0.1084 65/500 [==>...........................] - ETA: 2:27 - loss: 0.9993 - regression_loss: 0.8907 - classification_loss: 0.1087 66/500 [==>...........................] - ETA: 2:27 - loss: 0.9898 - regression_loss: 0.8821 - classification_loss: 0.1077 67/500 [===>..........................] - ETA: 2:27 - loss: 0.9842 - regression_loss: 0.8771 - classification_loss: 0.1071 68/500 [===>..........................] - ETA: 2:27 - loss: 0.9759 - regression_loss: 0.8698 - classification_loss: 0.1061 69/500 [===>..........................] - ETA: 2:26 - loss: 0.9736 - regression_loss: 0.8682 - classification_loss: 0.1054 70/500 [===>..........................] - ETA: 2:26 - loss: 0.9729 - regression_loss: 0.8673 - classification_loss: 0.1056 71/500 [===>..........................] - ETA: 2:25 - loss: 0.9740 - regression_loss: 0.8685 - classification_loss: 0.1055 72/500 [===>..........................] - ETA: 2:25 - loss: 0.9746 - regression_loss: 0.8694 - classification_loss: 0.1052 73/500 [===>..........................] - ETA: 2:25 - loss: 0.9775 - regression_loss: 0.8714 - classification_loss: 0.1061 74/500 [===>..........................] - ETA: 2:25 - loss: 0.9749 - regression_loss: 0.8690 - classification_loss: 0.1059 75/500 [===>..........................] - ETA: 2:24 - loss: 0.9733 - regression_loss: 0.8678 - classification_loss: 0.1055 76/500 [===>..........................] - ETA: 2:24 - loss: 0.9673 - regression_loss: 0.8628 - classification_loss: 0.1045 77/500 [===>..........................] - ETA: 2:24 - loss: 0.9609 - regression_loss: 0.8568 - classification_loss: 0.1041 78/500 [===>..........................] - ETA: 2:23 - loss: 0.9594 - regression_loss: 0.8560 - classification_loss: 0.1034 79/500 [===>..........................] - ETA: 2:23 - loss: 0.9504 - regression_loss: 0.8481 - classification_loss: 0.1022 80/500 [===>..........................] - ETA: 2:23 - loss: 0.9460 - regression_loss: 0.8440 - classification_loss: 0.1021 81/500 [===>..........................] - ETA: 2:22 - loss: 0.9578 - regression_loss: 0.8542 - classification_loss: 0.1036 82/500 [===>..........................] - ETA: 2:22 - loss: 0.9616 - regression_loss: 0.8579 - classification_loss: 0.1037 83/500 [===>..........................] - ETA: 2:22 - loss: 0.9632 - regression_loss: 0.8597 - classification_loss: 0.1035 84/500 [====>.........................] - ETA: 2:21 - loss: 0.9763 - regression_loss: 0.8698 - classification_loss: 0.1065 85/500 [====>.........................] - ETA: 2:21 - loss: 0.9841 - regression_loss: 0.8753 - classification_loss: 0.1088 86/500 [====>.........................] - ETA: 2:21 - loss: 0.9863 - regression_loss: 0.8767 - classification_loss: 0.1095 87/500 [====>.........................] - ETA: 2:20 - loss: 0.9871 - regression_loss: 0.8778 - classification_loss: 0.1093 88/500 [====>.........................] - ETA: 2:20 - loss: 0.9852 - regression_loss: 0.8764 - classification_loss: 0.1088 89/500 [====>.........................] - ETA: 2:19 - loss: 0.9869 - regression_loss: 0.8780 - classification_loss: 0.1089 90/500 [====>.........................] - ETA: 2:19 - loss: 0.9860 - regression_loss: 0.8775 - classification_loss: 0.1084 91/500 [====>.........................] - ETA: 2:19 - loss: 0.9821 - regression_loss: 0.8743 - classification_loss: 0.1078 92/500 [====>.........................] - ETA: 2:18 - loss: 0.9772 - regression_loss: 0.8702 - classification_loss: 0.1071 93/500 [====>.........................] - ETA: 2:18 - loss: 0.9739 - regression_loss: 0.8671 - classification_loss: 0.1068 94/500 [====>.........................] - ETA: 2:18 - loss: 0.9778 - regression_loss: 0.8711 - classification_loss: 0.1067 95/500 [====>.........................] - ETA: 2:17 - loss: 0.9758 - regression_loss: 0.8691 - classification_loss: 0.1067 96/500 [====>.........................] - ETA: 2:17 - loss: 0.9778 - regression_loss: 0.8699 - classification_loss: 0.1078 97/500 [====>.........................] - ETA: 2:16 - loss: 0.9721 - regression_loss: 0.8650 - classification_loss: 0.1071 98/500 [====>.........................] - ETA: 2:16 - loss: 0.9698 - regression_loss: 0.8632 - classification_loss: 0.1067 99/500 [====>.........................] - ETA: 2:16 - loss: 0.9690 - regression_loss: 0.8627 - classification_loss: 0.1063 100/500 [=====>........................] - ETA: 2:15 - loss: 0.9689 - regression_loss: 0.8629 - classification_loss: 0.1060 101/500 [=====>........................] - ETA: 2:15 - loss: 0.9723 - regression_loss: 0.8657 - classification_loss: 0.1066 102/500 [=====>........................] - ETA: 2:15 - loss: 0.9807 - regression_loss: 0.8726 - classification_loss: 0.1081 103/500 [=====>........................] - ETA: 2:14 - loss: 0.9815 - regression_loss: 0.8740 - classification_loss: 0.1075 104/500 [=====>........................] - ETA: 2:14 - loss: 0.9781 - regression_loss: 0.8709 - classification_loss: 0.1072 105/500 [=====>........................] - ETA: 2:14 - loss: 0.9808 - regression_loss: 0.8735 - classification_loss: 0.1074 106/500 [=====>........................] - ETA: 2:13 - loss: 0.9793 - regression_loss: 0.8721 - classification_loss: 0.1072 107/500 [=====>........................] - ETA: 2:13 - loss: 0.9796 - regression_loss: 0.8720 - classification_loss: 0.1076 108/500 [=====>........................] - ETA: 2:13 - loss: 0.9790 - regression_loss: 0.8720 - classification_loss: 0.1070 109/500 [=====>........................] - ETA: 2:12 - loss: 0.9733 - regression_loss: 0.8671 - classification_loss: 0.1062 110/500 [=====>........................] - ETA: 2:12 - loss: 0.9742 - regression_loss: 0.8680 - classification_loss: 0.1062 111/500 [=====>........................] - ETA: 2:12 - loss: 0.9744 - regression_loss: 0.8683 - classification_loss: 0.1061 112/500 [=====>........................] - ETA: 2:11 - loss: 0.9710 - regression_loss: 0.8650 - classification_loss: 0.1060 113/500 [=====>........................] - ETA: 2:11 - loss: 0.9688 - regression_loss: 0.8627 - classification_loss: 0.1061 114/500 [=====>........................] - ETA: 2:10 - loss: 0.9670 - regression_loss: 0.8613 - classification_loss: 0.1057 115/500 [=====>........................] - ETA: 2:10 - loss: 0.9691 - regression_loss: 0.8632 - classification_loss: 0.1059 116/500 [=====>........................] - ETA: 2:10 - loss: 0.9700 - regression_loss: 0.8640 - classification_loss: 0.1060 117/500 [======>.......................] - ETA: 2:09 - loss: 0.9689 - regression_loss: 0.8634 - classification_loss: 0.1056 118/500 [======>.......................] - ETA: 2:09 - loss: 0.9709 - regression_loss: 0.8650 - classification_loss: 0.1059 119/500 [======>.......................] - ETA: 2:09 - loss: 0.9788 - regression_loss: 0.8724 - classification_loss: 0.1064 120/500 [======>.......................] - ETA: 2:08 - loss: 0.9773 - regression_loss: 0.8711 - classification_loss: 0.1063 121/500 [======>.......................] - ETA: 2:08 - loss: 0.9757 - regression_loss: 0.8696 - classification_loss: 0.1061 122/500 [======>.......................] - ETA: 2:08 - loss: 0.9787 - regression_loss: 0.8725 - classification_loss: 0.1062 123/500 [======>.......................] - ETA: 2:07 - loss: 0.9782 - regression_loss: 0.8723 - classification_loss: 0.1059 124/500 [======>.......................] - ETA: 2:07 - loss: 0.9753 - regression_loss: 0.8699 - classification_loss: 0.1054 125/500 [======>.......................] - ETA: 2:07 - loss: 0.9703 - regression_loss: 0.8655 - classification_loss: 0.1048 126/500 [======>.......................] - ETA: 2:06 - loss: 0.9755 - regression_loss: 0.8689 - classification_loss: 0.1066 127/500 [======>.......................] - ETA: 2:06 - loss: 0.9769 - regression_loss: 0.8698 - classification_loss: 0.1070 128/500 [======>.......................] - ETA: 2:06 - loss: 0.9771 - regression_loss: 0.8702 - classification_loss: 0.1069 129/500 [======>.......................] - ETA: 2:05 - loss: 0.9743 - regression_loss: 0.8680 - classification_loss: 0.1064 130/500 [======>.......................] - ETA: 2:05 - loss: 0.9722 - regression_loss: 0.8663 - classification_loss: 0.1058 131/500 [======>.......................] - ETA: 2:05 - loss: 0.9690 - regression_loss: 0.8632 - classification_loss: 0.1058 132/500 [======>.......................] - ETA: 2:04 - loss: 0.9669 - regression_loss: 0.8613 - classification_loss: 0.1056 133/500 [======>.......................] - ETA: 2:04 - loss: 0.9649 - regression_loss: 0.8597 - classification_loss: 0.1052 134/500 [=======>......................] - ETA: 2:04 - loss: 0.9620 - regression_loss: 0.8565 - classification_loss: 0.1055 135/500 [=======>......................] - ETA: 2:03 - loss: 0.9621 - regression_loss: 0.8568 - classification_loss: 0.1052 136/500 [=======>......................] - ETA: 2:03 - loss: 0.9608 - regression_loss: 0.8558 - classification_loss: 0.1049 137/500 [=======>......................] - ETA: 2:03 - loss: 0.9626 - regression_loss: 0.8577 - classification_loss: 0.1049 138/500 [=======>......................] - ETA: 2:02 - loss: 0.9638 - regression_loss: 0.8589 - classification_loss: 0.1049 139/500 [=======>......................] - ETA: 2:02 - loss: 0.9672 - regression_loss: 0.8609 - classification_loss: 0.1063 140/500 [=======>......................] - ETA: 2:02 - loss: 0.9679 - regression_loss: 0.8617 - classification_loss: 0.1062 141/500 [=======>......................] - ETA: 2:01 - loss: 0.9719 - regression_loss: 0.8652 - classification_loss: 0.1067 142/500 [=======>......................] - ETA: 2:01 - loss: 0.9772 - regression_loss: 0.8699 - classification_loss: 0.1073 143/500 [=======>......................] - ETA: 2:01 - loss: 0.9817 - regression_loss: 0.8733 - classification_loss: 0.1084 144/500 [=======>......................] - ETA: 2:00 - loss: 0.9814 - regression_loss: 0.8729 - classification_loss: 0.1086 145/500 [=======>......................] - ETA: 2:00 - loss: 0.9849 - regression_loss: 0.8752 - classification_loss: 0.1097 146/500 [=======>......................] - ETA: 2:00 - loss: 0.9834 - regression_loss: 0.8738 - classification_loss: 0.1095 147/500 [=======>......................] - ETA: 1:59 - loss: 0.9819 - regression_loss: 0.8727 - classification_loss: 0.1092 148/500 [=======>......................] - ETA: 1:59 - loss: 0.9810 - regression_loss: 0.8721 - classification_loss: 0.1090 149/500 [=======>......................] - ETA: 1:59 - loss: 0.9806 - regression_loss: 0.8719 - classification_loss: 0.1087 150/500 [========>.....................] - ETA: 1:58 - loss: 0.9793 - regression_loss: 0.8710 - classification_loss: 0.1082 151/500 [========>.....................] - ETA: 1:58 - loss: 0.9759 - regression_loss: 0.8682 - classification_loss: 0.1077 152/500 [========>.....................] - ETA: 1:58 - loss: 0.9760 - regression_loss: 0.8683 - classification_loss: 0.1078 153/500 [========>.....................] - ETA: 1:57 - loss: 0.9758 - regression_loss: 0.8680 - classification_loss: 0.1078 154/500 [========>.....................] - ETA: 1:57 - loss: 0.9735 - regression_loss: 0.8624 - classification_loss: 0.1112 155/500 [========>.....................] - ETA: 1:57 - loss: 0.9708 - regression_loss: 0.8601 - classification_loss: 0.1107 156/500 [========>.....................] - ETA: 1:56 - loss: 0.9715 - regression_loss: 0.8605 - classification_loss: 0.1110 157/500 [========>.....................] - ETA: 1:56 - loss: 0.9718 - regression_loss: 0.8611 - classification_loss: 0.1107 158/500 [========>.....................] - ETA: 1:56 - loss: 0.9705 - regression_loss: 0.8602 - classification_loss: 0.1104 159/500 [========>.....................] - ETA: 1:55 - loss: 0.9711 - regression_loss: 0.8606 - classification_loss: 0.1104 160/500 [========>.....................] - ETA: 1:55 - loss: 0.9678 - regression_loss: 0.8579 - classification_loss: 0.1099 161/500 [========>.....................] - ETA: 1:55 - loss: 0.9640 - regression_loss: 0.8546 - classification_loss: 0.1095 162/500 [========>.....................] - ETA: 1:54 - loss: 0.9638 - regression_loss: 0.8544 - classification_loss: 0.1094 163/500 [========>.....................] - ETA: 1:54 - loss: 0.9609 - regression_loss: 0.8520 - classification_loss: 0.1089 164/500 [========>.....................] - ETA: 1:54 - loss: 0.9664 - regression_loss: 0.8573 - classification_loss: 0.1092 165/500 [========>.....................] - ETA: 1:53 - loss: 0.9686 - regression_loss: 0.8589 - classification_loss: 0.1097 166/500 [========>.....................] - ETA: 1:53 - loss: 0.9657 - regression_loss: 0.8564 - classification_loss: 0.1093 167/500 [=========>....................] - ETA: 1:53 - loss: 0.9644 - regression_loss: 0.8554 - classification_loss: 0.1090 168/500 [=========>....................] - ETA: 1:52 - loss: 0.9664 - regression_loss: 0.8570 - classification_loss: 0.1094 169/500 [=========>....................] - ETA: 1:52 - loss: 0.9677 - regression_loss: 0.8581 - classification_loss: 0.1096 170/500 [=========>....................] - ETA: 1:52 - loss: 0.9686 - regression_loss: 0.8593 - classification_loss: 0.1093 171/500 [=========>....................] - ETA: 1:51 - loss: 0.9673 - regression_loss: 0.8581 - classification_loss: 0.1092 172/500 [=========>....................] - ETA: 1:51 - loss: 0.9706 - regression_loss: 0.8614 - classification_loss: 0.1092 173/500 [=========>....................] - ETA: 1:50 - loss: 0.9697 - regression_loss: 0.8609 - classification_loss: 0.1088 174/500 [=========>....................] - ETA: 1:50 - loss: 0.9696 - regression_loss: 0.8609 - classification_loss: 0.1087 175/500 [=========>....................] - ETA: 1:50 - loss: 0.9706 - regression_loss: 0.8621 - classification_loss: 0.1085 176/500 [=========>....................] - ETA: 1:49 - loss: 0.9713 - regression_loss: 0.8630 - classification_loss: 0.1083 177/500 [=========>....................] - ETA: 1:49 - loss: 0.9680 - regression_loss: 0.8601 - classification_loss: 0.1079 178/500 [=========>....................] - ETA: 1:49 - loss: 0.9704 - regression_loss: 0.8621 - classification_loss: 0.1083 179/500 [=========>....................] - ETA: 1:48 - loss: 0.9684 - regression_loss: 0.8602 - classification_loss: 0.1082 180/500 [=========>....................] - ETA: 1:48 - loss: 0.9668 - regression_loss: 0.8588 - classification_loss: 0.1080 181/500 [=========>....................] - ETA: 1:48 - loss: 0.9646 - regression_loss: 0.8570 - classification_loss: 0.1076 182/500 [=========>....................] - ETA: 1:47 - loss: 0.9622 - regression_loss: 0.8550 - classification_loss: 0.1072 183/500 [=========>....................] - ETA: 1:47 - loss: 0.9641 - regression_loss: 0.8568 - classification_loss: 0.1073 184/500 [==========>...................] - ETA: 1:47 - loss: 0.9623 - regression_loss: 0.8553 - classification_loss: 0.1070 185/500 [==========>...................] - ETA: 1:46 - loss: 0.9610 - regression_loss: 0.8540 - classification_loss: 0.1070 186/500 [==========>...................] - ETA: 1:46 - loss: 0.9575 - regression_loss: 0.8507 - classification_loss: 0.1068 187/500 [==========>...................] - ETA: 1:46 - loss: 0.9607 - regression_loss: 0.8534 - classification_loss: 0.1073 188/500 [==========>...................] - ETA: 1:45 - loss: 0.9611 - regression_loss: 0.8532 - classification_loss: 0.1080 189/500 [==========>...................] - ETA: 1:45 - loss: 0.9616 - regression_loss: 0.8533 - classification_loss: 0.1082 190/500 [==========>...................] - ETA: 1:45 - loss: 0.9594 - regression_loss: 0.8514 - classification_loss: 0.1080 191/500 [==========>...................] - ETA: 1:44 - loss: 0.9576 - regression_loss: 0.8501 - classification_loss: 0.1075 192/500 [==========>...................] - ETA: 1:44 - loss: 0.9586 - regression_loss: 0.8507 - classification_loss: 0.1079 193/500 [==========>...................] - ETA: 1:44 - loss: 0.9603 - regression_loss: 0.8522 - classification_loss: 0.1081 194/500 [==========>...................] - ETA: 1:43 - loss: 0.9631 - regression_loss: 0.8549 - classification_loss: 0.1082 195/500 [==========>...................] - ETA: 1:43 - loss: 0.9595 - regression_loss: 0.8518 - classification_loss: 0.1077 196/500 [==========>...................] - ETA: 1:43 - loss: 0.9586 - regression_loss: 0.8512 - classification_loss: 0.1074 197/500 [==========>...................] - ETA: 1:43 - loss: 0.9577 - regression_loss: 0.8504 - classification_loss: 0.1073 198/500 [==========>...................] - ETA: 1:42 - loss: 0.9600 - regression_loss: 0.8527 - classification_loss: 0.1074 199/500 [==========>...................] - ETA: 1:42 - loss: 0.9595 - regression_loss: 0.8523 - classification_loss: 0.1072 200/500 [===========>..................] - ETA: 1:41 - loss: 0.9599 - regression_loss: 0.8528 - classification_loss: 0.1071 201/500 [===========>..................] - ETA: 1:41 - loss: 0.9573 - regression_loss: 0.8505 - classification_loss: 0.1067 202/500 [===========>..................] - ETA: 1:41 - loss: 0.9587 - regression_loss: 0.8517 - classification_loss: 0.1070 203/500 [===========>..................] - ETA: 1:40 - loss: 0.9567 - regression_loss: 0.8500 - classification_loss: 0.1067 204/500 [===========>..................] - ETA: 1:40 - loss: 0.9547 - regression_loss: 0.8483 - classification_loss: 0.1064 205/500 [===========>..................] - ETA: 1:40 - loss: 0.9545 - regression_loss: 0.8481 - classification_loss: 0.1064 206/500 [===========>..................] - ETA: 1:39 - loss: 0.9578 - regression_loss: 0.8508 - classification_loss: 0.1070 207/500 [===========>..................] - ETA: 1:39 - loss: 0.9557 - regression_loss: 0.8491 - classification_loss: 0.1066 208/500 [===========>..................] - ETA: 1:39 - loss: 0.9547 - regression_loss: 0.8484 - classification_loss: 0.1063 209/500 [===========>..................] - ETA: 1:38 - loss: 0.9530 - regression_loss: 0.8470 - classification_loss: 0.1060 210/500 [===========>..................] - ETA: 1:38 - loss: 0.9546 - regression_loss: 0.8483 - classification_loss: 0.1063 211/500 [===========>..................] - ETA: 1:38 - loss: 0.9539 - regression_loss: 0.8478 - classification_loss: 0.1061 212/500 [===========>..................] - ETA: 1:37 - loss: 0.9517 - regression_loss: 0.8460 - classification_loss: 0.1057 213/500 [===========>..................] - ETA: 1:37 - loss: 0.9547 - regression_loss: 0.8482 - classification_loss: 0.1065 214/500 [===========>..................] - ETA: 1:37 - loss: 0.9547 - regression_loss: 0.8482 - classification_loss: 0.1065 215/500 [===========>..................] - ETA: 1:36 - loss: 0.9525 - regression_loss: 0.8462 - classification_loss: 0.1063 216/500 [===========>..................] - ETA: 1:36 - loss: 0.9502 - regression_loss: 0.8443 - classification_loss: 0.1059 217/500 [============>.................] - ETA: 1:36 - loss: 0.9513 - regression_loss: 0.8453 - classification_loss: 0.1060 218/500 [============>.................] - ETA: 1:35 - loss: 0.9500 - regression_loss: 0.8442 - classification_loss: 0.1058 219/500 [============>.................] - ETA: 1:35 - loss: 0.9457 - regression_loss: 0.8403 - classification_loss: 0.1054 220/500 [============>.................] - ETA: 1:35 - loss: 0.9454 - regression_loss: 0.8401 - classification_loss: 0.1052 221/500 [============>.................] - ETA: 1:34 - loss: 0.9458 - regression_loss: 0.8405 - classification_loss: 0.1053 222/500 [============>.................] - ETA: 1:34 - loss: 0.9448 - regression_loss: 0.8399 - classification_loss: 0.1049 223/500 [============>.................] - ETA: 1:34 - loss: 0.9462 - regression_loss: 0.8411 - classification_loss: 0.1051 224/500 [============>.................] - ETA: 1:33 - loss: 0.9447 - regression_loss: 0.8396 - classification_loss: 0.1051 225/500 [============>.................] - ETA: 1:33 - loss: 0.9449 - regression_loss: 0.8395 - classification_loss: 0.1054 226/500 [============>.................] - ETA: 1:33 - loss: 0.9473 - regression_loss: 0.8413 - classification_loss: 0.1060 227/500 [============>.................] - ETA: 1:32 - loss: 0.9469 - regression_loss: 0.8411 - classification_loss: 0.1058 228/500 [============>.................] - ETA: 1:32 - loss: 0.9466 - regression_loss: 0.8411 - classification_loss: 0.1055 229/500 [============>.................] - ETA: 1:32 - loss: 0.9450 - regression_loss: 0.8397 - classification_loss: 0.1053 230/500 [============>.................] - ETA: 1:31 - loss: 0.9448 - regression_loss: 0.8397 - classification_loss: 0.1051 231/500 [============>.................] - ETA: 1:31 - loss: 0.9471 - regression_loss: 0.8415 - classification_loss: 0.1056 232/500 [============>.................] - ETA: 1:31 - loss: 0.9458 - regression_loss: 0.8405 - classification_loss: 0.1053 233/500 [============>.................] - ETA: 1:30 - loss: 0.9440 - regression_loss: 0.8390 - classification_loss: 0.1051 234/500 [=============>................] - ETA: 1:30 - loss: 0.9430 - regression_loss: 0.8380 - classification_loss: 0.1049 235/500 [=============>................] - ETA: 1:30 - loss: 0.9442 - regression_loss: 0.8391 - classification_loss: 0.1051 236/500 [=============>................] - ETA: 1:29 - loss: 0.9440 - regression_loss: 0.8390 - classification_loss: 0.1050 237/500 [=============>................] - ETA: 1:29 - loss: 0.9463 - regression_loss: 0.8410 - classification_loss: 0.1054 238/500 [=============>................] - ETA: 1:29 - loss: 0.9505 - regression_loss: 0.8438 - classification_loss: 0.1067 239/500 [=============>................] - ETA: 1:28 - loss: 0.9520 - regression_loss: 0.8448 - classification_loss: 0.1072 240/500 [=============>................] - ETA: 1:28 - loss: 0.9544 - regression_loss: 0.8473 - classification_loss: 0.1071 241/500 [=============>................] - ETA: 1:28 - loss: 0.9529 - regression_loss: 0.8461 - classification_loss: 0.1068 242/500 [=============>................] - ETA: 1:27 - loss: 0.9602 - regression_loss: 0.8519 - classification_loss: 0.1083 243/500 [=============>................] - ETA: 1:27 - loss: 0.9712 - regression_loss: 0.8554 - classification_loss: 0.1157 244/500 [=============>................] - ETA: 1:27 - loss: 0.9736 - regression_loss: 0.8573 - classification_loss: 0.1163 245/500 [=============>................] - ETA: 1:26 - loss: 0.9731 - regression_loss: 0.8566 - classification_loss: 0.1165 246/500 [=============>................] - ETA: 1:26 - loss: 0.9740 - regression_loss: 0.8573 - classification_loss: 0.1166 247/500 [=============>................] - ETA: 1:26 - loss: 0.9754 - regression_loss: 0.8579 - classification_loss: 0.1175 248/500 [=============>................] - ETA: 1:25 - loss: 0.9760 - regression_loss: 0.8585 - classification_loss: 0.1174 249/500 [=============>................] - ETA: 1:25 - loss: 0.9783 - regression_loss: 0.8607 - classification_loss: 0.1176 250/500 [==============>...............] - ETA: 1:24 - loss: 0.9797 - regression_loss: 0.8620 - classification_loss: 0.1177 251/500 [==============>...............] - ETA: 1:24 - loss: 0.9772 - regression_loss: 0.8598 - classification_loss: 0.1174 252/500 [==============>...............] - ETA: 1:24 - loss: 0.9760 - regression_loss: 0.8588 - classification_loss: 0.1172 253/500 [==============>...............] - ETA: 1:23 - loss: 0.9758 - regression_loss: 0.8586 - classification_loss: 0.1172 254/500 [==============>...............] - ETA: 1:23 - loss: 0.9794 - regression_loss: 0.8618 - classification_loss: 0.1176 255/500 [==============>...............] - ETA: 1:23 - loss: 0.9772 - regression_loss: 0.8599 - classification_loss: 0.1173 256/500 [==============>...............] - ETA: 1:22 - loss: 0.9760 - regression_loss: 0.8590 - classification_loss: 0.1171 257/500 [==============>...............] - ETA: 1:22 - loss: 0.9741 - regression_loss: 0.8573 - classification_loss: 0.1168 258/500 [==============>...............] - ETA: 1:22 - loss: 0.9722 - regression_loss: 0.8557 - classification_loss: 0.1165 259/500 [==============>...............] - ETA: 1:21 - loss: 0.9714 - regression_loss: 0.8550 - classification_loss: 0.1164 260/500 [==============>...............] - ETA: 1:21 - loss: 0.9708 - regression_loss: 0.8545 - classification_loss: 0.1163 261/500 [==============>...............] - ETA: 1:21 - loss: 0.9701 - regression_loss: 0.8539 - classification_loss: 0.1163 262/500 [==============>...............] - ETA: 1:20 - loss: 0.9722 - regression_loss: 0.8553 - classification_loss: 0.1169 263/500 [==============>...............] - ETA: 1:20 - loss: 0.9712 - regression_loss: 0.8545 - classification_loss: 0.1167 264/500 [==============>...............] - ETA: 1:20 - loss: 0.9695 - regression_loss: 0.8530 - classification_loss: 0.1165 265/500 [==============>...............] - ETA: 1:19 - loss: 0.9690 - regression_loss: 0.8528 - classification_loss: 0.1163 266/500 [==============>...............] - ETA: 1:19 - loss: 0.9701 - regression_loss: 0.8538 - classification_loss: 0.1163 267/500 [===============>..............] - ETA: 1:19 - loss: 0.9692 - regression_loss: 0.8528 - classification_loss: 0.1164 268/500 [===============>..............] - ETA: 1:18 - loss: 0.9688 - regression_loss: 0.8524 - classification_loss: 0.1164 269/500 [===============>..............] - ETA: 1:18 - loss: 0.9687 - regression_loss: 0.8525 - classification_loss: 0.1163 270/500 [===============>..............] - ETA: 1:18 - loss: 0.9691 - regression_loss: 0.8531 - classification_loss: 0.1160 271/500 [===============>..............] - ETA: 1:17 - loss: 0.9672 - regression_loss: 0.8515 - classification_loss: 0.1157 272/500 [===============>..............] - ETA: 1:17 - loss: 0.9675 - regression_loss: 0.8518 - classification_loss: 0.1157 273/500 [===============>..............] - ETA: 1:17 - loss: 0.9664 - regression_loss: 0.8509 - classification_loss: 0.1155 274/500 [===============>..............] - ETA: 1:16 - loss: 0.9664 - regression_loss: 0.8509 - classification_loss: 0.1155 275/500 [===============>..............] - ETA: 1:16 - loss: 0.9658 - regression_loss: 0.8504 - classification_loss: 0.1154 276/500 [===============>..............] - ETA: 1:16 - loss: 0.9651 - regression_loss: 0.8499 - classification_loss: 0.1152 277/500 [===============>..............] - ETA: 1:15 - loss: 0.9655 - regression_loss: 0.8505 - classification_loss: 0.1151 278/500 [===============>..............] - ETA: 1:15 - loss: 0.9651 - regression_loss: 0.8501 - classification_loss: 0.1150 279/500 [===============>..............] - ETA: 1:15 - loss: 0.9644 - regression_loss: 0.8494 - classification_loss: 0.1150 280/500 [===============>..............] - ETA: 1:14 - loss: 0.9635 - regression_loss: 0.8487 - classification_loss: 0.1148 281/500 [===============>..............] - ETA: 1:14 - loss: 0.9636 - regression_loss: 0.8489 - classification_loss: 0.1146 282/500 [===============>..............] - ETA: 1:13 - loss: 0.9633 - regression_loss: 0.8488 - classification_loss: 0.1145 283/500 [===============>..............] - ETA: 1:13 - loss: 0.9604 - regression_loss: 0.8462 - classification_loss: 0.1142 284/500 [================>.............] - ETA: 1:13 - loss: 0.9603 - regression_loss: 0.8462 - classification_loss: 0.1142 285/500 [================>.............] - ETA: 1:12 - loss: 0.9605 - regression_loss: 0.8462 - classification_loss: 0.1143 286/500 [================>.............] - ETA: 1:12 - loss: 0.9632 - regression_loss: 0.8487 - classification_loss: 0.1145 287/500 [================>.............] - ETA: 1:12 - loss: 0.9615 - regression_loss: 0.8473 - classification_loss: 0.1142 288/500 [================>.............] - ETA: 1:11 - loss: 0.9597 - regression_loss: 0.8458 - classification_loss: 0.1139 289/500 [================>.............] - ETA: 1:11 - loss: 0.9640 - regression_loss: 0.8500 - classification_loss: 0.1140 290/500 [================>.............] - ETA: 1:11 - loss: 0.9643 - regression_loss: 0.8502 - classification_loss: 0.1141 291/500 [================>.............] - ETA: 1:10 - loss: 0.9652 - regression_loss: 0.8511 - classification_loss: 0.1141 292/500 [================>.............] - ETA: 1:10 - loss: 0.9654 - regression_loss: 0.8515 - classification_loss: 0.1139 293/500 [================>.............] - ETA: 1:10 - loss: 0.9661 - regression_loss: 0.8521 - classification_loss: 0.1139 294/500 [================>.............] - ETA: 1:09 - loss: 0.9656 - regression_loss: 0.8518 - classification_loss: 0.1139 295/500 [================>.............] - ETA: 1:09 - loss: 0.9640 - regression_loss: 0.8505 - classification_loss: 0.1135 296/500 [================>.............] - ETA: 1:09 - loss: 0.9633 - regression_loss: 0.8499 - classification_loss: 0.1134 297/500 [================>.............] - ETA: 1:08 - loss: 0.9629 - regression_loss: 0.8495 - classification_loss: 0.1134 298/500 [================>.............] - ETA: 1:08 - loss: 0.9612 - regression_loss: 0.8481 - classification_loss: 0.1132 299/500 [================>.............] - ETA: 1:08 - loss: 0.9644 - regression_loss: 0.8506 - classification_loss: 0.1138 300/500 [=================>............] - ETA: 1:07 - loss: 0.9654 - regression_loss: 0.8517 - classification_loss: 0.1137 301/500 [=================>............] - ETA: 1:07 - loss: 0.9643 - regression_loss: 0.8506 - classification_loss: 0.1136 302/500 [=================>............] - ETA: 1:07 - loss: 0.9669 - regression_loss: 0.8526 - classification_loss: 0.1143 303/500 [=================>............] - ETA: 1:06 - loss: 0.9662 - regression_loss: 0.8521 - classification_loss: 0.1141 304/500 [=================>............] - ETA: 1:06 - loss: 0.9679 - regression_loss: 0.8534 - classification_loss: 0.1145 305/500 [=================>............] - ETA: 1:06 - loss: 0.9675 - regression_loss: 0.8532 - classification_loss: 0.1143 306/500 [=================>............] - ETA: 1:05 - loss: 0.9689 - regression_loss: 0.8547 - classification_loss: 0.1142 307/500 [=================>............] - ETA: 1:05 - loss: 0.9680 - regression_loss: 0.8540 - classification_loss: 0.1140 308/500 [=================>............] - ETA: 1:05 - loss: 0.9679 - regression_loss: 0.8539 - classification_loss: 0.1140 309/500 [=================>............] - ETA: 1:04 - loss: 0.9677 - regression_loss: 0.8538 - classification_loss: 0.1139 310/500 [=================>............] - ETA: 1:04 - loss: 0.9686 - regression_loss: 0.8547 - classification_loss: 0.1139 311/500 [=================>............] - ETA: 1:04 - loss: 0.9696 - regression_loss: 0.8555 - classification_loss: 0.1142 312/500 [=================>............] - ETA: 1:03 - loss: 0.9690 - regression_loss: 0.8550 - classification_loss: 0.1140 313/500 [=================>............] - ETA: 1:03 - loss: 0.9699 - regression_loss: 0.8557 - classification_loss: 0.1142 314/500 [=================>............] - ETA: 1:03 - loss: 0.9695 - regression_loss: 0.8555 - classification_loss: 0.1141 315/500 [=================>............] - ETA: 1:02 - loss: 0.9690 - regression_loss: 0.8551 - classification_loss: 0.1139 316/500 [=================>............] - ETA: 1:02 - loss: 0.9696 - regression_loss: 0.8557 - classification_loss: 0.1140 317/500 [==================>...........] - ETA: 1:02 - loss: 0.9696 - regression_loss: 0.8557 - classification_loss: 0.1139 318/500 [==================>...........] - ETA: 1:01 - loss: 0.9717 - regression_loss: 0.8577 - classification_loss: 0.1140 319/500 [==================>...........] - ETA: 1:01 - loss: 0.9732 - regression_loss: 0.8589 - classification_loss: 0.1142 320/500 [==================>...........] - ETA: 1:01 - loss: 0.9719 - regression_loss: 0.8577 - classification_loss: 0.1141 321/500 [==================>...........] - ETA: 1:00 - loss: 0.9719 - regression_loss: 0.8578 - classification_loss: 0.1141 322/500 [==================>...........] - ETA: 1:00 - loss: 0.9714 - regression_loss: 0.8574 - classification_loss: 0.1140 323/500 [==================>...........] - ETA: 1:00 - loss: 0.9713 - regression_loss: 0.8575 - classification_loss: 0.1139 324/500 [==================>...........] - ETA: 59s - loss: 0.9747 - regression_loss: 0.8604 - classification_loss: 0.1142  325/500 [==================>...........] - ETA: 59s - loss: 0.9751 - regression_loss: 0.8608 - classification_loss: 0.1143 326/500 [==================>...........] - ETA: 59s - loss: 0.9758 - regression_loss: 0.8614 - classification_loss: 0.1144 327/500 [==================>...........] - ETA: 58s - loss: 0.9744 - regression_loss: 0.8602 - classification_loss: 0.1143 328/500 [==================>...........] - ETA: 58s - loss: 0.9752 - regression_loss: 0.8609 - classification_loss: 0.1143 329/500 [==================>...........] - ETA: 58s - loss: 0.9773 - regression_loss: 0.8628 - classification_loss: 0.1145 330/500 [==================>...........] - ETA: 57s - loss: 0.9781 - regression_loss: 0.8636 - classification_loss: 0.1145 331/500 [==================>...........] - ETA: 57s - loss: 0.9760 - regression_loss: 0.8618 - classification_loss: 0.1142 332/500 [==================>...........] - ETA: 57s - loss: 0.9781 - regression_loss: 0.8637 - classification_loss: 0.1144 333/500 [==================>...........] - ETA: 56s - loss: 0.9775 - regression_loss: 0.8633 - classification_loss: 0.1142 334/500 [===================>..........] - ETA: 56s - loss: 0.9764 - regression_loss: 0.8624 - classification_loss: 0.1140 335/500 [===================>..........] - ETA: 56s - loss: 0.9747 - regression_loss: 0.8609 - classification_loss: 0.1137 336/500 [===================>..........] - ETA: 55s - loss: 0.9732 - regression_loss: 0.8597 - classification_loss: 0.1135 337/500 [===================>..........] - ETA: 55s - loss: 0.9713 - regression_loss: 0.8581 - classification_loss: 0.1132 338/500 [===================>..........] - ETA: 55s - loss: 0.9704 - regression_loss: 0.8574 - classification_loss: 0.1130 339/500 [===================>..........] - ETA: 54s - loss: 0.9708 - regression_loss: 0.8578 - classification_loss: 0.1129 340/500 [===================>..........] - ETA: 54s - loss: 0.9723 - regression_loss: 0.8591 - classification_loss: 0.1132 341/500 [===================>..........] - ETA: 54s - loss: 0.9705 - regression_loss: 0.8576 - classification_loss: 0.1129 342/500 [===================>..........] - ETA: 53s - loss: 0.9741 - regression_loss: 0.8605 - classification_loss: 0.1136 343/500 [===================>..........] - ETA: 53s - loss: 0.9739 - regression_loss: 0.8604 - classification_loss: 0.1135 344/500 [===================>..........] - ETA: 53s - loss: 0.9731 - regression_loss: 0.8595 - classification_loss: 0.1135 345/500 [===================>..........] - ETA: 52s - loss: 0.9732 - regression_loss: 0.8598 - classification_loss: 0.1134 346/500 [===================>..........] - ETA: 52s - loss: 0.9737 - regression_loss: 0.8603 - classification_loss: 0.1134 347/500 [===================>..........] - ETA: 52s - loss: 0.9730 - regression_loss: 0.8598 - classification_loss: 0.1132 348/500 [===================>..........] - ETA: 51s - loss: 0.9719 - regression_loss: 0.8588 - classification_loss: 0.1130 349/500 [===================>..........] - ETA: 51s - loss: 0.9717 - regression_loss: 0.8588 - classification_loss: 0.1129 350/500 [====================>.........] - ETA: 50s - loss: 0.9702 - regression_loss: 0.8576 - classification_loss: 0.1126 351/500 [====================>.........] - ETA: 50s - loss: 0.9701 - regression_loss: 0.8575 - classification_loss: 0.1126 352/500 [====================>.........] - ETA: 50s - loss: 0.9699 - regression_loss: 0.8574 - classification_loss: 0.1124 353/500 [====================>.........] - ETA: 49s - loss: 0.9695 - regression_loss: 0.8572 - classification_loss: 0.1123 354/500 [====================>.........] - ETA: 49s - loss: 0.9691 - regression_loss: 0.8570 - classification_loss: 0.1121 355/500 [====================>.........] - ETA: 49s - loss: 0.9701 - regression_loss: 0.8581 - classification_loss: 0.1121 356/500 [====================>.........] - ETA: 48s - loss: 0.9702 - regression_loss: 0.8582 - classification_loss: 0.1120 357/500 [====================>.........] - ETA: 48s - loss: 0.9694 - regression_loss: 0.8575 - classification_loss: 0.1119 358/500 [====================>.........] - ETA: 48s - loss: 0.9687 - regression_loss: 0.8569 - classification_loss: 0.1118 359/500 [====================>.........] - ETA: 47s - loss: 0.9696 - regression_loss: 0.8576 - classification_loss: 0.1120 360/500 [====================>.........] - ETA: 47s - loss: 0.9684 - regression_loss: 0.8566 - classification_loss: 0.1118 361/500 [====================>.........] - ETA: 47s - loss: 0.9666 - regression_loss: 0.8550 - classification_loss: 0.1115 362/500 [====================>.........] - ETA: 46s - loss: 0.9663 - regression_loss: 0.8548 - classification_loss: 0.1115 363/500 [====================>.........] - ETA: 46s - loss: 0.9682 - regression_loss: 0.8562 - classification_loss: 0.1120 364/500 [====================>.........] - ETA: 46s - loss: 0.9687 - regression_loss: 0.8567 - classification_loss: 0.1120 365/500 [====================>.........] - ETA: 45s - loss: 0.9685 - regression_loss: 0.8565 - classification_loss: 0.1120 366/500 [====================>.........] - ETA: 45s - loss: 0.9676 - regression_loss: 0.8557 - classification_loss: 0.1119 367/500 [=====================>........] - ETA: 45s - loss: 0.9678 - regression_loss: 0.8560 - classification_loss: 0.1118 368/500 [=====================>........] - ETA: 44s - loss: 0.9659 - regression_loss: 0.8543 - classification_loss: 0.1116 369/500 [=====================>........] - ETA: 44s - loss: 0.9641 - regression_loss: 0.8527 - classification_loss: 0.1114 370/500 [=====================>........] - ETA: 44s - loss: 0.9634 - regression_loss: 0.8522 - classification_loss: 0.1112 371/500 [=====================>........] - ETA: 43s - loss: 0.9631 - regression_loss: 0.8520 - classification_loss: 0.1111 372/500 [=====================>........] - ETA: 43s - loss: 0.9625 - regression_loss: 0.8515 - classification_loss: 0.1110 373/500 [=====================>........] - ETA: 43s - loss: 0.9623 - regression_loss: 0.8514 - classification_loss: 0.1109 374/500 [=====================>........] - ETA: 42s - loss: 0.9636 - regression_loss: 0.8525 - classification_loss: 0.1110 375/500 [=====================>........] - ETA: 42s - loss: 0.9636 - regression_loss: 0.8527 - classification_loss: 0.1109 376/500 [=====================>........] - ETA: 42s - loss: 0.9639 - regression_loss: 0.8530 - classification_loss: 0.1108 377/500 [=====================>........] - ETA: 41s - loss: 0.9638 - regression_loss: 0.8529 - classification_loss: 0.1109 378/500 [=====================>........] - ETA: 41s - loss: 0.9641 - regression_loss: 0.8530 - classification_loss: 0.1111 379/500 [=====================>........] - ETA: 41s - loss: 0.9649 - regression_loss: 0.8537 - classification_loss: 0.1112 380/500 [=====================>........] - ETA: 40s - loss: 0.9644 - regression_loss: 0.8533 - classification_loss: 0.1110 381/500 [=====================>........] - ETA: 40s - loss: 0.9643 - regression_loss: 0.8533 - classification_loss: 0.1110 382/500 [=====================>........] - ETA: 40s - loss: 0.9638 - regression_loss: 0.8530 - classification_loss: 0.1108 383/500 [=====================>........] - ETA: 39s - loss: 0.9635 - regression_loss: 0.8527 - classification_loss: 0.1109 384/500 [======================>.......] - ETA: 39s - loss: 0.9618 - regression_loss: 0.8512 - classification_loss: 0.1106 385/500 [======================>.......] - ETA: 39s - loss: 0.9611 - regression_loss: 0.8506 - classification_loss: 0.1105 386/500 [======================>.......] - ETA: 38s - loss: 0.9622 - regression_loss: 0.8516 - classification_loss: 0.1106 387/500 [======================>.......] - ETA: 38s - loss: 0.9623 - regression_loss: 0.8519 - classification_loss: 0.1104 388/500 [======================>.......] - ETA: 38s - loss: 0.9621 - regression_loss: 0.8519 - classification_loss: 0.1102 389/500 [======================>.......] - ETA: 37s - loss: 0.9622 - regression_loss: 0.8520 - classification_loss: 0.1102 390/500 [======================>.......] - ETA: 37s - loss: 0.9619 - regression_loss: 0.8518 - classification_loss: 0.1101 391/500 [======================>.......] - ETA: 37s - loss: 0.9613 - regression_loss: 0.8513 - classification_loss: 0.1100 392/500 [======================>.......] - ETA: 36s - loss: 0.9607 - regression_loss: 0.8508 - classification_loss: 0.1100 393/500 [======================>.......] - ETA: 36s - loss: 0.9605 - regression_loss: 0.8507 - classification_loss: 0.1098 394/500 [======================>.......] - ETA: 36s - loss: 0.9607 - regression_loss: 0.8511 - classification_loss: 0.1097 395/500 [======================>.......] - ETA: 35s - loss: 0.9616 - regression_loss: 0.8518 - classification_loss: 0.1098 396/500 [======================>.......] - ETA: 35s - loss: 0.9608 - regression_loss: 0.8510 - classification_loss: 0.1098 397/500 [======================>.......] - ETA: 34s - loss: 0.9610 - regression_loss: 0.8511 - classification_loss: 0.1098 398/500 [======================>.......] - ETA: 34s - loss: 0.9624 - regression_loss: 0.8523 - classification_loss: 0.1101 399/500 [======================>.......] - ETA: 34s - loss: 0.9620 - regression_loss: 0.8520 - classification_loss: 0.1101 400/500 [=======================>......] - ETA: 33s - loss: 0.9648 - regression_loss: 0.8543 - classification_loss: 0.1104 401/500 [=======================>......] - ETA: 33s - loss: 0.9638 - regression_loss: 0.8536 - classification_loss: 0.1103 402/500 [=======================>......] - ETA: 33s - loss: 0.9635 - regression_loss: 0.8534 - classification_loss: 0.1101 403/500 [=======================>......] - ETA: 32s - loss: 0.9630 - regression_loss: 0.8530 - classification_loss: 0.1100 404/500 [=======================>......] - ETA: 32s - loss: 0.9623 - regression_loss: 0.8524 - classification_loss: 0.1099 405/500 [=======================>......] - ETA: 32s - loss: 0.9616 - regression_loss: 0.8519 - classification_loss: 0.1098 406/500 [=======================>......] - ETA: 31s - loss: 0.9598 - regression_loss: 0.8502 - classification_loss: 0.1096 407/500 [=======================>......] - ETA: 31s - loss: 0.9589 - regression_loss: 0.8495 - classification_loss: 0.1094 408/500 [=======================>......] - ETA: 31s - loss: 0.9581 - regression_loss: 0.8488 - classification_loss: 0.1093 409/500 [=======================>......] - ETA: 30s - loss: 0.9576 - regression_loss: 0.8485 - classification_loss: 0.1091 410/500 [=======================>......] - ETA: 30s - loss: 0.9570 - regression_loss: 0.8480 - classification_loss: 0.1091 411/500 [=======================>......] - ETA: 30s - loss: 0.9557 - regression_loss: 0.8467 - classification_loss: 0.1090 412/500 [=======================>......] - ETA: 29s - loss: 0.9545 - regression_loss: 0.8457 - classification_loss: 0.1088 413/500 [=======================>......] - ETA: 29s - loss: 0.9561 - regression_loss: 0.8469 - classification_loss: 0.1092 414/500 [=======================>......] - ETA: 29s - loss: 0.9565 - regression_loss: 0.8473 - classification_loss: 0.1092 415/500 [=======================>......] - ETA: 28s - loss: 0.9565 - regression_loss: 0.8473 - classification_loss: 0.1092 416/500 [=======================>......] - ETA: 28s - loss: 0.9560 - regression_loss: 0.8469 - classification_loss: 0.1090 417/500 [========================>.....] - ETA: 28s - loss: 0.9570 - regression_loss: 0.8477 - classification_loss: 0.1093 418/500 [========================>.....] - ETA: 27s - loss: 0.9567 - regression_loss: 0.8474 - classification_loss: 0.1092 419/500 [========================>.....] - ETA: 27s - loss: 0.9560 - regression_loss: 0.8470 - classification_loss: 0.1090 420/500 [========================>.....] - ETA: 27s - loss: 0.9557 - regression_loss: 0.8468 - classification_loss: 0.1089 421/500 [========================>.....] - ETA: 26s - loss: 0.9558 - regression_loss: 0.8468 - classification_loss: 0.1089 422/500 [========================>.....] - ETA: 26s - loss: 0.9563 - regression_loss: 0.8471 - classification_loss: 0.1091 423/500 [========================>.....] - ETA: 26s - loss: 0.9547 - regression_loss: 0.8458 - classification_loss: 0.1089 424/500 [========================>.....] - ETA: 25s - loss: 0.9540 - regression_loss: 0.8452 - classification_loss: 0.1088 425/500 [========================>.....] - ETA: 25s - loss: 0.9533 - regression_loss: 0.8446 - classification_loss: 0.1087 426/500 [========================>.....] - ETA: 25s - loss: 0.9532 - regression_loss: 0.8446 - classification_loss: 0.1086 427/500 [========================>.....] - ETA: 24s - loss: 0.9543 - regression_loss: 0.8453 - classification_loss: 0.1090 428/500 [========================>.....] - ETA: 24s - loss: 0.9557 - regression_loss: 0.8463 - classification_loss: 0.1094 429/500 [========================>.....] - ETA: 24s - loss: 0.9577 - regression_loss: 0.8478 - classification_loss: 0.1099 430/500 [========================>.....] - ETA: 23s - loss: 0.9567 - regression_loss: 0.8470 - classification_loss: 0.1097 431/500 [========================>.....] - ETA: 23s - loss: 0.9553 - regression_loss: 0.8457 - classification_loss: 0.1096 432/500 [========================>.....] - ETA: 23s - loss: 0.9549 - regression_loss: 0.8453 - classification_loss: 0.1095 433/500 [========================>.....] - ETA: 22s - loss: 0.9555 - regression_loss: 0.8459 - classification_loss: 0.1095 434/500 [=========================>....] - ETA: 22s - loss: 0.9566 - regression_loss: 0.8470 - classification_loss: 0.1096 435/500 [=========================>....] - ETA: 22s - loss: 0.9566 - regression_loss: 0.8470 - classification_loss: 0.1096 436/500 [=========================>....] - ETA: 21s - loss: 0.9582 - regression_loss: 0.8483 - classification_loss: 0.1099 437/500 [=========================>....] - ETA: 21s - loss: 0.9589 - regression_loss: 0.8490 - classification_loss: 0.1099 438/500 [=========================>....] - ETA: 21s - loss: 0.9587 - regression_loss: 0.8487 - classification_loss: 0.1099 439/500 [=========================>....] - ETA: 20s - loss: 0.9583 - regression_loss: 0.8485 - classification_loss: 0.1098 440/500 [=========================>....] - ETA: 20s - loss: 0.9571 - regression_loss: 0.8473 - classification_loss: 0.1097 441/500 [=========================>....] - ETA: 20s - loss: 0.9576 - regression_loss: 0.8478 - classification_loss: 0.1098 442/500 [=========================>....] - ETA: 19s - loss: 0.9572 - regression_loss: 0.8475 - classification_loss: 0.1097 443/500 [=========================>....] - ETA: 19s - loss: 0.9567 - regression_loss: 0.8472 - classification_loss: 0.1095 444/500 [=========================>....] - ETA: 19s - loss: 0.9555 - regression_loss: 0.8461 - classification_loss: 0.1094 445/500 [=========================>....] - ETA: 18s - loss: 0.9540 - regression_loss: 0.8448 - classification_loss: 0.1092 446/500 [=========================>....] - ETA: 18s - loss: 0.9533 - regression_loss: 0.8442 - classification_loss: 0.1090 447/500 [=========================>....] - ETA: 18s - loss: 0.9526 - regression_loss: 0.8437 - classification_loss: 0.1089 448/500 [=========================>....] - ETA: 17s - loss: 0.9529 - regression_loss: 0.8439 - classification_loss: 0.1089 449/500 [=========================>....] - ETA: 17s - loss: 0.9531 - regression_loss: 0.8440 - classification_loss: 0.1090 450/500 [==========================>...] - ETA: 16s - loss: 0.9544 - regression_loss: 0.8451 - classification_loss: 0.1093 451/500 [==========================>...] - ETA: 16s - loss: 0.9543 - regression_loss: 0.8451 - classification_loss: 0.1092 452/500 [==========================>...] - ETA: 16s - loss: 0.9527 - regression_loss: 0.8437 - classification_loss: 0.1090 453/500 [==========================>...] - ETA: 15s - loss: 0.9531 - regression_loss: 0.8441 - classification_loss: 0.1090 454/500 [==========================>...] - ETA: 15s - loss: 0.9523 - regression_loss: 0.8434 - classification_loss: 0.1089 455/500 [==========================>...] - ETA: 15s - loss: 0.9527 - regression_loss: 0.8439 - classification_loss: 0.1089 456/500 [==========================>...] - ETA: 14s - loss: 0.9521 - regression_loss: 0.8433 - classification_loss: 0.1087 457/500 [==========================>...] - ETA: 14s - loss: 0.9518 - regression_loss: 0.8431 - classification_loss: 0.1087 458/500 [==========================>...] - ETA: 14s - loss: 0.9512 - regression_loss: 0.8426 - classification_loss: 0.1085 459/500 [==========================>...] - ETA: 13s - loss: 0.9507 - regression_loss: 0.8422 - classification_loss: 0.1084 460/500 [==========================>...] - ETA: 13s - loss: 0.9524 - regression_loss: 0.8435 - classification_loss: 0.1089 461/500 [==========================>...] - ETA: 13s - loss: 0.9511 - regression_loss: 0.8424 - classification_loss: 0.1087 462/500 [==========================>...] - ETA: 12s - loss: 0.9499 - regression_loss: 0.8412 - classification_loss: 0.1087 463/500 [==========================>...] - ETA: 12s - loss: 0.9503 - regression_loss: 0.8416 - classification_loss: 0.1086 464/500 [==========================>...] - ETA: 12s - loss: 0.9502 - regression_loss: 0.8417 - classification_loss: 0.1086 465/500 [==========================>...] - ETA: 11s - loss: 0.9500 - regression_loss: 0.8415 - classification_loss: 0.1085 466/500 [==========================>...] - ETA: 11s - loss: 0.9496 - regression_loss: 0.8412 - classification_loss: 0.1084 467/500 [===========================>..] - ETA: 11s - loss: 0.9492 - regression_loss: 0.8409 - classification_loss: 0.1083 468/500 [===========================>..] - ETA: 10s - loss: 0.9492 - regression_loss: 0.8408 - classification_loss: 0.1084 469/500 [===========================>..] - ETA: 10s - loss: 0.9494 - regression_loss: 0.8411 - classification_loss: 0.1083 470/500 [===========================>..] - ETA: 10s - loss: 0.9496 - regression_loss: 0.8413 - classification_loss: 0.1083 471/500 [===========================>..] - ETA: 9s - loss: 0.9489 - regression_loss: 0.8408 - classification_loss: 0.1081  472/500 [===========================>..] - ETA: 9s - loss: 0.9490 - regression_loss: 0.8409 - classification_loss: 0.1082 473/500 [===========================>..] - ETA: 9s - loss: 0.9492 - regression_loss: 0.8411 - classification_loss: 0.1082 474/500 [===========================>..] - ETA: 8s - loss: 0.9487 - regression_loss: 0.8407 - classification_loss: 0.1080 475/500 [===========================>..] - ETA: 8s - loss: 0.9499 - regression_loss: 0.8417 - classification_loss: 0.1082 476/500 [===========================>..] - ETA: 8s - loss: 0.9511 - regression_loss: 0.8428 - classification_loss: 0.1083 477/500 [===========================>..] - ETA: 7s - loss: 0.9510 - regression_loss: 0.8428 - classification_loss: 0.1083 478/500 [===========================>..] - ETA: 7s - loss: 0.9511 - regression_loss: 0.8429 - classification_loss: 0.1082 479/500 [===========================>..] - ETA: 7s - loss: 0.9507 - regression_loss: 0.8426 - classification_loss: 0.1081 480/500 [===========================>..] - ETA: 6s - loss: 0.9506 - regression_loss: 0.8424 - classification_loss: 0.1082 481/500 [===========================>..] - ETA: 6s - loss: 0.9507 - regression_loss: 0.8426 - classification_loss: 0.1081 482/500 [===========================>..] - ETA: 6s - loss: 0.9500 - regression_loss: 0.8420 - classification_loss: 0.1080 483/500 [===========================>..] - ETA: 5s - loss: 0.9495 - regression_loss: 0.8417 - classification_loss: 0.1079 484/500 [============================>.] - ETA: 5s - loss: 0.9485 - regression_loss: 0.8408 - classification_loss: 0.1077 485/500 [============================>.] - ETA: 5s - loss: 0.9486 - regression_loss: 0.8410 - classification_loss: 0.1077 486/500 [============================>.] - ETA: 4s - loss: 0.9501 - regression_loss: 0.8421 - classification_loss: 0.1080 487/500 [============================>.] - ETA: 4s - loss: 0.9495 - regression_loss: 0.8416 - classification_loss: 0.1079 488/500 [============================>.] - ETA: 4s - loss: 0.9506 - regression_loss: 0.8426 - classification_loss: 0.1080 489/500 [============================>.] - ETA: 3s - loss: 0.9511 - regression_loss: 0.8431 - classification_loss: 0.1080 490/500 [============================>.] - ETA: 3s - loss: 0.9511 - regression_loss: 0.8431 - classification_loss: 0.1080 491/500 [============================>.] - ETA: 3s - loss: 0.9507 - regression_loss: 0.8428 - classification_loss: 0.1080 492/500 [============================>.] - ETA: 2s - loss: 0.9515 - regression_loss: 0.8436 - classification_loss: 0.1079 493/500 [============================>.] - ETA: 2s - loss: 0.9521 - regression_loss: 0.8442 - classification_loss: 0.1079 494/500 [============================>.] - ETA: 2s - loss: 0.9509 - regression_loss: 0.8432 - classification_loss: 0.1077 495/500 [============================>.] - ETA: 1s - loss: 0.9499 - regression_loss: 0.8423 - classification_loss: 0.1076 496/500 [============================>.] - ETA: 1s - loss: 0.9493 - regression_loss: 0.8418 - classification_loss: 0.1075 497/500 [============================>.] - ETA: 1s - loss: 0.9503 - regression_loss: 0.8425 - classification_loss: 0.1078 498/500 [============================>.] - ETA: 0s - loss: 0.9484 - regression_loss: 0.8408 - classification_loss: 0.1076 499/500 [============================>.] - ETA: 0s - loss: 0.9480 - regression_loss: 0.8405 - classification_loss: 0.1075 500/500 [==============================] - 170s 340ms/step - loss: 0.9481 - regression_loss: 0.8407 - classification_loss: 0.1075 326 instances of class plum with average precision: 0.8338 mAP: 0.8338 Epoch 00027: saving model to ./training/snapshots/resnet101_pascal_27.h5 Epoch 28/150 1/500 [..............................] - ETA: 2:39 - loss: 0.6829 - regression_loss: 0.5993 - classification_loss: 0.0836 2/500 [..............................] - ETA: 2:39 - loss: 0.7534 - regression_loss: 0.6376 - classification_loss: 0.1158 3/500 [..............................] - ETA: 2:42 - loss: 0.8242 - regression_loss: 0.7205 - classification_loss: 0.1038 4/500 [..............................] - ETA: 2:41 - loss: 0.8285 - regression_loss: 0.7327 - classification_loss: 0.0958 5/500 [..............................] - ETA: 2:43 - loss: 0.8796 - regression_loss: 0.7771 - classification_loss: 0.1025 6/500 [..............................] - ETA: 2:44 - loss: 0.8139 - regression_loss: 0.7214 - classification_loss: 0.0925 7/500 [..............................] - ETA: 2:45 - loss: 0.8622 - regression_loss: 0.7668 - classification_loss: 0.0954 8/500 [..............................] - ETA: 2:46 - loss: 0.8293 - regression_loss: 0.7364 - classification_loss: 0.0930 9/500 [..............................] - ETA: 2:46 - loss: 0.8523 - regression_loss: 0.7594 - classification_loss: 0.0928 10/500 [..............................] - ETA: 2:45 - loss: 0.8120 - regression_loss: 0.7249 - classification_loss: 0.0871 11/500 [..............................] - ETA: 2:45 - loss: 0.8373 - regression_loss: 0.7492 - classification_loss: 0.0881 12/500 [..............................] - ETA: 2:45 - loss: 0.7932 - regression_loss: 0.7106 - classification_loss: 0.0826 13/500 [..............................] - ETA: 2:44 - loss: 0.7728 - regression_loss: 0.6918 - classification_loss: 0.0810 14/500 [..............................] - ETA: 2:44 - loss: 0.7953 - regression_loss: 0.7093 - classification_loss: 0.0861 15/500 [..............................] - ETA: 2:44 - loss: 0.7846 - regression_loss: 0.7023 - classification_loss: 0.0822 16/500 [..............................] - ETA: 2:44 - loss: 0.7615 - regression_loss: 0.6824 - classification_loss: 0.0791 17/500 [>.............................] - ETA: 2:43 - loss: 0.7600 - regression_loss: 0.6815 - classification_loss: 0.0785 18/500 [>.............................] - ETA: 2:43 - loss: 0.7776 - regression_loss: 0.6917 - classification_loss: 0.0859 19/500 [>.............................] - ETA: 2:43 - loss: 0.7778 - regression_loss: 0.6931 - classification_loss: 0.0847 20/500 [>.............................] - ETA: 2:42 - loss: 0.8146 - regression_loss: 0.7276 - classification_loss: 0.0870 21/500 [>.............................] - ETA: 2:42 - loss: 0.8229 - regression_loss: 0.7356 - classification_loss: 0.0874 22/500 [>.............................] - ETA: 2:42 - loss: 0.8487 - regression_loss: 0.7585 - classification_loss: 0.0902 23/500 [>.............................] - ETA: 2:42 - loss: 0.8465 - regression_loss: 0.7589 - classification_loss: 0.0876 24/500 [>.............................] - ETA: 2:42 - loss: 0.8491 - regression_loss: 0.7615 - classification_loss: 0.0876 25/500 [>.............................] - ETA: 2:41 - loss: 0.8578 - regression_loss: 0.7689 - classification_loss: 0.0888 26/500 [>.............................] - ETA: 2:41 - loss: 0.8523 - regression_loss: 0.7652 - classification_loss: 0.0871 27/500 [>.............................] - ETA: 2:40 - loss: 0.8449 - regression_loss: 0.7596 - classification_loss: 0.0853 28/500 [>.............................] - ETA: 2:40 - loss: 0.8612 - regression_loss: 0.7754 - classification_loss: 0.0858 29/500 [>.............................] - ETA: 2:40 - loss: 0.8618 - regression_loss: 0.7759 - classification_loss: 0.0858 30/500 [>.............................] - ETA: 2:40 - loss: 0.8592 - regression_loss: 0.7740 - classification_loss: 0.0852 31/500 [>.............................] - ETA: 2:39 - loss: 0.8519 - regression_loss: 0.7681 - classification_loss: 0.0838 32/500 [>.............................] - ETA: 2:39 - loss: 0.8577 - regression_loss: 0.7710 - classification_loss: 0.0867 33/500 [>.............................] - ETA: 2:39 - loss: 0.8624 - regression_loss: 0.7735 - classification_loss: 0.0890 34/500 [=>............................] - ETA: 2:38 - loss: 0.8650 - regression_loss: 0.7753 - classification_loss: 0.0896 35/500 [=>............................] - ETA: 2:38 - loss: 0.8692 - regression_loss: 0.7794 - classification_loss: 0.0899 36/500 [=>............................] - ETA: 2:38 - loss: 0.8735 - regression_loss: 0.7828 - classification_loss: 0.0907 37/500 [=>............................] - ETA: 2:37 - loss: 0.8776 - regression_loss: 0.7861 - classification_loss: 0.0914 38/500 [=>............................] - ETA: 2:37 - loss: 0.8950 - regression_loss: 0.8029 - classification_loss: 0.0921 39/500 [=>............................] - ETA: 2:37 - loss: 0.8960 - regression_loss: 0.8042 - classification_loss: 0.0918 40/500 [=>............................] - ETA: 2:37 - loss: 0.9199 - regression_loss: 0.8263 - classification_loss: 0.0936 41/500 [=>............................] - ETA: 2:36 - loss: 0.9142 - regression_loss: 0.8216 - classification_loss: 0.0926 42/500 [=>............................] - ETA: 2:36 - loss: 0.9163 - regression_loss: 0.8238 - classification_loss: 0.0926 43/500 [=>............................] - ETA: 2:35 - loss: 0.9171 - regression_loss: 0.8211 - classification_loss: 0.0960 44/500 [=>............................] - ETA: 2:35 - loss: 0.9185 - regression_loss: 0.8221 - classification_loss: 0.0964 45/500 [=>............................] - ETA: 2:35 - loss: 0.9093 - regression_loss: 0.8138 - classification_loss: 0.0955 46/500 [=>............................] - ETA: 2:34 - loss: 0.9182 - regression_loss: 0.8222 - classification_loss: 0.0960 47/500 [=>............................] - ETA: 2:34 - loss: 0.9164 - regression_loss: 0.8212 - classification_loss: 0.0952 48/500 [=>............................] - ETA: 2:34 - loss: 0.9066 - regression_loss: 0.8123 - classification_loss: 0.0943 49/500 [=>............................] - ETA: 2:33 - loss: 0.9095 - regression_loss: 0.8145 - classification_loss: 0.0950 50/500 [==>...........................] - ETA: 2:33 - loss: 0.9179 - regression_loss: 0.8193 - classification_loss: 0.0986 51/500 [==>...........................] - ETA: 2:32 - loss: 0.9188 - regression_loss: 0.8193 - classification_loss: 0.0995 52/500 [==>...........................] - ETA: 2:32 - loss: 0.9278 - regression_loss: 0.8268 - classification_loss: 0.1010 53/500 [==>...........................] - ETA: 2:31 - loss: 0.9374 - regression_loss: 0.8357 - classification_loss: 0.1016 54/500 [==>...........................] - ETA: 2:31 - loss: 0.9279 - regression_loss: 0.8274 - classification_loss: 0.1004 55/500 [==>...........................] - ETA: 2:30 - loss: 0.9332 - regression_loss: 0.8317 - classification_loss: 0.1015 56/500 [==>...........................] - ETA: 2:30 - loss: 0.9302 - regression_loss: 0.8289 - classification_loss: 0.1013 57/500 [==>...........................] - ETA: 2:30 - loss: 0.9352 - regression_loss: 0.8334 - classification_loss: 0.1018 58/500 [==>...........................] - ETA: 2:30 - loss: 0.9298 - regression_loss: 0.8277 - classification_loss: 0.1021 59/500 [==>...........................] - ETA: 2:29 - loss: 0.9325 - regression_loss: 0.8309 - classification_loss: 0.1016 60/500 [==>...........................] - ETA: 2:29 - loss: 0.9301 - regression_loss: 0.8293 - classification_loss: 0.1008 61/500 [==>...........................] - ETA: 2:29 - loss: 0.9349 - regression_loss: 0.8342 - classification_loss: 0.1007 62/500 [==>...........................] - ETA: 2:29 - loss: 0.9342 - regression_loss: 0.8343 - classification_loss: 0.0999 63/500 [==>...........................] - ETA: 2:28 - loss: 0.9398 - regression_loss: 0.8388 - classification_loss: 0.1010 64/500 [==>...........................] - ETA: 2:28 - loss: 0.9468 - regression_loss: 0.8442 - classification_loss: 0.1026 65/500 [==>...........................] - ETA: 2:27 - loss: 0.9478 - regression_loss: 0.8452 - classification_loss: 0.1026 66/500 [==>...........................] - ETA: 2:27 - loss: 0.9461 - regression_loss: 0.8443 - classification_loss: 0.1017 67/500 [===>..........................] - ETA: 2:27 - loss: 0.9551 - regression_loss: 0.8513 - classification_loss: 0.1038 68/500 [===>..........................] - ETA: 2:26 - loss: 0.9506 - regression_loss: 0.8477 - classification_loss: 0.1029 69/500 [===>..........................] - ETA: 2:26 - loss: 0.9474 - regression_loss: 0.8448 - classification_loss: 0.1026 70/500 [===>..........................] - ETA: 2:26 - loss: 0.9572 - regression_loss: 0.8530 - classification_loss: 0.1042 71/500 [===>..........................] - ETA: 2:25 - loss: 0.9565 - regression_loss: 0.8528 - classification_loss: 0.1036 72/500 [===>..........................] - ETA: 2:25 - loss: 0.9534 - regression_loss: 0.8505 - classification_loss: 0.1029 73/500 [===>..........................] - ETA: 2:24 - loss: 0.9479 - regression_loss: 0.8459 - classification_loss: 0.1021 74/500 [===>..........................] - ETA: 2:24 - loss: 0.9519 - regression_loss: 0.8484 - classification_loss: 0.1035 75/500 [===>..........................] - ETA: 2:24 - loss: 0.9536 - regression_loss: 0.8507 - classification_loss: 0.1029 76/500 [===>..........................] - ETA: 2:23 - loss: 0.9503 - regression_loss: 0.8483 - classification_loss: 0.1019 77/500 [===>..........................] - ETA: 2:23 - loss: 0.9526 - regression_loss: 0.8499 - classification_loss: 0.1027 78/500 [===>..........................] - ETA: 2:23 - loss: 0.9563 - regression_loss: 0.8530 - classification_loss: 0.1034 79/500 [===>..........................] - ETA: 2:23 - loss: 0.9567 - regression_loss: 0.8538 - classification_loss: 0.1029 80/500 [===>..........................] - ETA: 2:22 - loss: 0.9530 - regression_loss: 0.8504 - classification_loss: 0.1026 81/500 [===>..........................] - ETA: 2:22 - loss: 0.9518 - regression_loss: 0.8498 - classification_loss: 0.1020 82/500 [===>..........................] - ETA: 2:22 - loss: 0.9491 - regression_loss: 0.8478 - classification_loss: 0.1013 83/500 [===>..........................] - ETA: 2:21 - loss: 0.9510 - regression_loss: 0.8499 - classification_loss: 0.1011 84/500 [====>.........................] - ETA: 2:21 - loss: 0.9554 - regression_loss: 0.8527 - classification_loss: 0.1027 85/500 [====>.........................] - ETA: 2:21 - loss: 0.9546 - regression_loss: 0.8520 - classification_loss: 0.1026 86/500 [====>.........................] - ETA: 2:20 - loss: 0.9531 - regression_loss: 0.8502 - classification_loss: 0.1029 87/500 [====>.........................] - ETA: 2:20 - loss: 0.9545 - regression_loss: 0.8514 - classification_loss: 0.1031 88/500 [====>.........................] - ETA: 2:20 - loss: 0.9607 - regression_loss: 0.8573 - classification_loss: 0.1035 89/500 [====>.........................] - ETA: 2:19 - loss: 0.9637 - regression_loss: 0.8599 - classification_loss: 0.1038 90/500 [====>.........................] - ETA: 2:19 - loss: 0.9561 - regression_loss: 0.8533 - classification_loss: 0.1029 91/500 [====>.........................] - ETA: 2:19 - loss: 0.9564 - regression_loss: 0.8535 - classification_loss: 0.1029 92/500 [====>.........................] - ETA: 2:18 - loss: 0.9643 - regression_loss: 0.8611 - classification_loss: 0.1032 93/500 [====>.........................] - ETA: 2:18 - loss: 0.9633 - regression_loss: 0.8604 - classification_loss: 0.1029 94/500 [====>.........................] - ETA: 2:18 - loss: 0.9623 - regression_loss: 0.8599 - classification_loss: 0.1024 95/500 [====>.........................] - ETA: 2:18 - loss: 0.9598 - regression_loss: 0.8578 - classification_loss: 0.1020 96/500 [====>.........................] - ETA: 2:17 - loss: 0.9632 - regression_loss: 0.8610 - classification_loss: 0.1021 97/500 [====>.........................] - ETA: 2:17 - loss: 0.9613 - regression_loss: 0.8591 - classification_loss: 0.1022 98/500 [====>.........................] - ETA: 2:16 - loss: 0.9553 - regression_loss: 0.8537 - classification_loss: 0.1016 99/500 [====>.........................] - ETA: 2:16 - loss: 0.9555 - regression_loss: 0.8536 - classification_loss: 0.1018 100/500 [=====>........................] - ETA: 2:16 - loss: 0.9523 - regression_loss: 0.8511 - classification_loss: 0.1012 101/500 [=====>........................] - ETA: 2:15 - loss: 0.9537 - regression_loss: 0.8520 - classification_loss: 0.1017 102/500 [=====>........................] - ETA: 2:15 - loss: 0.9562 - regression_loss: 0.8538 - classification_loss: 0.1024 103/500 [=====>........................] - ETA: 2:15 - loss: 0.9523 - regression_loss: 0.8506 - classification_loss: 0.1017 104/500 [=====>........................] - ETA: 2:14 - loss: 0.9490 - regression_loss: 0.8475 - classification_loss: 0.1015 105/500 [=====>........................] - ETA: 2:14 - loss: 0.9470 - regression_loss: 0.8457 - classification_loss: 0.1013 106/500 [=====>........................] - ETA: 2:14 - loss: 0.9434 - regression_loss: 0.8428 - classification_loss: 0.1006 107/500 [=====>........................] - ETA: 2:13 - loss: 0.9389 - regression_loss: 0.8390 - classification_loss: 0.0999 108/500 [=====>........................] - ETA: 2:13 - loss: 0.9340 - regression_loss: 0.8348 - classification_loss: 0.0992 109/500 [=====>........................] - ETA: 2:13 - loss: 0.9370 - regression_loss: 0.8381 - classification_loss: 0.0989 110/500 [=====>........................] - ETA: 2:12 - loss: 0.9381 - regression_loss: 0.8397 - classification_loss: 0.0984 111/500 [=====>........................] - ETA: 2:12 - loss: 0.9436 - regression_loss: 0.8450 - classification_loss: 0.0985 112/500 [=====>........................] - ETA: 2:12 - loss: 0.9440 - regression_loss: 0.8456 - classification_loss: 0.0984 113/500 [=====>........................] - ETA: 2:11 - loss: 0.9411 - regression_loss: 0.8432 - classification_loss: 0.0979 114/500 [=====>........................] - ETA: 2:11 - loss: 0.9433 - regression_loss: 0.8447 - classification_loss: 0.0986 115/500 [=====>........................] - ETA: 2:11 - loss: 0.9395 - regression_loss: 0.8416 - classification_loss: 0.0979 116/500 [=====>........................] - ETA: 2:10 - loss: 0.9395 - regression_loss: 0.8417 - classification_loss: 0.0978 117/500 [======>.......................] - ETA: 2:10 - loss: 0.9421 - regression_loss: 0.8441 - classification_loss: 0.0980 118/500 [======>.......................] - ETA: 2:10 - loss: 0.9493 - regression_loss: 0.8488 - classification_loss: 0.1004 119/500 [======>.......................] - ETA: 2:09 - loss: 0.9511 - regression_loss: 0.8497 - classification_loss: 0.1014 120/500 [======>.......................] - ETA: 2:09 - loss: 0.9530 - regression_loss: 0.8515 - classification_loss: 0.1016 121/500 [======>.......................] - ETA: 2:09 - loss: 0.9500 - regression_loss: 0.8487 - classification_loss: 0.1013 122/500 [======>.......................] - ETA: 2:08 - loss: 0.9492 - regression_loss: 0.8481 - classification_loss: 0.1011 123/500 [======>.......................] - ETA: 2:08 - loss: 0.9487 - regression_loss: 0.8481 - classification_loss: 0.1006 124/500 [======>.......................] - ETA: 2:08 - loss: 0.9453 - regression_loss: 0.8451 - classification_loss: 0.1002 125/500 [======>.......................] - ETA: 2:07 - loss: 0.9379 - regression_loss: 0.8383 - classification_loss: 0.0996 126/500 [======>.......................] - ETA: 2:07 - loss: 0.9382 - regression_loss: 0.8378 - classification_loss: 0.1003 127/500 [======>.......................] - ETA: 2:07 - loss: 0.9377 - regression_loss: 0.8375 - classification_loss: 0.1002 128/500 [======>.......................] - ETA: 2:06 - loss: 0.9439 - regression_loss: 0.8429 - classification_loss: 0.1010 129/500 [======>.......................] - ETA: 2:06 - loss: 0.9448 - regression_loss: 0.8437 - classification_loss: 0.1011 130/500 [======>.......................] - ETA: 2:06 - loss: 0.9462 - regression_loss: 0.8451 - classification_loss: 0.1011 131/500 [======>.......................] - ETA: 2:05 - loss: 0.9445 - regression_loss: 0.8436 - classification_loss: 0.1009 132/500 [======>.......................] - ETA: 2:05 - loss: 0.9398 - regression_loss: 0.8395 - classification_loss: 0.1002 133/500 [======>.......................] - ETA: 2:05 - loss: 0.9407 - regression_loss: 0.8402 - classification_loss: 0.1005 134/500 [=======>......................] - ETA: 2:04 - loss: 0.9377 - regression_loss: 0.8376 - classification_loss: 0.1001 135/500 [=======>......................] - ETA: 2:04 - loss: 0.9400 - regression_loss: 0.8389 - classification_loss: 0.1011 136/500 [=======>......................] - ETA: 2:04 - loss: 0.9411 - regression_loss: 0.8394 - classification_loss: 0.1017 137/500 [=======>......................] - ETA: 2:03 - loss: 0.9386 - regression_loss: 0.8373 - classification_loss: 0.1013 138/500 [=======>......................] - ETA: 2:03 - loss: 0.9401 - regression_loss: 0.8389 - classification_loss: 0.1012 139/500 [=======>......................] - ETA: 2:03 - loss: 0.9346 - regression_loss: 0.8340 - classification_loss: 0.1006 140/500 [=======>......................] - ETA: 2:02 - loss: 0.9343 - regression_loss: 0.8339 - classification_loss: 0.1004 141/500 [=======>......................] - ETA: 2:02 - loss: 0.9333 - regression_loss: 0.8331 - classification_loss: 0.1002 142/500 [=======>......................] - ETA: 2:02 - loss: 0.9320 - regression_loss: 0.8318 - classification_loss: 0.1002 143/500 [=======>......................] - ETA: 2:01 - loss: 0.9330 - regression_loss: 0.8328 - classification_loss: 0.1002 144/500 [=======>......................] - ETA: 2:01 - loss: 0.9307 - regression_loss: 0.8308 - classification_loss: 0.0998 145/500 [=======>......................] - ETA: 2:01 - loss: 0.9274 - regression_loss: 0.8281 - classification_loss: 0.0994 146/500 [=======>......................] - ETA: 2:00 - loss: 0.9273 - regression_loss: 0.8280 - classification_loss: 0.0994 147/500 [=======>......................] - ETA: 2:00 - loss: 0.9266 - regression_loss: 0.8273 - classification_loss: 0.0993 148/500 [=======>......................] - ETA: 2:00 - loss: 0.9299 - regression_loss: 0.8302 - classification_loss: 0.0996 149/500 [=======>......................] - ETA: 1:59 - loss: 0.9328 - regression_loss: 0.8325 - classification_loss: 0.1003 150/500 [========>.....................] - ETA: 1:59 - loss: 0.9333 - regression_loss: 0.8326 - classification_loss: 0.1007 151/500 [========>.....................] - ETA: 1:59 - loss: 0.9308 - regression_loss: 0.8305 - classification_loss: 0.1003 152/500 [========>.....................] - ETA: 1:58 - loss: 0.9344 - regression_loss: 0.8331 - classification_loss: 0.1014 153/500 [========>.....................] - ETA: 1:58 - loss: 0.9350 - regression_loss: 0.8336 - classification_loss: 0.1014 154/500 [========>.....................] - ETA: 1:58 - loss: 0.9332 - regression_loss: 0.8320 - classification_loss: 0.1012 155/500 [========>.....................] - ETA: 1:57 - loss: 0.9378 - regression_loss: 0.8358 - classification_loss: 0.1020 156/500 [========>.....................] - ETA: 1:57 - loss: 0.9375 - regression_loss: 0.8354 - classification_loss: 0.1021 157/500 [========>.....................] - ETA: 1:57 - loss: 0.9354 - regression_loss: 0.8334 - classification_loss: 0.1020 158/500 [========>.....................] - ETA: 1:56 - loss: 0.9393 - regression_loss: 0.8363 - classification_loss: 0.1030 159/500 [========>.....................] - ETA: 1:56 - loss: 0.9399 - regression_loss: 0.8370 - classification_loss: 0.1028 160/500 [========>.....................] - ETA: 1:56 - loss: 0.9386 - regression_loss: 0.8354 - classification_loss: 0.1032 161/500 [========>.....................] - ETA: 1:55 - loss: 0.9366 - regression_loss: 0.8337 - classification_loss: 0.1029 162/500 [========>.....................] - ETA: 1:55 - loss: 0.9358 - regression_loss: 0.8331 - classification_loss: 0.1026 163/500 [========>.....................] - ETA: 1:55 - loss: 0.9350 - regression_loss: 0.8323 - classification_loss: 0.1028 164/500 [========>.....................] - ETA: 1:54 - loss: 0.9435 - regression_loss: 0.8395 - classification_loss: 0.1041 165/500 [========>.....................] - ETA: 1:54 - loss: 0.9425 - regression_loss: 0.8388 - classification_loss: 0.1037 166/500 [========>.....................] - ETA: 1:54 - loss: 0.9426 - regression_loss: 0.8387 - classification_loss: 0.1039 167/500 [=========>....................] - ETA: 1:53 - loss: 0.9451 - regression_loss: 0.8408 - classification_loss: 0.1044 168/500 [=========>....................] - ETA: 1:53 - loss: 0.9448 - regression_loss: 0.8403 - classification_loss: 0.1045 169/500 [=========>....................] - ETA: 1:53 - loss: 0.9442 - regression_loss: 0.8399 - classification_loss: 0.1043 170/500 [=========>....................] - ETA: 1:52 - loss: 0.9447 - regression_loss: 0.8403 - classification_loss: 0.1043 171/500 [=========>....................] - ETA: 1:52 - loss: 0.9429 - regression_loss: 0.8384 - classification_loss: 0.1045 172/500 [=========>....................] - ETA: 1:52 - loss: 0.9437 - regression_loss: 0.8390 - classification_loss: 0.1047 173/500 [=========>....................] - ETA: 1:51 - loss: 0.9466 - regression_loss: 0.8417 - classification_loss: 0.1049 174/500 [=========>....................] - ETA: 1:51 - loss: 0.9430 - regression_loss: 0.8386 - classification_loss: 0.1044 175/500 [=========>....................] - ETA: 1:51 - loss: 0.9452 - regression_loss: 0.8408 - classification_loss: 0.1045 176/500 [=========>....................] - ETA: 1:50 - loss: 0.9448 - regression_loss: 0.8405 - classification_loss: 0.1042 177/500 [=========>....................] - ETA: 1:50 - loss: 0.9457 - regression_loss: 0.8417 - classification_loss: 0.1041 178/500 [=========>....................] - ETA: 1:50 - loss: 0.9478 - regression_loss: 0.8434 - classification_loss: 0.1044 179/500 [=========>....................] - ETA: 1:49 - loss: 0.9468 - regression_loss: 0.8428 - classification_loss: 0.1040 180/500 [=========>....................] - ETA: 1:49 - loss: 0.9447 - regression_loss: 0.8411 - classification_loss: 0.1036 181/500 [=========>....................] - ETA: 1:49 - loss: 0.9443 - regression_loss: 0.8410 - classification_loss: 0.1034 182/500 [=========>....................] - ETA: 1:48 - loss: 0.9424 - regression_loss: 0.8393 - classification_loss: 0.1031 183/500 [=========>....................] - ETA: 1:48 - loss: 0.9418 - regression_loss: 0.8385 - classification_loss: 0.1033 184/500 [==========>...................] - ETA: 1:47 - loss: 0.9391 - regression_loss: 0.8362 - classification_loss: 0.1029 185/500 [==========>...................] - ETA: 1:47 - loss: 0.9372 - regression_loss: 0.8346 - classification_loss: 0.1026 186/500 [==========>...................] - ETA: 1:47 - loss: 0.9349 - regression_loss: 0.8327 - classification_loss: 0.1022 187/500 [==========>...................] - ETA: 1:46 - loss: 0.9371 - regression_loss: 0.8342 - classification_loss: 0.1029 188/500 [==========>...................] - ETA: 1:46 - loss: 0.9354 - regression_loss: 0.8328 - classification_loss: 0.1026 189/500 [==========>...................] - ETA: 1:46 - loss: 0.9317 - regression_loss: 0.8296 - classification_loss: 0.1022 190/500 [==========>...................] - ETA: 1:45 - loss: 0.9308 - regression_loss: 0.8286 - classification_loss: 0.1022 191/500 [==========>...................] - ETA: 1:45 - loss: 0.9298 - regression_loss: 0.8277 - classification_loss: 0.1021 192/500 [==========>...................] - ETA: 1:45 - loss: 0.9302 - regression_loss: 0.8281 - classification_loss: 0.1021 193/500 [==========>...................] - ETA: 1:44 - loss: 0.9296 - regression_loss: 0.8275 - classification_loss: 0.1020 194/500 [==========>...................] - ETA: 1:44 - loss: 0.9305 - regression_loss: 0.8285 - classification_loss: 0.1020 195/500 [==========>...................] - ETA: 1:44 - loss: 0.9326 - regression_loss: 0.8300 - classification_loss: 0.1026 196/500 [==========>...................] - ETA: 1:43 - loss: 0.9306 - regression_loss: 0.8284 - classification_loss: 0.1022 197/500 [==========>...................] - ETA: 1:43 - loss: 0.9284 - regression_loss: 0.8266 - classification_loss: 0.1019 198/500 [==========>...................] - ETA: 1:43 - loss: 0.9279 - regression_loss: 0.8261 - classification_loss: 0.1018 199/500 [==========>...................] - ETA: 1:42 - loss: 0.9283 - regression_loss: 0.8264 - classification_loss: 0.1019 200/500 [===========>..................] - ETA: 1:42 - loss: 0.9303 - regression_loss: 0.8286 - classification_loss: 0.1017 201/500 [===========>..................] - ETA: 1:42 - loss: 0.9320 - regression_loss: 0.8303 - classification_loss: 0.1017 202/500 [===========>..................] - ETA: 1:41 - loss: 0.9318 - regression_loss: 0.8300 - classification_loss: 0.1018 203/500 [===========>..................] - ETA: 1:41 - loss: 0.9297 - regression_loss: 0.8282 - classification_loss: 0.1015 204/500 [===========>..................] - ETA: 1:40 - loss: 0.9330 - regression_loss: 0.8304 - classification_loss: 0.1026 205/500 [===========>..................] - ETA: 1:40 - loss: 0.9337 - regression_loss: 0.8312 - classification_loss: 0.1025 206/500 [===========>..................] - ETA: 1:40 - loss: 0.9349 - regression_loss: 0.8323 - classification_loss: 0.1026 207/500 [===========>..................] - ETA: 1:39 - loss: 0.9337 - regression_loss: 0.8311 - classification_loss: 0.1026 208/500 [===========>..................] - ETA: 1:39 - loss: 0.9304 - regression_loss: 0.8282 - classification_loss: 0.1022 209/500 [===========>..................] - ETA: 1:39 - loss: 0.9310 - regression_loss: 0.8289 - classification_loss: 0.1021 210/500 [===========>..................] - ETA: 1:38 - loss: 0.9337 - regression_loss: 0.8310 - classification_loss: 0.1027 211/500 [===========>..................] - ETA: 1:38 - loss: 0.9341 - regression_loss: 0.8314 - classification_loss: 0.1027 212/500 [===========>..................] - ETA: 1:38 - loss: 0.9357 - regression_loss: 0.8330 - classification_loss: 0.1027 213/500 [===========>..................] - ETA: 1:37 - loss: 0.9355 - regression_loss: 0.8328 - classification_loss: 0.1027 214/500 [===========>..................] - ETA: 1:37 - loss: 0.9358 - regression_loss: 0.8329 - classification_loss: 0.1029 215/500 [===========>..................] - ETA: 1:37 - loss: 0.9336 - regression_loss: 0.8311 - classification_loss: 0.1026 216/500 [===========>..................] - ETA: 1:36 - loss: 0.9302 - regression_loss: 0.8280 - classification_loss: 0.1022 217/500 [============>.................] - ETA: 1:36 - loss: 0.9270 - regression_loss: 0.8252 - classification_loss: 0.1018 218/500 [============>.................] - ETA: 1:36 - loss: 0.9281 - regression_loss: 0.8262 - classification_loss: 0.1019 219/500 [============>.................] - ETA: 1:35 - loss: 0.9298 - regression_loss: 0.8276 - classification_loss: 0.1022 220/500 [============>.................] - ETA: 1:35 - loss: 0.9297 - regression_loss: 0.8277 - classification_loss: 0.1020 221/500 [============>.................] - ETA: 1:35 - loss: 0.9311 - regression_loss: 0.8290 - classification_loss: 0.1021 222/500 [============>.................] - ETA: 1:34 - loss: 0.9324 - regression_loss: 0.8305 - classification_loss: 0.1019 223/500 [============>.................] - ETA: 1:34 - loss: 0.9331 - regression_loss: 0.8311 - classification_loss: 0.1019 224/500 [============>.................] - ETA: 1:34 - loss: 0.9335 - regression_loss: 0.8317 - classification_loss: 0.1018 225/500 [============>.................] - ETA: 1:33 - loss: 0.9351 - regression_loss: 0.8333 - classification_loss: 0.1019 226/500 [============>.................] - ETA: 1:33 - loss: 0.9356 - regression_loss: 0.8337 - classification_loss: 0.1018 227/500 [============>.................] - ETA: 1:33 - loss: 0.9386 - regression_loss: 0.8362 - classification_loss: 0.1024 228/500 [============>.................] - ETA: 1:32 - loss: 0.9388 - regression_loss: 0.8366 - classification_loss: 0.1022 229/500 [============>.................] - ETA: 1:32 - loss: 0.9392 - regression_loss: 0.8371 - classification_loss: 0.1021 230/500 [============>.................] - ETA: 1:32 - loss: 0.9396 - regression_loss: 0.8376 - classification_loss: 0.1020 231/500 [============>.................] - ETA: 1:31 - loss: 0.9398 - regression_loss: 0.8374 - classification_loss: 0.1024 232/500 [============>.................] - ETA: 1:31 - loss: 0.9440 - regression_loss: 0.8406 - classification_loss: 0.1034 233/500 [============>.................] - ETA: 1:31 - loss: 0.9447 - regression_loss: 0.8413 - classification_loss: 0.1034 234/500 [=============>................] - ETA: 1:30 - loss: 0.9457 - regression_loss: 0.8422 - classification_loss: 0.1035 235/500 [=============>................] - ETA: 1:30 - loss: 0.9453 - regression_loss: 0.8420 - classification_loss: 0.1033 236/500 [=============>................] - ETA: 1:30 - loss: 0.9434 - regression_loss: 0.8404 - classification_loss: 0.1030 237/500 [=============>................] - ETA: 1:29 - loss: 0.9435 - regression_loss: 0.8405 - classification_loss: 0.1030 238/500 [=============>................] - ETA: 1:29 - loss: 0.9425 - regression_loss: 0.8396 - classification_loss: 0.1028 239/500 [=============>................] - ETA: 1:29 - loss: 0.9426 - regression_loss: 0.8398 - classification_loss: 0.1028 240/500 [=============>................] - ETA: 1:28 - loss: 0.9450 - regression_loss: 0.8418 - classification_loss: 0.1032 241/500 [=============>................] - ETA: 1:28 - loss: 0.9452 - regression_loss: 0.8420 - classification_loss: 0.1032 242/500 [=============>................] - ETA: 1:28 - loss: 0.9435 - regression_loss: 0.8405 - classification_loss: 0.1030 243/500 [=============>................] - ETA: 1:27 - loss: 0.9414 - regression_loss: 0.8387 - classification_loss: 0.1027 244/500 [=============>................] - ETA: 1:27 - loss: 0.9418 - regression_loss: 0.8390 - classification_loss: 0.1028 245/500 [=============>................] - ETA: 1:26 - loss: 0.9413 - regression_loss: 0.8387 - classification_loss: 0.1026 246/500 [=============>................] - ETA: 1:26 - loss: 0.9429 - regression_loss: 0.8396 - classification_loss: 0.1033 247/500 [=============>................] - ETA: 1:26 - loss: 0.9415 - regression_loss: 0.8384 - classification_loss: 0.1031 248/500 [=============>................] - ETA: 1:25 - loss: 0.9438 - regression_loss: 0.8405 - classification_loss: 0.1033 249/500 [=============>................] - ETA: 1:25 - loss: 0.9430 - regression_loss: 0.8399 - classification_loss: 0.1031 250/500 [==============>...............] - ETA: 1:25 - loss: 0.9417 - regression_loss: 0.8389 - classification_loss: 0.1028 251/500 [==============>...............] - ETA: 1:24 - loss: 0.9409 - regression_loss: 0.8384 - classification_loss: 0.1026 252/500 [==============>...............] - ETA: 1:24 - loss: 0.9414 - regression_loss: 0.8389 - classification_loss: 0.1025 253/500 [==============>...............] - ETA: 1:24 - loss: 0.9418 - regression_loss: 0.8392 - classification_loss: 0.1026 254/500 [==============>...............] - ETA: 1:23 - loss: 0.9424 - regression_loss: 0.8400 - classification_loss: 0.1025 255/500 [==============>...............] - ETA: 1:23 - loss: 0.9422 - regression_loss: 0.8400 - classification_loss: 0.1022 256/500 [==============>...............] - ETA: 1:23 - loss: 0.9419 - regression_loss: 0.8396 - classification_loss: 0.1022 257/500 [==============>...............] - ETA: 1:22 - loss: 0.9417 - regression_loss: 0.8397 - classification_loss: 0.1020 258/500 [==============>...............] - ETA: 1:22 - loss: 0.9420 - regression_loss: 0.8399 - classification_loss: 0.1022 259/500 [==============>...............] - ETA: 1:22 - loss: 0.9424 - regression_loss: 0.8403 - classification_loss: 0.1021 260/500 [==============>...............] - ETA: 1:21 - loss: 0.9411 - regression_loss: 0.8393 - classification_loss: 0.1018 261/500 [==============>...............] - ETA: 1:21 - loss: 0.9430 - regression_loss: 0.8406 - classification_loss: 0.1024 262/500 [==============>...............] - ETA: 1:21 - loss: 0.9442 - regression_loss: 0.8419 - classification_loss: 0.1022 263/500 [==============>...............] - ETA: 1:20 - loss: 0.9436 - regression_loss: 0.8416 - classification_loss: 0.1020 264/500 [==============>...............] - ETA: 1:20 - loss: 0.9424 - regression_loss: 0.8402 - classification_loss: 0.1021 265/500 [==============>...............] - ETA: 1:20 - loss: 0.9440 - regression_loss: 0.8415 - classification_loss: 0.1026 266/500 [==============>...............] - ETA: 1:19 - loss: 0.9414 - regression_loss: 0.8392 - classification_loss: 0.1022 267/500 [===============>..............] - ETA: 1:19 - loss: 0.9440 - regression_loss: 0.8414 - classification_loss: 0.1026 268/500 [===============>..............] - ETA: 1:19 - loss: 0.9438 - regression_loss: 0.8414 - classification_loss: 0.1024 269/500 [===============>..............] - ETA: 1:18 - loss: 0.9450 - regression_loss: 0.8425 - classification_loss: 0.1025 270/500 [===============>..............] - ETA: 1:18 - loss: 0.9421 - regression_loss: 0.8400 - classification_loss: 0.1021 271/500 [===============>..............] - ETA: 1:17 - loss: 0.9425 - regression_loss: 0.8404 - classification_loss: 0.1021 272/500 [===============>..............] - ETA: 1:17 - loss: 0.9440 - regression_loss: 0.8418 - classification_loss: 0.1023 273/500 [===============>..............] - ETA: 1:17 - loss: 0.9450 - regression_loss: 0.8426 - classification_loss: 0.1024 274/500 [===============>..............] - ETA: 1:16 - loss: 0.9459 - regression_loss: 0.8435 - classification_loss: 0.1024 275/500 [===============>..............] - ETA: 1:16 - loss: 0.9463 - regression_loss: 0.8441 - classification_loss: 0.1022 276/500 [===============>..............] - ETA: 1:16 - loss: 0.9460 - regression_loss: 0.8437 - classification_loss: 0.1022 277/500 [===============>..............] - ETA: 1:15 - loss: 0.9461 - regression_loss: 0.8438 - classification_loss: 0.1024 278/500 [===============>..............] - ETA: 1:15 - loss: 0.9453 - regression_loss: 0.8431 - classification_loss: 0.1022 279/500 [===============>..............] - ETA: 1:15 - loss: 0.9444 - regression_loss: 0.8424 - classification_loss: 0.1020 280/500 [===============>..............] - ETA: 1:14 - loss: 0.9449 - regression_loss: 0.8427 - classification_loss: 0.1022 281/500 [===============>..............] - ETA: 1:14 - loss: 0.9471 - regression_loss: 0.8445 - classification_loss: 0.1026 282/500 [===============>..............] - ETA: 1:14 - loss: 0.9472 - regression_loss: 0.8447 - classification_loss: 0.1025 283/500 [===============>..............] - ETA: 1:13 - loss: 0.9466 - regression_loss: 0.8441 - classification_loss: 0.1024 284/500 [================>.............] - ETA: 1:13 - loss: 0.9468 - regression_loss: 0.8444 - classification_loss: 0.1025 285/500 [================>.............] - ETA: 1:13 - loss: 0.9455 - regression_loss: 0.8433 - classification_loss: 0.1022 286/500 [================>.............] - ETA: 1:12 - loss: 0.9467 - regression_loss: 0.8444 - classification_loss: 0.1023 287/500 [================>.............] - ETA: 1:12 - loss: 0.9457 - regression_loss: 0.8436 - classification_loss: 0.1021 288/500 [================>.............] - ETA: 1:12 - loss: 0.9460 - regression_loss: 0.8439 - classification_loss: 0.1021 289/500 [================>.............] - ETA: 1:11 - loss: 0.9448 - regression_loss: 0.8430 - classification_loss: 0.1018 290/500 [================>.............] - ETA: 1:11 - loss: 0.9439 - regression_loss: 0.8423 - classification_loss: 0.1017 291/500 [================>.............] - ETA: 1:11 - loss: 0.9445 - regression_loss: 0.8426 - classification_loss: 0.1018 292/500 [================>.............] - ETA: 1:10 - loss: 0.9430 - regression_loss: 0.8413 - classification_loss: 0.1017 293/500 [================>.............] - ETA: 1:10 - loss: 0.9415 - regression_loss: 0.8395 - classification_loss: 0.1020 294/500 [================>.............] - ETA: 1:10 - loss: 0.9414 - regression_loss: 0.8393 - classification_loss: 0.1021 295/500 [================>.............] - ETA: 1:09 - loss: 0.9419 - regression_loss: 0.8399 - classification_loss: 0.1020 296/500 [================>.............] - ETA: 1:09 - loss: 0.9442 - regression_loss: 0.8418 - classification_loss: 0.1024 297/500 [================>.............] - ETA: 1:09 - loss: 0.9447 - regression_loss: 0.8423 - classification_loss: 0.1024 298/500 [================>.............] - ETA: 1:08 - loss: 0.9454 - regression_loss: 0.8429 - classification_loss: 0.1025 299/500 [================>.............] - ETA: 1:08 - loss: 0.9487 - regression_loss: 0.8456 - classification_loss: 0.1031 300/500 [=================>............] - ETA: 1:08 - loss: 0.9520 - regression_loss: 0.8484 - classification_loss: 0.1036 301/500 [=================>............] - ETA: 1:07 - loss: 0.9528 - regression_loss: 0.8490 - classification_loss: 0.1038 302/500 [=================>............] - ETA: 1:07 - loss: 0.9525 - regression_loss: 0.8489 - classification_loss: 0.1036 303/500 [=================>............] - ETA: 1:07 - loss: 0.9509 - regression_loss: 0.8476 - classification_loss: 0.1033 304/500 [=================>............] - ETA: 1:06 - loss: 0.9514 - regression_loss: 0.8481 - classification_loss: 0.1033 305/500 [=================>............] - ETA: 1:06 - loss: 0.9502 - regression_loss: 0.8471 - classification_loss: 0.1031 306/500 [=================>............] - ETA: 1:06 - loss: 0.9487 - regression_loss: 0.8458 - classification_loss: 0.1029 307/500 [=================>............] - ETA: 1:05 - loss: 0.9457 - regression_loss: 0.8430 - classification_loss: 0.1026 308/500 [=================>............] - ETA: 1:05 - loss: 0.9456 - regression_loss: 0.8430 - classification_loss: 0.1027 309/500 [=================>............] - ETA: 1:05 - loss: 0.9459 - regression_loss: 0.8431 - classification_loss: 0.1028 310/500 [=================>............] - ETA: 1:04 - loss: 0.9454 - regression_loss: 0.8427 - classification_loss: 0.1026 311/500 [=================>............] - ETA: 1:04 - loss: 0.9455 - regression_loss: 0.8430 - classification_loss: 0.1025 312/500 [=================>............] - ETA: 1:04 - loss: 0.9450 - regression_loss: 0.8427 - classification_loss: 0.1023 313/500 [=================>............] - ETA: 1:03 - loss: 0.9460 - regression_loss: 0.8437 - classification_loss: 0.1023 314/500 [=================>............] - ETA: 1:03 - loss: 0.9458 - regression_loss: 0.8433 - classification_loss: 0.1025 315/500 [=================>............] - ETA: 1:02 - loss: 0.9460 - regression_loss: 0.8435 - classification_loss: 0.1025 316/500 [=================>............] - ETA: 1:02 - loss: 0.9462 - regression_loss: 0.8440 - classification_loss: 0.1022 317/500 [==================>...........] - ETA: 1:02 - loss: 0.9462 - regression_loss: 0.8440 - classification_loss: 0.1022 318/500 [==================>...........] - ETA: 1:01 - loss: 0.9451 - regression_loss: 0.8430 - classification_loss: 0.1021 319/500 [==================>...........] - ETA: 1:01 - loss: 0.9458 - regression_loss: 0.8437 - classification_loss: 0.1021 320/500 [==================>...........] - ETA: 1:01 - loss: 0.9452 - regression_loss: 0.8431 - classification_loss: 0.1021 321/500 [==================>...........] - ETA: 1:00 - loss: 0.9465 - regression_loss: 0.8443 - classification_loss: 0.1022 322/500 [==================>...........] - ETA: 1:00 - loss: 0.9457 - regression_loss: 0.8437 - classification_loss: 0.1020 323/500 [==================>...........] - ETA: 1:00 - loss: 0.9465 - regression_loss: 0.8443 - classification_loss: 0.1022 324/500 [==================>...........] - ETA: 59s - loss: 0.9445 - regression_loss: 0.8425 - classification_loss: 0.1020  325/500 [==================>...........] - ETA: 59s - loss: 0.9448 - regression_loss: 0.8428 - classification_loss: 0.1020 326/500 [==================>...........] - ETA: 59s - loss: 0.9451 - regression_loss: 0.8431 - classification_loss: 0.1019 327/500 [==================>...........] - ETA: 58s - loss: 0.9474 - regression_loss: 0.8445 - classification_loss: 0.1028 328/500 [==================>...........] - ETA: 58s - loss: 0.9487 - regression_loss: 0.8453 - classification_loss: 0.1033 329/500 [==================>...........] - ETA: 58s - loss: 0.9481 - regression_loss: 0.8449 - classification_loss: 0.1032 330/500 [==================>...........] - ETA: 57s - loss: 0.9492 - regression_loss: 0.8458 - classification_loss: 0.1034 331/500 [==================>...........] - ETA: 57s - loss: 0.9488 - regression_loss: 0.8455 - classification_loss: 0.1033 332/500 [==================>...........] - ETA: 57s - loss: 0.9488 - regression_loss: 0.8457 - classification_loss: 0.1031 333/500 [==================>...........] - ETA: 56s - loss: 0.9483 - regression_loss: 0.8453 - classification_loss: 0.1030 334/500 [===================>..........] - ETA: 56s - loss: 0.9490 - regression_loss: 0.8460 - classification_loss: 0.1030 335/500 [===================>..........] - ETA: 56s - loss: 0.9496 - regression_loss: 0.8469 - classification_loss: 0.1028 336/500 [===================>..........] - ETA: 55s - loss: 0.9512 - regression_loss: 0.8483 - classification_loss: 0.1029 337/500 [===================>..........] - ETA: 55s - loss: 0.9520 - regression_loss: 0.8491 - classification_loss: 0.1029 338/500 [===================>..........] - ETA: 55s - loss: 0.9516 - regression_loss: 0.8488 - classification_loss: 0.1028 339/500 [===================>..........] - ETA: 54s - loss: 0.9541 - regression_loss: 0.8507 - classification_loss: 0.1034 340/500 [===================>..........] - ETA: 54s - loss: 0.9547 - regression_loss: 0.8514 - classification_loss: 0.1033 341/500 [===================>..........] - ETA: 54s - loss: 0.9566 - regression_loss: 0.8527 - classification_loss: 0.1039 342/500 [===================>..........] - ETA: 53s - loss: 0.9577 - regression_loss: 0.8536 - classification_loss: 0.1041 343/500 [===================>..........] - ETA: 53s - loss: 0.9586 - regression_loss: 0.8546 - classification_loss: 0.1040 344/500 [===================>..........] - ETA: 53s - loss: 0.9578 - regression_loss: 0.8539 - classification_loss: 0.1039 345/500 [===================>..........] - ETA: 52s - loss: 0.9569 - regression_loss: 0.8531 - classification_loss: 0.1038 346/500 [===================>..........] - ETA: 52s - loss: 0.9571 - regression_loss: 0.8534 - classification_loss: 0.1037 347/500 [===================>..........] - ETA: 52s - loss: 0.9568 - regression_loss: 0.8533 - classification_loss: 0.1036 348/500 [===================>..........] - ETA: 51s - loss: 0.9561 - regression_loss: 0.8527 - classification_loss: 0.1034 349/500 [===================>..........] - ETA: 51s - loss: 0.9559 - regression_loss: 0.8526 - classification_loss: 0.1033 350/500 [====================>.........] - ETA: 51s - loss: 0.9539 - regression_loss: 0.8508 - classification_loss: 0.1031 351/500 [====================>.........] - ETA: 50s - loss: 0.9540 - regression_loss: 0.8509 - classification_loss: 0.1031 352/500 [====================>.........] - ETA: 50s - loss: 0.9532 - regression_loss: 0.8503 - classification_loss: 0.1030 353/500 [====================>.........] - ETA: 50s - loss: 0.9530 - regression_loss: 0.8501 - classification_loss: 0.1029 354/500 [====================>.........] - ETA: 49s - loss: 0.9542 - regression_loss: 0.8510 - classification_loss: 0.1032 355/500 [====================>.........] - ETA: 49s - loss: 0.9561 - regression_loss: 0.8524 - classification_loss: 0.1037 356/500 [====================>.........] - ETA: 49s - loss: 0.9553 - regression_loss: 0.8518 - classification_loss: 0.1035 357/500 [====================>.........] - ETA: 48s - loss: 0.9550 - regression_loss: 0.8516 - classification_loss: 0.1034 358/500 [====================>.........] - ETA: 48s - loss: 0.9556 - regression_loss: 0.8520 - classification_loss: 0.1036 359/500 [====================>.........] - ETA: 48s - loss: 0.9544 - regression_loss: 0.8510 - classification_loss: 0.1033 360/500 [====================>.........] - ETA: 47s - loss: 0.9536 - regression_loss: 0.8504 - classification_loss: 0.1032 361/500 [====================>.........] - ETA: 47s - loss: 0.9542 - regression_loss: 0.8510 - classification_loss: 0.1032 362/500 [====================>.........] - ETA: 47s - loss: 0.9531 - regression_loss: 0.8501 - classification_loss: 0.1030 363/500 [====================>.........] - ETA: 46s - loss: 0.9527 - regression_loss: 0.8498 - classification_loss: 0.1029 364/500 [====================>.........] - ETA: 46s - loss: 0.9542 - regression_loss: 0.8513 - classification_loss: 0.1029 365/500 [====================>.........] - ETA: 46s - loss: 0.9538 - regression_loss: 0.8509 - classification_loss: 0.1029 366/500 [====================>.........] - ETA: 45s - loss: 0.9531 - regression_loss: 0.8503 - classification_loss: 0.1028 367/500 [=====================>........] - ETA: 45s - loss: 0.9536 - regression_loss: 0.8507 - classification_loss: 0.1029 368/500 [=====================>........] - ETA: 44s - loss: 0.9529 - regression_loss: 0.8500 - classification_loss: 0.1029 369/500 [=====================>........] - ETA: 44s - loss: 0.9515 - regression_loss: 0.8489 - classification_loss: 0.1027 370/500 [=====================>........] - ETA: 44s - loss: 0.9511 - regression_loss: 0.8485 - classification_loss: 0.1026 371/500 [=====================>........] - ETA: 43s - loss: 0.9518 - regression_loss: 0.8491 - classification_loss: 0.1027 372/500 [=====================>........] - ETA: 43s - loss: 0.9524 - regression_loss: 0.8497 - classification_loss: 0.1027 373/500 [=====================>........] - ETA: 43s - loss: 0.9535 - regression_loss: 0.8504 - classification_loss: 0.1031 374/500 [=====================>........] - ETA: 42s - loss: 0.9532 - regression_loss: 0.8502 - classification_loss: 0.1031 375/500 [=====================>........] - ETA: 42s - loss: 0.9524 - regression_loss: 0.8495 - classification_loss: 0.1029 376/500 [=====================>........] - ETA: 42s - loss: 0.9528 - regression_loss: 0.8499 - classification_loss: 0.1029 377/500 [=====================>........] - ETA: 41s - loss: 0.9536 - regression_loss: 0.8506 - classification_loss: 0.1029 378/500 [=====================>........] - ETA: 41s - loss: 0.9533 - regression_loss: 0.8505 - classification_loss: 0.1028 379/500 [=====================>........] - ETA: 41s - loss: 0.9534 - regression_loss: 0.8507 - classification_loss: 0.1027 380/500 [=====================>........] - ETA: 40s - loss: 0.9522 - regression_loss: 0.8497 - classification_loss: 0.1025 381/500 [=====================>........] - ETA: 40s - loss: 0.9510 - regression_loss: 0.8486 - classification_loss: 0.1025 382/500 [=====================>........] - ETA: 40s - loss: 0.9569 - regression_loss: 0.8529 - classification_loss: 0.1040 383/500 [=====================>........] - ETA: 39s - loss: 0.9552 - regression_loss: 0.8514 - classification_loss: 0.1038 384/500 [======================>.......] - ETA: 39s - loss: 0.9536 - regression_loss: 0.8500 - classification_loss: 0.1036 385/500 [======================>.......] - ETA: 39s - loss: 0.9532 - regression_loss: 0.8497 - classification_loss: 0.1034 386/500 [======================>.......] - ETA: 38s - loss: 0.9544 - regression_loss: 0.8506 - classification_loss: 0.1037 387/500 [======================>.......] - ETA: 38s - loss: 0.9532 - regression_loss: 0.8497 - classification_loss: 0.1035 388/500 [======================>.......] - ETA: 38s - loss: 0.9534 - regression_loss: 0.8498 - classification_loss: 0.1035 389/500 [======================>.......] - ETA: 37s - loss: 0.9529 - regression_loss: 0.8495 - classification_loss: 0.1034 390/500 [======================>.......] - ETA: 37s - loss: 0.9528 - regression_loss: 0.8494 - classification_loss: 0.1034 391/500 [======================>.......] - ETA: 37s - loss: 0.9528 - regression_loss: 0.8496 - classification_loss: 0.1033 392/500 [======================>.......] - ETA: 36s - loss: 0.9527 - regression_loss: 0.8495 - classification_loss: 0.1032 393/500 [======================>.......] - ETA: 36s - loss: 0.9523 - regression_loss: 0.8492 - classification_loss: 0.1031 394/500 [======================>.......] - ETA: 36s - loss: 0.9522 - regression_loss: 0.8492 - classification_loss: 0.1030 395/500 [======================>.......] - ETA: 35s - loss: 0.9534 - regression_loss: 0.8503 - classification_loss: 0.1031 396/500 [======================>.......] - ETA: 35s - loss: 0.9526 - regression_loss: 0.8496 - classification_loss: 0.1030 397/500 [======================>.......] - ETA: 35s - loss: 0.9526 - regression_loss: 0.8496 - classification_loss: 0.1031 398/500 [======================>.......] - ETA: 34s - loss: 0.9523 - regression_loss: 0.8494 - classification_loss: 0.1029 399/500 [======================>.......] - ETA: 34s - loss: 0.9523 - regression_loss: 0.8496 - classification_loss: 0.1028 400/500 [=======================>......] - ETA: 34s - loss: 0.9517 - regression_loss: 0.8490 - classification_loss: 0.1027 401/500 [=======================>......] - ETA: 33s - loss: 0.9529 - regression_loss: 0.8500 - classification_loss: 0.1029 402/500 [=======================>......] - ETA: 33s - loss: 0.9539 - regression_loss: 0.8509 - classification_loss: 0.1029 403/500 [=======================>......] - ETA: 33s - loss: 0.9541 - regression_loss: 0.8511 - classification_loss: 0.1030 404/500 [=======================>......] - ETA: 32s - loss: 0.9544 - regression_loss: 0.8514 - classification_loss: 0.1030 405/500 [=======================>......] - ETA: 32s - loss: 0.9540 - regression_loss: 0.8512 - classification_loss: 0.1028 406/500 [=======================>......] - ETA: 32s - loss: 0.9541 - regression_loss: 0.8513 - classification_loss: 0.1028 407/500 [=======================>......] - ETA: 31s - loss: 0.9533 - regression_loss: 0.8507 - classification_loss: 0.1026 408/500 [=======================>......] - ETA: 31s - loss: 0.9540 - regression_loss: 0.8514 - classification_loss: 0.1027 409/500 [=======================>......] - ETA: 30s - loss: 0.9553 - regression_loss: 0.8523 - classification_loss: 0.1030 410/500 [=======================>......] - ETA: 30s - loss: 0.9556 - regression_loss: 0.8526 - classification_loss: 0.1029 411/500 [=======================>......] - ETA: 30s - loss: 0.9564 - regression_loss: 0.8534 - classification_loss: 0.1029 412/500 [=======================>......] - ETA: 29s - loss: 0.9554 - regression_loss: 0.8526 - classification_loss: 0.1028 413/500 [=======================>......] - ETA: 29s - loss: 0.9549 - regression_loss: 0.8521 - classification_loss: 0.1027 414/500 [=======================>......] - ETA: 29s - loss: 0.9543 - regression_loss: 0.8517 - classification_loss: 0.1027 415/500 [=======================>......] - ETA: 28s - loss: 0.9545 - regression_loss: 0.8519 - classification_loss: 0.1026 416/500 [=======================>......] - ETA: 28s - loss: 0.9555 - regression_loss: 0.8529 - classification_loss: 0.1027 417/500 [========================>.....] - ETA: 28s - loss: 0.9556 - regression_loss: 0.8529 - classification_loss: 0.1028 418/500 [========================>.....] - ETA: 27s - loss: 0.9558 - regression_loss: 0.8530 - classification_loss: 0.1028 419/500 [========================>.....] - ETA: 27s - loss: 0.9557 - regression_loss: 0.8530 - classification_loss: 0.1027 420/500 [========================>.....] - ETA: 27s - loss: 0.9560 - regression_loss: 0.8533 - classification_loss: 0.1027 421/500 [========================>.....] - ETA: 26s - loss: 0.9556 - regression_loss: 0.8530 - classification_loss: 0.1026 422/500 [========================>.....] - ETA: 26s - loss: 0.9568 - regression_loss: 0.8540 - classification_loss: 0.1027 423/500 [========================>.....] - ETA: 26s - loss: 0.9563 - regression_loss: 0.8537 - classification_loss: 0.1026 424/500 [========================>.....] - ETA: 25s - loss: 0.9577 - regression_loss: 0.8548 - classification_loss: 0.1028 425/500 [========================>.....] - ETA: 25s - loss: 0.9569 - regression_loss: 0.8541 - classification_loss: 0.1028 426/500 [========================>.....] - ETA: 25s - loss: 0.9582 - regression_loss: 0.8552 - classification_loss: 0.1031 427/500 [========================>.....] - ETA: 24s - loss: 0.9584 - regression_loss: 0.8554 - classification_loss: 0.1030 428/500 [========================>.....] - ETA: 24s - loss: 0.9580 - regression_loss: 0.8550 - classification_loss: 0.1029 429/500 [========================>.....] - ETA: 24s - loss: 0.9573 - regression_loss: 0.8545 - classification_loss: 0.1028 430/500 [========================>.....] - ETA: 23s - loss: 0.9560 - regression_loss: 0.8534 - classification_loss: 0.1026 431/500 [========================>.....] - ETA: 23s - loss: 0.9570 - regression_loss: 0.8542 - classification_loss: 0.1027 432/500 [========================>.....] - ETA: 23s - loss: 0.9567 - regression_loss: 0.8541 - classification_loss: 0.1026 433/500 [========================>.....] - ETA: 22s - loss: 0.9561 - regression_loss: 0.8536 - classification_loss: 0.1025 434/500 [=========================>....] - ETA: 22s - loss: 0.9564 - regression_loss: 0.8539 - classification_loss: 0.1025 435/500 [=========================>....] - ETA: 22s - loss: 0.9567 - regression_loss: 0.8543 - classification_loss: 0.1024 436/500 [=========================>....] - ETA: 21s - loss: 0.9570 - regression_loss: 0.8546 - classification_loss: 0.1023 437/500 [=========================>....] - ETA: 21s - loss: 0.9558 - regression_loss: 0.8536 - classification_loss: 0.1022 438/500 [=========================>....] - ETA: 21s - loss: 0.9555 - regression_loss: 0.8534 - classification_loss: 0.1021 439/500 [=========================>....] - ETA: 20s - loss: 0.9546 - regression_loss: 0.8526 - classification_loss: 0.1020 440/500 [=========================>....] - ETA: 20s - loss: 0.9537 - regression_loss: 0.8518 - classification_loss: 0.1018 441/500 [=========================>....] - ETA: 20s - loss: 0.9534 - regression_loss: 0.8516 - classification_loss: 0.1018 442/500 [=========================>....] - ETA: 19s - loss: 0.9524 - regression_loss: 0.8506 - classification_loss: 0.1018 443/500 [=========================>....] - ETA: 19s - loss: 0.9527 - regression_loss: 0.8508 - classification_loss: 0.1019 444/500 [=========================>....] - ETA: 19s - loss: 0.9520 - regression_loss: 0.8502 - classification_loss: 0.1018 445/500 [=========================>....] - ETA: 18s - loss: 0.9519 - regression_loss: 0.8502 - classification_loss: 0.1017 446/500 [=========================>....] - ETA: 18s - loss: 0.9515 - regression_loss: 0.8498 - classification_loss: 0.1017 447/500 [=========================>....] - ETA: 18s - loss: 0.9521 - regression_loss: 0.8502 - classification_loss: 0.1019 448/500 [=========================>....] - ETA: 17s - loss: 0.9508 - regression_loss: 0.8491 - classification_loss: 0.1017 449/500 [=========================>....] - ETA: 17s - loss: 0.9517 - regression_loss: 0.8499 - classification_loss: 0.1018 450/500 [==========================>...] - ETA: 17s - loss: 0.9506 - regression_loss: 0.8490 - classification_loss: 0.1016 451/500 [==========================>...] - ETA: 16s - loss: 0.9510 - regression_loss: 0.8493 - classification_loss: 0.1017 452/500 [==========================>...] - ETA: 16s - loss: 0.9534 - regression_loss: 0.8511 - classification_loss: 0.1023 453/500 [==========================>...] - ETA: 16s - loss: 0.9522 - regression_loss: 0.8501 - classification_loss: 0.1022 454/500 [==========================>...] - ETA: 15s - loss: 0.9536 - regression_loss: 0.8514 - classification_loss: 0.1022 455/500 [==========================>...] - ETA: 15s - loss: 0.9537 - regression_loss: 0.8515 - classification_loss: 0.1022 456/500 [==========================>...] - ETA: 14s - loss: 0.9539 - regression_loss: 0.8518 - classification_loss: 0.1021 457/500 [==========================>...] - ETA: 14s - loss: 0.9536 - regression_loss: 0.8516 - classification_loss: 0.1020 458/500 [==========================>...] - ETA: 14s - loss: 0.9537 - regression_loss: 0.8516 - classification_loss: 0.1021 459/500 [==========================>...] - ETA: 13s - loss: 0.9534 - regression_loss: 0.8514 - classification_loss: 0.1021 460/500 [==========================>...] - ETA: 13s - loss: 0.9547 - regression_loss: 0.8525 - classification_loss: 0.1022 461/500 [==========================>...] - ETA: 13s - loss: 0.9539 - regression_loss: 0.8519 - classification_loss: 0.1020 462/500 [==========================>...] - ETA: 12s - loss: 0.9549 - regression_loss: 0.8528 - classification_loss: 0.1021 463/500 [==========================>...] - ETA: 12s - loss: 0.9542 - regression_loss: 0.8522 - classification_loss: 0.1019 464/500 [==========================>...] - ETA: 12s - loss: 0.9541 - regression_loss: 0.8520 - classification_loss: 0.1021 465/500 [==========================>...] - ETA: 11s - loss: 0.9537 - regression_loss: 0.8516 - classification_loss: 0.1021 466/500 [==========================>...] - ETA: 11s - loss: 0.9543 - regression_loss: 0.8522 - classification_loss: 0.1021 467/500 [===========================>..] - ETA: 11s - loss: 0.9544 - regression_loss: 0.8523 - classification_loss: 0.1022 468/500 [===========================>..] - ETA: 10s - loss: 0.9547 - regression_loss: 0.8525 - classification_loss: 0.1022 469/500 [===========================>..] - ETA: 10s - loss: 0.9547 - regression_loss: 0.8526 - classification_loss: 0.1021 470/500 [===========================>..] - ETA: 10s - loss: 0.9561 - regression_loss: 0.8538 - classification_loss: 0.1023 471/500 [===========================>..] - ETA: 9s - loss: 0.9572 - regression_loss: 0.8547 - classification_loss: 0.1025  472/500 [===========================>..] - ETA: 9s - loss: 0.9566 - regression_loss: 0.8542 - classification_loss: 0.1024 473/500 [===========================>..] - ETA: 9s - loss: 0.9565 - regression_loss: 0.8541 - classification_loss: 0.1024 474/500 [===========================>..] - ETA: 8s - loss: 0.9558 - regression_loss: 0.8535 - classification_loss: 0.1023 475/500 [===========================>..] - ETA: 8s - loss: 0.9564 - regression_loss: 0.8540 - classification_loss: 0.1025 476/500 [===========================>..] - ETA: 8s - loss: 0.9553 - regression_loss: 0.8530 - classification_loss: 0.1023 477/500 [===========================>..] - ETA: 7s - loss: 0.9545 - regression_loss: 0.8523 - classification_loss: 0.1022 478/500 [===========================>..] - ETA: 7s - loss: 0.9544 - regression_loss: 0.8523 - classification_loss: 0.1022 479/500 [===========================>..] - ETA: 7s - loss: 0.9549 - regression_loss: 0.8527 - classification_loss: 0.1023 480/500 [===========================>..] - ETA: 6s - loss: 0.9555 - regression_loss: 0.8530 - classification_loss: 0.1024 481/500 [===========================>..] - ETA: 6s - loss: 0.9544 - regression_loss: 0.8522 - classification_loss: 0.1023 482/500 [===========================>..] - ETA: 6s - loss: 0.9541 - regression_loss: 0.8520 - classification_loss: 0.1021 483/500 [===========================>..] - ETA: 5s - loss: 0.9537 - regression_loss: 0.8517 - classification_loss: 0.1020 484/500 [============================>.] - ETA: 5s - loss: 0.9524 - regression_loss: 0.8505 - classification_loss: 0.1019 485/500 [============================>.] - ETA: 5s - loss: 0.9512 - regression_loss: 0.8494 - classification_loss: 0.1018 486/500 [============================>.] - ETA: 4s - loss: 0.9514 - regression_loss: 0.8497 - classification_loss: 0.1017 487/500 [============================>.] - ETA: 4s - loss: 0.9506 - regression_loss: 0.8489 - classification_loss: 0.1017 488/500 [============================>.] - ETA: 4s - loss: 0.9510 - regression_loss: 0.8493 - classification_loss: 0.1017 489/500 [============================>.] - ETA: 3s - loss: 0.9501 - regression_loss: 0.8485 - classification_loss: 0.1016 490/500 [============================>.] - ETA: 3s - loss: 0.9501 - regression_loss: 0.8486 - classification_loss: 0.1015 491/500 [============================>.] - ETA: 3s - loss: 0.9496 - regression_loss: 0.8482 - classification_loss: 0.1014 492/500 [============================>.] - ETA: 2s - loss: 0.9480 - regression_loss: 0.8468 - classification_loss: 0.1012 493/500 [============================>.] - ETA: 2s - loss: 0.9475 - regression_loss: 0.8463 - classification_loss: 0.1012 494/500 [============================>.] - ETA: 2s - loss: 0.9465 - regression_loss: 0.8454 - classification_loss: 0.1010 495/500 [============================>.] - ETA: 1s - loss: 0.9464 - regression_loss: 0.8454 - classification_loss: 0.1010 496/500 [============================>.] - ETA: 1s - loss: 0.9466 - regression_loss: 0.8455 - classification_loss: 0.1010 497/500 [============================>.] - ETA: 1s - loss: 0.9447 - regression_loss: 0.8438 - classification_loss: 0.1008 498/500 [============================>.] - ETA: 0s - loss: 0.9450 - regression_loss: 0.8441 - classification_loss: 0.1009 499/500 [============================>.] - ETA: 0s - loss: 0.9453 - regression_loss: 0.8444 - classification_loss: 0.1009 500/500 [==============================] - 170s 340ms/step - loss: 0.9452 - regression_loss: 0.8444 - classification_loss: 0.1009 326 instances of class plum with average precision: 0.8145 mAP: 0.8145 Epoch 00028: saving model to ./training/snapshots/resnet101_pascal_28.h5 Epoch 29/150 1/500 [..............................] - ETA: 2:29 - loss: 1.3113 - regression_loss: 1.2029 - classification_loss: 0.1084 2/500 [..............................] - ETA: 2:34 - loss: 1.2373 - regression_loss: 1.1272 - classification_loss: 0.1101 3/500 [..............................] - ETA: 2:36 - loss: 1.1594 - regression_loss: 1.0445 - classification_loss: 0.1149 4/500 [..............................] - ETA: 2:39 - loss: 1.1875 - regression_loss: 1.0664 - classification_loss: 0.1211 5/500 [..............................] - ETA: 2:40 - loss: 1.1333 - regression_loss: 1.0190 - classification_loss: 0.1143 6/500 [..............................] - ETA: 2:41 - loss: 1.1006 - regression_loss: 0.9870 - classification_loss: 0.1136 7/500 [..............................] - ETA: 2:41 - loss: 1.0864 - regression_loss: 0.9714 - classification_loss: 0.1151 8/500 [..............................] - ETA: 2:41 - loss: 1.0542 - regression_loss: 0.9440 - classification_loss: 0.1102 9/500 [..............................] - ETA: 2:42 - loss: 1.0538 - regression_loss: 0.9407 - classification_loss: 0.1131 10/500 [..............................] - ETA: 2:42 - loss: 1.0020 - regression_loss: 0.8965 - classification_loss: 0.1055 11/500 [..............................] - ETA: 2:41 - loss: 0.9747 - regression_loss: 0.8739 - classification_loss: 0.1009 12/500 [..............................] - ETA: 2:41 - loss: 0.9820 - regression_loss: 0.8799 - classification_loss: 0.1021 13/500 [..............................] - ETA: 2:42 - loss: 0.9594 - regression_loss: 0.8610 - classification_loss: 0.0983 14/500 [..............................] - ETA: 2:41 - loss: 0.9457 - regression_loss: 0.8475 - classification_loss: 0.0982 15/500 [..............................] - ETA: 2:42 - loss: 0.9244 - regression_loss: 0.8297 - classification_loss: 0.0947 16/500 [..............................] - ETA: 2:41 - loss: 0.9149 - regression_loss: 0.8226 - classification_loss: 0.0923 17/500 [>.............................] - ETA: 2:41 - loss: 0.9030 - regression_loss: 0.8140 - classification_loss: 0.0890 18/500 [>.............................] - ETA: 2:41 - loss: 0.8913 - regression_loss: 0.8033 - classification_loss: 0.0879 19/500 [>.............................] - ETA: 2:40 - loss: 0.8904 - regression_loss: 0.8017 - classification_loss: 0.0888 20/500 [>.............................] - ETA: 2:40 - loss: 0.8990 - regression_loss: 0.8086 - classification_loss: 0.0904 21/500 [>.............................] - ETA: 2:40 - loss: 0.8863 - regression_loss: 0.7976 - classification_loss: 0.0887 22/500 [>.............................] - ETA: 2:40 - loss: 0.8697 - regression_loss: 0.7828 - classification_loss: 0.0869 23/500 [>.............................] - ETA: 2:40 - loss: 0.8322 - regression_loss: 0.7488 - classification_loss: 0.0835 24/500 [>.............................] - ETA: 2:39 - loss: 0.8060 - regression_loss: 0.7246 - classification_loss: 0.0814 25/500 [>.............................] - ETA: 2:39 - loss: 0.8632 - regression_loss: 0.7745 - classification_loss: 0.0887 26/500 [>.............................] - ETA: 2:39 - loss: 0.8650 - regression_loss: 0.7720 - classification_loss: 0.0930 27/500 [>.............................] - ETA: 2:39 - loss: 0.8754 - regression_loss: 0.7809 - classification_loss: 0.0945 28/500 [>.............................] - ETA: 2:38 - loss: 0.8596 - regression_loss: 0.7672 - classification_loss: 0.0924 29/500 [>.............................] - ETA: 2:38 - loss: 0.8709 - regression_loss: 0.7749 - classification_loss: 0.0960 30/500 [>.............................] - ETA: 2:38 - loss: 0.8562 - regression_loss: 0.7619 - classification_loss: 0.0943 31/500 [>.............................] - ETA: 2:38 - loss: 0.8584 - regression_loss: 0.7630 - classification_loss: 0.0953 32/500 [>.............................] - ETA: 2:37 - loss: 0.8640 - regression_loss: 0.7673 - classification_loss: 0.0966 33/500 [>.............................] - ETA: 2:37 - loss: 0.8670 - regression_loss: 0.7700 - classification_loss: 0.0970 34/500 [=>............................] - ETA: 2:37 - loss: 0.8936 - regression_loss: 0.7910 - classification_loss: 0.1026 35/500 [=>............................] - ETA: 2:37 - loss: 0.8786 - regression_loss: 0.7779 - classification_loss: 0.1007 36/500 [=>............................] - ETA: 2:37 - loss: 0.8692 - regression_loss: 0.7702 - classification_loss: 0.0990 37/500 [=>............................] - ETA: 2:36 - loss: 0.8625 - regression_loss: 0.7656 - classification_loss: 0.0969 38/500 [=>............................] - ETA: 2:36 - loss: 0.8591 - regression_loss: 0.7633 - classification_loss: 0.0958 39/500 [=>............................] - ETA: 2:36 - loss: 0.8770 - regression_loss: 0.7793 - classification_loss: 0.0977 40/500 [=>............................] - ETA: 2:36 - loss: 0.8805 - regression_loss: 0.7814 - classification_loss: 0.0990 41/500 [=>............................] - ETA: 2:35 - loss: 0.8783 - regression_loss: 0.7795 - classification_loss: 0.0988 42/500 [=>............................] - ETA: 2:35 - loss: 0.8770 - regression_loss: 0.7785 - classification_loss: 0.0986 43/500 [=>............................] - ETA: 2:35 - loss: 0.8848 - regression_loss: 0.7827 - classification_loss: 0.1021 44/500 [=>............................] - ETA: 2:34 - loss: 0.8816 - regression_loss: 0.7796 - classification_loss: 0.1020 45/500 [=>............................] - ETA: 2:34 - loss: 0.9000 - regression_loss: 0.7928 - classification_loss: 0.1073 46/500 [=>............................] - ETA: 2:33 - loss: 0.9023 - regression_loss: 0.7955 - classification_loss: 0.1068 47/500 [=>............................] - ETA: 2:33 - loss: 0.9050 - regression_loss: 0.7978 - classification_loss: 0.1072 48/500 [=>............................] - ETA: 2:32 - loss: 0.8974 - regression_loss: 0.7920 - classification_loss: 0.1054 49/500 [=>............................] - ETA: 2:32 - loss: 0.9027 - regression_loss: 0.7968 - classification_loss: 0.1059 50/500 [==>...........................] - ETA: 2:32 - loss: 0.9075 - regression_loss: 0.7983 - classification_loss: 0.1092 51/500 [==>...........................] - ETA: 2:31 - loss: 0.9012 - regression_loss: 0.7935 - classification_loss: 0.1077 52/500 [==>...........................] - ETA: 2:31 - loss: 0.8965 - regression_loss: 0.7899 - classification_loss: 0.1066 53/500 [==>...........................] - ETA: 2:31 - loss: 0.8983 - regression_loss: 0.7901 - classification_loss: 0.1082 54/500 [==>...........................] - ETA: 2:30 - loss: 0.8932 - regression_loss: 0.7862 - classification_loss: 0.1070 55/500 [==>...........................] - ETA: 2:30 - loss: 0.8868 - regression_loss: 0.7812 - classification_loss: 0.1056 56/500 [==>...........................] - ETA: 2:29 - loss: 0.8793 - regression_loss: 0.7752 - classification_loss: 0.1041 57/500 [==>...........................] - ETA: 2:29 - loss: 0.8773 - regression_loss: 0.7740 - classification_loss: 0.1033 58/500 [==>...........................] - ETA: 2:29 - loss: 0.8694 - regression_loss: 0.7675 - classification_loss: 0.1019 59/500 [==>...........................] - ETA: 2:29 - loss: 0.8887 - regression_loss: 0.7826 - classification_loss: 0.1060 60/500 [==>...........................] - ETA: 2:28 - loss: 0.8878 - regression_loss: 0.7823 - classification_loss: 0.1056 61/500 [==>...........................] - ETA: 2:28 - loss: 0.8910 - regression_loss: 0.7851 - classification_loss: 0.1059 62/500 [==>...........................] - ETA: 2:28 - loss: 0.8869 - regression_loss: 0.7819 - classification_loss: 0.1050 63/500 [==>...........................] - ETA: 2:27 - loss: 0.8797 - regression_loss: 0.7757 - classification_loss: 0.1040 64/500 [==>...........................] - ETA: 2:27 - loss: 0.8803 - regression_loss: 0.7759 - classification_loss: 0.1044 65/500 [==>...........................] - ETA: 2:27 - loss: 0.8787 - regression_loss: 0.7744 - classification_loss: 0.1043 66/500 [==>...........................] - ETA: 2:27 - loss: 0.8822 - regression_loss: 0.7786 - classification_loss: 0.1035 67/500 [===>..........................] - ETA: 2:26 - loss: 0.8834 - regression_loss: 0.7803 - classification_loss: 0.1031 68/500 [===>..........................] - ETA: 2:26 - loss: 0.8787 - regression_loss: 0.7765 - classification_loss: 0.1022 69/500 [===>..........................] - ETA: 2:25 - loss: 0.8768 - regression_loss: 0.7753 - classification_loss: 0.1015 70/500 [===>..........................] - ETA: 2:25 - loss: 0.8746 - regression_loss: 0.7739 - classification_loss: 0.1007 71/500 [===>..........................] - ETA: 2:25 - loss: 0.8789 - regression_loss: 0.7781 - classification_loss: 0.1008 72/500 [===>..........................] - ETA: 2:24 - loss: 0.8833 - regression_loss: 0.7821 - classification_loss: 0.1012 73/500 [===>..........................] - ETA: 2:24 - loss: 0.8785 - regression_loss: 0.7782 - classification_loss: 0.1002 74/500 [===>..........................] - ETA: 2:23 - loss: 0.8834 - regression_loss: 0.7817 - classification_loss: 0.1018 75/500 [===>..........................] - ETA: 2:23 - loss: 0.9049 - regression_loss: 0.7952 - classification_loss: 0.1097 76/500 [===>..........................] - ETA: 2:23 - loss: 0.9046 - regression_loss: 0.7957 - classification_loss: 0.1089 77/500 [===>..........................] - ETA: 2:22 - loss: 0.9077 - regression_loss: 0.7981 - classification_loss: 0.1096 78/500 [===>..........................] - ETA: 2:22 - loss: 0.9048 - regression_loss: 0.7959 - classification_loss: 0.1089 79/500 [===>..........................] - ETA: 2:22 - loss: 0.8969 - regression_loss: 0.7891 - classification_loss: 0.1078 80/500 [===>..........................] - ETA: 2:21 - loss: 0.8967 - regression_loss: 0.7897 - classification_loss: 0.1070 81/500 [===>..........................] - ETA: 2:21 - loss: 0.8963 - regression_loss: 0.7887 - classification_loss: 0.1076 82/500 [===>..........................] - ETA: 2:20 - loss: 0.8935 - regression_loss: 0.7864 - classification_loss: 0.1071 83/500 [===>..........................] - ETA: 2:20 - loss: 0.8950 - regression_loss: 0.7876 - classification_loss: 0.1073 84/500 [====>.........................] - ETA: 2:20 - loss: 0.8993 - regression_loss: 0.7914 - classification_loss: 0.1079 85/500 [====>.........................] - ETA: 2:20 - loss: 0.8985 - regression_loss: 0.7908 - classification_loss: 0.1077 86/500 [====>.........................] - ETA: 2:19 - loss: 0.8925 - regression_loss: 0.7854 - classification_loss: 0.1071 87/500 [====>.........................] - ETA: 2:19 - loss: 0.8899 - regression_loss: 0.7833 - classification_loss: 0.1066 88/500 [====>.........................] - ETA: 2:18 - loss: 0.9017 - regression_loss: 0.7919 - classification_loss: 0.1097 89/500 [====>.........................] - ETA: 2:18 - loss: 0.8969 - regression_loss: 0.7882 - classification_loss: 0.1088 90/500 [====>.........................] - ETA: 2:18 - loss: 0.8986 - regression_loss: 0.7896 - classification_loss: 0.1089 91/500 [====>.........................] - ETA: 2:18 - loss: 0.8893 - regression_loss: 0.7810 - classification_loss: 0.1083 92/500 [====>.........................] - ETA: 2:17 - loss: 0.8899 - regression_loss: 0.7818 - classification_loss: 0.1081 93/500 [====>.........................] - ETA: 2:17 - loss: 0.8941 - regression_loss: 0.7856 - classification_loss: 0.1085 94/500 [====>.........................] - ETA: 2:16 - loss: 0.8881 - regression_loss: 0.7804 - classification_loss: 0.1077 95/500 [====>.........................] - ETA: 2:16 - loss: 0.8834 - regression_loss: 0.7765 - classification_loss: 0.1069 96/500 [====>.........................] - ETA: 2:16 - loss: 0.8890 - regression_loss: 0.7812 - classification_loss: 0.1078 97/500 [====>.........................] - ETA: 2:15 - loss: 0.8880 - regression_loss: 0.7806 - classification_loss: 0.1074 98/500 [====>.........................] - ETA: 2:15 - loss: 0.8939 - regression_loss: 0.7847 - classification_loss: 0.1092 99/500 [====>.........................] - ETA: 2:15 - loss: 0.8939 - regression_loss: 0.7845 - classification_loss: 0.1093 100/500 [=====>........................] - ETA: 2:14 - loss: 0.8923 - regression_loss: 0.7839 - classification_loss: 0.1084 101/500 [=====>........................] - ETA: 2:14 - loss: 0.8884 - regression_loss: 0.7809 - classification_loss: 0.1075 102/500 [=====>........................] - ETA: 2:14 - loss: 0.8855 - regression_loss: 0.7788 - classification_loss: 0.1067 103/500 [=====>........................] - ETA: 2:13 - loss: 0.8853 - regression_loss: 0.7792 - classification_loss: 0.1061 104/500 [=====>........................] - ETA: 2:13 - loss: 0.8892 - regression_loss: 0.7829 - classification_loss: 0.1064 105/500 [=====>........................] - ETA: 2:13 - loss: 0.8902 - regression_loss: 0.7841 - classification_loss: 0.1061 106/500 [=====>........................] - ETA: 2:12 - loss: 0.8911 - regression_loss: 0.7856 - classification_loss: 0.1055 107/500 [=====>........................] - ETA: 2:12 - loss: 0.8923 - regression_loss: 0.7866 - classification_loss: 0.1056 108/500 [=====>........................] - ETA: 2:12 - loss: 0.8899 - regression_loss: 0.7849 - classification_loss: 0.1051 109/500 [=====>........................] - ETA: 2:11 - loss: 0.8932 - regression_loss: 0.7874 - classification_loss: 0.1057 110/500 [=====>........................] - ETA: 2:11 - loss: 0.8919 - regression_loss: 0.7864 - classification_loss: 0.1054 111/500 [=====>........................] - ETA: 2:11 - loss: 0.8942 - regression_loss: 0.7885 - classification_loss: 0.1057 112/500 [=====>........................] - ETA: 2:10 - loss: 0.8897 - regression_loss: 0.7846 - classification_loss: 0.1051 113/500 [=====>........................] - ETA: 2:10 - loss: 0.8939 - regression_loss: 0.7882 - classification_loss: 0.1057 114/500 [=====>........................] - ETA: 2:10 - loss: 0.8954 - regression_loss: 0.7893 - classification_loss: 0.1060 115/500 [=====>........................] - ETA: 2:09 - loss: 0.8922 - regression_loss: 0.7868 - classification_loss: 0.1055 116/500 [=====>........................] - ETA: 2:09 - loss: 0.8905 - regression_loss: 0.7854 - classification_loss: 0.1051 117/500 [======>.......................] - ETA: 2:09 - loss: 0.8928 - regression_loss: 0.7878 - classification_loss: 0.1050 118/500 [======>.......................] - ETA: 2:08 - loss: 0.8934 - regression_loss: 0.7886 - classification_loss: 0.1048 119/500 [======>.......................] - ETA: 2:08 - loss: 0.8947 - regression_loss: 0.7905 - classification_loss: 0.1042 120/500 [======>.......................] - ETA: 2:08 - loss: 0.8938 - regression_loss: 0.7893 - classification_loss: 0.1045 121/500 [======>.......................] - ETA: 2:07 - loss: 0.8986 - regression_loss: 0.7931 - classification_loss: 0.1055 122/500 [======>.......................] - ETA: 2:07 - loss: 0.8952 - regression_loss: 0.7901 - classification_loss: 0.1050 123/500 [======>.......................] - ETA: 2:07 - loss: 0.9005 - regression_loss: 0.7944 - classification_loss: 0.1061 124/500 [======>.......................] - ETA: 2:07 - loss: 0.9020 - regression_loss: 0.7961 - classification_loss: 0.1059 125/500 [======>.......................] - ETA: 2:06 - loss: 0.9035 - regression_loss: 0.7972 - classification_loss: 0.1063 126/500 [======>.......................] - ETA: 2:06 - loss: 0.9054 - regression_loss: 0.7988 - classification_loss: 0.1065 127/500 [======>.......................] - ETA: 2:06 - loss: 0.9020 - regression_loss: 0.7960 - classification_loss: 0.1060 128/500 [======>.......................] - ETA: 2:05 - loss: 0.9023 - regression_loss: 0.7959 - classification_loss: 0.1064 129/500 [======>.......................] - ETA: 2:05 - loss: 0.9019 - regression_loss: 0.7956 - classification_loss: 0.1063 130/500 [======>.......................] - ETA: 2:05 - loss: 0.9042 - regression_loss: 0.7976 - classification_loss: 0.1066 131/500 [======>.......................] - ETA: 2:04 - loss: 0.9032 - regression_loss: 0.7970 - classification_loss: 0.1061 132/500 [======>.......................] - ETA: 2:04 - loss: 0.9032 - regression_loss: 0.7974 - classification_loss: 0.1059 133/500 [======>.......................] - ETA: 2:04 - loss: 0.9013 - regression_loss: 0.7957 - classification_loss: 0.1057 134/500 [=======>......................] - ETA: 2:03 - loss: 0.9101 - regression_loss: 0.8037 - classification_loss: 0.1064 135/500 [=======>......................] - ETA: 2:03 - loss: 0.9072 - regression_loss: 0.8012 - classification_loss: 0.1059 136/500 [=======>......................] - ETA: 2:03 - loss: 0.9129 - regression_loss: 0.8057 - classification_loss: 0.1072 137/500 [=======>......................] - ETA: 2:02 - loss: 0.9079 - regression_loss: 0.8011 - classification_loss: 0.1068 138/500 [=======>......................] - ETA: 2:02 - loss: 0.9039 - regression_loss: 0.7977 - classification_loss: 0.1062 139/500 [=======>......................] - ETA: 2:02 - loss: 0.9057 - regression_loss: 0.7987 - classification_loss: 0.1070 140/500 [=======>......................] - ETA: 2:01 - loss: 0.9043 - regression_loss: 0.7974 - classification_loss: 0.1069 141/500 [=======>......................] - ETA: 2:01 - loss: 0.9014 - regression_loss: 0.7950 - classification_loss: 0.1064 142/500 [=======>......................] - ETA: 2:01 - loss: 0.8993 - regression_loss: 0.7933 - classification_loss: 0.1060 143/500 [=======>......................] - ETA: 2:00 - loss: 0.8967 - regression_loss: 0.7913 - classification_loss: 0.1054 144/500 [=======>......................] - ETA: 2:00 - loss: 0.8971 - regression_loss: 0.7918 - classification_loss: 0.1053 145/500 [=======>......................] - ETA: 2:00 - loss: 0.8933 - regression_loss: 0.7885 - classification_loss: 0.1048 146/500 [=======>......................] - ETA: 1:59 - loss: 0.8938 - regression_loss: 0.7890 - classification_loss: 0.1049 147/500 [=======>......................] - ETA: 1:59 - loss: 0.8918 - regression_loss: 0.7874 - classification_loss: 0.1044 148/500 [=======>......................] - ETA: 1:58 - loss: 0.8937 - regression_loss: 0.7890 - classification_loss: 0.1046 149/500 [=======>......................] - ETA: 1:58 - loss: 0.8919 - regression_loss: 0.7874 - classification_loss: 0.1044 150/500 [========>.....................] - ETA: 1:58 - loss: 0.8959 - regression_loss: 0.7906 - classification_loss: 0.1053 151/500 [========>.....................] - ETA: 1:57 - loss: 0.9009 - regression_loss: 0.7941 - classification_loss: 0.1068 152/500 [========>.....................] - ETA: 1:57 - loss: 0.8999 - regression_loss: 0.7926 - classification_loss: 0.1073 153/500 [========>.....................] - ETA: 1:57 - loss: 0.9016 - regression_loss: 0.7944 - classification_loss: 0.1072 154/500 [========>.....................] - ETA: 1:56 - loss: 0.9014 - regression_loss: 0.7944 - classification_loss: 0.1069 155/500 [========>.....................] - ETA: 1:56 - loss: 0.9009 - regression_loss: 0.7943 - classification_loss: 0.1067 156/500 [========>.....................] - ETA: 1:56 - loss: 0.9036 - regression_loss: 0.7970 - classification_loss: 0.1066 157/500 [========>.....................] - ETA: 1:55 - loss: 0.9003 - regression_loss: 0.7941 - classification_loss: 0.1062 158/500 [========>.....................] - ETA: 1:55 - loss: 0.8993 - regression_loss: 0.7932 - classification_loss: 0.1060 159/500 [========>.....................] - ETA: 1:55 - loss: 0.8960 - regression_loss: 0.7905 - classification_loss: 0.1055 160/500 [========>.....................] - ETA: 1:54 - loss: 0.8960 - regression_loss: 0.7906 - classification_loss: 0.1054 161/500 [========>.....................] - ETA: 1:54 - loss: 0.8980 - regression_loss: 0.7922 - classification_loss: 0.1058 162/500 [========>.....................] - ETA: 1:54 - loss: 0.8952 - regression_loss: 0.7899 - classification_loss: 0.1053 163/500 [========>.....................] - ETA: 1:53 - loss: 0.8962 - regression_loss: 0.7906 - classification_loss: 0.1055 164/500 [========>.....................] - ETA: 1:53 - loss: 0.9010 - regression_loss: 0.7941 - classification_loss: 0.1069 165/500 [========>.....................] - ETA: 1:53 - loss: 0.9013 - regression_loss: 0.7942 - classification_loss: 0.1071 166/500 [========>.....................] - ETA: 1:52 - loss: 0.9020 - regression_loss: 0.7947 - classification_loss: 0.1073 167/500 [=========>....................] - ETA: 1:52 - loss: 0.9011 - regression_loss: 0.7939 - classification_loss: 0.1071 168/500 [=========>....................] - ETA: 1:52 - loss: 0.8983 - regression_loss: 0.7915 - classification_loss: 0.1067 169/500 [=========>....................] - ETA: 1:51 - loss: 0.8978 - regression_loss: 0.7913 - classification_loss: 0.1065 170/500 [=========>....................] - ETA: 1:51 - loss: 0.8948 - regression_loss: 0.7887 - classification_loss: 0.1061 171/500 [=========>....................] - ETA: 1:51 - loss: 0.8982 - regression_loss: 0.7917 - classification_loss: 0.1065 172/500 [=========>....................] - ETA: 1:50 - loss: 0.8989 - regression_loss: 0.7924 - classification_loss: 0.1065 173/500 [=========>....................] - ETA: 1:50 - loss: 0.8997 - regression_loss: 0.7927 - classification_loss: 0.1070 174/500 [=========>....................] - ETA: 1:50 - loss: 0.8976 - regression_loss: 0.7910 - classification_loss: 0.1066 175/500 [=========>....................] - ETA: 1:49 - loss: 0.8966 - regression_loss: 0.7902 - classification_loss: 0.1064 176/500 [=========>....................] - ETA: 1:49 - loss: 0.8946 - regression_loss: 0.7888 - classification_loss: 0.1059 177/500 [=========>....................] - ETA: 1:49 - loss: 0.8944 - regression_loss: 0.7889 - classification_loss: 0.1055 178/500 [=========>....................] - ETA: 1:48 - loss: 0.8988 - regression_loss: 0.7925 - classification_loss: 0.1062 179/500 [=========>....................] - ETA: 1:48 - loss: 0.9019 - regression_loss: 0.7953 - classification_loss: 0.1067 180/500 [=========>....................] - ETA: 1:48 - loss: 0.9034 - regression_loss: 0.7965 - classification_loss: 0.1069 181/500 [=========>....................] - ETA: 1:47 - loss: 0.9042 - regression_loss: 0.7973 - classification_loss: 0.1068 182/500 [=========>....................] - ETA: 1:47 - loss: 0.9117 - regression_loss: 0.8031 - classification_loss: 0.1086 183/500 [=========>....................] - ETA: 1:47 - loss: 0.9128 - regression_loss: 0.8042 - classification_loss: 0.1086 184/500 [==========>...................] - ETA: 1:46 - loss: 0.9135 - regression_loss: 0.8051 - classification_loss: 0.1084 185/500 [==========>...................] - ETA: 1:46 - loss: 0.9131 - regression_loss: 0.8047 - classification_loss: 0.1084 186/500 [==========>...................] - ETA: 1:46 - loss: 0.9129 - regression_loss: 0.8047 - classification_loss: 0.1081 187/500 [==========>...................] - ETA: 1:45 - loss: 0.9104 - regression_loss: 0.8025 - classification_loss: 0.1078 188/500 [==========>...................] - ETA: 1:45 - loss: 0.9089 - regression_loss: 0.8014 - classification_loss: 0.1075 189/500 [==========>...................] - ETA: 1:45 - loss: 0.9084 - regression_loss: 0.8011 - classification_loss: 0.1073 190/500 [==========>...................] - ETA: 1:44 - loss: 0.9083 - regression_loss: 0.8011 - classification_loss: 0.1072 191/500 [==========>...................] - ETA: 1:44 - loss: 0.9073 - regression_loss: 0.8002 - classification_loss: 0.1071 192/500 [==========>...................] - ETA: 1:44 - loss: 0.9059 - regression_loss: 0.7991 - classification_loss: 0.1068 193/500 [==========>...................] - ETA: 1:43 - loss: 0.9037 - regression_loss: 0.7970 - classification_loss: 0.1067 194/500 [==========>...................] - ETA: 1:43 - loss: 0.9188 - regression_loss: 0.8075 - classification_loss: 0.1113 195/500 [==========>...................] - ETA: 1:43 - loss: 0.9173 - regression_loss: 0.8063 - classification_loss: 0.1110 196/500 [==========>...................] - ETA: 1:42 - loss: 0.9166 - regression_loss: 0.8058 - classification_loss: 0.1108 197/500 [==========>...................] - ETA: 1:42 - loss: 0.9148 - regression_loss: 0.8044 - classification_loss: 0.1105 198/500 [==========>...................] - ETA: 1:42 - loss: 0.9124 - regression_loss: 0.8023 - classification_loss: 0.1101 199/500 [==========>...................] - ETA: 1:41 - loss: 0.9085 - regression_loss: 0.7983 - classification_loss: 0.1102 200/500 [===========>..................] - ETA: 1:41 - loss: 0.9086 - regression_loss: 0.7985 - classification_loss: 0.1101 201/500 [===========>..................] - ETA: 1:41 - loss: 0.9082 - regression_loss: 0.7984 - classification_loss: 0.1098 202/500 [===========>..................] - ETA: 1:40 - loss: 0.9089 - regression_loss: 0.7993 - classification_loss: 0.1096 203/500 [===========>..................] - ETA: 1:40 - loss: 0.9067 - regression_loss: 0.7974 - classification_loss: 0.1093 204/500 [===========>..................] - ETA: 1:40 - loss: 0.9075 - regression_loss: 0.7983 - classification_loss: 0.1092 205/500 [===========>..................] - ETA: 1:39 - loss: 0.9073 - regression_loss: 0.7981 - classification_loss: 0.1091 206/500 [===========>..................] - ETA: 1:39 - loss: 0.9081 - regression_loss: 0.7989 - classification_loss: 0.1092 207/500 [===========>..................] - ETA: 1:39 - loss: 0.9070 - regression_loss: 0.7981 - classification_loss: 0.1088 208/500 [===========>..................] - ETA: 1:38 - loss: 0.9083 - regression_loss: 0.7993 - classification_loss: 0.1090 209/500 [===========>..................] - ETA: 1:38 - loss: 0.9068 - regression_loss: 0.7982 - classification_loss: 0.1086 210/500 [===========>..................] - ETA: 1:38 - loss: 0.9062 - regression_loss: 0.7978 - classification_loss: 0.1084 211/500 [===========>..................] - ETA: 1:37 - loss: 0.9073 - regression_loss: 0.7987 - classification_loss: 0.1086 212/500 [===========>..................] - ETA: 1:37 - loss: 0.9105 - regression_loss: 0.8010 - classification_loss: 0.1095 213/500 [===========>..................] - ETA: 1:37 - loss: 0.9120 - regression_loss: 0.8023 - classification_loss: 0.1098 214/500 [===========>..................] - ETA: 1:36 - loss: 0.9106 - regression_loss: 0.8012 - classification_loss: 0.1094 215/500 [===========>..................] - ETA: 1:36 - loss: 0.9079 - regression_loss: 0.7990 - classification_loss: 0.1089 216/500 [===========>..................] - ETA: 1:36 - loss: 0.9067 - regression_loss: 0.7979 - classification_loss: 0.1088 217/500 [============>.................] - ETA: 1:35 - loss: 0.9056 - regression_loss: 0.7971 - classification_loss: 0.1085 218/500 [============>.................] - ETA: 1:35 - loss: 0.9067 - regression_loss: 0.7981 - classification_loss: 0.1086 219/500 [============>.................] - ETA: 1:35 - loss: 0.9089 - regression_loss: 0.8002 - classification_loss: 0.1087 220/500 [============>.................] - ETA: 1:34 - loss: 0.9076 - regression_loss: 0.7992 - classification_loss: 0.1085 221/500 [============>.................] - ETA: 1:34 - loss: 0.9101 - regression_loss: 0.8010 - classification_loss: 0.1091 222/500 [============>.................] - ETA: 1:34 - loss: 0.9090 - regression_loss: 0.8002 - classification_loss: 0.1089 223/500 [============>.................] - ETA: 1:33 - loss: 0.9084 - regression_loss: 0.7997 - classification_loss: 0.1087 224/500 [============>.................] - ETA: 1:33 - loss: 0.9129 - regression_loss: 0.8036 - classification_loss: 0.1094 225/500 [============>.................] - ETA: 1:32 - loss: 0.9131 - regression_loss: 0.8040 - classification_loss: 0.1092 226/500 [============>.................] - ETA: 1:32 - loss: 0.9145 - regression_loss: 0.8053 - classification_loss: 0.1092 227/500 [============>.................] - ETA: 1:32 - loss: 0.9143 - regression_loss: 0.8053 - classification_loss: 0.1090 228/500 [============>.................] - ETA: 1:31 - loss: 0.9150 - regression_loss: 0.8061 - classification_loss: 0.1089 229/500 [============>.................] - ETA: 1:31 - loss: 0.9145 - regression_loss: 0.8058 - classification_loss: 0.1087 230/500 [============>.................] - ETA: 1:31 - loss: 0.9133 - regression_loss: 0.8047 - classification_loss: 0.1086 231/500 [============>.................] - ETA: 1:31 - loss: 0.9130 - regression_loss: 0.8047 - classification_loss: 0.1083 232/500 [============>.................] - ETA: 1:30 - loss: 0.9132 - regression_loss: 0.8048 - classification_loss: 0.1084 233/500 [============>.................] - ETA: 1:30 - loss: 0.9116 - regression_loss: 0.8036 - classification_loss: 0.1081 234/500 [=============>................] - ETA: 1:29 - loss: 0.9112 - regression_loss: 0.8033 - classification_loss: 0.1079 235/500 [=============>................] - ETA: 1:29 - loss: 0.9121 - regression_loss: 0.8043 - classification_loss: 0.1078 236/500 [=============>................] - ETA: 1:29 - loss: 0.9106 - regression_loss: 0.8031 - classification_loss: 0.1076 237/500 [=============>................] - ETA: 1:28 - loss: 0.9103 - regression_loss: 0.8027 - classification_loss: 0.1076 238/500 [=============>................] - ETA: 1:28 - loss: 0.9112 - regression_loss: 0.8033 - classification_loss: 0.1078 239/500 [=============>................] - ETA: 1:28 - loss: 0.9112 - regression_loss: 0.8033 - classification_loss: 0.1078 240/500 [=============>................] - ETA: 1:27 - loss: 0.9111 - regression_loss: 0.8034 - classification_loss: 0.1077 241/500 [=============>................] - ETA: 1:27 - loss: 0.9189 - regression_loss: 0.8087 - classification_loss: 0.1103 242/500 [=============>................] - ETA: 1:27 - loss: 0.9216 - regression_loss: 0.8107 - classification_loss: 0.1109 243/500 [=============>................] - ETA: 1:26 - loss: 0.9207 - regression_loss: 0.8102 - classification_loss: 0.1105 244/500 [=============>................] - ETA: 1:26 - loss: 0.9207 - regression_loss: 0.8103 - classification_loss: 0.1104 245/500 [=============>................] - ETA: 1:26 - loss: 0.9200 - regression_loss: 0.8099 - classification_loss: 0.1101 246/500 [=============>................] - ETA: 1:25 - loss: 0.9225 - regression_loss: 0.8113 - classification_loss: 0.1111 247/500 [=============>................] - ETA: 1:25 - loss: 0.9225 - regression_loss: 0.8114 - classification_loss: 0.1111 248/500 [=============>................] - ETA: 1:25 - loss: 0.9205 - regression_loss: 0.8097 - classification_loss: 0.1109 249/500 [=============>................] - ETA: 1:24 - loss: 0.9202 - regression_loss: 0.8097 - classification_loss: 0.1106 250/500 [==============>...............] - ETA: 1:24 - loss: 0.9194 - regression_loss: 0.8091 - classification_loss: 0.1103 251/500 [==============>...............] - ETA: 1:24 - loss: 0.9207 - regression_loss: 0.8104 - classification_loss: 0.1103 252/500 [==============>...............] - ETA: 1:23 - loss: 0.9199 - regression_loss: 0.8098 - classification_loss: 0.1101 253/500 [==============>...............] - ETA: 1:23 - loss: 0.9196 - regression_loss: 0.8095 - classification_loss: 0.1101 254/500 [==============>...............] - ETA: 1:23 - loss: 0.9208 - regression_loss: 0.8105 - classification_loss: 0.1103 255/500 [==============>...............] - ETA: 1:22 - loss: 0.9219 - regression_loss: 0.8116 - classification_loss: 0.1104 256/500 [==============>...............] - ETA: 1:22 - loss: 0.9205 - regression_loss: 0.8104 - classification_loss: 0.1102 257/500 [==============>...............] - ETA: 1:22 - loss: 0.9210 - regression_loss: 0.8110 - classification_loss: 0.1099 258/500 [==============>...............] - ETA: 1:21 - loss: 0.9219 - regression_loss: 0.8119 - classification_loss: 0.1100 259/500 [==============>...............] - ETA: 1:21 - loss: 0.9211 - regression_loss: 0.8113 - classification_loss: 0.1098 260/500 [==============>...............] - ETA: 1:21 - loss: 0.9212 - regression_loss: 0.8114 - classification_loss: 0.1098 261/500 [==============>...............] - ETA: 1:20 - loss: 0.9206 - regression_loss: 0.8110 - classification_loss: 0.1096 262/500 [==============>...............] - ETA: 1:20 - loss: 0.9236 - regression_loss: 0.8136 - classification_loss: 0.1099 263/500 [==============>...............] - ETA: 1:20 - loss: 0.9220 - regression_loss: 0.8124 - classification_loss: 0.1096 264/500 [==============>...............] - ETA: 1:19 - loss: 0.9230 - regression_loss: 0.8134 - classification_loss: 0.1095 265/500 [==============>...............] - ETA: 1:19 - loss: 0.9225 - regression_loss: 0.8133 - classification_loss: 0.1092 266/500 [==============>...............] - ETA: 1:19 - loss: 0.9263 - regression_loss: 0.8166 - classification_loss: 0.1097 267/500 [===============>..............] - ETA: 1:18 - loss: 0.9254 - regression_loss: 0.8158 - classification_loss: 0.1095 268/500 [===============>..............] - ETA: 1:18 - loss: 0.9240 - regression_loss: 0.8146 - classification_loss: 0.1094 269/500 [===============>..............] - ETA: 1:18 - loss: 0.9221 - regression_loss: 0.8130 - classification_loss: 0.1092 270/500 [===============>..............] - ETA: 1:17 - loss: 0.9263 - regression_loss: 0.8162 - classification_loss: 0.1101 271/500 [===============>..............] - ETA: 1:17 - loss: 0.9252 - regression_loss: 0.8150 - classification_loss: 0.1102 272/500 [===============>..............] - ETA: 1:17 - loss: 0.9254 - regression_loss: 0.8154 - classification_loss: 0.1100 273/500 [===============>..............] - ETA: 1:16 - loss: 0.9294 - regression_loss: 0.8191 - classification_loss: 0.1103 274/500 [===============>..............] - ETA: 1:16 - loss: 0.9277 - regression_loss: 0.8174 - classification_loss: 0.1103 275/500 [===============>..............] - ETA: 1:16 - loss: 0.9287 - regression_loss: 0.8183 - classification_loss: 0.1105 276/500 [===============>..............] - ETA: 1:15 - loss: 0.9287 - regression_loss: 0.8184 - classification_loss: 0.1104 277/500 [===============>..............] - ETA: 1:15 - loss: 0.9306 - regression_loss: 0.8197 - classification_loss: 0.1109 278/500 [===============>..............] - ETA: 1:15 - loss: 0.9310 - regression_loss: 0.8201 - classification_loss: 0.1109 279/500 [===============>..............] - ETA: 1:14 - loss: 0.9318 - regression_loss: 0.8211 - classification_loss: 0.1107 280/500 [===============>..............] - ETA: 1:14 - loss: 0.9326 - regression_loss: 0.8218 - classification_loss: 0.1108 281/500 [===============>..............] - ETA: 1:14 - loss: 0.9321 - regression_loss: 0.8215 - classification_loss: 0.1106 282/500 [===============>..............] - ETA: 1:13 - loss: 0.9343 - regression_loss: 0.8236 - classification_loss: 0.1107 283/500 [===============>..............] - ETA: 1:13 - loss: 0.9351 - regression_loss: 0.8243 - classification_loss: 0.1109 284/500 [================>.............] - ETA: 1:13 - loss: 0.9359 - regression_loss: 0.8253 - classification_loss: 0.1106 285/500 [================>.............] - ETA: 1:12 - loss: 0.9370 - regression_loss: 0.8265 - classification_loss: 0.1105 286/500 [================>.............] - ETA: 1:12 - loss: 0.9358 - regression_loss: 0.8255 - classification_loss: 0.1103 287/500 [================>.............] - ETA: 1:12 - loss: 0.9339 - regression_loss: 0.8237 - classification_loss: 0.1102 288/500 [================>.............] - ETA: 1:11 - loss: 0.9341 - regression_loss: 0.8240 - classification_loss: 0.1101 289/500 [================>.............] - ETA: 1:11 - loss: 0.9339 - regression_loss: 0.8239 - classification_loss: 0.1100 290/500 [================>.............] - ETA: 1:11 - loss: 0.9327 - regression_loss: 0.8229 - classification_loss: 0.1098 291/500 [================>.............] - ETA: 1:10 - loss: 0.9307 - regression_loss: 0.8212 - classification_loss: 0.1095 292/500 [================>.............] - ETA: 1:10 - loss: 0.9313 - regression_loss: 0.8219 - classification_loss: 0.1094 293/500 [================>.............] - ETA: 1:10 - loss: 0.9308 - regression_loss: 0.8216 - classification_loss: 0.1093 294/500 [================>.............] - ETA: 1:09 - loss: 0.9292 - regression_loss: 0.8201 - classification_loss: 0.1091 295/500 [================>.............] - ETA: 1:09 - loss: 0.9280 - regression_loss: 0.8192 - classification_loss: 0.1088 296/500 [================>.............] - ETA: 1:09 - loss: 0.9289 - regression_loss: 0.8199 - classification_loss: 0.1091 297/500 [================>.............] - ETA: 1:08 - loss: 0.9281 - regression_loss: 0.8192 - classification_loss: 0.1088 298/500 [================>.............] - ETA: 1:08 - loss: 0.9264 - regression_loss: 0.8178 - classification_loss: 0.1086 299/500 [================>.............] - ETA: 1:08 - loss: 0.9263 - regression_loss: 0.8180 - classification_loss: 0.1084 300/500 [=================>............] - ETA: 1:07 - loss: 0.9270 - regression_loss: 0.8186 - classification_loss: 0.1084 301/500 [=================>............] - ETA: 1:07 - loss: 0.9263 - regression_loss: 0.8181 - classification_loss: 0.1082 302/500 [=================>............] - ETA: 1:07 - loss: 0.9241 - regression_loss: 0.8162 - classification_loss: 0.1079 303/500 [=================>............] - ETA: 1:06 - loss: 0.9237 - regression_loss: 0.8158 - classification_loss: 0.1079 304/500 [=================>............] - ETA: 1:06 - loss: 0.9247 - regression_loss: 0.8167 - classification_loss: 0.1080 305/500 [=================>............] - ETA: 1:06 - loss: 0.9248 - regression_loss: 0.8168 - classification_loss: 0.1079 306/500 [=================>............] - ETA: 1:05 - loss: 0.9259 - regression_loss: 0.8174 - classification_loss: 0.1085 307/500 [=================>............] - ETA: 1:05 - loss: 0.9252 - regression_loss: 0.8169 - classification_loss: 0.1083 308/500 [=================>............] - ETA: 1:05 - loss: 0.9242 - regression_loss: 0.8160 - classification_loss: 0.1081 309/500 [=================>............] - ETA: 1:04 - loss: 0.9254 - regression_loss: 0.8172 - classification_loss: 0.1082 310/500 [=================>............] - ETA: 1:04 - loss: 0.9249 - regression_loss: 0.8170 - classification_loss: 0.1079 311/500 [=================>............] - ETA: 1:03 - loss: 0.9241 - regression_loss: 0.8163 - classification_loss: 0.1078 312/500 [=================>............] - ETA: 1:03 - loss: 0.9230 - regression_loss: 0.8154 - classification_loss: 0.1076 313/500 [=================>............] - ETA: 1:03 - loss: 0.9219 - regression_loss: 0.8145 - classification_loss: 0.1074 314/500 [=================>............] - ETA: 1:02 - loss: 0.9217 - regression_loss: 0.8145 - classification_loss: 0.1072 315/500 [=================>............] - ETA: 1:02 - loss: 0.9193 - regression_loss: 0.8123 - classification_loss: 0.1070 316/500 [=================>............] - ETA: 1:02 - loss: 0.9201 - regression_loss: 0.8132 - classification_loss: 0.1070 317/500 [==================>...........] - ETA: 1:01 - loss: 0.9182 - regression_loss: 0.8114 - classification_loss: 0.1068 318/500 [==================>...........] - ETA: 1:01 - loss: 0.9180 - regression_loss: 0.8113 - classification_loss: 0.1067 319/500 [==================>...........] - ETA: 1:01 - loss: 0.9180 - regression_loss: 0.8115 - classification_loss: 0.1064 320/500 [==================>...........] - ETA: 1:00 - loss: 0.9190 - regression_loss: 0.8124 - classification_loss: 0.1067 321/500 [==================>...........] - ETA: 1:00 - loss: 0.9197 - regression_loss: 0.8131 - classification_loss: 0.1066 322/500 [==================>...........] - ETA: 1:00 - loss: 0.9205 - regression_loss: 0.8138 - classification_loss: 0.1067 323/500 [==================>...........] - ETA: 59s - loss: 0.9199 - regression_loss: 0.8134 - classification_loss: 0.1066  324/500 [==================>...........] - ETA: 59s - loss: 0.9201 - regression_loss: 0.8137 - classification_loss: 0.1064 325/500 [==================>...........] - ETA: 59s - loss: 0.9196 - regression_loss: 0.8134 - classification_loss: 0.1063 326/500 [==================>...........] - ETA: 58s - loss: 0.9210 - regression_loss: 0.8145 - classification_loss: 0.1066 327/500 [==================>...........] - ETA: 58s - loss: 0.9217 - regression_loss: 0.8152 - classification_loss: 0.1065 328/500 [==================>...........] - ETA: 58s - loss: 0.9219 - regression_loss: 0.8155 - classification_loss: 0.1063 329/500 [==================>...........] - ETA: 57s - loss: 0.9208 - regression_loss: 0.8145 - classification_loss: 0.1063 330/500 [==================>...........] - ETA: 57s - loss: 0.9209 - regression_loss: 0.8146 - classification_loss: 0.1063 331/500 [==================>...........] - ETA: 57s - loss: 0.9226 - regression_loss: 0.8159 - classification_loss: 0.1067 332/500 [==================>...........] - ETA: 56s - loss: 0.9246 - regression_loss: 0.8178 - classification_loss: 0.1068 333/500 [==================>...........] - ETA: 56s - loss: 0.9257 - regression_loss: 0.8187 - classification_loss: 0.1070 334/500 [===================>..........] - ETA: 56s - loss: 0.9254 - regression_loss: 0.8186 - classification_loss: 0.1069 335/500 [===================>..........] - ETA: 55s - loss: 0.9262 - regression_loss: 0.8193 - classification_loss: 0.1069 336/500 [===================>..........] - ETA: 55s - loss: 0.9253 - regression_loss: 0.8186 - classification_loss: 0.1067 337/500 [===================>..........] - ETA: 55s - loss: 0.9254 - regression_loss: 0.8186 - classification_loss: 0.1068 338/500 [===================>..........] - ETA: 54s - loss: 0.9281 - regression_loss: 0.8206 - classification_loss: 0.1075 339/500 [===================>..........] - ETA: 54s - loss: 0.9279 - regression_loss: 0.8206 - classification_loss: 0.1073 340/500 [===================>..........] - ETA: 54s - loss: 0.9269 - regression_loss: 0.8198 - classification_loss: 0.1071 341/500 [===================>..........] - ETA: 53s - loss: 0.9257 - regression_loss: 0.8188 - classification_loss: 0.1069 342/500 [===================>..........] - ETA: 53s - loss: 0.9245 - regression_loss: 0.8179 - classification_loss: 0.1067 343/500 [===================>..........] - ETA: 53s - loss: 0.9225 - regression_loss: 0.8161 - classification_loss: 0.1064 344/500 [===================>..........] - ETA: 52s - loss: 0.9213 - regression_loss: 0.8151 - classification_loss: 0.1062 345/500 [===================>..........] - ETA: 52s - loss: 0.9218 - regression_loss: 0.8157 - classification_loss: 0.1062 346/500 [===================>..........] - ETA: 52s - loss: 0.9202 - regression_loss: 0.8143 - classification_loss: 0.1059 347/500 [===================>..........] - ETA: 51s - loss: 0.9204 - regression_loss: 0.8144 - classification_loss: 0.1059 348/500 [===================>..........] - ETA: 51s - loss: 0.9216 - regression_loss: 0.8157 - classification_loss: 0.1060 349/500 [===================>..........] - ETA: 51s - loss: 0.9231 - regression_loss: 0.8170 - classification_loss: 0.1061 350/500 [====================>.........] - ETA: 50s - loss: 0.9229 - regression_loss: 0.8169 - classification_loss: 0.1059 351/500 [====================>.........] - ETA: 50s - loss: 0.9232 - regression_loss: 0.8173 - classification_loss: 0.1059 352/500 [====================>.........] - ETA: 50s - loss: 0.9242 - regression_loss: 0.8183 - classification_loss: 0.1060 353/500 [====================>.........] - ETA: 49s - loss: 0.9239 - regression_loss: 0.8180 - classification_loss: 0.1059 354/500 [====================>.........] - ETA: 49s - loss: 0.9236 - regression_loss: 0.8179 - classification_loss: 0.1058 355/500 [====================>.........] - ETA: 49s - loss: 0.9241 - regression_loss: 0.8183 - classification_loss: 0.1058 356/500 [====================>.........] - ETA: 48s - loss: 0.9241 - regression_loss: 0.8183 - classification_loss: 0.1058 357/500 [====================>.........] - ETA: 48s - loss: 0.9264 - regression_loss: 0.8202 - classification_loss: 0.1062 358/500 [====================>.........] - ETA: 48s - loss: 0.9263 - regression_loss: 0.8202 - classification_loss: 0.1062 359/500 [====================>.........] - ETA: 47s - loss: 0.9267 - regression_loss: 0.8207 - classification_loss: 0.1060 360/500 [====================>.........] - ETA: 47s - loss: 0.9257 - regression_loss: 0.8199 - classification_loss: 0.1058 361/500 [====================>.........] - ETA: 47s - loss: 0.9243 - regression_loss: 0.8187 - classification_loss: 0.1056 362/500 [====================>.........] - ETA: 46s - loss: 0.9248 - regression_loss: 0.8191 - classification_loss: 0.1057 363/500 [====================>.........] - ETA: 46s - loss: 0.9264 - regression_loss: 0.8203 - classification_loss: 0.1061 364/500 [====================>.........] - ETA: 46s - loss: 0.9283 - regression_loss: 0.8219 - classification_loss: 0.1064 365/500 [====================>.........] - ETA: 45s - loss: 0.9275 - regression_loss: 0.8213 - classification_loss: 0.1062 366/500 [====================>.........] - ETA: 45s - loss: 0.9265 - regression_loss: 0.8203 - classification_loss: 0.1062 367/500 [=====================>........] - ETA: 45s - loss: 0.9273 - regression_loss: 0.8211 - classification_loss: 0.1062 368/500 [=====================>........] - ETA: 44s - loss: 0.9281 - regression_loss: 0.8219 - classification_loss: 0.1062 369/500 [=====================>........] - ETA: 44s - loss: 0.9288 - regression_loss: 0.8227 - classification_loss: 0.1062 370/500 [=====================>........] - ETA: 44s - loss: 0.9286 - regression_loss: 0.8226 - classification_loss: 0.1061 371/500 [=====================>........] - ETA: 43s - loss: 0.9279 - regression_loss: 0.8220 - classification_loss: 0.1059 372/500 [=====================>........] - ETA: 43s - loss: 0.9283 - regression_loss: 0.8224 - classification_loss: 0.1059 373/500 [=====================>........] - ETA: 43s - loss: 0.9293 - regression_loss: 0.8232 - classification_loss: 0.1061 374/500 [=====================>........] - ETA: 42s - loss: 0.9300 - regression_loss: 0.8240 - classification_loss: 0.1060 375/500 [=====================>........] - ETA: 42s - loss: 0.9297 - regression_loss: 0.8238 - classification_loss: 0.1059 376/500 [=====================>........] - ETA: 42s - loss: 0.9316 - regression_loss: 0.8256 - classification_loss: 0.1060 377/500 [=====================>........] - ETA: 41s - loss: 0.9315 - regression_loss: 0.8256 - classification_loss: 0.1059 378/500 [=====================>........] - ETA: 41s - loss: 0.9314 - regression_loss: 0.8256 - classification_loss: 0.1058 379/500 [=====================>........] - ETA: 41s - loss: 0.9297 - regression_loss: 0.8240 - classification_loss: 0.1056 380/500 [=====================>........] - ETA: 40s - loss: 0.9313 - regression_loss: 0.8254 - classification_loss: 0.1059 381/500 [=====================>........] - ETA: 40s - loss: 0.9328 - regression_loss: 0.8266 - classification_loss: 0.1061 382/500 [=====================>........] - ETA: 40s - loss: 0.9327 - regression_loss: 0.8267 - classification_loss: 0.1061 383/500 [=====================>........] - ETA: 39s - loss: 0.9323 - regression_loss: 0.8264 - classification_loss: 0.1059 384/500 [======================>.......] - ETA: 39s - loss: 0.9320 - regression_loss: 0.8262 - classification_loss: 0.1058 385/500 [======================>.......] - ETA: 39s - loss: 0.9320 - regression_loss: 0.8263 - classification_loss: 0.1057 386/500 [======================>.......] - ETA: 38s - loss: 0.9337 - regression_loss: 0.8277 - classification_loss: 0.1061 387/500 [======================>.......] - ETA: 38s - loss: 0.9338 - regression_loss: 0.8278 - classification_loss: 0.1060 388/500 [======================>.......] - ETA: 37s - loss: 0.9337 - regression_loss: 0.8278 - classification_loss: 0.1058 389/500 [======================>.......] - ETA: 37s - loss: 0.9327 - regression_loss: 0.8269 - classification_loss: 0.1058 390/500 [======================>.......] - ETA: 37s - loss: 0.9325 - regression_loss: 0.8267 - classification_loss: 0.1058 391/500 [======================>.......] - ETA: 36s - loss: 0.9321 - regression_loss: 0.8263 - classification_loss: 0.1058 392/500 [======================>.......] - ETA: 36s - loss: 0.9329 - regression_loss: 0.8268 - classification_loss: 0.1061 393/500 [======================>.......] - ETA: 36s - loss: 0.9319 - regression_loss: 0.8259 - classification_loss: 0.1060 394/500 [======================>.......] - ETA: 35s - loss: 0.9329 - regression_loss: 0.8268 - classification_loss: 0.1061 395/500 [======================>.......] - ETA: 35s - loss: 0.9326 - regression_loss: 0.8267 - classification_loss: 0.1059 396/500 [======================>.......] - ETA: 35s - loss: 0.9328 - regression_loss: 0.8268 - classification_loss: 0.1059 397/500 [======================>.......] - ETA: 34s - loss: 0.9325 - regression_loss: 0.8266 - classification_loss: 0.1059 398/500 [======================>.......] - ETA: 34s - loss: 0.9322 - regression_loss: 0.8264 - classification_loss: 0.1058 399/500 [======================>.......] - ETA: 34s - loss: 0.9312 - regression_loss: 0.8256 - classification_loss: 0.1056 400/500 [=======================>......] - ETA: 33s - loss: 0.9309 - regression_loss: 0.8254 - classification_loss: 0.1055 401/500 [=======================>......] - ETA: 33s - loss: 0.9297 - regression_loss: 0.8244 - classification_loss: 0.1053 402/500 [=======================>......] - ETA: 33s - loss: 0.9304 - regression_loss: 0.8251 - classification_loss: 0.1053 403/500 [=======================>......] - ETA: 32s - loss: 0.9312 - regression_loss: 0.8259 - classification_loss: 0.1053 404/500 [=======================>......] - ETA: 32s - loss: 0.9319 - regression_loss: 0.8263 - classification_loss: 0.1056 405/500 [=======================>......] - ETA: 32s - loss: 0.9322 - regression_loss: 0.8265 - classification_loss: 0.1057 406/500 [=======================>......] - ETA: 31s - loss: 0.9318 - regression_loss: 0.8262 - classification_loss: 0.1056 407/500 [=======================>......] - ETA: 31s - loss: 0.9314 - regression_loss: 0.8259 - classification_loss: 0.1055 408/500 [=======================>......] - ETA: 31s - loss: 0.9305 - regression_loss: 0.8251 - classification_loss: 0.1054 409/500 [=======================>......] - ETA: 30s - loss: 0.9297 - regression_loss: 0.8244 - classification_loss: 0.1053 410/500 [=======================>......] - ETA: 30s - loss: 0.9289 - regression_loss: 0.8237 - classification_loss: 0.1052 411/500 [=======================>......] - ETA: 30s - loss: 0.9294 - regression_loss: 0.8242 - classification_loss: 0.1052 412/500 [=======================>......] - ETA: 29s - loss: 0.9390 - regression_loss: 0.8319 - classification_loss: 0.1071 413/500 [=======================>......] - ETA: 29s - loss: 0.9378 - regression_loss: 0.8308 - classification_loss: 0.1070 414/500 [=======================>......] - ETA: 29s - loss: 0.9372 - regression_loss: 0.8303 - classification_loss: 0.1069 415/500 [=======================>......] - ETA: 28s - loss: 0.9374 - regression_loss: 0.8306 - classification_loss: 0.1068 416/500 [=======================>......] - ETA: 28s - loss: 0.9382 - regression_loss: 0.8313 - classification_loss: 0.1070 417/500 [========================>.....] - ETA: 28s - loss: 0.9385 - regression_loss: 0.8316 - classification_loss: 0.1068 418/500 [========================>.....] - ETA: 27s - loss: 0.9389 - regression_loss: 0.8321 - classification_loss: 0.1068 419/500 [========================>.....] - ETA: 27s - loss: 0.9375 - regression_loss: 0.8309 - classification_loss: 0.1066 420/500 [========================>.....] - ETA: 27s - loss: 0.9386 - regression_loss: 0.8316 - classification_loss: 0.1070 421/500 [========================>.....] - ETA: 26s - loss: 0.9385 - regression_loss: 0.8315 - classification_loss: 0.1070 422/500 [========================>.....] - ETA: 26s - loss: 0.9375 - regression_loss: 0.8307 - classification_loss: 0.1068 423/500 [========================>.....] - ETA: 26s - loss: 0.9372 - regression_loss: 0.8304 - classification_loss: 0.1068 424/500 [========================>.....] - ETA: 25s - loss: 0.9360 - regression_loss: 0.8294 - classification_loss: 0.1066 425/500 [========================>.....] - ETA: 25s - loss: 0.9363 - regression_loss: 0.8297 - classification_loss: 0.1065 426/500 [========================>.....] - ETA: 25s - loss: 0.9347 - regression_loss: 0.8283 - classification_loss: 0.1063 427/500 [========================>.....] - ETA: 24s - loss: 0.9342 - regression_loss: 0.8279 - classification_loss: 0.1062 428/500 [========================>.....] - ETA: 24s - loss: 0.9331 - regression_loss: 0.8271 - classification_loss: 0.1060 429/500 [========================>.....] - ETA: 24s - loss: 0.9353 - regression_loss: 0.8288 - classification_loss: 0.1065 430/500 [========================>.....] - ETA: 23s - loss: 0.9359 - regression_loss: 0.8295 - classification_loss: 0.1064 431/500 [========================>.....] - ETA: 23s - loss: 0.9359 - regression_loss: 0.8296 - classification_loss: 0.1064 432/500 [========================>.....] - ETA: 23s - loss: 0.9366 - regression_loss: 0.8302 - classification_loss: 0.1064 433/500 [========================>.....] - ETA: 22s - loss: 0.9357 - regression_loss: 0.8294 - classification_loss: 0.1063 434/500 [=========================>....] - ETA: 22s - loss: 0.9370 - regression_loss: 0.8304 - classification_loss: 0.1066 435/500 [=========================>....] - ETA: 22s - loss: 0.9370 - regression_loss: 0.8305 - classification_loss: 0.1065 436/500 [=========================>....] - ETA: 21s - loss: 0.9373 - regression_loss: 0.8308 - classification_loss: 0.1064 437/500 [=========================>....] - ETA: 21s - loss: 0.9372 - regression_loss: 0.8308 - classification_loss: 0.1064 438/500 [=========================>....] - ETA: 21s - loss: 0.9374 - regression_loss: 0.8311 - classification_loss: 0.1063 439/500 [=========================>....] - ETA: 20s - loss: 0.9365 - regression_loss: 0.8304 - classification_loss: 0.1062 440/500 [=========================>....] - ETA: 20s - loss: 0.9368 - regression_loss: 0.8308 - classification_loss: 0.1060 441/500 [=========================>....] - ETA: 20s - loss: 0.9375 - regression_loss: 0.8315 - classification_loss: 0.1059 442/500 [=========================>....] - ETA: 19s - loss: 0.9368 - regression_loss: 0.8311 - classification_loss: 0.1058 443/500 [=========================>....] - ETA: 19s - loss: 0.9357 - regression_loss: 0.8301 - classification_loss: 0.1056 444/500 [=========================>....] - ETA: 18s - loss: 0.9358 - regression_loss: 0.8303 - classification_loss: 0.1055 445/500 [=========================>....] - ETA: 18s - loss: 0.9362 - regression_loss: 0.8307 - classification_loss: 0.1055 446/500 [=========================>....] - ETA: 18s - loss: 0.9392 - regression_loss: 0.8329 - classification_loss: 0.1063 447/500 [=========================>....] - ETA: 17s - loss: 0.9385 - regression_loss: 0.8323 - classification_loss: 0.1062 448/500 [=========================>....] - ETA: 17s - loss: 0.9405 - regression_loss: 0.8341 - classification_loss: 0.1063 449/500 [=========================>....] - ETA: 17s - loss: 0.9410 - regression_loss: 0.8346 - classification_loss: 0.1064 450/500 [==========================>...] - ETA: 16s - loss: 0.9404 - regression_loss: 0.8341 - classification_loss: 0.1063 451/500 [==========================>...] - ETA: 16s - loss: 0.9423 - regression_loss: 0.8354 - classification_loss: 0.1069 452/500 [==========================>...] - ETA: 16s - loss: 0.9412 - regression_loss: 0.8344 - classification_loss: 0.1068 453/500 [==========================>...] - ETA: 15s - loss: 0.9407 - regression_loss: 0.8341 - classification_loss: 0.1066 454/500 [==========================>...] - ETA: 15s - loss: 0.9399 - regression_loss: 0.8334 - classification_loss: 0.1065 455/500 [==========================>...] - ETA: 15s - loss: 0.9396 - regression_loss: 0.8332 - classification_loss: 0.1063 456/500 [==========================>...] - ETA: 14s - loss: 0.9388 - regression_loss: 0.8326 - classification_loss: 0.1062 457/500 [==========================>...] - ETA: 14s - loss: 0.9393 - regression_loss: 0.8332 - classification_loss: 0.1061 458/500 [==========================>...] - ETA: 14s - loss: 0.9392 - regression_loss: 0.8331 - classification_loss: 0.1060 459/500 [==========================>...] - ETA: 13s - loss: 0.9402 - regression_loss: 0.8338 - classification_loss: 0.1064 460/500 [==========================>...] - ETA: 13s - loss: 0.9399 - regression_loss: 0.8335 - classification_loss: 0.1064 461/500 [==========================>...] - ETA: 13s - loss: 0.9406 - regression_loss: 0.8342 - classification_loss: 0.1064 462/500 [==========================>...] - ETA: 12s - loss: 0.9408 - regression_loss: 0.8344 - classification_loss: 0.1064 463/500 [==========================>...] - ETA: 12s - loss: 0.9407 - regression_loss: 0.8344 - classification_loss: 0.1064 464/500 [==========================>...] - ETA: 12s - loss: 0.9403 - regression_loss: 0.8340 - classification_loss: 0.1063 465/500 [==========================>...] - ETA: 11s - loss: 0.9394 - regression_loss: 0.8332 - classification_loss: 0.1062 466/500 [==========================>...] - ETA: 11s - loss: 0.9387 - regression_loss: 0.8325 - classification_loss: 0.1062 467/500 [===========================>..] - ETA: 11s - loss: 0.9382 - regression_loss: 0.8321 - classification_loss: 0.1061 468/500 [===========================>..] - ETA: 10s - loss: 0.9385 - regression_loss: 0.8324 - classification_loss: 0.1061 469/500 [===========================>..] - ETA: 10s - loss: 0.9393 - regression_loss: 0.8332 - classification_loss: 0.1061 470/500 [===========================>..] - ETA: 10s - loss: 0.9385 - regression_loss: 0.8325 - classification_loss: 0.1060 471/500 [===========================>..] - ETA: 9s - loss: 0.9389 - regression_loss: 0.8328 - classification_loss: 0.1061  472/500 [===========================>..] - ETA: 9s - loss: 0.9393 - regression_loss: 0.8331 - classification_loss: 0.1062 473/500 [===========================>..] - ETA: 9s - loss: 0.9392 - regression_loss: 0.8331 - classification_loss: 0.1062 474/500 [===========================>..] - ETA: 8s - loss: 0.9398 - regression_loss: 0.8336 - classification_loss: 0.1062 475/500 [===========================>..] - ETA: 8s - loss: 0.9393 - regression_loss: 0.8331 - classification_loss: 0.1062 476/500 [===========================>..] - ETA: 8s - loss: 0.9387 - regression_loss: 0.8326 - classification_loss: 0.1061 477/500 [===========================>..] - ETA: 7s - loss: 0.9387 - regression_loss: 0.8327 - classification_loss: 0.1060 478/500 [===========================>..] - ETA: 7s - loss: 0.9388 - regression_loss: 0.8328 - classification_loss: 0.1060 479/500 [===========================>..] - ETA: 7s - loss: 0.9384 - regression_loss: 0.8325 - classification_loss: 0.1059 480/500 [===========================>..] - ETA: 6s - loss: 0.9402 - regression_loss: 0.8342 - classification_loss: 0.1061 481/500 [===========================>..] - ETA: 6s - loss: 0.9402 - regression_loss: 0.8342 - classification_loss: 0.1060 482/500 [===========================>..] - ETA: 6s - loss: 0.9407 - regression_loss: 0.8347 - classification_loss: 0.1060 483/500 [===========================>..] - ETA: 5s - loss: 0.9404 - regression_loss: 0.8344 - classification_loss: 0.1060 484/500 [============================>.] - ETA: 5s - loss: 0.9412 - regression_loss: 0.8352 - classification_loss: 0.1060 485/500 [============================>.] - ETA: 5s - loss: 0.9414 - regression_loss: 0.8354 - classification_loss: 0.1060 486/500 [============================>.] - ETA: 4s - loss: 0.9415 - regression_loss: 0.8355 - classification_loss: 0.1060 487/500 [============================>.] - ETA: 4s - loss: 0.9402 - regression_loss: 0.8345 - classification_loss: 0.1058 488/500 [============================>.] - ETA: 4s - loss: 0.9416 - regression_loss: 0.8355 - classification_loss: 0.1062 489/500 [============================>.] - ETA: 3s - loss: 0.9415 - regression_loss: 0.8354 - classification_loss: 0.1061 490/500 [============================>.] - ETA: 3s - loss: 0.9414 - regression_loss: 0.8353 - classification_loss: 0.1062 491/500 [============================>.] - ETA: 3s - loss: 0.9403 - regression_loss: 0.8343 - classification_loss: 0.1060 492/500 [============================>.] - ETA: 2s - loss: 0.9397 - regression_loss: 0.8338 - classification_loss: 0.1059 493/500 [============================>.] - ETA: 2s - loss: 0.9397 - regression_loss: 0.8339 - classification_loss: 0.1058 494/500 [============================>.] - ETA: 2s - loss: 0.9389 - regression_loss: 0.8333 - classification_loss: 0.1056 495/500 [============================>.] - ETA: 1s - loss: 0.9397 - regression_loss: 0.8341 - classification_loss: 0.1056 496/500 [============================>.] - ETA: 1s - loss: 0.9405 - regression_loss: 0.8349 - classification_loss: 0.1056 497/500 [============================>.] - ETA: 1s - loss: 0.9399 - regression_loss: 0.8344 - classification_loss: 0.1055 498/500 [============================>.] - ETA: 0s - loss: 0.9398 - regression_loss: 0.8344 - classification_loss: 0.1054 499/500 [============================>.] - ETA: 0s - loss: 0.9395 - regression_loss: 0.8338 - classification_loss: 0.1056 500/500 [==============================] - 170s 339ms/step - loss: 0.9392 - regression_loss: 0.8336 - classification_loss: 0.1056 326 instances of class plum with average precision: 0.8556 mAP: 0.8556 Epoch 00029: saving model to ./training/snapshots/resnet101_pascal_29.h5 Epoch 30/150 1/500 [..............................] - ETA: 2:37 - loss: 0.9272 - regression_loss: 0.8586 - classification_loss: 0.0687 2/500 [..............................] - ETA: 2:40 - loss: 1.0260 - regression_loss: 0.9030 - classification_loss: 0.1230 3/500 [..............................] - ETA: 2:42 - loss: 0.8547 - regression_loss: 0.7616 - classification_loss: 0.0932 4/500 [..............................] - ETA: 2:45 - loss: 0.7925 - regression_loss: 0.7182 - classification_loss: 0.0743 5/500 [..............................] - ETA: 2:44 - loss: 0.7950 - regression_loss: 0.7185 - classification_loss: 0.0765 6/500 [..............................] - ETA: 2:46 - loss: 0.8064 - regression_loss: 0.7320 - classification_loss: 0.0744 7/500 [..............................] - ETA: 2:46 - loss: 0.7347 - regression_loss: 0.6669 - classification_loss: 0.0678 8/500 [..............................] - ETA: 2:47 - loss: 0.6710 - regression_loss: 0.6088 - classification_loss: 0.0623 9/500 [..............................] - ETA: 2:45 - loss: 0.6949 - regression_loss: 0.6347 - classification_loss: 0.0602 10/500 [..............................] - ETA: 2:45 - loss: 0.6999 - regression_loss: 0.6390 - classification_loss: 0.0608 11/500 [..............................] - ETA: 2:46 - loss: 0.7032 - regression_loss: 0.6414 - classification_loss: 0.0617 12/500 [..............................] - ETA: 2:46 - loss: 0.6994 - regression_loss: 0.6392 - classification_loss: 0.0602 13/500 [..............................] - ETA: 2:46 - loss: 0.6755 - regression_loss: 0.6178 - classification_loss: 0.0577 14/500 [..............................] - ETA: 2:45 - loss: 0.7437 - regression_loss: 0.6669 - classification_loss: 0.0768 15/500 [..............................] - ETA: 2:45 - loss: 0.7925 - regression_loss: 0.7065 - classification_loss: 0.0861 16/500 [..............................] - ETA: 2:45 - loss: 0.7654 - regression_loss: 0.6837 - classification_loss: 0.0817 17/500 [>.............................] - ETA: 2:45 - loss: 0.7481 - regression_loss: 0.6702 - classification_loss: 0.0779 18/500 [>.............................] - ETA: 2:44 - loss: 0.7761 - regression_loss: 0.6945 - classification_loss: 0.0815 19/500 [>.............................] - ETA: 2:44 - loss: 0.7795 - regression_loss: 0.6981 - classification_loss: 0.0814 20/500 [>.............................] - ETA: 2:43 - loss: 0.7873 - regression_loss: 0.7065 - classification_loss: 0.0809 21/500 [>.............................] - ETA: 2:43 - loss: 0.7894 - regression_loss: 0.7080 - classification_loss: 0.0813 22/500 [>.............................] - ETA: 2:42 - loss: 0.7767 - regression_loss: 0.6966 - classification_loss: 0.0802 23/500 [>.............................] - ETA: 2:42 - loss: 0.7685 - regression_loss: 0.6883 - classification_loss: 0.0802 24/500 [>.............................] - ETA: 2:41 - loss: 0.7626 - regression_loss: 0.6838 - classification_loss: 0.0789 25/500 [>.............................] - ETA: 2:41 - loss: 0.7763 - regression_loss: 0.6954 - classification_loss: 0.0809 26/500 [>.............................] - ETA: 2:40 - loss: 0.7906 - regression_loss: 0.7076 - classification_loss: 0.0830 27/500 [>.............................] - ETA: 2:40 - loss: 0.8108 - regression_loss: 0.7242 - classification_loss: 0.0865 28/500 [>.............................] - ETA: 2:40 - loss: 0.8363 - regression_loss: 0.7427 - classification_loss: 0.0936 29/500 [>.............................] - ETA: 2:39 - loss: 0.8395 - regression_loss: 0.7443 - classification_loss: 0.0952 30/500 [>.............................] - ETA: 2:39 - loss: 0.8499 - regression_loss: 0.7543 - classification_loss: 0.0956 31/500 [>.............................] - ETA: 2:39 - loss: 0.8457 - regression_loss: 0.7511 - classification_loss: 0.0946 32/500 [>.............................] - ETA: 2:39 - loss: 0.8430 - regression_loss: 0.7489 - classification_loss: 0.0940 33/500 [>.............................] - ETA: 2:39 - loss: 0.8446 - regression_loss: 0.7510 - classification_loss: 0.0936 34/500 [=>............................] - ETA: 2:39 - loss: 0.8447 - regression_loss: 0.7515 - classification_loss: 0.0933 35/500 [=>............................] - ETA: 2:38 - loss: 0.8471 - regression_loss: 0.7545 - classification_loss: 0.0925 36/500 [=>............................] - ETA: 2:38 - loss: 0.8324 - regression_loss: 0.7407 - classification_loss: 0.0917 37/500 [=>............................] - ETA: 2:38 - loss: 0.8552 - regression_loss: 0.7631 - classification_loss: 0.0921 38/500 [=>............................] - ETA: 2:37 - loss: 0.8817 - regression_loss: 0.7867 - classification_loss: 0.0950 39/500 [=>............................] - ETA: 2:37 - loss: 0.8731 - regression_loss: 0.7792 - classification_loss: 0.0940 40/500 [=>............................] - ETA: 2:37 - loss: 0.8736 - regression_loss: 0.7800 - classification_loss: 0.0935 41/500 [=>............................] - ETA: 2:37 - loss: 0.9178 - regression_loss: 0.8154 - classification_loss: 0.1024 42/500 [=>............................] - ETA: 2:36 - loss: 0.9184 - regression_loss: 0.8166 - classification_loss: 0.1017 43/500 [=>............................] - ETA: 2:36 - loss: 0.9116 - regression_loss: 0.8112 - classification_loss: 0.1003 44/500 [=>............................] - ETA: 2:35 - loss: 0.9160 - regression_loss: 0.8156 - classification_loss: 0.1004 45/500 [=>............................] - ETA: 2:35 - loss: 0.9122 - regression_loss: 0.8131 - classification_loss: 0.0991 46/500 [=>............................] - ETA: 2:35 - loss: 0.9070 - regression_loss: 0.8091 - classification_loss: 0.0979 47/500 [=>............................] - ETA: 2:34 - loss: 0.9063 - regression_loss: 0.8092 - classification_loss: 0.0971 48/500 [=>............................] - ETA: 2:34 - loss: 0.9081 - regression_loss: 0.8111 - classification_loss: 0.0970 49/500 [=>............................] - ETA: 2:33 - loss: 0.9110 - regression_loss: 0.8140 - classification_loss: 0.0969 50/500 [==>...........................] - ETA: 2:33 - loss: 0.9047 - regression_loss: 0.8093 - classification_loss: 0.0954 51/500 [==>...........................] - ETA: 2:33 - loss: 0.9033 - regression_loss: 0.8088 - classification_loss: 0.0945 52/500 [==>...........................] - ETA: 2:32 - loss: 0.8928 - regression_loss: 0.7996 - classification_loss: 0.0932 53/500 [==>...........................] - ETA: 2:32 - loss: 0.8822 - regression_loss: 0.7902 - classification_loss: 0.0920 54/500 [==>...........................] - ETA: 2:31 - loss: 0.8815 - regression_loss: 0.7899 - classification_loss: 0.0917 55/500 [==>...........................] - ETA: 2:31 - loss: 0.8877 - regression_loss: 0.7951 - classification_loss: 0.0926 56/500 [==>...........................] - ETA: 2:31 - loss: 0.8759 - regression_loss: 0.7846 - classification_loss: 0.0912 57/500 [==>...........................] - ETA: 2:30 - loss: 0.8790 - regression_loss: 0.7872 - classification_loss: 0.0917 58/500 [==>...........................] - ETA: 2:30 - loss: 0.8819 - regression_loss: 0.7896 - classification_loss: 0.0924 59/500 [==>...........................] - ETA: 2:30 - loss: 0.8728 - regression_loss: 0.7815 - classification_loss: 0.0913 60/500 [==>...........................] - ETA: 2:29 - loss: 0.8715 - regression_loss: 0.7805 - classification_loss: 0.0910 61/500 [==>...........................] - ETA: 2:29 - loss: 0.8693 - regression_loss: 0.7787 - classification_loss: 0.0906 62/500 [==>...........................] - ETA: 2:28 - loss: 0.8726 - regression_loss: 0.7813 - classification_loss: 0.0913 63/500 [==>...........................] - ETA: 2:28 - loss: 0.8705 - regression_loss: 0.7790 - classification_loss: 0.0915 64/500 [==>...........................] - ETA: 2:27 - loss: 0.8804 - regression_loss: 0.7875 - classification_loss: 0.0929 65/500 [==>...........................] - ETA: 2:27 - loss: 0.8849 - regression_loss: 0.7910 - classification_loss: 0.0939 66/500 [==>...........................] - ETA: 2:27 - loss: 0.8850 - regression_loss: 0.7906 - classification_loss: 0.0944 67/500 [===>..........................] - ETA: 2:27 - loss: 0.8871 - regression_loss: 0.7930 - classification_loss: 0.0941 68/500 [===>..........................] - ETA: 2:26 - loss: 0.8822 - regression_loss: 0.7892 - classification_loss: 0.0930 69/500 [===>..........................] - ETA: 2:26 - loss: 0.8795 - regression_loss: 0.7858 - classification_loss: 0.0937 70/500 [===>..........................] - ETA: 2:26 - loss: 0.8856 - regression_loss: 0.7910 - classification_loss: 0.0946 71/500 [===>..........................] - ETA: 2:25 - loss: 0.8836 - regression_loss: 0.7893 - classification_loss: 0.0943 72/500 [===>..........................] - ETA: 2:25 - loss: 0.8826 - regression_loss: 0.7885 - classification_loss: 0.0942 73/500 [===>..........................] - ETA: 2:25 - loss: 0.8881 - regression_loss: 0.7934 - classification_loss: 0.0946 74/500 [===>..........................] - ETA: 2:24 - loss: 0.8868 - regression_loss: 0.7927 - classification_loss: 0.0941 75/500 [===>..........................] - ETA: 2:24 - loss: 0.8864 - regression_loss: 0.7925 - classification_loss: 0.0939 76/500 [===>..........................] - ETA: 2:24 - loss: 0.9011 - regression_loss: 0.8053 - classification_loss: 0.0957 77/500 [===>..........................] - ETA: 2:23 - loss: 0.9035 - regression_loss: 0.8071 - classification_loss: 0.0964 78/500 [===>..........................] - ETA: 2:23 - loss: 0.9003 - regression_loss: 0.8047 - classification_loss: 0.0955 79/500 [===>..........................] - ETA: 2:23 - loss: 0.8952 - regression_loss: 0.8008 - classification_loss: 0.0944 80/500 [===>..........................] - ETA: 2:22 - loss: 0.8889 - regression_loss: 0.7950 - classification_loss: 0.0938 81/500 [===>..........................] - ETA: 2:22 - loss: 0.8860 - regression_loss: 0.7931 - classification_loss: 0.0930 82/500 [===>..........................] - ETA: 2:22 - loss: 0.8822 - regression_loss: 0.7896 - classification_loss: 0.0926 83/500 [===>..........................] - ETA: 2:21 - loss: 0.8717 - regression_loss: 0.7801 - classification_loss: 0.0916 84/500 [====>.........................] - ETA: 2:21 - loss: 0.8725 - regression_loss: 0.7815 - classification_loss: 0.0910 85/500 [====>.........................] - ETA: 2:20 - loss: 0.8839 - regression_loss: 0.7893 - classification_loss: 0.0946 86/500 [====>.........................] - ETA: 2:20 - loss: 0.8775 - regression_loss: 0.7836 - classification_loss: 0.0939 87/500 [====>.........................] - ETA: 2:20 - loss: 0.8807 - regression_loss: 0.7870 - classification_loss: 0.0937 88/500 [====>.........................] - ETA: 2:20 - loss: 0.8845 - regression_loss: 0.7905 - classification_loss: 0.0939 89/500 [====>.........................] - ETA: 2:19 - loss: 0.8931 - regression_loss: 0.7955 - classification_loss: 0.0975 90/500 [====>.........................] - ETA: 2:19 - loss: 0.8980 - regression_loss: 0.7999 - classification_loss: 0.0981 91/500 [====>.........................] - ETA: 2:18 - loss: 0.8976 - regression_loss: 0.7996 - classification_loss: 0.0979 92/500 [====>.........................] - ETA: 2:18 - loss: 0.9014 - regression_loss: 0.8027 - classification_loss: 0.0987 93/500 [====>.........................] - ETA: 2:18 - loss: 0.8993 - regression_loss: 0.8004 - classification_loss: 0.0989 94/500 [====>.........................] - ETA: 2:17 - loss: 0.8978 - regression_loss: 0.7983 - classification_loss: 0.0995 95/500 [====>.........................] - ETA: 2:17 - loss: 0.8979 - regression_loss: 0.7986 - classification_loss: 0.0993 96/500 [====>.........................] - ETA: 2:17 - loss: 0.8957 - regression_loss: 0.7967 - classification_loss: 0.0990 97/500 [====>.........................] - ETA: 2:16 - loss: 0.9006 - regression_loss: 0.8019 - classification_loss: 0.0987 98/500 [====>.........................] - ETA: 2:16 - loss: 0.8958 - regression_loss: 0.7976 - classification_loss: 0.0981 99/500 [====>.........................] - ETA: 2:16 - loss: 0.8966 - regression_loss: 0.7982 - classification_loss: 0.0984 100/500 [=====>........................] - ETA: 2:15 - loss: 0.9018 - regression_loss: 0.8021 - classification_loss: 0.0997 101/500 [=====>........................] - ETA: 2:15 - loss: 0.8992 - regression_loss: 0.8002 - classification_loss: 0.0990 102/500 [=====>........................] - ETA: 2:15 - loss: 0.8966 - regression_loss: 0.7978 - classification_loss: 0.0989 103/500 [=====>........................] - ETA: 2:14 - loss: 0.8961 - regression_loss: 0.7971 - classification_loss: 0.0991 104/500 [=====>........................] - ETA: 2:14 - loss: 0.8988 - regression_loss: 0.7998 - classification_loss: 0.0991 105/500 [=====>........................] - ETA: 2:14 - loss: 0.8991 - regression_loss: 0.8002 - classification_loss: 0.0988 106/500 [=====>........................] - ETA: 2:13 - loss: 0.9012 - regression_loss: 0.8022 - classification_loss: 0.0990 107/500 [=====>........................] - ETA: 2:13 - loss: 0.9047 - regression_loss: 0.8057 - classification_loss: 0.0990 108/500 [=====>........................] - ETA: 2:12 - loss: 0.8999 - regression_loss: 0.8016 - classification_loss: 0.0982 109/500 [=====>........................] - ETA: 2:12 - loss: 0.8982 - regression_loss: 0.8006 - classification_loss: 0.0977 110/500 [=====>........................] - ETA: 2:12 - loss: 0.9003 - regression_loss: 0.8027 - classification_loss: 0.0976 111/500 [=====>........................] - ETA: 2:11 - loss: 0.9059 - regression_loss: 0.8075 - classification_loss: 0.0984 112/500 [=====>........................] - ETA: 2:11 - loss: 0.9045 - regression_loss: 0.8064 - classification_loss: 0.0981 113/500 [=====>........................] - ETA: 2:10 - loss: 0.9051 - regression_loss: 0.8071 - classification_loss: 0.0980 114/500 [=====>........................] - ETA: 2:10 - loss: 0.9063 - regression_loss: 0.8089 - classification_loss: 0.0974 115/500 [=====>........................] - ETA: 2:10 - loss: 0.9025 - regression_loss: 0.8055 - classification_loss: 0.0970 116/500 [=====>........................] - ETA: 2:09 - loss: 0.9047 - regression_loss: 0.8076 - classification_loss: 0.0971 117/500 [======>.......................] - ETA: 2:09 - loss: 0.9052 - regression_loss: 0.8080 - classification_loss: 0.0972 118/500 [======>.......................] - ETA: 2:09 - loss: 0.9041 - regression_loss: 0.8071 - classification_loss: 0.0970 119/500 [======>.......................] - ETA: 2:08 - loss: 0.9025 - regression_loss: 0.8058 - classification_loss: 0.0967 120/500 [======>.......................] - ETA: 2:08 - loss: 0.9009 - regression_loss: 0.8044 - classification_loss: 0.0965 121/500 [======>.......................] - ETA: 2:08 - loss: 0.9011 - regression_loss: 0.8043 - classification_loss: 0.0968 122/500 [======>.......................] - ETA: 2:08 - loss: 0.8992 - regression_loss: 0.8030 - classification_loss: 0.0962 123/500 [======>.......................] - ETA: 2:07 - loss: 0.9027 - regression_loss: 0.8067 - classification_loss: 0.0960 124/500 [======>.......................] - ETA: 2:07 - loss: 0.9008 - regression_loss: 0.8052 - classification_loss: 0.0956 125/500 [======>.......................] - ETA: 2:06 - loss: 0.8999 - regression_loss: 0.8045 - classification_loss: 0.0954 126/500 [======>.......................] - ETA: 2:06 - loss: 0.8999 - regression_loss: 0.8046 - classification_loss: 0.0953 127/500 [======>.......................] - ETA: 2:06 - loss: 0.8989 - regression_loss: 0.8035 - classification_loss: 0.0954 128/500 [======>.......................] - ETA: 2:05 - loss: 0.8968 - regression_loss: 0.8018 - classification_loss: 0.0949 129/500 [======>.......................] - ETA: 2:05 - loss: 0.8948 - regression_loss: 0.8003 - classification_loss: 0.0945 130/500 [======>.......................] - ETA: 2:05 - loss: 0.9067 - regression_loss: 0.8107 - classification_loss: 0.0960 131/500 [======>.......................] - ETA: 2:04 - loss: 0.9103 - regression_loss: 0.8138 - classification_loss: 0.0965 132/500 [======>.......................] - ETA: 2:04 - loss: 0.9135 - regression_loss: 0.8160 - classification_loss: 0.0975 133/500 [======>.......................] - ETA: 2:04 - loss: 0.9180 - regression_loss: 0.8191 - classification_loss: 0.0989 134/500 [=======>......................] - ETA: 2:03 - loss: 0.9182 - regression_loss: 0.8192 - classification_loss: 0.0990 135/500 [=======>......................] - ETA: 2:03 - loss: 0.9195 - regression_loss: 0.8198 - classification_loss: 0.0996 136/500 [=======>......................] - ETA: 2:03 - loss: 0.9174 - regression_loss: 0.8185 - classification_loss: 0.0990 137/500 [=======>......................] - ETA: 2:02 - loss: 0.9208 - regression_loss: 0.8215 - classification_loss: 0.0993 138/500 [=======>......................] - ETA: 2:02 - loss: 0.9203 - regression_loss: 0.8212 - classification_loss: 0.0991 139/500 [=======>......................] - ETA: 2:02 - loss: 0.9193 - regression_loss: 0.8202 - classification_loss: 0.0991 140/500 [=======>......................] - ETA: 2:01 - loss: 0.9182 - regression_loss: 0.8192 - classification_loss: 0.0990 141/500 [=======>......................] - ETA: 2:01 - loss: 0.9197 - regression_loss: 0.8207 - classification_loss: 0.0990 142/500 [=======>......................] - ETA: 2:01 - loss: 0.9214 - regression_loss: 0.8222 - classification_loss: 0.0992 143/500 [=======>......................] - ETA: 2:00 - loss: 0.9209 - regression_loss: 0.8216 - classification_loss: 0.0993 144/500 [=======>......................] - ETA: 2:00 - loss: 0.9243 - regression_loss: 0.8248 - classification_loss: 0.0995 145/500 [=======>......................] - ETA: 2:00 - loss: 0.9276 - regression_loss: 0.8274 - classification_loss: 0.1002 146/500 [=======>......................] - ETA: 1:59 - loss: 0.9266 - regression_loss: 0.8269 - classification_loss: 0.0998 147/500 [=======>......................] - ETA: 1:59 - loss: 0.9284 - regression_loss: 0.8281 - classification_loss: 0.1003 148/500 [=======>......................] - ETA: 1:59 - loss: 0.9269 - regression_loss: 0.8267 - classification_loss: 0.1002 149/500 [=======>......................] - ETA: 1:58 - loss: 0.9255 - regression_loss: 0.8257 - classification_loss: 0.0998 150/500 [========>.....................] - ETA: 1:58 - loss: 0.9226 - regression_loss: 0.8232 - classification_loss: 0.0994 151/500 [========>.....................] - ETA: 1:58 - loss: 0.9219 - regression_loss: 0.8229 - classification_loss: 0.0990 152/500 [========>.....................] - ETA: 1:57 - loss: 0.9219 - regression_loss: 0.8230 - classification_loss: 0.0989 153/500 [========>.....................] - ETA: 1:57 - loss: 0.9216 - regression_loss: 0.8229 - classification_loss: 0.0987 154/500 [========>.....................] - ETA: 1:57 - loss: 0.9184 - regression_loss: 0.8202 - classification_loss: 0.0982 155/500 [========>.....................] - ETA: 1:56 - loss: 0.9176 - regression_loss: 0.8195 - classification_loss: 0.0981 156/500 [========>.....................] - ETA: 1:56 - loss: 0.9190 - regression_loss: 0.8207 - classification_loss: 0.0983 157/500 [========>.....................] - ETA: 1:56 - loss: 0.9132 - regression_loss: 0.8154 - classification_loss: 0.0977 158/500 [========>.....................] - ETA: 1:55 - loss: 0.9125 - regression_loss: 0.8149 - classification_loss: 0.0976 159/500 [========>.....................] - ETA: 1:55 - loss: 0.9152 - regression_loss: 0.8167 - classification_loss: 0.0985 160/500 [========>.....................] - ETA: 1:55 - loss: 0.9129 - regression_loss: 0.8147 - classification_loss: 0.0982 161/500 [========>.....................] - ETA: 1:54 - loss: 0.9116 - regression_loss: 0.8137 - classification_loss: 0.0979 162/500 [========>.....................] - ETA: 1:54 - loss: 0.9175 - regression_loss: 0.8180 - classification_loss: 0.0995 163/500 [========>.....................] - ETA: 1:54 - loss: 0.9169 - regression_loss: 0.8176 - classification_loss: 0.0993 164/500 [========>.....................] - ETA: 1:53 - loss: 0.9186 - regression_loss: 0.8183 - classification_loss: 0.1003 165/500 [========>.....................] - ETA: 1:53 - loss: 0.9239 - regression_loss: 0.8224 - classification_loss: 0.1015 166/500 [========>.....................] - ETA: 1:53 - loss: 0.9219 - regression_loss: 0.8209 - classification_loss: 0.1010 167/500 [=========>....................] - ETA: 1:52 - loss: 0.9208 - regression_loss: 0.8201 - classification_loss: 0.1007 168/500 [=========>....................] - ETA: 1:52 - loss: 0.9232 - regression_loss: 0.8216 - classification_loss: 0.1016 169/500 [=========>....................] - ETA: 1:52 - loss: 0.9236 - regression_loss: 0.8220 - classification_loss: 0.1016 170/500 [=========>....................] - ETA: 1:51 - loss: 0.9211 - regression_loss: 0.8199 - classification_loss: 0.1012 171/500 [=========>....................] - ETA: 1:51 - loss: 0.9230 - regression_loss: 0.8213 - classification_loss: 0.1017 172/500 [=========>....................] - ETA: 1:51 - loss: 0.9213 - regression_loss: 0.8200 - classification_loss: 0.1013 173/500 [=========>....................] - ETA: 1:50 - loss: 0.9227 - regression_loss: 0.8214 - classification_loss: 0.1013 174/500 [=========>....................] - ETA: 1:50 - loss: 0.9225 - regression_loss: 0.8212 - classification_loss: 0.1013 175/500 [=========>....................] - ETA: 1:50 - loss: 0.9252 - regression_loss: 0.8231 - classification_loss: 0.1020 176/500 [=========>....................] - ETA: 1:49 - loss: 0.9271 - regression_loss: 0.8244 - classification_loss: 0.1027 177/500 [=========>....................] - ETA: 1:49 - loss: 0.9254 - regression_loss: 0.8230 - classification_loss: 0.1024 178/500 [=========>....................] - ETA: 1:49 - loss: 0.9258 - regression_loss: 0.8235 - classification_loss: 0.1023 179/500 [=========>....................] - ETA: 1:48 - loss: 0.9236 - regression_loss: 0.8218 - classification_loss: 0.1019 180/500 [=========>....................] - ETA: 1:48 - loss: 0.9217 - regression_loss: 0.8201 - classification_loss: 0.1016 181/500 [=========>....................] - ETA: 1:48 - loss: 0.9213 - regression_loss: 0.8198 - classification_loss: 0.1015 182/500 [=========>....................] - ETA: 1:47 - loss: 0.9209 - regression_loss: 0.8196 - classification_loss: 0.1013 183/500 [=========>....................] - ETA: 1:47 - loss: 0.9176 - regression_loss: 0.8168 - classification_loss: 0.1008 184/500 [==========>...................] - ETA: 1:47 - loss: 0.9174 - regression_loss: 0.8165 - classification_loss: 0.1009 185/500 [==========>...................] - ETA: 1:46 - loss: 0.9187 - regression_loss: 0.8175 - classification_loss: 0.1012 186/500 [==========>...................] - ETA: 1:46 - loss: 0.9186 - regression_loss: 0.8169 - classification_loss: 0.1017 187/500 [==========>...................] - ETA: 1:46 - loss: 0.9184 - regression_loss: 0.8169 - classification_loss: 0.1015 188/500 [==========>...................] - ETA: 1:45 - loss: 0.9173 - regression_loss: 0.8162 - classification_loss: 0.1011 189/500 [==========>...................] - ETA: 1:45 - loss: 0.9163 - regression_loss: 0.8150 - classification_loss: 0.1012 190/500 [==========>...................] - ETA: 1:45 - loss: 0.9128 - regression_loss: 0.8120 - classification_loss: 0.1008 191/500 [==========>...................] - ETA: 1:44 - loss: 0.9118 - regression_loss: 0.8111 - classification_loss: 0.1007 192/500 [==========>...................] - ETA: 1:44 - loss: 0.9140 - regression_loss: 0.8134 - classification_loss: 0.1006 193/500 [==========>...................] - ETA: 1:44 - loss: 0.9134 - regression_loss: 0.8129 - classification_loss: 0.1005 194/500 [==========>...................] - ETA: 1:43 - loss: 0.9088 - regression_loss: 0.8087 - classification_loss: 0.1000 195/500 [==========>...................] - ETA: 1:43 - loss: 0.9127 - regression_loss: 0.8113 - classification_loss: 0.1014 196/500 [==========>...................] - ETA: 1:43 - loss: 0.9140 - regression_loss: 0.8122 - classification_loss: 0.1018 197/500 [==========>...................] - ETA: 1:42 - loss: 0.9120 - regression_loss: 0.8104 - classification_loss: 0.1016 198/500 [==========>...................] - ETA: 1:42 - loss: 0.9091 - regression_loss: 0.8078 - classification_loss: 0.1013 199/500 [==========>...................] - ETA: 1:42 - loss: 0.9085 - regression_loss: 0.8073 - classification_loss: 0.1012 200/500 [===========>..................] - ETA: 1:41 - loss: 0.9077 - regression_loss: 0.8066 - classification_loss: 0.1012 201/500 [===========>..................] - ETA: 1:41 - loss: 0.9067 - regression_loss: 0.8054 - classification_loss: 0.1013 202/500 [===========>..................] - ETA: 1:41 - loss: 0.9079 - regression_loss: 0.8059 - classification_loss: 0.1020 203/500 [===========>..................] - ETA: 1:40 - loss: 0.9080 - regression_loss: 0.8060 - classification_loss: 0.1021 204/500 [===========>..................] - ETA: 1:40 - loss: 0.9066 - regression_loss: 0.8046 - classification_loss: 0.1020 205/500 [===========>..................] - ETA: 1:40 - loss: 0.9055 - regression_loss: 0.8037 - classification_loss: 0.1018 206/500 [===========>..................] - ETA: 1:39 - loss: 0.9040 - regression_loss: 0.8025 - classification_loss: 0.1015 207/500 [===========>..................] - ETA: 1:39 - loss: 0.9052 - regression_loss: 0.8035 - classification_loss: 0.1016 208/500 [===========>..................] - ETA: 1:38 - loss: 0.9030 - regression_loss: 0.8017 - classification_loss: 0.1013 209/500 [===========>..................] - ETA: 1:38 - loss: 0.9015 - regression_loss: 0.8005 - classification_loss: 0.1010 210/500 [===========>..................] - ETA: 1:38 - loss: 0.9008 - regression_loss: 0.8001 - classification_loss: 0.1008 211/500 [===========>..................] - ETA: 1:37 - loss: 0.9020 - regression_loss: 0.8009 - classification_loss: 0.1012 212/500 [===========>..................] - ETA: 1:37 - loss: 0.8992 - regression_loss: 0.7984 - classification_loss: 0.1008 213/500 [===========>..................] - ETA: 1:37 - loss: 0.8987 - regression_loss: 0.7976 - classification_loss: 0.1011 214/500 [===========>..................] - ETA: 1:36 - loss: 0.8955 - regression_loss: 0.7948 - classification_loss: 0.1007 215/500 [===========>..................] - ETA: 1:36 - loss: 0.8956 - regression_loss: 0.7950 - classification_loss: 0.1005 216/500 [===========>..................] - ETA: 1:36 - loss: 0.8941 - regression_loss: 0.7938 - classification_loss: 0.1003 217/500 [============>.................] - ETA: 1:35 - loss: 0.8940 - regression_loss: 0.7936 - classification_loss: 0.1003 218/500 [============>.................] - ETA: 1:35 - loss: 0.8950 - regression_loss: 0.7947 - classification_loss: 0.1003 219/500 [============>.................] - ETA: 1:35 - loss: 0.8960 - regression_loss: 0.7958 - classification_loss: 0.1002 220/500 [============>.................] - ETA: 1:34 - loss: 0.8952 - regression_loss: 0.7951 - classification_loss: 0.1001 221/500 [============>.................] - ETA: 1:34 - loss: 0.8944 - regression_loss: 0.7946 - classification_loss: 0.0998 222/500 [============>.................] - ETA: 1:34 - loss: 0.8957 - regression_loss: 0.7959 - classification_loss: 0.0999 223/500 [============>.................] - ETA: 1:33 - loss: 0.8954 - regression_loss: 0.7957 - classification_loss: 0.0996 224/500 [============>.................] - ETA: 1:33 - loss: 0.8964 - regression_loss: 0.7967 - classification_loss: 0.0997 225/500 [============>.................] - ETA: 1:33 - loss: 0.8965 - regression_loss: 0.7967 - classification_loss: 0.0998 226/500 [============>.................] - ETA: 1:32 - loss: 0.8971 - regression_loss: 0.7973 - classification_loss: 0.0998 227/500 [============>.................] - ETA: 1:32 - loss: 0.8982 - regression_loss: 0.7985 - classification_loss: 0.0997 228/500 [============>.................] - ETA: 1:32 - loss: 0.8984 - regression_loss: 0.7988 - classification_loss: 0.0996 229/500 [============>.................] - ETA: 1:31 - loss: 0.8994 - regression_loss: 0.7997 - classification_loss: 0.0997 230/500 [============>.................] - ETA: 1:31 - loss: 0.9016 - regression_loss: 0.8017 - classification_loss: 0.1000 231/500 [============>.................] - ETA: 1:31 - loss: 0.9013 - regression_loss: 0.8015 - classification_loss: 0.0998 232/500 [============>.................] - ETA: 1:30 - loss: 0.9006 - regression_loss: 0.8010 - classification_loss: 0.0996 233/500 [============>.................] - ETA: 1:30 - loss: 0.8984 - regression_loss: 0.7991 - classification_loss: 0.0993 234/500 [=============>................] - ETA: 1:30 - loss: 0.8955 - regression_loss: 0.7965 - classification_loss: 0.0990 235/500 [=============>................] - ETA: 1:29 - loss: 0.8941 - regression_loss: 0.7953 - classification_loss: 0.0988 236/500 [=============>................] - ETA: 1:29 - loss: 0.8944 - regression_loss: 0.7954 - classification_loss: 0.0989 237/500 [=============>................] - ETA: 1:29 - loss: 0.8950 - regression_loss: 0.7961 - classification_loss: 0.0989 238/500 [=============>................] - ETA: 1:28 - loss: 0.8967 - regression_loss: 0.7977 - classification_loss: 0.0991 239/500 [=============>................] - ETA: 1:28 - loss: 0.8971 - regression_loss: 0.7979 - classification_loss: 0.0992 240/500 [=============>................] - ETA: 1:27 - loss: 0.9000 - regression_loss: 0.7997 - classification_loss: 0.1003 241/500 [=============>................] - ETA: 1:27 - loss: 0.8982 - regression_loss: 0.7982 - classification_loss: 0.1000 242/500 [=============>................] - ETA: 1:27 - loss: 0.9049 - regression_loss: 0.8026 - classification_loss: 0.1023 243/500 [=============>................] - ETA: 1:26 - loss: 0.9049 - regression_loss: 0.8026 - classification_loss: 0.1022 244/500 [=============>................] - ETA: 1:26 - loss: 0.9058 - regression_loss: 0.8035 - classification_loss: 0.1022 245/500 [=============>................] - ETA: 1:26 - loss: 0.9071 - regression_loss: 0.8048 - classification_loss: 0.1023 246/500 [=============>................] - ETA: 1:25 - loss: 0.9062 - regression_loss: 0.8041 - classification_loss: 0.1021 247/500 [=============>................] - ETA: 1:25 - loss: 0.9053 - regression_loss: 0.8032 - classification_loss: 0.1021 248/500 [=============>................] - ETA: 1:25 - loss: 0.9038 - regression_loss: 0.8020 - classification_loss: 0.1018 249/500 [=============>................] - ETA: 1:24 - loss: 0.9045 - regression_loss: 0.8024 - classification_loss: 0.1021 250/500 [==============>...............] - ETA: 1:24 - loss: 0.9051 - regression_loss: 0.8029 - classification_loss: 0.1022 251/500 [==============>...............] - ETA: 1:24 - loss: 0.9186 - regression_loss: 0.8071 - classification_loss: 0.1114 252/500 [==============>...............] - ETA: 1:23 - loss: 0.9180 - regression_loss: 0.8064 - classification_loss: 0.1116 253/500 [==============>...............] - ETA: 1:23 - loss: 0.9167 - regression_loss: 0.8053 - classification_loss: 0.1114 254/500 [==============>...............] - ETA: 1:23 - loss: 0.9158 - regression_loss: 0.8046 - classification_loss: 0.1113 255/500 [==============>...............] - ETA: 1:22 - loss: 0.9163 - regression_loss: 0.8047 - classification_loss: 0.1116 256/500 [==============>...............] - ETA: 1:22 - loss: 0.9159 - regression_loss: 0.8044 - classification_loss: 0.1115 257/500 [==============>...............] - ETA: 1:22 - loss: 0.9148 - regression_loss: 0.8032 - classification_loss: 0.1115 258/500 [==============>...............] - ETA: 1:21 - loss: 0.9147 - regression_loss: 0.8033 - classification_loss: 0.1114 259/500 [==============>...............] - ETA: 1:21 - loss: 0.9146 - regression_loss: 0.8031 - classification_loss: 0.1114 260/500 [==============>...............] - ETA: 1:21 - loss: 0.9151 - regression_loss: 0.8035 - classification_loss: 0.1116 261/500 [==============>...............] - ETA: 1:20 - loss: 0.9129 - regression_loss: 0.8012 - classification_loss: 0.1117 262/500 [==============>...............] - ETA: 1:20 - loss: 0.9123 - regression_loss: 0.8009 - classification_loss: 0.1115 263/500 [==============>...............] - ETA: 1:20 - loss: 0.9112 - regression_loss: 0.7999 - classification_loss: 0.1113 264/500 [==============>...............] - ETA: 1:19 - loss: 0.9106 - regression_loss: 0.7995 - classification_loss: 0.1111 265/500 [==============>...............] - ETA: 1:19 - loss: 0.9123 - regression_loss: 0.8012 - classification_loss: 0.1111 266/500 [==============>...............] - ETA: 1:19 - loss: 0.9126 - regression_loss: 0.8015 - classification_loss: 0.1111 267/500 [===============>..............] - ETA: 1:18 - loss: 0.9130 - regression_loss: 0.8018 - classification_loss: 0.1112 268/500 [===============>..............] - ETA: 1:18 - loss: 0.9117 - regression_loss: 0.8007 - classification_loss: 0.1109 269/500 [===============>..............] - ETA: 1:18 - loss: 0.9140 - regression_loss: 0.8028 - classification_loss: 0.1112 270/500 [===============>..............] - ETA: 1:17 - loss: 0.9135 - regression_loss: 0.8025 - classification_loss: 0.1110 271/500 [===============>..............] - ETA: 1:17 - loss: 0.9156 - regression_loss: 0.8043 - classification_loss: 0.1112 272/500 [===============>..............] - ETA: 1:17 - loss: 0.9165 - regression_loss: 0.8053 - classification_loss: 0.1112 273/500 [===============>..............] - ETA: 1:16 - loss: 0.9238 - regression_loss: 0.8119 - classification_loss: 0.1119 274/500 [===============>..............] - ETA: 1:16 - loss: 0.9226 - regression_loss: 0.8106 - classification_loss: 0.1120 275/500 [===============>..............] - ETA: 1:16 - loss: 0.9234 - regression_loss: 0.8113 - classification_loss: 0.1121 276/500 [===============>..............] - ETA: 1:15 - loss: 0.9235 - regression_loss: 0.8116 - classification_loss: 0.1120 277/500 [===============>..............] - ETA: 1:15 - loss: 0.9223 - regression_loss: 0.8105 - classification_loss: 0.1118 278/500 [===============>..............] - ETA: 1:15 - loss: 0.9254 - regression_loss: 0.8129 - classification_loss: 0.1125 279/500 [===============>..............] - ETA: 1:14 - loss: 0.9221 - regression_loss: 0.8100 - classification_loss: 0.1121 280/500 [===============>..............] - ETA: 1:14 - loss: 0.9224 - regression_loss: 0.8102 - classification_loss: 0.1122 281/500 [===============>..............] - ETA: 1:14 - loss: 0.9215 - regression_loss: 0.8094 - classification_loss: 0.1122 282/500 [===============>..............] - ETA: 1:13 - loss: 0.9215 - regression_loss: 0.8096 - classification_loss: 0.1119 283/500 [===============>..............] - ETA: 1:13 - loss: 0.9233 - regression_loss: 0.8110 - classification_loss: 0.1122 284/500 [================>.............] - ETA: 1:13 - loss: 0.9248 - regression_loss: 0.8121 - classification_loss: 0.1127 285/500 [================>.............] - ETA: 1:12 - loss: 0.9261 - regression_loss: 0.8132 - classification_loss: 0.1129 286/500 [================>.............] - ETA: 1:12 - loss: 0.9265 - regression_loss: 0.8137 - classification_loss: 0.1128 287/500 [================>.............] - ETA: 1:12 - loss: 0.9246 - regression_loss: 0.8120 - classification_loss: 0.1126 288/500 [================>.............] - ETA: 1:11 - loss: 0.9231 - regression_loss: 0.8108 - classification_loss: 0.1123 289/500 [================>.............] - ETA: 1:11 - loss: 0.9227 - regression_loss: 0.8104 - classification_loss: 0.1123 290/500 [================>.............] - ETA: 1:11 - loss: 0.9223 - regression_loss: 0.8099 - classification_loss: 0.1124 291/500 [================>.............] - ETA: 1:10 - loss: 0.9235 - regression_loss: 0.8110 - classification_loss: 0.1125 292/500 [================>.............] - ETA: 1:10 - loss: 0.9238 - regression_loss: 0.8114 - classification_loss: 0.1124 293/500 [================>.............] - ETA: 1:10 - loss: 0.9236 - regression_loss: 0.8114 - classification_loss: 0.1122 294/500 [================>.............] - ETA: 1:09 - loss: 0.9205 - regression_loss: 0.8086 - classification_loss: 0.1119 295/500 [================>.............] - ETA: 1:09 - loss: 0.9187 - regression_loss: 0.8071 - classification_loss: 0.1116 296/500 [================>.............] - ETA: 1:09 - loss: 0.9182 - regression_loss: 0.8067 - classification_loss: 0.1115 297/500 [================>.............] - ETA: 1:08 - loss: 0.9171 - regression_loss: 0.8059 - classification_loss: 0.1112 298/500 [================>.............] - ETA: 1:08 - loss: 0.9169 - regression_loss: 0.8058 - classification_loss: 0.1111 299/500 [================>.............] - ETA: 1:08 - loss: 0.9160 - regression_loss: 0.8048 - classification_loss: 0.1112 300/500 [=================>............] - ETA: 1:07 - loss: 0.9169 - regression_loss: 0.8056 - classification_loss: 0.1113 301/500 [=================>............] - ETA: 1:07 - loss: 0.9168 - regression_loss: 0.8056 - classification_loss: 0.1112 302/500 [=================>............] - ETA: 1:07 - loss: 0.9194 - regression_loss: 0.8079 - classification_loss: 0.1115 303/500 [=================>............] - ETA: 1:06 - loss: 0.9197 - regression_loss: 0.8083 - classification_loss: 0.1114 304/500 [=================>............] - ETA: 1:06 - loss: 0.9193 - regression_loss: 0.8080 - classification_loss: 0.1113 305/500 [=================>............] - ETA: 1:06 - loss: 0.9182 - regression_loss: 0.8072 - classification_loss: 0.1110 306/500 [=================>............] - ETA: 1:05 - loss: 0.9204 - regression_loss: 0.8088 - classification_loss: 0.1117 307/500 [=================>............] - ETA: 1:05 - loss: 0.9213 - regression_loss: 0.8095 - classification_loss: 0.1118 308/500 [=================>............] - ETA: 1:04 - loss: 0.9222 - regression_loss: 0.8106 - classification_loss: 0.1116 309/500 [=================>............] - ETA: 1:04 - loss: 0.9213 - regression_loss: 0.8099 - classification_loss: 0.1114 310/500 [=================>............] - ETA: 1:04 - loss: 0.9229 - regression_loss: 0.8113 - classification_loss: 0.1116 311/500 [=================>............] - ETA: 1:03 - loss: 0.9232 - regression_loss: 0.8116 - classification_loss: 0.1117 312/500 [=================>............] - ETA: 1:03 - loss: 0.9239 - regression_loss: 0.8122 - classification_loss: 0.1117 313/500 [=================>............] - ETA: 1:03 - loss: 0.9263 - regression_loss: 0.8144 - classification_loss: 0.1119 314/500 [=================>............] - ETA: 1:02 - loss: 0.9281 - regression_loss: 0.8159 - classification_loss: 0.1122 315/500 [=================>............] - ETA: 1:02 - loss: 0.9273 - regression_loss: 0.8153 - classification_loss: 0.1120 316/500 [=================>............] - ETA: 1:02 - loss: 0.9274 - regression_loss: 0.8157 - classification_loss: 0.1118 317/500 [==================>...........] - ETA: 1:01 - loss: 0.9258 - regression_loss: 0.8137 - classification_loss: 0.1121 318/500 [==================>...........] - ETA: 1:01 - loss: 0.9245 - regression_loss: 0.8127 - classification_loss: 0.1118 319/500 [==================>...........] - ETA: 1:01 - loss: 0.9240 - regression_loss: 0.8124 - classification_loss: 0.1117 320/500 [==================>...........] - ETA: 1:00 - loss: 0.9238 - regression_loss: 0.8123 - classification_loss: 0.1115 321/500 [==================>...........] - ETA: 1:00 - loss: 0.9271 - regression_loss: 0.8154 - classification_loss: 0.1118 322/500 [==================>...........] - ETA: 1:00 - loss: 0.9262 - regression_loss: 0.8147 - classification_loss: 0.1115 323/500 [==================>...........] - ETA: 59s - loss: 0.9239 - regression_loss: 0.8127 - classification_loss: 0.1112  324/500 [==================>...........] - ETA: 59s - loss: 0.9245 - regression_loss: 0.8133 - classification_loss: 0.1112 325/500 [==================>...........] - ETA: 59s - loss: 0.9245 - regression_loss: 0.8134 - classification_loss: 0.1111 326/500 [==================>...........] - ETA: 58s - loss: 0.9242 - regression_loss: 0.8132 - classification_loss: 0.1110 327/500 [==================>...........] - ETA: 58s - loss: 0.9257 - regression_loss: 0.8142 - classification_loss: 0.1115 328/500 [==================>...........] - ETA: 58s - loss: 0.9237 - regression_loss: 0.8125 - classification_loss: 0.1112 329/500 [==================>...........] - ETA: 57s - loss: 0.9234 - regression_loss: 0.8122 - classification_loss: 0.1112 330/500 [==================>...........] - ETA: 57s - loss: 0.9232 - regression_loss: 0.8121 - classification_loss: 0.1112 331/500 [==================>...........] - ETA: 57s - loss: 0.9246 - regression_loss: 0.8133 - classification_loss: 0.1114 332/500 [==================>...........] - ETA: 56s - loss: 0.9240 - regression_loss: 0.8128 - classification_loss: 0.1111 333/500 [==================>...........] - ETA: 56s - loss: 0.9237 - regression_loss: 0.8127 - classification_loss: 0.1110 334/500 [===================>..........] - ETA: 56s - loss: 0.9228 - regression_loss: 0.8119 - classification_loss: 0.1109 335/500 [===================>..........] - ETA: 55s - loss: 0.9208 - regression_loss: 0.8101 - classification_loss: 0.1107 336/500 [===================>..........] - ETA: 55s - loss: 0.9202 - regression_loss: 0.8092 - classification_loss: 0.1110 337/500 [===================>..........] - ETA: 55s - loss: 0.9204 - regression_loss: 0.8094 - classification_loss: 0.1110 338/500 [===================>..........] - ETA: 54s - loss: 0.9210 - regression_loss: 0.8099 - classification_loss: 0.1111 339/500 [===================>..........] - ETA: 54s - loss: 0.9195 - regression_loss: 0.8086 - classification_loss: 0.1109 340/500 [===================>..........] - ETA: 54s - loss: 0.9200 - regression_loss: 0.8091 - classification_loss: 0.1109 341/500 [===================>..........] - ETA: 53s - loss: 0.9207 - regression_loss: 0.8095 - classification_loss: 0.1112 342/500 [===================>..........] - ETA: 53s - loss: 0.9194 - regression_loss: 0.8084 - classification_loss: 0.1110 343/500 [===================>..........] - ETA: 53s - loss: 0.9184 - regression_loss: 0.8076 - classification_loss: 0.1108 344/500 [===================>..........] - ETA: 52s - loss: 0.9202 - regression_loss: 0.8090 - classification_loss: 0.1112 345/500 [===================>..........] - ETA: 52s - loss: 0.9192 - regression_loss: 0.8083 - classification_loss: 0.1110 346/500 [===================>..........] - ETA: 52s - loss: 0.9192 - regression_loss: 0.8084 - classification_loss: 0.1108 347/500 [===================>..........] - ETA: 51s - loss: 0.9193 - regression_loss: 0.8084 - classification_loss: 0.1108 348/500 [===================>..........] - ETA: 51s - loss: 0.9209 - regression_loss: 0.8102 - classification_loss: 0.1107 349/500 [===================>..........] - ETA: 51s - loss: 0.9196 - regression_loss: 0.8091 - classification_loss: 0.1105 350/500 [====================>.........] - ETA: 50s - loss: 0.9189 - regression_loss: 0.8086 - classification_loss: 0.1103 351/500 [====================>.........] - ETA: 50s - loss: 0.9181 - regression_loss: 0.8081 - classification_loss: 0.1100 352/500 [====================>.........] - ETA: 50s - loss: 0.9165 - regression_loss: 0.8067 - classification_loss: 0.1098 353/500 [====================>.........] - ETA: 49s - loss: 0.9155 - regression_loss: 0.8059 - classification_loss: 0.1096 354/500 [====================>.........] - ETA: 49s - loss: 0.9149 - regression_loss: 0.8054 - classification_loss: 0.1095 355/500 [====================>.........] - ETA: 49s - loss: 0.9143 - regression_loss: 0.8050 - classification_loss: 0.1093 356/500 [====================>.........] - ETA: 48s - loss: 0.9143 - regression_loss: 0.8053 - classification_loss: 0.1091 357/500 [====================>.........] - ETA: 48s - loss: 0.9142 - regression_loss: 0.8052 - classification_loss: 0.1090 358/500 [====================>.........] - ETA: 48s - loss: 0.9158 - regression_loss: 0.8063 - classification_loss: 0.1095 359/500 [====================>.........] - ETA: 47s - loss: 0.9150 - regression_loss: 0.8056 - classification_loss: 0.1093 360/500 [====================>.........] - ETA: 47s - loss: 0.9149 - regression_loss: 0.8057 - classification_loss: 0.1093 361/500 [====================>.........] - ETA: 47s - loss: 0.9142 - regression_loss: 0.8050 - classification_loss: 0.1091 362/500 [====================>.........] - ETA: 46s - loss: 0.9135 - regression_loss: 0.8044 - classification_loss: 0.1091 363/500 [====================>.........] - ETA: 46s - loss: 0.9129 - regression_loss: 0.8040 - classification_loss: 0.1089 364/500 [====================>.........] - ETA: 46s - loss: 0.9123 - regression_loss: 0.8035 - classification_loss: 0.1087 365/500 [====================>.........] - ETA: 45s - loss: 0.9121 - regression_loss: 0.8034 - classification_loss: 0.1086 366/500 [====================>.........] - ETA: 45s - loss: 0.9120 - regression_loss: 0.8035 - classification_loss: 0.1085 367/500 [=====================>........] - ETA: 45s - loss: 0.9118 - regression_loss: 0.8034 - classification_loss: 0.1085 368/500 [=====================>........] - ETA: 44s - loss: 0.9130 - regression_loss: 0.8040 - classification_loss: 0.1090 369/500 [=====================>........] - ETA: 44s - loss: 0.9136 - regression_loss: 0.8045 - classification_loss: 0.1091 370/500 [=====================>........] - ETA: 44s - loss: 0.9150 - regression_loss: 0.8056 - classification_loss: 0.1094 371/500 [=====================>........] - ETA: 43s - loss: 0.9159 - regression_loss: 0.8066 - classification_loss: 0.1094 372/500 [=====================>........] - ETA: 43s - loss: 0.9170 - regression_loss: 0.8074 - classification_loss: 0.1096 373/500 [=====================>........] - ETA: 43s - loss: 0.9161 - regression_loss: 0.8065 - classification_loss: 0.1096 374/500 [=====================>........] - ETA: 42s - loss: 0.9152 - regression_loss: 0.8059 - classification_loss: 0.1094 375/500 [=====================>........] - ETA: 42s - loss: 0.9152 - regression_loss: 0.8057 - classification_loss: 0.1095 376/500 [=====================>........] - ETA: 42s - loss: 0.9141 - regression_loss: 0.8047 - classification_loss: 0.1093 377/500 [=====================>........] - ETA: 41s - loss: 0.9139 - regression_loss: 0.8046 - classification_loss: 0.1093 378/500 [=====================>........] - ETA: 41s - loss: 0.9152 - regression_loss: 0.8058 - classification_loss: 0.1094 379/500 [=====================>........] - ETA: 41s - loss: 0.9154 - regression_loss: 0.8060 - classification_loss: 0.1094 380/500 [=====================>........] - ETA: 40s - loss: 0.9145 - regression_loss: 0.8053 - classification_loss: 0.1092 381/500 [=====================>........] - ETA: 40s - loss: 0.9147 - regression_loss: 0.8055 - classification_loss: 0.1092 382/500 [=====================>........] - ETA: 40s - loss: 0.9144 - regression_loss: 0.8052 - classification_loss: 0.1092 383/500 [=====================>........] - ETA: 39s - loss: 0.9133 - regression_loss: 0.8043 - classification_loss: 0.1089 384/500 [======================>.......] - ETA: 39s - loss: 0.9129 - regression_loss: 0.8041 - classification_loss: 0.1088 385/500 [======================>.......] - ETA: 38s - loss: 0.9125 - regression_loss: 0.8037 - classification_loss: 0.1088 386/500 [======================>.......] - ETA: 38s - loss: 0.9147 - regression_loss: 0.8052 - classification_loss: 0.1095 387/500 [======================>.......] - ETA: 38s - loss: 0.9158 - regression_loss: 0.8060 - classification_loss: 0.1098 388/500 [======================>.......] - ETA: 37s - loss: 0.9166 - regression_loss: 0.8067 - classification_loss: 0.1098 389/500 [======================>.......] - ETA: 37s - loss: 0.9165 - regression_loss: 0.8067 - classification_loss: 0.1098 390/500 [======================>.......] - ETA: 37s - loss: 0.9174 - regression_loss: 0.8076 - classification_loss: 0.1098 391/500 [======================>.......] - ETA: 36s - loss: 0.9167 - regression_loss: 0.8071 - classification_loss: 0.1096 392/500 [======================>.......] - ETA: 36s - loss: 0.9178 - regression_loss: 0.8081 - classification_loss: 0.1097 393/500 [======================>.......] - ETA: 36s - loss: 0.9176 - regression_loss: 0.8080 - classification_loss: 0.1097 394/500 [======================>.......] - ETA: 35s - loss: 0.9176 - regression_loss: 0.8081 - classification_loss: 0.1095 395/500 [======================>.......] - ETA: 35s - loss: 0.9161 - regression_loss: 0.8068 - classification_loss: 0.1093 396/500 [======================>.......] - ETA: 35s - loss: 0.9156 - regression_loss: 0.8064 - classification_loss: 0.1092 397/500 [======================>.......] - ETA: 34s - loss: 0.9151 - regression_loss: 0.8060 - classification_loss: 0.1091 398/500 [======================>.......] - ETA: 34s - loss: 0.9142 - regression_loss: 0.8054 - classification_loss: 0.1089 399/500 [======================>.......] - ETA: 34s - loss: 0.9139 - regression_loss: 0.8051 - classification_loss: 0.1088 400/500 [=======================>......] - ETA: 33s - loss: 0.9138 - regression_loss: 0.8051 - classification_loss: 0.1087 401/500 [=======================>......] - ETA: 33s - loss: 0.9133 - regression_loss: 0.8047 - classification_loss: 0.1086 402/500 [=======================>......] - ETA: 33s - loss: 0.9140 - regression_loss: 0.8054 - classification_loss: 0.1086 403/500 [=======================>......] - ETA: 32s - loss: 0.9139 - regression_loss: 0.8053 - classification_loss: 0.1086 404/500 [=======================>......] - ETA: 32s - loss: 0.9134 - regression_loss: 0.8049 - classification_loss: 0.1085 405/500 [=======================>......] - ETA: 32s - loss: 0.9119 - regression_loss: 0.8036 - classification_loss: 0.1082 406/500 [=======================>......] - ETA: 31s - loss: 0.9109 - regression_loss: 0.8028 - classification_loss: 0.1081 407/500 [=======================>......] - ETA: 31s - loss: 0.9123 - regression_loss: 0.8042 - classification_loss: 0.1081 408/500 [=======================>......] - ETA: 31s - loss: 0.9146 - regression_loss: 0.8061 - classification_loss: 0.1085 409/500 [=======================>......] - ETA: 30s - loss: 0.9149 - regression_loss: 0.8064 - classification_loss: 0.1085 410/500 [=======================>......] - ETA: 30s - loss: 0.9144 - regression_loss: 0.8056 - classification_loss: 0.1088 411/500 [=======================>......] - ETA: 30s - loss: 0.9151 - regression_loss: 0.8064 - classification_loss: 0.1087 412/500 [=======================>......] - ETA: 29s - loss: 0.9137 - regression_loss: 0.8052 - classification_loss: 0.1085 413/500 [=======================>......] - ETA: 29s - loss: 0.9149 - regression_loss: 0.8062 - classification_loss: 0.1086 414/500 [=======================>......] - ETA: 29s - loss: 0.9146 - regression_loss: 0.8060 - classification_loss: 0.1086 415/500 [=======================>......] - ETA: 28s - loss: 0.9143 - regression_loss: 0.8058 - classification_loss: 0.1085 416/500 [=======================>......] - ETA: 28s - loss: 0.9150 - regression_loss: 0.8065 - classification_loss: 0.1086 417/500 [========================>.....] - ETA: 28s - loss: 0.9146 - regression_loss: 0.8061 - classification_loss: 0.1085 418/500 [========================>.....] - ETA: 27s - loss: 0.9134 - regression_loss: 0.8051 - classification_loss: 0.1083 419/500 [========================>.....] - ETA: 27s - loss: 0.9151 - regression_loss: 0.8066 - classification_loss: 0.1085 420/500 [========================>.....] - ETA: 27s - loss: 0.9149 - regression_loss: 0.8064 - classification_loss: 0.1084 421/500 [========================>.....] - ETA: 26s - loss: 0.9146 - regression_loss: 0.8063 - classification_loss: 0.1083 422/500 [========================>.....] - ETA: 26s - loss: 0.9139 - regression_loss: 0.8056 - classification_loss: 0.1082 423/500 [========================>.....] - ETA: 26s - loss: 0.9133 - regression_loss: 0.8052 - classification_loss: 0.1081 424/500 [========================>.....] - ETA: 25s - loss: 0.9141 - regression_loss: 0.8059 - classification_loss: 0.1081 425/500 [========================>.....] - ETA: 25s - loss: 0.9141 - regression_loss: 0.8060 - classification_loss: 0.1080 426/500 [========================>.....] - ETA: 25s - loss: 0.9142 - regression_loss: 0.8061 - classification_loss: 0.1081 427/500 [========================>.....] - ETA: 24s - loss: 0.9136 - regression_loss: 0.8057 - classification_loss: 0.1079 428/500 [========================>.....] - ETA: 24s - loss: 0.9136 - regression_loss: 0.8058 - classification_loss: 0.1078 429/500 [========================>.....] - ETA: 24s - loss: 0.9134 - regression_loss: 0.8055 - classification_loss: 0.1078 430/500 [========================>.....] - ETA: 23s - loss: 0.9135 - regression_loss: 0.8058 - classification_loss: 0.1077 431/500 [========================>.....] - ETA: 23s - loss: 0.9133 - regression_loss: 0.8057 - classification_loss: 0.1076 432/500 [========================>.....] - ETA: 23s - loss: 0.9126 - regression_loss: 0.8052 - classification_loss: 0.1074 433/500 [========================>.....] - ETA: 22s - loss: 0.9113 - regression_loss: 0.8041 - classification_loss: 0.1072 434/500 [=========================>....] - ETA: 22s - loss: 0.9105 - regression_loss: 0.8034 - classification_loss: 0.1071 435/500 [=========================>....] - ETA: 22s - loss: 0.9108 - regression_loss: 0.8037 - classification_loss: 0.1071 436/500 [=========================>....] - ETA: 21s - loss: 0.9109 - regression_loss: 0.8036 - classification_loss: 0.1073 437/500 [=========================>....] - ETA: 21s - loss: 0.9127 - regression_loss: 0.8050 - classification_loss: 0.1077 438/500 [=========================>....] - ETA: 21s - loss: 0.9130 - regression_loss: 0.8054 - classification_loss: 0.1076 439/500 [=========================>....] - ETA: 20s - loss: 0.9133 - regression_loss: 0.8057 - classification_loss: 0.1076 440/500 [=========================>....] - ETA: 20s - loss: 0.9135 - regression_loss: 0.8059 - classification_loss: 0.1076 441/500 [=========================>....] - ETA: 20s - loss: 0.9132 - regression_loss: 0.8056 - classification_loss: 0.1075 442/500 [=========================>....] - ETA: 19s - loss: 0.9123 - regression_loss: 0.8049 - classification_loss: 0.1074 443/500 [=========================>....] - ETA: 19s - loss: 0.9130 - regression_loss: 0.8054 - classification_loss: 0.1075 444/500 [=========================>....] - ETA: 19s - loss: 0.9128 - regression_loss: 0.8054 - classification_loss: 0.1074 445/500 [=========================>....] - ETA: 18s - loss: 0.9126 - regression_loss: 0.8053 - classification_loss: 0.1073 446/500 [=========================>....] - ETA: 18s - loss: 0.9124 - regression_loss: 0.8052 - classification_loss: 0.1072 447/500 [=========================>....] - ETA: 17s - loss: 0.9125 - regression_loss: 0.8055 - classification_loss: 0.1071 448/500 [=========================>....] - ETA: 17s - loss: 0.9124 - regression_loss: 0.8054 - classification_loss: 0.1069 449/500 [=========================>....] - ETA: 17s - loss: 0.9118 - regression_loss: 0.8049 - classification_loss: 0.1069 450/500 [==========================>...] - ETA: 16s - loss: 0.9117 - regression_loss: 0.8048 - classification_loss: 0.1069 451/500 [==========================>...] - ETA: 16s - loss: 0.9116 - regression_loss: 0.8047 - classification_loss: 0.1069 452/500 [==========================>...] - ETA: 16s - loss: 0.9117 - regression_loss: 0.8049 - classification_loss: 0.1068 453/500 [==========================>...] - ETA: 15s - loss: 0.9127 - regression_loss: 0.8055 - classification_loss: 0.1072 454/500 [==========================>...] - ETA: 15s - loss: 0.9122 - regression_loss: 0.8052 - classification_loss: 0.1071 455/500 [==========================>...] - ETA: 15s - loss: 0.9130 - regression_loss: 0.8057 - classification_loss: 0.1072 456/500 [==========================>...] - ETA: 14s - loss: 0.9125 - regression_loss: 0.8054 - classification_loss: 0.1071 457/500 [==========================>...] - ETA: 14s - loss: 0.9124 - regression_loss: 0.8052 - classification_loss: 0.1072 458/500 [==========================>...] - ETA: 14s - loss: 0.9114 - regression_loss: 0.8043 - classification_loss: 0.1070 459/500 [==========================>...] - ETA: 13s - loss: 0.9131 - regression_loss: 0.8056 - classification_loss: 0.1076 460/500 [==========================>...] - ETA: 13s - loss: 0.9149 - regression_loss: 0.8073 - classification_loss: 0.1076 461/500 [==========================>...] - ETA: 13s - loss: 0.9148 - regression_loss: 0.8073 - classification_loss: 0.1075 462/500 [==========================>...] - ETA: 12s - loss: 0.9136 - regression_loss: 0.8063 - classification_loss: 0.1073 463/500 [==========================>...] - ETA: 12s - loss: 0.9146 - regression_loss: 0.8072 - classification_loss: 0.1073 464/500 [==========================>...] - ETA: 12s - loss: 0.9144 - regression_loss: 0.8071 - classification_loss: 0.1072 465/500 [==========================>...] - ETA: 11s - loss: 0.9133 - regression_loss: 0.8062 - classification_loss: 0.1071 466/500 [==========================>...] - ETA: 11s - loss: 0.9126 - regression_loss: 0.8056 - classification_loss: 0.1070 467/500 [===========================>..] - ETA: 11s - loss: 0.9128 - regression_loss: 0.8060 - classification_loss: 0.1069 468/500 [===========================>..] - ETA: 10s - loss: 0.9117 - regression_loss: 0.8050 - classification_loss: 0.1067 469/500 [===========================>..] - ETA: 10s - loss: 0.9107 - regression_loss: 0.8042 - classification_loss: 0.1065 470/500 [===========================>..] - ETA: 10s - loss: 0.9115 - regression_loss: 0.8050 - classification_loss: 0.1065 471/500 [===========================>..] - ETA: 9s - loss: 0.9123 - regression_loss: 0.8057 - classification_loss: 0.1066  472/500 [===========================>..] - ETA: 9s - loss: 0.9159 - regression_loss: 0.8081 - classification_loss: 0.1078 473/500 [===========================>..] - ETA: 9s - loss: 0.9163 - regression_loss: 0.8083 - classification_loss: 0.1079 474/500 [===========================>..] - ETA: 8s - loss: 0.9159 - regression_loss: 0.8081 - classification_loss: 0.1078 475/500 [===========================>..] - ETA: 8s - loss: 0.9164 - regression_loss: 0.8086 - classification_loss: 0.1078 476/500 [===========================>..] - ETA: 8s - loss: 0.9152 - regression_loss: 0.8075 - classification_loss: 0.1076 477/500 [===========================>..] - ETA: 7s - loss: 0.9147 - regression_loss: 0.8072 - classification_loss: 0.1075 478/500 [===========================>..] - ETA: 7s - loss: 0.9180 - regression_loss: 0.8102 - classification_loss: 0.1078 479/500 [===========================>..] - ETA: 7s - loss: 0.9174 - regression_loss: 0.8097 - classification_loss: 0.1077 480/500 [===========================>..] - ETA: 6s - loss: 0.9175 - regression_loss: 0.8099 - classification_loss: 0.1077 481/500 [===========================>..] - ETA: 6s - loss: 0.9174 - regression_loss: 0.8098 - classification_loss: 0.1076 482/500 [===========================>..] - ETA: 6s - loss: 0.9175 - regression_loss: 0.8100 - classification_loss: 0.1075 483/500 [===========================>..] - ETA: 5s - loss: 0.9176 - regression_loss: 0.8101 - classification_loss: 0.1075 484/500 [============================>.] - ETA: 5s - loss: 0.9174 - regression_loss: 0.8099 - classification_loss: 0.1075 485/500 [============================>.] - ETA: 5s - loss: 0.9178 - regression_loss: 0.8102 - classification_loss: 0.1075 486/500 [============================>.] - ETA: 4s - loss: 0.9169 - regression_loss: 0.8096 - classification_loss: 0.1074 487/500 [============================>.] - ETA: 4s - loss: 0.9168 - regression_loss: 0.8095 - classification_loss: 0.1073 488/500 [============================>.] - ETA: 4s - loss: 0.9164 - regression_loss: 0.8092 - classification_loss: 0.1072 489/500 [============================>.] - ETA: 3s - loss: 0.9176 - regression_loss: 0.8103 - classification_loss: 0.1073 490/500 [============================>.] - ETA: 3s - loss: 0.9170 - regression_loss: 0.8098 - classification_loss: 0.1072 491/500 [============================>.] - ETA: 3s - loss: 0.9169 - regression_loss: 0.8097 - classification_loss: 0.1072 492/500 [============================>.] - ETA: 2s - loss: 0.9169 - regression_loss: 0.8098 - classification_loss: 0.1071 493/500 [============================>.] - ETA: 2s - loss: 0.9165 - regression_loss: 0.8094 - classification_loss: 0.1070 494/500 [============================>.] - ETA: 2s - loss: 0.9177 - regression_loss: 0.8104 - classification_loss: 0.1074 495/500 [============================>.] - ETA: 1s - loss: 0.9165 - regression_loss: 0.8093 - classification_loss: 0.1072 496/500 [============================>.] - ETA: 1s - loss: 0.9164 - regression_loss: 0.8092 - classification_loss: 0.1072 497/500 [============================>.] - ETA: 1s - loss: 0.9155 - regression_loss: 0.8085 - classification_loss: 0.1070 498/500 [============================>.] - ETA: 0s - loss: 0.9156 - regression_loss: 0.8086 - classification_loss: 0.1070 499/500 [============================>.] - ETA: 0s - loss: 0.9163 - regression_loss: 0.8089 - classification_loss: 0.1074 500/500 [==============================] - 169s 339ms/step - loss: 0.9150 - regression_loss: 0.8078 - classification_loss: 0.1072 326 instances of class plum with average precision: 0.8388 mAP: 0.8388 Epoch 00030: saving model to ./training/snapshots/resnet101_pascal_30.h5 Epoch 31/150 1/500 [..............................] - ETA: 2:42 - loss: 1.1555 - regression_loss: 1.0365 - classification_loss: 0.1190 2/500 [..............................] - ETA: 2:46 - loss: 1.4394 - regression_loss: 1.2489 - classification_loss: 0.1905 3/500 [..............................] - ETA: 2:47 - loss: 1.2811 - regression_loss: 1.1181 - classification_loss: 0.1630 4/500 [..............................] - ETA: 2:47 - loss: 1.0952 - regression_loss: 0.9640 - classification_loss: 0.1313 5/500 [..............................] - ETA: 2:45 - loss: 1.0427 - regression_loss: 0.9109 - classification_loss: 0.1319 6/500 [..............................] - ETA: 2:43 - loss: 0.9845 - regression_loss: 0.8686 - classification_loss: 0.1159 7/500 [..............................] - ETA: 2:42 - loss: 0.9899 - regression_loss: 0.8772 - classification_loss: 0.1127 8/500 [..............................] - ETA: 2:43 - loss: 0.9718 - regression_loss: 0.8641 - classification_loss: 0.1077 9/500 [..............................] - ETA: 2:42 - loss: 1.0636 - regression_loss: 0.9286 - classification_loss: 0.1350 10/500 [..............................] - ETA: 2:42 - loss: 1.0670 - regression_loss: 0.9297 - classification_loss: 0.1373 11/500 [..............................] - ETA: 2:42 - loss: 1.0342 - regression_loss: 0.9058 - classification_loss: 0.1285 12/500 [..............................] - ETA: 2:41 - loss: 1.0562 - regression_loss: 0.9292 - classification_loss: 0.1270 13/500 [..............................] - ETA: 2:42 - loss: 1.0646 - regression_loss: 0.9406 - classification_loss: 0.1240 14/500 [..............................] - ETA: 2:42 - loss: 1.0389 - regression_loss: 0.9205 - classification_loss: 0.1184 15/500 [..............................] - ETA: 2:42 - loss: 1.0151 - regression_loss: 0.9028 - classification_loss: 0.1123 16/500 [..............................] - ETA: 2:41 - loss: 0.9965 - regression_loss: 0.8874 - classification_loss: 0.1091 17/500 [>.............................] - ETA: 2:41 - loss: 0.9717 - regression_loss: 0.8635 - classification_loss: 0.1082 18/500 [>.............................] - ETA: 2:41 - loss: 0.9514 - regression_loss: 0.8467 - classification_loss: 0.1047 19/500 [>.............................] - ETA: 2:40 - loss: 0.9594 - regression_loss: 0.8507 - classification_loss: 0.1087 20/500 [>.............................] - ETA: 2:40 - loss: 0.9536 - regression_loss: 0.8464 - classification_loss: 0.1072 21/500 [>.............................] - ETA: 2:40 - loss: 0.9614 - regression_loss: 0.8544 - classification_loss: 0.1069 22/500 [>.............................] - ETA: 2:39 - loss: 0.9294 - regression_loss: 0.8271 - classification_loss: 0.1023 23/500 [>.............................] - ETA: 2:39 - loss: 0.9241 - regression_loss: 0.8225 - classification_loss: 0.1016 24/500 [>.............................] - ETA: 2:39 - loss: 0.9169 - regression_loss: 0.8164 - classification_loss: 0.1005 25/500 [>.............................] - ETA: 2:38 - loss: 0.9211 - regression_loss: 0.8207 - classification_loss: 0.1004 26/500 [>.............................] - ETA: 2:38 - loss: 0.8994 - regression_loss: 0.8021 - classification_loss: 0.0972 27/500 [>.............................] - ETA: 2:38 - loss: 0.8924 - regression_loss: 0.7965 - classification_loss: 0.0958 28/500 [>.............................] - ETA: 2:38 - loss: 0.9018 - regression_loss: 0.8050 - classification_loss: 0.0968 29/500 [>.............................] - ETA: 2:38 - loss: 0.8875 - regression_loss: 0.7924 - classification_loss: 0.0952 30/500 [>.............................] - ETA: 2:37 - loss: 0.9075 - regression_loss: 0.8099 - classification_loss: 0.0976 31/500 [>.............................] - ETA: 2:37 - loss: 0.9070 - regression_loss: 0.8103 - classification_loss: 0.0968 32/500 [>.............................] - ETA: 2:37 - loss: 0.9143 - regression_loss: 0.8151 - classification_loss: 0.0992 33/500 [>.............................] - ETA: 2:37 - loss: 0.9238 - regression_loss: 0.8242 - classification_loss: 0.0996 34/500 [=>............................] - ETA: 2:36 - loss: 0.9100 - regression_loss: 0.8124 - classification_loss: 0.0976 35/500 [=>............................] - ETA: 2:36 - loss: 0.8938 - regression_loss: 0.7986 - classification_loss: 0.0952 36/500 [=>............................] - ETA: 2:36 - loss: 0.8907 - regression_loss: 0.7962 - classification_loss: 0.0945 37/500 [=>............................] - ETA: 2:36 - loss: 0.8840 - regression_loss: 0.7907 - classification_loss: 0.0933 38/500 [=>............................] - ETA: 2:35 - loss: 0.8906 - regression_loss: 0.7962 - classification_loss: 0.0943 39/500 [=>............................] - ETA: 2:35 - loss: 0.8948 - regression_loss: 0.8019 - classification_loss: 0.0929 40/500 [=>............................] - ETA: 2:35 - loss: 0.8908 - regression_loss: 0.7988 - classification_loss: 0.0920 41/500 [=>............................] - ETA: 2:34 - loss: 0.9016 - regression_loss: 0.8086 - classification_loss: 0.0930 42/500 [=>............................] - ETA: 2:34 - loss: 0.8883 - regression_loss: 0.7972 - classification_loss: 0.0911 43/500 [=>............................] - ETA: 2:34 - loss: 0.8922 - regression_loss: 0.8004 - classification_loss: 0.0918 44/500 [=>............................] - ETA: 2:33 - loss: 0.8917 - regression_loss: 0.8004 - classification_loss: 0.0912 45/500 [=>............................] - ETA: 2:33 - loss: 0.8886 - regression_loss: 0.7972 - classification_loss: 0.0914 46/500 [=>............................] - ETA: 2:33 - loss: 0.8946 - regression_loss: 0.8025 - classification_loss: 0.0920 47/500 [=>............................] - ETA: 2:33 - loss: 0.8872 - regression_loss: 0.7961 - classification_loss: 0.0911 48/500 [=>............................] - ETA: 2:33 - loss: 0.8794 - regression_loss: 0.7890 - classification_loss: 0.0903 49/500 [=>............................] - ETA: 2:32 - loss: 0.8868 - regression_loss: 0.7965 - classification_loss: 0.0904 50/500 [==>...........................] - ETA: 2:32 - loss: 0.8838 - regression_loss: 0.7941 - classification_loss: 0.0897 51/500 [==>...........................] - ETA: 2:32 - loss: 0.8782 - regression_loss: 0.7897 - classification_loss: 0.0885 52/500 [==>...........................] - ETA: 2:31 - loss: 0.8709 - regression_loss: 0.7836 - classification_loss: 0.0873 53/500 [==>...........................] - ETA: 2:31 - loss: 0.8743 - regression_loss: 0.7859 - classification_loss: 0.0884 54/500 [==>...........................] - ETA: 2:31 - loss: 0.8698 - regression_loss: 0.7826 - classification_loss: 0.0872 55/500 [==>...........................] - ETA: 2:30 - loss: 0.8666 - regression_loss: 0.7799 - classification_loss: 0.0867 56/500 [==>...........................] - ETA: 2:30 - loss: 0.8836 - regression_loss: 0.7911 - classification_loss: 0.0926 57/500 [==>...........................] - ETA: 2:30 - loss: 0.8813 - regression_loss: 0.7887 - classification_loss: 0.0926 58/500 [==>...........................] - ETA: 2:29 - loss: 0.8785 - regression_loss: 0.7869 - classification_loss: 0.0916 59/500 [==>...........................] - ETA: 2:29 - loss: 0.8698 - regression_loss: 0.7793 - classification_loss: 0.0905 60/500 [==>...........................] - ETA: 2:29 - loss: 0.8660 - regression_loss: 0.7765 - classification_loss: 0.0895 61/500 [==>...........................] - ETA: 2:28 - loss: 0.8623 - regression_loss: 0.7734 - classification_loss: 0.0889 62/500 [==>...........................] - ETA: 2:28 - loss: 0.8673 - regression_loss: 0.7773 - classification_loss: 0.0900 63/500 [==>...........................] - ETA: 2:28 - loss: 0.8646 - regression_loss: 0.7748 - classification_loss: 0.0898 64/500 [==>...........................] - ETA: 2:27 - loss: 0.8639 - regression_loss: 0.7742 - classification_loss: 0.0897 65/500 [==>...........................] - ETA: 2:27 - loss: 0.8785 - regression_loss: 0.7852 - classification_loss: 0.0933 66/500 [==>...........................] - ETA: 2:27 - loss: 0.8770 - regression_loss: 0.7842 - classification_loss: 0.0928 67/500 [===>..........................] - ETA: 2:27 - loss: 0.8776 - regression_loss: 0.7852 - classification_loss: 0.0924 68/500 [===>..........................] - ETA: 2:26 - loss: 0.8840 - regression_loss: 0.7913 - classification_loss: 0.0927 69/500 [===>..........................] - ETA: 2:26 - loss: 0.8844 - regression_loss: 0.7921 - classification_loss: 0.0924 70/500 [===>..........................] - ETA: 2:26 - loss: 0.8783 - regression_loss: 0.7869 - classification_loss: 0.0915 71/500 [===>..........................] - ETA: 2:25 - loss: 0.8766 - regression_loss: 0.7857 - classification_loss: 0.0909 72/500 [===>..........................] - ETA: 2:25 - loss: 0.8796 - regression_loss: 0.7885 - classification_loss: 0.0911 73/500 [===>..........................] - ETA: 2:25 - loss: 0.8805 - regression_loss: 0.7898 - classification_loss: 0.0907 74/500 [===>..........................] - ETA: 2:24 - loss: 0.8853 - regression_loss: 0.7933 - classification_loss: 0.0919 75/500 [===>..........................] - ETA: 2:24 - loss: 0.8856 - regression_loss: 0.7936 - classification_loss: 0.0919 76/500 [===>..........................] - ETA: 2:24 - loss: 0.8840 - regression_loss: 0.7925 - classification_loss: 0.0915 77/500 [===>..........................] - ETA: 2:23 - loss: 0.8853 - regression_loss: 0.7933 - classification_loss: 0.0920 78/500 [===>..........................] - ETA: 2:23 - loss: 0.8806 - regression_loss: 0.7888 - classification_loss: 0.0917 79/500 [===>..........................] - ETA: 2:23 - loss: 0.8785 - regression_loss: 0.7874 - classification_loss: 0.0911 80/500 [===>..........................] - ETA: 2:22 - loss: 0.8724 - regression_loss: 0.7822 - classification_loss: 0.0902 81/500 [===>..........................] - ETA: 2:22 - loss: 0.8698 - regression_loss: 0.7802 - classification_loss: 0.0896 82/500 [===>..........................] - ETA: 2:22 - loss: 0.8682 - regression_loss: 0.7786 - classification_loss: 0.0896 83/500 [===>..........................] - ETA: 2:21 - loss: 0.8660 - regression_loss: 0.7770 - classification_loss: 0.0891 84/500 [====>.........................] - ETA: 2:21 - loss: 0.8727 - regression_loss: 0.7828 - classification_loss: 0.0899 85/500 [====>.........................] - ETA: 2:21 - loss: 0.8667 - regression_loss: 0.7775 - classification_loss: 0.0892 86/500 [====>.........................] - ETA: 2:20 - loss: 0.8629 - regression_loss: 0.7743 - classification_loss: 0.0886 87/500 [====>.........................] - ETA: 2:20 - loss: 0.8637 - regression_loss: 0.7748 - classification_loss: 0.0889 88/500 [====>.........................] - ETA: 2:19 - loss: 0.8724 - regression_loss: 0.7809 - classification_loss: 0.0916 89/500 [====>.........................] - ETA: 2:19 - loss: 0.8663 - regression_loss: 0.7756 - classification_loss: 0.0908 90/500 [====>.........................] - ETA: 2:19 - loss: 0.8694 - regression_loss: 0.7781 - classification_loss: 0.0913 91/500 [====>.........................] - ETA: 2:18 - loss: 0.8749 - regression_loss: 0.7823 - classification_loss: 0.0926 92/500 [====>.........................] - ETA: 2:18 - loss: 0.8736 - regression_loss: 0.7815 - classification_loss: 0.0921 93/500 [====>.........................] - ETA: 2:18 - loss: 0.8656 - regression_loss: 0.7743 - classification_loss: 0.0913 94/500 [====>.........................] - ETA: 2:17 - loss: 0.8674 - regression_loss: 0.7760 - classification_loss: 0.0914 95/500 [====>.........................] - ETA: 2:17 - loss: 0.8697 - regression_loss: 0.7776 - classification_loss: 0.0922 96/500 [====>.........................] - ETA: 2:17 - loss: 0.8734 - regression_loss: 0.7811 - classification_loss: 0.0923 97/500 [====>.........................] - ETA: 2:16 - loss: 0.8790 - regression_loss: 0.7860 - classification_loss: 0.0930 98/500 [====>.........................] - ETA: 2:16 - loss: 0.8766 - regression_loss: 0.7841 - classification_loss: 0.0925 99/500 [====>.........................] - ETA: 2:16 - loss: 0.8745 - regression_loss: 0.7821 - classification_loss: 0.0924 100/500 [=====>........................] - ETA: 2:15 - loss: 0.8739 - regression_loss: 0.7818 - classification_loss: 0.0921 101/500 [=====>........................] - ETA: 2:15 - loss: 0.8721 - regression_loss: 0.7803 - classification_loss: 0.0918 102/500 [=====>........................] - ETA: 2:15 - loss: 0.8714 - regression_loss: 0.7798 - classification_loss: 0.0916 103/500 [=====>........................] - ETA: 2:14 - loss: 0.8756 - regression_loss: 0.7845 - classification_loss: 0.0911 104/500 [=====>........................] - ETA: 2:14 - loss: 0.8742 - regression_loss: 0.7835 - classification_loss: 0.0906 105/500 [=====>........................] - ETA: 2:13 - loss: 0.8754 - regression_loss: 0.7850 - classification_loss: 0.0905 106/500 [=====>........................] - ETA: 2:13 - loss: 0.8720 - regression_loss: 0.7821 - classification_loss: 0.0899 107/500 [=====>........................] - ETA: 2:13 - loss: 0.8763 - regression_loss: 0.7857 - classification_loss: 0.0906 108/500 [=====>........................] - ETA: 2:13 - loss: 0.8764 - regression_loss: 0.7859 - classification_loss: 0.0905 109/500 [=====>........................] - ETA: 2:12 - loss: 0.8708 - regression_loss: 0.7810 - classification_loss: 0.0898 110/500 [=====>........................] - ETA: 2:12 - loss: 0.8710 - regression_loss: 0.7811 - classification_loss: 0.0899 111/500 [=====>........................] - ETA: 2:12 - loss: 0.8697 - regression_loss: 0.7801 - classification_loss: 0.0896 112/500 [=====>........................] - ETA: 2:11 - loss: 0.8698 - regression_loss: 0.7803 - classification_loss: 0.0895 113/500 [=====>........................] - ETA: 2:11 - loss: 0.8683 - regression_loss: 0.7792 - classification_loss: 0.0891 114/500 [=====>........................] - ETA: 2:11 - loss: 0.8707 - regression_loss: 0.7808 - classification_loss: 0.0899 115/500 [=====>........................] - ETA: 2:10 - loss: 0.8698 - regression_loss: 0.7798 - classification_loss: 0.0900 116/500 [=====>........................] - ETA: 2:10 - loss: 0.8674 - regression_loss: 0.7778 - classification_loss: 0.0896 117/500 [======>.......................] - ETA: 2:10 - loss: 0.8667 - regression_loss: 0.7773 - classification_loss: 0.0894 118/500 [======>.......................] - ETA: 2:09 - loss: 0.8784 - regression_loss: 0.7851 - classification_loss: 0.0933 119/500 [======>.......................] - ETA: 2:09 - loss: 0.8889 - regression_loss: 0.7954 - classification_loss: 0.0935 120/500 [======>.......................] - ETA: 2:09 - loss: 0.8845 - regression_loss: 0.7916 - classification_loss: 0.0929 121/500 [======>.......................] - ETA: 2:08 - loss: 0.8820 - regression_loss: 0.7896 - classification_loss: 0.0925 122/500 [======>.......................] - ETA: 2:08 - loss: 0.8839 - regression_loss: 0.7904 - classification_loss: 0.0935 123/500 [======>.......................] - ETA: 2:08 - loss: 0.8903 - regression_loss: 0.7960 - classification_loss: 0.0943 124/500 [======>.......................] - ETA: 2:07 - loss: 0.8864 - regression_loss: 0.7926 - classification_loss: 0.0938 125/500 [======>.......................] - ETA: 2:07 - loss: 0.8848 - regression_loss: 0.7915 - classification_loss: 0.0934 126/500 [======>.......................] - ETA: 2:07 - loss: 0.8876 - regression_loss: 0.7941 - classification_loss: 0.0935 127/500 [======>.......................] - ETA: 2:06 - loss: 0.8890 - regression_loss: 0.7956 - classification_loss: 0.0934 128/500 [======>.......................] - ETA: 2:06 - loss: 0.8872 - regression_loss: 0.7941 - classification_loss: 0.0931 129/500 [======>.......................] - ETA: 2:06 - loss: 0.8828 - regression_loss: 0.7902 - classification_loss: 0.0926 130/500 [======>.......................] - ETA: 2:05 - loss: 0.8809 - regression_loss: 0.7886 - classification_loss: 0.0923 131/500 [======>.......................] - ETA: 2:05 - loss: 0.8851 - regression_loss: 0.7917 - classification_loss: 0.0934 132/500 [======>.......................] - ETA: 2:05 - loss: 0.8873 - regression_loss: 0.7939 - classification_loss: 0.0934 133/500 [======>.......................] - ETA: 2:04 - loss: 0.8838 - regression_loss: 0.7908 - classification_loss: 0.0930 134/500 [=======>......................] - ETA: 2:04 - loss: 0.8826 - regression_loss: 0.7897 - classification_loss: 0.0929 135/500 [=======>......................] - ETA: 2:04 - loss: 0.8803 - regression_loss: 0.7878 - classification_loss: 0.0926 136/500 [=======>......................] - ETA: 2:03 - loss: 0.8757 - regression_loss: 0.7837 - classification_loss: 0.0920 137/500 [=======>......................] - ETA: 2:03 - loss: 0.8776 - regression_loss: 0.7849 - classification_loss: 0.0927 138/500 [=======>......................] - ETA: 2:02 - loss: 0.8769 - regression_loss: 0.7844 - classification_loss: 0.0925 139/500 [=======>......................] - ETA: 2:02 - loss: 0.8731 - regression_loss: 0.7811 - classification_loss: 0.0920 140/500 [=======>......................] - ETA: 2:02 - loss: 0.8714 - regression_loss: 0.7796 - classification_loss: 0.0918 141/500 [=======>......................] - ETA: 2:01 - loss: 0.8685 - regression_loss: 0.7768 - classification_loss: 0.0918 142/500 [=======>......................] - ETA: 2:01 - loss: 0.8694 - regression_loss: 0.7779 - classification_loss: 0.0915 143/500 [=======>......................] - ETA: 2:01 - loss: 0.8678 - regression_loss: 0.7766 - classification_loss: 0.0912 144/500 [=======>......................] - ETA: 2:00 - loss: 0.8708 - regression_loss: 0.7790 - classification_loss: 0.0918 145/500 [=======>......................] - ETA: 2:00 - loss: 0.8710 - regression_loss: 0.7794 - classification_loss: 0.0917 146/500 [=======>......................] - ETA: 2:00 - loss: 0.8685 - regression_loss: 0.7771 - classification_loss: 0.0914 147/500 [=======>......................] - ETA: 1:59 - loss: 0.8676 - regression_loss: 0.7765 - classification_loss: 0.0910 148/500 [=======>......................] - ETA: 1:59 - loss: 0.8643 - regression_loss: 0.7738 - classification_loss: 0.0905 149/500 [=======>......................] - ETA: 1:59 - loss: 0.8642 - regression_loss: 0.7735 - classification_loss: 0.0907 150/500 [========>.....................] - ETA: 1:58 - loss: 0.8682 - regression_loss: 0.7773 - classification_loss: 0.0909 151/500 [========>.....................] - ETA: 1:58 - loss: 0.8690 - regression_loss: 0.7781 - classification_loss: 0.0909 152/500 [========>.....................] - ETA: 1:58 - loss: 0.8728 - regression_loss: 0.7817 - classification_loss: 0.0911 153/500 [========>.....................] - ETA: 1:57 - loss: 0.8710 - regression_loss: 0.7802 - classification_loss: 0.0908 154/500 [========>.....................] - ETA: 1:57 - loss: 0.8676 - regression_loss: 0.7772 - classification_loss: 0.0904 155/500 [========>.....................] - ETA: 1:56 - loss: 0.8655 - regression_loss: 0.7754 - classification_loss: 0.0902 156/500 [========>.....................] - ETA: 1:56 - loss: 0.8625 - regression_loss: 0.7728 - classification_loss: 0.0897 157/500 [========>.....................] - ETA: 1:56 - loss: 0.8636 - regression_loss: 0.7737 - classification_loss: 0.0899 158/500 [========>.....................] - ETA: 1:55 - loss: 0.8618 - regression_loss: 0.7721 - classification_loss: 0.0897 159/500 [========>.....................] - ETA: 1:55 - loss: 0.8603 - regression_loss: 0.7709 - classification_loss: 0.0894 160/500 [========>.....................] - ETA: 1:55 - loss: 0.8589 - regression_loss: 0.7697 - classification_loss: 0.0892 161/500 [========>.....................] - ETA: 1:54 - loss: 0.8582 - regression_loss: 0.7692 - classification_loss: 0.0890 162/500 [========>.....................] - ETA: 1:54 - loss: 0.8563 - regression_loss: 0.7676 - classification_loss: 0.0887 163/500 [========>.....................] - ETA: 1:54 - loss: 0.8556 - regression_loss: 0.7670 - classification_loss: 0.0886 164/500 [========>.....................] - ETA: 1:53 - loss: 0.8564 - regression_loss: 0.7678 - classification_loss: 0.0886 165/500 [========>.....................] - ETA: 1:53 - loss: 0.8547 - regression_loss: 0.7662 - classification_loss: 0.0885 166/500 [========>.....................] - ETA: 1:53 - loss: 0.8592 - regression_loss: 0.7701 - classification_loss: 0.0891 167/500 [=========>....................] - ETA: 1:52 - loss: 0.8605 - regression_loss: 0.7714 - classification_loss: 0.0891 168/500 [=========>....................] - ETA: 1:52 - loss: 0.8584 - regression_loss: 0.7692 - classification_loss: 0.0891 169/500 [=========>....................] - ETA: 1:52 - loss: 0.8606 - regression_loss: 0.7713 - classification_loss: 0.0893 170/500 [=========>....................] - ETA: 1:51 - loss: 0.8588 - regression_loss: 0.7698 - classification_loss: 0.0890 171/500 [=========>....................] - ETA: 1:51 - loss: 0.8580 - regression_loss: 0.7692 - classification_loss: 0.0888 172/500 [=========>....................] - ETA: 1:51 - loss: 0.8571 - regression_loss: 0.7683 - classification_loss: 0.0887 173/500 [=========>....................] - ETA: 1:50 - loss: 0.8604 - regression_loss: 0.7708 - classification_loss: 0.0896 174/500 [=========>....................] - ETA: 1:50 - loss: 0.8604 - regression_loss: 0.7708 - classification_loss: 0.0896 175/500 [=========>....................] - ETA: 1:50 - loss: 0.8608 - regression_loss: 0.7714 - classification_loss: 0.0894 176/500 [=========>....................] - ETA: 1:49 - loss: 0.8597 - regression_loss: 0.7706 - classification_loss: 0.0891 177/500 [=========>....................] - ETA: 1:49 - loss: 0.8608 - regression_loss: 0.7715 - classification_loss: 0.0893 178/500 [=========>....................] - ETA: 1:49 - loss: 0.8596 - regression_loss: 0.7706 - classification_loss: 0.0890 179/500 [=========>....................] - ETA: 1:48 - loss: 0.8595 - regression_loss: 0.7706 - classification_loss: 0.0889 180/500 [=========>....................] - ETA: 1:48 - loss: 0.8598 - regression_loss: 0.7709 - classification_loss: 0.0889 181/500 [=========>....................] - ETA: 1:48 - loss: 0.8582 - regression_loss: 0.7695 - classification_loss: 0.0886 182/500 [=========>....................] - ETA: 1:47 - loss: 0.8558 - regression_loss: 0.7675 - classification_loss: 0.0883 183/500 [=========>....................] - ETA: 1:47 - loss: 0.8550 - regression_loss: 0.7666 - classification_loss: 0.0884 184/500 [==========>...................] - ETA: 1:47 - loss: 0.8597 - regression_loss: 0.7706 - classification_loss: 0.0891 185/500 [==========>...................] - ETA: 1:46 - loss: 0.8594 - regression_loss: 0.7703 - classification_loss: 0.0890 186/500 [==========>...................] - ETA: 1:46 - loss: 0.8596 - regression_loss: 0.7705 - classification_loss: 0.0892 187/500 [==========>...................] - ETA: 1:46 - loss: 0.8591 - regression_loss: 0.7701 - classification_loss: 0.0889 188/500 [==========>...................] - ETA: 1:45 - loss: 0.8601 - regression_loss: 0.7709 - classification_loss: 0.0891 189/500 [==========>...................] - ETA: 1:45 - loss: 0.8606 - regression_loss: 0.7717 - classification_loss: 0.0889 190/500 [==========>...................] - ETA: 1:45 - loss: 0.8585 - regression_loss: 0.7699 - classification_loss: 0.0886 191/500 [==========>...................] - ETA: 1:44 - loss: 0.8582 - regression_loss: 0.7697 - classification_loss: 0.0886 192/500 [==========>...................] - ETA: 1:44 - loss: 0.8569 - regression_loss: 0.7686 - classification_loss: 0.0883 193/500 [==========>...................] - ETA: 1:44 - loss: 0.8545 - regression_loss: 0.7664 - classification_loss: 0.0881 194/500 [==========>...................] - ETA: 1:43 - loss: 0.8519 - regression_loss: 0.7641 - classification_loss: 0.0878 195/500 [==========>...................] - ETA: 1:43 - loss: 0.8498 - regression_loss: 0.7623 - classification_loss: 0.0876 196/500 [==========>...................] - ETA: 1:43 - loss: 0.8497 - regression_loss: 0.7623 - classification_loss: 0.0875 197/500 [==========>...................] - ETA: 1:42 - loss: 0.8518 - regression_loss: 0.7645 - classification_loss: 0.0873 198/500 [==========>...................] - ETA: 1:42 - loss: 0.8517 - regression_loss: 0.7647 - classification_loss: 0.0870 199/500 [==========>...................] - ETA: 1:42 - loss: 0.8528 - regression_loss: 0.7655 - classification_loss: 0.0872 200/500 [===========>..................] - ETA: 1:41 - loss: 0.8510 - regression_loss: 0.7640 - classification_loss: 0.0870 201/500 [===========>..................] - ETA: 1:41 - loss: 0.8508 - regression_loss: 0.7638 - classification_loss: 0.0870 202/500 [===========>..................] - ETA: 1:41 - loss: 0.8510 - regression_loss: 0.7640 - classification_loss: 0.0870 203/500 [===========>..................] - ETA: 1:40 - loss: 0.8504 - regression_loss: 0.7636 - classification_loss: 0.0868 204/500 [===========>..................] - ETA: 1:40 - loss: 0.8518 - regression_loss: 0.7646 - classification_loss: 0.0872 205/500 [===========>..................] - ETA: 1:40 - loss: 0.8517 - regression_loss: 0.7646 - classification_loss: 0.0871 206/500 [===========>..................] - ETA: 1:39 - loss: 0.8518 - regression_loss: 0.7648 - classification_loss: 0.0870 207/500 [===========>..................] - ETA: 1:39 - loss: 0.8492 - regression_loss: 0.7627 - classification_loss: 0.0866 208/500 [===========>..................] - ETA: 1:39 - loss: 0.8536 - regression_loss: 0.7666 - classification_loss: 0.0870 209/500 [===========>..................] - ETA: 1:38 - loss: 0.8522 - regression_loss: 0.7654 - classification_loss: 0.0869 210/500 [===========>..................] - ETA: 1:38 - loss: 0.8502 - regression_loss: 0.7633 - classification_loss: 0.0868 211/500 [===========>..................] - ETA: 1:38 - loss: 0.8519 - regression_loss: 0.7648 - classification_loss: 0.0871 212/500 [===========>..................] - ETA: 1:37 - loss: 0.8517 - regression_loss: 0.7646 - classification_loss: 0.0871 213/500 [===========>..................] - ETA: 1:37 - loss: 0.8524 - regression_loss: 0.7651 - classification_loss: 0.0874 214/500 [===========>..................] - ETA: 1:36 - loss: 0.8525 - regression_loss: 0.7652 - classification_loss: 0.0872 215/500 [===========>..................] - ETA: 1:36 - loss: 0.8529 - regression_loss: 0.7655 - classification_loss: 0.0874 216/500 [===========>..................] - ETA: 1:36 - loss: 0.8579 - regression_loss: 0.7695 - classification_loss: 0.0884 217/500 [============>.................] - ETA: 1:35 - loss: 0.8590 - regression_loss: 0.7700 - classification_loss: 0.0890 218/500 [============>.................] - ETA: 1:35 - loss: 0.8583 - regression_loss: 0.7695 - classification_loss: 0.0888 219/500 [============>.................] - ETA: 1:35 - loss: 0.8584 - regression_loss: 0.7698 - classification_loss: 0.0886 220/500 [============>.................] - ETA: 1:34 - loss: 0.8578 - regression_loss: 0.7690 - classification_loss: 0.0888 221/500 [============>.................] - ETA: 1:34 - loss: 0.8583 - regression_loss: 0.7695 - classification_loss: 0.0887 222/500 [============>.................] - ETA: 1:34 - loss: 0.8600 - regression_loss: 0.7711 - classification_loss: 0.0890 223/500 [============>.................] - ETA: 1:33 - loss: 0.8614 - regression_loss: 0.7727 - classification_loss: 0.0887 224/500 [============>.................] - ETA: 1:33 - loss: 0.8612 - regression_loss: 0.7726 - classification_loss: 0.0886 225/500 [============>.................] - ETA: 1:33 - loss: 0.8610 - regression_loss: 0.7726 - classification_loss: 0.0884 226/500 [============>.................] - ETA: 1:32 - loss: 0.8625 - regression_loss: 0.7736 - classification_loss: 0.0889 227/500 [============>.................] - ETA: 1:32 - loss: 0.8611 - regression_loss: 0.7725 - classification_loss: 0.0886 228/500 [============>.................] - ETA: 1:32 - loss: 0.8604 - regression_loss: 0.7719 - classification_loss: 0.0885 229/500 [============>.................] - ETA: 1:31 - loss: 0.8595 - regression_loss: 0.7713 - classification_loss: 0.0882 230/500 [============>.................] - ETA: 1:31 - loss: 0.8612 - regression_loss: 0.7728 - classification_loss: 0.0884 231/500 [============>.................] - ETA: 1:31 - loss: 0.8606 - regression_loss: 0.7724 - classification_loss: 0.0883 232/500 [============>.................] - ETA: 1:30 - loss: 0.8602 - regression_loss: 0.7719 - classification_loss: 0.0883 233/500 [============>.................] - ETA: 1:30 - loss: 0.8587 - regression_loss: 0.7707 - classification_loss: 0.0881 234/500 [=============>................] - ETA: 1:30 - loss: 0.8580 - regression_loss: 0.7699 - classification_loss: 0.0880 235/500 [=============>................] - ETA: 1:29 - loss: 0.8583 - regression_loss: 0.7704 - classification_loss: 0.0879 236/500 [=============>................] - ETA: 1:29 - loss: 0.8568 - regression_loss: 0.7690 - classification_loss: 0.0878 237/500 [=============>................] - ETA: 1:29 - loss: 0.8579 - regression_loss: 0.7701 - classification_loss: 0.0878 238/500 [=============>................] - ETA: 1:28 - loss: 0.8559 - regression_loss: 0.7684 - classification_loss: 0.0875 239/500 [=============>................] - ETA: 1:28 - loss: 0.8617 - regression_loss: 0.7730 - classification_loss: 0.0887 240/500 [=============>................] - ETA: 1:28 - loss: 0.8600 - regression_loss: 0.7716 - classification_loss: 0.0884 241/500 [=============>................] - ETA: 1:27 - loss: 0.8574 - regression_loss: 0.7684 - classification_loss: 0.0890 242/500 [=============>................] - ETA: 1:27 - loss: 0.8573 - regression_loss: 0.7684 - classification_loss: 0.0889 243/500 [=============>................] - ETA: 1:27 - loss: 0.8582 - regression_loss: 0.7692 - classification_loss: 0.0890 244/500 [=============>................] - ETA: 1:26 - loss: 0.8568 - regression_loss: 0.7681 - classification_loss: 0.0888 245/500 [=============>................] - ETA: 1:26 - loss: 0.8565 - regression_loss: 0.7678 - classification_loss: 0.0887 246/500 [=============>................] - ETA: 1:26 - loss: 0.8574 - regression_loss: 0.7685 - classification_loss: 0.0889 247/500 [=============>................] - ETA: 1:25 - loss: 0.8590 - regression_loss: 0.7703 - classification_loss: 0.0887 248/500 [=============>................] - ETA: 1:25 - loss: 0.8590 - regression_loss: 0.7703 - classification_loss: 0.0887 249/500 [=============>................] - ETA: 1:25 - loss: 0.8581 - regression_loss: 0.7695 - classification_loss: 0.0886 250/500 [==============>...............] - ETA: 1:24 - loss: 0.8563 - regression_loss: 0.7680 - classification_loss: 0.0883 251/500 [==============>...............] - ETA: 1:24 - loss: 0.8558 - regression_loss: 0.7673 - classification_loss: 0.0884 252/500 [==============>...............] - ETA: 1:24 - loss: 0.8570 - regression_loss: 0.7686 - classification_loss: 0.0884 253/500 [==============>...............] - ETA: 1:23 - loss: 0.8571 - regression_loss: 0.7688 - classification_loss: 0.0882 254/500 [==============>...............] - ETA: 1:23 - loss: 0.8554 - regression_loss: 0.7674 - classification_loss: 0.0881 255/500 [==============>...............] - ETA: 1:23 - loss: 0.8540 - regression_loss: 0.7661 - classification_loss: 0.0879 256/500 [==============>...............] - ETA: 1:22 - loss: 0.8552 - regression_loss: 0.7672 - classification_loss: 0.0880 257/500 [==============>...............] - ETA: 1:22 - loss: 0.8561 - regression_loss: 0.7680 - classification_loss: 0.0881 258/500 [==============>...............] - ETA: 1:22 - loss: 0.8567 - regression_loss: 0.7686 - classification_loss: 0.0881 259/500 [==============>...............] - ETA: 1:21 - loss: 0.8605 - regression_loss: 0.7712 - classification_loss: 0.0893 260/500 [==============>...............] - ETA: 1:21 - loss: 0.8603 - regression_loss: 0.7710 - classification_loss: 0.0892 261/500 [==============>...............] - ETA: 1:21 - loss: 0.8608 - regression_loss: 0.7715 - classification_loss: 0.0893 262/500 [==============>...............] - ETA: 1:20 - loss: 0.8618 - regression_loss: 0.7724 - classification_loss: 0.0893 263/500 [==============>...............] - ETA: 1:20 - loss: 0.8603 - regression_loss: 0.7712 - classification_loss: 0.0892 264/500 [==============>...............] - ETA: 1:20 - loss: 0.8581 - regression_loss: 0.7692 - classification_loss: 0.0889 265/500 [==============>...............] - ETA: 1:19 - loss: 0.8588 - regression_loss: 0.7700 - classification_loss: 0.0888 266/500 [==============>...............] - ETA: 1:19 - loss: 0.8580 - regression_loss: 0.7693 - classification_loss: 0.0887 267/500 [===============>..............] - ETA: 1:19 - loss: 0.8589 - regression_loss: 0.7701 - classification_loss: 0.0888 268/500 [===============>..............] - ETA: 1:18 - loss: 0.8604 - regression_loss: 0.7713 - classification_loss: 0.0892 269/500 [===============>..............] - ETA: 1:18 - loss: 0.8611 - regression_loss: 0.7718 - classification_loss: 0.0893 270/500 [===============>..............] - ETA: 1:17 - loss: 0.8628 - regression_loss: 0.7729 - classification_loss: 0.0899 271/500 [===============>..............] - ETA: 1:17 - loss: 0.8639 - regression_loss: 0.7739 - classification_loss: 0.0900 272/500 [===============>..............] - ETA: 1:17 - loss: 0.8633 - regression_loss: 0.7734 - classification_loss: 0.0899 273/500 [===============>..............] - ETA: 1:16 - loss: 0.8635 - regression_loss: 0.7735 - classification_loss: 0.0900 274/500 [===============>..............] - ETA: 1:16 - loss: 0.8680 - regression_loss: 0.7772 - classification_loss: 0.0908 275/500 [===============>..............] - ETA: 1:16 - loss: 0.8683 - regression_loss: 0.7774 - classification_loss: 0.0909 276/500 [===============>..............] - ETA: 1:15 - loss: 0.8702 - regression_loss: 0.7792 - classification_loss: 0.0910 277/500 [===============>..............] - ETA: 1:15 - loss: 0.8724 - regression_loss: 0.7813 - classification_loss: 0.0911 278/500 [===============>..............] - ETA: 1:15 - loss: 0.8745 - regression_loss: 0.7830 - classification_loss: 0.0915 279/500 [===============>..............] - ETA: 1:14 - loss: 0.8745 - regression_loss: 0.7831 - classification_loss: 0.0914 280/500 [===============>..............] - ETA: 1:14 - loss: 0.8775 - regression_loss: 0.7855 - classification_loss: 0.0920 281/500 [===============>..............] - ETA: 1:14 - loss: 0.8775 - regression_loss: 0.7857 - classification_loss: 0.0918 282/500 [===============>..............] - ETA: 1:13 - loss: 0.8781 - regression_loss: 0.7863 - classification_loss: 0.0918 283/500 [===============>..............] - ETA: 1:13 - loss: 0.8795 - regression_loss: 0.7875 - classification_loss: 0.0920 284/500 [================>.............] - ETA: 1:13 - loss: 0.8797 - regression_loss: 0.7876 - classification_loss: 0.0921 285/500 [================>.............] - ETA: 1:12 - loss: 0.8815 - regression_loss: 0.7892 - classification_loss: 0.0923 286/500 [================>.............] - ETA: 1:12 - loss: 0.8832 - regression_loss: 0.7906 - classification_loss: 0.0926 287/500 [================>.............] - ETA: 1:12 - loss: 0.8838 - regression_loss: 0.7913 - classification_loss: 0.0925 288/500 [================>.............] - ETA: 1:11 - loss: 0.8843 - regression_loss: 0.7917 - classification_loss: 0.0925 289/500 [================>.............] - ETA: 1:11 - loss: 0.8857 - regression_loss: 0.7929 - classification_loss: 0.0928 290/500 [================>.............] - ETA: 1:11 - loss: 0.8883 - regression_loss: 0.7947 - classification_loss: 0.0936 291/500 [================>.............] - ETA: 1:10 - loss: 0.8889 - regression_loss: 0.7953 - classification_loss: 0.0936 292/500 [================>.............] - ETA: 1:10 - loss: 0.8891 - regression_loss: 0.7955 - classification_loss: 0.0935 293/500 [================>.............] - ETA: 1:10 - loss: 0.8898 - regression_loss: 0.7964 - classification_loss: 0.0934 294/500 [================>.............] - ETA: 1:09 - loss: 0.8914 - regression_loss: 0.7979 - classification_loss: 0.0935 295/500 [================>.............] - ETA: 1:09 - loss: 0.8936 - regression_loss: 0.8000 - classification_loss: 0.0936 296/500 [================>.............] - ETA: 1:09 - loss: 0.8916 - regression_loss: 0.7982 - classification_loss: 0.0934 297/500 [================>.............] - ETA: 1:08 - loss: 0.8911 - regression_loss: 0.7979 - classification_loss: 0.0933 298/500 [================>.............] - ETA: 1:08 - loss: 0.8908 - regression_loss: 0.7976 - classification_loss: 0.0932 299/500 [================>.............] - ETA: 1:08 - loss: 0.8926 - regression_loss: 0.7988 - classification_loss: 0.0938 300/500 [=================>............] - ETA: 1:07 - loss: 0.8929 - regression_loss: 0.7992 - classification_loss: 0.0937 301/500 [=================>............] - ETA: 1:07 - loss: 0.8917 - regression_loss: 0.7981 - classification_loss: 0.0936 302/500 [=================>............] - ETA: 1:07 - loss: 0.8914 - regression_loss: 0.7978 - classification_loss: 0.0936 303/500 [=================>............] - ETA: 1:06 - loss: 0.8904 - regression_loss: 0.7967 - classification_loss: 0.0936 304/500 [=================>............] - ETA: 1:06 - loss: 0.8890 - regression_loss: 0.7956 - classification_loss: 0.0934 305/500 [=================>............] - ETA: 1:06 - loss: 0.8906 - regression_loss: 0.7968 - classification_loss: 0.0938 306/500 [=================>............] - ETA: 1:05 - loss: 0.8919 - regression_loss: 0.7979 - classification_loss: 0.0940 307/500 [=================>............] - ETA: 1:05 - loss: 0.8914 - regression_loss: 0.7975 - classification_loss: 0.0938 308/500 [=================>............] - ETA: 1:05 - loss: 0.8908 - regression_loss: 0.7971 - classification_loss: 0.0937 309/500 [=================>............] - ETA: 1:04 - loss: 0.8919 - regression_loss: 0.7983 - classification_loss: 0.0937 310/500 [=================>............] - ETA: 1:04 - loss: 0.8910 - regression_loss: 0.7975 - classification_loss: 0.0935 311/500 [=================>............] - ETA: 1:04 - loss: 0.8900 - regression_loss: 0.7967 - classification_loss: 0.0933 312/500 [=================>............] - ETA: 1:03 - loss: 0.8897 - regression_loss: 0.7964 - classification_loss: 0.0933 313/500 [=================>............] - ETA: 1:03 - loss: 0.8889 - regression_loss: 0.7957 - classification_loss: 0.0931 314/500 [=================>............] - ETA: 1:02 - loss: 0.8906 - regression_loss: 0.7970 - classification_loss: 0.0936 315/500 [=================>............] - ETA: 1:02 - loss: 0.8915 - regression_loss: 0.7977 - classification_loss: 0.0938 316/500 [=================>............] - ETA: 1:02 - loss: 0.8913 - regression_loss: 0.7977 - classification_loss: 0.0936 317/500 [==================>...........] - ETA: 1:01 - loss: 0.8907 - regression_loss: 0.7972 - classification_loss: 0.0935 318/500 [==================>...........] - ETA: 1:01 - loss: 0.8932 - regression_loss: 0.7991 - classification_loss: 0.0941 319/500 [==================>...........] - ETA: 1:01 - loss: 0.8933 - regression_loss: 0.7993 - classification_loss: 0.0940 320/500 [==================>...........] - ETA: 1:00 - loss: 0.8935 - regression_loss: 0.7995 - classification_loss: 0.0939 321/500 [==================>...........] - ETA: 1:00 - loss: 0.8917 - regression_loss: 0.7980 - classification_loss: 0.0937 322/500 [==================>...........] - ETA: 1:00 - loss: 0.8920 - regression_loss: 0.7983 - classification_loss: 0.0937 323/500 [==================>...........] - ETA: 59s - loss: 0.8915 - regression_loss: 0.7980 - classification_loss: 0.0935  324/500 [==================>...........] - ETA: 59s - loss: 0.8903 - regression_loss: 0.7970 - classification_loss: 0.0933 325/500 [==================>...........] - ETA: 59s - loss: 0.8895 - regression_loss: 0.7961 - classification_loss: 0.0933 326/500 [==================>...........] - ETA: 58s - loss: 0.8902 - regression_loss: 0.7968 - classification_loss: 0.0934 327/500 [==================>...........] - ETA: 58s - loss: 0.8901 - regression_loss: 0.7967 - classification_loss: 0.0934 328/500 [==================>...........] - ETA: 58s - loss: 0.8874 - regression_loss: 0.7942 - classification_loss: 0.0932 329/500 [==================>...........] - ETA: 57s - loss: 0.8874 - regression_loss: 0.7943 - classification_loss: 0.0931 330/500 [==================>...........] - ETA: 57s - loss: 0.8881 - regression_loss: 0.7950 - classification_loss: 0.0932 331/500 [==================>...........] - ETA: 57s - loss: 0.8906 - regression_loss: 0.7968 - classification_loss: 0.0938 332/500 [==================>...........] - ETA: 56s - loss: 0.8899 - regression_loss: 0.7963 - classification_loss: 0.0937 333/500 [==================>...........] - ETA: 56s - loss: 0.8892 - regression_loss: 0.7957 - classification_loss: 0.0935 334/500 [===================>..........] - ETA: 56s - loss: 0.8894 - regression_loss: 0.7960 - classification_loss: 0.0934 335/500 [===================>..........] - ETA: 55s - loss: 0.8895 - regression_loss: 0.7961 - classification_loss: 0.0933 336/500 [===================>..........] - ETA: 55s - loss: 0.8894 - regression_loss: 0.7960 - classification_loss: 0.0934 337/500 [===================>..........] - ETA: 55s - loss: 0.8895 - regression_loss: 0.7961 - classification_loss: 0.0935 338/500 [===================>..........] - ETA: 54s - loss: 0.8913 - regression_loss: 0.7972 - classification_loss: 0.0941 339/500 [===================>..........] - ETA: 54s - loss: 0.8905 - regression_loss: 0.7966 - classification_loss: 0.0940 340/500 [===================>..........] - ETA: 54s - loss: 0.8912 - regression_loss: 0.7970 - classification_loss: 0.0942 341/500 [===================>..........] - ETA: 53s - loss: 0.8930 - regression_loss: 0.7987 - classification_loss: 0.0943 342/500 [===================>..........] - ETA: 53s - loss: 0.8914 - regression_loss: 0.7973 - classification_loss: 0.0942 343/500 [===================>..........] - ETA: 53s - loss: 0.8912 - regression_loss: 0.7971 - classification_loss: 0.0941 344/500 [===================>..........] - ETA: 52s - loss: 0.8916 - regression_loss: 0.7975 - classification_loss: 0.0941 345/500 [===================>..........] - ETA: 52s - loss: 0.8906 - regression_loss: 0.7967 - classification_loss: 0.0939 346/500 [===================>..........] - ETA: 52s - loss: 0.8900 - regression_loss: 0.7962 - classification_loss: 0.0938 347/500 [===================>..........] - ETA: 51s - loss: 0.8908 - regression_loss: 0.7970 - classification_loss: 0.0938 348/500 [===================>..........] - ETA: 51s - loss: 0.8913 - regression_loss: 0.7975 - classification_loss: 0.0939 349/500 [===================>..........] - ETA: 51s - loss: 0.8915 - regression_loss: 0.7977 - classification_loss: 0.0938 350/500 [====================>.........] - ETA: 50s - loss: 0.8908 - regression_loss: 0.7972 - classification_loss: 0.0936 351/500 [====================>.........] - ETA: 50s - loss: 0.8919 - regression_loss: 0.7981 - classification_loss: 0.0938 352/500 [====================>.........] - ETA: 50s - loss: 0.8923 - regression_loss: 0.7984 - classification_loss: 0.0939 353/500 [====================>.........] - ETA: 49s - loss: 0.8931 - regression_loss: 0.7991 - classification_loss: 0.0941 354/500 [====================>.........] - ETA: 49s - loss: 0.8929 - regression_loss: 0.7990 - classification_loss: 0.0939 355/500 [====================>.........] - ETA: 49s - loss: 0.8953 - regression_loss: 0.8010 - classification_loss: 0.0943 356/500 [====================>.........] - ETA: 48s - loss: 0.8956 - regression_loss: 0.8014 - classification_loss: 0.0943 357/500 [====================>.........] - ETA: 48s - loss: 0.8957 - regression_loss: 0.8016 - classification_loss: 0.0942 358/500 [====================>.........] - ETA: 48s - loss: 0.8966 - regression_loss: 0.8023 - classification_loss: 0.0943 359/500 [====================>.........] - ETA: 47s - loss: 0.8970 - regression_loss: 0.8028 - classification_loss: 0.0943 360/500 [====================>.........] - ETA: 47s - loss: 0.8974 - regression_loss: 0.8032 - classification_loss: 0.0942 361/500 [====================>.........] - ETA: 47s - loss: 0.9001 - regression_loss: 0.8058 - classification_loss: 0.0943 362/500 [====================>.........] - ETA: 46s - loss: 0.9013 - regression_loss: 0.8070 - classification_loss: 0.0943 363/500 [====================>.........] - ETA: 46s - loss: 0.9016 - regression_loss: 0.8071 - classification_loss: 0.0945 364/500 [====================>.........] - ETA: 46s - loss: 0.9032 - regression_loss: 0.8081 - classification_loss: 0.0950 365/500 [====================>.........] - ETA: 45s - loss: 0.9059 - regression_loss: 0.8109 - classification_loss: 0.0950 366/500 [====================>.........] - ETA: 45s - loss: 0.9066 - regression_loss: 0.8114 - classification_loss: 0.0952 367/500 [=====================>........] - ETA: 45s - loss: 0.9054 - regression_loss: 0.8104 - classification_loss: 0.0950 368/500 [=====================>........] - ETA: 44s - loss: 0.9051 - regression_loss: 0.8102 - classification_loss: 0.0949 369/500 [=====================>........] - ETA: 44s - loss: 0.9039 - regression_loss: 0.8092 - classification_loss: 0.0947 370/500 [=====================>........] - ETA: 44s - loss: 0.9044 - regression_loss: 0.8097 - classification_loss: 0.0947 371/500 [=====================>........] - ETA: 43s - loss: 0.9029 - regression_loss: 0.8083 - classification_loss: 0.0946 372/500 [=====================>........] - ETA: 43s - loss: 0.9021 - regression_loss: 0.8076 - classification_loss: 0.0945 373/500 [=====================>........] - ETA: 43s - loss: 0.9007 - regression_loss: 0.8064 - classification_loss: 0.0943 374/500 [=====================>........] - ETA: 42s - loss: 0.9008 - regression_loss: 0.8066 - classification_loss: 0.0942 375/500 [=====================>........] - ETA: 42s - loss: 0.9011 - regression_loss: 0.8070 - classification_loss: 0.0941 376/500 [=====================>........] - ETA: 41s - loss: 0.9001 - regression_loss: 0.8061 - classification_loss: 0.0940 377/500 [=====================>........] - ETA: 41s - loss: 0.8992 - regression_loss: 0.8053 - classification_loss: 0.0939 378/500 [=====================>........] - ETA: 41s - loss: 0.8999 - regression_loss: 0.8060 - classification_loss: 0.0940 379/500 [=====================>........] - ETA: 40s - loss: 0.9001 - regression_loss: 0.8061 - classification_loss: 0.0940 380/500 [=====================>........] - ETA: 40s - loss: 0.8998 - regression_loss: 0.8060 - classification_loss: 0.0938 381/500 [=====================>........] - ETA: 40s - loss: 0.8988 - regression_loss: 0.8051 - classification_loss: 0.0937 382/500 [=====================>........] - ETA: 39s - loss: 0.8990 - regression_loss: 0.8053 - classification_loss: 0.0937 383/500 [=====================>........] - ETA: 39s - loss: 0.8988 - regression_loss: 0.8053 - classification_loss: 0.0936 384/500 [======================>.......] - ETA: 39s - loss: 0.8983 - regression_loss: 0.8047 - classification_loss: 0.0936 385/500 [======================>.......] - ETA: 38s - loss: 0.8976 - regression_loss: 0.8041 - classification_loss: 0.0935 386/500 [======================>.......] - ETA: 38s - loss: 0.8991 - regression_loss: 0.8051 - classification_loss: 0.0940 387/500 [======================>.......] - ETA: 38s - loss: 0.9007 - regression_loss: 0.8065 - classification_loss: 0.0942 388/500 [======================>.......] - ETA: 37s - loss: 0.9004 - regression_loss: 0.8062 - classification_loss: 0.0941 389/500 [======================>.......] - ETA: 37s - loss: 0.9027 - regression_loss: 0.8084 - classification_loss: 0.0943 390/500 [======================>.......] - ETA: 37s - loss: 0.9016 - regression_loss: 0.8073 - classification_loss: 0.0943 391/500 [======================>.......] - ETA: 36s - loss: 0.9022 - regression_loss: 0.8079 - classification_loss: 0.0943 392/500 [======================>.......] - ETA: 36s - loss: 0.9017 - regression_loss: 0.8074 - classification_loss: 0.0943 393/500 [======================>.......] - ETA: 36s - loss: 0.9003 - regression_loss: 0.8062 - classification_loss: 0.0941 394/500 [======================>.......] - ETA: 35s - loss: 0.9001 - regression_loss: 0.8061 - classification_loss: 0.0940 395/500 [======================>.......] - ETA: 35s - loss: 0.9002 - regression_loss: 0.8061 - classification_loss: 0.0940 396/500 [======================>.......] - ETA: 35s - loss: 0.9009 - regression_loss: 0.8068 - classification_loss: 0.0941 397/500 [======================>.......] - ETA: 34s - loss: 0.9008 - regression_loss: 0.8067 - classification_loss: 0.0941 398/500 [======================>.......] - ETA: 34s - loss: 0.9002 - regression_loss: 0.8062 - classification_loss: 0.0940 399/500 [======================>.......] - ETA: 34s - loss: 0.9006 - regression_loss: 0.8066 - classification_loss: 0.0940 400/500 [=======================>......] - ETA: 33s - loss: 0.9000 - regression_loss: 0.8061 - classification_loss: 0.0939 401/500 [=======================>......] - ETA: 33s - loss: 0.8994 - regression_loss: 0.8057 - classification_loss: 0.0938 402/500 [=======================>......] - ETA: 33s - loss: 0.8986 - regression_loss: 0.8049 - classification_loss: 0.0937 403/500 [=======================>......] - ETA: 32s - loss: 0.8976 - regression_loss: 0.8040 - classification_loss: 0.0936 404/500 [=======================>......] - ETA: 32s - loss: 0.8978 - regression_loss: 0.8043 - classification_loss: 0.0935 405/500 [=======================>......] - ETA: 32s - loss: 0.8981 - regression_loss: 0.8047 - classification_loss: 0.0934 406/500 [=======================>......] - ETA: 31s - loss: 0.8980 - regression_loss: 0.8046 - classification_loss: 0.0934 407/500 [=======================>......] - ETA: 31s - loss: 0.8979 - regression_loss: 0.8046 - classification_loss: 0.0933 408/500 [=======================>......] - ETA: 31s - loss: 0.8968 - regression_loss: 0.8037 - classification_loss: 0.0932 409/500 [=======================>......] - ETA: 30s - loss: 0.8958 - regression_loss: 0.8028 - classification_loss: 0.0930 410/500 [=======================>......] - ETA: 30s - loss: 0.8958 - regression_loss: 0.8028 - classification_loss: 0.0930 411/500 [=======================>......] - ETA: 30s - loss: 0.8946 - regression_loss: 0.8017 - classification_loss: 0.0928 412/500 [=======================>......] - ETA: 29s - loss: 0.8958 - regression_loss: 0.8026 - classification_loss: 0.0932 413/500 [=======================>......] - ETA: 29s - loss: 0.8956 - regression_loss: 0.8025 - classification_loss: 0.0931 414/500 [=======================>......] - ETA: 29s - loss: 0.8956 - regression_loss: 0.8025 - classification_loss: 0.0931 415/500 [=======================>......] - ETA: 28s - loss: 0.8945 - regression_loss: 0.8016 - classification_loss: 0.0929 416/500 [=======================>......] - ETA: 28s - loss: 0.8940 - regression_loss: 0.8012 - classification_loss: 0.0928 417/500 [========================>.....] - ETA: 28s - loss: 0.8944 - regression_loss: 0.8016 - classification_loss: 0.0928 418/500 [========================>.....] - ETA: 27s - loss: 0.8948 - regression_loss: 0.8020 - classification_loss: 0.0929 419/500 [========================>.....] - ETA: 27s - loss: 0.8948 - regression_loss: 0.8020 - classification_loss: 0.0928 420/500 [========================>.....] - ETA: 27s - loss: 0.8947 - regression_loss: 0.8019 - classification_loss: 0.0928 421/500 [========================>.....] - ETA: 26s - loss: 0.8940 - regression_loss: 0.8013 - classification_loss: 0.0927 422/500 [========================>.....] - ETA: 26s - loss: 0.8934 - regression_loss: 0.8008 - classification_loss: 0.0927 423/500 [========================>.....] - ETA: 26s - loss: 0.8930 - regression_loss: 0.8004 - classification_loss: 0.0926 424/500 [========================>.....] - ETA: 25s - loss: 0.8933 - regression_loss: 0.8007 - classification_loss: 0.0926 425/500 [========================>.....] - ETA: 25s - loss: 0.8931 - regression_loss: 0.8006 - classification_loss: 0.0926 426/500 [========================>.....] - ETA: 25s - loss: 0.8923 - regression_loss: 0.7999 - classification_loss: 0.0925 427/500 [========================>.....] - ETA: 24s - loss: 0.8932 - regression_loss: 0.8007 - classification_loss: 0.0925 428/500 [========================>.....] - ETA: 24s - loss: 0.8929 - regression_loss: 0.8005 - classification_loss: 0.0924 429/500 [========================>.....] - ETA: 24s - loss: 0.8932 - regression_loss: 0.8008 - classification_loss: 0.0924 430/500 [========================>.....] - ETA: 23s - loss: 0.8934 - regression_loss: 0.8010 - classification_loss: 0.0924 431/500 [========================>.....] - ETA: 23s - loss: 0.8932 - regression_loss: 0.8008 - classification_loss: 0.0924 432/500 [========================>.....] - ETA: 23s - loss: 0.8921 - regression_loss: 0.7998 - classification_loss: 0.0923 433/500 [========================>.....] - ETA: 22s - loss: 0.8920 - regression_loss: 0.7996 - classification_loss: 0.0924 434/500 [=========================>....] - ETA: 22s - loss: 0.8943 - regression_loss: 0.8017 - classification_loss: 0.0926 435/500 [=========================>....] - ETA: 22s - loss: 0.8951 - regression_loss: 0.8026 - classification_loss: 0.0926 436/500 [=========================>....] - ETA: 21s - loss: 0.8938 - regression_loss: 0.8014 - classification_loss: 0.0924 437/500 [=========================>....] - ETA: 21s - loss: 0.8939 - regression_loss: 0.8016 - classification_loss: 0.0923 438/500 [=========================>....] - ETA: 20s - loss: 0.8939 - regression_loss: 0.8017 - classification_loss: 0.0922 439/500 [=========================>....] - ETA: 20s - loss: 0.8936 - regression_loss: 0.8015 - classification_loss: 0.0922 440/500 [=========================>....] - ETA: 20s - loss: 0.8927 - regression_loss: 0.8007 - classification_loss: 0.0920 441/500 [=========================>....] - ETA: 19s - loss: 0.8928 - regression_loss: 0.8008 - classification_loss: 0.0921 442/500 [=========================>....] - ETA: 19s - loss: 0.8922 - regression_loss: 0.8003 - classification_loss: 0.0919 443/500 [=========================>....] - ETA: 19s - loss: 0.8920 - regression_loss: 0.8002 - classification_loss: 0.0918 444/500 [=========================>....] - ETA: 18s - loss: 0.8926 - regression_loss: 0.8008 - classification_loss: 0.0918 445/500 [=========================>....] - ETA: 18s - loss: 0.8917 - regression_loss: 0.8001 - classification_loss: 0.0916 446/500 [=========================>....] - ETA: 18s - loss: 0.8912 - regression_loss: 0.7996 - classification_loss: 0.0916 447/500 [=========================>....] - ETA: 17s - loss: 0.8908 - regression_loss: 0.7994 - classification_loss: 0.0915 448/500 [=========================>....] - ETA: 17s - loss: 0.8918 - regression_loss: 0.8002 - classification_loss: 0.0916 449/500 [=========================>....] - ETA: 17s - loss: 0.8924 - regression_loss: 0.8007 - classification_loss: 0.0918 450/500 [==========================>...] - ETA: 16s - loss: 0.8926 - regression_loss: 0.8009 - classification_loss: 0.0917 451/500 [==========================>...] - ETA: 16s - loss: 0.8924 - regression_loss: 0.8007 - classification_loss: 0.0917 452/500 [==========================>...] - ETA: 16s - loss: 0.8911 - regression_loss: 0.7996 - classification_loss: 0.0915 453/500 [==========================>...] - ETA: 15s - loss: 0.8931 - regression_loss: 0.8014 - classification_loss: 0.0917 454/500 [==========================>...] - ETA: 15s - loss: 0.8932 - regression_loss: 0.8016 - classification_loss: 0.0916 455/500 [==========================>...] - ETA: 15s - loss: 0.8930 - regression_loss: 0.8014 - classification_loss: 0.0915 456/500 [==========================>...] - ETA: 14s - loss: 0.8923 - regression_loss: 0.8009 - classification_loss: 0.0914 457/500 [==========================>...] - ETA: 14s - loss: 0.8947 - regression_loss: 0.8030 - classification_loss: 0.0917 458/500 [==========================>...] - ETA: 14s - loss: 0.8953 - regression_loss: 0.8034 - classification_loss: 0.0919 459/500 [==========================>...] - ETA: 13s - loss: 0.8950 - regression_loss: 0.8032 - classification_loss: 0.0918 460/500 [==========================>...] - ETA: 13s - loss: 0.8946 - regression_loss: 0.8027 - classification_loss: 0.0918 461/500 [==========================>...] - ETA: 13s - loss: 0.8949 - regression_loss: 0.8030 - classification_loss: 0.0919 462/500 [==========================>...] - ETA: 12s - loss: 0.8947 - regression_loss: 0.8029 - classification_loss: 0.0918 463/500 [==========================>...] - ETA: 12s - loss: 0.8950 - regression_loss: 0.8032 - classification_loss: 0.0918 464/500 [==========================>...] - ETA: 12s - loss: 0.8955 - regression_loss: 0.8037 - classification_loss: 0.0918 465/500 [==========================>...] - ETA: 11s - loss: 0.8955 - regression_loss: 0.8039 - classification_loss: 0.0916 466/500 [==========================>...] - ETA: 11s - loss: 0.8950 - regression_loss: 0.8036 - classification_loss: 0.0915 467/500 [===========================>..] - ETA: 11s - loss: 0.8964 - regression_loss: 0.8048 - classification_loss: 0.0916 468/500 [===========================>..] - ETA: 10s - loss: 0.8965 - regression_loss: 0.8050 - classification_loss: 0.0915 469/500 [===========================>..] - ETA: 10s - loss: 0.8969 - regression_loss: 0.8054 - classification_loss: 0.0914 470/500 [===========================>..] - ETA: 10s - loss: 0.8981 - regression_loss: 0.8063 - classification_loss: 0.0918 471/500 [===========================>..] - ETA: 9s - loss: 0.8988 - regression_loss: 0.8069 - classification_loss: 0.0919  472/500 [===========================>..] - ETA: 9s - loss: 0.8986 - regression_loss: 0.8067 - classification_loss: 0.0919 473/500 [===========================>..] - ETA: 9s - loss: 0.8985 - regression_loss: 0.8067 - classification_loss: 0.0918 474/500 [===========================>..] - ETA: 8s - loss: 0.8987 - regression_loss: 0.8069 - classification_loss: 0.0918 475/500 [===========================>..] - ETA: 8s - loss: 0.8987 - regression_loss: 0.8068 - classification_loss: 0.0918 476/500 [===========================>..] - ETA: 8s - loss: 0.8975 - regression_loss: 0.8058 - classification_loss: 0.0917 477/500 [===========================>..] - ETA: 7s - loss: 0.8988 - regression_loss: 0.8068 - classification_loss: 0.0920 478/500 [===========================>..] - ETA: 7s - loss: 0.8989 - regression_loss: 0.8070 - classification_loss: 0.0920 479/500 [===========================>..] - ETA: 7s - loss: 0.8989 - regression_loss: 0.8070 - classification_loss: 0.0919 480/500 [===========================>..] - ETA: 6s - loss: 0.9013 - regression_loss: 0.8091 - classification_loss: 0.0922 481/500 [===========================>..] - ETA: 6s - loss: 0.9008 - regression_loss: 0.8086 - classification_loss: 0.0922 482/500 [===========================>..] - ETA: 6s - loss: 0.9001 - regression_loss: 0.8080 - classification_loss: 0.0921 483/500 [===========================>..] - ETA: 5s - loss: 0.9007 - regression_loss: 0.8086 - classification_loss: 0.0921 484/500 [============================>.] - ETA: 5s - loss: 0.9004 - regression_loss: 0.8083 - classification_loss: 0.0921 485/500 [============================>.] - ETA: 5s - loss: 0.9006 - regression_loss: 0.8084 - classification_loss: 0.0922 486/500 [============================>.] - ETA: 4s - loss: 0.9004 - regression_loss: 0.8082 - classification_loss: 0.0922 487/500 [============================>.] - ETA: 4s - loss: 0.8990 - regression_loss: 0.8069 - classification_loss: 0.0921 488/500 [============================>.] - ETA: 4s - loss: 0.8997 - regression_loss: 0.8076 - classification_loss: 0.0922 489/500 [============================>.] - ETA: 3s - loss: 0.8990 - regression_loss: 0.8070 - classification_loss: 0.0921 490/500 [============================>.] - ETA: 3s - loss: 0.9000 - regression_loss: 0.8079 - classification_loss: 0.0921 491/500 [============================>.] - ETA: 3s - loss: 0.8990 - regression_loss: 0.8069 - classification_loss: 0.0921 492/500 [============================>.] - ETA: 2s - loss: 0.8993 - regression_loss: 0.8072 - classification_loss: 0.0921 493/500 [============================>.] - ETA: 2s - loss: 0.9006 - regression_loss: 0.8081 - classification_loss: 0.0925 494/500 [============================>.] - ETA: 2s - loss: 0.9003 - regression_loss: 0.8080 - classification_loss: 0.0923 495/500 [============================>.] - ETA: 1s - loss: 0.9001 - regression_loss: 0.8079 - classification_loss: 0.0923 496/500 [============================>.] - ETA: 1s - loss: 0.9004 - regression_loss: 0.8081 - classification_loss: 0.0923 497/500 [============================>.] - ETA: 1s - loss: 0.8995 - regression_loss: 0.8073 - classification_loss: 0.0922 498/500 [============================>.] - ETA: 0s - loss: 0.8984 - regression_loss: 0.8063 - classification_loss: 0.0921 499/500 [============================>.] - ETA: 0s - loss: 0.8997 - regression_loss: 0.8074 - classification_loss: 0.0924 500/500 [==============================] - 169s 339ms/step - loss: 0.8998 - regression_loss: 0.8073 - classification_loss: 0.0925 326 instances of class plum with average precision: 0.8156 mAP: 0.8156 Epoch 00031: saving model to ./training/snapshots/resnet101_pascal_31.h5 Epoch 32/150 1/500 [..............................] - ETA: 2:33 - loss: 0.8864 - regression_loss: 0.7914 - classification_loss: 0.0950 2/500 [..............................] - ETA: 2:31 - loss: 0.7274 - regression_loss: 0.6608 - classification_loss: 0.0666 3/500 [..............................] - ETA: 2:34 - loss: 0.9438 - regression_loss: 0.8311 - classification_loss: 0.1127 4/500 [..............................] - ETA: 2:35 - loss: 0.7082 - regression_loss: 0.6233 - classification_loss: 0.0849 5/500 [..............................] - ETA: 2:35 - loss: 0.6867 - regression_loss: 0.5986 - classification_loss: 0.0881 6/500 [..............................] - ETA: 2:37 - loss: 0.7188 - regression_loss: 0.6323 - classification_loss: 0.0865 7/500 [..............................] - ETA: 2:39 - loss: 0.8000 - regression_loss: 0.6913 - classification_loss: 0.1087 8/500 [..............................] - ETA: 2:39 - loss: 0.8357 - regression_loss: 0.7245 - classification_loss: 0.1112 9/500 [..............................] - ETA: 2:40 - loss: 0.7946 - regression_loss: 0.6918 - classification_loss: 0.1027 10/500 [..............................] - ETA: 2:40 - loss: 0.7962 - regression_loss: 0.6914 - classification_loss: 0.1049 11/500 [..............................] - ETA: 2:41 - loss: 0.7827 - regression_loss: 0.6828 - classification_loss: 0.0999 12/500 [..............................] - ETA: 2:41 - loss: 0.8095 - regression_loss: 0.7081 - classification_loss: 0.1014 13/500 [..............................] - ETA: 2:41 - loss: 0.8289 - regression_loss: 0.7225 - classification_loss: 0.1064 14/500 [..............................] - ETA: 2:41 - loss: 0.8215 - regression_loss: 0.7200 - classification_loss: 0.1014 15/500 [..............................] - ETA: 2:41 - loss: 0.8746 - regression_loss: 0.7648 - classification_loss: 0.1097 16/500 [..............................] - ETA: 2:41 - loss: 0.8641 - regression_loss: 0.7598 - classification_loss: 0.1043 17/500 [>.............................] - ETA: 2:40 - loss: 0.8624 - regression_loss: 0.7596 - classification_loss: 0.1028 18/500 [>.............................] - ETA: 2:40 - loss: 0.8475 - regression_loss: 0.7480 - classification_loss: 0.0995 19/500 [>.............................] - ETA: 2:40 - loss: 0.8374 - regression_loss: 0.7415 - classification_loss: 0.0959 20/500 [>.............................] - ETA: 2:40 - loss: 0.8321 - regression_loss: 0.7355 - classification_loss: 0.0966 21/500 [>.............................] - ETA: 2:40 - loss: 0.8303 - regression_loss: 0.7353 - classification_loss: 0.0950 22/500 [>.............................] - ETA: 2:40 - loss: 0.8342 - regression_loss: 0.7393 - classification_loss: 0.0949 23/500 [>.............................] - ETA: 2:39 - loss: 0.8124 - regression_loss: 0.7196 - classification_loss: 0.0928 24/500 [>.............................] - ETA: 2:39 - loss: 0.8082 - regression_loss: 0.7147 - classification_loss: 0.0935 25/500 [>.............................] - ETA: 2:38 - loss: 0.7977 - regression_loss: 0.7068 - classification_loss: 0.0909 26/500 [>.............................] - ETA: 2:37 - loss: 0.8194 - regression_loss: 0.7275 - classification_loss: 0.0919 27/500 [>.............................] - ETA: 2:37 - loss: 0.8234 - regression_loss: 0.7315 - classification_loss: 0.0919 28/500 [>.............................] - ETA: 2:36 - loss: 0.8241 - regression_loss: 0.7323 - classification_loss: 0.0918 29/500 [>.............................] - ETA: 2:36 - loss: 0.8512 - regression_loss: 0.7578 - classification_loss: 0.0934 30/500 [>.............................] - ETA: 2:36 - loss: 0.8395 - regression_loss: 0.7483 - classification_loss: 0.0913 31/500 [>.............................] - ETA: 2:35 - loss: 0.8359 - regression_loss: 0.7464 - classification_loss: 0.0896 32/500 [>.............................] - ETA: 2:35 - loss: 0.8329 - regression_loss: 0.7437 - classification_loss: 0.0892 33/500 [>.............................] - ETA: 2:34 - loss: 0.8282 - regression_loss: 0.7397 - classification_loss: 0.0885 34/500 [=>............................] - ETA: 2:34 - loss: 0.8178 - regression_loss: 0.7310 - classification_loss: 0.0868 35/500 [=>............................] - ETA: 2:34 - loss: 0.8272 - regression_loss: 0.7379 - classification_loss: 0.0893 36/500 [=>............................] - ETA: 2:34 - loss: 0.8297 - regression_loss: 0.7396 - classification_loss: 0.0901 37/500 [=>............................] - ETA: 2:34 - loss: 0.8379 - regression_loss: 0.7448 - classification_loss: 0.0931 38/500 [=>............................] - ETA: 2:33 - loss: 0.8233 - regression_loss: 0.7323 - classification_loss: 0.0910 39/500 [=>............................] - ETA: 2:33 - loss: 0.8213 - regression_loss: 0.7321 - classification_loss: 0.0892 40/500 [=>............................] - ETA: 2:33 - loss: 0.8176 - regression_loss: 0.7281 - classification_loss: 0.0895 41/500 [=>............................] - ETA: 2:33 - loss: 0.8205 - regression_loss: 0.7315 - classification_loss: 0.0890 42/500 [=>............................] - ETA: 2:33 - loss: 0.8441 - regression_loss: 0.7495 - classification_loss: 0.0946 43/500 [=>............................] - ETA: 2:32 - loss: 0.8382 - regression_loss: 0.7447 - classification_loss: 0.0936 44/500 [=>............................] - ETA: 2:32 - loss: 0.8350 - regression_loss: 0.7427 - classification_loss: 0.0923 45/500 [=>............................] - ETA: 2:32 - loss: 0.8401 - regression_loss: 0.7467 - classification_loss: 0.0934 46/500 [=>............................] - ETA: 2:31 - loss: 0.8321 - regression_loss: 0.7398 - classification_loss: 0.0923 47/500 [=>............................] - ETA: 2:31 - loss: 0.8257 - regression_loss: 0.7343 - classification_loss: 0.0914 48/500 [=>............................] - ETA: 2:31 - loss: 0.8215 - regression_loss: 0.7301 - classification_loss: 0.0914 49/500 [=>............................] - ETA: 2:30 - loss: 0.8128 - regression_loss: 0.7227 - classification_loss: 0.0900 50/500 [==>...........................] - ETA: 2:30 - loss: 0.8100 - regression_loss: 0.7201 - classification_loss: 0.0898 51/500 [==>...........................] - ETA: 2:30 - loss: 0.8159 - regression_loss: 0.7255 - classification_loss: 0.0904 52/500 [==>...........................] - ETA: 2:29 - loss: 0.8208 - regression_loss: 0.7290 - classification_loss: 0.0918 53/500 [==>...........................] - ETA: 2:29 - loss: 0.8203 - regression_loss: 0.7290 - classification_loss: 0.0913 54/500 [==>...........................] - ETA: 2:29 - loss: 0.8165 - regression_loss: 0.7257 - classification_loss: 0.0908 55/500 [==>...........................] - ETA: 2:29 - loss: 0.8172 - regression_loss: 0.7260 - classification_loss: 0.0912 56/500 [==>...........................] - ETA: 2:28 - loss: 0.8233 - regression_loss: 0.7305 - classification_loss: 0.0928 57/500 [==>...........................] - ETA: 2:28 - loss: 0.8398 - regression_loss: 0.7430 - classification_loss: 0.0968 58/500 [==>...........................] - ETA: 2:28 - loss: 0.8290 - regression_loss: 0.7336 - classification_loss: 0.0954 59/500 [==>...........................] - ETA: 2:28 - loss: 0.8290 - regression_loss: 0.7343 - classification_loss: 0.0948 60/500 [==>...........................] - ETA: 2:27 - loss: 0.8270 - regression_loss: 0.7330 - classification_loss: 0.0940 61/500 [==>...........................] - ETA: 2:27 - loss: 0.8283 - regression_loss: 0.7342 - classification_loss: 0.0941 62/500 [==>...........................] - ETA: 2:27 - loss: 0.8257 - regression_loss: 0.7316 - classification_loss: 0.0941 63/500 [==>...........................] - ETA: 2:26 - loss: 0.8285 - regression_loss: 0.7341 - classification_loss: 0.0943 64/500 [==>...........................] - ETA: 2:26 - loss: 0.8286 - regression_loss: 0.7348 - classification_loss: 0.0938 65/500 [==>...........................] - ETA: 2:26 - loss: 0.8185 - regression_loss: 0.7259 - classification_loss: 0.0926 66/500 [==>...........................] - ETA: 2:25 - loss: 0.8196 - regression_loss: 0.7269 - classification_loss: 0.0927 67/500 [===>..........................] - ETA: 2:25 - loss: 0.8252 - regression_loss: 0.7324 - classification_loss: 0.0928 68/500 [===>..........................] - ETA: 2:25 - loss: 0.8325 - regression_loss: 0.7389 - classification_loss: 0.0936 69/500 [===>..........................] - ETA: 2:24 - loss: 0.8298 - regression_loss: 0.7372 - classification_loss: 0.0926 70/500 [===>..........................] - ETA: 2:24 - loss: 0.8288 - regression_loss: 0.7368 - classification_loss: 0.0920 71/500 [===>..........................] - ETA: 2:24 - loss: 0.8357 - regression_loss: 0.7423 - classification_loss: 0.0933 72/500 [===>..........................] - ETA: 2:24 - loss: 0.8363 - regression_loss: 0.7433 - classification_loss: 0.0931 73/500 [===>..........................] - ETA: 2:23 - loss: 0.8347 - regression_loss: 0.7419 - classification_loss: 0.0928 74/500 [===>..........................] - ETA: 2:23 - loss: 0.8348 - regression_loss: 0.7420 - classification_loss: 0.0928 75/500 [===>..........................] - ETA: 2:23 - loss: 0.8293 - regression_loss: 0.7374 - classification_loss: 0.0919 76/500 [===>..........................] - ETA: 2:23 - loss: 0.8289 - regression_loss: 0.7371 - classification_loss: 0.0917 77/500 [===>..........................] - ETA: 2:22 - loss: 0.8269 - regression_loss: 0.7357 - classification_loss: 0.0912 78/500 [===>..........................] - ETA: 2:22 - loss: 0.8238 - regression_loss: 0.7332 - classification_loss: 0.0906 79/500 [===>..........................] - ETA: 2:22 - loss: 0.8249 - regression_loss: 0.7347 - classification_loss: 0.0902 80/500 [===>..........................] - ETA: 2:21 - loss: 0.8354 - regression_loss: 0.7444 - classification_loss: 0.0909 81/500 [===>..........................] - ETA: 2:21 - loss: 0.8424 - regression_loss: 0.7495 - classification_loss: 0.0928 82/500 [===>..........................] - ETA: 2:21 - loss: 0.8501 - regression_loss: 0.7557 - classification_loss: 0.0944 83/500 [===>..........................] - ETA: 2:20 - loss: 0.8446 - regression_loss: 0.7511 - classification_loss: 0.0935 84/500 [====>.........................] - ETA: 2:20 - loss: 0.8489 - regression_loss: 0.7544 - classification_loss: 0.0945 85/500 [====>.........................] - ETA: 2:20 - loss: 0.8454 - regression_loss: 0.7515 - classification_loss: 0.0939 86/500 [====>.........................] - ETA: 2:19 - loss: 0.8451 - regression_loss: 0.7518 - classification_loss: 0.0933 87/500 [====>.........................] - ETA: 2:19 - loss: 0.8475 - regression_loss: 0.7545 - classification_loss: 0.0931 88/500 [====>.........................] - ETA: 2:19 - loss: 0.8459 - regression_loss: 0.7529 - classification_loss: 0.0930 89/500 [====>.........................] - ETA: 2:18 - loss: 0.8447 - regression_loss: 0.7520 - classification_loss: 0.0927 90/500 [====>.........................] - ETA: 2:18 - loss: 0.8445 - regression_loss: 0.7519 - classification_loss: 0.0926 91/500 [====>.........................] - ETA: 2:18 - loss: 0.8436 - regression_loss: 0.7514 - classification_loss: 0.0923 92/500 [====>.........................] - ETA: 2:17 - loss: 0.8469 - regression_loss: 0.7542 - classification_loss: 0.0927 93/500 [====>.........................] - ETA: 2:17 - loss: 0.8489 - regression_loss: 0.7550 - classification_loss: 0.0939 94/500 [====>.........................] - ETA: 2:17 - loss: 0.8432 - regression_loss: 0.7501 - classification_loss: 0.0931 95/500 [====>.........................] - ETA: 2:17 - loss: 0.8412 - regression_loss: 0.7484 - classification_loss: 0.0928 96/500 [====>.........................] - ETA: 2:16 - loss: 0.8378 - regression_loss: 0.7457 - classification_loss: 0.0920 97/500 [====>.........................] - ETA: 2:16 - loss: 0.8365 - regression_loss: 0.7444 - classification_loss: 0.0921 98/500 [====>.........................] - ETA: 2:16 - loss: 0.8340 - regression_loss: 0.7425 - classification_loss: 0.0915 99/500 [====>.........................] - ETA: 2:15 - loss: 0.8307 - regression_loss: 0.7398 - classification_loss: 0.0909 100/500 [=====>........................] - ETA: 2:15 - loss: 0.8320 - regression_loss: 0.7418 - classification_loss: 0.0903 101/500 [=====>........................] - ETA: 2:15 - loss: 0.8317 - regression_loss: 0.7418 - classification_loss: 0.0899 102/500 [=====>........................] - ETA: 2:14 - loss: 0.8291 - regression_loss: 0.7396 - classification_loss: 0.0895 103/500 [=====>........................] - ETA: 2:14 - loss: 0.8340 - regression_loss: 0.7449 - classification_loss: 0.0890 104/500 [=====>........................] - ETA: 2:14 - loss: 0.8362 - regression_loss: 0.7469 - classification_loss: 0.0892 105/500 [=====>........................] - ETA: 2:13 - loss: 0.8319 - regression_loss: 0.7434 - classification_loss: 0.0885 106/500 [=====>........................] - ETA: 2:13 - loss: 0.8337 - regression_loss: 0.7453 - classification_loss: 0.0884 107/500 [=====>........................] - ETA: 2:12 - loss: 0.8371 - regression_loss: 0.7481 - classification_loss: 0.0890 108/500 [=====>........................] - ETA: 2:12 - loss: 0.8428 - regression_loss: 0.7521 - classification_loss: 0.0907 109/500 [=====>........................] - ETA: 2:12 - loss: 0.8447 - regression_loss: 0.7539 - classification_loss: 0.0908 110/500 [=====>........................] - ETA: 2:11 - loss: 0.8393 - regression_loss: 0.7491 - classification_loss: 0.0902 111/500 [=====>........................] - ETA: 2:11 - loss: 0.8395 - regression_loss: 0.7497 - classification_loss: 0.0898 112/500 [=====>........................] - ETA: 2:11 - loss: 0.8408 - regression_loss: 0.7509 - classification_loss: 0.0898 113/500 [=====>........................] - ETA: 2:10 - loss: 0.8415 - regression_loss: 0.7519 - classification_loss: 0.0896 114/500 [=====>........................] - ETA: 2:10 - loss: 0.8387 - regression_loss: 0.7494 - classification_loss: 0.0893 115/500 [=====>........................] - ETA: 2:10 - loss: 0.8365 - regression_loss: 0.7478 - classification_loss: 0.0887 116/500 [=====>........................] - ETA: 2:09 - loss: 0.8393 - regression_loss: 0.7500 - classification_loss: 0.0893 117/500 [======>.......................] - ETA: 2:09 - loss: 0.8397 - regression_loss: 0.7507 - classification_loss: 0.0890 118/500 [======>.......................] - ETA: 2:09 - loss: 0.8460 - regression_loss: 0.7555 - classification_loss: 0.0905 119/500 [======>.......................] - ETA: 2:08 - loss: 0.8444 - regression_loss: 0.7540 - classification_loss: 0.0904 120/500 [======>.......................] - ETA: 2:08 - loss: 0.8396 - regression_loss: 0.7499 - classification_loss: 0.0897 121/500 [======>.......................] - ETA: 2:08 - loss: 0.8399 - regression_loss: 0.7503 - classification_loss: 0.0895 122/500 [======>.......................] - ETA: 2:07 - loss: 0.8375 - regression_loss: 0.7480 - classification_loss: 0.0895 123/500 [======>.......................] - ETA: 2:07 - loss: 0.8436 - regression_loss: 0.7527 - classification_loss: 0.0909 124/500 [======>.......................] - ETA: 2:07 - loss: 0.8414 - regression_loss: 0.7509 - classification_loss: 0.0905 125/500 [======>.......................] - ETA: 2:06 - loss: 0.8388 - regression_loss: 0.7487 - classification_loss: 0.0901 126/500 [======>.......................] - ETA: 2:06 - loss: 0.8402 - regression_loss: 0.7501 - classification_loss: 0.0901 127/500 [======>.......................] - ETA: 2:06 - loss: 0.8375 - regression_loss: 0.7481 - classification_loss: 0.0894 128/500 [======>.......................] - ETA: 2:05 - loss: 0.8388 - regression_loss: 0.7495 - classification_loss: 0.0893 129/500 [======>.......................] - ETA: 2:05 - loss: 0.8357 - regression_loss: 0.7470 - classification_loss: 0.0887 130/500 [======>.......................] - ETA: 2:05 - loss: 0.8469 - regression_loss: 0.7569 - classification_loss: 0.0900 131/500 [======>.......................] - ETA: 2:04 - loss: 0.8482 - regression_loss: 0.7582 - classification_loss: 0.0900 132/500 [======>.......................] - ETA: 2:04 - loss: 0.8466 - regression_loss: 0.7567 - classification_loss: 0.0899 133/500 [======>.......................] - ETA: 2:04 - loss: 0.8467 - regression_loss: 0.7565 - classification_loss: 0.0901 134/500 [=======>......................] - ETA: 2:03 - loss: 0.8453 - regression_loss: 0.7556 - classification_loss: 0.0897 135/500 [=======>......................] - ETA: 2:03 - loss: 0.8461 - regression_loss: 0.7560 - classification_loss: 0.0900 136/500 [=======>......................] - ETA: 2:03 - loss: 0.8489 - regression_loss: 0.7581 - classification_loss: 0.0908 137/500 [=======>......................] - ETA: 2:02 - loss: 0.8485 - regression_loss: 0.7578 - classification_loss: 0.0907 138/500 [=======>......................] - ETA: 2:02 - loss: 0.8454 - regression_loss: 0.7549 - classification_loss: 0.0905 139/500 [=======>......................] - ETA: 2:02 - loss: 0.8448 - regression_loss: 0.7545 - classification_loss: 0.0903 140/500 [=======>......................] - ETA: 2:01 - loss: 0.8432 - regression_loss: 0.7533 - classification_loss: 0.0900 141/500 [=======>......................] - ETA: 2:01 - loss: 0.8427 - regression_loss: 0.7528 - classification_loss: 0.0900 142/500 [=======>......................] - ETA: 2:01 - loss: 0.8438 - regression_loss: 0.7537 - classification_loss: 0.0901 143/500 [=======>......................] - ETA: 2:00 - loss: 0.8431 - regression_loss: 0.7533 - classification_loss: 0.0898 144/500 [=======>......................] - ETA: 2:00 - loss: 0.8406 - regression_loss: 0.7512 - classification_loss: 0.0895 145/500 [=======>......................] - ETA: 2:00 - loss: 0.8417 - regression_loss: 0.7522 - classification_loss: 0.0895 146/500 [=======>......................] - ETA: 1:59 - loss: 0.8404 - regression_loss: 0.7513 - classification_loss: 0.0892 147/500 [=======>......................] - ETA: 1:59 - loss: 0.8397 - regression_loss: 0.7507 - classification_loss: 0.0890 148/500 [=======>......................] - ETA: 1:59 - loss: 0.8432 - regression_loss: 0.7540 - classification_loss: 0.0892 149/500 [=======>......................] - ETA: 1:58 - loss: 0.8428 - regression_loss: 0.7534 - classification_loss: 0.0894 150/500 [========>.....................] - ETA: 1:58 - loss: 0.8441 - regression_loss: 0.7543 - classification_loss: 0.0898 151/500 [========>.....................] - ETA: 1:58 - loss: 0.8415 - regression_loss: 0.7522 - classification_loss: 0.0893 152/500 [========>.....................] - ETA: 1:57 - loss: 0.8445 - regression_loss: 0.7550 - classification_loss: 0.0894 153/500 [========>.....................] - ETA: 1:57 - loss: 0.8438 - regression_loss: 0.7545 - classification_loss: 0.0892 154/500 [========>.....................] - ETA: 1:57 - loss: 0.8420 - regression_loss: 0.7532 - classification_loss: 0.0888 155/500 [========>.....................] - ETA: 1:56 - loss: 0.8390 - regression_loss: 0.7506 - classification_loss: 0.0883 156/500 [========>.....................] - ETA: 1:56 - loss: 0.8369 - regression_loss: 0.7489 - classification_loss: 0.0879 157/500 [========>.....................] - ETA: 1:56 - loss: 0.8389 - regression_loss: 0.7507 - classification_loss: 0.0882 158/500 [========>.....................] - ETA: 1:55 - loss: 0.8385 - regression_loss: 0.7504 - classification_loss: 0.0881 159/500 [========>.....................] - ETA: 1:55 - loss: 0.8394 - regression_loss: 0.7511 - classification_loss: 0.0883 160/500 [========>.....................] - ETA: 1:55 - loss: 0.8357 - regression_loss: 0.7476 - classification_loss: 0.0881 161/500 [========>.....................] - ETA: 1:54 - loss: 0.8465 - regression_loss: 0.7561 - classification_loss: 0.0903 162/500 [========>.....................] - ETA: 1:54 - loss: 0.8471 - regression_loss: 0.7570 - classification_loss: 0.0901 163/500 [========>.....................] - ETA: 1:54 - loss: 0.8482 - regression_loss: 0.7581 - classification_loss: 0.0901 164/500 [========>.....................] - ETA: 1:53 - loss: 0.8509 - regression_loss: 0.7603 - classification_loss: 0.0906 165/500 [========>.....................] - ETA: 1:53 - loss: 0.8491 - regression_loss: 0.7588 - classification_loss: 0.0903 166/500 [========>.....................] - ETA: 1:52 - loss: 0.8477 - regression_loss: 0.7577 - classification_loss: 0.0900 167/500 [=========>....................] - ETA: 1:52 - loss: 0.8470 - regression_loss: 0.7572 - classification_loss: 0.0899 168/500 [=========>....................] - ETA: 1:52 - loss: 0.8453 - regression_loss: 0.7558 - classification_loss: 0.0895 169/500 [=========>....................] - ETA: 1:51 - loss: 0.8438 - regression_loss: 0.7545 - classification_loss: 0.0893 170/500 [=========>....................] - ETA: 1:51 - loss: 0.8444 - regression_loss: 0.7548 - classification_loss: 0.0896 171/500 [=========>....................] - ETA: 1:51 - loss: 0.8435 - regression_loss: 0.7541 - classification_loss: 0.0894 172/500 [=========>....................] - ETA: 1:50 - loss: 0.8444 - regression_loss: 0.7553 - classification_loss: 0.0891 173/500 [=========>....................] - ETA: 1:50 - loss: 0.8442 - regression_loss: 0.7550 - classification_loss: 0.0892 174/500 [=========>....................] - ETA: 1:50 - loss: 0.8426 - regression_loss: 0.7535 - classification_loss: 0.0891 175/500 [=========>....................] - ETA: 1:49 - loss: 0.8419 - regression_loss: 0.7532 - classification_loss: 0.0888 176/500 [=========>....................] - ETA: 1:49 - loss: 0.8405 - regression_loss: 0.7520 - classification_loss: 0.0884 177/500 [=========>....................] - ETA: 1:49 - loss: 0.8385 - regression_loss: 0.7503 - classification_loss: 0.0882 178/500 [=========>....................] - ETA: 1:48 - loss: 0.8382 - regression_loss: 0.7501 - classification_loss: 0.0882 179/500 [=========>....................] - ETA: 1:48 - loss: 0.8399 - regression_loss: 0.7519 - classification_loss: 0.0880 180/500 [=========>....................] - ETA: 1:48 - loss: 0.8397 - regression_loss: 0.7516 - classification_loss: 0.0881 181/500 [=========>....................] - ETA: 1:47 - loss: 0.8400 - regression_loss: 0.7518 - classification_loss: 0.0882 182/500 [=========>....................] - ETA: 1:47 - loss: 0.8390 - regression_loss: 0.7511 - classification_loss: 0.0879 183/500 [=========>....................] - ETA: 1:47 - loss: 0.8377 - regression_loss: 0.7501 - classification_loss: 0.0876 184/500 [==========>...................] - ETA: 1:46 - loss: 0.8362 - regression_loss: 0.7489 - classification_loss: 0.0873 185/500 [==========>...................] - ETA: 1:46 - loss: 0.8379 - regression_loss: 0.7501 - classification_loss: 0.0877 186/500 [==========>...................] - ETA: 1:46 - loss: 0.8383 - regression_loss: 0.7507 - classification_loss: 0.0876 187/500 [==========>...................] - ETA: 1:45 - loss: 0.8424 - regression_loss: 0.7545 - classification_loss: 0.0878 188/500 [==========>...................] - ETA: 1:45 - loss: 0.8426 - regression_loss: 0.7548 - classification_loss: 0.0877 189/500 [==========>...................] - ETA: 1:45 - loss: 0.8425 - regression_loss: 0.7550 - classification_loss: 0.0875 190/500 [==========>...................] - ETA: 1:44 - loss: 0.8473 - regression_loss: 0.7592 - classification_loss: 0.0881 191/500 [==========>...................] - ETA: 1:44 - loss: 0.8455 - regression_loss: 0.7576 - classification_loss: 0.0879 192/500 [==========>...................] - ETA: 1:44 - loss: 0.8452 - regression_loss: 0.7575 - classification_loss: 0.0877 193/500 [==========>...................] - ETA: 1:43 - loss: 0.8465 - regression_loss: 0.7586 - classification_loss: 0.0879 194/500 [==========>...................] - ETA: 1:43 - loss: 0.8477 - regression_loss: 0.7597 - classification_loss: 0.0879 195/500 [==========>...................] - ETA: 1:43 - loss: 0.8468 - regression_loss: 0.7590 - classification_loss: 0.0878 196/500 [==========>...................] - ETA: 1:42 - loss: 0.8474 - regression_loss: 0.7596 - classification_loss: 0.0878 197/500 [==========>...................] - ETA: 1:42 - loss: 0.8464 - regression_loss: 0.7588 - classification_loss: 0.0876 198/500 [==========>...................] - ETA: 1:42 - loss: 0.8493 - regression_loss: 0.7611 - classification_loss: 0.0882 199/500 [==========>...................] - ETA: 1:41 - loss: 0.8483 - regression_loss: 0.7604 - classification_loss: 0.0879 200/500 [===========>..................] - ETA: 1:41 - loss: 0.8525 - regression_loss: 0.7638 - classification_loss: 0.0887 201/500 [===========>..................] - ETA: 1:41 - loss: 0.8527 - regression_loss: 0.7642 - classification_loss: 0.0885 202/500 [===========>..................] - ETA: 1:40 - loss: 0.8543 - regression_loss: 0.7657 - classification_loss: 0.0886 203/500 [===========>..................] - ETA: 1:40 - loss: 0.8528 - regression_loss: 0.7645 - classification_loss: 0.0883 204/500 [===========>..................] - ETA: 1:40 - loss: 0.8523 - regression_loss: 0.7642 - classification_loss: 0.0880 205/500 [===========>..................] - ETA: 1:39 - loss: 0.8531 - regression_loss: 0.7648 - classification_loss: 0.0883 206/500 [===========>..................] - ETA: 1:39 - loss: 0.8544 - regression_loss: 0.7661 - classification_loss: 0.0882 207/500 [===========>..................] - ETA: 1:39 - loss: 0.8577 - regression_loss: 0.7691 - classification_loss: 0.0887 208/500 [===========>..................] - ETA: 1:38 - loss: 0.8596 - regression_loss: 0.7700 - classification_loss: 0.0895 209/500 [===========>..................] - ETA: 1:38 - loss: 0.8615 - regression_loss: 0.7718 - classification_loss: 0.0897 210/500 [===========>..................] - ETA: 1:38 - loss: 0.8631 - regression_loss: 0.7733 - classification_loss: 0.0898 211/500 [===========>..................] - ETA: 1:37 - loss: 0.8657 - regression_loss: 0.7753 - classification_loss: 0.0903 212/500 [===========>..................] - ETA: 1:37 - loss: 0.8652 - regression_loss: 0.7750 - classification_loss: 0.0902 213/500 [===========>..................] - ETA: 1:37 - loss: 0.8645 - regression_loss: 0.7745 - classification_loss: 0.0900 214/500 [===========>..................] - ETA: 1:36 - loss: 0.8657 - regression_loss: 0.7756 - classification_loss: 0.0901 215/500 [===========>..................] - ETA: 1:36 - loss: 0.8657 - regression_loss: 0.7758 - classification_loss: 0.0899 216/500 [===========>..................] - ETA: 1:36 - loss: 0.8664 - regression_loss: 0.7765 - classification_loss: 0.0900 217/500 [============>.................] - ETA: 1:35 - loss: 0.8637 - regression_loss: 0.7741 - classification_loss: 0.0896 218/500 [============>.................] - ETA: 1:35 - loss: 0.8628 - regression_loss: 0.7734 - classification_loss: 0.0894 219/500 [============>.................] - ETA: 1:35 - loss: 0.8621 - regression_loss: 0.7729 - classification_loss: 0.0892 220/500 [============>.................] - ETA: 1:34 - loss: 0.8614 - regression_loss: 0.7725 - classification_loss: 0.0889 221/500 [============>.................] - ETA: 1:34 - loss: 0.8601 - regression_loss: 0.7714 - classification_loss: 0.0887 222/500 [============>.................] - ETA: 1:34 - loss: 0.8625 - regression_loss: 0.7728 - classification_loss: 0.0898 223/500 [============>.................] - ETA: 1:33 - loss: 0.8615 - regression_loss: 0.7719 - classification_loss: 0.0895 224/500 [============>.................] - ETA: 1:33 - loss: 0.8627 - regression_loss: 0.7731 - classification_loss: 0.0896 225/500 [============>.................] - ETA: 1:33 - loss: 0.8665 - regression_loss: 0.7765 - classification_loss: 0.0900 226/500 [============>.................] - ETA: 1:32 - loss: 0.8678 - regression_loss: 0.7777 - classification_loss: 0.0901 227/500 [============>.................] - ETA: 1:32 - loss: 0.8668 - regression_loss: 0.7769 - classification_loss: 0.0898 228/500 [============>.................] - ETA: 1:32 - loss: 0.8692 - regression_loss: 0.7792 - classification_loss: 0.0899 229/500 [============>.................] - ETA: 1:31 - loss: 0.8758 - regression_loss: 0.7839 - classification_loss: 0.0918 230/500 [============>.................] - ETA: 1:31 - loss: 0.8734 - regression_loss: 0.7818 - classification_loss: 0.0916 231/500 [============>.................] - ETA: 1:31 - loss: 0.8734 - regression_loss: 0.7819 - classification_loss: 0.0915 232/500 [============>.................] - ETA: 1:30 - loss: 0.8774 - regression_loss: 0.7856 - classification_loss: 0.0918 233/500 [============>.................] - ETA: 1:30 - loss: 0.8774 - regression_loss: 0.7855 - classification_loss: 0.0920 234/500 [=============>................] - ETA: 1:30 - loss: 0.8769 - regression_loss: 0.7847 - classification_loss: 0.0922 235/500 [=============>................] - ETA: 1:29 - loss: 0.8783 - regression_loss: 0.7863 - classification_loss: 0.0921 236/500 [=============>................] - ETA: 1:29 - loss: 0.8796 - regression_loss: 0.7872 - classification_loss: 0.0923 237/500 [=============>................] - ETA: 1:29 - loss: 0.8789 - regression_loss: 0.7866 - classification_loss: 0.0922 238/500 [=============>................] - ETA: 1:28 - loss: 0.8771 - regression_loss: 0.7851 - classification_loss: 0.0920 239/500 [=============>................] - ETA: 1:28 - loss: 0.8762 - regression_loss: 0.7843 - classification_loss: 0.0918 240/500 [=============>................] - ETA: 1:28 - loss: 0.8762 - regression_loss: 0.7844 - classification_loss: 0.0917 241/500 [=============>................] - ETA: 1:27 - loss: 0.8733 - regression_loss: 0.7819 - classification_loss: 0.0914 242/500 [=============>................] - ETA: 1:27 - loss: 0.8724 - regression_loss: 0.7812 - classification_loss: 0.0912 243/500 [=============>................] - ETA: 1:27 - loss: 0.8718 - regression_loss: 0.7808 - classification_loss: 0.0910 244/500 [=============>................] - ETA: 1:26 - loss: 0.8724 - regression_loss: 0.7811 - classification_loss: 0.0913 245/500 [=============>................] - ETA: 1:26 - loss: 0.8718 - regression_loss: 0.7805 - classification_loss: 0.0912 246/500 [=============>................] - ETA: 1:26 - loss: 0.8745 - regression_loss: 0.7820 - classification_loss: 0.0925 247/500 [=============>................] - ETA: 1:25 - loss: 0.8724 - regression_loss: 0.7802 - classification_loss: 0.0922 248/500 [=============>................] - ETA: 1:25 - loss: 0.8714 - regression_loss: 0.7794 - classification_loss: 0.0920 249/500 [=============>................] - ETA: 1:25 - loss: 0.8696 - regression_loss: 0.7778 - classification_loss: 0.0918 250/500 [==============>...............] - ETA: 1:24 - loss: 0.8675 - regression_loss: 0.7758 - classification_loss: 0.0917 251/500 [==============>...............] - ETA: 1:24 - loss: 0.8666 - regression_loss: 0.7752 - classification_loss: 0.0914 252/500 [==============>...............] - ETA: 1:24 - loss: 0.8671 - regression_loss: 0.7754 - classification_loss: 0.0917 253/500 [==============>...............] - ETA: 1:23 - loss: 0.8658 - regression_loss: 0.7743 - classification_loss: 0.0915 254/500 [==============>...............] - ETA: 1:23 - loss: 0.8664 - regression_loss: 0.7751 - classification_loss: 0.0914 255/500 [==============>...............] - ETA: 1:23 - loss: 0.8666 - regression_loss: 0.7753 - classification_loss: 0.0912 256/500 [==============>...............] - ETA: 1:22 - loss: 0.8670 - regression_loss: 0.7756 - classification_loss: 0.0914 257/500 [==============>...............] - ETA: 1:22 - loss: 0.8665 - regression_loss: 0.7751 - classification_loss: 0.0914 258/500 [==============>...............] - ETA: 1:22 - loss: 0.8677 - regression_loss: 0.7762 - classification_loss: 0.0916 259/500 [==============>...............] - ETA: 1:21 - loss: 0.8673 - regression_loss: 0.7758 - classification_loss: 0.0914 260/500 [==============>...............] - ETA: 1:21 - loss: 0.8675 - regression_loss: 0.7761 - classification_loss: 0.0914 261/500 [==============>...............] - ETA: 1:21 - loss: 0.8684 - regression_loss: 0.7770 - classification_loss: 0.0915 262/500 [==============>...............] - ETA: 1:20 - loss: 0.8702 - regression_loss: 0.7788 - classification_loss: 0.0914 263/500 [==============>...............] - ETA: 1:20 - loss: 0.8707 - regression_loss: 0.7793 - classification_loss: 0.0914 264/500 [==============>...............] - ETA: 1:20 - loss: 0.8714 - regression_loss: 0.7800 - classification_loss: 0.0913 265/500 [==============>...............] - ETA: 1:19 - loss: 0.8711 - regression_loss: 0.7799 - classification_loss: 0.0913 266/500 [==============>...............] - ETA: 1:19 - loss: 0.8725 - regression_loss: 0.7809 - classification_loss: 0.0916 267/500 [===============>..............] - ETA: 1:19 - loss: 0.8752 - regression_loss: 0.7831 - classification_loss: 0.0922 268/500 [===============>..............] - ETA: 1:18 - loss: 0.8749 - regression_loss: 0.7828 - classification_loss: 0.0921 269/500 [===============>..............] - ETA: 1:18 - loss: 0.8761 - regression_loss: 0.7840 - classification_loss: 0.0921 270/500 [===============>..............] - ETA: 1:18 - loss: 0.8752 - regression_loss: 0.7833 - classification_loss: 0.0919 271/500 [===============>..............] - ETA: 1:17 - loss: 0.8760 - regression_loss: 0.7841 - classification_loss: 0.0919 272/500 [===============>..............] - ETA: 1:17 - loss: 0.8738 - regression_loss: 0.7823 - classification_loss: 0.0916 273/500 [===============>..............] - ETA: 1:17 - loss: 0.8744 - regression_loss: 0.7829 - classification_loss: 0.0916 274/500 [===============>..............] - ETA: 1:16 - loss: 0.8751 - regression_loss: 0.7836 - classification_loss: 0.0915 275/500 [===============>..............] - ETA: 1:16 - loss: 0.8749 - regression_loss: 0.7835 - classification_loss: 0.0914 276/500 [===============>..............] - ETA: 1:16 - loss: 0.8748 - regression_loss: 0.7834 - classification_loss: 0.0914 277/500 [===============>..............] - ETA: 1:15 - loss: 0.8751 - regression_loss: 0.7836 - classification_loss: 0.0914 278/500 [===============>..............] - ETA: 1:15 - loss: 0.8749 - regression_loss: 0.7836 - classification_loss: 0.0912 279/500 [===============>..............] - ETA: 1:14 - loss: 0.8749 - regression_loss: 0.7837 - classification_loss: 0.0912 280/500 [===============>..............] - ETA: 1:14 - loss: 0.8747 - regression_loss: 0.7836 - classification_loss: 0.0911 281/500 [===============>..............] - ETA: 1:14 - loss: 0.8749 - regression_loss: 0.7838 - classification_loss: 0.0911 282/500 [===============>..............] - ETA: 1:13 - loss: 0.8751 - regression_loss: 0.7841 - classification_loss: 0.0910 283/500 [===============>..............] - ETA: 1:13 - loss: 0.8768 - regression_loss: 0.7857 - classification_loss: 0.0911 284/500 [================>.............] - ETA: 1:13 - loss: 0.8773 - regression_loss: 0.7863 - classification_loss: 0.0910 285/500 [================>.............] - ETA: 1:12 - loss: 0.8774 - regression_loss: 0.7864 - classification_loss: 0.0910 286/500 [================>.............] - ETA: 1:12 - loss: 0.8758 - regression_loss: 0.7850 - classification_loss: 0.0907 287/500 [================>.............] - ETA: 1:12 - loss: 0.8757 - regression_loss: 0.7850 - classification_loss: 0.0907 288/500 [================>.............] - ETA: 1:11 - loss: 0.8752 - regression_loss: 0.7845 - classification_loss: 0.0907 289/500 [================>.............] - ETA: 1:11 - loss: 0.8739 - regression_loss: 0.7834 - classification_loss: 0.0904 290/500 [================>.............] - ETA: 1:11 - loss: 0.8721 - regression_loss: 0.7819 - classification_loss: 0.0902 291/500 [================>.............] - ETA: 1:10 - loss: 0.8726 - regression_loss: 0.7823 - classification_loss: 0.0903 292/500 [================>.............] - ETA: 1:10 - loss: 0.8728 - regression_loss: 0.7826 - classification_loss: 0.0902 293/500 [================>.............] - ETA: 1:10 - loss: 0.8756 - regression_loss: 0.7846 - classification_loss: 0.0910 294/500 [================>.............] - ETA: 1:09 - loss: 0.8766 - regression_loss: 0.7856 - classification_loss: 0.0910 295/500 [================>.............] - ETA: 1:09 - loss: 0.8759 - regression_loss: 0.7849 - classification_loss: 0.0910 296/500 [================>.............] - ETA: 1:09 - loss: 0.8757 - regression_loss: 0.7847 - classification_loss: 0.0910 297/500 [================>.............] - ETA: 1:08 - loss: 0.8753 - regression_loss: 0.7845 - classification_loss: 0.0908 298/500 [================>.............] - ETA: 1:08 - loss: 0.8769 - regression_loss: 0.7858 - classification_loss: 0.0911 299/500 [================>.............] - ETA: 1:08 - loss: 0.8761 - regression_loss: 0.7852 - classification_loss: 0.0909 300/500 [=================>............] - ETA: 1:07 - loss: 0.8768 - regression_loss: 0.7857 - classification_loss: 0.0911 301/500 [=================>............] - ETA: 1:07 - loss: 0.8761 - regression_loss: 0.7851 - classification_loss: 0.0910 302/500 [=================>............] - ETA: 1:07 - loss: 0.8749 - regression_loss: 0.7841 - classification_loss: 0.0908 303/500 [=================>............] - ETA: 1:06 - loss: 0.8732 - regression_loss: 0.7826 - classification_loss: 0.0906 304/500 [=================>............] - ETA: 1:06 - loss: 0.8723 - regression_loss: 0.7819 - classification_loss: 0.0904 305/500 [=================>............] - ETA: 1:06 - loss: 0.8731 - regression_loss: 0.7825 - classification_loss: 0.0906 306/500 [=================>............] - ETA: 1:05 - loss: 0.8738 - regression_loss: 0.7830 - classification_loss: 0.0907 307/500 [=================>............] - ETA: 1:05 - loss: 0.8741 - regression_loss: 0.7834 - classification_loss: 0.0907 308/500 [=================>............] - ETA: 1:05 - loss: 0.8733 - regression_loss: 0.7828 - classification_loss: 0.0905 309/500 [=================>............] - ETA: 1:04 - loss: 0.8722 - regression_loss: 0.7818 - classification_loss: 0.0904 310/500 [=================>............] - ETA: 1:04 - loss: 0.8753 - regression_loss: 0.7842 - classification_loss: 0.0911 311/500 [=================>............] - ETA: 1:04 - loss: 0.8738 - regression_loss: 0.7829 - classification_loss: 0.0909 312/500 [=================>............] - ETA: 1:03 - loss: 0.8729 - regression_loss: 0.7822 - classification_loss: 0.0907 313/500 [=================>............] - ETA: 1:03 - loss: 0.8726 - regression_loss: 0.7819 - classification_loss: 0.0907 314/500 [=================>............] - ETA: 1:03 - loss: 0.8705 - regression_loss: 0.7800 - classification_loss: 0.0905 315/500 [=================>............] - ETA: 1:02 - loss: 0.8720 - regression_loss: 0.7815 - classification_loss: 0.0906 316/500 [=================>............] - ETA: 1:02 - loss: 0.8702 - regression_loss: 0.7799 - classification_loss: 0.0903 317/500 [==================>...........] - ETA: 1:02 - loss: 0.8685 - regression_loss: 0.7784 - classification_loss: 0.0901 318/500 [==================>...........] - ETA: 1:01 - loss: 0.8709 - regression_loss: 0.7807 - classification_loss: 0.0902 319/500 [==================>...........] - ETA: 1:01 - loss: 0.8709 - regression_loss: 0.7807 - classification_loss: 0.0902 320/500 [==================>...........] - ETA: 1:00 - loss: 0.8695 - regression_loss: 0.7795 - classification_loss: 0.0900 321/500 [==================>...........] - ETA: 1:00 - loss: 0.8693 - regression_loss: 0.7794 - classification_loss: 0.0899 322/500 [==================>...........] - ETA: 1:00 - loss: 0.8700 - regression_loss: 0.7801 - classification_loss: 0.0899 323/500 [==================>...........] - ETA: 59s - loss: 0.8726 - regression_loss: 0.7820 - classification_loss: 0.0906  324/500 [==================>...........] - ETA: 59s - loss: 0.8713 - regression_loss: 0.7808 - classification_loss: 0.0905 325/500 [==================>...........] - ETA: 59s - loss: 0.8716 - regression_loss: 0.7813 - classification_loss: 0.0903 326/500 [==================>...........] - ETA: 58s - loss: 0.8722 - regression_loss: 0.7817 - classification_loss: 0.0905 327/500 [==================>...........] - ETA: 58s - loss: 0.8725 - regression_loss: 0.7820 - classification_loss: 0.0905 328/500 [==================>...........] - ETA: 58s - loss: 0.8732 - regression_loss: 0.7828 - classification_loss: 0.0905 329/500 [==================>...........] - ETA: 57s - loss: 0.8734 - regression_loss: 0.7830 - classification_loss: 0.0904 330/500 [==================>...........] - ETA: 57s - loss: 0.8724 - regression_loss: 0.7820 - classification_loss: 0.0904 331/500 [==================>...........] - ETA: 57s - loss: 0.8722 - regression_loss: 0.7819 - classification_loss: 0.0903 332/500 [==================>...........] - ETA: 56s - loss: 0.8712 - regression_loss: 0.7811 - classification_loss: 0.0901 333/500 [==================>...........] - ETA: 56s - loss: 0.8715 - regression_loss: 0.7812 - classification_loss: 0.0902 334/500 [===================>..........] - ETA: 56s - loss: 0.8715 - regression_loss: 0.7813 - classification_loss: 0.0902 335/500 [===================>..........] - ETA: 55s - loss: 0.8717 - regression_loss: 0.7815 - classification_loss: 0.0902 336/500 [===================>..........] - ETA: 55s - loss: 0.8732 - regression_loss: 0.7831 - classification_loss: 0.0901 337/500 [===================>..........] - ETA: 55s - loss: 0.8719 - regression_loss: 0.7820 - classification_loss: 0.0899 338/500 [===================>..........] - ETA: 54s - loss: 0.8713 - regression_loss: 0.7816 - classification_loss: 0.0897 339/500 [===================>..........] - ETA: 54s - loss: 0.8722 - regression_loss: 0.7824 - classification_loss: 0.0898 340/500 [===================>..........] - ETA: 54s - loss: 0.8714 - regression_loss: 0.7817 - classification_loss: 0.0897 341/500 [===================>..........] - ETA: 53s - loss: 0.8719 - regression_loss: 0.7821 - classification_loss: 0.0898 342/500 [===================>..........] - ETA: 53s - loss: 0.8721 - regression_loss: 0.7823 - classification_loss: 0.0898 343/500 [===================>..........] - ETA: 53s - loss: 0.8741 - regression_loss: 0.7839 - classification_loss: 0.0902 344/500 [===================>..........] - ETA: 52s - loss: 0.8760 - regression_loss: 0.7851 - classification_loss: 0.0909 345/500 [===================>..........] - ETA: 52s - loss: 0.8754 - regression_loss: 0.7846 - classification_loss: 0.0908 346/500 [===================>..........] - ETA: 52s - loss: 0.8751 - regression_loss: 0.7844 - classification_loss: 0.0907 347/500 [===================>..........] - ETA: 51s - loss: 0.8753 - regression_loss: 0.7846 - classification_loss: 0.0907 348/500 [===================>..........] - ETA: 51s - loss: 0.8752 - regression_loss: 0.7843 - classification_loss: 0.0909 349/500 [===================>..........] - ETA: 51s - loss: 0.8738 - regression_loss: 0.7831 - classification_loss: 0.0907 350/500 [====================>.........] - ETA: 50s - loss: 0.8733 - regression_loss: 0.7828 - classification_loss: 0.0905 351/500 [====================>.........] - ETA: 50s - loss: 0.8746 - regression_loss: 0.7837 - classification_loss: 0.0909 352/500 [====================>.........] - ETA: 50s - loss: 0.8740 - regression_loss: 0.7833 - classification_loss: 0.0908 353/500 [====================>.........] - ETA: 49s - loss: 0.8743 - regression_loss: 0.7835 - classification_loss: 0.0908 354/500 [====================>.........] - ETA: 49s - loss: 0.8742 - regression_loss: 0.7834 - classification_loss: 0.0908 355/500 [====================>.........] - ETA: 49s - loss: 0.8745 - regression_loss: 0.7837 - classification_loss: 0.0908 356/500 [====================>.........] - ETA: 48s - loss: 0.8767 - regression_loss: 0.7854 - classification_loss: 0.0913 357/500 [====================>.........] - ETA: 48s - loss: 0.8762 - regression_loss: 0.7850 - classification_loss: 0.0912 358/500 [====================>.........] - ETA: 48s - loss: 0.8760 - regression_loss: 0.7849 - classification_loss: 0.0911 359/500 [====================>.........] - ETA: 47s - loss: 0.8772 - regression_loss: 0.7859 - classification_loss: 0.0913 360/500 [====================>.........] - ETA: 47s - loss: 0.8798 - regression_loss: 0.7882 - classification_loss: 0.0916 361/500 [====================>.........] - ETA: 47s - loss: 0.8805 - regression_loss: 0.7889 - classification_loss: 0.0916 362/500 [====================>.........] - ETA: 46s - loss: 0.8813 - regression_loss: 0.7898 - classification_loss: 0.0915 363/500 [====================>.........] - ETA: 46s - loss: 0.8845 - regression_loss: 0.7926 - classification_loss: 0.0919 364/500 [====================>.........] - ETA: 46s - loss: 0.8842 - regression_loss: 0.7924 - classification_loss: 0.0918 365/500 [====================>.........] - ETA: 45s - loss: 0.8847 - regression_loss: 0.7929 - classification_loss: 0.0918 366/500 [====================>.........] - ETA: 45s - loss: 0.8852 - regression_loss: 0.7934 - classification_loss: 0.0919 367/500 [=====================>........] - ETA: 45s - loss: 0.8840 - regression_loss: 0.7923 - classification_loss: 0.0917 368/500 [=====================>........] - ETA: 44s - loss: 0.8826 - regression_loss: 0.7909 - classification_loss: 0.0917 369/500 [=====================>........] - ETA: 44s - loss: 0.8828 - regression_loss: 0.7911 - classification_loss: 0.0918 370/500 [=====================>........] - ETA: 44s - loss: 0.8846 - regression_loss: 0.7928 - classification_loss: 0.0918 371/500 [=====================>........] - ETA: 43s - loss: 0.8837 - regression_loss: 0.7920 - classification_loss: 0.0917 372/500 [=====================>........] - ETA: 43s - loss: 0.8836 - regression_loss: 0.7921 - classification_loss: 0.0916 373/500 [=====================>........] - ETA: 42s - loss: 0.8838 - regression_loss: 0.7924 - classification_loss: 0.0915 374/500 [=====================>........] - ETA: 42s - loss: 0.8834 - regression_loss: 0.7921 - classification_loss: 0.0913 375/500 [=====================>........] - ETA: 42s - loss: 0.8850 - regression_loss: 0.7934 - classification_loss: 0.0915 376/500 [=====================>........] - ETA: 41s - loss: 0.8849 - regression_loss: 0.7934 - classification_loss: 0.0916 377/500 [=====================>........] - ETA: 41s - loss: 0.8859 - regression_loss: 0.7945 - classification_loss: 0.0914 378/500 [=====================>........] - ETA: 41s - loss: 0.8884 - regression_loss: 0.7962 - classification_loss: 0.0922 379/500 [=====================>........] - ETA: 40s - loss: 0.8893 - regression_loss: 0.7970 - classification_loss: 0.0923 380/500 [=====================>........] - ETA: 40s - loss: 0.8897 - regression_loss: 0.7974 - classification_loss: 0.0923 381/500 [=====================>........] - ETA: 40s - loss: 0.8905 - regression_loss: 0.7981 - classification_loss: 0.0924 382/500 [=====================>........] - ETA: 39s - loss: 0.8893 - regression_loss: 0.7971 - classification_loss: 0.0922 383/500 [=====================>........] - ETA: 39s - loss: 0.8906 - regression_loss: 0.7985 - classification_loss: 0.0922 384/500 [======================>.......] - ETA: 39s - loss: 0.8901 - regression_loss: 0.7981 - classification_loss: 0.0920 385/500 [======================>.......] - ETA: 38s - loss: 0.8906 - regression_loss: 0.7983 - classification_loss: 0.0923 386/500 [======================>.......] - ETA: 38s - loss: 0.8913 - regression_loss: 0.7989 - classification_loss: 0.0924 387/500 [======================>.......] - ETA: 38s - loss: 0.8907 - regression_loss: 0.7984 - classification_loss: 0.0923 388/500 [======================>.......] - ETA: 37s - loss: 0.8904 - regression_loss: 0.7982 - classification_loss: 0.0922 389/500 [======================>.......] - ETA: 37s - loss: 0.8898 - regression_loss: 0.7977 - classification_loss: 0.0921 390/500 [======================>.......] - ETA: 37s - loss: 0.8902 - regression_loss: 0.7980 - classification_loss: 0.0923 391/500 [======================>.......] - ETA: 36s - loss: 0.8897 - regression_loss: 0.7977 - classification_loss: 0.0921 392/500 [======================>.......] - ETA: 36s - loss: 0.8901 - regression_loss: 0.7981 - classification_loss: 0.0920 393/500 [======================>.......] - ETA: 36s - loss: 0.8907 - regression_loss: 0.7987 - classification_loss: 0.0920 394/500 [======================>.......] - ETA: 35s - loss: 0.8897 - regression_loss: 0.7978 - classification_loss: 0.0919 395/500 [======================>.......] - ETA: 35s - loss: 0.8895 - regression_loss: 0.7977 - classification_loss: 0.0918 396/500 [======================>.......] - ETA: 35s - loss: 0.8887 - regression_loss: 0.7971 - classification_loss: 0.0916 397/500 [======================>.......] - ETA: 34s - loss: 0.8888 - regression_loss: 0.7972 - classification_loss: 0.0916 398/500 [======================>.......] - ETA: 34s - loss: 0.8890 - regression_loss: 0.7973 - classification_loss: 0.0917 399/500 [======================>.......] - ETA: 34s - loss: 0.8889 - regression_loss: 0.7973 - classification_loss: 0.0916 400/500 [=======================>......] - ETA: 33s - loss: 0.8906 - regression_loss: 0.7990 - classification_loss: 0.0917 401/500 [=======================>......] - ETA: 33s - loss: 0.8893 - regression_loss: 0.7976 - classification_loss: 0.0916 402/500 [=======================>......] - ETA: 33s - loss: 0.8874 - regression_loss: 0.7960 - classification_loss: 0.0914 403/500 [=======================>......] - ETA: 32s - loss: 0.8880 - regression_loss: 0.7965 - classification_loss: 0.0915 404/500 [=======================>......] - ETA: 32s - loss: 0.8882 - regression_loss: 0.7966 - classification_loss: 0.0915 405/500 [=======================>......] - ETA: 32s - loss: 0.8875 - regression_loss: 0.7961 - classification_loss: 0.0914 406/500 [=======================>......] - ETA: 31s - loss: 0.8870 - regression_loss: 0.7958 - classification_loss: 0.0912 407/500 [=======================>......] - ETA: 31s - loss: 0.8881 - regression_loss: 0.7966 - classification_loss: 0.0914 408/500 [=======================>......] - ETA: 31s - loss: 0.8870 - regression_loss: 0.7957 - classification_loss: 0.0913 409/500 [=======================>......] - ETA: 30s - loss: 0.8855 - regression_loss: 0.7944 - classification_loss: 0.0911 410/500 [=======================>......] - ETA: 30s - loss: 0.8850 - regression_loss: 0.7940 - classification_loss: 0.0910 411/500 [=======================>......] - ETA: 30s - loss: 0.8842 - regression_loss: 0.7933 - classification_loss: 0.0909 412/500 [=======================>......] - ETA: 29s - loss: 0.8842 - regression_loss: 0.7934 - classification_loss: 0.0908 413/500 [=======================>......] - ETA: 29s - loss: 0.8840 - regression_loss: 0.7933 - classification_loss: 0.0907 414/500 [=======================>......] - ETA: 29s - loss: 0.8848 - regression_loss: 0.7941 - classification_loss: 0.0907 415/500 [=======================>......] - ETA: 28s - loss: 0.8871 - regression_loss: 0.7957 - classification_loss: 0.0914 416/500 [=======================>......] - ETA: 28s - loss: 0.8860 - regression_loss: 0.7948 - classification_loss: 0.0913 417/500 [========================>.....] - ETA: 28s - loss: 0.8859 - regression_loss: 0.7947 - classification_loss: 0.0912 418/500 [========================>.....] - ETA: 27s - loss: 0.8873 - regression_loss: 0.7959 - classification_loss: 0.0914 419/500 [========================>.....] - ETA: 27s - loss: 0.8882 - regression_loss: 0.7968 - classification_loss: 0.0914 420/500 [========================>.....] - ETA: 27s - loss: 0.8878 - regression_loss: 0.7964 - classification_loss: 0.0914 421/500 [========================>.....] - ETA: 26s - loss: 0.8882 - regression_loss: 0.7966 - classification_loss: 0.0916 422/500 [========================>.....] - ETA: 26s - loss: 0.8884 - regression_loss: 0.7963 - classification_loss: 0.0921 423/500 [========================>.....] - ETA: 26s - loss: 0.8900 - regression_loss: 0.7979 - classification_loss: 0.0921 424/500 [========================>.....] - ETA: 25s - loss: 0.8899 - regression_loss: 0.7978 - classification_loss: 0.0920 425/500 [========================>.....] - ETA: 25s - loss: 0.8904 - regression_loss: 0.7984 - classification_loss: 0.0920 426/500 [========================>.....] - ETA: 25s - loss: 0.8908 - regression_loss: 0.7987 - classification_loss: 0.0921 427/500 [========================>.....] - ETA: 24s - loss: 0.8921 - regression_loss: 0.7997 - classification_loss: 0.0925 428/500 [========================>.....] - ETA: 24s - loss: 0.8940 - regression_loss: 0.8016 - classification_loss: 0.0924 429/500 [========================>.....] - ETA: 24s - loss: 0.8979 - regression_loss: 0.8047 - classification_loss: 0.0933 430/500 [========================>.....] - ETA: 23s - loss: 0.8978 - regression_loss: 0.8046 - classification_loss: 0.0931 431/500 [========================>.....] - ETA: 23s - loss: 0.8976 - regression_loss: 0.8044 - classification_loss: 0.0932 432/500 [========================>.....] - ETA: 23s - loss: 0.8966 - regression_loss: 0.8036 - classification_loss: 0.0930 433/500 [========================>.....] - ETA: 22s - loss: 0.8956 - regression_loss: 0.8027 - classification_loss: 0.0929 434/500 [=========================>....] - ETA: 22s - loss: 0.8962 - regression_loss: 0.8033 - classification_loss: 0.0929 435/500 [=========================>....] - ETA: 22s - loss: 0.8957 - regression_loss: 0.8028 - classification_loss: 0.0929 436/500 [=========================>....] - ETA: 21s - loss: 0.8950 - regression_loss: 0.8022 - classification_loss: 0.0928 437/500 [=========================>....] - ETA: 21s - loss: 0.8952 - regression_loss: 0.8022 - classification_loss: 0.0929 438/500 [=========================>....] - ETA: 20s - loss: 0.8948 - regression_loss: 0.8019 - classification_loss: 0.0929 439/500 [=========================>....] - ETA: 20s - loss: 0.8953 - regression_loss: 0.8024 - classification_loss: 0.0929 440/500 [=========================>....] - ETA: 20s - loss: 0.8963 - regression_loss: 0.8034 - classification_loss: 0.0929 441/500 [=========================>....] - ETA: 19s - loss: 0.8958 - regression_loss: 0.8030 - classification_loss: 0.0928 442/500 [=========================>....] - ETA: 19s - loss: 0.8960 - regression_loss: 0.8032 - classification_loss: 0.0928 443/500 [=========================>....] - ETA: 19s - loss: 0.8960 - regression_loss: 0.8032 - classification_loss: 0.0928 444/500 [=========================>....] - ETA: 18s - loss: 0.8951 - regression_loss: 0.8024 - classification_loss: 0.0927 445/500 [=========================>....] - ETA: 18s - loss: 0.8950 - regression_loss: 0.8022 - classification_loss: 0.0928 446/500 [=========================>....] - ETA: 18s - loss: 0.8938 - regression_loss: 0.8013 - classification_loss: 0.0926 447/500 [=========================>....] - ETA: 17s - loss: 0.8942 - regression_loss: 0.8016 - classification_loss: 0.0926 448/500 [=========================>....] - ETA: 17s - loss: 0.8944 - regression_loss: 0.8018 - classification_loss: 0.0926 449/500 [=========================>....] - ETA: 17s - loss: 0.8942 - regression_loss: 0.8018 - classification_loss: 0.0925 450/500 [==========================>...] - ETA: 16s - loss: 0.8923 - regression_loss: 0.8000 - classification_loss: 0.0923 451/500 [==========================>...] - ETA: 16s - loss: 0.8931 - regression_loss: 0.8009 - classification_loss: 0.0922 452/500 [==========================>...] - ETA: 16s - loss: 0.8934 - regression_loss: 0.8010 - classification_loss: 0.0924 453/500 [==========================>...] - ETA: 15s - loss: 0.8920 - regression_loss: 0.7998 - classification_loss: 0.0922 454/500 [==========================>...] - ETA: 15s - loss: 0.8934 - regression_loss: 0.8010 - classification_loss: 0.0924 455/500 [==========================>...] - ETA: 15s - loss: 0.8934 - regression_loss: 0.8010 - classification_loss: 0.0924 456/500 [==========================>...] - ETA: 14s - loss: 0.8939 - regression_loss: 0.8014 - classification_loss: 0.0925 457/500 [==========================>...] - ETA: 14s - loss: 0.8945 - regression_loss: 0.8020 - classification_loss: 0.0926 458/500 [==========================>...] - ETA: 14s - loss: 0.8948 - regression_loss: 0.8022 - classification_loss: 0.0926 459/500 [==========================>...] - ETA: 13s - loss: 0.8955 - regression_loss: 0.8028 - classification_loss: 0.0927 460/500 [==========================>...] - ETA: 13s - loss: 0.8966 - regression_loss: 0.8037 - classification_loss: 0.0929 461/500 [==========================>...] - ETA: 13s - loss: 0.8962 - regression_loss: 0.8034 - classification_loss: 0.0928 462/500 [==========================>...] - ETA: 12s - loss: 0.8955 - regression_loss: 0.8028 - classification_loss: 0.0927 463/500 [==========================>...] - ETA: 12s - loss: 0.8949 - regression_loss: 0.8024 - classification_loss: 0.0925 464/500 [==========================>...] - ETA: 12s - loss: 0.8938 - regression_loss: 0.8014 - classification_loss: 0.0924 465/500 [==========================>...] - ETA: 11s - loss: 0.8942 - regression_loss: 0.8017 - classification_loss: 0.0924 466/500 [==========================>...] - ETA: 11s - loss: 0.8935 - regression_loss: 0.8012 - classification_loss: 0.0923 467/500 [===========================>..] - ETA: 11s - loss: 0.8938 - regression_loss: 0.8015 - classification_loss: 0.0923 468/500 [===========================>..] - ETA: 10s - loss: 0.8937 - regression_loss: 0.8013 - classification_loss: 0.0924 469/500 [===========================>..] - ETA: 10s - loss: 0.8940 - regression_loss: 0.8016 - classification_loss: 0.0923 470/500 [===========================>..] - ETA: 10s - loss: 0.8942 - regression_loss: 0.8017 - classification_loss: 0.0924 471/500 [===========================>..] - ETA: 9s - loss: 0.8968 - regression_loss: 0.8042 - classification_loss: 0.0926  472/500 [===========================>..] - ETA: 9s - loss: 0.8971 - regression_loss: 0.8046 - classification_loss: 0.0925 473/500 [===========================>..] - ETA: 9s - loss: 0.8983 - regression_loss: 0.8056 - classification_loss: 0.0927 474/500 [===========================>..] - ETA: 8s - loss: 0.8985 - regression_loss: 0.8059 - classification_loss: 0.0926 475/500 [===========================>..] - ETA: 8s - loss: 0.8989 - regression_loss: 0.8063 - classification_loss: 0.0927 476/500 [===========================>..] - ETA: 8s - loss: 0.8999 - regression_loss: 0.8071 - classification_loss: 0.0927 477/500 [===========================>..] - ETA: 7s - loss: 0.8993 - regression_loss: 0.8067 - classification_loss: 0.0926 478/500 [===========================>..] - ETA: 7s - loss: 0.9014 - regression_loss: 0.8082 - classification_loss: 0.0931 479/500 [===========================>..] - ETA: 7s - loss: 0.9011 - regression_loss: 0.8080 - classification_loss: 0.0931 480/500 [===========================>..] - ETA: 6s - loss: 0.9013 - regression_loss: 0.8082 - classification_loss: 0.0931 481/500 [===========================>..] - ETA: 6s - loss: 0.9018 - regression_loss: 0.8088 - classification_loss: 0.0930 482/500 [===========================>..] - ETA: 6s - loss: 0.9009 - regression_loss: 0.8080 - classification_loss: 0.0929 483/500 [===========================>..] - ETA: 5s - loss: 0.9007 - regression_loss: 0.8078 - classification_loss: 0.0929 484/500 [============================>.] - ETA: 5s - loss: 0.9007 - regression_loss: 0.8079 - classification_loss: 0.0928 485/500 [============================>.] - ETA: 5s - loss: 0.9004 - regression_loss: 0.8077 - classification_loss: 0.0928 486/500 [============================>.] - ETA: 4s - loss: 0.9009 - regression_loss: 0.8081 - classification_loss: 0.0928 487/500 [============================>.] - ETA: 4s - loss: 0.9004 - regression_loss: 0.8076 - classification_loss: 0.0928 488/500 [============================>.] - ETA: 4s - loss: 0.9004 - regression_loss: 0.8075 - classification_loss: 0.0929 489/500 [============================>.] - ETA: 3s - loss: 0.8993 - regression_loss: 0.8066 - classification_loss: 0.0927 490/500 [============================>.] - ETA: 3s - loss: 0.9000 - regression_loss: 0.8073 - classification_loss: 0.0928 491/500 [============================>.] - ETA: 3s - loss: 0.9003 - regression_loss: 0.8075 - classification_loss: 0.0929 492/500 [============================>.] - ETA: 2s - loss: 0.9009 - regression_loss: 0.8079 - classification_loss: 0.0930 493/500 [============================>.] - ETA: 2s - loss: 0.9019 - regression_loss: 0.8089 - classification_loss: 0.0931 494/500 [============================>.] - ETA: 2s - loss: 0.9030 - regression_loss: 0.8098 - classification_loss: 0.0932 495/500 [============================>.] - ETA: 1s - loss: 0.9028 - regression_loss: 0.8097 - classification_loss: 0.0931 496/500 [============================>.] - ETA: 1s - loss: 0.9015 - regression_loss: 0.8086 - classification_loss: 0.0930 497/500 [============================>.] - ETA: 1s - loss: 0.9030 - regression_loss: 0.8098 - classification_loss: 0.0932 498/500 [============================>.] - ETA: 0s - loss: 0.9049 - regression_loss: 0.8114 - classification_loss: 0.0935 499/500 [============================>.] - ETA: 0s - loss: 0.9049 - regression_loss: 0.8113 - classification_loss: 0.0935 500/500 [==============================] - 169s 339ms/step - loss: 0.9059 - regression_loss: 0.8121 - classification_loss: 0.0938 326 instances of class plum with average precision: 0.8290 mAP: 0.8290 Epoch 00032: saving model to ./training/snapshots/resnet101_pascal_32.h5 Epoch 33/150 1/500 [..............................] - ETA: 2:49 - loss: 0.7464 - regression_loss: 0.7109 - classification_loss: 0.0355 2/500 [..............................] - ETA: 2:45 - loss: 0.7791 - regression_loss: 0.7295 - classification_loss: 0.0496 3/500 [..............................] - ETA: 2:45 - loss: 0.7003 - regression_loss: 0.6584 - classification_loss: 0.0418 4/500 [..............................] - ETA: 2:48 - loss: 0.7097 - regression_loss: 0.6582 - classification_loss: 0.0516 5/500 [..............................] - ETA: 2:49 - loss: 0.7393 - regression_loss: 0.6834 - classification_loss: 0.0559 6/500 [..............................] - ETA: 2:49 - loss: 0.7268 - regression_loss: 0.6731 - classification_loss: 0.0536 7/500 [..............................] - ETA: 2:48 - loss: 0.6564 - regression_loss: 0.6084 - classification_loss: 0.0480 8/500 [..............................] - ETA: 2:46 - loss: 0.6259 - regression_loss: 0.5804 - classification_loss: 0.0455 9/500 [..............................] - ETA: 2:46 - loss: 0.6147 - regression_loss: 0.5712 - classification_loss: 0.0435 10/500 [..............................] - ETA: 2:45 - loss: 0.6356 - regression_loss: 0.5912 - classification_loss: 0.0444 11/500 [..............................] - ETA: 2:46 - loss: 0.6451 - regression_loss: 0.5985 - classification_loss: 0.0466 12/500 [..............................] - ETA: 2:46 - loss: 0.6531 - regression_loss: 0.6065 - classification_loss: 0.0467 13/500 [..............................] - ETA: 2:45 - loss: 0.6633 - regression_loss: 0.6168 - classification_loss: 0.0466 14/500 [..............................] - ETA: 2:45 - loss: 0.6578 - regression_loss: 0.6121 - classification_loss: 0.0457 15/500 [..............................] - ETA: 2:44 - loss: 0.6562 - regression_loss: 0.6105 - classification_loss: 0.0457 16/500 [..............................] - ETA: 2:43 - loss: 0.6375 - regression_loss: 0.5939 - classification_loss: 0.0437 17/500 [>.............................] - ETA: 2:42 - loss: 0.6201 - regression_loss: 0.5779 - classification_loss: 0.0422 18/500 [>.............................] - ETA: 2:41 - loss: 0.6395 - regression_loss: 0.5962 - classification_loss: 0.0432 19/500 [>.............................] - ETA: 2:41 - loss: 0.6533 - regression_loss: 0.6080 - classification_loss: 0.0453 20/500 [>.............................] - ETA: 2:41 - loss: 0.6598 - regression_loss: 0.6137 - classification_loss: 0.0461 21/500 [>.............................] - ETA: 2:41 - loss: 0.6665 - regression_loss: 0.6185 - classification_loss: 0.0480 22/500 [>.............................] - ETA: 2:41 - loss: 0.6767 - regression_loss: 0.6254 - classification_loss: 0.0512 23/500 [>.............................] - ETA: 2:40 - loss: 0.7082 - regression_loss: 0.6493 - classification_loss: 0.0589 24/500 [>.............................] - ETA: 2:40 - loss: 0.7029 - regression_loss: 0.6444 - classification_loss: 0.0585 25/500 [>.............................] - ETA: 2:40 - loss: 0.7240 - regression_loss: 0.6629 - classification_loss: 0.0611 26/500 [>.............................] - ETA: 2:40 - loss: 0.7604 - regression_loss: 0.6964 - classification_loss: 0.0640 27/500 [>.............................] - ETA: 2:39 - loss: 0.7667 - regression_loss: 0.7024 - classification_loss: 0.0643 28/500 [>.............................] - ETA: 2:39 - loss: 0.7660 - regression_loss: 0.7015 - classification_loss: 0.0645 29/500 [>.............................] - ETA: 2:39 - loss: 0.7670 - regression_loss: 0.7028 - classification_loss: 0.0642 30/500 [>.............................] - ETA: 2:39 - loss: 0.7741 - regression_loss: 0.7089 - classification_loss: 0.0652 31/500 [>.............................] - ETA: 2:38 - loss: 0.7742 - regression_loss: 0.7104 - classification_loss: 0.0638 32/500 [>.............................] - ETA: 2:38 - loss: 0.7933 - regression_loss: 0.7295 - classification_loss: 0.0638 33/500 [>.............................] - ETA: 2:38 - loss: 0.7972 - regression_loss: 0.7326 - classification_loss: 0.0646 34/500 [=>............................] - ETA: 2:38 - loss: 0.8017 - regression_loss: 0.7355 - classification_loss: 0.0663 35/500 [=>............................] - ETA: 2:37 - loss: 0.8144 - regression_loss: 0.7471 - classification_loss: 0.0673 36/500 [=>............................] - ETA: 2:36 - loss: 0.8074 - regression_loss: 0.7415 - classification_loss: 0.0660 37/500 [=>............................] - ETA: 2:36 - loss: 0.8004 - regression_loss: 0.7348 - classification_loss: 0.0656 38/500 [=>............................] - ETA: 2:36 - loss: 0.8015 - regression_loss: 0.7349 - classification_loss: 0.0665 39/500 [=>............................] - ETA: 2:35 - loss: 0.8206 - regression_loss: 0.7509 - classification_loss: 0.0697 40/500 [=>............................] - ETA: 2:35 - loss: 0.8195 - regression_loss: 0.7502 - classification_loss: 0.0693 41/500 [=>............................] - ETA: 2:35 - loss: 0.8217 - regression_loss: 0.7531 - classification_loss: 0.0687 42/500 [=>............................] - ETA: 2:34 - loss: 0.8210 - regression_loss: 0.7513 - classification_loss: 0.0697 43/500 [=>............................] - ETA: 2:34 - loss: 0.8301 - regression_loss: 0.7580 - classification_loss: 0.0720 44/500 [=>............................] - ETA: 2:34 - loss: 0.8250 - regression_loss: 0.7523 - classification_loss: 0.0728 45/500 [=>............................] - ETA: 2:33 - loss: 0.8340 - regression_loss: 0.7592 - classification_loss: 0.0748 46/500 [=>............................] - ETA: 2:33 - loss: 0.8292 - regression_loss: 0.7548 - classification_loss: 0.0744 47/500 [=>............................] - ETA: 2:33 - loss: 0.8353 - regression_loss: 0.7620 - classification_loss: 0.0733 48/500 [=>............................] - ETA: 2:32 - loss: 0.8303 - regression_loss: 0.7573 - classification_loss: 0.0729 49/500 [=>............................] - ETA: 2:32 - loss: 0.8357 - regression_loss: 0.7626 - classification_loss: 0.0730 50/500 [==>...........................] - ETA: 2:32 - loss: 0.8285 - regression_loss: 0.7565 - classification_loss: 0.0720 51/500 [==>...........................] - ETA: 2:31 - loss: 0.8271 - regression_loss: 0.7548 - classification_loss: 0.0723 52/500 [==>...........................] - ETA: 2:31 - loss: 0.8166 - regression_loss: 0.7454 - classification_loss: 0.0713 53/500 [==>...........................] - ETA: 2:31 - loss: 0.8203 - regression_loss: 0.7477 - classification_loss: 0.0726 54/500 [==>...........................] - ETA: 2:31 - loss: 0.8133 - regression_loss: 0.7412 - classification_loss: 0.0721 55/500 [==>...........................] - ETA: 2:30 - loss: 0.8123 - regression_loss: 0.7401 - classification_loss: 0.0721 56/500 [==>...........................] - ETA: 2:30 - loss: 0.8128 - regression_loss: 0.7409 - classification_loss: 0.0718 57/500 [==>...........................] - ETA: 2:30 - loss: 0.8140 - regression_loss: 0.7424 - classification_loss: 0.0716 58/500 [==>...........................] - ETA: 2:29 - loss: 0.8161 - regression_loss: 0.7439 - classification_loss: 0.0722 59/500 [==>...........................] - ETA: 2:29 - loss: 0.8099 - regression_loss: 0.7382 - classification_loss: 0.0717 60/500 [==>...........................] - ETA: 2:29 - loss: 0.8070 - regression_loss: 0.7359 - classification_loss: 0.0711 61/500 [==>...........................] - ETA: 2:29 - loss: 0.8038 - regression_loss: 0.7333 - classification_loss: 0.0705 62/500 [==>...........................] - ETA: 2:28 - loss: 0.8010 - regression_loss: 0.7307 - classification_loss: 0.0703 63/500 [==>...........................] - ETA: 2:28 - loss: 0.7963 - regression_loss: 0.7262 - classification_loss: 0.0701 64/500 [==>...........................] - ETA: 2:28 - loss: 0.7936 - regression_loss: 0.7236 - classification_loss: 0.0700 65/500 [==>...........................] - ETA: 2:27 - loss: 0.7900 - regression_loss: 0.7205 - classification_loss: 0.0696 66/500 [==>...........................] - ETA: 2:27 - loss: 0.7834 - regression_loss: 0.7148 - classification_loss: 0.0686 67/500 [===>..........................] - ETA: 2:27 - loss: 0.7808 - regression_loss: 0.7124 - classification_loss: 0.0684 68/500 [===>..........................] - ETA: 2:26 - loss: 0.7972 - regression_loss: 0.7253 - classification_loss: 0.0719 69/500 [===>..........................] - ETA: 2:26 - loss: 0.7996 - regression_loss: 0.7273 - classification_loss: 0.0723 70/500 [===>..........................] - ETA: 2:26 - loss: 0.8009 - regression_loss: 0.7291 - classification_loss: 0.0718 71/500 [===>..........................] - ETA: 2:25 - loss: 0.8020 - regression_loss: 0.7301 - classification_loss: 0.0719 72/500 [===>..........................] - ETA: 2:25 - loss: 0.8025 - regression_loss: 0.7309 - classification_loss: 0.0716 73/500 [===>..........................] - ETA: 2:24 - loss: 0.7951 - regression_loss: 0.7235 - classification_loss: 0.0716 74/500 [===>..........................] - ETA: 2:24 - loss: 0.7981 - regression_loss: 0.7256 - classification_loss: 0.0725 75/500 [===>..........................] - ETA: 2:24 - loss: 0.7990 - regression_loss: 0.7267 - classification_loss: 0.0724 76/500 [===>..........................] - ETA: 2:23 - loss: 0.7956 - regression_loss: 0.7241 - classification_loss: 0.0715 77/500 [===>..........................] - ETA: 2:23 - loss: 0.7945 - regression_loss: 0.7237 - classification_loss: 0.0708 78/500 [===>..........................] - ETA: 2:23 - loss: 0.8010 - regression_loss: 0.7291 - classification_loss: 0.0718 79/500 [===>..........................] - ETA: 2:23 - loss: 0.8023 - regression_loss: 0.7301 - classification_loss: 0.0723 80/500 [===>..........................] - ETA: 2:22 - loss: 0.7973 - regression_loss: 0.7257 - classification_loss: 0.0716 81/500 [===>..........................] - ETA: 2:22 - loss: 0.7985 - regression_loss: 0.7264 - classification_loss: 0.0721 82/500 [===>..........................] - ETA: 2:21 - loss: 0.8016 - regression_loss: 0.7290 - classification_loss: 0.0725 83/500 [===>..........................] - ETA: 2:21 - loss: 0.7998 - regression_loss: 0.7277 - classification_loss: 0.0721 84/500 [====>.........................] - ETA: 2:21 - loss: 0.8054 - regression_loss: 0.7325 - classification_loss: 0.0729 85/500 [====>.........................] - ETA: 2:20 - loss: 0.8082 - regression_loss: 0.7351 - classification_loss: 0.0731 86/500 [====>.........................] - ETA: 2:20 - loss: 0.8079 - regression_loss: 0.7351 - classification_loss: 0.0728 87/500 [====>.........................] - ETA: 2:20 - loss: 0.8040 - regression_loss: 0.7317 - classification_loss: 0.0723 88/500 [====>.........................] - ETA: 2:19 - loss: 0.7998 - regression_loss: 0.7282 - classification_loss: 0.0717 89/500 [====>.........................] - ETA: 2:19 - loss: 0.8066 - regression_loss: 0.7338 - classification_loss: 0.0728 90/500 [====>.........................] - ETA: 2:19 - loss: 0.8098 - regression_loss: 0.7365 - classification_loss: 0.0733 91/500 [====>.........................] - ETA: 2:18 - loss: 0.8116 - regression_loss: 0.7379 - classification_loss: 0.0737 92/500 [====>.........................] - ETA: 2:18 - loss: 0.8078 - regression_loss: 0.7346 - classification_loss: 0.0732 93/500 [====>.........................] - ETA: 2:18 - loss: 0.8153 - regression_loss: 0.7406 - classification_loss: 0.0748 94/500 [====>.........................] - ETA: 2:17 - loss: 0.8165 - regression_loss: 0.7420 - classification_loss: 0.0745 95/500 [====>.........................] - ETA: 2:17 - loss: 0.8172 - regression_loss: 0.7427 - classification_loss: 0.0745 96/500 [====>.........................] - ETA: 2:17 - loss: 0.8195 - regression_loss: 0.7442 - classification_loss: 0.0754 97/500 [====>.........................] - ETA: 2:16 - loss: 0.8250 - regression_loss: 0.7482 - classification_loss: 0.0768 98/500 [====>.........................] - ETA: 2:16 - loss: 0.8217 - regression_loss: 0.7454 - classification_loss: 0.0763 99/500 [====>.........................] - ETA: 2:16 - loss: 0.8209 - regression_loss: 0.7446 - classification_loss: 0.0764 100/500 [=====>........................] - ETA: 2:15 - loss: 0.8213 - regression_loss: 0.7449 - classification_loss: 0.0764 101/500 [=====>........................] - ETA: 2:15 - loss: 0.8209 - regression_loss: 0.7444 - classification_loss: 0.0765 102/500 [=====>........................] - ETA: 2:15 - loss: 0.8210 - regression_loss: 0.7443 - classification_loss: 0.0767 103/500 [=====>........................] - ETA: 2:15 - loss: 0.8198 - regression_loss: 0.7435 - classification_loss: 0.0763 104/500 [=====>........................] - ETA: 2:14 - loss: 0.8188 - regression_loss: 0.7427 - classification_loss: 0.0760 105/500 [=====>........................] - ETA: 2:14 - loss: 0.8246 - regression_loss: 0.7473 - classification_loss: 0.0773 106/500 [=====>........................] - ETA: 2:14 - loss: 0.8261 - regression_loss: 0.7482 - classification_loss: 0.0779 107/500 [=====>........................] - ETA: 2:13 - loss: 0.8252 - regression_loss: 0.7475 - classification_loss: 0.0777 108/500 [=====>........................] - ETA: 2:13 - loss: 0.8311 - regression_loss: 0.7533 - classification_loss: 0.0779 109/500 [=====>........................] - ETA: 2:12 - loss: 0.8301 - regression_loss: 0.7526 - classification_loss: 0.0774 110/500 [=====>........................] - ETA: 2:12 - loss: 0.8290 - regression_loss: 0.7517 - classification_loss: 0.0773 111/500 [=====>........................] - ETA: 2:12 - loss: 0.8275 - regression_loss: 0.7504 - classification_loss: 0.0770 112/500 [=====>........................] - ETA: 2:11 - loss: 0.8249 - regression_loss: 0.7483 - classification_loss: 0.0766 113/500 [=====>........................] - ETA: 2:11 - loss: 0.8291 - regression_loss: 0.7512 - classification_loss: 0.0779 114/500 [=====>........................] - ETA: 2:11 - loss: 0.8271 - regression_loss: 0.7495 - classification_loss: 0.0776 115/500 [=====>........................] - ETA: 2:10 - loss: 0.8259 - regression_loss: 0.7484 - classification_loss: 0.0775 116/500 [=====>........................] - ETA: 2:10 - loss: 0.8298 - regression_loss: 0.7513 - classification_loss: 0.0785 117/500 [======>.......................] - ETA: 2:10 - loss: 0.8267 - regression_loss: 0.7486 - classification_loss: 0.0780 118/500 [======>.......................] - ETA: 2:09 - loss: 0.8275 - regression_loss: 0.7495 - classification_loss: 0.0781 119/500 [======>.......................] - ETA: 2:09 - loss: 0.8264 - regression_loss: 0.7485 - classification_loss: 0.0779 120/500 [======>.......................] - ETA: 2:09 - loss: 0.8257 - regression_loss: 0.7483 - classification_loss: 0.0774 121/500 [======>.......................] - ETA: 2:08 - loss: 0.8253 - regression_loss: 0.7480 - classification_loss: 0.0773 122/500 [======>.......................] - ETA: 2:08 - loss: 0.8221 - regression_loss: 0.7453 - classification_loss: 0.0768 123/500 [======>.......................] - ETA: 2:08 - loss: 0.8206 - regression_loss: 0.7436 - classification_loss: 0.0770 124/500 [======>.......................] - ETA: 2:07 - loss: 0.8198 - regression_loss: 0.7428 - classification_loss: 0.0769 125/500 [======>.......................] - ETA: 2:07 - loss: 0.8218 - regression_loss: 0.7446 - classification_loss: 0.0772 126/500 [======>.......................] - ETA: 2:07 - loss: 0.8229 - regression_loss: 0.7451 - classification_loss: 0.0778 127/500 [======>.......................] - ETA: 2:06 - loss: 0.8213 - regression_loss: 0.7437 - classification_loss: 0.0775 128/500 [======>.......................] - ETA: 2:06 - loss: 0.8216 - regression_loss: 0.7440 - classification_loss: 0.0776 129/500 [======>.......................] - ETA: 2:06 - loss: 0.8233 - regression_loss: 0.7453 - classification_loss: 0.0780 130/500 [======>.......................] - ETA: 2:05 - loss: 0.8216 - regression_loss: 0.7439 - classification_loss: 0.0777 131/500 [======>.......................] - ETA: 2:05 - loss: 0.8198 - regression_loss: 0.7423 - classification_loss: 0.0775 132/500 [======>.......................] - ETA: 2:05 - loss: 0.8218 - regression_loss: 0.7442 - classification_loss: 0.0776 133/500 [======>.......................] - ETA: 2:04 - loss: 0.8184 - regression_loss: 0.7413 - classification_loss: 0.0771 134/500 [=======>......................] - ETA: 2:04 - loss: 0.8215 - regression_loss: 0.7442 - classification_loss: 0.0773 135/500 [=======>......................] - ETA: 2:04 - loss: 0.8242 - regression_loss: 0.7463 - classification_loss: 0.0779 136/500 [=======>......................] - ETA: 2:03 - loss: 0.8241 - regression_loss: 0.7460 - classification_loss: 0.0781 137/500 [=======>......................] - ETA: 2:03 - loss: 0.8203 - regression_loss: 0.7427 - classification_loss: 0.0776 138/500 [=======>......................] - ETA: 2:03 - loss: 0.8209 - regression_loss: 0.7432 - classification_loss: 0.0776 139/500 [=======>......................] - ETA: 2:02 - loss: 0.8220 - regression_loss: 0.7443 - classification_loss: 0.0777 140/500 [=======>......................] - ETA: 2:02 - loss: 0.8201 - regression_loss: 0.7427 - classification_loss: 0.0774 141/500 [=======>......................] - ETA: 2:02 - loss: 0.8205 - regression_loss: 0.7434 - classification_loss: 0.0771 142/500 [=======>......................] - ETA: 2:01 - loss: 0.8240 - regression_loss: 0.7462 - classification_loss: 0.0777 143/500 [=======>......................] - ETA: 2:01 - loss: 0.8247 - regression_loss: 0.7469 - classification_loss: 0.0778 144/500 [=======>......................] - ETA: 2:01 - loss: 0.8253 - regression_loss: 0.7474 - classification_loss: 0.0779 145/500 [=======>......................] - ETA: 2:00 - loss: 0.8243 - regression_loss: 0.7466 - classification_loss: 0.0776 146/500 [=======>......................] - ETA: 2:00 - loss: 0.8268 - regression_loss: 0.7491 - classification_loss: 0.0777 147/500 [=======>......................] - ETA: 2:00 - loss: 0.8260 - regression_loss: 0.7484 - classification_loss: 0.0777 148/500 [=======>......................] - ETA: 1:59 - loss: 0.8297 - regression_loss: 0.7513 - classification_loss: 0.0783 149/500 [=======>......................] - ETA: 1:59 - loss: 0.8307 - regression_loss: 0.7527 - classification_loss: 0.0779 150/500 [========>.....................] - ETA: 1:59 - loss: 0.8275 - regression_loss: 0.7499 - classification_loss: 0.0776 151/500 [========>.....................] - ETA: 1:58 - loss: 0.8259 - regression_loss: 0.7485 - classification_loss: 0.0773 152/500 [========>.....................] - ETA: 1:58 - loss: 0.8242 - regression_loss: 0.7471 - classification_loss: 0.0771 153/500 [========>.....................] - ETA: 1:58 - loss: 0.8244 - regression_loss: 0.7476 - classification_loss: 0.0768 154/500 [========>.....................] - ETA: 1:57 - loss: 0.8267 - regression_loss: 0.7501 - classification_loss: 0.0766 155/500 [========>.....................] - ETA: 1:57 - loss: 0.8258 - regression_loss: 0.7493 - classification_loss: 0.0764 156/500 [========>.....................] - ETA: 1:57 - loss: 0.8294 - regression_loss: 0.7523 - classification_loss: 0.0771 157/500 [========>.....................] - ETA: 1:56 - loss: 0.8321 - regression_loss: 0.7549 - classification_loss: 0.0772 158/500 [========>.....................] - ETA: 1:56 - loss: 0.8308 - regression_loss: 0.7535 - classification_loss: 0.0773 159/500 [========>.....................] - ETA: 1:56 - loss: 0.8308 - regression_loss: 0.7536 - classification_loss: 0.0773 160/500 [========>.....................] - ETA: 1:55 - loss: 0.8297 - regression_loss: 0.7526 - classification_loss: 0.0771 161/500 [========>.....................] - ETA: 1:55 - loss: 0.8274 - regression_loss: 0.7505 - classification_loss: 0.0769 162/500 [========>.....................] - ETA: 1:55 - loss: 0.8261 - regression_loss: 0.7494 - classification_loss: 0.0767 163/500 [========>.....................] - ETA: 1:54 - loss: 0.8256 - regression_loss: 0.7492 - classification_loss: 0.0765 164/500 [========>.....................] - ETA: 1:54 - loss: 0.8257 - regression_loss: 0.7494 - classification_loss: 0.0763 165/500 [========>.....................] - ETA: 1:54 - loss: 0.8269 - regression_loss: 0.7502 - classification_loss: 0.0766 166/500 [========>.....................] - ETA: 1:53 - loss: 0.8287 - regression_loss: 0.7522 - classification_loss: 0.0765 167/500 [=========>....................] - ETA: 1:53 - loss: 0.8280 - regression_loss: 0.7516 - classification_loss: 0.0764 168/500 [=========>....................] - ETA: 1:53 - loss: 0.8266 - regression_loss: 0.7504 - classification_loss: 0.0761 169/500 [=========>....................] - ETA: 1:52 - loss: 0.8267 - regression_loss: 0.7507 - classification_loss: 0.0760 170/500 [=========>....................] - ETA: 1:52 - loss: 0.8269 - regression_loss: 0.7507 - classification_loss: 0.0763 171/500 [=========>....................] - ETA: 1:52 - loss: 0.8274 - regression_loss: 0.7513 - classification_loss: 0.0761 172/500 [=========>....................] - ETA: 1:51 - loss: 0.8287 - regression_loss: 0.7526 - classification_loss: 0.0761 173/500 [=========>....................] - ETA: 1:51 - loss: 0.8265 - regression_loss: 0.7507 - classification_loss: 0.0758 174/500 [=========>....................] - ETA: 1:50 - loss: 0.8294 - regression_loss: 0.7535 - classification_loss: 0.0759 175/500 [=========>....................] - ETA: 1:50 - loss: 0.8296 - regression_loss: 0.7540 - classification_loss: 0.0757 176/500 [=========>....................] - ETA: 1:50 - loss: 0.8363 - regression_loss: 0.7590 - classification_loss: 0.0773 177/500 [=========>....................] - ETA: 1:49 - loss: 0.8367 - regression_loss: 0.7596 - classification_loss: 0.0771 178/500 [=========>....................] - ETA: 1:49 - loss: 0.8345 - regression_loss: 0.7577 - classification_loss: 0.0767 179/500 [=========>....................] - ETA: 1:49 - loss: 0.8392 - regression_loss: 0.7615 - classification_loss: 0.0777 180/500 [=========>....................] - ETA: 1:48 - loss: 0.8432 - regression_loss: 0.7643 - classification_loss: 0.0789 181/500 [=========>....................] - ETA: 1:48 - loss: 0.8449 - regression_loss: 0.7661 - classification_loss: 0.0788 182/500 [=========>....................] - ETA: 1:48 - loss: 0.8434 - regression_loss: 0.7648 - classification_loss: 0.0786 183/500 [=========>....................] - ETA: 1:47 - loss: 0.8435 - regression_loss: 0.7652 - classification_loss: 0.0783 184/500 [==========>...................] - ETA: 1:47 - loss: 0.8405 - regression_loss: 0.7625 - classification_loss: 0.0780 185/500 [==========>...................] - ETA: 1:47 - loss: 0.8389 - regression_loss: 0.7611 - classification_loss: 0.0778 186/500 [==========>...................] - ETA: 1:46 - loss: 0.8376 - regression_loss: 0.7599 - classification_loss: 0.0777 187/500 [==========>...................] - ETA: 1:46 - loss: 0.8385 - regression_loss: 0.7607 - classification_loss: 0.0778 188/500 [==========>...................] - ETA: 1:46 - loss: 0.8368 - regression_loss: 0.7589 - classification_loss: 0.0779 189/500 [==========>...................] - ETA: 1:45 - loss: 0.8388 - regression_loss: 0.7607 - classification_loss: 0.0780 190/500 [==========>...................] - ETA: 1:45 - loss: 0.8406 - regression_loss: 0.7625 - classification_loss: 0.0781 191/500 [==========>...................] - ETA: 1:45 - loss: 0.8392 - regression_loss: 0.7613 - classification_loss: 0.0779 192/500 [==========>...................] - ETA: 1:44 - loss: 0.8394 - regression_loss: 0.7615 - classification_loss: 0.0778 193/500 [==========>...................] - ETA: 1:44 - loss: 0.8378 - regression_loss: 0.7602 - classification_loss: 0.0776 194/500 [==========>...................] - ETA: 1:43 - loss: 0.8398 - regression_loss: 0.7618 - classification_loss: 0.0780 195/500 [==========>...................] - ETA: 1:43 - loss: 0.8404 - regression_loss: 0.7624 - classification_loss: 0.0780 196/500 [==========>...................] - ETA: 1:43 - loss: 0.8389 - regression_loss: 0.7612 - classification_loss: 0.0777 197/500 [==========>...................] - ETA: 1:42 - loss: 0.8401 - regression_loss: 0.7622 - classification_loss: 0.0779 198/500 [==========>...................] - ETA: 1:42 - loss: 0.8392 - regression_loss: 0.7615 - classification_loss: 0.0776 199/500 [==========>...................] - ETA: 1:42 - loss: 0.8387 - regression_loss: 0.7610 - classification_loss: 0.0776 200/500 [===========>..................] - ETA: 1:41 - loss: 0.8382 - regression_loss: 0.7607 - classification_loss: 0.0775 201/500 [===========>..................] - ETA: 1:41 - loss: 0.8353 - regression_loss: 0.7581 - classification_loss: 0.0772 202/500 [===========>..................] - ETA: 1:41 - loss: 0.8378 - regression_loss: 0.7602 - classification_loss: 0.0777 203/500 [===========>..................] - ETA: 1:40 - loss: 0.8389 - regression_loss: 0.7610 - classification_loss: 0.0779 204/500 [===========>..................] - ETA: 1:40 - loss: 0.8415 - regression_loss: 0.7627 - classification_loss: 0.0788 205/500 [===========>..................] - ETA: 1:40 - loss: 0.8414 - regression_loss: 0.7626 - classification_loss: 0.0788 206/500 [===========>..................] - ETA: 1:39 - loss: 0.8411 - regression_loss: 0.7623 - classification_loss: 0.0788 207/500 [===========>..................] - ETA: 1:39 - loss: 0.8402 - regression_loss: 0.7615 - classification_loss: 0.0787 208/500 [===========>..................] - ETA: 1:39 - loss: 0.8408 - regression_loss: 0.7616 - classification_loss: 0.0792 209/500 [===========>..................] - ETA: 1:38 - loss: 0.8417 - regression_loss: 0.7623 - classification_loss: 0.0794 210/500 [===========>..................] - ETA: 1:38 - loss: 0.8409 - regression_loss: 0.7617 - classification_loss: 0.0792 211/500 [===========>..................] - ETA: 1:38 - loss: 0.8421 - regression_loss: 0.7627 - classification_loss: 0.0795 212/500 [===========>..................] - ETA: 1:37 - loss: 0.8407 - regression_loss: 0.7614 - classification_loss: 0.0792 213/500 [===========>..................] - ETA: 1:37 - loss: 0.8436 - regression_loss: 0.7634 - classification_loss: 0.0802 214/500 [===========>..................] - ETA: 1:37 - loss: 0.8446 - regression_loss: 0.7640 - classification_loss: 0.0806 215/500 [===========>..................] - ETA: 1:36 - loss: 0.8464 - regression_loss: 0.7653 - classification_loss: 0.0811 216/500 [===========>..................] - ETA: 1:36 - loss: 0.8470 - regression_loss: 0.7660 - classification_loss: 0.0810 217/500 [============>.................] - ETA: 1:36 - loss: 0.8465 - regression_loss: 0.7658 - classification_loss: 0.0807 218/500 [============>.................] - ETA: 1:35 - loss: 0.8458 - regression_loss: 0.7652 - classification_loss: 0.0806 219/500 [============>.................] - ETA: 1:35 - loss: 0.8445 - regression_loss: 0.7642 - classification_loss: 0.0804 220/500 [============>.................] - ETA: 1:35 - loss: 0.8441 - regression_loss: 0.7638 - classification_loss: 0.0803 221/500 [============>.................] - ETA: 1:34 - loss: 0.8446 - regression_loss: 0.7642 - classification_loss: 0.0804 222/500 [============>.................] - ETA: 1:34 - loss: 0.8437 - regression_loss: 0.7635 - classification_loss: 0.0803 223/500 [============>.................] - ETA: 1:34 - loss: 0.8443 - regression_loss: 0.7639 - classification_loss: 0.0804 224/500 [============>.................] - ETA: 1:33 - loss: 0.8432 - regression_loss: 0.7630 - classification_loss: 0.0802 225/500 [============>.................] - ETA: 1:33 - loss: 0.8451 - regression_loss: 0.7646 - classification_loss: 0.0805 226/500 [============>.................] - ETA: 1:33 - loss: 0.8448 - regression_loss: 0.7644 - classification_loss: 0.0804 227/500 [============>.................] - ETA: 1:32 - loss: 0.8448 - regression_loss: 0.7646 - classification_loss: 0.0802 228/500 [============>.................] - ETA: 1:32 - loss: 0.8445 - regression_loss: 0.7644 - classification_loss: 0.0801 229/500 [============>.................] - ETA: 1:32 - loss: 0.8442 - regression_loss: 0.7641 - classification_loss: 0.0801 230/500 [============>.................] - ETA: 1:31 - loss: 0.8430 - regression_loss: 0.7631 - classification_loss: 0.0799 231/500 [============>.................] - ETA: 1:31 - loss: 0.8422 - regression_loss: 0.7623 - classification_loss: 0.0799 232/500 [============>.................] - ETA: 1:31 - loss: 0.8438 - regression_loss: 0.7637 - classification_loss: 0.0801 233/500 [============>.................] - ETA: 1:30 - loss: 0.8431 - regression_loss: 0.7622 - classification_loss: 0.0809 234/500 [=============>................] - ETA: 1:30 - loss: 0.8413 - regression_loss: 0.7606 - classification_loss: 0.0807 235/500 [=============>................] - ETA: 1:30 - loss: 0.8412 - regression_loss: 0.7606 - classification_loss: 0.0806 236/500 [=============>................] - ETA: 1:29 - loss: 0.8437 - regression_loss: 0.7626 - classification_loss: 0.0811 237/500 [=============>................] - ETA: 1:29 - loss: 0.8452 - regression_loss: 0.7637 - classification_loss: 0.0815 238/500 [=============>................] - ETA: 1:28 - loss: 0.8454 - regression_loss: 0.7638 - classification_loss: 0.0815 239/500 [=============>................] - ETA: 1:28 - loss: 0.8444 - regression_loss: 0.7631 - classification_loss: 0.0813 240/500 [=============>................] - ETA: 1:28 - loss: 0.8456 - regression_loss: 0.7643 - classification_loss: 0.0814 241/500 [=============>................] - ETA: 1:27 - loss: 0.8474 - regression_loss: 0.7656 - classification_loss: 0.0817 242/500 [=============>................] - ETA: 1:27 - loss: 0.8500 - regression_loss: 0.7679 - classification_loss: 0.0821 243/500 [=============>................] - ETA: 1:27 - loss: 0.8498 - regression_loss: 0.7676 - classification_loss: 0.0822 244/500 [=============>................] - ETA: 1:26 - loss: 0.8486 - regression_loss: 0.7666 - classification_loss: 0.0821 245/500 [=============>................] - ETA: 1:26 - loss: 0.8497 - regression_loss: 0.7678 - classification_loss: 0.0819 246/500 [=============>................] - ETA: 1:26 - loss: 0.8493 - regression_loss: 0.7674 - classification_loss: 0.0819 247/500 [=============>................] - ETA: 1:25 - loss: 0.8486 - regression_loss: 0.7668 - classification_loss: 0.0818 248/500 [=============>................] - ETA: 1:25 - loss: 0.8476 - regression_loss: 0.7659 - classification_loss: 0.0817 249/500 [=============>................] - ETA: 1:25 - loss: 0.8496 - regression_loss: 0.7677 - classification_loss: 0.0820 250/500 [==============>...............] - ETA: 1:24 - loss: 0.8489 - regression_loss: 0.7671 - classification_loss: 0.0818 251/500 [==============>...............] - ETA: 1:24 - loss: 0.8494 - regression_loss: 0.7674 - classification_loss: 0.0819 252/500 [==============>...............] - ETA: 1:24 - loss: 0.8471 - regression_loss: 0.7653 - classification_loss: 0.0818 253/500 [==============>...............] - ETA: 1:23 - loss: 0.8476 - regression_loss: 0.7658 - classification_loss: 0.0818 254/500 [==============>...............] - ETA: 1:23 - loss: 0.8480 - regression_loss: 0.7661 - classification_loss: 0.0820 255/500 [==============>...............] - ETA: 1:23 - loss: 0.8510 - regression_loss: 0.7686 - classification_loss: 0.0824 256/500 [==============>...............] - ETA: 1:22 - loss: 0.8500 - regression_loss: 0.7676 - classification_loss: 0.0824 257/500 [==============>...............] - ETA: 1:22 - loss: 0.8510 - regression_loss: 0.7685 - classification_loss: 0.0825 258/500 [==============>...............] - ETA: 1:22 - loss: 0.8512 - regression_loss: 0.7688 - classification_loss: 0.0824 259/500 [==============>...............] - ETA: 1:21 - loss: 0.8499 - regression_loss: 0.7676 - classification_loss: 0.0823 260/500 [==============>...............] - ETA: 1:21 - loss: 0.8506 - regression_loss: 0.7680 - classification_loss: 0.0826 261/500 [==============>...............] - ETA: 1:21 - loss: 0.8494 - regression_loss: 0.7670 - classification_loss: 0.0824 262/500 [==============>...............] - ETA: 1:20 - loss: 0.8527 - regression_loss: 0.7692 - classification_loss: 0.0835 263/500 [==============>...............] - ETA: 1:20 - loss: 0.8534 - regression_loss: 0.7698 - classification_loss: 0.0835 264/500 [==============>...............] - ETA: 1:20 - loss: 0.8536 - regression_loss: 0.7700 - classification_loss: 0.0836 265/500 [==============>...............] - ETA: 1:19 - loss: 0.8521 - regression_loss: 0.7687 - classification_loss: 0.0834 266/500 [==============>...............] - ETA: 1:19 - loss: 0.8513 - regression_loss: 0.7679 - classification_loss: 0.0834 267/500 [===============>..............] - ETA: 1:19 - loss: 0.8503 - regression_loss: 0.7670 - classification_loss: 0.0832 268/500 [===============>..............] - ETA: 1:18 - loss: 0.8522 - regression_loss: 0.7685 - classification_loss: 0.0838 269/500 [===============>..............] - ETA: 1:18 - loss: 0.8501 - regression_loss: 0.7665 - classification_loss: 0.0836 270/500 [===============>..............] - ETA: 1:18 - loss: 0.8488 - regression_loss: 0.7655 - classification_loss: 0.0833 271/500 [===============>..............] - ETA: 1:17 - loss: 0.8463 - regression_loss: 0.7627 - classification_loss: 0.0836 272/500 [===============>..............] - ETA: 1:17 - loss: 0.8462 - regression_loss: 0.7623 - classification_loss: 0.0839 273/500 [===============>..............] - ETA: 1:17 - loss: 0.8452 - regression_loss: 0.7614 - classification_loss: 0.0838 274/500 [===============>..............] - ETA: 1:16 - loss: 0.8456 - regression_loss: 0.7618 - classification_loss: 0.0838 275/500 [===============>..............] - ETA: 1:16 - loss: 0.8476 - regression_loss: 0.7632 - classification_loss: 0.0844 276/500 [===============>..............] - ETA: 1:16 - loss: 0.8483 - regression_loss: 0.7638 - classification_loss: 0.0845 277/500 [===============>..............] - ETA: 1:15 - loss: 0.8476 - regression_loss: 0.7632 - classification_loss: 0.0844 278/500 [===============>..............] - ETA: 1:15 - loss: 0.8501 - regression_loss: 0.7649 - classification_loss: 0.0852 279/500 [===============>..............] - ETA: 1:15 - loss: 0.8508 - regression_loss: 0.7657 - classification_loss: 0.0851 280/500 [===============>..............] - ETA: 1:14 - loss: 0.8507 - regression_loss: 0.7658 - classification_loss: 0.0849 281/500 [===============>..............] - ETA: 1:14 - loss: 0.8526 - regression_loss: 0.7675 - classification_loss: 0.0851 282/500 [===============>..............] - ETA: 1:14 - loss: 0.8536 - regression_loss: 0.7684 - classification_loss: 0.0851 283/500 [===============>..............] - ETA: 1:13 - loss: 0.8552 - regression_loss: 0.7698 - classification_loss: 0.0854 284/500 [================>.............] - ETA: 1:13 - loss: 0.8541 - regression_loss: 0.7689 - classification_loss: 0.0852 285/500 [================>.............] - ETA: 1:13 - loss: 0.8542 - regression_loss: 0.7690 - classification_loss: 0.0852 286/500 [================>.............] - ETA: 1:12 - loss: 0.8540 - regression_loss: 0.7689 - classification_loss: 0.0851 287/500 [================>.............] - ETA: 1:12 - loss: 0.8531 - regression_loss: 0.7682 - classification_loss: 0.0849 288/500 [================>.............] - ETA: 1:12 - loss: 0.8515 - regression_loss: 0.7668 - classification_loss: 0.0847 289/500 [================>.............] - ETA: 1:11 - loss: 0.8525 - regression_loss: 0.7673 - classification_loss: 0.0852 290/500 [================>.............] - ETA: 1:11 - loss: 0.8529 - regression_loss: 0.7677 - classification_loss: 0.0852 291/500 [================>.............] - ETA: 1:11 - loss: 0.8534 - regression_loss: 0.7682 - classification_loss: 0.0852 292/500 [================>.............] - ETA: 1:10 - loss: 0.8532 - regression_loss: 0.7681 - classification_loss: 0.0851 293/500 [================>.............] - ETA: 1:10 - loss: 0.8518 - regression_loss: 0.7669 - classification_loss: 0.0850 294/500 [================>.............] - ETA: 1:10 - loss: 0.8512 - regression_loss: 0.7662 - classification_loss: 0.0850 295/500 [================>.............] - ETA: 1:09 - loss: 0.8510 - regression_loss: 0.7660 - classification_loss: 0.0850 296/500 [================>.............] - ETA: 1:09 - loss: 0.8506 - regression_loss: 0.7656 - classification_loss: 0.0849 297/500 [================>.............] - ETA: 1:09 - loss: 0.8508 - regression_loss: 0.7656 - classification_loss: 0.0852 298/500 [================>.............] - ETA: 1:08 - loss: 0.8485 - regression_loss: 0.7635 - classification_loss: 0.0850 299/500 [================>.............] - ETA: 1:08 - loss: 0.8497 - regression_loss: 0.7643 - classification_loss: 0.0854 300/500 [=================>............] - ETA: 1:08 - loss: 0.8492 - regression_loss: 0.7638 - classification_loss: 0.0855 301/500 [=================>............] - ETA: 1:07 - loss: 0.8481 - regression_loss: 0.7626 - classification_loss: 0.0855 302/500 [=================>............] - ETA: 1:07 - loss: 0.8480 - regression_loss: 0.7627 - classification_loss: 0.0853 303/500 [=================>............] - ETA: 1:07 - loss: 0.8475 - regression_loss: 0.7623 - classification_loss: 0.0853 304/500 [=================>............] - ETA: 1:06 - loss: 0.8495 - regression_loss: 0.7638 - classification_loss: 0.0857 305/500 [=================>............] - ETA: 1:06 - loss: 0.8541 - regression_loss: 0.7677 - classification_loss: 0.0865 306/500 [=================>............] - ETA: 1:05 - loss: 0.8535 - regression_loss: 0.7670 - classification_loss: 0.0864 307/500 [=================>............] - ETA: 1:05 - loss: 0.8530 - regression_loss: 0.7664 - classification_loss: 0.0866 308/500 [=================>............] - ETA: 1:05 - loss: 0.8561 - regression_loss: 0.7690 - classification_loss: 0.0871 309/500 [=================>............] - ETA: 1:04 - loss: 0.8557 - regression_loss: 0.7686 - classification_loss: 0.0871 310/500 [=================>............] - ETA: 1:04 - loss: 0.8553 - regression_loss: 0.7681 - classification_loss: 0.0871 311/500 [=================>............] - ETA: 1:04 - loss: 0.8537 - regression_loss: 0.7667 - classification_loss: 0.0870 312/500 [=================>............] - ETA: 1:03 - loss: 0.8555 - regression_loss: 0.7680 - classification_loss: 0.0875 313/500 [=================>............] - ETA: 1:03 - loss: 0.8554 - regression_loss: 0.7679 - classification_loss: 0.0876 314/500 [=================>............] - ETA: 1:03 - loss: 0.8557 - regression_loss: 0.7681 - classification_loss: 0.0876 315/500 [=================>............] - ETA: 1:02 - loss: 0.8534 - regression_loss: 0.7660 - classification_loss: 0.0874 316/500 [=================>............] - ETA: 1:02 - loss: 0.8540 - regression_loss: 0.7664 - classification_loss: 0.0876 317/500 [==================>...........] - ETA: 1:02 - loss: 0.8544 - regression_loss: 0.7669 - classification_loss: 0.0875 318/500 [==================>...........] - ETA: 1:01 - loss: 0.8539 - regression_loss: 0.7665 - classification_loss: 0.0874 319/500 [==================>...........] - ETA: 1:01 - loss: 0.8534 - regression_loss: 0.7661 - classification_loss: 0.0872 320/500 [==================>...........] - ETA: 1:01 - loss: 0.8519 - regression_loss: 0.7649 - classification_loss: 0.0870 321/500 [==================>...........] - ETA: 1:00 - loss: 0.8519 - regression_loss: 0.7651 - classification_loss: 0.0869 322/500 [==================>...........] - ETA: 1:00 - loss: 0.8503 - regression_loss: 0.7635 - classification_loss: 0.0867 323/500 [==================>...........] - ETA: 1:00 - loss: 0.8492 - regression_loss: 0.7627 - classification_loss: 0.0865 324/500 [==================>...........] - ETA: 59s - loss: 0.8488 - regression_loss: 0.7624 - classification_loss: 0.0864  325/500 [==================>...........] - ETA: 59s - loss: 0.8475 - regression_loss: 0.7613 - classification_loss: 0.0862 326/500 [==================>...........] - ETA: 59s - loss: 0.8462 - regression_loss: 0.7601 - classification_loss: 0.0861 327/500 [==================>...........] - ETA: 58s - loss: 0.8454 - regression_loss: 0.7594 - classification_loss: 0.0860 328/500 [==================>...........] - ETA: 58s - loss: 0.8449 - regression_loss: 0.7590 - classification_loss: 0.0860 329/500 [==================>...........] - ETA: 58s - loss: 0.8447 - regression_loss: 0.7589 - classification_loss: 0.0859 330/500 [==================>...........] - ETA: 57s - loss: 0.8439 - regression_loss: 0.7581 - classification_loss: 0.0858 331/500 [==================>...........] - ETA: 57s - loss: 0.8439 - regression_loss: 0.7581 - classification_loss: 0.0858 332/500 [==================>...........] - ETA: 57s - loss: 0.8436 - regression_loss: 0.7579 - classification_loss: 0.0857 333/500 [==================>...........] - ETA: 56s - loss: 0.8440 - regression_loss: 0.7583 - classification_loss: 0.0857 334/500 [===================>..........] - ETA: 56s - loss: 0.8456 - regression_loss: 0.7594 - classification_loss: 0.0861 335/500 [===================>..........] - ETA: 56s - loss: 0.8453 - regression_loss: 0.7592 - classification_loss: 0.0861 336/500 [===================>..........] - ETA: 55s - loss: 0.8454 - regression_loss: 0.7594 - classification_loss: 0.0860 337/500 [===================>..........] - ETA: 55s - loss: 0.8458 - regression_loss: 0.7597 - classification_loss: 0.0861 338/500 [===================>..........] - ETA: 55s - loss: 0.8458 - regression_loss: 0.7599 - classification_loss: 0.0859 339/500 [===================>..........] - ETA: 54s - loss: 0.8455 - regression_loss: 0.7597 - classification_loss: 0.0858 340/500 [===================>..........] - ETA: 54s - loss: 0.8467 - regression_loss: 0.7609 - classification_loss: 0.0857 341/500 [===================>..........] - ETA: 54s - loss: 0.8514 - regression_loss: 0.7652 - classification_loss: 0.0862 342/500 [===================>..........] - ETA: 53s - loss: 0.8528 - regression_loss: 0.7665 - classification_loss: 0.0863 343/500 [===================>..........] - ETA: 53s - loss: 0.8531 - regression_loss: 0.7667 - classification_loss: 0.0864 344/500 [===================>..........] - ETA: 53s - loss: 0.8513 - regression_loss: 0.7650 - classification_loss: 0.0863 345/500 [===================>..........] - ETA: 52s - loss: 0.8517 - regression_loss: 0.7653 - classification_loss: 0.0864 346/500 [===================>..........] - ETA: 52s - loss: 0.8521 - regression_loss: 0.7658 - classification_loss: 0.0863 347/500 [===================>..........] - ETA: 52s - loss: 0.8533 - regression_loss: 0.7670 - classification_loss: 0.0863 348/500 [===================>..........] - ETA: 51s - loss: 0.8509 - regression_loss: 0.7648 - classification_loss: 0.0861 349/500 [===================>..........] - ETA: 51s - loss: 0.8503 - regression_loss: 0.7643 - classification_loss: 0.0859 350/500 [====================>.........] - ETA: 51s - loss: 0.8508 - regression_loss: 0.7646 - classification_loss: 0.0863 351/500 [====================>.........] - ETA: 50s - loss: 0.8517 - regression_loss: 0.7652 - classification_loss: 0.0865 352/500 [====================>.........] - ETA: 50s - loss: 0.8499 - regression_loss: 0.7636 - classification_loss: 0.0863 353/500 [====================>.........] - ETA: 50s - loss: 0.8498 - regression_loss: 0.7634 - classification_loss: 0.0863 354/500 [====================>.........] - ETA: 49s - loss: 0.8502 - regression_loss: 0.7638 - classification_loss: 0.0864 355/500 [====================>.........] - ETA: 49s - loss: 0.8495 - regression_loss: 0.7631 - classification_loss: 0.0863 356/500 [====================>.........] - ETA: 48s - loss: 0.8517 - regression_loss: 0.7655 - classification_loss: 0.0863 357/500 [====================>.........] - ETA: 48s - loss: 0.8511 - regression_loss: 0.7649 - classification_loss: 0.0863 358/500 [====================>.........] - ETA: 48s - loss: 0.8506 - regression_loss: 0.7643 - classification_loss: 0.0862 359/500 [====================>.........] - ETA: 47s - loss: 0.8506 - regression_loss: 0.7645 - classification_loss: 0.0861 360/500 [====================>.........] - ETA: 47s - loss: 0.8501 - regression_loss: 0.7642 - classification_loss: 0.0859 361/500 [====================>.........] - ETA: 47s - loss: 0.8527 - regression_loss: 0.7662 - classification_loss: 0.0865 362/500 [====================>.........] - ETA: 46s - loss: 0.8531 - regression_loss: 0.7667 - classification_loss: 0.0865 363/500 [====================>.........] - ETA: 46s - loss: 0.8520 - regression_loss: 0.7657 - classification_loss: 0.0863 364/500 [====================>.........] - ETA: 46s - loss: 0.8514 - regression_loss: 0.7652 - classification_loss: 0.0862 365/500 [====================>.........] - ETA: 45s - loss: 0.8505 - regression_loss: 0.7644 - classification_loss: 0.0861 366/500 [====================>.........] - ETA: 45s - loss: 0.8508 - regression_loss: 0.7648 - classification_loss: 0.0860 367/500 [=====================>........] - ETA: 45s - loss: 0.8499 - regression_loss: 0.7640 - classification_loss: 0.0858 368/500 [=====================>........] - ETA: 44s - loss: 0.8493 - regression_loss: 0.7634 - classification_loss: 0.0860 369/500 [=====================>........] - ETA: 44s - loss: 0.8501 - regression_loss: 0.7642 - classification_loss: 0.0859 370/500 [=====================>........] - ETA: 44s - loss: 0.8506 - regression_loss: 0.7648 - classification_loss: 0.0858 371/500 [=====================>........] - ETA: 43s - loss: 0.8499 - regression_loss: 0.7643 - classification_loss: 0.0857 372/500 [=====================>........] - ETA: 43s - loss: 0.8504 - regression_loss: 0.7647 - classification_loss: 0.0857 373/500 [=====================>........] - ETA: 43s - loss: 0.8504 - regression_loss: 0.7648 - classification_loss: 0.0856 374/500 [=====================>........] - ETA: 42s - loss: 0.8500 - regression_loss: 0.7645 - classification_loss: 0.0855 375/500 [=====================>........] - ETA: 42s - loss: 0.8495 - regression_loss: 0.7641 - classification_loss: 0.0854 376/500 [=====================>........] - ETA: 42s - loss: 0.8502 - regression_loss: 0.7648 - classification_loss: 0.0854 377/500 [=====================>........] - ETA: 41s - loss: 0.8491 - regression_loss: 0.7638 - classification_loss: 0.0853 378/500 [=====================>........] - ETA: 41s - loss: 0.8498 - regression_loss: 0.7646 - classification_loss: 0.0852 379/500 [=====================>........] - ETA: 41s - loss: 0.8494 - regression_loss: 0.7643 - classification_loss: 0.0850 380/500 [=====================>........] - ETA: 40s - loss: 0.8486 - regression_loss: 0.7637 - classification_loss: 0.0849 381/500 [=====================>........] - ETA: 40s - loss: 0.8496 - regression_loss: 0.7645 - classification_loss: 0.0851 382/500 [=====================>........] - ETA: 40s - loss: 0.8510 - regression_loss: 0.7655 - classification_loss: 0.0855 383/500 [=====================>........] - ETA: 39s - loss: 0.8501 - regression_loss: 0.7648 - classification_loss: 0.0854 384/500 [======================>.......] - ETA: 39s - loss: 0.8519 - regression_loss: 0.7663 - classification_loss: 0.0856 385/500 [======================>.......] - ETA: 39s - loss: 0.8511 - regression_loss: 0.7657 - classification_loss: 0.0855 386/500 [======================>.......] - ETA: 38s - loss: 0.8560 - regression_loss: 0.7694 - classification_loss: 0.0867 387/500 [======================>.......] - ETA: 38s - loss: 0.8558 - regression_loss: 0.7691 - classification_loss: 0.0867 388/500 [======================>.......] - ETA: 38s - loss: 0.8551 - regression_loss: 0.7685 - classification_loss: 0.0866 389/500 [======================>.......] - ETA: 37s - loss: 0.8545 - regression_loss: 0.7681 - classification_loss: 0.0864 390/500 [======================>.......] - ETA: 37s - loss: 0.8533 - regression_loss: 0.7670 - classification_loss: 0.0863 391/500 [======================>.......] - ETA: 37s - loss: 0.8538 - regression_loss: 0.7675 - classification_loss: 0.0863 392/500 [======================>.......] - ETA: 36s - loss: 0.8523 - regression_loss: 0.7662 - classification_loss: 0.0861 393/500 [======================>.......] - ETA: 36s - loss: 0.8519 - regression_loss: 0.7660 - classification_loss: 0.0860 394/500 [======================>.......] - ETA: 36s - loss: 0.8531 - regression_loss: 0.7670 - classification_loss: 0.0861 395/500 [======================>.......] - ETA: 35s - loss: 0.8539 - regression_loss: 0.7677 - classification_loss: 0.0861 396/500 [======================>.......] - ETA: 35s - loss: 0.8534 - regression_loss: 0.7674 - classification_loss: 0.0860 397/500 [======================>.......] - ETA: 35s - loss: 0.8533 - regression_loss: 0.7673 - classification_loss: 0.0860 398/500 [======================>.......] - ETA: 34s - loss: 0.8545 - regression_loss: 0.7682 - classification_loss: 0.0863 399/500 [======================>.......] - ETA: 34s - loss: 0.8548 - regression_loss: 0.7683 - classification_loss: 0.0865 400/500 [=======================>......] - ETA: 33s - loss: 0.8560 - regression_loss: 0.7692 - classification_loss: 0.0868 401/500 [=======================>......] - ETA: 33s - loss: 0.8554 - regression_loss: 0.7687 - classification_loss: 0.0867 402/500 [=======================>......] - ETA: 33s - loss: 0.8561 - regression_loss: 0.7692 - classification_loss: 0.0868 403/500 [=======================>......] - ETA: 32s - loss: 0.8552 - regression_loss: 0.7684 - classification_loss: 0.0867 404/500 [=======================>......] - ETA: 32s - loss: 0.8554 - regression_loss: 0.7687 - classification_loss: 0.0867 405/500 [=======================>......] - ETA: 32s - loss: 0.8583 - regression_loss: 0.7709 - classification_loss: 0.0875 406/500 [=======================>......] - ETA: 31s - loss: 0.8584 - regression_loss: 0.7709 - classification_loss: 0.0875 407/500 [=======================>......] - ETA: 31s - loss: 0.8586 - regression_loss: 0.7709 - classification_loss: 0.0877 408/500 [=======================>......] - ETA: 31s - loss: 0.8585 - regression_loss: 0.7708 - classification_loss: 0.0877 409/500 [=======================>......] - ETA: 30s - loss: 0.8592 - regression_loss: 0.7713 - classification_loss: 0.0878 410/500 [=======================>......] - ETA: 30s - loss: 0.8594 - regression_loss: 0.7715 - classification_loss: 0.0879 411/500 [=======================>......] - ETA: 30s - loss: 0.8584 - regression_loss: 0.7707 - classification_loss: 0.0877 412/500 [=======================>......] - ETA: 29s - loss: 0.8581 - regression_loss: 0.7703 - classification_loss: 0.0878 413/500 [=======================>......] - ETA: 29s - loss: 0.8583 - regression_loss: 0.7705 - classification_loss: 0.0878 414/500 [=======================>......] - ETA: 29s - loss: 0.8600 - regression_loss: 0.7718 - classification_loss: 0.0882 415/500 [=======================>......] - ETA: 28s - loss: 0.8602 - regression_loss: 0.7721 - classification_loss: 0.0881 416/500 [=======================>......] - ETA: 28s - loss: 0.8600 - regression_loss: 0.7719 - classification_loss: 0.0880 417/500 [========================>.....] - ETA: 28s - loss: 0.8611 - regression_loss: 0.7729 - classification_loss: 0.0882 418/500 [========================>.....] - ETA: 27s - loss: 0.8601 - regression_loss: 0.7720 - classification_loss: 0.0881 419/500 [========================>.....] - ETA: 27s - loss: 0.8599 - regression_loss: 0.7718 - classification_loss: 0.0880 420/500 [========================>.....] - ETA: 27s - loss: 0.8588 - regression_loss: 0.7708 - classification_loss: 0.0879 421/500 [========================>.....] - ETA: 26s - loss: 0.8588 - regression_loss: 0.7708 - classification_loss: 0.0880 422/500 [========================>.....] - ETA: 26s - loss: 0.8582 - regression_loss: 0.7703 - classification_loss: 0.0879 423/500 [========================>.....] - ETA: 26s - loss: 0.8569 - regression_loss: 0.7691 - classification_loss: 0.0877 424/500 [========================>.....] - ETA: 25s - loss: 0.8571 - regression_loss: 0.7694 - classification_loss: 0.0877 425/500 [========================>.....] - ETA: 25s - loss: 0.8586 - regression_loss: 0.7707 - classification_loss: 0.0879 426/500 [========================>.....] - ETA: 25s - loss: 0.8582 - regression_loss: 0.7703 - classification_loss: 0.0879 427/500 [========================>.....] - ETA: 24s - loss: 0.8591 - regression_loss: 0.7711 - classification_loss: 0.0880 428/500 [========================>.....] - ETA: 24s - loss: 0.8588 - regression_loss: 0.7709 - classification_loss: 0.0879 429/500 [========================>.....] - ETA: 24s - loss: 0.8601 - regression_loss: 0.7721 - classification_loss: 0.0880 430/500 [========================>.....] - ETA: 23s - loss: 0.8596 - regression_loss: 0.7718 - classification_loss: 0.0878 431/500 [========================>.....] - ETA: 23s - loss: 0.8599 - regression_loss: 0.7721 - classification_loss: 0.0878 432/500 [========================>.....] - ETA: 23s - loss: 0.8603 - regression_loss: 0.7725 - classification_loss: 0.0878 433/500 [========================>.....] - ETA: 22s - loss: 0.8602 - regression_loss: 0.7724 - classification_loss: 0.0877 434/500 [=========================>....] - ETA: 22s - loss: 0.8598 - regression_loss: 0.7722 - classification_loss: 0.0876 435/500 [=========================>....] - ETA: 22s - loss: 0.8591 - regression_loss: 0.7717 - classification_loss: 0.0874 436/500 [=========================>....] - ETA: 21s - loss: 0.8586 - regression_loss: 0.7713 - classification_loss: 0.0874 437/500 [=========================>....] - ETA: 21s - loss: 0.8596 - regression_loss: 0.7721 - classification_loss: 0.0875 438/500 [=========================>....] - ETA: 21s - loss: 0.8608 - regression_loss: 0.7732 - classification_loss: 0.0876 439/500 [=========================>....] - ETA: 20s - loss: 0.8604 - regression_loss: 0.7729 - classification_loss: 0.0875 440/500 [=========================>....] - ETA: 20s - loss: 0.8613 - regression_loss: 0.7737 - classification_loss: 0.0876 441/500 [=========================>....] - ETA: 20s - loss: 0.8621 - regression_loss: 0.7743 - classification_loss: 0.0877 442/500 [=========================>....] - ETA: 19s - loss: 0.8615 - regression_loss: 0.7740 - classification_loss: 0.0876 443/500 [=========================>....] - ETA: 19s - loss: 0.8614 - regression_loss: 0.7738 - classification_loss: 0.0877 444/500 [=========================>....] - ETA: 19s - loss: 0.8602 - regression_loss: 0.7727 - classification_loss: 0.0875 445/500 [=========================>....] - ETA: 18s - loss: 0.8589 - regression_loss: 0.7715 - classification_loss: 0.0874 446/500 [=========================>....] - ETA: 18s - loss: 0.8591 - regression_loss: 0.7717 - classification_loss: 0.0874 447/500 [=========================>....] - ETA: 18s - loss: 0.8605 - regression_loss: 0.7730 - classification_loss: 0.0875 448/500 [=========================>....] - ETA: 17s - loss: 0.8602 - regression_loss: 0.7726 - classification_loss: 0.0876 449/500 [=========================>....] - ETA: 17s - loss: 0.8600 - regression_loss: 0.7725 - classification_loss: 0.0875 450/500 [==========================>...] - ETA: 17s - loss: 0.8593 - regression_loss: 0.7718 - classification_loss: 0.0875 451/500 [==========================>...] - ETA: 16s - loss: 0.8592 - regression_loss: 0.7718 - classification_loss: 0.0874 452/500 [==========================>...] - ETA: 16s - loss: 0.8590 - regression_loss: 0.7716 - classification_loss: 0.0874 453/500 [==========================>...] - ETA: 15s - loss: 0.8595 - regression_loss: 0.7721 - classification_loss: 0.0875 454/500 [==========================>...] - ETA: 15s - loss: 0.8592 - regression_loss: 0.7718 - classification_loss: 0.0874 455/500 [==========================>...] - ETA: 15s - loss: 0.8588 - regression_loss: 0.7714 - classification_loss: 0.0874 456/500 [==========================>...] - ETA: 14s - loss: 0.8584 - regression_loss: 0.7712 - classification_loss: 0.0872 457/500 [==========================>...] - ETA: 14s - loss: 0.8577 - regression_loss: 0.7707 - classification_loss: 0.0871 458/500 [==========================>...] - ETA: 14s - loss: 0.8575 - regression_loss: 0.7705 - classification_loss: 0.0869 459/500 [==========================>...] - ETA: 13s - loss: 0.8577 - regression_loss: 0.7707 - classification_loss: 0.0869 460/500 [==========================>...] - ETA: 13s - loss: 0.8577 - regression_loss: 0.7707 - classification_loss: 0.0869 461/500 [==========================>...] - ETA: 13s - loss: 0.8575 - regression_loss: 0.7706 - classification_loss: 0.0869 462/500 [==========================>...] - ETA: 12s - loss: 0.8572 - regression_loss: 0.7704 - classification_loss: 0.0868 463/500 [==========================>...] - ETA: 12s - loss: 0.8586 - regression_loss: 0.7711 - classification_loss: 0.0875 464/500 [==========================>...] - ETA: 12s - loss: 0.8574 - regression_loss: 0.7700 - classification_loss: 0.0874 465/500 [==========================>...] - ETA: 11s - loss: 0.8567 - regression_loss: 0.7694 - classification_loss: 0.0873 466/500 [==========================>...] - ETA: 11s - loss: 0.8565 - regression_loss: 0.7692 - classification_loss: 0.0873 467/500 [===========================>..] - ETA: 11s - loss: 0.8570 - regression_loss: 0.7697 - classification_loss: 0.0873 468/500 [===========================>..] - ETA: 10s - loss: 0.8563 - regression_loss: 0.7692 - classification_loss: 0.0872 469/500 [===========================>..] - ETA: 10s - loss: 0.8579 - regression_loss: 0.7706 - classification_loss: 0.0873 470/500 [===========================>..] - ETA: 10s - loss: 0.8581 - regression_loss: 0.7708 - classification_loss: 0.0873 471/500 [===========================>..] - ETA: 9s - loss: 0.8578 - regression_loss: 0.7706 - classification_loss: 0.0872  472/500 [===========================>..] - ETA: 9s - loss: 0.8597 - regression_loss: 0.7724 - classification_loss: 0.0874 473/500 [===========================>..] - ETA: 9s - loss: 0.8612 - regression_loss: 0.7734 - classification_loss: 0.0878 474/500 [===========================>..] - ETA: 8s - loss: 0.8607 - regression_loss: 0.7729 - classification_loss: 0.0877 475/500 [===========================>..] - ETA: 8s - loss: 0.8603 - regression_loss: 0.7727 - classification_loss: 0.0876 476/500 [===========================>..] - ETA: 8s - loss: 0.8602 - regression_loss: 0.7727 - classification_loss: 0.0875 477/500 [===========================>..] - ETA: 7s - loss: 0.8599 - regression_loss: 0.7724 - classification_loss: 0.0874 478/500 [===========================>..] - ETA: 7s - loss: 0.8596 - regression_loss: 0.7722 - classification_loss: 0.0874 479/500 [===========================>..] - ETA: 7s - loss: 0.8592 - regression_loss: 0.7719 - classification_loss: 0.0873 480/500 [===========================>..] - ETA: 6s - loss: 0.8600 - regression_loss: 0.7727 - classification_loss: 0.0872 481/500 [===========================>..] - ETA: 6s - loss: 0.8596 - regression_loss: 0.7725 - classification_loss: 0.0871 482/500 [===========================>..] - ETA: 6s - loss: 0.8592 - regression_loss: 0.7721 - classification_loss: 0.0871 483/500 [===========================>..] - ETA: 5s - loss: 0.8586 - regression_loss: 0.7716 - classification_loss: 0.0870 484/500 [============================>.] - ETA: 5s - loss: 0.8588 - regression_loss: 0.7717 - classification_loss: 0.0871 485/500 [============================>.] - ETA: 5s - loss: 0.8598 - regression_loss: 0.7725 - classification_loss: 0.0873 486/500 [============================>.] - ETA: 4s - loss: 0.8597 - regression_loss: 0.7725 - classification_loss: 0.0872 487/500 [============================>.] - ETA: 4s - loss: 0.8595 - regression_loss: 0.7724 - classification_loss: 0.0871 488/500 [============================>.] - ETA: 4s - loss: 0.8587 - regression_loss: 0.7717 - classification_loss: 0.0870 489/500 [============================>.] - ETA: 3s - loss: 0.8583 - regression_loss: 0.7715 - classification_loss: 0.0868 490/500 [============================>.] - ETA: 3s - loss: 0.8576 - regression_loss: 0.7709 - classification_loss: 0.0867 491/500 [============================>.] - ETA: 3s - loss: 0.8578 - regression_loss: 0.7712 - classification_loss: 0.0867 492/500 [============================>.] - ETA: 2s - loss: 0.8585 - regression_loss: 0.7719 - classification_loss: 0.0866 493/500 [============================>.] - ETA: 2s - loss: 0.8579 - regression_loss: 0.7714 - classification_loss: 0.0865 494/500 [============================>.] - ETA: 2s - loss: 0.8576 - regression_loss: 0.7712 - classification_loss: 0.0864 495/500 [============================>.] - ETA: 1s - loss: 0.8565 - regression_loss: 0.7703 - classification_loss: 0.0863 496/500 [============================>.] - ETA: 1s - loss: 0.8566 - regression_loss: 0.7704 - classification_loss: 0.0862 497/500 [============================>.] - ETA: 1s - loss: 0.8575 - regression_loss: 0.7712 - classification_loss: 0.0863 498/500 [============================>.] - ETA: 0s - loss: 0.8583 - regression_loss: 0.7719 - classification_loss: 0.0863 499/500 [============================>.] - ETA: 0s - loss: 0.8593 - regression_loss: 0.7728 - classification_loss: 0.0864 500/500 [==============================] - 170s 340ms/step - loss: 0.8589 - regression_loss: 0.7725 - classification_loss: 0.0864 326 instances of class plum with average precision: 0.8213 mAP: 0.8213 Epoch 00033: saving model to ./training/snapshots/resnet101_pascal_33.h5 Epoch 34/150 1/500 [..............................] - ETA: 2:44 - loss: 0.8230 - regression_loss: 0.7616 - classification_loss: 0.0615 2/500 [..............................] - ETA: 2:50 - loss: 0.5893 - regression_loss: 0.5430 - classification_loss: 0.0463 3/500 [..............................] - ETA: 2:48 - loss: 0.5382 - regression_loss: 0.4964 - classification_loss: 0.0418 4/500 [..............................] - ETA: 2:45 - loss: 0.6798 - regression_loss: 0.6292 - classification_loss: 0.0506 5/500 [..............................] - ETA: 2:46 - loss: 0.6407 - regression_loss: 0.5947 - classification_loss: 0.0461 6/500 [..............................] - ETA: 2:47 - loss: 0.7410 - regression_loss: 0.6832 - classification_loss: 0.0577 7/500 [..............................] - ETA: 2:46 - loss: 0.8175 - regression_loss: 0.7406 - classification_loss: 0.0769 8/500 [..............................] - ETA: 2:47 - loss: 0.7801 - regression_loss: 0.7085 - classification_loss: 0.0715 9/500 [..............................] - ETA: 2:47 - loss: 0.7424 - regression_loss: 0.6752 - classification_loss: 0.0672 10/500 [..............................] - ETA: 2:47 - loss: 0.7119 - regression_loss: 0.6496 - classification_loss: 0.0624 11/500 [..............................] - ETA: 2:46 - loss: 0.7426 - regression_loss: 0.6750 - classification_loss: 0.0677 12/500 [..............................] - ETA: 2:47 - loss: 0.7375 - regression_loss: 0.6725 - classification_loss: 0.0650 13/500 [..............................] - ETA: 2:47 - loss: 0.7472 - regression_loss: 0.6803 - classification_loss: 0.0669 14/500 [..............................] - ETA: 2:46 - loss: 0.7442 - regression_loss: 0.6791 - classification_loss: 0.0651 15/500 [..............................] - ETA: 2:45 - loss: 0.7743 - regression_loss: 0.6972 - classification_loss: 0.0771 16/500 [..............................] - ETA: 2:45 - loss: 0.7805 - regression_loss: 0.7013 - classification_loss: 0.0792 17/500 [>.............................] - ETA: 2:45 - loss: 0.7671 - regression_loss: 0.6904 - classification_loss: 0.0767 18/500 [>.............................] - ETA: 2:44 - loss: 0.7797 - regression_loss: 0.7028 - classification_loss: 0.0769 19/500 [>.............................] - ETA: 2:44 - loss: 0.7914 - regression_loss: 0.7141 - classification_loss: 0.0773 20/500 [>.............................] - ETA: 2:43 - loss: 1.1575 - regression_loss: 0.7558 - classification_loss: 0.4017 21/500 [>.............................] - ETA: 2:43 - loss: 1.1523 - regression_loss: 0.7654 - classification_loss: 0.3869 22/500 [>.............................] - ETA: 2:43 - loss: 1.1369 - regression_loss: 0.7651 - classification_loss: 0.3718 23/500 [>.............................] - ETA: 2:43 - loss: 1.1159 - regression_loss: 0.7544 - classification_loss: 0.3615 24/500 [>.............................] - ETA: 2:42 - loss: 1.1127 - regression_loss: 0.7634 - classification_loss: 0.3493 25/500 [>.............................] - ETA: 2:42 - loss: 1.0886 - regression_loss: 0.7507 - classification_loss: 0.3379 26/500 [>.............................] - ETA: 2:41 - loss: 1.0748 - regression_loss: 0.7473 - classification_loss: 0.3275 27/500 [>.............................] - ETA: 2:41 - loss: 1.0940 - regression_loss: 0.7702 - classification_loss: 0.3238 28/500 [>.............................] - ETA: 2:41 - loss: 1.1039 - regression_loss: 0.7794 - classification_loss: 0.3245 29/500 [>.............................] - ETA: 2:41 - loss: 1.0905 - regression_loss: 0.7748 - classification_loss: 0.3157 30/500 [>.............................] - ETA: 2:41 - loss: 1.0816 - regression_loss: 0.7740 - classification_loss: 0.3075 31/500 [>.............................] - ETA: 2:41 - loss: 1.0544 - regression_loss: 0.7562 - classification_loss: 0.2982 32/500 [>.............................] - ETA: 2:40 - loss: 1.0522 - regression_loss: 0.7607 - classification_loss: 0.2915 33/500 [>.............................] - ETA: 2:40 - loss: 1.0512 - regression_loss: 0.7659 - classification_loss: 0.2853 34/500 [=>............................] - ETA: 2:40 - loss: 1.0456 - regression_loss: 0.7659 - classification_loss: 0.2796 35/500 [=>............................] - ETA: 2:39 - loss: 1.0438 - regression_loss: 0.7692 - classification_loss: 0.2746 36/500 [=>............................] - ETA: 2:39 - loss: 1.0291 - regression_loss: 0.7611 - classification_loss: 0.2680 37/500 [=>............................] - ETA: 2:38 - loss: 1.0283 - regression_loss: 0.7648 - classification_loss: 0.2636 38/500 [=>............................] - ETA: 2:38 - loss: 1.0297 - regression_loss: 0.7668 - classification_loss: 0.2629 39/500 [=>............................] - ETA: 2:38 - loss: 1.0389 - regression_loss: 0.7754 - classification_loss: 0.2635 40/500 [=>............................] - ETA: 2:37 - loss: 1.0373 - regression_loss: 0.7783 - classification_loss: 0.2590 41/500 [=>............................] - ETA: 2:37 - loss: 1.0434 - regression_loss: 0.7890 - classification_loss: 0.2544 42/500 [=>............................] - ETA: 2:36 - loss: 1.0195 - regression_loss: 0.7702 - classification_loss: 0.2493 43/500 [=>............................] - ETA: 2:36 - loss: 1.0110 - regression_loss: 0.7665 - classification_loss: 0.2445 44/500 [=>............................] - ETA: 2:35 - loss: 0.9934 - regression_loss: 0.7543 - classification_loss: 0.2391 45/500 [=>............................] - ETA: 2:35 - loss: 0.9957 - regression_loss: 0.7598 - classification_loss: 0.2359 46/500 [=>............................] - ETA: 2:34 - loss: 0.9925 - regression_loss: 0.7598 - classification_loss: 0.2326 47/500 [=>............................] - ETA: 2:34 - loss: 0.9884 - regression_loss: 0.7592 - classification_loss: 0.2292 48/500 [=>............................] - ETA: 2:33 - loss: 0.9890 - regression_loss: 0.7636 - classification_loss: 0.2254 49/500 [=>............................] - ETA: 2:33 - loss: 0.9907 - regression_loss: 0.7673 - classification_loss: 0.2233 50/500 [==>...........................] - ETA: 2:32 - loss: 0.9817 - regression_loss: 0.7619 - classification_loss: 0.2198 51/500 [==>...........................] - ETA: 2:32 - loss: 1.0081 - regression_loss: 0.7863 - classification_loss: 0.2217 52/500 [==>...........................] - ETA: 2:32 - loss: 0.9985 - regression_loss: 0.7800 - classification_loss: 0.2184 53/500 [==>...........................] - ETA: 2:32 - loss: 0.9968 - regression_loss: 0.7809 - classification_loss: 0.2159 54/500 [==>...........................] - ETA: 2:31 - loss: 0.9879 - regression_loss: 0.7755 - classification_loss: 0.2124 55/500 [==>...........................] - ETA: 2:31 - loss: 0.9999 - regression_loss: 0.7861 - classification_loss: 0.2138 56/500 [==>...........................] - ETA: 2:31 - loss: 0.9970 - regression_loss: 0.7862 - classification_loss: 0.2108 57/500 [==>...........................] - ETA: 2:30 - loss: 0.9915 - regression_loss: 0.7835 - classification_loss: 0.2081 58/500 [==>...........................] - ETA: 2:30 - loss: 0.9886 - regression_loss: 0.7825 - classification_loss: 0.2061 59/500 [==>...........................] - ETA: 2:30 - loss: 0.9808 - regression_loss: 0.7773 - classification_loss: 0.2035 60/500 [==>...........................] - ETA: 2:29 - loss: 0.9763 - regression_loss: 0.7754 - classification_loss: 0.2009 61/500 [==>...........................] - ETA: 2:29 - loss: 0.9763 - regression_loss: 0.7769 - classification_loss: 0.1994 62/500 [==>...........................] - ETA: 2:29 - loss: 0.9728 - regression_loss: 0.7749 - classification_loss: 0.1979 63/500 [==>...........................] - ETA: 2:28 - loss: 0.9691 - regression_loss: 0.7737 - classification_loss: 0.1953 64/500 [==>...........................] - ETA: 2:28 - loss: 0.9726 - regression_loss: 0.7783 - classification_loss: 0.1942 65/500 [==>...........................] - ETA: 2:27 - loss: 0.9666 - regression_loss: 0.7745 - classification_loss: 0.1920 66/500 [==>...........................] - ETA: 2:27 - loss: 0.9669 - regression_loss: 0.7767 - classification_loss: 0.1901 67/500 [===>..........................] - ETA: 2:27 - loss: 0.9560 - regression_loss: 0.7684 - classification_loss: 0.1876 68/500 [===>..........................] - ETA: 2:26 - loss: 0.9482 - regression_loss: 0.7629 - classification_loss: 0.1853 69/500 [===>..........................] - ETA: 2:26 - loss: 0.9459 - regression_loss: 0.7618 - classification_loss: 0.1842 70/500 [===>..........................] - ETA: 2:26 - loss: 0.9395 - regression_loss: 0.7574 - classification_loss: 0.1822 71/500 [===>..........................] - ETA: 2:25 - loss: 0.9331 - regression_loss: 0.7529 - classification_loss: 0.1801 72/500 [===>..........................] - ETA: 2:25 - loss: 0.9306 - regression_loss: 0.7523 - classification_loss: 0.1782 73/500 [===>..........................] - ETA: 2:25 - loss: 0.9320 - regression_loss: 0.7546 - classification_loss: 0.1773 74/500 [===>..........................] - ETA: 2:24 - loss: 0.9272 - regression_loss: 0.7515 - classification_loss: 0.1757 75/500 [===>..........................] - ETA: 2:24 - loss: 0.9227 - regression_loss: 0.7488 - classification_loss: 0.1739 76/500 [===>..........................] - ETA: 2:24 - loss: 0.9465 - regression_loss: 0.7687 - classification_loss: 0.1778 77/500 [===>..........................] - ETA: 2:23 - loss: 0.9520 - regression_loss: 0.7739 - classification_loss: 0.1781 78/500 [===>..........................] - ETA: 2:23 - loss: 0.9470 - regression_loss: 0.7706 - classification_loss: 0.1763 79/500 [===>..........................] - ETA: 2:23 - loss: 0.9510 - regression_loss: 0.7758 - classification_loss: 0.1752 80/500 [===>..........................] - ETA: 2:22 - loss: 0.9541 - regression_loss: 0.7791 - classification_loss: 0.1750 81/500 [===>..........................] - ETA: 2:22 - loss: 0.9668 - regression_loss: 0.7927 - classification_loss: 0.1740 82/500 [===>..........................] - ETA: 2:22 - loss: 0.9680 - regression_loss: 0.7954 - classification_loss: 0.1726 83/500 [===>..........................] - ETA: 2:21 - loss: 0.9643 - regression_loss: 0.7931 - classification_loss: 0.1712 84/500 [====>.........................] - ETA: 2:21 - loss: 0.9692 - regression_loss: 0.7975 - classification_loss: 0.1718 85/500 [====>.........................] - ETA: 2:21 - loss: 0.9659 - regression_loss: 0.7956 - classification_loss: 0.1703 86/500 [====>.........................] - ETA: 2:20 - loss: 0.9657 - regression_loss: 0.7962 - classification_loss: 0.1695 87/500 [====>.........................] - ETA: 2:20 - loss: 0.9601 - regression_loss: 0.7921 - classification_loss: 0.1680 88/500 [====>.........................] - ETA: 2:19 - loss: 0.9568 - regression_loss: 0.7905 - classification_loss: 0.1663 89/500 [====>.........................] - ETA: 2:19 - loss: 0.9529 - regression_loss: 0.7880 - classification_loss: 0.1649 90/500 [====>.........................] - ETA: 2:19 - loss: 0.9511 - regression_loss: 0.7875 - classification_loss: 0.1637 91/500 [====>.........................] - ETA: 2:19 - loss: 0.9600 - regression_loss: 0.7971 - classification_loss: 0.1629 92/500 [====>.........................] - ETA: 2:18 - loss: 0.9561 - regression_loss: 0.7944 - classification_loss: 0.1617 93/500 [====>.........................] - ETA: 2:18 - loss: 0.9551 - regression_loss: 0.7946 - classification_loss: 0.1605 94/500 [====>.........................] - ETA: 2:18 - loss: 0.9500 - regression_loss: 0.7909 - classification_loss: 0.1591 95/500 [====>.........................] - ETA: 2:17 - loss: 0.9596 - regression_loss: 0.7998 - classification_loss: 0.1598 96/500 [====>.........................] - ETA: 2:17 - loss: 0.9580 - regression_loss: 0.7993 - classification_loss: 0.1586 97/500 [====>.........................] - ETA: 2:17 - loss: 0.9654 - regression_loss: 0.8067 - classification_loss: 0.1587 98/500 [====>.........................] - ETA: 2:16 - loss: 0.9637 - regression_loss: 0.8059 - classification_loss: 0.1578 99/500 [====>.........................] - ETA: 2:16 - loss: 0.9618 - regression_loss: 0.8052 - classification_loss: 0.1567 100/500 [=====>........................] - ETA: 2:16 - loss: 0.9593 - regression_loss: 0.8036 - classification_loss: 0.1557 101/500 [=====>........................] - ETA: 2:15 - loss: 0.9620 - regression_loss: 0.8064 - classification_loss: 0.1556 102/500 [=====>........................] - ETA: 2:15 - loss: 0.9615 - regression_loss: 0.8068 - classification_loss: 0.1547 103/500 [=====>........................] - ETA: 2:15 - loss: 0.9596 - regression_loss: 0.8060 - classification_loss: 0.1536 104/500 [=====>........................] - ETA: 2:14 - loss: 0.9564 - regression_loss: 0.8038 - classification_loss: 0.1526 105/500 [=====>........................] - ETA: 2:14 - loss: 0.9548 - regression_loss: 0.8031 - classification_loss: 0.1517 106/500 [=====>........................] - ETA: 2:14 - loss: 0.9546 - regression_loss: 0.8033 - classification_loss: 0.1513 107/500 [=====>........................] - ETA: 2:13 - loss: 0.9533 - regression_loss: 0.8029 - classification_loss: 0.1504 108/500 [=====>........................] - ETA: 2:13 - loss: 0.9510 - regression_loss: 0.8013 - classification_loss: 0.1497 109/500 [=====>........................] - ETA: 2:13 - loss: 0.9478 - regression_loss: 0.7990 - classification_loss: 0.1488 110/500 [=====>........................] - ETA: 2:12 - loss: 0.9466 - regression_loss: 0.7988 - classification_loss: 0.1479 111/500 [=====>........................] - ETA: 2:12 - loss: 0.9423 - regression_loss: 0.7955 - classification_loss: 0.1468 112/500 [=====>........................] - ETA: 2:11 - loss: 0.9385 - regression_loss: 0.7921 - classification_loss: 0.1464 113/500 [=====>........................] - ETA: 2:11 - loss: 0.9362 - regression_loss: 0.7908 - classification_loss: 0.1455 114/500 [=====>........................] - ETA: 2:11 - loss: 0.9333 - regression_loss: 0.7886 - classification_loss: 0.1447 115/500 [=====>........................] - ETA: 2:10 - loss: 0.9308 - regression_loss: 0.7869 - classification_loss: 0.1438 116/500 [=====>........................] - ETA: 2:10 - loss: 0.9345 - regression_loss: 0.7909 - classification_loss: 0.1436 117/500 [======>.......................] - ETA: 2:10 - loss: 0.9340 - regression_loss: 0.7910 - classification_loss: 0.1430 118/500 [======>.......................] - ETA: 2:09 - loss: 0.9344 - regression_loss: 0.7910 - classification_loss: 0.1434 119/500 [======>.......................] - ETA: 2:09 - loss: 0.9267 - regression_loss: 0.7843 - classification_loss: 0.1423 120/500 [======>.......................] - ETA: 2:09 - loss: 0.9244 - regression_loss: 0.7829 - classification_loss: 0.1415 121/500 [======>.......................] - ETA: 2:08 - loss: 0.9219 - regression_loss: 0.7811 - classification_loss: 0.1408 122/500 [======>.......................] - ETA: 2:08 - loss: 0.9288 - regression_loss: 0.7871 - classification_loss: 0.1417 123/500 [======>.......................] - ETA: 2:08 - loss: 0.9269 - regression_loss: 0.7855 - classification_loss: 0.1415 124/500 [======>.......................] - ETA: 2:07 - loss: 0.9252 - regression_loss: 0.7842 - classification_loss: 0.1410 125/500 [======>.......................] - ETA: 2:07 - loss: 0.9267 - regression_loss: 0.7857 - classification_loss: 0.1409 126/500 [======>.......................] - ETA: 2:07 - loss: 0.9292 - regression_loss: 0.7882 - classification_loss: 0.1410 127/500 [======>.......................] - ETA: 2:06 - loss: 0.9260 - regression_loss: 0.7858 - classification_loss: 0.1403 128/500 [======>.......................] - ETA: 2:06 - loss: 0.9271 - regression_loss: 0.7873 - classification_loss: 0.1398 129/500 [======>.......................] - ETA: 2:05 - loss: 0.9288 - regression_loss: 0.7893 - classification_loss: 0.1395 130/500 [======>.......................] - ETA: 2:05 - loss: 0.9263 - regression_loss: 0.7877 - classification_loss: 0.1387 131/500 [======>.......................] - ETA: 2:05 - loss: 0.9255 - regression_loss: 0.7874 - classification_loss: 0.1381 132/500 [======>.......................] - ETA: 2:05 - loss: 0.9207 - regression_loss: 0.7835 - classification_loss: 0.1371 133/500 [======>.......................] - ETA: 2:04 - loss: 0.9178 - regression_loss: 0.7814 - classification_loss: 0.1364 134/500 [=======>......................] - ETA: 2:04 - loss: 0.9169 - regression_loss: 0.7812 - classification_loss: 0.1357 135/500 [=======>......................] - ETA: 2:04 - loss: 0.9195 - regression_loss: 0.7836 - classification_loss: 0.1359 136/500 [=======>......................] - ETA: 2:03 - loss: 0.9199 - regression_loss: 0.7841 - classification_loss: 0.1358 137/500 [=======>......................] - ETA: 2:03 - loss: 0.9203 - regression_loss: 0.7850 - classification_loss: 0.1353 138/500 [=======>......................] - ETA: 2:03 - loss: 0.9184 - regression_loss: 0.7837 - classification_loss: 0.1347 139/500 [=======>......................] - ETA: 2:02 - loss: 0.9221 - regression_loss: 0.7869 - classification_loss: 0.1353 140/500 [=======>......................] - ETA: 2:02 - loss: 0.9217 - regression_loss: 0.7870 - classification_loss: 0.1348 141/500 [=======>......................] - ETA: 2:01 - loss: 0.9214 - regression_loss: 0.7872 - classification_loss: 0.1342 142/500 [=======>......................] - ETA: 2:01 - loss: 0.9212 - regression_loss: 0.7873 - classification_loss: 0.1339 143/500 [=======>......................] - ETA: 2:01 - loss: 0.9207 - regression_loss: 0.7872 - classification_loss: 0.1335 144/500 [=======>......................] - ETA: 2:01 - loss: 0.9219 - regression_loss: 0.7879 - classification_loss: 0.1340 145/500 [=======>......................] - ETA: 2:00 - loss: 0.9206 - regression_loss: 0.7872 - classification_loss: 0.1334 146/500 [=======>......................] - ETA: 2:00 - loss: 0.9206 - regression_loss: 0.7875 - classification_loss: 0.1331 147/500 [=======>......................] - ETA: 2:00 - loss: 0.9227 - regression_loss: 0.7901 - classification_loss: 0.1326 148/500 [=======>......................] - ETA: 1:59 - loss: 0.9202 - regression_loss: 0.7881 - classification_loss: 0.1320 149/500 [=======>......................] - ETA: 1:59 - loss: 0.9189 - regression_loss: 0.7872 - classification_loss: 0.1317 150/500 [========>.....................] - ETA: 1:59 - loss: 0.9178 - regression_loss: 0.7867 - classification_loss: 0.1311 151/500 [========>.....................] - ETA: 1:58 - loss: 0.9182 - regression_loss: 0.7873 - classification_loss: 0.1309 152/500 [========>.....................] - ETA: 1:58 - loss: 0.9143 - regression_loss: 0.7841 - classification_loss: 0.1302 153/500 [========>.....................] - ETA: 1:58 - loss: 0.9125 - regression_loss: 0.7830 - classification_loss: 0.1295 154/500 [========>.....................] - ETA: 1:57 - loss: 0.9143 - regression_loss: 0.7844 - classification_loss: 0.1299 155/500 [========>.....................] - ETA: 1:57 - loss: 0.9132 - regression_loss: 0.7839 - classification_loss: 0.1293 156/500 [========>.....................] - ETA: 1:56 - loss: 0.9130 - regression_loss: 0.7840 - classification_loss: 0.1291 157/500 [========>.....................] - ETA: 1:56 - loss: 0.9112 - regression_loss: 0.7826 - classification_loss: 0.1286 158/500 [========>.....................] - ETA: 1:56 - loss: 0.9123 - regression_loss: 0.7837 - classification_loss: 0.1286 159/500 [========>.....................] - ETA: 1:55 - loss: 0.9110 - regression_loss: 0.7828 - classification_loss: 0.1282 160/500 [========>.....................] - ETA: 1:55 - loss: 0.9110 - regression_loss: 0.7832 - classification_loss: 0.1277 161/500 [========>.....................] - ETA: 1:55 - loss: 0.9113 - regression_loss: 0.7834 - classification_loss: 0.1280 162/500 [========>.....................] - ETA: 1:54 - loss: 0.9119 - regression_loss: 0.7844 - classification_loss: 0.1275 163/500 [========>.....................] - ETA: 1:54 - loss: 0.9110 - regression_loss: 0.7841 - classification_loss: 0.1270 164/500 [========>.....................] - ETA: 1:54 - loss: 0.9098 - regression_loss: 0.7833 - classification_loss: 0.1266 165/500 [========>.....................] - ETA: 1:53 - loss: 0.9100 - regression_loss: 0.7836 - classification_loss: 0.1263 166/500 [========>.....................] - ETA: 1:53 - loss: 0.9088 - regression_loss: 0.7830 - classification_loss: 0.1257 167/500 [=========>....................] - ETA: 1:53 - loss: 0.9083 - regression_loss: 0.7830 - classification_loss: 0.1253 168/500 [=========>....................] - ETA: 1:52 - loss: 0.9134 - regression_loss: 0.7869 - classification_loss: 0.1264 169/500 [=========>....................] - ETA: 1:52 - loss: 0.9129 - regression_loss: 0.7868 - classification_loss: 0.1261 170/500 [=========>....................] - ETA: 1:52 - loss: 0.9106 - regression_loss: 0.7851 - classification_loss: 0.1255 171/500 [=========>....................] - ETA: 1:51 - loss: 0.9100 - regression_loss: 0.7849 - classification_loss: 0.1251 172/500 [=========>....................] - ETA: 1:51 - loss: 0.9114 - regression_loss: 0.7863 - classification_loss: 0.1251 173/500 [=========>....................] - ETA: 1:51 - loss: 0.9115 - regression_loss: 0.7862 - classification_loss: 0.1253 174/500 [=========>....................] - ETA: 1:50 - loss: 0.9105 - regression_loss: 0.7856 - classification_loss: 0.1249 175/500 [=========>....................] - ETA: 1:50 - loss: 0.9091 - regression_loss: 0.7846 - classification_loss: 0.1245 176/500 [=========>....................] - ETA: 1:50 - loss: 0.9079 - regression_loss: 0.7838 - classification_loss: 0.1241 177/500 [=========>....................] - ETA: 1:49 - loss: 0.9097 - regression_loss: 0.7855 - classification_loss: 0.1242 178/500 [=========>....................] - ETA: 1:49 - loss: 0.9084 - regression_loss: 0.7847 - classification_loss: 0.1237 179/500 [=========>....................] - ETA: 1:49 - loss: 0.9059 - regression_loss: 0.7828 - classification_loss: 0.1231 180/500 [=========>....................] - ETA: 1:48 - loss: 0.9042 - regression_loss: 0.7813 - classification_loss: 0.1228 181/500 [=========>....................] - ETA: 1:48 - loss: 0.9005 - regression_loss: 0.7783 - classification_loss: 0.1222 182/500 [=========>....................] - ETA: 1:48 - loss: 0.8981 - regression_loss: 0.7758 - classification_loss: 0.1222 183/500 [=========>....................] - ETA: 1:47 - loss: 0.8958 - regression_loss: 0.7740 - classification_loss: 0.1217 184/500 [==========>...................] - ETA: 1:47 - loss: 0.8959 - regression_loss: 0.7739 - classification_loss: 0.1220 185/500 [==========>...................] - ETA: 1:47 - loss: 0.8985 - regression_loss: 0.7759 - classification_loss: 0.1226 186/500 [==========>...................] - ETA: 1:46 - loss: 0.8978 - regression_loss: 0.7755 - classification_loss: 0.1222 187/500 [==========>...................] - ETA: 1:46 - loss: 0.8980 - regression_loss: 0.7757 - classification_loss: 0.1222 188/500 [==========>...................] - ETA: 1:46 - loss: 0.8958 - regression_loss: 0.7738 - classification_loss: 0.1220 189/500 [==========>...................] - ETA: 1:45 - loss: 0.8955 - regression_loss: 0.7739 - classification_loss: 0.1216 190/500 [==========>...................] - ETA: 1:45 - loss: 0.8954 - regression_loss: 0.7741 - classification_loss: 0.1213 191/500 [==========>...................] - ETA: 1:45 - loss: 0.8919 - regression_loss: 0.7711 - classification_loss: 0.1208 192/500 [==========>...................] - ETA: 1:44 - loss: 0.8962 - regression_loss: 0.7748 - classification_loss: 0.1214 193/500 [==========>...................] - ETA: 1:44 - loss: 0.8979 - regression_loss: 0.7763 - classification_loss: 0.1217 194/500 [==========>...................] - ETA: 1:44 - loss: 0.8986 - regression_loss: 0.7769 - classification_loss: 0.1218 195/500 [==========>...................] - ETA: 1:43 - loss: 0.9003 - regression_loss: 0.7786 - classification_loss: 0.1217 196/500 [==========>...................] - ETA: 1:43 - loss: 0.8999 - regression_loss: 0.7784 - classification_loss: 0.1216 197/500 [==========>...................] - ETA: 1:43 - loss: 0.9001 - regression_loss: 0.7788 - classification_loss: 0.1214 198/500 [==========>...................] - ETA: 1:42 - loss: 0.9012 - regression_loss: 0.7792 - classification_loss: 0.1221 199/500 [==========>...................] - ETA: 1:42 - loss: 0.9031 - regression_loss: 0.7807 - classification_loss: 0.1224 200/500 [===========>..................] - ETA: 1:42 - loss: 0.9023 - regression_loss: 0.7802 - classification_loss: 0.1222 201/500 [===========>..................] - ETA: 1:41 - loss: 0.9005 - regression_loss: 0.7787 - classification_loss: 0.1218 202/500 [===========>..................] - ETA: 1:41 - loss: 0.9014 - regression_loss: 0.7796 - classification_loss: 0.1218 203/500 [===========>..................] - ETA: 1:41 - loss: 0.9012 - regression_loss: 0.7797 - classification_loss: 0.1215 204/500 [===========>..................] - ETA: 1:40 - loss: 0.8984 - regression_loss: 0.7773 - classification_loss: 0.1211 205/500 [===========>..................] - ETA: 1:40 - loss: 0.8970 - regression_loss: 0.7762 - classification_loss: 0.1208 206/500 [===========>..................] - ETA: 1:39 - loss: 0.8970 - regression_loss: 0.7764 - classification_loss: 0.1205 207/500 [===========>..................] - ETA: 1:39 - loss: 0.8976 - regression_loss: 0.7774 - classification_loss: 0.1202 208/500 [===========>..................] - ETA: 1:39 - loss: 0.8979 - regression_loss: 0.7778 - classification_loss: 0.1201 209/500 [===========>..................] - ETA: 1:38 - loss: 0.8996 - regression_loss: 0.7789 - classification_loss: 0.1207 210/500 [===========>..................] - ETA: 1:38 - loss: 0.8994 - regression_loss: 0.7789 - classification_loss: 0.1205 211/500 [===========>..................] - ETA: 1:38 - loss: 0.8988 - regression_loss: 0.7786 - classification_loss: 0.1202 212/500 [===========>..................] - ETA: 1:37 - loss: 0.9004 - regression_loss: 0.7805 - classification_loss: 0.1199 213/500 [===========>..................] - ETA: 1:37 - loss: 0.9021 - regression_loss: 0.7824 - classification_loss: 0.1197 214/500 [===========>..................] - ETA: 1:37 - loss: 0.8997 - regression_loss: 0.7805 - classification_loss: 0.1192 215/500 [===========>..................] - ETA: 1:36 - loss: 0.8981 - regression_loss: 0.7792 - classification_loss: 0.1189 216/500 [===========>..................] - ETA: 1:36 - loss: 0.8956 - regression_loss: 0.7772 - classification_loss: 0.1184 217/500 [============>.................] - ETA: 1:36 - loss: 0.8961 - regression_loss: 0.7779 - classification_loss: 0.1181 218/500 [============>.................] - ETA: 1:35 - loss: 0.8961 - regression_loss: 0.7781 - classification_loss: 0.1180 219/500 [============>.................] - ETA: 1:35 - loss: 0.8935 - regression_loss: 0.7760 - classification_loss: 0.1175 220/500 [============>.................] - ETA: 1:35 - loss: 0.8929 - regression_loss: 0.7752 - classification_loss: 0.1177 221/500 [============>.................] - ETA: 1:34 - loss: 0.8901 - regression_loss: 0.7729 - classification_loss: 0.1172 222/500 [============>.................] - ETA: 1:34 - loss: 0.8891 - regression_loss: 0.7723 - classification_loss: 0.1168 223/500 [============>.................] - ETA: 1:34 - loss: 0.8877 - regression_loss: 0.7712 - classification_loss: 0.1165 224/500 [============>.................] - ETA: 1:33 - loss: 0.8886 - regression_loss: 0.7720 - classification_loss: 0.1165 225/500 [============>.................] - ETA: 1:33 - loss: 0.8906 - regression_loss: 0.7741 - classification_loss: 0.1165 226/500 [============>.................] - ETA: 1:33 - loss: 0.8901 - regression_loss: 0.7737 - classification_loss: 0.1164 227/500 [============>.................] - ETA: 1:32 - loss: 0.8888 - regression_loss: 0.7727 - classification_loss: 0.1161 228/500 [============>.................] - ETA: 1:32 - loss: 0.8870 - regression_loss: 0.7713 - classification_loss: 0.1157 229/500 [============>.................] - ETA: 1:32 - loss: 0.8858 - regression_loss: 0.7704 - classification_loss: 0.1154 230/500 [============>.................] - ETA: 1:31 - loss: 0.8841 - regression_loss: 0.7692 - classification_loss: 0.1149 231/500 [============>.................] - ETA: 1:31 - loss: 0.8843 - regression_loss: 0.7693 - classification_loss: 0.1150 232/500 [============>.................] - ETA: 1:31 - loss: 0.8838 - regression_loss: 0.7691 - classification_loss: 0.1147 233/500 [============>.................] - ETA: 1:30 - loss: 0.8838 - regression_loss: 0.7692 - classification_loss: 0.1146 234/500 [=============>................] - ETA: 1:30 - loss: 0.8822 - regression_loss: 0.7679 - classification_loss: 0.1143 235/500 [=============>................] - ETA: 1:29 - loss: 0.8815 - regression_loss: 0.7675 - classification_loss: 0.1140 236/500 [=============>................] - ETA: 1:29 - loss: 0.8814 - regression_loss: 0.7675 - classification_loss: 0.1140 237/500 [=============>................] - ETA: 1:29 - loss: 0.8809 - regression_loss: 0.7671 - classification_loss: 0.1137 238/500 [=============>................] - ETA: 1:28 - loss: 0.8811 - regression_loss: 0.7674 - classification_loss: 0.1137 239/500 [=============>................] - ETA: 1:28 - loss: 0.8820 - regression_loss: 0.7683 - classification_loss: 0.1137 240/500 [=============>................] - ETA: 1:28 - loss: 0.8830 - regression_loss: 0.7691 - classification_loss: 0.1139 241/500 [=============>................] - ETA: 1:27 - loss: 0.8807 - regression_loss: 0.7672 - classification_loss: 0.1135 242/500 [=============>................] - ETA: 1:27 - loss: 0.8802 - regression_loss: 0.7670 - classification_loss: 0.1132 243/500 [=============>................] - ETA: 1:27 - loss: 0.8779 - regression_loss: 0.7651 - classification_loss: 0.1128 244/500 [=============>................] - ETA: 1:26 - loss: 0.8774 - regression_loss: 0.7648 - classification_loss: 0.1125 245/500 [=============>................] - ETA: 1:26 - loss: 0.8779 - regression_loss: 0.7654 - classification_loss: 0.1125 246/500 [=============>................] - ETA: 1:26 - loss: 0.8764 - regression_loss: 0.7641 - classification_loss: 0.1123 247/500 [=============>................] - ETA: 1:25 - loss: 0.8774 - regression_loss: 0.7650 - classification_loss: 0.1124 248/500 [=============>................] - ETA: 1:25 - loss: 0.8760 - regression_loss: 0.7639 - classification_loss: 0.1121 249/500 [=============>................] - ETA: 1:25 - loss: 0.8753 - regression_loss: 0.7634 - classification_loss: 0.1119 250/500 [==============>...............] - ETA: 1:24 - loss: 0.8743 - regression_loss: 0.7627 - classification_loss: 0.1116 251/500 [==============>...............] - ETA: 1:24 - loss: 0.8740 - regression_loss: 0.7626 - classification_loss: 0.1114 252/500 [==============>...............] - ETA: 1:24 - loss: 0.8734 - regression_loss: 0.7620 - classification_loss: 0.1113 253/500 [==============>...............] - ETA: 1:23 - loss: 0.8749 - regression_loss: 0.7635 - classification_loss: 0.1114 254/500 [==============>...............] - ETA: 1:23 - loss: 0.8749 - regression_loss: 0.7636 - classification_loss: 0.1113 255/500 [==============>...............] - ETA: 1:23 - loss: 0.8736 - regression_loss: 0.7625 - classification_loss: 0.1112 256/500 [==============>...............] - ETA: 1:22 - loss: 0.8735 - regression_loss: 0.7623 - classification_loss: 0.1112 257/500 [==============>...............] - ETA: 1:22 - loss: 0.8726 - regression_loss: 0.7616 - classification_loss: 0.1110 258/500 [==============>...............] - ETA: 1:22 - loss: 0.8733 - regression_loss: 0.7624 - classification_loss: 0.1109 259/500 [==============>...............] - ETA: 1:21 - loss: 0.8717 - regression_loss: 0.7610 - classification_loss: 0.1107 260/500 [==============>...............] - ETA: 1:21 - loss: 0.8710 - regression_loss: 0.7605 - classification_loss: 0.1104 261/500 [==============>...............] - ETA: 1:21 - loss: 0.8704 - regression_loss: 0.7603 - classification_loss: 0.1102 262/500 [==============>...............] - ETA: 1:20 - loss: 0.8700 - regression_loss: 0.7600 - classification_loss: 0.1100 263/500 [==============>...............] - ETA: 1:20 - loss: 0.8677 - regression_loss: 0.7581 - classification_loss: 0.1097 264/500 [==============>...............] - ETA: 1:20 - loss: 0.8659 - regression_loss: 0.7566 - classification_loss: 0.1093 265/500 [==============>...............] - ETA: 1:19 - loss: 0.8649 - regression_loss: 0.7557 - classification_loss: 0.1092 266/500 [==============>...............] - ETA: 1:19 - loss: 0.8667 - regression_loss: 0.7572 - classification_loss: 0.1095 267/500 [===============>..............] - ETA: 1:19 - loss: 0.8666 - regression_loss: 0.7573 - classification_loss: 0.1093 268/500 [===============>..............] - ETA: 1:18 - loss: 0.8667 - regression_loss: 0.7575 - classification_loss: 0.1092 269/500 [===============>..............] - ETA: 1:18 - loss: 0.8692 - regression_loss: 0.7595 - classification_loss: 0.1096 270/500 [===============>..............] - ETA: 1:18 - loss: 0.8686 - regression_loss: 0.7591 - classification_loss: 0.1096 271/500 [===============>..............] - ETA: 1:17 - loss: 0.8674 - regression_loss: 0.7581 - classification_loss: 0.1093 272/500 [===============>..............] - ETA: 1:17 - loss: 0.8673 - regression_loss: 0.7580 - classification_loss: 0.1093 273/500 [===============>..............] - ETA: 1:17 - loss: 0.8679 - regression_loss: 0.7584 - classification_loss: 0.1094 274/500 [===============>..............] - ETA: 1:16 - loss: 0.8694 - regression_loss: 0.7597 - classification_loss: 0.1097 275/500 [===============>..............] - ETA: 1:16 - loss: 0.8698 - regression_loss: 0.7601 - classification_loss: 0.1096 276/500 [===============>..............] - ETA: 1:16 - loss: 0.8726 - regression_loss: 0.7623 - classification_loss: 0.1103 277/500 [===============>..............] - ETA: 1:15 - loss: 0.8734 - regression_loss: 0.7632 - classification_loss: 0.1102 278/500 [===============>..............] - ETA: 1:15 - loss: 0.8744 - regression_loss: 0.7642 - classification_loss: 0.1103 279/500 [===============>..............] - ETA: 1:15 - loss: 0.8733 - regression_loss: 0.7633 - classification_loss: 0.1100 280/500 [===============>..............] - ETA: 1:14 - loss: 0.8735 - regression_loss: 0.7635 - classification_loss: 0.1100 281/500 [===============>..............] - ETA: 1:14 - loss: 0.8737 - regression_loss: 0.7639 - classification_loss: 0.1098 282/500 [===============>..............] - ETA: 1:13 - loss: 0.8725 - regression_loss: 0.7629 - classification_loss: 0.1096 283/500 [===============>..............] - ETA: 1:13 - loss: 0.8724 - regression_loss: 0.7627 - classification_loss: 0.1097 284/500 [================>.............] - ETA: 1:13 - loss: 0.8724 - regression_loss: 0.7629 - classification_loss: 0.1095 285/500 [================>.............] - ETA: 1:12 - loss: 0.8704 - regression_loss: 0.7612 - classification_loss: 0.1092 286/500 [================>.............] - ETA: 1:12 - loss: 0.8693 - regression_loss: 0.7603 - classification_loss: 0.1090 287/500 [================>.............] - ETA: 1:12 - loss: 0.8706 - regression_loss: 0.7615 - classification_loss: 0.1092 288/500 [================>.............] - ETA: 1:11 - loss: 0.8691 - regression_loss: 0.7603 - classification_loss: 0.1088 289/500 [================>.............] - ETA: 1:11 - loss: 0.8693 - regression_loss: 0.7605 - classification_loss: 0.1088 290/500 [================>.............] - ETA: 1:11 - loss: 0.8704 - regression_loss: 0.7616 - classification_loss: 0.1088 291/500 [================>.............] - ETA: 1:10 - loss: 0.8692 - regression_loss: 0.7607 - classification_loss: 0.1084 292/500 [================>.............] - ETA: 1:10 - loss: 0.8688 - regression_loss: 0.7605 - classification_loss: 0.1082 293/500 [================>.............] - ETA: 1:10 - loss: 0.8677 - regression_loss: 0.7597 - classification_loss: 0.1080 294/500 [================>.............] - ETA: 1:09 - loss: 0.8677 - regression_loss: 0.7598 - classification_loss: 0.1079 295/500 [================>.............] - ETA: 1:09 - loss: 0.8679 - regression_loss: 0.7600 - classification_loss: 0.1079 296/500 [================>.............] - ETA: 1:09 - loss: 0.8685 - regression_loss: 0.7606 - classification_loss: 0.1079 297/500 [================>.............] - ETA: 1:08 - loss: 0.8677 - regression_loss: 0.7599 - classification_loss: 0.1078 298/500 [================>.............] - ETA: 1:08 - loss: 0.8659 - regression_loss: 0.7584 - classification_loss: 0.1075 299/500 [================>.............] - ETA: 1:08 - loss: 0.8660 - regression_loss: 0.7587 - classification_loss: 0.1074 300/500 [=================>............] - ETA: 1:07 - loss: 0.8632 - regression_loss: 0.7562 - classification_loss: 0.1070 301/500 [=================>............] - ETA: 1:07 - loss: 0.8637 - regression_loss: 0.7567 - classification_loss: 0.1070 302/500 [=================>............] - ETA: 1:07 - loss: 0.8627 - regression_loss: 0.7560 - classification_loss: 0.1067 303/500 [=================>............] - ETA: 1:06 - loss: 0.8628 - regression_loss: 0.7561 - classification_loss: 0.1067 304/500 [=================>............] - ETA: 1:06 - loss: 0.8612 - regression_loss: 0.7548 - classification_loss: 0.1064 305/500 [=================>............] - ETA: 1:06 - loss: 0.8630 - regression_loss: 0.7567 - classification_loss: 0.1063 306/500 [=================>............] - ETA: 1:05 - loss: 0.8630 - regression_loss: 0.7568 - classification_loss: 0.1062 307/500 [=================>............] - ETA: 1:05 - loss: 0.8617 - regression_loss: 0.7558 - classification_loss: 0.1059 308/500 [=================>............] - ETA: 1:05 - loss: 0.8627 - regression_loss: 0.7569 - classification_loss: 0.1058 309/500 [=================>............] - ETA: 1:04 - loss: 0.8637 - regression_loss: 0.7579 - classification_loss: 0.1058 310/500 [=================>............] - ETA: 1:04 - loss: 0.8617 - regression_loss: 0.7563 - classification_loss: 0.1055 311/500 [=================>............] - ETA: 1:04 - loss: 0.8636 - regression_loss: 0.7575 - classification_loss: 0.1061 312/500 [=================>............] - ETA: 1:03 - loss: 0.8632 - regression_loss: 0.7572 - classification_loss: 0.1061 313/500 [=================>............] - ETA: 1:03 - loss: 0.8624 - regression_loss: 0.7566 - classification_loss: 0.1058 314/500 [=================>............] - ETA: 1:03 - loss: 0.8638 - regression_loss: 0.7581 - classification_loss: 0.1057 315/500 [=================>............] - ETA: 1:02 - loss: 0.8640 - regression_loss: 0.7583 - classification_loss: 0.1057 316/500 [=================>............] - ETA: 1:02 - loss: 0.8627 - regression_loss: 0.7573 - classification_loss: 0.1054 317/500 [==================>...........] - ETA: 1:02 - loss: 0.8630 - regression_loss: 0.7570 - classification_loss: 0.1060 318/500 [==================>...........] - ETA: 1:01 - loss: 0.8622 - regression_loss: 0.7564 - classification_loss: 0.1058 319/500 [==================>...........] - ETA: 1:01 - loss: 0.8611 - regression_loss: 0.7555 - classification_loss: 0.1056 320/500 [==================>...........] - ETA: 1:01 - loss: 0.8619 - regression_loss: 0.7563 - classification_loss: 0.1056 321/500 [==================>...........] - ETA: 1:00 - loss: 0.8643 - regression_loss: 0.7581 - classification_loss: 0.1062 322/500 [==================>...........] - ETA: 1:00 - loss: 0.8642 - regression_loss: 0.7581 - classification_loss: 0.1061 323/500 [==================>...........] - ETA: 1:00 - loss: 0.8633 - regression_loss: 0.7574 - classification_loss: 0.1059 324/500 [==================>...........] - ETA: 59s - loss: 0.8630 - regression_loss: 0.7573 - classification_loss: 0.1057  325/500 [==================>...........] - ETA: 59s - loss: 0.8608 - regression_loss: 0.7554 - classification_loss: 0.1054 326/500 [==================>...........] - ETA: 58s - loss: 0.8592 - regression_loss: 0.7540 - classification_loss: 0.1052 327/500 [==================>...........] - ETA: 58s - loss: 0.8602 - regression_loss: 0.7548 - classification_loss: 0.1054 328/500 [==================>...........] - ETA: 58s - loss: 0.8599 - regression_loss: 0.7546 - classification_loss: 0.1053 329/500 [==================>...........] - ETA: 57s - loss: 0.8594 - regression_loss: 0.7543 - classification_loss: 0.1051 330/500 [==================>...........] - ETA: 57s - loss: 0.8598 - regression_loss: 0.7547 - classification_loss: 0.1050 331/500 [==================>...........] - ETA: 57s - loss: 0.8590 - regression_loss: 0.7542 - classification_loss: 0.1048 332/500 [==================>...........] - ETA: 56s - loss: 0.8571 - regression_loss: 0.7526 - classification_loss: 0.1045 333/500 [==================>...........] - ETA: 56s - loss: 0.8566 - regression_loss: 0.7521 - classification_loss: 0.1045 334/500 [===================>..........] - ETA: 56s - loss: 0.8552 - regression_loss: 0.7509 - classification_loss: 0.1042 335/500 [===================>..........] - ETA: 55s - loss: 0.8551 - regression_loss: 0.7510 - classification_loss: 0.1041 336/500 [===================>..........] - ETA: 55s - loss: 0.8552 - regression_loss: 0.7512 - classification_loss: 0.1040 337/500 [===================>..........] - ETA: 55s - loss: 0.8576 - regression_loss: 0.7538 - classification_loss: 0.1039 338/500 [===================>..........] - ETA: 54s - loss: 0.8579 - regression_loss: 0.7541 - classification_loss: 0.1038 339/500 [===================>..........] - ETA: 54s - loss: 0.8595 - regression_loss: 0.7554 - classification_loss: 0.1040 340/500 [===================>..........] - ETA: 54s - loss: 0.8590 - regression_loss: 0.7552 - classification_loss: 0.1038 341/500 [===================>..........] - ETA: 53s - loss: 0.8593 - regression_loss: 0.7557 - classification_loss: 0.1036 342/500 [===================>..........] - ETA: 53s - loss: 0.8608 - regression_loss: 0.7570 - classification_loss: 0.1038 343/500 [===================>..........] - ETA: 53s - loss: 0.8602 - regression_loss: 0.7566 - classification_loss: 0.1035 344/500 [===================>..........] - ETA: 52s - loss: 0.8598 - regression_loss: 0.7563 - classification_loss: 0.1035 345/500 [===================>..........] - ETA: 52s - loss: 0.8597 - regression_loss: 0.7563 - classification_loss: 0.1034 346/500 [===================>..........] - ETA: 52s - loss: 0.8588 - regression_loss: 0.7556 - classification_loss: 0.1032 347/500 [===================>..........] - ETA: 51s - loss: 0.8600 - regression_loss: 0.7567 - classification_loss: 0.1033 348/500 [===================>..........] - ETA: 51s - loss: 0.8612 - regression_loss: 0.7578 - classification_loss: 0.1034 349/500 [===================>..........] - ETA: 51s - loss: 0.8604 - regression_loss: 0.7572 - classification_loss: 0.1031 350/500 [====================>.........] - ETA: 50s - loss: 0.8598 - regression_loss: 0.7569 - classification_loss: 0.1029 351/500 [====================>.........] - ETA: 50s - loss: 0.8594 - regression_loss: 0.7566 - classification_loss: 0.1029 352/500 [====================>.........] - ETA: 50s - loss: 0.8597 - regression_loss: 0.7570 - classification_loss: 0.1028 353/500 [====================>.........] - ETA: 49s - loss: 0.8596 - regression_loss: 0.7569 - classification_loss: 0.1027 354/500 [====================>.........] - ETA: 49s - loss: 0.8586 - regression_loss: 0.7561 - classification_loss: 0.1026 355/500 [====================>.........] - ETA: 49s - loss: 0.8579 - regression_loss: 0.7554 - classification_loss: 0.1024 356/500 [====================>.........] - ETA: 48s - loss: 0.8578 - regression_loss: 0.7554 - classification_loss: 0.1024 357/500 [====================>.........] - ETA: 48s - loss: 0.8580 - regression_loss: 0.7558 - classification_loss: 0.1023 358/500 [====================>.........] - ETA: 48s - loss: 0.8579 - regression_loss: 0.7556 - classification_loss: 0.1022 359/500 [====================>.........] - ETA: 47s - loss: 0.8603 - regression_loss: 0.7574 - classification_loss: 0.1029 360/500 [====================>.........] - ETA: 47s - loss: 0.8594 - regression_loss: 0.7567 - classification_loss: 0.1027 361/500 [====================>.........] - ETA: 47s - loss: 0.8596 - regression_loss: 0.7569 - classification_loss: 0.1026 362/500 [====================>.........] - ETA: 46s - loss: 0.8591 - regression_loss: 0.7567 - classification_loss: 0.1024 363/500 [====================>.........] - ETA: 46s - loss: 0.8588 - regression_loss: 0.7564 - classification_loss: 0.1024 364/500 [====================>.........] - ETA: 46s - loss: 0.8602 - regression_loss: 0.7575 - classification_loss: 0.1027 365/500 [====================>.........] - ETA: 45s - loss: 0.8607 - regression_loss: 0.7580 - classification_loss: 0.1027 366/500 [====================>.........] - ETA: 45s - loss: 0.8609 - regression_loss: 0.7582 - classification_loss: 0.1027 367/500 [=====================>........] - ETA: 45s - loss: 0.8599 - regression_loss: 0.7574 - classification_loss: 0.1025 368/500 [=====================>........] - ETA: 44s - loss: 0.8604 - regression_loss: 0.7579 - classification_loss: 0.1025 369/500 [=====================>........] - ETA: 44s - loss: 0.8597 - regression_loss: 0.7573 - classification_loss: 0.1024 370/500 [=====================>........] - ETA: 44s - loss: 0.8606 - regression_loss: 0.7580 - classification_loss: 0.1026 371/500 [=====================>........] - ETA: 43s - loss: 0.8605 - regression_loss: 0.7580 - classification_loss: 0.1026 372/500 [=====================>........] - ETA: 43s - loss: 0.8599 - regression_loss: 0.7575 - classification_loss: 0.1024 373/500 [=====================>........] - ETA: 43s - loss: 0.8595 - regression_loss: 0.7572 - classification_loss: 0.1023 374/500 [=====================>........] - ETA: 42s - loss: 0.8587 - regression_loss: 0.7565 - classification_loss: 0.1021 375/500 [=====================>........] - ETA: 42s - loss: 0.8576 - regression_loss: 0.7557 - classification_loss: 0.1019 376/500 [=====================>........] - ETA: 41s - loss: 0.8582 - regression_loss: 0.7561 - classification_loss: 0.1021 377/500 [=====================>........] - ETA: 41s - loss: 0.8600 - regression_loss: 0.7576 - classification_loss: 0.1025 378/500 [=====================>........] - ETA: 41s - loss: 0.8595 - regression_loss: 0.7572 - classification_loss: 0.1023 379/500 [=====================>........] - ETA: 40s - loss: 0.8589 - regression_loss: 0.7568 - classification_loss: 0.1021 380/500 [=====================>........] - ETA: 40s - loss: 0.8596 - regression_loss: 0.7572 - classification_loss: 0.1023 381/500 [=====================>........] - ETA: 40s - loss: 0.8592 - regression_loss: 0.7570 - classification_loss: 0.1022 382/500 [=====================>........] - ETA: 39s - loss: 0.8590 - regression_loss: 0.7568 - classification_loss: 0.1021 383/500 [=====================>........] - ETA: 39s - loss: 0.8595 - regression_loss: 0.7574 - classification_loss: 0.1021 384/500 [======================>.......] - ETA: 39s - loss: 0.8590 - regression_loss: 0.7570 - classification_loss: 0.1020 385/500 [======================>.......] - ETA: 38s - loss: 0.8600 - regression_loss: 0.7578 - classification_loss: 0.1022 386/500 [======================>.......] - ETA: 38s - loss: 0.8596 - regression_loss: 0.7576 - classification_loss: 0.1020 387/500 [======================>.......] - ETA: 38s - loss: 0.8586 - regression_loss: 0.7568 - classification_loss: 0.1018 388/500 [======================>.......] - ETA: 37s - loss: 0.8580 - regression_loss: 0.7564 - classification_loss: 0.1016 389/500 [======================>.......] - ETA: 37s - loss: 0.8583 - regression_loss: 0.7567 - classification_loss: 0.1015 390/500 [======================>.......] - ETA: 37s - loss: 0.8591 - regression_loss: 0.7574 - classification_loss: 0.1018 391/500 [======================>.......] - ETA: 36s - loss: 0.8595 - regression_loss: 0.7578 - classification_loss: 0.1017 392/500 [======================>.......] - ETA: 36s - loss: 0.8596 - regression_loss: 0.7579 - classification_loss: 0.1017 393/500 [======================>.......] - ETA: 36s - loss: 0.8594 - regression_loss: 0.7578 - classification_loss: 0.1016 394/500 [======================>.......] - ETA: 35s - loss: 0.8581 - regression_loss: 0.7567 - classification_loss: 0.1014 395/500 [======================>.......] - ETA: 35s - loss: 0.8582 - regression_loss: 0.7569 - classification_loss: 0.1014 396/500 [======================>.......] - ETA: 35s - loss: 0.8596 - regression_loss: 0.7581 - classification_loss: 0.1016 397/500 [======================>.......] - ETA: 34s - loss: 0.8591 - regression_loss: 0.7577 - classification_loss: 0.1015 398/500 [======================>.......] - ETA: 34s - loss: 0.8597 - regression_loss: 0.7583 - classification_loss: 0.1014 399/500 [======================>.......] - ETA: 34s - loss: 0.8608 - regression_loss: 0.7593 - classification_loss: 0.1015 400/500 [=======================>......] - ETA: 33s - loss: 0.8605 - regression_loss: 0.7591 - classification_loss: 0.1013 401/500 [=======================>......] - ETA: 33s - loss: 0.8615 - regression_loss: 0.7597 - classification_loss: 0.1018 402/500 [=======================>......] - ETA: 33s - loss: 0.8613 - regression_loss: 0.7596 - classification_loss: 0.1017 403/500 [=======================>......] - ETA: 32s - loss: 0.8618 - regression_loss: 0.7602 - classification_loss: 0.1016 404/500 [=======================>......] - ETA: 32s - loss: 0.8611 - regression_loss: 0.7596 - classification_loss: 0.1015 405/500 [=======================>......] - ETA: 32s - loss: 0.8600 - regression_loss: 0.7586 - classification_loss: 0.1014 406/500 [=======================>......] - ETA: 31s - loss: 0.8591 - regression_loss: 0.7579 - classification_loss: 0.1012 407/500 [=======================>......] - ETA: 31s - loss: 0.8600 - regression_loss: 0.7587 - classification_loss: 0.1013 408/500 [=======================>......] - ETA: 31s - loss: 0.8601 - regression_loss: 0.7589 - classification_loss: 0.1012 409/500 [=======================>......] - ETA: 30s - loss: 0.8588 - regression_loss: 0.7578 - classification_loss: 0.1010 410/500 [=======================>......] - ETA: 30s - loss: 0.8587 - regression_loss: 0.7578 - classification_loss: 0.1010 411/500 [=======================>......] - ETA: 30s - loss: 0.8587 - regression_loss: 0.7577 - classification_loss: 0.1009 412/500 [=======================>......] - ETA: 29s - loss: 0.8592 - regression_loss: 0.7582 - classification_loss: 0.1010 413/500 [=======================>......] - ETA: 29s - loss: 0.8580 - regression_loss: 0.7572 - classification_loss: 0.1008 414/500 [=======================>......] - ETA: 29s - loss: 0.8583 - regression_loss: 0.7575 - classification_loss: 0.1008 415/500 [=======================>......] - ETA: 28s - loss: 0.8578 - regression_loss: 0.7571 - classification_loss: 0.1006 416/500 [=======================>......] - ETA: 28s - loss: 0.8569 - regression_loss: 0.7564 - classification_loss: 0.1005 417/500 [========================>.....] - ETA: 28s - loss: 0.8570 - regression_loss: 0.7564 - classification_loss: 0.1006 418/500 [========================>.....] - ETA: 27s - loss: 0.8561 - regression_loss: 0.7555 - classification_loss: 0.1006 419/500 [========================>.....] - ETA: 27s - loss: 0.8545 - regression_loss: 0.7541 - classification_loss: 0.1004 420/500 [========================>.....] - ETA: 27s - loss: 0.8555 - regression_loss: 0.7549 - classification_loss: 0.1006 421/500 [========================>.....] - ETA: 26s - loss: 0.8549 - regression_loss: 0.7545 - classification_loss: 0.1005 422/500 [========================>.....] - ETA: 26s - loss: 0.8553 - regression_loss: 0.7548 - classification_loss: 0.1005 423/500 [========================>.....] - ETA: 26s - loss: 0.8546 - regression_loss: 0.7542 - classification_loss: 0.1004 424/500 [========================>.....] - ETA: 25s - loss: 0.8541 - regression_loss: 0.7538 - classification_loss: 0.1003 425/500 [========================>.....] - ETA: 25s - loss: 0.8549 - regression_loss: 0.7546 - classification_loss: 0.1003 426/500 [========================>.....] - ETA: 25s - loss: 0.8558 - regression_loss: 0.7554 - classification_loss: 0.1003 427/500 [========================>.....] - ETA: 24s - loss: 0.8550 - regression_loss: 0.7547 - classification_loss: 0.1003 428/500 [========================>.....] - ETA: 24s - loss: 0.8545 - regression_loss: 0.7544 - classification_loss: 0.1001 429/500 [========================>.....] - ETA: 24s - loss: 0.8554 - regression_loss: 0.7553 - classification_loss: 0.1000 430/500 [========================>.....] - ETA: 23s - loss: 0.8552 - regression_loss: 0.7552 - classification_loss: 0.0999 431/500 [========================>.....] - ETA: 23s - loss: 0.8550 - regression_loss: 0.7553 - classification_loss: 0.0997 432/500 [========================>.....] - ETA: 23s - loss: 0.8567 - regression_loss: 0.7564 - classification_loss: 0.1003 433/500 [========================>.....] - ETA: 22s - loss: 0.8561 - regression_loss: 0.7559 - classification_loss: 0.1002 434/500 [=========================>....] - ETA: 22s - loss: 0.8560 - regression_loss: 0.7560 - classification_loss: 0.1001 435/500 [=========================>....] - ETA: 22s - loss: 0.8574 - regression_loss: 0.7573 - classification_loss: 0.1001 436/500 [=========================>....] - ETA: 21s - loss: 0.8570 - regression_loss: 0.7571 - classification_loss: 0.0999 437/500 [=========================>....] - ETA: 21s - loss: 0.8565 - regression_loss: 0.7567 - classification_loss: 0.0998 438/500 [=========================>....] - ETA: 21s - loss: 0.8554 - regression_loss: 0.7558 - classification_loss: 0.0996 439/500 [=========================>....] - ETA: 20s - loss: 0.8548 - regression_loss: 0.7554 - classification_loss: 0.0994 440/500 [=========================>....] - ETA: 20s - loss: 0.8551 - regression_loss: 0.7557 - classification_loss: 0.0994 441/500 [=========================>....] - ETA: 19s - loss: 0.8539 - regression_loss: 0.7546 - classification_loss: 0.0992 442/500 [=========================>....] - ETA: 19s - loss: 0.8533 - regression_loss: 0.7541 - classification_loss: 0.0992 443/500 [=========================>....] - ETA: 19s - loss: 0.8533 - regression_loss: 0.7542 - classification_loss: 0.0991 444/500 [=========================>....] - ETA: 18s - loss: 0.8546 - regression_loss: 0.7552 - classification_loss: 0.0994 445/500 [=========================>....] - ETA: 18s - loss: 0.8537 - regression_loss: 0.7545 - classification_loss: 0.0992 446/500 [=========================>....] - ETA: 18s - loss: 0.8540 - regression_loss: 0.7547 - classification_loss: 0.0993 447/500 [=========================>....] - ETA: 17s - loss: 0.8536 - regression_loss: 0.7544 - classification_loss: 0.0992 448/500 [=========================>....] - ETA: 17s - loss: 0.8540 - regression_loss: 0.7547 - classification_loss: 0.0992 449/500 [=========================>....] - ETA: 17s - loss: 0.8541 - regression_loss: 0.7550 - classification_loss: 0.0991 450/500 [==========================>...] - ETA: 16s - loss: 0.8533 - regression_loss: 0.7544 - classification_loss: 0.0989 451/500 [==========================>...] - ETA: 16s - loss: 0.8525 - regression_loss: 0.7537 - classification_loss: 0.0988 452/500 [==========================>...] - ETA: 16s - loss: 0.8520 - regression_loss: 0.7533 - classification_loss: 0.0987 453/500 [==========================>...] - ETA: 15s - loss: 0.8512 - regression_loss: 0.7527 - classification_loss: 0.0985 454/500 [==========================>...] - ETA: 15s - loss: 0.8514 - regression_loss: 0.7529 - classification_loss: 0.0985 455/500 [==========================>...] - ETA: 15s - loss: 0.8519 - regression_loss: 0.7534 - classification_loss: 0.0985 456/500 [==========================>...] - ETA: 14s - loss: 0.8508 - regression_loss: 0.7525 - classification_loss: 0.0983 457/500 [==========================>...] - ETA: 14s - loss: 0.8494 - regression_loss: 0.7513 - classification_loss: 0.0981 458/500 [==========================>...] - ETA: 14s - loss: 0.8498 - regression_loss: 0.7518 - classification_loss: 0.0980 459/500 [==========================>...] - ETA: 13s - loss: 0.8496 - regression_loss: 0.7515 - classification_loss: 0.0981 460/500 [==========================>...] - ETA: 13s - loss: 0.8508 - regression_loss: 0.7528 - classification_loss: 0.0980 461/500 [==========================>...] - ETA: 13s - loss: 0.8509 - regression_loss: 0.7530 - classification_loss: 0.0979 462/500 [==========================>...] - ETA: 12s - loss: 0.8504 - regression_loss: 0.7526 - classification_loss: 0.0977 463/500 [==========================>...] - ETA: 12s - loss: 0.8497 - regression_loss: 0.7521 - classification_loss: 0.0976 464/500 [==========================>...] - ETA: 12s - loss: 0.8500 - regression_loss: 0.7523 - classification_loss: 0.0977 465/500 [==========================>...] - ETA: 11s - loss: 0.8503 - regression_loss: 0.7526 - classification_loss: 0.0977 466/500 [==========================>...] - ETA: 11s - loss: 0.8509 - regression_loss: 0.7532 - classification_loss: 0.0977 467/500 [===========================>..] - ETA: 11s - loss: 0.8504 - regression_loss: 0.7529 - classification_loss: 0.0975 468/500 [===========================>..] - ETA: 10s - loss: 0.8506 - regression_loss: 0.7532 - classification_loss: 0.0974 469/500 [===========================>..] - ETA: 10s - loss: 0.8501 - regression_loss: 0.7527 - classification_loss: 0.0974 470/500 [===========================>..] - ETA: 10s - loss: 0.8503 - regression_loss: 0.7530 - classification_loss: 0.0973 471/500 [===========================>..] - ETA: 9s - loss: 0.8491 - regression_loss: 0.7520 - classification_loss: 0.0971  472/500 [===========================>..] - ETA: 9s - loss: 0.8492 - regression_loss: 0.7522 - classification_loss: 0.0970 473/500 [===========================>..] - ETA: 9s - loss: 0.8494 - regression_loss: 0.7522 - classification_loss: 0.0972 474/500 [===========================>..] - ETA: 8s - loss: 0.8495 - regression_loss: 0.7525 - classification_loss: 0.0971 475/500 [===========================>..] - ETA: 8s - loss: 0.8492 - regression_loss: 0.7522 - classification_loss: 0.0970 476/500 [===========================>..] - ETA: 8s - loss: 0.8485 - regression_loss: 0.7516 - classification_loss: 0.0969 477/500 [===========================>..] - ETA: 7s - loss: 0.8491 - regression_loss: 0.7522 - classification_loss: 0.0969 478/500 [===========================>..] - ETA: 7s - loss: 0.8484 - regression_loss: 0.7517 - classification_loss: 0.0967 479/500 [===========================>..] - ETA: 7s - loss: 0.8482 - regression_loss: 0.7516 - classification_loss: 0.0966 480/500 [===========================>..] - ETA: 6s - loss: 0.8496 - regression_loss: 0.7526 - classification_loss: 0.0969 481/500 [===========================>..] - ETA: 6s - loss: 0.8502 - regression_loss: 0.7532 - classification_loss: 0.0971 482/500 [===========================>..] - ETA: 6s - loss: 0.8500 - regression_loss: 0.7530 - classification_loss: 0.0969 483/500 [===========================>..] - ETA: 5s - loss: 0.8494 - regression_loss: 0.7526 - classification_loss: 0.0968 484/500 [============================>.] - ETA: 5s - loss: 0.8486 - regression_loss: 0.7519 - classification_loss: 0.0967 485/500 [============================>.] - ETA: 5s - loss: 0.8487 - regression_loss: 0.7521 - classification_loss: 0.0966 486/500 [============================>.] - ETA: 4s - loss: 0.8479 - regression_loss: 0.7514 - classification_loss: 0.0965 487/500 [============================>.] - ETA: 4s - loss: 0.8483 - regression_loss: 0.7517 - classification_loss: 0.0965 488/500 [============================>.] - ETA: 4s - loss: 0.8476 - regression_loss: 0.7512 - classification_loss: 0.0964 489/500 [============================>.] - ETA: 3s - loss: 0.8481 - regression_loss: 0.7517 - classification_loss: 0.0964 490/500 [============================>.] - ETA: 3s - loss: 0.8500 - regression_loss: 0.7531 - classification_loss: 0.0969 491/500 [============================>.] - ETA: 3s - loss: 0.8493 - regression_loss: 0.7524 - classification_loss: 0.0969 492/500 [============================>.] - ETA: 2s - loss: 0.8488 - regression_loss: 0.7520 - classification_loss: 0.0968 493/500 [============================>.] - ETA: 2s - loss: 0.8489 - regression_loss: 0.7520 - classification_loss: 0.0968 494/500 [============================>.] - ETA: 2s - loss: 0.8482 - regression_loss: 0.7514 - classification_loss: 0.0968 495/500 [============================>.] - ETA: 1s - loss: 0.8485 - regression_loss: 0.7517 - classification_loss: 0.0967 496/500 [============================>.] - ETA: 1s - loss: 0.8490 - regression_loss: 0.7523 - classification_loss: 0.0967 497/500 [============================>.] - ETA: 1s - loss: 0.8498 - regression_loss: 0.7530 - classification_loss: 0.0968 498/500 [============================>.] - ETA: 0s - loss: 0.8492 - regression_loss: 0.7526 - classification_loss: 0.0967 499/500 [============================>.] - ETA: 0s - loss: 0.8493 - regression_loss: 0.7526 - classification_loss: 0.0967 500/500 [==============================] - 170s 339ms/step - loss: 0.8500 - regression_loss: 0.7532 - classification_loss: 0.0968 326 instances of class plum with average precision: 0.8348 mAP: 0.8348 Epoch 00034: saving model to ./training/snapshots/resnet101_pascal_34.h5 Epoch 35/150 1/500 [..............................] - ETA: 2:38 - loss: 0.5448 - regression_loss: 0.5235 - classification_loss: 0.0213 2/500 [..............................] - ETA: 2:43 - loss: 0.5995 - regression_loss: 0.5716 - classification_loss: 0.0278 3/500 [..............................] - ETA: 2:46 - loss: 0.6779 - regression_loss: 0.6255 - classification_loss: 0.0524 4/500 [..............................] - ETA: 2:48 - loss: 0.6848 - regression_loss: 0.6275 - classification_loss: 0.0573 5/500 [..............................] - ETA: 2:49 - loss: 0.6640 - regression_loss: 0.6098 - classification_loss: 0.0542 6/500 [..............................] - ETA: 2:49 - loss: 0.6689 - regression_loss: 0.6133 - classification_loss: 0.0556 7/500 [..............................] - ETA: 2:49 - loss: 0.6918 - regression_loss: 0.6315 - classification_loss: 0.0604 8/500 [..............................] - ETA: 2:49 - loss: 0.7282 - regression_loss: 0.6644 - classification_loss: 0.0638 9/500 [..............................] - ETA: 2:48 - loss: 0.6955 - regression_loss: 0.6353 - classification_loss: 0.0601 10/500 [..............................] - ETA: 2:48 - loss: 0.7092 - regression_loss: 0.6332 - classification_loss: 0.0761 11/500 [..............................] - ETA: 2:48 - loss: 0.6903 - regression_loss: 0.6172 - classification_loss: 0.0730 12/500 [..............................] - ETA: 2:47 - loss: 0.7167 - regression_loss: 0.6433 - classification_loss: 0.0734 13/500 [..............................] - ETA: 2:47 - loss: 0.7223 - regression_loss: 0.6491 - classification_loss: 0.0731 14/500 [..............................] - ETA: 2:46 - loss: 0.7172 - regression_loss: 0.6474 - classification_loss: 0.0698 15/500 [..............................] - ETA: 2:45 - loss: 0.7225 - regression_loss: 0.6536 - classification_loss: 0.0689 16/500 [..............................] - ETA: 2:45 - loss: 0.7654 - regression_loss: 0.6848 - classification_loss: 0.0806 17/500 [>.............................] - ETA: 2:45 - loss: 0.7626 - regression_loss: 0.6842 - classification_loss: 0.0783 18/500 [>.............................] - ETA: 2:45 - loss: 0.7700 - regression_loss: 0.6926 - classification_loss: 0.0774 19/500 [>.............................] - ETA: 2:44 - loss: 0.7901 - regression_loss: 0.7108 - classification_loss: 0.0793 20/500 [>.............................] - ETA: 2:43 - loss: 0.8130 - regression_loss: 0.7309 - classification_loss: 0.0821 21/500 [>.............................] - ETA: 2:43 - loss: 0.8103 - regression_loss: 0.7286 - classification_loss: 0.0816 22/500 [>.............................] - ETA: 2:43 - loss: 0.8166 - regression_loss: 0.7345 - classification_loss: 0.0821 23/500 [>.............................] - ETA: 2:42 - loss: 0.8587 - regression_loss: 0.7683 - classification_loss: 0.0905 24/500 [>.............................] - ETA: 2:42 - loss: 0.8426 - regression_loss: 0.7550 - classification_loss: 0.0876 25/500 [>.............................] - ETA: 2:41 - loss: 0.8283 - regression_loss: 0.7430 - classification_loss: 0.0854 26/500 [>.............................] - ETA: 2:41 - loss: 0.8391 - regression_loss: 0.7516 - classification_loss: 0.0875 27/500 [>.............................] - ETA: 2:40 - loss: 0.8373 - regression_loss: 0.7508 - classification_loss: 0.0864 28/500 [>.............................] - ETA: 2:40 - loss: 0.8429 - regression_loss: 0.7543 - classification_loss: 0.0886 29/500 [>.............................] - ETA: 2:40 - loss: 0.8249 - regression_loss: 0.7389 - classification_loss: 0.0860 30/500 [>.............................] - ETA: 2:39 - loss: 0.8153 - regression_loss: 0.7313 - classification_loss: 0.0840 31/500 [>.............................] - ETA: 2:39 - loss: 0.8178 - regression_loss: 0.7343 - classification_loss: 0.0835 32/500 [>.............................] - ETA: 2:39 - loss: 0.8389 - regression_loss: 0.7522 - classification_loss: 0.0867 33/500 [>.............................] - ETA: 2:38 - loss: 0.8519 - regression_loss: 0.7643 - classification_loss: 0.0876 34/500 [=>............................] - ETA: 2:37 - loss: 0.8474 - regression_loss: 0.7614 - classification_loss: 0.0860 35/500 [=>............................] - ETA: 2:37 - loss: 0.8497 - regression_loss: 0.7642 - classification_loss: 0.0855 36/500 [=>............................] - ETA: 2:36 - loss: 0.8299 - regression_loss: 0.7466 - classification_loss: 0.0833 37/500 [=>............................] - ETA: 2:36 - loss: 0.8362 - regression_loss: 0.7541 - classification_loss: 0.0821 38/500 [=>............................] - ETA: 2:36 - loss: 0.8273 - regression_loss: 0.7464 - classification_loss: 0.0809 39/500 [=>............................] - ETA: 2:35 - loss: 0.8297 - regression_loss: 0.7483 - classification_loss: 0.0814 40/500 [=>............................] - ETA: 2:35 - loss: 0.8266 - regression_loss: 0.7462 - classification_loss: 0.0804 41/500 [=>............................] - ETA: 2:35 - loss: 0.8109 - regression_loss: 0.7321 - classification_loss: 0.0788 42/500 [=>............................] - ETA: 2:35 - loss: 0.8149 - regression_loss: 0.7360 - classification_loss: 0.0789 43/500 [=>............................] - ETA: 2:34 - loss: 0.8069 - regression_loss: 0.7293 - classification_loss: 0.0776 44/500 [=>............................] - ETA: 2:34 - loss: 0.7967 - regression_loss: 0.7203 - classification_loss: 0.0764 45/500 [=>............................] - ETA: 2:34 - loss: 0.7966 - regression_loss: 0.7200 - classification_loss: 0.0765 46/500 [=>............................] - ETA: 2:33 - loss: 0.7998 - regression_loss: 0.7228 - classification_loss: 0.0771 47/500 [=>............................] - ETA: 2:33 - loss: 0.8046 - regression_loss: 0.7273 - classification_loss: 0.0774 48/500 [=>............................] - ETA: 2:33 - loss: 0.8094 - regression_loss: 0.7320 - classification_loss: 0.0774 49/500 [=>............................] - ETA: 2:32 - loss: 0.8063 - regression_loss: 0.7285 - classification_loss: 0.0778 50/500 [==>...........................] - ETA: 2:32 - loss: 0.8122 - regression_loss: 0.7347 - classification_loss: 0.0775 51/500 [==>...........................] - ETA: 2:32 - loss: 0.8154 - regression_loss: 0.7375 - classification_loss: 0.0779 52/500 [==>...........................] - ETA: 2:31 - loss: 0.8112 - regression_loss: 0.7344 - classification_loss: 0.0769 53/500 [==>...........................] - ETA: 2:31 - loss: 0.8204 - regression_loss: 0.7435 - classification_loss: 0.0769 54/500 [==>...........................] - ETA: 2:31 - loss: 0.8416 - regression_loss: 0.7634 - classification_loss: 0.0782 55/500 [==>...........................] - ETA: 2:30 - loss: 0.8400 - regression_loss: 0.7617 - classification_loss: 0.0783 56/500 [==>...........................] - ETA: 2:30 - loss: 0.8413 - regression_loss: 0.7636 - classification_loss: 0.0777 57/500 [==>...........................] - ETA: 2:29 - loss: 0.8466 - regression_loss: 0.7678 - classification_loss: 0.0788 58/500 [==>...........................] - ETA: 2:29 - loss: 0.8594 - regression_loss: 0.7771 - classification_loss: 0.0822 59/500 [==>...........................] - ETA: 2:29 - loss: 0.8687 - regression_loss: 0.7834 - classification_loss: 0.0854 60/500 [==>...........................] - ETA: 2:28 - loss: 0.8791 - regression_loss: 0.7917 - classification_loss: 0.0874 61/500 [==>...........................] - ETA: 2:28 - loss: 0.8788 - regression_loss: 0.7910 - classification_loss: 0.0878 62/500 [==>...........................] - ETA: 2:28 - loss: 0.8772 - regression_loss: 0.7897 - classification_loss: 0.0875 63/500 [==>...........................] - ETA: 2:27 - loss: 0.8906 - regression_loss: 0.8010 - classification_loss: 0.0896 64/500 [==>...........................] - ETA: 2:27 - loss: 0.9035 - regression_loss: 0.8100 - classification_loss: 0.0934 65/500 [==>...........................] - ETA: 2:27 - loss: 0.9015 - regression_loss: 0.8086 - classification_loss: 0.0929 66/500 [==>...........................] - ETA: 2:26 - loss: 0.8957 - regression_loss: 0.8039 - classification_loss: 0.0918 67/500 [===>..........................] - ETA: 2:26 - loss: 0.8924 - regression_loss: 0.8015 - classification_loss: 0.0910 68/500 [===>..........................] - ETA: 2:26 - loss: 0.8914 - regression_loss: 0.8008 - classification_loss: 0.0906 69/500 [===>..........................] - ETA: 2:26 - loss: 0.8882 - regression_loss: 0.7985 - classification_loss: 0.0898 70/500 [===>..........................] - ETA: 2:25 - loss: 0.8926 - regression_loss: 0.8016 - classification_loss: 0.0910 71/500 [===>..........................] - ETA: 2:25 - loss: 0.8885 - regression_loss: 0.7976 - classification_loss: 0.0909 72/500 [===>..........................] - ETA: 2:24 - loss: 0.8867 - regression_loss: 0.7960 - classification_loss: 0.0907 73/500 [===>..........................] - ETA: 2:24 - loss: 0.8862 - regression_loss: 0.7959 - classification_loss: 0.0903 74/500 [===>..........................] - ETA: 2:24 - loss: 0.8805 - regression_loss: 0.7912 - classification_loss: 0.0894 75/500 [===>..........................] - ETA: 2:23 - loss: 0.8791 - regression_loss: 0.7902 - classification_loss: 0.0889 76/500 [===>..........................] - ETA: 2:23 - loss: 0.8884 - regression_loss: 0.7974 - classification_loss: 0.0910 77/500 [===>..........................] - ETA: 2:23 - loss: 0.8815 - regression_loss: 0.7914 - classification_loss: 0.0901 78/500 [===>..........................] - ETA: 2:22 - loss: 0.8833 - regression_loss: 0.7940 - classification_loss: 0.0892 79/500 [===>..........................] - ETA: 2:22 - loss: 0.8821 - regression_loss: 0.7932 - classification_loss: 0.0889 80/500 [===>..........................] - ETA: 2:22 - loss: 0.8771 - regression_loss: 0.7890 - classification_loss: 0.0880 81/500 [===>..........................] - ETA: 2:22 - loss: 0.8810 - regression_loss: 0.7915 - classification_loss: 0.0895 82/500 [===>..........................] - ETA: 2:21 - loss: 0.8765 - regression_loss: 0.7874 - classification_loss: 0.0891 83/500 [===>..........................] - ETA: 2:21 - loss: 0.8761 - regression_loss: 0.7871 - classification_loss: 0.0890 84/500 [====>.........................] - ETA: 2:21 - loss: 0.8909 - regression_loss: 0.7997 - classification_loss: 0.0913 85/500 [====>.........................] - ETA: 2:20 - loss: 0.8827 - regression_loss: 0.7920 - classification_loss: 0.0906 86/500 [====>.........................] - ETA: 2:20 - loss: 0.8870 - regression_loss: 0.7953 - classification_loss: 0.0917 87/500 [====>.........................] - ETA: 2:20 - loss: 0.8883 - regression_loss: 0.7966 - classification_loss: 0.0918 88/500 [====>.........................] - ETA: 2:19 - loss: 0.8840 - regression_loss: 0.7928 - classification_loss: 0.0912 89/500 [====>.........................] - ETA: 2:19 - loss: 0.8828 - regression_loss: 0.7920 - classification_loss: 0.0908 90/500 [====>.........................] - ETA: 2:18 - loss: 0.8814 - regression_loss: 0.7911 - classification_loss: 0.0903 91/500 [====>.........................] - ETA: 2:18 - loss: 0.8878 - regression_loss: 0.7962 - classification_loss: 0.0916 92/500 [====>.........................] - ETA: 2:18 - loss: 0.8845 - regression_loss: 0.7935 - classification_loss: 0.0910 93/500 [====>.........................] - ETA: 2:17 - loss: 0.8909 - regression_loss: 0.7997 - classification_loss: 0.0912 94/500 [====>.........................] - ETA: 2:17 - loss: 0.8891 - regression_loss: 0.7980 - classification_loss: 0.0911 95/500 [====>.........................] - ETA: 2:17 - loss: 0.8855 - regression_loss: 0.7948 - classification_loss: 0.0907 96/500 [====>.........................] - ETA: 2:17 - loss: 0.8816 - regression_loss: 0.7911 - classification_loss: 0.0906 97/500 [====>.........................] - ETA: 2:16 - loss: 0.8800 - regression_loss: 0.7901 - classification_loss: 0.0899 98/500 [====>.........................] - ETA: 2:16 - loss: 0.8803 - regression_loss: 0.7907 - classification_loss: 0.0897 99/500 [====>.........................] - ETA: 2:15 - loss: 0.8807 - regression_loss: 0.7912 - classification_loss: 0.0896 100/500 [=====>........................] - ETA: 2:15 - loss: 0.8801 - regression_loss: 0.7904 - classification_loss: 0.0898 101/500 [=====>........................] - ETA: 2:15 - loss: 0.8767 - regression_loss: 0.7872 - classification_loss: 0.0894 102/500 [=====>........................] - ETA: 2:14 - loss: 0.8728 - regression_loss: 0.7839 - classification_loss: 0.0890 103/500 [=====>........................] - ETA: 2:14 - loss: 0.8687 - regression_loss: 0.7804 - classification_loss: 0.0883 104/500 [=====>........................] - ETA: 2:14 - loss: 0.8648 - regression_loss: 0.7771 - classification_loss: 0.0877 105/500 [=====>........................] - ETA: 2:13 - loss: 0.8684 - regression_loss: 0.7797 - classification_loss: 0.0888 106/500 [=====>........................] - ETA: 2:13 - loss: 0.8676 - regression_loss: 0.7789 - classification_loss: 0.0887 107/500 [=====>........................] - ETA: 2:13 - loss: 0.8688 - regression_loss: 0.7801 - classification_loss: 0.0887 108/500 [=====>........................] - ETA: 2:12 - loss: 0.8692 - regression_loss: 0.7809 - classification_loss: 0.0883 109/500 [=====>........................] - ETA: 2:12 - loss: 0.8695 - regression_loss: 0.7797 - classification_loss: 0.0898 110/500 [=====>........................] - ETA: 2:11 - loss: 0.8690 - regression_loss: 0.7794 - classification_loss: 0.0896 111/500 [=====>........................] - ETA: 2:11 - loss: 0.8643 - regression_loss: 0.7753 - classification_loss: 0.0890 112/500 [=====>........................] - ETA: 2:11 - loss: 0.8626 - regression_loss: 0.7743 - classification_loss: 0.0884 113/500 [=====>........................] - ETA: 2:10 - loss: 0.8590 - regression_loss: 0.7712 - classification_loss: 0.0878 114/500 [=====>........................] - ETA: 2:10 - loss: 0.8583 - regression_loss: 0.7711 - classification_loss: 0.0872 115/500 [=====>........................] - ETA: 2:10 - loss: 0.8603 - regression_loss: 0.7730 - classification_loss: 0.0873 116/500 [=====>........................] - ETA: 2:09 - loss: 0.8605 - regression_loss: 0.7733 - classification_loss: 0.0871 117/500 [======>.......................] - ETA: 2:09 - loss: 0.8582 - regression_loss: 0.7709 - classification_loss: 0.0873 118/500 [======>.......................] - ETA: 2:09 - loss: 0.8596 - regression_loss: 0.7718 - classification_loss: 0.0878 119/500 [======>.......................] - ETA: 2:08 - loss: 0.8602 - regression_loss: 0.7725 - classification_loss: 0.0877 120/500 [======>.......................] - ETA: 2:08 - loss: 0.8546 - regression_loss: 0.7675 - classification_loss: 0.0872 121/500 [======>.......................] - ETA: 2:08 - loss: 0.8550 - regression_loss: 0.7679 - classification_loss: 0.0871 122/500 [======>.......................] - ETA: 2:07 - loss: 0.8505 - regression_loss: 0.7640 - classification_loss: 0.0865 123/500 [======>.......................] - ETA: 2:07 - loss: 0.8504 - regression_loss: 0.7638 - classification_loss: 0.0866 124/500 [======>.......................] - ETA: 2:07 - loss: 0.8518 - regression_loss: 0.7648 - classification_loss: 0.0870 125/500 [======>.......................] - ETA: 2:06 - loss: 0.8471 - regression_loss: 0.7607 - classification_loss: 0.0865 126/500 [======>.......................] - ETA: 2:06 - loss: 0.8502 - regression_loss: 0.7641 - classification_loss: 0.0860 127/500 [======>.......................] - ETA: 2:06 - loss: 0.8511 - regression_loss: 0.7651 - classification_loss: 0.0860 128/500 [======>.......................] - ETA: 2:05 - loss: 0.8494 - regression_loss: 0.7638 - classification_loss: 0.0856 129/500 [======>.......................] - ETA: 2:05 - loss: 0.8465 - regression_loss: 0.7613 - classification_loss: 0.0852 130/500 [======>.......................] - ETA: 2:05 - loss: 0.8485 - regression_loss: 0.7635 - classification_loss: 0.0850 131/500 [======>.......................] - ETA: 2:04 - loss: 0.8482 - regression_loss: 0.7633 - classification_loss: 0.0848 132/500 [======>.......................] - ETA: 2:04 - loss: 0.8471 - regression_loss: 0.7625 - classification_loss: 0.0846 133/500 [======>.......................] - ETA: 2:04 - loss: 0.8511 - regression_loss: 0.7656 - classification_loss: 0.0855 134/500 [=======>......................] - ETA: 2:03 - loss: 0.8506 - regression_loss: 0.7653 - classification_loss: 0.0854 135/500 [=======>......................] - ETA: 2:03 - loss: 0.8464 - regression_loss: 0.7616 - classification_loss: 0.0848 136/500 [=======>......................] - ETA: 2:03 - loss: 0.8443 - regression_loss: 0.7599 - classification_loss: 0.0845 137/500 [=======>......................] - ETA: 2:02 - loss: 0.8432 - regression_loss: 0.7590 - classification_loss: 0.0843 138/500 [=======>......................] - ETA: 2:02 - loss: 0.8433 - regression_loss: 0.7590 - classification_loss: 0.0843 139/500 [=======>......................] - ETA: 2:02 - loss: 0.8458 - regression_loss: 0.7609 - classification_loss: 0.0849 140/500 [=======>......................] - ETA: 2:01 - loss: 0.8484 - regression_loss: 0.7634 - classification_loss: 0.0850 141/500 [=======>......................] - ETA: 2:01 - loss: 0.8464 - regression_loss: 0.7616 - classification_loss: 0.0847 142/500 [=======>......................] - ETA: 2:01 - loss: 0.8483 - regression_loss: 0.7635 - classification_loss: 0.0847 143/500 [=======>......................] - ETA: 2:00 - loss: 0.8459 - regression_loss: 0.7614 - classification_loss: 0.0845 144/500 [=======>......................] - ETA: 2:00 - loss: 0.8472 - regression_loss: 0.7626 - classification_loss: 0.0845 145/500 [=======>......................] - ETA: 2:00 - loss: 0.8522 - regression_loss: 0.7670 - classification_loss: 0.0852 146/500 [=======>......................] - ETA: 1:59 - loss: 0.8504 - regression_loss: 0.7655 - classification_loss: 0.0849 147/500 [=======>......................] - ETA: 1:59 - loss: 0.8499 - regression_loss: 0.7654 - classification_loss: 0.0845 148/500 [=======>......................] - ETA: 1:59 - loss: 0.8561 - regression_loss: 0.7714 - classification_loss: 0.0847 149/500 [=======>......................] - ETA: 1:58 - loss: 0.8575 - regression_loss: 0.7728 - classification_loss: 0.0847 150/500 [========>.....................] - ETA: 1:58 - loss: 0.8604 - regression_loss: 0.7752 - classification_loss: 0.0852 151/500 [========>.....................] - ETA: 1:58 - loss: 0.8598 - regression_loss: 0.7747 - classification_loss: 0.0852 152/500 [========>.....................] - ETA: 1:57 - loss: 0.8591 - regression_loss: 0.7742 - classification_loss: 0.0849 153/500 [========>.....................] - ETA: 1:57 - loss: 0.8590 - regression_loss: 0.7743 - classification_loss: 0.0847 154/500 [========>.....................] - ETA: 1:57 - loss: 0.8557 - regression_loss: 0.7714 - classification_loss: 0.0843 155/500 [========>.....................] - ETA: 1:56 - loss: 0.8543 - regression_loss: 0.7704 - classification_loss: 0.0839 156/500 [========>.....................] - ETA: 1:56 - loss: 0.8513 - regression_loss: 0.7677 - classification_loss: 0.0836 157/500 [========>.....................] - ETA: 1:56 - loss: 0.8476 - regression_loss: 0.7645 - classification_loss: 0.0831 158/500 [========>.....................] - ETA: 1:55 - loss: 0.8461 - regression_loss: 0.7632 - classification_loss: 0.0829 159/500 [========>.....................] - ETA: 1:55 - loss: 0.8452 - regression_loss: 0.7627 - classification_loss: 0.0825 160/500 [========>.....................] - ETA: 1:55 - loss: 0.8488 - regression_loss: 0.7656 - classification_loss: 0.0831 161/500 [========>.....................] - ETA: 1:54 - loss: 0.8482 - regression_loss: 0.7653 - classification_loss: 0.0829 162/500 [========>.....................] - ETA: 1:54 - loss: 0.8489 - regression_loss: 0.7660 - classification_loss: 0.0829 163/500 [========>.....................] - ETA: 1:54 - loss: 0.8494 - regression_loss: 0.7667 - classification_loss: 0.0828 164/500 [========>.....................] - ETA: 1:53 - loss: 0.8475 - regression_loss: 0.7650 - classification_loss: 0.0825 165/500 [========>.....................] - ETA: 1:53 - loss: 0.8474 - regression_loss: 0.7651 - classification_loss: 0.0823 166/500 [========>.....................] - ETA: 1:53 - loss: 0.8469 - regression_loss: 0.7646 - classification_loss: 0.0824 167/500 [=========>....................] - ETA: 1:52 - loss: 0.8438 - regression_loss: 0.7618 - classification_loss: 0.0820 168/500 [=========>....................] - ETA: 1:52 - loss: 0.8443 - regression_loss: 0.7622 - classification_loss: 0.0820 169/500 [=========>....................] - ETA: 1:52 - loss: 0.8438 - regression_loss: 0.7617 - classification_loss: 0.0821 170/500 [=========>....................] - ETA: 1:51 - loss: 0.8456 - regression_loss: 0.7636 - classification_loss: 0.0820 171/500 [=========>....................] - ETA: 1:51 - loss: 0.8440 - regression_loss: 0.7621 - classification_loss: 0.0819 172/500 [=========>....................] - ETA: 1:51 - loss: 0.8443 - regression_loss: 0.7623 - classification_loss: 0.0820 173/500 [=========>....................] - ETA: 1:50 - loss: 0.8436 - regression_loss: 0.7620 - classification_loss: 0.0816 174/500 [=========>....................] - ETA: 1:50 - loss: 0.8441 - regression_loss: 0.7625 - classification_loss: 0.0815 175/500 [=========>....................] - ETA: 1:50 - loss: 0.8467 - regression_loss: 0.7648 - classification_loss: 0.0819 176/500 [=========>....................] - ETA: 1:49 - loss: 0.8484 - regression_loss: 0.7665 - classification_loss: 0.0819 177/500 [=========>....................] - ETA: 1:49 - loss: 0.8457 - regression_loss: 0.7641 - classification_loss: 0.0815 178/500 [=========>....................] - ETA: 1:49 - loss: 0.8436 - regression_loss: 0.7623 - classification_loss: 0.0812 179/500 [=========>....................] - ETA: 1:48 - loss: 0.8418 - regression_loss: 0.7608 - classification_loss: 0.0810 180/500 [=========>....................] - ETA: 1:48 - loss: 0.8406 - regression_loss: 0.7598 - classification_loss: 0.0808 181/500 [=========>....................] - ETA: 1:48 - loss: 0.8420 - regression_loss: 0.7601 - classification_loss: 0.0819 182/500 [=========>....................] - ETA: 1:47 - loss: 0.8404 - regression_loss: 0.7588 - classification_loss: 0.0816 183/500 [=========>....................] - ETA: 1:47 - loss: 0.8406 - regression_loss: 0.7591 - classification_loss: 0.0815 184/500 [==========>...................] - ETA: 1:47 - loss: 0.8402 - regression_loss: 0.7588 - classification_loss: 0.0814 185/500 [==========>...................] - ETA: 1:46 - loss: 0.8392 - regression_loss: 0.7580 - classification_loss: 0.0812 186/500 [==========>...................] - ETA: 1:46 - loss: 0.8408 - regression_loss: 0.7597 - classification_loss: 0.0811 187/500 [==========>...................] - ETA: 1:46 - loss: 0.8388 - regression_loss: 0.7579 - classification_loss: 0.0810 188/500 [==========>...................] - ETA: 1:45 - loss: 0.8392 - regression_loss: 0.7582 - classification_loss: 0.0809 189/500 [==========>...................] - ETA: 1:45 - loss: 0.8393 - regression_loss: 0.7581 - classification_loss: 0.0812 190/500 [==========>...................] - ETA: 1:45 - loss: 0.8387 - regression_loss: 0.7575 - classification_loss: 0.0812 191/500 [==========>...................] - ETA: 1:44 - loss: 0.8404 - regression_loss: 0.7591 - classification_loss: 0.0813 192/500 [==========>...................] - ETA: 1:44 - loss: 0.8468 - regression_loss: 0.7645 - classification_loss: 0.0823 193/500 [==========>...................] - ETA: 1:44 - loss: 0.8473 - regression_loss: 0.7645 - classification_loss: 0.0828 194/500 [==========>...................] - ETA: 1:43 - loss: 0.8464 - regression_loss: 0.7635 - classification_loss: 0.0829 195/500 [==========>...................] - ETA: 1:43 - loss: 0.8443 - regression_loss: 0.7613 - classification_loss: 0.0830 196/500 [==========>...................] - ETA: 1:43 - loss: 0.8467 - regression_loss: 0.7636 - classification_loss: 0.0831 197/500 [==========>...................] - ETA: 1:42 - loss: 0.8474 - regression_loss: 0.7642 - classification_loss: 0.0831 198/500 [==========>...................] - ETA: 1:42 - loss: 0.8452 - regression_loss: 0.7624 - classification_loss: 0.0828 199/500 [==========>...................] - ETA: 1:42 - loss: 0.8451 - regression_loss: 0.7624 - classification_loss: 0.0828 200/500 [===========>..................] - ETA: 1:41 - loss: 0.8453 - regression_loss: 0.7625 - classification_loss: 0.0828 201/500 [===========>..................] - ETA: 1:41 - loss: 0.8449 - regression_loss: 0.7622 - classification_loss: 0.0827 202/500 [===========>..................] - ETA: 1:41 - loss: 0.8449 - regression_loss: 0.7620 - classification_loss: 0.0829 203/500 [===========>..................] - ETA: 1:40 - loss: 0.8452 - regression_loss: 0.7623 - classification_loss: 0.0829 204/500 [===========>..................] - ETA: 1:40 - loss: 0.8431 - regression_loss: 0.7604 - classification_loss: 0.0826 205/500 [===========>..................] - ETA: 1:40 - loss: 0.8429 - regression_loss: 0.7603 - classification_loss: 0.0825 206/500 [===========>..................] - ETA: 1:39 - loss: 0.8428 - regression_loss: 0.7604 - classification_loss: 0.0824 207/500 [===========>..................] - ETA: 1:39 - loss: 0.8408 - regression_loss: 0.7586 - classification_loss: 0.0822 208/500 [===========>..................] - ETA: 1:39 - loss: 0.8395 - regression_loss: 0.7574 - classification_loss: 0.0821 209/500 [===========>..................] - ETA: 1:38 - loss: 0.8392 - regression_loss: 0.7572 - classification_loss: 0.0820 210/500 [===========>..................] - ETA: 1:38 - loss: 0.8405 - regression_loss: 0.7585 - classification_loss: 0.0819 211/500 [===========>..................] - ETA: 1:37 - loss: 0.8412 - regression_loss: 0.7594 - classification_loss: 0.0818 212/500 [===========>..................] - ETA: 1:37 - loss: 0.8395 - regression_loss: 0.7581 - classification_loss: 0.0814 213/500 [===========>..................] - ETA: 1:37 - loss: 0.8372 - regression_loss: 0.7561 - classification_loss: 0.0811 214/500 [===========>..................] - ETA: 1:36 - loss: 0.8371 - regression_loss: 0.7561 - classification_loss: 0.0809 215/500 [===========>..................] - ETA: 1:36 - loss: 0.8355 - regression_loss: 0.7548 - classification_loss: 0.0807 216/500 [===========>..................] - ETA: 1:36 - loss: 0.8361 - regression_loss: 0.7553 - classification_loss: 0.0808 217/500 [============>.................] - ETA: 1:35 - loss: 0.8363 - regression_loss: 0.7556 - classification_loss: 0.0807 218/500 [============>.................] - ETA: 1:35 - loss: 0.8380 - regression_loss: 0.7571 - classification_loss: 0.0809 219/500 [============>.................] - ETA: 1:35 - loss: 0.8372 - regression_loss: 0.7565 - classification_loss: 0.0807 220/500 [============>.................] - ETA: 1:34 - loss: 0.8343 - regression_loss: 0.7538 - classification_loss: 0.0804 221/500 [============>.................] - ETA: 1:34 - loss: 0.8333 - regression_loss: 0.7530 - classification_loss: 0.0803 222/500 [============>.................] - ETA: 1:34 - loss: 0.8359 - regression_loss: 0.7553 - classification_loss: 0.0806 223/500 [============>.................] - ETA: 1:33 - loss: 0.8344 - regression_loss: 0.7538 - classification_loss: 0.0806 224/500 [============>.................] - ETA: 1:33 - loss: 0.8345 - regression_loss: 0.7540 - classification_loss: 0.0805 225/500 [============>.................] - ETA: 1:33 - loss: 0.8348 - regression_loss: 0.7542 - classification_loss: 0.0805 226/500 [============>.................] - ETA: 1:32 - loss: 0.8385 - regression_loss: 0.7575 - classification_loss: 0.0810 227/500 [============>.................] - ETA: 1:32 - loss: 0.8375 - regression_loss: 0.7566 - classification_loss: 0.0809 228/500 [============>.................] - ETA: 1:32 - loss: 0.8373 - regression_loss: 0.7566 - classification_loss: 0.0806 229/500 [============>.................] - ETA: 1:31 - loss: 0.8388 - regression_loss: 0.7575 - classification_loss: 0.0813 230/500 [============>.................] - ETA: 1:31 - loss: 0.8386 - regression_loss: 0.7574 - classification_loss: 0.0812 231/500 [============>.................] - ETA: 1:31 - loss: 0.8388 - regression_loss: 0.7579 - classification_loss: 0.0810 232/500 [============>.................] - ETA: 1:30 - loss: 0.8374 - regression_loss: 0.7567 - classification_loss: 0.0807 233/500 [============>.................] - ETA: 1:30 - loss: 0.8386 - regression_loss: 0.7580 - classification_loss: 0.0806 234/500 [=============>................] - ETA: 1:30 - loss: 0.8373 - regression_loss: 0.7570 - classification_loss: 0.0803 235/500 [=============>................] - ETA: 1:29 - loss: 0.8435 - regression_loss: 0.7616 - classification_loss: 0.0819 236/500 [=============>................] - ETA: 1:29 - loss: 0.8417 - regression_loss: 0.7600 - classification_loss: 0.0817 237/500 [=============>................] - ETA: 1:29 - loss: 0.8404 - regression_loss: 0.7588 - classification_loss: 0.0816 238/500 [=============>................] - ETA: 1:28 - loss: 0.8399 - regression_loss: 0.7586 - classification_loss: 0.0813 239/500 [=============>................] - ETA: 1:28 - loss: 0.8439 - regression_loss: 0.7620 - classification_loss: 0.0819 240/500 [=============>................] - ETA: 1:28 - loss: 0.8422 - regression_loss: 0.7606 - classification_loss: 0.0816 241/500 [=============>................] - ETA: 1:27 - loss: 0.8501 - regression_loss: 0.7673 - classification_loss: 0.0828 242/500 [=============>................] - ETA: 1:27 - loss: 0.8505 - regression_loss: 0.7678 - classification_loss: 0.0827 243/500 [=============>................] - ETA: 1:27 - loss: 0.8488 - regression_loss: 0.7663 - classification_loss: 0.0824 244/500 [=============>................] - ETA: 1:26 - loss: 0.8500 - regression_loss: 0.7671 - classification_loss: 0.0829 245/500 [=============>................] - ETA: 1:26 - loss: 0.8492 - regression_loss: 0.7664 - classification_loss: 0.0828 246/500 [=============>................] - ETA: 1:26 - loss: 0.8521 - regression_loss: 0.7683 - classification_loss: 0.0837 247/500 [=============>................] - ETA: 1:25 - loss: 0.8525 - regression_loss: 0.7686 - classification_loss: 0.0838 248/500 [=============>................] - ETA: 1:25 - loss: 0.8520 - regression_loss: 0.7682 - classification_loss: 0.0838 249/500 [=============>................] - ETA: 1:25 - loss: 0.8497 - regression_loss: 0.7661 - classification_loss: 0.0836 250/500 [==============>...............] - ETA: 1:24 - loss: 0.8495 - regression_loss: 0.7660 - classification_loss: 0.0835 251/500 [==============>...............] - ETA: 1:24 - loss: 0.8495 - regression_loss: 0.7661 - classification_loss: 0.0834 252/500 [==============>...............] - ETA: 1:24 - loss: 0.8470 - regression_loss: 0.7639 - classification_loss: 0.0831 253/500 [==============>...............] - ETA: 1:23 - loss: 0.8461 - regression_loss: 0.7631 - classification_loss: 0.0830 254/500 [==============>...............] - ETA: 1:23 - loss: 0.8441 - regression_loss: 0.7613 - classification_loss: 0.0827 255/500 [==============>...............] - ETA: 1:23 - loss: 0.8463 - regression_loss: 0.7631 - classification_loss: 0.0831 256/500 [==============>...............] - ETA: 1:22 - loss: 0.8450 - regression_loss: 0.7621 - classification_loss: 0.0830 257/500 [==============>...............] - ETA: 1:22 - loss: 0.8440 - regression_loss: 0.7612 - classification_loss: 0.0828 258/500 [==============>...............] - ETA: 1:22 - loss: 0.8422 - regression_loss: 0.7596 - classification_loss: 0.0825 259/500 [==============>...............] - ETA: 1:21 - loss: 0.8402 - regression_loss: 0.7579 - classification_loss: 0.0823 260/500 [==============>...............] - ETA: 1:21 - loss: 0.8415 - regression_loss: 0.7590 - classification_loss: 0.0825 261/500 [==============>...............] - ETA: 1:20 - loss: 0.8421 - regression_loss: 0.7596 - classification_loss: 0.0826 262/500 [==============>...............] - ETA: 1:20 - loss: 0.8418 - regression_loss: 0.7594 - classification_loss: 0.0825 263/500 [==============>...............] - ETA: 1:20 - loss: 0.8418 - regression_loss: 0.7594 - classification_loss: 0.0824 264/500 [==============>...............] - ETA: 1:20 - loss: 0.8407 - regression_loss: 0.7585 - classification_loss: 0.0822 265/500 [==============>...............] - ETA: 1:19 - loss: 0.8413 - regression_loss: 0.7582 - classification_loss: 0.0831 266/500 [==============>...............] - ETA: 1:19 - loss: 0.8422 - regression_loss: 0.7591 - classification_loss: 0.0831 267/500 [===============>..............] - ETA: 1:18 - loss: 0.8414 - regression_loss: 0.7584 - classification_loss: 0.0829 268/500 [===============>..............] - ETA: 1:18 - loss: 0.8422 - regression_loss: 0.7592 - classification_loss: 0.0829 269/500 [===============>..............] - ETA: 1:18 - loss: 0.8434 - regression_loss: 0.7603 - classification_loss: 0.0832 270/500 [===============>..............] - ETA: 1:17 - loss: 0.8431 - regression_loss: 0.7601 - classification_loss: 0.0830 271/500 [===============>..............] - ETA: 1:17 - loss: 0.8459 - regression_loss: 0.7625 - classification_loss: 0.0834 272/500 [===============>..............] - ETA: 1:17 - loss: 0.8453 - regression_loss: 0.7621 - classification_loss: 0.0832 273/500 [===============>..............] - ETA: 1:16 - loss: 0.8458 - regression_loss: 0.7625 - classification_loss: 0.0833 274/500 [===============>..............] - ETA: 1:16 - loss: 0.8454 - regression_loss: 0.7621 - classification_loss: 0.0833 275/500 [===============>..............] - ETA: 1:16 - loss: 0.8445 - regression_loss: 0.7613 - classification_loss: 0.0832 276/500 [===============>..............] - ETA: 1:15 - loss: 0.8466 - regression_loss: 0.7633 - classification_loss: 0.0833 277/500 [===============>..............] - ETA: 1:15 - loss: 0.8451 - regression_loss: 0.7620 - classification_loss: 0.0831 278/500 [===============>..............] - ETA: 1:15 - loss: 0.8474 - regression_loss: 0.7636 - classification_loss: 0.0838 279/500 [===============>..............] - ETA: 1:14 - loss: 0.8453 - regression_loss: 0.7617 - classification_loss: 0.0836 280/500 [===============>..............] - ETA: 1:14 - loss: 0.8466 - regression_loss: 0.7626 - classification_loss: 0.0839 281/500 [===============>..............] - ETA: 1:14 - loss: 0.8464 - regression_loss: 0.7625 - classification_loss: 0.0839 282/500 [===============>..............] - ETA: 1:13 - loss: 0.8487 - regression_loss: 0.7644 - classification_loss: 0.0843 283/500 [===============>..............] - ETA: 1:13 - loss: 0.8477 - regression_loss: 0.7636 - classification_loss: 0.0841 284/500 [================>.............] - ETA: 1:13 - loss: 0.8481 - regression_loss: 0.7639 - classification_loss: 0.0841 285/500 [================>.............] - ETA: 1:12 - loss: 0.8478 - regression_loss: 0.7639 - classification_loss: 0.0839 286/500 [================>.............] - ETA: 1:12 - loss: 0.8477 - regression_loss: 0.7639 - classification_loss: 0.0839 287/500 [================>.............] - ETA: 1:12 - loss: 0.8481 - regression_loss: 0.7641 - classification_loss: 0.0840 288/500 [================>.............] - ETA: 1:11 - loss: 0.8478 - regression_loss: 0.7639 - classification_loss: 0.0839 289/500 [================>.............] - ETA: 1:11 - loss: 0.8475 - regression_loss: 0.7634 - classification_loss: 0.0841 290/500 [================>.............] - ETA: 1:11 - loss: 0.8461 - regression_loss: 0.7622 - classification_loss: 0.0839 291/500 [================>.............] - ETA: 1:10 - loss: 0.8458 - regression_loss: 0.7619 - classification_loss: 0.0839 292/500 [================>.............] - ETA: 1:10 - loss: 0.8470 - regression_loss: 0.7629 - classification_loss: 0.0841 293/500 [================>.............] - ETA: 1:10 - loss: 0.8467 - regression_loss: 0.7627 - classification_loss: 0.0840 294/500 [================>.............] - ETA: 1:09 - loss: 0.8462 - regression_loss: 0.7622 - classification_loss: 0.0840 295/500 [================>.............] - ETA: 1:09 - loss: 0.8460 - regression_loss: 0.7620 - classification_loss: 0.0840 296/500 [================>.............] - ETA: 1:09 - loss: 0.8462 - regression_loss: 0.7620 - classification_loss: 0.0841 297/500 [================>.............] - ETA: 1:08 - loss: 0.8450 - regression_loss: 0.7611 - classification_loss: 0.0840 298/500 [================>.............] - ETA: 1:08 - loss: 0.8461 - regression_loss: 0.7621 - classification_loss: 0.0840 299/500 [================>.............] - ETA: 1:08 - loss: 0.8458 - regression_loss: 0.7619 - classification_loss: 0.0839 300/500 [=================>............] - ETA: 1:07 - loss: 0.8452 - regression_loss: 0.7614 - classification_loss: 0.0838 301/500 [=================>............] - ETA: 1:07 - loss: 0.8450 - regression_loss: 0.7613 - classification_loss: 0.0837 302/500 [=================>............] - ETA: 1:07 - loss: 0.8465 - regression_loss: 0.7628 - classification_loss: 0.0838 303/500 [=================>............] - ETA: 1:06 - loss: 0.8457 - regression_loss: 0.7621 - classification_loss: 0.0836 304/500 [=================>............] - ETA: 1:06 - loss: 0.8465 - regression_loss: 0.7626 - classification_loss: 0.0839 305/500 [=================>............] - ETA: 1:06 - loss: 0.8447 - regression_loss: 0.7610 - classification_loss: 0.0837 306/500 [=================>............] - ETA: 1:05 - loss: 0.8466 - regression_loss: 0.7627 - classification_loss: 0.0839 307/500 [=================>............] - ETA: 1:05 - loss: 0.8533 - regression_loss: 0.7687 - classification_loss: 0.0846 308/500 [=================>............] - ETA: 1:05 - loss: 0.8523 - regression_loss: 0.7679 - classification_loss: 0.0844 309/500 [=================>............] - ETA: 1:04 - loss: 0.8525 - regression_loss: 0.7681 - classification_loss: 0.0844 310/500 [=================>............] - ETA: 1:04 - loss: 0.8586 - regression_loss: 0.7720 - classification_loss: 0.0866 311/500 [=================>............] - ETA: 1:04 - loss: 0.8588 - regression_loss: 0.7723 - classification_loss: 0.0865 312/500 [=================>............] - ETA: 1:03 - loss: 0.8579 - regression_loss: 0.7714 - classification_loss: 0.0865 313/500 [=================>............] - ETA: 1:03 - loss: 0.8584 - regression_loss: 0.7719 - classification_loss: 0.0865 314/500 [=================>............] - ETA: 1:03 - loss: 0.8582 - regression_loss: 0.7717 - classification_loss: 0.0865 315/500 [=================>............] - ETA: 1:02 - loss: 0.8578 - regression_loss: 0.7713 - classification_loss: 0.0864 316/500 [=================>............] - ETA: 1:02 - loss: 0.8558 - regression_loss: 0.7696 - classification_loss: 0.0862 317/500 [==================>...........] - ETA: 1:02 - loss: 0.8585 - regression_loss: 0.7715 - classification_loss: 0.0870 318/500 [==================>...........] - ETA: 1:01 - loss: 0.8568 - regression_loss: 0.7701 - classification_loss: 0.0868 319/500 [==================>...........] - ETA: 1:01 - loss: 0.8564 - regression_loss: 0.7698 - classification_loss: 0.0866 320/500 [==================>...........] - ETA: 1:01 - loss: 0.8559 - regression_loss: 0.7693 - classification_loss: 0.0865 321/500 [==================>...........] - ETA: 1:00 - loss: 0.8545 - regression_loss: 0.7681 - classification_loss: 0.0864 322/500 [==================>...........] - ETA: 1:00 - loss: 0.8536 - regression_loss: 0.7674 - classification_loss: 0.0862 323/500 [==================>...........] - ETA: 1:00 - loss: 0.8531 - regression_loss: 0.7670 - classification_loss: 0.0862 324/500 [==================>...........] - ETA: 59s - loss: 0.8540 - regression_loss: 0.7677 - classification_loss: 0.0863  325/500 [==================>...........] - ETA: 59s - loss: 0.8552 - regression_loss: 0.7689 - classification_loss: 0.0863 326/500 [==================>...........] - ETA: 59s - loss: 0.8536 - regression_loss: 0.7675 - classification_loss: 0.0861 327/500 [==================>...........] - ETA: 58s - loss: 0.8549 - regression_loss: 0.7682 - classification_loss: 0.0867 328/500 [==================>...........] - ETA: 58s - loss: 0.8556 - regression_loss: 0.7685 - classification_loss: 0.0871 329/500 [==================>...........] - ETA: 58s - loss: 0.8542 - regression_loss: 0.7672 - classification_loss: 0.0870 330/500 [==================>...........] - ETA: 57s - loss: 0.8549 - regression_loss: 0.7678 - classification_loss: 0.0871 331/500 [==================>...........] - ETA: 57s - loss: 0.8548 - regression_loss: 0.7678 - classification_loss: 0.0870 332/500 [==================>...........] - ETA: 56s - loss: 0.8556 - regression_loss: 0.7685 - classification_loss: 0.0871 333/500 [==================>...........] - ETA: 56s - loss: 0.8555 - regression_loss: 0.7683 - classification_loss: 0.0871 334/500 [===================>..........] - ETA: 56s - loss: 0.8548 - regression_loss: 0.7679 - classification_loss: 0.0869 335/500 [===================>..........] - ETA: 55s - loss: 0.8548 - regression_loss: 0.7679 - classification_loss: 0.0869 336/500 [===================>..........] - ETA: 55s - loss: 0.8545 - regression_loss: 0.7675 - classification_loss: 0.0870 337/500 [===================>..........] - ETA: 55s - loss: 0.8534 - regression_loss: 0.7666 - classification_loss: 0.0869 338/500 [===================>..........] - ETA: 54s - loss: 0.8533 - regression_loss: 0.7665 - classification_loss: 0.0868 339/500 [===================>..........] - ETA: 54s - loss: 0.8525 - regression_loss: 0.7659 - classification_loss: 0.0866 340/500 [===================>..........] - ETA: 54s - loss: 0.8538 - regression_loss: 0.7673 - classification_loss: 0.0865 341/500 [===================>..........] - ETA: 53s - loss: 0.8542 - regression_loss: 0.7679 - classification_loss: 0.0863 342/500 [===================>..........] - ETA: 53s - loss: 0.8538 - regression_loss: 0.7676 - classification_loss: 0.0862 343/500 [===================>..........] - ETA: 53s - loss: 0.8568 - regression_loss: 0.7696 - classification_loss: 0.0872 344/500 [===================>..........] - ETA: 52s - loss: 0.8588 - regression_loss: 0.7712 - classification_loss: 0.0876 345/500 [===================>..........] - ETA: 52s - loss: 0.8593 - regression_loss: 0.7717 - classification_loss: 0.0876 346/500 [===================>..........] - ETA: 52s - loss: 0.8584 - regression_loss: 0.7710 - classification_loss: 0.0874 347/500 [===================>..........] - ETA: 51s - loss: 0.8570 - regression_loss: 0.7697 - classification_loss: 0.0873 348/500 [===================>..........] - ETA: 51s - loss: 0.8577 - regression_loss: 0.7703 - classification_loss: 0.0875 349/500 [===================>..........] - ETA: 51s - loss: 0.8571 - regression_loss: 0.7698 - classification_loss: 0.0873 350/500 [====================>.........] - ETA: 50s - loss: 0.8576 - regression_loss: 0.7700 - classification_loss: 0.0876 351/500 [====================>.........] - ETA: 50s - loss: 0.8572 - regression_loss: 0.7697 - classification_loss: 0.0875 352/500 [====================>.........] - ETA: 50s - loss: 0.8579 - regression_loss: 0.7701 - classification_loss: 0.0878 353/500 [====================>.........] - ETA: 49s - loss: 0.8572 - regression_loss: 0.7695 - classification_loss: 0.0876 354/500 [====================>.........] - ETA: 49s - loss: 0.8594 - regression_loss: 0.7719 - classification_loss: 0.0875 355/500 [====================>.........] - ETA: 49s - loss: 0.8638 - regression_loss: 0.7759 - classification_loss: 0.0879 356/500 [====================>.........] - ETA: 48s - loss: 0.8647 - regression_loss: 0.7766 - classification_loss: 0.0881 357/500 [====================>.........] - ETA: 48s - loss: 0.8641 - regression_loss: 0.7761 - classification_loss: 0.0880 358/500 [====================>.........] - ETA: 48s - loss: 0.8640 - regression_loss: 0.7760 - classification_loss: 0.0879 359/500 [====================>.........] - ETA: 47s - loss: 0.8648 - regression_loss: 0.7767 - classification_loss: 0.0881 360/500 [====================>.........] - ETA: 47s - loss: 0.8654 - regression_loss: 0.7773 - classification_loss: 0.0882 361/500 [====================>.........] - ETA: 47s - loss: 0.8652 - regression_loss: 0.7771 - classification_loss: 0.0881 362/500 [====================>.........] - ETA: 46s - loss: 0.8643 - regression_loss: 0.7763 - classification_loss: 0.0880 363/500 [====================>.........] - ETA: 46s - loss: 0.8651 - regression_loss: 0.7770 - classification_loss: 0.0880 364/500 [====================>.........] - ETA: 46s - loss: 0.8652 - regression_loss: 0.7770 - classification_loss: 0.0881 365/500 [====================>.........] - ETA: 45s - loss: 0.8695 - regression_loss: 0.7807 - classification_loss: 0.0888 366/500 [====================>.........] - ETA: 45s - loss: 0.8686 - regression_loss: 0.7799 - classification_loss: 0.0887 367/500 [=====================>........] - ETA: 45s - loss: 0.8678 - regression_loss: 0.7793 - classification_loss: 0.0885 368/500 [=====================>........] - ETA: 44s - loss: 0.8673 - regression_loss: 0.7788 - classification_loss: 0.0884 369/500 [=====================>........] - ETA: 44s - loss: 0.8678 - regression_loss: 0.7794 - classification_loss: 0.0884 370/500 [=====================>........] - ETA: 44s - loss: 0.8671 - regression_loss: 0.7789 - classification_loss: 0.0882 371/500 [=====================>........] - ETA: 43s - loss: 0.8669 - regression_loss: 0.7786 - classification_loss: 0.0882 372/500 [=====================>........] - ETA: 43s - loss: 0.8675 - regression_loss: 0.7793 - classification_loss: 0.0881 373/500 [=====================>........] - ETA: 43s - loss: 0.8685 - regression_loss: 0.7800 - classification_loss: 0.0884 374/500 [=====================>........] - ETA: 42s - loss: 0.8669 - regression_loss: 0.7786 - classification_loss: 0.0883 375/500 [=====================>........] - ETA: 42s - loss: 0.8670 - regression_loss: 0.7788 - classification_loss: 0.0882 376/500 [=====================>........] - ETA: 42s - loss: 0.8700 - regression_loss: 0.7799 - classification_loss: 0.0900 377/500 [=====================>........] - ETA: 41s - loss: 0.8696 - regression_loss: 0.7797 - classification_loss: 0.0900 378/500 [=====================>........] - ETA: 41s - loss: 0.8695 - regression_loss: 0.7797 - classification_loss: 0.0898 379/500 [=====================>........] - ETA: 41s - loss: 0.8708 - regression_loss: 0.7808 - classification_loss: 0.0900 380/500 [=====================>........] - ETA: 40s - loss: 0.8716 - regression_loss: 0.7814 - classification_loss: 0.0901 381/500 [=====================>........] - ETA: 40s - loss: 0.8701 - regression_loss: 0.7801 - classification_loss: 0.0900 382/500 [=====================>........] - ETA: 40s - loss: 0.8708 - regression_loss: 0.7806 - classification_loss: 0.0901 383/500 [=====================>........] - ETA: 39s - loss: 0.8705 - regression_loss: 0.7805 - classification_loss: 0.0901 384/500 [======================>.......] - ETA: 39s - loss: 0.8701 - regression_loss: 0.7799 - classification_loss: 0.0902 385/500 [======================>.......] - ETA: 39s - loss: 0.8702 - regression_loss: 0.7800 - classification_loss: 0.0901 386/500 [======================>.......] - ETA: 38s - loss: 0.8693 - regression_loss: 0.7792 - classification_loss: 0.0900 387/500 [======================>.......] - ETA: 38s - loss: 0.8682 - regression_loss: 0.7783 - classification_loss: 0.0900 388/500 [======================>.......] - ETA: 38s - loss: 0.8677 - regression_loss: 0.7777 - classification_loss: 0.0900 389/500 [======================>.......] - ETA: 37s - loss: 0.8669 - regression_loss: 0.7771 - classification_loss: 0.0898 390/500 [======================>.......] - ETA: 37s - loss: 0.8676 - regression_loss: 0.7777 - classification_loss: 0.0899 391/500 [======================>.......] - ETA: 37s - loss: 0.8687 - regression_loss: 0.7783 - classification_loss: 0.0904 392/500 [======================>.......] - ETA: 36s - loss: 0.8678 - regression_loss: 0.7775 - classification_loss: 0.0903 393/500 [======================>.......] - ETA: 36s - loss: 0.8673 - regression_loss: 0.7771 - classification_loss: 0.0902 394/500 [======================>.......] - ETA: 35s - loss: 0.8676 - regression_loss: 0.7775 - classification_loss: 0.0901 395/500 [======================>.......] - ETA: 35s - loss: 0.8672 - regression_loss: 0.7772 - classification_loss: 0.0900 396/500 [======================>.......] - ETA: 35s - loss: 0.8672 - regression_loss: 0.7773 - classification_loss: 0.0899 397/500 [======================>.......] - ETA: 34s - loss: 0.8662 - regression_loss: 0.7764 - classification_loss: 0.0898 398/500 [======================>.......] - ETA: 34s - loss: 0.8658 - regression_loss: 0.7759 - classification_loss: 0.0898 399/500 [======================>.......] - ETA: 34s - loss: 0.8646 - regression_loss: 0.7749 - classification_loss: 0.0897 400/500 [=======================>......] - ETA: 33s - loss: 0.8637 - regression_loss: 0.7742 - classification_loss: 0.0895 401/500 [=======================>......] - ETA: 33s - loss: 0.8626 - regression_loss: 0.7730 - classification_loss: 0.0896 402/500 [=======================>......] - ETA: 33s - loss: 0.8637 - regression_loss: 0.7739 - classification_loss: 0.0899 403/500 [=======================>......] - ETA: 32s - loss: 0.8646 - regression_loss: 0.7746 - classification_loss: 0.0900 404/500 [=======================>......] - ETA: 32s - loss: 0.8665 - regression_loss: 0.7763 - classification_loss: 0.0901 405/500 [=======================>......] - ETA: 32s - loss: 0.8657 - regression_loss: 0.7757 - classification_loss: 0.0900 406/500 [=======================>......] - ETA: 31s - loss: 0.8653 - regression_loss: 0.7752 - classification_loss: 0.0900 407/500 [=======================>......] - ETA: 31s - loss: 0.8660 - regression_loss: 0.7759 - classification_loss: 0.0900 408/500 [=======================>......] - ETA: 31s - loss: 0.8660 - regression_loss: 0.7760 - classification_loss: 0.0900 409/500 [=======================>......] - ETA: 30s - loss: 0.8653 - regression_loss: 0.7754 - classification_loss: 0.0899 410/500 [=======================>......] - ETA: 30s - loss: 0.8660 - regression_loss: 0.7762 - classification_loss: 0.0899 411/500 [=======================>......] - ETA: 30s - loss: 0.8670 - regression_loss: 0.7770 - classification_loss: 0.0899 412/500 [=======================>......] - ETA: 29s - loss: 0.8680 - regression_loss: 0.7780 - classification_loss: 0.0900 413/500 [=======================>......] - ETA: 29s - loss: 0.8675 - regression_loss: 0.7775 - classification_loss: 0.0900 414/500 [=======================>......] - ETA: 29s - loss: 0.8696 - regression_loss: 0.7791 - classification_loss: 0.0904 415/500 [=======================>......] - ETA: 28s - loss: 0.8698 - regression_loss: 0.7793 - classification_loss: 0.0904 416/500 [=======================>......] - ETA: 28s - loss: 0.8692 - regression_loss: 0.7789 - classification_loss: 0.0903 417/500 [========================>.....] - ETA: 28s - loss: 0.8690 - regression_loss: 0.7788 - classification_loss: 0.0902 418/500 [========================>.....] - ETA: 27s - loss: 0.8685 - regression_loss: 0.7783 - classification_loss: 0.0901 419/500 [========================>.....] - ETA: 27s - loss: 0.8677 - regression_loss: 0.7777 - classification_loss: 0.0900 420/500 [========================>.....] - ETA: 27s - loss: 0.8683 - regression_loss: 0.7780 - classification_loss: 0.0903 421/500 [========================>.....] - ETA: 26s - loss: 0.8693 - regression_loss: 0.7789 - classification_loss: 0.0904 422/500 [========================>.....] - ETA: 26s - loss: 0.8684 - regression_loss: 0.7782 - classification_loss: 0.0902 423/500 [========================>.....] - ETA: 26s - loss: 0.8683 - regression_loss: 0.7781 - classification_loss: 0.0902 424/500 [========================>.....] - ETA: 25s - loss: 0.8682 - regression_loss: 0.7780 - classification_loss: 0.0902 425/500 [========================>.....] - ETA: 25s - loss: 0.8724 - regression_loss: 0.7813 - classification_loss: 0.0910 426/500 [========================>.....] - ETA: 25s - loss: 0.8727 - regression_loss: 0.7816 - classification_loss: 0.0911 427/500 [========================>.....] - ETA: 24s - loss: 0.8721 - regression_loss: 0.7811 - classification_loss: 0.0910 428/500 [========================>.....] - ETA: 24s - loss: 0.8729 - regression_loss: 0.7818 - classification_loss: 0.0911 429/500 [========================>.....] - ETA: 24s - loss: 0.8723 - regression_loss: 0.7812 - classification_loss: 0.0911 430/500 [========================>.....] - ETA: 23s - loss: 0.8739 - regression_loss: 0.7826 - classification_loss: 0.0913 431/500 [========================>.....] - ETA: 23s - loss: 0.8731 - regression_loss: 0.7819 - classification_loss: 0.0912 432/500 [========================>.....] - ETA: 23s - loss: 0.8773 - regression_loss: 0.7855 - classification_loss: 0.0918 433/500 [========================>.....] - ETA: 22s - loss: 0.8768 - regression_loss: 0.7850 - classification_loss: 0.0918 434/500 [=========================>....] - ETA: 22s - loss: 0.8758 - regression_loss: 0.7842 - classification_loss: 0.0916 435/500 [=========================>....] - ETA: 22s - loss: 0.8771 - regression_loss: 0.7855 - classification_loss: 0.0916 436/500 [=========================>....] - ETA: 21s - loss: 0.8762 - regression_loss: 0.7847 - classification_loss: 0.0915 437/500 [=========================>....] - ETA: 21s - loss: 0.8767 - regression_loss: 0.7853 - classification_loss: 0.0914 438/500 [=========================>....] - ETA: 21s - loss: 0.8764 - regression_loss: 0.7850 - classification_loss: 0.0914 439/500 [=========================>....] - ETA: 20s - loss: 0.8751 - regression_loss: 0.7838 - classification_loss: 0.0913 440/500 [=========================>....] - ETA: 20s - loss: 0.8749 - regression_loss: 0.7836 - classification_loss: 0.0912 441/500 [=========================>....] - ETA: 20s - loss: 0.8749 - regression_loss: 0.7837 - classification_loss: 0.0912 442/500 [=========================>....] - ETA: 19s - loss: 0.8738 - regression_loss: 0.7827 - classification_loss: 0.0911 443/500 [=========================>....] - ETA: 19s - loss: 0.8728 - regression_loss: 0.7818 - classification_loss: 0.0910 444/500 [=========================>....] - ETA: 19s - loss: 0.8741 - regression_loss: 0.7829 - classification_loss: 0.0912 445/500 [=========================>....] - ETA: 18s - loss: 0.8736 - regression_loss: 0.7825 - classification_loss: 0.0911 446/500 [=========================>....] - ETA: 18s - loss: 0.8734 - regression_loss: 0.7823 - classification_loss: 0.0911 447/500 [=========================>....] - ETA: 18s - loss: 0.8741 - regression_loss: 0.7830 - classification_loss: 0.0912 448/500 [=========================>....] - ETA: 17s - loss: 0.8727 - regression_loss: 0.7817 - classification_loss: 0.0910 449/500 [=========================>....] - ETA: 17s - loss: 0.8728 - regression_loss: 0.7818 - classification_loss: 0.0911 450/500 [==========================>...] - ETA: 16s - loss: 0.8725 - regression_loss: 0.7815 - classification_loss: 0.0910 451/500 [==========================>...] - ETA: 16s - loss: 0.8723 - regression_loss: 0.7813 - classification_loss: 0.0910 452/500 [==========================>...] - ETA: 16s - loss: 0.8722 - regression_loss: 0.7813 - classification_loss: 0.0909 453/500 [==========================>...] - ETA: 15s - loss: 0.8735 - regression_loss: 0.7823 - classification_loss: 0.0912 454/500 [==========================>...] - ETA: 15s - loss: 0.8736 - regression_loss: 0.7824 - classification_loss: 0.0912 455/500 [==========================>...] - ETA: 15s - loss: 0.8731 - regression_loss: 0.7820 - classification_loss: 0.0911 456/500 [==========================>...] - ETA: 14s - loss: 0.8730 - regression_loss: 0.7820 - classification_loss: 0.0910 457/500 [==========================>...] - ETA: 14s - loss: 0.8729 - regression_loss: 0.7820 - classification_loss: 0.0909 458/500 [==========================>...] - ETA: 14s - loss: 0.8722 - regression_loss: 0.7814 - classification_loss: 0.0908 459/500 [==========================>...] - ETA: 13s - loss: 0.8713 - regression_loss: 0.7807 - classification_loss: 0.0906 460/500 [==========================>...] - ETA: 13s - loss: 0.8715 - regression_loss: 0.7808 - classification_loss: 0.0907 461/500 [==========================>...] - ETA: 13s - loss: 0.8714 - regression_loss: 0.7807 - classification_loss: 0.0906 462/500 [==========================>...] - ETA: 12s - loss: 0.8739 - regression_loss: 0.7825 - classification_loss: 0.0914 463/500 [==========================>...] - ETA: 12s - loss: 0.8739 - regression_loss: 0.7825 - classification_loss: 0.0914 464/500 [==========================>...] - ETA: 12s - loss: 0.8755 - regression_loss: 0.7839 - classification_loss: 0.0916 465/500 [==========================>...] - ETA: 11s - loss: 0.8761 - regression_loss: 0.7844 - classification_loss: 0.0918 466/500 [==========================>...] - ETA: 11s - loss: 0.8778 - regression_loss: 0.7858 - classification_loss: 0.0920 467/500 [===========================>..] - ETA: 11s - loss: 0.8779 - regression_loss: 0.7859 - classification_loss: 0.0920 468/500 [===========================>..] - ETA: 10s - loss: 0.8776 - regression_loss: 0.7857 - classification_loss: 0.0919 469/500 [===========================>..] - ETA: 10s - loss: 0.8778 - regression_loss: 0.7858 - classification_loss: 0.0919 470/500 [===========================>..] - ETA: 10s - loss: 0.8774 - regression_loss: 0.7855 - classification_loss: 0.0919 471/500 [===========================>..] - ETA: 9s - loss: 0.8773 - regression_loss: 0.7855 - classification_loss: 0.0918  472/500 [===========================>..] - ETA: 9s - loss: 0.8774 - regression_loss: 0.7855 - classification_loss: 0.0919 473/500 [===========================>..] - ETA: 9s - loss: 0.8774 - regression_loss: 0.7855 - classification_loss: 0.0919 474/500 [===========================>..] - ETA: 8s - loss: 0.8771 - regression_loss: 0.7853 - classification_loss: 0.0918 475/500 [===========================>..] - ETA: 8s - loss: 0.8771 - regression_loss: 0.7853 - classification_loss: 0.0918 476/500 [===========================>..] - ETA: 8s - loss: 0.8772 - regression_loss: 0.7854 - classification_loss: 0.0918 477/500 [===========================>..] - ETA: 7s - loss: 0.8773 - regression_loss: 0.7855 - classification_loss: 0.0918 478/500 [===========================>..] - ETA: 7s - loss: 0.8763 - regression_loss: 0.7847 - classification_loss: 0.0916 479/500 [===========================>..] - ETA: 7s - loss: 0.8758 - regression_loss: 0.7842 - classification_loss: 0.0916 480/500 [===========================>..] - ETA: 6s - loss: 0.8755 - regression_loss: 0.7839 - classification_loss: 0.0916 481/500 [===========================>..] - ETA: 6s - loss: 0.8747 - regression_loss: 0.7832 - classification_loss: 0.0914 482/500 [===========================>..] - ETA: 6s - loss: 0.8742 - regression_loss: 0.7829 - classification_loss: 0.0914 483/500 [===========================>..] - ETA: 5s - loss: 0.8749 - regression_loss: 0.7835 - classification_loss: 0.0914 484/500 [============================>.] - ETA: 5s - loss: 0.8751 - regression_loss: 0.7837 - classification_loss: 0.0914 485/500 [============================>.] - ETA: 5s - loss: 0.8751 - regression_loss: 0.7836 - classification_loss: 0.0914 486/500 [============================>.] - ETA: 4s - loss: 0.8750 - regression_loss: 0.7835 - classification_loss: 0.0914 487/500 [============================>.] - ETA: 4s - loss: 0.8755 - regression_loss: 0.7840 - classification_loss: 0.0915 488/500 [============================>.] - ETA: 4s - loss: 0.8755 - regression_loss: 0.7841 - classification_loss: 0.0914 489/500 [============================>.] - ETA: 3s - loss: 0.8747 - regression_loss: 0.7834 - classification_loss: 0.0913 490/500 [============================>.] - ETA: 3s - loss: 0.8740 - regression_loss: 0.7828 - classification_loss: 0.0912 491/500 [============================>.] - ETA: 3s - loss: 0.8740 - regression_loss: 0.7829 - classification_loss: 0.0911 492/500 [============================>.] - ETA: 2s - loss: 0.8734 - regression_loss: 0.7824 - classification_loss: 0.0909 493/500 [============================>.] - ETA: 2s - loss: 0.8736 - regression_loss: 0.7826 - classification_loss: 0.0909 494/500 [============================>.] - ETA: 2s - loss: 0.8757 - regression_loss: 0.7839 - classification_loss: 0.0917 495/500 [============================>.] - ETA: 1s - loss: 0.8749 - regression_loss: 0.7833 - classification_loss: 0.0916 496/500 [============================>.] - ETA: 1s - loss: 0.8743 - regression_loss: 0.7828 - classification_loss: 0.0915 497/500 [============================>.] - ETA: 1s - loss: 0.8745 - regression_loss: 0.7830 - classification_loss: 0.0914 498/500 [============================>.] - ETA: 0s - loss: 0.8735 - regression_loss: 0.7822 - classification_loss: 0.0913 499/500 [============================>.] - ETA: 0s - loss: 0.8732 - regression_loss: 0.7820 - classification_loss: 0.0912 500/500 [==============================] - 170s 340ms/step - loss: 0.8722 - regression_loss: 0.7811 - classification_loss: 0.0912 326 instances of class plum with average precision: 0.8291 mAP: 0.8291 Epoch 00035: saving model to ./training/snapshots/resnet101_pascal_35.h5 Epoch 36/150 1/500 [..............................] - ETA: 2:39 - loss: 0.7421 - regression_loss: 0.6751 - classification_loss: 0.0669 2/500 [..............................] - ETA: 2:44 - loss: 0.6832 - regression_loss: 0.6300 - classification_loss: 0.0532 3/500 [..............................] - ETA: 2:46 - loss: 0.7173 - regression_loss: 0.6623 - classification_loss: 0.0550 4/500 [..............................] - ETA: 2:47 - loss: 0.7870 - regression_loss: 0.7184 - classification_loss: 0.0686 5/500 [..............................] - ETA: 2:48 - loss: 0.7034 - regression_loss: 0.6434 - classification_loss: 0.0601 6/500 [..............................] - ETA: 2:48 - loss: 0.6994 - regression_loss: 0.6415 - classification_loss: 0.0579 7/500 [..............................] - ETA: 2:48 - loss: 0.6492 - regression_loss: 0.5943 - classification_loss: 0.0549 8/500 [..............................] - ETA: 2:48 - loss: 0.6765 - regression_loss: 0.6199 - classification_loss: 0.0566 9/500 [..............................] - ETA: 2:48 - loss: 0.6609 - regression_loss: 0.5874 - classification_loss: 0.0735 10/500 [..............................] - ETA: 2:49 - loss: 0.6911 - regression_loss: 0.6190 - classification_loss: 0.0722 11/500 [..............................] - ETA: 2:48 - loss: 0.6711 - regression_loss: 0.6029 - classification_loss: 0.0682 12/500 [..............................] - ETA: 2:47 - loss: 0.6913 - regression_loss: 0.6234 - classification_loss: 0.0679 13/500 [..............................] - ETA: 2:47 - loss: 0.7561 - regression_loss: 0.6792 - classification_loss: 0.0769 14/500 [..............................] - ETA: 2:46 - loss: 0.7796 - regression_loss: 0.7036 - classification_loss: 0.0760 15/500 [..............................] - ETA: 2:46 - loss: 0.7673 - regression_loss: 0.6929 - classification_loss: 0.0745 16/500 [..............................] - ETA: 2:45 - loss: 0.7767 - regression_loss: 0.7030 - classification_loss: 0.0736 17/500 [>.............................] - ETA: 2:45 - loss: 0.7684 - regression_loss: 0.6951 - classification_loss: 0.0734 18/500 [>.............................] - ETA: 2:44 - loss: 0.7730 - regression_loss: 0.7010 - classification_loss: 0.0720 19/500 [>.............................] - ETA: 2:44 - loss: 0.8094 - regression_loss: 0.7265 - classification_loss: 0.0828 20/500 [>.............................] - ETA: 2:44 - loss: 0.8124 - regression_loss: 0.7301 - classification_loss: 0.0824 21/500 [>.............................] - ETA: 2:44 - loss: 0.8032 - regression_loss: 0.7215 - classification_loss: 0.0817 22/500 [>.............................] - ETA: 2:43 - loss: 0.8134 - regression_loss: 0.7326 - classification_loss: 0.0808 23/500 [>.............................] - ETA: 2:43 - loss: 0.7964 - regression_loss: 0.7177 - classification_loss: 0.0787 24/500 [>.............................] - ETA: 2:42 - loss: 0.7950 - regression_loss: 0.7175 - classification_loss: 0.0775 25/500 [>.............................] - ETA: 2:42 - loss: 0.7830 - regression_loss: 0.7083 - classification_loss: 0.0747 26/500 [>.............................] - ETA: 2:42 - loss: 0.7882 - regression_loss: 0.7104 - classification_loss: 0.0778 27/500 [>.............................] - ETA: 2:42 - loss: 0.7914 - regression_loss: 0.7126 - classification_loss: 0.0788 28/500 [>.............................] - ETA: 2:41 - loss: 0.7755 - regression_loss: 0.6992 - classification_loss: 0.0763 29/500 [>.............................] - ETA: 2:40 - loss: 0.7651 - regression_loss: 0.6905 - classification_loss: 0.0745 30/500 [>.............................] - ETA: 2:40 - loss: 0.7721 - regression_loss: 0.6972 - classification_loss: 0.0749 31/500 [>.............................] - ETA: 2:40 - loss: 0.7640 - regression_loss: 0.6904 - classification_loss: 0.0737 32/500 [>.............................] - ETA: 2:39 - loss: 0.7502 - regression_loss: 0.6785 - classification_loss: 0.0717 33/500 [>.............................] - ETA: 2:39 - loss: 0.7897 - regression_loss: 0.7120 - classification_loss: 0.0777 34/500 [=>............................] - ETA: 2:39 - loss: 0.7927 - regression_loss: 0.7142 - classification_loss: 0.0784 35/500 [=>............................] - ETA: 2:38 - loss: 0.7783 - regression_loss: 0.7016 - classification_loss: 0.0766 36/500 [=>............................] - ETA: 2:38 - loss: 0.7961 - regression_loss: 0.7167 - classification_loss: 0.0794 37/500 [=>............................] - ETA: 2:38 - loss: 0.7906 - regression_loss: 0.7119 - classification_loss: 0.0787 38/500 [=>............................] - ETA: 2:37 - loss: 0.7943 - regression_loss: 0.7164 - classification_loss: 0.0780 39/500 [=>............................] - ETA: 2:37 - loss: 0.7848 - regression_loss: 0.7079 - classification_loss: 0.0769 40/500 [=>............................] - ETA: 2:37 - loss: 0.7712 - regression_loss: 0.6953 - classification_loss: 0.0759 41/500 [=>............................] - ETA: 2:36 - loss: 0.7658 - regression_loss: 0.6907 - classification_loss: 0.0751 42/500 [=>............................] - ETA: 2:36 - loss: 0.7837 - regression_loss: 0.7073 - classification_loss: 0.0764 43/500 [=>............................] - ETA: 2:36 - loss: 0.7816 - regression_loss: 0.7057 - classification_loss: 0.0759 44/500 [=>............................] - ETA: 2:35 - loss: 0.7817 - regression_loss: 0.7063 - classification_loss: 0.0754 45/500 [=>............................] - ETA: 2:35 - loss: 0.7935 - regression_loss: 0.7169 - classification_loss: 0.0766 46/500 [=>............................] - ETA: 2:35 - loss: 0.8143 - regression_loss: 0.7343 - classification_loss: 0.0799 47/500 [=>............................] - ETA: 2:34 - loss: 0.8091 - regression_loss: 0.7300 - classification_loss: 0.0790 48/500 [=>............................] - ETA: 2:34 - loss: 0.8160 - regression_loss: 0.7359 - classification_loss: 0.0801 49/500 [=>............................] - ETA: 2:34 - loss: 0.8256 - regression_loss: 0.7454 - classification_loss: 0.0801 50/500 [==>...........................] - ETA: 2:33 - loss: 0.8212 - regression_loss: 0.7421 - classification_loss: 0.0792 51/500 [==>...........................] - ETA: 2:33 - loss: 0.8361 - regression_loss: 0.7533 - classification_loss: 0.0828 52/500 [==>...........................] - ETA: 2:32 - loss: 0.8330 - regression_loss: 0.7511 - classification_loss: 0.0819 53/500 [==>...........................] - ETA: 2:32 - loss: 0.8291 - regression_loss: 0.7481 - classification_loss: 0.0809 54/500 [==>...........................] - ETA: 2:32 - loss: 0.8420 - regression_loss: 0.7592 - classification_loss: 0.0828 55/500 [==>...........................] - ETA: 2:31 - loss: 0.8410 - regression_loss: 0.7576 - classification_loss: 0.0834 56/500 [==>...........................] - ETA: 2:31 - loss: 0.8335 - regression_loss: 0.7512 - classification_loss: 0.0823 57/500 [==>...........................] - ETA: 2:31 - loss: 0.8367 - regression_loss: 0.7552 - classification_loss: 0.0815 58/500 [==>...........................] - ETA: 2:30 - loss: 0.8381 - regression_loss: 0.7561 - classification_loss: 0.0820 59/500 [==>...........................] - ETA: 2:30 - loss: 0.8349 - regression_loss: 0.7533 - classification_loss: 0.0817 60/500 [==>...........................] - ETA: 2:30 - loss: 0.8356 - regression_loss: 0.7538 - classification_loss: 0.0818 61/500 [==>...........................] - ETA: 2:29 - loss: 0.8304 - regression_loss: 0.7490 - classification_loss: 0.0814 62/500 [==>...........................] - ETA: 2:29 - loss: 0.8268 - regression_loss: 0.7456 - classification_loss: 0.0812 63/500 [==>...........................] - ETA: 2:29 - loss: 0.8217 - regression_loss: 0.7415 - classification_loss: 0.0802 64/500 [==>...........................] - ETA: 2:28 - loss: 0.8243 - regression_loss: 0.7443 - classification_loss: 0.0800 65/500 [==>...........................] - ETA: 2:28 - loss: 0.8276 - regression_loss: 0.7476 - classification_loss: 0.0799 66/500 [==>...........................] - ETA: 2:27 - loss: 0.8266 - regression_loss: 0.7471 - classification_loss: 0.0795 67/500 [===>..........................] - ETA: 2:27 - loss: 0.8232 - regression_loss: 0.7443 - classification_loss: 0.0789 68/500 [===>..........................] - ETA: 2:27 - loss: 0.8251 - regression_loss: 0.7460 - classification_loss: 0.0791 69/500 [===>..........................] - ETA: 2:26 - loss: 0.8152 - regression_loss: 0.7371 - classification_loss: 0.0780 70/500 [===>..........................] - ETA: 2:26 - loss: 0.8257 - regression_loss: 0.7441 - classification_loss: 0.0817 71/500 [===>..........................] - ETA: 2:26 - loss: 0.8257 - regression_loss: 0.7444 - classification_loss: 0.0813 72/500 [===>..........................] - ETA: 2:25 - loss: 0.8261 - regression_loss: 0.7451 - classification_loss: 0.0810 73/500 [===>..........................] - ETA: 2:25 - loss: 0.8265 - regression_loss: 0.7446 - classification_loss: 0.0819 74/500 [===>..........................] - ETA: 2:25 - loss: 0.8255 - regression_loss: 0.7427 - classification_loss: 0.0828 75/500 [===>..........................] - ETA: 2:24 - loss: 0.8248 - regression_loss: 0.7414 - classification_loss: 0.0834 76/500 [===>..........................] - ETA: 2:24 - loss: 0.8274 - regression_loss: 0.7438 - classification_loss: 0.0836 77/500 [===>..........................] - ETA: 2:24 - loss: 0.8467 - regression_loss: 0.7601 - classification_loss: 0.0866 78/500 [===>..........................] - ETA: 2:23 - loss: 0.8456 - regression_loss: 0.7591 - classification_loss: 0.0865 79/500 [===>..........................] - ETA: 2:23 - loss: 0.8414 - regression_loss: 0.7555 - classification_loss: 0.0859 80/500 [===>..........................] - ETA: 2:23 - loss: 0.8492 - regression_loss: 0.7612 - classification_loss: 0.0879 81/500 [===>..........................] - ETA: 2:22 - loss: 0.8549 - regression_loss: 0.7655 - classification_loss: 0.0894 82/500 [===>..........................] - ETA: 2:22 - loss: 0.8584 - regression_loss: 0.7686 - classification_loss: 0.0898 83/500 [===>..........................] - ETA: 2:22 - loss: 0.8651 - regression_loss: 0.7750 - classification_loss: 0.0901 84/500 [====>.........................] - ETA: 2:21 - loss: 0.8663 - regression_loss: 0.7763 - classification_loss: 0.0900 85/500 [====>.........................] - ETA: 2:21 - loss: 0.8673 - regression_loss: 0.7777 - classification_loss: 0.0896 86/500 [====>.........................] - ETA: 2:21 - loss: 0.8656 - regression_loss: 0.7765 - classification_loss: 0.0891 87/500 [====>.........................] - ETA: 2:20 - loss: 0.8607 - regression_loss: 0.7723 - classification_loss: 0.0883 88/500 [====>.........................] - ETA: 2:20 - loss: 0.8548 - regression_loss: 0.7674 - classification_loss: 0.0874 89/500 [====>.........................] - ETA: 2:19 - loss: 0.8589 - regression_loss: 0.7716 - classification_loss: 0.0873 90/500 [====>.........................] - ETA: 2:19 - loss: 0.8562 - regression_loss: 0.7694 - classification_loss: 0.0868 91/500 [====>.........................] - ETA: 2:19 - loss: 0.8535 - regression_loss: 0.7669 - classification_loss: 0.0866 92/500 [====>.........................] - ETA: 2:18 - loss: 0.8529 - regression_loss: 0.7663 - classification_loss: 0.0866 93/500 [====>.........................] - ETA: 2:18 - loss: 0.8517 - regression_loss: 0.7651 - classification_loss: 0.0866 94/500 [====>.........................] - ETA: 2:18 - loss: 0.8544 - regression_loss: 0.7671 - classification_loss: 0.0873 95/500 [====>.........................] - ETA: 2:17 - loss: 0.8577 - regression_loss: 0.7699 - classification_loss: 0.0878 96/500 [====>.........................] - ETA: 2:17 - loss: 0.8575 - regression_loss: 0.7699 - classification_loss: 0.0876 97/500 [====>.........................] - ETA: 2:17 - loss: 0.8572 - regression_loss: 0.7702 - classification_loss: 0.0870 98/500 [====>.........................] - ETA: 2:16 - loss: 0.8543 - regression_loss: 0.7680 - classification_loss: 0.0864 99/500 [====>.........................] - ETA: 2:16 - loss: 0.8520 - regression_loss: 0.7661 - classification_loss: 0.0859 100/500 [=====>........................] - ETA: 2:16 - loss: 0.8499 - regression_loss: 0.7645 - classification_loss: 0.0854 101/500 [=====>........................] - ETA: 2:15 - loss: 0.8497 - regression_loss: 0.7645 - classification_loss: 0.0851 102/500 [=====>........................] - ETA: 2:15 - loss: 0.8463 - regression_loss: 0.7618 - classification_loss: 0.0846 103/500 [=====>........................] - ETA: 2:14 - loss: 0.8485 - regression_loss: 0.7639 - classification_loss: 0.0846 104/500 [=====>........................] - ETA: 2:14 - loss: 0.8473 - regression_loss: 0.7632 - classification_loss: 0.0841 105/500 [=====>........................] - ETA: 2:14 - loss: 0.8482 - regression_loss: 0.7637 - classification_loss: 0.0846 106/500 [=====>........................] - ETA: 2:13 - loss: 0.8479 - regression_loss: 0.7635 - classification_loss: 0.0844 107/500 [=====>........................] - ETA: 2:13 - loss: 0.8451 - regression_loss: 0.7613 - classification_loss: 0.0839 108/500 [=====>........................] - ETA: 2:13 - loss: 0.8469 - regression_loss: 0.7630 - classification_loss: 0.0840 109/500 [=====>........................] - ETA: 2:12 - loss: 0.8442 - regression_loss: 0.7609 - classification_loss: 0.0833 110/500 [=====>........................] - ETA: 2:12 - loss: 0.8419 - regression_loss: 0.7589 - classification_loss: 0.0830 111/500 [=====>........................] - ETA: 2:12 - loss: 0.8415 - regression_loss: 0.7588 - classification_loss: 0.0827 112/500 [=====>........................] - ETA: 2:11 - loss: 0.8406 - regression_loss: 0.7582 - classification_loss: 0.0824 113/500 [=====>........................] - ETA: 2:11 - loss: 0.8374 - regression_loss: 0.7552 - classification_loss: 0.0822 114/500 [=====>........................] - ETA: 2:11 - loss: 0.8357 - regression_loss: 0.7539 - classification_loss: 0.0817 115/500 [=====>........................] - ETA: 2:10 - loss: 0.8375 - regression_loss: 0.7554 - classification_loss: 0.0821 116/500 [=====>........................] - ETA: 2:10 - loss: 0.8350 - regression_loss: 0.7535 - classification_loss: 0.0815 117/500 [======>.......................] - ETA: 2:10 - loss: 0.8332 - regression_loss: 0.7522 - classification_loss: 0.0810 118/500 [======>.......................] - ETA: 2:09 - loss: 0.8340 - regression_loss: 0.7531 - classification_loss: 0.0809 119/500 [======>.......................] - ETA: 2:09 - loss: 0.8312 - regression_loss: 0.7508 - classification_loss: 0.0804 120/500 [======>.......................] - ETA: 2:09 - loss: 0.8328 - regression_loss: 0.7520 - classification_loss: 0.0808 121/500 [======>.......................] - ETA: 2:08 - loss: 0.8309 - regression_loss: 0.7505 - classification_loss: 0.0804 122/500 [======>.......................] - ETA: 2:08 - loss: 0.8280 - regression_loss: 0.7480 - classification_loss: 0.0800 123/500 [======>.......................] - ETA: 2:08 - loss: 0.8269 - regression_loss: 0.7473 - classification_loss: 0.0796 124/500 [======>.......................] - ETA: 2:07 - loss: 0.8241 - regression_loss: 0.7449 - classification_loss: 0.0791 125/500 [======>.......................] - ETA: 2:07 - loss: 0.8221 - regression_loss: 0.7432 - classification_loss: 0.0790 126/500 [======>.......................] - ETA: 2:07 - loss: 0.8222 - regression_loss: 0.7430 - classification_loss: 0.0792 127/500 [======>.......................] - ETA: 2:06 - loss: 0.8212 - regression_loss: 0.7423 - classification_loss: 0.0789 128/500 [======>.......................] - ETA: 2:06 - loss: 0.8188 - regression_loss: 0.7403 - classification_loss: 0.0785 129/500 [======>.......................] - ETA: 2:06 - loss: 0.8182 - regression_loss: 0.7398 - classification_loss: 0.0784 130/500 [======>.......................] - ETA: 2:05 - loss: 0.8175 - regression_loss: 0.7393 - classification_loss: 0.0782 131/500 [======>.......................] - ETA: 2:05 - loss: 0.8190 - regression_loss: 0.7405 - classification_loss: 0.0786 132/500 [======>.......................] - ETA: 2:05 - loss: 0.8203 - regression_loss: 0.7416 - classification_loss: 0.0788 133/500 [======>.......................] - ETA: 2:04 - loss: 0.8181 - regression_loss: 0.7396 - classification_loss: 0.0785 134/500 [=======>......................] - ETA: 2:04 - loss: 0.8180 - regression_loss: 0.7396 - classification_loss: 0.0784 135/500 [=======>......................] - ETA: 2:04 - loss: 0.8222 - regression_loss: 0.7428 - classification_loss: 0.0793 136/500 [=======>......................] - ETA: 2:03 - loss: 0.8211 - regression_loss: 0.7420 - classification_loss: 0.0791 137/500 [=======>......................] - ETA: 2:03 - loss: 0.8229 - regression_loss: 0.7438 - classification_loss: 0.0791 138/500 [=======>......................] - ETA: 2:03 - loss: 0.8260 - regression_loss: 0.7471 - classification_loss: 0.0790 139/500 [=======>......................] - ETA: 2:02 - loss: 0.8285 - regression_loss: 0.7491 - classification_loss: 0.0794 140/500 [=======>......................] - ETA: 2:02 - loss: 0.8252 - regression_loss: 0.7463 - classification_loss: 0.0789 141/500 [=======>......................] - ETA: 2:02 - loss: 0.8232 - regression_loss: 0.7446 - classification_loss: 0.0786 142/500 [=======>......................] - ETA: 2:01 - loss: 0.8230 - regression_loss: 0.7447 - classification_loss: 0.0783 143/500 [=======>......................] - ETA: 2:01 - loss: 0.8228 - regression_loss: 0.7444 - classification_loss: 0.0783 144/500 [=======>......................] - ETA: 2:01 - loss: 0.8235 - regression_loss: 0.7450 - classification_loss: 0.0785 145/500 [=======>......................] - ETA: 2:00 - loss: 0.8204 - regression_loss: 0.7423 - classification_loss: 0.0781 146/500 [=======>......................] - ETA: 2:00 - loss: 0.8214 - regression_loss: 0.7432 - classification_loss: 0.0782 147/500 [=======>......................] - ETA: 2:00 - loss: 0.8193 - regression_loss: 0.7415 - classification_loss: 0.0779 148/500 [=======>......................] - ETA: 1:59 - loss: 0.8164 - regression_loss: 0.7389 - classification_loss: 0.0775 149/500 [=======>......................] - ETA: 1:59 - loss: 0.8123 - regression_loss: 0.7352 - classification_loss: 0.0771 150/500 [========>.....................] - ETA: 1:58 - loss: 0.8120 - regression_loss: 0.7352 - classification_loss: 0.0769 151/500 [========>.....................] - ETA: 1:58 - loss: 0.8125 - regression_loss: 0.7356 - classification_loss: 0.0769 152/500 [========>.....................] - ETA: 1:58 - loss: 0.8121 - regression_loss: 0.7352 - classification_loss: 0.0769 153/500 [========>.....................] - ETA: 1:57 - loss: 0.8165 - regression_loss: 0.7381 - classification_loss: 0.0784 154/500 [========>.....................] - ETA: 1:57 - loss: 0.8168 - regression_loss: 0.7385 - classification_loss: 0.0783 155/500 [========>.....................] - ETA: 1:57 - loss: 0.8163 - regression_loss: 0.7382 - classification_loss: 0.0781 156/500 [========>.....................] - ETA: 1:56 - loss: 0.8169 - regression_loss: 0.7390 - classification_loss: 0.0779 157/500 [========>.....................] - ETA: 1:56 - loss: 0.8219 - regression_loss: 0.7430 - classification_loss: 0.0789 158/500 [========>.....................] - ETA: 1:56 - loss: 0.8227 - regression_loss: 0.7441 - classification_loss: 0.0786 159/500 [========>.....................] - ETA: 1:55 - loss: 0.8207 - regression_loss: 0.7422 - classification_loss: 0.0785 160/500 [========>.....................] - ETA: 1:55 - loss: 0.8188 - regression_loss: 0.7407 - classification_loss: 0.0781 161/500 [========>.....................] - ETA: 1:55 - loss: 0.8186 - regression_loss: 0.7407 - classification_loss: 0.0779 162/500 [========>.....................] - ETA: 1:54 - loss: 0.8185 - regression_loss: 0.7406 - classification_loss: 0.0779 163/500 [========>.....................] - ETA: 1:54 - loss: 0.8179 - regression_loss: 0.7400 - classification_loss: 0.0778 164/500 [========>.....................] - ETA: 1:54 - loss: 0.8176 - regression_loss: 0.7399 - classification_loss: 0.0777 165/500 [========>.....................] - ETA: 1:53 - loss: 0.8187 - regression_loss: 0.7411 - classification_loss: 0.0776 166/500 [========>.....................] - ETA: 1:53 - loss: 0.8205 - regression_loss: 0.7427 - classification_loss: 0.0779 167/500 [=========>....................] - ETA: 1:53 - loss: 0.8202 - regression_loss: 0.7425 - classification_loss: 0.0777 168/500 [=========>....................] - ETA: 1:52 - loss: 0.8210 - regression_loss: 0.7434 - classification_loss: 0.0776 169/500 [=========>....................] - ETA: 1:52 - loss: 0.8245 - regression_loss: 0.7456 - classification_loss: 0.0788 170/500 [=========>....................] - ETA: 1:52 - loss: 0.8226 - regression_loss: 0.7438 - classification_loss: 0.0787 171/500 [=========>....................] - ETA: 1:51 - loss: 0.8179 - regression_loss: 0.7395 - classification_loss: 0.0785 172/500 [=========>....................] - ETA: 1:51 - loss: 0.8194 - regression_loss: 0.7407 - classification_loss: 0.0787 173/500 [=========>....................] - ETA: 1:50 - loss: 0.8184 - regression_loss: 0.7401 - classification_loss: 0.0783 174/500 [=========>....................] - ETA: 1:50 - loss: 0.8162 - regression_loss: 0.7381 - classification_loss: 0.0780 175/500 [=========>....................] - ETA: 1:50 - loss: 0.8133 - regression_loss: 0.7356 - classification_loss: 0.0777 176/500 [=========>....................] - ETA: 1:49 - loss: 0.8123 - regression_loss: 0.7348 - classification_loss: 0.0775 177/500 [=========>....................] - ETA: 1:49 - loss: 0.8117 - regression_loss: 0.7345 - classification_loss: 0.0773 178/500 [=========>....................] - ETA: 1:49 - loss: 0.8093 - regression_loss: 0.7323 - classification_loss: 0.0770 179/500 [=========>....................] - ETA: 1:48 - loss: 0.8090 - regression_loss: 0.7321 - classification_loss: 0.0769 180/500 [=========>....................] - ETA: 1:48 - loss: 0.8117 - regression_loss: 0.7349 - classification_loss: 0.0767 181/500 [=========>....................] - ETA: 1:48 - loss: 0.8112 - regression_loss: 0.7345 - classification_loss: 0.0767 182/500 [=========>....................] - ETA: 1:47 - loss: 0.8093 - regression_loss: 0.7328 - classification_loss: 0.0765 183/500 [=========>....................] - ETA: 1:47 - loss: 0.8050 - regression_loss: 0.7288 - classification_loss: 0.0762 184/500 [==========>...................] - ETA: 1:47 - loss: 0.8033 - regression_loss: 0.7274 - classification_loss: 0.0759 185/500 [==========>...................] - ETA: 1:46 - loss: 0.8034 - regression_loss: 0.7273 - classification_loss: 0.0760 186/500 [==========>...................] - ETA: 1:46 - loss: 0.8050 - regression_loss: 0.7289 - classification_loss: 0.0762 187/500 [==========>...................] - ETA: 1:46 - loss: 0.8063 - regression_loss: 0.7303 - classification_loss: 0.0759 188/500 [==========>...................] - ETA: 1:45 - loss: 0.8076 - regression_loss: 0.7312 - classification_loss: 0.0763 189/500 [==========>...................] - ETA: 1:45 - loss: 0.8056 - regression_loss: 0.7296 - classification_loss: 0.0760 190/500 [==========>...................] - ETA: 1:45 - loss: 0.8073 - regression_loss: 0.7310 - classification_loss: 0.0763 191/500 [==========>...................] - ETA: 1:44 - loss: 0.8104 - regression_loss: 0.7337 - classification_loss: 0.0767 192/500 [==========>...................] - ETA: 1:44 - loss: 0.8078 - regression_loss: 0.7314 - classification_loss: 0.0764 193/500 [==========>...................] - ETA: 1:44 - loss: 0.8079 - regression_loss: 0.7312 - classification_loss: 0.0767 194/500 [==========>...................] - ETA: 1:43 - loss: 0.8110 - regression_loss: 0.7341 - classification_loss: 0.0768 195/500 [==========>...................] - ETA: 1:43 - loss: 0.8133 - regression_loss: 0.7358 - classification_loss: 0.0775 196/500 [==========>...................] - ETA: 1:43 - loss: 0.8146 - regression_loss: 0.7371 - classification_loss: 0.0775 197/500 [==========>...................] - ETA: 1:42 - loss: 0.8161 - regression_loss: 0.7382 - classification_loss: 0.0779 198/500 [==========>...................] - ETA: 1:42 - loss: 0.8159 - regression_loss: 0.7383 - classification_loss: 0.0777 199/500 [==========>...................] - ETA: 1:42 - loss: 0.8136 - regression_loss: 0.7363 - classification_loss: 0.0773 200/500 [===========>..................] - ETA: 1:41 - loss: 0.8183 - regression_loss: 0.7403 - classification_loss: 0.0780 201/500 [===========>..................] - ETA: 1:41 - loss: 0.8158 - regression_loss: 0.7382 - classification_loss: 0.0776 202/500 [===========>..................] - ETA: 1:41 - loss: 0.8167 - regression_loss: 0.7393 - classification_loss: 0.0774 203/500 [===========>..................] - ETA: 1:40 - loss: 0.8174 - regression_loss: 0.7397 - classification_loss: 0.0777 204/500 [===========>..................] - ETA: 1:40 - loss: 0.8175 - regression_loss: 0.7399 - classification_loss: 0.0777 205/500 [===========>..................] - ETA: 1:40 - loss: 0.8207 - regression_loss: 0.7422 - classification_loss: 0.0785 206/500 [===========>..................] - ETA: 1:39 - loss: 0.8194 - regression_loss: 0.7411 - classification_loss: 0.0783 207/500 [===========>..................] - ETA: 1:39 - loss: 0.8187 - regression_loss: 0.7407 - classification_loss: 0.0781 208/500 [===========>..................] - ETA: 1:39 - loss: 0.8173 - regression_loss: 0.7394 - classification_loss: 0.0778 209/500 [===========>..................] - ETA: 1:38 - loss: 0.8212 - regression_loss: 0.7406 - classification_loss: 0.0807 210/500 [===========>..................] - ETA: 1:38 - loss: 0.8212 - regression_loss: 0.7407 - classification_loss: 0.0805 211/500 [===========>..................] - ETA: 1:38 - loss: 0.8210 - regression_loss: 0.7404 - classification_loss: 0.0806 212/500 [===========>..................] - ETA: 1:37 - loss: 0.8224 - regression_loss: 0.7416 - classification_loss: 0.0809 213/500 [===========>..................] - ETA: 1:37 - loss: 0.8219 - regression_loss: 0.7412 - classification_loss: 0.0808 214/500 [===========>..................] - ETA: 1:37 - loss: 0.8229 - regression_loss: 0.7422 - classification_loss: 0.0807 215/500 [===========>..................] - ETA: 1:36 - loss: 0.8239 - regression_loss: 0.7432 - classification_loss: 0.0807 216/500 [===========>..................] - ETA: 1:36 - loss: 0.8223 - regression_loss: 0.7418 - classification_loss: 0.0805 217/500 [============>.................] - ETA: 1:36 - loss: 0.8218 - regression_loss: 0.7415 - classification_loss: 0.0803 218/500 [============>.................] - ETA: 1:35 - loss: 0.8220 - regression_loss: 0.7418 - classification_loss: 0.0802 219/500 [============>.................] - ETA: 1:35 - loss: 0.8221 - regression_loss: 0.7420 - classification_loss: 0.0801 220/500 [============>.................] - ETA: 1:35 - loss: 0.8219 - regression_loss: 0.7421 - classification_loss: 0.0799 221/500 [============>.................] - ETA: 1:34 - loss: 0.8205 - regression_loss: 0.7409 - classification_loss: 0.0796 222/500 [============>.................] - ETA: 1:34 - loss: 0.8193 - regression_loss: 0.7397 - classification_loss: 0.0796 223/500 [============>.................] - ETA: 1:34 - loss: 0.8221 - regression_loss: 0.7419 - classification_loss: 0.0802 224/500 [============>.................] - ETA: 1:33 - loss: 0.8193 - regression_loss: 0.7394 - classification_loss: 0.0799 225/500 [============>.................] - ETA: 1:33 - loss: 0.8210 - regression_loss: 0.7409 - classification_loss: 0.0802 226/500 [============>.................] - ETA: 1:33 - loss: 0.8200 - regression_loss: 0.7399 - classification_loss: 0.0801 227/500 [============>.................] - ETA: 1:32 - loss: 0.8202 - regression_loss: 0.7402 - classification_loss: 0.0800 228/500 [============>.................] - ETA: 1:32 - loss: 0.8199 - regression_loss: 0.7401 - classification_loss: 0.0798 229/500 [============>.................] - ETA: 1:32 - loss: 0.8211 - regression_loss: 0.7408 - classification_loss: 0.0803 230/500 [============>.................] - ETA: 1:31 - loss: 0.8198 - regression_loss: 0.7397 - classification_loss: 0.0801 231/500 [============>.................] - ETA: 1:31 - loss: 0.8183 - regression_loss: 0.7384 - classification_loss: 0.0799 232/500 [============>.................] - ETA: 1:31 - loss: 0.8181 - regression_loss: 0.7384 - classification_loss: 0.0797 233/500 [============>.................] - ETA: 1:30 - loss: 0.8195 - regression_loss: 0.7396 - classification_loss: 0.0799 234/500 [=============>................] - ETA: 1:30 - loss: 0.8199 - regression_loss: 0.7402 - classification_loss: 0.0797 235/500 [=============>................] - ETA: 1:30 - loss: 0.8207 - regression_loss: 0.7411 - classification_loss: 0.0796 236/500 [=============>................] - ETA: 1:29 - loss: 0.8201 - regression_loss: 0.7406 - classification_loss: 0.0795 237/500 [=============>................] - ETA: 1:29 - loss: 0.8219 - regression_loss: 0.7421 - classification_loss: 0.0798 238/500 [=============>................] - ETA: 1:29 - loss: 0.8222 - regression_loss: 0.7424 - classification_loss: 0.0798 239/500 [=============>................] - ETA: 1:28 - loss: 0.8221 - regression_loss: 0.7421 - classification_loss: 0.0799 240/500 [=============>................] - ETA: 1:28 - loss: 0.8220 - regression_loss: 0.7420 - classification_loss: 0.0800 241/500 [=============>................] - ETA: 1:27 - loss: 0.8227 - regression_loss: 0.7427 - classification_loss: 0.0801 242/500 [=============>................] - ETA: 1:27 - loss: 0.8259 - regression_loss: 0.7456 - classification_loss: 0.0804 243/500 [=============>................] - ETA: 1:27 - loss: 0.8284 - regression_loss: 0.7480 - classification_loss: 0.0803 244/500 [=============>................] - ETA: 1:26 - loss: 0.8311 - regression_loss: 0.7500 - classification_loss: 0.0811 245/500 [=============>................] - ETA: 1:26 - loss: 0.8325 - regression_loss: 0.7513 - classification_loss: 0.0812 246/500 [=============>................] - ETA: 1:26 - loss: 0.8344 - regression_loss: 0.7531 - classification_loss: 0.0814 247/500 [=============>................] - ETA: 1:25 - loss: 0.8335 - regression_loss: 0.7523 - classification_loss: 0.0812 248/500 [=============>................] - ETA: 1:25 - loss: 0.8344 - regression_loss: 0.7530 - classification_loss: 0.0814 249/500 [=============>................] - ETA: 1:25 - loss: 0.8350 - regression_loss: 0.7534 - classification_loss: 0.0815 250/500 [==============>...............] - ETA: 1:24 - loss: 0.8334 - regression_loss: 0.7521 - classification_loss: 0.0813 251/500 [==============>...............] - ETA: 1:24 - loss: 0.8346 - regression_loss: 0.7529 - classification_loss: 0.0817 252/500 [==============>...............] - ETA: 1:24 - loss: 0.8334 - regression_loss: 0.7518 - classification_loss: 0.0816 253/500 [==============>...............] - ETA: 1:23 - loss: 0.8322 - regression_loss: 0.7508 - classification_loss: 0.0814 254/500 [==============>...............] - ETA: 1:23 - loss: 0.8339 - regression_loss: 0.7521 - classification_loss: 0.0818 255/500 [==============>...............] - ETA: 1:23 - loss: 0.8365 - regression_loss: 0.7547 - classification_loss: 0.0819 256/500 [==============>...............] - ETA: 1:22 - loss: 0.8345 - regression_loss: 0.7529 - classification_loss: 0.0816 257/500 [==============>...............] - ETA: 1:22 - loss: 0.8349 - regression_loss: 0.7534 - classification_loss: 0.0815 258/500 [==============>...............] - ETA: 1:22 - loss: 0.8359 - regression_loss: 0.7542 - classification_loss: 0.0817 259/500 [==============>...............] - ETA: 1:21 - loss: 0.8376 - regression_loss: 0.7559 - classification_loss: 0.0817 260/500 [==============>...............] - ETA: 1:21 - loss: 0.8420 - regression_loss: 0.7594 - classification_loss: 0.0826 261/500 [==============>...............] - ETA: 1:21 - loss: 0.8415 - regression_loss: 0.7590 - classification_loss: 0.0825 262/500 [==============>...............] - ETA: 1:20 - loss: 0.8396 - regression_loss: 0.7573 - classification_loss: 0.0823 263/500 [==============>...............] - ETA: 1:20 - loss: 0.8391 - regression_loss: 0.7570 - classification_loss: 0.0821 264/500 [==============>...............] - ETA: 1:20 - loss: 0.8390 - regression_loss: 0.7570 - classification_loss: 0.0819 265/500 [==============>...............] - ETA: 1:19 - loss: 0.8410 - regression_loss: 0.7583 - classification_loss: 0.0827 266/500 [==============>...............] - ETA: 1:19 - loss: 0.8413 - regression_loss: 0.7583 - classification_loss: 0.0829 267/500 [===============>..............] - ETA: 1:19 - loss: 0.8412 - regression_loss: 0.7583 - classification_loss: 0.0829 268/500 [===============>..............] - ETA: 1:18 - loss: 0.8436 - regression_loss: 0.7605 - classification_loss: 0.0831 269/500 [===============>..............] - ETA: 1:18 - loss: 0.8440 - regression_loss: 0.7609 - classification_loss: 0.0831 270/500 [===============>..............] - ETA: 1:18 - loss: 0.8443 - regression_loss: 0.7612 - classification_loss: 0.0831 271/500 [===============>..............] - ETA: 1:17 - loss: 0.8467 - regression_loss: 0.7635 - classification_loss: 0.0832 272/500 [===============>..............] - ETA: 1:17 - loss: 0.8465 - regression_loss: 0.7633 - classification_loss: 0.0832 273/500 [===============>..............] - ETA: 1:17 - loss: 0.8454 - regression_loss: 0.7622 - classification_loss: 0.0831 274/500 [===============>..............] - ETA: 1:16 - loss: 0.8446 - regression_loss: 0.7617 - classification_loss: 0.0829 275/500 [===============>..............] - ETA: 1:16 - loss: 0.8440 - regression_loss: 0.7611 - classification_loss: 0.0829 276/500 [===============>..............] - ETA: 1:16 - loss: 0.8427 - regression_loss: 0.7600 - classification_loss: 0.0827 277/500 [===============>..............] - ETA: 1:15 - loss: 0.8447 - regression_loss: 0.7619 - classification_loss: 0.0828 278/500 [===============>..............] - ETA: 1:15 - loss: 0.8429 - regression_loss: 0.7603 - classification_loss: 0.0826 279/500 [===============>..............] - ETA: 1:14 - loss: 0.8426 - regression_loss: 0.7600 - classification_loss: 0.0826 280/500 [===============>..............] - ETA: 1:14 - loss: 0.8401 - regression_loss: 0.7577 - classification_loss: 0.0824 281/500 [===============>..............] - ETA: 1:14 - loss: 0.8404 - regression_loss: 0.7581 - classification_loss: 0.0823 282/500 [===============>..............] - ETA: 1:13 - loss: 0.8402 - regression_loss: 0.7579 - classification_loss: 0.0823 283/500 [===============>..............] - ETA: 1:13 - loss: 0.8414 - regression_loss: 0.7590 - classification_loss: 0.0824 284/500 [================>.............] - ETA: 1:13 - loss: 0.8408 - regression_loss: 0.7586 - classification_loss: 0.0822 285/500 [================>.............] - ETA: 1:12 - loss: 0.8401 - regression_loss: 0.7580 - classification_loss: 0.0821 286/500 [================>.............] - ETA: 1:12 - loss: 0.8390 - regression_loss: 0.7570 - classification_loss: 0.0820 287/500 [================>.............] - ETA: 1:12 - loss: 0.8403 - regression_loss: 0.7582 - classification_loss: 0.0821 288/500 [================>.............] - ETA: 1:11 - loss: 0.8408 - regression_loss: 0.7586 - classification_loss: 0.0822 289/500 [================>.............] - ETA: 1:11 - loss: 0.8404 - regression_loss: 0.7583 - classification_loss: 0.0821 290/500 [================>.............] - ETA: 1:11 - loss: 0.8409 - regression_loss: 0.7588 - classification_loss: 0.0821 291/500 [================>.............] - ETA: 1:10 - loss: 0.8409 - regression_loss: 0.7588 - classification_loss: 0.0821 292/500 [================>.............] - ETA: 1:10 - loss: 0.8402 - regression_loss: 0.7582 - classification_loss: 0.0820 293/500 [================>.............] - ETA: 1:10 - loss: 0.8421 - regression_loss: 0.7595 - classification_loss: 0.0826 294/500 [================>.............] - ETA: 1:09 - loss: 0.8407 - regression_loss: 0.7583 - classification_loss: 0.0824 295/500 [================>.............] - ETA: 1:09 - loss: 0.8428 - regression_loss: 0.7597 - classification_loss: 0.0830 296/500 [================>.............] - ETA: 1:09 - loss: 0.8419 - regression_loss: 0.7589 - classification_loss: 0.0830 297/500 [================>.............] - ETA: 1:08 - loss: 0.8414 - regression_loss: 0.7585 - classification_loss: 0.0829 298/500 [================>.............] - ETA: 1:08 - loss: 0.8402 - regression_loss: 0.7575 - classification_loss: 0.0827 299/500 [================>.............] - ETA: 1:08 - loss: 0.8419 - regression_loss: 0.7588 - classification_loss: 0.0831 300/500 [=================>............] - ETA: 1:07 - loss: 0.8423 - regression_loss: 0.7592 - classification_loss: 0.0831 301/500 [=================>............] - ETA: 1:07 - loss: 0.8435 - regression_loss: 0.7598 - classification_loss: 0.0836 302/500 [=================>............] - ETA: 1:07 - loss: 0.8441 - regression_loss: 0.7607 - classification_loss: 0.0834 303/500 [=================>............] - ETA: 1:06 - loss: 0.8439 - regression_loss: 0.7605 - classification_loss: 0.0834 304/500 [=================>............] - ETA: 1:06 - loss: 0.8457 - regression_loss: 0.7621 - classification_loss: 0.0835 305/500 [=================>............] - ETA: 1:06 - loss: 0.8448 - regression_loss: 0.7614 - classification_loss: 0.0834 306/500 [=================>............] - ETA: 1:05 - loss: 0.8450 - regression_loss: 0.7615 - classification_loss: 0.0836 307/500 [=================>............] - ETA: 1:05 - loss: 0.8436 - regression_loss: 0.7602 - classification_loss: 0.0833 308/500 [=================>............] - ETA: 1:05 - loss: 0.8447 - regression_loss: 0.7612 - classification_loss: 0.0835 309/500 [=================>............] - ETA: 1:04 - loss: 0.8452 - regression_loss: 0.7616 - classification_loss: 0.0836 310/500 [=================>............] - ETA: 1:04 - loss: 0.8456 - regression_loss: 0.7621 - classification_loss: 0.0835 311/500 [=================>............] - ETA: 1:04 - loss: 0.8455 - regression_loss: 0.7621 - classification_loss: 0.0834 312/500 [=================>............] - ETA: 1:03 - loss: 0.8442 - regression_loss: 0.7609 - classification_loss: 0.0832 313/500 [=================>............] - ETA: 1:03 - loss: 0.8441 - regression_loss: 0.7609 - classification_loss: 0.0832 314/500 [=================>............] - ETA: 1:03 - loss: 0.8420 - regression_loss: 0.7590 - classification_loss: 0.0830 315/500 [=================>............] - ETA: 1:02 - loss: 0.8443 - regression_loss: 0.7606 - classification_loss: 0.0837 316/500 [=================>............] - ETA: 1:02 - loss: 0.8456 - regression_loss: 0.7617 - classification_loss: 0.0840 317/500 [==================>...........] - ETA: 1:02 - loss: 0.8447 - regression_loss: 0.7608 - classification_loss: 0.0839 318/500 [==================>...........] - ETA: 1:01 - loss: 0.8449 - regression_loss: 0.7610 - classification_loss: 0.0839 319/500 [==================>...........] - ETA: 1:01 - loss: 0.8451 - regression_loss: 0.7609 - classification_loss: 0.0842 320/500 [==================>...........] - ETA: 1:01 - loss: 0.8457 - regression_loss: 0.7615 - classification_loss: 0.0843 321/500 [==================>...........] - ETA: 1:00 - loss: 0.8446 - regression_loss: 0.7605 - classification_loss: 0.0841 322/500 [==================>...........] - ETA: 1:00 - loss: 0.8444 - regression_loss: 0.7604 - classification_loss: 0.0840 323/500 [==================>...........] - ETA: 1:00 - loss: 0.8438 - regression_loss: 0.7599 - classification_loss: 0.0839 324/500 [==================>...........] - ETA: 59s - loss: 0.8432 - regression_loss: 0.7593 - classification_loss: 0.0839  325/500 [==================>...........] - ETA: 59s - loss: 0.8433 - regression_loss: 0.7596 - classification_loss: 0.0837 326/500 [==================>...........] - ETA: 59s - loss: 0.8446 - regression_loss: 0.7609 - classification_loss: 0.0837 327/500 [==================>...........] - ETA: 58s - loss: 0.8442 - regression_loss: 0.7607 - classification_loss: 0.0835 328/500 [==================>...........] - ETA: 58s - loss: 0.8448 - regression_loss: 0.7612 - classification_loss: 0.0836 329/500 [==================>...........] - ETA: 58s - loss: 0.8460 - regression_loss: 0.7624 - classification_loss: 0.0835 330/500 [==================>...........] - ETA: 57s - loss: 0.8455 - regression_loss: 0.7621 - classification_loss: 0.0834 331/500 [==================>...........] - ETA: 57s - loss: 0.8451 - regression_loss: 0.7619 - classification_loss: 0.0832 332/500 [==================>...........] - ETA: 57s - loss: 0.8449 - regression_loss: 0.7617 - classification_loss: 0.0832 333/500 [==================>...........] - ETA: 56s - loss: 0.8478 - regression_loss: 0.7642 - classification_loss: 0.0837 334/500 [===================>..........] - ETA: 56s - loss: 0.8464 - regression_loss: 0.7629 - classification_loss: 0.0836 335/500 [===================>..........] - ETA: 56s - loss: 0.8473 - regression_loss: 0.7636 - classification_loss: 0.0837 336/500 [===================>..........] - ETA: 55s - loss: 0.8462 - regression_loss: 0.7627 - classification_loss: 0.0835 337/500 [===================>..........] - ETA: 55s - loss: 0.8454 - regression_loss: 0.7619 - classification_loss: 0.0834 338/500 [===================>..........] - ETA: 55s - loss: 0.8466 - regression_loss: 0.7630 - classification_loss: 0.0836 339/500 [===================>..........] - ETA: 54s - loss: 0.8462 - regression_loss: 0.7628 - classification_loss: 0.0835 340/500 [===================>..........] - ETA: 54s - loss: 0.8452 - regression_loss: 0.7618 - classification_loss: 0.0834 341/500 [===================>..........] - ETA: 54s - loss: 0.8460 - regression_loss: 0.7625 - classification_loss: 0.0835 342/500 [===================>..........] - ETA: 53s - loss: 0.8461 - regression_loss: 0.7626 - classification_loss: 0.0835 343/500 [===================>..........] - ETA: 53s - loss: 0.8443 - regression_loss: 0.7610 - classification_loss: 0.0834 344/500 [===================>..........] - ETA: 52s - loss: 0.8451 - regression_loss: 0.7617 - classification_loss: 0.0834 345/500 [===================>..........] - ETA: 52s - loss: 0.8448 - regression_loss: 0.7615 - classification_loss: 0.0833 346/500 [===================>..........] - ETA: 52s - loss: 0.8442 - regression_loss: 0.7610 - classification_loss: 0.0832 347/500 [===================>..........] - ETA: 51s - loss: 0.8435 - regression_loss: 0.7603 - classification_loss: 0.0831 348/500 [===================>..........] - ETA: 51s - loss: 0.8434 - regression_loss: 0.7602 - classification_loss: 0.0832 349/500 [===================>..........] - ETA: 51s - loss: 0.8432 - regression_loss: 0.7601 - classification_loss: 0.0831 350/500 [====================>.........] - ETA: 50s - loss: 0.8425 - regression_loss: 0.7594 - classification_loss: 0.0830 351/500 [====================>.........] - ETA: 50s - loss: 0.8407 - regression_loss: 0.7577 - classification_loss: 0.0830 352/500 [====================>.........] - ETA: 50s - loss: 0.8401 - regression_loss: 0.7572 - classification_loss: 0.0829 353/500 [====================>.........] - ETA: 49s - loss: 0.8398 - regression_loss: 0.7570 - classification_loss: 0.0829 354/500 [====================>.........] - ETA: 49s - loss: 0.8411 - regression_loss: 0.7582 - classification_loss: 0.0828 355/500 [====================>.........] - ETA: 49s - loss: 0.8410 - regression_loss: 0.7581 - classification_loss: 0.0829 356/500 [====================>.........] - ETA: 48s - loss: 0.8424 - regression_loss: 0.7593 - classification_loss: 0.0831 357/500 [====================>.........] - ETA: 48s - loss: 0.8421 - regression_loss: 0.7591 - classification_loss: 0.0830 358/500 [====================>.........] - ETA: 48s - loss: 0.8413 - regression_loss: 0.7584 - classification_loss: 0.0829 359/500 [====================>.........] - ETA: 47s - loss: 0.8421 - regression_loss: 0.7593 - classification_loss: 0.0829 360/500 [====================>.........] - ETA: 47s - loss: 0.8425 - regression_loss: 0.7597 - classification_loss: 0.0828 361/500 [====================>.........] - ETA: 47s - loss: 0.8428 - regression_loss: 0.7600 - classification_loss: 0.0828 362/500 [====================>.........] - ETA: 46s - loss: 0.8431 - regression_loss: 0.7603 - classification_loss: 0.0829 363/500 [====================>.........] - ETA: 46s - loss: 0.8428 - regression_loss: 0.7601 - classification_loss: 0.0827 364/500 [====================>.........] - ETA: 46s - loss: 0.8424 - regression_loss: 0.7598 - classification_loss: 0.0826 365/500 [====================>.........] - ETA: 45s - loss: 0.8426 - regression_loss: 0.7601 - classification_loss: 0.0826 366/500 [====================>.........] - ETA: 45s - loss: 0.8422 - regression_loss: 0.7597 - classification_loss: 0.0825 367/500 [=====================>........] - ETA: 45s - loss: 0.8411 - regression_loss: 0.7588 - classification_loss: 0.0823 368/500 [=====================>........] - ETA: 44s - loss: 0.8415 - regression_loss: 0.7593 - classification_loss: 0.0823 369/500 [=====================>........] - ETA: 44s - loss: 0.8412 - regression_loss: 0.7590 - classification_loss: 0.0822 370/500 [=====================>........] - ETA: 44s - loss: 0.8412 - regression_loss: 0.7590 - classification_loss: 0.0822 371/500 [=====================>........] - ETA: 43s - loss: 0.8406 - regression_loss: 0.7585 - classification_loss: 0.0821 372/500 [=====================>........] - ETA: 43s - loss: 0.8407 - regression_loss: 0.7584 - classification_loss: 0.0823 373/500 [=====================>........] - ETA: 43s - loss: 0.8410 - regression_loss: 0.7587 - classification_loss: 0.0824 374/500 [=====================>........] - ETA: 42s - loss: 0.8409 - regression_loss: 0.7586 - classification_loss: 0.0823 375/500 [=====================>........] - ETA: 42s - loss: 0.8402 - regression_loss: 0.7581 - classification_loss: 0.0821 376/500 [=====================>........] - ETA: 42s - loss: 0.8393 - regression_loss: 0.7574 - classification_loss: 0.0820 377/500 [=====================>........] - ETA: 41s - loss: 0.8374 - regression_loss: 0.7554 - classification_loss: 0.0820 378/500 [=====================>........] - ETA: 41s - loss: 0.8366 - regression_loss: 0.7547 - classification_loss: 0.0819 379/500 [=====================>........] - ETA: 41s - loss: 0.8377 - regression_loss: 0.7558 - classification_loss: 0.0819 380/500 [=====================>........] - ETA: 40s - loss: 0.8375 - regression_loss: 0.7557 - classification_loss: 0.0818 381/500 [=====================>........] - ETA: 40s - loss: 0.8377 - regression_loss: 0.7559 - classification_loss: 0.0818 382/500 [=====================>........] - ETA: 40s - loss: 0.8375 - regression_loss: 0.7556 - classification_loss: 0.0818 383/500 [=====================>........] - ETA: 39s - loss: 0.8362 - regression_loss: 0.7545 - classification_loss: 0.0817 384/500 [======================>.......] - ETA: 39s - loss: 0.8360 - regression_loss: 0.7544 - classification_loss: 0.0816 385/500 [======================>.......] - ETA: 39s - loss: 0.8359 - regression_loss: 0.7544 - classification_loss: 0.0816 386/500 [======================>.......] - ETA: 38s - loss: 0.8371 - regression_loss: 0.7553 - classification_loss: 0.0817 387/500 [======================>.......] - ETA: 38s - loss: 0.8364 - regression_loss: 0.7547 - classification_loss: 0.0817 388/500 [======================>.......] - ETA: 38s - loss: 0.8354 - regression_loss: 0.7538 - classification_loss: 0.0815 389/500 [======================>.......] - ETA: 37s - loss: 0.8349 - regression_loss: 0.7534 - classification_loss: 0.0814 390/500 [======================>.......] - ETA: 37s - loss: 0.8346 - regression_loss: 0.7533 - classification_loss: 0.0813 391/500 [======================>.......] - ETA: 37s - loss: 0.8352 - regression_loss: 0.7537 - classification_loss: 0.0816 392/500 [======================>.......] - ETA: 36s - loss: 0.8347 - regression_loss: 0.7532 - classification_loss: 0.0815 393/500 [======================>.......] - ETA: 36s - loss: 0.8347 - regression_loss: 0.7532 - classification_loss: 0.0815 394/500 [======================>.......] - ETA: 36s - loss: 0.8341 - regression_loss: 0.7526 - classification_loss: 0.0815 395/500 [======================>.......] - ETA: 35s - loss: 0.8360 - regression_loss: 0.7543 - classification_loss: 0.0818 396/500 [======================>.......] - ETA: 35s - loss: 0.8363 - regression_loss: 0.7545 - classification_loss: 0.0817 397/500 [======================>.......] - ETA: 34s - loss: 0.8359 - regression_loss: 0.7544 - classification_loss: 0.0816 398/500 [======================>.......] - ETA: 34s - loss: 0.8357 - regression_loss: 0.7542 - classification_loss: 0.0815 399/500 [======================>.......] - ETA: 34s - loss: 0.8337 - regression_loss: 0.7523 - classification_loss: 0.0814 400/500 [=======================>......] - ETA: 33s - loss: 0.8338 - regression_loss: 0.7526 - classification_loss: 0.0812 401/500 [=======================>......] - ETA: 33s - loss: 0.8339 - regression_loss: 0.7527 - classification_loss: 0.0812 402/500 [=======================>......] - ETA: 33s - loss: 0.8366 - regression_loss: 0.7548 - classification_loss: 0.0818 403/500 [=======================>......] - ETA: 32s - loss: 0.8361 - regression_loss: 0.7545 - classification_loss: 0.0816 404/500 [=======================>......] - ETA: 32s - loss: 0.8389 - regression_loss: 0.7571 - classification_loss: 0.0818 405/500 [=======================>......] - ETA: 32s - loss: 0.8378 - regression_loss: 0.7561 - classification_loss: 0.0817 406/500 [=======================>......] - ETA: 31s - loss: 0.8371 - regression_loss: 0.7555 - classification_loss: 0.0817 407/500 [=======================>......] - ETA: 31s - loss: 0.8380 - regression_loss: 0.7561 - classification_loss: 0.0819 408/500 [=======================>......] - ETA: 31s - loss: 0.8375 - regression_loss: 0.7557 - classification_loss: 0.0818 409/500 [=======================>......] - ETA: 30s - loss: 0.8397 - regression_loss: 0.7576 - classification_loss: 0.0822 410/500 [=======================>......] - ETA: 30s - loss: 0.8386 - regression_loss: 0.7566 - classification_loss: 0.0821 411/500 [=======================>......] - ETA: 30s - loss: 0.8383 - regression_loss: 0.7562 - classification_loss: 0.0821 412/500 [=======================>......] - ETA: 29s - loss: 0.8378 - regression_loss: 0.7557 - classification_loss: 0.0821 413/500 [=======================>......] - ETA: 29s - loss: 0.8374 - regression_loss: 0.7555 - classification_loss: 0.0819 414/500 [=======================>......] - ETA: 29s - loss: 0.8373 - regression_loss: 0.7554 - classification_loss: 0.0819 415/500 [=======================>......] - ETA: 28s - loss: 0.8374 - regression_loss: 0.7556 - classification_loss: 0.0818 416/500 [=======================>......] - ETA: 28s - loss: 0.8358 - regression_loss: 0.7541 - classification_loss: 0.0817 417/500 [========================>.....] - ETA: 28s - loss: 0.8366 - regression_loss: 0.7547 - classification_loss: 0.0819 418/500 [========================>.....] - ETA: 27s - loss: 0.8355 - regression_loss: 0.7538 - classification_loss: 0.0818 419/500 [========================>.....] - ETA: 27s - loss: 0.8358 - regression_loss: 0.7541 - classification_loss: 0.0817 420/500 [========================>.....] - ETA: 27s - loss: 0.8354 - regression_loss: 0.7538 - classification_loss: 0.0816 421/500 [========================>.....] - ETA: 26s - loss: 0.8354 - regression_loss: 0.7539 - classification_loss: 0.0816 422/500 [========================>.....] - ETA: 26s - loss: 0.8361 - regression_loss: 0.7544 - classification_loss: 0.0817 423/500 [========================>.....] - ETA: 26s - loss: 0.8361 - regression_loss: 0.7544 - classification_loss: 0.0816 424/500 [========================>.....] - ETA: 25s - loss: 0.8366 - regression_loss: 0.7549 - classification_loss: 0.0816 425/500 [========================>.....] - ETA: 25s - loss: 0.8360 - regression_loss: 0.7545 - classification_loss: 0.0815 426/500 [========================>.....] - ETA: 25s - loss: 0.8355 - regression_loss: 0.7542 - classification_loss: 0.0814 427/500 [========================>.....] - ETA: 24s - loss: 0.8367 - regression_loss: 0.7554 - classification_loss: 0.0813 428/500 [========================>.....] - ETA: 24s - loss: 0.8362 - regression_loss: 0.7551 - classification_loss: 0.0812 429/500 [========================>.....] - ETA: 24s - loss: 0.8360 - regression_loss: 0.7548 - classification_loss: 0.0812 430/500 [========================>.....] - ETA: 23s - loss: 0.8372 - regression_loss: 0.7557 - classification_loss: 0.0815 431/500 [========================>.....] - ETA: 23s - loss: 0.8371 - regression_loss: 0.7557 - classification_loss: 0.0814 432/500 [========================>.....] - ETA: 23s - loss: 0.8374 - regression_loss: 0.7559 - classification_loss: 0.0815 433/500 [========================>.....] - ETA: 22s - loss: 0.8362 - regression_loss: 0.7548 - classification_loss: 0.0814 434/500 [=========================>....] - ETA: 22s - loss: 0.8378 - regression_loss: 0.7562 - classification_loss: 0.0816 435/500 [=========================>....] - ETA: 22s - loss: 0.8379 - regression_loss: 0.7564 - classification_loss: 0.0815 436/500 [=========================>....] - ETA: 21s - loss: 0.8371 - regression_loss: 0.7558 - classification_loss: 0.0813 437/500 [=========================>....] - ETA: 21s - loss: 0.8381 - regression_loss: 0.7565 - classification_loss: 0.0816 438/500 [=========================>....] - ETA: 21s - loss: 0.8388 - regression_loss: 0.7570 - classification_loss: 0.0818 439/500 [=========================>....] - ETA: 20s - loss: 0.8391 - regression_loss: 0.7570 - classification_loss: 0.0820 440/500 [=========================>....] - ETA: 20s - loss: 0.8395 - regression_loss: 0.7575 - classification_loss: 0.0820 441/500 [=========================>....] - ETA: 20s - loss: 0.8394 - regression_loss: 0.7575 - classification_loss: 0.0820 442/500 [=========================>....] - ETA: 19s - loss: 0.8389 - regression_loss: 0.7570 - classification_loss: 0.0819 443/500 [=========================>....] - ETA: 19s - loss: 0.8398 - regression_loss: 0.7578 - classification_loss: 0.0820 444/500 [=========================>....] - ETA: 19s - loss: 0.8398 - regression_loss: 0.7577 - classification_loss: 0.0820 445/500 [=========================>....] - ETA: 18s - loss: 0.8400 - regression_loss: 0.7579 - classification_loss: 0.0821 446/500 [=========================>....] - ETA: 18s - loss: 0.8396 - regression_loss: 0.7576 - classification_loss: 0.0820 447/500 [=========================>....] - ETA: 18s - loss: 0.8398 - regression_loss: 0.7578 - classification_loss: 0.0820 448/500 [=========================>....] - ETA: 17s - loss: 0.8406 - regression_loss: 0.7586 - classification_loss: 0.0821 449/500 [=========================>....] - ETA: 17s - loss: 0.8401 - regression_loss: 0.7582 - classification_loss: 0.0820 450/500 [==========================>...] - ETA: 16s - loss: 0.8411 - regression_loss: 0.7592 - classification_loss: 0.0820 451/500 [==========================>...] - ETA: 16s - loss: 0.8416 - regression_loss: 0.7595 - classification_loss: 0.0822 452/500 [==========================>...] - ETA: 16s - loss: 0.8421 - regression_loss: 0.7598 - classification_loss: 0.0823 453/500 [==========================>...] - ETA: 15s - loss: 0.8437 - regression_loss: 0.7611 - classification_loss: 0.0826 454/500 [==========================>...] - ETA: 15s - loss: 0.8449 - regression_loss: 0.7622 - classification_loss: 0.0826 455/500 [==========================>...] - ETA: 15s - loss: 0.8435 - regression_loss: 0.7610 - classification_loss: 0.0825 456/500 [==========================>...] - ETA: 14s - loss: 0.8430 - regression_loss: 0.7606 - classification_loss: 0.0824 457/500 [==========================>...] - ETA: 14s - loss: 0.8429 - regression_loss: 0.7605 - classification_loss: 0.0824 458/500 [==========================>...] - ETA: 14s - loss: 0.8424 - regression_loss: 0.7601 - classification_loss: 0.0824 459/500 [==========================>...] - ETA: 13s - loss: 0.8421 - regression_loss: 0.7598 - classification_loss: 0.0823 460/500 [==========================>...] - ETA: 13s - loss: 0.8426 - regression_loss: 0.7602 - classification_loss: 0.0824 461/500 [==========================>...] - ETA: 13s - loss: 0.8424 - regression_loss: 0.7601 - classification_loss: 0.0823 462/500 [==========================>...] - ETA: 12s - loss: 0.8421 - regression_loss: 0.7598 - classification_loss: 0.0823 463/500 [==========================>...] - ETA: 12s - loss: 0.8428 - regression_loss: 0.7604 - classification_loss: 0.0823 464/500 [==========================>...] - ETA: 12s - loss: 0.8435 - regression_loss: 0.7611 - classification_loss: 0.0824 465/500 [==========================>...] - ETA: 11s - loss: 0.8430 - regression_loss: 0.7607 - classification_loss: 0.0823 466/500 [==========================>...] - ETA: 11s - loss: 0.8430 - regression_loss: 0.7607 - classification_loss: 0.0823 467/500 [===========================>..] - ETA: 11s - loss: 0.8427 - regression_loss: 0.7605 - classification_loss: 0.0823 468/500 [===========================>..] - ETA: 10s - loss: 0.8429 - regression_loss: 0.7606 - classification_loss: 0.0823 469/500 [===========================>..] - ETA: 10s - loss: 0.8432 - regression_loss: 0.7608 - classification_loss: 0.0823 470/500 [===========================>..] - ETA: 10s - loss: 0.8423 - regression_loss: 0.7600 - classification_loss: 0.0823 471/500 [===========================>..] - ETA: 9s - loss: 0.8420 - regression_loss: 0.7598 - classification_loss: 0.0822  472/500 [===========================>..] - ETA: 9s - loss: 0.8417 - regression_loss: 0.7596 - classification_loss: 0.0821 473/500 [===========================>..] - ETA: 9s - loss: 0.8407 - regression_loss: 0.7587 - classification_loss: 0.0820 474/500 [===========================>..] - ETA: 8s - loss: 0.8409 - regression_loss: 0.7589 - classification_loss: 0.0820 475/500 [===========================>..] - ETA: 8s - loss: 0.8410 - regression_loss: 0.7589 - classification_loss: 0.0821 476/500 [===========================>..] - ETA: 8s - loss: 0.8408 - regression_loss: 0.7586 - classification_loss: 0.0821 477/500 [===========================>..] - ETA: 7s - loss: 0.8402 - regression_loss: 0.7581 - classification_loss: 0.0821 478/500 [===========================>..] - ETA: 7s - loss: 0.8402 - regression_loss: 0.7582 - classification_loss: 0.0821 479/500 [===========================>..] - ETA: 7s - loss: 0.8406 - regression_loss: 0.7585 - classification_loss: 0.0821 480/500 [===========================>..] - ETA: 6s - loss: 0.8398 - regression_loss: 0.7577 - classification_loss: 0.0820 481/500 [===========================>..] - ETA: 6s - loss: 0.8395 - regression_loss: 0.7576 - classification_loss: 0.0819 482/500 [===========================>..] - ETA: 6s - loss: 0.8393 - regression_loss: 0.7575 - classification_loss: 0.0818 483/500 [===========================>..] - ETA: 5s - loss: 0.8399 - regression_loss: 0.7581 - classification_loss: 0.0818 484/500 [============================>.] - ETA: 5s - loss: 0.8390 - regression_loss: 0.7573 - classification_loss: 0.0817 485/500 [============================>.] - ETA: 5s - loss: 0.8403 - regression_loss: 0.7583 - classification_loss: 0.0820 486/500 [============================>.] - ETA: 4s - loss: 0.8395 - regression_loss: 0.7576 - classification_loss: 0.0819 487/500 [============================>.] - ETA: 4s - loss: 0.8391 - regression_loss: 0.7573 - classification_loss: 0.0818 488/500 [============================>.] - ETA: 4s - loss: 0.8384 - regression_loss: 0.7566 - classification_loss: 0.0817 489/500 [============================>.] - ETA: 3s - loss: 0.8379 - regression_loss: 0.7562 - classification_loss: 0.0816 490/500 [============================>.] - ETA: 3s - loss: 0.8378 - regression_loss: 0.7563 - classification_loss: 0.0816 491/500 [============================>.] - ETA: 3s - loss: 0.8378 - regression_loss: 0.7563 - classification_loss: 0.0815 492/500 [============================>.] - ETA: 2s - loss: 0.8379 - regression_loss: 0.7564 - classification_loss: 0.0815 493/500 [============================>.] - ETA: 2s - loss: 0.8379 - regression_loss: 0.7564 - classification_loss: 0.0814 494/500 [============================>.] - ETA: 2s - loss: 0.8390 - regression_loss: 0.7576 - classification_loss: 0.0814 495/500 [============================>.] - ETA: 1s - loss: 0.8379 - regression_loss: 0.7566 - classification_loss: 0.0813 496/500 [============================>.] - ETA: 1s - loss: 0.8378 - regression_loss: 0.7566 - classification_loss: 0.0812 497/500 [============================>.] - ETA: 1s - loss: 0.8384 - regression_loss: 0.7568 - classification_loss: 0.0815 498/500 [============================>.] - ETA: 0s - loss: 0.8377 - regression_loss: 0.7563 - classification_loss: 0.0815 499/500 [============================>.] - ETA: 0s - loss: 0.8380 - regression_loss: 0.7566 - classification_loss: 0.0814 500/500 [==============================] - 170s 339ms/step - loss: 0.8386 - regression_loss: 0.7571 - classification_loss: 0.0815 326 instances of class plum with average precision: 0.8436 mAP: 0.8436 Epoch 00036: saving model to ./training/snapshots/resnet101_pascal_36.h5 Epoch 37/150 1/500 [..............................] - ETA: 2:34 - loss: 0.6734 - regression_loss: 0.6014 - classification_loss: 0.0720 2/500 [..............................] - ETA: 2:43 - loss: 0.4708 - regression_loss: 0.4315 - classification_loss: 0.0393 3/500 [..............................] - ETA: 2:45 - loss: 0.6454 - regression_loss: 0.5922 - classification_loss: 0.0532 4/500 [..............................] - ETA: 2:45 - loss: 0.7258 - regression_loss: 0.6685 - classification_loss: 0.0574 5/500 [..............................] - ETA: 2:45 - loss: 0.7639 - regression_loss: 0.7008 - classification_loss: 0.0631 6/500 [..............................] - ETA: 2:44 - loss: 0.7536 - regression_loss: 0.6892 - classification_loss: 0.0644 7/500 [..............................] - ETA: 2:43 - loss: 0.6914 - regression_loss: 0.6345 - classification_loss: 0.0569 8/500 [..............................] - ETA: 2:42 - loss: 0.7531 - regression_loss: 0.6872 - classification_loss: 0.0659 9/500 [..............................] - ETA: 2:42 - loss: 0.7574 - regression_loss: 0.6941 - classification_loss: 0.0633 10/500 [..............................] - ETA: 2:43 - loss: 0.7538 - regression_loss: 0.6838 - classification_loss: 0.0699 11/500 [..............................] - ETA: 2:43 - loss: 0.7467 - regression_loss: 0.6787 - classification_loss: 0.0680 12/500 [..............................] - ETA: 2:43 - loss: 0.7887 - regression_loss: 0.7186 - classification_loss: 0.0701 13/500 [..............................] - ETA: 2:43 - loss: 0.7647 - regression_loss: 0.6972 - classification_loss: 0.0675 14/500 [..............................] - ETA: 2:43 - loss: 0.7766 - regression_loss: 0.7102 - classification_loss: 0.0663 15/500 [..............................] - ETA: 2:44 - loss: 0.7857 - regression_loss: 0.7187 - classification_loss: 0.0670 16/500 [..............................] - ETA: 2:44 - loss: 0.8309 - regression_loss: 0.7490 - classification_loss: 0.0819 17/500 [>.............................] - ETA: 2:44 - loss: 0.8357 - regression_loss: 0.7539 - classification_loss: 0.0818 18/500 [>.............................] - ETA: 2:43 - loss: 0.8715 - regression_loss: 0.7848 - classification_loss: 0.0867 19/500 [>.............................] - ETA: 2:42 - loss: 0.8725 - regression_loss: 0.7843 - classification_loss: 0.0881 20/500 [>.............................] - ETA: 2:42 - loss: 0.8539 - regression_loss: 0.7691 - classification_loss: 0.0848 21/500 [>.............................] - ETA: 2:42 - loss: 0.8650 - regression_loss: 0.7771 - classification_loss: 0.0879 22/500 [>.............................] - ETA: 2:41 - loss: 0.8793 - regression_loss: 0.7883 - classification_loss: 0.0910 23/500 [>.............................] - ETA: 2:41 - loss: 0.8802 - regression_loss: 0.7897 - classification_loss: 0.0905 24/500 [>.............................] - ETA: 2:41 - loss: 0.8755 - regression_loss: 0.7871 - classification_loss: 0.0885 25/500 [>.............................] - ETA: 2:40 - loss: 0.8757 - regression_loss: 0.7884 - classification_loss: 0.0874 26/500 [>.............................] - ETA: 2:40 - loss: 0.8732 - regression_loss: 0.7865 - classification_loss: 0.0867 27/500 [>.............................] - ETA: 2:40 - loss: 0.8747 - regression_loss: 0.7882 - classification_loss: 0.0866 28/500 [>.............................] - ETA: 2:40 - loss: 0.8601 - regression_loss: 0.7754 - classification_loss: 0.0847 29/500 [>.............................] - ETA: 2:39 - loss: 0.8582 - regression_loss: 0.7734 - classification_loss: 0.0849 30/500 [>.............................] - ETA: 2:39 - loss: 0.8565 - regression_loss: 0.7731 - classification_loss: 0.0834 31/500 [>.............................] - ETA: 2:39 - loss: 0.8527 - regression_loss: 0.7707 - classification_loss: 0.0820 32/500 [>.............................] - ETA: 2:39 - loss: 0.8485 - regression_loss: 0.7649 - classification_loss: 0.0836 33/500 [>.............................] - ETA: 2:39 - loss: 0.8739 - regression_loss: 0.7820 - classification_loss: 0.0919 34/500 [=>............................] - ETA: 2:38 - loss: 0.8736 - regression_loss: 0.7833 - classification_loss: 0.0903 35/500 [=>............................] - ETA: 2:38 - loss: 0.8747 - regression_loss: 0.7855 - classification_loss: 0.0892 36/500 [=>............................] - ETA: 2:38 - loss: 0.8725 - regression_loss: 0.7824 - classification_loss: 0.0901 37/500 [=>............................] - ETA: 2:37 - loss: 0.8680 - regression_loss: 0.7794 - classification_loss: 0.0886 38/500 [=>............................] - ETA: 2:37 - loss: 0.8641 - regression_loss: 0.7755 - classification_loss: 0.0886 39/500 [=>............................] - ETA: 2:37 - loss: 0.8647 - regression_loss: 0.7759 - classification_loss: 0.0888 40/500 [=>............................] - ETA: 2:37 - loss: 0.8556 - regression_loss: 0.7683 - classification_loss: 0.0874 41/500 [=>............................] - ETA: 2:37 - loss: 0.8444 - regression_loss: 0.7580 - classification_loss: 0.0863 42/500 [=>............................] - ETA: 2:36 - loss: 0.8382 - regression_loss: 0.7528 - classification_loss: 0.0854 43/500 [=>............................] - ETA: 2:36 - loss: 0.8864 - regression_loss: 0.7907 - classification_loss: 0.0957 44/500 [=>............................] - ETA: 2:35 - loss: 0.8867 - regression_loss: 0.7919 - classification_loss: 0.0948 45/500 [=>............................] - ETA: 2:35 - loss: 0.8812 - regression_loss: 0.7869 - classification_loss: 0.0943 46/500 [=>............................] - ETA: 2:34 - loss: 0.8848 - regression_loss: 0.7905 - classification_loss: 0.0943 47/500 [=>............................] - ETA: 2:34 - loss: 0.8832 - regression_loss: 0.7896 - classification_loss: 0.0936 48/500 [=>............................] - ETA: 2:34 - loss: 0.8795 - regression_loss: 0.7863 - classification_loss: 0.0932 49/500 [=>............................] - ETA: 2:33 - loss: 0.8739 - regression_loss: 0.7817 - classification_loss: 0.0922 50/500 [==>...........................] - ETA: 2:33 - loss: 0.8718 - regression_loss: 0.7807 - classification_loss: 0.0912 51/500 [==>...........................] - ETA: 2:33 - loss: 0.8740 - regression_loss: 0.7816 - classification_loss: 0.0924 52/500 [==>...........................] - ETA: 2:32 - loss: 0.8819 - regression_loss: 0.7867 - classification_loss: 0.0952 53/500 [==>...........................] - ETA: 2:32 - loss: 0.8836 - regression_loss: 0.7885 - classification_loss: 0.0952 54/500 [==>...........................] - ETA: 2:32 - loss: 0.8924 - regression_loss: 0.7957 - classification_loss: 0.0967 55/500 [==>...........................] - ETA: 2:31 - loss: 0.8963 - regression_loss: 0.7992 - classification_loss: 0.0971 56/500 [==>...........................] - ETA: 2:31 - loss: 0.8877 - regression_loss: 0.7918 - classification_loss: 0.0959 57/500 [==>...........................] - ETA: 2:31 - loss: 0.8906 - regression_loss: 0.7947 - classification_loss: 0.0959 58/500 [==>...........................] - ETA: 2:30 - loss: 0.8905 - regression_loss: 0.7949 - classification_loss: 0.0956 59/500 [==>...........................] - ETA: 2:30 - loss: 0.8914 - regression_loss: 0.7958 - classification_loss: 0.0956 60/500 [==>...........................] - ETA: 2:30 - loss: 0.8908 - regression_loss: 0.7960 - classification_loss: 0.0948 61/500 [==>...........................] - ETA: 2:29 - loss: 0.8925 - regression_loss: 0.7985 - classification_loss: 0.0940 62/500 [==>...........................] - ETA: 2:29 - loss: 0.8886 - regression_loss: 0.7955 - classification_loss: 0.0930 63/500 [==>...........................] - ETA: 2:29 - loss: 0.8848 - regression_loss: 0.7924 - classification_loss: 0.0924 64/500 [==>...........................] - ETA: 2:29 - loss: 0.8762 - regression_loss: 0.7851 - classification_loss: 0.0912 65/500 [==>...........................] - ETA: 2:28 - loss: 0.8844 - regression_loss: 0.7907 - classification_loss: 0.0938 66/500 [==>...........................] - ETA: 2:28 - loss: 0.8835 - regression_loss: 0.7902 - classification_loss: 0.0933 67/500 [===>..........................] - ETA: 2:28 - loss: 0.8884 - regression_loss: 0.7944 - classification_loss: 0.0940 68/500 [===>..........................] - ETA: 2:27 - loss: 0.8908 - regression_loss: 0.7966 - classification_loss: 0.0943 69/500 [===>..........................] - ETA: 2:27 - loss: 0.8921 - regression_loss: 0.7975 - classification_loss: 0.0946 70/500 [===>..........................] - ETA: 2:27 - loss: 0.8901 - regression_loss: 0.7954 - classification_loss: 0.0947 71/500 [===>..........................] - ETA: 2:26 - loss: 0.8833 - regression_loss: 0.7896 - classification_loss: 0.0936 72/500 [===>..........................] - ETA: 2:26 - loss: 0.8837 - regression_loss: 0.7903 - classification_loss: 0.0934 73/500 [===>..........................] - ETA: 2:26 - loss: 0.8809 - regression_loss: 0.7881 - classification_loss: 0.0928 74/500 [===>..........................] - ETA: 2:25 - loss: 0.8767 - regression_loss: 0.7846 - classification_loss: 0.0921 75/500 [===>..........................] - ETA: 2:25 - loss: 0.8732 - regression_loss: 0.7816 - classification_loss: 0.0916 76/500 [===>..........................] - ETA: 2:24 - loss: 0.8696 - regression_loss: 0.7788 - classification_loss: 0.0908 77/500 [===>..........................] - ETA: 2:24 - loss: 0.8635 - regression_loss: 0.7733 - classification_loss: 0.0902 78/500 [===>..........................] - ETA: 2:24 - loss: 0.8633 - regression_loss: 0.7730 - classification_loss: 0.0903 79/500 [===>..........................] - ETA: 2:23 - loss: 0.8598 - regression_loss: 0.7699 - classification_loss: 0.0899 80/500 [===>..........................] - ETA: 2:23 - loss: 0.8662 - regression_loss: 0.7758 - classification_loss: 0.0905 81/500 [===>..........................] - ETA: 2:23 - loss: 0.8683 - regression_loss: 0.7780 - classification_loss: 0.0903 82/500 [===>..........................] - ETA: 2:23 - loss: 0.8763 - regression_loss: 0.7844 - classification_loss: 0.0918 83/500 [===>..........................] - ETA: 2:22 - loss: 0.8784 - regression_loss: 0.7867 - classification_loss: 0.0917 84/500 [====>.........................] - ETA: 2:22 - loss: 0.8759 - regression_loss: 0.7851 - classification_loss: 0.0908 85/500 [====>.........................] - ETA: 2:21 - loss: 0.8790 - regression_loss: 0.7884 - classification_loss: 0.0907 86/500 [====>.........................] - ETA: 2:21 - loss: 0.8768 - regression_loss: 0.7860 - classification_loss: 0.0908 87/500 [====>.........................] - ETA: 2:21 - loss: 0.8849 - regression_loss: 0.7925 - classification_loss: 0.0924 88/500 [====>.........................] - ETA: 2:20 - loss: 0.8839 - regression_loss: 0.7920 - classification_loss: 0.0919 89/500 [====>.........................] - ETA: 2:20 - loss: 0.8819 - regression_loss: 0.7906 - classification_loss: 0.0914 90/500 [====>.........................] - ETA: 2:20 - loss: 0.8806 - regression_loss: 0.7893 - classification_loss: 0.0912 91/500 [====>.........................] - ETA: 2:19 - loss: 0.8839 - regression_loss: 0.7919 - classification_loss: 0.0920 92/500 [====>.........................] - ETA: 2:19 - loss: 0.8849 - regression_loss: 0.7930 - classification_loss: 0.0919 93/500 [====>.........................] - ETA: 2:19 - loss: 0.8854 - regression_loss: 0.7942 - classification_loss: 0.0913 94/500 [====>.........................] - ETA: 2:18 - loss: 0.8865 - regression_loss: 0.7945 - classification_loss: 0.0921 95/500 [====>.........................] - ETA: 2:18 - loss: 0.8851 - regression_loss: 0.7933 - classification_loss: 0.0917 96/500 [====>.........................] - ETA: 2:18 - loss: 0.8793 - regression_loss: 0.7879 - classification_loss: 0.0914 97/500 [====>.........................] - ETA: 2:17 - loss: 0.8733 - regression_loss: 0.7827 - classification_loss: 0.0906 98/500 [====>.........................] - ETA: 2:17 - loss: 0.8844 - regression_loss: 0.7926 - classification_loss: 0.0918 99/500 [====>.........................] - ETA: 2:17 - loss: 0.8826 - regression_loss: 0.7910 - classification_loss: 0.0916 100/500 [=====>........................] - ETA: 2:16 - loss: 0.8841 - regression_loss: 0.7929 - classification_loss: 0.0911 101/500 [=====>........................] - ETA: 2:16 - loss: 0.8829 - regression_loss: 0.7922 - classification_loss: 0.0906 102/500 [=====>........................] - ETA: 2:16 - loss: 0.8809 - regression_loss: 0.7907 - classification_loss: 0.0902 103/500 [=====>........................] - ETA: 2:15 - loss: 0.8762 - regression_loss: 0.7864 - classification_loss: 0.0898 104/500 [=====>........................] - ETA: 2:15 - loss: 0.8755 - regression_loss: 0.7857 - classification_loss: 0.0898 105/500 [=====>........................] - ETA: 2:15 - loss: 0.8761 - regression_loss: 0.7865 - classification_loss: 0.0896 106/500 [=====>........................] - ETA: 2:14 - loss: 0.8779 - regression_loss: 0.7883 - classification_loss: 0.0895 107/500 [=====>........................] - ETA: 2:14 - loss: 0.8813 - regression_loss: 0.7904 - classification_loss: 0.0909 108/500 [=====>........................] - ETA: 2:13 - loss: 0.8784 - regression_loss: 0.7880 - classification_loss: 0.0903 109/500 [=====>........................] - ETA: 2:13 - loss: 0.8754 - regression_loss: 0.7856 - classification_loss: 0.0898 110/500 [=====>........................] - ETA: 2:13 - loss: 0.8778 - regression_loss: 0.7872 - classification_loss: 0.0906 111/500 [=====>........................] - ETA: 2:12 - loss: 0.8769 - regression_loss: 0.7866 - classification_loss: 0.0902 112/500 [=====>........................] - ETA: 2:12 - loss: 0.8770 - regression_loss: 0.7865 - classification_loss: 0.0905 113/500 [=====>........................] - ETA: 2:12 - loss: 0.8774 - regression_loss: 0.7872 - classification_loss: 0.0902 114/500 [=====>........................] - ETA: 2:11 - loss: 0.8783 - regression_loss: 0.7879 - classification_loss: 0.0904 115/500 [=====>........................] - ETA: 2:11 - loss: 0.8810 - regression_loss: 0.7908 - classification_loss: 0.0902 116/500 [=====>........................] - ETA: 2:11 - loss: 0.8825 - regression_loss: 0.7920 - classification_loss: 0.0904 117/500 [======>.......................] - ETA: 2:10 - loss: 0.8886 - regression_loss: 0.7980 - classification_loss: 0.0906 118/500 [======>.......................] - ETA: 2:10 - loss: 0.8862 - regression_loss: 0.7961 - classification_loss: 0.0901 119/500 [======>.......................] - ETA: 2:09 - loss: 0.8833 - regression_loss: 0.7938 - classification_loss: 0.0896 120/500 [======>.......................] - ETA: 2:09 - loss: 0.8826 - regression_loss: 0.7933 - classification_loss: 0.0893 121/500 [======>.......................] - ETA: 2:09 - loss: 0.8827 - regression_loss: 0.7935 - classification_loss: 0.0892 122/500 [======>.......................] - ETA: 2:08 - loss: 0.9009 - regression_loss: 0.7982 - classification_loss: 0.1027 123/500 [======>.......................] - ETA: 2:08 - loss: 0.8969 - regression_loss: 0.7950 - classification_loss: 0.1019 124/500 [======>.......................] - ETA: 2:08 - loss: 0.8955 - regression_loss: 0.7940 - classification_loss: 0.1014 125/500 [======>.......................] - ETA: 2:07 - loss: 0.8977 - regression_loss: 0.7958 - classification_loss: 0.1019 126/500 [======>.......................] - ETA: 2:07 - loss: 0.8973 - regression_loss: 0.7958 - classification_loss: 0.1015 127/500 [======>.......................] - ETA: 2:07 - loss: 0.8948 - regression_loss: 0.7938 - classification_loss: 0.1010 128/500 [======>.......................] - ETA: 2:06 - loss: 0.8983 - regression_loss: 0.7964 - classification_loss: 0.1019 129/500 [======>.......................] - ETA: 2:06 - loss: 0.9015 - regression_loss: 0.7995 - classification_loss: 0.1020 130/500 [======>.......................] - ETA: 2:05 - loss: 0.8992 - regression_loss: 0.7978 - classification_loss: 0.1014 131/500 [======>.......................] - ETA: 2:05 - loss: 0.9130 - regression_loss: 0.8083 - classification_loss: 0.1047 132/500 [======>.......................] - ETA: 2:05 - loss: 0.9165 - regression_loss: 0.8108 - classification_loss: 0.1057 133/500 [======>.......................] - ETA: 2:05 - loss: 0.9187 - regression_loss: 0.8129 - classification_loss: 0.1058 134/500 [=======>......................] - ETA: 2:04 - loss: 0.9212 - regression_loss: 0.8146 - classification_loss: 0.1066 135/500 [=======>......................] - ETA: 2:04 - loss: 0.9191 - regression_loss: 0.8130 - classification_loss: 0.1061 136/500 [=======>......................] - ETA: 2:03 - loss: 0.9170 - regression_loss: 0.8114 - classification_loss: 0.1056 137/500 [=======>......................] - ETA: 2:03 - loss: 0.9155 - regression_loss: 0.8103 - classification_loss: 0.1052 138/500 [=======>......................] - ETA: 2:03 - loss: 0.9124 - regression_loss: 0.8078 - classification_loss: 0.1046 139/500 [=======>......................] - ETA: 2:02 - loss: 0.9109 - regression_loss: 0.8068 - classification_loss: 0.1041 140/500 [=======>......................] - ETA: 2:02 - loss: 0.9112 - regression_loss: 0.8070 - classification_loss: 0.1043 141/500 [=======>......................] - ETA: 2:02 - loss: 0.9096 - regression_loss: 0.8057 - classification_loss: 0.1038 142/500 [=======>......................] - ETA: 2:01 - loss: 0.9154 - regression_loss: 0.8102 - classification_loss: 0.1052 143/500 [=======>......................] - ETA: 2:01 - loss: 0.9183 - regression_loss: 0.8119 - classification_loss: 0.1064 144/500 [=======>......................] - ETA: 2:01 - loss: 0.9179 - regression_loss: 0.8117 - classification_loss: 0.1062 145/500 [=======>......................] - ETA: 2:00 - loss: 0.9194 - regression_loss: 0.8129 - classification_loss: 0.1065 146/500 [=======>......................] - ETA: 2:00 - loss: 0.9164 - regression_loss: 0.8105 - classification_loss: 0.1059 147/500 [=======>......................] - ETA: 2:00 - loss: 0.9171 - regression_loss: 0.8110 - classification_loss: 0.1061 148/500 [=======>......................] - ETA: 1:59 - loss: 0.9167 - regression_loss: 0.8110 - classification_loss: 0.1058 149/500 [=======>......................] - ETA: 1:59 - loss: 0.9170 - regression_loss: 0.8114 - classification_loss: 0.1057 150/500 [========>.....................] - ETA: 1:59 - loss: 0.9148 - regression_loss: 0.8092 - classification_loss: 0.1055 151/500 [========>.....................] - ETA: 1:58 - loss: 0.9165 - regression_loss: 0.8109 - classification_loss: 0.1055 152/500 [========>.....................] - ETA: 1:58 - loss: 0.9134 - regression_loss: 0.8083 - classification_loss: 0.1051 153/500 [========>.....................] - ETA: 1:58 - loss: 0.9107 - regression_loss: 0.8061 - classification_loss: 0.1046 154/500 [========>.....................] - ETA: 1:57 - loss: 0.9100 - regression_loss: 0.8049 - classification_loss: 0.1051 155/500 [========>.....................] - ETA: 1:57 - loss: 0.9098 - regression_loss: 0.8049 - classification_loss: 0.1049 156/500 [========>.....................] - ETA: 1:57 - loss: 0.9088 - regression_loss: 0.8042 - classification_loss: 0.1046 157/500 [========>.....................] - ETA: 1:56 - loss: 0.9124 - regression_loss: 0.8072 - classification_loss: 0.1051 158/500 [========>.....................] - ETA: 1:56 - loss: 0.9135 - regression_loss: 0.8084 - classification_loss: 0.1051 159/500 [========>.....................] - ETA: 1:55 - loss: 0.9116 - regression_loss: 0.8069 - classification_loss: 0.1047 160/500 [========>.....................] - ETA: 1:55 - loss: 0.9073 - regression_loss: 0.8031 - classification_loss: 0.1043 161/500 [========>.....................] - ETA: 1:55 - loss: 0.9041 - regression_loss: 0.8004 - classification_loss: 0.1037 162/500 [========>.....................] - ETA: 1:54 - loss: 0.9033 - regression_loss: 0.7999 - classification_loss: 0.1034 163/500 [========>.....................] - ETA: 1:54 - loss: 0.9015 - regression_loss: 0.7984 - classification_loss: 0.1031 164/500 [========>.....................] - ETA: 1:54 - loss: 0.9015 - regression_loss: 0.7986 - classification_loss: 0.1029 165/500 [========>.....................] - ETA: 1:53 - loss: 0.8980 - regression_loss: 0.7955 - classification_loss: 0.1025 166/500 [========>.....................] - ETA: 1:53 - loss: 0.8948 - regression_loss: 0.7927 - classification_loss: 0.1021 167/500 [=========>....................] - ETA: 1:53 - loss: 0.8953 - regression_loss: 0.7934 - classification_loss: 0.1019 168/500 [=========>....................] - ETA: 1:52 - loss: 0.8946 - regression_loss: 0.7928 - classification_loss: 0.1018 169/500 [=========>....................] - ETA: 1:52 - loss: 0.8946 - regression_loss: 0.7930 - classification_loss: 0.1016 170/500 [=========>....................] - ETA: 1:52 - loss: 0.8917 - regression_loss: 0.7905 - classification_loss: 0.1012 171/500 [=========>....................] - ETA: 1:51 - loss: 0.8950 - regression_loss: 0.7934 - classification_loss: 0.1015 172/500 [=========>....................] - ETA: 1:51 - loss: 0.8927 - regression_loss: 0.7915 - classification_loss: 0.1012 173/500 [=========>....................] - ETA: 1:51 - loss: 0.8928 - regression_loss: 0.7918 - classification_loss: 0.1010 174/500 [=========>....................] - ETA: 1:50 - loss: 0.8895 - regression_loss: 0.7889 - classification_loss: 0.1005 175/500 [=========>....................] - ETA: 1:50 - loss: 0.8899 - regression_loss: 0.7891 - classification_loss: 0.1007 176/500 [=========>....................] - ETA: 1:50 - loss: 0.8874 - regression_loss: 0.7870 - classification_loss: 0.1004 177/500 [=========>....................] - ETA: 1:49 - loss: 0.8867 - regression_loss: 0.7864 - classification_loss: 0.1002 178/500 [=========>....................] - ETA: 1:49 - loss: 0.8852 - regression_loss: 0.7853 - classification_loss: 0.0998 179/500 [=========>....................] - ETA: 1:49 - loss: 0.8867 - regression_loss: 0.7866 - classification_loss: 0.1001 180/500 [=========>....................] - ETA: 1:48 - loss: 0.8872 - regression_loss: 0.7872 - classification_loss: 0.0999 181/500 [=========>....................] - ETA: 1:48 - loss: 0.8852 - regression_loss: 0.7858 - classification_loss: 0.0995 182/500 [=========>....................] - ETA: 1:48 - loss: 0.8886 - regression_loss: 0.7885 - classification_loss: 0.1000 183/500 [=========>....................] - ETA: 1:47 - loss: 0.8880 - regression_loss: 0.7883 - classification_loss: 0.0997 184/500 [==========>...................] - ETA: 1:47 - loss: 0.8882 - regression_loss: 0.7886 - classification_loss: 0.0997 185/500 [==========>...................] - ETA: 1:47 - loss: 0.8895 - regression_loss: 0.7898 - classification_loss: 0.0997 186/500 [==========>...................] - ETA: 1:47 - loss: 0.8931 - regression_loss: 0.7926 - classification_loss: 0.1005 187/500 [==========>...................] - ETA: 1:46 - loss: 0.8958 - regression_loss: 0.7952 - classification_loss: 0.1006 188/500 [==========>...................] - ETA: 1:46 - loss: 0.8990 - regression_loss: 0.7976 - classification_loss: 0.1014 189/500 [==========>...................] - ETA: 1:46 - loss: 0.8979 - regression_loss: 0.7968 - classification_loss: 0.1011 190/500 [==========>...................] - ETA: 1:45 - loss: 0.8950 - regression_loss: 0.7943 - classification_loss: 0.1007 191/500 [==========>...................] - ETA: 1:45 - loss: 0.8928 - regression_loss: 0.7926 - classification_loss: 0.1002 192/500 [==========>...................] - ETA: 1:44 - loss: 0.8920 - regression_loss: 0.7920 - classification_loss: 0.1000 193/500 [==========>...................] - ETA: 1:44 - loss: 0.8923 - regression_loss: 0.7926 - classification_loss: 0.0998 194/500 [==========>...................] - ETA: 1:44 - loss: 0.8937 - regression_loss: 0.7934 - classification_loss: 0.1003 195/500 [==========>...................] - ETA: 1:43 - loss: 0.8932 - regression_loss: 0.7932 - classification_loss: 0.1000 196/500 [==========>...................] - ETA: 1:43 - loss: 0.8938 - regression_loss: 0.7937 - classification_loss: 0.1001 197/500 [==========>...................] - ETA: 1:43 - loss: 0.8932 - regression_loss: 0.7934 - classification_loss: 0.0998 198/500 [==========>...................] - ETA: 1:42 - loss: 0.8913 - regression_loss: 0.7917 - classification_loss: 0.0996 199/500 [==========>...................] - ETA: 1:42 - loss: 0.8922 - regression_loss: 0.7927 - classification_loss: 0.0995 200/500 [===========>..................] - ETA: 1:42 - loss: 0.8914 - regression_loss: 0.7921 - classification_loss: 0.0994 201/500 [===========>..................] - ETA: 1:41 - loss: 0.8904 - regression_loss: 0.7911 - classification_loss: 0.0993 202/500 [===========>..................] - ETA: 1:41 - loss: 0.8918 - regression_loss: 0.7925 - classification_loss: 0.0993 203/500 [===========>..................] - ETA: 1:41 - loss: 0.8953 - regression_loss: 0.7953 - classification_loss: 0.1000 204/500 [===========>..................] - ETA: 1:40 - loss: 0.8980 - regression_loss: 0.7975 - classification_loss: 0.1005 205/500 [===========>..................] - ETA: 1:40 - loss: 0.8957 - regression_loss: 0.7956 - classification_loss: 0.1001 206/500 [===========>..................] - ETA: 1:40 - loss: 0.8992 - regression_loss: 0.7986 - classification_loss: 0.1006 207/500 [===========>..................] - ETA: 1:39 - loss: 0.8969 - regression_loss: 0.7966 - classification_loss: 0.1002 208/500 [===========>..................] - ETA: 1:39 - loss: 0.8963 - regression_loss: 0.7963 - classification_loss: 0.0999 209/500 [===========>..................] - ETA: 1:39 - loss: 0.8949 - regression_loss: 0.7953 - classification_loss: 0.0996 210/500 [===========>..................] - ETA: 1:38 - loss: 0.8944 - regression_loss: 0.7950 - classification_loss: 0.0995 211/500 [===========>..................] - ETA: 1:38 - loss: 0.8946 - regression_loss: 0.7952 - classification_loss: 0.0994 212/500 [===========>..................] - ETA: 1:38 - loss: 0.8959 - regression_loss: 0.7961 - classification_loss: 0.0999 213/500 [===========>..................] - ETA: 1:37 - loss: 0.8967 - regression_loss: 0.7969 - classification_loss: 0.0997 214/500 [===========>..................] - ETA: 1:37 - loss: 0.8961 - regression_loss: 0.7965 - classification_loss: 0.0996 215/500 [===========>..................] - ETA: 1:37 - loss: 0.8970 - regression_loss: 0.7973 - classification_loss: 0.0997 216/500 [===========>..................] - ETA: 1:36 - loss: 0.8964 - regression_loss: 0.7969 - classification_loss: 0.0996 217/500 [============>.................] - ETA: 1:36 - loss: 0.8954 - regression_loss: 0.7958 - classification_loss: 0.0996 218/500 [============>.................] - ETA: 1:36 - loss: 0.8951 - regression_loss: 0.7956 - classification_loss: 0.0996 219/500 [============>.................] - ETA: 1:35 - loss: 0.8923 - regression_loss: 0.7932 - classification_loss: 0.0992 220/500 [============>.................] - ETA: 1:35 - loss: 0.8897 - regression_loss: 0.7907 - classification_loss: 0.0990 221/500 [============>.................] - ETA: 1:35 - loss: 0.8892 - regression_loss: 0.7904 - classification_loss: 0.0987 222/500 [============>.................] - ETA: 1:34 - loss: 0.8925 - regression_loss: 0.7929 - classification_loss: 0.0996 223/500 [============>.................] - ETA: 1:34 - loss: 0.8920 - regression_loss: 0.7926 - classification_loss: 0.0994 224/500 [============>.................] - ETA: 1:34 - loss: 0.8916 - regression_loss: 0.7922 - classification_loss: 0.0995 225/500 [============>.................] - ETA: 1:33 - loss: 0.8899 - regression_loss: 0.7907 - classification_loss: 0.0992 226/500 [============>.................] - ETA: 1:33 - loss: 0.8905 - regression_loss: 0.7913 - classification_loss: 0.0992 227/500 [============>.................] - ETA: 1:33 - loss: 0.8902 - regression_loss: 0.7911 - classification_loss: 0.0991 228/500 [============>.................] - ETA: 1:32 - loss: 0.8887 - regression_loss: 0.7899 - classification_loss: 0.0989 229/500 [============>.................] - ETA: 1:32 - loss: 0.8908 - regression_loss: 0.7919 - classification_loss: 0.0989 230/500 [============>.................] - ETA: 1:32 - loss: 0.8908 - regression_loss: 0.7919 - classification_loss: 0.0988 231/500 [============>.................] - ETA: 1:31 - loss: 0.8945 - regression_loss: 0.7955 - classification_loss: 0.0990 232/500 [============>.................] - ETA: 1:31 - loss: 0.8932 - regression_loss: 0.7944 - classification_loss: 0.0989 233/500 [============>.................] - ETA: 1:30 - loss: 0.8938 - regression_loss: 0.7949 - classification_loss: 0.0989 234/500 [=============>................] - ETA: 1:30 - loss: 0.8955 - regression_loss: 0.7965 - classification_loss: 0.0990 235/500 [=============>................] - ETA: 1:30 - loss: 0.8932 - regression_loss: 0.7944 - classification_loss: 0.0989 236/500 [=============>................] - ETA: 1:29 - loss: 0.8916 - regression_loss: 0.7931 - classification_loss: 0.0986 237/500 [=============>................] - ETA: 1:29 - loss: 0.8904 - regression_loss: 0.7920 - classification_loss: 0.0984 238/500 [=============>................] - ETA: 1:29 - loss: 0.8913 - regression_loss: 0.7929 - classification_loss: 0.0984 239/500 [=============>................] - ETA: 1:28 - loss: 0.8900 - regression_loss: 0.7919 - classification_loss: 0.0982 240/500 [=============>................] - ETA: 1:28 - loss: 0.8890 - regression_loss: 0.7909 - classification_loss: 0.0980 241/500 [=============>................] - ETA: 1:28 - loss: 0.8887 - regression_loss: 0.7907 - classification_loss: 0.0980 242/500 [=============>................] - ETA: 1:27 - loss: 0.8911 - regression_loss: 0.7924 - classification_loss: 0.0988 243/500 [=============>................] - ETA: 1:27 - loss: 0.8934 - regression_loss: 0.7940 - classification_loss: 0.0994 244/500 [=============>................] - ETA: 1:27 - loss: 0.8931 - regression_loss: 0.7939 - classification_loss: 0.0992 245/500 [=============>................] - ETA: 1:26 - loss: 0.8923 - regression_loss: 0.7932 - classification_loss: 0.0991 246/500 [=============>................] - ETA: 1:26 - loss: 0.8930 - regression_loss: 0.7939 - classification_loss: 0.0991 247/500 [=============>................] - ETA: 1:26 - loss: 0.8918 - regression_loss: 0.7929 - classification_loss: 0.0989 248/500 [=============>................] - ETA: 1:25 - loss: 0.8910 - regression_loss: 0.7922 - classification_loss: 0.0988 249/500 [=============>................] - ETA: 1:25 - loss: 0.8916 - regression_loss: 0.7927 - classification_loss: 0.0989 250/500 [==============>...............] - ETA: 1:25 - loss: 0.8901 - regression_loss: 0.7914 - classification_loss: 0.0987 251/500 [==============>...............] - ETA: 1:24 - loss: 0.8890 - regression_loss: 0.7907 - classification_loss: 0.0984 252/500 [==============>...............] - ETA: 1:24 - loss: 0.8878 - regression_loss: 0.7897 - classification_loss: 0.0981 253/500 [==============>...............] - ETA: 1:24 - loss: 0.8892 - regression_loss: 0.7911 - classification_loss: 0.0981 254/500 [==============>...............] - ETA: 1:23 - loss: 0.8891 - regression_loss: 0.7912 - classification_loss: 0.0979 255/500 [==============>...............] - ETA: 1:23 - loss: 0.8876 - regression_loss: 0.7898 - classification_loss: 0.0978 256/500 [==============>...............] - ETA: 1:23 - loss: 0.8856 - regression_loss: 0.7881 - classification_loss: 0.0975 257/500 [==============>...............] - ETA: 1:22 - loss: 0.8860 - regression_loss: 0.7883 - classification_loss: 0.0977 258/500 [==============>...............] - ETA: 1:22 - loss: 0.8858 - regression_loss: 0.7882 - classification_loss: 0.0976 259/500 [==============>...............] - ETA: 1:22 - loss: 0.8850 - regression_loss: 0.7876 - classification_loss: 0.0974 260/500 [==============>...............] - ETA: 1:21 - loss: 0.8857 - regression_loss: 0.7885 - classification_loss: 0.0973 261/500 [==============>...............] - ETA: 1:21 - loss: 0.8843 - regression_loss: 0.7871 - classification_loss: 0.0971 262/500 [==============>...............] - ETA: 1:20 - loss: 0.8830 - regression_loss: 0.7861 - classification_loss: 0.0969 263/500 [==============>...............] - ETA: 1:20 - loss: 0.8821 - regression_loss: 0.7854 - classification_loss: 0.0967 264/500 [==============>...............] - ETA: 1:20 - loss: 0.8824 - regression_loss: 0.7857 - classification_loss: 0.0967 265/500 [==============>...............] - ETA: 1:19 - loss: 0.8820 - regression_loss: 0.7855 - classification_loss: 0.0966 266/500 [==============>...............] - ETA: 1:19 - loss: 0.8827 - regression_loss: 0.7861 - classification_loss: 0.0966 267/500 [===============>..............] - ETA: 1:19 - loss: 0.8814 - regression_loss: 0.7850 - classification_loss: 0.0963 268/500 [===============>..............] - ETA: 1:18 - loss: 0.8821 - regression_loss: 0.7858 - classification_loss: 0.0964 269/500 [===============>..............] - ETA: 1:18 - loss: 0.8805 - regression_loss: 0.7844 - classification_loss: 0.0961 270/500 [===============>..............] - ETA: 1:18 - loss: 0.8791 - regression_loss: 0.7833 - classification_loss: 0.0958 271/500 [===============>..............] - ETA: 1:17 - loss: 0.8784 - regression_loss: 0.7827 - classification_loss: 0.0957 272/500 [===============>..............] - ETA: 1:17 - loss: 0.8767 - regression_loss: 0.7812 - classification_loss: 0.0955 273/500 [===============>..............] - ETA: 1:17 - loss: 0.8772 - regression_loss: 0.7815 - classification_loss: 0.0957 274/500 [===============>..............] - ETA: 1:16 - loss: 0.8780 - regression_loss: 0.7824 - classification_loss: 0.0955 275/500 [===============>..............] - ETA: 1:16 - loss: 0.8777 - regression_loss: 0.7823 - classification_loss: 0.0954 276/500 [===============>..............] - ETA: 1:16 - loss: 0.8768 - regression_loss: 0.7814 - classification_loss: 0.0953 277/500 [===============>..............] - ETA: 1:15 - loss: 0.8758 - regression_loss: 0.7807 - classification_loss: 0.0951 278/500 [===============>..............] - ETA: 1:15 - loss: 0.8749 - regression_loss: 0.7800 - classification_loss: 0.0949 279/500 [===============>..............] - ETA: 1:15 - loss: 0.8740 - regression_loss: 0.7792 - classification_loss: 0.0948 280/500 [===============>..............] - ETA: 1:14 - loss: 0.8737 - regression_loss: 0.7791 - classification_loss: 0.0946 281/500 [===============>..............] - ETA: 1:14 - loss: 0.8752 - regression_loss: 0.7803 - classification_loss: 0.0948 282/500 [===============>..............] - ETA: 1:14 - loss: 0.8746 - regression_loss: 0.7798 - classification_loss: 0.0948 283/500 [===============>..............] - ETA: 1:13 - loss: 0.8748 - regression_loss: 0.7798 - classification_loss: 0.0950 284/500 [================>.............] - ETA: 1:13 - loss: 0.8738 - regression_loss: 0.7791 - classification_loss: 0.0947 285/500 [================>.............] - ETA: 1:13 - loss: 0.8743 - regression_loss: 0.7796 - classification_loss: 0.0948 286/500 [================>.............] - ETA: 1:12 - loss: 0.8753 - regression_loss: 0.7805 - classification_loss: 0.0948 287/500 [================>.............] - ETA: 1:12 - loss: 0.8747 - regression_loss: 0.7802 - classification_loss: 0.0946 288/500 [================>.............] - ETA: 1:12 - loss: 0.8757 - regression_loss: 0.7812 - classification_loss: 0.0945 289/500 [================>.............] - ETA: 1:11 - loss: 0.8749 - regression_loss: 0.7804 - classification_loss: 0.0944 290/500 [================>.............] - ETA: 1:11 - loss: 0.8754 - regression_loss: 0.7809 - classification_loss: 0.0945 291/500 [================>.............] - ETA: 1:11 - loss: 0.8757 - regression_loss: 0.7812 - classification_loss: 0.0944 292/500 [================>.............] - ETA: 1:10 - loss: 0.8748 - regression_loss: 0.7804 - classification_loss: 0.0944 293/500 [================>.............] - ETA: 1:10 - loss: 0.8748 - regression_loss: 0.7805 - classification_loss: 0.0943 294/500 [================>.............] - ETA: 1:10 - loss: 0.8737 - regression_loss: 0.7792 - classification_loss: 0.0944 295/500 [================>.............] - ETA: 1:09 - loss: 0.8733 - regression_loss: 0.7790 - classification_loss: 0.0943 296/500 [================>.............] - ETA: 1:09 - loss: 0.8725 - regression_loss: 0.7781 - classification_loss: 0.0944 297/500 [================>.............] - ETA: 1:09 - loss: 0.8745 - regression_loss: 0.7797 - classification_loss: 0.0947 298/500 [================>.............] - ETA: 1:08 - loss: 0.8739 - regression_loss: 0.7793 - classification_loss: 0.0945 299/500 [================>.............] - ETA: 1:08 - loss: 0.8742 - regression_loss: 0.7798 - classification_loss: 0.0944 300/500 [=================>............] - ETA: 1:08 - loss: 0.8741 - regression_loss: 0.7797 - classification_loss: 0.0943 301/500 [=================>............] - ETA: 1:07 - loss: 0.8745 - regression_loss: 0.7801 - classification_loss: 0.0944 302/500 [=================>............] - ETA: 1:07 - loss: 0.8733 - regression_loss: 0.7790 - classification_loss: 0.0943 303/500 [=================>............] - ETA: 1:06 - loss: 0.8714 - regression_loss: 0.7773 - classification_loss: 0.0941 304/500 [=================>............] - ETA: 1:06 - loss: 0.8718 - regression_loss: 0.7778 - classification_loss: 0.0939 305/500 [=================>............] - ETA: 1:06 - loss: 0.8706 - regression_loss: 0.7768 - classification_loss: 0.0937 306/500 [=================>............] - ETA: 1:05 - loss: 0.8692 - regression_loss: 0.7757 - classification_loss: 0.0935 307/500 [=================>............] - ETA: 1:05 - loss: 0.8689 - regression_loss: 0.7753 - classification_loss: 0.0936 308/500 [=================>............] - ETA: 1:05 - loss: 0.8696 - regression_loss: 0.7760 - classification_loss: 0.0936 309/500 [=================>............] - ETA: 1:04 - loss: 0.8683 - regression_loss: 0.7747 - classification_loss: 0.0936 310/500 [=================>............] - ETA: 1:04 - loss: 0.8677 - regression_loss: 0.7742 - classification_loss: 0.0935 311/500 [=================>............] - ETA: 1:04 - loss: 0.8683 - regression_loss: 0.7749 - classification_loss: 0.0934 312/500 [=================>............] - ETA: 1:03 - loss: 0.8682 - regression_loss: 0.7748 - classification_loss: 0.0934 313/500 [=================>............] - ETA: 1:03 - loss: 0.8698 - regression_loss: 0.7759 - classification_loss: 0.0939 314/500 [=================>............] - ETA: 1:03 - loss: 0.8696 - regression_loss: 0.7758 - classification_loss: 0.0938 315/500 [=================>............] - ETA: 1:02 - loss: 0.8676 - regression_loss: 0.7741 - classification_loss: 0.0935 316/500 [=================>............] - ETA: 1:02 - loss: 0.8672 - regression_loss: 0.7738 - classification_loss: 0.0934 317/500 [==================>...........] - ETA: 1:02 - loss: 0.8677 - regression_loss: 0.7744 - classification_loss: 0.0933 318/500 [==================>...........] - ETA: 1:01 - loss: 0.8681 - regression_loss: 0.7748 - classification_loss: 0.0933 319/500 [==================>...........] - ETA: 1:01 - loss: 0.8672 - regression_loss: 0.7740 - classification_loss: 0.0931 320/500 [==================>...........] - ETA: 1:01 - loss: 0.8687 - regression_loss: 0.7755 - classification_loss: 0.0932 321/500 [==================>...........] - ETA: 1:00 - loss: 0.8673 - regression_loss: 0.7742 - classification_loss: 0.0930 322/500 [==================>...........] - ETA: 1:00 - loss: 0.8686 - regression_loss: 0.7755 - classification_loss: 0.0931 323/500 [==================>...........] - ETA: 1:00 - loss: 0.8683 - regression_loss: 0.7753 - classification_loss: 0.0930 324/500 [==================>...........] - ETA: 59s - loss: 0.8686 - regression_loss: 0.7757 - classification_loss: 0.0929  325/500 [==================>...........] - ETA: 59s - loss: 0.8681 - regression_loss: 0.7754 - classification_loss: 0.0928 326/500 [==================>...........] - ETA: 59s - loss: 0.8672 - regression_loss: 0.7745 - classification_loss: 0.0927 327/500 [==================>...........] - ETA: 58s - loss: 0.8654 - regression_loss: 0.7729 - classification_loss: 0.0925 328/500 [==================>...........] - ETA: 58s - loss: 0.8636 - regression_loss: 0.7713 - classification_loss: 0.0922 329/500 [==================>...........] - ETA: 58s - loss: 0.8632 - regression_loss: 0.7711 - classification_loss: 0.0921 330/500 [==================>...........] - ETA: 57s - loss: 0.8622 - regression_loss: 0.7704 - classification_loss: 0.0919 331/500 [==================>...........] - ETA: 57s - loss: 0.8619 - regression_loss: 0.7699 - classification_loss: 0.0920 332/500 [==================>...........] - ETA: 57s - loss: 0.8614 - regression_loss: 0.7695 - classification_loss: 0.0919 333/500 [==================>...........] - ETA: 56s - loss: 0.8608 - regression_loss: 0.7692 - classification_loss: 0.0917 334/500 [===================>..........] - ETA: 56s - loss: 0.8605 - regression_loss: 0.7690 - classification_loss: 0.0916 335/500 [===================>..........] - ETA: 56s - loss: 0.8596 - regression_loss: 0.7681 - classification_loss: 0.0915 336/500 [===================>..........] - ETA: 55s - loss: 0.8593 - regression_loss: 0.7677 - classification_loss: 0.0915 337/500 [===================>..........] - ETA: 55s - loss: 0.8602 - regression_loss: 0.7685 - classification_loss: 0.0917 338/500 [===================>..........] - ETA: 55s - loss: 0.8589 - regression_loss: 0.7674 - classification_loss: 0.0915 339/500 [===================>..........] - ETA: 54s - loss: 0.8599 - regression_loss: 0.7682 - classification_loss: 0.0917 340/500 [===================>..........] - ETA: 54s - loss: 0.8594 - regression_loss: 0.7678 - classification_loss: 0.0917 341/500 [===================>..........] - ETA: 54s - loss: 0.8591 - regression_loss: 0.7676 - classification_loss: 0.0915 342/500 [===================>..........] - ETA: 53s - loss: 0.8586 - regression_loss: 0.7674 - classification_loss: 0.0913 343/500 [===================>..........] - ETA: 53s - loss: 0.8598 - regression_loss: 0.7685 - classification_loss: 0.0913 344/500 [===================>..........] - ETA: 53s - loss: 0.8604 - regression_loss: 0.7691 - classification_loss: 0.0913 345/500 [===================>..........] - ETA: 52s - loss: 0.8592 - regression_loss: 0.7681 - classification_loss: 0.0911 346/500 [===================>..........] - ETA: 52s - loss: 0.8592 - regression_loss: 0.7681 - classification_loss: 0.0910 347/500 [===================>..........] - ETA: 51s - loss: 0.8594 - regression_loss: 0.7684 - classification_loss: 0.0910 348/500 [===================>..........] - ETA: 51s - loss: 0.8594 - regression_loss: 0.7681 - classification_loss: 0.0913 349/500 [===================>..........] - ETA: 51s - loss: 0.8596 - regression_loss: 0.7682 - classification_loss: 0.0914 350/500 [====================>.........] - ETA: 50s - loss: 0.8591 - regression_loss: 0.7678 - classification_loss: 0.0913 351/500 [====================>.........] - ETA: 50s - loss: 0.8589 - regression_loss: 0.7678 - classification_loss: 0.0911 352/500 [====================>.........] - ETA: 50s - loss: 0.8589 - regression_loss: 0.7680 - classification_loss: 0.0910 353/500 [====================>.........] - ETA: 49s - loss: 0.8582 - regression_loss: 0.7673 - classification_loss: 0.0909 354/500 [====================>.........] - ETA: 49s - loss: 0.8576 - regression_loss: 0.7668 - classification_loss: 0.0909 355/500 [====================>.........] - ETA: 49s - loss: 0.8572 - regression_loss: 0.7663 - classification_loss: 0.0909 356/500 [====================>.........] - ETA: 48s - loss: 0.8574 - regression_loss: 0.7666 - classification_loss: 0.0908 357/500 [====================>.........] - ETA: 48s - loss: 0.8572 - regression_loss: 0.7664 - classification_loss: 0.0907 358/500 [====================>.........] - ETA: 48s - loss: 0.8577 - regression_loss: 0.7668 - classification_loss: 0.0909 359/500 [====================>.........] - ETA: 47s - loss: 0.8576 - regression_loss: 0.7667 - classification_loss: 0.0909 360/500 [====================>.........] - ETA: 47s - loss: 0.8576 - regression_loss: 0.7667 - classification_loss: 0.0909 361/500 [====================>.........] - ETA: 47s - loss: 0.8569 - regression_loss: 0.7662 - classification_loss: 0.0907 362/500 [====================>.........] - ETA: 46s - loss: 0.8557 - regression_loss: 0.7652 - classification_loss: 0.0905 363/500 [====================>.........] - ETA: 46s - loss: 0.8580 - regression_loss: 0.7671 - classification_loss: 0.0909 364/500 [====================>.........] - ETA: 46s - loss: 0.8576 - regression_loss: 0.7668 - classification_loss: 0.0908 365/500 [====================>.........] - ETA: 45s - loss: 0.8575 - regression_loss: 0.7668 - classification_loss: 0.0907 366/500 [====================>.........] - ETA: 45s - loss: 0.8574 - regression_loss: 0.7667 - classification_loss: 0.0907 367/500 [=====================>........] - ETA: 45s - loss: 0.8568 - regression_loss: 0.7663 - classification_loss: 0.0905 368/500 [=====================>........] - ETA: 44s - loss: 0.8551 - regression_loss: 0.7648 - classification_loss: 0.0903 369/500 [=====================>........] - ETA: 44s - loss: 0.8559 - regression_loss: 0.7653 - classification_loss: 0.0906 370/500 [=====================>........] - ETA: 44s - loss: 0.8579 - regression_loss: 0.7675 - classification_loss: 0.0905 371/500 [=====================>........] - ETA: 43s - loss: 0.8571 - regression_loss: 0.7667 - classification_loss: 0.0904 372/500 [=====================>........] - ETA: 43s - loss: 0.8556 - regression_loss: 0.7654 - classification_loss: 0.0902 373/500 [=====================>........] - ETA: 43s - loss: 0.8560 - regression_loss: 0.7659 - classification_loss: 0.0901 374/500 [=====================>........] - ETA: 42s - loss: 0.8567 - regression_loss: 0.7666 - classification_loss: 0.0901 375/500 [=====================>........] - ETA: 42s - loss: 0.8563 - regression_loss: 0.7663 - classification_loss: 0.0900 376/500 [=====================>........] - ETA: 42s - loss: 0.8570 - regression_loss: 0.7669 - classification_loss: 0.0900 377/500 [=====================>........] - ETA: 41s - loss: 0.8578 - regression_loss: 0.7677 - classification_loss: 0.0901 378/500 [=====================>........] - ETA: 41s - loss: 0.8582 - regression_loss: 0.7680 - classification_loss: 0.0902 379/500 [=====================>........] - ETA: 41s - loss: 0.8580 - regression_loss: 0.7679 - classification_loss: 0.0901 380/500 [=====================>........] - ETA: 40s - loss: 0.8590 - regression_loss: 0.7687 - classification_loss: 0.0903 381/500 [=====================>........] - ETA: 40s - loss: 0.8578 - regression_loss: 0.7676 - classification_loss: 0.0901 382/500 [=====================>........] - ETA: 40s - loss: 0.8568 - regression_loss: 0.7668 - classification_loss: 0.0900 383/500 [=====================>........] - ETA: 39s - loss: 0.8563 - regression_loss: 0.7664 - classification_loss: 0.0899 384/500 [======================>.......] - ETA: 39s - loss: 0.8556 - regression_loss: 0.7658 - classification_loss: 0.0898 385/500 [======================>.......] - ETA: 39s - loss: 0.8547 - regression_loss: 0.7651 - classification_loss: 0.0896 386/500 [======================>.......] - ETA: 38s - loss: 0.8537 - regression_loss: 0.7642 - classification_loss: 0.0895 387/500 [======================>.......] - ETA: 38s - loss: 0.8546 - regression_loss: 0.7649 - classification_loss: 0.0897 388/500 [======================>.......] - ETA: 38s - loss: 0.8546 - regression_loss: 0.7650 - classification_loss: 0.0896 389/500 [======================>.......] - ETA: 37s - loss: 0.8544 - regression_loss: 0.7649 - classification_loss: 0.0896 390/500 [======================>.......] - ETA: 37s - loss: 0.8534 - regression_loss: 0.7640 - classification_loss: 0.0894 391/500 [======================>.......] - ETA: 37s - loss: 0.8534 - regression_loss: 0.7641 - classification_loss: 0.0892 392/500 [======================>.......] - ETA: 36s - loss: 0.8519 - regression_loss: 0.7628 - classification_loss: 0.0890 393/500 [======================>.......] - ETA: 36s - loss: 0.8518 - regression_loss: 0.7628 - classification_loss: 0.0890 394/500 [======================>.......] - ETA: 36s - loss: 0.8512 - regression_loss: 0.7624 - classification_loss: 0.0888 395/500 [======================>.......] - ETA: 35s - loss: 0.8500 - regression_loss: 0.7613 - classification_loss: 0.0887 396/500 [======================>.......] - ETA: 35s - loss: 0.8489 - regression_loss: 0.7604 - classification_loss: 0.0885 397/500 [======================>.......] - ETA: 34s - loss: 0.8481 - regression_loss: 0.7598 - classification_loss: 0.0883 398/500 [======================>.......] - ETA: 34s - loss: 0.8463 - regression_loss: 0.7582 - classification_loss: 0.0881 399/500 [======================>.......] - ETA: 34s - loss: 0.8462 - regression_loss: 0.7581 - classification_loss: 0.0880 400/500 [=======================>......] - ETA: 33s - loss: 0.8465 - regression_loss: 0.7586 - classification_loss: 0.0879 401/500 [=======================>......] - ETA: 33s - loss: 0.8461 - regression_loss: 0.7583 - classification_loss: 0.0878 402/500 [=======================>......] - ETA: 33s - loss: 0.8458 - regression_loss: 0.7581 - classification_loss: 0.0878 403/500 [=======================>......] - ETA: 32s - loss: 0.8463 - regression_loss: 0.7585 - classification_loss: 0.0878 404/500 [=======================>......] - ETA: 32s - loss: 0.8459 - regression_loss: 0.7582 - classification_loss: 0.0877 405/500 [=======================>......] - ETA: 32s - loss: 0.8452 - regression_loss: 0.7576 - classification_loss: 0.0875 406/500 [=======================>......] - ETA: 31s - loss: 0.8449 - regression_loss: 0.7573 - classification_loss: 0.0876 407/500 [=======================>......] - ETA: 31s - loss: 0.8466 - regression_loss: 0.7587 - classification_loss: 0.0878 408/500 [=======================>......] - ETA: 31s - loss: 0.8481 - regression_loss: 0.7600 - classification_loss: 0.0881 409/500 [=======================>......] - ETA: 30s - loss: 0.8475 - regression_loss: 0.7595 - classification_loss: 0.0880 410/500 [=======================>......] - ETA: 30s - loss: 0.8487 - regression_loss: 0.7604 - classification_loss: 0.0883 411/500 [=======================>......] - ETA: 30s - loss: 0.8493 - regression_loss: 0.7610 - classification_loss: 0.0883 412/500 [=======================>......] - ETA: 29s - loss: 0.8501 - regression_loss: 0.7618 - classification_loss: 0.0883 413/500 [=======================>......] - ETA: 29s - loss: 0.8504 - regression_loss: 0.7621 - classification_loss: 0.0883 414/500 [=======================>......] - ETA: 29s - loss: 0.8494 - regression_loss: 0.7613 - classification_loss: 0.0881 415/500 [=======================>......] - ETA: 28s - loss: 0.8493 - regression_loss: 0.7612 - classification_loss: 0.0881 416/500 [=======================>......] - ETA: 28s - loss: 0.8490 - regression_loss: 0.7609 - classification_loss: 0.0881 417/500 [========================>.....] - ETA: 28s - loss: 0.8485 - regression_loss: 0.7605 - classification_loss: 0.0880 418/500 [========================>.....] - ETA: 27s - loss: 0.8484 - regression_loss: 0.7605 - classification_loss: 0.0879 419/500 [========================>.....] - ETA: 27s - loss: 0.8485 - regression_loss: 0.7606 - classification_loss: 0.0879 420/500 [========================>.....] - ETA: 27s - loss: 0.8495 - regression_loss: 0.7615 - classification_loss: 0.0879 421/500 [========================>.....] - ETA: 26s - loss: 0.8483 - regression_loss: 0.7606 - classification_loss: 0.0878 422/500 [========================>.....] - ETA: 26s - loss: 0.8487 - regression_loss: 0.7609 - classification_loss: 0.0878 423/500 [========================>.....] - ETA: 26s - loss: 0.8486 - regression_loss: 0.7609 - classification_loss: 0.0877 424/500 [========================>.....] - ETA: 25s - loss: 0.8477 - regression_loss: 0.7602 - classification_loss: 0.0875 425/500 [========================>.....] - ETA: 25s - loss: 0.8475 - regression_loss: 0.7601 - classification_loss: 0.0874 426/500 [========================>.....] - ETA: 25s - loss: 0.8468 - regression_loss: 0.7595 - classification_loss: 0.0873 427/500 [========================>.....] - ETA: 24s - loss: 0.8472 - regression_loss: 0.7598 - classification_loss: 0.0874 428/500 [========================>.....] - ETA: 24s - loss: 0.8467 - regression_loss: 0.7594 - classification_loss: 0.0873 429/500 [========================>.....] - ETA: 24s - loss: 0.8475 - regression_loss: 0.7602 - classification_loss: 0.0873 430/500 [========================>.....] - ETA: 23s - loss: 0.8487 - regression_loss: 0.7614 - classification_loss: 0.0873 431/500 [========================>.....] - ETA: 23s - loss: 0.8500 - regression_loss: 0.7624 - classification_loss: 0.0875 432/500 [========================>.....] - ETA: 23s - loss: 0.8523 - regression_loss: 0.7645 - classification_loss: 0.0879 433/500 [========================>.....] - ETA: 22s - loss: 0.8518 - regression_loss: 0.7640 - classification_loss: 0.0878 434/500 [=========================>....] - ETA: 22s - loss: 0.8513 - regression_loss: 0.7637 - classification_loss: 0.0877 435/500 [=========================>....] - ETA: 22s - loss: 0.8513 - regression_loss: 0.7637 - classification_loss: 0.0876 436/500 [=========================>....] - ETA: 21s - loss: 0.8512 - regression_loss: 0.7636 - classification_loss: 0.0876 437/500 [=========================>....] - ETA: 21s - loss: 0.8507 - regression_loss: 0.7631 - classification_loss: 0.0876 438/500 [=========================>....] - ETA: 21s - loss: 0.8498 - regression_loss: 0.7623 - classification_loss: 0.0875 439/500 [=========================>....] - ETA: 20s - loss: 0.8495 - regression_loss: 0.7621 - classification_loss: 0.0874 440/500 [=========================>....] - ETA: 20s - loss: 0.8493 - regression_loss: 0.7620 - classification_loss: 0.0873 441/500 [=========================>....] - ETA: 20s - loss: 0.8487 - regression_loss: 0.7615 - classification_loss: 0.0872 442/500 [=========================>....] - ETA: 19s - loss: 0.8484 - regression_loss: 0.7613 - classification_loss: 0.0871 443/500 [=========================>....] - ETA: 19s - loss: 0.8476 - regression_loss: 0.7606 - classification_loss: 0.0870 444/500 [=========================>....] - ETA: 19s - loss: 0.8472 - regression_loss: 0.7603 - classification_loss: 0.0870 445/500 [=========================>....] - ETA: 18s - loss: 0.8468 - regression_loss: 0.7599 - classification_loss: 0.0868 446/500 [=========================>....] - ETA: 18s - loss: 0.8455 - regression_loss: 0.7588 - classification_loss: 0.0867 447/500 [=========================>....] - ETA: 18s - loss: 0.8467 - regression_loss: 0.7598 - classification_loss: 0.0869 448/500 [=========================>....] - ETA: 17s - loss: 0.8470 - regression_loss: 0.7601 - classification_loss: 0.0869 449/500 [=========================>....] - ETA: 17s - loss: 0.8484 - regression_loss: 0.7614 - classification_loss: 0.0870 450/500 [==========================>...] - ETA: 16s - loss: 0.8482 - regression_loss: 0.7613 - classification_loss: 0.0869 451/500 [==========================>...] - ETA: 16s - loss: 0.8480 - regression_loss: 0.7612 - classification_loss: 0.0868 452/500 [==========================>...] - ETA: 16s - loss: 0.8490 - regression_loss: 0.7619 - classification_loss: 0.0871 453/500 [==========================>...] - ETA: 15s - loss: 0.8477 - regression_loss: 0.7608 - classification_loss: 0.0870 454/500 [==========================>...] - ETA: 15s - loss: 0.8484 - regression_loss: 0.7614 - classification_loss: 0.0870 455/500 [==========================>...] - ETA: 15s - loss: 0.8486 - regression_loss: 0.7615 - classification_loss: 0.0871 456/500 [==========================>...] - ETA: 14s - loss: 0.8481 - regression_loss: 0.7611 - classification_loss: 0.0870 457/500 [==========================>...] - ETA: 14s - loss: 0.8478 - regression_loss: 0.7609 - classification_loss: 0.0869 458/500 [==========================>...] - ETA: 14s - loss: 0.8476 - regression_loss: 0.7608 - classification_loss: 0.0869 459/500 [==========================>...] - ETA: 13s - loss: 0.8481 - regression_loss: 0.7612 - classification_loss: 0.0869 460/500 [==========================>...] - ETA: 13s - loss: 0.8502 - regression_loss: 0.7626 - classification_loss: 0.0876 461/500 [==========================>...] - ETA: 13s - loss: 0.8498 - regression_loss: 0.7624 - classification_loss: 0.0875 462/500 [==========================>...] - ETA: 12s - loss: 0.8502 - regression_loss: 0.7629 - classification_loss: 0.0874 463/500 [==========================>...] - ETA: 12s - loss: 0.8505 - regression_loss: 0.7630 - classification_loss: 0.0875 464/500 [==========================>...] - ETA: 12s - loss: 0.8499 - regression_loss: 0.7626 - classification_loss: 0.0874 465/500 [==========================>...] - ETA: 11s - loss: 0.8504 - regression_loss: 0.7630 - classification_loss: 0.0874 466/500 [==========================>...] - ETA: 11s - loss: 0.8501 - regression_loss: 0.7627 - classification_loss: 0.0873 467/500 [===========================>..] - ETA: 11s - loss: 0.8504 - regression_loss: 0.7630 - classification_loss: 0.0873 468/500 [===========================>..] - ETA: 10s - loss: 0.8521 - regression_loss: 0.7644 - classification_loss: 0.0877 469/500 [===========================>..] - ETA: 10s - loss: 0.8524 - regression_loss: 0.7647 - classification_loss: 0.0877 470/500 [===========================>..] - ETA: 10s - loss: 0.8516 - regression_loss: 0.7640 - classification_loss: 0.0876 471/500 [===========================>..] - ETA: 9s - loss: 0.8510 - regression_loss: 0.7636 - classification_loss: 0.0874  472/500 [===========================>..] - ETA: 9s - loss: 0.8508 - regression_loss: 0.7634 - classification_loss: 0.0874 473/500 [===========================>..] - ETA: 9s - loss: 0.8504 - regression_loss: 0.7631 - classification_loss: 0.0873 474/500 [===========================>..] - ETA: 8s - loss: 0.8503 - regression_loss: 0.7631 - classification_loss: 0.0872 475/500 [===========================>..] - ETA: 8s - loss: 0.8506 - regression_loss: 0.7634 - classification_loss: 0.0872 476/500 [===========================>..] - ETA: 8s - loss: 0.8522 - regression_loss: 0.7649 - classification_loss: 0.0873 477/500 [===========================>..] - ETA: 7s - loss: 0.8517 - regression_loss: 0.7644 - classification_loss: 0.0873 478/500 [===========================>..] - ETA: 7s - loss: 0.8499 - regression_loss: 0.7628 - classification_loss: 0.0871 479/500 [===========================>..] - ETA: 7s - loss: 0.8501 - regression_loss: 0.7630 - classification_loss: 0.0871 480/500 [===========================>..] - ETA: 6s - loss: 0.8501 - regression_loss: 0.7629 - classification_loss: 0.0872 481/500 [===========================>..] - ETA: 6s - loss: 0.8499 - regression_loss: 0.7628 - classification_loss: 0.0871 482/500 [===========================>..] - ETA: 6s - loss: 0.8499 - regression_loss: 0.7628 - classification_loss: 0.0871 483/500 [===========================>..] - ETA: 5s - loss: 0.8497 - regression_loss: 0.7627 - classification_loss: 0.0870 484/500 [============================>.] - ETA: 5s - loss: 0.8500 - regression_loss: 0.7631 - classification_loss: 0.0869 485/500 [============================>.] - ETA: 5s - loss: 0.8496 - regression_loss: 0.7628 - classification_loss: 0.0868 486/500 [============================>.] - ETA: 4s - loss: 0.8500 - regression_loss: 0.7632 - classification_loss: 0.0868 487/500 [============================>.] - ETA: 4s - loss: 0.8513 - regression_loss: 0.7643 - classification_loss: 0.0870 488/500 [============================>.] - ETA: 4s - loss: 0.8519 - regression_loss: 0.7648 - classification_loss: 0.0871 489/500 [============================>.] - ETA: 3s - loss: 0.8509 - regression_loss: 0.7640 - classification_loss: 0.0870 490/500 [============================>.] - ETA: 3s - loss: 0.8507 - regression_loss: 0.7637 - classification_loss: 0.0870 491/500 [============================>.] - ETA: 3s - loss: 0.8502 - regression_loss: 0.7633 - classification_loss: 0.0869 492/500 [============================>.] - ETA: 2s - loss: 0.8498 - regression_loss: 0.7630 - classification_loss: 0.0868 493/500 [============================>.] - ETA: 2s - loss: 0.8495 - regression_loss: 0.7628 - classification_loss: 0.0867 494/500 [============================>.] - ETA: 2s - loss: 0.8496 - regression_loss: 0.7631 - classification_loss: 0.0866 495/500 [============================>.] - ETA: 1s - loss: 0.8492 - regression_loss: 0.7626 - classification_loss: 0.0865 496/500 [============================>.] - ETA: 1s - loss: 0.8487 - regression_loss: 0.7623 - classification_loss: 0.0864 497/500 [============================>.] - ETA: 1s - loss: 0.8488 - regression_loss: 0.7624 - classification_loss: 0.0864 498/500 [============================>.] - ETA: 0s - loss: 0.8482 - regression_loss: 0.7619 - classification_loss: 0.0863 499/500 [============================>.] - ETA: 0s - loss: 0.8479 - regression_loss: 0.7617 - classification_loss: 0.0862 500/500 [==============================] - 170s 340ms/step - loss: 0.8480 - regression_loss: 0.7619 - classification_loss: 0.0861 326 instances of class plum with average precision: 0.8235 mAP: 0.8235 Epoch 00037: saving model to ./training/snapshots/resnet101_pascal_37.h5 Epoch 38/150 1/500 [..............................] - ETA: 2:36 - loss: 0.5627 - regression_loss: 0.5435 - classification_loss: 0.0193 2/500 [..............................] - ETA: 2:40 - loss: 0.6285 - regression_loss: 0.6064 - classification_loss: 0.0221 3/500 [..............................] - ETA: 2:43 - loss: 0.6053 - regression_loss: 0.5668 - classification_loss: 0.0386 4/500 [..............................] - ETA: 2:41 - loss: 0.6312 - regression_loss: 0.5911 - classification_loss: 0.0401 5/500 [..............................] - ETA: 2:42 - loss: 0.6014 - regression_loss: 0.5649 - classification_loss: 0.0365 6/500 [..............................] - ETA: 2:42 - loss: 0.7629 - regression_loss: 0.6766 - classification_loss: 0.0863 7/500 [..............................] - ETA: 2:43 - loss: 0.7477 - regression_loss: 0.6663 - classification_loss: 0.0814 8/500 [..............................] - ETA: 2:43 - loss: 0.7735 - regression_loss: 0.6892 - classification_loss: 0.0843 9/500 [..............................] - ETA: 2:43 - loss: 0.8225 - regression_loss: 0.7238 - classification_loss: 0.0987 10/500 [..............................] - ETA: 2:43 - loss: 0.8138 - regression_loss: 0.7185 - classification_loss: 0.0953 11/500 [..............................] - ETA: 2:44 - loss: 0.8576 - regression_loss: 0.7624 - classification_loss: 0.0952 12/500 [..............................] - ETA: 2:44 - loss: 0.8647 - regression_loss: 0.7716 - classification_loss: 0.0931 13/500 [..............................] - ETA: 2:43 - loss: 0.8531 - regression_loss: 0.7632 - classification_loss: 0.0899 14/500 [..............................] - ETA: 2:43 - loss: 0.8669 - regression_loss: 0.7759 - classification_loss: 0.0910 15/500 [..............................] - ETA: 2:43 - loss: 0.8805 - regression_loss: 0.7878 - classification_loss: 0.0927 16/500 [..............................] - ETA: 2:43 - loss: 0.8533 - regression_loss: 0.7637 - classification_loss: 0.0897 17/500 [>.............................] - ETA: 2:43 - loss: 0.8233 - regression_loss: 0.7187 - classification_loss: 0.1045 18/500 [>.............................] - ETA: 2:43 - loss: 0.8209 - regression_loss: 0.7193 - classification_loss: 0.1016 19/500 [>.............................] - ETA: 2:43 - loss: 0.8295 - regression_loss: 0.7270 - classification_loss: 0.1025 20/500 [>.............................] - ETA: 2:43 - loss: 0.7997 - regression_loss: 0.7019 - classification_loss: 0.0978 21/500 [>.............................] - ETA: 2:43 - loss: 0.8258 - regression_loss: 0.7230 - classification_loss: 0.1028 22/500 [>.............................] - ETA: 2:43 - loss: 0.8213 - regression_loss: 0.7195 - classification_loss: 0.1018 23/500 [>.............................] - ETA: 2:42 - loss: 0.8110 - regression_loss: 0.7120 - classification_loss: 0.0990 24/500 [>.............................] - ETA: 2:42 - loss: 0.7982 - regression_loss: 0.7024 - classification_loss: 0.0959 25/500 [>.............................] - ETA: 2:42 - loss: 0.7936 - regression_loss: 0.6998 - classification_loss: 0.0938 26/500 [>.............................] - ETA: 2:41 - loss: 0.7930 - regression_loss: 0.6992 - classification_loss: 0.0938 27/500 [>.............................] - ETA: 2:41 - loss: 0.8149 - regression_loss: 0.7165 - classification_loss: 0.0984 28/500 [>.............................] - ETA: 2:40 - loss: 0.7986 - regression_loss: 0.7028 - classification_loss: 0.0958 29/500 [>.............................] - ETA: 2:40 - loss: 0.7978 - regression_loss: 0.7034 - classification_loss: 0.0944 30/500 [>.............................] - ETA: 2:40 - loss: 0.8159 - regression_loss: 0.7191 - classification_loss: 0.0968 31/500 [>.............................] - ETA: 2:39 - loss: 0.8069 - regression_loss: 0.7121 - classification_loss: 0.0948 32/500 [>.............................] - ETA: 2:39 - loss: 0.7937 - regression_loss: 0.7011 - classification_loss: 0.0926 33/500 [>.............................] - ETA: 2:38 - loss: 0.8155 - regression_loss: 0.7175 - classification_loss: 0.0980 34/500 [=>............................] - ETA: 2:37 - loss: 0.8246 - regression_loss: 0.7259 - classification_loss: 0.0987 35/500 [=>............................] - ETA: 2:36 - loss: 0.8312 - regression_loss: 0.7328 - classification_loss: 0.0984 36/500 [=>............................] - ETA: 2:35 - loss: 0.8249 - regression_loss: 0.7286 - classification_loss: 0.0963 37/500 [=>............................] - ETA: 2:35 - loss: 0.8227 - regression_loss: 0.7269 - classification_loss: 0.0958 38/500 [=>............................] - ETA: 2:34 - loss: 0.8295 - regression_loss: 0.7331 - classification_loss: 0.0964 39/500 [=>............................] - ETA: 2:33 - loss: 0.8223 - regression_loss: 0.7273 - classification_loss: 0.0949 40/500 [=>............................] - ETA: 2:33 - loss: 0.8367 - regression_loss: 0.7421 - classification_loss: 0.0946 41/500 [=>............................] - ETA: 2:32 - loss: 0.8422 - regression_loss: 0.7491 - classification_loss: 0.0931 42/500 [=>............................] - ETA: 2:31 - loss: 0.8395 - regression_loss: 0.7478 - classification_loss: 0.0917 43/500 [=>............................] - ETA: 2:30 - loss: 0.8443 - regression_loss: 0.7530 - classification_loss: 0.0912 44/500 [=>............................] - ETA: 2:30 - loss: 0.8589 - regression_loss: 0.7654 - classification_loss: 0.0936 45/500 [=>............................] - ETA: 2:29 - loss: 0.8588 - regression_loss: 0.7639 - classification_loss: 0.0949 46/500 [=>............................] - ETA: 2:29 - loss: 0.8482 - regression_loss: 0.7550 - classification_loss: 0.0932 47/500 [=>............................] - ETA: 2:29 - loss: 0.8583 - regression_loss: 0.7640 - classification_loss: 0.0943 48/500 [=>............................] - ETA: 2:29 - loss: 0.8549 - regression_loss: 0.7607 - classification_loss: 0.0943 49/500 [=>............................] - ETA: 2:28 - loss: 0.8590 - regression_loss: 0.7632 - classification_loss: 0.0958 50/500 [==>...........................] - ETA: 2:28 - loss: 0.8540 - regression_loss: 0.7592 - classification_loss: 0.0948 51/500 [==>...........................] - ETA: 2:28 - loss: 0.8528 - regression_loss: 0.7589 - classification_loss: 0.0940 52/500 [==>...........................] - ETA: 2:27 - loss: 0.8543 - regression_loss: 0.7589 - classification_loss: 0.0954 53/500 [==>...........................] - ETA: 2:27 - loss: 0.8436 - regression_loss: 0.7496 - classification_loss: 0.0939 54/500 [==>...........................] - ETA: 2:27 - loss: 0.8467 - regression_loss: 0.7528 - classification_loss: 0.0939 55/500 [==>...........................] - ETA: 2:26 - loss: 0.8439 - regression_loss: 0.7508 - classification_loss: 0.0931 56/500 [==>...........................] - ETA: 2:26 - loss: 0.8431 - regression_loss: 0.7507 - classification_loss: 0.0924 57/500 [==>...........................] - ETA: 2:26 - loss: 0.8408 - regression_loss: 0.7495 - classification_loss: 0.0913 58/500 [==>...........................] - ETA: 2:25 - loss: 0.8390 - regression_loss: 0.7486 - classification_loss: 0.0904 59/500 [==>...........................] - ETA: 2:25 - loss: 0.8358 - regression_loss: 0.7462 - classification_loss: 0.0897 60/500 [==>...........................] - ETA: 2:25 - loss: 0.8278 - regression_loss: 0.7388 - classification_loss: 0.0890 61/500 [==>...........................] - ETA: 2:25 - loss: 0.8375 - regression_loss: 0.7479 - classification_loss: 0.0896 62/500 [==>...........................] - ETA: 2:24 - loss: 0.8276 - regression_loss: 0.7392 - classification_loss: 0.0884 63/500 [==>...........................] - ETA: 2:24 - loss: 0.8208 - regression_loss: 0.7332 - classification_loss: 0.0875 64/500 [==>...........................] - ETA: 2:24 - loss: 0.8207 - regression_loss: 0.7336 - classification_loss: 0.0871 65/500 [==>...........................] - ETA: 2:23 - loss: 0.8175 - regression_loss: 0.7310 - classification_loss: 0.0865 66/500 [==>...........................] - ETA: 2:23 - loss: 0.8141 - regression_loss: 0.7282 - classification_loss: 0.0859 67/500 [===>..........................] - ETA: 2:23 - loss: 0.8212 - regression_loss: 0.7352 - classification_loss: 0.0860 68/500 [===>..........................] - ETA: 2:22 - loss: 0.8213 - regression_loss: 0.7351 - classification_loss: 0.0861 69/500 [===>..........................] - ETA: 2:22 - loss: 0.8151 - regression_loss: 0.7299 - classification_loss: 0.0852 70/500 [===>..........................] - ETA: 2:22 - loss: 0.8113 - regression_loss: 0.7267 - classification_loss: 0.0846 71/500 [===>..........................] - ETA: 2:22 - loss: 0.8081 - regression_loss: 0.7244 - classification_loss: 0.0836 72/500 [===>..........................] - ETA: 2:21 - loss: 0.8132 - regression_loss: 0.7287 - classification_loss: 0.0845 73/500 [===>..........................] - ETA: 2:21 - loss: 0.8079 - regression_loss: 0.7243 - classification_loss: 0.0836 74/500 [===>..........................] - ETA: 2:21 - loss: 0.8087 - regression_loss: 0.7252 - classification_loss: 0.0835 75/500 [===>..........................] - ETA: 2:20 - loss: 0.8080 - regression_loss: 0.7245 - classification_loss: 0.0835 76/500 [===>..........................] - ETA: 2:20 - loss: 0.8048 - regression_loss: 0.7219 - classification_loss: 0.0829 77/500 [===>..........................] - ETA: 2:20 - loss: 0.8019 - regression_loss: 0.7199 - classification_loss: 0.0820 78/500 [===>..........................] - ETA: 2:20 - loss: 0.8041 - regression_loss: 0.7218 - classification_loss: 0.0823 79/500 [===>..........................] - ETA: 2:19 - loss: 0.8082 - regression_loss: 0.7252 - classification_loss: 0.0829 80/500 [===>..........................] - ETA: 2:19 - loss: 0.8092 - regression_loss: 0.7268 - classification_loss: 0.0824 81/500 [===>..........................] - ETA: 2:19 - loss: 0.8268 - regression_loss: 0.7433 - classification_loss: 0.0835 82/500 [===>..........................] - ETA: 2:19 - loss: 0.8209 - regression_loss: 0.7383 - classification_loss: 0.0826 83/500 [===>..........................] - ETA: 2:18 - loss: 0.8183 - regression_loss: 0.7361 - classification_loss: 0.0822 84/500 [====>.........................] - ETA: 2:18 - loss: 0.8146 - regression_loss: 0.7327 - classification_loss: 0.0819 85/500 [====>.........................] - ETA: 2:18 - loss: 0.8100 - regression_loss: 0.7287 - classification_loss: 0.0812 86/500 [====>.........................] - ETA: 2:17 - loss: 0.8073 - regression_loss: 0.7265 - classification_loss: 0.0809 87/500 [====>.........................] - ETA: 2:17 - loss: 0.8051 - regression_loss: 0.7247 - classification_loss: 0.0803 88/500 [====>.........................] - ETA: 2:17 - loss: 0.8076 - regression_loss: 0.7274 - classification_loss: 0.0802 89/500 [====>.........................] - ETA: 2:16 - loss: 0.8107 - regression_loss: 0.7301 - classification_loss: 0.0806 90/500 [====>.........................] - ETA: 2:16 - loss: 0.8058 - regression_loss: 0.7260 - classification_loss: 0.0798 91/500 [====>.........................] - ETA: 2:16 - loss: 0.8124 - regression_loss: 0.7318 - classification_loss: 0.0806 92/500 [====>.........................] - ETA: 2:16 - loss: 0.8127 - regression_loss: 0.7322 - classification_loss: 0.0805 93/500 [====>.........................] - ETA: 2:15 - loss: 0.8108 - regression_loss: 0.7310 - classification_loss: 0.0799 94/500 [====>.........................] - ETA: 2:15 - loss: 0.8108 - regression_loss: 0.7312 - classification_loss: 0.0796 95/500 [====>.........................] - ETA: 2:15 - loss: 0.8122 - regression_loss: 0.7329 - classification_loss: 0.0793 96/500 [====>.........................] - ETA: 2:14 - loss: 0.8129 - regression_loss: 0.7339 - classification_loss: 0.0790 97/500 [====>.........................] - ETA: 2:14 - loss: 0.8129 - regression_loss: 0.7336 - classification_loss: 0.0793 98/500 [====>.........................] - ETA: 2:14 - loss: 0.8097 - regression_loss: 0.7307 - classification_loss: 0.0790 99/500 [====>.........................] - ETA: 2:13 - loss: 0.8127 - regression_loss: 0.7334 - classification_loss: 0.0794 100/500 [=====>........................] - ETA: 2:13 - loss: 0.8141 - regression_loss: 0.7346 - classification_loss: 0.0796 101/500 [=====>........................] - ETA: 2:13 - loss: 0.8159 - regression_loss: 0.7362 - classification_loss: 0.0797 102/500 [=====>........................] - ETA: 2:12 - loss: 0.8120 - regression_loss: 0.7330 - classification_loss: 0.0791 103/500 [=====>........................] - ETA: 2:12 - loss: 0.8221 - regression_loss: 0.7403 - classification_loss: 0.0818 104/500 [=====>........................] - ETA: 2:12 - loss: 0.8242 - regression_loss: 0.7425 - classification_loss: 0.0817 105/500 [=====>........................] - ETA: 2:11 - loss: 0.8224 - regression_loss: 0.7412 - classification_loss: 0.0811 106/500 [=====>........................] - ETA: 2:11 - loss: 0.8224 - regression_loss: 0.7412 - classification_loss: 0.0813 107/500 [=====>........................] - ETA: 2:11 - loss: 0.8196 - regression_loss: 0.7390 - classification_loss: 0.0807 108/500 [=====>........................] - ETA: 2:10 - loss: 0.8193 - regression_loss: 0.7389 - classification_loss: 0.0803 109/500 [=====>........................] - ETA: 2:10 - loss: 0.8213 - regression_loss: 0.7406 - classification_loss: 0.0806 110/500 [=====>........................] - ETA: 2:10 - loss: 0.8223 - regression_loss: 0.7420 - classification_loss: 0.0803 111/500 [=====>........................] - ETA: 2:10 - loss: 0.8220 - regression_loss: 0.7417 - classification_loss: 0.0803 112/500 [=====>........................] - ETA: 2:09 - loss: 0.8207 - regression_loss: 0.7408 - classification_loss: 0.0799 113/500 [=====>........................] - ETA: 2:09 - loss: 0.8206 - regression_loss: 0.7410 - classification_loss: 0.0796 114/500 [=====>........................] - ETA: 2:09 - loss: 0.8201 - regression_loss: 0.7408 - classification_loss: 0.0793 115/500 [=====>........................] - ETA: 2:08 - loss: 0.8167 - regression_loss: 0.7378 - classification_loss: 0.0790 116/500 [=====>........................] - ETA: 2:08 - loss: 0.8169 - regression_loss: 0.7382 - classification_loss: 0.0787 117/500 [======>.......................] - ETA: 2:08 - loss: 0.8175 - regression_loss: 0.7393 - classification_loss: 0.0782 118/500 [======>.......................] - ETA: 2:07 - loss: 0.8167 - regression_loss: 0.7388 - classification_loss: 0.0779 119/500 [======>.......................] - ETA: 2:07 - loss: 0.8144 - regression_loss: 0.7365 - classification_loss: 0.0780 120/500 [======>.......................] - ETA: 2:07 - loss: 0.8123 - regression_loss: 0.7346 - classification_loss: 0.0777 121/500 [======>.......................] - ETA: 2:06 - loss: 0.8116 - regression_loss: 0.7342 - classification_loss: 0.0774 122/500 [======>.......................] - ETA: 2:06 - loss: 0.8190 - regression_loss: 0.7409 - classification_loss: 0.0782 123/500 [======>.......................] - ETA: 2:06 - loss: 0.8237 - regression_loss: 0.7456 - classification_loss: 0.0781 124/500 [======>.......................] - ETA: 2:05 - loss: 0.8239 - regression_loss: 0.7458 - classification_loss: 0.0781 125/500 [======>.......................] - ETA: 2:05 - loss: 0.8241 - regression_loss: 0.7458 - classification_loss: 0.0783 126/500 [======>.......................] - ETA: 2:05 - loss: 0.8223 - regression_loss: 0.7442 - classification_loss: 0.0781 127/500 [======>.......................] - ETA: 2:04 - loss: 0.8203 - regression_loss: 0.7426 - classification_loss: 0.0777 128/500 [======>.......................] - ETA: 2:04 - loss: 0.8179 - regression_loss: 0.7402 - classification_loss: 0.0777 129/500 [======>.......................] - ETA: 2:04 - loss: 0.8150 - regression_loss: 0.7377 - classification_loss: 0.0773 130/500 [======>.......................] - ETA: 2:03 - loss: 0.8166 - regression_loss: 0.7395 - classification_loss: 0.0772 131/500 [======>.......................] - ETA: 2:03 - loss: 0.8156 - regression_loss: 0.7385 - classification_loss: 0.0771 132/500 [======>.......................] - ETA: 2:03 - loss: 0.8114 - regression_loss: 0.7348 - classification_loss: 0.0766 133/500 [======>.......................] - ETA: 2:02 - loss: 0.8108 - regression_loss: 0.7342 - classification_loss: 0.0766 134/500 [=======>......................] - ETA: 2:02 - loss: 0.8082 - regression_loss: 0.7320 - classification_loss: 0.0762 135/500 [=======>......................] - ETA: 2:02 - loss: 0.8066 - regression_loss: 0.7307 - classification_loss: 0.0759 136/500 [=======>......................] - ETA: 2:01 - loss: 0.8053 - regression_loss: 0.7295 - classification_loss: 0.0758 137/500 [=======>......................] - ETA: 2:01 - loss: 0.8071 - regression_loss: 0.7310 - classification_loss: 0.0761 138/500 [=======>......................] - ETA: 2:01 - loss: 0.8043 - regression_loss: 0.7285 - classification_loss: 0.0758 139/500 [=======>......................] - ETA: 2:01 - loss: 0.8092 - regression_loss: 0.7326 - classification_loss: 0.0766 140/500 [=======>......................] - ETA: 2:00 - loss: 0.8099 - regression_loss: 0.7332 - classification_loss: 0.0767 141/500 [=======>......................] - ETA: 2:00 - loss: 0.8164 - regression_loss: 0.7381 - classification_loss: 0.0783 142/500 [=======>......................] - ETA: 2:00 - loss: 0.8189 - regression_loss: 0.7408 - classification_loss: 0.0781 143/500 [=======>......................] - ETA: 1:59 - loss: 0.8198 - regression_loss: 0.7416 - classification_loss: 0.0782 144/500 [=======>......................] - ETA: 1:59 - loss: 0.8203 - regression_loss: 0.7419 - classification_loss: 0.0784 145/500 [=======>......................] - ETA: 1:59 - loss: 0.8203 - regression_loss: 0.7416 - classification_loss: 0.0787 146/500 [=======>......................] - ETA: 1:58 - loss: 0.8188 - regression_loss: 0.7404 - classification_loss: 0.0784 147/500 [=======>......................] - ETA: 1:58 - loss: 0.8181 - regression_loss: 0.7396 - classification_loss: 0.0784 148/500 [=======>......................] - ETA: 1:58 - loss: 0.8173 - regression_loss: 0.7388 - classification_loss: 0.0785 149/500 [=======>......................] - ETA: 1:57 - loss: 0.8167 - regression_loss: 0.7381 - classification_loss: 0.0786 150/500 [========>.....................] - ETA: 1:57 - loss: 0.8192 - regression_loss: 0.7402 - classification_loss: 0.0790 151/500 [========>.....................] - ETA: 1:57 - loss: 0.8203 - regression_loss: 0.7410 - classification_loss: 0.0793 152/500 [========>.....................] - ETA: 1:56 - loss: 0.8236 - regression_loss: 0.7445 - classification_loss: 0.0792 153/500 [========>.....................] - ETA: 1:56 - loss: 0.8240 - regression_loss: 0.7449 - classification_loss: 0.0791 154/500 [========>.....................] - ETA: 1:56 - loss: 0.8227 - regression_loss: 0.7440 - classification_loss: 0.0788 155/500 [========>.....................] - ETA: 1:55 - loss: 0.8240 - regression_loss: 0.7453 - classification_loss: 0.0787 156/500 [========>.....................] - ETA: 1:55 - loss: 0.8237 - regression_loss: 0.7452 - classification_loss: 0.0785 157/500 [========>.....................] - ETA: 1:55 - loss: 0.8234 - regression_loss: 0.7449 - classification_loss: 0.0786 158/500 [========>.....................] - ETA: 1:54 - loss: 0.8235 - regression_loss: 0.7452 - classification_loss: 0.0783 159/500 [========>.....................] - ETA: 1:54 - loss: 0.8251 - regression_loss: 0.7468 - classification_loss: 0.0783 160/500 [========>.....................] - ETA: 1:54 - loss: 0.8310 - regression_loss: 0.7513 - classification_loss: 0.0798 161/500 [========>.....................] - ETA: 1:53 - loss: 0.8336 - regression_loss: 0.7539 - classification_loss: 0.0797 162/500 [========>.....................] - ETA: 1:53 - loss: 0.8356 - regression_loss: 0.7561 - classification_loss: 0.0795 163/500 [========>.....................] - ETA: 1:53 - loss: 0.8355 - regression_loss: 0.7562 - classification_loss: 0.0794 164/500 [========>.....................] - ETA: 1:52 - loss: 0.8324 - regression_loss: 0.7534 - classification_loss: 0.0790 165/500 [========>.....................] - ETA: 1:52 - loss: 0.8394 - regression_loss: 0.7598 - classification_loss: 0.0796 166/500 [========>.....................] - ETA: 1:52 - loss: 0.8404 - regression_loss: 0.7609 - classification_loss: 0.0795 167/500 [=========>....................] - ETA: 1:51 - loss: 0.8403 - regression_loss: 0.7611 - classification_loss: 0.0793 168/500 [=========>....................] - ETA: 1:51 - loss: 0.8437 - regression_loss: 0.7637 - classification_loss: 0.0799 169/500 [=========>....................] - ETA: 1:51 - loss: 0.8461 - regression_loss: 0.7658 - classification_loss: 0.0803 170/500 [=========>....................] - ETA: 1:50 - loss: 0.8447 - regression_loss: 0.7648 - classification_loss: 0.0799 171/500 [=========>....................] - ETA: 1:50 - loss: 0.8453 - regression_loss: 0.7656 - classification_loss: 0.0797 172/500 [=========>....................] - ETA: 1:50 - loss: 0.8453 - regression_loss: 0.7656 - classification_loss: 0.0797 173/500 [=========>....................] - ETA: 1:49 - loss: 0.8453 - regression_loss: 0.7654 - classification_loss: 0.0799 174/500 [=========>....................] - ETA: 1:49 - loss: 0.8477 - regression_loss: 0.7667 - classification_loss: 0.0810 175/500 [=========>....................] - ETA: 1:49 - loss: 0.8476 - regression_loss: 0.7666 - classification_loss: 0.0810 176/500 [=========>....................] - ETA: 1:48 - loss: 0.8473 - regression_loss: 0.7665 - classification_loss: 0.0809 177/500 [=========>....................] - ETA: 1:48 - loss: 0.8475 - regression_loss: 0.7666 - classification_loss: 0.0808 178/500 [=========>....................] - ETA: 1:48 - loss: 0.8480 - regression_loss: 0.7672 - classification_loss: 0.0807 179/500 [=========>....................] - ETA: 1:47 - loss: 0.8468 - regression_loss: 0.7662 - classification_loss: 0.0806 180/500 [=========>....................] - ETA: 1:47 - loss: 0.8460 - regression_loss: 0.7654 - classification_loss: 0.0806 181/500 [=========>....................] - ETA: 1:47 - loss: 0.8453 - regression_loss: 0.7648 - classification_loss: 0.0804 182/500 [=========>....................] - ETA: 1:46 - loss: 0.8466 - regression_loss: 0.7658 - classification_loss: 0.0808 183/500 [=========>....................] - ETA: 1:46 - loss: 0.8469 - regression_loss: 0.7661 - classification_loss: 0.0809 184/500 [==========>...................] - ETA: 1:46 - loss: 0.8476 - regression_loss: 0.7667 - classification_loss: 0.0809 185/500 [==========>...................] - ETA: 1:45 - loss: 0.8495 - regression_loss: 0.7683 - classification_loss: 0.0812 186/500 [==========>...................] - ETA: 1:45 - loss: 0.8524 - regression_loss: 0.7711 - classification_loss: 0.0813 187/500 [==========>...................] - ETA: 1:45 - loss: 0.8547 - regression_loss: 0.7722 - classification_loss: 0.0825 188/500 [==========>...................] - ETA: 1:44 - loss: 0.8529 - regression_loss: 0.7707 - classification_loss: 0.0822 189/500 [==========>...................] - ETA: 1:44 - loss: 0.8515 - regression_loss: 0.7697 - classification_loss: 0.0819 190/500 [==========>...................] - ETA: 1:44 - loss: 0.8488 - regression_loss: 0.7673 - classification_loss: 0.0815 191/500 [==========>...................] - ETA: 1:43 - loss: 0.8474 - regression_loss: 0.7661 - classification_loss: 0.0813 192/500 [==========>...................] - ETA: 1:43 - loss: 0.8519 - regression_loss: 0.7698 - classification_loss: 0.0821 193/500 [==========>...................] - ETA: 1:43 - loss: 0.8509 - regression_loss: 0.7689 - classification_loss: 0.0820 194/500 [==========>...................] - ETA: 1:42 - loss: 0.8525 - regression_loss: 0.7702 - classification_loss: 0.0823 195/500 [==========>...................] - ETA: 1:42 - loss: 0.8528 - regression_loss: 0.7705 - classification_loss: 0.0823 196/500 [==========>...................] - ETA: 1:42 - loss: 0.8548 - regression_loss: 0.7724 - classification_loss: 0.0824 197/500 [==========>...................] - ETA: 1:42 - loss: 0.8557 - regression_loss: 0.7730 - classification_loss: 0.0827 198/500 [==========>...................] - ETA: 1:41 - loss: 0.8541 - regression_loss: 0.7717 - classification_loss: 0.0825 199/500 [==========>...................] - ETA: 1:41 - loss: 0.8546 - regression_loss: 0.7720 - classification_loss: 0.0827 200/500 [===========>..................] - ETA: 1:40 - loss: 0.8524 - regression_loss: 0.7700 - classification_loss: 0.0824 201/500 [===========>..................] - ETA: 1:40 - loss: 0.8512 - regression_loss: 0.7691 - classification_loss: 0.0822 202/500 [===========>..................] - ETA: 1:40 - loss: 0.8487 - regression_loss: 0.7669 - classification_loss: 0.0818 203/500 [===========>..................] - ETA: 1:40 - loss: 0.8505 - regression_loss: 0.7684 - classification_loss: 0.0821 204/500 [===========>..................] - ETA: 1:39 - loss: 0.8497 - regression_loss: 0.7677 - classification_loss: 0.0820 205/500 [===========>..................] - ETA: 1:39 - loss: 0.8470 - regression_loss: 0.7654 - classification_loss: 0.0817 206/500 [===========>..................] - ETA: 1:39 - loss: 0.8480 - regression_loss: 0.7661 - classification_loss: 0.0819 207/500 [===========>..................] - ETA: 1:38 - loss: 0.8451 - regression_loss: 0.7635 - classification_loss: 0.0816 208/500 [===========>..................] - ETA: 1:38 - loss: 0.8449 - regression_loss: 0.7635 - classification_loss: 0.0814 209/500 [===========>..................] - ETA: 1:38 - loss: 0.8446 - regression_loss: 0.7634 - classification_loss: 0.0812 210/500 [===========>..................] - ETA: 1:37 - loss: 0.8420 - regression_loss: 0.7611 - classification_loss: 0.0809 211/500 [===========>..................] - ETA: 1:37 - loss: 0.8400 - regression_loss: 0.7593 - classification_loss: 0.0807 212/500 [===========>..................] - ETA: 1:36 - loss: 0.8395 - regression_loss: 0.7590 - classification_loss: 0.0805 213/500 [===========>..................] - ETA: 1:36 - loss: 0.8383 - regression_loss: 0.7580 - classification_loss: 0.0803 214/500 [===========>..................] - ETA: 1:36 - loss: 0.8382 - regression_loss: 0.7578 - classification_loss: 0.0804 215/500 [===========>..................] - ETA: 1:35 - loss: 0.8380 - regression_loss: 0.7573 - classification_loss: 0.0807 216/500 [===========>..................] - ETA: 1:35 - loss: 0.8386 - regression_loss: 0.7577 - classification_loss: 0.0809 217/500 [============>.................] - ETA: 1:35 - loss: 0.8391 - regression_loss: 0.7584 - classification_loss: 0.0807 218/500 [============>.................] - ETA: 1:35 - loss: 0.8383 - regression_loss: 0.7575 - classification_loss: 0.0807 219/500 [============>.................] - ETA: 1:34 - loss: 0.8366 - regression_loss: 0.7562 - classification_loss: 0.0804 220/500 [============>.................] - ETA: 1:34 - loss: 0.8364 - regression_loss: 0.7560 - classification_loss: 0.0804 221/500 [============>.................] - ETA: 1:34 - loss: 0.8366 - regression_loss: 0.7563 - classification_loss: 0.0803 222/500 [============>.................] - ETA: 1:33 - loss: 0.8349 - regression_loss: 0.7549 - classification_loss: 0.0800 223/500 [============>.................] - ETA: 1:33 - loss: 0.8364 - regression_loss: 0.7564 - classification_loss: 0.0800 224/500 [============>.................] - ETA: 1:33 - loss: 0.8368 - regression_loss: 0.7569 - classification_loss: 0.0799 225/500 [============>.................] - ETA: 1:32 - loss: 0.8352 - regression_loss: 0.7556 - classification_loss: 0.0796 226/500 [============>.................] - ETA: 1:32 - loss: 0.8343 - regression_loss: 0.7548 - classification_loss: 0.0795 227/500 [============>.................] - ETA: 1:32 - loss: 0.8368 - regression_loss: 0.7572 - classification_loss: 0.0796 228/500 [============>.................] - ETA: 1:31 - loss: 0.8360 - regression_loss: 0.7564 - classification_loss: 0.0796 229/500 [============>.................] - ETA: 1:31 - loss: 0.8379 - regression_loss: 0.7577 - classification_loss: 0.0803 230/500 [============>.................] - ETA: 1:30 - loss: 0.8496 - regression_loss: 0.7647 - classification_loss: 0.0848 231/500 [============>.................] - ETA: 1:30 - loss: 0.8531 - regression_loss: 0.7680 - classification_loss: 0.0851 232/500 [============>.................] - ETA: 1:30 - loss: 0.8531 - regression_loss: 0.7682 - classification_loss: 0.0850 233/500 [============>.................] - ETA: 1:29 - loss: 0.8554 - regression_loss: 0.7703 - classification_loss: 0.0852 234/500 [=============>................] - ETA: 1:29 - loss: 0.8535 - regression_loss: 0.7686 - classification_loss: 0.0849 235/500 [=============>................] - ETA: 1:29 - loss: 0.8539 - regression_loss: 0.7692 - classification_loss: 0.0847 236/500 [=============>................] - ETA: 1:28 - loss: 0.8535 - regression_loss: 0.7689 - classification_loss: 0.0846 237/500 [=============>................] - ETA: 1:28 - loss: 0.8543 - regression_loss: 0.7696 - classification_loss: 0.0847 238/500 [=============>................] - ETA: 1:28 - loss: 0.8538 - regression_loss: 0.7693 - classification_loss: 0.0845 239/500 [=============>................] - ETA: 1:27 - loss: 0.8553 - regression_loss: 0.7705 - classification_loss: 0.0849 240/500 [=============>................] - ETA: 1:27 - loss: 0.8556 - regression_loss: 0.7708 - classification_loss: 0.0848 241/500 [=============>................] - ETA: 1:27 - loss: 0.8546 - regression_loss: 0.7701 - classification_loss: 0.0845 242/500 [=============>................] - ETA: 1:26 - loss: 0.8535 - regression_loss: 0.7691 - classification_loss: 0.0844 243/500 [=============>................] - ETA: 1:26 - loss: 0.8548 - regression_loss: 0.7699 - classification_loss: 0.0849 244/500 [=============>................] - ETA: 1:26 - loss: 0.8547 - regression_loss: 0.7698 - classification_loss: 0.0848 245/500 [=============>................] - ETA: 1:25 - loss: 0.8520 - regression_loss: 0.7675 - classification_loss: 0.0845 246/500 [=============>................] - ETA: 1:25 - loss: 0.8509 - regression_loss: 0.7665 - classification_loss: 0.0844 247/500 [=============>................] - ETA: 1:25 - loss: 0.8516 - regression_loss: 0.7672 - classification_loss: 0.0844 248/500 [=============>................] - ETA: 1:24 - loss: 0.8540 - regression_loss: 0.7687 - classification_loss: 0.0854 249/500 [=============>................] - ETA: 1:24 - loss: 0.8528 - regression_loss: 0.7675 - classification_loss: 0.0853 250/500 [==============>...............] - ETA: 1:24 - loss: 0.8511 - regression_loss: 0.7660 - classification_loss: 0.0851 251/500 [==============>...............] - ETA: 1:23 - loss: 0.8512 - regression_loss: 0.7659 - classification_loss: 0.0853 252/500 [==============>...............] - ETA: 1:23 - loss: 0.8490 - regression_loss: 0.7640 - classification_loss: 0.0851 253/500 [==============>...............] - ETA: 1:23 - loss: 0.8508 - regression_loss: 0.7651 - classification_loss: 0.0857 254/500 [==============>...............] - ETA: 1:22 - loss: 0.8486 - regression_loss: 0.7631 - classification_loss: 0.0854 255/500 [==============>...............] - ETA: 1:22 - loss: 0.8475 - regression_loss: 0.7621 - classification_loss: 0.0854 256/500 [==============>...............] - ETA: 1:22 - loss: 0.8500 - regression_loss: 0.7644 - classification_loss: 0.0857 257/500 [==============>...............] - ETA: 1:21 - loss: 0.8499 - regression_loss: 0.7643 - classification_loss: 0.0856 258/500 [==============>...............] - ETA: 1:21 - loss: 0.8501 - regression_loss: 0.7646 - classification_loss: 0.0855 259/500 [==============>...............] - ETA: 1:21 - loss: 0.8515 - regression_loss: 0.7658 - classification_loss: 0.0857 260/500 [==============>...............] - ETA: 1:20 - loss: 0.8511 - regression_loss: 0.7654 - classification_loss: 0.0856 261/500 [==============>...............] - ETA: 1:20 - loss: 0.8502 - regression_loss: 0.7647 - classification_loss: 0.0855 262/500 [==============>...............] - ETA: 1:20 - loss: 0.8487 - regression_loss: 0.7635 - classification_loss: 0.0852 263/500 [==============>...............] - ETA: 1:19 - loss: 0.8484 - regression_loss: 0.7633 - classification_loss: 0.0851 264/500 [==============>...............] - ETA: 1:19 - loss: 0.8477 - regression_loss: 0.7626 - classification_loss: 0.0851 265/500 [==============>...............] - ETA: 1:19 - loss: 0.8457 - regression_loss: 0.7608 - classification_loss: 0.0849 266/500 [==============>...............] - ETA: 1:18 - loss: 0.8448 - regression_loss: 0.7601 - classification_loss: 0.0847 267/500 [===============>..............] - ETA: 1:18 - loss: 0.8436 - regression_loss: 0.7591 - classification_loss: 0.0845 268/500 [===============>..............] - ETA: 1:18 - loss: 0.8446 - regression_loss: 0.7598 - classification_loss: 0.0847 269/500 [===============>..............] - ETA: 1:17 - loss: 0.8450 - regression_loss: 0.7603 - classification_loss: 0.0847 270/500 [===============>..............] - ETA: 1:17 - loss: 0.8451 - regression_loss: 0.7605 - classification_loss: 0.0846 271/500 [===============>..............] - ETA: 1:17 - loss: 0.8443 - regression_loss: 0.7596 - classification_loss: 0.0846 272/500 [===============>..............] - ETA: 1:16 - loss: 0.8441 - regression_loss: 0.7597 - classification_loss: 0.0845 273/500 [===============>..............] - ETA: 1:16 - loss: 0.8482 - regression_loss: 0.7628 - classification_loss: 0.0854 274/500 [===============>..............] - ETA: 1:16 - loss: 0.8503 - regression_loss: 0.7647 - classification_loss: 0.0856 275/500 [===============>..............] - ETA: 1:15 - loss: 0.8519 - regression_loss: 0.7664 - classification_loss: 0.0855 276/500 [===============>..............] - ETA: 1:15 - loss: 0.8537 - regression_loss: 0.7678 - classification_loss: 0.0859 277/500 [===============>..............] - ETA: 1:15 - loss: 0.8531 - regression_loss: 0.7674 - classification_loss: 0.0857 278/500 [===============>..............] - ETA: 1:14 - loss: 0.8537 - regression_loss: 0.7680 - classification_loss: 0.0857 279/500 [===============>..............] - ETA: 1:14 - loss: 0.8517 - regression_loss: 0.7663 - classification_loss: 0.0854 280/500 [===============>..............] - ETA: 1:14 - loss: 0.8522 - regression_loss: 0.7669 - classification_loss: 0.0853 281/500 [===============>..............] - ETA: 1:13 - loss: 0.8523 - regression_loss: 0.7669 - classification_loss: 0.0853 282/500 [===============>..............] - ETA: 1:13 - loss: 0.8514 - regression_loss: 0.7663 - classification_loss: 0.0851 283/500 [===============>..............] - ETA: 1:13 - loss: 0.8499 - regression_loss: 0.7650 - classification_loss: 0.0849 284/500 [================>.............] - ETA: 1:12 - loss: 0.8496 - regression_loss: 0.7647 - classification_loss: 0.0849 285/500 [================>.............] - ETA: 1:12 - loss: 0.8488 - regression_loss: 0.7641 - classification_loss: 0.0847 286/500 [================>.............] - ETA: 1:12 - loss: 0.8478 - regression_loss: 0.7632 - classification_loss: 0.0846 287/500 [================>.............] - ETA: 1:11 - loss: 0.8508 - regression_loss: 0.7657 - classification_loss: 0.0851 288/500 [================>.............] - ETA: 1:11 - loss: 0.8504 - regression_loss: 0.7654 - classification_loss: 0.0850 289/500 [================>.............] - ETA: 1:11 - loss: 0.8513 - regression_loss: 0.7662 - classification_loss: 0.0851 290/500 [================>.............] - ETA: 1:10 - loss: 0.8513 - regression_loss: 0.7663 - classification_loss: 0.0850 291/500 [================>.............] - ETA: 1:10 - loss: 0.8505 - regression_loss: 0.7655 - classification_loss: 0.0850 292/500 [================>.............] - ETA: 1:10 - loss: 0.8507 - regression_loss: 0.7656 - classification_loss: 0.0851 293/500 [================>.............] - ETA: 1:09 - loss: 0.8497 - regression_loss: 0.7647 - classification_loss: 0.0850 294/500 [================>.............] - ETA: 1:09 - loss: 0.8492 - regression_loss: 0.7643 - classification_loss: 0.0849 295/500 [================>.............] - ETA: 1:09 - loss: 0.8504 - regression_loss: 0.7655 - classification_loss: 0.0850 296/500 [================>.............] - ETA: 1:08 - loss: 0.8495 - regression_loss: 0.7647 - classification_loss: 0.0848 297/500 [================>.............] - ETA: 1:08 - loss: 0.8499 - regression_loss: 0.7651 - classification_loss: 0.0848 298/500 [================>.............] - ETA: 1:08 - loss: 0.8483 - regression_loss: 0.7637 - classification_loss: 0.0846 299/500 [================>.............] - ETA: 1:07 - loss: 0.8476 - regression_loss: 0.7632 - classification_loss: 0.0844 300/500 [=================>............] - ETA: 1:07 - loss: 0.8472 - regression_loss: 0.7629 - classification_loss: 0.0844 301/500 [=================>............] - ETA: 1:07 - loss: 0.8457 - regression_loss: 0.7616 - classification_loss: 0.0841 302/500 [=================>............] - ETA: 1:06 - loss: 0.8454 - regression_loss: 0.7613 - classification_loss: 0.0840 303/500 [=================>............] - ETA: 1:06 - loss: 0.8450 - regression_loss: 0.7610 - classification_loss: 0.0839 304/500 [=================>............] - ETA: 1:06 - loss: 0.8451 - regression_loss: 0.7612 - classification_loss: 0.0839 305/500 [=================>............] - ETA: 1:05 - loss: 0.8442 - regression_loss: 0.7604 - classification_loss: 0.0838 306/500 [=================>............] - ETA: 1:05 - loss: 0.8431 - regression_loss: 0.7595 - classification_loss: 0.0836 307/500 [=================>............] - ETA: 1:05 - loss: 0.8442 - regression_loss: 0.7604 - classification_loss: 0.0837 308/500 [=================>............] - ETA: 1:04 - loss: 0.8436 - regression_loss: 0.7600 - classification_loss: 0.0836 309/500 [=================>............] - ETA: 1:04 - loss: 0.8431 - regression_loss: 0.7596 - classification_loss: 0.0835 310/500 [=================>............] - ETA: 1:04 - loss: 0.8444 - regression_loss: 0.7606 - classification_loss: 0.0838 311/500 [=================>............] - ETA: 1:03 - loss: 0.8437 - regression_loss: 0.7601 - classification_loss: 0.0836 312/500 [=================>............] - ETA: 1:03 - loss: 0.8417 - regression_loss: 0.7584 - classification_loss: 0.0833 313/500 [=================>............] - ETA: 1:03 - loss: 0.8435 - regression_loss: 0.7598 - classification_loss: 0.0837 314/500 [=================>............] - ETA: 1:02 - loss: 0.8434 - regression_loss: 0.7599 - classification_loss: 0.0836 315/500 [=================>............] - ETA: 1:02 - loss: 0.8449 - regression_loss: 0.7612 - classification_loss: 0.0837 316/500 [=================>............] - ETA: 1:02 - loss: 0.8460 - regression_loss: 0.7621 - classification_loss: 0.0839 317/500 [==================>...........] - ETA: 1:01 - loss: 0.8448 - regression_loss: 0.7611 - classification_loss: 0.0837 318/500 [==================>...........] - ETA: 1:01 - loss: 0.8458 - regression_loss: 0.7619 - classification_loss: 0.0839 319/500 [==================>...........] - ETA: 1:01 - loss: 0.8459 - regression_loss: 0.7622 - classification_loss: 0.0838 320/500 [==================>...........] - ETA: 1:00 - loss: 0.8459 - regression_loss: 0.7623 - classification_loss: 0.0837 321/500 [==================>...........] - ETA: 1:00 - loss: 0.8454 - regression_loss: 0.7618 - classification_loss: 0.0836 322/500 [==================>...........] - ETA: 1:00 - loss: 0.8438 - regression_loss: 0.7603 - classification_loss: 0.0835 323/500 [==================>...........] - ETA: 59s - loss: 0.8442 - regression_loss: 0.7607 - classification_loss: 0.0835  324/500 [==================>...........] - ETA: 59s - loss: 0.8436 - regression_loss: 0.7603 - classification_loss: 0.0833 325/500 [==================>...........] - ETA: 59s - loss: 0.8425 - regression_loss: 0.7593 - classification_loss: 0.0832 326/500 [==================>...........] - ETA: 58s - loss: 0.8415 - regression_loss: 0.7585 - classification_loss: 0.0830 327/500 [==================>...........] - ETA: 58s - loss: 0.8414 - regression_loss: 0.7585 - classification_loss: 0.0829 328/500 [==================>...........] - ETA: 58s - loss: 0.8431 - regression_loss: 0.7598 - classification_loss: 0.0834 329/500 [==================>...........] - ETA: 57s - loss: 0.8422 - regression_loss: 0.7590 - classification_loss: 0.0832 330/500 [==================>...........] - ETA: 57s - loss: 0.8414 - regression_loss: 0.7582 - classification_loss: 0.0832 331/500 [==================>...........] - ETA: 57s - loss: 0.8421 - regression_loss: 0.7589 - classification_loss: 0.0832 332/500 [==================>...........] - ETA: 56s - loss: 0.8413 - regression_loss: 0.7583 - classification_loss: 0.0830 333/500 [==================>...........] - ETA: 56s - loss: 0.8393 - regression_loss: 0.7565 - classification_loss: 0.0828 334/500 [===================>..........] - ETA: 56s - loss: 0.8405 - regression_loss: 0.7577 - classification_loss: 0.0829 335/500 [===================>..........] - ETA: 55s - loss: 0.8403 - regression_loss: 0.7575 - classification_loss: 0.0828 336/500 [===================>..........] - ETA: 55s - loss: 0.8415 - regression_loss: 0.7586 - classification_loss: 0.0829 337/500 [===================>..........] - ETA: 55s - loss: 0.8405 - regression_loss: 0.7578 - classification_loss: 0.0827 338/500 [===================>..........] - ETA: 54s - loss: 0.8392 - regression_loss: 0.7566 - classification_loss: 0.0825 339/500 [===================>..........] - ETA: 54s - loss: 0.8399 - regression_loss: 0.7573 - classification_loss: 0.0826 340/500 [===================>..........] - ETA: 54s - loss: 0.8416 - regression_loss: 0.7586 - classification_loss: 0.0830 341/500 [===================>..........] - ETA: 53s - loss: 0.8420 - regression_loss: 0.7590 - classification_loss: 0.0830 342/500 [===================>..........] - ETA: 53s - loss: 0.8452 - regression_loss: 0.7617 - classification_loss: 0.0835 343/500 [===================>..........] - ETA: 53s - loss: 0.8444 - regression_loss: 0.7609 - classification_loss: 0.0835 344/500 [===================>..........] - ETA: 52s - loss: 0.8449 - regression_loss: 0.7615 - classification_loss: 0.0835 345/500 [===================>..........] - ETA: 52s - loss: 0.8449 - regression_loss: 0.7616 - classification_loss: 0.0834 346/500 [===================>..........] - ETA: 52s - loss: 0.8447 - regression_loss: 0.7614 - classification_loss: 0.0833 347/500 [===================>..........] - ETA: 51s - loss: 0.8450 - regression_loss: 0.7617 - classification_loss: 0.0833 348/500 [===================>..........] - ETA: 51s - loss: 0.8449 - regression_loss: 0.7617 - classification_loss: 0.0832 349/500 [===================>..........] - ETA: 51s - loss: 0.8471 - regression_loss: 0.7636 - classification_loss: 0.0836 350/500 [====================>.........] - ETA: 50s - loss: 0.8473 - regression_loss: 0.7637 - classification_loss: 0.0835 351/500 [====================>.........] - ETA: 50s - loss: 0.8466 - regression_loss: 0.7633 - classification_loss: 0.0834 352/500 [====================>.........] - ETA: 49s - loss: 0.8462 - regression_loss: 0.7629 - classification_loss: 0.0833 353/500 [====================>.........] - ETA: 49s - loss: 0.8458 - regression_loss: 0.7625 - classification_loss: 0.0833 354/500 [====================>.........] - ETA: 49s - loss: 0.8453 - regression_loss: 0.7621 - classification_loss: 0.0832 355/500 [====================>.........] - ETA: 48s - loss: 0.8460 - regression_loss: 0.7629 - classification_loss: 0.0831 356/500 [====================>.........] - ETA: 48s - loss: 0.8475 - regression_loss: 0.7643 - classification_loss: 0.0832 357/500 [====================>.........] - ETA: 48s - loss: 0.8466 - regression_loss: 0.7636 - classification_loss: 0.0830 358/500 [====================>.........] - ETA: 47s - loss: 0.8465 - regression_loss: 0.7634 - classification_loss: 0.0831 359/500 [====================>.........] - ETA: 47s - loss: 0.8465 - regression_loss: 0.7635 - classification_loss: 0.0831 360/500 [====================>.........] - ETA: 47s - loss: 0.8461 - regression_loss: 0.7631 - classification_loss: 0.0831 361/500 [====================>.........] - ETA: 46s - loss: 0.8469 - regression_loss: 0.7635 - classification_loss: 0.0833 362/500 [====================>.........] - ETA: 46s - loss: 0.8471 - regression_loss: 0.7638 - classification_loss: 0.0833 363/500 [====================>.........] - ETA: 46s - loss: 0.8470 - regression_loss: 0.7637 - classification_loss: 0.0834 364/500 [====================>.........] - ETA: 45s - loss: 0.8463 - regression_loss: 0.7630 - classification_loss: 0.0833 365/500 [====================>.........] - ETA: 45s - loss: 0.8470 - regression_loss: 0.7636 - classification_loss: 0.0834 366/500 [====================>.........] - ETA: 45s - loss: 0.8461 - regression_loss: 0.7628 - classification_loss: 0.0833 367/500 [=====================>........] - ETA: 44s - loss: 0.8470 - regression_loss: 0.7633 - classification_loss: 0.0837 368/500 [=====================>........] - ETA: 44s - loss: 0.8469 - regression_loss: 0.7631 - classification_loss: 0.0837 369/500 [=====================>........] - ETA: 44s - loss: 0.8476 - regression_loss: 0.7638 - classification_loss: 0.0838 370/500 [=====================>........] - ETA: 43s - loss: 0.8489 - regression_loss: 0.7649 - classification_loss: 0.0840 371/500 [=====================>........] - ETA: 43s - loss: 0.8485 - regression_loss: 0.7646 - classification_loss: 0.0839 372/500 [=====================>........] - ETA: 43s - loss: 0.8483 - regression_loss: 0.7643 - classification_loss: 0.0840 373/500 [=====================>........] - ETA: 42s - loss: 0.8471 - regression_loss: 0.7633 - classification_loss: 0.0838 374/500 [=====================>........] - ETA: 42s - loss: 0.8460 - regression_loss: 0.7623 - classification_loss: 0.0837 375/500 [=====================>........] - ETA: 42s - loss: 0.8462 - regression_loss: 0.7623 - classification_loss: 0.0839 376/500 [=====================>........] - ETA: 41s - loss: 0.8452 - regression_loss: 0.7614 - classification_loss: 0.0837 377/500 [=====================>........] - ETA: 41s - loss: 0.8449 - regression_loss: 0.7613 - classification_loss: 0.0836 378/500 [=====================>........] - ETA: 41s - loss: 0.8450 - regression_loss: 0.7614 - classification_loss: 0.0836 379/500 [=====================>........] - ETA: 40s - loss: 0.8451 - regression_loss: 0.7615 - classification_loss: 0.0836 380/500 [=====================>........] - ETA: 40s - loss: 0.8446 - regression_loss: 0.7611 - classification_loss: 0.0835 381/500 [=====================>........] - ETA: 40s - loss: 0.8445 - regression_loss: 0.7611 - classification_loss: 0.0834 382/500 [=====================>........] - ETA: 39s - loss: 0.8442 - regression_loss: 0.7608 - classification_loss: 0.0834 383/500 [=====================>........] - ETA: 39s - loss: 0.8443 - regression_loss: 0.7609 - classification_loss: 0.0834 384/500 [======================>.......] - ETA: 39s - loss: 0.8437 - regression_loss: 0.7605 - classification_loss: 0.0832 385/500 [======================>.......] - ETA: 38s - loss: 0.8431 - regression_loss: 0.7600 - classification_loss: 0.0831 386/500 [======================>.......] - ETA: 38s - loss: 0.8430 - regression_loss: 0.7599 - classification_loss: 0.0832 387/500 [======================>.......] - ETA: 38s - loss: 0.8454 - regression_loss: 0.7614 - classification_loss: 0.0840 388/500 [======================>.......] - ETA: 37s - loss: 0.8453 - regression_loss: 0.7614 - classification_loss: 0.0839 389/500 [======================>.......] - ETA: 37s - loss: 0.8452 - regression_loss: 0.7613 - classification_loss: 0.0839 390/500 [======================>.......] - ETA: 37s - loss: 0.8445 - regression_loss: 0.7607 - classification_loss: 0.0838 391/500 [======================>.......] - ETA: 36s - loss: 0.8437 - regression_loss: 0.7600 - classification_loss: 0.0837 392/500 [======================>.......] - ETA: 36s - loss: 0.8438 - regression_loss: 0.7601 - classification_loss: 0.0837 393/500 [======================>.......] - ETA: 36s - loss: 0.8452 - regression_loss: 0.7613 - classification_loss: 0.0838 394/500 [======================>.......] - ETA: 35s - loss: 0.8482 - regression_loss: 0.7638 - classification_loss: 0.0843 395/500 [======================>.......] - ETA: 35s - loss: 0.8471 - regression_loss: 0.7629 - classification_loss: 0.0842 396/500 [======================>.......] - ETA: 35s - loss: 0.8468 - regression_loss: 0.7627 - classification_loss: 0.0841 397/500 [======================>.......] - ETA: 34s - loss: 0.8462 - regression_loss: 0.7623 - classification_loss: 0.0840 398/500 [======================>.......] - ETA: 34s - loss: 0.8461 - regression_loss: 0.7623 - classification_loss: 0.0838 399/500 [======================>.......] - ETA: 34s - loss: 0.8455 - regression_loss: 0.7617 - classification_loss: 0.0839 400/500 [=======================>......] - ETA: 33s - loss: 0.8449 - regression_loss: 0.7611 - classification_loss: 0.0838 401/500 [=======================>......] - ETA: 33s - loss: 0.8458 - regression_loss: 0.7618 - classification_loss: 0.0839 402/500 [=======================>......] - ETA: 33s - loss: 0.8458 - regression_loss: 0.7619 - classification_loss: 0.0839 403/500 [=======================>......] - ETA: 32s - loss: 0.8446 - regression_loss: 0.7609 - classification_loss: 0.0837 404/500 [=======================>......] - ETA: 32s - loss: 0.8446 - regression_loss: 0.7610 - classification_loss: 0.0836 405/500 [=======================>......] - ETA: 32s - loss: 0.8448 - regression_loss: 0.7612 - classification_loss: 0.0836 406/500 [=======================>......] - ETA: 31s - loss: 0.8456 - regression_loss: 0.7618 - classification_loss: 0.0838 407/500 [=======================>......] - ETA: 31s - loss: 0.8457 - regression_loss: 0.7619 - classification_loss: 0.0838 408/500 [=======================>......] - ETA: 31s - loss: 0.8452 - regression_loss: 0.7616 - classification_loss: 0.0837 409/500 [=======================>......] - ETA: 30s - loss: 0.8451 - regression_loss: 0.7615 - classification_loss: 0.0836 410/500 [=======================>......] - ETA: 30s - loss: 0.8445 - regression_loss: 0.7610 - classification_loss: 0.0835 411/500 [=======================>......] - ETA: 30s - loss: 0.8437 - regression_loss: 0.7603 - classification_loss: 0.0834 412/500 [=======================>......] - ETA: 29s - loss: 0.8426 - regression_loss: 0.7594 - classification_loss: 0.0832 413/500 [=======================>......] - ETA: 29s - loss: 0.8416 - regression_loss: 0.7585 - classification_loss: 0.0831 414/500 [=======================>......] - ETA: 29s - loss: 0.8420 - regression_loss: 0.7589 - classification_loss: 0.0831 415/500 [=======================>......] - ETA: 28s - loss: 0.8421 - regression_loss: 0.7590 - classification_loss: 0.0831 416/500 [=======================>......] - ETA: 28s - loss: 0.8430 - regression_loss: 0.7596 - classification_loss: 0.0834 417/500 [========================>.....] - ETA: 28s - loss: 0.8426 - regression_loss: 0.7593 - classification_loss: 0.0833 418/500 [========================>.....] - ETA: 27s - loss: 0.8428 - regression_loss: 0.7595 - classification_loss: 0.0833 419/500 [========================>.....] - ETA: 27s - loss: 0.8426 - regression_loss: 0.7595 - classification_loss: 0.0832 420/500 [========================>.....] - ETA: 27s - loss: 0.8434 - regression_loss: 0.7602 - classification_loss: 0.0832 421/500 [========================>.....] - ETA: 26s - loss: 0.8445 - regression_loss: 0.7611 - classification_loss: 0.0834 422/500 [========================>.....] - ETA: 26s - loss: 0.8439 - regression_loss: 0.7606 - classification_loss: 0.0833 423/500 [========================>.....] - ETA: 26s - loss: 0.8437 - regression_loss: 0.7605 - classification_loss: 0.0832 424/500 [========================>.....] - ETA: 25s - loss: 0.8443 - regression_loss: 0.7612 - classification_loss: 0.0831 425/500 [========================>.....] - ETA: 25s - loss: 0.8441 - regression_loss: 0.7610 - classification_loss: 0.0831 426/500 [========================>.....] - ETA: 25s - loss: 0.8432 - regression_loss: 0.7602 - classification_loss: 0.0830 427/500 [========================>.....] - ETA: 24s - loss: 0.8428 - regression_loss: 0.7599 - classification_loss: 0.0829 428/500 [========================>.....] - ETA: 24s - loss: 0.8426 - regression_loss: 0.7598 - classification_loss: 0.0828 429/500 [========================>.....] - ETA: 23s - loss: 0.8419 - regression_loss: 0.7593 - classification_loss: 0.0827 430/500 [========================>.....] - ETA: 23s - loss: 0.8419 - regression_loss: 0.7594 - classification_loss: 0.0826 431/500 [========================>.....] - ETA: 23s - loss: 0.8409 - regression_loss: 0.7584 - classification_loss: 0.0824 432/500 [========================>.....] - ETA: 22s - loss: 0.8409 - regression_loss: 0.7585 - classification_loss: 0.0824 433/500 [========================>.....] - ETA: 22s - loss: 0.8407 - regression_loss: 0.7583 - classification_loss: 0.0825 434/500 [=========================>....] - ETA: 22s - loss: 0.8397 - regression_loss: 0.7574 - classification_loss: 0.0823 435/500 [=========================>....] - ETA: 21s - loss: 0.8401 - regression_loss: 0.7578 - classification_loss: 0.0823 436/500 [=========================>....] - ETA: 21s - loss: 0.8403 - regression_loss: 0.7580 - classification_loss: 0.0823 437/500 [=========================>....] - ETA: 21s - loss: 0.8403 - regression_loss: 0.7581 - classification_loss: 0.0822 438/500 [=========================>....] - ETA: 20s - loss: 0.8396 - regression_loss: 0.7575 - classification_loss: 0.0821 439/500 [=========================>....] - ETA: 20s - loss: 0.8402 - regression_loss: 0.7581 - classification_loss: 0.0820 440/500 [=========================>....] - ETA: 20s - loss: 0.8404 - regression_loss: 0.7584 - classification_loss: 0.0821 441/500 [=========================>....] - ETA: 19s - loss: 0.8405 - regression_loss: 0.7585 - classification_loss: 0.0821 442/500 [=========================>....] - ETA: 19s - loss: 0.8403 - regression_loss: 0.7583 - classification_loss: 0.0820 443/500 [=========================>....] - ETA: 19s - loss: 0.8398 - regression_loss: 0.7579 - classification_loss: 0.0819 444/500 [=========================>....] - ETA: 18s - loss: 0.8401 - regression_loss: 0.7582 - classification_loss: 0.0819 445/500 [=========================>....] - ETA: 18s - loss: 0.8395 - regression_loss: 0.7577 - classification_loss: 0.0818 446/500 [=========================>....] - ETA: 18s - loss: 0.8386 - regression_loss: 0.7569 - classification_loss: 0.0817 447/500 [=========================>....] - ETA: 17s - loss: 0.8388 - regression_loss: 0.7570 - classification_loss: 0.0818 448/500 [=========================>....] - ETA: 17s - loss: 0.8380 - regression_loss: 0.7563 - classification_loss: 0.0817 449/500 [=========================>....] - ETA: 17s - loss: 0.8387 - regression_loss: 0.7569 - classification_loss: 0.0817 450/500 [==========================>...] - ETA: 16s - loss: 0.8383 - regression_loss: 0.7567 - classification_loss: 0.0816 451/500 [==========================>...] - ETA: 16s - loss: 0.8375 - regression_loss: 0.7560 - classification_loss: 0.0815 452/500 [==========================>...] - ETA: 16s - loss: 0.8363 - regression_loss: 0.7549 - classification_loss: 0.0814 453/500 [==========================>...] - ETA: 15s - loss: 0.8352 - regression_loss: 0.7540 - classification_loss: 0.0812 454/500 [==========================>...] - ETA: 15s - loss: 0.8349 - regression_loss: 0.7537 - classification_loss: 0.0811 455/500 [==========================>...] - ETA: 15s - loss: 0.8343 - regression_loss: 0.7533 - classification_loss: 0.0810 456/500 [==========================>...] - ETA: 14s - loss: 0.8343 - regression_loss: 0.7532 - classification_loss: 0.0811 457/500 [==========================>...] - ETA: 14s - loss: 0.8340 - regression_loss: 0.7530 - classification_loss: 0.0810 458/500 [==========================>...] - ETA: 14s - loss: 0.8339 - regression_loss: 0.7528 - classification_loss: 0.0810 459/500 [==========================>...] - ETA: 13s - loss: 0.8326 - regression_loss: 0.7517 - classification_loss: 0.0809 460/500 [==========================>...] - ETA: 13s - loss: 0.8324 - regression_loss: 0.7515 - classification_loss: 0.0808 461/500 [==========================>...] - ETA: 13s - loss: 0.8324 - regression_loss: 0.7516 - classification_loss: 0.0808 462/500 [==========================>...] - ETA: 12s - loss: 0.8319 - regression_loss: 0.7512 - classification_loss: 0.0807 463/500 [==========================>...] - ETA: 12s - loss: 0.8317 - regression_loss: 0.7511 - classification_loss: 0.0806 464/500 [==========================>...] - ETA: 12s - loss: 0.8315 - regression_loss: 0.7509 - classification_loss: 0.0806 465/500 [==========================>...] - ETA: 11s - loss: 0.8316 - regression_loss: 0.7511 - classification_loss: 0.0805 466/500 [==========================>...] - ETA: 11s - loss: 0.8320 - regression_loss: 0.7515 - classification_loss: 0.0806 467/500 [===========================>..] - ETA: 11s - loss: 0.8324 - regression_loss: 0.7519 - classification_loss: 0.0806 468/500 [===========================>..] - ETA: 10s - loss: 0.8329 - regression_loss: 0.7523 - classification_loss: 0.0807 469/500 [===========================>..] - ETA: 10s - loss: 0.8327 - regression_loss: 0.7521 - classification_loss: 0.0806 470/500 [===========================>..] - ETA: 10s - loss: 0.8328 - regression_loss: 0.7522 - classification_loss: 0.0806 471/500 [===========================>..] - ETA: 9s - loss: 0.8319 - regression_loss: 0.7514 - classification_loss: 0.0805  472/500 [===========================>..] - ETA: 9s - loss: 0.8311 - regression_loss: 0.7507 - classification_loss: 0.0804 473/500 [===========================>..] - ETA: 9s - loss: 0.8315 - regression_loss: 0.7510 - classification_loss: 0.0804 474/500 [===========================>..] - ETA: 8s - loss: 0.8300 - regression_loss: 0.7497 - classification_loss: 0.0803 475/500 [===========================>..] - ETA: 8s - loss: 0.8299 - regression_loss: 0.7497 - classification_loss: 0.0802 476/500 [===========================>..] - ETA: 8s - loss: 0.8294 - regression_loss: 0.7493 - classification_loss: 0.0801 477/500 [===========================>..] - ETA: 7s - loss: 0.8290 - regression_loss: 0.7490 - classification_loss: 0.0800 478/500 [===========================>..] - ETA: 7s - loss: 0.8281 - regression_loss: 0.7483 - classification_loss: 0.0799 479/500 [===========================>..] - ETA: 7s - loss: 0.8281 - regression_loss: 0.7483 - classification_loss: 0.0799 480/500 [===========================>..] - ETA: 6s - loss: 0.8280 - regression_loss: 0.7481 - classification_loss: 0.0799 481/500 [===========================>..] - ETA: 6s - loss: 0.8283 - regression_loss: 0.7484 - classification_loss: 0.0799 482/500 [===========================>..] - ETA: 6s - loss: 0.8270 - regression_loss: 0.7472 - classification_loss: 0.0798 483/500 [===========================>..] - ETA: 5s - loss: 0.8262 - regression_loss: 0.7465 - classification_loss: 0.0797 484/500 [============================>.] - ETA: 5s - loss: 0.8261 - regression_loss: 0.7463 - classification_loss: 0.0797 485/500 [============================>.] - ETA: 5s - loss: 0.8258 - regression_loss: 0.7462 - classification_loss: 0.0797 486/500 [============================>.] - ETA: 4s - loss: 0.8256 - regression_loss: 0.7460 - classification_loss: 0.0796 487/500 [============================>.] - ETA: 4s - loss: 0.8261 - regression_loss: 0.7465 - classification_loss: 0.0796 488/500 [============================>.] - ETA: 4s - loss: 0.8262 - regression_loss: 0.7465 - classification_loss: 0.0796 489/500 [============================>.] - ETA: 3s - loss: 0.8275 - regression_loss: 0.7475 - classification_loss: 0.0799 490/500 [============================>.] - ETA: 3s - loss: 0.8272 - regression_loss: 0.7474 - classification_loss: 0.0798 491/500 [============================>.] - ETA: 3s - loss: 0.8267 - regression_loss: 0.7470 - classification_loss: 0.0797 492/500 [============================>.] - ETA: 2s - loss: 0.8265 - regression_loss: 0.7468 - classification_loss: 0.0797 493/500 [============================>.] - ETA: 2s - loss: 0.8260 - regression_loss: 0.7464 - classification_loss: 0.0796 494/500 [============================>.] - ETA: 2s - loss: 0.8262 - regression_loss: 0.7465 - classification_loss: 0.0797 495/500 [============================>.] - ETA: 1s - loss: 0.8271 - regression_loss: 0.7472 - classification_loss: 0.0799 496/500 [============================>.] - ETA: 1s - loss: 0.8267 - regression_loss: 0.7469 - classification_loss: 0.0798 497/500 [============================>.] - ETA: 1s - loss: 0.8260 - regression_loss: 0.7463 - classification_loss: 0.0797 498/500 [============================>.] - ETA: 0s - loss: 0.8269 - regression_loss: 0.7469 - classification_loss: 0.0800 499/500 [============================>.] - ETA: 0s - loss: 0.8264 - regression_loss: 0.7465 - classification_loss: 0.0799 500/500 [==============================] - 169s 338ms/step - loss: 0.8254 - regression_loss: 0.7457 - classification_loss: 0.0798 326 instances of class plum with average precision: 0.8280 mAP: 0.8280 Epoch 00038: saving model to ./training/snapshots/resnet101_pascal_38.h5 Epoch 39/150 1/500 [..............................] - ETA: 2:41 - loss: 0.9455 - regression_loss: 0.8865 - classification_loss: 0.0591 2/500 [..............................] - ETA: 2:47 - loss: 0.7154 - regression_loss: 0.6368 - classification_loss: 0.0786 3/500 [..............................] - ETA: 2:47 - loss: 0.6307 - regression_loss: 0.5669 - classification_loss: 0.0638 4/500 [..............................] - ETA: 2:45 - loss: 0.6001 - regression_loss: 0.5485 - classification_loss: 0.0515 5/500 [..............................] - ETA: 2:45 - loss: 0.6270 - regression_loss: 0.5759 - classification_loss: 0.0512 6/500 [..............................] - ETA: 2:46 - loss: 0.6104 - regression_loss: 0.5604 - classification_loss: 0.0500 7/500 [..............................] - ETA: 2:47 - loss: 0.6699 - regression_loss: 0.6087 - classification_loss: 0.0612 8/500 [..............................] - ETA: 2:45 - loss: 0.7503 - regression_loss: 0.6803 - classification_loss: 0.0700 9/500 [..............................] - ETA: 2:43 - loss: 0.8059 - regression_loss: 0.7351 - classification_loss: 0.0708 10/500 [..............................] - ETA: 2:42 - loss: 0.7629 - regression_loss: 0.6973 - classification_loss: 0.0656 11/500 [..............................] - ETA: 2:42 - loss: 0.8680 - regression_loss: 0.7812 - classification_loss: 0.0868 12/500 [..............................] - ETA: 2:42 - loss: 0.8374 - regression_loss: 0.7559 - classification_loss: 0.0815 13/500 [..............................] - ETA: 2:42 - loss: 0.8491 - regression_loss: 0.7675 - classification_loss: 0.0816 14/500 [..............................] - ETA: 2:41 - loss: 0.8248 - regression_loss: 0.7475 - classification_loss: 0.0773 15/500 [..............................] - ETA: 2:41 - loss: 0.8080 - regression_loss: 0.7315 - classification_loss: 0.0765 16/500 [..............................] - ETA: 2:41 - loss: 0.8029 - regression_loss: 0.7269 - classification_loss: 0.0760 17/500 [>.............................] - ETA: 2:41 - loss: 0.8530 - regression_loss: 0.7637 - classification_loss: 0.0893 18/500 [>.............................] - ETA: 2:41 - loss: 0.8380 - regression_loss: 0.7519 - classification_loss: 0.0861 19/500 [>.............................] - ETA: 2:41 - loss: 0.8330 - regression_loss: 0.7488 - classification_loss: 0.0843 20/500 [>.............................] - ETA: 2:40 - loss: 0.8314 - regression_loss: 0.7466 - classification_loss: 0.0847 21/500 [>.............................] - ETA: 2:40 - loss: 0.8275 - regression_loss: 0.7360 - classification_loss: 0.0915 22/500 [>.............................] - ETA: 2:40 - loss: 0.8372 - regression_loss: 0.7459 - classification_loss: 0.0913 23/500 [>.............................] - ETA: 2:40 - loss: 0.8415 - regression_loss: 0.7520 - classification_loss: 0.0894 24/500 [>.............................] - ETA: 2:40 - loss: 0.8230 - regression_loss: 0.7359 - classification_loss: 0.0871 25/500 [>.............................] - ETA: 2:40 - loss: 0.8117 - regression_loss: 0.7258 - classification_loss: 0.0859 26/500 [>.............................] - ETA: 2:39 - loss: 0.7975 - regression_loss: 0.7139 - classification_loss: 0.0836 27/500 [>.............................] - ETA: 2:39 - loss: 0.8125 - regression_loss: 0.7273 - classification_loss: 0.0852 28/500 [>.............................] - ETA: 2:38 - loss: 0.7985 - regression_loss: 0.7149 - classification_loss: 0.0836 29/500 [>.............................] - ETA: 2:38 - loss: 0.7963 - regression_loss: 0.7142 - classification_loss: 0.0821 30/500 [>.............................] - ETA: 2:38 - loss: 0.7832 - regression_loss: 0.7029 - classification_loss: 0.0803 31/500 [>.............................] - ETA: 2:38 - loss: 0.7697 - regression_loss: 0.6907 - classification_loss: 0.0789 32/500 [>.............................] - ETA: 2:37 - loss: 0.7725 - regression_loss: 0.6935 - classification_loss: 0.0791 33/500 [>.............................] - ETA: 2:37 - loss: 0.7769 - regression_loss: 0.6972 - classification_loss: 0.0797 34/500 [=>............................] - ETA: 2:37 - loss: 0.7701 - regression_loss: 0.6908 - classification_loss: 0.0793 35/500 [=>............................] - ETA: 2:36 - loss: 0.7725 - regression_loss: 0.6928 - classification_loss: 0.0796 36/500 [=>............................] - ETA: 2:36 - loss: 0.7739 - regression_loss: 0.6940 - classification_loss: 0.0799 37/500 [=>............................] - ETA: 2:36 - loss: 0.8099 - regression_loss: 0.7226 - classification_loss: 0.0874 38/500 [=>............................] - ETA: 2:36 - loss: 0.8122 - regression_loss: 0.7238 - classification_loss: 0.0884 39/500 [=>............................] - ETA: 2:35 - loss: 0.8045 - regression_loss: 0.7178 - classification_loss: 0.0867 40/500 [=>............................] - ETA: 2:35 - loss: 0.8008 - regression_loss: 0.7144 - classification_loss: 0.0864 41/500 [=>............................] - ETA: 2:35 - loss: 0.8107 - regression_loss: 0.7240 - classification_loss: 0.0867 42/500 [=>............................] - ETA: 2:34 - loss: 0.8073 - regression_loss: 0.7221 - classification_loss: 0.0852 43/500 [=>............................] - ETA: 2:34 - loss: 0.8034 - regression_loss: 0.7181 - classification_loss: 0.0853 44/500 [=>............................] - ETA: 2:34 - loss: 0.8098 - regression_loss: 0.7243 - classification_loss: 0.0855 45/500 [=>............................] - ETA: 2:33 - loss: 0.7992 - regression_loss: 0.7153 - classification_loss: 0.0839 46/500 [=>............................] - ETA: 2:33 - loss: 0.8044 - regression_loss: 0.7200 - classification_loss: 0.0843 47/500 [=>............................] - ETA: 2:32 - loss: 0.7955 - regression_loss: 0.7126 - classification_loss: 0.0829 48/500 [=>............................] - ETA: 2:32 - loss: 0.7951 - regression_loss: 0.7126 - classification_loss: 0.0825 49/500 [=>............................] - ETA: 2:32 - loss: 0.7860 - regression_loss: 0.7047 - classification_loss: 0.0812 50/500 [==>...........................] - ETA: 2:31 - loss: 0.8127 - regression_loss: 0.7241 - classification_loss: 0.0886 51/500 [==>...........................] - ETA: 2:31 - loss: 0.8113 - regression_loss: 0.7220 - classification_loss: 0.0893 52/500 [==>...........................] - ETA: 2:31 - loss: 0.8305 - regression_loss: 0.7384 - classification_loss: 0.0922 53/500 [==>...........................] - ETA: 2:30 - loss: 0.8259 - regression_loss: 0.7340 - classification_loss: 0.0919 54/500 [==>...........................] - ETA: 2:30 - loss: 0.8207 - regression_loss: 0.7300 - classification_loss: 0.0906 55/500 [==>...........................] - ETA: 2:30 - loss: 0.8156 - regression_loss: 0.7248 - classification_loss: 0.0907 56/500 [==>...........................] - ETA: 2:29 - loss: 0.8185 - regression_loss: 0.7277 - classification_loss: 0.0909 57/500 [==>...........................] - ETA: 2:29 - loss: 0.8279 - regression_loss: 0.7347 - classification_loss: 0.0932 58/500 [==>...........................] - ETA: 2:29 - loss: 0.8190 - regression_loss: 0.7271 - classification_loss: 0.0919 59/500 [==>...........................] - ETA: 2:28 - loss: 0.8055 - regression_loss: 0.7148 - classification_loss: 0.0907 60/500 [==>...........................] - ETA: 2:28 - loss: 0.8126 - regression_loss: 0.7211 - classification_loss: 0.0914 61/500 [==>...........................] - ETA: 2:28 - loss: 0.8117 - regression_loss: 0.7205 - classification_loss: 0.0912 62/500 [==>...........................] - ETA: 2:28 - loss: 0.8127 - regression_loss: 0.7220 - classification_loss: 0.0906 63/500 [==>...........................] - ETA: 2:27 - loss: 0.8180 - regression_loss: 0.7256 - classification_loss: 0.0924 64/500 [==>...........................] - ETA: 2:27 - loss: 0.8163 - regression_loss: 0.7244 - classification_loss: 0.0920 65/500 [==>...........................] - ETA: 2:27 - loss: 0.8136 - regression_loss: 0.7222 - classification_loss: 0.0914 66/500 [==>...........................] - ETA: 2:26 - loss: 0.8036 - regression_loss: 0.7134 - classification_loss: 0.0902 67/500 [===>..........................] - ETA: 2:26 - loss: 0.7961 - regression_loss: 0.7070 - classification_loss: 0.0891 68/500 [===>..........................] - ETA: 2:26 - loss: 0.8038 - regression_loss: 0.7128 - classification_loss: 0.0910 69/500 [===>..........................] - ETA: 2:25 - loss: 0.8012 - regression_loss: 0.7109 - classification_loss: 0.0904 70/500 [===>..........................] - ETA: 2:25 - loss: 0.7990 - regression_loss: 0.7094 - classification_loss: 0.0896 71/500 [===>..........................] - ETA: 2:25 - loss: 0.7956 - regression_loss: 0.7066 - classification_loss: 0.0890 72/500 [===>..........................] - ETA: 2:24 - loss: 0.7939 - regression_loss: 0.7056 - classification_loss: 0.0883 73/500 [===>..........................] - ETA: 2:24 - loss: 0.7966 - regression_loss: 0.7079 - classification_loss: 0.0887 74/500 [===>..........................] - ETA: 2:24 - loss: 0.7989 - regression_loss: 0.7102 - classification_loss: 0.0887 75/500 [===>..........................] - ETA: 2:23 - loss: 0.8020 - regression_loss: 0.7138 - classification_loss: 0.0883 76/500 [===>..........................] - ETA: 2:23 - loss: 0.8019 - regression_loss: 0.7137 - classification_loss: 0.0881 77/500 [===>..........................] - ETA: 2:23 - loss: 0.8089 - regression_loss: 0.7194 - classification_loss: 0.0895 78/500 [===>..........................] - ETA: 2:22 - loss: 0.8089 - regression_loss: 0.7197 - classification_loss: 0.0892 79/500 [===>..........................] - ETA: 2:22 - loss: 0.8137 - regression_loss: 0.7240 - classification_loss: 0.0897 80/500 [===>..........................] - ETA: 2:22 - loss: 0.8127 - regression_loss: 0.7234 - classification_loss: 0.0894 81/500 [===>..........................] - ETA: 2:21 - loss: 0.8151 - regression_loss: 0.7255 - classification_loss: 0.0897 82/500 [===>..........................] - ETA: 2:21 - loss: 0.8217 - regression_loss: 0.7304 - classification_loss: 0.0913 83/500 [===>..........................] - ETA: 2:21 - loss: 0.8190 - regression_loss: 0.7283 - classification_loss: 0.0908 84/500 [====>.........................] - ETA: 2:20 - loss: 0.8201 - regression_loss: 0.7291 - classification_loss: 0.0910 85/500 [====>.........................] - ETA: 2:20 - loss: 0.8235 - regression_loss: 0.7324 - classification_loss: 0.0911 86/500 [====>.........................] - ETA: 2:19 - loss: 0.8203 - regression_loss: 0.7296 - classification_loss: 0.0906 87/500 [====>.........................] - ETA: 2:19 - loss: 0.8237 - regression_loss: 0.7326 - classification_loss: 0.0911 88/500 [====>.........................] - ETA: 2:19 - loss: 0.8193 - regression_loss: 0.7291 - classification_loss: 0.0903 89/500 [====>.........................] - ETA: 2:19 - loss: 0.8196 - regression_loss: 0.7294 - classification_loss: 0.0902 90/500 [====>.........................] - ETA: 2:18 - loss: 0.8195 - regression_loss: 0.7298 - classification_loss: 0.0897 91/500 [====>.........................] - ETA: 2:18 - loss: 0.8162 - regression_loss: 0.7265 - classification_loss: 0.0897 92/500 [====>.........................] - ETA: 2:17 - loss: 0.8142 - regression_loss: 0.7246 - classification_loss: 0.0897 93/500 [====>.........................] - ETA: 2:17 - loss: 0.8123 - regression_loss: 0.7231 - classification_loss: 0.0892 94/500 [====>.........................] - ETA: 2:17 - loss: 0.8104 - regression_loss: 0.7216 - classification_loss: 0.0888 95/500 [====>.........................] - ETA: 2:16 - loss: 0.8140 - regression_loss: 0.7255 - classification_loss: 0.0885 96/500 [====>.........................] - ETA: 2:16 - loss: 0.8112 - regression_loss: 0.7234 - classification_loss: 0.0878 97/500 [====>.........................] - ETA: 2:16 - loss: 0.8137 - regression_loss: 0.7261 - classification_loss: 0.0877 98/500 [====>.........................] - ETA: 2:15 - loss: 0.8115 - regression_loss: 0.7241 - classification_loss: 0.0874 99/500 [====>.........................] - ETA: 2:15 - loss: 0.8115 - regression_loss: 0.7243 - classification_loss: 0.0872 100/500 [=====>........................] - ETA: 2:15 - loss: 0.8044 - regression_loss: 0.7177 - classification_loss: 0.0867 101/500 [=====>........................] - ETA: 2:14 - loss: 0.8017 - regression_loss: 0.7153 - classification_loss: 0.0864 102/500 [=====>........................] - ETA: 2:14 - loss: 0.8013 - regression_loss: 0.7154 - classification_loss: 0.0859 103/500 [=====>........................] - ETA: 2:14 - loss: 0.8028 - regression_loss: 0.7165 - classification_loss: 0.0862 104/500 [=====>........................] - ETA: 2:13 - loss: 0.8030 - regression_loss: 0.7171 - classification_loss: 0.0859 105/500 [=====>........................] - ETA: 2:13 - loss: 0.7970 - regression_loss: 0.7119 - classification_loss: 0.0851 106/500 [=====>........................] - ETA: 2:13 - loss: 0.7963 - regression_loss: 0.7116 - classification_loss: 0.0847 107/500 [=====>........................] - ETA: 2:12 - loss: 0.7969 - regression_loss: 0.7124 - classification_loss: 0.0845 108/500 [=====>........................] - ETA: 2:12 - loss: 0.7956 - regression_loss: 0.7112 - classification_loss: 0.0844 109/500 [=====>........................] - ETA: 2:12 - loss: 0.7991 - regression_loss: 0.7144 - classification_loss: 0.0847 110/500 [=====>........................] - ETA: 2:11 - loss: 0.8005 - regression_loss: 0.7157 - classification_loss: 0.0848 111/500 [=====>........................] - ETA: 2:11 - loss: 0.8004 - regression_loss: 0.7159 - classification_loss: 0.0844 112/500 [=====>........................] - ETA: 2:11 - loss: 0.8019 - regression_loss: 0.7174 - classification_loss: 0.0845 113/500 [=====>........................] - ETA: 2:10 - loss: 0.8006 - regression_loss: 0.7165 - classification_loss: 0.0841 114/500 [=====>........................] - ETA: 2:10 - loss: 0.7988 - regression_loss: 0.7151 - classification_loss: 0.0837 115/500 [=====>........................] - ETA: 2:10 - loss: 0.7967 - regression_loss: 0.7133 - classification_loss: 0.0834 116/500 [=====>........................] - ETA: 2:09 - loss: 0.7963 - regression_loss: 0.7129 - classification_loss: 0.0833 117/500 [======>.......................] - ETA: 2:09 - loss: 0.7943 - regression_loss: 0.7112 - classification_loss: 0.0830 118/500 [======>.......................] - ETA: 2:09 - loss: 0.7966 - regression_loss: 0.7138 - classification_loss: 0.0828 119/500 [======>.......................] - ETA: 2:08 - loss: 0.7957 - regression_loss: 0.7129 - classification_loss: 0.0827 120/500 [======>.......................] - ETA: 2:08 - loss: 0.7913 - regression_loss: 0.7091 - classification_loss: 0.0822 121/500 [======>.......................] - ETA: 2:08 - loss: 0.7889 - regression_loss: 0.7071 - classification_loss: 0.0818 122/500 [======>.......................] - ETA: 2:07 - loss: 0.7885 - regression_loss: 0.7069 - classification_loss: 0.0816 123/500 [======>.......................] - ETA: 2:07 - loss: 0.7865 - regression_loss: 0.7052 - classification_loss: 0.0813 124/500 [======>.......................] - ETA: 2:07 - loss: 0.7834 - regression_loss: 0.7027 - classification_loss: 0.0808 125/500 [======>.......................] - ETA: 2:07 - loss: 0.7815 - regression_loss: 0.7010 - classification_loss: 0.0805 126/500 [======>.......................] - ETA: 2:06 - loss: 0.7774 - regression_loss: 0.6975 - classification_loss: 0.0800 127/500 [======>.......................] - ETA: 2:06 - loss: 0.7763 - regression_loss: 0.6969 - classification_loss: 0.0795 128/500 [======>.......................] - ETA: 2:06 - loss: 0.7758 - regression_loss: 0.6958 - classification_loss: 0.0800 129/500 [======>.......................] - ETA: 2:05 - loss: 0.7775 - regression_loss: 0.6974 - classification_loss: 0.0801 130/500 [======>.......................] - ETA: 2:05 - loss: 0.7799 - regression_loss: 0.6995 - classification_loss: 0.0804 131/500 [======>.......................] - ETA: 2:05 - loss: 0.7802 - regression_loss: 0.6997 - classification_loss: 0.0805 132/500 [======>.......................] - ETA: 2:04 - loss: 0.7785 - regression_loss: 0.6985 - classification_loss: 0.0800 133/500 [======>.......................] - ETA: 2:04 - loss: 0.7836 - regression_loss: 0.7023 - classification_loss: 0.0813 134/500 [=======>......................] - ETA: 2:03 - loss: 0.7868 - regression_loss: 0.7042 - classification_loss: 0.0826 135/500 [=======>......................] - ETA: 2:03 - loss: 0.7840 - regression_loss: 0.7018 - classification_loss: 0.0822 136/500 [=======>......................] - ETA: 2:03 - loss: 0.7834 - regression_loss: 0.7017 - classification_loss: 0.0817 137/500 [=======>......................] - ETA: 2:02 - loss: 0.7807 - regression_loss: 0.6994 - classification_loss: 0.0813 138/500 [=======>......................] - ETA: 2:02 - loss: 0.7821 - regression_loss: 0.7009 - classification_loss: 0.0811 139/500 [=======>......................] - ETA: 2:02 - loss: 0.7783 - regression_loss: 0.6977 - classification_loss: 0.0806 140/500 [=======>......................] - ETA: 2:01 - loss: 0.7777 - regression_loss: 0.6972 - classification_loss: 0.0804 141/500 [=======>......................] - ETA: 2:01 - loss: 0.7783 - regression_loss: 0.6978 - classification_loss: 0.0806 142/500 [=======>......................] - ETA: 2:01 - loss: 0.7833 - regression_loss: 0.7014 - classification_loss: 0.0819 143/500 [=======>......................] - ETA: 2:00 - loss: 0.7801 - regression_loss: 0.6986 - classification_loss: 0.0815 144/500 [=======>......................] - ETA: 2:00 - loss: 0.7831 - regression_loss: 0.7017 - classification_loss: 0.0815 145/500 [=======>......................] - ETA: 2:00 - loss: 0.7833 - regression_loss: 0.7020 - classification_loss: 0.0813 146/500 [=======>......................] - ETA: 1:59 - loss: 0.7845 - regression_loss: 0.7030 - classification_loss: 0.0815 147/500 [=======>......................] - ETA: 1:59 - loss: 0.7866 - regression_loss: 0.7046 - classification_loss: 0.0820 148/500 [=======>......................] - ETA: 1:59 - loss: 0.7878 - regression_loss: 0.7059 - classification_loss: 0.0819 149/500 [=======>......................] - ETA: 1:58 - loss: 0.7851 - regression_loss: 0.7037 - classification_loss: 0.0814 150/500 [========>.....................] - ETA: 1:58 - loss: 0.7819 - regression_loss: 0.7008 - classification_loss: 0.0811 151/500 [========>.....................] - ETA: 1:58 - loss: 0.7813 - regression_loss: 0.7006 - classification_loss: 0.0807 152/500 [========>.....................] - ETA: 1:57 - loss: 0.7828 - regression_loss: 0.7017 - classification_loss: 0.0811 153/500 [========>.....................] - ETA: 1:57 - loss: 0.7843 - regression_loss: 0.7028 - classification_loss: 0.0814 154/500 [========>.....................] - ETA: 1:57 - loss: 0.7844 - regression_loss: 0.7028 - classification_loss: 0.0816 155/500 [========>.....................] - ETA: 1:57 - loss: 0.7853 - regression_loss: 0.7034 - classification_loss: 0.0819 156/500 [========>.....................] - ETA: 1:56 - loss: 0.7894 - regression_loss: 0.7073 - classification_loss: 0.0822 157/500 [========>.....................] - ETA: 1:56 - loss: 0.7873 - regression_loss: 0.7054 - classification_loss: 0.0820 158/500 [========>.....................] - ETA: 1:56 - loss: 0.7868 - regression_loss: 0.7051 - classification_loss: 0.0817 159/500 [========>.....................] - ETA: 1:55 - loss: 0.7858 - regression_loss: 0.7044 - classification_loss: 0.0814 160/500 [========>.....................] - ETA: 1:55 - loss: 0.7875 - regression_loss: 0.7060 - classification_loss: 0.0815 161/500 [========>.....................] - ETA: 1:54 - loss: 0.7872 - regression_loss: 0.7059 - classification_loss: 0.0814 162/500 [========>.....................] - ETA: 1:54 - loss: 0.7879 - regression_loss: 0.7065 - classification_loss: 0.0814 163/500 [========>.....................] - ETA: 1:54 - loss: 0.7880 - regression_loss: 0.7066 - classification_loss: 0.0814 164/500 [========>.....................] - ETA: 1:54 - loss: 0.7877 - regression_loss: 0.7068 - classification_loss: 0.0810 165/500 [========>.....................] - ETA: 1:53 - loss: 0.7873 - regression_loss: 0.7065 - classification_loss: 0.0809 166/500 [========>.....................] - ETA: 1:53 - loss: 0.7837 - regression_loss: 0.7032 - classification_loss: 0.0805 167/500 [=========>....................] - ETA: 1:52 - loss: 0.7854 - regression_loss: 0.7042 - classification_loss: 0.0812 168/500 [=========>....................] - ETA: 1:52 - loss: 0.7850 - regression_loss: 0.7038 - classification_loss: 0.0812 169/500 [=========>....................] - ETA: 1:52 - loss: 0.7899 - regression_loss: 0.7084 - classification_loss: 0.0814 170/500 [=========>....................] - ETA: 1:51 - loss: 0.7899 - regression_loss: 0.7088 - classification_loss: 0.0811 171/500 [=========>....................] - ETA: 1:51 - loss: 0.7893 - regression_loss: 0.7081 - classification_loss: 0.0812 172/500 [=========>....................] - ETA: 1:51 - loss: 0.7911 - regression_loss: 0.7097 - classification_loss: 0.0814 173/500 [=========>....................] - ETA: 1:50 - loss: 0.7904 - regression_loss: 0.7092 - classification_loss: 0.0813 174/500 [=========>....................] - ETA: 1:50 - loss: 0.7902 - regression_loss: 0.7092 - classification_loss: 0.0810 175/500 [=========>....................] - ETA: 1:50 - loss: 0.7881 - regression_loss: 0.7074 - classification_loss: 0.0807 176/500 [=========>....................] - ETA: 1:49 - loss: 0.7868 - regression_loss: 0.7064 - classification_loss: 0.0804 177/500 [=========>....................] - ETA: 1:49 - loss: 0.7869 - regression_loss: 0.7066 - classification_loss: 0.0803 178/500 [=========>....................] - ETA: 1:49 - loss: 0.7836 - regression_loss: 0.7036 - classification_loss: 0.0799 179/500 [=========>....................] - ETA: 1:48 - loss: 0.7817 - regression_loss: 0.7021 - classification_loss: 0.0796 180/500 [=========>....................] - ETA: 1:48 - loss: 0.7815 - regression_loss: 0.7021 - classification_loss: 0.0795 181/500 [=========>....................] - ETA: 1:48 - loss: 0.7852 - regression_loss: 0.7053 - classification_loss: 0.0799 182/500 [=========>....................] - ETA: 1:47 - loss: 0.7821 - regression_loss: 0.7025 - classification_loss: 0.0795 183/500 [=========>....................] - ETA: 1:47 - loss: 0.7814 - regression_loss: 0.7020 - classification_loss: 0.0794 184/500 [==========>...................] - ETA: 1:47 - loss: 0.7794 - regression_loss: 0.7004 - classification_loss: 0.0791 185/500 [==========>...................] - ETA: 1:46 - loss: 0.7797 - regression_loss: 0.7008 - classification_loss: 0.0789 186/500 [==========>...................] - ETA: 1:46 - loss: 0.7781 - regression_loss: 0.6970 - classification_loss: 0.0811 187/500 [==========>...................] - ETA: 1:46 - loss: 0.7784 - regression_loss: 0.6974 - classification_loss: 0.0809 188/500 [==========>...................] - ETA: 1:45 - loss: 0.7787 - regression_loss: 0.6980 - classification_loss: 0.0807 189/500 [==========>...................] - ETA: 1:45 - loss: 0.7781 - regression_loss: 0.6977 - classification_loss: 0.0804 190/500 [==========>...................] - ETA: 1:45 - loss: 0.7818 - regression_loss: 0.7008 - classification_loss: 0.0810 191/500 [==========>...................] - ETA: 1:44 - loss: 0.7833 - regression_loss: 0.7023 - classification_loss: 0.0810 192/500 [==========>...................] - ETA: 1:44 - loss: 0.7817 - regression_loss: 0.7011 - classification_loss: 0.0807 193/500 [==========>...................] - ETA: 1:44 - loss: 0.7811 - regression_loss: 0.7007 - classification_loss: 0.0805 194/500 [==========>...................] - ETA: 1:43 - loss: 0.7800 - regression_loss: 0.6997 - classification_loss: 0.0803 195/500 [==========>...................] - ETA: 1:43 - loss: 0.7801 - regression_loss: 0.7000 - classification_loss: 0.0800 196/500 [==========>...................] - ETA: 1:43 - loss: 0.7806 - regression_loss: 0.7006 - classification_loss: 0.0799 197/500 [==========>...................] - ETA: 1:42 - loss: 0.7811 - regression_loss: 0.7010 - classification_loss: 0.0800 198/500 [==========>...................] - ETA: 1:42 - loss: 0.7824 - regression_loss: 0.7019 - classification_loss: 0.0805 199/500 [==========>...................] - ETA: 1:42 - loss: 0.7855 - regression_loss: 0.7044 - classification_loss: 0.0811 200/500 [===========>..................] - ETA: 1:41 - loss: 0.7872 - regression_loss: 0.7058 - classification_loss: 0.0815 201/500 [===========>..................] - ETA: 1:41 - loss: 0.7881 - regression_loss: 0.7065 - classification_loss: 0.0817 202/500 [===========>..................] - ETA: 1:41 - loss: 0.7877 - regression_loss: 0.7062 - classification_loss: 0.0815 203/500 [===========>..................] - ETA: 1:40 - loss: 0.7859 - regression_loss: 0.7047 - classification_loss: 0.0812 204/500 [===========>..................] - ETA: 1:40 - loss: 0.7853 - regression_loss: 0.7041 - classification_loss: 0.0812 205/500 [===========>..................] - ETA: 1:39 - loss: 0.7832 - regression_loss: 0.7023 - classification_loss: 0.0809 206/500 [===========>..................] - ETA: 1:39 - loss: 0.7861 - regression_loss: 0.7048 - classification_loss: 0.0813 207/500 [===========>..................] - ETA: 1:39 - loss: 0.7874 - regression_loss: 0.7062 - classification_loss: 0.0812 208/500 [===========>..................] - ETA: 1:38 - loss: 0.7853 - regression_loss: 0.7044 - classification_loss: 0.0810 209/500 [===========>..................] - ETA: 1:38 - loss: 0.7843 - regression_loss: 0.7034 - classification_loss: 0.0809 210/500 [===========>..................] - ETA: 1:38 - loss: 0.7822 - regression_loss: 0.7017 - classification_loss: 0.0805 211/500 [===========>..................] - ETA: 1:37 - loss: 0.7827 - regression_loss: 0.7021 - classification_loss: 0.0806 212/500 [===========>..................] - ETA: 1:37 - loss: 0.7861 - regression_loss: 0.7047 - classification_loss: 0.0814 213/500 [===========>..................] - ETA: 1:37 - loss: 0.7859 - regression_loss: 0.7046 - classification_loss: 0.0813 214/500 [===========>..................] - ETA: 1:36 - loss: 0.7866 - regression_loss: 0.7051 - classification_loss: 0.0814 215/500 [===========>..................] - ETA: 1:36 - loss: 0.7855 - regression_loss: 0.7042 - classification_loss: 0.0813 216/500 [===========>..................] - ETA: 1:36 - loss: 0.7845 - regression_loss: 0.7033 - classification_loss: 0.0812 217/500 [============>.................] - ETA: 1:35 - loss: 0.7847 - regression_loss: 0.7036 - classification_loss: 0.0811 218/500 [============>.................] - ETA: 1:35 - loss: 0.7831 - regression_loss: 0.7021 - classification_loss: 0.0809 219/500 [============>.................] - ETA: 1:35 - loss: 0.7839 - regression_loss: 0.7023 - classification_loss: 0.0816 220/500 [============>.................] - ETA: 1:34 - loss: 0.7830 - regression_loss: 0.7016 - classification_loss: 0.0814 221/500 [============>.................] - ETA: 1:34 - loss: 0.7846 - regression_loss: 0.7030 - classification_loss: 0.0816 222/500 [============>.................] - ETA: 1:34 - loss: 0.7882 - regression_loss: 0.7055 - classification_loss: 0.0827 223/500 [============>.................] - ETA: 1:33 - loss: 0.7875 - regression_loss: 0.7050 - classification_loss: 0.0825 224/500 [============>.................] - ETA: 1:33 - loss: 0.7905 - regression_loss: 0.7076 - classification_loss: 0.0829 225/500 [============>.................] - ETA: 1:33 - loss: 0.7890 - regression_loss: 0.7064 - classification_loss: 0.0826 226/500 [============>.................] - ETA: 1:32 - loss: 0.7899 - regression_loss: 0.7071 - classification_loss: 0.0828 227/500 [============>.................] - ETA: 1:32 - loss: 0.7898 - regression_loss: 0.7072 - classification_loss: 0.0826 228/500 [============>.................] - ETA: 1:32 - loss: 0.7897 - regression_loss: 0.7071 - classification_loss: 0.0826 229/500 [============>.................] - ETA: 1:31 - loss: 0.7897 - regression_loss: 0.7072 - classification_loss: 0.0826 230/500 [============>.................] - ETA: 1:31 - loss: 0.7903 - regression_loss: 0.7076 - classification_loss: 0.0827 231/500 [============>.................] - ETA: 1:31 - loss: 0.7908 - regression_loss: 0.7082 - classification_loss: 0.0826 232/500 [============>.................] - ETA: 1:30 - loss: 0.7959 - regression_loss: 0.7111 - classification_loss: 0.0847 233/500 [============>.................] - ETA: 1:30 - loss: 0.7959 - regression_loss: 0.7107 - classification_loss: 0.0853 234/500 [=============>................] - ETA: 1:30 - loss: 0.7952 - regression_loss: 0.7101 - classification_loss: 0.0851 235/500 [=============>................] - ETA: 1:29 - loss: 0.7951 - regression_loss: 0.7102 - classification_loss: 0.0850 236/500 [=============>................] - ETA: 1:29 - loss: 0.7947 - regression_loss: 0.7099 - classification_loss: 0.0848 237/500 [=============>................] - ETA: 1:29 - loss: 0.7958 - regression_loss: 0.7107 - classification_loss: 0.0851 238/500 [=============>................] - ETA: 1:28 - loss: 0.7949 - regression_loss: 0.7101 - classification_loss: 0.0848 239/500 [=============>................] - ETA: 1:28 - loss: 0.7952 - regression_loss: 0.7102 - classification_loss: 0.0850 240/500 [=============>................] - ETA: 1:28 - loss: 0.7942 - regression_loss: 0.7092 - classification_loss: 0.0851 241/500 [=============>................] - ETA: 1:27 - loss: 0.7937 - regression_loss: 0.7087 - classification_loss: 0.0850 242/500 [=============>................] - ETA: 1:27 - loss: 0.7926 - regression_loss: 0.7078 - classification_loss: 0.0848 243/500 [=============>................] - ETA: 1:26 - loss: 0.7930 - regression_loss: 0.7082 - classification_loss: 0.0848 244/500 [=============>................] - ETA: 1:26 - loss: 0.7934 - regression_loss: 0.7087 - classification_loss: 0.0847 245/500 [=============>................] - ETA: 1:26 - loss: 0.7923 - regression_loss: 0.7078 - classification_loss: 0.0845 246/500 [=============>................] - ETA: 1:25 - loss: 0.7929 - regression_loss: 0.7082 - classification_loss: 0.0847 247/500 [=============>................] - ETA: 1:25 - loss: 0.7932 - regression_loss: 0.7085 - classification_loss: 0.0848 248/500 [=============>................] - ETA: 1:25 - loss: 0.7919 - regression_loss: 0.7073 - classification_loss: 0.0846 249/500 [=============>................] - ETA: 1:24 - loss: 0.7919 - regression_loss: 0.7074 - classification_loss: 0.0845 250/500 [==============>...............] - ETA: 1:24 - loss: 0.7912 - regression_loss: 0.7069 - classification_loss: 0.0843 251/500 [==============>...............] - ETA: 1:24 - loss: 0.7913 - regression_loss: 0.7071 - classification_loss: 0.0842 252/500 [==============>...............] - ETA: 1:23 - loss: 0.7909 - regression_loss: 0.7068 - classification_loss: 0.0840 253/500 [==============>...............] - ETA: 1:23 - loss: 0.7885 - regression_loss: 0.7048 - classification_loss: 0.0837 254/500 [==============>...............] - ETA: 1:23 - loss: 0.7880 - regression_loss: 0.7045 - classification_loss: 0.0835 255/500 [==============>...............] - ETA: 1:22 - loss: 0.7888 - regression_loss: 0.7051 - classification_loss: 0.0836 256/500 [==============>...............] - ETA: 1:22 - loss: 0.7878 - regression_loss: 0.7043 - classification_loss: 0.0835 257/500 [==============>...............] - ETA: 1:22 - loss: 0.7905 - regression_loss: 0.7062 - classification_loss: 0.0843 258/500 [==============>...............] - ETA: 1:21 - loss: 0.7911 - regression_loss: 0.7070 - classification_loss: 0.0841 259/500 [==============>...............] - ETA: 1:21 - loss: 0.7907 - regression_loss: 0.7068 - classification_loss: 0.0839 260/500 [==============>...............] - ETA: 1:21 - loss: 0.7898 - regression_loss: 0.7061 - classification_loss: 0.0837 261/500 [==============>...............] - ETA: 1:20 - loss: 0.7894 - regression_loss: 0.7059 - classification_loss: 0.0835 262/500 [==============>...............] - ETA: 1:20 - loss: 0.7890 - regression_loss: 0.7057 - classification_loss: 0.0833 263/500 [==============>...............] - ETA: 1:20 - loss: 0.7887 - regression_loss: 0.7056 - classification_loss: 0.0831 264/500 [==============>...............] - ETA: 1:19 - loss: 0.7896 - regression_loss: 0.7066 - classification_loss: 0.0830 265/500 [==============>...............] - ETA: 1:19 - loss: 0.7900 - regression_loss: 0.7071 - classification_loss: 0.0829 266/500 [==============>...............] - ETA: 1:19 - loss: 0.7885 - regression_loss: 0.7058 - classification_loss: 0.0827 267/500 [===============>..............] - ETA: 1:18 - loss: 0.7862 - regression_loss: 0.7037 - classification_loss: 0.0825 268/500 [===============>..............] - ETA: 1:18 - loss: 0.7859 - regression_loss: 0.7034 - classification_loss: 0.0824 269/500 [===============>..............] - ETA: 1:18 - loss: 0.7843 - regression_loss: 0.7021 - classification_loss: 0.0822 270/500 [===============>..............] - ETA: 1:17 - loss: 0.7855 - regression_loss: 0.7034 - classification_loss: 0.0821 271/500 [===============>..............] - ETA: 1:17 - loss: 0.7866 - regression_loss: 0.7044 - classification_loss: 0.0822 272/500 [===============>..............] - ETA: 1:17 - loss: 0.7865 - regression_loss: 0.7042 - classification_loss: 0.0823 273/500 [===============>..............] - ETA: 1:16 - loss: 0.7853 - regression_loss: 0.7032 - classification_loss: 0.0821 274/500 [===============>..............] - ETA: 1:16 - loss: 0.7857 - regression_loss: 0.7036 - classification_loss: 0.0821 275/500 [===============>..............] - ETA: 1:16 - loss: 0.7847 - regression_loss: 0.7028 - classification_loss: 0.0819 276/500 [===============>..............] - ETA: 1:15 - loss: 0.7845 - regression_loss: 0.7027 - classification_loss: 0.0818 277/500 [===============>..............] - ETA: 1:15 - loss: 0.7841 - regression_loss: 0.7025 - classification_loss: 0.0816 278/500 [===============>..............] - ETA: 1:15 - loss: 0.7865 - regression_loss: 0.7047 - classification_loss: 0.0818 279/500 [===============>..............] - ETA: 1:14 - loss: 0.7899 - regression_loss: 0.7076 - classification_loss: 0.0823 280/500 [===============>..............] - ETA: 1:14 - loss: 0.7892 - regression_loss: 0.7070 - classification_loss: 0.0822 281/500 [===============>..............] - ETA: 1:14 - loss: 0.7902 - regression_loss: 0.7079 - classification_loss: 0.0824 282/500 [===============>..............] - ETA: 1:13 - loss: 0.7923 - regression_loss: 0.7099 - classification_loss: 0.0824 283/500 [===============>..............] - ETA: 1:13 - loss: 0.7927 - regression_loss: 0.7103 - classification_loss: 0.0824 284/500 [================>.............] - ETA: 1:13 - loss: 0.7911 - regression_loss: 0.7090 - classification_loss: 0.0821 285/500 [================>.............] - ETA: 1:12 - loss: 0.7904 - regression_loss: 0.7084 - classification_loss: 0.0820 286/500 [================>.............] - ETA: 1:12 - loss: 0.7890 - regression_loss: 0.7073 - classification_loss: 0.0818 287/500 [================>.............] - ETA: 1:12 - loss: 0.7902 - regression_loss: 0.7084 - classification_loss: 0.0819 288/500 [================>.............] - ETA: 1:11 - loss: 0.7908 - regression_loss: 0.7089 - classification_loss: 0.0819 289/500 [================>.............] - ETA: 1:11 - loss: 0.7916 - regression_loss: 0.7097 - classification_loss: 0.0819 290/500 [================>.............] - ETA: 1:11 - loss: 0.7908 - regression_loss: 0.7090 - classification_loss: 0.0818 291/500 [================>.............] - ETA: 1:10 - loss: 0.7908 - regression_loss: 0.7091 - classification_loss: 0.0817 292/500 [================>.............] - ETA: 1:10 - loss: 0.7914 - regression_loss: 0.7098 - classification_loss: 0.0817 293/500 [================>.............] - ETA: 1:10 - loss: 0.7909 - regression_loss: 0.7094 - classification_loss: 0.0815 294/500 [================>.............] - ETA: 1:09 - loss: 0.7919 - regression_loss: 0.7103 - classification_loss: 0.0816 295/500 [================>.............] - ETA: 1:09 - loss: 0.7919 - regression_loss: 0.7104 - classification_loss: 0.0815 296/500 [================>.............] - ETA: 1:09 - loss: 0.7936 - regression_loss: 0.7117 - classification_loss: 0.0819 297/500 [================>.............] - ETA: 1:08 - loss: 0.7930 - regression_loss: 0.7113 - classification_loss: 0.0817 298/500 [================>.............] - ETA: 1:08 - loss: 0.7937 - regression_loss: 0.7117 - classification_loss: 0.0821 299/500 [================>.............] - ETA: 1:08 - loss: 0.7947 - regression_loss: 0.7126 - classification_loss: 0.0821 300/500 [=================>............] - ETA: 1:07 - loss: 0.7942 - regression_loss: 0.7123 - classification_loss: 0.0819 301/500 [=================>............] - ETA: 1:07 - loss: 0.7941 - regression_loss: 0.7122 - classification_loss: 0.0819 302/500 [=================>............] - ETA: 1:07 - loss: 0.7946 - regression_loss: 0.7128 - classification_loss: 0.0817 303/500 [=================>............] - ETA: 1:06 - loss: 0.7937 - regression_loss: 0.7121 - classification_loss: 0.0815 304/500 [=================>............] - ETA: 1:06 - loss: 0.7944 - regression_loss: 0.7125 - classification_loss: 0.0819 305/500 [=================>............] - ETA: 1:06 - loss: 0.7935 - regression_loss: 0.7117 - classification_loss: 0.0818 306/500 [=================>............] - ETA: 1:05 - loss: 0.7937 - regression_loss: 0.7121 - classification_loss: 0.0817 307/500 [=================>............] - ETA: 1:05 - loss: 0.7923 - regression_loss: 0.7108 - classification_loss: 0.0815 308/500 [=================>............] - ETA: 1:05 - loss: 0.7909 - regression_loss: 0.7096 - classification_loss: 0.0813 309/500 [=================>............] - ETA: 1:04 - loss: 0.7904 - regression_loss: 0.7092 - classification_loss: 0.0812 310/500 [=================>............] - ETA: 1:04 - loss: 0.7980 - regression_loss: 0.7148 - classification_loss: 0.0831 311/500 [=================>............] - ETA: 1:04 - loss: 0.8000 - regression_loss: 0.7166 - classification_loss: 0.0834 312/500 [=================>............] - ETA: 1:03 - loss: 0.7987 - regression_loss: 0.7155 - classification_loss: 0.0831 313/500 [=================>............] - ETA: 1:03 - loss: 0.7995 - regression_loss: 0.7159 - classification_loss: 0.0835 314/500 [=================>............] - ETA: 1:03 - loss: 0.7989 - regression_loss: 0.7153 - classification_loss: 0.0836 315/500 [=================>............] - ETA: 1:02 - loss: 0.7985 - regression_loss: 0.7149 - classification_loss: 0.0836 316/500 [=================>............] - ETA: 1:02 - loss: 0.7984 - regression_loss: 0.7149 - classification_loss: 0.0835 317/500 [==================>...........] - ETA: 1:01 - loss: 0.7985 - regression_loss: 0.7149 - classification_loss: 0.0836 318/500 [==================>...........] - ETA: 1:01 - loss: 0.7978 - regression_loss: 0.7144 - classification_loss: 0.0834 319/500 [==================>...........] - ETA: 1:01 - loss: 0.7980 - regression_loss: 0.7145 - classification_loss: 0.0834 320/500 [==================>...........] - ETA: 1:00 - loss: 0.7955 - regression_loss: 0.7123 - classification_loss: 0.0832 321/500 [==================>...........] - ETA: 1:00 - loss: 0.7949 - regression_loss: 0.7118 - classification_loss: 0.0831 322/500 [==================>...........] - ETA: 1:00 - loss: 0.7940 - regression_loss: 0.7110 - classification_loss: 0.0829 323/500 [==================>...........] - ETA: 59s - loss: 0.7931 - regression_loss: 0.7103 - classification_loss: 0.0828  324/500 [==================>...........] - ETA: 59s - loss: 0.7942 - regression_loss: 0.7112 - classification_loss: 0.0829 325/500 [==================>...........] - ETA: 59s - loss: 0.7932 - regression_loss: 0.7105 - classification_loss: 0.0827 326/500 [==================>...........] - ETA: 58s - loss: 0.7937 - regression_loss: 0.7109 - classification_loss: 0.0828 327/500 [==================>...........] - ETA: 58s - loss: 0.7947 - regression_loss: 0.7118 - classification_loss: 0.0830 328/500 [==================>...........] - ETA: 58s - loss: 0.7943 - regression_loss: 0.7115 - classification_loss: 0.0828 329/500 [==================>...........] - ETA: 57s - loss: 0.7933 - regression_loss: 0.7106 - classification_loss: 0.0827 330/500 [==================>...........] - ETA: 57s - loss: 0.7947 - regression_loss: 0.7118 - classification_loss: 0.0829 331/500 [==================>...........] - ETA: 57s - loss: 0.7930 - regression_loss: 0.7104 - classification_loss: 0.0826 332/500 [==================>...........] - ETA: 56s - loss: 0.7949 - regression_loss: 0.7116 - classification_loss: 0.0833 333/500 [==================>...........] - ETA: 56s - loss: 0.7946 - regression_loss: 0.7114 - classification_loss: 0.0831 334/500 [===================>..........] - ETA: 56s - loss: 0.7939 - regression_loss: 0.7110 - classification_loss: 0.0829 335/500 [===================>..........] - ETA: 55s - loss: 0.7942 - regression_loss: 0.7114 - classification_loss: 0.0828 336/500 [===================>..........] - ETA: 55s - loss: 0.7943 - regression_loss: 0.7116 - classification_loss: 0.0827 337/500 [===================>..........] - ETA: 55s - loss: 0.7941 - regression_loss: 0.7114 - classification_loss: 0.0827 338/500 [===================>..........] - ETA: 54s - loss: 0.7939 - regression_loss: 0.7112 - classification_loss: 0.0827 339/500 [===================>..........] - ETA: 54s - loss: 0.7939 - regression_loss: 0.7113 - classification_loss: 0.0827 340/500 [===================>..........] - ETA: 54s - loss: 0.7935 - regression_loss: 0.7109 - classification_loss: 0.0826 341/500 [===================>..........] - ETA: 53s - loss: 0.7927 - regression_loss: 0.7103 - classification_loss: 0.0824 342/500 [===================>..........] - ETA: 53s - loss: 0.7929 - regression_loss: 0.7105 - classification_loss: 0.0824 343/500 [===================>..........] - ETA: 53s - loss: 0.7926 - regression_loss: 0.7103 - classification_loss: 0.0822 344/500 [===================>..........] - ETA: 52s - loss: 0.7917 - regression_loss: 0.7096 - classification_loss: 0.0821 345/500 [===================>..........] - ETA: 52s - loss: 0.7941 - regression_loss: 0.7113 - classification_loss: 0.0828 346/500 [===================>..........] - ETA: 52s - loss: 0.7934 - regression_loss: 0.7107 - classification_loss: 0.0826 347/500 [===================>..........] - ETA: 51s - loss: 0.7927 - regression_loss: 0.7102 - classification_loss: 0.0826 348/500 [===================>..........] - ETA: 51s - loss: 0.7928 - regression_loss: 0.7103 - classification_loss: 0.0825 349/500 [===================>..........] - ETA: 51s - loss: 0.7923 - regression_loss: 0.7099 - classification_loss: 0.0824 350/500 [====================>.........] - ETA: 50s - loss: 0.7921 - regression_loss: 0.7097 - classification_loss: 0.0824 351/500 [====================>.........] - ETA: 50s - loss: 0.7914 - regression_loss: 0.7092 - classification_loss: 0.0822 352/500 [====================>.........] - ETA: 50s - loss: 0.7915 - regression_loss: 0.7093 - classification_loss: 0.0822 353/500 [====================>.........] - ETA: 49s - loss: 0.7917 - regression_loss: 0.7094 - classification_loss: 0.0823 354/500 [====================>.........] - ETA: 49s - loss: 0.7922 - regression_loss: 0.7098 - classification_loss: 0.0824 355/500 [====================>.........] - ETA: 49s - loss: 0.7912 - regression_loss: 0.7090 - classification_loss: 0.0822 356/500 [====================>.........] - ETA: 48s - loss: 0.7921 - regression_loss: 0.7099 - classification_loss: 0.0822 357/500 [====================>.........] - ETA: 48s - loss: 0.7924 - regression_loss: 0.7103 - classification_loss: 0.0821 358/500 [====================>.........] - ETA: 48s - loss: 0.7931 - regression_loss: 0.7110 - classification_loss: 0.0821 359/500 [====================>.........] - ETA: 47s - loss: 0.7937 - regression_loss: 0.7115 - classification_loss: 0.0822 360/500 [====================>.........] - ETA: 47s - loss: 0.7967 - regression_loss: 0.7139 - classification_loss: 0.0828 361/500 [====================>.........] - ETA: 47s - loss: 0.7964 - regression_loss: 0.7137 - classification_loss: 0.0827 362/500 [====================>.........] - ETA: 46s - loss: 0.7964 - regression_loss: 0.7138 - classification_loss: 0.0827 363/500 [====================>.........] - ETA: 46s - loss: 0.7964 - regression_loss: 0.7140 - classification_loss: 0.0825 364/500 [====================>.........] - ETA: 46s - loss: 0.7962 - regression_loss: 0.7138 - classification_loss: 0.0824 365/500 [====================>.........] - ETA: 45s - loss: 0.7981 - regression_loss: 0.7152 - classification_loss: 0.0829 366/500 [====================>.........] - ETA: 45s - loss: 0.7980 - regression_loss: 0.7151 - classification_loss: 0.0829 367/500 [=====================>........] - ETA: 45s - loss: 0.7978 - regression_loss: 0.7150 - classification_loss: 0.0828 368/500 [=====================>........] - ETA: 44s - loss: 0.7979 - regression_loss: 0.7151 - classification_loss: 0.0828 369/500 [=====================>........] - ETA: 44s - loss: 0.7980 - regression_loss: 0.7152 - classification_loss: 0.0828 370/500 [=====================>........] - ETA: 44s - loss: 0.7986 - regression_loss: 0.7158 - classification_loss: 0.0828 371/500 [=====================>........] - ETA: 43s - loss: 0.8000 - regression_loss: 0.7173 - classification_loss: 0.0828 372/500 [=====================>........] - ETA: 43s - loss: 0.8007 - regression_loss: 0.7179 - classification_loss: 0.0828 373/500 [=====================>........] - ETA: 43s - loss: 0.8004 - regression_loss: 0.7177 - classification_loss: 0.0827 374/500 [=====================>........] - ETA: 42s - loss: 0.8015 - regression_loss: 0.7187 - classification_loss: 0.0828 375/500 [=====================>........] - ETA: 42s - loss: 0.8023 - regression_loss: 0.7194 - classification_loss: 0.0829 376/500 [=====================>........] - ETA: 42s - loss: 0.8018 - regression_loss: 0.7190 - classification_loss: 0.0828 377/500 [=====================>........] - ETA: 41s - loss: 0.8025 - regression_loss: 0.7195 - classification_loss: 0.0830 378/500 [=====================>........] - ETA: 41s - loss: 0.8048 - regression_loss: 0.7217 - classification_loss: 0.0831 379/500 [=====================>........] - ETA: 41s - loss: 0.8055 - regression_loss: 0.7221 - classification_loss: 0.0834 380/500 [=====================>........] - ETA: 40s - loss: 0.8044 - regression_loss: 0.7212 - classification_loss: 0.0832 381/500 [=====================>........] - ETA: 40s - loss: 0.8037 - regression_loss: 0.7206 - classification_loss: 0.0831 382/500 [=====================>........] - ETA: 39s - loss: 0.8062 - regression_loss: 0.7226 - classification_loss: 0.0836 383/500 [=====================>........] - ETA: 39s - loss: 0.8079 - regression_loss: 0.7241 - classification_loss: 0.0838 384/500 [======================>.......] - ETA: 39s - loss: 0.8087 - regression_loss: 0.7251 - classification_loss: 0.0836 385/500 [======================>.......] - ETA: 38s - loss: 0.8090 - regression_loss: 0.7254 - classification_loss: 0.0836 386/500 [======================>.......] - ETA: 38s - loss: 0.8088 - regression_loss: 0.7254 - classification_loss: 0.0835 387/500 [======================>.......] - ETA: 38s - loss: 0.8083 - regression_loss: 0.7250 - classification_loss: 0.0833 388/500 [======================>.......] - ETA: 37s - loss: 0.8080 - regression_loss: 0.7248 - classification_loss: 0.0832 389/500 [======================>.......] - ETA: 37s - loss: 0.8070 - regression_loss: 0.7239 - classification_loss: 0.0831 390/500 [======================>.......] - ETA: 37s - loss: 0.8076 - regression_loss: 0.7232 - classification_loss: 0.0845 391/500 [======================>.......] - ETA: 36s - loss: 0.8091 - regression_loss: 0.7245 - classification_loss: 0.0846 392/500 [======================>.......] - ETA: 36s - loss: 0.8087 - regression_loss: 0.7242 - classification_loss: 0.0845 393/500 [======================>.......] - ETA: 36s - loss: 0.8086 - regression_loss: 0.7242 - classification_loss: 0.0844 394/500 [======================>.......] - ETA: 35s - loss: 0.8081 - regression_loss: 0.7238 - classification_loss: 0.0843 395/500 [======================>.......] - ETA: 35s - loss: 0.8083 - regression_loss: 0.7241 - classification_loss: 0.0842 396/500 [======================>.......] - ETA: 35s - loss: 0.8090 - regression_loss: 0.7248 - classification_loss: 0.0843 397/500 [======================>.......] - ETA: 34s - loss: 0.8085 - regression_loss: 0.7243 - classification_loss: 0.0842 398/500 [======================>.......] - ETA: 34s - loss: 0.8081 - regression_loss: 0.7240 - classification_loss: 0.0841 399/500 [======================>.......] - ETA: 34s - loss: 0.8080 - regression_loss: 0.7239 - classification_loss: 0.0841 400/500 [=======================>......] - ETA: 33s - loss: 0.8084 - regression_loss: 0.7243 - classification_loss: 0.0842 401/500 [=======================>......] - ETA: 33s - loss: 0.8074 - regression_loss: 0.7233 - classification_loss: 0.0841 402/500 [=======================>......] - ETA: 33s - loss: 0.8068 - regression_loss: 0.7228 - classification_loss: 0.0840 403/500 [=======================>......] - ETA: 32s - loss: 0.8052 - regression_loss: 0.7214 - classification_loss: 0.0838 404/500 [=======================>......] - ETA: 32s - loss: 0.8042 - regression_loss: 0.7205 - classification_loss: 0.0836 405/500 [=======================>......] - ETA: 32s - loss: 0.8043 - regression_loss: 0.7207 - classification_loss: 0.0836 406/500 [=======================>......] - ETA: 31s - loss: 0.8034 - regression_loss: 0.7199 - classification_loss: 0.0835 407/500 [=======================>......] - ETA: 31s - loss: 0.8029 - regression_loss: 0.7195 - classification_loss: 0.0835 408/500 [=======================>......] - ETA: 31s - loss: 0.8034 - regression_loss: 0.7200 - classification_loss: 0.0835 409/500 [=======================>......] - ETA: 30s - loss: 0.8030 - regression_loss: 0.7196 - classification_loss: 0.0834 410/500 [=======================>......] - ETA: 30s - loss: 0.8029 - regression_loss: 0.7194 - classification_loss: 0.0835 411/500 [=======================>......] - ETA: 30s - loss: 0.8033 - regression_loss: 0.7198 - classification_loss: 0.0835 412/500 [=======================>......] - ETA: 29s - loss: 0.8041 - regression_loss: 0.7204 - classification_loss: 0.0836 413/500 [=======================>......] - ETA: 29s - loss: 0.8048 - regression_loss: 0.7212 - classification_loss: 0.0836 414/500 [=======================>......] - ETA: 29s - loss: 0.8057 - regression_loss: 0.7221 - classification_loss: 0.0836 415/500 [=======================>......] - ETA: 28s - loss: 0.8064 - regression_loss: 0.7225 - classification_loss: 0.0839 416/500 [=======================>......] - ETA: 28s - loss: 0.8076 - regression_loss: 0.7233 - classification_loss: 0.0843 417/500 [========================>.....] - ETA: 28s - loss: 0.8099 - regression_loss: 0.7252 - classification_loss: 0.0847 418/500 [========================>.....] - ETA: 27s - loss: 0.8108 - regression_loss: 0.7254 - classification_loss: 0.0854 419/500 [========================>.....] - ETA: 27s - loss: 0.8111 - regression_loss: 0.7256 - classification_loss: 0.0855 420/500 [========================>.....] - ETA: 27s - loss: 0.8126 - regression_loss: 0.7268 - classification_loss: 0.0858 421/500 [========================>.....] - ETA: 26s - loss: 0.8123 - regression_loss: 0.7265 - classification_loss: 0.0858 422/500 [========================>.....] - ETA: 26s - loss: 0.8110 - regression_loss: 0.7254 - classification_loss: 0.0856 423/500 [========================>.....] - ETA: 26s - loss: 0.8102 - regression_loss: 0.7247 - classification_loss: 0.0855 424/500 [========================>.....] - ETA: 25s - loss: 0.8090 - regression_loss: 0.7236 - classification_loss: 0.0854 425/500 [========================>.....] - ETA: 25s - loss: 0.8098 - regression_loss: 0.7243 - classification_loss: 0.0855 426/500 [========================>.....] - ETA: 25s - loss: 0.8095 - regression_loss: 0.7240 - classification_loss: 0.0855 427/500 [========================>.....] - ETA: 24s - loss: 0.8094 - regression_loss: 0.7240 - classification_loss: 0.0855 428/500 [========================>.....] - ETA: 24s - loss: 0.8110 - regression_loss: 0.7252 - classification_loss: 0.0858 429/500 [========================>.....] - ETA: 24s - loss: 0.8113 - regression_loss: 0.7255 - classification_loss: 0.0858 430/500 [========================>.....] - ETA: 23s - loss: 0.8109 - regression_loss: 0.7251 - classification_loss: 0.0858 431/500 [========================>.....] - ETA: 23s - loss: 0.8110 - regression_loss: 0.7252 - classification_loss: 0.0858 432/500 [========================>.....] - ETA: 23s - loss: 0.8117 - regression_loss: 0.7259 - classification_loss: 0.0858 433/500 [========================>.....] - ETA: 22s - loss: 0.8113 - regression_loss: 0.7255 - classification_loss: 0.0858 434/500 [=========================>....] - ETA: 22s - loss: 0.8121 - regression_loss: 0.7263 - classification_loss: 0.0858 435/500 [=========================>....] - ETA: 22s - loss: 0.8124 - regression_loss: 0.7264 - classification_loss: 0.0860 436/500 [=========================>....] - ETA: 21s - loss: 0.8124 - regression_loss: 0.7266 - classification_loss: 0.0859 437/500 [=========================>....] - ETA: 21s - loss: 0.8123 - regression_loss: 0.7265 - classification_loss: 0.0858 438/500 [=========================>....] - ETA: 21s - loss: 0.8119 - regression_loss: 0.7262 - classification_loss: 0.0857 439/500 [=========================>....] - ETA: 20s - loss: 0.8124 - regression_loss: 0.7265 - classification_loss: 0.0859 440/500 [=========================>....] - ETA: 20s - loss: 0.8125 - regression_loss: 0.7265 - classification_loss: 0.0859 441/500 [=========================>....] - ETA: 20s - loss: 0.8120 - regression_loss: 0.7261 - classification_loss: 0.0860 442/500 [=========================>....] - ETA: 19s - loss: 0.8118 - regression_loss: 0.7258 - classification_loss: 0.0860 443/500 [=========================>....] - ETA: 19s - loss: 0.8113 - regression_loss: 0.7253 - classification_loss: 0.0860 444/500 [=========================>....] - ETA: 18s - loss: 0.8113 - regression_loss: 0.7253 - classification_loss: 0.0860 445/500 [=========================>....] - ETA: 18s - loss: 0.8108 - regression_loss: 0.7250 - classification_loss: 0.0858 446/500 [=========================>....] - ETA: 18s - loss: 0.8128 - regression_loss: 0.7263 - classification_loss: 0.0865 447/500 [=========================>....] - ETA: 17s - loss: 0.8115 - regression_loss: 0.7252 - classification_loss: 0.0863 448/500 [=========================>....] - ETA: 17s - loss: 0.8110 - regression_loss: 0.7248 - classification_loss: 0.0862 449/500 [=========================>....] - ETA: 17s - loss: 0.8104 - regression_loss: 0.7243 - classification_loss: 0.0861 450/500 [==========================>...] - ETA: 16s - loss: 0.8116 - regression_loss: 0.7256 - classification_loss: 0.0860 451/500 [==========================>...] - ETA: 16s - loss: 0.8110 - regression_loss: 0.7251 - classification_loss: 0.0859 452/500 [==========================>...] - ETA: 16s - loss: 0.8111 - regression_loss: 0.7253 - classification_loss: 0.0858 453/500 [==========================>...] - ETA: 15s - loss: 0.8104 - regression_loss: 0.7248 - classification_loss: 0.0856 454/500 [==========================>...] - ETA: 15s - loss: 0.8095 - regression_loss: 0.7240 - classification_loss: 0.0855 455/500 [==========================>...] - ETA: 15s - loss: 0.8096 - regression_loss: 0.7241 - classification_loss: 0.0855 456/500 [==========================>...] - ETA: 14s - loss: 0.8100 - regression_loss: 0.7244 - classification_loss: 0.0857 457/500 [==========================>...] - ETA: 14s - loss: 0.8100 - regression_loss: 0.7244 - classification_loss: 0.0855 458/500 [==========================>...] - ETA: 14s - loss: 0.8110 - regression_loss: 0.7253 - classification_loss: 0.0856 459/500 [==========================>...] - ETA: 13s - loss: 0.8107 - regression_loss: 0.7252 - classification_loss: 0.0856 460/500 [==========================>...] - ETA: 13s - loss: 0.8099 - regression_loss: 0.7245 - classification_loss: 0.0854 461/500 [==========================>...] - ETA: 13s - loss: 0.8091 - regression_loss: 0.7238 - classification_loss: 0.0853 462/500 [==========================>...] - ETA: 12s - loss: 0.8084 - regression_loss: 0.7232 - classification_loss: 0.0852 463/500 [==========================>...] - ETA: 12s - loss: 0.8079 - regression_loss: 0.7228 - classification_loss: 0.0851 464/500 [==========================>...] - ETA: 12s - loss: 0.8085 - regression_loss: 0.7234 - classification_loss: 0.0851 465/500 [==========================>...] - ETA: 11s - loss: 0.8090 - regression_loss: 0.7239 - classification_loss: 0.0852 466/500 [==========================>...] - ETA: 11s - loss: 0.8087 - regression_loss: 0.7236 - classification_loss: 0.0851 467/500 [===========================>..] - ETA: 11s - loss: 0.8103 - regression_loss: 0.7253 - classification_loss: 0.0850 468/500 [===========================>..] - ETA: 10s - loss: 0.8087 - regression_loss: 0.7237 - classification_loss: 0.0850 469/500 [===========================>..] - ETA: 10s - loss: 0.8096 - regression_loss: 0.7245 - classification_loss: 0.0851 470/500 [===========================>..] - ETA: 10s - loss: 0.8088 - regression_loss: 0.7239 - classification_loss: 0.0850 471/500 [===========================>..] - ETA: 9s - loss: 0.8095 - regression_loss: 0.7244 - classification_loss: 0.0851  472/500 [===========================>..] - ETA: 9s - loss: 0.8088 - regression_loss: 0.7238 - classification_loss: 0.0850 473/500 [===========================>..] - ETA: 9s - loss: 0.8080 - regression_loss: 0.7231 - classification_loss: 0.0849 474/500 [===========================>..] - ETA: 8s - loss: 0.8079 - regression_loss: 0.7230 - classification_loss: 0.0849 475/500 [===========================>..] - ETA: 8s - loss: 0.8080 - regression_loss: 0.7232 - classification_loss: 0.0848 476/500 [===========================>..] - ETA: 8s - loss: 0.8094 - regression_loss: 0.7242 - classification_loss: 0.0852 477/500 [===========================>..] - ETA: 7s - loss: 0.8101 - regression_loss: 0.7248 - classification_loss: 0.0853 478/500 [===========================>..] - ETA: 7s - loss: 0.8095 - regression_loss: 0.7243 - classification_loss: 0.0853 479/500 [===========================>..] - ETA: 7s - loss: 0.8087 - regression_loss: 0.7236 - classification_loss: 0.0851 480/500 [===========================>..] - ETA: 6s - loss: 0.8071 - regression_loss: 0.7221 - classification_loss: 0.0850 481/500 [===========================>..] - ETA: 6s - loss: 0.8076 - regression_loss: 0.7227 - classification_loss: 0.0849 482/500 [===========================>..] - ETA: 6s - loss: 0.8079 - regression_loss: 0.7230 - classification_loss: 0.0849 483/500 [===========================>..] - ETA: 5s - loss: 0.8069 - regression_loss: 0.7221 - classification_loss: 0.0848 484/500 [============================>.] - ETA: 5s - loss: 0.8068 - regression_loss: 0.7221 - classification_loss: 0.0847 485/500 [============================>.] - ETA: 5s - loss: 0.8079 - regression_loss: 0.7231 - classification_loss: 0.0848 486/500 [============================>.] - ETA: 4s - loss: 0.8091 - regression_loss: 0.7239 - classification_loss: 0.0852 487/500 [============================>.] - ETA: 4s - loss: 0.8098 - regression_loss: 0.7244 - classification_loss: 0.0854 488/500 [============================>.] - ETA: 4s - loss: 0.8104 - regression_loss: 0.7250 - classification_loss: 0.0855 489/500 [============================>.] - ETA: 3s - loss: 0.8117 - regression_loss: 0.7257 - classification_loss: 0.0860 490/500 [============================>.] - ETA: 3s - loss: 0.8119 - regression_loss: 0.7259 - classification_loss: 0.0860 491/500 [============================>.] - ETA: 3s - loss: 0.8117 - regression_loss: 0.7258 - classification_loss: 0.0859 492/500 [============================>.] - ETA: 2s - loss: 0.8116 - regression_loss: 0.7258 - classification_loss: 0.0858 493/500 [============================>.] - ETA: 2s - loss: 0.8107 - regression_loss: 0.7250 - classification_loss: 0.0857 494/500 [============================>.] - ETA: 2s - loss: 0.8116 - regression_loss: 0.7260 - classification_loss: 0.0856 495/500 [============================>.] - ETA: 1s - loss: 0.8114 - regression_loss: 0.7259 - classification_loss: 0.0855 496/500 [============================>.] - ETA: 1s - loss: 0.8133 - regression_loss: 0.7275 - classification_loss: 0.0858 497/500 [============================>.] - ETA: 1s - loss: 0.8130 - regression_loss: 0.7272 - classification_loss: 0.0858 498/500 [============================>.] - ETA: 0s - loss: 0.8137 - regression_loss: 0.7277 - classification_loss: 0.0860 499/500 [============================>.] - ETA: 0s - loss: 0.8151 - regression_loss: 0.7287 - classification_loss: 0.0864 500/500 [==============================] - 169s 339ms/step - loss: 0.8147 - regression_loss: 0.7284 - classification_loss: 0.0863 326 instances of class plum with average precision: 0.8157 mAP: 0.8157 Epoch 00039: saving model to ./training/snapshots/resnet101_pascal_39.h5 Epoch 40/150 1/500 [..............................] - ETA: 2:35 - loss: 0.6119 - regression_loss: 0.5738 - classification_loss: 0.0381 2/500 [..............................] - ETA: 2:45 - loss: 0.7187 - regression_loss: 0.6542 - classification_loss: 0.0645 3/500 [..............................] - ETA: 2:45 - loss: 0.6401 - regression_loss: 0.5854 - classification_loss: 0.0547 4/500 [..............................] - ETA: 2:45 - loss: 0.5598 - regression_loss: 0.5159 - classification_loss: 0.0440 5/500 [..............................] - ETA: 2:45 - loss: 0.6233 - regression_loss: 0.5789 - classification_loss: 0.0444 6/500 [..............................] - ETA: 2:47 - loss: 0.6260 - regression_loss: 0.5817 - classification_loss: 0.0443 7/500 [..............................] - ETA: 2:47 - loss: 0.6936 - regression_loss: 0.6377 - classification_loss: 0.0559 8/500 [..............................] - ETA: 2:47 - loss: 0.6902 - regression_loss: 0.6375 - classification_loss: 0.0527 9/500 [..............................] - ETA: 2:46 - loss: 0.7463 - regression_loss: 0.6928 - classification_loss: 0.0536 10/500 [..............................] - ETA: 2:46 - loss: 0.7505 - regression_loss: 0.6981 - classification_loss: 0.0524 11/500 [..............................] - ETA: 2:46 - loss: 0.8134 - regression_loss: 0.7490 - classification_loss: 0.0644 12/500 [..............................] - ETA: 2:46 - loss: 0.8236 - regression_loss: 0.7588 - classification_loss: 0.0648 13/500 [..............................] - ETA: 2:46 - loss: 0.8880 - regression_loss: 0.8166 - classification_loss: 0.0715 14/500 [..............................] - ETA: 2:46 - loss: 0.8564 - regression_loss: 0.7877 - classification_loss: 0.0687 15/500 [..............................] - ETA: 2:46 - loss: 0.8607 - regression_loss: 0.7923 - classification_loss: 0.0683 16/500 [..............................] - ETA: 2:46 - loss: 0.8317 - regression_loss: 0.7663 - classification_loss: 0.0654 17/500 [>.............................] - ETA: 2:45 - loss: 0.8129 - regression_loss: 0.7475 - classification_loss: 0.0654 18/500 [>.............................] - ETA: 2:45 - loss: 0.8185 - regression_loss: 0.7464 - classification_loss: 0.0721 19/500 [>.............................] - ETA: 2:44 - loss: 0.8049 - regression_loss: 0.7341 - classification_loss: 0.0709 20/500 [>.............................] - ETA: 2:44 - loss: 0.7999 - regression_loss: 0.7292 - classification_loss: 0.0706 21/500 [>.............................] - ETA: 2:43 - loss: 0.8035 - regression_loss: 0.7323 - classification_loss: 0.0713 22/500 [>.............................] - ETA: 2:43 - loss: 0.8307 - regression_loss: 0.7554 - classification_loss: 0.0753 23/500 [>.............................] - ETA: 2:42 - loss: 0.8429 - regression_loss: 0.7664 - classification_loss: 0.0764 24/500 [>.............................] - ETA: 2:42 - loss: 0.8527 - regression_loss: 0.7768 - classification_loss: 0.0759 25/500 [>.............................] - ETA: 2:42 - loss: 0.8449 - regression_loss: 0.7702 - classification_loss: 0.0747 26/500 [>.............................] - ETA: 2:42 - loss: 0.8364 - regression_loss: 0.7631 - classification_loss: 0.0734 27/500 [>.............................] - ETA: 2:41 - loss: 0.8522 - regression_loss: 0.7784 - classification_loss: 0.0738 28/500 [>.............................] - ETA: 2:41 - loss: 0.8488 - regression_loss: 0.7758 - classification_loss: 0.0730 29/500 [>.............................] - ETA: 2:41 - loss: 0.8416 - regression_loss: 0.7692 - classification_loss: 0.0724 30/500 [>.............................] - ETA: 2:40 - loss: 0.8396 - regression_loss: 0.7671 - classification_loss: 0.0724 31/500 [>.............................] - ETA: 2:40 - loss: 0.8335 - regression_loss: 0.7616 - classification_loss: 0.0719 32/500 [>.............................] - ETA: 2:39 - loss: 0.8206 - regression_loss: 0.7503 - classification_loss: 0.0703 33/500 [>.............................] - ETA: 2:39 - loss: 0.8196 - regression_loss: 0.7480 - classification_loss: 0.0717 34/500 [=>............................] - ETA: 2:39 - loss: 0.8095 - regression_loss: 0.7389 - classification_loss: 0.0706 35/500 [=>............................] - ETA: 2:38 - loss: 0.8159 - regression_loss: 0.7431 - classification_loss: 0.0728 36/500 [=>............................] - ETA: 2:38 - loss: 0.8052 - regression_loss: 0.7338 - classification_loss: 0.0713 37/500 [=>............................] - ETA: 2:37 - loss: 0.8147 - regression_loss: 0.7418 - classification_loss: 0.0729 38/500 [=>............................] - ETA: 2:36 - loss: 0.8063 - regression_loss: 0.7348 - classification_loss: 0.0715 39/500 [=>............................] - ETA: 2:36 - loss: 0.8046 - regression_loss: 0.7337 - classification_loss: 0.0710 40/500 [=>............................] - ETA: 2:36 - loss: 0.7941 - regression_loss: 0.7243 - classification_loss: 0.0698 41/500 [=>............................] - ETA: 2:36 - loss: 0.7835 - regression_loss: 0.7148 - classification_loss: 0.0687 42/500 [=>............................] - ETA: 2:35 - loss: 0.7770 - regression_loss: 0.7094 - classification_loss: 0.0676 43/500 [=>............................] - ETA: 2:35 - loss: 0.7706 - regression_loss: 0.7038 - classification_loss: 0.0668 44/500 [=>............................] - ETA: 2:35 - loss: 0.7695 - regression_loss: 0.7031 - classification_loss: 0.0664 45/500 [=>............................] - ETA: 2:34 - loss: 0.7742 - regression_loss: 0.7084 - classification_loss: 0.0658 46/500 [=>............................] - ETA: 2:34 - loss: 0.7732 - regression_loss: 0.7077 - classification_loss: 0.0654 47/500 [=>............................] - ETA: 2:34 - loss: 0.7744 - regression_loss: 0.7086 - classification_loss: 0.0658 48/500 [=>............................] - ETA: 2:33 - loss: 0.7725 - regression_loss: 0.7075 - classification_loss: 0.0650 49/500 [=>............................] - ETA: 2:33 - loss: 0.7648 - regression_loss: 0.7004 - classification_loss: 0.0644 50/500 [==>...........................] - ETA: 2:33 - loss: 0.7679 - regression_loss: 0.7029 - classification_loss: 0.0650 51/500 [==>...........................] - ETA: 2:32 - loss: 0.7760 - regression_loss: 0.7094 - classification_loss: 0.0665 52/500 [==>...........................] - ETA: 2:32 - loss: 0.7808 - regression_loss: 0.7126 - classification_loss: 0.0682 53/500 [==>...........................] - ETA: 2:32 - loss: 0.7724 - regression_loss: 0.7051 - classification_loss: 0.0673 54/500 [==>...........................] - ETA: 2:32 - loss: 0.7678 - regression_loss: 0.7011 - classification_loss: 0.0667 55/500 [==>...........................] - ETA: 2:31 - loss: 0.7628 - regression_loss: 0.6967 - classification_loss: 0.0661 56/500 [==>...........................] - ETA: 2:31 - loss: 0.7634 - regression_loss: 0.6977 - classification_loss: 0.0657 57/500 [==>...........................] - ETA: 2:31 - loss: 0.7568 - regression_loss: 0.6921 - classification_loss: 0.0647 58/500 [==>...........................] - ETA: 2:30 - loss: 0.7548 - regression_loss: 0.6901 - classification_loss: 0.0646 59/500 [==>...........................] - ETA: 2:30 - loss: 0.7508 - regression_loss: 0.6870 - classification_loss: 0.0637 60/500 [==>...........................] - ETA: 2:29 - loss: 0.7493 - regression_loss: 0.6863 - classification_loss: 0.0630 61/500 [==>...........................] - ETA: 2:29 - loss: 0.7522 - regression_loss: 0.6883 - classification_loss: 0.0639 62/500 [==>...........................] - ETA: 2:29 - loss: 0.7473 - regression_loss: 0.6842 - classification_loss: 0.0631 63/500 [==>...........................] - ETA: 2:28 - loss: 0.7497 - regression_loss: 0.6869 - classification_loss: 0.0629 64/500 [==>...........................] - ETA: 2:28 - loss: 0.7536 - regression_loss: 0.6905 - classification_loss: 0.0632 65/500 [==>...........................] - ETA: 2:28 - loss: 0.7594 - regression_loss: 0.6958 - classification_loss: 0.0637 66/500 [==>...........................] - ETA: 2:27 - loss: 0.7555 - regression_loss: 0.6923 - classification_loss: 0.0632 67/500 [===>..........................] - ETA: 2:27 - loss: 0.7442 - regression_loss: 0.6820 - classification_loss: 0.0623 68/500 [===>..........................] - ETA: 2:27 - loss: 0.7434 - regression_loss: 0.6810 - classification_loss: 0.0624 69/500 [===>..........................] - ETA: 2:26 - loss: 0.7445 - regression_loss: 0.6821 - classification_loss: 0.0624 70/500 [===>..........................] - ETA: 2:26 - loss: 0.7460 - regression_loss: 0.6838 - classification_loss: 0.0622 71/500 [===>..........................] - ETA: 2:26 - loss: 0.7417 - regression_loss: 0.6784 - classification_loss: 0.0632 72/500 [===>..........................] - ETA: 2:25 - loss: 0.7502 - regression_loss: 0.6857 - classification_loss: 0.0645 73/500 [===>..........................] - ETA: 2:25 - loss: 0.7537 - regression_loss: 0.6879 - classification_loss: 0.0657 74/500 [===>..........................] - ETA: 2:25 - loss: 0.7531 - regression_loss: 0.6876 - classification_loss: 0.0655 75/500 [===>..........................] - ETA: 2:24 - loss: 0.7510 - regression_loss: 0.6858 - classification_loss: 0.0652 76/500 [===>..........................] - ETA: 2:24 - loss: 0.7445 - regression_loss: 0.6801 - classification_loss: 0.0644 77/500 [===>..........................] - ETA: 2:24 - loss: 0.7454 - regression_loss: 0.6803 - classification_loss: 0.0651 78/500 [===>..........................] - ETA: 2:23 - loss: 0.7410 - regression_loss: 0.6763 - classification_loss: 0.0647 79/500 [===>..........................] - ETA: 2:23 - loss: 0.7392 - regression_loss: 0.6748 - classification_loss: 0.0644 80/500 [===>..........................] - ETA: 2:23 - loss: 0.7328 - regression_loss: 0.6691 - classification_loss: 0.0638 81/500 [===>..........................] - ETA: 2:22 - loss: 0.7271 - regression_loss: 0.6638 - classification_loss: 0.0632 82/500 [===>..........................] - ETA: 2:22 - loss: 0.7218 - regression_loss: 0.6592 - classification_loss: 0.0626 83/500 [===>..........................] - ETA: 2:22 - loss: 0.7224 - regression_loss: 0.6600 - classification_loss: 0.0624 84/500 [====>.........................] - ETA: 2:21 - loss: 0.7197 - regression_loss: 0.6578 - classification_loss: 0.0619 85/500 [====>.........................] - ETA: 2:21 - loss: 0.7239 - regression_loss: 0.6616 - classification_loss: 0.0622 86/500 [====>.........................] - ETA: 2:21 - loss: 0.7272 - regression_loss: 0.6650 - classification_loss: 0.0623 87/500 [====>.........................] - ETA: 2:20 - loss: 0.7256 - regression_loss: 0.6635 - classification_loss: 0.0621 88/500 [====>.........................] - ETA: 2:20 - loss: 0.7196 - regression_loss: 0.6581 - classification_loss: 0.0615 89/500 [====>.........................] - ETA: 2:20 - loss: 0.7182 - regression_loss: 0.6570 - classification_loss: 0.0612 90/500 [====>.........................] - ETA: 2:19 - loss: 0.7209 - regression_loss: 0.6594 - classification_loss: 0.0615 91/500 [====>.........................] - ETA: 2:19 - loss: 0.7202 - regression_loss: 0.6590 - classification_loss: 0.0612 92/500 [====>.........................] - ETA: 2:19 - loss: 0.7246 - regression_loss: 0.6633 - classification_loss: 0.0612 93/500 [====>.........................] - ETA: 2:18 - loss: 0.7246 - regression_loss: 0.6636 - classification_loss: 0.0611 94/500 [====>.........................] - ETA: 2:18 - loss: 0.7385 - regression_loss: 0.6745 - classification_loss: 0.0641 95/500 [====>.........................] - ETA: 2:18 - loss: 0.7353 - regression_loss: 0.6717 - classification_loss: 0.0637 96/500 [====>.........................] - ETA: 2:17 - loss: 0.7342 - regression_loss: 0.6706 - classification_loss: 0.0636 97/500 [====>.........................] - ETA: 2:17 - loss: 0.7339 - regression_loss: 0.6704 - classification_loss: 0.0635 98/500 [====>.........................] - ETA: 2:17 - loss: 0.7332 - regression_loss: 0.6700 - classification_loss: 0.0632 99/500 [====>.........................] - ETA: 2:16 - loss: 0.7343 - regression_loss: 0.6713 - classification_loss: 0.0630 100/500 [=====>........................] - ETA: 2:16 - loss: 0.7356 - regression_loss: 0.6726 - classification_loss: 0.0630 101/500 [=====>........................] - ETA: 2:16 - loss: 0.7360 - regression_loss: 0.6730 - classification_loss: 0.0629 102/500 [=====>........................] - ETA: 2:15 - loss: 0.7321 - regression_loss: 0.6695 - classification_loss: 0.0625 103/500 [=====>........................] - ETA: 2:15 - loss: 0.7264 - regression_loss: 0.6643 - classification_loss: 0.0620 104/500 [=====>........................] - ETA: 2:15 - loss: 0.7280 - regression_loss: 0.6655 - classification_loss: 0.0624 105/500 [=====>........................] - ETA: 2:14 - loss: 0.7256 - regression_loss: 0.6634 - classification_loss: 0.0622 106/500 [=====>........................] - ETA: 2:14 - loss: 0.7223 - regression_loss: 0.6605 - classification_loss: 0.0617 107/500 [=====>........................] - ETA: 2:14 - loss: 0.7227 - regression_loss: 0.6608 - classification_loss: 0.0619 108/500 [=====>........................] - ETA: 2:13 - loss: 0.7194 - regression_loss: 0.6579 - classification_loss: 0.0615 109/500 [=====>........................] - ETA: 2:13 - loss: 0.7170 - regression_loss: 0.6557 - classification_loss: 0.0613 110/500 [=====>........................] - ETA: 2:12 - loss: 0.7178 - regression_loss: 0.6565 - classification_loss: 0.0613 111/500 [=====>........................] - ETA: 2:12 - loss: 0.7191 - regression_loss: 0.6574 - classification_loss: 0.0616 112/500 [=====>........................] - ETA: 2:12 - loss: 0.7201 - regression_loss: 0.6583 - classification_loss: 0.0617 113/500 [=====>........................] - ETA: 2:11 - loss: 0.7184 - regression_loss: 0.6569 - classification_loss: 0.0614 114/500 [=====>........................] - ETA: 2:11 - loss: 0.7176 - regression_loss: 0.6564 - classification_loss: 0.0611 115/500 [=====>........................] - ETA: 2:11 - loss: 0.7161 - regression_loss: 0.6551 - classification_loss: 0.0610 116/500 [=====>........................] - ETA: 2:10 - loss: 0.7133 - regression_loss: 0.6527 - classification_loss: 0.0606 117/500 [======>.......................] - ETA: 2:10 - loss: 0.7132 - regression_loss: 0.6523 - classification_loss: 0.0609 118/500 [======>.......................] - ETA: 2:10 - loss: 0.7170 - regression_loss: 0.6555 - classification_loss: 0.0615 119/500 [======>.......................] - ETA: 2:09 - loss: 0.7181 - regression_loss: 0.6564 - classification_loss: 0.0617 120/500 [======>.......................] - ETA: 2:09 - loss: 0.7226 - regression_loss: 0.6601 - classification_loss: 0.0625 121/500 [======>.......................] - ETA: 2:09 - loss: 0.7253 - regression_loss: 0.6622 - classification_loss: 0.0631 122/500 [======>.......................] - ETA: 2:08 - loss: 0.7249 - regression_loss: 0.6620 - classification_loss: 0.0628 123/500 [======>.......................] - ETA: 2:08 - loss: 0.7219 - regression_loss: 0.6594 - classification_loss: 0.0625 124/500 [======>.......................] - ETA: 2:08 - loss: 0.7216 - regression_loss: 0.6593 - classification_loss: 0.0623 125/500 [======>.......................] - ETA: 2:07 - loss: 0.7220 - regression_loss: 0.6596 - classification_loss: 0.0623 126/500 [======>.......................] - ETA: 2:07 - loss: 0.7228 - regression_loss: 0.6603 - classification_loss: 0.0625 127/500 [======>.......................] - ETA: 2:07 - loss: 0.7254 - regression_loss: 0.6628 - classification_loss: 0.0626 128/500 [======>.......................] - ETA: 2:06 - loss: 0.7282 - regression_loss: 0.6652 - classification_loss: 0.0630 129/500 [======>.......................] - ETA: 2:06 - loss: 0.7280 - regression_loss: 0.6653 - classification_loss: 0.0627 130/500 [======>.......................] - ETA: 2:06 - loss: 0.7276 - regression_loss: 0.6650 - classification_loss: 0.0626 131/500 [======>.......................] - ETA: 2:05 - loss: 0.7285 - regression_loss: 0.6650 - classification_loss: 0.0635 132/500 [======>.......................] - ETA: 2:05 - loss: 0.7281 - regression_loss: 0.6648 - classification_loss: 0.0633 133/500 [======>.......................] - ETA: 2:05 - loss: 0.7275 - regression_loss: 0.6644 - classification_loss: 0.0631 134/500 [=======>......................] - ETA: 2:04 - loss: 0.7291 - regression_loss: 0.6664 - classification_loss: 0.0628 135/500 [=======>......................] - ETA: 2:04 - loss: 0.7273 - regression_loss: 0.6646 - classification_loss: 0.0627 136/500 [=======>......................] - ETA: 2:04 - loss: 0.7292 - regression_loss: 0.6662 - classification_loss: 0.0630 137/500 [=======>......................] - ETA: 2:03 - loss: 0.7275 - regression_loss: 0.6647 - classification_loss: 0.0628 138/500 [=======>......................] - ETA: 2:03 - loss: 0.7312 - regression_loss: 0.6674 - classification_loss: 0.0638 139/500 [=======>......................] - ETA: 2:03 - loss: 0.7295 - regression_loss: 0.6659 - classification_loss: 0.0636 140/500 [=======>......................] - ETA: 2:02 - loss: 0.7330 - regression_loss: 0.6692 - classification_loss: 0.0639 141/500 [=======>......................] - ETA: 2:02 - loss: 0.7358 - regression_loss: 0.6715 - classification_loss: 0.0644 142/500 [=======>......................] - ETA: 2:02 - loss: 0.7365 - regression_loss: 0.6719 - classification_loss: 0.0646 143/500 [=======>......................] - ETA: 2:01 - loss: 0.7364 - regression_loss: 0.6718 - classification_loss: 0.0645 144/500 [=======>......................] - ETA: 2:01 - loss: 0.7354 - regression_loss: 0.6708 - classification_loss: 0.0646 145/500 [=======>......................] - ETA: 2:01 - loss: 0.7390 - regression_loss: 0.6735 - classification_loss: 0.0655 146/500 [=======>......................] - ETA: 2:00 - loss: 0.7373 - regression_loss: 0.6721 - classification_loss: 0.0652 147/500 [=======>......................] - ETA: 2:00 - loss: 0.7359 - regression_loss: 0.6709 - classification_loss: 0.0650 148/500 [=======>......................] - ETA: 2:00 - loss: 0.7333 - regression_loss: 0.6685 - classification_loss: 0.0648 149/500 [=======>......................] - ETA: 1:59 - loss: 0.7380 - regression_loss: 0.6726 - classification_loss: 0.0654 150/500 [========>.....................] - ETA: 1:59 - loss: 0.7395 - regression_loss: 0.6738 - classification_loss: 0.0657 151/500 [========>.....................] - ETA: 1:59 - loss: 0.7434 - regression_loss: 0.6778 - classification_loss: 0.0656 152/500 [========>.....................] - ETA: 1:58 - loss: 0.7425 - regression_loss: 0.6771 - classification_loss: 0.0654 153/500 [========>.....................] - ETA: 1:58 - loss: 0.7479 - regression_loss: 0.6814 - classification_loss: 0.0665 154/500 [========>.....................] - ETA: 1:58 - loss: 0.7453 - regression_loss: 0.6790 - classification_loss: 0.0663 155/500 [========>.....................] - ETA: 1:57 - loss: 0.7454 - regression_loss: 0.6792 - classification_loss: 0.0662 156/500 [========>.....................] - ETA: 1:57 - loss: 0.7483 - regression_loss: 0.6814 - classification_loss: 0.0669 157/500 [========>.....................] - ETA: 1:57 - loss: 0.7473 - regression_loss: 0.6807 - classification_loss: 0.0667 158/500 [========>.....................] - ETA: 1:56 - loss: 0.7458 - regression_loss: 0.6794 - classification_loss: 0.0664 159/500 [========>.....................] - ETA: 1:56 - loss: 0.7446 - regression_loss: 0.6783 - classification_loss: 0.0662 160/500 [========>.....................] - ETA: 1:56 - loss: 0.7474 - regression_loss: 0.6800 - classification_loss: 0.0673 161/500 [========>.....................] - ETA: 1:55 - loss: 0.7473 - regression_loss: 0.6801 - classification_loss: 0.0672 162/500 [========>.....................] - ETA: 1:55 - loss: 0.7499 - regression_loss: 0.6823 - classification_loss: 0.0676 163/500 [========>.....................] - ETA: 1:55 - loss: 0.7552 - regression_loss: 0.6867 - classification_loss: 0.0685 164/500 [========>.....................] - ETA: 1:54 - loss: 0.7527 - regression_loss: 0.6845 - classification_loss: 0.0682 165/500 [========>.....................] - ETA: 1:54 - loss: 0.7549 - regression_loss: 0.6865 - classification_loss: 0.0683 166/500 [========>.....................] - ETA: 1:54 - loss: 0.7545 - regression_loss: 0.6864 - classification_loss: 0.0681 167/500 [=========>....................] - ETA: 1:53 - loss: 0.7541 - regression_loss: 0.6858 - classification_loss: 0.0683 168/500 [=========>....................] - ETA: 1:53 - loss: 0.7542 - regression_loss: 0.6859 - classification_loss: 0.0683 169/500 [=========>....................] - ETA: 1:53 - loss: 0.7512 - regression_loss: 0.6833 - classification_loss: 0.0679 170/500 [=========>....................] - ETA: 1:52 - loss: 0.7516 - regression_loss: 0.6840 - classification_loss: 0.0677 171/500 [=========>....................] - ETA: 1:52 - loss: 0.7539 - regression_loss: 0.6860 - classification_loss: 0.0679 172/500 [=========>....................] - ETA: 1:52 - loss: 0.7523 - regression_loss: 0.6846 - classification_loss: 0.0677 173/500 [=========>....................] - ETA: 1:51 - loss: 0.7550 - regression_loss: 0.6865 - classification_loss: 0.0685 174/500 [=========>....................] - ETA: 1:51 - loss: 0.7609 - regression_loss: 0.6921 - classification_loss: 0.0688 175/500 [=========>....................] - ETA: 1:51 - loss: 0.7615 - regression_loss: 0.6925 - classification_loss: 0.0690 176/500 [=========>....................] - ETA: 1:50 - loss: 0.7610 - regression_loss: 0.6921 - classification_loss: 0.0690 177/500 [=========>....................] - ETA: 1:50 - loss: 0.7592 - regression_loss: 0.6905 - classification_loss: 0.0687 178/500 [=========>....................] - ETA: 1:50 - loss: 0.7604 - regression_loss: 0.6917 - classification_loss: 0.0687 179/500 [=========>....................] - ETA: 1:49 - loss: 0.7620 - regression_loss: 0.6932 - classification_loss: 0.0688 180/500 [=========>....................] - ETA: 1:49 - loss: 0.7611 - regression_loss: 0.6925 - classification_loss: 0.0685 181/500 [=========>....................] - ETA: 1:49 - loss: 0.7593 - regression_loss: 0.6910 - classification_loss: 0.0683 182/500 [=========>....................] - ETA: 1:48 - loss: 0.7595 - regression_loss: 0.6913 - classification_loss: 0.0682 183/500 [=========>....................] - ETA: 1:48 - loss: 0.7582 - regression_loss: 0.6902 - classification_loss: 0.0680 184/500 [==========>...................] - ETA: 1:47 - loss: 0.7582 - regression_loss: 0.6903 - classification_loss: 0.0679 185/500 [==========>...................] - ETA: 1:47 - loss: 0.7595 - regression_loss: 0.6913 - classification_loss: 0.0682 186/500 [==========>...................] - ETA: 1:47 - loss: 0.7612 - regression_loss: 0.6929 - classification_loss: 0.0683 187/500 [==========>...................] - ETA: 1:46 - loss: 0.7617 - regression_loss: 0.6933 - classification_loss: 0.0684 188/500 [==========>...................] - ETA: 1:46 - loss: 0.7603 - regression_loss: 0.6922 - classification_loss: 0.0681 189/500 [==========>...................] - ETA: 1:46 - loss: 0.7603 - regression_loss: 0.6923 - classification_loss: 0.0680 190/500 [==========>...................] - ETA: 1:45 - loss: 0.7588 - regression_loss: 0.6910 - classification_loss: 0.0678 191/500 [==========>...................] - ETA: 1:45 - loss: 0.7609 - regression_loss: 0.6925 - classification_loss: 0.0683 192/500 [==========>...................] - ETA: 1:45 - loss: 0.7595 - regression_loss: 0.6914 - classification_loss: 0.0681 193/500 [==========>...................] - ETA: 1:44 - loss: 0.7610 - regression_loss: 0.6925 - classification_loss: 0.0685 194/500 [==========>...................] - ETA: 1:44 - loss: 0.7646 - regression_loss: 0.6950 - classification_loss: 0.0696 195/500 [==========>...................] - ETA: 1:44 - loss: 0.7650 - regression_loss: 0.6956 - classification_loss: 0.0694 196/500 [==========>...................] - ETA: 1:43 - loss: 0.7640 - regression_loss: 0.6948 - classification_loss: 0.0692 197/500 [==========>...................] - ETA: 1:43 - loss: 0.7644 - regression_loss: 0.6954 - classification_loss: 0.0690 198/500 [==========>...................] - ETA: 1:42 - loss: 0.7632 - regression_loss: 0.6944 - classification_loss: 0.0688 199/500 [==========>...................] - ETA: 1:42 - loss: 0.7681 - regression_loss: 0.6984 - classification_loss: 0.0697 200/500 [===========>..................] - ETA: 1:42 - loss: 0.7664 - regression_loss: 0.6970 - classification_loss: 0.0695 201/500 [===========>..................] - ETA: 1:41 - loss: 0.7646 - regression_loss: 0.6951 - classification_loss: 0.0695 202/500 [===========>..................] - ETA: 1:41 - loss: 0.7657 - regression_loss: 0.6959 - classification_loss: 0.0698 203/500 [===========>..................] - ETA: 1:41 - loss: 0.7654 - regression_loss: 0.6956 - classification_loss: 0.0698 204/500 [===========>..................] - ETA: 1:40 - loss: 0.7679 - regression_loss: 0.6976 - classification_loss: 0.0703 205/500 [===========>..................] - ETA: 1:40 - loss: 0.7693 - regression_loss: 0.6990 - classification_loss: 0.0704 206/500 [===========>..................] - ETA: 1:40 - loss: 0.7694 - regression_loss: 0.6992 - classification_loss: 0.0702 207/500 [===========>..................] - ETA: 1:39 - loss: 0.7679 - regression_loss: 0.6979 - classification_loss: 0.0700 208/500 [===========>..................] - ETA: 1:39 - loss: 0.7689 - regression_loss: 0.6986 - classification_loss: 0.0703 209/500 [===========>..................] - ETA: 1:39 - loss: 0.7684 - regression_loss: 0.6983 - classification_loss: 0.0701 210/500 [===========>..................] - ETA: 1:38 - loss: 0.7676 - regression_loss: 0.6976 - classification_loss: 0.0700 211/500 [===========>..................] - ETA: 1:38 - loss: 0.7662 - regression_loss: 0.6964 - classification_loss: 0.0698 212/500 [===========>..................] - ETA: 1:38 - loss: 0.7650 - regression_loss: 0.6954 - classification_loss: 0.0696 213/500 [===========>..................] - ETA: 1:37 - loss: 0.7636 - regression_loss: 0.6941 - classification_loss: 0.0694 214/500 [===========>..................] - ETA: 1:37 - loss: 0.7643 - regression_loss: 0.6948 - classification_loss: 0.0695 215/500 [===========>..................] - ETA: 1:37 - loss: 0.7632 - regression_loss: 0.6938 - classification_loss: 0.0694 216/500 [===========>..................] - ETA: 1:36 - loss: 0.7617 - regression_loss: 0.6925 - classification_loss: 0.0692 217/500 [============>.................] - ETA: 1:36 - loss: 0.7610 - regression_loss: 0.6919 - classification_loss: 0.0691 218/500 [============>.................] - ETA: 1:36 - loss: 0.7594 - regression_loss: 0.6905 - classification_loss: 0.0689 219/500 [============>.................] - ETA: 1:35 - loss: 0.7606 - regression_loss: 0.6916 - classification_loss: 0.0690 220/500 [============>.................] - ETA: 1:35 - loss: 0.7615 - regression_loss: 0.6926 - classification_loss: 0.0689 221/500 [============>.................] - ETA: 1:34 - loss: 0.7629 - regression_loss: 0.6939 - classification_loss: 0.0689 222/500 [============>.................] - ETA: 1:34 - loss: 0.7628 - regression_loss: 0.6940 - classification_loss: 0.0687 223/500 [============>.................] - ETA: 1:34 - loss: 0.7649 - regression_loss: 0.6960 - classification_loss: 0.0689 224/500 [============>.................] - ETA: 1:33 - loss: 0.7644 - regression_loss: 0.6955 - classification_loss: 0.0690 225/500 [============>.................] - ETA: 1:33 - loss: 0.7648 - regression_loss: 0.6958 - classification_loss: 0.0689 226/500 [============>.................] - ETA: 1:33 - loss: 0.7628 - regression_loss: 0.6941 - classification_loss: 0.0687 227/500 [============>.................] - ETA: 1:32 - loss: 0.7627 - regression_loss: 0.6942 - classification_loss: 0.0685 228/500 [============>.................] - ETA: 1:32 - loss: 0.7630 - regression_loss: 0.6944 - classification_loss: 0.0686 229/500 [============>.................] - ETA: 1:32 - loss: 0.7654 - regression_loss: 0.6966 - classification_loss: 0.0688 230/500 [============>.................] - ETA: 1:31 - loss: 0.7667 - regression_loss: 0.6979 - classification_loss: 0.0689 231/500 [============>.................] - ETA: 1:31 - loss: 0.7669 - regression_loss: 0.6980 - classification_loss: 0.0689 232/500 [============>.................] - ETA: 1:31 - loss: 0.7660 - regression_loss: 0.6972 - classification_loss: 0.0688 233/500 [============>.................] - ETA: 1:30 - loss: 0.7651 - regression_loss: 0.6965 - classification_loss: 0.0686 234/500 [=============>................] - ETA: 1:30 - loss: 0.7655 - regression_loss: 0.6969 - classification_loss: 0.0686 235/500 [=============>................] - ETA: 1:30 - loss: 0.7642 - regression_loss: 0.6954 - classification_loss: 0.0688 236/500 [=============>................] - ETA: 1:29 - loss: 0.7644 - regression_loss: 0.6958 - classification_loss: 0.0686 237/500 [=============>................] - ETA: 1:29 - loss: 0.7637 - regression_loss: 0.6952 - classification_loss: 0.0685 238/500 [=============>................] - ETA: 1:29 - loss: 0.7636 - regression_loss: 0.6949 - classification_loss: 0.0687 239/500 [=============>................] - ETA: 1:28 - loss: 0.7631 - regression_loss: 0.6944 - classification_loss: 0.0687 240/500 [=============>................] - ETA: 1:28 - loss: 0.7658 - regression_loss: 0.6968 - classification_loss: 0.0690 241/500 [=============>................] - ETA: 1:28 - loss: 0.7659 - regression_loss: 0.6968 - classification_loss: 0.0691 242/500 [=============>................] - ETA: 1:27 - loss: 0.7659 - regression_loss: 0.6969 - classification_loss: 0.0690 243/500 [=============>................] - ETA: 1:27 - loss: 0.7641 - regression_loss: 0.6953 - classification_loss: 0.0688 244/500 [=============>................] - ETA: 1:27 - loss: 0.7643 - regression_loss: 0.6955 - classification_loss: 0.0688 245/500 [=============>................] - ETA: 1:26 - loss: 0.7632 - regression_loss: 0.6946 - classification_loss: 0.0687 246/500 [=============>................] - ETA: 1:26 - loss: 0.7623 - regression_loss: 0.6937 - classification_loss: 0.0686 247/500 [=============>................] - ETA: 1:26 - loss: 0.7641 - regression_loss: 0.6948 - classification_loss: 0.0694 248/500 [=============>................] - ETA: 1:25 - loss: 0.7656 - regression_loss: 0.6963 - classification_loss: 0.0693 249/500 [=============>................] - ETA: 1:25 - loss: 0.7640 - regression_loss: 0.6949 - classification_loss: 0.0691 250/500 [==============>...............] - ETA: 1:25 - loss: 0.7679 - regression_loss: 0.6983 - classification_loss: 0.0695 251/500 [==============>...............] - ETA: 1:24 - loss: 0.7669 - regression_loss: 0.6976 - classification_loss: 0.0694 252/500 [==============>...............] - ETA: 1:24 - loss: 0.7659 - regression_loss: 0.6967 - classification_loss: 0.0692 253/500 [==============>...............] - ETA: 1:24 - loss: 0.7667 - regression_loss: 0.6975 - classification_loss: 0.0692 254/500 [==============>...............] - ETA: 1:23 - loss: 0.7679 - regression_loss: 0.6985 - classification_loss: 0.0694 255/500 [==============>...............] - ETA: 1:23 - loss: 0.7697 - regression_loss: 0.7002 - classification_loss: 0.0695 256/500 [==============>...............] - ETA: 1:23 - loss: 0.7726 - regression_loss: 0.7020 - classification_loss: 0.0706 257/500 [==============>...............] - ETA: 1:22 - loss: 0.7734 - regression_loss: 0.7022 - classification_loss: 0.0711 258/500 [==============>...............] - ETA: 1:22 - loss: 0.7741 - regression_loss: 0.7030 - classification_loss: 0.0711 259/500 [==============>...............] - ETA: 1:22 - loss: 0.7737 - regression_loss: 0.7028 - classification_loss: 0.0709 260/500 [==============>...............] - ETA: 1:21 - loss: 0.7765 - regression_loss: 0.7051 - classification_loss: 0.0714 261/500 [==============>...............] - ETA: 1:21 - loss: 0.7768 - regression_loss: 0.7055 - classification_loss: 0.0713 262/500 [==============>...............] - ETA: 1:21 - loss: 0.7745 - regression_loss: 0.7034 - classification_loss: 0.0711 263/500 [==============>...............] - ETA: 1:20 - loss: 0.7734 - regression_loss: 0.7024 - classification_loss: 0.0710 264/500 [==============>...............] - ETA: 1:20 - loss: 0.7744 - regression_loss: 0.7033 - classification_loss: 0.0711 265/500 [==============>...............] - ETA: 1:20 - loss: 0.7731 - regression_loss: 0.7022 - classification_loss: 0.0710 266/500 [==============>...............] - ETA: 1:19 - loss: 0.7732 - regression_loss: 0.7022 - classification_loss: 0.0710 267/500 [===============>..............] - ETA: 1:19 - loss: 0.7716 - regression_loss: 0.7008 - classification_loss: 0.0708 268/500 [===============>..............] - ETA: 1:18 - loss: 0.7740 - regression_loss: 0.7027 - classification_loss: 0.0713 269/500 [===============>..............] - ETA: 1:18 - loss: 0.7725 - regression_loss: 0.7013 - classification_loss: 0.0712 270/500 [===============>..............] - ETA: 1:18 - loss: 0.7696 - regression_loss: 0.6987 - classification_loss: 0.0710 271/500 [===============>..............] - ETA: 1:17 - loss: 0.7715 - regression_loss: 0.6997 - classification_loss: 0.0718 272/500 [===============>..............] - ETA: 1:17 - loss: 0.7725 - regression_loss: 0.7003 - classification_loss: 0.0722 273/500 [===============>..............] - ETA: 1:17 - loss: 0.7721 - regression_loss: 0.7001 - classification_loss: 0.0721 274/500 [===============>..............] - ETA: 1:16 - loss: 0.7717 - regression_loss: 0.6997 - classification_loss: 0.0720 275/500 [===============>..............] - ETA: 1:16 - loss: 0.7721 - regression_loss: 0.7001 - classification_loss: 0.0720 276/500 [===============>..............] - ETA: 1:16 - loss: 0.7726 - regression_loss: 0.7007 - classification_loss: 0.0719 277/500 [===============>..............] - ETA: 1:15 - loss: 0.7706 - regression_loss: 0.6989 - classification_loss: 0.0717 278/500 [===============>..............] - ETA: 1:15 - loss: 0.7687 - regression_loss: 0.6972 - classification_loss: 0.0715 279/500 [===============>..............] - ETA: 1:15 - loss: 0.7676 - regression_loss: 0.6963 - classification_loss: 0.0713 280/500 [===============>..............] - ETA: 1:14 - loss: 0.7687 - regression_loss: 0.6972 - classification_loss: 0.0715 281/500 [===============>..............] - ETA: 1:14 - loss: 0.7690 - regression_loss: 0.6975 - classification_loss: 0.0715 282/500 [===============>..............] - ETA: 1:14 - loss: 0.7690 - regression_loss: 0.6976 - classification_loss: 0.0715 283/500 [===============>..............] - ETA: 1:13 - loss: 0.7716 - regression_loss: 0.6999 - classification_loss: 0.0717 284/500 [================>.............] - ETA: 1:13 - loss: 0.7734 - regression_loss: 0.7014 - classification_loss: 0.0720 285/500 [================>.............] - ETA: 1:13 - loss: 0.7731 - regression_loss: 0.7012 - classification_loss: 0.0719 286/500 [================>.............] - ETA: 1:12 - loss: 0.7731 - regression_loss: 0.7011 - classification_loss: 0.0720 287/500 [================>.............] - ETA: 1:12 - loss: 0.7742 - regression_loss: 0.7020 - classification_loss: 0.0722 288/500 [================>.............] - ETA: 1:12 - loss: 0.7737 - regression_loss: 0.7016 - classification_loss: 0.0720 289/500 [================>.............] - ETA: 1:11 - loss: 0.7750 - regression_loss: 0.7027 - classification_loss: 0.0723 290/500 [================>.............] - ETA: 1:11 - loss: 0.7754 - regression_loss: 0.7031 - classification_loss: 0.0723 291/500 [================>.............] - ETA: 1:11 - loss: 0.7737 - regression_loss: 0.7016 - classification_loss: 0.0722 292/500 [================>.............] - ETA: 1:10 - loss: 0.7725 - regression_loss: 0.7005 - classification_loss: 0.0720 293/500 [================>.............] - ETA: 1:10 - loss: 0.7724 - regression_loss: 0.7006 - classification_loss: 0.0719 294/500 [================>.............] - ETA: 1:10 - loss: 0.7723 - regression_loss: 0.7004 - classification_loss: 0.0719 295/500 [================>.............] - ETA: 1:09 - loss: 0.7715 - regression_loss: 0.6998 - classification_loss: 0.0717 296/500 [================>.............] - ETA: 1:09 - loss: 0.7704 - regression_loss: 0.6988 - classification_loss: 0.0716 297/500 [================>.............] - ETA: 1:09 - loss: 0.7729 - regression_loss: 0.7013 - classification_loss: 0.0716 298/500 [================>.............] - ETA: 1:08 - loss: 0.7714 - regression_loss: 0.6999 - classification_loss: 0.0715 299/500 [================>.............] - ETA: 1:08 - loss: 0.7707 - regression_loss: 0.6994 - classification_loss: 0.0713 300/500 [=================>............] - ETA: 1:08 - loss: 0.7713 - regression_loss: 0.7000 - classification_loss: 0.0713 301/500 [=================>............] - ETA: 1:07 - loss: 0.7739 - regression_loss: 0.7018 - classification_loss: 0.0721 302/500 [=================>............] - ETA: 1:07 - loss: 0.7739 - regression_loss: 0.7018 - classification_loss: 0.0721 303/500 [=================>............] - ETA: 1:07 - loss: 0.7725 - regression_loss: 0.7006 - classification_loss: 0.0719 304/500 [=================>............] - ETA: 1:06 - loss: 0.7727 - regression_loss: 0.7007 - classification_loss: 0.0719 305/500 [=================>............] - ETA: 1:06 - loss: 0.7745 - regression_loss: 0.7025 - classification_loss: 0.0720 306/500 [=================>............] - ETA: 1:06 - loss: 0.7773 - regression_loss: 0.7047 - classification_loss: 0.0726 307/500 [=================>............] - ETA: 1:05 - loss: 0.7771 - regression_loss: 0.7046 - classification_loss: 0.0725 308/500 [=================>............] - ETA: 1:05 - loss: 0.7763 - regression_loss: 0.7040 - classification_loss: 0.0724 309/500 [=================>............] - ETA: 1:05 - loss: 0.7751 - regression_loss: 0.7028 - classification_loss: 0.0723 310/500 [=================>............] - ETA: 1:04 - loss: 0.7746 - regression_loss: 0.7023 - classification_loss: 0.0723 311/500 [=================>............] - ETA: 1:04 - loss: 0.7730 - regression_loss: 0.7009 - classification_loss: 0.0721 312/500 [=================>............] - ETA: 1:04 - loss: 0.7722 - regression_loss: 0.7003 - classification_loss: 0.0719 313/500 [=================>............] - ETA: 1:03 - loss: 0.7742 - regression_loss: 0.7023 - classification_loss: 0.0719 314/500 [=================>............] - ETA: 1:03 - loss: 0.7741 - regression_loss: 0.7022 - classification_loss: 0.0719 315/500 [=================>............] - ETA: 1:03 - loss: 0.7741 - regression_loss: 0.7023 - classification_loss: 0.0718 316/500 [=================>............] - ETA: 1:02 - loss: 0.7735 - regression_loss: 0.7018 - classification_loss: 0.0717 317/500 [==================>...........] - ETA: 1:02 - loss: 0.7728 - regression_loss: 0.7012 - classification_loss: 0.0716 318/500 [==================>...........] - ETA: 1:02 - loss: 0.7712 - regression_loss: 0.6998 - classification_loss: 0.0714 319/500 [==================>...........] - ETA: 1:01 - loss: 0.7723 - regression_loss: 0.7010 - classification_loss: 0.0713 320/500 [==================>...........] - ETA: 1:01 - loss: 0.7716 - regression_loss: 0.7004 - classification_loss: 0.0712 321/500 [==================>...........] - ETA: 1:01 - loss: 0.7716 - regression_loss: 0.7004 - classification_loss: 0.0712 322/500 [==================>...........] - ETA: 1:00 - loss: 0.7702 - regression_loss: 0.6992 - classification_loss: 0.0710 323/500 [==================>...........] - ETA: 1:00 - loss: 0.7699 - regression_loss: 0.6990 - classification_loss: 0.0709 324/500 [==================>...........] - ETA: 59s - loss: 0.7684 - regression_loss: 0.6977 - classification_loss: 0.0708  325/500 [==================>...........] - ETA: 59s - loss: 0.7670 - regression_loss: 0.6964 - classification_loss: 0.0706 326/500 [==================>...........] - ETA: 59s - loss: 0.7671 - regression_loss: 0.6965 - classification_loss: 0.0706 327/500 [==================>...........] - ETA: 58s - loss: 0.7666 - regression_loss: 0.6961 - classification_loss: 0.0705 328/500 [==================>...........] - ETA: 58s - loss: 0.7662 - regression_loss: 0.6957 - classification_loss: 0.0705 329/500 [==================>...........] - ETA: 58s - loss: 0.7666 - regression_loss: 0.6961 - classification_loss: 0.0705 330/500 [==================>...........] - ETA: 57s - loss: 0.7656 - regression_loss: 0.6953 - classification_loss: 0.0703 331/500 [==================>...........] - ETA: 57s - loss: 0.7657 - regression_loss: 0.6953 - classification_loss: 0.0704 332/500 [==================>...........] - ETA: 57s - loss: 0.7669 - regression_loss: 0.6964 - classification_loss: 0.0704 333/500 [==================>...........] - ETA: 56s - loss: 0.7665 - regression_loss: 0.6962 - classification_loss: 0.0703 334/500 [===================>..........] - ETA: 56s - loss: 0.7669 - regression_loss: 0.6966 - classification_loss: 0.0703 335/500 [===================>..........] - ETA: 56s - loss: 0.7675 - regression_loss: 0.6971 - classification_loss: 0.0704 336/500 [===================>..........] - ETA: 55s - loss: 0.7678 - regression_loss: 0.6975 - classification_loss: 0.0703 337/500 [===================>..........] - ETA: 55s - loss: 0.7715 - regression_loss: 0.7006 - classification_loss: 0.0709 338/500 [===================>..........] - ETA: 55s - loss: 0.7704 - regression_loss: 0.6996 - classification_loss: 0.0707 339/500 [===================>..........] - ETA: 54s - loss: 0.7707 - regression_loss: 0.7001 - classification_loss: 0.0707 340/500 [===================>..........] - ETA: 54s - loss: 0.7697 - regression_loss: 0.6992 - classification_loss: 0.0705 341/500 [===================>..........] - ETA: 54s - loss: 0.7696 - regression_loss: 0.6991 - classification_loss: 0.0705 342/500 [===================>..........] - ETA: 53s - loss: 0.7700 - regression_loss: 0.6995 - classification_loss: 0.0705 343/500 [===================>..........] - ETA: 53s - loss: 0.7698 - regression_loss: 0.6992 - classification_loss: 0.0706 344/500 [===================>..........] - ETA: 53s - loss: 0.7680 - regression_loss: 0.6976 - classification_loss: 0.0704 345/500 [===================>..........] - ETA: 52s - loss: 0.7677 - regression_loss: 0.6973 - classification_loss: 0.0703 346/500 [===================>..........] - ETA: 52s - loss: 0.7673 - regression_loss: 0.6970 - classification_loss: 0.0703 347/500 [===================>..........] - ETA: 52s - loss: 0.7680 - regression_loss: 0.6976 - classification_loss: 0.0703 348/500 [===================>..........] - ETA: 51s - loss: 0.7674 - regression_loss: 0.6971 - classification_loss: 0.0703 349/500 [===================>..........] - ETA: 51s - loss: 0.7663 - regression_loss: 0.6961 - classification_loss: 0.0703 350/500 [====================>.........] - ETA: 51s - loss: 0.7650 - regression_loss: 0.6949 - classification_loss: 0.0701 351/500 [====================>.........] - ETA: 50s - loss: 0.7655 - regression_loss: 0.6953 - classification_loss: 0.0701 352/500 [====================>.........] - ETA: 50s - loss: 0.7664 - regression_loss: 0.6961 - classification_loss: 0.0704 353/500 [====================>.........] - ETA: 50s - loss: 0.7659 - regression_loss: 0.6956 - classification_loss: 0.0703 354/500 [====================>.........] - ETA: 49s - loss: 0.7661 - regression_loss: 0.6959 - classification_loss: 0.0702 355/500 [====================>.........] - ETA: 49s - loss: 0.7667 - regression_loss: 0.6963 - classification_loss: 0.0704 356/500 [====================>.........] - ETA: 49s - loss: 0.7665 - regression_loss: 0.6963 - classification_loss: 0.0702 357/500 [====================>.........] - ETA: 48s - loss: 0.7656 - regression_loss: 0.6954 - classification_loss: 0.0702 358/500 [====================>.........] - ETA: 48s - loss: 0.7639 - regression_loss: 0.6939 - classification_loss: 0.0700 359/500 [====================>.........] - ETA: 48s - loss: 0.7651 - regression_loss: 0.6949 - classification_loss: 0.0701 360/500 [====================>.........] - ETA: 47s - loss: 0.7644 - regression_loss: 0.6943 - classification_loss: 0.0701 361/500 [====================>.........] - ETA: 47s - loss: 0.7642 - regression_loss: 0.6941 - classification_loss: 0.0701 362/500 [====================>.........] - ETA: 46s - loss: 0.7630 - regression_loss: 0.6931 - classification_loss: 0.0699 363/500 [====================>.........] - ETA: 46s - loss: 0.7629 - regression_loss: 0.6930 - classification_loss: 0.0700 364/500 [====================>.........] - ETA: 46s - loss: 0.7625 - regression_loss: 0.6927 - classification_loss: 0.0698 365/500 [====================>.........] - ETA: 45s - loss: 0.7633 - regression_loss: 0.6934 - classification_loss: 0.0700 366/500 [====================>.........] - ETA: 45s - loss: 0.7628 - regression_loss: 0.6930 - classification_loss: 0.0698 367/500 [=====================>........] - ETA: 45s - loss: 0.7616 - regression_loss: 0.6918 - classification_loss: 0.0698 368/500 [=====================>........] - ETA: 44s - loss: 0.7640 - regression_loss: 0.6935 - classification_loss: 0.0705 369/500 [=====================>........] - ETA: 44s - loss: 0.7644 - regression_loss: 0.6940 - classification_loss: 0.0704 370/500 [=====================>........] - ETA: 44s - loss: 0.7656 - regression_loss: 0.6951 - classification_loss: 0.0705 371/500 [=====================>........] - ETA: 43s - loss: 0.7656 - regression_loss: 0.6950 - classification_loss: 0.0706 372/500 [=====================>........] - ETA: 43s - loss: 0.7673 - regression_loss: 0.6963 - classification_loss: 0.0710 373/500 [=====================>........] - ETA: 43s - loss: 0.7666 - regression_loss: 0.6957 - classification_loss: 0.0709 374/500 [=====================>........] - ETA: 42s - loss: 0.7675 - regression_loss: 0.6966 - classification_loss: 0.0709 375/500 [=====================>........] - ETA: 42s - loss: 0.7675 - regression_loss: 0.6965 - classification_loss: 0.0710 376/500 [=====================>........] - ETA: 42s - loss: 0.7691 - regression_loss: 0.6976 - classification_loss: 0.0715 377/500 [=====================>........] - ETA: 41s - loss: 0.7688 - regression_loss: 0.6972 - classification_loss: 0.0715 378/500 [=====================>........] - ETA: 41s - loss: 0.7692 - regression_loss: 0.6977 - classification_loss: 0.0715 379/500 [=====================>........] - ETA: 41s - loss: 0.7689 - regression_loss: 0.6975 - classification_loss: 0.0714 380/500 [=====================>........] - ETA: 40s - loss: 0.7688 - regression_loss: 0.6974 - classification_loss: 0.0714 381/500 [=====================>........] - ETA: 40s - loss: 0.7681 - regression_loss: 0.6968 - classification_loss: 0.0713 382/500 [=====================>........] - ETA: 40s - loss: 0.7700 - regression_loss: 0.6981 - classification_loss: 0.0719 383/500 [=====================>........] - ETA: 39s - loss: 0.7716 - regression_loss: 0.6995 - classification_loss: 0.0721 384/500 [======================>.......] - ETA: 39s - loss: 0.7714 - regression_loss: 0.6994 - classification_loss: 0.0720 385/500 [======================>.......] - ETA: 39s - loss: 0.7700 - regression_loss: 0.6982 - classification_loss: 0.0718 386/500 [======================>.......] - ETA: 38s - loss: 0.7696 - regression_loss: 0.6978 - classification_loss: 0.0718 387/500 [======================>.......] - ETA: 38s - loss: 0.7700 - regression_loss: 0.6980 - classification_loss: 0.0720 388/500 [======================>.......] - ETA: 38s - loss: 0.7713 - regression_loss: 0.6991 - classification_loss: 0.0722 389/500 [======================>.......] - ETA: 37s - loss: 0.7705 - regression_loss: 0.6984 - classification_loss: 0.0722 390/500 [======================>.......] - ETA: 37s - loss: 0.7715 - regression_loss: 0.6993 - classification_loss: 0.0722 391/500 [======================>.......] - ETA: 37s - loss: 0.7717 - regression_loss: 0.6995 - classification_loss: 0.0722 392/500 [======================>.......] - ETA: 36s - loss: 0.7732 - regression_loss: 0.7009 - classification_loss: 0.0723 393/500 [======================>.......] - ETA: 36s - loss: 0.7740 - regression_loss: 0.7016 - classification_loss: 0.0724 394/500 [======================>.......] - ETA: 36s - loss: 0.7739 - regression_loss: 0.7016 - classification_loss: 0.0723 395/500 [======================>.......] - ETA: 35s - loss: 0.7754 - regression_loss: 0.7027 - classification_loss: 0.0727 396/500 [======================>.......] - ETA: 35s - loss: 0.7753 - regression_loss: 0.7026 - classification_loss: 0.0726 397/500 [======================>.......] - ETA: 35s - loss: 0.7757 - regression_loss: 0.7030 - classification_loss: 0.0727 398/500 [======================>.......] - ETA: 34s - loss: 0.7751 - regression_loss: 0.7024 - classification_loss: 0.0727 399/500 [======================>.......] - ETA: 34s - loss: 0.7745 - regression_loss: 0.7019 - classification_loss: 0.0726 400/500 [=======================>......] - ETA: 34s - loss: 0.7750 - regression_loss: 0.7023 - classification_loss: 0.0727 401/500 [=======================>......] - ETA: 33s - loss: 0.7749 - regression_loss: 0.7023 - classification_loss: 0.0727 402/500 [=======================>......] - ETA: 33s - loss: 0.7762 - regression_loss: 0.7032 - classification_loss: 0.0730 403/500 [=======================>......] - ETA: 33s - loss: 0.7761 - regression_loss: 0.7032 - classification_loss: 0.0729 404/500 [=======================>......] - ETA: 32s - loss: 0.7762 - regression_loss: 0.7033 - classification_loss: 0.0729 405/500 [=======================>......] - ETA: 32s - loss: 0.7760 - regression_loss: 0.7032 - classification_loss: 0.0728 406/500 [=======================>......] - ETA: 31s - loss: 0.7751 - regression_loss: 0.7023 - classification_loss: 0.0728 407/500 [=======================>......] - ETA: 31s - loss: 0.7757 - regression_loss: 0.7030 - classification_loss: 0.0728 408/500 [=======================>......] - ETA: 31s - loss: 0.7753 - regression_loss: 0.7026 - classification_loss: 0.0727 409/500 [=======================>......] - ETA: 30s - loss: 0.7757 - regression_loss: 0.7030 - classification_loss: 0.0728 410/500 [=======================>......] - ETA: 30s - loss: 0.7760 - regression_loss: 0.7033 - classification_loss: 0.0727 411/500 [=======================>......] - ETA: 30s - loss: 0.7757 - regression_loss: 0.7031 - classification_loss: 0.0726 412/500 [=======================>......] - ETA: 29s - loss: 0.7755 - regression_loss: 0.7029 - classification_loss: 0.0726 413/500 [=======================>......] - ETA: 29s - loss: 0.7757 - regression_loss: 0.7031 - classification_loss: 0.0726 414/500 [=======================>......] - ETA: 29s - loss: 0.7760 - regression_loss: 0.7033 - classification_loss: 0.0726 415/500 [=======================>......] - ETA: 28s - loss: 0.7753 - regression_loss: 0.7028 - classification_loss: 0.0726 416/500 [=======================>......] - ETA: 28s - loss: 0.7746 - regression_loss: 0.7021 - classification_loss: 0.0725 417/500 [========================>.....] - ETA: 28s - loss: 0.7739 - regression_loss: 0.7015 - classification_loss: 0.0724 418/500 [========================>.....] - ETA: 27s - loss: 0.7748 - regression_loss: 0.7024 - classification_loss: 0.0724 419/500 [========================>.....] - ETA: 27s - loss: 0.7735 - regression_loss: 0.7012 - classification_loss: 0.0723 420/500 [========================>.....] - ETA: 27s - loss: 0.7722 - regression_loss: 0.7000 - classification_loss: 0.0722 421/500 [========================>.....] - ETA: 26s - loss: 0.7727 - regression_loss: 0.7003 - classification_loss: 0.0723 422/500 [========================>.....] - ETA: 26s - loss: 0.7728 - regression_loss: 0.7005 - classification_loss: 0.0723 423/500 [========================>.....] - ETA: 26s - loss: 0.7724 - regression_loss: 0.7001 - classification_loss: 0.0722 424/500 [========================>.....] - ETA: 25s - loss: 0.7724 - regression_loss: 0.7002 - classification_loss: 0.0722 425/500 [========================>.....] - ETA: 25s - loss: 0.7728 - regression_loss: 0.7007 - classification_loss: 0.0721 426/500 [========================>.....] - ETA: 25s - loss: 0.7726 - regression_loss: 0.7006 - classification_loss: 0.0720 427/500 [========================>.....] - ETA: 24s - loss: 0.7734 - regression_loss: 0.7014 - classification_loss: 0.0720 428/500 [========================>.....] - ETA: 24s - loss: 0.7739 - regression_loss: 0.7017 - classification_loss: 0.0722 429/500 [========================>.....] - ETA: 24s - loss: 0.7736 - regression_loss: 0.7014 - classification_loss: 0.0722 430/500 [========================>.....] - ETA: 23s - loss: 0.7726 - regression_loss: 0.7006 - classification_loss: 0.0721 431/500 [========================>.....] - ETA: 23s - loss: 0.7735 - regression_loss: 0.7015 - classification_loss: 0.0720 432/500 [========================>.....] - ETA: 23s - loss: 0.7728 - regression_loss: 0.7009 - classification_loss: 0.0719 433/500 [========================>.....] - ETA: 22s - loss: 0.7721 - regression_loss: 0.7004 - classification_loss: 0.0718 434/500 [=========================>....] - ETA: 22s - loss: 0.7719 - regression_loss: 0.7001 - classification_loss: 0.0718 435/500 [=========================>....] - ETA: 22s - loss: 0.7710 - regression_loss: 0.6994 - classification_loss: 0.0716 436/500 [=========================>....] - ETA: 21s - loss: 0.7715 - regression_loss: 0.6998 - classification_loss: 0.0717 437/500 [=========================>....] - ETA: 21s - loss: 0.7710 - regression_loss: 0.6993 - classification_loss: 0.0718 438/500 [=========================>....] - ETA: 21s - loss: 0.7715 - regression_loss: 0.6996 - classification_loss: 0.0718 439/500 [=========================>....] - ETA: 20s - loss: 0.7721 - regression_loss: 0.7002 - classification_loss: 0.0719 440/500 [=========================>....] - ETA: 20s - loss: 0.7725 - regression_loss: 0.7006 - classification_loss: 0.0719 441/500 [=========================>....] - ETA: 20s - loss: 0.7719 - regression_loss: 0.7001 - classification_loss: 0.0718 442/500 [=========================>....] - ETA: 19s - loss: 0.7723 - regression_loss: 0.7005 - classification_loss: 0.0718 443/500 [=========================>....] - ETA: 19s - loss: 0.7714 - regression_loss: 0.6997 - classification_loss: 0.0717 444/500 [=========================>....] - ETA: 19s - loss: 0.7713 - regression_loss: 0.6996 - classification_loss: 0.0717 445/500 [=========================>....] - ETA: 18s - loss: 0.7732 - regression_loss: 0.7011 - classification_loss: 0.0721 446/500 [=========================>....] - ETA: 18s - loss: 0.7741 - regression_loss: 0.7019 - classification_loss: 0.0722 447/500 [=========================>....] - ETA: 18s - loss: 0.7739 - regression_loss: 0.7018 - classification_loss: 0.0721 448/500 [=========================>....] - ETA: 17s - loss: 0.7736 - regression_loss: 0.7016 - classification_loss: 0.0720 449/500 [=========================>....] - ETA: 17s - loss: 0.7734 - regression_loss: 0.7015 - classification_loss: 0.0720 450/500 [==========================>...] - ETA: 17s - loss: 0.7727 - regression_loss: 0.7008 - classification_loss: 0.0719 451/500 [==========================>...] - ETA: 16s - loss: 0.7737 - regression_loss: 0.7015 - classification_loss: 0.0721 452/500 [==========================>...] - ETA: 16s - loss: 0.7749 - regression_loss: 0.7026 - classification_loss: 0.0723 453/500 [==========================>...] - ETA: 15s - loss: 0.7754 - regression_loss: 0.7032 - classification_loss: 0.0722 454/500 [==========================>...] - ETA: 15s - loss: 0.7797 - regression_loss: 0.7069 - classification_loss: 0.0728 455/500 [==========================>...] - ETA: 15s - loss: 0.7806 - regression_loss: 0.7071 - classification_loss: 0.0735 456/500 [==========================>...] - ETA: 14s - loss: 0.7801 - regression_loss: 0.7067 - classification_loss: 0.0735 457/500 [==========================>...] - ETA: 14s - loss: 0.7814 - regression_loss: 0.7078 - classification_loss: 0.0736 458/500 [==========================>...] - ETA: 14s - loss: 0.7814 - regression_loss: 0.7079 - classification_loss: 0.0735 459/500 [==========================>...] - ETA: 13s - loss: 0.7812 - regression_loss: 0.7077 - classification_loss: 0.0735 460/500 [==========================>...] - ETA: 13s - loss: 0.7821 - regression_loss: 0.7085 - classification_loss: 0.0736 461/500 [==========================>...] - ETA: 13s - loss: 0.7825 - regression_loss: 0.7089 - classification_loss: 0.0737 462/500 [==========================>...] - ETA: 12s - loss: 0.7826 - regression_loss: 0.7088 - classification_loss: 0.0738 463/500 [==========================>...] - ETA: 12s - loss: 0.7816 - regression_loss: 0.7079 - classification_loss: 0.0737 464/500 [==========================>...] - ETA: 12s - loss: 0.7812 - regression_loss: 0.7076 - classification_loss: 0.0736 465/500 [==========================>...] - ETA: 11s - loss: 0.7806 - regression_loss: 0.7071 - classification_loss: 0.0735 466/500 [==========================>...] - ETA: 11s - loss: 0.7809 - regression_loss: 0.7074 - classification_loss: 0.0735 467/500 [===========================>..] - ETA: 11s - loss: 0.7803 - regression_loss: 0.7069 - classification_loss: 0.0734 468/500 [===========================>..] - ETA: 10s - loss: 0.7807 - regression_loss: 0.7073 - classification_loss: 0.0734 469/500 [===========================>..] - ETA: 10s - loss: 0.7797 - regression_loss: 0.7064 - classification_loss: 0.0733 470/500 [===========================>..] - ETA: 10s - loss: 0.7807 - regression_loss: 0.7074 - classification_loss: 0.0733 471/500 [===========================>..] - ETA: 9s - loss: 0.7811 - regression_loss: 0.7078 - classification_loss: 0.0733  472/500 [===========================>..] - ETA: 9s - loss: 0.7810 - regression_loss: 0.7077 - classification_loss: 0.0733 473/500 [===========================>..] - ETA: 9s - loss: 0.7805 - regression_loss: 0.7072 - classification_loss: 0.0733 474/500 [===========================>..] - ETA: 8s - loss: 0.7797 - regression_loss: 0.7065 - classification_loss: 0.0732 475/500 [===========================>..] - ETA: 8s - loss: 0.7807 - regression_loss: 0.7075 - classification_loss: 0.0732 476/500 [===========================>..] - ETA: 8s - loss: 0.7804 - regression_loss: 0.7072 - classification_loss: 0.0731 477/500 [===========================>..] - ETA: 7s - loss: 0.7794 - regression_loss: 0.7064 - classification_loss: 0.0730 478/500 [===========================>..] - ETA: 7s - loss: 0.7791 - regression_loss: 0.7062 - classification_loss: 0.0729 479/500 [===========================>..] - ETA: 7s - loss: 0.7782 - regression_loss: 0.7054 - classification_loss: 0.0728 480/500 [===========================>..] - ETA: 6s - loss: 0.7797 - regression_loss: 0.7066 - classification_loss: 0.0731 481/500 [===========================>..] - ETA: 6s - loss: 0.7798 - regression_loss: 0.7067 - classification_loss: 0.0731 482/500 [===========================>..] - ETA: 6s - loss: 0.7802 - regression_loss: 0.7071 - classification_loss: 0.0732 483/500 [===========================>..] - ETA: 5s - loss: 0.7813 - regression_loss: 0.7079 - classification_loss: 0.0733 484/500 [============================>.] - ETA: 5s - loss: 0.7808 - regression_loss: 0.7075 - classification_loss: 0.0732 485/500 [============================>.] - ETA: 5s - loss: 0.7807 - regression_loss: 0.7075 - classification_loss: 0.0732 486/500 [============================>.] - ETA: 4s - loss: 0.7809 - regression_loss: 0.7077 - classification_loss: 0.0732 487/500 [============================>.] - ETA: 4s - loss: 0.7819 - regression_loss: 0.7087 - classification_loss: 0.0732 488/500 [============================>.] - ETA: 4s - loss: 0.7816 - regression_loss: 0.7084 - classification_loss: 0.0731 489/500 [============================>.] - ETA: 3s - loss: 0.7808 - regression_loss: 0.7077 - classification_loss: 0.0731 490/500 [============================>.] - ETA: 3s - loss: 0.7805 - regression_loss: 0.7075 - classification_loss: 0.0731 491/500 [============================>.] - ETA: 3s - loss: 0.7805 - regression_loss: 0.7075 - classification_loss: 0.0730 492/500 [============================>.] - ETA: 2s - loss: 0.7811 - regression_loss: 0.7081 - classification_loss: 0.0730 493/500 [============================>.] - ETA: 2s - loss: 0.7807 - regression_loss: 0.7078 - classification_loss: 0.0729 494/500 [============================>.] - ETA: 2s - loss: 0.7804 - regression_loss: 0.7075 - classification_loss: 0.0729 495/500 [============================>.] - ETA: 1s - loss: 0.7802 - regression_loss: 0.7074 - classification_loss: 0.0729 496/500 [============================>.] - ETA: 1s - loss: 0.7803 - regression_loss: 0.7075 - classification_loss: 0.0728 497/500 [============================>.] - ETA: 1s - loss: 0.7804 - regression_loss: 0.7076 - classification_loss: 0.0728 498/500 [============================>.] - ETA: 0s - loss: 0.7804 - regression_loss: 0.7074 - classification_loss: 0.0730 499/500 [============================>.] - ETA: 0s - loss: 0.7804 - regression_loss: 0.7074 - classification_loss: 0.0730 500/500 [==============================] - 170s 340ms/step - loss: 0.7805 - regression_loss: 0.7075 - classification_loss: 0.0730 326 instances of class plum with average precision: 0.8549 mAP: 0.8549 Epoch 00040: saving model to ./training/snapshots/resnet101_pascal_40.h5 Epoch 41/150 1/500 [..............................] - ETA: 2:48 - loss: 0.8486 - regression_loss: 0.7707 - classification_loss: 0.0780 2/500 [..............................] - ETA: 2:47 - loss: 0.6227 - regression_loss: 0.5806 - classification_loss: 0.0421 3/500 [..............................] - ETA: 2:48 - loss: 0.6722 - regression_loss: 0.6144 - classification_loss: 0.0578 4/500 [..............................] - ETA: 2:48 - loss: 0.7238 - regression_loss: 0.6490 - classification_loss: 0.0748 5/500 [..............................] - ETA: 2:49 - loss: 0.7782 - regression_loss: 0.7064 - classification_loss: 0.0717 6/500 [..............................] - ETA: 2:48 - loss: 0.7533 - regression_loss: 0.6865 - classification_loss: 0.0668 7/500 [..............................] - ETA: 2:46 - loss: 0.7487 - regression_loss: 0.6847 - classification_loss: 0.0641 8/500 [..............................] - ETA: 2:47 - loss: 0.7401 - regression_loss: 0.6783 - classification_loss: 0.0618 9/500 [..............................] - ETA: 2:47 - loss: 0.8013 - regression_loss: 0.7318 - classification_loss: 0.0695 10/500 [..............................] - ETA: 2:46 - loss: 0.8191 - regression_loss: 0.7453 - classification_loss: 0.0739 11/500 [..............................] - ETA: 2:46 - loss: 0.9005 - regression_loss: 0.8209 - classification_loss: 0.0796 12/500 [..............................] - ETA: 2:46 - loss: 0.8769 - regression_loss: 0.7992 - classification_loss: 0.0778 13/500 [..............................] - ETA: 2:44 - loss: 0.8295 - regression_loss: 0.7565 - classification_loss: 0.0730 14/500 [..............................] - ETA: 2:44 - loss: 0.8190 - regression_loss: 0.7493 - classification_loss: 0.0697 15/500 [..............................] - ETA: 2:44 - loss: 0.8416 - regression_loss: 0.7716 - classification_loss: 0.0700 16/500 [..............................] - ETA: 2:44 - loss: 0.8399 - regression_loss: 0.7700 - classification_loss: 0.0699 17/500 [>.............................] - ETA: 2:44 - loss: 0.8345 - regression_loss: 0.7656 - classification_loss: 0.0689 18/500 [>.............................] - ETA: 2:44 - loss: 0.8146 - regression_loss: 0.7449 - classification_loss: 0.0697 19/500 [>.............................] - ETA: 2:44 - loss: 0.8070 - regression_loss: 0.7382 - classification_loss: 0.0687 20/500 [>.............................] - ETA: 2:44 - loss: 0.7824 - regression_loss: 0.7164 - classification_loss: 0.0660 21/500 [>.............................] - ETA: 2:44 - loss: 0.7782 - regression_loss: 0.7124 - classification_loss: 0.0658 22/500 [>.............................] - ETA: 2:44 - loss: 0.7746 - regression_loss: 0.7096 - classification_loss: 0.0650 23/500 [>.............................] - ETA: 2:43 - loss: 0.7671 - regression_loss: 0.7038 - classification_loss: 0.0633 24/500 [>.............................] - ETA: 2:43 - loss: 0.7745 - regression_loss: 0.7107 - classification_loss: 0.0638 25/500 [>.............................] - ETA: 2:42 - loss: 0.7688 - regression_loss: 0.7060 - classification_loss: 0.0628 26/500 [>.............................] - ETA: 2:42 - loss: 0.7749 - regression_loss: 0.7098 - classification_loss: 0.0651 27/500 [>.............................] - ETA: 2:42 - loss: 0.7678 - regression_loss: 0.7014 - classification_loss: 0.0664 28/500 [>.............................] - ETA: 2:41 - loss: 0.7785 - regression_loss: 0.7119 - classification_loss: 0.0666 29/500 [>.............................] - ETA: 2:40 - loss: 0.7669 - regression_loss: 0.7017 - classification_loss: 0.0651 30/500 [>.............................] - ETA: 2:40 - loss: 0.7735 - regression_loss: 0.7086 - classification_loss: 0.0650 31/500 [>.............................] - ETA: 2:40 - loss: 0.7845 - regression_loss: 0.7171 - classification_loss: 0.0674 32/500 [>.............................] - ETA: 2:39 - loss: 0.7637 - regression_loss: 0.6947 - classification_loss: 0.0690 33/500 [>.............................] - ETA: 2:39 - loss: 0.7575 - regression_loss: 0.6887 - classification_loss: 0.0688 34/500 [=>............................] - ETA: 2:38 - loss: 0.7681 - regression_loss: 0.6976 - classification_loss: 0.0704 35/500 [=>............................] - ETA: 2:38 - loss: 0.7775 - regression_loss: 0.7079 - classification_loss: 0.0696 36/500 [=>............................] - ETA: 2:38 - loss: 0.7704 - regression_loss: 0.7020 - classification_loss: 0.0684 37/500 [=>............................] - ETA: 2:37 - loss: 0.7647 - regression_loss: 0.6968 - classification_loss: 0.0679 38/500 [=>............................] - ETA: 2:37 - loss: 0.7787 - regression_loss: 0.7062 - classification_loss: 0.0725 39/500 [=>............................] - ETA: 2:37 - loss: 0.7869 - regression_loss: 0.7117 - classification_loss: 0.0752 40/500 [=>............................] - ETA: 2:36 - loss: 0.7754 - regression_loss: 0.7014 - classification_loss: 0.0740 41/500 [=>............................] - ETA: 2:36 - loss: 0.7635 - regression_loss: 0.6908 - classification_loss: 0.0727 42/500 [=>............................] - ETA: 2:36 - loss: 0.7611 - regression_loss: 0.6894 - classification_loss: 0.0717 43/500 [=>............................] - ETA: 2:35 - loss: 0.7624 - regression_loss: 0.6907 - classification_loss: 0.0716 44/500 [=>............................] - ETA: 2:35 - loss: 0.7533 - regression_loss: 0.6829 - classification_loss: 0.0704 45/500 [=>............................] - ETA: 2:35 - loss: 0.7434 - regression_loss: 0.6743 - classification_loss: 0.0691 46/500 [=>............................] - ETA: 2:34 - loss: 0.7615 - regression_loss: 0.6913 - classification_loss: 0.0702 47/500 [=>............................] - ETA: 2:34 - loss: 0.7517 - regression_loss: 0.6826 - classification_loss: 0.0691 48/500 [=>............................] - ETA: 2:34 - loss: 0.7504 - regression_loss: 0.6819 - classification_loss: 0.0686 49/500 [=>............................] - ETA: 2:33 - loss: 0.7468 - regression_loss: 0.6792 - classification_loss: 0.0676 50/500 [==>...........................] - ETA: 2:33 - loss: 0.7510 - regression_loss: 0.6834 - classification_loss: 0.0676 51/500 [==>...........................] - ETA: 2:33 - loss: 0.7520 - regression_loss: 0.6844 - classification_loss: 0.0676 52/500 [==>...........................] - ETA: 2:32 - loss: 0.7488 - regression_loss: 0.6818 - classification_loss: 0.0669 53/500 [==>...........................] - ETA: 2:32 - loss: 0.7624 - regression_loss: 0.6952 - classification_loss: 0.0671 54/500 [==>...........................] - ETA: 2:32 - loss: 0.7590 - regression_loss: 0.6925 - classification_loss: 0.0665 55/500 [==>...........................] - ETA: 2:31 - loss: 0.7499 - regression_loss: 0.6845 - classification_loss: 0.0654 56/500 [==>...........................] - ETA: 2:31 - loss: 0.7507 - regression_loss: 0.6853 - classification_loss: 0.0655 57/500 [==>...........................] - ETA: 2:31 - loss: 0.7473 - regression_loss: 0.6825 - classification_loss: 0.0648 58/500 [==>...........................] - ETA: 2:31 - loss: 0.7396 - regression_loss: 0.6757 - classification_loss: 0.0640 59/500 [==>...........................] - ETA: 2:30 - loss: 0.7392 - regression_loss: 0.6754 - classification_loss: 0.0638 60/500 [==>...........................] - ETA: 2:30 - loss: 0.7331 - regression_loss: 0.6702 - classification_loss: 0.0629 61/500 [==>...........................] - ETA: 2:29 - loss: 0.7329 - regression_loss: 0.6698 - classification_loss: 0.0631 62/500 [==>...........................] - ETA: 2:29 - loss: 0.7322 - regression_loss: 0.6693 - classification_loss: 0.0629 63/500 [==>...........................] - ETA: 2:29 - loss: 0.7301 - regression_loss: 0.6676 - classification_loss: 0.0625 64/500 [==>...........................] - ETA: 2:28 - loss: 0.7276 - regression_loss: 0.6657 - classification_loss: 0.0620 65/500 [==>...........................] - ETA: 2:28 - loss: 0.7435 - regression_loss: 0.6773 - classification_loss: 0.0662 66/500 [==>...........................] - ETA: 2:27 - loss: 0.7372 - regression_loss: 0.6716 - classification_loss: 0.0656 67/500 [===>..........................] - ETA: 2:27 - loss: 0.7383 - regression_loss: 0.6720 - classification_loss: 0.0663 68/500 [===>..........................] - ETA: 2:26 - loss: 0.7386 - regression_loss: 0.6728 - classification_loss: 0.0658 69/500 [===>..........................] - ETA: 2:26 - loss: 0.7377 - regression_loss: 0.6720 - classification_loss: 0.0657 70/500 [===>..........................] - ETA: 2:26 - loss: 0.7341 - regression_loss: 0.6690 - classification_loss: 0.0652 71/500 [===>..........................] - ETA: 2:26 - loss: 0.7289 - regression_loss: 0.6645 - classification_loss: 0.0644 72/500 [===>..........................] - ETA: 2:25 - loss: 0.7313 - regression_loss: 0.6670 - classification_loss: 0.0642 73/500 [===>..........................] - ETA: 2:25 - loss: 0.7374 - regression_loss: 0.6724 - classification_loss: 0.0650 74/500 [===>..........................] - ETA: 2:25 - loss: 0.7385 - regression_loss: 0.6735 - classification_loss: 0.0650 75/500 [===>..........................] - ETA: 2:24 - loss: 0.7344 - regression_loss: 0.6701 - classification_loss: 0.0643 76/500 [===>..........................] - ETA: 2:24 - loss: 0.7417 - regression_loss: 0.6753 - classification_loss: 0.0665 77/500 [===>..........................] - ETA: 2:23 - loss: 0.7454 - regression_loss: 0.6792 - classification_loss: 0.0662 78/500 [===>..........................] - ETA: 2:23 - loss: 0.7417 - regression_loss: 0.6763 - classification_loss: 0.0655 79/500 [===>..........................] - ETA: 2:23 - loss: 0.7419 - regression_loss: 0.6765 - classification_loss: 0.0653 80/500 [===>..........................] - ETA: 2:22 - loss: 0.7450 - regression_loss: 0.6791 - classification_loss: 0.0659 81/500 [===>..........................] - ETA: 2:22 - loss: 0.7440 - regression_loss: 0.6786 - classification_loss: 0.0655 82/500 [===>..........................] - ETA: 2:22 - loss: 0.7461 - regression_loss: 0.6806 - classification_loss: 0.0655 83/500 [===>..........................] - ETA: 2:21 - loss: 0.7445 - regression_loss: 0.6793 - classification_loss: 0.0652 84/500 [====>.........................] - ETA: 2:21 - loss: 0.7460 - regression_loss: 0.6808 - classification_loss: 0.0652 85/500 [====>.........................] - ETA: 2:21 - loss: 0.7415 - regression_loss: 0.6769 - classification_loss: 0.0645 86/500 [====>.........................] - ETA: 2:20 - loss: 0.7456 - regression_loss: 0.6805 - classification_loss: 0.0651 87/500 [====>.........................] - ETA: 2:20 - loss: 0.7436 - regression_loss: 0.6787 - classification_loss: 0.0650 88/500 [====>.........................] - ETA: 2:20 - loss: 0.7462 - regression_loss: 0.6812 - classification_loss: 0.0650 89/500 [====>.........................] - ETA: 2:19 - loss: 0.7448 - regression_loss: 0.6802 - classification_loss: 0.0645 90/500 [====>.........................] - ETA: 2:19 - loss: 0.7460 - regression_loss: 0.6809 - classification_loss: 0.0650 91/500 [====>.........................] - ETA: 2:19 - loss: 0.7497 - regression_loss: 0.6831 - classification_loss: 0.0666 92/500 [====>.........................] - ETA: 2:18 - loss: 0.7506 - regression_loss: 0.6838 - classification_loss: 0.0669 93/500 [====>.........................] - ETA: 2:18 - loss: 0.7469 - regression_loss: 0.6806 - classification_loss: 0.0664 94/500 [====>.........................] - ETA: 2:18 - loss: 0.7449 - regression_loss: 0.6788 - classification_loss: 0.0660 95/500 [====>.........................] - ETA: 2:17 - loss: 0.7432 - regression_loss: 0.6775 - classification_loss: 0.0658 96/500 [====>.........................] - ETA: 2:17 - loss: 0.7402 - regression_loss: 0.6749 - classification_loss: 0.0653 97/500 [====>.........................] - ETA: 2:17 - loss: 0.7478 - regression_loss: 0.6807 - classification_loss: 0.0671 98/500 [====>.........................] - ETA: 2:16 - loss: 0.7506 - regression_loss: 0.6834 - classification_loss: 0.0672 99/500 [====>.........................] - ETA: 2:16 - loss: 0.7595 - regression_loss: 0.6901 - classification_loss: 0.0694 100/500 [=====>........................] - ETA: 2:16 - loss: 0.7621 - regression_loss: 0.6920 - classification_loss: 0.0701 101/500 [=====>........................] - ETA: 2:15 - loss: 0.7573 - regression_loss: 0.6878 - classification_loss: 0.0696 102/500 [=====>........................] - ETA: 2:15 - loss: 0.7583 - regression_loss: 0.6890 - classification_loss: 0.0694 103/500 [=====>........................] - ETA: 2:15 - loss: 0.7553 - regression_loss: 0.6864 - classification_loss: 0.0689 104/500 [=====>........................] - ETA: 2:14 - loss: 0.7551 - regression_loss: 0.6863 - classification_loss: 0.0688 105/500 [=====>........................] - ETA: 2:14 - loss: 0.7529 - regression_loss: 0.6844 - classification_loss: 0.0685 106/500 [=====>........................] - ETA: 2:13 - loss: 0.7539 - regression_loss: 0.6855 - classification_loss: 0.0684 107/500 [=====>........................] - ETA: 2:13 - loss: 0.7536 - regression_loss: 0.6853 - classification_loss: 0.0684 108/500 [=====>........................] - ETA: 2:13 - loss: 0.7555 - regression_loss: 0.6869 - classification_loss: 0.0686 109/500 [=====>........................] - ETA: 2:12 - loss: 0.7552 - regression_loss: 0.6863 - classification_loss: 0.0688 110/500 [=====>........................] - ETA: 2:12 - loss: 0.7524 - regression_loss: 0.6839 - classification_loss: 0.0685 111/500 [=====>........................] - ETA: 2:12 - loss: 0.7533 - regression_loss: 0.6851 - classification_loss: 0.0682 112/500 [=====>........................] - ETA: 2:12 - loss: 0.7514 - regression_loss: 0.6835 - classification_loss: 0.0679 113/500 [=====>........................] - ETA: 2:11 - loss: 0.7526 - regression_loss: 0.6847 - classification_loss: 0.0679 114/500 [=====>........................] - ETA: 2:11 - loss: 0.7484 - regression_loss: 0.6810 - classification_loss: 0.0674 115/500 [=====>........................] - ETA: 2:11 - loss: 0.7581 - regression_loss: 0.6888 - classification_loss: 0.0693 116/500 [=====>........................] - ETA: 2:10 - loss: 0.7576 - regression_loss: 0.6886 - classification_loss: 0.0690 117/500 [======>.......................] - ETA: 2:10 - loss: 0.7647 - regression_loss: 0.6948 - classification_loss: 0.0698 118/500 [======>.......................] - ETA: 2:10 - loss: 0.7644 - regression_loss: 0.6948 - classification_loss: 0.0696 119/500 [======>.......................] - ETA: 2:09 - loss: 0.7629 - regression_loss: 0.6934 - classification_loss: 0.0694 120/500 [======>.......................] - ETA: 2:09 - loss: 0.7637 - regression_loss: 0.6944 - classification_loss: 0.0693 121/500 [======>.......................] - ETA: 2:08 - loss: 0.7637 - regression_loss: 0.6944 - classification_loss: 0.0693 122/500 [======>.......................] - ETA: 2:08 - loss: 0.7604 - regression_loss: 0.6914 - classification_loss: 0.0689 123/500 [======>.......................] - ETA: 2:08 - loss: 0.7617 - regression_loss: 0.6928 - classification_loss: 0.0689 124/500 [======>.......................] - ETA: 2:07 - loss: 0.7665 - regression_loss: 0.6972 - classification_loss: 0.0693 125/500 [======>.......................] - ETA: 2:07 - loss: 0.7641 - regression_loss: 0.6947 - classification_loss: 0.0693 126/500 [======>.......................] - ETA: 2:07 - loss: 0.7630 - regression_loss: 0.6938 - classification_loss: 0.0692 127/500 [======>.......................] - ETA: 2:06 - loss: 0.7629 - regression_loss: 0.6937 - classification_loss: 0.0691 128/500 [======>.......................] - ETA: 2:06 - loss: 0.7594 - regression_loss: 0.6907 - classification_loss: 0.0687 129/500 [======>.......................] - ETA: 2:06 - loss: 0.7563 - regression_loss: 0.6879 - classification_loss: 0.0684 130/500 [======>.......................] - ETA: 2:05 - loss: 0.7583 - regression_loss: 0.6894 - classification_loss: 0.0689 131/500 [======>.......................] - ETA: 2:05 - loss: 0.7560 - regression_loss: 0.6873 - classification_loss: 0.0687 132/500 [======>.......................] - ETA: 2:05 - loss: 0.7560 - regression_loss: 0.6873 - classification_loss: 0.0687 133/500 [======>.......................] - ETA: 2:04 - loss: 0.7570 - regression_loss: 0.6884 - classification_loss: 0.0686 134/500 [=======>......................] - ETA: 2:04 - loss: 0.7559 - regression_loss: 0.6876 - classification_loss: 0.0683 135/500 [=======>......................] - ETA: 2:04 - loss: 0.7559 - regression_loss: 0.6878 - classification_loss: 0.0681 136/500 [=======>......................] - ETA: 2:03 - loss: 0.7569 - regression_loss: 0.6890 - classification_loss: 0.0679 137/500 [=======>......................] - ETA: 2:03 - loss: 0.7554 - regression_loss: 0.6877 - classification_loss: 0.0676 138/500 [=======>......................] - ETA: 2:03 - loss: 0.7538 - regression_loss: 0.6862 - classification_loss: 0.0676 139/500 [=======>......................] - ETA: 2:02 - loss: 0.7554 - regression_loss: 0.6878 - classification_loss: 0.0676 140/500 [=======>......................] - ETA: 2:02 - loss: 0.7547 - regression_loss: 0.6872 - classification_loss: 0.0675 141/500 [=======>......................] - ETA: 2:02 - loss: 0.7546 - regression_loss: 0.6869 - classification_loss: 0.0677 142/500 [=======>......................] - ETA: 2:01 - loss: 0.7521 - regression_loss: 0.6848 - classification_loss: 0.0673 143/500 [=======>......................] - ETA: 2:01 - loss: 0.7511 - regression_loss: 0.6840 - classification_loss: 0.0671 144/500 [=======>......................] - ETA: 2:01 - loss: 0.7488 - regression_loss: 0.6821 - classification_loss: 0.0667 145/500 [=======>......................] - ETA: 2:00 - loss: 0.7454 - regression_loss: 0.6791 - classification_loss: 0.0663 146/500 [=======>......................] - ETA: 2:00 - loss: 0.7477 - regression_loss: 0.6816 - classification_loss: 0.0661 147/500 [=======>......................] - ETA: 2:00 - loss: 0.7483 - regression_loss: 0.6824 - classification_loss: 0.0659 148/500 [=======>......................] - ETA: 1:59 - loss: 0.7471 - regression_loss: 0.6814 - classification_loss: 0.0657 149/500 [=======>......................] - ETA: 1:59 - loss: 0.7499 - regression_loss: 0.6838 - classification_loss: 0.0660 150/500 [========>.....................] - ETA: 1:59 - loss: 0.7501 - regression_loss: 0.6842 - classification_loss: 0.0659 151/500 [========>.....................] - ETA: 1:58 - loss: 0.7478 - regression_loss: 0.6822 - classification_loss: 0.0656 152/500 [========>.....................] - ETA: 1:58 - loss: 0.7489 - regression_loss: 0.6834 - classification_loss: 0.0655 153/500 [========>.....................] - ETA: 1:58 - loss: 0.7487 - regression_loss: 0.6832 - classification_loss: 0.0655 154/500 [========>.....................] - ETA: 1:57 - loss: 0.7493 - regression_loss: 0.6836 - classification_loss: 0.0657 155/500 [========>.....................] - ETA: 1:57 - loss: 0.7476 - regression_loss: 0.6820 - classification_loss: 0.0656 156/500 [========>.....................] - ETA: 1:57 - loss: 0.7489 - regression_loss: 0.6830 - classification_loss: 0.0658 157/500 [========>.....................] - ETA: 1:56 - loss: 0.7508 - regression_loss: 0.6851 - classification_loss: 0.0657 158/500 [========>.....................] - ETA: 1:56 - loss: 0.7517 - regression_loss: 0.6856 - classification_loss: 0.0661 159/500 [========>.....................] - ETA: 1:56 - loss: 0.7505 - regression_loss: 0.6845 - classification_loss: 0.0660 160/500 [========>.....................] - ETA: 1:55 - loss: 0.7503 - regression_loss: 0.6843 - classification_loss: 0.0660 161/500 [========>.....................] - ETA: 1:55 - loss: 0.7485 - regression_loss: 0.6827 - classification_loss: 0.0658 162/500 [========>.....................] - ETA: 1:55 - loss: 0.7472 - regression_loss: 0.6818 - classification_loss: 0.0655 163/500 [========>.....................] - ETA: 1:54 - loss: 0.7507 - regression_loss: 0.6846 - classification_loss: 0.0661 164/500 [========>.....................] - ETA: 1:54 - loss: 0.7484 - regression_loss: 0.6826 - classification_loss: 0.0658 165/500 [========>.....................] - ETA: 1:54 - loss: 0.7480 - regression_loss: 0.6822 - classification_loss: 0.0657 166/500 [========>.....................] - ETA: 1:53 - loss: 0.7477 - regression_loss: 0.6821 - classification_loss: 0.0657 167/500 [=========>....................] - ETA: 1:53 - loss: 0.7477 - regression_loss: 0.6819 - classification_loss: 0.0658 168/500 [=========>....................] - ETA: 1:53 - loss: 0.7469 - regression_loss: 0.6813 - classification_loss: 0.0656 169/500 [=========>....................] - ETA: 1:52 - loss: 0.7482 - regression_loss: 0.6824 - classification_loss: 0.0658 170/500 [=========>....................] - ETA: 1:52 - loss: 0.7484 - regression_loss: 0.6827 - classification_loss: 0.0658 171/500 [=========>....................] - ETA: 1:52 - loss: 0.7470 - regression_loss: 0.6814 - classification_loss: 0.0656 172/500 [=========>....................] - ETA: 1:51 - loss: 0.7450 - regression_loss: 0.6798 - classification_loss: 0.0653 173/500 [=========>....................] - ETA: 1:51 - loss: 0.7440 - regression_loss: 0.6789 - classification_loss: 0.0651 174/500 [=========>....................] - ETA: 1:51 - loss: 0.7435 - regression_loss: 0.6785 - classification_loss: 0.0649 175/500 [=========>....................] - ETA: 1:50 - loss: 0.7417 - regression_loss: 0.6770 - classification_loss: 0.0647 176/500 [=========>....................] - ETA: 1:50 - loss: 0.7501 - regression_loss: 0.6831 - classification_loss: 0.0670 177/500 [=========>....................] - ETA: 1:50 - loss: 0.7530 - regression_loss: 0.6852 - classification_loss: 0.0678 178/500 [=========>....................] - ETA: 1:49 - loss: 0.7548 - regression_loss: 0.6867 - classification_loss: 0.0680 179/500 [=========>....................] - ETA: 1:49 - loss: 0.7527 - regression_loss: 0.6848 - classification_loss: 0.0679 180/500 [=========>....................] - ETA: 1:49 - loss: 0.7555 - regression_loss: 0.6875 - classification_loss: 0.0680 181/500 [=========>....................] - ETA: 1:48 - loss: 0.7560 - regression_loss: 0.6880 - classification_loss: 0.0680 182/500 [=========>....................] - ETA: 1:48 - loss: 0.7542 - regression_loss: 0.6865 - classification_loss: 0.0677 183/500 [=========>....................] - ETA: 1:47 - loss: 0.7536 - regression_loss: 0.6860 - classification_loss: 0.0676 184/500 [==========>...................] - ETA: 1:47 - loss: 0.7540 - regression_loss: 0.6863 - classification_loss: 0.0676 185/500 [==========>...................] - ETA: 1:47 - loss: 0.7537 - regression_loss: 0.6863 - classification_loss: 0.0673 186/500 [==========>...................] - ETA: 1:46 - loss: 0.7527 - regression_loss: 0.6856 - classification_loss: 0.0671 187/500 [==========>...................] - ETA: 1:46 - loss: 0.7520 - regression_loss: 0.6850 - classification_loss: 0.0670 188/500 [==========>...................] - ETA: 1:46 - loss: 0.7543 - regression_loss: 0.6870 - classification_loss: 0.0672 189/500 [==========>...................] - ETA: 1:45 - loss: 0.7583 - regression_loss: 0.6903 - classification_loss: 0.0680 190/500 [==========>...................] - ETA: 1:45 - loss: 0.7560 - regression_loss: 0.6879 - classification_loss: 0.0681 191/500 [==========>...................] - ETA: 1:45 - loss: 0.7619 - regression_loss: 0.6921 - classification_loss: 0.0699 192/500 [==========>...................] - ETA: 1:44 - loss: 0.7644 - regression_loss: 0.6944 - classification_loss: 0.0700 193/500 [==========>...................] - ETA: 1:44 - loss: 0.7656 - regression_loss: 0.6956 - classification_loss: 0.0701 194/500 [==========>...................] - ETA: 1:44 - loss: 0.7652 - regression_loss: 0.6953 - classification_loss: 0.0698 195/500 [==========>...................] - ETA: 1:43 - loss: 0.7643 - regression_loss: 0.6946 - classification_loss: 0.0697 196/500 [==========>...................] - ETA: 1:43 - loss: 0.7688 - regression_loss: 0.6984 - classification_loss: 0.0704 197/500 [==========>...................] - ETA: 1:43 - loss: 0.7676 - regression_loss: 0.6975 - classification_loss: 0.0701 198/500 [==========>...................] - ETA: 1:42 - loss: 0.7673 - regression_loss: 0.6975 - classification_loss: 0.0698 199/500 [==========>...................] - ETA: 1:42 - loss: 0.7677 - regression_loss: 0.6980 - classification_loss: 0.0697 200/500 [===========>..................] - ETA: 1:42 - loss: 0.7659 - regression_loss: 0.6965 - classification_loss: 0.0694 201/500 [===========>..................] - ETA: 1:41 - loss: 0.7680 - regression_loss: 0.6980 - classification_loss: 0.0700 202/500 [===========>..................] - ETA: 1:41 - loss: 0.7689 - regression_loss: 0.6987 - classification_loss: 0.0701 203/500 [===========>..................] - ETA: 1:41 - loss: 0.7714 - regression_loss: 0.7007 - classification_loss: 0.0707 204/500 [===========>..................] - ETA: 1:40 - loss: 0.7716 - regression_loss: 0.7010 - classification_loss: 0.0705 205/500 [===========>..................] - ETA: 1:40 - loss: 0.7714 - regression_loss: 0.7009 - classification_loss: 0.0705 206/500 [===========>..................] - ETA: 1:40 - loss: 0.7750 - regression_loss: 0.7041 - classification_loss: 0.0709 207/500 [===========>..................] - ETA: 1:39 - loss: 0.7751 - regression_loss: 0.7043 - classification_loss: 0.0708 208/500 [===========>..................] - ETA: 1:39 - loss: 0.7771 - regression_loss: 0.7063 - classification_loss: 0.0707 209/500 [===========>..................] - ETA: 1:38 - loss: 0.7766 - regression_loss: 0.7061 - classification_loss: 0.0705 210/500 [===========>..................] - ETA: 1:38 - loss: 0.7769 - regression_loss: 0.7064 - classification_loss: 0.0705 211/500 [===========>..................] - ETA: 1:38 - loss: 0.7759 - regression_loss: 0.7054 - classification_loss: 0.0705 212/500 [===========>..................] - ETA: 1:37 - loss: 0.7764 - regression_loss: 0.7053 - classification_loss: 0.0711 213/500 [===========>..................] - ETA: 1:37 - loss: 0.7763 - regression_loss: 0.7052 - classification_loss: 0.0711 214/500 [===========>..................] - ETA: 1:37 - loss: 0.7755 - regression_loss: 0.7045 - classification_loss: 0.0710 215/500 [===========>..................] - ETA: 1:36 - loss: 0.7738 - regression_loss: 0.7030 - classification_loss: 0.0707 216/500 [===========>..................] - ETA: 1:36 - loss: 0.7753 - regression_loss: 0.7045 - classification_loss: 0.0708 217/500 [============>.................] - ETA: 1:36 - loss: 0.7766 - regression_loss: 0.7059 - classification_loss: 0.0707 218/500 [============>.................] - ETA: 1:35 - loss: 0.7739 - regression_loss: 0.7035 - classification_loss: 0.0704 219/500 [============>.................] - ETA: 1:35 - loss: 0.7753 - regression_loss: 0.7050 - classification_loss: 0.0703 220/500 [============>.................] - ETA: 1:35 - loss: 0.7742 - regression_loss: 0.7040 - classification_loss: 0.0702 221/500 [============>.................] - ETA: 1:34 - loss: 0.7743 - regression_loss: 0.7040 - classification_loss: 0.0703 222/500 [============>.................] - ETA: 1:34 - loss: 0.7749 - regression_loss: 0.7045 - classification_loss: 0.0704 223/500 [============>.................] - ETA: 1:34 - loss: 0.7753 - regression_loss: 0.7048 - classification_loss: 0.0705 224/500 [============>.................] - ETA: 1:33 - loss: 0.7854 - regression_loss: 0.7130 - classification_loss: 0.0724 225/500 [============>.................] - ETA: 1:33 - loss: 0.7846 - regression_loss: 0.7123 - classification_loss: 0.0723 226/500 [============>.................] - ETA: 1:33 - loss: 0.7852 - regression_loss: 0.7128 - classification_loss: 0.0724 227/500 [============>.................] - ETA: 1:32 - loss: 0.7836 - regression_loss: 0.7115 - classification_loss: 0.0721 228/500 [============>.................] - ETA: 1:32 - loss: 0.7840 - regression_loss: 0.7119 - classification_loss: 0.0722 229/500 [============>.................] - ETA: 1:32 - loss: 0.7836 - regression_loss: 0.7116 - classification_loss: 0.0720 230/500 [============>.................] - ETA: 1:31 - loss: 0.7823 - regression_loss: 0.7104 - classification_loss: 0.0719 231/500 [============>.................] - ETA: 1:31 - loss: 0.7832 - regression_loss: 0.7112 - classification_loss: 0.0719 232/500 [============>.................] - ETA: 1:31 - loss: 0.7827 - regression_loss: 0.7109 - classification_loss: 0.0718 233/500 [============>.................] - ETA: 1:30 - loss: 0.7822 - regression_loss: 0.7102 - classification_loss: 0.0720 234/500 [=============>................] - ETA: 1:30 - loss: 0.7825 - regression_loss: 0.7105 - classification_loss: 0.0720 235/500 [=============>................] - ETA: 1:30 - loss: 0.7829 - regression_loss: 0.7108 - classification_loss: 0.0721 236/500 [=============>................] - ETA: 1:29 - loss: 0.7848 - regression_loss: 0.7119 - classification_loss: 0.0729 237/500 [=============>................] - ETA: 1:29 - loss: 0.7843 - regression_loss: 0.7115 - classification_loss: 0.0728 238/500 [=============>................] - ETA: 1:29 - loss: 0.7835 - regression_loss: 0.7109 - classification_loss: 0.0726 239/500 [=============>................] - ETA: 1:28 - loss: 0.7832 - regression_loss: 0.7105 - classification_loss: 0.0727 240/500 [=============>................] - ETA: 1:28 - loss: 0.7831 - regression_loss: 0.7106 - classification_loss: 0.0725 241/500 [=============>................] - ETA: 1:27 - loss: 0.7799 - regression_loss: 0.7076 - classification_loss: 0.0723 242/500 [=============>................] - ETA: 1:27 - loss: 0.7791 - regression_loss: 0.7069 - classification_loss: 0.0722 243/500 [=============>................] - ETA: 1:27 - loss: 0.7799 - regression_loss: 0.7075 - classification_loss: 0.0724 244/500 [=============>................] - ETA: 1:26 - loss: 0.7792 - regression_loss: 0.7069 - classification_loss: 0.0723 245/500 [=============>................] - ETA: 1:26 - loss: 0.7777 - regression_loss: 0.7056 - classification_loss: 0.0721 246/500 [=============>................] - ETA: 1:26 - loss: 0.7783 - regression_loss: 0.7059 - classification_loss: 0.0724 247/500 [=============>................] - ETA: 1:25 - loss: 0.7770 - regression_loss: 0.7049 - classification_loss: 0.0721 248/500 [=============>................] - ETA: 1:25 - loss: 0.7759 - regression_loss: 0.7040 - classification_loss: 0.0719 249/500 [=============>................] - ETA: 1:25 - loss: 0.7762 - regression_loss: 0.7040 - classification_loss: 0.0722 250/500 [==============>...............] - ETA: 1:24 - loss: 0.7760 - regression_loss: 0.7037 - classification_loss: 0.0723 251/500 [==============>...............] - ETA: 1:24 - loss: 0.7754 - regression_loss: 0.7032 - classification_loss: 0.0722 252/500 [==============>...............] - ETA: 1:24 - loss: 0.7755 - regression_loss: 0.7033 - classification_loss: 0.0723 253/500 [==============>...............] - ETA: 1:23 - loss: 0.7781 - regression_loss: 0.7057 - classification_loss: 0.0724 254/500 [==============>...............] - ETA: 1:23 - loss: 0.7774 - regression_loss: 0.7051 - classification_loss: 0.0723 255/500 [==============>...............] - ETA: 1:23 - loss: 0.7811 - regression_loss: 0.7083 - classification_loss: 0.0728 256/500 [==============>...............] - ETA: 1:22 - loss: 0.7786 - regression_loss: 0.7060 - classification_loss: 0.0726 257/500 [==============>...............] - ETA: 1:22 - loss: 0.7782 - regression_loss: 0.7057 - classification_loss: 0.0725 258/500 [==============>...............] - ETA: 1:22 - loss: 0.7784 - regression_loss: 0.7060 - classification_loss: 0.0724 259/500 [==============>...............] - ETA: 1:21 - loss: 0.7772 - regression_loss: 0.7050 - classification_loss: 0.0722 260/500 [==============>...............] - ETA: 1:21 - loss: 0.7778 - regression_loss: 0.7056 - classification_loss: 0.0722 261/500 [==============>...............] - ETA: 1:21 - loss: 0.7758 - regression_loss: 0.7038 - classification_loss: 0.0720 262/500 [==============>...............] - ETA: 1:20 - loss: 0.7764 - regression_loss: 0.7041 - classification_loss: 0.0722 263/500 [==============>...............] - ETA: 1:20 - loss: 0.7766 - regression_loss: 0.7044 - classification_loss: 0.0722 264/500 [==============>...............] - ETA: 1:20 - loss: 0.7768 - regression_loss: 0.7047 - classification_loss: 0.0721 265/500 [==============>...............] - ETA: 1:19 - loss: 0.7748 - regression_loss: 0.7030 - classification_loss: 0.0719 266/500 [==============>...............] - ETA: 1:19 - loss: 0.7782 - regression_loss: 0.7057 - classification_loss: 0.0725 267/500 [===============>..............] - ETA: 1:19 - loss: 0.7767 - regression_loss: 0.7044 - classification_loss: 0.0723 268/500 [===============>..............] - ETA: 1:18 - loss: 0.7765 - regression_loss: 0.7043 - classification_loss: 0.0722 269/500 [===============>..............] - ETA: 1:18 - loss: 0.7764 - regression_loss: 0.7042 - classification_loss: 0.0721 270/500 [===============>..............] - ETA: 1:18 - loss: 0.7768 - regression_loss: 0.7045 - classification_loss: 0.0723 271/500 [===============>..............] - ETA: 1:17 - loss: 0.7749 - regression_loss: 0.7028 - classification_loss: 0.0721 272/500 [===============>..............] - ETA: 1:17 - loss: 0.7751 - regression_loss: 0.7030 - classification_loss: 0.0720 273/500 [===============>..............] - ETA: 1:17 - loss: 0.7731 - regression_loss: 0.7013 - classification_loss: 0.0718 274/500 [===============>..............] - ETA: 1:16 - loss: 0.7743 - regression_loss: 0.7023 - classification_loss: 0.0721 275/500 [===============>..............] - ETA: 1:16 - loss: 0.7738 - regression_loss: 0.7018 - classification_loss: 0.0720 276/500 [===============>..............] - ETA: 1:16 - loss: 0.7742 - regression_loss: 0.7022 - classification_loss: 0.0720 277/500 [===============>..............] - ETA: 1:15 - loss: 0.7745 - regression_loss: 0.7025 - classification_loss: 0.0720 278/500 [===============>..............] - ETA: 1:15 - loss: 0.7742 - regression_loss: 0.7022 - classification_loss: 0.0720 279/500 [===============>..............] - ETA: 1:15 - loss: 0.7735 - regression_loss: 0.7016 - classification_loss: 0.0718 280/500 [===============>..............] - ETA: 1:14 - loss: 0.7721 - regression_loss: 0.7004 - classification_loss: 0.0716 281/500 [===============>..............] - ETA: 1:14 - loss: 0.7726 - regression_loss: 0.7009 - classification_loss: 0.0717 282/500 [===============>..............] - ETA: 1:14 - loss: 0.7720 - regression_loss: 0.7005 - classification_loss: 0.0716 283/500 [===============>..............] - ETA: 1:13 - loss: 0.7734 - regression_loss: 0.7016 - classification_loss: 0.0719 284/500 [================>.............] - ETA: 1:13 - loss: 0.7723 - regression_loss: 0.7006 - classification_loss: 0.0717 285/500 [================>.............] - ETA: 1:13 - loss: 0.7727 - regression_loss: 0.7012 - classification_loss: 0.0715 286/500 [================>.............] - ETA: 1:12 - loss: 0.7734 - regression_loss: 0.7019 - classification_loss: 0.0715 287/500 [================>.............] - ETA: 1:12 - loss: 0.7721 - regression_loss: 0.7008 - classification_loss: 0.0713 288/500 [================>.............] - ETA: 1:11 - loss: 0.7711 - regression_loss: 0.7000 - classification_loss: 0.0712 289/500 [================>.............] - ETA: 1:11 - loss: 0.7715 - regression_loss: 0.7002 - classification_loss: 0.0712 290/500 [================>.............] - ETA: 1:11 - loss: 0.7714 - regression_loss: 0.7002 - classification_loss: 0.0711 291/500 [================>.............] - ETA: 1:10 - loss: 0.7718 - regression_loss: 0.7005 - classification_loss: 0.0713 292/500 [================>.............] - ETA: 1:10 - loss: 0.7729 - regression_loss: 0.7010 - classification_loss: 0.0719 293/500 [================>.............] - ETA: 1:10 - loss: 0.7729 - regression_loss: 0.7010 - classification_loss: 0.0719 294/500 [================>.............] - ETA: 1:09 - loss: 0.7724 - regression_loss: 0.7007 - classification_loss: 0.0717 295/500 [================>.............] - ETA: 1:09 - loss: 0.7733 - regression_loss: 0.7016 - classification_loss: 0.0717 296/500 [================>.............] - ETA: 1:09 - loss: 0.7722 - regression_loss: 0.7006 - classification_loss: 0.0715 297/500 [================>.............] - ETA: 1:08 - loss: 0.7721 - regression_loss: 0.7006 - classification_loss: 0.0715 298/500 [================>.............] - ETA: 1:08 - loss: 0.7722 - regression_loss: 0.7007 - classification_loss: 0.0715 299/500 [================>.............] - ETA: 1:08 - loss: 0.7716 - regression_loss: 0.7002 - classification_loss: 0.0714 300/500 [=================>............] - ETA: 1:07 - loss: 0.7715 - regression_loss: 0.7001 - classification_loss: 0.0714 301/500 [=================>............] - ETA: 1:07 - loss: 0.7727 - regression_loss: 0.7011 - classification_loss: 0.0716 302/500 [=================>............] - ETA: 1:07 - loss: 0.7727 - regression_loss: 0.7011 - classification_loss: 0.0716 303/500 [=================>............] - ETA: 1:06 - loss: 0.7732 - regression_loss: 0.7016 - classification_loss: 0.0716 304/500 [=================>............] - ETA: 1:06 - loss: 0.7726 - regression_loss: 0.7011 - classification_loss: 0.0715 305/500 [=================>............] - ETA: 1:06 - loss: 0.7723 - regression_loss: 0.7008 - classification_loss: 0.0714 306/500 [=================>............] - ETA: 1:05 - loss: 0.7716 - regression_loss: 0.7002 - classification_loss: 0.0714 307/500 [=================>............] - ETA: 1:05 - loss: 0.7716 - regression_loss: 0.7001 - classification_loss: 0.0715 308/500 [=================>............] - ETA: 1:05 - loss: 0.7700 - regression_loss: 0.6987 - classification_loss: 0.0713 309/500 [=================>............] - ETA: 1:04 - loss: 0.7719 - regression_loss: 0.6999 - classification_loss: 0.0720 310/500 [=================>............] - ETA: 1:04 - loss: 0.7734 - regression_loss: 0.7012 - classification_loss: 0.0722 311/500 [=================>............] - ETA: 1:04 - loss: 0.7733 - regression_loss: 0.7011 - classification_loss: 0.0721 312/500 [=================>............] - ETA: 1:03 - loss: 0.7739 - regression_loss: 0.7018 - classification_loss: 0.0721 313/500 [=================>............] - ETA: 1:03 - loss: 0.7735 - regression_loss: 0.7015 - classification_loss: 0.0721 314/500 [=================>............] - ETA: 1:03 - loss: 0.7738 - regression_loss: 0.7017 - classification_loss: 0.0721 315/500 [=================>............] - ETA: 1:02 - loss: 0.7731 - regression_loss: 0.7011 - classification_loss: 0.0721 316/500 [=================>............] - ETA: 1:02 - loss: 0.7730 - regression_loss: 0.7010 - classification_loss: 0.0720 317/500 [==================>...........] - ETA: 1:02 - loss: 0.7743 - regression_loss: 0.7022 - classification_loss: 0.0721 318/500 [==================>...........] - ETA: 1:01 - loss: 0.7742 - regression_loss: 0.7023 - classification_loss: 0.0719 319/500 [==================>...........] - ETA: 1:01 - loss: 0.7747 - regression_loss: 0.7028 - classification_loss: 0.0719 320/500 [==================>...........] - ETA: 1:01 - loss: 0.7747 - regression_loss: 0.7027 - classification_loss: 0.0720 321/500 [==================>...........] - ETA: 1:00 - loss: 0.7741 - regression_loss: 0.7021 - classification_loss: 0.0720 322/500 [==================>...........] - ETA: 1:00 - loss: 0.7743 - regression_loss: 0.7022 - classification_loss: 0.0721 323/500 [==================>...........] - ETA: 1:00 - loss: 0.7746 - regression_loss: 0.7024 - classification_loss: 0.0722 324/500 [==================>...........] - ETA: 59s - loss: 0.7748 - regression_loss: 0.7027 - classification_loss: 0.0721  325/500 [==================>...........] - ETA: 59s - loss: 0.7778 - regression_loss: 0.7053 - classification_loss: 0.0725 326/500 [==================>...........] - ETA: 59s - loss: 0.7772 - regression_loss: 0.7048 - classification_loss: 0.0724 327/500 [==================>...........] - ETA: 58s - loss: 0.7767 - regression_loss: 0.7044 - classification_loss: 0.0723 328/500 [==================>...........] - ETA: 58s - loss: 0.7756 - regression_loss: 0.7034 - classification_loss: 0.0722 329/500 [==================>...........] - ETA: 58s - loss: 0.7790 - regression_loss: 0.7069 - classification_loss: 0.0721 330/500 [==================>...........] - ETA: 57s - loss: 0.7789 - regression_loss: 0.7068 - classification_loss: 0.0721 331/500 [==================>...........] - ETA: 57s - loss: 0.7783 - regression_loss: 0.7064 - classification_loss: 0.0719 332/500 [==================>...........] - ETA: 57s - loss: 0.7797 - regression_loss: 0.7078 - classification_loss: 0.0719 333/500 [==================>...........] - ETA: 56s - loss: 0.7795 - regression_loss: 0.7076 - classification_loss: 0.0719 334/500 [===================>..........] - ETA: 56s - loss: 0.7790 - regression_loss: 0.7069 - classification_loss: 0.0720 335/500 [===================>..........] - ETA: 56s - loss: 0.7778 - regression_loss: 0.7057 - classification_loss: 0.0721 336/500 [===================>..........] - ETA: 55s - loss: 0.7767 - regression_loss: 0.7045 - classification_loss: 0.0722 337/500 [===================>..........] - ETA: 55s - loss: 0.7770 - regression_loss: 0.7050 - classification_loss: 0.0721 338/500 [===================>..........] - ETA: 54s - loss: 0.7758 - regression_loss: 0.7038 - classification_loss: 0.0720 339/500 [===================>..........] - ETA: 54s - loss: 0.7767 - regression_loss: 0.7047 - classification_loss: 0.0721 340/500 [===================>..........] - ETA: 54s - loss: 0.7762 - regression_loss: 0.7042 - classification_loss: 0.0720 341/500 [===================>..........] - ETA: 53s - loss: 0.7784 - regression_loss: 0.7061 - classification_loss: 0.0723 342/500 [===================>..........] - ETA: 53s - loss: 0.7780 - regression_loss: 0.7058 - classification_loss: 0.0722 343/500 [===================>..........] - ETA: 53s - loss: 0.7784 - regression_loss: 0.7061 - classification_loss: 0.0722 344/500 [===================>..........] - ETA: 52s - loss: 0.7780 - regression_loss: 0.7058 - classification_loss: 0.0722 345/500 [===================>..........] - ETA: 52s - loss: 0.7795 - regression_loss: 0.7072 - classification_loss: 0.0724 346/500 [===================>..........] - ETA: 52s - loss: 0.7800 - regression_loss: 0.7077 - classification_loss: 0.0724 347/500 [===================>..........] - ETA: 51s - loss: 0.7790 - regression_loss: 0.7067 - classification_loss: 0.0723 348/500 [===================>..........] - ETA: 51s - loss: 0.7780 - regression_loss: 0.7059 - classification_loss: 0.0721 349/500 [===================>..........] - ETA: 51s - loss: 0.7793 - regression_loss: 0.7070 - classification_loss: 0.0723 350/500 [====================>.........] - ETA: 50s - loss: 0.7793 - regression_loss: 0.7070 - classification_loss: 0.0723 351/500 [====================>.........] - ETA: 50s - loss: 0.7804 - regression_loss: 0.7078 - classification_loss: 0.0725 352/500 [====================>.........] - ETA: 50s - loss: 0.7799 - regression_loss: 0.7073 - classification_loss: 0.0725 353/500 [====================>.........] - ETA: 49s - loss: 0.7793 - regression_loss: 0.7069 - classification_loss: 0.0724 354/500 [====================>.........] - ETA: 49s - loss: 0.7797 - regression_loss: 0.7072 - classification_loss: 0.0725 355/500 [====================>.........] - ETA: 49s - loss: 0.7785 - regression_loss: 0.7062 - classification_loss: 0.0724 356/500 [====================>.........] - ETA: 48s - loss: 0.7782 - regression_loss: 0.7059 - classification_loss: 0.0723 357/500 [====================>.........] - ETA: 48s - loss: 0.7796 - regression_loss: 0.7071 - classification_loss: 0.0725 358/500 [====================>.........] - ETA: 48s - loss: 0.7800 - regression_loss: 0.7075 - classification_loss: 0.0726 359/500 [====================>.........] - ETA: 47s - loss: 0.7782 - regression_loss: 0.7058 - classification_loss: 0.0724 360/500 [====================>.........] - ETA: 47s - loss: 0.7775 - regression_loss: 0.7053 - classification_loss: 0.0723 361/500 [====================>.........] - ETA: 47s - loss: 0.7807 - regression_loss: 0.7073 - classification_loss: 0.0733 362/500 [====================>.........] - ETA: 46s - loss: 0.7808 - regression_loss: 0.7075 - classification_loss: 0.0733 363/500 [====================>.........] - ETA: 46s - loss: 0.7806 - regression_loss: 0.7073 - classification_loss: 0.0733 364/500 [====================>.........] - ETA: 46s - loss: 0.7807 - regression_loss: 0.7074 - classification_loss: 0.0733 365/500 [====================>.........] - ETA: 45s - loss: 0.7805 - regression_loss: 0.7072 - classification_loss: 0.0733 366/500 [====================>.........] - ETA: 45s - loss: 0.7799 - regression_loss: 0.7068 - classification_loss: 0.0731 367/500 [=====================>........] - ETA: 45s - loss: 0.7795 - regression_loss: 0.7064 - classification_loss: 0.0731 368/500 [=====================>........] - ETA: 44s - loss: 0.7803 - regression_loss: 0.7072 - classification_loss: 0.0731 369/500 [=====================>........] - ETA: 44s - loss: 0.7818 - regression_loss: 0.7086 - classification_loss: 0.0732 370/500 [=====================>........] - ETA: 44s - loss: 0.7812 - regression_loss: 0.7081 - classification_loss: 0.0731 371/500 [=====================>........] - ETA: 43s - loss: 0.7819 - regression_loss: 0.7089 - classification_loss: 0.0730 372/500 [=====================>........] - ETA: 43s - loss: 0.7814 - regression_loss: 0.7084 - classification_loss: 0.0729 373/500 [=====================>........] - ETA: 43s - loss: 0.7812 - regression_loss: 0.7084 - classification_loss: 0.0728 374/500 [=====================>........] - ETA: 42s - loss: 0.7826 - regression_loss: 0.7095 - classification_loss: 0.0731 375/500 [=====================>........] - ETA: 42s - loss: 0.7824 - regression_loss: 0.7093 - classification_loss: 0.0730 376/500 [=====================>........] - ETA: 42s - loss: 0.7827 - regression_loss: 0.7097 - classification_loss: 0.0730 377/500 [=====================>........] - ETA: 41s - loss: 0.7828 - regression_loss: 0.7100 - classification_loss: 0.0729 378/500 [=====================>........] - ETA: 41s - loss: 0.7841 - regression_loss: 0.7110 - classification_loss: 0.0731 379/500 [=====================>........] - ETA: 41s - loss: 0.7881 - regression_loss: 0.7140 - classification_loss: 0.0741 380/500 [=====================>........] - ETA: 40s - loss: 0.7869 - regression_loss: 0.7129 - classification_loss: 0.0740 381/500 [=====================>........] - ETA: 40s - loss: 0.7883 - regression_loss: 0.7140 - classification_loss: 0.0743 382/500 [=====================>........] - ETA: 40s - loss: 0.7875 - regression_loss: 0.7134 - classification_loss: 0.0742 383/500 [=====================>........] - ETA: 39s - loss: 0.7878 - regression_loss: 0.7135 - classification_loss: 0.0742 384/500 [======================>.......] - ETA: 39s - loss: 0.7884 - regression_loss: 0.7141 - classification_loss: 0.0743 385/500 [======================>.......] - ETA: 39s - loss: 0.7899 - regression_loss: 0.7155 - classification_loss: 0.0744 386/500 [======================>.......] - ETA: 38s - loss: 0.7889 - regression_loss: 0.7146 - classification_loss: 0.0742 387/500 [======================>.......] - ETA: 38s - loss: 0.7898 - regression_loss: 0.7152 - classification_loss: 0.0746 388/500 [======================>.......] - ETA: 38s - loss: 0.7885 - regression_loss: 0.7140 - classification_loss: 0.0745 389/500 [======================>.......] - ETA: 37s - loss: 0.7904 - regression_loss: 0.7156 - classification_loss: 0.0748 390/500 [======================>.......] - ETA: 37s - loss: 0.7896 - regression_loss: 0.7149 - classification_loss: 0.0748 391/500 [======================>.......] - ETA: 37s - loss: 0.7895 - regression_loss: 0.7148 - classification_loss: 0.0747 392/500 [======================>.......] - ETA: 36s - loss: 0.7905 - regression_loss: 0.7156 - classification_loss: 0.0748 393/500 [======================>.......] - ETA: 36s - loss: 0.7909 - regression_loss: 0.7160 - classification_loss: 0.0748 394/500 [======================>.......] - ETA: 35s - loss: 0.7917 - regression_loss: 0.7167 - classification_loss: 0.0751 395/500 [======================>.......] - ETA: 35s - loss: 0.7915 - regression_loss: 0.7164 - classification_loss: 0.0750 396/500 [======================>.......] - ETA: 35s - loss: 0.7909 - regression_loss: 0.7160 - classification_loss: 0.0749 397/500 [======================>.......] - ETA: 34s - loss: 0.7902 - regression_loss: 0.7154 - classification_loss: 0.0748 398/500 [======================>.......] - ETA: 34s - loss: 0.7897 - regression_loss: 0.7151 - classification_loss: 0.0746 399/500 [======================>.......] - ETA: 34s - loss: 0.7891 - regression_loss: 0.7146 - classification_loss: 0.0745 400/500 [=======================>......] - ETA: 33s - loss: 0.7897 - regression_loss: 0.7152 - classification_loss: 0.0746 401/500 [=======================>......] - ETA: 33s - loss: 0.7916 - regression_loss: 0.7169 - classification_loss: 0.0747 402/500 [=======================>......] - ETA: 33s - loss: 0.7907 - regression_loss: 0.7161 - classification_loss: 0.0746 403/500 [=======================>......] - ETA: 32s - loss: 0.7899 - regression_loss: 0.7154 - classification_loss: 0.0745 404/500 [=======================>......] - ETA: 32s - loss: 0.7918 - regression_loss: 0.7167 - classification_loss: 0.0752 405/500 [=======================>......] - ETA: 32s - loss: 0.7914 - regression_loss: 0.7163 - classification_loss: 0.0751 406/500 [=======================>......] - ETA: 31s - loss: 0.7908 - regression_loss: 0.7157 - classification_loss: 0.0751 407/500 [=======================>......] - ETA: 31s - loss: 0.7919 - regression_loss: 0.7165 - classification_loss: 0.0755 408/500 [=======================>......] - ETA: 31s - loss: 0.7917 - regression_loss: 0.7158 - classification_loss: 0.0758 409/500 [=======================>......] - ETA: 30s - loss: 0.7903 - regression_loss: 0.7146 - classification_loss: 0.0757 410/500 [=======================>......] - ETA: 30s - loss: 0.7893 - regression_loss: 0.7138 - classification_loss: 0.0756 411/500 [=======================>......] - ETA: 30s - loss: 0.7914 - regression_loss: 0.7154 - classification_loss: 0.0761 412/500 [=======================>......] - ETA: 29s - loss: 0.7917 - regression_loss: 0.7157 - classification_loss: 0.0760 413/500 [=======================>......] - ETA: 29s - loss: 0.7907 - regression_loss: 0.7149 - classification_loss: 0.0759 414/500 [=======================>......] - ETA: 29s - loss: 0.7907 - regression_loss: 0.7149 - classification_loss: 0.0758 415/500 [=======================>......] - ETA: 28s - loss: 0.7903 - regression_loss: 0.7145 - classification_loss: 0.0758 416/500 [=======================>......] - ETA: 28s - loss: 0.7892 - regression_loss: 0.7135 - classification_loss: 0.0757 417/500 [========================>.....] - ETA: 28s - loss: 0.7912 - regression_loss: 0.7150 - classification_loss: 0.0762 418/500 [========================>.....] - ETA: 27s - loss: 0.7906 - regression_loss: 0.7145 - classification_loss: 0.0761 419/500 [========================>.....] - ETA: 27s - loss: 0.7896 - regression_loss: 0.7136 - classification_loss: 0.0760 420/500 [========================>.....] - ETA: 27s - loss: 0.7887 - regression_loss: 0.7129 - classification_loss: 0.0758 421/500 [========================>.....] - ETA: 26s - loss: 0.7880 - regression_loss: 0.7123 - classification_loss: 0.0757 422/500 [========================>.....] - ETA: 26s - loss: 0.7880 - regression_loss: 0.7124 - classification_loss: 0.0756 423/500 [========================>.....] - ETA: 26s - loss: 0.7875 - regression_loss: 0.7119 - classification_loss: 0.0755 424/500 [========================>.....] - ETA: 25s - loss: 0.7877 - regression_loss: 0.7122 - classification_loss: 0.0755 425/500 [========================>.....] - ETA: 25s - loss: 0.7883 - regression_loss: 0.7128 - classification_loss: 0.0755 426/500 [========================>.....] - ETA: 25s - loss: 0.7877 - regression_loss: 0.7123 - classification_loss: 0.0755 427/500 [========================>.....] - ETA: 24s - loss: 0.7877 - regression_loss: 0.7123 - classification_loss: 0.0754 428/500 [========================>.....] - ETA: 24s - loss: 0.7891 - regression_loss: 0.7131 - classification_loss: 0.0759 429/500 [========================>.....] - ETA: 24s - loss: 0.7880 - regression_loss: 0.7122 - classification_loss: 0.0758 430/500 [========================>.....] - ETA: 23s - loss: 0.7874 - regression_loss: 0.7117 - classification_loss: 0.0757 431/500 [========================>.....] - ETA: 23s - loss: 0.7882 - regression_loss: 0.7125 - classification_loss: 0.0757 432/500 [========================>.....] - ETA: 23s - loss: 0.7883 - regression_loss: 0.7127 - classification_loss: 0.0756 433/500 [========================>.....] - ETA: 22s - loss: 0.7879 - regression_loss: 0.7123 - classification_loss: 0.0756 434/500 [=========================>....] - ETA: 22s - loss: 0.7878 - regression_loss: 0.7122 - classification_loss: 0.0757 435/500 [=========================>....] - ETA: 22s - loss: 0.7876 - regression_loss: 0.7120 - classification_loss: 0.0756 436/500 [=========================>....] - ETA: 21s - loss: 0.7885 - regression_loss: 0.7126 - classification_loss: 0.0759 437/500 [=========================>....] - ETA: 21s - loss: 0.7898 - regression_loss: 0.7133 - classification_loss: 0.0765 438/500 [=========================>....] - ETA: 21s - loss: 0.7897 - regression_loss: 0.7133 - classification_loss: 0.0764 439/500 [=========================>....] - ETA: 20s - loss: 0.7904 - regression_loss: 0.7139 - classification_loss: 0.0765 440/500 [=========================>....] - ETA: 20s - loss: 0.7898 - regression_loss: 0.7134 - classification_loss: 0.0764 441/500 [=========================>....] - ETA: 20s - loss: 0.7886 - regression_loss: 0.7123 - classification_loss: 0.0763 442/500 [=========================>....] - ETA: 19s - loss: 0.7897 - regression_loss: 0.7132 - classification_loss: 0.0764 443/500 [=========================>....] - ETA: 19s - loss: 0.7888 - regression_loss: 0.7125 - classification_loss: 0.0763 444/500 [=========================>....] - ETA: 19s - loss: 0.7896 - regression_loss: 0.7132 - classification_loss: 0.0764 445/500 [=========================>....] - ETA: 18s - loss: 0.7907 - regression_loss: 0.7141 - classification_loss: 0.0765 446/500 [=========================>....] - ETA: 18s - loss: 0.7904 - regression_loss: 0.7139 - classification_loss: 0.0765 447/500 [=========================>....] - ETA: 17s - loss: 0.7900 - regression_loss: 0.7136 - classification_loss: 0.0764 448/500 [=========================>....] - ETA: 17s - loss: 0.7896 - regression_loss: 0.7133 - classification_loss: 0.0763 449/500 [=========================>....] - ETA: 17s - loss: 0.7898 - regression_loss: 0.7135 - classification_loss: 0.0763 450/500 [==========================>...] - ETA: 16s - loss: 0.7887 - regression_loss: 0.7125 - classification_loss: 0.0761 451/500 [==========================>...] - ETA: 16s - loss: 0.7897 - regression_loss: 0.7135 - classification_loss: 0.0762 452/500 [==========================>...] - ETA: 16s - loss: 0.7879 - regression_loss: 0.7119 - classification_loss: 0.0760 453/500 [==========================>...] - ETA: 15s - loss: 0.7886 - regression_loss: 0.7121 - classification_loss: 0.0765 454/500 [==========================>...] - ETA: 15s - loss: 0.7893 - regression_loss: 0.7127 - classification_loss: 0.0766 455/500 [==========================>...] - ETA: 15s - loss: 0.7893 - regression_loss: 0.7127 - classification_loss: 0.0766 456/500 [==========================>...] - ETA: 14s - loss: 0.7899 - regression_loss: 0.7133 - classification_loss: 0.0766 457/500 [==========================>...] - ETA: 14s - loss: 0.7892 - regression_loss: 0.7127 - classification_loss: 0.0765 458/500 [==========================>...] - ETA: 14s - loss: 0.7885 - regression_loss: 0.7121 - classification_loss: 0.0764 459/500 [==========================>...] - ETA: 13s - loss: 0.7882 - regression_loss: 0.7119 - classification_loss: 0.0763 460/500 [==========================>...] - ETA: 13s - loss: 0.7881 - regression_loss: 0.7118 - classification_loss: 0.0763 461/500 [==========================>...] - ETA: 13s - loss: 0.7890 - regression_loss: 0.7127 - classification_loss: 0.0763 462/500 [==========================>...] - ETA: 12s - loss: 0.7881 - regression_loss: 0.7119 - classification_loss: 0.0762 463/500 [==========================>...] - ETA: 12s - loss: 0.7878 - regression_loss: 0.7117 - classification_loss: 0.0761 464/500 [==========================>...] - ETA: 12s - loss: 0.7877 - regression_loss: 0.7116 - classification_loss: 0.0761 465/500 [==========================>...] - ETA: 11s - loss: 0.7876 - regression_loss: 0.7116 - classification_loss: 0.0760 466/500 [==========================>...] - ETA: 11s - loss: 0.7867 - regression_loss: 0.7107 - classification_loss: 0.0759 467/500 [===========================>..] - ETA: 11s - loss: 0.7867 - regression_loss: 0.7108 - classification_loss: 0.0759 468/500 [===========================>..] - ETA: 10s - loss: 0.7864 - regression_loss: 0.7106 - classification_loss: 0.0758 469/500 [===========================>..] - ETA: 10s - loss: 0.7862 - regression_loss: 0.7104 - classification_loss: 0.0758 470/500 [===========================>..] - ETA: 10s - loss: 0.7864 - regression_loss: 0.7105 - classification_loss: 0.0759 471/500 [===========================>..] - ETA: 9s - loss: 0.7870 - regression_loss: 0.7110 - classification_loss: 0.0761  472/500 [===========================>..] - ETA: 9s - loss: 0.7860 - regression_loss: 0.7100 - classification_loss: 0.0759 473/500 [===========================>..] - ETA: 9s - loss: 0.7869 - regression_loss: 0.7110 - classification_loss: 0.0759 474/500 [===========================>..] - ETA: 8s - loss: 0.7875 - regression_loss: 0.7115 - classification_loss: 0.0760 475/500 [===========================>..] - ETA: 8s - loss: 0.7873 - regression_loss: 0.7113 - classification_loss: 0.0760 476/500 [===========================>..] - ETA: 8s - loss: 0.7867 - regression_loss: 0.7108 - classification_loss: 0.0759 477/500 [===========================>..] - ETA: 7s - loss: 0.7863 - regression_loss: 0.7104 - classification_loss: 0.0758 478/500 [===========================>..] - ETA: 7s - loss: 0.7862 - regression_loss: 0.7104 - classification_loss: 0.0758 479/500 [===========================>..] - ETA: 7s - loss: 0.7863 - regression_loss: 0.7106 - classification_loss: 0.0757 480/500 [===========================>..] - ETA: 6s - loss: 0.7865 - regression_loss: 0.7109 - classification_loss: 0.0757 481/500 [===========================>..] - ETA: 6s - loss: 0.7865 - regression_loss: 0.7108 - classification_loss: 0.0756 482/500 [===========================>..] - ETA: 6s - loss: 0.7854 - regression_loss: 0.7100 - classification_loss: 0.0755 483/500 [===========================>..] - ETA: 5s - loss: 0.7853 - regression_loss: 0.7098 - classification_loss: 0.0755 484/500 [============================>.] - ETA: 5s - loss: 0.7849 - regression_loss: 0.7095 - classification_loss: 0.0755 485/500 [============================>.] - ETA: 5s - loss: 0.7851 - regression_loss: 0.7096 - classification_loss: 0.0755 486/500 [============================>.] - ETA: 4s - loss: 0.7850 - regression_loss: 0.7095 - classification_loss: 0.0754 487/500 [============================>.] - ETA: 4s - loss: 0.7851 - regression_loss: 0.7097 - classification_loss: 0.0755 488/500 [============================>.] - ETA: 4s - loss: 0.7846 - regression_loss: 0.7092 - classification_loss: 0.0754 489/500 [============================>.] - ETA: 3s - loss: 0.7846 - regression_loss: 0.7091 - classification_loss: 0.0754 490/500 [============================>.] - ETA: 3s - loss: 0.7833 - regression_loss: 0.7080 - classification_loss: 0.0753 491/500 [============================>.] - ETA: 3s - loss: 0.7883 - regression_loss: 0.7109 - classification_loss: 0.0774 492/500 [============================>.] - ETA: 2s - loss: 0.7880 - regression_loss: 0.7106 - classification_loss: 0.0774 493/500 [============================>.] - ETA: 2s - loss: 0.7872 - regression_loss: 0.7100 - classification_loss: 0.0773 494/500 [============================>.] - ETA: 2s - loss: 0.7860 - regression_loss: 0.7089 - classification_loss: 0.0771 495/500 [============================>.] - ETA: 1s - loss: 0.7854 - regression_loss: 0.7084 - classification_loss: 0.0770 496/500 [============================>.] - ETA: 1s - loss: 0.7851 - regression_loss: 0.7082 - classification_loss: 0.0770 497/500 [============================>.] - ETA: 1s - loss: 0.7849 - regression_loss: 0.7080 - classification_loss: 0.0769 498/500 [============================>.] - ETA: 0s - loss: 0.7840 - regression_loss: 0.7072 - classification_loss: 0.0768 499/500 [============================>.] - ETA: 0s - loss: 0.7838 - regression_loss: 0.7071 - classification_loss: 0.0768 500/500 [==============================] - 170s 339ms/step - loss: 0.7841 - regression_loss: 0.7073 - classification_loss: 0.0768 326 instances of class plum with average precision: 0.8387 mAP: 0.8387 Epoch 00041: saving model to ./training/snapshots/resnet101_pascal_41.h5 Epoch 42/150 1/500 [..............................] - ETA: 2:46 - loss: 0.6929 - regression_loss: 0.6276 - classification_loss: 0.0653 2/500 [..............................] - ETA: 2:43 - loss: 0.5326 - regression_loss: 0.4881 - classification_loss: 0.0444 3/500 [..............................] - ETA: 2:43 - loss: 0.5620 - regression_loss: 0.5234 - classification_loss: 0.0386 4/500 [..............................] - ETA: 2:45 - loss: 0.6421 - regression_loss: 0.5898 - classification_loss: 0.0523 5/500 [..............................] - ETA: 2:46 - loss: 0.6579 - regression_loss: 0.6028 - classification_loss: 0.0550 6/500 [..............................] - ETA: 2:48 - loss: 0.7537 - regression_loss: 0.6822 - classification_loss: 0.0715 7/500 [..............................] - ETA: 2:47 - loss: 0.7155 - regression_loss: 0.6525 - classification_loss: 0.0630 8/500 [..............................] - ETA: 2:47 - loss: 0.6826 - regression_loss: 0.6227 - classification_loss: 0.0599 9/500 [..............................] - ETA: 2:47 - loss: 0.6605 - regression_loss: 0.6002 - classification_loss: 0.0603 10/500 [..............................] - ETA: 2:47 - loss: 0.7178 - regression_loss: 0.6436 - classification_loss: 0.0743 11/500 [..............................] - ETA: 2:47 - loss: 0.6908 - regression_loss: 0.6218 - classification_loss: 0.0690 12/500 [..............................] - ETA: 2:46 - loss: 0.6807 - regression_loss: 0.6143 - classification_loss: 0.0664 13/500 [..............................] - ETA: 2:46 - loss: 0.6622 - regression_loss: 0.5993 - classification_loss: 0.0629 14/500 [..............................] - ETA: 2:46 - loss: 0.6802 - regression_loss: 0.6162 - classification_loss: 0.0640 15/500 [..............................] - ETA: 2:45 - loss: 0.7011 - regression_loss: 0.6346 - classification_loss: 0.0665 16/500 [..............................] - ETA: 2:44 - loss: 0.7129 - regression_loss: 0.6457 - classification_loss: 0.0672 17/500 [>.............................] - ETA: 2:44 - loss: 0.7208 - regression_loss: 0.6558 - classification_loss: 0.0650 18/500 [>.............................] - ETA: 2:44 - loss: 0.7237 - regression_loss: 0.6575 - classification_loss: 0.0662 19/500 [>.............................] - ETA: 2:43 - loss: 0.7398 - regression_loss: 0.6738 - classification_loss: 0.0660 20/500 [>.............................] - ETA: 2:43 - loss: 0.7508 - regression_loss: 0.6848 - classification_loss: 0.0659 21/500 [>.............................] - ETA: 2:43 - loss: 0.7340 - regression_loss: 0.6710 - classification_loss: 0.0630 22/500 [>.............................] - ETA: 2:42 - loss: 0.7357 - regression_loss: 0.6655 - classification_loss: 0.0702 23/500 [>.............................] - ETA: 2:41 - loss: 0.7715 - regression_loss: 0.6967 - classification_loss: 0.0747 24/500 [>.............................] - ETA: 2:41 - loss: 0.7742 - regression_loss: 0.6986 - classification_loss: 0.0756 25/500 [>.............................] - ETA: 2:40 - loss: 0.7634 - regression_loss: 0.6898 - classification_loss: 0.0736 26/500 [>.............................] - ETA: 2:40 - loss: 0.7596 - regression_loss: 0.6854 - classification_loss: 0.0743 27/500 [>.............................] - ETA: 2:40 - loss: 0.7401 - regression_loss: 0.6676 - classification_loss: 0.0725 28/500 [>.............................] - ETA: 2:40 - loss: 0.7407 - regression_loss: 0.6679 - classification_loss: 0.0728 29/500 [>.............................] - ETA: 2:39 - loss: 0.7356 - regression_loss: 0.6639 - classification_loss: 0.0717 30/500 [>.............................] - ETA: 2:39 - loss: 0.7435 - regression_loss: 0.6713 - classification_loss: 0.0722 31/500 [>.............................] - ETA: 2:39 - loss: 0.7665 - regression_loss: 0.6901 - classification_loss: 0.0764 32/500 [>.............................] - ETA: 2:38 - loss: 0.7527 - regression_loss: 0.6767 - classification_loss: 0.0760 33/500 [>.............................] - ETA: 2:38 - loss: 0.7568 - regression_loss: 0.6823 - classification_loss: 0.0746 34/500 [=>............................] - ETA: 2:37 - loss: 0.7487 - regression_loss: 0.6752 - classification_loss: 0.0735 35/500 [=>............................] - ETA: 2:37 - loss: 0.7494 - regression_loss: 0.6764 - classification_loss: 0.0730 36/500 [=>............................] - ETA: 2:37 - loss: 0.7480 - regression_loss: 0.6762 - classification_loss: 0.0719 37/500 [=>............................] - ETA: 2:36 - loss: 0.7488 - regression_loss: 0.6770 - classification_loss: 0.0718 38/500 [=>............................] - ETA: 2:36 - loss: 0.7519 - regression_loss: 0.6781 - classification_loss: 0.0737 39/500 [=>............................] - ETA: 2:36 - loss: 0.7494 - regression_loss: 0.6766 - classification_loss: 0.0728 40/500 [=>............................] - ETA: 2:36 - loss: 0.7439 - regression_loss: 0.6724 - classification_loss: 0.0715 41/500 [=>............................] - ETA: 2:35 - loss: 0.7562 - regression_loss: 0.6837 - classification_loss: 0.0725 42/500 [=>............................] - ETA: 2:35 - loss: 0.7617 - regression_loss: 0.6890 - classification_loss: 0.0726 43/500 [=>............................] - ETA: 2:35 - loss: 0.7720 - regression_loss: 0.6994 - classification_loss: 0.0725 44/500 [=>............................] - ETA: 2:35 - loss: 0.7798 - regression_loss: 0.7048 - classification_loss: 0.0751 45/500 [=>............................] - ETA: 2:34 - loss: 0.7700 - regression_loss: 0.6964 - classification_loss: 0.0737 46/500 [=>............................] - ETA: 2:34 - loss: 0.7706 - regression_loss: 0.6976 - classification_loss: 0.0730 47/500 [=>............................] - ETA: 2:33 - loss: 0.7686 - regression_loss: 0.6964 - classification_loss: 0.0723 48/500 [=>............................] - ETA: 2:33 - loss: 0.7703 - regression_loss: 0.6981 - classification_loss: 0.0721 49/500 [=>............................] - ETA: 2:33 - loss: 0.7664 - regression_loss: 0.6952 - classification_loss: 0.0711 50/500 [==>...........................] - ETA: 2:33 - loss: 0.7703 - regression_loss: 0.6985 - classification_loss: 0.0718 51/500 [==>...........................] - ETA: 2:32 - loss: 0.7783 - regression_loss: 0.7031 - classification_loss: 0.0752 52/500 [==>...........................] - ETA: 2:32 - loss: 0.7747 - regression_loss: 0.7002 - classification_loss: 0.0745 53/500 [==>...........................] - ETA: 2:31 - loss: 0.7774 - regression_loss: 0.7010 - classification_loss: 0.0763 54/500 [==>...........................] - ETA: 2:31 - loss: 0.7755 - regression_loss: 0.6997 - classification_loss: 0.0759 55/500 [==>...........................] - ETA: 2:31 - loss: 0.7796 - regression_loss: 0.7038 - classification_loss: 0.0758 56/500 [==>...........................] - ETA: 2:31 - loss: 0.7701 - regression_loss: 0.6953 - classification_loss: 0.0748 57/500 [==>...........................] - ETA: 2:30 - loss: 0.7675 - regression_loss: 0.6929 - classification_loss: 0.0747 58/500 [==>...........................] - ETA: 2:30 - loss: 0.7650 - regression_loss: 0.6910 - classification_loss: 0.0740 59/500 [==>...........................] - ETA: 2:30 - loss: 0.7684 - regression_loss: 0.6933 - classification_loss: 0.0751 60/500 [==>...........................] - ETA: 2:29 - loss: 0.7692 - regression_loss: 0.6946 - classification_loss: 0.0746 61/500 [==>...........................] - ETA: 2:29 - loss: 0.7673 - regression_loss: 0.6933 - classification_loss: 0.0740 62/500 [==>...........................] - ETA: 2:28 - loss: 0.7658 - regression_loss: 0.6921 - classification_loss: 0.0737 63/500 [==>...........................] - ETA: 2:28 - loss: 0.7721 - regression_loss: 0.6983 - classification_loss: 0.0738 64/500 [==>...........................] - ETA: 2:28 - loss: 0.7716 - regression_loss: 0.6980 - classification_loss: 0.0736 65/500 [==>...........................] - ETA: 2:27 - loss: 0.7792 - regression_loss: 0.7055 - classification_loss: 0.0738 66/500 [==>...........................] - ETA: 2:27 - loss: 0.7752 - regression_loss: 0.7023 - classification_loss: 0.0729 67/500 [===>..........................] - ETA: 2:27 - loss: 0.7661 - regression_loss: 0.6942 - classification_loss: 0.0719 68/500 [===>..........................] - ETA: 2:26 - loss: 0.7708 - regression_loss: 0.6980 - classification_loss: 0.0729 69/500 [===>..........................] - ETA: 2:26 - loss: 0.7707 - regression_loss: 0.6987 - classification_loss: 0.0720 70/500 [===>..........................] - ETA: 2:26 - loss: 0.7703 - regression_loss: 0.6986 - classification_loss: 0.0717 71/500 [===>..........................] - ETA: 2:25 - loss: 0.7690 - regression_loss: 0.6974 - classification_loss: 0.0716 72/500 [===>..........................] - ETA: 2:25 - loss: 0.7747 - regression_loss: 0.7030 - classification_loss: 0.0717 73/500 [===>..........................] - ETA: 2:24 - loss: 0.7737 - regression_loss: 0.7022 - classification_loss: 0.0715 74/500 [===>..........................] - ETA: 2:24 - loss: 0.7701 - regression_loss: 0.6988 - classification_loss: 0.0713 75/500 [===>..........................] - ETA: 2:24 - loss: 0.7645 - regression_loss: 0.6934 - classification_loss: 0.0711 76/500 [===>..........................] - ETA: 2:23 - loss: 0.7666 - regression_loss: 0.6948 - classification_loss: 0.0719 77/500 [===>..........................] - ETA: 2:23 - loss: 0.7647 - regression_loss: 0.6933 - classification_loss: 0.0714 78/500 [===>..........................] - ETA: 2:23 - loss: 0.7679 - regression_loss: 0.6968 - classification_loss: 0.0711 79/500 [===>..........................] - ETA: 2:22 - loss: 0.7711 - regression_loss: 0.6999 - classification_loss: 0.0712 80/500 [===>..........................] - ETA: 2:22 - loss: 0.7643 - regression_loss: 0.6937 - classification_loss: 0.0706 81/500 [===>..........................] - ETA: 2:22 - loss: 0.7609 - regression_loss: 0.6908 - classification_loss: 0.0701 82/500 [===>..........................] - ETA: 2:21 - loss: 0.7605 - regression_loss: 0.6907 - classification_loss: 0.0698 83/500 [===>..........................] - ETA: 2:21 - loss: 0.7633 - regression_loss: 0.6931 - classification_loss: 0.0702 84/500 [====>.........................] - ETA: 2:21 - loss: 0.7577 - regression_loss: 0.6882 - classification_loss: 0.0695 85/500 [====>.........................] - ETA: 2:20 - loss: 0.7579 - regression_loss: 0.6890 - classification_loss: 0.0689 86/500 [====>.........................] - ETA: 2:20 - loss: 0.7618 - regression_loss: 0.6926 - classification_loss: 0.0692 87/500 [====>.........................] - ETA: 2:20 - loss: 0.7648 - regression_loss: 0.6954 - classification_loss: 0.0694 88/500 [====>.........................] - ETA: 2:19 - loss: 0.7696 - regression_loss: 0.7004 - classification_loss: 0.0692 89/500 [====>.........................] - ETA: 2:19 - loss: 0.7705 - regression_loss: 0.7017 - classification_loss: 0.0687 90/500 [====>.........................] - ETA: 2:19 - loss: 0.7732 - regression_loss: 0.7047 - classification_loss: 0.0685 91/500 [====>.........................] - ETA: 2:18 - loss: 0.7762 - regression_loss: 0.7072 - classification_loss: 0.0689 92/500 [====>.........................] - ETA: 2:18 - loss: 0.7763 - regression_loss: 0.7076 - classification_loss: 0.0687 93/500 [====>.........................] - ETA: 2:17 - loss: 0.7742 - regression_loss: 0.7057 - classification_loss: 0.0684 94/500 [====>.........................] - ETA: 2:17 - loss: 0.7724 - regression_loss: 0.7043 - classification_loss: 0.0680 95/500 [====>.........................] - ETA: 2:17 - loss: 0.7712 - regression_loss: 0.7034 - classification_loss: 0.0678 96/500 [====>.........................] - ETA: 2:16 - loss: 0.7780 - regression_loss: 0.7087 - classification_loss: 0.0694 97/500 [====>.........................] - ETA: 2:16 - loss: 0.7763 - regression_loss: 0.7074 - classification_loss: 0.0689 98/500 [====>.........................] - ETA: 2:16 - loss: 0.7734 - regression_loss: 0.7049 - classification_loss: 0.0685 99/500 [====>.........................] - ETA: 2:15 - loss: 0.7777 - regression_loss: 0.7088 - classification_loss: 0.0689 100/500 [=====>........................] - ETA: 2:15 - loss: 0.7744 - regression_loss: 0.7059 - classification_loss: 0.0685 101/500 [=====>........................] - ETA: 2:15 - loss: 0.7771 - regression_loss: 0.7081 - classification_loss: 0.0691 102/500 [=====>........................] - ETA: 2:14 - loss: 0.7749 - regression_loss: 0.7063 - classification_loss: 0.0687 103/500 [=====>........................] - ETA: 2:14 - loss: 0.7745 - regression_loss: 0.7059 - classification_loss: 0.0686 104/500 [=====>........................] - ETA: 2:13 - loss: 0.7710 - regression_loss: 0.7029 - classification_loss: 0.0681 105/500 [=====>........................] - ETA: 2:13 - loss: 0.7734 - regression_loss: 0.7050 - classification_loss: 0.0684 106/500 [=====>........................] - ETA: 2:13 - loss: 0.7711 - regression_loss: 0.7030 - classification_loss: 0.0682 107/500 [=====>........................] - ETA: 2:13 - loss: 0.7707 - regression_loss: 0.7027 - classification_loss: 0.0680 108/500 [=====>........................] - ETA: 2:12 - loss: 0.7719 - regression_loss: 0.7040 - classification_loss: 0.0679 109/500 [=====>........................] - ETA: 2:12 - loss: 0.7725 - regression_loss: 0.7047 - classification_loss: 0.0678 110/500 [=====>........................] - ETA: 2:11 - loss: 0.7713 - regression_loss: 0.7036 - classification_loss: 0.0678 111/500 [=====>........................] - ETA: 2:11 - loss: 0.7702 - regression_loss: 0.7020 - classification_loss: 0.0682 112/500 [=====>........................] - ETA: 2:11 - loss: 0.7734 - regression_loss: 0.7039 - classification_loss: 0.0694 113/500 [=====>........................] - ETA: 2:10 - loss: 0.7746 - regression_loss: 0.7052 - classification_loss: 0.0694 114/500 [=====>........................] - ETA: 2:10 - loss: 0.7780 - regression_loss: 0.7083 - classification_loss: 0.0697 115/500 [=====>........................] - ETA: 2:10 - loss: 0.7798 - regression_loss: 0.7100 - classification_loss: 0.0698 116/500 [=====>........................] - ETA: 2:09 - loss: 0.7810 - regression_loss: 0.7111 - classification_loss: 0.0700 117/500 [======>.......................] - ETA: 2:09 - loss: 0.7817 - regression_loss: 0.7122 - classification_loss: 0.0696 118/500 [======>.......................] - ETA: 2:09 - loss: 0.7798 - regression_loss: 0.7103 - classification_loss: 0.0695 119/500 [======>.......................] - ETA: 2:08 - loss: 0.7781 - regression_loss: 0.7087 - classification_loss: 0.0694 120/500 [======>.......................] - ETA: 2:08 - loss: 0.7774 - regression_loss: 0.7080 - classification_loss: 0.0694 121/500 [======>.......................] - ETA: 2:08 - loss: 0.7729 - regression_loss: 0.7040 - classification_loss: 0.0689 122/500 [======>.......................] - ETA: 2:07 - loss: 0.7704 - regression_loss: 0.7015 - classification_loss: 0.0688 123/500 [======>.......................] - ETA: 2:07 - loss: 0.7668 - regression_loss: 0.6984 - classification_loss: 0.0684 124/500 [======>.......................] - ETA: 2:07 - loss: 0.7675 - regression_loss: 0.6990 - classification_loss: 0.0685 125/500 [======>.......................] - ETA: 2:06 - loss: 0.7704 - regression_loss: 0.7018 - classification_loss: 0.0686 126/500 [======>.......................] - ETA: 2:06 - loss: 0.7699 - regression_loss: 0.7014 - classification_loss: 0.0685 127/500 [======>.......................] - ETA: 2:06 - loss: 0.7696 - regression_loss: 0.7012 - classification_loss: 0.0683 128/500 [======>.......................] - ETA: 2:05 - loss: 0.7698 - regression_loss: 0.7014 - classification_loss: 0.0684 129/500 [======>.......................] - ETA: 2:05 - loss: 0.7701 - regression_loss: 0.7019 - classification_loss: 0.0682 130/500 [======>.......................] - ETA: 2:05 - loss: 0.7666 - regression_loss: 0.6989 - classification_loss: 0.0677 131/500 [======>.......................] - ETA: 2:04 - loss: 0.7647 - regression_loss: 0.6974 - classification_loss: 0.0674 132/500 [======>.......................] - ETA: 2:04 - loss: 0.7650 - regression_loss: 0.6977 - classification_loss: 0.0674 133/500 [======>.......................] - ETA: 2:04 - loss: 0.7634 - regression_loss: 0.6963 - classification_loss: 0.0671 134/500 [=======>......................] - ETA: 2:03 - loss: 0.7636 - regression_loss: 0.6957 - classification_loss: 0.0679 135/500 [=======>......................] - ETA: 2:03 - loss: 0.7623 - regression_loss: 0.6943 - classification_loss: 0.0680 136/500 [=======>......................] - ETA: 2:03 - loss: 0.7644 - regression_loss: 0.6964 - classification_loss: 0.0680 137/500 [=======>......................] - ETA: 2:02 - loss: 0.7667 - regression_loss: 0.6987 - classification_loss: 0.0680 138/500 [=======>......................] - ETA: 2:02 - loss: 0.7687 - regression_loss: 0.7007 - classification_loss: 0.0680 139/500 [=======>......................] - ETA: 2:02 - loss: 0.7701 - regression_loss: 0.7018 - classification_loss: 0.0683 140/500 [=======>......................] - ETA: 2:01 - loss: 0.7678 - regression_loss: 0.6998 - classification_loss: 0.0680 141/500 [=======>......................] - ETA: 2:01 - loss: 0.7685 - regression_loss: 0.7005 - classification_loss: 0.0680 142/500 [=======>......................] - ETA: 2:01 - loss: 0.7707 - regression_loss: 0.7028 - classification_loss: 0.0679 143/500 [=======>......................] - ETA: 2:00 - loss: 0.7744 - regression_loss: 0.7057 - classification_loss: 0.0687 144/500 [=======>......................] - ETA: 2:00 - loss: 0.7832 - regression_loss: 0.7134 - classification_loss: 0.0698 145/500 [=======>......................] - ETA: 2:00 - loss: 0.7824 - regression_loss: 0.7128 - classification_loss: 0.0696 146/500 [=======>......................] - ETA: 1:59 - loss: 0.7837 - regression_loss: 0.7141 - classification_loss: 0.0696 147/500 [=======>......................] - ETA: 1:59 - loss: 0.7819 - regression_loss: 0.7126 - classification_loss: 0.0693 148/500 [=======>......................] - ETA: 1:59 - loss: 0.7818 - regression_loss: 0.7126 - classification_loss: 0.0693 149/500 [=======>......................] - ETA: 1:58 - loss: 0.7804 - regression_loss: 0.7113 - classification_loss: 0.0690 150/500 [========>.....................] - ETA: 1:58 - loss: 0.7768 - regression_loss: 0.7080 - classification_loss: 0.0688 151/500 [========>.....................] - ETA: 1:58 - loss: 0.7760 - regression_loss: 0.7075 - classification_loss: 0.0685 152/500 [========>.....................] - ETA: 1:57 - loss: 0.7774 - regression_loss: 0.7090 - classification_loss: 0.0683 153/500 [========>.....................] - ETA: 1:57 - loss: 0.7752 - regression_loss: 0.7072 - classification_loss: 0.0680 154/500 [========>.....................] - ETA: 1:57 - loss: 0.7744 - regression_loss: 0.7066 - classification_loss: 0.0678 155/500 [========>.....................] - ETA: 1:56 - loss: 0.7730 - regression_loss: 0.7051 - classification_loss: 0.0678 156/500 [========>.....................] - ETA: 1:56 - loss: 0.7696 - regression_loss: 0.7021 - classification_loss: 0.0675 157/500 [========>.....................] - ETA: 1:56 - loss: 0.7735 - regression_loss: 0.7041 - classification_loss: 0.0694 158/500 [========>.....................] - ETA: 1:55 - loss: 0.7723 - regression_loss: 0.7030 - classification_loss: 0.0693 159/500 [========>.....................] - ETA: 1:55 - loss: 0.7726 - regression_loss: 0.7033 - classification_loss: 0.0692 160/500 [========>.....................] - ETA: 1:54 - loss: 0.7761 - regression_loss: 0.7058 - classification_loss: 0.0703 161/500 [========>.....................] - ETA: 1:54 - loss: 0.7776 - regression_loss: 0.7071 - classification_loss: 0.0704 162/500 [========>.....................] - ETA: 1:54 - loss: 0.7772 - regression_loss: 0.7070 - classification_loss: 0.0701 163/500 [========>.....................] - ETA: 1:53 - loss: 0.7776 - regression_loss: 0.7076 - classification_loss: 0.0701 164/500 [========>.....................] - ETA: 1:53 - loss: 0.7781 - regression_loss: 0.7080 - classification_loss: 0.0701 165/500 [========>.....................] - ETA: 1:53 - loss: 0.7793 - regression_loss: 0.7092 - classification_loss: 0.0700 166/500 [========>.....................] - ETA: 1:52 - loss: 0.7772 - regression_loss: 0.7074 - classification_loss: 0.0699 167/500 [=========>....................] - ETA: 1:52 - loss: 0.7752 - regression_loss: 0.7056 - classification_loss: 0.0696 168/500 [=========>....................] - ETA: 1:52 - loss: 0.7751 - regression_loss: 0.7058 - classification_loss: 0.0693 169/500 [=========>....................] - ETA: 1:51 - loss: 0.7779 - regression_loss: 0.7079 - classification_loss: 0.0700 170/500 [=========>....................] - ETA: 1:51 - loss: 0.7757 - regression_loss: 0.7061 - classification_loss: 0.0696 171/500 [=========>....................] - ETA: 1:51 - loss: 0.7793 - regression_loss: 0.7084 - classification_loss: 0.0708 172/500 [=========>....................] - ETA: 1:50 - loss: 0.7774 - regression_loss: 0.7068 - classification_loss: 0.0706 173/500 [=========>....................] - ETA: 1:50 - loss: 0.7749 - regression_loss: 0.7047 - classification_loss: 0.0703 174/500 [=========>....................] - ETA: 1:50 - loss: 0.7752 - regression_loss: 0.7047 - classification_loss: 0.0704 175/500 [=========>....................] - ETA: 1:49 - loss: 0.7760 - regression_loss: 0.7058 - classification_loss: 0.0703 176/500 [=========>....................] - ETA: 1:49 - loss: 0.7753 - regression_loss: 0.7052 - classification_loss: 0.0701 177/500 [=========>....................] - ETA: 1:49 - loss: 0.7755 - regression_loss: 0.7053 - classification_loss: 0.0701 178/500 [=========>....................] - ETA: 1:48 - loss: 0.7729 - regression_loss: 0.7031 - classification_loss: 0.0698 179/500 [=========>....................] - ETA: 1:48 - loss: 0.7730 - regression_loss: 0.7033 - classification_loss: 0.0697 180/500 [=========>....................] - ETA: 1:48 - loss: 0.7745 - regression_loss: 0.7050 - classification_loss: 0.0695 181/500 [=========>....................] - ETA: 1:47 - loss: 0.7763 - regression_loss: 0.7065 - classification_loss: 0.0698 182/500 [=========>....................] - ETA: 1:47 - loss: 0.7744 - regression_loss: 0.7049 - classification_loss: 0.0695 183/500 [=========>....................] - ETA: 1:47 - loss: 0.7755 - regression_loss: 0.7059 - classification_loss: 0.0696 184/500 [==========>...................] - ETA: 1:46 - loss: 0.7758 - regression_loss: 0.7063 - classification_loss: 0.0695 185/500 [==========>...................] - ETA: 1:46 - loss: 0.7726 - regression_loss: 0.7033 - classification_loss: 0.0693 186/500 [==========>...................] - ETA: 1:46 - loss: 0.7732 - regression_loss: 0.7038 - classification_loss: 0.0695 187/500 [==========>...................] - ETA: 1:45 - loss: 0.7728 - regression_loss: 0.7034 - classification_loss: 0.0694 188/500 [==========>...................] - ETA: 1:45 - loss: 0.7723 - regression_loss: 0.7030 - classification_loss: 0.0694 189/500 [==========>...................] - ETA: 1:45 - loss: 0.7716 - regression_loss: 0.7025 - classification_loss: 0.0691 190/500 [==========>...................] - ETA: 1:44 - loss: 0.7676 - regression_loss: 0.6988 - classification_loss: 0.0688 191/500 [==========>...................] - ETA: 1:44 - loss: 0.7669 - regression_loss: 0.6981 - classification_loss: 0.0688 192/500 [==========>...................] - ETA: 1:44 - loss: 0.7699 - regression_loss: 0.7012 - classification_loss: 0.0687 193/500 [==========>...................] - ETA: 1:43 - loss: 0.7729 - regression_loss: 0.7029 - classification_loss: 0.0701 194/500 [==========>...................] - ETA: 1:43 - loss: 0.7724 - regression_loss: 0.7024 - classification_loss: 0.0699 195/500 [==========>...................] - ETA: 1:43 - loss: 0.7708 - regression_loss: 0.7011 - classification_loss: 0.0697 196/500 [==========>...................] - ETA: 1:42 - loss: 0.7709 - regression_loss: 0.7011 - classification_loss: 0.0698 197/500 [==========>...................] - ETA: 1:42 - loss: 0.7704 - regression_loss: 0.7004 - classification_loss: 0.0699 198/500 [==========>...................] - ETA: 1:42 - loss: 0.7699 - regression_loss: 0.7001 - classification_loss: 0.0698 199/500 [==========>...................] - ETA: 1:41 - loss: 0.7704 - regression_loss: 0.7006 - classification_loss: 0.0698 200/500 [===========>..................] - ETA: 1:41 - loss: 0.7678 - regression_loss: 0.6984 - classification_loss: 0.0695 201/500 [===========>..................] - ETA: 1:41 - loss: 0.7672 - regression_loss: 0.6977 - classification_loss: 0.0695 202/500 [===========>..................] - ETA: 1:40 - loss: 0.7689 - regression_loss: 0.6990 - classification_loss: 0.0700 203/500 [===========>..................] - ETA: 1:40 - loss: 0.7672 - regression_loss: 0.6974 - classification_loss: 0.0698 204/500 [===========>..................] - ETA: 1:40 - loss: 0.7685 - regression_loss: 0.6988 - classification_loss: 0.0697 205/500 [===========>..................] - ETA: 1:39 - loss: 0.7681 - regression_loss: 0.6985 - classification_loss: 0.0697 206/500 [===========>..................] - ETA: 1:39 - loss: 0.7665 - regression_loss: 0.6971 - classification_loss: 0.0694 207/500 [===========>..................] - ETA: 1:39 - loss: 0.7685 - regression_loss: 0.6984 - classification_loss: 0.0701 208/500 [===========>..................] - ETA: 1:38 - loss: 0.7668 - regression_loss: 0.6969 - classification_loss: 0.0699 209/500 [===========>..................] - ETA: 1:38 - loss: 0.7655 - regression_loss: 0.6959 - classification_loss: 0.0696 210/500 [===========>..................] - ETA: 1:37 - loss: 0.7664 - regression_loss: 0.6965 - classification_loss: 0.0699 211/500 [===========>..................] - ETA: 1:37 - loss: 0.7655 - regression_loss: 0.6958 - classification_loss: 0.0697 212/500 [===========>..................] - ETA: 1:37 - loss: 0.7640 - regression_loss: 0.6946 - classification_loss: 0.0695 213/500 [===========>..................] - ETA: 1:37 - loss: 0.7643 - regression_loss: 0.6948 - classification_loss: 0.0696 214/500 [===========>..................] - ETA: 1:36 - loss: 0.7657 - regression_loss: 0.6960 - classification_loss: 0.0697 215/500 [===========>..................] - ETA: 1:36 - loss: 0.7651 - regression_loss: 0.6955 - classification_loss: 0.0696 216/500 [===========>..................] - ETA: 1:36 - loss: 0.7660 - regression_loss: 0.6962 - classification_loss: 0.0698 217/500 [============>.................] - ETA: 1:35 - loss: 0.7639 - regression_loss: 0.6943 - classification_loss: 0.0696 218/500 [============>.................] - ETA: 1:35 - loss: 0.7640 - regression_loss: 0.6945 - classification_loss: 0.0694 219/500 [============>.................] - ETA: 1:35 - loss: 0.7639 - regression_loss: 0.6945 - classification_loss: 0.0694 220/500 [============>.................] - ETA: 1:34 - loss: 0.7624 - regression_loss: 0.6932 - classification_loss: 0.0692 221/500 [============>.................] - ETA: 1:34 - loss: 0.7629 - regression_loss: 0.6933 - classification_loss: 0.0696 222/500 [============>.................] - ETA: 1:33 - loss: 0.7643 - regression_loss: 0.6948 - classification_loss: 0.0695 223/500 [============>.................] - ETA: 1:33 - loss: 0.7655 - regression_loss: 0.6957 - classification_loss: 0.0698 224/500 [============>.................] - ETA: 1:33 - loss: 0.7644 - regression_loss: 0.6947 - classification_loss: 0.0696 225/500 [============>.................] - ETA: 1:33 - loss: 0.7626 - regression_loss: 0.6932 - classification_loss: 0.0694 226/500 [============>.................] - ETA: 1:32 - loss: 0.7621 - regression_loss: 0.6928 - classification_loss: 0.0693 227/500 [============>.................] - ETA: 1:32 - loss: 0.7624 - regression_loss: 0.6933 - classification_loss: 0.0692 228/500 [============>.................] - ETA: 1:32 - loss: 0.7632 - regression_loss: 0.6942 - classification_loss: 0.0690 229/500 [============>.................] - ETA: 1:31 - loss: 0.7617 - regression_loss: 0.6929 - classification_loss: 0.0688 230/500 [============>.................] - ETA: 1:31 - loss: 0.7612 - regression_loss: 0.6924 - classification_loss: 0.0688 231/500 [============>.................] - ETA: 1:31 - loss: 0.7642 - regression_loss: 0.6948 - classification_loss: 0.0694 232/500 [============>.................] - ETA: 1:30 - loss: 0.7641 - regression_loss: 0.6949 - classification_loss: 0.0692 233/500 [============>.................] - ETA: 1:30 - loss: 0.7637 - regression_loss: 0.6945 - classification_loss: 0.0692 234/500 [=============>................] - ETA: 1:29 - loss: 0.7639 - regression_loss: 0.6949 - classification_loss: 0.0690 235/500 [=============>................] - ETA: 1:29 - loss: 0.7658 - regression_loss: 0.6961 - classification_loss: 0.0697 236/500 [=============>................] - ETA: 1:29 - loss: 0.7662 - regression_loss: 0.6964 - classification_loss: 0.0698 237/500 [=============>................] - ETA: 1:28 - loss: 0.7655 - regression_loss: 0.6958 - classification_loss: 0.0697 238/500 [=============>................] - ETA: 1:28 - loss: 0.7679 - regression_loss: 0.6979 - classification_loss: 0.0699 239/500 [=============>................] - ETA: 1:28 - loss: 0.7681 - regression_loss: 0.6982 - classification_loss: 0.0699 240/500 [=============>................] - ETA: 1:27 - loss: 0.7696 - regression_loss: 0.6995 - classification_loss: 0.0700 241/500 [=============>................] - ETA: 1:27 - loss: 0.7700 - regression_loss: 0.6999 - classification_loss: 0.0701 242/500 [=============>................] - ETA: 1:27 - loss: 0.7713 - regression_loss: 0.7009 - classification_loss: 0.0704 243/500 [=============>................] - ETA: 1:26 - loss: 0.7721 - regression_loss: 0.7014 - classification_loss: 0.0707 244/500 [=============>................] - ETA: 1:26 - loss: 0.7718 - regression_loss: 0.7012 - classification_loss: 0.0706 245/500 [=============>................] - ETA: 1:26 - loss: 0.7740 - regression_loss: 0.7034 - classification_loss: 0.0706 246/500 [=============>................] - ETA: 1:25 - loss: 0.7744 - regression_loss: 0.7037 - classification_loss: 0.0707 247/500 [=============>................] - ETA: 1:25 - loss: 0.7733 - regression_loss: 0.7027 - classification_loss: 0.0706 248/500 [=============>................] - ETA: 1:25 - loss: 0.7736 - regression_loss: 0.7031 - classification_loss: 0.0705 249/500 [=============>................] - ETA: 1:24 - loss: 0.7755 - regression_loss: 0.7048 - classification_loss: 0.0707 250/500 [==============>...............] - ETA: 1:24 - loss: 0.7739 - regression_loss: 0.7035 - classification_loss: 0.0704 251/500 [==============>...............] - ETA: 1:24 - loss: 0.7736 - regression_loss: 0.7033 - classification_loss: 0.0703 252/500 [==============>...............] - ETA: 1:23 - loss: 0.7730 - regression_loss: 0.7029 - classification_loss: 0.0701 253/500 [==============>...............] - ETA: 1:23 - loss: 0.7732 - regression_loss: 0.7032 - classification_loss: 0.0700 254/500 [==============>...............] - ETA: 1:23 - loss: 0.7791 - regression_loss: 0.7086 - classification_loss: 0.0704 255/500 [==============>...............] - ETA: 1:22 - loss: 0.7776 - regression_loss: 0.7073 - classification_loss: 0.0702 256/500 [==============>...............] - ETA: 1:22 - loss: 0.7763 - regression_loss: 0.7062 - classification_loss: 0.0701 257/500 [==============>...............] - ETA: 1:22 - loss: 0.7759 - regression_loss: 0.7058 - classification_loss: 0.0700 258/500 [==============>...............] - ETA: 1:21 - loss: 0.7744 - regression_loss: 0.7046 - classification_loss: 0.0698 259/500 [==============>...............] - ETA: 1:21 - loss: 0.7736 - regression_loss: 0.7039 - classification_loss: 0.0697 260/500 [==============>...............] - ETA: 1:21 - loss: 0.7751 - regression_loss: 0.7053 - classification_loss: 0.0698 261/500 [==============>...............] - ETA: 1:20 - loss: 0.7747 - regression_loss: 0.7050 - classification_loss: 0.0697 262/500 [==============>...............] - ETA: 1:20 - loss: 0.7746 - regression_loss: 0.7050 - classification_loss: 0.0695 263/500 [==============>...............] - ETA: 1:20 - loss: 0.7745 - regression_loss: 0.7050 - classification_loss: 0.0695 264/500 [==============>...............] - ETA: 1:19 - loss: 0.7761 - regression_loss: 0.7061 - classification_loss: 0.0700 265/500 [==============>...............] - ETA: 1:19 - loss: 0.7749 - regression_loss: 0.7051 - classification_loss: 0.0698 266/500 [==============>...............] - ETA: 1:19 - loss: 0.7752 - regression_loss: 0.7053 - classification_loss: 0.0700 267/500 [===============>..............] - ETA: 1:18 - loss: 0.7765 - regression_loss: 0.7063 - classification_loss: 0.0702 268/500 [===============>..............] - ETA: 1:18 - loss: 0.7796 - regression_loss: 0.7089 - classification_loss: 0.0707 269/500 [===============>..............] - ETA: 1:18 - loss: 0.7791 - regression_loss: 0.7084 - classification_loss: 0.0707 270/500 [===============>..............] - ETA: 1:17 - loss: 0.7796 - regression_loss: 0.7087 - classification_loss: 0.0710 271/500 [===============>..............] - ETA: 1:17 - loss: 0.7784 - regression_loss: 0.7077 - classification_loss: 0.0708 272/500 [===============>..............] - ETA: 1:17 - loss: 0.7795 - regression_loss: 0.7085 - classification_loss: 0.0710 273/500 [===============>..............] - ETA: 1:16 - loss: 0.7798 - regression_loss: 0.7089 - classification_loss: 0.0710 274/500 [===============>..............] - ETA: 1:16 - loss: 0.7786 - regression_loss: 0.7075 - classification_loss: 0.0711 275/500 [===============>..............] - ETA: 1:16 - loss: 0.7777 - regression_loss: 0.7068 - classification_loss: 0.0709 276/500 [===============>..............] - ETA: 1:15 - loss: 0.7786 - regression_loss: 0.7078 - classification_loss: 0.0708 277/500 [===============>..............] - ETA: 1:15 - loss: 0.7777 - regression_loss: 0.7070 - classification_loss: 0.0707 278/500 [===============>..............] - ETA: 1:15 - loss: 0.7784 - regression_loss: 0.7078 - classification_loss: 0.0706 279/500 [===============>..............] - ETA: 1:14 - loss: 0.7785 - regression_loss: 0.7079 - classification_loss: 0.0705 280/500 [===============>..............] - ETA: 1:14 - loss: 0.7809 - regression_loss: 0.7101 - classification_loss: 0.0708 281/500 [===============>..............] - ETA: 1:14 - loss: 0.7795 - regression_loss: 0.7089 - classification_loss: 0.0706 282/500 [===============>..............] - ETA: 1:13 - loss: 0.7791 - regression_loss: 0.7086 - classification_loss: 0.0705 283/500 [===============>..............] - ETA: 1:13 - loss: 0.7786 - regression_loss: 0.7081 - classification_loss: 0.0705 284/500 [================>.............] - ETA: 1:12 - loss: 0.7773 - regression_loss: 0.7070 - classification_loss: 0.0703 285/500 [================>.............] - ETA: 1:12 - loss: 0.7770 - regression_loss: 0.7068 - classification_loss: 0.0702 286/500 [================>.............] - ETA: 1:12 - loss: 0.7772 - regression_loss: 0.7069 - classification_loss: 0.0703 287/500 [================>.............] - ETA: 1:11 - loss: 0.7766 - regression_loss: 0.7064 - classification_loss: 0.0702 288/500 [================>.............] - ETA: 1:11 - loss: 0.7768 - regression_loss: 0.7067 - classification_loss: 0.0702 289/500 [================>.............] - ETA: 1:11 - loss: 0.7793 - regression_loss: 0.7088 - classification_loss: 0.0705 290/500 [================>.............] - ETA: 1:10 - loss: 0.7781 - regression_loss: 0.7078 - classification_loss: 0.0703 291/500 [================>.............] - ETA: 1:10 - loss: 0.7790 - regression_loss: 0.7086 - classification_loss: 0.0703 292/500 [================>.............] - ETA: 1:10 - loss: 0.7812 - regression_loss: 0.7104 - classification_loss: 0.0708 293/500 [================>.............] - ETA: 1:09 - loss: 0.7816 - regression_loss: 0.7108 - classification_loss: 0.0708 294/500 [================>.............] - ETA: 1:09 - loss: 0.7816 - regression_loss: 0.7108 - classification_loss: 0.0709 295/500 [================>.............] - ETA: 1:09 - loss: 0.7821 - regression_loss: 0.7109 - classification_loss: 0.0711 296/500 [================>.............] - ETA: 1:08 - loss: 0.7808 - regression_loss: 0.7099 - classification_loss: 0.0710 297/500 [================>.............] - ETA: 1:08 - loss: 0.7798 - regression_loss: 0.7089 - classification_loss: 0.0708 298/500 [================>.............] - ETA: 1:08 - loss: 0.7796 - regression_loss: 0.7088 - classification_loss: 0.0708 299/500 [================>.............] - ETA: 1:07 - loss: 0.7793 - regression_loss: 0.7087 - classification_loss: 0.0707 300/500 [=================>............] - ETA: 1:07 - loss: 0.7793 - regression_loss: 0.7086 - classification_loss: 0.0706 301/500 [=================>............] - ETA: 1:07 - loss: 0.7796 - regression_loss: 0.7091 - classification_loss: 0.0705 302/500 [=================>............] - ETA: 1:06 - loss: 0.7794 - regression_loss: 0.7090 - classification_loss: 0.0704 303/500 [=================>............] - ETA: 1:06 - loss: 0.7785 - regression_loss: 0.7083 - classification_loss: 0.0702 304/500 [=================>............] - ETA: 1:06 - loss: 0.7789 - regression_loss: 0.7084 - classification_loss: 0.0704 305/500 [=================>............] - ETA: 1:05 - loss: 0.7778 - regression_loss: 0.7076 - classification_loss: 0.0703 306/500 [=================>............] - ETA: 1:05 - loss: 0.7789 - regression_loss: 0.7083 - classification_loss: 0.0706 307/500 [=================>............] - ETA: 1:05 - loss: 0.7790 - regression_loss: 0.7086 - classification_loss: 0.0704 308/500 [=================>............] - ETA: 1:04 - loss: 0.7787 - regression_loss: 0.7084 - classification_loss: 0.0703 309/500 [=================>............] - ETA: 1:04 - loss: 0.7801 - regression_loss: 0.7096 - classification_loss: 0.0705 310/500 [=================>............] - ETA: 1:04 - loss: 0.7820 - regression_loss: 0.7110 - classification_loss: 0.0710 311/500 [=================>............] - ETA: 1:03 - loss: 0.7823 - regression_loss: 0.7113 - classification_loss: 0.0710 312/500 [=================>............] - ETA: 1:03 - loss: 0.7809 - regression_loss: 0.7100 - classification_loss: 0.0709 313/500 [=================>............] - ETA: 1:03 - loss: 0.7798 - regression_loss: 0.7090 - classification_loss: 0.0708 314/500 [=================>............] - ETA: 1:02 - loss: 0.7798 - regression_loss: 0.7091 - classification_loss: 0.0707 315/500 [=================>............] - ETA: 1:02 - loss: 0.7790 - regression_loss: 0.7082 - classification_loss: 0.0707 316/500 [=================>............] - ETA: 1:02 - loss: 0.7787 - regression_loss: 0.7081 - classification_loss: 0.0707 317/500 [==================>...........] - ETA: 1:01 - loss: 0.7791 - regression_loss: 0.7084 - classification_loss: 0.0707 318/500 [==================>...........] - ETA: 1:01 - loss: 0.7783 - regression_loss: 0.7077 - classification_loss: 0.0705 319/500 [==================>...........] - ETA: 1:01 - loss: 0.7778 - regression_loss: 0.7073 - classification_loss: 0.0705 320/500 [==================>...........] - ETA: 1:00 - loss: 0.7769 - regression_loss: 0.7066 - classification_loss: 0.0703 321/500 [==================>...........] - ETA: 1:00 - loss: 0.7795 - regression_loss: 0.7091 - classification_loss: 0.0704 322/500 [==================>...........] - ETA: 1:00 - loss: 0.7799 - regression_loss: 0.7095 - classification_loss: 0.0704 323/500 [==================>...........] - ETA: 59s - loss: 0.7797 - regression_loss: 0.7093 - classification_loss: 0.0704  324/500 [==================>...........] - ETA: 59s - loss: 0.7793 - regression_loss: 0.7089 - classification_loss: 0.0703 325/500 [==================>...........] - ETA: 59s - loss: 0.7795 - regression_loss: 0.7092 - classification_loss: 0.0703 326/500 [==================>...........] - ETA: 58s - loss: 0.7794 - regression_loss: 0.7092 - classification_loss: 0.0702 327/500 [==================>...........] - ETA: 58s - loss: 0.7790 - regression_loss: 0.7088 - classification_loss: 0.0701 328/500 [==================>...........] - ETA: 58s - loss: 0.7782 - regression_loss: 0.7083 - classification_loss: 0.0700 329/500 [==================>...........] - ETA: 57s - loss: 0.7784 - regression_loss: 0.7085 - classification_loss: 0.0699 330/500 [==================>...........] - ETA: 57s - loss: 0.7820 - regression_loss: 0.7114 - classification_loss: 0.0706 331/500 [==================>...........] - ETA: 57s - loss: 0.7824 - regression_loss: 0.7118 - classification_loss: 0.0706 332/500 [==================>...........] - ETA: 56s - loss: 0.7826 - regression_loss: 0.7121 - classification_loss: 0.0705 333/500 [==================>...........] - ETA: 56s - loss: 0.7823 - regression_loss: 0.7119 - classification_loss: 0.0704 334/500 [===================>..........] - ETA: 56s - loss: 0.7846 - regression_loss: 0.7143 - classification_loss: 0.0704 335/500 [===================>..........] - ETA: 55s - loss: 0.7848 - regression_loss: 0.7144 - classification_loss: 0.0704 336/500 [===================>..........] - ETA: 55s - loss: 0.7850 - regression_loss: 0.7147 - classification_loss: 0.0704 337/500 [===================>..........] - ETA: 55s - loss: 0.7841 - regression_loss: 0.7139 - classification_loss: 0.0702 338/500 [===================>..........] - ETA: 54s - loss: 0.7841 - regression_loss: 0.7140 - classification_loss: 0.0701 339/500 [===================>..........] - ETA: 54s - loss: 0.7848 - regression_loss: 0.7146 - classification_loss: 0.0702 340/500 [===================>..........] - ETA: 54s - loss: 0.7846 - regression_loss: 0.7143 - classification_loss: 0.0703 341/500 [===================>..........] - ETA: 53s - loss: 0.7848 - regression_loss: 0.7145 - classification_loss: 0.0703 342/500 [===================>..........] - ETA: 53s - loss: 0.7844 - regression_loss: 0.7142 - classification_loss: 0.0702 343/500 [===================>..........] - ETA: 53s - loss: 0.7839 - regression_loss: 0.7138 - classification_loss: 0.0701 344/500 [===================>..........] - ETA: 52s - loss: 0.7829 - regression_loss: 0.7129 - classification_loss: 0.0700 345/500 [===================>..........] - ETA: 52s - loss: 0.7832 - regression_loss: 0.7131 - classification_loss: 0.0701 346/500 [===================>..........] - ETA: 52s - loss: 0.7860 - regression_loss: 0.7147 - classification_loss: 0.0713 347/500 [===================>..........] - ETA: 51s - loss: 0.7879 - regression_loss: 0.7162 - classification_loss: 0.0717 348/500 [===================>..........] - ETA: 51s - loss: 0.7883 - regression_loss: 0.7166 - classification_loss: 0.0717 349/500 [===================>..........] - ETA: 51s - loss: 0.7887 - regression_loss: 0.7169 - classification_loss: 0.0718 350/500 [====================>.........] - ETA: 50s - loss: 0.7889 - regression_loss: 0.7169 - classification_loss: 0.0720 351/500 [====================>.........] - ETA: 50s - loss: 0.7886 - regression_loss: 0.7167 - classification_loss: 0.0719 352/500 [====================>.........] - ETA: 50s - loss: 0.7892 - regression_loss: 0.7173 - classification_loss: 0.0720 353/500 [====================>.........] - ETA: 49s - loss: 0.7875 - regression_loss: 0.7157 - classification_loss: 0.0718 354/500 [====================>.........] - ETA: 49s - loss: 0.7866 - regression_loss: 0.7149 - classification_loss: 0.0718 355/500 [====================>.........] - ETA: 49s - loss: 0.7872 - regression_loss: 0.7153 - classification_loss: 0.0719 356/500 [====================>.........] - ETA: 48s - loss: 0.7870 - regression_loss: 0.7152 - classification_loss: 0.0718 357/500 [====================>.........] - ETA: 48s - loss: 0.7863 - regression_loss: 0.7145 - classification_loss: 0.0717 358/500 [====================>.........] - ETA: 47s - loss: 0.7851 - regression_loss: 0.7136 - classification_loss: 0.0716 359/500 [====================>.........] - ETA: 47s - loss: 0.7855 - regression_loss: 0.7138 - classification_loss: 0.0717 360/500 [====================>.........] - ETA: 47s - loss: 0.7852 - regression_loss: 0.7135 - classification_loss: 0.0717 361/500 [====================>.........] - ETA: 46s - loss: 0.7855 - regression_loss: 0.7137 - classification_loss: 0.0718 362/500 [====================>.........] - ETA: 46s - loss: 0.7858 - regression_loss: 0.7141 - classification_loss: 0.0718 363/500 [====================>.........] - ETA: 46s - loss: 0.7850 - regression_loss: 0.7133 - classification_loss: 0.0716 364/500 [====================>.........] - ETA: 45s - loss: 0.7842 - regression_loss: 0.7127 - classification_loss: 0.0715 365/500 [====================>.........] - ETA: 45s - loss: 0.7846 - regression_loss: 0.7131 - classification_loss: 0.0716 366/500 [====================>.........] - ETA: 45s - loss: 0.7830 - regression_loss: 0.7116 - classification_loss: 0.0714 367/500 [=====================>........] - ETA: 44s - loss: 0.7839 - regression_loss: 0.7122 - classification_loss: 0.0717 368/500 [=====================>........] - ETA: 44s - loss: 0.7842 - regression_loss: 0.7124 - classification_loss: 0.0718 369/500 [=====================>........] - ETA: 44s - loss: 0.7864 - regression_loss: 0.7142 - classification_loss: 0.0722 370/500 [=====================>........] - ETA: 43s - loss: 0.7879 - regression_loss: 0.7157 - classification_loss: 0.0722 371/500 [=====================>........] - ETA: 43s - loss: 0.7875 - regression_loss: 0.7154 - classification_loss: 0.0721 372/500 [=====================>........] - ETA: 43s - loss: 0.7863 - regression_loss: 0.7143 - classification_loss: 0.0720 373/500 [=====================>........] - ETA: 42s - loss: 0.7842 - regression_loss: 0.7124 - classification_loss: 0.0718 374/500 [=====================>........] - ETA: 42s - loss: 0.7839 - regression_loss: 0.7123 - classification_loss: 0.0717 375/500 [=====================>........] - ETA: 42s - loss: 0.7830 - regression_loss: 0.7114 - classification_loss: 0.0716 376/500 [=====================>........] - ETA: 41s - loss: 0.7820 - regression_loss: 0.7101 - classification_loss: 0.0719 377/500 [=====================>........] - ETA: 41s - loss: 0.7813 - regression_loss: 0.7095 - classification_loss: 0.0718 378/500 [=====================>........] - ETA: 41s - loss: 0.7819 - regression_loss: 0.7102 - classification_loss: 0.0718 379/500 [=====================>........] - ETA: 40s - loss: 0.7805 - regression_loss: 0.7089 - classification_loss: 0.0716 380/500 [=====================>........] - ETA: 40s - loss: 0.7808 - regression_loss: 0.7092 - classification_loss: 0.0716 381/500 [=====================>........] - ETA: 40s - loss: 0.7802 - regression_loss: 0.7086 - classification_loss: 0.0715 382/500 [=====================>........] - ETA: 39s - loss: 0.7800 - regression_loss: 0.7084 - classification_loss: 0.0716 383/500 [=====================>........] - ETA: 39s - loss: 0.7803 - regression_loss: 0.7085 - classification_loss: 0.0717 384/500 [======================>.......] - ETA: 39s - loss: 0.7803 - regression_loss: 0.7086 - classification_loss: 0.0716 385/500 [======================>.......] - ETA: 38s - loss: 0.7802 - regression_loss: 0.7086 - classification_loss: 0.0716 386/500 [======================>.......] - ETA: 38s - loss: 0.7810 - regression_loss: 0.7092 - classification_loss: 0.0718 387/500 [======================>.......] - ETA: 38s - loss: 0.7806 - regression_loss: 0.7089 - classification_loss: 0.0717 388/500 [======================>.......] - ETA: 37s - loss: 0.7815 - regression_loss: 0.7098 - classification_loss: 0.0716 389/500 [======================>.......] - ETA: 37s - loss: 0.7816 - regression_loss: 0.7098 - classification_loss: 0.0718 390/500 [======================>.......] - ETA: 37s - loss: 0.7813 - regression_loss: 0.7095 - classification_loss: 0.0717 391/500 [======================>.......] - ETA: 36s - loss: 0.7809 - regression_loss: 0.7092 - classification_loss: 0.0717 392/500 [======================>.......] - ETA: 36s - loss: 0.7806 - regression_loss: 0.7090 - classification_loss: 0.0716 393/500 [======================>.......] - ETA: 36s - loss: 0.7802 - regression_loss: 0.7086 - classification_loss: 0.0716 394/500 [======================>.......] - ETA: 35s - loss: 0.7793 - regression_loss: 0.7078 - classification_loss: 0.0715 395/500 [======================>.......] - ETA: 35s - loss: 0.7783 - regression_loss: 0.7070 - classification_loss: 0.0714 396/500 [======================>.......] - ETA: 35s - loss: 0.7786 - regression_loss: 0.7072 - classification_loss: 0.0714 397/500 [======================>.......] - ETA: 34s - loss: 0.7781 - regression_loss: 0.7067 - classification_loss: 0.0713 398/500 [======================>.......] - ETA: 34s - loss: 0.7785 - regression_loss: 0.7073 - classification_loss: 0.0712 399/500 [======================>.......] - ETA: 34s - loss: 0.7775 - regression_loss: 0.7064 - classification_loss: 0.0711 400/500 [=======================>......] - ETA: 33s - loss: 0.7782 - regression_loss: 0.7069 - classification_loss: 0.0713 401/500 [=======================>......] - ETA: 33s - loss: 0.7804 - regression_loss: 0.7082 - classification_loss: 0.0722 402/500 [=======================>......] - ETA: 33s - loss: 0.7793 - regression_loss: 0.7072 - classification_loss: 0.0721 403/500 [=======================>......] - ETA: 32s - loss: 0.7788 - regression_loss: 0.7068 - classification_loss: 0.0720 404/500 [=======================>......] - ETA: 32s - loss: 0.7787 - regression_loss: 0.7067 - classification_loss: 0.0720 405/500 [=======================>......] - ETA: 32s - loss: 0.7785 - regression_loss: 0.7065 - classification_loss: 0.0720 406/500 [=======================>......] - ETA: 31s - loss: 0.7791 - regression_loss: 0.7071 - classification_loss: 0.0720 407/500 [=======================>......] - ETA: 31s - loss: 0.7790 - regression_loss: 0.7070 - classification_loss: 0.0720 408/500 [=======================>......] - ETA: 31s - loss: 0.7782 - regression_loss: 0.7064 - classification_loss: 0.0718 409/500 [=======================>......] - ETA: 30s - loss: 0.7778 - regression_loss: 0.7061 - classification_loss: 0.0717 410/500 [=======================>......] - ETA: 30s - loss: 0.7775 - regression_loss: 0.7059 - classification_loss: 0.0716 411/500 [=======================>......] - ETA: 30s - loss: 0.7766 - regression_loss: 0.7051 - classification_loss: 0.0715 412/500 [=======================>......] - ETA: 29s - loss: 0.7761 - regression_loss: 0.7047 - classification_loss: 0.0714 413/500 [=======================>......] - ETA: 29s - loss: 0.7759 - regression_loss: 0.7045 - classification_loss: 0.0714 414/500 [=======================>......] - ETA: 29s - loss: 0.7751 - regression_loss: 0.7037 - classification_loss: 0.0713 415/500 [=======================>......] - ETA: 28s - loss: 0.7755 - regression_loss: 0.7041 - classification_loss: 0.0714 416/500 [=======================>......] - ETA: 28s - loss: 0.7748 - regression_loss: 0.7036 - classification_loss: 0.0712 417/500 [========================>.....] - ETA: 28s - loss: 0.7748 - regression_loss: 0.7036 - classification_loss: 0.0712 418/500 [========================>.....] - ETA: 27s - loss: 0.7762 - regression_loss: 0.7046 - classification_loss: 0.0716 419/500 [========================>.....] - ETA: 27s - loss: 0.7758 - regression_loss: 0.7042 - classification_loss: 0.0717 420/500 [========================>.....] - ETA: 27s - loss: 0.7751 - regression_loss: 0.7035 - classification_loss: 0.0716 421/500 [========================>.....] - ETA: 26s - loss: 0.7756 - regression_loss: 0.7039 - classification_loss: 0.0717 422/500 [========================>.....] - ETA: 26s - loss: 0.7752 - regression_loss: 0.7035 - classification_loss: 0.0717 423/500 [========================>.....] - ETA: 26s - loss: 0.7743 - regression_loss: 0.7027 - classification_loss: 0.0716 424/500 [========================>.....] - ETA: 25s - loss: 0.7742 - regression_loss: 0.7027 - classification_loss: 0.0715 425/500 [========================>.....] - ETA: 25s - loss: 0.7745 - regression_loss: 0.7030 - classification_loss: 0.0715 426/500 [========================>.....] - ETA: 25s - loss: 0.7739 - regression_loss: 0.7025 - classification_loss: 0.0714 427/500 [========================>.....] - ETA: 24s - loss: 0.7749 - regression_loss: 0.7031 - classification_loss: 0.0718 428/500 [========================>.....] - ETA: 24s - loss: 0.7756 - regression_loss: 0.7038 - classification_loss: 0.0718 429/500 [========================>.....] - ETA: 23s - loss: 0.7752 - regression_loss: 0.7035 - classification_loss: 0.0717 430/500 [========================>.....] - ETA: 23s - loss: 0.7756 - regression_loss: 0.7038 - classification_loss: 0.0718 431/500 [========================>.....] - ETA: 23s - loss: 0.7753 - regression_loss: 0.7035 - classification_loss: 0.0717 432/500 [========================>.....] - ETA: 22s - loss: 0.7747 - regression_loss: 0.7031 - classification_loss: 0.0717 433/500 [========================>.....] - ETA: 22s - loss: 0.7735 - regression_loss: 0.7020 - classification_loss: 0.0715 434/500 [=========================>....] - ETA: 22s - loss: 0.7740 - regression_loss: 0.7025 - classification_loss: 0.0716 435/500 [=========================>....] - ETA: 21s - loss: 0.7733 - regression_loss: 0.7018 - classification_loss: 0.0715 436/500 [=========================>....] - ETA: 21s - loss: 0.7739 - regression_loss: 0.7024 - classification_loss: 0.0715 437/500 [=========================>....] - ETA: 21s - loss: 0.7732 - regression_loss: 0.7019 - classification_loss: 0.0714 438/500 [=========================>....] - ETA: 20s - loss: 0.7728 - regression_loss: 0.7016 - classification_loss: 0.0713 439/500 [=========================>....] - ETA: 20s - loss: 0.7729 - regression_loss: 0.7016 - classification_loss: 0.0713 440/500 [=========================>....] - ETA: 20s - loss: 0.7730 - regression_loss: 0.7017 - classification_loss: 0.0713 441/500 [=========================>....] - ETA: 19s - loss: 0.7733 - regression_loss: 0.7020 - classification_loss: 0.0713 442/500 [=========================>....] - ETA: 19s - loss: 0.7731 - regression_loss: 0.7019 - classification_loss: 0.0712 443/500 [=========================>....] - ETA: 19s - loss: 0.7742 - regression_loss: 0.7028 - classification_loss: 0.0714 444/500 [=========================>....] - ETA: 18s - loss: 0.7736 - regression_loss: 0.7023 - classification_loss: 0.0713 445/500 [=========================>....] - ETA: 18s - loss: 0.7729 - regression_loss: 0.7018 - classification_loss: 0.0712 446/500 [=========================>....] - ETA: 18s - loss: 0.7719 - regression_loss: 0.7008 - classification_loss: 0.0711 447/500 [=========================>....] - ETA: 17s - loss: 0.7709 - regression_loss: 0.6999 - classification_loss: 0.0710 448/500 [=========================>....] - ETA: 17s - loss: 0.7717 - regression_loss: 0.7004 - classification_loss: 0.0712 449/500 [=========================>....] - ETA: 17s - loss: 0.7716 - regression_loss: 0.7004 - classification_loss: 0.0712 450/500 [==========================>...] - ETA: 16s - loss: 0.7719 - regression_loss: 0.7006 - classification_loss: 0.0713 451/500 [==========================>...] - ETA: 16s - loss: 0.7716 - regression_loss: 0.7005 - classification_loss: 0.0712 452/500 [==========================>...] - ETA: 16s - loss: 0.7711 - regression_loss: 0.7000 - classification_loss: 0.0711 453/500 [==========================>...] - ETA: 15s - loss: 0.7700 - regression_loss: 0.6991 - classification_loss: 0.0710 454/500 [==========================>...] - ETA: 15s - loss: 0.7701 - regression_loss: 0.6990 - classification_loss: 0.0711 455/500 [==========================>...] - ETA: 15s - loss: 0.7714 - regression_loss: 0.7003 - classification_loss: 0.0712 456/500 [==========================>...] - ETA: 14s - loss: 0.7709 - regression_loss: 0.6998 - classification_loss: 0.0711 457/500 [==========================>...] - ETA: 14s - loss: 0.7707 - regression_loss: 0.6996 - classification_loss: 0.0711 458/500 [==========================>...] - ETA: 14s - loss: 0.7711 - regression_loss: 0.7000 - classification_loss: 0.0711 459/500 [==========================>...] - ETA: 13s - loss: 0.7712 - regression_loss: 0.7002 - classification_loss: 0.0710 460/500 [==========================>...] - ETA: 13s - loss: 0.7721 - regression_loss: 0.7011 - classification_loss: 0.0710 461/500 [==========================>...] - ETA: 13s - loss: 0.7722 - regression_loss: 0.7012 - classification_loss: 0.0709 462/500 [==========================>...] - ETA: 12s - loss: 0.7719 - regression_loss: 0.7011 - classification_loss: 0.0708 463/500 [==========================>...] - ETA: 12s - loss: 0.7731 - regression_loss: 0.7021 - classification_loss: 0.0710 464/500 [==========================>...] - ETA: 12s - loss: 0.7730 - regression_loss: 0.7020 - classification_loss: 0.0710 465/500 [==========================>...] - ETA: 11s - loss: 0.7713 - regression_loss: 0.7005 - classification_loss: 0.0709 466/500 [==========================>...] - ETA: 11s - loss: 0.7700 - regression_loss: 0.6993 - classification_loss: 0.0707 467/500 [===========================>..] - ETA: 11s - loss: 0.7701 - regression_loss: 0.6993 - classification_loss: 0.0708 468/500 [===========================>..] - ETA: 10s - loss: 0.7698 - regression_loss: 0.6990 - classification_loss: 0.0708 469/500 [===========================>..] - ETA: 10s - loss: 0.7704 - regression_loss: 0.6995 - classification_loss: 0.0708 470/500 [===========================>..] - ETA: 10s - loss: 0.7705 - regression_loss: 0.6996 - classification_loss: 0.0709 471/500 [===========================>..] - ETA: 9s - loss: 0.7704 - regression_loss: 0.6995 - classification_loss: 0.0709  472/500 [===========================>..] - ETA: 9s - loss: 0.7705 - regression_loss: 0.6995 - classification_loss: 0.0710 473/500 [===========================>..] - ETA: 9s - loss: 0.7713 - regression_loss: 0.7002 - classification_loss: 0.0711 474/500 [===========================>..] - ETA: 8s - loss: 0.7715 - regression_loss: 0.7004 - classification_loss: 0.0711 475/500 [===========================>..] - ETA: 8s - loss: 0.7716 - regression_loss: 0.7006 - classification_loss: 0.0710 476/500 [===========================>..] - ETA: 8s - loss: 0.7723 - regression_loss: 0.7012 - classification_loss: 0.0711 477/500 [===========================>..] - ETA: 7s - loss: 0.7734 - regression_loss: 0.7019 - classification_loss: 0.0715 478/500 [===========================>..] - ETA: 7s - loss: 0.7731 - regression_loss: 0.7016 - classification_loss: 0.0715 479/500 [===========================>..] - ETA: 7s - loss: 0.7729 - regression_loss: 0.7014 - classification_loss: 0.0715 480/500 [===========================>..] - ETA: 6s - loss: 0.7732 - regression_loss: 0.7017 - classification_loss: 0.0715 481/500 [===========================>..] - ETA: 6s - loss: 0.7731 - regression_loss: 0.7016 - classification_loss: 0.0715 482/500 [===========================>..] - ETA: 6s - loss: 0.7730 - regression_loss: 0.7015 - classification_loss: 0.0715 483/500 [===========================>..] - ETA: 5s - loss: 0.7728 - regression_loss: 0.7014 - classification_loss: 0.0715 484/500 [============================>.] - ETA: 5s - loss: 0.7727 - regression_loss: 0.7013 - classification_loss: 0.0714 485/500 [============================>.] - ETA: 5s - loss: 0.7722 - regression_loss: 0.7008 - classification_loss: 0.0713 486/500 [============================>.] - ETA: 4s - loss: 0.7722 - regression_loss: 0.7009 - classification_loss: 0.0713 487/500 [============================>.] - ETA: 4s - loss: 0.7720 - regression_loss: 0.7008 - classification_loss: 0.0712 488/500 [============================>.] - ETA: 4s - loss: 0.7724 - regression_loss: 0.7012 - classification_loss: 0.0712 489/500 [============================>.] - ETA: 3s - loss: 0.7729 - regression_loss: 0.7016 - classification_loss: 0.0713 490/500 [============================>.] - ETA: 3s - loss: 0.7730 - regression_loss: 0.7017 - classification_loss: 0.0713 491/500 [============================>.] - ETA: 3s - loss: 0.7727 - regression_loss: 0.7015 - classification_loss: 0.0712 492/500 [============================>.] - ETA: 2s - loss: 0.7727 - regression_loss: 0.7015 - classification_loss: 0.0712 493/500 [============================>.] - ETA: 2s - loss: 0.7727 - regression_loss: 0.7015 - classification_loss: 0.0711 494/500 [============================>.] - ETA: 2s - loss: 0.7721 - regression_loss: 0.7010 - classification_loss: 0.0710 495/500 [============================>.] - ETA: 1s - loss: 0.7720 - regression_loss: 0.7011 - classification_loss: 0.0709 496/500 [============================>.] - ETA: 1s - loss: 0.7720 - regression_loss: 0.7011 - classification_loss: 0.0709 497/500 [============================>.] - ETA: 1s - loss: 0.7714 - regression_loss: 0.7006 - classification_loss: 0.0708 498/500 [============================>.] - ETA: 0s - loss: 0.7705 - regression_loss: 0.6998 - classification_loss: 0.0706 499/500 [============================>.] - ETA: 0s - loss: 0.7709 - regression_loss: 0.7002 - classification_loss: 0.0707 500/500 [==============================] - 169s 338ms/step - loss: 0.7701 - regression_loss: 0.6995 - classification_loss: 0.0706 326 instances of class plum with average precision: 0.8393 mAP: 0.8393 Epoch 00042: saving model to ./training/snapshots/resnet101_pascal_42.h5 Epoch 43/150 1/500 [..............................] - ETA: 2:38 - loss: 0.8756 - regression_loss: 0.7941 - classification_loss: 0.0815 2/500 [..............................] - ETA: 2:41 - loss: 0.7989 - regression_loss: 0.7466 - classification_loss: 0.0523 3/500 [..............................] - ETA: 2:42 - loss: 0.8117 - regression_loss: 0.7428 - classification_loss: 0.0689 4/500 [..............................] - ETA: 2:42 - loss: 0.8546 - regression_loss: 0.7942 - classification_loss: 0.0604 5/500 [..............................] - ETA: 2:42 - loss: 0.8787 - regression_loss: 0.8248 - classification_loss: 0.0539 6/500 [..............................] - ETA: 2:42 - loss: 0.8341 - regression_loss: 0.7784 - classification_loss: 0.0557 7/500 [..............................] - ETA: 2:42 - loss: 0.8069 - regression_loss: 0.7571 - classification_loss: 0.0498 8/500 [..............................] - ETA: 2:42 - loss: 0.7977 - regression_loss: 0.7465 - classification_loss: 0.0512 9/500 [..............................] - ETA: 2:42 - loss: 0.8253 - regression_loss: 0.7706 - classification_loss: 0.0547 10/500 [..............................] - ETA: 2:42 - loss: 0.8652 - regression_loss: 0.8015 - classification_loss: 0.0637 11/500 [..............................] - ETA: 2:42 - loss: 0.8224 - regression_loss: 0.7632 - classification_loss: 0.0592 12/500 [..............................] - ETA: 2:41 - loss: 0.8106 - regression_loss: 0.7495 - classification_loss: 0.0611 13/500 [..............................] - ETA: 2:41 - loss: 0.8187 - regression_loss: 0.7557 - classification_loss: 0.0630 14/500 [..............................] - ETA: 2:41 - loss: 0.7972 - regression_loss: 0.7375 - classification_loss: 0.0597 15/500 [..............................] - ETA: 2:41 - loss: 0.9181 - regression_loss: 0.8338 - classification_loss: 0.0843 16/500 [..............................] - ETA: 2:41 - loss: 0.8920 - regression_loss: 0.8115 - classification_loss: 0.0806 17/500 [>.............................] - ETA: 2:41 - loss: 0.9028 - regression_loss: 0.8227 - classification_loss: 0.0802 18/500 [>.............................] - ETA: 2:40 - loss: 0.8846 - regression_loss: 0.8077 - classification_loss: 0.0769 19/500 [>.............................] - ETA: 2:40 - loss: 0.8654 - regression_loss: 0.7878 - classification_loss: 0.0777 20/500 [>.............................] - ETA: 2:40 - loss: 0.8692 - regression_loss: 0.7919 - classification_loss: 0.0773 21/500 [>.............................] - ETA: 2:39 - loss: 0.8505 - regression_loss: 0.7762 - classification_loss: 0.0743 22/500 [>.............................] - ETA: 2:39 - loss: 0.8342 - regression_loss: 0.7615 - classification_loss: 0.0727 23/500 [>.............................] - ETA: 2:39 - loss: 0.8200 - regression_loss: 0.7497 - classification_loss: 0.0703 24/500 [>.............................] - ETA: 2:39 - loss: 0.8182 - regression_loss: 0.7491 - classification_loss: 0.0691 25/500 [>.............................] - ETA: 2:39 - loss: 0.8106 - regression_loss: 0.7425 - classification_loss: 0.0681 26/500 [>.............................] - ETA: 2:39 - loss: 0.8188 - regression_loss: 0.7497 - classification_loss: 0.0691 27/500 [>.............................] - ETA: 2:39 - loss: 0.8271 - regression_loss: 0.7559 - classification_loss: 0.0712 28/500 [>.............................] - ETA: 2:38 - loss: 0.8326 - regression_loss: 0.7606 - classification_loss: 0.0720 29/500 [>.............................] - ETA: 2:37 - loss: 0.8264 - regression_loss: 0.7556 - classification_loss: 0.0708 30/500 [>.............................] - ETA: 2:37 - loss: 0.8160 - regression_loss: 0.7470 - classification_loss: 0.0690 31/500 [>.............................] - ETA: 2:36 - loss: 0.8353 - regression_loss: 0.7629 - classification_loss: 0.0723 32/500 [>.............................] - ETA: 2:36 - loss: 0.8255 - regression_loss: 0.7541 - classification_loss: 0.0714 33/500 [>.............................] - ETA: 2:36 - loss: 0.8124 - regression_loss: 0.7429 - classification_loss: 0.0695 34/500 [=>............................] - ETA: 2:36 - loss: 0.7972 - regression_loss: 0.7289 - classification_loss: 0.0683 35/500 [=>............................] - ETA: 2:35 - loss: 0.8020 - regression_loss: 0.7337 - classification_loss: 0.0683 36/500 [=>............................] - ETA: 2:35 - loss: 0.8057 - regression_loss: 0.7378 - classification_loss: 0.0679 37/500 [=>............................] - ETA: 2:35 - loss: 0.8069 - regression_loss: 0.7385 - classification_loss: 0.0683 38/500 [=>............................] - ETA: 2:35 - loss: 0.8056 - regression_loss: 0.7376 - classification_loss: 0.0681 39/500 [=>............................] - ETA: 2:34 - loss: 0.8120 - regression_loss: 0.7436 - classification_loss: 0.0684 40/500 [=>............................] - ETA: 2:34 - loss: 0.8081 - regression_loss: 0.7403 - classification_loss: 0.0678 41/500 [=>............................] - ETA: 2:33 - loss: 0.8125 - regression_loss: 0.7447 - classification_loss: 0.0678 42/500 [=>............................] - ETA: 2:33 - loss: 0.8159 - regression_loss: 0.7478 - classification_loss: 0.0682 43/500 [=>............................] - ETA: 2:33 - loss: 0.8216 - regression_loss: 0.7540 - classification_loss: 0.0677 44/500 [=>............................] - ETA: 2:33 - loss: 0.8362 - regression_loss: 0.7658 - classification_loss: 0.0704 45/500 [=>............................] - ETA: 2:32 - loss: 0.8419 - regression_loss: 0.7707 - classification_loss: 0.0712 46/500 [=>............................] - ETA: 2:32 - loss: 0.8287 - regression_loss: 0.7588 - classification_loss: 0.0699 47/500 [=>............................] - ETA: 2:32 - loss: 0.8229 - regression_loss: 0.7539 - classification_loss: 0.0690 48/500 [=>............................] - ETA: 2:32 - loss: 0.8110 - regression_loss: 0.7430 - classification_loss: 0.0680 49/500 [=>............................] - ETA: 2:31 - loss: 0.7985 - regression_loss: 0.7316 - classification_loss: 0.0669 50/500 [==>...........................] - ETA: 2:31 - loss: 0.7929 - regression_loss: 0.7266 - classification_loss: 0.0663 51/500 [==>...........................] - ETA: 2:31 - loss: 0.7951 - regression_loss: 0.7283 - classification_loss: 0.0669 52/500 [==>...........................] - ETA: 2:30 - loss: 0.7849 - regression_loss: 0.7188 - classification_loss: 0.0661 53/500 [==>...........................] - ETA: 2:30 - loss: 0.7870 - regression_loss: 0.7208 - classification_loss: 0.0661 54/500 [==>...........................] - ETA: 2:30 - loss: 0.7788 - regression_loss: 0.7137 - classification_loss: 0.0651 55/500 [==>...........................] - ETA: 2:30 - loss: 0.7816 - regression_loss: 0.7162 - classification_loss: 0.0654 56/500 [==>...........................] - ETA: 2:29 - loss: 0.7821 - regression_loss: 0.7167 - classification_loss: 0.0654 57/500 [==>...........................] - ETA: 2:29 - loss: 0.7780 - regression_loss: 0.7130 - classification_loss: 0.0650 58/500 [==>...........................] - ETA: 2:29 - loss: 0.7714 - regression_loss: 0.7072 - classification_loss: 0.0642 59/500 [==>...........................] - ETA: 2:28 - loss: 0.7739 - regression_loss: 0.7089 - classification_loss: 0.0650 60/500 [==>...........................] - ETA: 2:28 - loss: 0.7705 - regression_loss: 0.7047 - classification_loss: 0.0659 61/500 [==>...........................] - ETA: 2:28 - loss: 0.7740 - regression_loss: 0.7077 - classification_loss: 0.0664 62/500 [==>...........................] - ETA: 2:27 - loss: 0.7856 - regression_loss: 0.7167 - classification_loss: 0.0689 63/500 [==>...........................] - ETA: 2:27 - loss: 0.7882 - regression_loss: 0.7195 - classification_loss: 0.0687 64/500 [==>...........................] - ETA: 2:27 - loss: 0.7810 - regression_loss: 0.7132 - classification_loss: 0.0678 65/500 [==>...........................] - ETA: 2:26 - loss: 0.7906 - regression_loss: 0.7201 - classification_loss: 0.0704 66/500 [==>...........................] - ETA: 2:26 - loss: 0.7896 - regression_loss: 0.7194 - classification_loss: 0.0702 67/500 [===>..........................] - ETA: 2:26 - loss: 0.7850 - regression_loss: 0.7151 - classification_loss: 0.0698 68/500 [===>..........................] - ETA: 2:26 - loss: 0.7867 - regression_loss: 0.7168 - classification_loss: 0.0699 69/500 [===>..........................] - ETA: 2:25 - loss: 0.7889 - regression_loss: 0.7184 - classification_loss: 0.0705 70/500 [===>..........................] - ETA: 2:25 - loss: 0.7923 - regression_loss: 0.7207 - classification_loss: 0.0716 71/500 [===>..........................] - ETA: 2:25 - loss: 0.7870 - regression_loss: 0.7157 - classification_loss: 0.0712 72/500 [===>..........................] - ETA: 2:24 - loss: 0.7879 - regression_loss: 0.7165 - classification_loss: 0.0714 73/500 [===>..........................] - ETA: 2:24 - loss: 0.7888 - regression_loss: 0.7173 - classification_loss: 0.0715 74/500 [===>..........................] - ETA: 2:24 - loss: 0.7892 - regression_loss: 0.7180 - classification_loss: 0.0712 75/500 [===>..........................] - ETA: 2:23 - loss: 0.7822 - regression_loss: 0.7108 - classification_loss: 0.0714 76/500 [===>..........................] - ETA: 2:23 - loss: 0.7814 - regression_loss: 0.7105 - classification_loss: 0.0709 77/500 [===>..........................] - ETA: 2:23 - loss: 0.7848 - regression_loss: 0.7143 - classification_loss: 0.0705 78/500 [===>..........................] - ETA: 2:22 - loss: 0.7846 - regression_loss: 0.7144 - classification_loss: 0.0702 79/500 [===>..........................] - ETA: 2:22 - loss: 0.7810 - regression_loss: 0.7113 - classification_loss: 0.0697 80/500 [===>..........................] - ETA: 2:22 - loss: 0.7789 - regression_loss: 0.7097 - classification_loss: 0.0691 81/500 [===>..........................] - ETA: 2:22 - loss: 0.7821 - regression_loss: 0.7123 - classification_loss: 0.0697 82/500 [===>..........................] - ETA: 2:21 - loss: 0.7901 - regression_loss: 0.7193 - classification_loss: 0.0708 83/500 [===>..........................] - ETA: 2:21 - loss: 0.7859 - regression_loss: 0.7157 - classification_loss: 0.0702 84/500 [====>.........................] - ETA: 2:21 - loss: 0.7855 - regression_loss: 0.7156 - classification_loss: 0.0699 85/500 [====>.........................] - ETA: 2:20 - loss: 0.7860 - regression_loss: 0.7165 - classification_loss: 0.0695 86/500 [====>.........................] - ETA: 2:20 - loss: 0.7825 - regression_loss: 0.7135 - classification_loss: 0.0690 87/500 [====>.........................] - ETA: 2:20 - loss: 0.7834 - regression_loss: 0.7142 - classification_loss: 0.0692 88/500 [====>.........................] - ETA: 2:19 - loss: 0.7819 - regression_loss: 0.7131 - classification_loss: 0.0688 89/500 [====>.........................] - ETA: 2:19 - loss: 0.7787 - regression_loss: 0.7105 - classification_loss: 0.0682 90/500 [====>.........................] - ETA: 2:19 - loss: 0.7787 - regression_loss: 0.7104 - classification_loss: 0.0683 91/500 [====>.........................] - ETA: 2:18 - loss: 0.7820 - regression_loss: 0.7128 - classification_loss: 0.0692 92/500 [====>.........................] - ETA: 2:18 - loss: 0.7861 - regression_loss: 0.7159 - classification_loss: 0.0701 93/500 [====>.........................] - ETA: 2:18 - loss: 0.7876 - regression_loss: 0.7167 - classification_loss: 0.0709 94/500 [====>.........................] - ETA: 2:17 - loss: 0.7837 - regression_loss: 0.7132 - classification_loss: 0.0705 95/500 [====>.........................] - ETA: 2:17 - loss: 0.7868 - regression_loss: 0.7154 - classification_loss: 0.0714 96/500 [====>.........................] - ETA: 2:17 - loss: 0.7861 - regression_loss: 0.7141 - classification_loss: 0.0719 97/500 [====>.........................] - ETA: 2:16 - loss: 0.7964 - regression_loss: 0.7238 - classification_loss: 0.0726 98/500 [====>.........................] - ETA: 2:16 - loss: 0.7947 - regression_loss: 0.7225 - classification_loss: 0.0722 99/500 [====>.........................] - ETA: 2:15 - loss: 0.7979 - regression_loss: 0.7255 - classification_loss: 0.0724 100/500 [=====>........................] - ETA: 2:15 - loss: 0.7957 - regression_loss: 0.7236 - classification_loss: 0.0720 101/500 [=====>........................] - ETA: 2:15 - loss: 0.7922 - regression_loss: 0.7205 - classification_loss: 0.0716 102/500 [=====>........................] - ETA: 2:14 - loss: 0.7942 - regression_loss: 0.7225 - classification_loss: 0.0717 103/500 [=====>........................] - ETA: 2:14 - loss: 0.7988 - regression_loss: 0.7258 - classification_loss: 0.0730 104/500 [=====>........................] - ETA: 2:14 - loss: 0.8007 - regression_loss: 0.7279 - classification_loss: 0.0728 105/500 [=====>........................] - ETA: 2:13 - loss: 0.8021 - regression_loss: 0.7294 - classification_loss: 0.0727 106/500 [=====>........................] - ETA: 2:13 - loss: 0.8036 - regression_loss: 0.7307 - classification_loss: 0.0729 107/500 [=====>........................] - ETA: 2:13 - loss: 0.8003 - regression_loss: 0.7280 - classification_loss: 0.0723 108/500 [=====>........................] - ETA: 2:13 - loss: 0.8022 - regression_loss: 0.7295 - classification_loss: 0.0727 109/500 [=====>........................] - ETA: 2:12 - loss: 0.8006 - regression_loss: 0.7280 - classification_loss: 0.0725 110/500 [=====>........................] - ETA: 2:12 - loss: 0.7987 - regression_loss: 0.7267 - classification_loss: 0.0720 111/500 [=====>........................] - ETA: 2:12 - loss: 0.7992 - regression_loss: 0.7272 - classification_loss: 0.0720 112/500 [=====>........................] - ETA: 2:11 - loss: 0.8017 - regression_loss: 0.7298 - classification_loss: 0.0719 113/500 [=====>........................] - ETA: 2:11 - loss: 0.7997 - regression_loss: 0.7281 - classification_loss: 0.0717 114/500 [=====>........................] - ETA: 2:11 - loss: 0.7958 - regression_loss: 0.7246 - classification_loss: 0.0713 115/500 [=====>........................] - ETA: 2:10 - loss: 0.7955 - regression_loss: 0.7244 - classification_loss: 0.0711 116/500 [=====>........................] - ETA: 2:10 - loss: 0.7997 - regression_loss: 0.7281 - classification_loss: 0.0716 117/500 [======>.......................] - ETA: 2:10 - loss: 0.8025 - regression_loss: 0.7310 - classification_loss: 0.0715 118/500 [======>.......................] - ETA: 2:09 - loss: 0.7992 - regression_loss: 0.7280 - classification_loss: 0.0712 119/500 [======>.......................] - ETA: 2:09 - loss: 0.8007 - regression_loss: 0.7295 - classification_loss: 0.0711 120/500 [======>.......................] - ETA: 2:09 - loss: 0.8022 - regression_loss: 0.7308 - classification_loss: 0.0714 121/500 [======>.......................] - ETA: 2:08 - loss: 0.8029 - regression_loss: 0.7316 - classification_loss: 0.0713 122/500 [======>.......................] - ETA: 2:08 - loss: 0.8058 - regression_loss: 0.7342 - classification_loss: 0.0716 123/500 [======>.......................] - ETA: 2:08 - loss: 0.8057 - regression_loss: 0.7341 - classification_loss: 0.0715 124/500 [======>.......................] - ETA: 2:07 - loss: 0.8052 - regression_loss: 0.7337 - classification_loss: 0.0714 125/500 [======>.......................] - ETA: 2:07 - loss: 0.8054 - regression_loss: 0.7339 - classification_loss: 0.0715 126/500 [======>.......................] - ETA: 2:07 - loss: 0.8029 - regression_loss: 0.7316 - classification_loss: 0.0714 127/500 [======>.......................] - ETA: 2:06 - loss: 0.7976 - regression_loss: 0.7267 - classification_loss: 0.0708 128/500 [======>.......................] - ETA: 2:06 - loss: 0.8080 - regression_loss: 0.7357 - classification_loss: 0.0723 129/500 [======>.......................] - ETA: 2:06 - loss: 0.8079 - regression_loss: 0.7357 - classification_loss: 0.0722 130/500 [======>.......................] - ETA: 2:05 - loss: 0.8113 - regression_loss: 0.7378 - classification_loss: 0.0735 131/500 [======>.......................] - ETA: 2:05 - loss: 0.8112 - regression_loss: 0.7375 - classification_loss: 0.0736 132/500 [======>.......................] - ETA: 2:04 - loss: 0.8102 - regression_loss: 0.7365 - classification_loss: 0.0737 133/500 [======>.......................] - ETA: 2:04 - loss: 0.8092 - regression_loss: 0.7357 - classification_loss: 0.0734 134/500 [=======>......................] - ETA: 2:04 - loss: 0.8089 - regression_loss: 0.7357 - classification_loss: 0.0732 135/500 [=======>......................] - ETA: 2:03 - loss: 0.8081 - regression_loss: 0.7351 - classification_loss: 0.0730 136/500 [=======>......................] - ETA: 2:03 - loss: 0.8053 - regression_loss: 0.7328 - classification_loss: 0.0725 137/500 [=======>......................] - ETA: 2:03 - loss: 0.8084 - regression_loss: 0.7358 - classification_loss: 0.0726 138/500 [=======>......................] - ETA: 2:02 - loss: 0.8097 - regression_loss: 0.7370 - classification_loss: 0.0728 139/500 [=======>......................] - ETA: 2:02 - loss: 0.8077 - regression_loss: 0.7351 - classification_loss: 0.0726 140/500 [=======>......................] - ETA: 2:02 - loss: 0.8098 - regression_loss: 0.7371 - classification_loss: 0.0727 141/500 [=======>......................] - ETA: 2:01 - loss: 0.8082 - regression_loss: 0.7358 - classification_loss: 0.0724 142/500 [=======>......................] - ETA: 2:01 - loss: 0.8069 - regression_loss: 0.7346 - classification_loss: 0.0722 143/500 [=======>......................] - ETA: 2:01 - loss: 0.8051 - regression_loss: 0.7329 - classification_loss: 0.0721 144/500 [=======>......................] - ETA: 2:00 - loss: 0.8052 - regression_loss: 0.7333 - classification_loss: 0.0719 145/500 [=======>......................] - ETA: 2:00 - loss: 0.8039 - regression_loss: 0.7323 - classification_loss: 0.0716 146/500 [=======>......................] - ETA: 2:00 - loss: 0.8007 - regression_loss: 0.7293 - classification_loss: 0.0714 147/500 [=======>......................] - ETA: 1:59 - loss: 0.7997 - regression_loss: 0.7283 - classification_loss: 0.0714 148/500 [=======>......................] - ETA: 1:59 - loss: 0.7998 - regression_loss: 0.7284 - classification_loss: 0.0714 149/500 [=======>......................] - ETA: 1:59 - loss: 0.7993 - regression_loss: 0.7278 - classification_loss: 0.0715 150/500 [========>.....................] - ETA: 1:58 - loss: 0.7981 - regression_loss: 0.7268 - classification_loss: 0.0713 151/500 [========>.....................] - ETA: 1:58 - loss: 0.7994 - regression_loss: 0.7282 - classification_loss: 0.0712 152/500 [========>.....................] - ETA: 1:58 - loss: 0.8008 - regression_loss: 0.7297 - classification_loss: 0.0712 153/500 [========>.....................] - ETA: 1:57 - loss: 0.8018 - regression_loss: 0.7306 - classification_loss: 0.0712 154/500 [========>.....................] - ETA: 1:57 - loss: 0.8007 - regression_loss: 0.7294 - classification_loss: 0.0713 155/500 [========>.....................] - ETA: 1:57 - loss: 0.7994 - regression_loss: 0.7282 - classification_loss: 0.0712 156/500 [========>.....................] - ETA: 1:56 - loss: 0.7970 - regression_loss: 0.7261 - classification_loss: 0.0710 157/500 [========>.....................] - ETA: 1:56 - loss: 0.7989 - regression_loss: 0.7279 - classification_loss: 0.0710 158/500 [========>.....................] - ETA: 1:55 - loss: 0.7980 - regression_loss: 0.7272 - classification_loss: 0.0708 159/500 [========>.....................] - ETA: 1:55 - loss: 0.7981 - regression_loss: 0.7275 - classification_loss: 0.0706 160/500 [========>.....................] - ETA: 1:55 - loss: 0.7978 - regression_loss: 0.7272 - classification_loss: 0.0706 161/500 [========>.....................] - ETA: 1:54 - loss: 0.7990 - regression_loss: 0.7284 - classification_loss: 0.0706 162/500 [========>.....................] - ETA: 1:54 - loss: 0.7968 - regression_loss: 0.7265 - classification_loss: 0.0703 163/500 [========>.....................] - ETA: 1:54 - loss: 0.7977 - regression_loss: 0.7272 - classification_loss: 0.0705 164/500 [========>.....................] - ETA: 1:53 - loss: 0.7959 - regression_loss: 0.7256 - classification_loss: 0.0703 165/500 [========>.....................] - ETA: 1:53 - loss: 0.7920 - regression_loss: 0.7220 - classification_loss: 0.0700 166/500 [========>.....................] - ETA: 1:53 - loss: 0.7900 - regression_loss: 0.7203 - classification_loss: 0.0697 167/500 [=========>....................] - ETA: 1:52 - loss: 0.7892 - regression_loss: 0.7195 - classification_loss: 0.0697 168/500 [=========>....................] - ETA: 1:52 - loss: 0.7894 - regression_loss: 0.7200 - classification_loss: 0.0695 169/500 [=========>....................] - ETA: 1:52 - loss: 0.7897 - regression_loss: 0.7201 - classification_loss: 0.0696 170/500 [=========>....................] - ETA: 1:51 - loss: 0.7919 - regression_loss: 0.7222 - classification_loss: 0.0697 171/500 [=========>....................] - ETA: 1:51 - loss: 0.7922 - regression_loss: 0.7225 - classification_loss: 0.0697 172/500 [=========>....................] - ETA: 1:51 - loss: 0.7909 - regression_loss: 0.7213 - classification_loss: 0.0695 173/500 [=========>....................] - ETA: 1:50 - loss: 0.7957 - regression_loss: 0.7255 - classification_loss: 0.0702 174/500 [=========>....................] - ETA: 1:50 - loss: 0.7990 - regression_loss: 0.7278 - classification_loss: 0.0711 175/500 [=========>....................] - ETA: 1:50 - loss: 0.7980 - regression_loss: 0.7271 - classification_loss: 0.0709 176/500 [=========>....................] - ETA: 1:49 - loss: 0.7984 - regression_loss: 0.7277 - classification_loss: 0.0707 177/500 [=========>....................] - ETA: 1:49 - loss: 0.7958 - regression_loss: 0.7254 - classification_loss: 0.0704 178/500 [=========>....................] - ETA: 1:49 - loss: 0.7947 - regression_loss: 0.7245 - classification_loss: 0.0702 179/500 [=========>....................] - ETA: 1:48 - loss: 0.7926 - regression_loss: 0.7227 - classification_loss: 0.0699 180/500 [=========>....................] - ETA: 1:48 - loss: 0.7922 - regression_loss: 0.7224 - classification_loss: 0.0698 181/500 [=========>....................] - ETA: 1:48 - loss: 0.7889 - regression_loss: 0.7194 - classification_loss: 0.0695 182/500 [=========>....................] - ETA: 1:47 - loss: 0.7899 - regression_loss: 0.7203 - classification_loss: 0.0696 183/500 [=========>....................] - ETA: 1:47 - loss: 0.7885 - regression_loss: 0.7189 - classification_loss: 0.0696 184/500 [==========>...................] - ETA: 1:47 - loss: 0.7881 - regression_loss: 0.7187 - classification_loss: 0.0694 185/500 [==========>...................] - ETA: 1:46 - loss: 0.7860 - regression_loss: 0.7168 - classification_loss: 0.0692 186/500 [==========>...................] - ETA: 1:46 - loss: 0.7841 - regression_loss: 0.7152 - classification_loss: 0.0689 187/500 [==========>...................] - ETA: 1:46 - loss: 0.7831 - regression_loss: 0.7144 - classification_loss: 0.0687 188/500 [==========>...................] - ETA: 1:45 - loss: 0.7816 - regression_loss: 0.7130 - classification_loss: 0.0686 189/500 [==========>...................] - ETA: 1:45 - loss: 0.7836 - regression_loss: 0.7142 - classification_loss: 0.0693 190/500 [==========>...................] - ETA: 1:45 - loss: 0.7822 - regression_loss: 0.7131 - classification_loss: 0.0691 191/500 [==========>...................] - ETA: 1:44 - loss: 0.7808 - regression_loss: 0.7120 - classification_loss: 0.0688 192/500 [==========>...................] - ETA: 1:44 - loss: 0.7801 - regression_loss: 0.7111 - classification_loss: 0.0690 193/500 [==========>...................] - ETA: 1:44 - loss: 0.7845 - regression_loss: 0.7148 - classification_loss: 0.0697 194/500 [==========>...................] - ETA: 1:43 - loss: 0.7842 - regression_loss: 0.7146 - classification_loss: 0.0696 195/500 [==========>...................] - ETA: 1:43 - loss: 0.7837 - regression_loss: 0.7142 - classification_loss: 0.0695 196/500 [==========>...................] - ETA: 1:43 - loss: 0.7880 - regression_loss: 0.7175 - classification_loss: 0.0704 197/500 [==========>...................] - ETA: 1:42 - loss: 0.7856 - regression_loss: 0.7154 - classification_loss: 0.0702 198/500 [==========>...................] - ETA: 1:42 - loss: 0.7849 - regression_loss: 0.7147 - classification_loss: 0.0702 199/500 [==========>...................] - ETA: 1:42 - loss: 0.7847 - regression_loss: 0.7144 - classification_loss: 0.0704 200/500 [===========>..................] - ETA: 1:41 - loss: 0.7808 - regression_loss: 0.7108 - classification_loss: 0.0700 201/500 [===========>..................] - ETA: 1:41 - loss: 0.7800 - regression_loss: 0.7101 - classification_loss: 0.0698 202/500 [===========>..................] - ETA: 1:41 - loss: 0.7796 - regression_loss: 0.7097 - classification_loss: 0.0699 203/500 [===========>..................] - ETA: 1:40 - loss: 0.7803 - regression_loss: 0.7105 - classification_loss: 0.0698 204/500 [===========>..................] - ETA: 1:40 - loss: 0.7822 - regression_loss: 0.7124 - classification_loss: 0.0698 205/500 [===========>..................] - ETA: 1:40 - loss: 0.7827 - regression_loss: 0.7130 - classification_loss: 0.0697 206/500 [===========>..................] - ETA: 1:39 - loss: 0.7804 - regression_loss: 0.7109 - classification_loss: 0.0695 207/500 [===========>..................] - ETA: 1:39 - loss: 0.7793 - regression_loss: 0.7100 - classification_loss: 0.0693 208/500 [===========>..................] - ETA: 1:39 - loss: 0.7798 - regression_loss: 0.7105 - classification_loss: 0.0693 209/500 [===========>..................] - ETA: 1:38 - loss: 0.7794 - regression_loss: 0.7102 - classification_loss: 0.0692 210/500 [===========>..................] - ETA: 1:38 - loss: 0.7808 - regression_loss: 0.7116 - classification_loss: 0.0693 211/500 [===========>..................] - ETA: 1:38 - loss: 0.7809 - regression_loss: 0.7117 - classification_loss: 0.0692 212/500 [===========>..................] - ETA: 1:37 - loss: 0.7793 - regression_loss: 0.7103 - classification_loss: 0.0690 213/500 [===========>..................] - ETA: 1:37 - loss: 0.7790 - regression_loss: 0.7100 - classification_loss: 0.0690 214/500 [===========>..................] - ETA: 1:37 - loss: 0.7784 - regression_loss: 0.7095 - classification_loss: 0.0689 215/500 [===========>..................] - ETA: 1:36 - loss: 0.7774 - regression_loss: 0.7087 - classification_loss: 0.0687 216/500 [===========>..................] - ETA: 1:36 - loss: 0.7777 - regression_loss: 0.7090 - classification_loss: 0.0687 217/500 [============>.................] - ETA: 1:36 - loss: 0.7793 - regression_loss: 0.7102 - classification_loss: 0.0691 218/500 [============>.................] - ETA: 1:35 - loss: 0.7796 - regression_loss: 0.7106 - classification_loss: 0.0690 219/500 [============>.................] - ETA: 1:35 - loss: 0.7798 - regression_loss: 0.7106 - classification_loss: 0.0692 220/500 [============>.................] - ETA: 1:35 - loss: 0.7803 - regression_loss: 0.7111 - classification_loss: 0.0692 221/500 [============>.................] - ETA: 1:34 - loss: 0.7800 - regression_loss: 0.7110 - classification_loss: 0.0691 222/500 [============>.................] - ETA: 1:34 - loss: 0.7802 - regression_loss: 0.7111 - classification_loss: 0.0691 223/500 [============>.................] - ETA: 1:34 - loss: 0.7812 - regression_loss: 0.7120 - classification_loss: 0.0692 224/500 [============>.................] - ETA: 1:33 - loss: 0.7810 - regression_loss: 0.7119 - classification_loss: 0.0691 225/500 [============>.................] - ETA: 1:33 - loss: 0.7804 - regression_loss: 0.7114 - classification_loss: 0.0690 226/500 [============>.................] - ETA: 1:32 - loss: 0.7786 - regression_loss: 0.7100 - classification_loss: 0.0687 227/500 [============>.................] - ETA: 1:32 - loss: 0.7784 - regression_loss: 0.7097 - classification_loss: 0.0687 228/500 [============>.................] - ETA: 1:32 - loss: 0.7771 - regression_loss: 0.7086 - classification_loss: 0.0685 229/500 [============>.................] - ETA: 1:31 - loss: 0.7762 - regression_loss: 0.7079 - classification_loss: 0.0684 230/500 [============>.................] - ETA: 1:31 - loss: 0.7750 - regression_loss: 0.7069 - classification_loss: 0.0682 231/500 [============>.................] - ETA: 1:31 - loss: 0.7751 - regression_loss: 0.7070 - classification_loss: 0.0681 232/500 [============>.................] - ETA: 1:30 - loss: 0.7734 - regression_loss: 0.7054 - classification_loss: 0.0680 233/500 [============>.................] - ETA: 1:30 - loss: 0.7753 - regression_loss: 0.7075 - classification_loss: 0.0678 234/500 [=============>................] - ETA: 1:30 - loss: 0.7749 - regression_loss: 0.7072 - classification_loss: 0.0676 235/500 [=============>................] - ETA: 1:29 - loss: 0.7745 - regression_loss: 0.7069 - classification_loss: 0.0676 236/500 [=============>................] - ETA: 1:29 - loss: 0.7736 - regression_loss: 0.7062 - classification_loss: 0.0674 237/500 [=============>................] - ETA: 1:29 - loss: 0.7743 - regression_loss: 0.7067 - classification_loss: 0.0676 238/500 [=============>................] - ETA: 1:28 - loss: 0.7733 - regression_loss: 0.7058 - classification_loss: 0.0675 239/500 [=============>................] - ETA: 1:28 - loss: 0.7719 - regression_loss: 0.7046 - classification_loss: 0.0673 240/500 [=============>................] - ETA: 1:28 - loss: 0.7731 - regression_loss: 0.7057 - classification_loss: 0.0674 241/500 [=============>................] - ETA: 1:27 - loss: 0.7717 - regression_loss: 0.7044 - classification_loss: 0.0673 242/500 [=============>................] - ETA: 1:27 - loss: 0.7728 - regression_loss: 0.7055 - classification_loss: 0.0673 243/500 [=============>................] - ETA: 1:27 - loss: 0.7714 - regression_loss: 0.7043 - classification_loss: 0.0671 244/500 [=============>................] - ETA: 1:26 - loss: 0.7722 - regression_loss: 0.7049 - classification_loss: 0.0673 245/500 [=============>................] - ETA: 1:26 - loss: 0.7709 - regression_loss: 0.7037 - classification_loss: 0.0672 246/500 [=============>................] - ETA: 1:26 - loss: 0.7710 - regression_loss: 0.7037 - classification_loss: 0.0674 247/500 [=============>................] - ETA: 1:25 - loss: 0.7707 - regression_loss: 0.7034 - classification_loss: 0.0674 248/500 [=============>................] - ETA: 1:25 - loss: 0.7701 - regression_loss: 0.7029 - classification_loss: 0.0672 249/500 [=============>................] - ETA: 1:25 - loss: 0.7687 - regression_loss: 0.7017 - classification_loss: 0.0671 250/500 [==============>...............] - ETA: 1:24 - loss: 0.7707 - regression_loss: 0.7033 - classification_loss: 0.0673 251/500 [==============>...............] - ETA: 1:24 - loss: 0.7710 - regression_loss: 0.7037 - classification_loss: 0.0673 252/500 [==============>...............] - ETA: 1:24 - loss: 0.7751 - regression_loss: 0.7073 - classification_loss: 0.0678 253/500 [==============>...............] - ETA: 1:23 - loss: 0.7732 - regression_loss: 0.7056 - classification_loss: 0.0675 254/500 [==============>...............] - ETA: 1:23 - loss: 0.7772 - regression_loss: 0.7091 - classification_loss: 0.0680 255/500 [==============>...............] - ETA: 1:23 - loss: 0.7775 - regression_loss: 0.7094 - classification_loss: 0.0681 256/500 [==============>...............] - ETA: 1:22 - loss: 0.7760 - regression_loss: 0.7081 - classification_loss: 0.0680 257/500 [==============>...............] - ETA: 1:22 - loss: 0.7759 - regression_loss: 0.7079 - classification_loss: 0.0680 258/500 [==============>...............] - ETA: 1:22 - loss: 0.7747 - regression_loss: 0.7069 - classification_loss: 0.0679 259/500 [==============>...............] - ETA: 1:21 - loss: 0.7741 - regression_loss: 0.7065 - classification_loss: 0.0677 260/500 [==============>...............] - ETA: 1:21 - loss: 0.7759 - regression_loss: 0.7077 - classification_loss: 0.0682 261/500 [==============>...............] - ETA: 1:21 - loss: 0.7769 - regression_loss: 0.7088 - classification_loss: 0.0682 262/500 [==============>...............] - ETA: 1:20 - loss: 0.7770 - regression_loss: 0.7090 - classification_loss: 0.0681 263/500 [==============>...............] - ETA: 1:20 - loss: 0.7769 - regression_loss: 0.7090 - classification_loss: 0.0680 264/500 [==============>...............] - ETA: 1:20 - loss: 0.7761 - regression_loss: 0.7083 - classification_loss: 0.0678 265/500 [==============>...............] - ETA: 1:19 - loss: 0.7761 - regression_loss: 0.7084 - classification_loss: 0.0677 266/500 [==============>...............] - ETA: 1:19 - loss: 0.7765 - regression_loss: 0.7088 - classification_loss: 0.0677 267/500 [===============>..............] - ETA: 1:19 - loss: 0.7751 - regression_loss: 0.7076 - classification_loss: 0.0675 268/500 [===============>..............] - ETA: 1:18 - loss: 0.7753 - regression_loss: 0.7078 - classification_loss: 0.0675 269/500 [===============>..............] - ETA: 1:18 - loss: 0.7754 - regression_loss: 0.7079 - classification_loss: 0.0675 270/500 [===============>..............] - ETA: 1:17 - loss: 0.7768 - regression_loss: 0.7088 - classification_loss: 0.0680 271/500 [===============>..............] - ETA: 1:17 - loss: 0.7764 - regression_loss: 0.7085 - classification_loss: 0.0679 272/500 [===============>..............] - ETA: 1:17 - loss: 0.7749 - regression_loss: 0.7072 - classification_loss: 0.0677 273/500 [===============>..............] - ETA: 1:16 - loss: 0.7759 - regression_loss: 0.7083 - classification_loss: 0.0677 274/500 [===============>..............] - ETA: 1:16 - loss: 0.7750 - regression_loss: 0.7074 - classification_loss: 0.0675 275/500 [===============>..............] - ETA: 1:16 - loss: 0.7763 - regression_loss: 0.7088 - classification_loss: 0.0675 276/500 [===============>..............] - ETA: 1:15 - loss: 0.7762 - regression_loss: 0.7088 - classification_loss: 0.0674 277/500 [===============>..............] - ETA: 1:15 - loss: 0.7759 - regression_loss: 0.7086 - classification_loss: 0.0673 278/500 [===============>..............] - ETA: 1:15 - loss: 0.7757 - regression_loss: 0.7087 - classification_loss: 0.0671 279/500 [===============>..............] - ETA: 1:14 - loss: 0.7793 - regression_loss: 0.7120 - classification_loss: 0.0673 280/500 [===============>..............] - ETA: 1:14 - loss: 0.7796 - regression_loss: 0.7122 - classification_loss: 0.0674 281/500 [===============>..............] - ETA: 1:14 - loss: 0.7779 - regression_loss: 0.7107 - classification_loss: 0.0672 282/500 [===============>..............] - ETA: 1:13 - loss: 0.7762 - regression_loss: 0.7092 - classification_loss: 0.0670 283/500 [===============>..............] - ETA: 1:13 - loss: 0.7758 - regression_loss: 0.7089 - classification_loss: 0.0670 284/500 [================>.............] - ETA: 1:13 - loss: 0.7786 - regression_loss: 0.7109 - classification_loss: 0.0677 285/500 [================>.............] - ETA: 1:12 - loss: 0.7781 - regression_loss: 0.7105 - classification_loss: 0.0676 286/500 [================>.............] - ETA: 1:12 - loss: 0.7769 - regression_loss: 0.7094 - classification_loss: 0.0676 287/500 [================>.............] - ETA: 1:12 - loss: 0.7760 - regression_loss: 0.7086 - classification_loss: 0.0675 288/500 [================>.............] - ETA: 1:11 - loss: 0.7757 - regression_loss: 0.7083 - classification_loss: 0.0673 289/500 [================>.............] - ETA: 1:11 - loss: 0.7755 - regression_loss: 0.7082 - classification_loss: 0.0673 290/500 [================>.............] - ETA: 1:11 - loss: 0.7749 - regression_loss: 0.7078 - classification_loss: 0.0671 291/500 [================>.............] - ETA: 1:10 - loss: 0.7745 - regression_loss: 0.7075 - classification_loss: 0.0670 292/500 [================>.............] - ETA: 1:10 - loss: 0.7778 - regression_loss: 0.7103 - classification_loss: 0.0675 293/500 [================>.............] - ETA: 1:10 - loss: 0.7783 - regression_loss: 0.7106 - classification_loss: 0.0678 294/500 [================>.............] - ETA: 1:09 - loss: 0.7782 - regression_loss: 0.7101 - classification_loss: 0.0681 295/500 [================>.............] - ETA: 1:09 - loss: 0.7792 - regression_loss: 0.7105 - classification_loss: 0.0686 296/500 [================>.............] - ETA: 1:09 - loss: 0.7791 - regression_loss: 0.7106 - classification_loss: 0.0686 297/500 [================>.............] - ETA: 1:08 - loss: 0.7795 - regression_loss: 0.7108 - classification_loss: 0.0687 298/500 [================>.............] - ETA: 1:08 - loss: 0.7792 - regression_loss: 0.7106 - classification_loss: 0.0686 299/500 [================>.............] - ETA: 1:08 - loss: 0.7781 - regression_loss: 0.7097 - classification_loss: 0.0684 300/500 [=================>............] - ETA: 1:07 - loss: 0.7764 - regression_loss: 0.7081 - classification_loss: 0.0682 301/500 [=================>............] - ETA: 1:07 - loss: 0.7752 - regression_loss: 0.7072 - classification_loss: 0.0680 302/500 [=================>............] - ETA: 1:07 - loss: 0.7748 - regression_loss: 0.7069 - classification_loss: 0.0679 303/500 [=================>............] - ETA: 1:06 - loss: 0.7762 - regression_loss: 0.7077 - classification_loss: 0.0685 304/500 [=================>............] - ETA: 1:06 - loss: 0.7770 - regression_loss: 0.7086 - classification_loss: 0.0685 305/500 [=================>............] - ETA: 1:06 - loss: 0.7764 - regression_loss: 0.7080 - classification_loss: 0.0684 306/500 [=================>............] - ETA: 1:05 - loss: 0.7773 - regression_loss: 0.7089 - classification_loss: 0.0684 307/500 [=================>............] - ETA: 1:05 - loss: 0.7776 - regression_loss: 0.7093 - classification_loss: 0.0683 308/500 [=================>............] - ETA: 1:05 - loss: 0.7782 - regression_loss: 0.7099 - classification_loss: 0.0683 309/500 [=================>............] - ETA: 1:04 - loss: 0.7791 - regression_loss: 0.7105 - classification_loss: 0.0686 310/500 [=================>............] - ETA: 1:04 - loss: 0.7779 - regression_loss: 0.7095 - classification_loss: 0.0684 311/500 [=================>............] - ETA: 1:03 - loss: 0.7783 - regression_loss: 0.7099 - classification_loss: 0.0684 312/500 [=================>............] - ETA: 1:03 - loss: 0.7783 - regression_loss: 0.7099 - classification_loss: 0.0684 313/500 [=================>............] - ETA: 1:03 - loss: 0.7798 - regression_loss: 0.7112 - classification_loss: 0.0686 314/500 [=================>............] - ETA: 1:02 - loss: 0.7801 - regression_loss: 0.7117 - classification_loss: 0.0684 315/500 [=================>............] - ETA: 1:02 - loss: 0.7814 - regression_loss: 0.7130 - classification_loss: 0.0683 316/500 [=================>............] - ETA: 1:02 - loss: 0.7826 - regression_loss: 0.7141 - classification_loss: 0.0685 317/500 [==================>...........] - ETA: 1:01 - loss: 0.7818 - regression_loss: 0.7132 - classification_loss: 0.0686 318/500 [==================>...........] - ETA: 1:01 - loss: 0.7830 - regression_loss: 0.7145 - classification_loss: 0.0685 319/500 [==================>...........] - ETA: 1:01 - loss: 0.7822 - regression_loss: 0.7138 - classification_loss: 0.0684 320/500 [==================>...........] - ETA: 1:00 - loss: 0.7814 - regression_loss: 0.7132 - classification_loss: 0.0683 321/500 [==================>...........] - ETA: 1:00 - loss: 0.7808 - regression_loss: 0.7126 - classification_loss: 0.0681 322/500 [==================>...........] - ETA: 1:00 - loss: 0.7813 - regression_loss: 0.7130 - classification_loss: 0.0683 323/500 [==================>...........] - ETA: 59s - loss: 0.7798 - regression_loss: 0.7117 - classification_loss: 0.0681  324/500 [==================>...........] - ETA: 59s - loss: 0.7794 - regression_loss: 0.7114 - classification_loss: 0.0680 325/500 [==================>...........] - ETA: 59s - loss: 0.7789 - regression_loss: 0.7111 - classification_loss: 0.0679 326/500 [==================>...........] - ETA: 58s - loss: 0.7792 - regression_loss: 0.7113 - classification_loss: 0.0679 327/500 [==================>...........] - ETA: 58s - loss: 0.7804 - regression_loss: 0.7122 - classification_loss: 0.0682 328/500 [==================>...........] - ETA: 58s - loss: 0.7813 - regression_loss: 0.7130 - classification_loss: 0.0683 329/500 [==================>...........] - ETA: 57s - loss: 0.7808 - regression_loss: 0.7126 - classification_loss: 0.0682 330/500 [==================>...........] - ETA: 57s - loss: 0.7813 - regression_loss: 0.7132 - classification_loss: 0.0681 331/500 [==================>...........] - ETA: 57s - loss: 0.7819 - regression_loss: 0.7137 - classification_loss: 0.0681 332/500 [==================>...........] - ETA: 56s - loss: 0.7826 - regression_loss: 0.7142 - classification_loss: 0.0684 333/500 [==================>...........] - ETA: 56s - loss: 0.7821 - regression_loss: 0.7138 - classification_loss: 0.0683 334/500 [===================>..........] - ETA: 56s - loss: 0.7831 - regression_loss: 0.7146 - classification_loss: 0.0685 335/500 [===================>..........] - ETA: 55s - loss: 0.7815 - regression_loss: 0.7131 - classification_loss: 0.0684 336/500 [===================>..........] - ETA: 55s - loss: 0.7816 - regression_loss: 0.7132 - classification_loss: 0.0684 337/500 [===================>..........] - ETA: 55s - loss: 0.7838 - regression_loss: 0.7150 - classification_loss: 0.0688 338/500 [===================>..........] - ETA: 54s - loss: 0.7874 - regression_loss: 0.7179 - classification_loss: 0.0694 339/500 [===================>..........] - ETA: 54s - loss: 0.7872 - regression_loss: 0.7178 - classification_loss: 0.0694 340/500 [===================>..........] - ETA: 54s - loss: 0.7868 - regression_loss: 0.7174 - classification_loss: 0.0694 341/500 [===================>..........] - ETA: 53s - loss: 0.7860 - regression_loss: 0.7168 - classification_loss: 0.0693 342/500 [===================>..........] - ETA: 53s - loss: 0.7862 - regression_loss: 0.7169 - classification_loss: 0.0693 343/500 [===================>..........] - ETA: 53s - loss: 0.7867 - regression_loss: 0.7172 - classification_loss: 0.0695 344/500 [===================>..........] - ETA: 52s - loss: 0.7867 - regression_loss: 0.7172 - classification_loss: 0.0695 345/500 [===================>..........] - ETA: 52s - loss: 0.7860 - regression_loss: 0.7166 - classification_loss: 0.0694 346/500 [===================>..........] - ETA: 52s - loss: 0.7869 - regression_loss: 0.7174 - classification_loss: 0.0695 347/500 [===================>..........] - ETA: 51s - loss: 0.7907 - regression_loss: 0.7208 - classification_loss: 0.0699 348/500 [===================>..........] - ETA: 51s - loss: 0.7904 - regression_loss: 0.7205 - classification_loss: 0.0699 349/500 [===================>..........] - ETA: 51s - loss: 0.7897 - regression_loss: 0.7199 - classification_loss: 0.0698 350/500 [====================>.........] - ETA: 50s - loss: 0.7895 - regression_loss: 0.7199 - classification_loss: 0.0697 351/500 [====================>.........] - ETA: 50s - loss: 0.7921 - regression_loss: 0.7216 - classification_loss: 0.0705 352/500 [====================>.........] - ETA: 50s - loss: 0.7914 - regression_loss: 0.7211 - classification_loss: 0.0704 353/500 [====================>.........] - ETA: 49s - loss: 0.7908 - regression_loss: 0.7206 - classification_loss: 0.0703 354/500 [====================>.........] - ETA: 49s - loss: 0.7911 - regression_loss: 0.7209 - classification_loss: 0.0703 355/500 [====================>.........] - ETA: 49s - loss: 0.7906 - regression_loss: 0.7205 - classification_loss: 0.0702 356/500 [====================>.........] - ETA: 48s - loss: 0.7902 - regression_loss: 0.7202 - classification_loss: 0.0700 357/500 [====================>.........] - ETA: 48s - loss: 0.7915 - regression_loss: 0.7213 - classification_loss: 0.0702 358/500 [====================>.........] - ETA: 48s - loss: 0.7930 - regression_loss: 0.7226 - classification_loss: 0.0705 359/500 [====================>.........] - ETA: 47s - loss: 0.7950 - regression_loss: 0.7239 - classification_loss: 0.0710 360/500 [====================>.........] - ETA: 47s - loss: 0.7942 - regression_loss: 0.7233 - classification_loss: 0.0709 361/500 [====================>.........] - ETA: 47s - loss: 0.7944 - regression_loss: 0.7235 - classification_loss: 0.0709 362/500 [====================>.........] - ETA: 46s - loss: 0.7935 - regression_loss: 0.7228 - classification_loss: 0.0707 363/500 [====================>.........] - ETA: 46s - loss: 0.7929 - regression_loss: 0.7223 - classification_loss: 0.0706 364/500 [====================>.........] - ETA: 46s - loss: 0.7920 - regression_loss: 0.7215 - classification_loss: 0.0705 365/500 [====================>.........] - ETA: 45s - loss: 0.7923 - regression_loss: 0.7217 - classification_loss: 0.0706 366/500 [====================>.........] - ETA: 45s - loss: 0.7926 - regression_loss: 0.7219 - classification_loss: 0.0706 367/500 [=====================>........] - ETA: 45s - loss: 0.7915 - regression_loss: 0.7210 - classification_loss: 0.0705 368/500 [=====================>........] - ETA: 44s - loss: 0.7909 - regression_loss: 0.7206 - classification_loss: 0.0704 369/500 [=====================>........] - ETA: 44s - loss: 0.7903 - regression_loss: 0.7200 - classification_loss: 0.0703 370/500 [=====================>........] - ETA: 43s - loss: 0.7901 - regression_loss: 0.7199 - classification_loss: 0.0702 371/500 [=====================>........] - ETA: 43s - loss: 0.7908 - regression_loss: 0.7204 - classification_loss: 0.0704 372/500 [=====================>........] - ETA: 43s - loss: 0.7898 - regression_loss: 0.7196 - classification_loss: 0.0702 373/500 [=====================>........] - ETA: 42s - loss: 0.7894 - regression_loss: 0.7192 - classification_loss: 0.0701 374/500 [=====================>........] - ETA: 42s - loss: 0.7902 - regression_loss: 0.7199 - classification_loss: 0.0703 375/500 [=====================>........] - ETA: 42s - loss: 0.7896 - regression_loss: 0.7195 - classification_loss: 0.0701 376/500 [=====================>........] - ETA: 41s - loss: 0.7900 - regression_loss: 0.7199 - classification_loss: 0.0701 377/500 [=====================>........] - ETA: 41s - loss: 0.7894 - regression_loss: 0.7194 - classification_loss: 0.0701 378/500 [=====================>........] - ETA: 41s - loss: 0.7901 - regression_loss: 0.7199 - classification_loss: 0.0701 379/500 [=====================>........] - ETA: 40s - loss: 0.7894 - regression_loss: 0.7194 - classification_loss: 0.0700 380/500 [=====================>........] - ETA: 40s - loss: 0.7895 - regression_loss: 0.7194 - classification_loss: 0.0700 381/500 [=====================>........] - ETA: 40s - loss: 0.7893 - regression_loss: 0.7193 - classification_loss: 0.0700 382/500 [=====================>........] - ETA: 39s - loss: 0.7879 - regression_loss: 0.7180 - classification_loss: 0.0698 383/500 [=====================>........] - ETA: 39s - loss: 0.7874 - regression_loss: 0.7176 - classification_loss: 0.0697 384/500 [======================>.......] - ETA: 39s - loss: 0.7873 - regression_loss: 0.7176 - classification_loss: 0.0697 385/500 [======================>.......] - ETA: 38s - loss: 0.7871 - regression_loss: 0.7175 - classification_loss: 0.0696 386/500 [======================>.......] - ETA: 38s - loss: 0.7870 - regression_loss: 0.7174 - classification_loss: 0.0696 387/500 [======================>.......] - ETA: 38s - loss: 0.7860 - regression_loss: 0.7165 - classification_loss: 0.0695 388/500 [======================>.......] - ETA: 37s - loss: 0.7892 - regression_loss: 0.7179 - classification_loss: 0.0713 389/500 [======================>.......] - ETA: 37s - loss: 0.7883 - regression_loss: 0.7171 - classification_loss: 0.0711 390/500 [======================>.......] - ETA: 37s - loss: 0.7892 - regression_loss: 0.7180 - classification_loss: 0.0712 391/500 [======================>.......] - ETA: 36s - loss: 0.7891 - regression_loss: 0.7179 - classification_loss: 0.0712 392/500 [======================>.......] - ETA: 36s - loss: 0.7891 - regression_loss: 0.7177 - classification_loss: 0.0714 393/500 [======================>.......] - ETA: 36s - loss: 0.7944 - regression_loss: 0.7216 - classification_loss: 0.0727 394/500 [======================>.......] - ETA: 35s - loss: 0.7931 - regression_loss: 0.7206 - classification_loss: 0.0726 395/500 [======================>.......] - ETA: 35s - loss: 0.7916 - regression_loss: 0.7191 - classification_loss: 0.0724 396/500 [======================>.......] - ETA: 35s - loss: 0.7908 - regression_loss: 0.7184 - classification_loss: 0.0723 397/500 [======================>.......] - ETA: 34s - loss: 0.7907 - regression_loss: 0.7182 - classification_loss: 0.0725 398/500 [======================>.......] - ETA: 34s - loss: 0.7904 - regression_loss: 0.7180 - classification_loss: 0.0724 399/500 [======================>.......] - ETA: 34s - loss: 0.7893 - regression_loss: 0.7168 - classification_loss: 0.0724 400/500 [=======================>......] - ETA: 33s - loss: 0.7900 - regression_loss: 0.7173 - classification_loss: 0.0727 401/500 [=======================>......] - ETA: 33s - loss: 0.7894 - regression_loss: 0.7167 - classification_loss: 0.0727 402/500 [=======================>......] - ETA: 33s - loss: 0.7890 - regression_loss: 0.7163 - classification_loss: 0.0727 403/500 [=======================>......] - ETA: 32s - loss: 0.7891 - regression_loss: 0.7164 - classification_loss: 0.0727 404/500 [=======================>......] - ETA: 32s - loss: 0.7879 - regression_loss: 0.7153 - classification_loss: 0.0726 405/500 [=======================>......] - ETA: 32s - loss: 0.7872 - regression_loss: 0.7147 - classification_loss: 0.0725 406/500 [=======================>......] - ETA: 31s - loss: 0.7877 - regression_loss: 0.7150 - classification_loss: 0.0727 407/500 [=======================>......] - ETA: 31s - loss: 0.7882 - regression_loss: 0.7156 - classification_loss: 0.0727 408/500 [=======================>......] - ETA: 31s - loss: 0.7870 - regression_loss: 0.7145 - classification_loss: 0.0725 409/500 [=======================>......] - ETA: 30s - loss: 0.7860 - regression_loss: 0.7136 - classification_loss: 0.0724 410/500 [=======================>......] - ETA: 30s - loss: 0.7859 - regression_loss: 0.7136 - classification_loss: 0.0724 411/500 [=======================>......] - ETA: 30s - loss: 0.7859 - regression_loss: 0.7136 - classification_loss: 0.0723 412/500 [=======================>......] - ETA: 29s - loss: 0.7854 - regression_loss: 0.7132 - classification_loss: 0.0722 413/500 [=======================>......] - ETA: 29s - loss: 0.7882 - regression_loss: 0.7157 - classification_loss: 0.0725 414/500 [=======================>......] - ETA: 29s - loss: 0.7881 - regression_loss: 0.7156 - classification_loss: 0.0726 415/500 [=======================>......] - ETA: 28s - loss: 0.7875 - regression_loss: 0.7150 - classification_loss: 0.0725 416/500 [=======================>......] - ETA: 28s - loss: 0.7874 - regression_loss: 0.7150 - classification_loss: 0.0724 417/500 [========================>.....] - ETA: 28s - loss: 0.7861 - regression_loss: 0.7139 - classification_loss: 0.0723 418/500 [========================>.....] - ETA: 27s - loss: 0.7869 - regression_loss: 0.7145 - classification_loss: 0.0724 419/500 [========================>.....] - ETA: 27s - loss: 0.7858 - regression_loss: 0.7136 - classification_loss: 0.0723 420/500 [========================>.....] - ETA: 27s - loss: 0.7865 - regression_loss: 0.7140 - classification_loss: 0.0724 421/500 [========================>.....] - ETA: 26s - loss: 0.7883 - regression_loss: 0.7151 - classification_loss: 0.0732 422/500 [========================>.....] - ETA: 26s - loss: 0.7887 - regression_loss: 0.7155 - classification_loss: 0.0732 423/500 [========================>.....] - ETA: 26s - loss: 0.7878 - regression_loss: 0.7147 - classification_loss: 0.0731 424/500 [========================>.....] - ETA: 25s - loss: 0.7877 - regression_loss: 0.7146 - classification_loss: 0.0730 425/500 [========================>.....] - ETA: 25s - loss: 0.7868 - regression_loss: 0.7139 - classification_loss: 0.0729 426/500 [========================>.....] - ETA: 25s - loss: 0.7860 - regression_loss: 0.7131 - classification_loss: 0.0728 427/500 [========================>.....] - ETA: 24s - loss: 0.7858 - regression_loss: 0.7131 - classification_loss: 0.0727 428/500 [========================>.....] - ETA: 24s - loss: 0.7857 - regression_loss: 0.7131 - classification_loss: 0.0727 429/500 [========================>.....] - ETA: 24s - loss: 0.7846 - regression_loss: 0.7121 - classification_loss: 0.0725 430/500 [========================>.....] - ETA: 23s - loss: 0.7837 - regression_loss: 0.7112 - classification_loss: 0.0725 431/500 [========================>.....] - ETA: 23s - loss: 0.7841 - regression_loss: 0.7115 - classification_loss: 0.0726 432/500 [========================>.....] - ETA: 23s - loss: 0.7840 - regression_loss: 0.7113 - classification_loss: 0.0727 433/500 [========================>.....] - ETA: 22s - loss: 0.7833 - regression_loss: 0.7108 - classification_loss: 0.0726 434/500 [=========================>....] - ETA: 22s - loss: 0.7840 - regression_loss: 0.7114 - classification_loss: 0.0726 435/500 [=========================>....] - ETA: 21s - loss: 0.7831 - regression_loss: 0.7106 - classification_loss: 0.0725 436/500 [=========================>....] - ETA: 21s - loss: 0.7841 - regression_loss: 0.7115 - classification_loss: 0.0725 437/500 [=========================>....] - ETA: 21s - loss: 0.7842 - regression_loss: 0.7117 - classification_loss: 0.0725 438/500 [=========================>....] - ETA: 20s - loss: 0.7842 - regression_loss: 0.7117 - classification_loss: 0.0724 439/500 [=========================>....] - ETA: 20s - loss: 0.7836 - regression_loss: 0.7112 - classification_loss: 0.0723 440/500 [=========================>....] - ETA: 20s - loss: 0.7829 - regression_loss: 0.7107 - classification_loss: 0.0722 441/500 [=========================>....] - ETA: 19s - loss: 0.7826 - regression_loss: 0.7104 - classification_loss: 0.0722 442/500 [=========================>....] - ETA: 19s - loss: 0.7823 - regression_loss: 0.7101 - classification_loss: 0.0722 443/500 [=========================>....] - ETA: 19s - loss: 0.7824 - regression_loss: 0.7101 - classification_loss: 0.0722 444/500 [=========================>....] - ETA: 18s - loss: 0.7820 - regression_loss: 0.7098 - classification_loss: 0.0722 445/500 [=========================>....] - ETA: 18s - loss: 0.7814 - regression_loss: 0.7092 - classification_loss: 0.0721 446/500 [=========================>....] - ETA: 18s - loss: 0.7821 - regression_loss: 0.7098 - classification_loss: 0.0723 447/500 [=========================>....] - ETA: 17s - loss: 0.7832 - regression_loss: 0.7107 - classification_loss: 0.0725 448/500 [=========================>....] - ETA: 17s - loss: 0.7835 - regression_loss: 0.7109 - classification_loss: 0.0725 449/500 [=========================>....] - ETA: 17s - loss: 0.7829 - regression_loss: 0.7105 - classification_loss: 0.0724 450/500 [==========================>...] - ETA: 16s - loss: 0.7824 - regression_loss: 0.7098 - classification_loss: 0.0726 451/500 [==========================>...] - ETA: 16s - loss: 0.7818 - regression_loss: 0.7093 - classification_loss: 0.0725 452/500 [==========================>...] - ETA: 16s - loss: 0.7824 - regression_loss: 0.7098 - classification_loss: 0.0726 453/500 [==========================>...] - ETA: 15s - loss: 0.7835 - regression_loss: 0.7108 - classification_loss: 0.0727 454/500 [==========================>...] - ETA: 15s - loss: 0.7842 - regression_loss: 0.7114 - classification_loss: 0.0728 455/500 [==========================>...] - ETA: 15s - loss: 0.7835 - regression_loss: 0.7108 - classification_loss: 0.0727 456/500 [==========================>...] - ETA: 14s - loss: 0.7830 - regression_loss: 0.7103 - classification_loss: 0.0727 457/500 [==========================>...] - ETA: 14s - loss: 0.7823 - regression_loss: 0.7098 - classification_loss: 0.0725 458/500 [==========================>...] - ETA: 14s - loss: 0.7825 - regression_loss: 0.7099 - classification_loss: 0.0726 459/500 [==========================>...] - ETA: 13s - loss: 0.7825 - regression_loss: 0.7100 - classification_loss: 0.0726 460/500 [==========================>...] - ETA: 13s - loss: 0.7820 - regression_loss: 0.7095 - classification_loss: 0.0725 461/500 [==========================>...] - ETA: 13s - loss: 0.7829 - regression_loss: 0.7102 - classification_loss: 0.0727 462/500 [==========================>...] - ETA: 12s - loss: 0.7833 - regression_loss: 0.7103 - classification_loss: 0.0729 463/500 [==========================>...] - ETA: 12s - loss: 0.7838 - regression_loss: 0.7108 - classification_loss: 0.0730 464/500 [==========================>...] - ETA: 12s - loss: 0.7831 - regression_loss: 0.7103 - classification_loss: 0.0728 465/500 [==========================>...] - ETA: 11s - loss: 0.7829 - regression_loss: 0.7102 - classification_loss: 0.0728 466/500 [==========================>...] - ETA: 11s - loss: 0.7825 - regression_loss: 0.7098 - classification_loss: 0.0727 467/500 [===========================>..] - ETA: 11s - loss: 0.7819 - regression_loss: 0.7093 - classification_loss: 0.0727 468/500 [===========================>..] - ETA: 10s - loss: 0.7822 - regression_loss: 0.7093 - classification_loss: 0.0728 469/500 [===========================>..] - ETA: 10s - loss: 0.7817 - regression_loss: 0.7089 - classification_loss: 0.0728 470/500 [===========================>..] - ETA: 10s - loss: 0.7813 - regression_loss: 0.7086 - classification_loss: 0.0727 471/500 [===========================>..] - ETA: 9s - loss: 0.7817 - regression_loss: 0.7091 - classification_loss: 0.0726  472/500 [===========================>..] - ETA: 9s - loss: 0.7810 - regression_loss: 0.7085 - classification_loss: 0.0725 473/500 [===========================>..] - ETA: 9s - loss: 0.7809 - regression_loss: 0.7084 - classification_loss: 0.0725 474/500 [===========================>..] - ETA: 8s - loss: 0.7804 - regression_loss: 0.7081 - classification_loss: 0.0723 475/500 [===========================>..] - ETA: 8s - loss: 0.7802 - regression_loss: 0.7079 - classification_loss: 0.0723 476/500 [===========================>..] - ETA: 8s - loss: 0.7811 - regression_loss: 0.7088 - classification_loss: 0.0723 477/500 [===========================>..] - ETA: 7s - loss: 0.7809 - regression_loss: 0.7087 - classification_loss: 0.0723 478/500 [===========================>..] - ETA: 7s - loss: 0.7812 - regression_loss: 0.7089 - classification_loss: 0.0722 479/500 [===========================>..] - ETA: 7s - loss: 0.7806 - regression_loss: 0.7084 - classification_loss: 0.0721 480/500 [===========================>..] - ETA: 6s - loss: 0.7819 - regression_loss: 0.7095 - classification_loss: 0.0724 481/500 [===========================>..] - ETA: 6s - loss: 0.7818 - regression_loss: 0.7094 - classification_loss: 0.0725 482/500 [===========================>..] - ETA: 6s - loss: 0.7820 - regression_loss: 0.7095 - classification_loss: 0.0725 483/500 [===========================>..] - ETA: 5s - loss: 0.7810 - regression_loss: 0.7087 - classification_loss: 0.0723 484/500 [============================>.] - ETA: 5s - loss: 0.7810 - regression_loss: 0.7086 - classification_loss: 0.0724 485/500 [============================>.] - ETA: 5s - loss: 0.7803 - regression_loss: 0.7080 - classification_loss: 0.0723 486/500 [============================>.] - ETA: 4s - loss: 0.7804 - regression_loss: 0.7081 - classification_loss: 0.0723 487/500 [============================>.] - ETA: 4s - loss: 0.7796 - regression_loss: 0.7074 - classification_loss: 0.0722 488/500 [============================>.] - ETA: 4s - loss: 0.7789 - regression_loss: 0.7068 - classification_loss: 0.0721 489/500 [============================>.] - ETA: 3s - loss: 0.7796 - regression_loss: 0.7074 - classification_loss: 0.0722 490/500 [============================>.] - ETA: 3s - loss: 0.7799 - regression_loss: 0.7077 - classification_loss: 0.0721 491/500 [============================>.] - ETA: 3s - loss: 0.7794 - regression_loss: 0.7073 - classification_loss: 0.0721 492/500 [============================>.] - ETA: 2s - loss: 0.7791 - regression_loss: 0.7071 - classification_loss: 0.0720 493/500 [============================>.] - ETA: 2s - loss: 0.7785 - regression_loss: 0.7066 - classification_loss: 0.0719 494/500 [============================>.] - ETA: 2s - loss: 0.7791 - regression_loss: 0.7069 - classification_loss: 0.0722 495/500 [============================>.] - ETA: 1s - loss: 0.7807 - regression_loss: 0.7081 - classification_loss: 0.0726 496/500 [============================>.] - ETA: 1s - loss: 0.7799 - regression_loss: 0.7075 - classification_loss: 0.0725 497/500 [============================>.] - ETA: 1s - loss: 0.7792 - regression_loss: 0.7068 - classification_loss: 0.0723 498/500 [============================>.] - ETA: 0s - loss: 0.7797 - regression_loss: 0.7073 - classification_loss: 0.0724 499/500 [============================>.] - ETA: 0s - loss: 0.7805 - regression_loss: 0.7079 - classification_loss: 0.0726 500/500 [==============================] - 169s 339ms/step - loss: 0.7803 - regression_loss: 0.7077 - classification_loss: 0.0726 326 instances of class plum with average precision: 0.8390 mAP: 0.8390 Epoch 00043: saving model to ./training/snapshots/resnet101_pascal_43.h5 Epoch 44/150 1/500 [..............................] - ETA: 2:39 - loss: 0.5557 - regression_loss: 0.5035 - classification_loss: 0.0522 2/500 [..............................] - ETA: 2:44 - loss: 0.5263 - regression_loss: 0.4843 - classification_loss: 0.0420 3/500 [..............................] - ETA: 2:42 - loss: 0.4682 - regression_loss: 0.4356 - classification_loss: 0.0326 4/500 [..............................] - ETA: 2:45 - loss: 0.6447 - regression_loss: 0.5660 - classification_loss: 0.0788 5/500 [..............................] - ETA: 2:45 - loss: 0.6306 - regression_loss: 0.5591 - classification_loss: 0.0715 6/500 [..............................] - ETA: 2:43 - loss: 0.6044 - regression_loss: 0.5405 - classification_loss: 0.0638 7/500 [..............................] - ETA: 2:43 - loss: 0.5881 - regression_loss: 0.5279 - classification_loss: 0.0603 8/500 [..............................] - ETA: 2:44 - loss: 0.5989 - regression_loss: 0.5400 - classification_loss: 0.0589 9/500 [..............................] - ETA: 2:44 - loss: 0.5709 - regression_loss: 0.5163 - classification_loss: 0.0546 10/500 [..............................] - ETA: 2:44 - loss: 0.5703 - regression_loss: 0.5183 - classification_loss: 0.0520 11/500 [..............................] - ETA: 2:44 - loss: 0.5469 - regression_loss: 0.4988 - classification_loss: 0.0481 12/500 [..............................] - ETA: 2:44 - loss: 0.5369 - regression_loss: 0.4918 - classification_loss: 0.0451 13/500 [..............................] - ETA: 2:43 - loss: 0.5311 - regression_loss: 0.4881 - classification_loss: 0.0430 14/500 [..............................] - ETA: 2:44 - loss: 0.5184 - regression_loss: 0.4775 - classification_loss: 0.0410 15/500 [..............................] - ETA: 2:43 - loss: 0.5618 - regression_loss: 0.5128 - classification_loss: 0.0490 16/500 [..............................] - ETA: 2:42 - loss: 0.5681 - regression_loss: 0.5193 - classification_loss: 0.0488 17/500 [>.............................] - ETA: 2:42 - loss: 0.5754 - regression_loss: 0.5273 - classification_loss: 0.0481 18/500 [>.............................] - ETA: 2:42 - loss: 0.5851 - regression_loss: 0.5368 - classification_loss: 0.0483 19/500 [>.............................] - ETA: 2:41 - loss: 0.5875 - regression_loss: 0.5395 - classification_loss: 0.0480 20/500 [>.............................] - ETA: 2:40 - loss: 0.6188 - regression_loss: 0.5696 - classification_loss: 0.0491 21/500 [>.............................] - ETA: 2:40 - loss: 0.6323 - regression_loss: 0.5823 - classification_loss: 0.0500 22/500 [>.............................] - ETA: 2:39 - loss: 0.6289 - regression_loss: 0.5788 - classification_loss: 0.0502 23/500 [>.............................] - ETA: 2:39 - loss: 0.6120 - regression_loss: 0.5636 - classification_loss: 0.0484 24/500 [>.............................] - ETA: 2:39 - loss: 0.6101 - regression_loss: 0.5626 - classification_loss: 0.0474 25/500 [>.............................] - ETA: 2:39 - loss: 0.6031 - regression_loss: 0.5570 - classification_loss: 0.0461 26/500 [>.............................] - ETA: 2:39 - loss: 0.6130 - regression_loss: 0.5663 - classification_loss: 0.0467 27/500 [>.............................] - ETA: 2:38 - loss: 0.6074 - regression_loss: 0.5615 - classification_loss: 0.0459 28/500 [>.............................] - ETA: 2:38 - loss: 0.6077 - regression_loss: 0.5619 - classification_loss: 0.0457 29/500 [>.............................] - ETA: 2:38 - loss: 0.6033 - regression_loss: 0.5580 - classification_loss: 0.0453 30/500 [>.............................] - ETA: 2:38 - loss: 0.6001 - regression_loss: 0.5553 - classification_loss: 0.0448 31/500 [>.............................] - ETA: 2:37 - loss: 0.6130 - regression_loss: 0.5671 - classification_loss: 0.0460 32/500 [>.............................] - ETA: 2:37 - loss: 0.6170 - regression_loss: 0.5715 - classification_loss: 0.0454 33/500 [>.............................] - ETA: 2:36 - loss: 0.6231 - regression_loss: 0.5776 - classification_loss: 0.0455 34/500 [=>............................] - ETA: 2:36 - loss: 0.6350 - regression_loss: 0.5896 - classification_loss: 0.0454 35/500 [=>............................] - ETA: 2:36 - loss: 0.6470 - regression_loss: 0.6003 - classification_loss: 0.0467 36/500 [=>............................] - ETA: 2:35 - loss: 0.6572 - regression_loss: 0.6080 - classification_loss: 0.0492 37/500 [=>............................] - ETA: 2:35 - loss: 0.7397 - regression_loss: 0.6584 - classification_loss: 0.0813 38/500 [=>............................] - ETA: 2:35 - loss: 0.7402 - regression_loss: 0.6598 - classification_loss: 0.0804 39/500 [=>............................] - ETA: 2:35 - loss: 0.7349 - regression_loss: 0.6559 - classification_loss: 0.0790 40/500 [=>............................] - ETA: 2:35 - loss: 0.7329 - regression_loss: 0.6548 - classification_loss: 0.0780 41/500 [=>............................] - ETA: 2:34 - loss: 0.7238 - regression_loss: 0.6471 - classification_loss: 0.0766 42/500 [=>............................] - ETA: 2:34 - loss: 0.7202 - regression_loss: 0.6446 - classification_loss: 0.0757 43/500 [=>............................] - ETA: 2:34 - loss: 0.7265 - regression_loss: 0.6508 - classification_loss: 0.0757 44/500 [=>............................] - ETA: 2:33 - loss: 0.7247 - regression_loss: 0.6500 - classification_loss: 0.0747 45/500 [=>............................] - ETA: 2:33 - loss: 0.7245 - regression_loss: 0.6502 - classification_loss: 0.0742 46/500 [=>............................] - ETA: 2:33 - loss: 0.7291 - regression_loss: 0.6545 - classification_loss: 0.0745 47/500 [=>............................] - ETA: 2:33 - loss: 0.7276 - regression_loss: 0.6535 - classification_loss: 0.0741 48/500 [=>............................] - ETA: 2:32 - loss: 0.7220 - regression_loss: 0.6488 - classification_loss: 0.0732 49/500 [=>............................] - ETA: 2:32 - loss: 0.7244 - regression_loss: 0.6512 - classification_loss: 0.0732 50/500 [==>...........................] - ETA: 2:32 - loss: 0.7245 - regression_loss: 0.6521 - classification_loss: 0.0725 51/500 [==>...........................] - ETA: 2:31 - loss: 0.7229 - regression_loss: 0.6515 - classification_loss: 0.0714 52/500 [==>...........................] - ETA: 2:31 - loss: 0.7230 - regression_loss: 0.6522 - classification_loss: 0.0708 53/500 [==>...........................] - ETA: 2:31 - loss: 0.7345 - regression_loss: 0.6599 - classification_loss: 0.0746 54/500 [==>...........................] - ETA: 2:31 - loss: 0.7545 - regression_loss: 0.6764 - classification_loss: 0.0781 55/500 [==>...........................] - ETA: 2:30 - loss: 0.7514 - regression_loss: 0.6737 - classification_loss: 0.0776 56/500 [==>...........................] - ETA: 2:30 - loss: 0.7541 - regression_loss: 0.6766 - classification_loss: 0.0775 57/500 [==>...........................] - ETA: 2:30 - loss: 0.7510 - regression_loss: 0.6740 - classification_loss: 0.0769 58/500 [==>...........................] - ETA: 2:29 - loss: 0.7510 - regression_loss: 0.6749 - classification_loss: 0.0762 59/500 [==>...........................] - ETA: 2:29 - loss: 0.7520 - regression_loss: 0.6756 - classification_loss: 0.0764 60/500 [==>...........................] - ETA: 2:29 - loss: 0.7602 - regression_loss: 0.6841 - classification_loss: 0.0761 61/500 [==>...........................] - ETA: 2:28 - loss: 0.7523 - regression_loss: 0.6772 - classification_loss: 0.0751 62/500 [==>...........................] - ETA: 2:28 - loss: 0.7515 - regression_loss: 0.6765 - classification_loss: 0.0750 63/500 [==>...........................] - ETA: 2:28 - loss: 0.7586 - regression_loss: 0.6825 - classification_loss: 0.0761 64/500 [==>...........................] - ETA: 2:27 - loss: 0.7599 - regression_loss: 0.6837 - classification_loss: 0.0762 65/500 [==>...........................] - ETA: 2:27 - loss: 0.7584 - regression_loss: 0.6828 - classification_loss: 0.0755 66/500 [==>...........................] - ETA: 2:27 - loss: 0.7520 - regression_loss: 0.6773 - classification_loss: 0.0747 67/500 [===>..........................] - ETA: 2:27 - loss: 0.7460 - regression_loss: 0.6721 - classification_loss: 0.0739 68/500 [===>..........................] - ETA: 2:26 - loss: 0.7488 - regression_loss: 0.6754 - classification_loss: 0.0733 69/500 [===>..........................] - ETA: 2:26 - loss: 0.7452 - regression_loss: 0.6725 - classification_loss: 0.0727 70/500 [===>..........................] - ETA: 2:26 - loss: 0.7461 - regression_loss: 0.6735 - classification_loss: 0.0726 71/500 [===>..........................] - ETA: 2:25 - loss: 0.7471 - regression_loss: 0.6750 - classification_loss: 0.0722 72/500 [===>..........................] - ETA: 2:25 - loss: 0.7511 - regression_loss: 0.6783 - classification_loss: 0.0727 73/500 [===>..........................] - ETA: 2:25 - loss: 0.7470 - regression_loss: 0.6747 - classification_loss: 0.0723 74/500 [===>..........................] - ETA: 2:24 - loss: 0.7380 - regression_loss: 0.6667 - classification_loss: 0.0713 75/500 [===>..........................] - ETA: 2:24 - loss: 0.7362 - regression_loss: 0.6652 - classification_loss: 0.0709 76/500 [===>..........................] - ETA: 2:24 - loss: 0.7314 - regression_loss: 0.6607 - classification_loss: 0.0707 77/500 [===>..........................] - ETA: 2:23 - loss: 0.7317 - regression_loss: 0.6612 - classification_loss: 0.0705 78/500 [===>..........................] - ETA: 2:23 - loss: 0.7337 - regression_loss: 0.6607 - classification_loss: 0.0730 79/500 [===>..........................] - ETA: 2:23 - loss: 0.7281 - regression_loss: 0.6558 - classification_loss: 0.0723 80/500 [===>..........................] - ETA: 2:23 - loss: 0.7243 - regression_loss: 0.6527 - classification_loss: 0.0717 81/500 [===>..........................] - ETA: 2:22 - loss: 0.7196 - regression_loss: 0.6486 - classification_loss: 0.0710 82/500 [===>..........................] - ETA: 2:22 - loss: 0.7180 - regression_loss: 0.6475 - classification_loss: 0.0705 83/500 [===>..........................] - ETA: 2:22 - loss: 0.7171 - regression_loss: 0.6467 - classification_loss: 0.0705 84/500 [====>.........................] - ETA: 2:21 - loss: 0.7199 - regression_loss: 0.6499 - classification_loss: 0.0700 85/500 [====>.........................] - ETA: 2:21 - loss: 0.7228 - regression_loss: 0.6519 - classification_loss: 0.0709 86/500 [====>.........................] - ETA: 2:21 - loss: 0.7269 - regression_loss: 0.6561 - classification_loss: 0.0708 87/500 [====>.........................] - ETA: 2:20 - loss: 0.7279 - regression_loss: 0.6572 - classification_loss: 0.0707 88/500 [====>.........................] - ETA: 2:20 - loss: 0.7303 - regression_loss: 0.6599 - classification_loss: 0.0703 89/500 [====>.........................] - ETA: 2:20 - loss: 0.7261 - regression_loss: 0.6563 - classification_loss: 0.0698 90/500 [====>.........................] - ETA: 2:19 - loss: 0.7243 - regression_loss: 0.6549 - classification_loss: 0.0694 91/500 [====>.........................] - ETA: 2:19 - loss: 0.7237 - regression_loss: 0.6546 - classification_loss: 0.0690 92/500 [====>.........................] - ETA: 2:19 - loss: 0.7272 - regression_loss: 0.6579 - classification_loss: 0.0693 93/500 [====>.........................] - ETA: 2:18 - loss: 0.7250 - regression_loss: 0.6561 - classification_loss: 0.0688 94/500 [====>.........................] - ETA: 2:18 - loss: 0.7260 - regression_loss: 0.6573 - classification_loss: 0.0687 95/500 [====>.........................] - ETA: 2:18 - loss: 0.7229 - regression_loss: 0.6548 - classification_loss: 0.0681 96/500 [====>.........................] - ETA: 2:17 - loss: 0.7203 - regression_loss: 0.6522 - classification_loss: 0.0681 97/500 [====>.........................] - ETA: 2:17 - loss: 0.7207 - regression_loss: 0.6528 - classification_loss: 0.0679 98/500 [====>.........................] - ETA: 2:17 - loss: 0.7165 - regression_loss: 0.6492 - classification_loss: 0.0674 99/500 [====>.........................] - ETA: 2:16 - loss: 0.7174 - regression_loss: 0.6499 - classification_loss: 0.0675 100/500 [=====>........................] - ETA: 2:16 - loss: 0.7169 - regression_loss: 0.6499 - classification_loss: 0.0670 101/500 [=====>........................] - ETA: 2:16 - loss: 0.7148 - regression_loss: 0.6480 - classification_loss: 0.0668 102/500 [=====>........................] - ETA: 2:15 - loss: 0.7173 - regression_loss: 0.6500 - classification_loss: 0.0673 103/500 [=====>........................] - ETA: 2:15 - loss: 0.7178 - regression_loss: 0.6506 - classification_loss: 0.0672 104/500 [=====>........................] - ETA: 2:14 - loss: 0.7153 - regression_loss: 0.6484 - classification_loss: 0.0669 105/500 [=====>........................] - ETA: 2:14 - loss: 0.7132 - regression_loss: 0.6467 - classification_loss: 0.0665 106/500 [=====>........................] - ETA: 2:14 - loss: 0.7126 - regression_loss: 0.6465 - classification_loss: 0.0661 107/500 [=====>........................] - ETA: 2:13 - loss: 0.7144 - regression_loss: 0.6488 - classification_loss: 0.0656 108/500 [=====>........................] - ETA: 2:13 - loss: 0.7156 - regression_loss: 0.6497 - classification_loss: 0.0659 109/500 [=====>........................] - ETA: 2:13 - loss: 0.7142 - regression_loss: 0.6484 - classification_loss: 0.0658 110/500 [=====>........................] - ETA: 2:12 - loss: 0.7126 - regression_loss: 0.6469 - classification_loss: 0.0657 111/500 [=====>........................] - ETA: 2:12 - loss: 0.7141 - regression_loss: 0.6483 - classification_loss: 0.0658 112/500 [=====>........................] - ETA: 2:12 - loss: 0.7130 - regression_loss: 0.6474 - classification_loss: 0.0655 113/500 [=====>........................] - ETA: 2:11 - loss: 0.7138 - regression_loss: 0.6483 - classification_loss: 0.0655 114/500 [=====>........................] - ETA: 2:11 - loss: 0.7122 - regression_loss: 0.6470 - classification_loss: 0.0652 115/500 [=====>........................] - ETA: 2:11 - loss: 0.7129 - regression_loss: 0.6476 - classification_loss: 0.0654 116/500 [=====>........................] - ETA: 2:10 - loss: 0.7122 - regression_loss: 0.6468 - classification_loss: 0.0653 117/500 [======>.......................] - ETA: 2:10 - loss: 0.7096 - regression_loss: 0.6446 - classification_loss: 0.0650 118/500 [======>.......................] - ETA: 2:10 - loss: 0.7072 - regression_loss: 0.6426 - classification_loss: 0.0646 119/500 [======>.......................] - ETA: 2:09 - loss: 0.7055 - regression_loss: 0.6413 - classification_loss: 0.0642 120/500 [======>.......................] - ETA: 2:09 - loss: 0.7022 - regression_loss: 0.6383 - classification_loss: 0.0639 121/500 [======>.......................] - ETA: 2:08 - loss: 0.7027 - regression_loss: 0.6391 - classification_loss: 0.0636 122/500 [======>.......................] - ETA: 2:08 - loss: 0.7020 - regression_loss: 0.6384 - classification_loss: 0.0635 123/500 [======>.......................] - ETA: 2:08 - loss: 0.7009 - regression_loss: 0.6376 - classification_loss: 0.0633 124/500 [======>.......................] - ETA: 2:08 - loss: 0.7078 - regression_loss: 0.6435 - classification_loss: 0.0642 125/500 [======>.......................] - ETA: 2:07 - loss: 0.7048 - regression_loss: 0.6405 - classification_loss: 0.0643 126/500 [======>.......................] - ETA: 2:07 - loss: 0.7069 - regression_loss: 0.6423 - classification_loss: 0.0646 127/500 [======>.......................] - ETA: 2:07 - loss: 0.7088 - regression_loss: 0.6441 - classification_loss: 0.0646 128/500 [======>.......................] - ETA: 2:06 - loss: 0.7075 - regression_loss: 0.6432 - classification_loss: 0.0643 129/500 [======>.......................] - ETA: 2:06 - loss: 0.7117 - regression_loss: 0.6466 - classification_loss: 0.0651 130/500 [======>.......................] - ETA: 2:06 - loss: 0.7105 - regression_loss: 0.6456 - classification_loss: 0.0649 131/500 [======>.......................] - ETA: 2:05 - loss: 0.7083 - regression_loss: 0.6435 - classification_loss: 0.0649 132/500 [======>.......................] - ETA: 2:05 - loss: 0.7086 - regression_loss: 0.6438 - classification_loss: 0.0648 133/500 [======>.......................] - ETA: 2:04 - loss: 0.7150 - regression_loss: 0.6485 - classification_loss: 0.0664 134/500 [=======>......................] - ETA: 2:04 - loss: 0.7181 - regression_loss: 0.6515 - classification_loss: 0.0666 135/500 [=======>......................] - ETA: 2:04 - loss: 0.7189 - regression_loss: 0.6524 - classification_loss: 0.0665 136/500 [=======>......................] - ETA: 2:04 - loss: 0.7195 - regression_loss: 0.6531 - classification_loss: 0.0664 137/500 [=======>......................] - ETA: 2:03 - loss: 0.7173 - regression_loss: 0.6511 - classification_loss: 0.0662 138/500 [=======>......................] - ETA: 2:03 - loss: 0.7182 - regression_loss: 0.6520 - classification_loss: 0.0662 139/500 [=======>......................] - ETA: 2:03 - loss: 0.7179 - regression_loss: 0.6518 - classification_loss: 0.0661 140/500 [=======>......................] - ETA: 2:02 - loss: 0.7204 - regression_loss: 0.6541 - classification_loss: 0.0663 141/500 [=======>......................] - ETA: 2:02 - loss: 0.7221 - regression_loss: 0.6555 - classification_loss: 0.0666 142/500 [=======>......................] - ETA: 2:02 - loss: 0.7257 - regression_loss: 0.6582 - classification_loss: 0.0675 143/500 [=======>......................] - ETA: 2:01 - loss: 0.7279 - regression_loss: 0.6600 - classification_loss: 0.0679 144/500 [=======>......................] - ETA: 2:01 - loss: 0.7264 - regression_loss: 0.6587 - classification_loss: 0.0677 145/500 [=======>......................] - ETA: 2:01 - loss: 0.7268 - regression_loss: 0.6592 - classification_loss: 0.0675 146/500 [=======>......................] - ETA: 2:00 - loss: 0.7278 - regression_loss: 0.6603 - classification_loss: 0.0675 147/500 [=======>......................] - ETA: 2:00 - loss: 0.7282 - regression_loss: 0.6606 - classification_loss: 0.0676 148/500 [=======>......................] - ETA: 1:59 - loss: 0.7262 - regression_loss: 0.6588 - classification_loss: 0.0673 149/500 [=======>......................] - ETA: 1:59 - loss: 0.7257 - regression_loss: 0.6582 - classification_loss: 0.0675 150/500 [========>.....................] - ETA: 1:59 - loss: 0.7243 - regression_loss: 0.6571 - classification_loss: 0.0672 151/500 [========>.....................] - ETA: 1:58 - loss: 0.7257 - regression_loss: 0.6585 - classification_loss: 0.0672 152/500 [========>.....................] - ETA: 1:58 - loss: 0.7236 - regression_loss: 0.6567 - classification_loss: 0.0670 153/500 [========>.....................] - ETA: 1:58 - loss: 0.7242 - regression_loss: 0.6573 - classification_loss: 0.0669 154/500 [========>.....................] - ETA: 1:57 - loss: 0.7250 - regression_loss: 0.6583 - classification_loss: 0.0667 155/500 [========>.....................] - ETA: 1:57 - loss: 0.7256 - regression_loss: 0.6592 - classification_loss: 0.0664 156/500 [========>.....................] - ETA: 1:57 - loss: 0.7276 - regression_loss: 0.6611 - classification_loss: 0.0665 157/500 [========>.....................] - ETA: 1:56 - loss: 0.7296 - regression_loss: 0.6634 - classification_loss: 0.0662 158/500 [========>.....................] - ETA: 1:56 - loss: 0.7266 - regression_loss: 0.6607 - classification_loss: 0.0658 159/500 [========>.....................] - ETA: 1:56 - loss: 0.7268 - regression_loss: 0.6607 - classification_loss: 0.0661 160/500 [========>.....................] - ETA: 1:55 - loss: 0.7263 - regression_loss: 0.6604 - classification_loss: 0.0659 161/500 [========>.....................] - ETA: 1:55 - loss: 0.7253 - regression_loss: 0.6596 - classification_loss: 0.0657 162/500 [========>.....................] - ETA: 1:55 - loss: 0.7273 - regression_loss: 0.6614 - classification_loss: 0.0660 163/500 [========>.....................] - ETA: 1:54 - loss: 0.7229 - regression_loss: 0.6573 - classification_loss: 0.0656 164/500 [========>.....................] - ETA: 1:54 - loss: 0.7217 - regression_loss: 0.6563 - classification_loss: 0.0654 165/500 [========>.....................] - ETA: 1:54 - loss: 0.7206 - regression_loss: 0.6553 - classification_loss: 0.0653 166/500 [========>.....................] - ETA: 1:53 - loss: 0.7223 - regression_loss: 0.6564 - classification_loss: 0.0659 167/500 [=========>....................] - ETA: 1:53 - loss: 0.7204 - regression_loss: 0.6548 - classification_loss: 0.0656 168/500 [=========>....................] - ETA: 1:53 - loss: 0.7171 - regression_loss: 0.6519 - classification_loss: 0.0653 169/500 [=========>....................] - ETA: 1:52 - loss: 0.7165 - regression_loss: 0.6514 - classification_loss: 0.0651 170/500 [=========>....................] - ETA: 1:52 - loss: 0.7183 - regression_loss: 0.6531 - classification_loss: 0.0652 171/500 [=========>....................] - ETA: 1:52 - loss: 0.7222 - regression_loss: 0.6564 - classification_loss: 0.0658 172/500 [=========>....................] - ETA: 1:51 - loss: 0.7238 - regression_loss: 0.6580 - classification_loss: 0.0658 173/500 [=========>....................] - ETA: 1:51 - loss: 0.7236 - regression_loss: 0.6580 - classification_loss: 0.0656 174/500 [=========>....................] - ETA: 1:51 - loss: 0.7235 - regression_loss: 0.6579 - classification_loss: 0.0656 175/500 [=========>....................] - ETA: 1:50 - loss: 0.7248 - regression_loss: 0.6588 - classification_loss: 0.0659 176/500 [=========>....................] - ETA: 1:50 - loss: 0.7290 - regression_loss: 0.6625 - classification_loss: 0.0665 177/500 [=========>....................] - ETA: 1:50 - loss: 0.7311 - regression_loss: 0.6638 - classification_loss: 0.0673 178/500 [=========>....................] - ETA: 1:49 - loss: 0.7289 - regression_loss: 0.6619 - classification_loss: 0.0669 179/500 [=========>....................] - ETA: 1:49 - loss: 0.7293 - regression_loss: 0.6624 - classification_loss: 0.0669 180/500 [=========>....................] - ETA: 1:49 - loss: 0.7301 - regression_loss: 0.6632 - classification_loss: 0.0669 181/500 [=========>....................] - ETA: 1:48 - loss: 0.7277 - regression_loss: 0.6611 - classification_loss: 0.0666 182/500 [=========>....................] - ETA: 1:48 - loss: 0.7265 - regression_loss: 0.6601 - classification_loss: 0.0663 183/500 [=========>....................] - ETA: 1:47 - loss: 0.7244 - regression_loss: 0.6583 - classification_loss: 0.0661 184/500 [==========>...................] - ETA: 1:47 - loss: 0.7259 - regression_loss: 0.6594 - classification_loss: 0.0665 185/500 [==========>...................] - ETA: 1:47 - loss: 0.7261 - regression_loss: 0.6597 - classification_loss: 0.0664 186/500 [==========>...................] - ETA: 1:46 - loss: 0.7257 - regression_loss: 0.6593 - classification_loss: 0.0664 187/500 [==========>...................] - ETA: 1:46 - loss: 0.7281 - regression_loss: 0.6616 - classification_loss: 0.0665 188/500 [==========>...................] - ETA: 1:46 - loss: 0.7311 - regression_loss: 0.6642 - classification_loss: 0.0669 189/500 [==========>...................] - ETA: 1:45 - loss: 0.7273 - regression_loss: 0.6607 - classification_loss: 0.0666 190/500 [==========>...................] - ETA: 1:45 - loss: 0.7306 - regression_loss: 0.6626 - classification_loss: 0.0680 191/500 [==========>...................] - ETA: 1:45 - loss: 0.7300 - regression_loss: 0.6622 - classification_loss: 0.0678 192/500 [==========>...................] - ETA: 1:44 - loss: 0.7291 - regression_loss: 0.6615 - classification_loss: 0.0676 193/500 [==========>...................] - ETA: 1:44 - loss: 0.7295 - regression_loss: 0.6618 - classification_loss: 0.0676 194/500 [==========>...................] - ETA: 1:44 - loss: 0.7279 - regression_loss: 0.6606 - classification_loss: 0.0673 195/500 [==========>...................] - ETA: 1:43 - loss: 0.7297 - regression_loss: 0.6623 - classification_loss: 0.0674 196/500 [==========>...................] - ETA: 1:43 - loss: 0.7292 - regression_loss: 0.6620 - classification_loss: 0.0672 197/500 [==========>...................] - ETA: 1:43 - loss: 0.7285 - regression_loss: 0.6613 - classification_loss: 0.0672 198/500 [==========>...................] - ETA: 1:42 - loss: 0.7273 - regression_loss: 0.6603 - classification_loss: 0.0670 199/500 [==========>...................] - ETA: 1:42 - loss: 0.7277 - regression_loss: 0.6608 - classification_loss: 0.0669 200/500 [===========>..................] - ETA: 1:42 - loss: 0.7268 - regression_loss: 0.6601 - classification_loss: 0.0666 201/500 [===========>..................] - ETA: 1:41 - loss: 0.7286 - regression_loss: 0.6614 - classification_loss: 0.0671 202/500 [===========>..................] - ETA: 1:41 - loss: 0.7294 - regression_loss: 0.6625 - classification_loss: 0.0670 203/500 [===========>..................] - ETA: 1:41 - loss: 0.7287 - regression_loss: 0.6617 - classification_loss: 0.0669 204/500 [===========>..................] - ETA: 1:40 - loss: 0.7281 - regression_loss: 0.6614 - classification_loss: 0.0667 205/500 [===========>..................] - ETA: 1:40 - loss: 0.7313 - regression_loss: 0.6638 - classification_loss: 0.0675 206/500 [===========>..................] - ETA: 1:40 - loss: 0.7315 - regression_loss: 0.6641 - classification_loss: 0.0674 207/500 [===========>..................] - ETA: 1:39 - loss: 0.7315 - regression_loss: 0.6639 - classification_loss: 0.0676 208/500 [===========>..................] - ETA: 1:39 - loss: 0.7296 - regression_loss: 0.6623 - classification_loss: 0.0673 209/500 [===========>..................] - ETA: 1:39 - loss: 0.7293 - regression_loss: 0.6620 - classification_loss: 0.0673 210/500 [===========>..................] - ETA: 1:38 - loss: 0.7326 - regression_loss: 0.6645 - classification_loss: 0.0682 211/500 [===========>..................] - ETA: 1:38 - loss: 0.7316 - regression_loss: 0.6636 - classification_loss: 0.0680 212/500 [===========>..................] - ETA: 1:38 - loss: 0.7306 - regression_loss: 0.6629 - classification_loss: 0.0678 213/500 [===========>..................] - ETA: 1:37 - loss: 0.7303 - regression_loss: 0.6625 - classification_loss: 0.0677 214/500 [===========>..................] - ETA: 1:37 - loss: 0.7292 - regression_loss: 0.6616 - classification_loss: 0.0676 215/500 [===========>..................] - ETA: 1:37 - loss: 0.7298 - regression_loss: 0.6623 - classification_loss: 0.0675 216/500 [===========>..................] - ETA: 1:36 - loss: 0.7285 - regression_loss: 0.6611 - classification_loss: 0.0674 217/500 [============>.................] - ETA: 1:36 - loss: 0.7286 - regression_loss: 0.6613 - classification_loss: 0.0673 218/500 [============>.................] - ETA: 1:36 - loss: 0.7294 - regression_loss: 0.6620 - classification_loss: 0.0674 219/500 [============>.................] - ETA: 1:35 - loss: 0.7284 - regression_loss: 0.6612 - classification_loss: 0.0672 220/500 [============>.................] - ETA: 1:35 - loss: 0.7314 - regression_loss: 0.6642 - classification_loss: 0.0672 221/500 [============>.................] - ETA: 1:35 - loss: 0.7328 - regression_loss: 0.6654 - classification_loss: 0.0674 222/500 [============>.................] - ETA: 1:34 - loss: 0.7327 - regression_loss: 0.6654 - classification_loss: 0.0673 223/500 [============>.................] - ETA: 1:34 - loss: 0.7316 - regression_loss: 0.6646 - classification_loss: 0.0671 224/500 [============>.................] - ETA: 1:34 - loss: 0.7319 - regression_loss: 0.6643 - classification_loss: 0.0676 225/500 [============>.................] - ETA: 1:33 - loss: 0.7328 - regression_loss: 0.6650 - classification_loss: 0.0678 226/500 [============>.................] - ETA: 1:33 - loss: 0.7344 - regression_loss: 0.6662 - classification_loss: 0.0682 227/500 [============>.................] - ETA: 1:33 - loss: 0.7349 - regression_loss: 0.6667 - classification_loss: 0.0683 228/500 [============>.................] - ETA: 1:32 - loss: 0.7346 - regression_loss: 0.6664 - classification_loss: 0.0682 229/500 [============>.................] - ETA: 1:32 - loss: 0.7334 - regression_loss: 0.6654 - classification_loss: 0.0680 230/500 [============>.................] - ETA: 1:32 - loss: 0.7323 - regression_loss: 0.6645 - classification_loss: 0.0678 231/500 [============>.................] - ETA: 1:31 - loss: 0.7317 - regression_loss: 0.6640 - classification_loss: 0.0677 232/500 [============>.................] - ETA: 1:31 - loss: 0.7323 - regression_loss: 0.6645 - classification_loss: 0.0678 233/500 [============>.................] - ETA: 1:31 - loss: 0.7308 - regression_loss: 0.6632 - classification_loss: 0.0676 234/500 [=============>................] - ETA: 1:30 - loss: 0.7313 - regression_loss: 0.6635 - classification_loss: 0.0678 235/500 [=============>................] - ETA: 1:30 - loss: 0.7314 - regression_loss: 0.6636 - classification_loss: 0.0678 236/500 [=============>................] - ETA: 1:30 - loss: 0.7307 - regression_loss: 0.6628 - classification_loss: 0.0679 237/500 [=============>................] - ETA: 1:29 - loss: 0.7302 - regression_loss: 0.6624 - classification_loss: 0.0678 238/500 [=============>................] - ETA: 1:29 - loss: 0.7292 - regression_loss: 0.6616 - classification_loss: 0.0676 239/500 [=============>................] - ETA: 1:29 - loss: 0.7306 - regression_loss: 0.6623 - classification_loss: 0.0683 240/500 [=============>................] - ETA: 1:28 - loss: 0.7305 - regression_loss: 0.6623 - classification_loss: 0.0682 241/500 [=============>................] - ETA: 1:28 - loss: 0.7320 - regression_loss: 0.6632 - classification_loss: 0.0688 242/500 [=============>................] - ETA: 1:27 - loss: 0.7320 - regression_loss: 0.6634 - classification_loss: 0.0687 243/500 [=============>................] - ETA: 1:27 - loss: 0.7310 - regression_loss: 0.6625 - classification_loss: 0.0685 244/500 [=============>................] - ETA: 1:27 - loss: 0.7303 - regression_loss: 0.6619 - classification_loss: 0.0683 245/500 [=============>................] - ETA: 1:26 - loss: 0.7302 - regression_loss: 0.6618 - classification_loss: 0.0684 246/500 [=============>................] - ETA: 1:26 - loss: 0.7298 - regression_loss: 0.6614 - classification_loss: 0.0683 247/500 [=============>................] - ETA: 1:26 - loss: 0.7313 - regression_loss: 0.6629 - classification_loss: 0.0684 248/500 [=============>................] - ETA: 1:25 - loss: 0.7310 - regression_loss: 0.6628 - classification_loss: 0.0682 249/500 [=============>................] - ETA: 1:25 - loss: 0.7302 - regression_loss: 0.6621 - classification_loss: 0.0681 250/500 [==============>...............] - ETA: 1:25 - loss: 0.7324 - regression_loss: 0.6638 - classification_loss: 0.0686 251/500 [==============>...............] - ETA: 1:24 - loss: 0.7311 - regression_loss: 0.6627 - classification_loss: 0.0684 252/500 [==============>...............] - ETA: 1:24 - loss: 0.7310 - regression_loss: 0.6627 - classification_loss: 0.0683 253/500 [==============>...............] - ETA: 1:24 - loss: 0.7315 - regression_loss: 0.6632 - classification_loss: 0.0682 254/500 [==============>...............] - ETA: 1:23 - loss: 0.7321 - regression_loss: 0.6638 - classification_loss: 0.0683 255/500 [==============>...............] - ETA: 1:23 - loss: 0.7316 - regression_loss: 0.6632 - classification_loss: 0.0684 256/500 [==============>...............] - ETA: 1:23 - loss: 0.7307 - regression_loss: 0.6624 - classification_loss: 0.0683 257/500 [==============>...............] - ETA: 1:22 - loss: 0.7304 - regression_loss: 0.6623 - classification_loss: 0.0681 258/500 [==============>...............] - ETA: 1:22 - loss: 0.7304 - regression_loss: 0.6624 - classification_loss: 0.0680 259/500 [==============>...............] - ETA: 1:22 - loss: 0.7290 - regression_loss: 0.6612 - classification_loss: 0.0678 260/500 [==============>...............] - ETA: 1:21 - loss: 0.7324 - regression_loss: 0.6639 - classification_loss: 0.0685 261/500 [==============>...............] - ETA: 1:21 - loss: 0.7335 - regression_loss: 0.6651 - classification_loss: 0.0684 262/500 [==============>...............] - ETA: 1:21 - loss: 0.7316 - regression_loss: 0.6634 - classification_loss: 0.0682 263/500 [==============>...............] - ETA: 1:20 - loss: 0.7320 - regression_loss: 0.6637 - classification_loss: 0.0683 264/500 [==============>...............] - ETA: 1:20 - loss: 0.7325 - regression_loss: 0.6640 - classification_loss: 0.0685 265/500 [==============>...............] - ETA: 1:20 - loss: 0.7326 - regression_loss: 0.6640 - classification_loss: 0.0686 266/500 [==============>...............] - ETA: 1:19 - loss: 0.7372 - regression_loss: 0.6680 - classification_loss: 0.0692 267/500 [===============>..............] - ETA: 1:19 - loss: 0.7360 - regression_loss: 0.6670 - classification_loss: 0.0690 268/500 [===============>..............] - ETA: 1:19 - loss: 0.7341 - regression_loss: 0.6653 - classification_loss: 0.0688 269/500 [===============>..............] - ETA: 1:18 - loss: 0.7342 - regression_loss: 0.6654 - classification_loss: 0.0687 270/500 [===============>..............] - ETA: 1:18 - loss: 0.7336 - regression_loss: 0.6650 - classification_loss: 0.0687 271/500 [===============>..............] - ETA: 1:18 - loss: 0.7339 - regression_loss: 0.6653 - classification_loss: 0.0686 272/500 [===============>..............] - ETA: 1:17 - loss: 0.7337 - regression_loss: 0.6651 - classification_loss: 0.0685 273/500 [===============>..............] - ETA: 1:17 - loss: 0.7353 - regression_loss: 0.6666 - classification_loss: 0.0687 274/500 [===============>..............] - ETA: 1:17 - loss: 0.7378 - regression_loss: 0.6683 - classification_loss: 0.0694 275/500 [===============>..............] - ETA: 1:16 - loss: 0.7369 - regression_loss: 0.6676 - classification_loss: 0.0692 276/500 [===============>..............] - ETA: 1:16 - loss: 0.7362 - regression_loss: 0.6672 - classification_loss: 0.0691 277/500 [===============>..............] - ETA: 1:15 - loss: 0.7353 - regression_loss: 0.6664 - classification_loss: 0.0689 278/500 [===============>..............] - ETA: 1:15 - loss: 0.7366 - regression_loss: 0.6675 - classification_loss: 0.0691 279/500 [===============>..............] - ETA: 1:15 - loss: 0.7373 - regression_loss: 0.6682 - classification_loss: 0.0691 280/500 [===============>..............] - ETA: 1:14 - loss: 0.7370 - regression_loss: 0.6680 - classification_loss: 0.0690 281/500 [===============>..............] - ETA: 1:14 - loss: 0.7373 - regression_loss: 0.6682 - classification_loss: 0.0692 282/500 [===============>..............] - ETA: 1:14 - loss: 0.7373 - regression_loss: 0.6682 - classification_loss: 0.0691 283/500 [===============>..............] - ETA: 1:13 - loss: 0.7361 - regression_loss: 0.6671 - classification_loss: 0.0690 284/500 [================>.............] - ETA: 1:13 - loss: 0.7353 - regression_loss: 0.6664 - classification_loss: 0.0688 285/500 [================>.............] - ETA: 1:13 - loss: 0.7350 - regression_loss: 0.6663 - classification_loss: 0.0688 286/500 [================>.............] - ETA: 1:12 - loss: 0.7343 - regression_loss: 0.6657 - classification_loss: 0.0687 287/500 [================>.............] - ETA: 1:12 - loss: 0.7332 - regression_loss: 0.6647 - classification_loss: 0.0685 288/500 [================>.............] - ETA: 1:12 - loss: 0.7347 - regression_loss: 0.6659 - classification_loss: 0.0689 289/500 [================>.............] - ETA: 1:11 - loss: 0.7379 - regression_loss: 0.6688 - classification_loss: 0.0691 290/500 [================>.............] - ETA: 1:11 - loss: 0.7378 - regression_loss: 0.6688 - classification_loss: 0.0690 291/500 [================>.............] - ETA: 1:11 - loss: 0.7394 - regression_loss: 0.6703 - classification_loss: 0.0691 292/500 [================>.............] - ETA: 1:10 - loss: 0.7413 - regression_loss: 0.6717 - classification_loss: 0.0697 293/500 [================>.............] - ETA: 1:10 - loss: 0.7415 - regression_loss: 0.6718 - classification_loss: 0.0697 294/500 [================>.............] - ETA: 1:10 - loss: 0.7409 - regression_loss: 0.6714 - classification_loss: 0.0695 295/500 [================>.............] - ETA: 1:09 - loss: 0.7414 - regression_loss: 0.6716 - classification_loss: 0.0698 296/500 [================>.............] - ETA: 1:09 - loss: 0.7413 - regression_loss: 0.6716 - classification_loss: 0.0698 297/500 [================>.............] - ETA: 1:09 - loss: 0.7414 - regression_loss: 0.6716 - classification_loss: 0.0698 298/500 [================>.............] - ETA: 1:08 - loss: 0.7421 - regression_loss: 0.6723 - classification_loss: 0.0699 299/500 [================>.............] - ETA: 1:08 - loss: 0.7440 - regression_loss: 0.6737 - classification_loss: 0.0703 300/500 [=================>............] - ETA: 1:08 - loss: 0.7452 - regression_loss: 0.6748 - classification_loss: 0.0704 301/500 [=================>............] - ETA: 1:07 - loss: 0.7448 - regression_loss: 0.6745 - classification_loss: 0.0703 302/500 [=================>............] - ETA: 1:07 - loss: 0.7443 - regression_loss: 0.6741 - classification_loss: 0.0702 303/500 [=================>............] - ETA: 1:07 - loss: 0.7441 - regression_loss: 0.6741 - classification_loss: 0.0700 304/500 [=================>............] - ETA: 1:06 - loss: 0.7445 - regression_loss: 0.6745 - classification_loss: 0.0700 305/500 [=================>............] - ETA: 1:06 - loss: 0.7447 - regression_loss: 0.6747 - classification_loss: 0.0700 306/500 [=================>............] - ETA: 1:06 - loss: 0.7451 - regression_loss: 0.6750 - classification_loss: 0.0701 307/500 [=================>............] - ETA: 1:05 - loss: 0.7444 - regression_loss: 0.6744 - classification_loss: 0.0700 308/500 [=================>............] - ETA: 1:05 - loss: 0.7434 - regression_loss: 0.6736 - classification_loss: 0.0698 309/500 [=================>............] - ETA: 1:05 - loss: 0.7426 - regression_loss: 0.6729 - classification_loss: 0.0697 310/500 [=================>............] - ETA: 1:04 - loss: 0.7419 - regression_loss: 0.6724 - classification_loss: 0.0695 311/500 [=================>............] - ETA: 1:04 - loss: 0.7411 - regression_loss: 0.6717 - classification_loss: 0.0694 312/500 [=================>............] - ETA: 1:04 - loss: 0.7406 - regression_loss: 0.6713 - classification_loss: 0.0693 313/500 [=================>............] - ETA: 1:03 - loss: 0.7405 - regression_loss: 0.6713 - classification_loss: 0.0692 314/500 [=================>............] - ETA: 1:03 - loss: 0.7407 - regression_loss: 0.6715 - classification_loss: 0.0692 315/500 [=================>............] - ETA: 1:03 - loss: 0.7411 - regression_loss: 0.6719 - classification_loss: 0.0692 316/500 [=================>............] - ETA: 1:02 - loss: 0.7398 - regression_loss: 0.6707 - classification_loss: 0.0691 317/500 [==================>...........] - ETA: 1:02 - loss: 0.7390 - regression_loss: 0.6701 - classification_loss: 0.0689 318/500 [==================>...........] - ETA: 1:01 - loss: 0.7401 - regression_loss: 0.6710 - classification_loss: 0.0691 319/500 [==================>...........] - ETA: 1:01 - loss: 0.7393 - regression_loss: 0.6704 - classification_loss: 0.0690 320/500 [==================>...........] - ETA: 1:01 - loss: 0.7383 - regression_loss: 0.6694 - classification_loss: 0.0689 321/500 [==================>...........] - ETA: 1:00 - loss: 0.7385 - regression_loss: 0.6697 - classification_loss: 0.0688 322/500 [==================>...........] - ETA: 1:00 - loss: 0.7378 - regression_loss: 0.6691 - classification_loss: 0.0687 323/500 [==================>...........] - ETA: 1:00 - loss: 0.7377 - regression_loss: 0.6690 - classification_loss: 0.0687 324/500 [==================>...........] - ETA: 59s - loss: 0.7398 - regression_loss: 0.6709 - classification_loss: 0.0689  325/500 [==================>...........] - ETA: 59s - loss: 0.7400 - regression_loss: 0.6712 - classification_loss: 0.0688 326/500 [==================>...........] - ETA: 59s - loss: 0.7400 - regression_loss: 0.6712 - classification_loss: 0.0688 327/500 [==================>...........] - ETA: 58s - loss: 0.7396 - regression_loss: 0.6709 - classification_loss: 0.0687 328/500 [==================>...........] - ETA: 58s - loss: 0.7390 - regression_loss: 0.6704 - classification_loss: 0.0686 329/500 [==================>...........] - ETA: 58s - loss: 0.7381 - regression_loss: 0.6697 - classification_loss: 0.0685 330/500 [==================>...........] - ETA: 57s - loss: 0.7371 - regression_loss: 0.6688 - classification_loss: 0.0683 331/500 [==================>...........] - ETA: 57s - loss: 0.7368 - regression_loss: 0.6686 - classification_loss: 0.0682 332/500 [==================>...........] - ETA: 57s - loss: 0.7382 - regression_loss: 0.6699 - classification_loss: 0.0683 333/500 [==================>...........] - ETA: 56s - loss: 0.7392 - regression_loss: 0.6709 - classification_loss: 0.0684 334/500 [===================>..........] - ETA: 56s - loss: 0.7377 - regression_loss: 0.6694 - classification_loss: 0.0682 335/500 [===================>..........] - ETA: 56s - loss: 0.7466 - regression_loss: 0.6745 - classification_loss: 0.0720 336/500 [===================>..........] - ETA: 55s - loss: 0.7452 - regression_loss: 0.6733 - classification_loss: 0.0719 337/500 [===================>..........] - ETA: 55s - loss: 0.7450 - regression_loss: 0.6733 - classification_loss: 0.0717 338/500 [===================>..........] - ETA: 55s - loss: 0.7447 - regression_loss: 0.6731 - classification_loss: 0.0716 339/500 [===================>..........] - ETA: 54s - loss: 0.7453 - regression_loss: 0.6737 - classification_loss: 0.0716 340/500 [===================>..........] - ETA: 54s - loss: 0.7447 - regression_loss: 0.6732 - classification_loss: 0.0715 341/500 [===================>..........] - ETA: 54s - loss: 0.7446 - regression_loss: 0.6732 - classification_loss: 0.0714 342/500 [===================>..........] - ETA: 53s - loss: 0.7450 - regression_loss: 0.6736 - classification_loss: 0.0714 343/500 [===================>..........] - ETA: 53s - loss: 0.7453 - regression_loss: 0.6739 - classification_loss: 0.0714 344/500 [===================>..........] - ETA: 53s - loss: 0.7445 - regression_loss: 0.6726 - classification_loss: 0.0718 345/500 [===================>..........] - ETA: 52s - loss: 0.7446 - regression_loss: 0.6727 - classification_loss: 0.0719 346/500 [===================>..........] - ETA: 52s - loss: 0.7446 - regression_loss: 0.6728 - classification_loss: 0.0719 347/500 [===================>..........] - ETA: 52s - loss: 0.7441 - regression_loss: 0.6723 - classification_loss: 0.0718 348/500 [===================>..........] - ETA: 51s - loss: 0.7434 - regression_loss: 0.6717 - classification_loss: 0.0717 349/500 [===================>..........] - ETA: 51s - loss: 0.7428 - regression_loss: 0.6710 - classification_loss: 0.0718 350/500 [====================>.........] - ETA: 51s - loss: 0.7423 - regression_loss: 0.6706 - classification_loss: 0.0716 351/500 [====================>.........] - ETA: 50s - loss: 0.7408 - regression_loss: 0.6693 - classification_loss: 0.0715 352/500 [====================>.........] - ETA: 50s - loss: 0.7406 - regression_loss: 0.6692 - classification_loss: 0.0714 353/500 [====================>.........] - ETA: 50s - loss: 0.7398 - regression_loss: 0.6684 - classification_loss: 0.0714 354/500 [====================>.........] - ETA: 49s - loss: 0.7384 - regression_loss: 0.6672 - classification_loss: 0.0712 355/500 [====================>.........] - ETA: 49s - loss: 0.7384 - regression_loss: 0.6673 - classification_loss: 0.0712 356/500 [====================>.........] - ETA: 49s - loss: 0.7395 - regression_loss: 0.6681 - classification_loss: 0.0714 357/500 [====================>.........] - ETA: 48s - loss: 0.7398 - regression_loss: 0.6684 - classification_loss: 0.0713 358/500 [====================>.........] - ETA: 48s - loss: 0.7398 - regression_loss: 0.6685 - classification_loss: 0.0713 359/500 [====================>.........] - ETA: 48s - loss: 0.7395 - regression_loss: 0.6683 - classification_loss: 0.0712 360/500 [====================>.........] - ETA: 47s - loss: 0.7398 - regression_loss: 0.6686 - classification_loss: 0.0712 361/500 [====================>.........] - ETA: 47s - loss: 0.7404 - regression_loss: 0.6693 - classification_loss: 0.0711 362/500 [====================>.........] - ETA: 46s - loss: 0.7423 - regression_loss: 0.6711 - classification_loss: 0.0712 363/500 [====================>.........] - ETA: 46s - loss: 0.7438 - regression_loss: 0.6721 - classification_loss: 0.0716 364/500 [====================>.........] - ETA: 46s - loss: 0.7451 - regression_loss: 0.6733 - classification_loss: 0.0717 365/500 [====================>.........] - ETA: 45s - loss: 0.7449 - regression_loss: 0.6733 - classification_loss: 0.0716 366/500 [====================>.........] - ETA: 45s - loss: 0.7460 - regression_loss: 0.6742 - classification_loss: 0.0718 367/500 [=====================>........] - ETA: 45s - loss: 0.7454 - regression_loss: 0.6737 - classification_loss: 0.0717 368/500 [=====================>........] - ETA: 44s - loss: 0.7452 - regression_loss: 0.6737 - classification_loss: 0.0715 369/500 [=====================>........] - ETA: 44s - loss: 0.7461 - regression_loss: 0.6746 - classification_loss: 0.0715 370/500 [=====================>........] - ETA: 44s - loss: 0.7461 - regression_loss: 0.6747 - classification_loss: 0.0714 371/500 [=====================>........] - ETA: 43s - loss: 0.7454 - regression_loss: 0.6741 - classification_loss: 0.0713 372/500 [=====================>........] - ETA: 43s - loss: 0.7443 - regression_loss: 0.6731 - classification_loss: 0.0712 373/500 [=====================>........] - ETA: 43s - loss: 0.7445 - regression_loss: 0.6733 - classification_loss: 0.0712 374/500 [=====================>........] - ETA: 42s - loss: 0.7452 - regression_loss: 0.6739 - classification_loss: 0.0712 375/500 [=====================>........] - ETA: 42s - loss: 0.7448 - regression_loss: 0.6737 - classification_loss: 0.0711 376/500 [=====================>........] - ETA: 42s - loss: 0.7454 - regression_loss: 0.6743 - classification_loss: 0.0711 377/500 [=====================>........] - ETA: 41s - loss: 0.7456 - regression_loss: 0.6743 - classification_loss: 0.0713 378/500 [=====================>........] - ETA: 41s - loss: 0.7456 - regression_loss: 0.6743 - classification_loss: 0.0713 379/500 [=====================>........] - ETA: 41s - loss: 0.7463 - regression_loss: 0.6746 - classification_loss: 0.0717 380/500 [=====================>........] - ETA: 40s - loss: 0.7479 - regression_loss: 0.6760 - classification_loss: 0.0719 381/500 [=====================>........] - ETA: 40s - loss: 0.7474 - regression_loss: 0.6756 - classification_loss: 0.0718 382/500 [=====================>........] - ETA: 40s - loss: 0.7470 - regression_loss: 0.6752 - classification_loss: 0.0718 383/500 [=====================>........] - ETA: 39s - loss: 0.7471 - regression_loss: 0.6753 - classification_loss: 0.0718 384/500 [======================>.......] - ETA: 39s - loss: 0.7480 - regression_loss: 0.6761 - classification_loss: 0.0719 385/500 [======================>.......] - ETA: 39s - loss: 0.7471 - regression_loss: 0.6753 - classification_loss: 0.0718 386/500 [======================>.......] - ETA: 38s - loss: 0.7482 - regression_loss: 0.6761 - classification_loss: 0.0720 387/500 [======================>.......] - ETA: 38s - loss: 0.7478 - regression_loss: 0.6758 - classification_loss: 0.0719 388/500 [======================>.......] - ETA: 38s - loss: 0.7479 - regression_loss: 0.6757 - classification_loss: 0.0722 389/500 [======================>.......] - ETA: 37s - loss: 0.7475 - regression_loss: 0.6753 - classification_loss: 0.0721 390/500 [======================>.......] - ETA: 37s - loss: 0.7490 - regression_loss: 0.6764 - classification_loss: 0.0726 391/500 [======================>.......] - ETA: 37s - loss: 0.7487 - regression_loss: 0.6762 - classification_loss: 0.0725 392/500 [======================>.......] - ETA: 36s - loss: 0.7472 - regression_loss: 0.6749 - classification_loss: 0.0723 393/500 [======================>.......] - ETA: 36s - loss: 0.7472 - regression_loss: 0.6748 - classification_loss: 0.0724 394/500 [======================>.......] - ETA: 36s - loss: 0.7473 - regression_loss: 0.6750 - classification_loss: 0.0724 395/500 [======================>.......] - ETA: 35s - loss: 0.7468 - regression_loss: 0.6745 - classification_loss: 0.0723 396/500 [======================>.......] - ETA: 35s - loss: 0.7465 - regression_loss: 0.6743 - classification_loss: 0.0722 397/500 [======================>.......] - ETA: 35s - loss: 0.7470 - regression_loss: 0.6748 - classification_loss: 0.0722 398/500 [======================>.......] - ETA: 34s - loss: 0.7465 - regression_loss: 0.6745 - classification_loss: 0.0721 399/500 [======================>.......] - ETA: 34s - loss: 0.7470 - regression_loss: 0.6749 - classification_loss: 0.0722 400/500 [=======================>......] - ETA: 34s - loss: 0.7471 - regression_loss: 0.6750 - classification_loss: 0.0721 401/500 [=======================>......] - ETA: 33s - loss: 0.7500 - regression_loss: 0.6775 - classification_loss: 0.0725 402/500 [=======================>......] - ETA: 33s - loss: 0.7496 - regression_loss: 0.6771 - classification_loss: 0.0724 403/500 [=======================>......] - ETA: 32s - loss: 0.7499 - regression_loss: 0.6774 - classification_loss: 0.0725 404/500 [=======================>......] - ETA: 32s - loss: 0.7507 - regression_loss: 0.6780 - classification_loss: 0.0727 405/500 [=======================>......] - ETA: 32s - loss: 0.7510 - regression_loss: 0.6783 - classification_loss: 0.0727 406/500 [=======================>......] - ETA: 31s - loss: 0.7519 - regression_loss: 0.6790 - classification_loss: 0.0728 407/500 [=======================>......] - ETA: 31s - loss: 0.7506 - regression_loss: 0.6779 - classification_loss: 0.0727 408/500 [=======================>......] - ETA: 31s - loss: 0.7502 - regression_loss: 0.6776 - classification_loss: 0.0725 409/500 [=======================>......] - ETA: 30s - loss: 0.7497 - regression_loss: 0.6773 - classification_loss: 0.0724 410/500 [=======================>......] - ETA: 30s - loss: 0.7498 - regression_loss: 0.6774 - classification_loss: 0.0724 411/500 [=======================>......] - ETA: 30s - loss: 0.7496 - regression_loss: 0.6772 - classification_loss: 0.0724 412/500 [=======================>......] - ETA: 29s - loss: 0.7515 - regression_loss: 0.6787 - classification_loss: 0.0728 413/500 [=======================>......] - ETA: 29s - loss: 0.7510 - regression_loss: 0.6783 - classification_loss: 0.0727 414/500 [=======================>......] - ETA: 29s - loss: 0.7514 - regression_loss: 0.6785 - classification_loss: 0.0729 415/500 [=======================>......] - ETA: 28s - loss: 0.7516 - regression_loss: 0.6787 - classification_loss: 0.0729 416/500 [=======================>......] - ETA: 28s - loss: 0.7513 - regression_loss: 0.6785 - classification_loss: 0.0728 417/500 [========================>.....] - ETA: 28s - loss: 0.7504 - regression_loss: 0.6776 - classification_loss: 0.0728 418/500 [========================>.....] - ETA: 27s - loss: 0.7510 - regression_loss: 0.6783 - classification_loss: 0.0727 419/500 [========================>.....] - ETA: 27s - loss: 0.7507 - regression_loss: 0.6780 - classification_loss: 0.0727 420/500 [========================>.....] - ETA: 27s - loss: 0.7504 - regression_loss: 0.6777 - classification_loss: 0.0727 421/500 [========================>.....] - ETA: 26s - loss: 0.7499 - regression_loss: 0.6773 - classification_loss: 0.0726 422/500 [========================>.....] - ETA: 26s - loss: 0.7505 - regression_loss: 0.6778 - classification_loss: 0.0727 423/500 [========================>.....] - ETA: 26s - loss: 0.7514 - regression_loss: 0.6785 - classification_loss: 0.0729 424/500 [========================>.....] - ETA: 25s - loss: 0.7519 - regression_loss: 0.6790 - classification_loss: 0.0729 425/500 [========================>.....] - ETA: 25s - loss: 0.7516 - regression_loss: 0.6787 - classification_loss: 0.0729 426/500 [========================>.....] - ETA: 25s - loss: 0.7506 - regression_loss: 0.6779 - classification_loss: 0.0727 427/500 [========================>.....] - ETA: 24s - loss: 0.7503 - regression_loss: 0.6776 - classification_loss: 0.0727 428/500 [========================>.....] - ETA: 24s - loss: 0.7495 - regression_loss: 0.6769 - classification_loss: 0.0725 429/500 [========================>.....] - ETA: 24s - loss: 0.7494 - regression_loss: 0.6770 - classification_loss: 0.0724 430/500 [========================>.....] - ETA: 23s - loss: 0.7489 - regression_loss: 0.6765 - classification_loss: 0.0724 431/500 [========================>.....] - ETA: 23s - loss: 0.7488 - regression_loss: 0.6762 - classification_loss: 0.0726 432/500 [========================>.....] - ETA: 23s - loss: 0.7480 - regression_loss: 0.6755 - classification_loss: 0.0724 433/500 [========================>.....] - ETA: 22s - loss: 0.7480 - regression_loss: 0.6756 - classification_loss: 0.0724 434/500 [=========================>....] - ETA: 22s - loss: 0.7484 - regression_loss: 0.6760 - classification_loss: 0.0725 435/500 [=========================>....] - ETA: 22s - loss: 0.7497 - regression_loss: 0.6771 - classification_loss: 0.0726 436/500 [=========================>....] - ETA: 21s - loss: 0.7492 - regression_loss: 0.6766 - classification_loss: 0.0726 437/500 [=========================>....] - ETA: 21s - loss: 0.7488 - regression_loss: 0.6763 - classification_loss: 0.0725 438/500 [=========================>....] - ETA: 21s - loss: 0.7485 - regression_loss: 0.6760 - classification_loss: 0.0725 439/500 [=========================>....] - ETA: 20s - loss: 0.7481 - regression_loss: 0.6757 - classification_loss: 0.0724 440/500 [=========================>....] - ETA: 20s - loss: 0.7474 - regression_loss: 0.6750 - classification_loss: 0.0723 441/500 [=========================>....] - ETA: 20s - loss: 0.7464 - regression_loss: 0.6742 - classification_loss: 0.0722 442/500 [=========================>....] - ETA: 19s - loss: 0.7464 - regression_loss: 0.6742 - classification_loss: 0.0721 443/500 [=========================>....] - ETA: 19s - loss: 0.7462 - regression_loss: 0.6741 - classification_loss: 0.0721 444/500 [=========================>....] - ETA: 19s - loss: 0.7459 - regression_loss: 0.6738 - classification_loss: 0.0721 445/500 [=========================>....] - ETA: 18s - loss: 0.7473 - regression_loss: 0.6751 - classification_loss: 0.0722 446/500 [=========================>....] - ETA: 18s - loss: 0.7471 - regression_loss: 0.6748 - classification_loss: 0.0722 447/500 [=========================>....] - ETA: 18s - loss: 0.7469 - regression_loss: 0.6748 - classification_loss: 0.0722 448/500 [=========================>....] - ETA: 17s - loss: 0.7472 - regression_loss: 0.6751 - classification_loss: 0.0721 449/500 [=========================>....] - ETA: 17s - loss: 0.7469 - regression_loss: 0.6748 - classification_loss: 0.0721 450/500 [==========================>...] - ETA: 17s - loss: 0.7483 - regression_loss: 0.6758 - classification_loss: 0.0725 451/500 [==========================>...] - ETA: 16s - loss: 0.7488 - regression_loss: 0.6763 - classification_loss: 0.0725 452/500 [==========================>...] - ETA: 16s - loss: 0.7489 - regression_loss: 0.6765 - classification_loss: 0.0724 453/500 [==========================>...] - ETA: 15s - loss: 0.7485 - regression_loss: 0.6762 - classification_loss: 0.0724 454/500 [==========================>...] - ETA: 15s - loss: 0.7480 - regression_loss: 0.6757 - classification_loss: 0.0723 455/500 [==========================>...] - ETA: 15s - loss: 0.7476 - regression_loss: 0.6754 - classification_loss: 0.0722 456/500 [==========================>...] - ETA: 14s - loss: 0.7469 - regression_loss: 0.6749 - classification_loss: 0.0721 457/500 [==========================>...] - ETA: 14s - loss: 0.7468 - regression_loss: 0.6748 - classification_loss: 0.0720 458/500 [==========================>...] - ETA: 14s - loss: 0.7463 - regression_loss: 0.6743 - classification_loss: 0.0719 459/500 [==========================>...] - ETA: 13s - loss: 0.7454 - regression_loss: 0.6736 - classification_loss: 0.0718 460/500 [==========================>...] - ETA: 13s - loss: 0.7446 - regression_loss: 0.6729 - classification_loss: 0.0717 461/500 [==========================>...] - ETA: 13s - loss: 0.7436 - regression_loss: 0.6721 - classification_loss: 0.0716 462/500 [==========================>...] - ETA: 12s - loss: 0.7443 - regression_loss: 0.6727 - classification_loss: 0.0716 463/500 [==========================>...] - ETA: 12s - loss: 0.7450 - regression_loss: 0.6733 - classification_loss: 0.0716 464/500 [==========================>...] - ETA: 12s - loss: 0.7444 - regression_loss: 0.6729 - classification_loss: 0.0715 465/500 [==========================>...] - ETA: 11s - loss: 0.7447 - regression_loss: 0.6731 - classification_loss: 0.0716 466/500 [==========================>...] - ETA: 11s - loss: 0.7444 - regression_loss: 0.6728 - classification_loss: 0.0716 467/500 [===========================>..] - ETA: 11s - loss: 0.7441 - regression_loss: 0.6725 - classification_loss: 0.0716 468/500 [===========================>..] - ETA: 10s - loss: 0.7440 - regression_loss: 0.6724 - classification_loss: 0.0716 469/500 [===========================>..] - ETA: 10s - loss: 0.7431 - regression_loss: 0.6717 - classification_loss: 0.0714 470/500 [===========================>..] - ETA: 10s - loss: 0.7424 - regression_loss: 0.6711 - classification_loss: 0.0713 471/500 [===========================>..] - ETA: 9s - loss: 0.7421 - regression_loss: 0.6709 - classification_loss: 0.0713  472/500 [===========================>..] - ETA: 9s - loss: 0.7428 - regression_loss: 0.6714 - classification_loss: 0.0715 473/500 [===========================>..] - ETA: 9s - loss: 0.7431 - regression_loss: 0.6716 - classification_loss: 0.0715 474/500 [===========================>..] - ETA: 8s - loss: 0.7423 - regression_loss: 0.6708 - classification_loss: 0.0714 475/500 [===========================>..] - ETA: 8s - loss: 0.7420 - regression_loss: 0.6706 - classification_loss: 0.0714 476/500 [===========================>..] - ETA: 8s - loss: 0.7428 - regression_loss: 0.6714 - classification_loss: 0.0714 477/500 [===========================>..] - ETA: 7s - loss: 0.7429 - regression_loss: 0.6715 - classification_loss: 0.0714 478/500 [===========================>..] - ETA: 7s - loss: 0.7422 - regression_loss: 0.6710 - classification_loss: 0.0713 479/500 [===========================>..] - ETA: 7s - loss: 0.7419 - regression_loss: 0.6707 - classification_loss: 0.0712 480/500 [===========================>..] - ETA: 6s - loss: 0.7423 - regression_loss: 0.6710 - classification_loss: 0.0713 481/500 [===========================>..] - ETA: 6s - loss: 0.7429 - regression_loss: 0.6716 - classification_loss: 0.0713 482/500 [===========================>..] - ETA: 6s - loss: 0.7435 - regression_loss: 0.6721 - classification_loss: 0.0714 483/500 [===========================>..] - ETA: 5s - loss: 0.7438 - regression_loss: 0.6724 - classification_loss: 0.0714 484/500 [============================>.] - ETA: 5s - loss: 0.7442 - regression_loss: 0.6726 - classification_loss: 0.0716 485/500 [============================>.] - ETA: 5s - loss: 0.7458 - regression_loss: 0.6740 - classification_loss: 0.0718 486/500 [============================>.] - ETA: 4s - loss: 0.7457 - regression_loss: 0.6739 - classification_loss: 0.0718 487/500 [============================>.] - ETA: 4s - loss: 0.7464 - regression_loss: 0.6744 - classification_loss: 0.0719 488/500 [============================>.] - ETA: 4s - loss: 0.7461 - regression_loss: 0.6743 - classification_loss: 0.0718 489/500 [============================>.] - ETA: 3s - loss: 0.7461 - regression_loss: 0.6742 - classification_loss: 0.0719 490/500 [============================>.] - ETA: 3s - loss: 0.7465 - regression_loss: 0.6746 - classification_loss: 0.0720 491/500 [============================>.] - ETA: 3s - loss: 0.7472 - regression_loss: 0.6752 - classification_loss: 0.0720 492/500 [============================>.] - ETA: 2s - loss: 0.7469 - regression_loss: 0.6749 - classification_loss: 0.0720 493/500 [============================>.] - ETA: 2s - loss: 0.7470 - regression_loss: 0.6750 - classification_loss: 0.0720 494/500 [============================>.] - ETA: 2s - loss: 0.7466 - regression_loss: 0.6747 - classification_loss: 0.0719 495/500 [============================>.] - ETA: 1s - loss: 0.7456 - regression_loss: 0.6738 - classification_loss: 0.0717 496/500 [============================>.] - ETA: 1s - loss: 0.7462 - regression_loss: 0.6744 - classification_loss: 0.0718 497/500 [============================>.] - ETA: 1s - loss: 0.7466 - regression_loss: 0.6747 - classification_loss: 0.0719 498/500 [============================>.] - ETA: 0s - loss: 0.7473 - regression_loss: 0.6754 - classification_loss: 0.0718 499/500 [============================>.] - ETA: 0s - loss: 0.7471 - regression_loss: 0.6753 - classification_loss: 0.0718 500/500 [==============================] - 170s 340ms/step - loss: 0.7463 - regression_loss: 0.6746 - classification_loss: 0.0717 326 instances of class plum with average precision: 0.8417 mAP: 0.8417 Epoch 00044: saving model to ./training/snapshots/resnet101_pascal_44.h5