CHECK: Is CUDA the right version (10)? Creating network using backbone resnet101 Not applying augmentation to RGBD data Loading a pascal format RGBD dataset Loading imagenet weights Creating model, this may take a second... Building ResNet backbone using defined input shape of Tensor("input_1:0", shape=(?, ?, ?, 4), dtype=float32) Loading weights into model tracking anchors tracking anchors tracking anchors tracking anchors tracking anchors Model: "retinanet" __________________________________________________________________________________________________ Layer (type) Output Shape Param # Connected to ================================================================================================== input_1 (InputLayer) (None, None, None, 4 0 __________________________________________________________________________________________________ padding_conv1 (ZeroPadding2D) (None, None, None, 4 0 input_1[0][0] __________________________________________________________________________________________________ conv1 (Conv2D) (None, None, None, 6 12544 padding_conv1[0][0] __________________________________________________________________________________________________ bn_conv1 (BatchNormalization) (None, None, None, 6 256 conv1[0][0] __________________________________________________________________________________________________ conv1_relu (Activation) (None, None, None, 6 0 bn_conv1[0][0] __________________________________________________________________________________________________ pool1 (MaxPooling2D) (None, None, None, 6 0 conv1_relu[0][0] __________________________________________________________________________________________________ res2a_branch2a (Conv2D) (None, None, None, 6 4096 pool1[0][0] __________________________________________________________________________________________________ bn2a_branch2a (BatchNormalizati (None, None, None, 6 256 res2a_branch2a[0][0] __________________________________________________________________________________________________ res2a_branch2a_relu (Activation (None, None, None, 6 0 bn2a_branch2a[0][0] __________________________________________________________________________________________________ padding2a_branch2b (ZeroPadding (None, None, None, 6 0 res2a_branch2a_relu[0][0] __________________________________________________________________________________________________ res2a_branch2b (Conv2D) (None, None, None, 6 36864 padding2a_branch2b[0][0] __________________________________________________________________________________________________ bn2a_branch2b (BatchNormalizati (None, None, None, 6 256 res2a_branch2b[0][0] __________________________________________________________________________________________________ res2a_branch2b_relu (Activation (None, None, None, 6 0 bn2a_branch2b[0][0] __________________________________________________________________________________________________ res2a_branch2c (Conv2D) (None, None, None, 2 16384 res2a_branch2b_relu[0][0] __________________________________________________________________________________________________ res2a_branch1 (Conv2D) (None, None, None, 2 16384 pool1[0][0] __________________________________________________________________________________________________ bn2a_branch2c (BatchNormalizati (None, None, None, 2 1024 res2a_branch2c[0][0] __________________________________________________________________________________________________ bn2a_branch1 (BatchNormalizatio (None, None, None, 2 1024 res2a_branch1[0][0] __________________________________________________________________________________________________ res2a (Add) (None, None, None, 2 0 bn2a_branch2c[0][0] bn2a_branch1[0][0] __________________________________________________________________________________________________ res2a_relu (Activation) (None, None, None, 2 0 res2a[0][0] __________________________________________________________________________________________________ res2b_branch2a (Conv2D) (None, None, None, 6 16384 res2a_relu[0][0] __________________________________________________________________________________________________ bn2b_branch2a (BatchNormalizati (None, None, None, 6 256 res2b_branch2a[0][0] __________________________________________________________________________________________________ res2b_branch2a_relu (Activation (None, None, None, 6 0 bn2b_branch2a[0][0] __________________________________________________________________________________________________ padding2b_branch2b (ZeroPadding (None, None, None, 6 0 res2b_branch2a_relu[0][0] __________________________________________________________________________________________________ res2b_branch2b (Conv2D) (None, None, None, 6 36864 padding2b_branch2b[0][0] __________________________________________________________________________________________________ bn2b_branch2b (BatchNormalizati (None, None, None, 6 256 res2b_branch2b[0][0] __________________________________________________________________________________________________ res2b_branch2b_relu (Activation (None, None, None, 6 0 bn2b_branch2b[0][0] __________________________________________________________________________________________________ res2b_branch2c (Conv2D) (None, None, None, 2 16384 res2b_branch2b_relu[0][0] __________________________________________________________________________________________________ bn2b_branch2c (BatchNormalizati (None, None, None, 2 1024 res2b_branch2c[0][0] __________________________________________________________________________________________________ res2b (Add) (None, None, None, 2 0 bn2b_branch2c[0][0] res2a_relu[0][0] __________________________________________________________________________________________________ res2b_relu (Activation) (None, None, None, 2 0 res2b[0][0] __________________________________________________________________________________________________ res2c_branch2a (Conv2D) (None, None, None, 6 16384 res2b_relu[0][0] __________________________________________________________________________________________________ bn2c_branch2a (BatchNormalizati (None, None, None, 6 256 res2c_branch2a[0][0] __________________________________________________________________________________________________ res2c_branch2a_relu (Activation (None, None, None, 6 0 bn2c_branch2a[0][0] __________________________________________________________________________________________________ padding2c_branch2b (ZeroPadding (None, None, None, 6 0 res2c_branch2a_relu[0][0] __________________________________________________________________________________________________ res2c_branch2b (Conv2D) (None, None, None, 6 36864 padding2c_branch2b[0][0] __________________________________________________________________________________________________ bn2c_branch2b (BatchNormalizati (None, None, None, 6 256 res2c_branch2b[0][0] __________________________________________________________________________________________________ res2c_branch2b_relu (Activation (None, None, None, 6 0 bn2c_branch2b[0][0] __________________________________________________________________________________________________ res2c_branch2c (Conv2D) (None, None, None, 2 16384 res2c_branch2b_relu[0][0] __________________________________________________________________________________________________ bn2c_branch2c (BatchNormalizati (None, None, None, 2 1024 res2c_branch2c[0][0] __________________________________________________________________________________________________ res2c (Add) (None, None, None, 2 0 bn2c_branch2c[0][0] res2b_relu[0][0] __________________________________________________________________________________________________ res2c_relu (Activation) (None, None, None, 2 0 res2c[0][0] __________________________________________________________________________________________________ res3a_branch2a (Conv2D) (None, None, None, 1 32768 res2c_relu[0][0] __________________________________________________________________________________________________ bn3a_branch2a (BatchNormalizati (None, None, None, 1 512 res3a_branch2a[0][0] __________________________________________________________________________________________________ res3a_branch2a_relu (Activation (None, None, None, 1 0 bn3a_branch2a[0][0] __________________________________________________________________________________________________ padding3a_branch2b (ZeroPadding (None, None, None, 1 0 res3a_branch2a_relu[0][0] __________________________________________________________________________________________________ res3a_branch2b (Conv2D) (None, None, None, 1 147456 padding3a_branch2b[0][0] __________________________________________________________________________________________________ bn3a_branch2b (BatchNormalizati (None, None, None, 1 512 res3a_branch2b[0][0] __________________________________________________________________________________________________ res3a_branch2b_relu (Activation (None, None, None, 1 0 bn3a_branch2b[0][0] __________________________________________________________________________________________________ res3a_branch2c (Conv2D) (None, None, None, 5 65536 res3a_branch2b_relu[0][0] __________________________________________________________________________________________________ res3a_branch1 (Conv2D) (None, None, None, 5 131072 res2c_relu[0][0] __________________________________________________________________________________________________ bn3a_branch2c (BatchNormalizati (None, None, None, 5 2048 res3a_branch2c[0][0] __________________________________________________________________________________________________ bn3a_branch1 (BatchNormalizatio (None, None, None, 5 2048 res3a_branch1[0][0] __________________________________________________________________________________________________ res3a (Add) (None, None, None, 5 0 bn3a_branch2c[0][0] bn3a_branch1[0][0] __________________________________________________________________________________________________ res3a_relu (Activation) (None, None, None, 5 0 res3a[0][0] __________________________________________________________________________________________________ res3b1_branch2a (Conv2D) (None, None, None, 1 65536 res3a_relu[0][0] __________________________________________________________________________________________________ bn3b1_branch2a (BatchNormalizat (None, None, None, 1 512 res3b1_branch2a[0][0] __________________________________________________________________________________________________ res3b1_branch2a_relu (Activatio (None, None, None, 1 0 bn3b1_branch2a[0][0] __________________________________________________________________________________________________ padding3b1_branch2b (ZeroPaddin (None, None, None, 1 0 res3b1_branch2a_relu[0][0] __________________________________________________________________________________________________ res3b1_branch2b (Conv2D) (None, None, None, 1 147456 padding3b1_branch2b[0][0] __________________________________________________________________________________________________ bn3b1_branch2b (BatchNormalizat (None, None, None, 1 512 res3b1_branch2b[0][0] __________________________________________________________________________________________________ res3b1_branch2b_relu (Activatio (None, None, None, 1 0 bn3b1_branch2b[0][0] __________________________________________________________________________________________________ res3b1_branch2c (Conv2D) (None, None, None, 5 65536 res3b1_branch2b_relu[0][0] __________________________________________________________________________________________________ bn3b1_branch2c (BatchNormalizat (None, None, None, 5 2048 res3b1_branch2c[0][0] __________________________________________________________________________________________________ res3b1 (Add) (None, None, None, 5 0 bn3b1_branch2c[0][0] res3a_relu[0][0] __________________________________________________________________________________________________ res3b1_relu (Activation) (None, None, None, 5 0 res3b1[0][0] __________________________________________________________________________________________________ res3b2_branch2a (Conv2D) (None, None, None, 1 65536 res3b1_relu[0][0] __________________________________________________________________________________________________ bn3b2_branch2a (BatchNormalizat (None, None, None, 1 512 res3b2_branch2a[0][0] __________________________________________________________________________________________________ res3b2_branch2a_relu (Activatio (None, None, None, 1 0 bn3b2_branch2a[0][0] __________________________________________________________________________________________________ padding3b2_branch2b (ZeroPaddin (None, None, None, 1 0 res3b2_branch2a_relu[0][0] __________________________________________________________________________________________________ res3b2_branch2b (Conv2D) (None, None, None, 1 147456 padding3b2_branch2b[0][0] __________________________________________________________________________________________________ bn3b2_branch2b (BatchNormalizat (None, None, None, 1 512 res3b2_branch2b[0][0] __________________________________________________________________________________________________ res3b2_branch2b_relu (Activatio (None, None, None, 1 0 bn3b2_branch2b[0][0] __________________________________________________________________________________________________ res3b2_branch2c (Conv2D) (None, None, None, 5 65536 res3b2_branch2b_relu[0][0] __________________________________________________________________________________________________ bn3b2_branch2c (BatchNormalizat (None, None, None, 5 2048 res3b2_branch2c[0][0] __________________________________________________________________________________________________ res3b2 (Add) (None, None, None, 5 0 bn3b2_branch2c[0][0] res3b1_relu[0][0] __________________________________________________________________________________________________ res3b2_relu (Activation) (None, None, None, 5 0 res3b2[0][0] __________________________________________________________________________________________________ res3b3_branch2a (Conv2D) (None, None, None, 1 65536 res3b2_relu[0][0] __________________________________________________________________________________________________ bn3b3_branch2a (BatchNormalizat (None, None, None, 1 512 res3b3_branch2a[0][0] __________________________________________________________________________________________________ res3b3_branch2a_relu (Activatio (None, None, None, 1 0 bn3b3_branch2a[0][0] __________________________________________________________________________________________________ padding3b3_branch2b (ZeroPaddin (None, None, None, 1 0 res3b3_branch2a_relu[0][0] __________________________________________________________________________________________________ res3b3_branch2b (Conv2D) (None, None, None, 1 147456 padding3b3_branch2b[0][0] __________________________________________________________________________________________________ bn3b3_branch2b (BatchNormalizat (None, None, None, 1 512 res3b3_branch2b[0][0] __________________________________________________________________________________________________ res3b3_branch2b_relu (Activatio (None, None, None, 1 0 bn3b3_branch2b[0][0] __________________________________________________________________________________________________ res3b3_branch2c (Conv2D) (None, None, None, 5 65536 res3b3_branch2b_relu[0][0] __________________________________________________________________________________________________ bn3b3_branch2c (BatchNormalizat (None, None, None, 5 2048 res3b3_branch2c[0][0] __________________________________________________________________________________________________ res3b3 (Add) (None, None, None, 5 0 bn3b3_branch2c[0][0] res3b2_relu[0][0] __________________________________________________________________________________________________ res3b3_relu (Activation) (None, None, None, 5 0 res3b3[0][0] __________________________________________________________________________________________________ res4a_branch2a (Conv2D) (None, None, None, 2 131072 res3b3_relu[0][0] __________________________________________________________________________________________________ bn4a_branch2a (BatchNormalizati (None, None, None, 2 1024 res4a_branch2a[0][0] __________________________________________________________________________________________________ res4a_branch2a_relu (Activation (None, None, None, 2 0 bn4a_branch2a[0][0] __________________________________________________________________________________________________ padding4a_branch2b (ZeroPadding (None, None, None, 2 0 res4a_branch2a_relu[0][0] __________________________________________________________________________________________________ res4a_branch2b (Conv2D) (None, None, None, 2 589824 padding4a_branch2b[0][0] __________________________________________________________________________________________________ bn4a_branch2b (BatchNormalizati (None, None, None, 2 1024 res4a_branch2b[0][0] __________________________________________________________________________________________________ res4a_branch2b_relu (Activation (None, None, None, 2 0 bn4a_branch2b[0][0] __________________________________________________________________________________________________ res4a_branch2c (Conv2D) (None, None, None, 1 262144 res4a_branch2b_relu[0][0] __________________________________________________________________________________________________ res4a_branch1 (Conv2D) (None, None, None, 1 524288 res3b3_relu[0][0] __________________________________________________________________________________________________ bn4a_branch2c (BatchNormalizati (None, None, None, 1 4096 res4a_branch2c[0][0] __________________________________________________________________________________________________ bn4a_branch1 (BatchNormalizatio (None, None, None, 1 4096 res4a_branch1[0][0] __________________________________________________________________________________________________ res4a (Add) (None, None, None, 1 0 bn4a_branch2c[0][0] bn4a_branch1[0][0] __________________________________________________________________________________________________ res4a_relu (Activation) (None, None, None, 1 0 res4a[0][0] __________________________________________________________________________________________________ res4b1_branch2a (Conv2D) (None, None, None, 2 262144 res4a_relu[0][0] __________________________________________________________________________________________________ bn4b1_branch2a (BatchNormalizat (None, None, None, 2 1024 res4b1_branch2a[0][0] __________________________________________________________________________________________________ res4b1_branch2a_relu (Activatio (None, None, None, 2 0 bn4b1_branch2a[0][0] __________________________________________________________________________________________________ padding4b1_branch2b (ZeroPaddin (None, None, None, 2 0 res4b1_branch2a_relu[0][0] __________________________________________________________________________________________________ res4b1_branch2b (Conv2D) (None, None, None, 2 589824 padding4b1_branch2b[0][0] __________________________________________________________________________________________________ bn4b1_branch2b (BatchNormalizat (None, None, None, 2 1024 res4b1_branch2b[0][0] __________________________________________________________________________________________________ res4b1_branch2b_relu (Activatio (None, None, None, 2 0 bn4b1_branch2b[0][0] __________________________________________________________________________________________________ res4b1_branch2c (Conv2D) (None, None, None, 1 262144 res4b1_branch2b_relu[0][0] __________________________________________________________________________________________________ bn4b1_branch2c (BatchNormalizat (None, None, None, 1 4096 res4b1_branch2c[0][0] __________________________________________________________________________________________________ res4b1 (Add) (None, None, None, 1 0 bn4b1_branch2c[0][0] res4a_relu[0][0] __________________________________________________________________________________________________ res4b1_relu (Activation) (None, None, None, 1 0 res4b1[0][0] __________________________________________________________________________________________________ res4b2_branch2a (Conv2D) (None, None, None, 2 262144 res4b1_relu[0][0] __________________________________________________________________________________________________ bn4b2_branch2a (BatchNormalizat (None, None, None, 2 1024 res4b2_branch2a[0][0] __________________________________________________________________________________________________ res4b2_branch2a_relu (Activatio (None, None, None, 2 0 bn4b2_branch2a[0][0] __________________________________________________________________________________________________ padding4b2_branch2b (ZeroPaddin (None, None, None, 2 0 res4b2_branch2a_relu[0][0] __________________________________________________________________________________________________ res4b2_branch2b (Conv2D) (None, None, None, 2 589824 padding4b2_branch2b[0][0] __________________________________________________________________________________________________ bn4b2_branch2b (BatchNormalizat (None, None, None, 2 1024 res4b2_branch2b[0][0] __________________________________________________________________________________________________ res4b2_branch2b_relu (Activatio (None, None, None, 2 0 bn4b2_branch2b[0][0] __________________________________________________________________________________________________ res4b2_branch2c (Conv2D) (None, None, None, 1 262144 res4b2_branch2b_relu[0][0] __________________________________________________________________________________________________ bn4b2_branch2c (BatchNormalizat (None, None, None, 1 4096 res4b2_branch2c[0][0] __________________________________________________________________________________________________ res4b2 (Add) (None, None, None, 1 0 bn4b2_branch2c[0][0] res4b1_relu[0][0] __________________________________________________________________________________________________ res4b2_relu (Activation) (None, None, None, 1 0 res4b2[0][0] __________________________________________________________________________________________________ res4b3_branch2a (Conv2D) (None, None, None, 2 262144 res4b2_relu[0][0] __________________________________________________________________________________________________ bn4b3_branch2a (BatchNormalizat (None, None, None, 2 1024 res4b3_branch2a[0][0] __________________________________________________________________________________________________ res4b3_branch2a_relu (Activatio (None, None, None, 2 0 bn4b3_branch2a[0][0] __________________________________________________________________________________________________ padding4b3_branch2b (ZeroPaddin (None, None, None, 2 0 res4b3_branch2a_relu[0][0] __________________________________________________________________________________________________ res4b3_branch2b (Conv2D) (None, None, None, 2 589824 padding4b3_branch2b[0][0] __________________________________________________________________________________________________ bn4b3_branch2b (BatchNormalizat (None, None, None, 2 1024 res4b3_branch2b[0][0] __________________________________________________________________________________________________ res4b3_branch2b_relu (Activatio (None, None, None, 2 0 bn4b3_branch2b[0][0] __________________________________________________________________________________________________ res4b3_branch2c (Conv2D) (None, None, None, 1 262144 res4b3_branch2b_relu[0][0] __________________________________________________________________________________________________ bn4b3_branch2c (BatchNormalizat (None, None, None, 1 4096 res4b3_branch2c[0][0] __________________________________________________________________________________________________ res4b3 (Add) (None, None, None, 1 0 bn4b3_branch2c[0][0] res4b2_relu[0][0] __________________________________________________________________________________________________ res4b3_relu (Activation) (None, None, None, 1 0 res4b3[0][0] __________________________________________________________________________________________________ res4b4_branch2a (Conv2D) (None, None, None, 2 262144 res4b3_relu[0][0] __________________________________________________________________________________________________ bn4b4_branch2a (BatchNormalizat (None, None, None, 2 1024 res4b4_branch2a[0][0] __________________________________________________________________________________________________ res4b4_branch2a_relu (Activatio (None, None, None, 2 0 bn4b4_branch2a[0][0] __________________________________________________________________________________________________ padding4b4_branch2b (ZeroPaddin (None, None, None, 2 0 res4b4_branch2a_relu[0][0] __________________________________________________________________________________________________ res4b4_branch2b (Conv2D) (None, None, None, 2 589824 padding4b4_branch2b[0][0] __________________________________________________________________________________________________ bn4b4_branch2b (BatchNormalizat (None, None, None, 2 1024 res4b4_branch2b[0][0] __________________________________________________________________________________________________ res4b4_branch2b_relu (Activatio (None, None, None, 2 0 bn4b4_branch2b[0][0] __________________________________________________________________________________________________ res4b4_branch2c (Conv2D) (None, None, None, 1 262144 res4b4_branch2b_relu[0][0] __________________________________________________________________________________________________ bn4b4_branch2c (BatchNormalizat (None, None, None, 1 4096 res4b4_branch2c[0][0] __________________________________________________________________________________________________ res4b4 (Add) (None, None, None, 1 0 bn4b4_branch2c[0][0] res4b3_relu[0][0] __________________________________________________________________________________________________ res4b4_relu (Activation) (None, None, None, 1 0 res4b4[0][0] __________________________________________________________________________________________________ res4b5_branch2a (Conv2D) (None, None, None, 2 262144 res4b4_relu[0][0] __________________________________________________________________________________________________ bn4b5_branch2a (BatchNormalizat (None, None, None, 2 1024 res4b5_branch2a[0][0] __________________________________________________________________________________________________ res4b5_branch2a_relu (Activatio (None, None, None, 2 0 bn4b5_branch2a[0][0] __________________________________________________________________________________________________ padding4b5_branch2b (ZeroPaddin (None, None, None, 2 0 res4b5_branch2a_relu[0][0] __________________________________________________________________________________________________ res4b5_branch2b (Conv2D) (None, None, None, 2 589824 padding4b5_branch2b[0][0] __________________________________________________________________________________________________ bn4b5_branch2b (BatchNormalizat (None, None, None, 2 1024 res4b5_branch2b[0][0] __________________________________________________________________________________________________ res4b5_branch2b_relu (Activatio (None, None, None, 2 0 bn4b5_branch2b[0][0] __________________________________________________________________________________________________ res4b5_branch2c (Conv2D) (None, None, None, 1 262144 res4b5_branch2b_relu[0][0] __________________________________________________________________________________________________ bn4b5_branch2c (BatchNormalizat (None, None, None, 1 4096 res4b5_branch2c[0][0] __________________________________________________________________________________________________ res4b5 (Add) (None, None, None, 1 0 bn4b5_branch2c[0][0] res4b4_relu[0][0] __________________________________________________________________________________________________ res4b5_relu (Activation) (None, None, None, 1 0 res4b5[0][0] __________________________________________________________________________________________________ res4b6_branch2a (Conv2D) (None, None, None, 2 262144 res4b5_relu[0][0] __________________________________________________________________________________________________ bn4b6_branch2a (BatchNormalizat (None, None, None, 2 1024 res4b6_branch2a[0][0] __________________________________________________________________________________________________ res4b6_branch2a_relu (Activatio (None, None, None, 2 0 bn4b6_branch2a[0][0] __________________________________________________________________________________________________ padding4b6_branch2b (ZeroPaddin (None, None, None, 2 0 res4b6_branch2a_relu[0][0] __________________________________________________________________________________________________ res4b6_branch2b (Conv2D) (None, None, None, 2 589824 padding4b6_branch2b[0][0] __________________________________________________________________________________________________ bn4b6_branch2b (BatchNormalizat (None, None, None, 2 1024 res4b6_branch2b[0][0] __________________________________________________________________________________________________ res4b6_branch2b_relu (Activatio (None, None, None, 2 0 bn4b6_branch2b[0][0] __________________________________________________________________________________________________ res4b6_branch2c (Conv2D) (None, None, None, 1 262144 res4b6_branch2b_relu[0][0] __________________________________________________________________________________________________ bn4b6_branch2c (BatchNormalizat (None, None, None, 1 4096 res4b6_branch2c[0][0] __________________________________________________________________________________________________ res4b6 (Add) (None, None, None, 1 0 bn4b6_branch2c[0][0] res4b5_relu[0][0] __________________________________________________________________________________________________ res4b6_relu (Activation) (None, None, None, 1 0 res4b6[0][0] __________________________________________________________________________________________________ res4b7_branch2a (Conv2D) (None, None, None, 2 262144 res4b6_relu[0][0] __________________________________________________________________________________________________ bn4b7_branch2a (BatchNormalizat (None, None, None, 2 1024 res4b7_branch2a[0][0] __________________________________________________________________________________________________ res4b7_branch2a_relu (Activatio (None, None, None, 2 0 bn4b7_branch2a[0][0] __________________________________________________________________________________________________ padding4b7_branch2b (ZeroPaddin (None, None, None, 2 0 res4b7_branch2a_relu[0][0] __________________________________________________________________________________________________ res4b7_branch2b (Conv2D) (None, None, None, 2 589824 padding4b7_branch2b[0][0] __________________________________________________________________________________________________ bn4b7_branch2b (BatchNormalizat (None, None, None, 2 1024 res4b7_branch2b[0][0] __________________________________________________________________________________________________ res4b7_branch2b_relu (Activatio (None, None, None, 2 0 bn4b7_branch2b[0][0] __________________________________________________________________________________________________ res4b7_branch2c (Conv2D) (None, None, None, 1 262144 res4b7_branch2b_relu[0][0] __________________________________________________________________________________________________ bn4b7_branch2c (BatchNormalizat (None, None, None, 1 4096 res4b7_branch2c[0][0] __________________________________________________________________________________________________ res4b7 (Add) (None, None, None, 1 0 bn4b7_branch2c[0][0] res4b6_relu[0][0] __________________________________________________________________________________________________ res4b7_relu (Activation) (None, None, None, 1 0 res4b7[0][0] __________________________________________________________________________________________________ res4b8_branch2a (Conv2D) (None, None, None, 2 262144 res4b7_relu[0][0] __________________________________________________________________________________________________ bn4b8_branch2a (BatchNormalizat (None, None, None, 2 1024 res4b8_branch2a[0][0] __________________________________________________________________________________________________ res4b8_branch2a_relu (Activatio (None, None, None, 2 0 bn4b8_branch2a[0][0] __________________________________________________________________________________________________ padding4b8_branch2b (ZeroPaddin (None, None, None, 2 0 res4b8_branch2a_relu[0][0] __________________________________________________________________________________________________ res4b8_branch2b (Conv2D) (None, None, None, 2 589824 padding4b8_branch2b[0][0] __________________________________________________________________________________________________ bn4b8_branch2b (BatchNormalizat (None, None, None, 2 1024 res4b8_branch2b[0][0] __________________________________________________________________________________________________ res4b8_branch2b_relu (Activatio (None, None, None, 2 0 bn4b8_branch2b[0][0] __________________________________________________________________________________________________ res4b8_branch2c (Conv2D) (None, None, None, 1 262144 res4b8_branch2b_relu[0][0] __________________________________________________________________________________________________ bn4b8_branch2c (BatchNormalizat (None, None, None, 1 4096 res4b8_branch2c[0][0] __________________________________________________________________________________________________ res4b8 (Add) (None, None, None, 1 0 bn4b8_branch2c[0][0] res4b7_relu[0][0] __________________________________________________________________________________________________ res4b8_relu (Activation) (None, None, None, 1 0 res4b8[0][0] __________________________________________________________________________________________________ res4b9_branch2a (Conv2D) (None, None, None, 2 262144 res4b8_relu[0][0] __________________________________________________________________________________________________ bn4b9_branch2a (BatchNormalizat (None, None, None, 2 1024 res4b9_branch2a[0][0] __________________________________________________________________________________________________ res4b9_branch2a_relu (Activatio (None, None, None, 2 0 bn4b9_branch2a[0][0] __________________________________________________________________________________________________ padding4b9_branch2b (ZeroPaddin (None, None, None, 2 0 res4b9_branch2a_relu[0][0] __________________________________________________________________________________________________ res4b9_branch2b (Conv2D) (None, None, None, 2 589824 padding4b9_branch2b[0][0] __________________________________________________________________________________________________ bn4b9_branch2b (BatchNormalizat (None, None, None, 2 1024 res4b9_branch2b[0][0] __________________________________________________________________________________________________ res4b9_branch2b_relu (Activatio (None, None, None, 2 0 bn4b9_branch2b[0][0] __________________________________________________________________________________________________ res4b9_branch2c (Conv2D) (None, None, None, 1 262144 res4b9_branch2b_relu[0][0] __________________________________________________________________________________________________ bn4b9_branch2c (BatchNormalizat (None, None, None, 1 4096 res4b9_branch2c[0][0] __________________________________________________________________________________________________ res4b9 (Add) (None, None, None, 1 0 bn4b9_branch2c[0][0] res4b8_relu[0][0] __________________________________________________________________________________________________ res4b9_relu (Activation) (None, None, None, 1 0 res4b9[0][0] __________________________________________________________________________________________________ res4b10_branch2a (Conv2D) (None, None, None, 2 262144 res4b9_relu[0][0] __________________________________________________________________________________________________ bn4b10_branch2a (BatchNormaliza (None, None, None, 2 1024 res4b10_branch2a[0][0] __________________________________________________________________________________________________ res4b10_branch2a_relu (Activati (None, None, None, 2 0 bn4b10_branch2a[0][0] __________________________________________________________________________________________________ padding4b10_branch2b (ZeroPaddi (None, None, None, 2 0 res4b10_branch2a_relu[0][0] __________________________________________________________________________________________________ res4b10_branch2b (Conv2D) (None, None, None, 2 589824 padding4b10_branch2b[0][0] __________________________________________________________________________________________________ bn4b10_branch2b (BatchNormaliza (None, None, None, 2 1024 res4b10_branch2b[0][0] __________________________________________________________________________________________________ res4b10_branch2b_relu (Activati (None, None, None, 2 0 bn4b10_branch2b[0][0] __________________________________________________________________________________________________ res4b10_branch2c (Conv2D) (None, None, None, 1 262144 res4b10_branch2b_relu[0][0] __________________________________________________________________________________________________ bn4b10_branch2c (BatchNormaliza (None, None, None, 1 4096 res4b10_branch2c[0][0] __________________________________________________________________________________________________ res4b10 (Add) (None, None, None, 1 0 bn4b10_branch2c[0][0] res4b9_relu[0][0] __________________________________________________________________________________________________ res4b10_relu (Activation) (None, None, None, 1 0 res4b10[0][0] __________________________________________________________________________________________________ res4b11_branch2a (Conv2D) (None, None, None, 2 262144 res4b10_relu[0][0] __________________________________________________________________________________________________ bn4b11_branch2a (BatchNormaliza (None, None, None, 2 1024 res4b11_branch2a[0][0] __________________________________________________________________________________________________ res4b11_branch2a_relu (Activati (None, None, None, 2 0 bn4b11_branch2a[0][0] __________________________________________________________________________________________________ padding4b11_branch2b (ZeroPaddi (None, None, None, 2 0 res4b11_branch2a_relu[0][0] __________________________________________________________________________________________________ res4b11_branch2b (Conv2D) (None, None, None, 2 589824 padding4b11_branch2b[0][0] __________________________________________________________________________________________________ bn4b11_branch2b (BatchNormaliza (None, None, None, 2 1024 res4b11_branch2b[0][0] __________________________________________________________________________________________________ res4b11_branch2b_relu (Activati (None, None, None, 2 0 bn4b11_branch2b[0][0] __________________________________________________________________________________________________ res4b11_branch2c (Conv2D) (None, None, None, 1 262144 res4b11_branch2b_relu[0][0] __________________________________________________________________________________________________ bn4b11_branch2c (BatchNormaliza (None, None, None, 1 4096 res4b11_branch2c[0][0] __________________________________________________________________________________________________ res4b11 (Add) (None, None, None, 1 0 bn4b11_branch2c[0][0] res4b10_relu[0][0] __________________________________________________________________________________________________ res4b11_relu (Activation) (None, None, None, 1 0 res4b11[0][0] __________________________________________________________________________________________________ res4b12_branch2a (Conv2D) (None, None, None, 2 262144 res4b11_relu[0][0] __________________________________________________________________________________________________ bn4b12_branch2a (BatchNormaliza (None, None, None, 2 1024 res4b12_branch2a[0][0] __________________________________________________________________________________________________ res4b12_branch2a_relu (Activati (None, None, None, 2 0 bn4b12_branch2a[0][0] __________________________________________________________________________________________________ padding4b12_branch2b (ZeroPaddi (None, None, None, 2 0 res4b12_branch2a_relu[0][0] __________________________________________________________________________________________________ res4b12_branch2b (Conv2D) (None, None, None, 2 589824 padding4b12_branch2b[0][0] __________________________________________________________________________________________________ bn4b12_branch2b (BatchNormaliza (None, None, None, 2 1024 res4b12_branch2b[0][0] __________________________________________________________________________________________________ res4b12_branch2b_relu (Activati (None, None, None, 2 0 bn4b12_branch2b[0][0] __________________________________________________________________________________________________ res4b12_branch2c (Conv2D) (None, None, None, 1 262144 res4b12_branch2b_relu[0][0] __________________________________________________________________________________________________ bn4b12_branch2c (BatchNormaliza (None, None, None, 1 4096 res4b12_branch2c[0][0] __________________________________________________________________________________________________ res4b12 (Add) (None, None, None, 1 0 bn4b12_branch2c[0][0] res4b11_relu[0][0] __________________________________________________________________________________________________ res4b12_relu (Activation) (None, None, None, 1 0 res4b12[0][0] __________________________________________________________________________________________________ res4b13_branch2a (Conv2D) (None, None, None, 2 262144 res4b12_relu[0][0] __________________________________________________________________________________________________ bn4b13_branch2a (BatchNormaliza (None, None, None, 2 1024 res4b13_branch2a[0][0] __________________________________________________________________________________________________ res4b13_branch2a_relu (Activati (None, None, None, 2 0 bn4b13_branch2a[0][0] __________________________________________________________________________________________________ padding4b13_branch2b (ZeroPaddi (None, None, None, 2 0 res4b13_branch2a_relu[0][0] __________________________________________________________________________________________________ res4b13_branch2b (Conv2D) (None, None, None, 2 589824 padding4b13_branch2b[0][0] __________________________________________________________________________________________________ bn4b13_branch2b (BatchNormaliza (None, None, None, 2 1024 res4b13_branch2b[0][0] __________________________________________________________________________________________________ res4b13_branch2b_relu (Activati (None, None, None, 2 0 bn4b13_branch2b[0][0] __________________________________________________________________________________________________ res4b13_branch2c (Conv2D) (None, None, None, 1 262144 res4b13_branch2b_relu[0][0] __________________________________________________________________________________________________ bn4b13_branch2c (BatchNormaliza (None, None, None, 1 4096 res4b13_branch2c[0][0] __________________________________________________________________________________________________ res4b13 (Add) (None, None, None, 1 0 bn4b13_branch2c[0][0] res4b12_relu[0][0] __________________________________________________________________________________________________ res4b13_relu (Activation) (None, None, None, 1 0 res4b13[0][0] __________________________________________________________________________________________________ res4b14_branch2a (Conv2D) (None, None, None, 2 262144 res4b13_relu[0][0] __________________________________________________________________________________________________ bn4b14_branch2a (BatchNormaliza (None, None, None, 2 1024 res4b14_branch2a[0][0] __________________________________________________________________________________________________ res4b14_branch2a_relu (Activati (None, None, None, 2 0 bn4b14_branch2a[0][0] __________________________________________________________________________________________________ padding4b14_branch2b (ZeroPaddi (None, None, None, 2 0 res4b14_branch2a_relu[0][0] __________________________________________________________________________________________________ res4b14_branch2b (Conv2D) (None, None, None, 2 589824 padding4b14_branch2b[0][0] __________________________________________________________________________________________________ bn4b14_branch2b (BatchNormaliza (None, None, None, 2 1024 res4b14_branch2b[0][0] __________________________________________________________________________________________________ res4b14_branch2b_relu (Activati (None, None, None, 2 0 bn4b14_branch2b[0][0] __________________________________________________________________________________________________ res4b14_branch2c (Conv2D) (None, None, None, 1 262144 res4b14_branch2b_relu[0][0] __________________________________________________________________________________________________ bn4b14_branch2c (BatchNormaliza (None, None, None, 1 4096 res4b14_branch2c[0][0] __________________________________________________________________________________________________ res4b14 (Add) (None, None, None, 1 0 bn4b14_branch2c[0][0] res4b13_relu[0][0] __________________________________________________________________________________________________ res4b14_relu (Activation) (None, None, None, 1 0 res4b14[0][0] __________________________________________________________________________________________________ res4b15_branch2a (Conv2D) (None, None, None, 2 262144 res4b14_relu[0][0] __________________________________________________________________________________________________ bn4b15_branch2a (BatchNormaliza (None, None, None, 2 1024 res4b15_branch2a[0][0] __________________________________________________________________________________________________ res4b15_branch2a_relu (Activati (None, None, None, 2 0 bn4b15_branch2a[0][0] __________________________________________________________________________________________________ padding4b15_branch2b (ZeroPaddi (None, None, None, 2 0 res4b15_branch2a_relu[0][0] __________________________________________________________________________________________________ res4b15_branch2b (Conv2D) (None, None, None, 2 589824 padding4b15_branch2b[0][0] __________________________________________________________________________________________________ bn4b15_branch2b (BatchNormaliza (None, None, None, 2 1024 res4b15_branch2b[0][0] __________________________________________________________________________________________________ res4b15_branch2b_relu (Activati (None, None, None, 2 0 bn4b15_branch2b[0][0] __________________________________________________________________________________________________ res4b15_branch2c (Conv2D) (None, None, None, 1 262144 res4b15_branch2b_relu[0][0] __________________________________________________________________________________________________ bn4b15_branch2c (BatchNormaliza (None, None, None, 1 4096 res4b15_branch2c[0][0] __________________________________________________________________________________________________ res4b15 (Add) (None, None, None, 1 0 bn4b15_branch2c[0][0] res4b14_relu[0][0] __________________________________________________________________________________________________ res4b15_relu (Activation) (None, None, None, 1 0 res4b15[0][0] __________________________________________________________________________________________________ res4b16_branch2a (Conv2D) (None, None, None, 2 262144 res4b15_relu[0][0] __________________________________________________________________________________________________ bn4b16_branch2a (BatchNormaliza (None, None, None, 2 1024 res4b16_branch2a[0][0] __________________________________________________________________________________________________ res4b16_branch2a_relu (Activati (None, None, None, 2 0 bn4b16_branch2a[0][0] __________________________________________________________________________________________________ padding4b16_branch2b (ZeroPaddi (None, None, None, 2 0 res4b16_branch2a_relu[0][0] __________________________________________________________________________________________________ res4b16_branch2b (Conv2D) (None, None, None, 2 589824 padding4b16_branch2b[0][0] __________________________________________________________________________________________________ bn4b16_branch2b (BatchNormaliza (None, None, None, 2 1024 res4b16_branch2b[0][0] __________________________________________________________________________________________________ res4b16_branch2b_relu (Activati (None, None, None, 2 0 bn4b16_branch2b[0][0] __________________________________________________________________________________________________ res4b16_branch2c (Conv2D) (None, None, None, 1 262144 res4b16_branch2b_relu[0][0] __________________________________________________________________________________________________ bn4b16_branch2c (BatchNormaliza (None, None, None, 1 4096 res4b16_branch2c[0][0] __________________________________________________________________________________________________ res4b16 (Add) (None, None, None, 1 0 bn4b16_branch2c[0][0] res4b15_relu[0][0] __________________________________________________________________________________________________ res4b16_relu (Activation) (None, None, None, 1 0 res4b16[0][0] __________________________________________________________________________________________________ res4b17_branch2a (Conv2D) (None, None, None, 2 262144 res4b16_relu[0][0] __________________________________________________________________________________________________ bn4b17_branch2a (BatchNormaliza (None, None, None, 2 1024 res4b17_branch2a[0][0] __________________________________________________________________________________________________ res4b17_branch2a_relu (Activati (None, None, None, 2 0 bn4b17_branch2a[0][0] __________________________________________________________________________________________________ padding4b17_branch2b (ZeroPaddi (None, None, None, 2 0 res4b17_branch2a_relu[0][0] __________________________________________________________________________________________________ res4b17_branch2b (Conv2D) (None, None, None, 2 589824 padding4b17_branch2b[0][0] __________________________________________________________________________________________________ bn4b17_branch2b (BatchNormaliza (None, None, None, 2 1024 res4b17_branch2b[0][0] __________________________________________________________________________________________________ res4b17_branch2b_relu (Activati (None, None, None, 2 0 bn4b17_branch2b[0][0] __________________________________________________________________________________________________ res4b17_branch2c (Conv2D) (None, None, None, 1 262144 res4b17_branch2b_relu[0][0] __________________________________________________________________________________________________ bn4b17_branch2c (BatchNormaliza (None, None, None, 1 4096 res4b17_branch2c[0][0] __________________________________________________________________________________________________ res4b17 (Add) (None, None, None, 1 0 bn4b17_branch2c[0][0] res4b16_relu[0][0] __________________________________________________________________________________________________ res4b17_relu (Activation) (None, None, None, 1 0 res4b17[0][0] __________________________________________________________________________________________________ res4b18_branch2a (Conv2D) (None, None, None, 2 262144 res4b17_relu[0][0] __________________________________________________________________________________________________ bn4b18_branch2a (BatchNormaliza (None, None, None, 2 1024 res4b18_branch2a[0][0] __________________________________________________________________________________________________ res4b18_branch2a_relu (Activati (None, None, None, 2 0 bn4b18_branch2a[0][0] __________________________________________________________________________________________________ padding4b18_branch2b (ZeroPaddi (None, None, None, 2 0 res4b18_branch2a_relu[0][0] __________________________________________________________________________________________________ res4b18_branch2b (Conv2D) (None, None, None, 2 589824 padding4b18_branch2b[0][0] __________________________________________________________________________________________________ bn4b18_branch2b (BatchNormaliza (None, None, None, 2 1024 res4b18_branch2b[0][0] __________________________________________________________________________________________________ res4b18_branch2b_relu (Activati (None, None, None, 2 0 bn4b18_branch2b[0][0] __________________________________________________________________________________________________ res4b18_branch2c (Conv2D) (None, None, None, 1 262144 res4b18_branch2b_relu[0][0] __________________________________________________________________________________________________ bn4b18_branch2c (BatchNormaliza (None, None, None, 1 4096 res4b18_branch2c[0][0] __________________________________________________________________________________________________ res4b18 (Add) (None, None, None, 1 0 bn4b18_branch2c[0][0] res4b17_relu[0][0] __________________________________________________________________________________________________ res4b18_relu (Activation) (None, None, None, 1 0 res4b18[0][0] __________________________________________________________________________________________________ res4b19_branch2a (Conv2D) (None, None, None, 2 262144 res4b18_relu[0][0] __________________________________________________________________________________________________ bn4b19_branch2a (BatchNormaliza (None, None, None, 2 1024 res4b19_branch2a[0][0] __________________________________________________________________________________________________ res4b19_branch2a_relu (Activati (None, None, None, 2 0 bn4b19_branch2a[0][0] __________________________________________________________________________________________________ padding4b19_branch2b (ZeroPaddi (None, None, None, 2 0 res4b19_branch2a_relu[0][0] __________________________________________________________________________________________________ res4b19_branch2b (Conv2D) (None, None, None, 2 589824 padding4b19_branch2b[0][0] __________________________________________________________________________________________________ bn4b19_branch2b (BatchNormaliza (None, None, None, 2 1024 res4b19_branch2b[0][0] __________________________________________________________________________________________________ res4b19_branch2b_relu (Activati (None, None, None, 2 0 bn4b19_branch2b[0][0] __________________________________________________________________________________________________ res4b19_branch2c (Conv2D) (None, None, None, 1 262144 res4b19_branch2b_relu[0][0] __________________________________________________________________________________________________ bn4b19_branch2c (BatchNormaliza (None, None, None, 1 4096 res4b19_branch2c[0][0] __________________________________________________________________________________________________ res4b19 (Add) (None, None, None, 1 0 bn4b19_branch2c[0][0] res4b18_relu[0][0] __________________________________________________________________________________________________ res4b19_relu (Activation) (None, None, None, 1 0 res4b19[0][0] __________________________________________________________________________________________________ res4b20_branch2a (Conv2D) (None, None, None, 2 262144 res4b19_relu[0][0] __________________________________________________________________________________________________ bn4b20_branch2a (BatchNormaliza (None, None, None, 2 1024 res4b20_branch2a[0][0] __________________________________________________________________________________________________ res4b20_branch2a_relu (Activati (None, None, None, 2 0 bn4b20_branch2a[0][0] __________________________________________________________________________________________________ padding4b20_branch2b (ZeroPaddi (None, None, None, 2 0 res4b20_branch2a_relu[0][0] __________________________________________________________________________________________________ res4b20_branch2b (Conv2D) (None, None, None, 2 589824 padding4b20_branch2b[0][0] __________________________________________________________________________________________________ bn4b20_branch2b (BatchNormaliza (None, None, None, 2 1024 res4b20_branch2b[0][0] __________________________________________________________________________________________________ res4b20_branch2b_relu (Activati (None, None, None, 2 0 bn4b20_branch2b[0][0] __________________________________________________________________________________________________ res4b20_branch2c (Conv2D) (None, None, None, 1 262144 res4b20_branch2b_relu[0][0] __________________________________________________________________________________________________ bn4b20_branch2c (BatchNormaliza (None, None, None, 1 4096 res4b20_branch2c[0][0] __________________________________________________________________________________________________ res4b20 (Add) (None, None, None, 1 0 bn4b20_branch2c[0][0] res4b19_relu[0][0] __________________________________________________________________________________________________ res4b20_relu (Activation) (None, None, None, 1 0 res4b20[0][0] __________________________________________________________________________________________________ res4b21_branch2a (Conv2D) (None, None, None, 2 262144 res4b20_relu[0][0] __________________________________________________________________________________________________ bn4b21_branch2a (BatchNormaliza (None, None, None, 2 1024 res4b21_branch2a[0][0] __________________________________________________________________________________________________ res4b21_branch2a_relu (Activati (None, None, None, 2 0 bn4b21_branch2a[0][0] __________________________________________________________________________________________________ padding4b21_branch2b (ZeroPaddi (None, None, None, 2 0 res4b21_branch2a_relu[0][0] __________________________________________________________________________________________________ res4b21_branch2b (Conv2D) (None, None, None, 2 589824 padding4b21_branch2b[0][0] __________________________________________________________________________________________________ bn4b21_branch2b (BatchNormaliza (None, None, None, 2 1024 res4b21_branch2b[0][0] __________________________________________________________________________________________________ res4b21_branch2b_relu (Activati (None, None, None, 2 0 bn4b21_branch2b[0][0] __________________________________________________________________________________________________ res4b21_branch2c (Conv2D) (None, None, None, 1 262144 res4b21_branch2b_relu[0][0] __________________________________________________________________________________________________ bn4b21_branch2c (BatchNormaliza (None, None, None, 1 4096 res4b21_branch2c[0][0] __________________________________________________________________________________________________ res4b21 (Add) (None, None, None, 1 0 bn4b21_branch2c[0][0] res4b20_relu[0][0] __________________________________________________________________________________________________ res4b21_relu (Activation) (None, None, None, 1 0 res4b21[0][0] __________________________________________________________________________________________________ res4b22_branch2a (Conv2D) (None, None, None, 2 262144 res4b21_relu[0][0] __________________________________________________________________________________________________ bn4b22_branch2a (BatchNormaliza (None, None, None, 2 1024 res4b22_branch2a[0][0] __________________________________________________________________________________________________ res4b22_branch2a_relu (Activati (None, None, None, 2 0 bn4b22_branch2a[0][0] __________________________________________________________________________________________________ padding4b22_branch2b (ZeroPaddi (None, None, None, 2 0 res4b22_branch2a_relu[0][0] __________________________________________________________________________________________________ res4b22_branch2b (Conv2D) (None, None, None, 2 589824 padding4b22_branch2b[0][0] __________________________________________________________________________________________________ bn4b22_branch2b (BatchNormaliza (None, None, None, 2 1024 res4b22_branch2b[0][0] __________________________________________________________________________________________________ res4b22_branch2b_relu (Activati (None, None, None, 2 0 bn4b22_branch2b[0][0] __________________________________________________________________________________________________ res4b22_branch2c (Conv2D) (None, None, None, 1 262144 res4b22_branch2b_relu[0][0] __________________________________________________________________________________________________ bn4b22_branch2c (BatchNormaliza (None, None, None, 1 4096 res4b22_branch2c[0][0] __________________________________________________________________________________________________ res4b22 (Add) (None, None, None, 1 0 bn4b22_branch2c[0][0] res4b21_relu[0][0] __________________________________________________________________________________________________ res4b22_relu (Activation) (None, None, None, 1 0 res4b22[0][0] __________________________________________________________________________________________________ res5a_branch2a (Conv2D) (None, None, None, 5 524288 res4b22_relu[0][0] __________________________________________________________________________________________________ bn5a_branch2a (BatchNormalizati (None, None, None, 5 2048 res5a_branch2a[0][0] __________________________________________________________________________________________________ res5a_branch2a_relu (Activation (None, None, None, 5 0 bn5a_branch2a[0][0] __________________________________________________________________________________________________ padding5a_branch2b (ZeroPadding (None, None, None, 5 0 res5a_branch2a_relu[0][0] __________________________________________________________________________________________________ res5a_branch2b (Conv2D) (None, None, None, 5 2359296 padding5a_branch2b[0][0] __________________________________________________________________________________________________ bn5a_branch2b (BatchNormalizati (None, None, None, 5 2048 res5a_branch2b[0][0] __________________________________________________________________________________________________ res5a_branch2b_relu (Activation (None, None, None, 5 0 bn5a_branch2b[0][0] __________________________________________________________________________________________________ res5a_branch2c (Conv2D) (None, None, None, 2 1048576 res5a_branch2b_relu[0][0] __________________________________________________________________________________________________ res5a_branch1 (Conv2D) (None, None, None, 2 2097152 res4b22_relu[0][0] __________________________________________________________________________________________________ bn5a_branch2c (BatchNormalizati (None, None, None, 2 8192 res5a_branch2c[0][0] __________________________________________________________________________________________________ bn5a_branch1 (BatchNormalizatio (None, None, None, 2 8192 res5a_branch1[0][0] __________________________________________________________________________________________________ res5a (Add) (None, None, None, 2 0 bn5a_branch2c[0][0] bn5a_branch1[0][0] __________________________________________________________________________________________________ res5a_relu (Activation) (None, None, None, 2 0 res5a[0][0] __________________________________________________________________________________________________ res5b_branch2a (Conv2D) (None, None, None, 5 1048576 res5a_relu[0][0] __________________________________________________________________________________________________ bn5b_branch2a (BatchNormalizati (None, None, None, 5 2048 res5b_branch2a[0][0] __________________________________________________________________________________________________ res5b_branch2a_relu (Activation (None, None, None, 5 0 bn5b_branch2a[0][0] __________________________________________________________________________________________________ padding5b_branch2b (ZeroPadding (None, None, None, 5 0 res5b_branch2a_relu[0][0] __________________________________________________________________________________________________ res5b_branch2b (Conv2D) (None, None, None, 5 2359296 padding5b_branch2b[0][0] __________________________________________________________________________________________________ bn5b_branch2b (BatchNormalizati (None, None, None, 5 2048 res5b_branch2b[0][0] __________________________________________________________________________________________________ res5b_branch2b_relu (Activation (None, None, None, 5 0 bn5b_branch2b[0][0] __________________________________________________________________________________________________ res5b_branch2c (Conv2D) (None, None, None, 2 1048576 res5b_branch2b_relu[0][0] __________________________________________________________________________________________________ bn5b_branch2c (BatchNormalizati (None, None, None, 2 8192 res5b_branch2c[0][0] __________________________________________________________________________________________________ res5b (Add) (None, None, None, 2 0 bn5b_branch2c[0][0] res5a_relu[0][0] __________________________________________________________________________________________________ res5b_relu (Activation) (None, None, None, 2 0 res5b[0][0] __________________________________________________________________________________________________ res5c_branch2a (Conv2D) (None, None, None, 5 1048576 res5b_relu[0][0] __________________________________________________________________________________________________ bn5c_branch2a (BatchNormalizati (None, None, None, 5 2048 res5c_branch2a[0][0] __________________________________________________________________________________________________ res5c_branch2a_relu (Activation (None, None, None, 5 0 bn5c_branch2a[0][0] __________________________________________________________________________________________________ padding5c_branch2b (ZeroPadding (None, None, None, 5 0 res5c_branch2a_relu[0][0] __________________________________________________________________________________________________ res5c_branch2b (Conv2D) (None, None, None, 5 2359296 padding5c_branch2b[0][0] __________________________________________________________________________________________________ bn5c_branch2b (BatchNormalizati (None, None, None, 5 2048 res5c_branch2b[0][0] __________________________________________________________________________________________________ res5c_branch2b_relu (Activation (None, None, None, 5 0 bn5c_branch2b[0][0] __________________________________________________________________________________________________ res5c_branch2c (Conv2D) (None, None, None, 2 1048576 res5c_branch2b_relu[0][0] __________________________________________________________________________________________________ bn5c_branch2c (BatchNormalizati (None, None, None, 2 8192 res5c_branch2c[0][0] __________________________________________________________________________________________________ res5c (Add) (None, None, None, 2 0 bn5c_branch2c[0][0] res5b_relu[0][0] __________________________________________________________________________________________________ res5c_relu (Activation) (None, None, None, 2 0 res5c[0][0] __________________________________________________________________________________________________ C5_reduced (Conv2D) (None, None, None, 2 524544 res5c_relu[0][0] __________________________________________________________________________________________________ P5_upsampled (UpsampleLike) (None, None, None, 2 0 C5_reduced[0][0] res4b22_relu[0][0] __________________________________________________________________________________________________ C4_reduced (Conv2D) (None, None, None, 2 262400 res4b22_relu[0][0] __________________________________________________________________________________________________ P4_merged (Add) (None, None, None, 2 0 P5_upsampled[0][0] C4_reduced[0][0] __________________________________________________________________________________________________ P4_upsampled (UpsampleLike) (None, None, None, 2 0 P4_merged[0][0] res3b3_relu[0][0] __________________________________________________________________________________________________ C3_reduced (Conv2D) (None, None, None, 2 131328 res3b3_relu[0][0] __________________________________________________________________________________________________ P6 (Conv2D) (None, None, None, 2 4718848 res5c_relu[0][0] __________________________________________________________________________________________________ P3_merged (Add) (None, None, None, 2 0 P4_upsampled[0][0] C3_reduced[0][0] __________________________________________________________________________________________________ C6_relu (Activation) (None, None, None, 2 0 P6[0][0] __________________________________________________________________________________________________ P3 (Conv2D) (None, None, None, 2 590080 P3_merged[0][0] __________________________________________________________________________________________________ P4 (Conv2D) (None, None, None, 2 590080 P4_merged[0][0] __________________________________________________________________________________________________ P5 (Conv2D) (None, None, None, 2 590080 C5_reduced[0][0] __________________________________________________________________________________________________ P7 (Conv2D) (None, None, None, 2 590080 C6_relu[0][0] __________________________________________________________________________________________________ regression_submodel (Model) (None, None, 4) 2443300 P3[0][0] P4[0][0] P5[0][0] P6[0][0] P7[0][0] __________________________________________________________________________________________________ classification_submodel (Model) (None, None, 1) 2381065 P3[0][0] P4[0][0] P5[0][0] P6[0][0] P7[0][0] __________________________________________________________________________________________________ regression (Concatenate) (None, None, 4) 0 regression_submodel[1][0] regression_submodel[2][0] regression_submodel[3][0] regression_submodel[4][0] regression_submodel[5][0] __________________________________________________________________________________________________ classification (Concatenate) (None, None, 1) 0 classification_submodel[1][0] classification_submodel[2][0] classification_submodel[3][0] classification_submodel[4][0] classification_submodel[5][0] ================================================================================================== Total params: 55,430,445 Trainable params: 55,219,757 Non-trainable params: 210,688 __________________________________________________________________________________________________ None Epoch 1/150 1/500 [..............................] - ETA: 1:09:39 - loss: 3.9508 - regression_loss: 2.8208 - classification_loss: 1.1300 2/500 [..............................] - ETA: 36:03 - loss: 4.0764 - regression_loss: 2.9468 - classification_loss: 1.1295 3/500 [..............................] - ETA: 24:52 - loss: 4.0548 - regression_loss: 2.9258 - classification_loss: 1.1290 4/500 [..............................] - ETA: 19:17 - loss: 4.0699 - regression_loss: 2.9407 - classification_loss: 1.1293 5/500 [..............................] - ETA: 15:56 - loss: 4.1108 - regression_loss: 2.9813 - classification_loss: 1.1296 6/500 [..............................] - ETA: 13:41 - loss: 4.1379 - regression_loss: 3.0083 - classification_loss: 1.1297 7/500 [..............................] - ETA: 12:05 - loss: 4.0794 - regression_loss: 2.9498 - classification_loss: 1.1296 8/500 [..............................] - ETA: 10:55 - loss: 4.0740 - regression_loss: 2.9443 - classification_loss: 1.1297 9/500 [..............................] - ETA: 10:00 - loss: 4.0655 - regression_loss: 2.9359 - classification_loss: 1.1296 10/500 [..............................] - ETA: 9:15 - loss: 4.0554 - regression_loss: 2.9258 - classification_loss: 1.1296 11/500 [..............................] - ETA: 8:38 - loss: 4.0441 - regression_loss: 2.9145 - classification_loss: 1.1296 12/500 [..............................] - ETA: 8:07 - loss: 4.0481 - regression_loss: 2.9185 - classification_loss: 1.1295 13/500 [..............................] - ETA: 7:40 - loss: 4.0506 - regression_loss: 2.9210 - classification_loss: 1.1296 14/500 [..............................] - ETA: 7:18 - loss: 4.0577 - regression_loss: 2.9281 - classification_loss: 1.1296 15/500 [..............................] - ETA: 6:58 - loss: 4.0641 - regression_loss: 2.9345 - classification_loss: 1.1297 16/500 [..............................] - ETA: 6:41 - loss: 4.0621 - regression_loss: 2.9325 - classification_loss: 1.1296 17/500 [>.............................] - ETA: 6:25 - loss: 4.0561 - regression_loss: 2.9265 - classification_loss: 1.1297 18/500 [>.............................] - ETA: 6:12 - loss: 4.0624 - regression_loss: 2.9327 - classification_loss: 1.1297 19/500 [>.............................] - ETA: 6:00 - loss: 4.0578 - regression_loss: 2.9281 - classification_loss: 1.1297 20/500 [>.............................] - ETA: 5:49 - loss: 4.0624 - regression_loss: 2.9327 - classification_loss: 1.1298 21/500 [>.............................] - ETA: 5:39 - loss: 4.0548 - regression_loss: 2.9251 - classification_loss: 1.1297 22/500 [>.............................] - ETA: 5:30 - loss: 4.0424 - regression_loss: 2.9126 - classification_loss: 1.1298 23/500 [>.............................] - ETA: 5:22 - loss: 4.0377 - regression_loss: 2.9078 - classification_loss: 1.1299 24/500 [>.............................] - ETA: 5:14 - loss: 4.0352 - regression_loss: 2.9054 - classification_loss: 1.1298 25/500 [>.............................] - ETA: 5:07 - loss: 4.0404 - regression_loss: 2.9106 - classification_loss: 1.1298 26/500 [>.............................] - ETA: 5:01 - loss: 4.0468 - regression_loss: 2.9170 - classification_loss: 1.1298 27/500 [>.............................] - ETA: 4:54 - loss: 4.0485 - regression_loss: 2.9188 - classification_loss: 1.1298 28/500 [>.............................] - ETA: 4:49 - loss: 4.0500 - regression_loss: 2.9203 - classification_loss: 1.1297 29/500 [>.............................] - ETA: 4:43 - loss: 4.0600 - regression_loss: 2.9302 - classification_loss: 1.1298 30/500 [>.............................] - ETA: 4:38 - loss: 4.0608 - regression_loss: 2.9310 - classification_loss: 1.1297 31/500 [>.............................] - ETA: 4:34 - loss: 4.0586 - regression_loss: 2.9289 - classification_loss: 1.1297 32/500 [>.............................] - ETA: 4:29 - loss: 4.0511 - regression_loss: 2.9213 - classification_loss: 1.1298 33/500 [>.............................] - ETA: 4:25 - loss: 4.0405 - regression_loss: 2.9107 - classification_loss: 1.1298 34/500 [=>............................] - ETA: 4:21 - loss: 4.0371 - regression_loss: 2.9073 - classification_loss: 1.1298 35/500 [=>............................] - ETA: 4:17 - loss: 4.0348 - regression_loss: 2.9049 - classification_loss: 1.1298 36/500 [=>............................] - ETA: 4:14 - loss: 4.0373 - regression_loss: 2.9075 - classification_loss: 1.1298 37/500 [=>............................] - ETA: 4:10 - loss: 4.0386 - regression_loss: 2.9089 - classification_loss: 1.1298 38/500 [=>............................] - ETA: 4:07 - loss: 4.0401 - regression_loss: 2.9103 - classification_loss: 1.1297 39/500 [=>............................] - ETA: 4:04 - loss: 4.0466 - regression_loss: 2.9169 - classification_loss: 1.1297 40/500 [=>............................] - ETA: 4:01 - loss: 4.0423 - regression_loss: 2.9125 - classification_loss: 1.1297 41/500 [=>............................] - ETA: 3:59 - loss: 4.0424 - regression_loss: 2.9127 - classification_loss: 1.1297 42/500 [=>............................] - ETA: 3:56 - loss: 4.0380 - regression_loss: 2.9083 - classification_loss: 1.1297 43/500 [=>............................] - ETA: 3:54 - loss: 4.0387 - regression_loss: 2.9091 - classification_loss: 1.1297 44/500 [=>............................] - ETA: 3:51 - loss: 4.0408 - regression_loss: 2.9112 - classification_loss: 1.1296 45/500 [=>............................] - ETA: 3:49 - loss: 4.0370 - regression_loss: 2.9074 - classification_loss: 1.1296 46/500 [=>............................] - ETA: 3:46 - loss: 4.0364 - regression_loss: 2.9068 - classification_loss: 1.1296 47/500 [=>............................] - ETA: 3:44 - loss: 4.0390 - regression_loss: 2.9095 - classification_loss: 1.1295 48/500 [=>............................] - ETA: 3:42 - loss: 4.0378 - regression_loss: 2.9083 - classification_loss: 1.1295 49/500 [=>............................] - ETA: 3:40 - loss: 4.0367 - regression_loss: 2.9073 - classification_loss: 1.1294 50/500 [==>...........................] - ETA: 3:39 - loss: 4.0379 - regression_loss: 2.9085 - classification_loss: 1.1294 51/500 [==>...........................] - ETA: 3:37 - loss: 4.0338 - regression_loss: 2.9044 - classification_loss: 1.1294 52/500 [==>...........................] - ETA: 3:35 - loss: 4.0333 - regression_loss: 2.9040 - classification_loss: 1.1293 53/500 [==>...........................] - ETA: 3:33 - loss: 4.0333 - regression_loss: 2.9040 - classification_loss: 1.1293 54/500 [==>...........................] - ETA: 3:31 - loss: 4.0316 - regression_loss: 2.9023 - classification_loss: 1.1293 55/500 [==>...........................] - ETA: 3:30 - loss: 4.0308 - regression_loss: 2.9016 - classification_loss: 1.1293 56/500 [==>...........................] - ETA: 3:28 - loss: 4.0327 - regression_loss: 2.9034 - classification_loss: 1.1292 57/500 [==>...........................] - ETA: 3:27 - loss: 4.0335 - regression_loss: 2.9043 - classification_loss: 1.1292 58/500 [==>...........................] - ETA: 3:25 - loss: 4.0314 - regression_loss: 2.9023 - classification_loss: 1.1291 59/500 [==>...........................] - ETA: 3:24 - loss: 4.0323 - regression_loss: 2.9032 - classification_loss: 1.1291 60/500 [==>...........................] - ETA: 3:22 - loss: 4.0359 - regression_loss: 2.9068 - classification_loss: 1.1291 61/500 [==>...........................] - ETA: 3:21 - loss: 4.0367 - regression_loss: 2.9077 - classification_loss: 1.1291 62/500 [==>...........................] - ETA: 3:19 - loss: 4.0361 - regression_loss: 2.9071 - classification_loss: 1.1290 63/500 [==>...........................] - ETA: 3:18 - loss: 4.0351 - regression_loss: 2.9062 - classification_loss: 1.1289 64/500 [==>...........................] - ETA: 3:17 - loss: 4.0342 - regression_loss: 2.9052 - classification_loss: 1.1289 65/500 [==>...........................] - ETA: 3:15 - loss: 4.0325 - regression_loss: 2.9036 - classification_loss: 1.1289 66/500 [==>...........................] - ETA: 3:14 - loss: 4.0333 - regression_loss: 2.9044 - classification_loss: 1.1289 67/500 [===>..........................] - ETA: 3:13 - loss: 4.0320 - regression_loss: 2.9031 - classification_loss: 1.1288 68/500 [===>..........................] - ETA: 3:11 - loss: 4.0277 - regression_loss: 2.8988 - classification_loss: 1.1289 69/500 [===>..........................] - ETA: 3:10 - loss: 4.0264 - regression_loss: 2.8975 - classification_loss: 1.1289 70/500 [===>..........................] - ETA: 3:09 - loss: 4.0223 - regression_loss: 2.8934 - classification_loss: 1.1289 71/500 [===>..........................] - ETA: 3:08 - loss: 4.0218 - regression_loss: 2.8929 - classification_loss: 1.1288 72/500 [===>..........................] - ETA: 3:06 - loss: 4.0193 - regression_loss: 2.8905 - classification_loss: 1.1288 73/500 [===>..........................] - ETA: 3:05 - loss: 4.0178 - regression_loss: 2.8890 - classification_loss: 1.1288 74/500 [===>..........................] - ETA: 3:04 - loss: 4.0154 - regression_loss: 2.8867 - classification_loss: 1.1287 75/500 [===>..........................] - ETA: 3:03 - loss: 4.0150 - regression_loss: 2.8864 - classification_loss: 1.1286 76/500 [===>..........................] - ETA: 3:02 - loss: 4.0138 - regression_loss: 2.8853 - classification_loss: 1.1286 77/500 [===>..........................] - ETA: 3:01 - loss: 4.0131 - regression_loss: 2.8846 - classification_loss: 1.1285 78/500 [===>..........................] - ETA: 3:00 - loss: 4.0116 - regression_loss: 2.8831 - classification_loss: 1.1284 79/500 [===>..........................] - ETA: 2:59 - loss: 4.0147 - regression_loss: 2.8863 - classification_loss: 1.1284 80/500 [===>..........................] - ETA: 2:58 - loss: 4.0165 - regression_loss: 2.8881 - classification_loss: 1.1284 81/500 [===>..........................] - ETA: 2:58 - loss: 4.0163 - regression_loss: 2.8879 - classification_loss: 1.1283 82/500 [===>..........................] - ETA: 2:57 - loss: 4.0179 - regression_loss: 2.8896 - classification_loss: 1.1283 83/500 [===>..........................] - ETA: 2:56 - loss: 4.0169 - regression_loss: 2.8886 - classification_loss: 1.1283 84/500 [====>.........................] - ETA: 2:55 - loss: 4.0157 - regression_loss: 2.8874 - classification_loss: 1.1283 85/500 [====>.........................] - ETA: 2:54 - loss: 4.0163 - regression_loss: 2.8881 - classification_loss: 1.1282 86/500 [====>.........................] - ETA: 2:53 - loss: 4.0122 - regression_loss: 2.8840 - classification_loss: 1.1282 87/500 [====>.........................] - ETA: 2:52 - loss: 4.0132 - regression_loss: 2.8851 - classification_loss: 1.1281 88/500 [====>.........................] - ETA: 2:51 - loss: 4.0128 - regression_loss: 2.8847 - classification_loss: 1.1280 89/500 [====>.........................] - ETA: 2:51 - loss: 4.0100 - regression_loss: 2.8820 - classification_loss: 1.1280 90/500 [====>.........................] - ETA: 2:50 - loss: 4.0106 - regression_loss: 2.8827 - classification_loss: 1.1279 91/500 [====>.........................] - ETA: 2:49 - loss: 4.0101 - regression_loss: 2.8823 - classification_loss: 1.1278 92/500 [====>.........................] - ETA: 2:48 - loss: 4.0088 - regression_loss: 2.8812 - classification_loss: 1.1277 93/500 [====>.........................] - ETA: 2:47 - loss: 4.0075 - regression_loss: 2.8799 - classification_loss: 1.1276 94/500 [====>.........................] - ETA: 2:47 - loss: 4.0069 - regression_loss: 2.8793 - classification_loss: 1.1275 95/500 [====>.........................] - ETA: 2:46 - loss: 4.0084 - regression_loss: 2.8809 - classification_loss: 1.1275 96/500 [====>.........................] - ETA: 2:45 - loss: 4.0086 - regression_loss: 2.8812 - classification_loss: 1.1274 97/500 [====>.........................] - ETA: 2:44 - loss: 4.0119 - regression_loss: 2.8845 - classification_loss: 1.1274 98/500 [====>.........................] - ETA: 2:44 - loss: 4.0140 - regression_loss: 2.8866 - classification_loss: 1.1273 99/500 [====>.........................] - ETA: 2:43 - loss: 4.0115 - regression_loss: 2.8843 - classification_loss: 1.1272 100/500 [=====>........................] - ETA: 2:42 - loss: 4.0115 - regression_loss: 2.8843 - classification_loss: 1.1272 101/500 [=====>........................] - ETA: 2:41 - loss: 4.0111 - regression_loss: 2.8839 - classification_loss: 1.1272 102/500 [=====>........................] - ETA: 2:41 - loss: 4.0109 - regression_loss: 2.8838 - classification_loss: 1.1271 103/500 [=====>........................] - ETA: 2:40 - loss: 4.0095 - regression_loss: 2.8825 - classification_loss: 1.1270 104/500 [=====>........................] - ETA: 2:39 - loss: 4.0080 - regression_loss: 2.8811 - classification_loss: 1.1269 105/500 [=====>........................] - ETA: 2:38 - loss: 4.0048 - regression_loss: 2.8781 - classification_loss: 1.1267 106/500 [=====>........................] - ETA: 2:38 - loss: 4.0042 - regression_loss: 2.8776 - classification_loss: 1.1267 107/500 [=====>........................] - ETA: 2:37 - loss: 4.0014 - regression_loss: 2.8749 - classification_loss: 1.1265 108/500 [=====>........................] - ETA: 2:36 - loss: 3.9998 - regression_loss: 2.8734 - classification_loss: 1.1264 109/500 [=====>........................] - ETA: 2:36 - loss: 3.9985 - regression_loss: 2.8723 - classification_loss: 1.1262 110/500 [=====>........................] - ETA: 2:35 - loss: 3.9973 - regression_loss: 2.8713 - classification_loss: 1.1260 111/500 [=====>........................] - ETA: 2:34 - loss: 3.9954 - regression_loss: 2.8694 - classification_loss: 1.1259 112/500 [=====>........................] - ETA: 2:34 - loss: 3.9941 - regression_loss: 2.8683 - classification_loss: 1.1258 113/500 [=====>........................] - ETA: 2:33 - loss: 3.9920 - regression_loss: 2.8664 - classification_loss: 1.1256 114/500 [=====>........................] - ETA: 2:32 - loss: 3.9907 - regression_loss: 2.8653 - classification_loss: 1.1254 115/500 [=====>........................] - ETA: 2:32 - loss: 3.9893 - regression_loss: 2.8640 - classification_loss: 1.1253 116/500 [=====>........................] - ETA: 2:31 - loss: 3.9876 - regression_loss: 2.8624 - classification_loss: 1.1251 117/500 [======>.......................] - ETA: 2:31 - loss: 3.9866 - regression_loss: 2.8615 - classification_loss: 1.1250 118/500 [======>.......................] - ETA: 2:30 - loss: 3.9840 - regression_loss: 2.8593 - classification_loss: 1.1248 119/500 [======>.......................] - ETA: 2:29 - loss: 3.9817 - regression_loss: 2.8572 - classification_loss: 1.1245 120/500 [======>.......................] - ETA: 2:29 - loss: 3.9792 - regression_loss: 2.8547 - classification_loss: 1.1245 121/500 [======>.......................] - ETA: 2:28 - loss: 3.9768 - regression_loss: 2.8526 - classification_loss: 1.1242 122/500 [======>.......................] - ETA: 2:27 - loss: 3.9790 - regression_loss: 2.8549 - classification_loss: 1.1241 123/500 [======>.......................] - ETA: 2:27 - loss: 3.9772 - regression_loss: 2.8533 - classification_loss: 1.1239 124/500 [======>.......................] - ETA: 2:26 - loss: 3.9750 - regression_loss: 2.8514 - classification_loss: 1.1236 125/500 [======>.......................] - ETA: 2:26 - loss: 3.9735 - regression_loss: 2.8500 - classification_loss: 1.1236 126/500 [======>.......................] - ETA: 2:25 - loss: 3.9719 - regression_loss: 2.8486 - classification_loss: 1.1234 127/500 [======>.......................] - ETA: 2:25 - loss: 3.9692 - regression_loss: 2.8461 - classification_loss: 1.1231 128/500 [======>.......................] - ETA: 2:24 - loss: 3.9671 - regression_loss: 2.8443 - classification_loss: 1.1228 129/500 [======>.......................] - ETA: 2:23 - loss: 3.9646 - regression_loss: 2.8421 - classification_loss: 1.1225 130/500 [======>.......................] - ETA: 2:23 - loss: 3.9624 - regression_loss: 2.8400 - classification_loss: 1.1224 131/500 [======>.......................] - ETA: 2:22 - loss: 3.9613 - regression_loss: 2.8392 - classification_loss: 1.1221 132/500 [======>.......................] - ETA: 2:22 - loss: 3.9588 - regression_loss: 2.8368 - classification_loss: 1.1219 133/500 [======>.......................] - ETA: 2:21 - loss: 3.9562 - regression_loss: 2.8346 - classification_loss: 1.1216 134/500 [=======>......................] - ETA: 2:21 - loss: 3.9537 - regression_loss: 2.8326 - classification_loss: 1.1212 135/500 [=======>......................] - ETA: 2:20 - loss: 3.9524 - regression_loss: 2.8315 - classification_loss: 1.1209 136/500 [=======>......................] - ETA: 2:20 - loss: 3.9519 - regression_loss: 2.8311 - classification_loss: 1.1208 137/500 [=======>......................] - ETA: 2:19 - loss: 3.9504 - regression_loss: 2.8296 - classification_loss: 1.1207 138/500 [=======>......................] - ETA: 2:19 - loss: 3.9483 - regression_loss: 2.8279 - classification_loss: 1.1204 139/500 [=======>......................] - ETA: 2:18 - loss: 3.9477 - regression_loss: 2.8275 - classification_loss: 1.1202 140/500 [=======>......................] - ETA: 2:17 - loss: 3.9459 - regression_loss: 2.8260 - classification_loss: 1.1198 141/500 [=======>......................] - ETA: 2:17 - loss: 3.9434 - regression_loss: 2.8240 - classification_loss: 1.1194 142/500 [=======>......................] - ETA: 2:16 - loss: 3.9417 - regression_loss: 2.8226 - classification_loss: 1.1191 143/500 [=======>......................] - ETA: 2:16 - loss: 3.9387 - regression_loss: 2.8200 - classification_loss: 1.1187 144/500 [=======>......................] - ETA: 2:15 - loss: 3.9376 - regression_loss: 2.8192 - classification_loss: 1.1184 145/500 [=======>......................] - ETA: 2:15 - loss: 3.9354 - regression_loss: 2.8175 - classification_loss: 1.1179 146/500 [=======>......................] - ETA: 2:14 - loss: 3.9323 - regression_loss: 2.8151 - classification_loss: 1.1172 147/500 [=======>......................] - ETA: 2:14 - loss: 3.9301 - regression_loss: 2.8134 - classification_loss: 1.1167 148/500 [=======>......................] - ETA: 2:13 - loss: 3.9276 - regression_loss: 2.8114 - classification_loss: 1.1162 149/500 [=======>......................] - ETA: 2:13 - loss: 3.9244 - regression_loss: 2.8090 - classification_loss: 1.1155 150/500 [========>.....................] - ETA: 2:12 - loss: 3.9224 - regression_loss: 2.8075 - classification_loss: 1.1149 151/500 [========>.....................] - ETA: 2:12 - loss: 3.9231 - regression_loss: 2.8085 - classification_loss: 1.1146 152/500 [========>.....................] - ETA: 2:11 - loss: 3.9214 - regression_loss: 2.8075 - classification_loss: 1.1140 153/500 [========>.....................] - ETA: 2:11 - loss: 3.9191 - regression_loss: 2.8059 - classification_loss: 1.1132 154/500 [========>.....................] - ETA: 2:11 - loss: 3.9155 - regression_loss: 2.8030 - classification_loss: 1.1125 155/500 [========>.....................] - ETA: 2:10 - loss: 3.9135 - regression_loss: 2.8013 - classification_loss: 1.1122 156/500 [========>.....................] - ETA: 2:10 - loss: 3.9110 - regression_loss: 2.7995 - classification_loss: 1.1116 157/500 [========>.....................] - ETA: 2:09 - loss: 3.9084 - regression_loss: 2.7975 - classification_loss: 1.1109 158/500 [========>.....................] - ETA: 2:09 - loss: 3.9057 - regression_loss: 2.7955 - classification_loss: 1.1101 159/500 [========>.....................] - ETA: 2:08 - loss: 3.9030 - regression_loss: 2.7936 - classification_loss: 1.1094 160/500 [========>.....................] - ETA: 2:08 - loss: 3.9020 - regression_loss: 2.7931 - classification_loss: 1.1089 161/500 [========>.....................] - ETA: 2:07 - loss: 3.8991 - regression_loss: 2.7909 - classification_loss: 1.1083 162/500 [========>.....................] - ETA: 2:07 - loss: 3.8978 - regression_loss: 2.7900 - classification_loss: 1.1078 163/500 [========>.....................] - ETA: 2:06 - loss: 3.8947 - regression_loss: 2.7878 - classification_loss: 1.1069 164/500 [========>.....................] - ETA: 2:06 - loss: 3.8917 - regression_loss: 2.7857 - classification_loss: 1.1060 165/500 [========>.....................] - ETA: 2:05 - loss: 3.8892 - regression_loss: 2.7838 - classification_loss: 1.1053 166/500 [========>.....................] - ETA: 2:05 - loss: 3.8870 - regression_loss: 2.7823 - classification_loss: 1.1047 167/500 [=========>....................] - ETA: 2:04 - loss: 3.8846 - regression_loss: 2.7809 - classification_loss: 1.1037 168/500 [=========>....................] - ETA: 2:04 - loss: 3.8817 - regression_loss: 2.7790 - classification_loss: 1.1027 169/500 [=========>....................] - ETA: 2:03 - loss: 3.8794 - regression_loss: 2.7773 - classification_loss: 1.1021 170/500 [=========>....................] - ETA: 2:03 - loss: 3.8787 - regression_loss: 2.7772 - classification_loss: 1.1015 171/500 [=========>....................] - ETA: 2:02 - loss: 3.8754 - regression_loss: 2.7749 - classification_loss: 1.1005 172/500 [=========>....................] - ETA: 2:02 - loss: 3.8727 - regression_loss: 2.7733 - classification_loss: 1.0994 173/500 [=========>....................] - ETA: 2:01 - loss: 3.8696 - regression_loss: 2.7715 - classification_loss: 1.0980 174/500 [=========>....................] - ETA: 2:01 - loss: 3.8674 - regression_loss: 2.7702 - classification_loss: 1.0972 175/500 [=========>....................] - ETA: 2:01 - loss: 3.8665 - regression_loss: 2.7700 - classification_loss: 1.0965 176/500 [=========>....................] - ETA: 2:00 - loss: 3.8631 - regression_loss: 2.7679 - classification_loss: 1.0951 177/500 [=========>....................] - ETA: 2:00 - loss: 3.8595 - regression_loss: 2.7658 - classification_loss: 1.0937 178/500 [=========>....................] - ETA: 1:59 - loss: 3.8567 - regression_loss: 2.7642 - classification_loss: 1.0926 179/500 [=========>....................] - ETA: 1:59 - loss: 3.8550 - regression_loss: 2.7629 - classification_loss: 1.0921 180/500 [=========>....................] - ETA: 1:58 - loss: 3.8515 - regression_loss: 2.7608 - classification_loss: 1.0907 181/500 [=========>....................] - ETA: 1:58 - loss: 3.8488 - regression_loss: 2.7595 - classification_loss: 1.0894 182/500 [=========>....................] - ETA: 1:57 - loss: 3.8478 - regression_loss: 2.7590 - classification_loss: 1.0889 183/500 [=========>....................] - ETA: 1:57 - loss: 3.8444 - regression_loss: 2.7569 - classification_loss: 1.0875 184/500 [==========>...................] - ETA: 1:56 - loss: 3.8422 - regression_loss: 2.7556 - classification_loss: 1.0867 185/500 [==========>...................] - ETA: 1:56 - loss: 3.8439 - regression_loss: 2.7556 - classification_loss: 1.0883 186/500 [==========>...................] - ETA: 1:55 - loss: 3.8405 - regression_loss: 2.7538 - classification_loss: 1.0866 187/500 [==========>...................] - ETA: 1:55 - loss: 3.8369 - regression_loss: 2.7520 - classification_loss: 1.0849 188/500 [==========>...................] - ETA: 1:55 - loss: 3.8360 - regression_loss: 2.7511 - classification_loss: 1.0849 189/500 [==========>...................] - ETA: 1:54 - loss: 3.8353 - regression_loss: 2.7501 - classification_loss: 1.0852 190/500 [==========>...................] - ETA: 1:54 - loss: 3.8329 - regression_loss: 2.7483 - classification_loss: 1.0846 191/500 [==========>...................] - ETA: 1:53 - loss: 3.8300 - regression_loss: 2.7470 - classification_loss: 1.0831 192/500 [==========>...................] - ETA: 1:53 - loss: 3.8277 - regression_loss: 2.7458 - classification_loss: 1.0820 193/500 [==========>...................] - ETA: 1:52 - loss: 3.8248 - regression_loss: 2.7443 - classification_loss: 1.0805 194/500 [==========>...................] - ETA: 1:52 - loss: 3.8227 - regression_loss: 2.7431 - classification_loss: 1.0796 195/500 [==========>...................] - ETA: 1:52 - loss: 3.8211 - regression_loss: 2.7423 - classification_loss: 1.0788 196/500 [==========>...................] - ETA: 1:51 - loss: 3.8189 - regression_loss: 2.7412 - classification_loss: 1.0777 197/500 [==========>...................] - ETA: 1:51 - loss: 3.8167 - regression_loss: 2.7403 - classification_loss: 1.0764 198/500 [==========>...................] - ETA: 1:50 - loss: 3.8144 - regression_loss: 2.7391 - classification_loss: 1.0753 199/500 [==========>...................] - ETA: 1:50 - loss: 3.8143 - regression_loss: 2.7394 - classification_loss: 1.0749 200/500 [===========>..................] - ETA: 1:49 - loss: 3.8115 - regression_loss: 2.7380 - classification_loss: 1.0735 201/500 [===========>..................] - ETA: 1:49 - loss: 3.8085 - regression_loss: 2.7363 - classification_loss: 1.0722 202/500 [===========>..................] - ETA: 1:49 - loss: 3.8061 - regression_loss: 2.7353 - classification_loss: 1.0708 203/500 [===========>..................] - ETA: 1:48 - loss: 3.8035 - regression_loss: 2.7343 - classification_loss: 1.0692 204/500 [===========>..................] - ETA: 1:48 - loss: 3.8005 - regression_loss: 2.7330 - classification_loss: 1.0675 205/500 [===========>..................] - ETA: 1:47 - loss: 3.7995 - regression_loss: 2.7324 - classification_loss: 1.0671 206/500 [===========>..................] - ETA: 1:47 - loss: 3.7984 - regression_loss: 2.7322 - classification_loss: 1.0662 207/500 [===========>..................] - ETA: 1:46 - loss: 3.7961 - regression_loss: 2.7307 - classification_loss: 1.0655 208/500 [===========>..................] - ETA: 1:46 - loss: 3.7929 - regression_loss: 2.7290 - classification_loss: 1.0639 209/500 [===========>..................] - ETA: 1:46 - loss: 3.7902 - regression_loss: 2.7279 - classification_loss: 1.0623 210/500 [===========>..................] - ETA: 1:45 - loss: 3.7901 - regression_loss: 2.7284 - classification_loss: 1.0618 211/500 [===========>..................] - ETA: 1:45 - loss: 3.7918 - regression_loss: 2.7306 - classification_loss: 1.0612 212/500 [===========>..................] - ETA: 1:44 - loss: 3.7889 - regression_loss: 2.7292 - classification_loss: 1.0597 213/500 [===========>..................] - ETA: 1:44 - loss: 3.7861 - regression_loss: 2.7280 - classification_loss: 1.0581 214/500 [===========>..................] - ETA: 1:44 - loss: 3.7832 - regression_loss: 2.7267 - classification_loss: 1.0565 215/500 [===========>..................] - ETA: 1:43 - loss: 3.7808 - regression_loss: 2.7255 - classification_loss: 1.0553 216/500 [===========>..................] - ETA: 1:43 - loss: 3.7794 - regression_loss: 2.7253 - classification_loss: 1.0541 217/500 [============>.................] - ETA: 1:42 - loss: 3.7769 - regression_loss: 2.7241 - classification_loss: 1.0528 218/500 [============>.................] - ETA: 1:42 - loss: 3.7740 - regression_loss: 2.7229 - classification_loss: 1.0512 219/500 [============>.................] - ETA: 1:41 - loss: 3.7706 - regression_loss: 2.7213 - classification_loss: 1.0494 220/500 [============>.................] - ETA: 1:41 - loss: 3.7675 - regression_loss: 2.7198 - classification_loss: 1.0476 221/500 [============>.................] - ETA: 1:41 - loss: 3.7644 - regression_loss: 2.7185 - classification_loss: 1.0459 222/500 [============>.................] - ETA: 1:40 - loss: 3.7634 - regression_loss: 2.7177 - classification_loss: 1.0456 223/500 [============>.................] - ETA: 1:40 - loss: 3.7655 - regression_loss: 2.7197 - classification_loss: 1.0459 224/500 [============>.................] - ETA: 1:39 - loss: 3.7657 - regression_loss: 2.7192 - classification_loss: 1.0465 225/500 [============>.................] - ETA: 1:39 - loss: 3.7625 - regression_loss: 2.7177 - classification_loss: 1.0447 226/500 [============>.................] - ETA: 1:39 - loss: 3.7597 - regression_loss: 2.7167 - classification_loss: 1.0430 227/500 [============>.................] - ETA: 1:38 - loss: 3.7568 - regression_loss: 2.7156 - classification_loss: 1.0412 228/500 [============>.................] - ETA: 1:38 - loss: 3.7546 - regression_loss: 2.7148 - classification_loss: 1.0399 229/500 [============>.................] - ETA: 1:37 - loss: 3.7523 - regression_loss: 2.7136 - classification_loss: 1.0388 230/500 [============>.................] - ETA: 1:37 - loss: 3.7528 - regression_loss: 2.7133 - classification_loss: 1.0395 231/500 [============>.................] - ETA: 1:37 - loss: 3.7518 - regression_loss: 2.7130 - classification_loss: 1.0388 232/500 [============>.................] - ETA: 1:36 - loss: 3.7517 - regression_loss: 2.7127 - classification_loss: 1.0390 233/500 [============>.................] - ETA: 1:36 - loss: 3.7489 - regression_loss: 2.7114 - classification_loss: 1.0375 234/500 [=============>................] - ETA: 1:35 - loss: 3.7467 - regression_loss: 2.7103 - classification_loss: 1.0364 235/500 [=============>................] - ETA: 1:35 - loss: 3.7465 - regression_loss: 2.7107 - classification_loss: 1.0358 236/500 [=============>................] - ETA: 1:35 - loss: 3.7440 - regression_loss: 2.7096 - classification_loss: 1.0344 237/500 [=============>................] - ETA: 1:34 - loss: 3.7412 - regression_loss: 2.7084 - classification_loss: 1.0328 238/500 [=============>................] - ETA: 1:34 - loss: 3.7390 - regression_loss: 2.7074 - classification_loss: 1.0316 239/500 [=============>................] - ETA: 1:33 - loss: 3.7364 - regression_loss: 2.7064 - classification_loss: 1.0300 240/500 [=============>................] - ETA: 1:33 - loss: 3.7335 - regression_loss: 2.7051 - classification_loss: 1.0284 241/500 [=============>................] - ETA: 1:33 - loss: 3.7333 - regression_loss: 2.7041 - classification_loss: 1.0292 242/500 [=============>................] - ETA: 1:32 - loss: 3.7303 - regression_loss: 2.7029 - classification_loss: 1.0274 243/500 [=============>................] - ETA: 1:32 - loss: 3.7281 - regression_loss: 2.7020 - classification_loss: 1.0262 244/500 [=============>................] - ETA: 1:31 - loss: 3.7271 - regression_loss: 2.7016 - classification_loss: 1.0255 245/500 [=============>................] - ETA: 1:31 - loss: 3.7246 - regression_loss: 2.7007 - classification_loss: 1.0239 246/500 [=============>................] - ETA: 1:30 - loss: 3.7222 - regression_loss: 2.6996 - classification_loss: 1.0226 247/500 [=============>................] - ETA: 1:30 - loss: 3.7213 - regression_loss: 2.6988 - classification_loss: 1.0225 248/500 [=============>................] - ETA: 1:30 - loss: 3.7204 - regression_loss: 2.6983 - classification_loss: 1.0221 249/500 [=============>................] - ETA: 1:29 - loss: 3.7177 - regression_loss: 2.6971 - classification_loss: 1.0205 250/500 [==============>...............] - ETA: 1:29 - loss: 3.7153 - regression_loss: 2.6960 - classification_loss: 1.0194 251/500 [==============>...............] - ETA: 1:28 - loss: 3.7127 - regression_loss: 2.6947 - classification_loss: 1.0181 252/500 [==============>...............] - ETA: 1:28 - loss: 3.7124 - regression_loss: 2.6946 - classification_loss: 1.0178 253/500 [==============>...............] - ETA: 1:28 - loss: 3.7099 - regression_loss: 2.6935 - classification_loss: 1.0165 254/500 [==============>...............] - ETA: 1:27 - loss: 3.7083 - regression_loss: 2.6929 - classification_loss: 1.0154 255/500 [==============>...............] - ETA: 1:27 - loss: 3.7085 - regression_loss: 2.6927 - classification_loss: 1.0158 256/500 [==============>...............] - ETA: 1:26 - loss: 3.7087 - regression_loss: 2.6930 - classification_loss: 1.0158 257/500 [==============>...............] - ETA: 1:26 - loss: 3.7087 - regression_loss: 2.6926 - classification_loss: 1.0161 258/500 [==============>...............] - ETA: 1:26 - loss: 3.7068 - regression_loss: 2.6917 - classification_loss: 1.0151 259/500 [==============>...............] - ETA: 1:25 - loss: 3.7051 - regression_loss: 2.6912 - classification_loss: 1.0139 260/500 [==============>...............] - ETA: 1:25 - loss: 3.7033 - regression_loss: 2.6905 - classification_loss: 1.0128 261/500 [==============>...............] - ETA: 1:24 - loss: 3.7009 - regression_loss: 2.6894 - classification_loss: 1.0115 262/500 [==============>...............] - ETA: 1:24 - loss: 3.6992 - regression_loss: 2.6888 - classification_loss: 1.0104 263/500 [==============>...............] - ETA: 1:24 - loss: 3.6974 - regression_loss: 2.6882 - classification_loss: 1.0093 264/500 [==============>...............] - ETA: 1:23 - loss: 3.6964 - regression_loss: 2.6874 - classification_loss: 1.0090 265/500 [==============>...............] - ETA: 1:23 - loss: 3.6963 - regression_loss: 2.6867 - classification_loss: 1.0096 266/500 [==============>...............] - ETA: 1:22 - loss: 3.6941 - regression_loss: 2.6857 - classification_loss: 1.0083 267/500 [===============>..............] - ETA: 1:22 - loss: 3.6920 - regression_loss: 2.6852 - classification_loss: 1.0068 268/500 [===============>..............] - ETA: 1:22 - loss: 3.6899 - regression_loss: 2.6842 - classification_loss: 1.0057 269/500 [===============>..............] - ETA: 1:21 - loss: 3.6880 - regression_loss: 2.6831 - classification_loss: 1.0049 270/500 [===============>..............] - ETA: 1:21 - loss: 3.6857 - regression_loss: 2.6824 - classification_loss: 1.0033 271/500 [===============>..............] - ETA: 1:21 - loss: 3.6836 - regression_loss: 2.6817 - classification_loss: 1.0019 272/500 [===============>..............] - ETA: 1:20 - loss: 3.6812 - regression_loss: 2.6808 - classification_loss: 1.0004 273/500 [===============>..............] - ETA: 1:20 - loss: 3.6792 - regression_loss: 2.6800 - classification_loss: 0.9992 274/500 [===============>..............] - ETA: 1:19 - loss: 3.6782 - regression_loss: 2.6799 - classification_loss: 0.9983 275/500 [===============>..............] - ETA: 1:19 - loss: 3.6761 - regression_loss: 2.6791 - classification_loss: 0.9970 276/500 [===============>..............] - ETA: 1:19 - loss: 3.6739 - regression_loss: 2.6783 - classification_loss: 0.9956 277/500 [===============>..............] - ETA: 1:18 - loss: 3.6744 - regression_loss: 2.6782 - classification_loss: 0.9962 278/500 [===============>..............] - ETA: 1:18 - loss: 3.6752 - regression_loss: 2.6780 - classification_loss: 0.9972 279/500 [===============>..............] - ETA: 1:17 - loss: 3.6736 - regression_loss: 2.6776 - classification_loss: 0.9960 280/500 [===============>..............] - ETA: 1:17 - loss: 3.6715 - regression_loss: 2.6771 - classification_loss: 0.9945 281/500 [===============>..............] - ETA: 1:17 - loss: 3.6698 - regression_loss: 2.6764 - classification_loss: 0.9934 282/500 [===============>..............] - ETA: 1:16 - loss: 3.6683 - regression_loss: 2.6759 - classification_loss: 0.9924 283/500 [===============>..............] - ETA: 1:16 - loss: 3.6663 - regression_loss: 2.6747 - classification_loss: 0.9916 284/500 [================>.............] - ETA: 1:16 - loss: 3.6642 - regression_loss: 2.6738 - classification_loss: 0.9905 285/500 [================>.............] - ETA: 1:15 - loss: 3.6626 - regression_loss: 2.6728 - classification_loss: 0.9897 286/500 [================>.............] - ETA: 1:15 - loss: 3.6607 - regression_loss: 2.6720 - classification_loss: 0.9887 287/500 [================>.............] - ETA: 1:14 - loss: 3.6591 - regression_loss: 2.6703 - classification_loss: 0.9888 288/500 [================>.............] - ETA: 1:14 - loss: 3.6580 - regression_loss: 2.6703 - classification_loss: 0.9877 289/500 [================>.............] - ETA: 1:14 - loss: 3.6565 - regression_loss: 2.6697 - classification_loss: 0.9868 290/500 [================>.............] - ETA: 1:13 - loss: 3.6568 - regression_loss: 2.6699 - classification_loss: 0.9869 291/500 [================>.............] - ETA: 1:13 - loss: 3.6551 - regression_loss: 2.6691 - classification_loss: 0.9860 292/500 [================>.............] - ETA: 1:12 - loss: 3.6534 - regression_loss: 2.6681 - classification_loss: 0.9853 293/500 [================>.............] - ETA: 1:12 - loss: 3.6507 - regression_loss: 2.6667 - classification_loss: 0.9840 294/500 [================>.............] - ETA: 1:12 - loss: 3.6488 - regression_loss: 2.6658 - classification_loss: 0.9830 295/500 [================>.............] - ETA: 1:11 - loss: 3.6495 - regression_loss: 2.6668 - classification_loss: 0.9828 296/500 [================>.............] - ETA: 1:11 - loss: 3.6477 - regression_loss: 2.6659 - classification_loss: 0.9818 297/500 [================>.............] - ETA: 1:11 - loss: 3.6469 - regression_loss: 2.6660 - classification_loss: 0.9810 298/500 [================>.............] - ETA: 1:10 - loss: 3.6448 - regression_loss: 2.6651 - classification_loss: 0.9797 299/500 [================>.............] - ETA: 1:10 - loss: 3.6452 - regression_loss: 2.6661 - classification_loss: 0.9791 300/500 [=================>............] - ETA: 1:09 - loss: 3.6438 - regression_loss: 2.6656 - classification_loss: 0.9782 301/500 [=================>............] - ETA: 1:09 - loss: 3.6420 - regression_loss: 2.6649 - classification_loss: 0.9771 302/500 [=================>............] - ETA: 1:09 - loss: 3.6405 - regression_loss: 2.6647 - classification_loss: 0.9758 303/500 [=================>............] - ETA: 1:08 - loss: 3.6391 - regression_loss: 2.6644 - classification_loss: 0.9747 304/500 [=================>............] - ETA: 1:08 - loss: 3.6377 - regression_loss: 2.6639 - classification_loss: 0.9738 305/500 [=================>............] - ETA: 1:08 - loss: 3.6370 - regression_loss: 2.6636 - classification_loss: 0.9735 306/500 [=================>............] - ETA: 1:07 - loss: 3.6354 - regression_loss: 2.6629 - classification_loss: 0.9725 307/500 [=================>............] - ETA: 1:07 - loss: 3.6378 - regression_loss: 2.6644 - classification_loss: 0.9734 308/500 [=================>............] - ETA: 1:06 - loss: 3.6357 - regression_loss: 2.6635 - classification_loss: 0.9722 309/500 [=================>............] - ETA: 1:06 - loss: 3.6345 - regression_loss: 2.6629 - classification_loss: 0.9716 310/500 [=================>............] - ETA: 1:06 - loss: 3.6328 - regression_loss: 2.6619 - classification_loss: 0.9709 311/500 [=================>............] - ETA: 1:05 - loss: 3.6349 - regression_loss: 2.6625 - classification_loss: 0.9724 312/500 [=================>............] - ETA: 1:05 - loss: 3.6329 - regression_loss: 2.6618 - classification_loss: 0.9712 313/500 [=================>............] - ETA: 1:05 - loss: 3.6319 - regression_loss: 2.6612 - classification_loss: 0.9707 314/500 [=================>............] - ETA: 1:04 - loss: 3.6301 - regression_loss: 2.6606 - classification_loss: 0.9695 315/500 [=================>............] - ETA: 1:04 - loss: 3.6288 - regression_loss: 2.6600 - classification_loss: 0.9688 316/500 [=================>............] - ETA: 1:04 - loss: 3.6277 - regression_loss: 2.6595 - classification_loss: 0.9682 317/500 [==================>...........] - ETA: 1:03 - loss: 3.6273 - regression_loss: 2.6599 - classification_loss: 0.9674 318/500 [==================>...........] - ETA: 1:03 - loss: 3.6270 - regression_loss: 2.6604 - classification_loss: 0.9666 319/500 [==================>...........] - ETA: 1:02 - loss: 3.6274 - regression_loss: 2.6611 - classification_loss: 0.9664 320/500 [==================>...........] - ETA: 1:02 - loss: 3.6262 - regression_loss: 2.6605 - classification_loss: 0.9657 321/500 [==================>...........] - ETA: 1:02 - loss: 3.6249 - regression_loss: 2.6596 - classification_loss: 0.9654 322/500 [==================>...........] - ETA: 1:01 - loss: 3.6236 - regression_loss: 2.6588 - classification_loss: 0.9648 323/500 [==================>...........] - ETA: 1:01 - loss: 3.6233 - regression_loss: 2.6590 - classification_loss: 0.9643 324/500 [==================>...........] - ETA: 1:01 - loss: 3.6221 - regression_loss: 2.6587 - classification_loss: 0.9635 325/500 [==================>...........] - ETA: 1:00 - loss: 3.6209 - regression_loss: 2.6583 - classification_loss: 0.9626 326/500 [==================>...........] - ETA: 1:00 - loss: 3.6200 - regression_loss: 2.6577 - classification_loss: 0.9623 327/500 [==================>...........] - ETA: 1:00 - loss: 3.6190 - regression_loss: 2.6576 - classification_loss: 0.9614 328/500 [==================>...........] - ETA: 59s - loss: 3.6175 - regression_loss: 2.6571 - classification_loss: 0.9603 329/500 [==================>...........] - ETA: 59s - loss: 3.6162 - regression_loss: 2.6565 - classification_loss: 0.9597 330/500 [==================>...........] - ETA: 58s - loss: 3.6162 - regression_loss: 2.6575 - classification_loss: 0.9587 331/500 [==================>...........] - ETA: 58s - loss: 3.6146 - regression_loss: 2.6565 - classification_loss: 0.9581 332/500 [==================>...........] - ETA: 58s - loss: 3.6139 - regression_loss: 2.6564 - classification_loss: 0.9575 333/500 [==================>...........] - ETA: 57s - loss: 3.6119 - regression_loss: 2.6558 - classification_loss: 0.9560 334/500 [===================>..........] - ETA: 57s - loss: 3.6105 - regression_loss: 2.6552 - classification_loss: 0.9552 335/500 [===================>..........] - ETA: 57s - loss: 3.6091 - regression_loss: 2.6546 - classification_loss: 0.9545 336/500 [===================>..........] - ETA: 56s - loss: 3.6083 - regression_loss: 2.6540 - classification_loss: 0.9543 337/500 [===================>..........] - ETA: 56s - loss: 3.6062 - regression_loss: 2.6533 - classification_loss: 0.9529 338/500 [===================>..........] - ETA: 56s - loss: 3.6046 - regression_loss: 2.6526 - classification_loss: 0.9520 339/500 [===================>..........] - ETA: 55s - loss: 3.6031 - regression_loss: 2.6520 - classification_loss: 0.9511 340/500 [===================>..........] - ETA: 55s - loss: 3.6015 - regression_loss: 2.6517 - classification_loss: 0.9498 341/500 [===================>..........] - ETA: 54s - loss: 3.6000 - regression_loss: 2.6510 - classification_loss: 0.9490 342/500 [===================>..........] - ETA: 54s - loss: 3.5988 - regression_loss: 2.6505 - classification_loss: 0.9483 343/500 [===================>..........] - ETA: 54s - loss: 3.5993 - regression_loss: 2.6501 - classification_loss: 0.9492 344/500 [===================>..........] - ETA: 53s - loss: 3.5980 - regression_loss: 2.6495 - classification_loss: 0.9485 345/500 [===================>..........] - ETA: 53s - loss: 3.5964 - regression_loss: 2.6489 - classification_loss: 0.9475 346/500 [===================>..........] - ETA: 53s - loss: 3.5967 - regression_loss: 2.6498 - classification_loss: 0.9469 347/500 [===================>..........] - ETA: 52s - loss: 3.5957 - regression_loss: 2.6494 - classification_loss: 0.9463 348/500 [===================>..........] - ETA: 52s - loss: 3.5958 - regression_loss: 2.6495 - classification_loss: 0.9462 349/500 [===================>..........] - ETA: 52s - loss: 3.5945 - regression_loss: 2.6489 - classification_loss: 0.9456 350/500 [====================>.........] - ETA: 51s - loss: 3.5943 - regression_loss: 2.6491 - classification_loss: 0.9453 351/500 [====================>.........] - ETA: 51s - loss: 3.5930 - regression_loss: 2.6484 - classification_loss: 0.9446 352/500 [====================>.........] - ETA: 50s - loss: 3.5918 - regression_loss: 2.6482 - classification_loss: 0.9436 353/500 [====================>.........] - ETA: 50s - loss: 3.5906 - regression_loss: 2.6478 - classification_loss: 0.9428 354/500 [====================>.........] - ETA: 50s - loss: 3.5893 - regression_loss: 2.6473 - classification_loss: 0.9421 355/500 [====================>.........] - ETA: 49s - loss: 3.5876 - regression_loss: 2.6463 - classification_loss: 0.9412 356/500 [====================>.........] - ETA: 49s - loss: 3.5873 - regression_loss: 2.6460 - classification_loss: 0.9413 357/500 [====================>.........] - ETA: 49s - loss: 3.5860 - regression_loss: 2.6454 - classification_loss: 0.9407 358/500 [====================>.........] - ETA: 48s - loss: 3.5850 - regression_loss: 2.6449 - classification_loss: 0.9401 359/500 [====================>.........] - ETA: 48s - loss: 3.5837 - regression_loss: 2.6445 - classification_loss: 0.9392 360/500 [====================>.........] - ETA: 48s - loss: 3.5829 - regression_loss: 2.6445 - classification_loss: 0.9384 361/500 [====================>.........] - ETA: 47s - loss: 3.5814 - regression_loss: 2.6438 - classification_loss: 0.9376 362/500 [====================>.........] - ETA: 47s - loss: 3.5798 - regression_loss: 2.6430 - classification_loss: 0.9368 363/500 [====================>.........] - ETA: 47s - loss: 3.5787 - regression_loss: 2.6428 - classification_loss: 0.9359 364/500 [====================>.........] - ETA: 46s - loss: 3.5784 - regression_loss: 2.6429 - classification_loss: 0.9356 365/500 [====================>.........] - ETA: 46s - loss: 3.5786 - regression_loss: 2.6423 - classification_loss: 0.9363 366/500 [====================>.........] - ETA: 46s - loss: 3.5774 - regression_loss: 2.6418 - classification_loss: 0.9356 367/500 [=====================>........] - ETA: 45s - loss: 3.5797 - regression_loss: 2.6424 - classification_loss: 0.9373 368/500 [=====================>........] - ETA: 45s - loss: 3.5791 - regression_loss: 2.6428 - classification_loss: 0.9363 369/500 [=====================>........] - ETA: 44s - loss: 3.5779 - regression_loss: 2.6422 - classification_loss: 0.9356 370/500 [=====================>........] - ETA: 44s - loss: 3.5773 - regression_loss: 2.6427 - classification_loss: 0.9346 371/500 [=====================>........] - ETA: 44s - loss: 3.5759 - regression_loss: 2.6422 - classification_loss: 0.9336 372/500 [=====================>........] - ETA: 43s - loss: 3.5751 - regression_loss: 2.6420 - classification_loss: 0.9331 373/500 [=====================>........] - ETA: 43s - loss: 3.5737 - regression_loss: 2.6414 - classification_loss: 0.9323 374/500 [=====================>........] - ETA: 43s - loss: 3.5721 - regression_loss: 2.6407 - classification_loss: 0.9314 375/500 [=====================>........] - ETA: 42s - loss: 3.5707 - regression_loss: 2.6402 - classification_loss: 0.9305 376/500 [=====================>........] - ETA: 42s - loss: 3.5697 - regression_loss: 2.6399 - classification_loss: 0.9298 377/500 [=====================>........] - ETA: 42s - loss: 3.5693 - regression_loss: 2.6400 - classification_loss: 0.9293 378/500 [=====================>........] - ETA: 41s - loss: 3.5692 - regression_loss: 2.6406 - classification_loss: 0.9286 379/500 [=====================>........] - ETA: 41s - loss: 3.5684 - regression_loss: 2.6399 - classification_loss: 0.9285 380/500 [=====================>........] - ETA: 41s - loss: 3.5684 - regression_loss: 2.6402 - classification_loss: 0.9282 381/500 [=====================>........] - ETA: 40s - loss: 3.5671 - regression_loss: 2.6395 - classification_loss: 0.9276 382/500 [=====================>........] - ETA: 40s - loss: 3.5658 - regression_loss: 2.6389 - classification_loss: 0.9269 383/500 [=====================>........] - ETA: 40s - loss: 3.5647 - regression_loss: 2.6382 - classification_loss: 0.9264 384/500 [======================>.......] - ETA: 39s - loss: 3.5632 - regression_loss: 2.6376 - classification_loss: 0.9256 385/500 [======================>.......] - ETA: 39s - loss: 3.5618 - regression_loss: 2.6371 - classification_loss: 0.9247 386/500 [======================>.......] - ETA: 38s - loss: 3.5606 - regression_loss: 2.6367 - classification_loss: 0.9239 387/500 [======================>.......] - ETA: 38s - loss: 3.5598 - regression_loss: 2.6364 - classification_loss: 0.9235 388/500 [======================>.......] - ETA: 38s - loss: 3.5586 - regression_loss: 2.6359 - classification_loss: 0.9227 389/500 [======================>.......] - ETA: 37s - loss: 3.5578 - regression_loss: 2.6356 - classification_loss: 0.9222 390/500 [======================>.......] - ETA: 37s - loss: 3.5577 - regression_loss: 2.6360 - classification_loss: 0.9217 391/500 [======================>.......] - ETA: 37s - loss: 3.5569 - regression_loss: 2.6360 - classification_loss: 0.9209 392/500 [======================>.......] - ETA: 36s - loss: 3.5557 - regression_loss: 2.6355 - classification_loss: 0.9201 393/500 [======================>.......] - ETA: 36s - loss: 3.5544 - regression_loss: 2.6351 - classification_loss: 0.9193 394/500 [======================>.......] - ETA: 36s - loss: 3.5549 - regression_loss: 2.6353 - classification_loss: 0.9196 395/500 [======================>.......] - ETA: 35s - loss: 3.5549 - regression_loss: 2.6349 - classification_loss: 0.9200 396/500 [======================>.......] - ETA: 35s - loss: 3.5555 - regression_loss: 2.6357 - classification_loss: 0.9198 397/500 [======================>.......] - ETA: 35s - loss: 3.5540 - regression_loss: 2.6351 - classification_loss: 0.9190 398/500 [======================>.......] - ETA: 34s - loss: 3.5529 - regression_loss: 2.6346 - classification_loss: 0.9183 399/500 [======================>.......] - ETA: 34s - loss: 3.5529 - regression_loss: 2.6344 - classification_loss: 0.9185 400/500 [=======================>......] - ETA: 34s - loss: 3.5519 - regression_loss: 2.6342 - classification_loss: 0.9177 401/500 [=======================>......] - ETA: 33s - loss: 3.5506 - regression_loss: 2.6337 - classification_loss: 0.9169 402/500 [=======================>......] - ETA: 33s - loss: 3.5497 - regression_loss: 2.6334 - classification_loss: 0.9163 403/500 [=======================>......] - ETA: 33s - loss: 3.5482 - regression_loss: 2.6328 - classification_loss: 0.9154 404/500 [=======================>......] - ETA: 32s - loss: 3.5470 - regression_loss: 2.6322 - classification_loss: 0.9147 405/500 [=======================>......] - ETA: 32s - loss: 3.5456 - regression_loss: 2.6314 - classification_loss: 0.9142 406/500 [=======================>......] - ETA: 31s - loss: 3.5440 - regression_loss: 2.6304 - classification_loss: 0.9136 407/500 [=======================>......] - ETA: 31s - loss: 3.5428 - regression_loss: 2.6301 - classification_loss: 0.9127 408/500 [=======================>......] - ETA: 31s - loss: 3.5414 - regression_loss: 2.6295 - classification_loss: 0.9119 409/500 [=======================>......] - ETA: 30s - loss: 3.5402 - regression_loss: 2.6292 - classification_loss: 0.9111 410/500 [=======================>......] - ETA: 30s - loss: 3.5391 - regression_loss: 2.6287 - classification_loss: 0.9104 411/500 [=======================>......] - ETA: 30s - loss: 3.5380 - regression_loss: 2.6282 - classification_loss: 0.9098 412/500 [=======================>......] - ETA: 29s - loss: 3.5370 - regression_loss: 2.6279 - classification_loss: 0.9091 413/500 [=======================>......] - ETA: 29s - loss: 3.5363 - regression_loss: 2.6277 - classification_loss: 0.9086 414/500 [=======================>......] - ETA: 29s - loss: 3.5351 - regression_loss: 2.6271 - classification_loss: 0.9080 415/500 [=======================>......] - ETA: 28s - loss: 3.5339 - regression_loss: 2.6265 - classification_loss: 0.9074 416/500 [=======================>......] - ETA: 28s - loss: 3.5329 - regression_loss: 2.6262 - classification_loss: 0.9066 417/500 [========================>.....] - ETA: 28s - loss: 3.5321 - regression_loss: 2.6257 - classification_loss: 0.9063 418/500 [========================>.....] - ETA: 27s - loss: 3.5309 - regression_loss: 2.6252 - classification_loss: 0.9057 419/500 [========================>.....] - ETA: 27s - loss: 3.5301 - regression_loss: 2.6252 - classification_loss: 0.9049 420/500 [========================>.....] - ETA: 27s - loss: 3.5313 - regression_loss: 2.6264 - classification_loss: 0.9049 421/500 [========================>.....] - ETA: 26s - loss: 3.5315 - regression_loss: 2.6268 - classification_loss: 0.9048 422/500 [========================>.....] - ETA: 26s - loss: 3.5305 - regression_loss: 2.6263 - classification_loss: 0.9041 423/500 [========================>.....] - ETA: 26s - loss: 3.5300 - regression_loss: 2.6259 - classification_loss: 0.9040 424/500 [========================>.....] - ETA: 25s - loss: 3.5287 - regression_loss: 2.6255 - classification_loss: 0.9032 425/500 [========================>.....] - ETA: 25s - loss: 3.5275 - regression_loss: 2.6250 - classification_loss: 0.9025 426/500 [========================>.....] - ETA: 25s - loss: 3.5272 - regression_loss: 2.6248 - classification_loss: 0.9024 427/500 [========================>.....] - ETA: 24s - loss: 3.5271 - regression_loss: 2.6251 - classification_loss: 0.9020 428/500 [========================>.....] - ETA: 24s - loss: 3.5261 - regression_loss: 2.6247 - classification_loss: 0.9014 429/500 [========================>.....] - ETA: 24s - loss: 3.5250 - regression_loss: 2.6243 - classification_loss: 0.9007 430/500 [========================>.....] - ETA: 23s - loss: 3.5238 - regression_loss: 2.6238 - classification_loss: 0.8999 431/500 [========================>.....] - ETA: 23s - loss: 3.5226 - regression_loss: 2.6234 - classification_loss: 0.8992 432/500 [========================>.....] - ETA: 23s - loss: 3.5216 - regression_loss: 2.6230 - classification_loss: 0.8986 433/500 [========================>.....] - ETA: 22s - loss: 3.5207 - regression_loss: 2.6227 - classification_loss: 0.8980 434/500 [=========================>....] - ETA: 22s - loss: 3.5201 - regression_loss: 2.6226 - classification_loss: 0.8974 435/500 [=========================>....] - ETA: 21s - loss: 3.5193 - regression_loss: 2.6224 - classification_loss: 0.8969 436/500 [=========================>....] - ETA: 21s - loss: 3.5187 - regression_loss: 2.6220 - classification_loss: 0.8967 437/500 [=========================>....] - ETA: 21s - loss: 3.5183 - regression_loss: 2.6217 - classification_loss: 0.8966 438/500 [=========================>....] - ETA: 20s - loss: 3.5180 - regression_loss: 2.6210 - classification_loss: 0.8970 439/500 [=========================>....] - ETA: 20s - loss: 3.5175 - regression_loss: 2.6209 - classification_loss: 0.8965 440/500 [=========================>....] - ETA: 20s - loss: 3.5163 - regression_loss: 2.6205 - classification_loss: 0.8958 441/500 [=========================>....] - ETA: 19s - loss: 3.5162 - regression_loss: 2.6210 - classification_loss: 0.8952 442/500 [=========================>....] - ETA: 19s - loss: 3.5151 - regression_loss: 2.6207 - classification_loss: 0.8944 443/500 [=========================>....] - ETA: 19s - loss: 3.5140 - regression_loss: 2.6202 - classification_loss: 0.8938 444/500 [=========================>....] - ETA: 18s - loss: 3.5130 - regression_loss: 2.6200 - classification_loss: 0.8929 445/500 [=========================>....] - ETA: 18s - loss: 3.5124 - regression_loss: 2.6199 - classification_loss: 0.8924 446/500 [=========================>....] - ETA: 18s - loss: 3.5111 - regression_loss: 2.6194 - classification_loss: 0.8917 447/500 [=========================>....] - ETA: 17s - loss: 3.5096 - regression_loss: 2.6187 - classification_loss: 0.8909 448/500 [=========================>....] - ETA: 17s - loss: 3.5083 - regression_loss: 2.6184 - classification_loss: 0.8899 449/500 [=========================>....] - ETA: 17s - loss: 3.5073 - regression_loss: 2.6180 - classification_loss: 0.8894 450/500 [==========================>...] - ETA: 16s - loss: 3.5063 - regression_loss: 2.6175 - classification_loss: 0.8888 451/500 [==========================>...] - ETA: 16s - loss: 3.5073 - regression_loss: 2.6181 - classification_loss: 0.8892 452/500 [==========================>...] - ETA: 16s - loss: 3.5063 - regression_loss: 2.6177 - classification_loss: 0.8886 453/500 [==========================>...] - ETA: 15s - loss: 3.5055 - regression_loss: 2.6174 - classification_loss: 0.8881 454/500 [==========================>...] - ETA: 15s - loss: 3.5046 - regression_loss: 2.6170 - classification_loss: 0.8876 455/500 [==========================>...] - ETA: 15s - loss: 3.5036 - regression_loss: 2.6166 - classification_loss: 0.8870 456/500 [==========================>...] - ETA: 14s - loss: 3.5025 - regression_loss: 2.6161 - classification_loss: 0.8864 457/500 [==========================>...] - ETA: 14s - loss: 3.5017 - regression_loss: 2.6159 - classification_loss: 0.8858 458/500 [==========================>...] - ETA: 14s - loss: 3.5008 - regression_loss: 2.6157 - classification_loss: 0.8851 459/500 [==========================>...] - ETA: 13s - loss: 3.5015 - regression_loss: 2.6165 - classification_loss: 0.8850 460/500 [==========================>...] - ETA: 13s - loss: 3.5013 - regression_loss: 2.6159 - classification_loss: 0.8854 461/500 [==========================>...] - ETA: 13s - loss: 3.5017 - regression_loss: 2.6159 - classification_loss: 0.8858 462/500 [==========================>...] - ETA: 12s - loss: 3.5015 - regression_loss: 2.6160 - classification_loss: 0.8855 463/500 [==========================>...] - ETA: 12s - loss: 3.5006 - regression_loss: 2.6156 - classification_loss: 0.8849 464/500 [==========================>...] - ETA: 12s - loss: 3.5001 - regression_loss: 2.6155 - classification_loss: 0.8846 465/500 [==========================>...] - ETA: 11s - loss: 3.4992 - regression_loss: 2.6151 - classification_loss: 0.8842 466/500 [==========================>...] - ETA: 11s - loss: 3.4981 - regression_loss: 2.6146 - classification_loss: 0.8835 467/500 [===========================>..] - ETA: 11s - loss: 3.4974 - regression_loss: 2.6143 - classification_loss: 0.8832 468/500 [===========================>..] - ETA: 10s - loss: 3.4979 - regression_loss: 2.6151 - classification_loss: 0.8827 469/500 [===========================>..] - ETA: 10s - loss: 3.4970 - regression_loss: 2.6149 - classification_loss: 0.8821 470/500 [===========================>..] - ETA: 10s - loss: 3.4970 - regression_loss: 2.6155 - classification_loss: 0.8815 471/500 [===========================>..] - ETA: 9s - loss: 3.4967 - regression_loss: 2.6154 - classification_loss: 0.8812 472/500 [===========================>..] - ETA: 9s - loss: 3.4963 - regression_loss: 2.6151 - classification_loss: 0.8812 473/500 [===========================>..] - ETA: 9s - loss: 3.4959 - regression_loss: 2.6147 - classification_loss: 0.8812 474/500 [===========================>..] - ETA: 8s - loss: 3.4954 - regression_loss: 2.6143 - classification_loss: 0.8811 475/500 [===========================>..] - ETA: 8s - loss: 3.4946 - regression_loss: 2.6140 - classification_loss: 0.8807 476/500 [===========================>..] - ETA: 8s - loss: 3.4942 - regression_loss: 2.6137 - classification_loss: 0.8804 477/500 [===========================>..] - ETA: 7s - loss: 3.4935 - regression_loss: 2.6132 - classification_loss: 0.8802 478/500 [===========================>..] - ETA: 7s - loss: 3.4926 - regression_loss: 2.6128 - classification_loss: 0.8798 479/500 [===========================>..] - ETA: 7s - loss: 3.4919 - regression_loss: 2.6126 - classification_loss: 0.8793 480/500 [===========================>..] - ETA: 6s - loss: 3.4924 - regression_loss: 2.6132 - classification_loss: 0.8792 481/500 [===========================>..] - ETA: 6s - loss: 3.4913 - regression_loss: 2.6130 - classification_loss: 0.8783 482/500 [===========================>..] - ETA: 6s - loss: 3.4903 - regression_loss: 2.6126 - classification_loss: 0.8777 483/500 [===========================>..] - ETA: 5s - loss: 3.4900 - regression_loss: 2.6125 - classification_loss: 0.8775 484/500 [============================>.] - ETA: 5s - loss: 3.4897 - regression_loss: 2.6125 - classification_loss: 0.8772 485/500 [============================>.] - ETA: 5s - loss: 3.4895 - regression_loss: 2.6125 - classification_loss: 0.8770 486/500 [============================>.] - ETA: 4s - loss: 3.4886 - regression_loss: 2.6123 - classification_loss: 0.8764 487/500 [============================>.] - ETA: 4s - loss: 3.4885 - regression_loss: 2.6120 - classification_loss: 0.8765 488/500 [============================>.] - ETA: 4s - loss: 3.4875 - regression_loss: 2.6115 - classification_loss: 0.8760 489/500 [============================>.] - ETA: 3s - loss: 3.4866 - regression_loss: 2.6113 - classification_loss: 0.8754 490/500 [============================>.] - ETA: 3s - loss: 3.4857 - regression_loss: 2.6109 - classification_loss: 0.8748 491/500 [============================>.] - ETA: 3s - loss: 3.4852 - regression_loss: 2.6109 - classification_loss: 0.8744 492/500 [============================>.] - ETA: 2s - loss: 3.4840 - regression_loss: 2.6104 - classification_loss: 0.8737 493/500 [============================>.] - ETA: 2s - loss: 3.4833 - regression_loss: 2.6099 - classification_loss: 0.8733 494/500 [============================>.] - ETA: 2s - loss: 3.4824 - regression_loss: 2.6096 - classification_loss: 0.8727 495/500 [============================>.] - ETA: 1s - loss: 3.4814 - regression_loss: 2.6092 - classification_loss: 0.8722 496/500 [============================>.] - ETA: 1s - loss: 3.4807 - regression_loss: 2.6090 - classification_loss: 0.8717 497/500 [============================>.] - ETA: 1s - loss: 3.4804 - regression_loss: 2.6092 - classification_loss: 0.8712 498/500 [============================>.] - ETA: 0s - loss: 3.4794 - regression_loss: 2.6087 - classification_loss: 0.8707 499/500 [============================>.] - ETA: 0s - loss: 3.4786 - regression_loss: 2.6085 - classification_loss: 0.8701 500/500 [==============================] - 167s 335ms/step - loss: 3.4777 - regression_loss: 2.6082 - classification_loss: 0.8694 1172 instances of class plum with average precision: 0.0033 mAP: 0.0033 Epoch 00001: saving model to ./training/snapshots/resnet101_pascal_01.h5 Epoch 2/150 1/500 [..............................] - ETA: 2:43 - loss: 2.9905 - regression_loss: 2.4874 - classification_loss: 0.5030 2/500 [..............................] - ETA: 2:38 - loss: 3.3849 - regression_loss: 2.7553 - classification_loss: 0.6296 3/500 [..............................] - ETA: 2:37 - loss: 3.2924 - regression_loss: 2.6402 - classification_loss: 0.6522 4/500 [..............................] - ETA: 2:36 - loss: 3.2071 - regression_loss: 2.5956 - classification_loss: 0.6114 5/500 [..............................] - ETA: 2:34 - loss: 3.1717 - regression_loss: 2.5629 - classification_loss: 0.6088 6/500 [..............................] - ETA: 2:33 - loss: 3.1943 - regression_loss: 2.5820 - classification_loss: 0.6123 7/500 [..............................] - ETA: 2:32 - loss: 3.1744 - regression_loss: 2.5688 - classification_loss: 0.6056 8/500 [..............................] - ETA: 2:31 - loss: 3.1816 - regression_loss: 2.5764 - classification_loss: 0.6052 9/500 [..............................] - ETA: 2:31 - loss: 3.1890 - regression_loss: 2.5799 - classification_loss: 0.6090 10/500 [..............................] - ETA: 2:30 - loss: 3.1838 - regression_loss: 2.5685 - classification_loss: 0.6153 11/500 [..............................] - ETA: 2:30 - loss: 3.1557 - regression_loss: 2.5511 - classification_loss: 0.6046 12/500 [..............................] - ETA: 2:29 - loss: 3.1587 - regression_loss: 2.5489 - classification_loss: 0.6098 13/500 [..............................] - ETA: 2:29 - loss: 3.1458 - regression_loss: 2.5367 - classification_loss: 0.6091 14/500 [..............................] - ETA: 2:29 - loss: 3.1303 - regression_loss: 2.5187 - classification_loss: 0.6116 15/500 [..............................] - ETA: 2:28 - loss: 3.1269 - regression_loss: 2.5122 - classification_loss: 0.6148 16/500 [..............................] - ETA: 2:28 - loss: 3.1299 - regression_loss: 2.5092 - classification_loss: 0.6207 17/500 [>.............................] - ETA: 2:28 - loss: 3.1270 - regression_loss: 2.5067 - classification_loss: 0.6203 18/500 [>.............................] - ETA: 2:28 - loss: 3.1216 - regression_loss: 2.5033 - classification_loss: 0.6183 19/500 [>.............................] - ETA: 2:27 - loss: 3.1142 - regression_loss: 2.5002 - classification_loss: 0.6141 20/500 [>.............................] - ETA: 2:27 - loss: 3.1101 - regression_loss: 2.4952 - classification_loss: 0.6149 21/500 [>.............................] - ETA: 2:27 - loss: 3.1037 - regression_loss: 2.4925 - classification_loss: 0.6111 22/500 [>.............................] - ETA: 2:27 - loss: 3.0987 - regression_loss: 2.4896 - classification_loss: 0.6091 23/500 [>.............................] - ETA: 2:27 - loss: 3.0980 - regression_loss: 2.4897 - classification_loss: 0.6084 24/500 [>.............................] - ETA: 2:26 - loss: 3.0999 - regression_loss: 2.4910 - classification_loss: 0.6088 25/500 [>.............................] - ETA: 2:26 - loss: 3.1013 - regression_loss: 2.4871 - classification_loss: 0.6142 26/500 [>.............................] - ETA: 2:26 - loss: 3.1066 - regression_loss: 2.4879 - classification_loss: 0.6188 27/500 [>.............................] - ETA: 2:26 - loss: 3.1191 - regression_loss: 2.4946 - classification_loss: 0.6245 28/500 [>.............................] - ETA: 2:25 - loss: 3.1159 - regression_loss: 2.4912 - classification_loss: 0.6247 29/500 [>.............................] - ETA: 2:25 - loss: 3.1154 - regression_loss: 2.4904 - classification_loss: 0.6250 30/500 [>.............................] - ETA: 2:25 - loss: 3.1084 - regression_loss: 2.4876 - classification_loss: 0.6208 31/500 [>.............................] - ETA: 2:24 - loss: 3.1047 - regression_loss: 2.4829 - classification_loss: 0.6218 32/500 [>.............................] - ETA: 2:24 - loss: 3.0975 - regression_loss: 2.4801 - classification_loss: 0.6174 33/500 [>.............................] - ETA: 2:24 - loss: 3.0973 - regression_loss: 2.4768 - classification_loss: 0.6205 34/500 [=>............................] - ETA: 2:23 - loss: 3.0949 - regression_loss: 2.4767 - classification_loss: 0.6182 35/500 [=>............................] - ETA: 2:23 - loss: 3.0894 - regression_loss: 2.4741 - classification_loss: 0.6153 36/500 [=>............................] - ETA: 2:23 - loss: 3.0847 - regression_loss: 2.4737 - classification_loss: 0.6110 37/500 [=>............................] - ETA: 2:22 - loss: 3.0836 - regression_loss: 2.4719 - classification_loss: 0.6116 38/500 [=>............................] - ETA: 2:22 - loss: 3.0726 - regression_loss: 2.4663 - classification_loss: 0.6063 39/500 [=>............................] - ETA: 2:22 - loss: 3.0743 - regression_loss: 2.4682 - classification_loss: 0.6061 40/500 [=>............................] - ETA: 2:22 - loss: 3.0824 - regression_loss: 2.4604 - classification_loss: 0.6221 41/500 [=>............................] - ETA: 2:21 - loss: 3.0844 - regression_loss: 2.4615 - classification_loss: 0.6229 42/500 [=>............................] - ETA: 2:21 - loss: 3.0921 - regression_loss: 2.4609 - classification_loss: 0.6312 43/500 [=>............................] - ETA: 2:21 - loss: 3.1040 - regression_loss: 2.4634 - classification_loss: 0.6406 44/500 [=>............................] - ETA: 2:21 - loss: 3.1051 - regression_loss: 2.4634 - classification_loss: 0.6417 45/500 [=>............................] - ETA: 2:20 - loss: 3.0978 - regression_loss: 2.4603 - classification_loss: 0.6375 46/500 [=>............................] - ETA: 2:20 - loss: 3.0995 - regression_loss: 2.4621 - classification_loss: 0.6374 47/500 [=>............................] - ETA: 2:20 - loss: 3.1025 - regression_loss: 2.4615 - classification_loss: 0.6410 48/500 [=>............................] - ETA: 2:20 - loss: 3.1030 - regression_loss: 2.4602 - classification_loss: 0.6428 49/500 [=>............................] - ETA: 2:19 - loss: 3.1031 - regression_loss: 2.4602 - classification_loss: 0.6429 50/500 [==>...........................] - ETA: 2:19 - loss: 3.1020 - regression_loss: 2.4593 - classification_loss: 0.6427 51/500 [==>...........................] - ETA: 2:19 - loss: 3.1175 - regression_loss: 2.4718 - classification_loss: 0.6458 52/500 [==>...........................] - ETA: 2:18 - loss: 3.1342 - regression_loss: 2.4850 - classification_loss: 0.6492 53/500 [==>...........................] - ETA: 2:18 - loss: 3.1386 - regression_loss: 2.4894 - classification_loss: 0.6492 54/500 [==>...........................] - ETA: 2:18 - loss: 3.1383 - regression_loss: 2.4889 - classification_loss: 0.6495 55/500 [==>...........................] - ETA: 2:17 - loss: 3.1358 - regression_loss: 2.4868 - classification_loss: 0.6490 56/500 [==>...........................] - ETA: 2:17 - loss: 3.1334 - regression_loss: 2.4873 - classification_loss: 0.6462 57/500 [==>...........................] - ETA: 2:17 - loss: 3.1321 - regression_loss: 2.4859 - classification_loss: 0.6462 58/500 [==>...........................] - ETA: 2:17 - loss: 3.1395 - regression_loss: 2.4929 - classification_loss: 0.6467 59/500 [==>...........................] - ETA: 2:16 - loss: 3.1437 - regression_loss: 2.4940 - classification_loss: 0.6497 60/500 [==>...........................] - ETA: 2:16 - loss: 3.1422 - regression_loss: 2.4939 - classification_loss: 0.6483 61/500 [==>...........................] - ETA: 2:16 - loss: 3.1404 - regression_loss: 2.4928 - classification_loss: 0.6476 62/500 [==>...........................] - ETA: 2:15 - loss: 3.1378 - regression_loss: 2.4933 - classification_loss: 0.6445 63/500 [==>...........................] - ETA: 2:15 - loss: 3.1368 - regression_loss: 2.4944 - classification_loss: 0.6424 64/500 [==>...........................] - ETA: 2:15 - loss: 3.1353 - regression_loss: 2.4932 - classification_loss: 0.6421 65/500 [==>...........................] - ETA: 2:15 - loss: 3.1328 - regression_loss: 2.4918 - classification_loss: 0.6410 66/500 [==>...........................] - ETA: 2:14 - loss: 3.1304 - regression_loss: 2.4908 - classification_loss: 0.6396 67/500 [===>..........................] - ETA: 2:14 - loss: 3.1278 - regression_loss: 2.4899 - classification_loss: 0.6379 68/500 [===>..........................] - ETA: 2:14 - loss: 3.1267 - regression_loss: 2.4896 - classification_loss: 0.6371 69/500 [===>..........................] - ETA: 2:13 - loss: 3.1230 - regression_loss: 2.4887 - classification_loss: 0.6343 70/500 [===>..........................] - ETA: 2:13 - loss: 3.1222 - regression_loss: 2.4881 - classification_loss: 0.6341 71/500 [===>..........................] - ETA: 2:13 - loss: 3.1205 - regression_loss: 2.4870 - classification_loss: 0.6336 72/500 [===>..........................] - ETA: 2:12 - loss: 3.1207 - regression_loss: 2.4866 - classification_loss: 0.6341 73/500 [===>..........................] - ETA: 2:12 - loss: 3.1220 - regression_loss: 2.4876 - classification_loss: 0.6343 74/500 [===>..........................] - ETA: 2:12 - loss: 3.1258 - regression_loss: 2.4901 - classification_loss: 0.6358 75/500 [===>..........................] - ETA: 2:11 - loss: 3.1243 - regression_loss: 2.4894 - classification_loss: 0.6350 76/500 [===>..........................] - ETA: 2:11 - loss: 3.1253 - regression_loss: 2.4883 - classification_loss: 0.6370 77/500 [===>..........................] - ETA: 2:11 - loss: 3.1273 - regression_loss: 2.4903 - classification_loss: 0.6371 78/500 [===>..........................] - ETA: 2:10 - loss: 3.1256 - regression_loss: 2.4905 - classification_loss: 0.6351 79/500 [===>..........................] - ETA: 2:10 - loss: 3.1240 - regression_loss: 2.4902 - classification_loss: 0.6338 80/500 [===>..........................] - ETA: 2:10 - loss: 3.1231 - regression_loss: 2.4891 - classification_loss: 0.6341 81/500 [===>..........................] - ETA: 2:09 - loss: 3.1251 - regression_loss: 2.4902 - classification_loss: 0.6349 82/500 [===>..........................] - ETA: 2:09 - loss: 3.1295 - regression_loss: 2.4921 - classification_loss: 0.6374 83/500 [===>..........................] - ETA: 2:09 - loss: 3.1287 - regression_loss: 2.4912 - classification_loss: 0.6375 84/500 [====>.........................] - ETA: 2:08 - loss: 3.1278 - regression_loss: 2.4902 - classification_loss: 0.6376 85/500 [====>.........................] - ETA: 2:08 - loss: 3.1274 - regression_loss: 2.4900 - classification_loss: 0.6374 86/500 [====>.........................] - ETA: 2:08 - loss: 3.1267 - regression_loss: 2.4896 - classification_loss: 0.6371 87/500 [====>.........................] - ETA: 2:08 - loss: 3.1335 - regression_loss: 2.4913 - classification_loss: 0.6422 88/500 [====>.........................] - ETA: 2:07 - loss: 3.1315 - regression_loss: 2.4903 - classification_loss: 0.6412 89/500 [====>.........................] - ETA: 2:07 - loss: 3.1310 - regression_loss: 2.4905 - classification_loss: 0.6405 90/500 [====>.........................] - ETA: 2:07 - loss: 3.1310 - regression_loss: 2.4913 - classification_loss: 0.6397 91/500 [====>.........................] - ETA: 2:06 - loss: 3.1330 - regression_loss: 2.4932 - classification_loss: 0.6399 92/500 [====>.........................] - ETA: 2:06 - loss: 3.1366 - regression_loss: 2.4975 - classification_loss: 0.6391 93/500 [====>.........................] - ETA: 2:06 - loss: 3.1354 - regression_loss: 2.4962 - classification_loss: 0.6392 94/500 [====>.........................] - ETA: 2:06 - loss: 3.1359 - regression_loss: 2.4962 - classification_loss: 0.6397 95/500 [====>.........................] - ETA: 2:05 - loss: 3.1398 - regression_loss: 2.4999 - classification_loss: 0.6399 96/500 [====>.........................] - ETA: 2:05 - loss: 3.1390 - regression_loss: 2.4994 - classification_loss: 0.6396 97/500 [====>.........................] - ETA: 2:05 - loss: 3.1388 - regression_loss: 2.4986 - classification_loss: 0.6402 98/500 [====>.........................] - ETA: 2:04 - loss: 3.1357 - regression_loss: 2.4965 - classification_loss: 0.6392 99/500 [====>.........................] - ETA: 2:04 - loss: 3.1345 - regression_loss: 2.4960 - classification_loss: 0.6385 100/500 [=====>........................] - ETA: 2:04 - loss: 3.1357 - regression_loss: 2.4962 - classification_loss: 0.6395 101/500 [=====>........................] - ETA: 2:03 - loss: 3.1333 - regression_loss: 2.4944 - classification_loss: 0.6388 102/500 [=====>........................] - ETA: 2:03 - loss: 3.1327 - regression_loss: 2.4943 - classification_loss: 0.6384 103/500 [=====>........................] - ETA: 2:03 - loss: 3.1332 - regression_loss: 2.4942 - classification_loss: 0.6390 104/500 [=====>........................] - ETA: 2:02 - loss: 3.1333 - regression_loss: 2.4939 - classification_loss: 0.6394 105/500 [=====>........................] - ETA: 2:02 - loss: 3.1327 - regression_loss: 2.4933 - classification_loss: 0.6394 106/500 [=====>........................] - ETA: 2:02 - loss: 3.1345 - regression_loss: 2.4943 - classification_loss: 0.6402 107/500 [=====>........................] - ETA: 2:01 - loss: 3.1339 - regression_loss: 2.4949 - classification_loss: 0.6389 108/500 [=====>........................] - ETA: 2:01 - loss: 3.1353 - regression_loss: 2.4958 - classification_loss: 0.6395 109/500 [=====>........................] - ETA: 2:01 - loss: 3.1339 - regression_loss: 2.4950 - classification_loss: 0.6388 110/500 [=====>........................] - ETA: 2:01 - loss: 3.1340 - regression_loss: 2.4945 - classification_loss: 0.6395 111/500 [=====>........................] - ETA: 2:00 - loss: 3.1324 - regression_loss: 2.4934 - classification_loss: 0.6390 112/500 [=====>........................] - ETA: 2:00 - loss: 3.1308 - regression_loss: 2.4925 - classification_loss: 0.6384 113/500 [=====>........................] - ETA: 2:00 - loss: 3.1296 - regression_loss: 2.4919 - classification_loss: 0.6377 114/500 [=====>........................] - ETA: 1:59 - loss: 3.1292 - regression_loss: 2.4906 - classification_loss: 0.6386 115/500 [=====>........................] - ETA: 1:59 - loss: 3.1277 - regression_loss: 2.4904 - classification_loss: 0.6373 116/500 [=====>........................] - ETA: 1:59 - loss: 3.1270 - regression_loss: 2.4899 - classification_loss: 0.6371 117/500 [======>.......................] - ETA: 1:58 - loss: 3.1263 - regression_loss: 2.4893 - classification_loss: 0.6370 118/500 [======>.......................] - ETA: 1:58 - loss: 3.1258 - regression_loss: 2.4896 - classification_loss: 0.6363 119/500 [======>.......................] - ETA: 1:58 - loss: 3.1249 - regression_loss: 2.4894 - classification_loss: 0.6355 120/500 [======>.......................] - ETA: 1:58 - loss: 3.1244 - regression_loss: 2.4898 - classification_loss: 0.6346 121/500 [======>.......................] - ETA: 1:57 - loss: 3.1237 - regression_loss: 2.4892 - classification_loss: 0.6346 122/500 [======>.......................] - ETA: 1:57 - loss: 3.1223 - regression_loss: 2.4882 - classification_loss: 0.6341 123/500 [======>.......................] - ETA: 1:57 - loss: 3.1210 - regression_loss: 2.4879 - classification_loss: 0.6330 124/500 [======>.......................] - ETA: 1:56 - loss: 3.1255 - regression_loss: 2.4909 - classification_loss: 0.6346 125/500 [======>.......................] - ETA: 1:56 - loss: 3.1239 - regression_loss: 2.4899 - classification_loss: 0.6340 126/500 [======>.......................] - ETA: 1:56 - loss: 3.1230 - regression_loss: 2.4896 - classification_loss: 0.6334 127/500 [======>.......................] - ETA: 1:55 - loss: 3.1222 - regression_loss: 2.4895 - classification_loss: 0.6328 128/500 [======>.......................] - ETA: 1:55 - loss: 3.1224 - regression_loss: 2.4899 - classification_loss: 0.6325 129/500 [======>.......................] - ETA: 1:55 - loss: 3.1214 - regression_loss: 2.4891 - classification_loss: 0.6324 130/500 [======>.......................] - ETA: 1:54 - loss: 3.1207 - regression_loss: 2.4886 - classification_loss: 0.6321 131/500 [======>.......................] - ETA: 1:54 - loss: 3.1210 - regression_loss: 2.4898 - classification_loss: 0.6312 132/500 [======>.......................] - ETA: 1:54 - loss: 3.1200 - regression_loss: 2.4891 - classification_loss: 0.6309 133/500 [======>.......................] - ETA: 1:53 - loss: 3.1204 - regression_loss: 2.4888 - classification_loss: 0.6316 134/500 [=======>......................] - ETA: 1:53 - loss: 3.1206 - regression_loss: 2.4891 - classification_loss: 0.6316 135/500 [=======>......................] - ETA: 1:53 - loss: 3.1206 - regression_loss: 2.4881 - classification_loss: 0.6326 136/500 [=======>......................] - ETA: 1:53 - loss: 3.1209 - regression_loss: 2.4883 - classification_loss: 0.6325 137/500 [=======>......................] - ETA: 1:52 - loss: 3.1185 - regression_loss: 2.4872 - classification_loss: 0.6313 138/500 [=======>......................] - ETA: 1:52 - loss: 3.1181 - regression_loss: 2.4872 - classification_loss: 0.6310 139/500 [=======>......................] - ETA: 1:52 - loss: 3.1176 - regression_loss: 2.4867 - classification_loss: 0.6309 140/500 [=======>......................] - ETA: 1:51 - loss: 3.1164 - regression_loss: 2.4861 - classification_loss: 0.6302 141/500 [=======>......................] - ETA: 1:51 - loss: 3.1156 - regression_loss: 2.4860 - classification_loss: 0.6295 142/500 [=======>......................] - ETA: 1:51 - loss: 3.1149 - regression_loss: 2.4856 - classification_loss: 0.6293 143/500 [=======>......................] - ETA: 1:50 - loss: 3.1147 - regression_loss: 2.4856 - classification_loss: 0.6291 144/500 [=======>......................] - ETA: 1:50 - loss: 3.1137 - regression_loss: 2.4851 - classification_loss: 0.6286 145/500 [=======>......................] - ETA: 1:50 - loss: 3.1127 - regression_loss: 2.4847 - classification_loss: 0.6281 146/500 [=======>......................] - ETA: 1:50 - loss: 3.1138 - regression_loss: 2.4864 - classification_loss: 0.6274 147/500 [=======>......................] - ETA: 1:49 - loss: 3.1149 - regression_loss: 2.4845 - classification_loss: 0.6304 148/500 [=======>......................] - ETA: 1:49 - loss: 3.1139 - regression_loss: 2.4838 - classification_loss: 0.6301 149/500 [=======>......................] - ETA: 1:49 - loss: 3.1130 - regression_loss: 2.4831 - classification_loss: 0.6298 150/500 [========>.....................] - ETA: 1:48 - loss: 3.1134 - regression_loss: 2.4840 - classification_loss: 0.6294 151/500 [========>.....................] - ETA: 1:48 - loss: 3.1125 - regression_loss: 2.4833 - classification_loss: 0.6292 152/500 [========>.....................] - ETA: 1:48 - loss: 3.1117 - regression_loss: 2.4828 - classification_loss: 0.6289 153/500 [========>.....................] - ETA: 1:47 - loss: 3.1121 - regression_loss: 2.4834 - classification_loss: 0.6287 154/500 [========>.....................] - ETA: 1:47 - loss: 3.1107 - regression_loss: 2.4825 - classification_loss: 0.6282 155/500 [========>.....................] - ETA: 1:47 - loss: 3.1106 - regression_loss: 2.4823 - classification_loss: 0.6283 156/500 [========>.....................] - ETA: 1:46 - loss: 3.1109 - regression_loss: 2.4822 - classification_loss: 0.6287 157/500 [========>.....................] - ETA: 1:46 - loss: 3.1094 - regression_loss: 2.4804 - classification_loss: 0.6290 158/500 [========>.....................] - ETA: 1:46 - loss: 3.1092 - regression_loss: 2.4796 - classification_loss: 0.6296 159/500 [========>.....................] - ETA: 1:45 - loss: 3.1078 - regression_loss: 2.4788 - classification_loss: 0.6290 160/500 [========>.....................] - ETA: 1:45 - loss: 3.1075 - regression_loss: 2.4793 - classification_loss: 0.6282 161/500 [========>.....................] - ETA: 1:45 - loss: 3.1088 - regression_loss: 2.4810 - classification_loss: 0.6278 162/500 [========>.....................] - ETA: 1:45 - loss: 3.1078 - regression_loss: 2.4806 - classification_loss: 0.6272 163/500 [========>.....................] - ETA: 1:44 - loss: 3.1067 - regression_loss: 2.4806 - classification_loss: 0.6262 164/500 [========>.....................] - ETA: 1:44 - loss: 3.1061 - regression_loss: 2.4805 - classification_loss: 0.6256 165/500 [========>.....................] - ETA: 1:44 - loss: 3.1066 - regression_loss: 2.4808 - classification_loss: 0.6259 166/500 [========>.....................] - ETA: 1:43 - loss: 3.1071 - regression_loss: 2.4813 - classification_loss: 0.6258 167/500 [=========>....................] - ETA: 1:43 - loss: 3.1066 - regression_loss: 2.4815 - classification_loss: 0.6252 168/500 [=========>....................] - ETA: 1:43 - loss: 3.1064 - regression_loss: 2.4814 - classification_loss: 0.6250 169/500 [=========>....................] - ETA: 1:42 - loss: 3.1058 - regression_loss: 2.4810 - classification_loss: 0.6248 170/500 [=========>....................] - ETA: 1:42 - loss: 3.1053 - regression_loss: 2.4806 - classification_loss: 0.6247 171/500 [=========>....................] - ETA: 1:42 - loss: 3.1056 - regression_loss: 2.4805 - classification_loss: 0.6251 172/500 [=========>....................] - ETA: 1:41 - loss: 3.1062 - regression_loss: 2.4807 - classification_loss: 0.6255 173/500 [=========>....................] - ETA: 1:41 - loss: 3.1054 - regression_loss: 2.4802 - classification_loss: 0.6252 174/500 [=========>....................] - ETA: 1:41 - loss: 3.1056 - regression_loss: 2.4812 - classification_loss: 0.6244 175/500 [=========>....................] - ETA: 1:40 - loss: 3.1049 - regression_loss: 2.4808 - classification_loss: 0.6241 176/500 [=========>....................] - ETA: 1:40 - loss: 3.1062 - regression_loss: 2.4826 - classification_loss: 0.6236 177/500 [=========>....................] - ETA: 1:40 - loss: 3.1057 - regression_loss: 2.4825 - classification_loss: 0.6233 178/500 [=========>....................] - ETA: 1:40 - loss: 3.1055 - regression_loss: 2.4821 - classification_loss: 0.6234 179/500 [=========>....................] - ETA: 1:39 - loss: 3.1067 - regression_loss: 2.4830 - classification_loss: 0.6237 180/500 [=========>....................] - ETA: 1:39 - loss: 3.1073 - regression_loss: 2.4837 - classification_loss: 0.6236 181/500 [=========>....................] - ETA: 1:39 - loss: 3.1057 - regression_loss: 2.4825 - classification_loss: 0.6232 182/500 [=========>....................] - ETA: 1:38 - loss: 3.1085 - regression_loss: 2.4851 - classification_loss: 0.6235 183/500 [=========>....................] - ETA: 1:38 - loss: 3.1075 - regression_loss: 2.4844 - classification_loss: 0.6232 184/500 [==========>...................] - ETA: 1:38 - loss: 3.1063 - regression_loss: 2.4840 - classification_loss: 0.6223 185/500 [==========>...................] - ETA: 1:37 - loss: 3.1056 - regression_loss: 2.4839 - classification_loss: 0.6217 186/500 [==========>...................] - ETA: 1:37 - loss: 3.1056 - regression_loss: 2.4834 - classification_loss: 0.6222 187/500 [==========>...................] - ETA: 1:37 - loss: 3.1056 - regression_loss: 2.4834 - classification_loss: 0.6222 188/500 [==========>...................] - ETA: 1:36 - loss: 3.1053 - regression_loss: 2.4831 - classification_loss: 0.6222 189/500 [==========>...................] - ETA: 1:36 - loss: 3.1067 - regression_loss: 2.4841 - classification_loss: 0.6225 190/500 [==========>...................] - ETA: 1:36 - loss: 3.1071 - regression_loss: 2.4847 - classification_loss: 0.6224 191/500 [==========>...................] - ETA: 1:36 - loss: 3.1071 - regression_loss: 2.4847 - classification_loss: 0.6224 192/500 [==========>...................] - ETA: 1:35 - loss: 3.1093 - regression_loss: 2.4861 - classification_loss: 0.6232 193/500 [==========>...................] - ETA: 1:35 - loss: 3.1069 - regression_loss: 2.4848 - classification_loss: 0.6221 194/500 [==========>...................] - ETA: 1:35 - loss: 3.1065 - regression_loss: 2.4844 - classification_loss: 0.6221 195/500 [==========>...................] - ETA: 1:34 - loss: 3.1080 - regression_loss: 2.4859 - classification_loss: 0.6220 196/500 [==========>...................] - ETA: 1:34 - loss: 3.1078 - regression_loss: 2.4854 - classification_loss: 0.6224 197/500 [==========>...................] - ETA: 1:34 - loss: 3.1072 - regression_loss: 2.4851 - classification_loss: 0.6221 198/500 [==========>...................] - ETA: 1:33 - loss: 3.1068 - regression_loss: 2.4849 - classification_loss: 0.6219 199/500 [==========>...................] - ETA: 1:33 - loss: 3.1053 - regression_loss: 2.4837 - classification_loss: 0.6216 200/500 [===========>..................] - ETA: 1:33 - loss: 3.1048 - regression_loss: 2.4835 - classification_loss: 0.6213 201/500 [===========>..................] - ETA: 1:32 - loss: 3.1042 - regression_loss: 2.4831 - classification_loss: 0.6211 202/500 [===========>..................] - ETA: 1:32 - loss: 3.1034 - regression_loss: 2.4828 - classification_loss: 0.6206 203/500 [===========>..................] - ETA: 1:32 - loss: 3.1030 - regression_loss: 2.4825 - classification_loss: 0.6205 204/500 [===========>..................] - ETA: 1:32 - loss: 3.1032 - regression_loss: 2.4827 - classification_loss: 0.6204 205/500 [===========>..................] - ETA: 1:31 - loss: 3.1026 - regression_loss: 2.4827 - classification_loss: 0.6199 206/500 [===========>..................] - ETA: 1:31 - loss: 3.1023 - regression_loss: 2.4827 - classification_loss: 0.6196 207/500 [===========>..................] - ETA: 1:31 - loss: 3.1011 - regression_loss: 2.4821 - classification_loss: 0.6190 208/500 [===========>..................] - ETA: 1:30 - loss: 3.1013 - regression_loss: 2.4823 - classification_loss: 0.6190 209/500 [===========>..................] - ETA: 1:30 - loss: 3.1003 - regression_loss: 2.4817 - classification_loss: 0.6186 210/500 [===========>..................] - ETA: 1:30 - loss: 3.0999 - regression_loss: 2.4817 - classification_loss: 0.6182 211/500 [===========>..................] - ETA: 1:29 - loss: 3.1013 - regression_loss: 2.4815 - classification_loss: 0.6198 212/500 [===========>..................] - ETA: 1:29 - loss: 3.1018 - regression_loss: 2.4816 - classification_loss: 0.6203 213/500 [===========>..................] - ETA: 1:29 - loss: 3.1017 - regression_loss: 2.4818 - classification_loss: 0.6199 214/500 [===========>..................] - ETA: 1:28 - loss: 3.1011 - regression_loss: 2.4812 - classification_loss: 0.6199 215/500 [===========>..................] - ETA: 1:28 - loss: 3.0999 - regression_loss: 2.4809 - classification_loss: 0.6190 216/500 [===========>..................] - ETA: 1:28 - loss: 3.0992 - regression_loss: 2.4806 - classification_loss: 0.6186 217/500 [============>.................] - ETA: 1:27 - loss: 3.0989 - regression_loss: 2.4801 - classification_loss: 0.6188 218/500 [============>.................] - ETA: 1:27 - loss: 3.0983 - regression_loss: 2.4801 - classification_loss: 0.6182 219/500 [============>.................] - ETA: 1:27 - loss: 3.0969 - regression_loss: 2.4790 - classification_loss: 0.6178 220/500 [============>.................] - ETA: 1:27 - loss: 3.0978 - regression_loss: 2.4795 - classification_loss: 0.6183 221/500 [============>.................] - ETA: 1:26 - loss: 3.0971 - regression_loss: 2.4792 - classification_loss: 0.6179 222/500 [============>.................] - ETA: 1:26 - loss: 3.0972 - regression_loss: 2.4794 - classification_loss: 0.6178 223/500 [============>.................] - ETA: 1:26 - loss: 3.0972 - regression_loss: 2.4793 - classification_loss: 0.6179 224/500 [============>.................] - ETA: 1:25 - loss: 3.0966 - regression_loss: 2.4789 - classification_loss: 0.6177 225/500 [============>.................] - ETA: 1:25 - loss: 3.0960 - regression_loss: 2.4786 - classification_loss: 0.6174 226/500 [============>.................] - ETA: 1:25 - loss: 3.0961 - regression_loss: 2.4782 - classification_loss: 0.6180 227/500 [============>.................] - ETA: 1:24 - loss: 3.0957 - regression_loss: 2.4779 - classification_loss: 0.6179 228/500 [============>.................] - ETA: 1:24 - loss: 3.0956 - regression_loss: 2.4776 - classification_loss: 0.6180 229/500 [============>.................] - ETA: 1:24 - loss: 3.0950 - regression_loss: 2.4773 - classification_loss: 0.6177 230/500 [============>.................] - ETA: 1:23 - loss: 3.0947 - regression_loss: 2.4763 - classification_loss: 0.6184 231/500 [============>.................] - ETA: 1:23 - loss: 3.0941 - regression_loss: 2.4760 - classification_loss: 0.6181 232/500 [============>.................] - ETA: 1:23 - loss: 3.0934 - regression_loss: 2.4759 - classification_loss: 0.6175 233/500 [============>.................] - ETA: 1:23 - loss: 3.0937 - regression_loss: 2.4762 - classification_loss: 0.6175 234/500 [=============>................] - ETA: 1:22 - loss: 3.0946 - regression_loss: 2.4768 - classification_loss: 0.6179 235/500 [=============>................] - ETA: 1:22 - loss: 3.0937 - regression_loss: 2.4766 - classification_loss: 0.6171 236/500 [=============>................] - ETA: 1:22 - loss: 3.0940 - regression_loss: 2.4769 - classification_loss: 0.6171 237/500 [=============>................] - ETA: 1:21 - loss: 3.0941 - regression_loss: 2.4776 - classification_loss: 0.6165 238/500 [=============>................] - ETA: 1:21 - loss: 3.0942 - regression_loss: 2.4776 - classification_loss: 0.6166 239/500 [=============>................] - ETA: 1:21 - loss: 3.0936 - regression_loss: 2.4771 - classification_loss: 0.6166 240/500 [=============>................] - ETA: 1:20 - loss: 3.0938 - regression_loss: 2.4769 - classification_loss: 0.6168 241/500 [=============>................] - ETA: 1:20 - loss: 3.0934 - regression_loss: 2.4765 - classification_loss: 0.6169 242/500 [=============>................] - ETA: 1:20 - loss: 3.0950 - regression_loss: 2.4778 - classification_loss: 0.6172 243/500 [=============>................] - ETA: 1:19 - loss: 3.0946 - regression_loss: 2.4779 - classification_loss: 0.6168 244/500 [=============>................] - ETA: 1:19 - loss: 3.0966 - regression_loss: 2.4787 - classification_loss: 0.6179 245/500 [=============>................] - ETA: 1:19 - loss: 3.0977 - regression_loss: 2.4792 - classification_loss: 0.6185 246/500 [=============>................] - ETA: 1:19 - loss: 3.0977 - regression_loss: 2.4794 - classification_loss: 0.6182 247/500 [=============>................] - ETA: 1:18 - loss: 3.0977 - regression_loss: 2.4792 - classification_loss: 0.6185 248/500 [=============>................] - ETA: 1:18 - loss: 3.0972 - regression_loss: 2.4789 - classification_loss: 0.6183 249/500 [=============>................] - ETA: 1:18 - loss: 3.0974 - regression_loss: 2.4789 - classification_loss: 0.6185 250/500 [==============>...............] - ETA: 1:17 - loss: 3.0975 - regression_loss: 2.4792 - classification_loss: 0.6183 251/500 [==============>...............] - ETA: 1:17 - loss: 3.0958 - regression_loss: 2.4784 - classification_loss: 0.6174 252/500 [==============>...............] - ETA: 1:17 - loss: 3.0945 - regression_loss: 2.4778 - classification_loss: 0.6167 253/500 [==============>...............] - ETA: 1:16 - loss: 3.0947 - regression_loss: 2.4778 - classification_loss: 0.6169 254/500 [==============>...............] - ETA: 1:16 - loss: 3.0956 - regression_loss: 2.4785 - classification_loss: 0.6170 255/500 [==============>...............] - ETA: 1:16 - loss: 3.0947 - regression_loss: 2.4782 - classification_loss: 0.6165 256/500 [==============>...............] - ETA: 1:15 - loss: 3.0926 - regression_loss: 2.4770 - classification_loss: 0.6156 257/500 [==============>...............] - ETA: 1:15 - loss: 3.0929 - regression_loss: 2.4769 - classification_loss: 0.6160 258/500 [==============>...............] - ETA: 1:15 - loss: 3.0916 - regression_loss: 2.4762 - classification_loss: 0.6154 259/500 [==============>...............] - ETA: 1:14 - loss: 3.0916 - regression_loss: 2.4759 - classification_loss: 0.6157 260/500 [==============>...............] - ETA: 1:14 - loss: 3.0914 - regression_loss: 2.4759 - classification_loss: 0.6155 261/500 [==============>...............] - ETA: 1:14 - loss: 3.0909 - regression_loss: 2.4756 - classification_loss: 0.6153 262/500 [==============>...............] - ETA: 1:14 - loss: 3.0911 - regression_loss: 2.4759 - classification_loss: 0.6152 263/500 [==============>...............] - ETA: 1:13 - loss: 3.0921 - regression_loss: 2.4771 - classification_loss: 0.6151 264/500 [==============>...............] - ETA: 1:13 - loss: 3.0919 - regression_loss: 2.4768 - classification_loss: 0.6150 265/500 [==============>...............] - ETA: 1:13 - loss: 3.0917 - regression_loss: 2.4767 - classification_loss: 0.6149 266/500 [==============>...............] - ETA: 1:12 - loss: 3.0914 - regression_loss: 2.4772 - classification_loss: 0.6142 267/500 [===============>..............] - ETA: 1:12 - loss: 3.0939 - regression_loss: 2.4778 - classification_loss: 0.6161 268/500 [===============>..............] - ETA: 1:12 - loss: 3.0925 - regression_loss: 2.4773 - classification_loss: 0.6152 269/500 [===============>..............] - ETA: 1:11 - loss: 3.0924 - regression_loss: 2.4773 - classification_loss: 0.6150 270/500 [===============>..............] - ETA: 1:11 - loss: 3.0932 - regression_loss: 2.4781 - classification_loss: 0.6152 271/500 [===============>..............] - ETA: 1:11 - loss: 3.0929 - regression_loss: 2.4779 - classification_loss: 0.6151 272/500 [===============>..............] - ETA: 1:10 - loss: 3.0928 - regression_loss: 2.4779 - classification_loss: 0.6149 273/500 [===============>..............] - ETA: 1:10 - loss: 3.0925 - regression_loss: 2.4777 - classification_loss: 0.6148 274/500 [===============>..............] - ETA: 1:10 - loss: 3.0923 - regression_loss: 2.4779 - classification_loss: 0.6144 275/500 [===============>..............] - ETA: 1:09 - loss: 3.0932 - regression_loss: 2.4788 - classification_loss: 0.6144 276/500 [===============>..............] - ETA: 1:09 - loss: 3.0929 - regression_loss: 2.4785 - classification_loss: 0.6144 277/500 [===============>..............] - ETA: 1:09 - loss: 3.0923 - regression_loss: 2.4784 - classification_loss: 0.6139 278/500 [===============>..............] - ETA: 1:09 - loss: 3.0921 - regression_loss: 2.4783 - classification_loss: 0.6138 279/500 [===============>..............] - ETA: 1:08 - loss: 3.0934 - regression_loss: 2.4798 - classification_loss: 0.6136 280/500 [===============>..............] - ETA: 1:08 - loss: 3.0927 - regression_loss: 2.4794 - classification_loss: 0.6133 281/500 [===============>..............] - ETA: 1:08 - loss: 3.0923 - regression_loss: 2.4792 - classification_loss: 0.6131 282/500 [===============>..............] - ETA: 1:07 - loss: 3.0919 - regression_loss: 2.4788 - classification_loss: 0.6131 283/500 [===============>..............] - ETA: 1:07 - loss: 3.0909 - regression_loss: 2.4783 - classification_loss: 0.6126 284/500 [================>.............] - ETA: 1:07 - loss: 3.0900 - regression_loss: 2.4777 - classification_loss: 0.6124 285/500 [================>.............] - ETA: 1:06 - loss: 3.0893 - regression_loss: 2.4773 - classification_loss: 0.6121 286/500 [================>.............] - ETA: 1:06 - loss: 3.0887 - regression_loss: 2.4772 - classification_loss: 0.6115 287/500 [================>.............] - ETA: 1:06 - loss: 3.0911 - regression_loss: 2.4788 - classification_loss: 0.6123 288/500 [================>.............] - ETA: 1:05 - loss: 3.0908 - regression_loss: 2.4786 - classification_loss: 0.6122 289/500 [================>.............] - ETA: 1:05 - loss: 3.0911 - regression_loss: 2.4788 - classification_loss: 0.6123 290/500 [================>.............] - ETA: 1:05 - loss: 3.0911 - regression_loss: 2.4788 - classification_loss: 0.6123 291/500 [================>.............] - ETA: 1:05 - loss: 3.0903 - regression_loss: 2.4786 - classification_loss: 0.6116 292/500 [================>.............] - ETA: 1:04 - loss: 3.0892 - regression_loss: 2.4782 - classification_loss: 0.6111 293/500 [================>.............] - ETA: 1:04 - loss: 3.0890 - regression_loss: 2.4780 - classification_loss: 0.6110 294/500 [================>.............] - ETA: 1:04 - loss: 3.0889 - regression_loss: 2.4782 - classification_loss: 0.6107 295/500 [================>.............] - ETA: 1:03 - loss: 3.0884 - regression_loss: 2.4778 - classification_loss: 0.6105 296/500 [================>.............] - ETA: 1:03 - loss: 3.0884 - regression_loss: 2.4779 - classification_loss: 0.6105 297/500 [================>.............] - ETA: 1:03 - loss: 3.0886 - regression_loss: 2.4781 - classification_loss: 0.6105 298/500 [================>.............] - ETA: 1:02 - loss: 3.0881 - regression_loss: 2.4778 - classification_loss: 0.6103 299/500 [================>.............] - ETA: 1:02 - loss: 3.0874 - regression_loss: 2.4776 - classification_loss: 0.6098 300/500 [=================>............] - ETA: 1:02 - loss: 3.0860 - regression_loss: 2.4768 - classification_loss: 0.6093 301/500 [=================>............] - ETA: 1:02 - loss: 3.0858 - regression_loss: 2.4764 - classification_loss: 0.6093 302/500 [=================>............] - ETA: 1:01 - loss: 3.0863 - regression_loss: 2.4763 - classification_loss: 0.6099 303/500 [=================>............] - ETA: 1:01 - loss: 3.0858 - regression_loss: 2.4757 - classification_loss: 0.6101 304/500 [=================>............] - ETA: 1:01 - loss: 3.0851 - regression_loss: 2.4753 - classification_loss: 0.6099 305/500 [=================>............] - ETA: 1:00 - loss: 3.0854 - regression_loss: 2.4755 - classification_loss: 0.6099 306/500 [=================>............] - ETA: 1:00 - loss: 3.0847 - regression_loss: 2.4747 - classification_loss: 0.6100 307/500 [=================>............] - ETA: 1:00 - loss: 3.0849 - regression_loss: 2.4749 - classification_loss: 0.6101 308/500 [=================>............] - ETA: 59s - loss: 3.0849 - regression_loss: 2.4750 - classification_loss: 0.6099  309/500 [=================>............] - ETA: 59s - loss: 3.0848 - regression_loss: 2.4752 - classification_loss: 0.6096 310/500 [=================>............] - ETA: 59s - loss: 3.0843 - regression_loss: 2.4749 - classification_loss: 0.6095 311/500 [=================>............] - ETA: 59s - loss: 3.0841 - regression_loss: 2.4746 - classification_loss: 0.6094 312/500 [=================>............] - ETA: 58s - loss: 3.0836 - regression_loss: 2.4745 - classification_loss: 0.6092 313/500 [=================>............] - ETA: 58s - loss: 3.0846 - regression_loss: 2.4757 - classification_loss: 0.6088 314/500 [=================>............] - ETA: 58s - loss: 3.0841 - regression_loss: 2.4757 - classification_loss: 0.6084 315/500 [=================>............] - ETA: 57s - loss: 3.0844 - regression_loss: 2.4758 - classification_loss: 0.6085 316/500 [=================>............] - ETA: 57s - loss: 3.0841 - regression_loss: 2.4756 - classification_loss: 0.6085 317/500 [==================>...........] - ETA: 57s - loss: 3.0859 - regression_loss: 2.4766 - classification_loss: 0.6092 318/500 [==================>...........] - ETA: 56s - loss: 3.0856 - regression_loss: 2.4764 - classification_loss: 0.6092 319/500 [==================>...........] - ETA: 56s - loss: 3.0849 - regression_loss: 2.4760 - classification_loss: 0.6089 320/500 [==================>...........] - ETA: 56s - loss: 3.0845 - regression_loss: 2.4757 - classification_loss: 0.6088 321/500 [==================>...........] - ETA: 55s - loss: 3.0843 - regression_loss: 2.4757 - classification_loss: 0.6086 322/500 [==================>...........] - ETA: 55s - loss: 3.0846 - regression_loss: 2.4755 - classification_loss: 0.6091 323/500 [==================>...........] - ETA: 55s - loss: 3.0842 - regression_loss: 2.4753 - classification_loss: 0.6089 324/500 [==================>...........] - ETA: 55s - loss: 3.0860 - regression_loss: 2.4764 - classification_loss: 0.6096 325/500 [==================>...........] - ETA: 54s - loss: 3.0857 - regression_loss: 2.4762 - classification_loss: 0.6095 326/500 [==================>...........] - ETA: 54s - loss: 3.0853 - regression_loss: 2.4760 - classification_loss: 0.6093 327/500 [==================>...........] - ETA: 54s - loss: 3.0848 - regression_loss: 2.4757 - classification_loss: 0.6091 328/500 [==================>...........] - ETA: 53s - loss: 3.0845 - regression_loss: 2.4753 - classification_loss: 0.6091 329/500 [==================>...........] - ETA: 53s - loss: 3.0835 - regression_loss: 2.4747 - classification_loss: 0.6087 330/500 [==================>...........] - ETA: 53s - loss: 3.0831 - regression_loss: 2.4744 - classification_loss: 0.6086 331/500 [==================>...........] - ETA: 52s - loss: 3.0830 - regression_loss: 2.4744 - classification_loss: 0.6086 332/500 [==================>...........] - ETA: 52s - loss: 3.0826 - regression_loss: 2.4742 - classification_loss: 0.6085 333/500 [==================>...........] - ETA: 52s - loss: 3.0822 - regression_loss: 2.4739 - classification_loss: 0.6083 334/500 [===================>..........] - ETA: 51s - loss: 3.0825 - regression_loss: 2.4746 - classification_loss: 0.6079 335/500 [===================>..........] - ETA: 51s - loss: 3.0821 - regression_loss: 2.4744 - classification_loss: 0.6076 336/500 [===================>..........] - ETA: 51s - loss: 3.0844 - regression_loss: 2.4764 - classification_loss: 0.6081 337/500 [===================>..........] - ETA: 51s - loss: 3.0839 - regression_loss: 2.4761 - classification_loss: 0.6078 338/500 [===================>..........] - ETA: 50s - loss: 3.0838 - regression_loss: 2.4762 - classification_loss: 0.6076 339/500 [===================>..........] - ETA: 50s - loss: 3.0836 - regression_loss: 2.4761 - classification_loss: 0.6075 340/500 [===================>..........] - ETA: 50s - loss: 3.0844 - regression_loss: 2.4765 - classification_loss: 0.6079 341/500 [===================>..........] - ETA: 49s - loss: 3.0848 - regression_loss: 2.4773 - classification_loss: 0.6076 342/500 [===================>..........] - ETA: 49s - loss: 3.0850 - regression_loss: 2.4773 - classification_loss: 0.6076 343/500 [===================>..........] - ETA: 49s - loss: 3.0845 - regression_loss: 2.4766 - classification_loss: 0.6079 344/500 [===================>..........] - ETA: 48s - loss: 3.0842 - regression_loss: 2.4764 - classification_loss: 0.6078 345/500 [===================>..........] - ETA: 48s - loss: 3.0840 - regression_loss: 2.4764 - classification_loss: 0.6076 346/500 [===================>..........] - ETA: 48s - loss: 3.0839 - regression_loss: 2.4762 - classification_loss: 0.6076 347/500 [===================>..........] - ETA: 47s - loss: 3.0832 - regression_loss: 2.4760 - classification_loss: 0.6072 348/500 [===================>..........] - ETA: 47s - loss: 3.0829 - regression_loss: 2.4759 - classification_loss: 0.6070 349/500 [===================>..........] - ETA: 47s - loss: 3.0827 - regression_loss: 2.4757 - classification_loss: 0.6069 350/500 [====================>.........] - ETA: 47s - loss: 3.0821 - regression_loss: 2.4754 - classification_loss: 0.6067 351/500 [====================>.........] - ETA: 46s - loss: 3.0817 - regression_loss: 2.4752 - classification_loss: 0.6065 352/500 [====================>.........] - ETA: 46s - loss: 3.0818 - regression_loss: 2.4754 - classification_loss: 0.6064 353/500 [====================>.........] - ETA: 46s - loss: 3.0826 - regression_loss: 2.4763 - classification_loss: 0.6063 354/500 [====================>.........] - ETA: 45s - loss: 3.0824 - regression_loss: 2.4762 - classification_loss: 0.6062 355/500 [====================>.........] - ETA: 45s - loss: 3.0820 - regression_loss: 2.4760 - classification_loss: 0.6061 356/500 [====================>.........] - ETA: 45s - loss: 3.0814 - regression_loss: 2.4756 - classification_loss: 0.6058 357/500 [====================>.........] - ETA: 44s - loss: 3.0823 - regression_loss: 2.4766 - classification_loss: 0.6057 358/500 [====================>.........] - ETA: 44s - loss: 3.0819 - regression_loss: 2.4765 - classification_loss: 0.6053 359/500 [====================>.........] - ETA: 44s - loss: 3.0826 - regression_loss: 2.4775 - classification_loss: 0.6051 360/500 [====================>.........] - ETA: 44s - loss: 3.0823 - regression_loss: 2.4773 - classification_loss: 0.6050 361/500 [====================>.........] - ETA: 43s - loss: 3.0811 - regression_loss: 2.4766 - classification_loss: 0.6044 362/500 [====================>.........] - ETA: 43s - loss: 3.0807 - regression_loss: 2.4763 - classification_loss: 0.6044 363/500 [====================>.........] - ETA: 43s - loss: 3.0793 - regression_loss: 2.4753 - classification_loss: 0.6040 364/500 [====================>.........] - ETA: 42s - loss: 3.0794 - regression_loss: 2.4755 - classification_loss: 0.6040 365/500 [====================>.........] - ETA: 42s - loss: 3.0778 - regression_loss: 2.4743 - classification_loss: 0.6035 366/500 [====================>.........] - ETA: 42s - loss: 3.0778 - regression_loss: 2.4743 - classification_loss: 0.6035 367/500 [=====================>........] - ETA: 41s - loss: 3.0773 - regression_loss: 2.4740 - classification_loss: 0.6033 368/500 [=====================>........] - ETA: 41s - loss: 3.0768 - regression_loss: 2.4737 - classification_loss: 0.6031 369/500 [=====================>........] - ETA: 41s - loss: 3.0764 - regression_loss: 2.4736 - classification_loss: 0.6028 370/500 [=====================>........] - ETA: 40s - loss: 3.0765 - regression_loss: 2.4739 - classification_loss: 0.6026 371/500 [=====================>........] - ETA: 40s - loss: 3.0763 - regression_loss: 2.4737 - classification_loss: 0.6026 372/500 [=====================>........] - ETA: 40s - loss: 3.0758 - regression_loss: 2.4734 - classification_loss: 0.6024 373/500 [=====================>........] - ETA: 39s - loss: 3.0753 - regression_loss: 2.4733 - classification_loss: 0.6020 374/500 [=====================>........] - ETA: 39s - loss: 3.0749 - regression_loss: 2.4732 - classification_loss: 0.6017 375/500 [=====================>........] - ETA: 39s - loss: 3.0756 - regression_loss: 2.4734 - classification_loss: 0.6022 376/500 [=====================>........] - ETA: 39s - loss: 3.0751 - regression_loss: 2.4730 - classification_loss: 0.6020 377/500 [=====================>........] - ETA: 38s - loss: 3.0745 - regression_loss: 2.4727 - classification_loss: 0.6018 378/500 [=====================>........] - ETA: 38s - loss: 3.0742 - regression_loss: 2.4724 - classification_loss: 0.6017 379/500 [=====================>........] - ETA: 38s - loss: 3.0745 - regression_loss: 2.4725 - classification_loss: 0.6020 380/500 [=====================>........] - ETA: 37s - loss: 3.0742 - regression_loss: 2.4720 - classification_loss: 0.6022 381/500 [=====================>........] - ETA: 37s - loss: 3.0735 - regression_loss: 2.4719 - classification_loss: 0.6016 382/500 [=====================>........] - ETA: 37s - loss: 3.0732 - regression_loss: 2.4718 - classification_loss: 0.6014 383/500 [=====================>........] - ETA: 36s - loss: 3.0728 - regression_loss: 2.4716 - classification_loss: 0.6012 384/500 [======================>.......] - ETA: 36s - loss: 3.0725 - regression_loss: 2.4714 - classification_loss: 0.6011 385/500 [======================>.......] - ETA: 36s - loss: 3.0717 - regression_loss: 2.4708 - classification_loss: 0.6009 386/500 [======================>.......] - ETA: 35s - loss: 3.0713 - regression_loss: 2.4706 - classification_loss: 0.6007 387/500 [======================>.......] - ETA: 35s - loss: 3.0715 - regression_loss: 2.4707 - classification_loss: 0.6007 388/500 [======================>.......] - ETA: 35s - loss: 3.0709 - regression_loss: 2.4704 - classification_loss: 0.6005 389/500 [======================>.......] - ETA: 35s - loss: 3.0706 - regression_loss: 2.4702 - classification_loss: 0.6004 390/500 [======================>.......] - ETA: 34s - loss: 3.0707 - regression_loss: 2.4703 - classification_loss: 0.6004 391/500 [======================>.......] - ETA: 34s - loss: 3.0698 - regression_loss: 2.4696 - classification_loss: 0.6002 392/500 [======================>.......] - ETA: 34s - loss: 3.0696 - regression_loss: 2.4695 - classification_loss: 0.6000 393/500 [======================>.......] - ETA: 33s - loss: 3.0695 - regression_loss: 2.4698 - classification_loss: 0.5997 394/500 [======================>.......] - ETA: 33s - loss: 3.0692 - regression_loss: 2.4697 - classification_loss: 0.5995 395/500 [======================>.......] - ETA: 33s - loss: 3.0692 - regression_loss: 2.4697 - classification_loss: 0.5995 396/500 [======================>.......] - ETA: 32s - loss: 3.0691 - regression_loss: 2.4697 - classification_loss: 0.5994 397/500 [======================>.......] - ETA: 32s - loss: 3.0687 - regression_loss: 2.4695 - classification_loss: 0.5993 398/500 [======================>.......] - ETA: 32s - loss: 3.0687 - regression_loss: 2.4694 - classification_loss: 0.5993 399/500 [======================>.......] - ETA: 31s - loss: 3.0684 - regression_loss: 2.4692 - classification_loss: 0.5991 400/500 [=======================>......] - ETA: 31s - loss: 3.0686 - regression_loss: 2.4693 - classification_loss: 0.5993 401/500 [=======================>......] - ETA: 31s - loss: 3.0686 - regression_loss: 2.4691 - classification_loss: 0.5995 402/500 [=======================>......] - ETA: 30s - loss: 3.0688 - regression_loss: 2.4682 - classification_loss: 0.6007 403/500 [=======================>......] - ETA: 30s - loss: 3.0695 - regression_loss: 2.4683 - classification_loss: 0.6012 404/500 [=======================>......] - ETA: 30s - loss: 3.0691 - regression_loss: 2.4681 - classification_loss: 0.6010 405/500 [=======================>......] - ETA: 30s - loss: 3.0688 - regression_loss: 2.4682 - classification_loss: 0.6006 406/500 [=======================>......] - ETA: 29s - loss: 3.0684 - regression_loss: 2.4682 - classification_loss: 0.6002 407/500 [=======================>......] - ETA: 29s - loss: 3.0682 - regression_loss: 2.4680 - classification_loss: 0.6002 408/500 [=======================>......] - ETA: 29s - loss: 3.0679 - regression_loss: 2.4677 - classification_loss: 0.6002 409/500 [=======================>......] - ETA: 28s - loss: 3.0689 - regression_loss: 2.4686 - classification_loss: 0.6004 410/500 [=======================>......] - ETA: 28s - loss: 3.0684 - regression_loss: 2.4683 - classification_loss: 0.6001 411/500 [=======================>......] - ETA: 28s - loss: 3.0681 - regression_loss: 2.4682 - classification_loss: 0.5999 412/500 [=======================>......] - ETA: 27s - loss: 3.0681 - regression_loss: 2.4683 - classification_loss: 0.5999 413/500 [=======================>......] - ETA: 27s - loss: 3.0682 - regression_loss: 2.4684 - classification_loss: 0.5999 414/500 [=======================>......] - ETA: 27s - loss: 3.0673 - regression_loss: 2.4678 - classification_loss: 0.5994 415/500 [=======================>......] - ETA: 26s - loss: 3.0669 - regression_loss: 2.4676 - classification_loss: 0.5993 416/500 [=======================>......] - ETA: 26s - loss: 3.0669 - regression_loss: 2.4677 - classification_loss: 0.5991 417/500 [========================>.....] - ETA: 26s - loss: 3.0665 - regression_loss: 2.4675 - classification_loss: 0.5990 418/500 [========================>.....] - ETA: 25s - loss: 3.0677 - regression_loss: 2.4682 - classification_loss: 0.5996 419/500 [========================>.....] - ETA: 25s - loss: 3.0661 - regression_loss: 2.4671 - classification_loss: 0.5990 420/500 [========================>.....] - ETA: 25s - loss: 3.0659 - regression_loss: 2.4670 - classification_loss: 0.5989 421/500 [========================>.....] - ETA: 25s - loss: 3.0655 - regression_loss: 2.4667 - classification_loss: 0.5987 422/500 [========================>.....] - ETA: 24s - loss: 3.0652 - regression_loss: 2.4665 - classification_loss: 0.5987 423/500 [========================>.....] - ETA: 24s - loss: 3.0649 - regression_loss: 2.4662 - classification_loss: 0.5987 424/500 [========================>.....] - ETA: 24s - loss: 3.0650 - regression_loss: 2.4663 - classification_loss: 0.5987 425/500 [========================>.....] - ETA: 23s - loss: 3.0647 - regression_loss: 2.4661 - classification_loss: 0.5986 426/500 [========================>.....] - ETA: 23s - loss: 3.0645 - regression_loss: 2.4660 - classification_loss: 0.5985 427/500 [========================>.....] - ETA: 23s - loss: 3.0642 - regression_loss: 2.4658 - classification_loss: 0.5985 428/500 [========================>.....] - ETA: 22s - loss: 3.0641 - regression_loss: 2.4657 - classification_loss: 0.5984 429/500 [========================>.....] - ETA: 22s - loss: 3.0649 - regression_loss: 2.4665 - classification_loss: 0.5984 430/500 [========================>.....] - ETA: 22s - loss: 3.0646 - regression_loss: 2.4663 - classification_loss: 0.5982 431/500 [========================>.....] - ETA: 21s - loss: 3.0642 - regression_loss: 2.4661 - classification_loss: 0.5980 432/500 [========================>.....] - ETA: 21s - loss: 3.0649 - regression_loss: 2.4664 - classification_loss: 0.5985 433/500 [========================>.....] - ETA: 21s - loss: 3.0654 - regression_loss: 2.4671 - classification_loss: 0.5983 434/500 [=========================>....] - ETA: 20s - loss: 3.0660 - regression_loss: 2.4674 - classification_loss: 0.5986 435/500 [=========================>....] - ETA: 20s - loss: 3.0657 - regression_loss: 2.4672 - classification_loss: 0.5985 436/500 [=========================>....] - ETA: 20s - loss: 3.0651 - regression_loss: 2.4668 - classification_loss: 0.5983 437/500 [=========================>....] - ETA: 19s - loss: 3.0649 - regression_loss: 2.4669 - classification_loss: 0.5980 438/500 [=========================>....] - ETA: 19s - loss: 3.0647 - regression_loss: 2.4668 - classification_loss: 0.5979 439/500 [=========================>....] - ETA: 19s - loss: 3.0644 - regression_loss: 2.4665 - classification_loss: 0.5979 440/500 [=========================>....] - ETA: 19s - loss: 3.0659 - regression_loss: 2.4674 - classification_loss: 0.5985 441/500 [=========================>....] - ETA: 18s - loss: 3.0658 - regression_loss: 2.4673 - classification_loss: 0.5985 442/500 [=========================>....] - ETA: 18s - loss: 3.0661 - regression_loss: 2.4673 - classification_loss: 0.5987 443/500 [=========================>....] - ETA: 18s - loss: 3.0668 - regression_loss: 2.4677 - classification_loss: 0.5991 444/500 [=========================>....] - ETA: 17s - loss: 3.0664 - regression_loss: 2.4675 - classification_loss: 0.5989 445/500 [=========================>....] - ETA: 17s - loss: 3.0660 - regression_loss: 2.4676 - classification_loss: 0.5985 446/500 [=========================>....] - ETA: 17s - loss: 3.0666 - regression_loss: 2.4682 - classification_loss: 0.5983 447/500 [=========================>....] - ETA: 16s - loss: 3.0663 - regression_loss: 2.4681 - classification_loss: 0.5982 448/500 [=========================>....] - ETA: 16s - loss: 3.0664 - regression_loss: 2.4680 - classification_loss: 0.5985 449/500 [=========================>....] - ETA: 16s - loss: 3.0663 - regression_loss: 2.4678 - classification_loss: 0.5984 450/500 [==========================>...] - ETA: 15s - loss: 3.0657 - regression_loss: 2.4676 - classification_loss: 0.5981 451/500 [==========================>...] - ETA: 15s - loss: 3.0654 - regression_loss: 2.4674 - classification_loss: 0.5980 452/500 [==========================>...] - ETA: 15s - loss: 3.0651 - regression_loss: 2.4672 - classification_loss: 0.5979 453/500 [==========================>...] - ETA: 14s - loss: 3.0650 - regression_loss: 2.4672 - classification_loss: 0.5978 454/500 [==========================>...] - ETA: 14s - loss: 3.0652 - regression_loss: 2.4673 - classification_loss: 0.5979 455/500 [==========================>...] - ETA: 14s - loss: 3.0649 - regression_loss: 2.4672 - classification_loss: 0.5978 456/500 [==========================>...] - ETA: 13s - loss: 3.0645 - regression_loss: 2.4669 - classification_loss: 0.5976 457/500 [==========================>...] - ETA: 13s - loss: 3.0643 - regression_loss: 2.4668 - classification_loss: 0.5975 458/500 [==========================>...] - ETA: 13s - loss: 3.0635 - regression_loss: 2.4664 - classification_loss: 0.5971 459/500 [==========================>...] - ETA: 13s - loss: 3.0632 - regression_loss: 2.4663 - classification_loss: 0.5969 460/500 [==========================>...] - ETA: 12s - loss: 3.0628 - regression_loss: 2.4661 - classification_loss: 0.5967 461/500 [==========================>...] - ETA: 12s - loss: 3.0626 - regression_loss: 2.4660 - classification_loss: 0.5966 462/500 [==========================>...] - ETA: 12s - loss: 3.0625 - regression_loss: 2.4660 - classification_loss: 0.5965 463/500 [==========================>...] - ETA: 11s - loss: 3.0615 - regression_loss: 2.4653 - classification_loss: 0.5962 464/500 [==========================>...] - ETA: 11s - loss: 3.0612 - regression_loss: 2.4651 - classification_loss: 0.5962 465/500 [==========================>...] - ETA: 11s - loss: 3.0611 - regression_loss: 2.4651 - classification_loss: 0.5960 466/500 [==========================>...] - ETA: 10s - loss: 3.0616 - regression_loss: 2.4652 - classification_loss: 0.5963 467/500 [===========================>..] - ETA: 10s - loss: 3.0615 - regression_loss: 2.4653 - classification_loss: 0.5962 468/500 [===========================>..] - ETA: 10s - loss: 3.0611 - regression_loss: 2.4652 - classification_loss: 0.5960 469/500 [===========================>..] - ETA: 9s - loss: 3.0608 - regression_loss: 2.4650 - classification_loss: 0.5958  470/500 [===========================>..] - ETA: 9s - loss: 3.0609 - regression_loss: 2.4648 - classification_loss: 0.5961 471/500 [===========================>..] - ETA: 9s - loss: 3.0610 - regression_loss: 2.4650 - classification_loss: 0.5960 472/500 [===========================>..] - ETA: 8s - loss: 3.0615 - regression_loss: 2.4650 - classification_loss: 0.5965 473/500 [===========================>..] - ETA: 8s - loss: 3.0620 - regression_loss: 2.4651 - classification_loss: 0.5969 474/500 [===========================>..] - ETA: 8s - loss: 3.0618 - regression_loss: 2.4650 - classification_loss: 0.5968 475/500 [===========================>..] - ETA: 7s - loss: 3.0612 - regression_loss: 2.4649 - classification_loss: 0.5963 476/500 [===========================>..] - ETA: 7s - loss: 3.0609 - regression_loss: 2.4648 - classification_loss: 0.5961 477/500 [===========================>..] - ETA: 7s - loss: 3.0604 - regression_loss: 2.4646 - classification_loss: 0.5959 478/500 [===========================>..] - ETA: 7s - loss: 3.0602 - regression_loss: 2.4645 - classification_loss: 0.5957 479/500 [===========================>..] - ETA: 6s - loss: 3.0599 - regression_loss: 2.4643 - classification_loss: 0.5955 480/500 [===========================>..] - ETA: 6s - loss: 3.0598 - regression_loss: 2.4643 - classification_loss: 0.5955 481/500 [===========================>..] - ETA: 6s - loss: 3.0584 - regression_loss: 2.4634 - classification_loss: 0.5950 482/500 [===========================>..] - ETA: 5s - loss: 3.0580 - regression_loss: 2.4631 - classification_loss: 0.5949 483/500 [===========================>..] - ETA: 5s - loss: 3.0574 - regression_loss: 2.4627 - classification_loss: 0.5947 484/500 [============================>.] - ETA: 5s - loss: 3.0575 - regression_loss: 2.4628 - classification_loss: 0.5947 485/500 [============================>.] - ETA: 4s - loss: 3.0569 - regression_loss: 2.4625 - classification_loss: 0.5944 486/500 [============================>.] - ETA: 4s - loss: 3.0565 - regression_loss: 2.4623 - classification_loss: 0.5942 487/500 [============================>.] - ETA: 4s - loss: 3.0560 - regression_loss: 2.4619 - classification_loss: 0.5941 488/500 [============================>.] - ETA: 3s - loss: 3.0558 - regression_loss: 2.4617 - classification_loss: 0.5940 489/500 [============================>.] - ETA: 3s - loss: 3.0554 - regression_loss: 2.4615 - classification_loss: 0.5939 490/500 [============================>.] - ETA: 3s - loss: 3.0551 - regression_loss: 2.4613 - classification_loss: 0.5938 491/500 [============================>.] - ETA: 2s - loss: 3.0547 - regression_loss: 2.4612 - classification_loss: 0.5935 492/500 [============================>.] - ETA: 2s - loss: 3.0542 - regression_loss: 2.4610 - classification_loss: 0.5932 493/500 [============================>.] - ETA: 2s - loss: 3.0542 - regression_loss: 2.4610 - classification_loss: 0.5932 494/500 [============================>.] - ETA: 1s - loss: 3.0537 - regression_loss: 2.4609 - classification_loss: 0.5928 495/500 [============================>.] - ETA: 1s - loss: 3.0536 - regression_loss: 2.4609 - classification_loss: 0.5927 496/500 [============================>.] - ETA: 1s - loss: 3.0534 - regression_loss: 2.4608 - classification_loss: 0.5926 497/500 [============================>.] - ETA: 0s - loss: 3.0526 - regression_loss: 2.4603 - classification_loss: 0.5923 498/500 [============================>.] - ETA: 0s - loss: 3.0520 - regression_loss: 2.4601 - classification_loss: 0.5919 499/500 [============================>.] - ETA: 0s - loss: 3.0516 - regression_loss: 2.4599 - classification_loss: 0.5916 500/500 [==============================] - 159s 319ms/step - loss: 3.0521 - regression_loss: 2.4599 - classification_loss: 0.5922 1172 instances of class plum with average precision: 0.0270 mAP: 0.0270 Epoch 00002: saving model to ./training/snapshots/resnet101_pascal_02.h5 Epoch 3/150 1/500 [..............................] - ETA: 2:35 - loss: 4.2835 - regression_loss: 2.7760 - classification_loss: 1.5075 2/500 [..............................] - ETA: 2:37 - loss: 3.9163 - regression_loss: 2.8187 - classification_loss: 1.0975 3/500 [..............................] - ETA: 2:35 - loss: 3.6557 - regression_loss: 2.7392 - classification_loss: 0.9165 4/500 [..............................] - ETA: 2:34 - loss: 3.4206 - regression_loss: 2.6129 - classification_loss: 0.8077 5/500 [..............................] - ETA: 2:36 - loss: 3.3424 - regression_loss: 2.5696 - classification_loss: 0.7728 6/500 [..............................] - ETA: 2:37 - loss: 3.2709 - regression_loss: 2.5455 - classification_loss: 0.7254 7/500 [..............................] - ETA: 2:36 - loss: 3.2197 - regression_loss: 2.5340 - classification_loss: 0.6857 8/500 [..............................] - ETA: 2:38 - loss: 3.2199 - regression_loss: 2.5324 - classification_loss: 0.6876 9/500 [..............................] - ETA: 2:40 - loss: 3.1627 - regression_loss: 2.5068 - classification_loss: 0.6559 10/500 [..............................] - ETA: 2:39 - loss: 3.0960 - regression_loss: 2.4658 - classification_loss: 0.6303 11/500 [..............................] - ETA: 2:39 - loss: 3.0478 - regression_loss: 2.4415 - classification_loss: 0.6063 12/500 [..............................] - ETA: 2:39 - loss: 3.0342 - regression_loss: 2.4338 - classification_loss: 0.6004 13/500 [..............................] - ETA: 2:38 - loss: 3.0028 - regression_loss: 2.4164 - classification_loss: 0.5865 14/500 [..............................] - ETA: 2:39 - loss: 2.9943 - regression_loss: 2.4090 - classification_loss: 0.5853 15/500 [..............................] - ETA: 2:38 - loss: 2.9894 - regression_loss: 2.4074 - classification_loss: 0.5820 16/500 [..............................] - ETA: 2:38 - loss: 2.9831 - regression_loss: 2.4031 - classification_loss: 0.5801 17/500 [>.............................] - ETA: 2:39 - loss: 2.9775 - regression_loss: 2.4009 - classification_loss: 0.5766 18/500 [>.............................] - ETA: 2:38 - loss: 3.0053 - regression_loss: 2.3853 - classification_loss: 0.6199 19/500 [>.............................] - ETA: 2:38 - loss: 3.0200 - regression_loss: 2.3879 - classification_loss: 0.6322 20/500 [>.............................] - ETA: 2:38 - loss: 3.2263 - regression_loss: 2.4010 - classification_loss: 0.8252 21/500 [>.............................] - ETA: 2:38 - loss: 3.2220 - regression_loss: 2.4040 - classification_loss: 0.8179 22/500 [>.............................] - ETA: 2:37 - loss: 3.1925 - regression_loss: 2.3918 - classification_loss: 0.8008 23/500 [>.............................] - ETA: 2:36 - loss: 3.1866 - regression_loss: 2.3982 - classification_loss: 0.7884 24/500 [>.............................] - ETA: 2:36 - loss: 3.1752 - regression_loss: 2.4021 - classification_loss: 0.7731 25/500 [>.............................] - ETA: 2:36 - loss: 3.1630 - regression_loss: 2.4018 - classification_loss: 0.7612 26/500 [>.............................] - ETA: 2:35 - loss: 3.1678 - regression_loss: 2.4125 - classification_loss: 0.7553 27/500 [>.............................] - ETA: 2:35 - loss: 3.1480 - regression_loss: 2.4040 - classification_loss: 0.7440 28/500 [>.............................] - ETA: 2:35 - loss: 3.1431 - regression_loss: 2.4057 - classification_loss: 0.7374 29/500 [>.............................] - ETA: 2:34 - loss: 3.1495 - regression_loss: 2.4197 - classification_loss: 0.7298 30/500 [>.............................] - ETA: 2:34 - loss: 3.1172 - regression_loss: 2.4008 - classification_loss: 0.7164 31/500 [>.............................] - ETA: 2:34 - loss: 3.1108 - regression_loss: 2.3997 - classification_loss: 0.7111 32/500 [>.............................] - ETA: 2:34 - loss: 3.1296 - regression_loss: 2.4200 - classification_loss: 0.7096 33/500 [>.............................] - ETA: 2:34 - loss: 3.1338 - regression_loss: 2.4278 - classification_loss: 0.7061 34/500 [=>............................] - ETA: 2:33 - loss: 3.1338 - regression_loss: 2.4352 - classification_loss: 0.6987 35/500 [=>............................] - ETA: 2:33 - loss: 3.1313 - regression_loss: 2.4379 - classification_loss: 0.6934 36/500 [=>............................] - ETA: 2:33 - loss: 3.1239 - regression_loss: 2.4350 - classification_loss: 0.6889 37/500 [=>............................] - ETA: 2:32 - loss: 3.1185 - regression_loss: 2.4327 - classification_loss: 0.6858 38/500 [=>............................] - ETA: 2:32 - loss: 3.1094 - regression_loss: 2.4316 - classification_loss: 0.6778 39/500 [=>............................] - ETA: 2:32 - loss: 3.1035 - regression_loss: 2.4300 - classification_loss: 0.6735 40/500 [=>............................] - ETA: 2:32 - loss: 3.0978 - regression_loss: 2.4277 - classification_loss: 0.6701 41/500 [=>............................] - ETA: 2:31 - loss: 3.0982 - regression_loss: 2.4306 - classification_loss: 0.6676 42/500 [=>............................] - ETA: 2:31 - loss: 3.0937 - regression_loss: 2.4295 - classification_loss: 0.6642 43/500 [=>............................] - ETA: 2:30 - loss: 3.0860 - regression_loss: 2.4271 - classification_loss: 0.6589 44/500 [=>............................] - ETA: 2:30 - loss: 3.0895 - regression_loss: 2.4280 - classification_loss: 0.6615 45/500 [=>............................] - ETA: 2:29 - loss: 3.0832 - regression_loss: 2.4280 - classification_loss: 0.6553 46/500 [=>............................] - ETA: 2:29 - loss: 3.0806 - regression_loss: 2.4242 - classification_loss: 0.6564 47/500 [=>............................] - ETA: 2:29 - loss: 3.0722 - regression_loss: 2.4153 - classification_loss: 0.6569 48/500 [=>............................] - ETA: 2:28 - loss: 3.0733 - regression_loss: 2.4174 - classification_loss: 0.6559 49/500 [=>............................] - ETA: 2:28 - loss: 3.0732 - regression_loss: 2.4192 - classification_loss: 0.6540 50/500 [==>...........................] - ETA: 2:28 - loss: 3.0679 - regression_loss: 2.4167 - classification_loss: 0.6513 51/500 [==>...........................] - ETA: 2:27 - loss: 3.0627 - regression_loss: 2.4159 - classification_loss: 0.6468 52/500 [==>...........................] - ETA: 2:27 - loss: 3.0602 - regression_loss: 2.4154 - classification_loss: 0.6448 53/500 [==>...........................] - ETA: 2:27 - loss: 3.0588 - regression_loss: 2.4167 - classification_loss: 0.6421 54/500 [==>...........................] - ETA: 2:26 - loss: 3.0554 - regression_loss: 2.4155 - classification_loss: 0.6399 55/500 [==>...........................] - ETA: 2:26 - loss: 3.0569 - regression_loss: 2.4174 - classification_loss: 0.6395 56/500 [==>...........................] - ETA: 2:26 - loss: 3.0556 - regression_loss: 2.4173 - classification_loss: 0.6384 57/500 [==>...........................] - ETA: 2:25 - loss: 3.0530 - regression_loss: 2.4164 - classification_loss: 0.6366 58/500 [==>...........................] - ETA: 2:25 - loss: 3.0533 - regression_loss: 2.4170 - classification_loss: 0.6363 59/500 [==>...........................] - ETA: 2:25 - loss: 3.0508 - regression_loss: 2.4156 - classification_loss: 0.6352 60/500 [==>...........................] - ETA: 2:25 - loss: 3.0468 - regression_loss: 2.4138 - classification_loss: 0.6330 61/500 [==>...........................] - ETA: 2:24 - loss: 3.0501 - regression_loss: 2.4153 - classification_loss: 0.6348 62/500 [==>...........................] - ETA: 2:24 - loss: 3.0472 - regression_loss: 2.4140 - classification_loss: 0.6332 63/500 [==>...........................] - ETA: 2:24 - loss: 3.0394 - regression_loss: 2.4090 - classification_loss: 0.6304 64/500 [==>...........................] - ETA: 2:24 - loss: 3.0372 - regression_loss: 2.4079 - classification_loss: 0.6293 65/500 [==>...........................] - ETA: 2:23 - loss: 3.0346 - regression_loss: 2.4069 - classification_loss: 0.6277 66/500 [==>...........................] - ETA: 2:23 - loss: 3.0302 - regression_loss: 2.4044 - classification_loss: 0.6258 67/500 [===>..........................] - ETA: 2:23 - loss: 3.0278 - regression_loss: 2.4035 - classification_loss: 0.6242 68/500 [===>..........................] - ETA: 2:22 - loss: 3.0271 - regression_loss: 2.4047 - classification_loss: 0.6224 69/500 [===>..........................] - ETA: 2:22 - loss: 3.0298 - regression_loss: 2.4084 - classification_loss: 0.6214 70/500 [===>..........................] - ETA: 2:22 - loss: 3.0254 - regression_loss: 2.4069 - classification_loss: 0.6185 71/500 [===>..........................] - ETA: 2:21 - loss: 3.0228 - regression_loss: 2.4036 - classification_loss: 0.6192 72/500 [===>..........................] - ETA: 2:21 - loss: 3.0222 - regression_loss: 2.4034 - classification_loss: 0.6188 73/500 [===>..........................] - ETA: 2:21 - loss: 3.0188 - regression_loss: 2.4035 - classification_loss: 0.6154 74/500 [===>..........................] - ETA: 2:20 - loss: 3.0104 - regression_loss: 2.3974 - classification_loss: 0.6130 75/500 [===>..........................] - ETA: 2:20 - loss: 3.0067 - regression_loss: 2.3967 - classification_loss: 0.6100 76/500 [===>..........................] - ETA: 2:20 - loss: 3.0044 - regression_loss: 2.3967 - classification_loss: 0.6077 77/500 [===>..........................] - ETA: 2:19 - loss: 3.0041 - regression_loss: 2.3966 - classification_loss: 0.6075 78/500 [===>..........................] - ETA: 2:19 - loss: 3.0060 - regression_loss: 2.3989 - classification_loss: 0.6070 79/500 [===>..........................] - ETA: 2:19 - loss: 3.0060 - regression_loss: 2.3993 - classification_loss: 0.6067 80/500 [===>..........................] - ETA: 2:18 - loss: 3.0066 - regression_loss: 2.3993 - classification_loss: 0.6073 81/500 [===>..........................] - ETA: 2:18 - loss: 3.0062 - regression_loss: 2.4002 - classification_loss: 0.6060 82/500 [===>..........................] - ETA: 2:18 - loss: 3.0062 - regression_loss: 2.4000 - classification_loss: 0.6062 83/500 [===>..........................] - ETA: 2:17 - loss: 3.0045 - regression_loss: 2.3995 - classification_loss: 0.6050 84/500 [====>.........................] - ETA: 2:17 - loss: 3.0043 - regression_loss: 2.3995 - classification_loss: 0.6049 85/500 [====>.........................] - ETA: 2:17 - loss: 3.0087 - regression_loss: 2.4020 - classification_loss: 0.6068 86/500 [====>.........................] - ETA: 2:16 - loss: 3.0088 - regression_loss: 2.4022 - classification_loss: 0.6065 87/500 [====>.........................] - ETA: 2:16 - loss: 3.0207 - regression_loss: 2.4055 - classification_loss: 0.6153 88/500 [====>.........................] - ETA: 2:16 - loss: 3.0157 - regression_loss: 2.4026 - classification_loss: 0.6131 89/500 [====>.........................] - ETA: 2:15 - loss: 3.0129 - regression_loss: 2.4021 - classification_loss: 0.6108 90/500 [====>.........................] - ETA: 2:15 - loss: 3.0116 - regression_loss: 2.4014 - classification_loss: 0.6102 91/500 [====>.........................] - ETA: 2:15 - loss: 3.0114 - regression_loss: 2.4013 - classification_loss: 0.6102 92/500 [====>.........................] - ETA: 2:14 - loss: 3.0114 - regression_loss: 2.4014 - classification_loss: 0.6100 93/500 [====>.........................] - ETA: 2:14 - loss: 3.0108 - regression_loss: 2.4019 - classification_loss: 0.6089 94/500 [====>.........................] - ETA: 2:14 - loss: 3.0099 - regression_loss: 2.4013 - classification_loss: 0.6086 95/500 [====>.........................] - ETA: 2:13 - loss: 3.0086 - regression_loss: 2.4000 - classification_loss: 0.6087 96/500 [====>.........................] - ETA: 2:13 - loss: 3.0126 - regression_loss: 2.4035 - classification_loss: 0.6091 97/500 [====>.........................] - ETA: 2:13 - loss: 3.0133 - regression_loss: 2.4061 - classification_loss: 0.6071 98/500 [====>.........................] - ETA: 2:12 - loss: 3.0165 - regression_loss: 2.4082 - classification_loss: 0.6083 99/500 [====>.........................] - ETA: 2:12 - loss: 3.0123 - regression_loss: 2.4059 - classification_loss: 0.6064 100/500 [=====>........................] - ETA: 2:12 - loss: 3.0156 - regression_loss: 2.4086 - classification_loss: 0.6070 101/500 [=====>........................] - ETA: 2:11 - loss: 3.0151 - regression_loss: 2.4089 - classification_loss: 0.6062 102/500 [=====>........................] - ETA: 2:11 - loss: 3.0166 - regression_loss: 2.4117 - classification_loss: 0.6048 103/500 [=====>........................] - ETA: 2:11 - loss: 3.0170 - regression_loss: 2.4124 - classification_loss: 0.6046 104/500 [=====>........................] - ETA: 2:10 - loss: 3.0141 - regression_loss: 2.4098 - classification_loss: 0.6043 105/500 [=====>........................] - ETA: 2:10 - loss: 3.0132 - regression_loss: 2.4096 - classification_loss: 0.6036 106/500 [=====>........................] - ETA: 2:09 - loss: 3.0116 - regression_loss: 2.4096 - classification_loss: 0.6020 107/500 [=====>........................] - ETA: 2:09 - loss: 3.0119 - regression_loss: 2.4100 - classification_loss: 0.6019 108/500 [=====>........................] - ETA: 2:09 - loss: 3.0117 - regression_loss: 2.4103 - classification_loss: 0.6014 109/500 [=====>........................] - ETA: 2:09 - loss: 3.0104 - regression_loss: 2.4095 - classification_loss: 0.6009 110/500 [=====>........................] - ETA: 2:08 - loss: 3.0095 - regression_loss: 2.4097 - classification_loss: 0.5999 111/500 [=====>........................] - ETA: 2:08 - loss: 3.0096 - regression_loss: 2.4100 - classification_loss: 0.5996 112/500 [=====>........................] - ETA: 2:08 - loss: 3.0074 - regression_loss: 2.4084 - classification_loss: 0.5990 113/500 [=====>........................] - ETA: 2:07 - loss: 3.0073 - regression_loss: 2.4064 - classification_loss: 0.6009 114/500 [=====>........................] - ETA: 2:07 - loss: 3.0053 - regression_loss: 2.4062 - classification_loss: 0.5992 115/500 [=====>........................] - ETA: 2:07 - loss: 3.0047 - regression_loss: 2.4061 - classification_loss: 0.5987 116/500 [=====>........................] - ETA: 2:06 - loss: 3.0032 - regression_loss: 2.4055 - classification_loss: 0.5977 117/500 [======>.......................] - ETA: 2:06 - loss: 3.0021 - regression_loss: 2.4050 - classification_loss: 0.5971 118/500 [======>.......................] - ETA: 2:06 - loss: 3.0010 - regression_loss: 2.4045 - classification_loss: 0.5965 119/500 [======>.......................] - ETA: 2:05 - loss: 2.9941 - regression_loss: 2.4000 - classification_loss: 0.5941 120/500 [======>.......................] - ETA: 2:05 - loss: 2.9926 - regression_loss: 2.4005 - classification_loss: 0.5921 121/500 [======>.......................] - ETA: 2:05 - loss: 2.9924 - regression_loss: 2.4006 - classification_loss: 0.5918 122/500 [======>.......................] - ETA: 2:04 - loss: 2.9905 - regression_loss: 2.4000 - classification_loss: 0.5905 123/500 [======>.......................] - ETA: 2:04 - loss: 2.9913 - regression_loss: 2.4004 - classification_loss: 0.5908 124/500 [======>.......................] - ETA: 2:04 - loss: 2.9913 - regression_loss: 2.4006 - classification_loss: 0.5907 125/500 [======>.......................] - ETA: 2:03 - loss: 2.9900 - regression_loss: 2.3998 - classification_loss: 0.5902 126/500 [======>.......................] - ETA: 2:03 - loss: 2.9900 - regression_loss: 2.3999 - classification_loss: 0.5901 127/500 [======>.......................] - ETA: 2:03 - loss: 2.9878 - regression_loss: 2.3990 - classification_loss: 0.5888 128/500 [======>.......................] - ETA: 2:03 - loss: 2.9878 - regression_loss: 2.3990 - classification_loss: 0.5888 129/500 [======>.......................] - ETA: 2:02 - loss: 2.9866 - regression_loss: 2.3987 - classification_loss: 0.5880 130/500 [======>.......................] - ETA: 2:02 - loss: 2.9880 - regression_loss: 2.3983 - classification_loss: 0.5897 131/500 [======>.......................] - ETA: 2:01 - loss: 2.9873 - regression_loss: 2.3980 - classification_loss: 0.5893 132/500 [======>.......................] - ETA: 2:01 - loss: 2.9866 - regression_loss: 2.3978 - classification_loss: 0.5888 133/500 [======>.......................] - ETA: 2:01 - loss: 2.9856 - regression_loss: 2.3975 - classification_loss: 0.5881 134/500 [=======>......................] - ETA: 2:01 - loss: 2.9854 - regression_loss: 2.3977 - classification_loss: 0.5877 135/500 [=======>......................] - ETA: 2:00 - loss: 2.9868 - regression_loss: 2.3984 - classification_loss: 0.5883 136/500 [=======>......................] - ETA: 2:00 - loss: 2.9858 - regression_loss: 2.3981 - classification_loss: 0.5877 137/500 [=======>......................] - ETA: 2:00 - loss: 2.9837 - regression_loss: 2.3964 - classification_loss: 0.5874 138/500 [=======>......................] - ETA: 1:59 - loss: 2.9829 - regression_loss: 2.3961 - classification_loss: 0.5868 139/500 [=======>......................] - ETA: 1:59 - loss: 2.9819 - regression_loss: 2.3957 - classification_loss: 0.5862 140/500 [=======>......................] - ETA: 1:59 - loss: 2.9856 - regression_loss: 2.3989 - classification_loss: 0.5866 141/500 [=======>......................] - ETA: 1:58 - loss: 2.9807 - regression_loss: 2.3955 - classification_loss: 0.5853 142/500 [=======>......................] - ETA: 1:58 - loss: 2.9795 - regression_loss: 2.3942 - classification_loss: 0.5853 143/500 [=======>......................] - ETA: 1:58 - loss: 2.9789 - regression_loss: 2.3937 - classification_loss: 0.5852 144/500 [=======>......................] - ETA: 1:57 - loss: 2.9818 - regression_loss: 2.3950 - classification_loss: 0.5868 145/500 [=======>......................] - ETA: 1:57 - loss: 2.9834 - regression_loss: 2.3969 - classification_loss: 0.5865 146/500 [=======>......................] - ETA: 1:57 - loss: 2.9829 - regression_loss: 2.3964 - classification_loss: 0.5865 147/500 [=======>......................] - ETA: 1:56 - loss: 2.9815 - regression_loss: 2.3953 - classification_loss: 0.5862 148/500 [=======>......................] - ETA: 1:56 - loss: 2.9847 - regression_loss: 2.3968 - classification_loss: 0.5879 149/500 [=======>......................] - ETA: 1:56 - loss: 2.9863 - regression_loss: 2.3979 - classification_loss: 0.5884 150/500 [========>.....................] - ETA: 1:55 - loss: 2.9827 - regression_loss: 2.3947 - classification_loss: 0.5880 151/500 [========>.....................] - ETA: 1:55 - loss: 2.9807 - regression_loss: 2.3934 - classification_loss: 0.5873 152/500 [========>.....................] - ETA: 1:55 - loss: 2.9790 - regression_loss: 2.3928 - classification_loss: 0.5862 153/500 [========>.....................] - ETA: 1:54 - loss: 2.9767 - regression_loss: 2.3911 - classification_loss: 0.5857 154/500 [========>.....................] - ETA: 1:54 - loss: 2.9756 - regression_loss: 2.3903 - classification_loss: 0.5852 155/500 [========>.....................] - ETA: 1:54 - loss: 2.9720 - regression_loss: 2.3881 - classification_loss: 0.5838 156/500 [========>.....................] - ETA: 1:53 - loss: 2.9710 - regression_loss: 2.3875 - classification_loss: 0.5835 157/500 [========>.....................] - ETA: 1:53 - loss: 2.9706 - regression_loss: 2.3872 - classification_loss: 0.5834 158/500 [========>.....................] - ETA: 1:53 - loss: 2.9694 - regression_loss: 2.3864 - classification_loss: 0.5830 159/500 [========>.....................] - ETA: 1:52 - loss: 2.9695 - regression_loss: 2.3868 - classification_loss: 0.5827 160/500 [========>.....................] - ETA: 1:52 - loss: 2.9691 - regression_loss: 2.3870 - classification_loss: 0.5821 161/500 [========>.....................] - ETA: 1:52 - loss: 2.9681 - regression_loss: 2.3865 - classification_loss: 0.5815 162/500 [========>.....................] - ETA: 1:51 - loss: 2.9693 - regression_loss: 2.3866 - classification_loss: 0.5827 163/500 [========>.....................] - ETA: 1:51 - loss: 2.9689 - regression_loss: 2.3865 - classification_loss: 0.5824 164/500 [========>.....................] - ETA: 1:51 - loss: 2.9689 - regression_loss: 2.3869 - classification_loss: 0.5819 165/500 [========>.....................] - ETA: 1:50 - loss: 2.9678 - regression_loss: 2.3867 - classification_loss: 0.5811 166/500 [========>.....................] - ETA: 1:50 - loss: 2.9675 - regression_loss: 2.3869 - classification_loss: 0.5806 167/500 [=========>....................] - ETA: 1:50 - loss: 2.9649 - regression_loss: 2.3859 - classification_loss: 0.5789 168/500 [=========>....................] - ETA: 1:49 - loss: 2.9645 - regression_loss: 2.3857 - classification_loss: 0.5788 169/500 [=========>....................] - ETA: 1:49 - loss: 2.9640 - regression_loss: 2.3852 - classification_loss: 0.5787 170/500 [=========>....................] - ETA: 1:49 - loss: 2.9643 - regression_loss: 2.3856 - classification_loss: 0.5787 171/500 [=========>....................] - ETA: 1:48 - loss: 2.9638 - regression_loss: 2.3855 - classification_loss: 0.5783 172/500 [=========>....................] - ETA: 1:48 - loss: 2.9638 - regression_loss: 2.3854 - classification_loss: 0.5784 173/500 [=========>....................] - ETA: 1:48 - loss: 2.9654 - regression_loss: 2.3876 - classification_loss: 0.5779 174/500 [=========>....................] - ETA: 1:47 - loss: 2.9646 - regression_loss: 2.3872 - classification_loss: 0.5774 175/500 [=========>....................] - ETA: 1:47 - loss: 2.9634 - regression_loss: 2.3869 - classification_loss: 0.5766 176/500 [=========>....................] - ETA: 1:47 - loss: 2.9637 - regression_loss: 2.3867 - classification_loss: 0.5769 177/500 [=========>....................] - ETA: 1:46 - loss: 2.9631 - regression_loss: 2.3863 - classification_loss: 0.5768 178/500 [=========>....................] - ETA: 1:46 - loss: 2.9632 - regression_loss: 2.3867 - classification_loss: 0.5764 179/500 [=========>....................] - ETA: 1:46 - loss: 2.9622 - regression_loss: 2.3865 - classification_loss: 0.5757 180/500 [=========>....................] - ETA: 1:46 - loss: 2.9616 - regression_loss: 2.3863 - classification_loss: 0.5753 181/500 [=========>....................] - ETA: 1:45 - loss: 2.9608 - regression_loss: 2.3858 - classification_loss: 0.5750 182/500 [=========>....................] - ETA: 1:45 - loss: 2.9612 - regression_loss: 2.3863 - classification_loss: 0.5749 183/500 [=========>....................] - ETA: 1:45 - loss: 2.9636 - regression_loss: 2.3874 - classification_loss: 0.5762 184/500 [==========>...................] - ETA: 1:44 - loss: 2.9637 - regression_loss: 2.3880 - classification_loss: 0.5757 185/500 [==========>...................] - ETA: 1:44 - loss: 2.9655 - regression_loss: 2.3889 - classification_loss: 0.5766 186/500 [==========>...................] - ETA: 1:44 - loss: 2.9637 - regression_loss: 2.3881 - classification_loss: 0.5756 187/500 [==========>...................] - ETA: 1:43 - loss: 2.9642 - regression_loss: 2.3889 - classification_loss: 0.5753 188/500 [==========>...................] - ETA: 1:43 - loss: 2.9647 - regression_loss: 2.3897 - classification_loss: 0.5751 189/500 [==========>...................] - ETA: 1:42 - loss: 2.9641 - regression_loss: 2.3891 - classification_loss: 0.5750 190/500 [==========>...................] - ETA: 1:42 - loss: 2.9636 - regression_loss: 2.3892 - classification_loss: 0.5743 191/500 [==========>...................] - ETA: 1:42 - loss: 2.9635 - regression_loss: 2.3894 - classification_loss: 0.5741 192/500 [==========>...................] - ETA: 1:41 - loss: 2.9644 - regression_loss: 2.3903 - classification_loss: 0.5742 193/500 [==========>...................] - ETA: 1:41 - loss: 2.9606 - regression_loss: 2.3878 - classification_loss: 0.5729 194/500 [==========>...................] - ETA: 1:41 - loss: 2.9600 - regression_loss: 2.3877 - classification_loss: 0.5723 195/500 [==========>...................] - ETA: 1:40 - loss: 2.9595 - regression_loss: 2.3875 - classification_loss: 0.5720 196/500 [==========>...................] - ETA: 1:40 - loss: 2.9638 - regression_loss: 2.3902 - classification_loss: 0.5736 197/500 [==========>...................] - ETA: 1:40 - loss: 2.9637 - regression_loss: 2.3901 - classification_loss: 0.5736 198/500 [==========>...................] - ETA: 1:39 - loss: 2.9623 - regression_loss: 2.3895 - classification_loss: 0.5728 199/500 [==========>...................] - ETA: 1:39 - loss: 2.9622 - regression_loss: 2.3892 - classification_loss: 0.5730 200/500 [===========>..................] - ETA: 1:39 - loss: 2.9619 - regression_loss: 2.3891 - classification_loss: 0.5728 201/500 [===========>..................] - ETA: 1:38 - loss: 2.9644 - regression_loss: 2.3912 - classification_loss: 0.5732 202/500 [===========>..................] - ETA: 1:38 - loss: 2.9634 - regression_loss: 2.3907 - classification_loss: 0.5728 203/500 [===========>..................] - ETA: 1:38 - loss: 2.9640 - regression_loss: 2.3908 - classification_loss: 0.5732 204/500 [===========>..................] - ETA: 1:37 - loss: 2.9677 - regression_loss: 2.3929 - classification_loss: 0.5748 205/500 [===========>..................] - ETA: 1:37 - loss: 2.9672 - regression_loss: 2.3924 - classification_loss: 0.5748 206/500 [===========>..................] - ETA: 1:37 - loss: 2.9673 - regression_loss: 2.3925 - classification_loss: 0.5748 207/500 [===========>..................] - ETA: 1:36 - loss: 2.9665 - regression_loss: 2.3923 - classification_loss: 0.5742 208/500 [===========>..................] - ETA: 1:36 - loss: 2.9663 - regression_loss: 2.3922 - classification_loss: 0.5740 209/500 [===========>..................] - ETA: 1:36 - loss: 2.9646 - regression_loss: 2.3913 - classification_loss: 0.5734 210/500 [===========>..................] - ETA: 1:35 - loss: 2.9645 - regression_loss: 2.3915 - classification_loss: 0.5730 211/500 [===========>..................] - ETA: 1:35 - loss: 2.9646 - regression_loss: 2.3917 - classification_loss: 0.5729 212/500 [===========>..................] - ETA: 1:35 - loss: 2.9649 - regression_loss: 2.3918 - classification_loss: 0.5731 213/500 [===========>..................] - ETA: 1:34 - loss: 2.9654 - regression_loss: 2.3885 - classification_loss: 0.5768 214/500 [===========>..................] - ETA: 1:34 - loss: 2.9656 - regression_loss: 2.3887 - classification_loss: 0.5769 215/500 [===========>..................] - ETA: 1:34 - loss: 2.9688 - regression_loss: 2.3905 - classification_loss: 0.5783 216/500 [===========>..................] - ETA: 1:33 - loss: 2.9682 - regression_loss: 2.3902 - classification_loss: 0.5781 217/500 [============>.................] - ETA: 1:33 - loss: 2.9672 - regression_loss: 2.3894 - classification_loss: 0.5777 218/500 [============>.................] - ETA: 1:33 - loss: 2.9660 - regression_loss: 2.3891 - classification_loss: 0.5770 219/500 [============>.................] - ETA: 1:32 - loss: 2.9657 - regression_loss: 2.3894 - classification_loss: 0.5763 220/500 [============>.................] - ETA: 1:32 - loss: 2.9659 - regression_loss: 2.3892 - classification_loss: 0.5767 221/500 [============>.................] - ETA: 1:32 - loss: 2.9664 - regression_loss: 2.3894 - classification_loss: 0.5770 222/500 [============>.................] - ETA: 1:31 - loss: 2.9665 - regression_loss: 2.3896 - classification_loss: 0.5768 223/500 [============>.................] - ETA: 1:31 - loss: 2.9667 - regression_loss: 2.3900 - classification_loss: 0.5768 224/500 [============>.................] - ETA: 1:31 - loss: 2.9678 - regression_loss: 2.3913 - classification_loss: 0.5766 225/500 [============>.................] - ETA: 1:30 - loss: 2.9684 - regression_loss: 2.3916 - classification_loss: 0.5767 226/500 [============>.................] - ETA: 1:30 - loss: 2.9685 - regression_loss: 2.3918 - classification_loss: 0.5767 227/500 [============>.................] - ETA: 1:30 - loss: 2.9689 - regression_loss: 2.3921 - classification_loss: 0.5768 228/500 [============>.................] - ETA: 1:29 - loss: 2.9708 - regression_loss: 2.3924 - classification_loss: 0.5784 229/500 [============>.................] - ETA: 1:29 - loss: 2.9705 - regression_loss: 2.3922 - classification_loss: 0.5783 230/500 [============>.................] - ETA: 1:29 - loss: 2.9701 - regression_loss: 2.3922 - classification_loss: 0.5780 231/500 [============>.................] - ETA: 1:28 - loss: 2.9675 - regression_loss: 2.3905 - classification_loss: 0.5770 232/500 [============>.................] - ETA: 1:28 - loss: 2.9688 - regression_loss: 2.3913 - classification_loss: 0.5775 233/500 [============>.................] - ETA: 1:28 - loss: 2.9702 - regression_loss: 2.3931 - classification_loss: 0.5771 234/500 [=============>................] - ETA: 1:27 - loss: 2.9708 - regression_loss: 2.3944 - classification_loss: 0.5764 235/500 [=============>................] - ETA: 1:27 - loss: 2.9686 - regression_loss: 2.3927 - classification_loss: 0.5759 236/500 [=============>................] - ETA: 1:27 - loss: 2.9683 - regression_loss: 2.3928 - classification_loss: 0.5756 237/500 [=============>................] - ETA: 1:26 - loss: 2.9700 - regression_loss: 2.3936 - classification_loss: 0.5764 238/500 [=============>................] - ETA: 1:26 - loss: 2.9722 - regression_loss: 2.3948 - classification_loss: 0.5774 239/500 [=============>................] - ETA: 1:26 - loss: 2.9713 - regression_loss: 2.3947 - classification_loss: 0.5767 240/500 [=============>................] - ETA: 1:25 - loss: 2.9686 - regression_loss: 2.3927 - classification_loss: 0.5758 241/500 [=============>................] - ETA: 1:25 - loss: 2.9689 - regression_loss: 2.3928 - classification_loss: 0.5762 242/500 [=============>................] - ETA: 1:25 - loss: 2.9696 - regression_loss: 2.3936 - classification_loss: 0.5761 243/500 [=============>................] - ETA: 1:24 - loss: 2.9693 - regression_loss: 2.3935 - classification_loss: 0.5758 244/500 [=============>................] - ETA: 1:24 - loss: 2.9654 - regression_loss: 2.3904 - classification_loss: 0.5750 245/500 [=============>................] - ETA: 1:24 - loss: 2.9622 - regression_loss: 2.3876 - classification_loss: 0.5746 246/500 [=============>................] - ETA: 1:23 - loss: 2.9605 - regression_loss: 2.3860 - classification_loss: 0.5745 247/500 [=============>................] - ETA: 1:23 - loss: 2.9604 - regression_loss: 2.3862 - classification_loss: 0.5742 248/500 [=============>................] - ETA: 1:23 - loss: 2.9598 - regression_loss: 2.3859 - classification_loss: 0.5738 249/500 [=============>................] - ETA: 1:22 - loss: 2.9595 - regression_loss: 2.3858 - classification_loss: 0.5737 250/500 [==============>...............] - ETA: 1:22 - loss: 2.9590 - regression_loss: 2.3855 - classification_loss: 0.5736 251/500 [==============>...............] - ETA: 1:22 - loss: 2.9588 - regression_loss: 2.3853 - classification_loss: 0.5735 252/500 [==============>...............] - ETA: 1:22 - loss: 2.9595 - regression_loss: 2.3861 - classification_loss: 0.5735 253/500 [==============>...............] - ETA: 1:21 - loss: 2.9600 - regression_loss: 2.3859 - classification_loss: 0.5741 254/500 [==============>...............] - ETA: 1:21 - loss: 2.9594 - regression_loss: 2.3857 - classification_loss: 0.5738 255/500 [==============>...............] - ETA: 1:21 - loss: 2.9593 - regression_loss: 2.3859 - classification_loss: 0.5735 256/500 [==============>...............] - ETA: 1:20 - loss: 2.9590 - regression_loss: 2.3858 - classification_loss: 0.5732 257/500 [==============>...............] - ETA: 1:20 - loss: 2.9578 - regression_loss: 2.3849 - classification_loss: 0.5729 258/500 [==============>...............] - ETA: 1:20 - loss: 2.9579 - regression_loss: 2.3853 - classification_loss: 0.5726 259/500 [==============>...............] - ETA: 1:19 - loss: 2.9568 - regression_loss: 2.3848 - classification_loss: 0.5720 260/500 [==============>...............] - ETA: 1:19 - loss: 2.9563 - regression_loss: 2.3846 - classification_loss: 0.5716 261/500 [==============>...............] - ETA: 1:19 - loss: 2.9555 - regression_loss: 2.3846 - classification_loss: 0.5709 262/500 [==============>...............] - ETA: 1:18 - loss: 2.9545 - regression_loss: 2.3839 - classification_loss: 0.5706 263/500 [==============>...............] - ETA: 1:18 - loss: 2.9552 - regression_loss: 2.3850 - classification_loss: 0.5702 264/500 [==============>...............] - ETA: 1:18 - loss: 2.9545 - regression_loss: 2.3853 - classification_loss: 0.5693 265/500 [==============>...............] - ETA: 1:17 - loss: 2.9539 - regression_loss: 2.3848 - classification_loss: 0.5691 266/500 [==============>...............] - ETA: 1:17 - loss: 2.9536 - regression_loss: 2.3847 - classification_loss: 0.5689 267/500 [===============>..............] - ETA: 1:17 - loss: 2.9536 - regression_loss: 2.3844 - classification_loss: 0.5692 268/500 [===============>..............] - ETA: 1:16 - loss: 2.9537 - regression_loss: 2.3845 - classification_loss: 0.5693 269/500 [===============>..............] - ETA: 1:16 - loss: 2.9526 - regression_loss: 2.3841 - classification_loss: 0.5685 270/500 [===============>..............] - ETA: 1:16 - loss: 2.9522 - regression_loss: 2.3839 - classification_loss: 0.5683 271/500 [===============>..............] - ETA: 1:15 - loss: 2.9516 - regression_loss: 2.3837 - classification_loss: 0.5680 272/500 [===============>..............] - ETA: 1:15 - loss: 2.9511 - regression_loss: 2.3835 - classification_loss: 0.5676 273/500 [===============>..............] - ETA: 1:15 - loss: 2.9491 - regression_loss: 2.3821 - classification_loss: 0.5669 274/500 [===============>..............] - ETA: 1:14 - loss: 2.9485 - regression_loss: 2.3817 - classification_loss: 0.5668 275/500 [===============>..............] - ETA: 1:14 - loss: 2.9483 - regression_loss: 2.3804 - classification_loss: 0.5680 276/500 [===============>..............] - ETA: 1:14 - loss: 2.9476 - regression_loss: 2.3801 - classification_loss: 0.5675 277/500 [===============>..............] - ETA: 1:13 - loss: 2.9476 - regression_loss: 2.3802 - classification_loss: 0.5674 278/500 [===============>..............] - ETA: 1:13 - loss: 2.9478 - regression_loss: 2.3806 - classification_loss: 0.5672 279/500 [===============>..............] - ETA: 1:13 - loss: 2.9477 - regression_loss: 2.3806 - classification_loss: 0.5671 280/500 [===============>..............] - ETA: 1:12 - loss: 2.9466 - regression_loss: 2.3798 - classification_loss: 0.5668 281/500 [===============>..............] - ETA: 1:12 - loss: 2.9458 - regression_loss: 2.3797 - classification_loss: 0.5661 282/500 [===============>..............] - ETA: 1:12 - loss: 2.9455 - regression_loss: 2.3795 - classification_loss: 0.5660 283/500 [===============>..............] - ETA: 1:11 - loss: 2.9456 - regression_loss: 2.3799 - classification_loss: 0.5658 284/500 [================>.............] - ETA: 1:11 - loss: 2.9453 - regression_loss: 2.3797 - classification_loss: 0.5656 285/500 [================>.............] - ETA: 1:11 - loss: 2.9451 - regression_loss: 2.3796 - classification_loss: 0.5656 286/500 [================>.............] - ETA: 1:10 - loss: 2.9453 - regression_loss: 2.3800 - classification_loss: 0.5653 287/500 [================>.............] - ETA: 1:10 - loss: 2.9420 - regression_loss: 2.3776 - classification_loss: 0.5644 288/500 [================>.............] - ETA: 1:10 - loss: 2.9418 - regression_loss: 2.3775 - classification_loss: 0.5643 289/500 [================>.............] - ETA: 1:09 - loss: 2.9417 - regression_loss: 2.3776 - classification_loss: 0.5642 290/500 [================>.............] - ETA: 1:09 - loss: 2.9415 - regression_loss: 2.3773 - classification_loss: 0.5641 291/500 [================>.............] - ETA: 1:09 - loss: 2.9412 - regression_loss: 2.3771 - classification_loss: 0.5640 292/500 [================>.............] - ETA: 1:08 - loss: 2.9408 - regression_loss: 2.3768 - classification_loss: 0.5640 293/500 [================>.............] - ETA: 1:08 - loss: 2.9400 - regression_loss: 2.3762 - classification_loss: 0.5638 294/500 [================>.............] - ETA: 1:08 - loss: 2.9392 - regression_loss: 2.3758 - classification_loss: 0.5634 295/500 [================>.............] - ETA: 1:07 - loss: 2.9418 - regression_loss: 2.3768 - classification_loss: 0.5650 296/500 [================>.............] - ETA: 1:07 - loss: 2.9415 - regression_loss: 2.3767 - classification_loss: 0.5648 297/500 [================>.............] - ETA: 1:07 - loss: 2.9407 - regression_loss: 2.3763 - classification_loss: 0.5644 298/500 [================>.............] - ETA: 1:06 - loss: 2.9403 - regression_loss: 2.3761 - classification_loss: 0.5642 299/500 [================>.............] - ETA: 1:06 - loss: 2.9398 - regression_loss: 2.3759 - classification_loss: 0.5639 300/500 [=================>............] - ETA: 1:06 - loss: 2.9395 - regression_loss: 2.3757 - classification_loss: 0.5639 301/500 [=================>............] - ETA: 1:05 - loss: 2.9387 - regression_loss: 2.3755 - classification_loss: 0.5633 302/500 [=================>............] - ETA: 1:05 - loss: 2.9391 - regression_loss: 2.3758 - classification_loss: 0.5633 303/500 [=================>............] - ETA: 1:05 - loss: 2.9392 - regression_loss: 2.3758 - classification_loss: 0.5634 304/500 [=================>............] - ETA: 1:04 - loss: 2.9378 - regression_loss: 2.3748 - classification_loss: 0.5630 305/500 [=================>............] - ETA: 1:04 - loss: 2.9375 - regression_loss: 2.3750 - classification_loss: 0.5626 306/500 [=================>............] - ETA: 1:04 - loss: 2.9372 - regression_loss: 2.3748 - classification_loss: 0.5624 307/500 [=================>............] - ETA: 1:03 - loss: 2.9367 - regression_loss: 2.3745 - classification_loss: 0.5621 308/500 [=================>............] - ETA: 1:03 - loss: 2.9363 - regression_loss: 2.3743 - classification_loss: 0.5620 309/500 [=================>............] - ETA: 1:03 - loss: 2.9362 - regression_loss: 2.3746 - classification_loss: 0.5616 310/500 [=================>............] - ETA: 1:03 - loss: 2.9364 - regression_loss: 2.3748 - classification_loss: 0.5616 311/500 [=================>............] - ETA: 1:02 - loss: 2.9394 - regression_loss: 2.3773 - classification_loss: 0.5621 312/500 [=================>............] - ETA: 1:02 - loss: 2.9391 - regression_loss: 2.3771 - classification_loss: 0.5619 313/500 [=================>............] - ETA: 1:01 - loss: 2.9396 - regression_loss: 2.3775 - classification_loss: 0.5622 314/500 [=================>............] - ETA: 1:01 - loss: 2.9391 - regression_loss: 2.3773 - classification_loss: 0.5618 315/500 [=================>............] - ETA: 1:01 - loss: 2.9381 - regression_loss: 2.3769 - classification_loss: 0.5612 316/500 [=================>............] - ETA: 1:01 - loss: 2.9386 - regression_loss: 2.3772 - classification_loss: 0.5614 317/500 [==================>...........] - ETA: 1:00 - loss: 2.9390 - regression_loss: 2.3775 - classification_loss: 0.5615 318/500 [==================>...........] - ETA: 1:00 - loss: 2.9387 - regression_loss: 2.3776 - classification_loss: 0.5612 319/500 [==================>...........] - ETA: 59s - loss: 2.9390 - regression_loss: 2.3778 - classification_loss: 0.5613  320/500 [==================>...........] - ETA: 59s - loss: 2.9396 - regression_loss: 2.3780 - classification_loss: 0.5616 321/500 [==================>...........] - ETA: 59s - loss: 2.9385 - regression_loss: 2.3772 - classification_loss: 0.5612 322/500 [==================>...........] - ETA: 59s - loss: 2.9373 - regression_loss: 2.3763 - classification_loss: 0.5610 323/500 [==================>...........] - ETA: 58s - loss: 2.9365 - regression_loss: 2.3760 - classification_loss: 0.5605 324/500 [==================>...........] - ETA: 58s - loss: 2.9361 - regression_loss: 2.3758 - classification_loss: 0.5603 325/500 [==================>...........] - ETA: 58s - loss: 2.9370 - regression_loss: 2.3763 - classification_loss: 0.5607 326/500 [==================>...........] - ETA: 57s - loss: 2.9368 - regression_loss: 2.3763 - classification_loss: 0.5605 327/500 [==================>...........] - ETA: 57s - loss: 2.9357 - regression_loss: 2.3752 - classification_loss: 0.5605 328/500 [==================>...........] - ETA: 57s - loss: 2.9364 - regression_loss: 2.3761 - classification_loss: 0.5602 329/500 [==================>...........] - ETA: 56s - loss: 2.9363 - regression_loss: 2.3763 - classification_loss: 0.5600 330/500 [==================>...........] - ETA: 56s - loss: 2.9360 - regression_loss: 2.3761 - classification_loss: 0.5599 331/500 [==================>...........] - ETA: 56s - loss: 2.9362 - regression_loss: 2.3766 - classification_loss: 0.5595 332/500 [==================>...........] - ETA: 55s - loss: 2.9370 - regression_loss: 2.3770 - classification_loss: 0.5600 333/500 [==================>...........] - ETA: 55s - loss: 2.9358 - regression_loss: 2.3763 - classification_loss: 0.5595 334/500 [===================>..........] - ETA: 55s - loss: 2.9365 - regression_loss: 2.3772 - classification_loss: 0.5594 335/500 [===================>..........] - ETA: 54s - loss: 2.9361 - regression_loss: 2.3769 - classification_loss: 0.5592 336/500 [===================>..........] - ETA: 54s - loss: 2.9355 - regression_loss: 2.3765 - classification_loss: 0.5591 337/500 [===================>..........] - ETA: 54s - loss: 2.9350 - regression_loss: 2.3763 - classification_loss: 0.5587 338/500 [===================>..........] - ETA: 53s - loss: 2.9359 - regression_loss: 2.3770 - classification_loss: 0.5590 339/500 [===================>..........] - ETA: 53s - loss: 2.9357 - regression_loss: 2.3770 - classification_loss: 0.5587 340/500 [===================>..........] - ETA: 53s - loss: 2.9352 - regression_loss: 2.3767 - classification_loss: 0.5585 341/500 [===================>..........] - ETA: 52s - loss: 2.9350 - regression_loss: 2.3766 - classification_loss: 0.5584 342/500 [===================>..........] - ETA: 52s - loss: 2.9353 - regression_loss: 2.3772 - classification_loss: 0.5581 343/500 [===================>..........] - ETA: 52s - loss: 2.9357 - regression_loss: 2.3773 - classification_loss: 0.5584 344/500 [===================>..........] - ETA: 51s - loss: 2.9352 - regression_loss: 2.3772 - classification_loss: 0.5580 345/500 [===================>..........] - ETA: 51s - loss: 2.9348 - regression_loss: 2.3771 - classification_loss: 0.5578 346/500 [===================>..........] - ETA: 51s - loss: 2.9347 - regression_loss: 2.3770 - classification_loss: 0.5577 347/500 [===================>..........] - ETA: 50s - loss: 2.9339 - regression_loss: 2.3767 - classification_loss: 0.5572 348/500 [===================>..........] - ETA: 50s - loss: 2.9334 - regression_loss: 2.3764 - classification_loss: 0.5570 349/500 [===================>..........] - ETA: 50s - loss: 2.9332 - regression_loss: 2.3764 - classification_loss: 0.5568 350/500 [====================>.........] - ETA: 49s - loss: 2.9318 - regression_loss: 2.3753 - classification_loss: 0.5565 351/500 [====================>.........] - ETA: 49s - loss: 2.9313 - regression_loss: 2.3749 - classification_loss: 0.5564 352/500 [====================>.........] - ETA: 49s - loss: 2.9309 - regression_loss: 2.3747 - classification_loss: 0.5562 353/500 [====================>.........] - ETA: 48s - loss: 2.9297 - regression_loss: 2.3738 - classification_loss: 0.5559 354/500 [====================>.........] - ETA: 48s - loss: 2.9292 - regression_loss: 2.3736 - classification_loss: 0.5557 355/500 [====================>.........] - ETA: 48s - loss: 2.9288 - regression_loss: 2.3734 - classification_loss: 0.5554 356/500 [====================>.........] - ETA: 47s - loss: 2.9282 - regression_loss: 2.3731 - classification_loss: 0.5551 357/500 [====================>.........] - ETA: 47s - loss: 2.9283 - regression_loss: 2.3730 - classification_loss: 0.5553 358/500 [====================>.........] - ETA: 47s - loss: 2.9271 - regression_loss: 2.3721 - classification_loss: 0.5550 359/500 [====================>.........] - ETA: 46s - loss: 2.9266 - regression_loss: 2.3718 - classification_loss: 0.5548 360/500 [====================>.........] - ETA: 46s - loss: 2.9258 - regression_loss: 2.3715 - classification_loss: 0.5544 361/500 [====================>.........] - ETA: 46s - loss: 2.9254 - regression_loss: 2.3712 - classification_loss: 0.5542 362/500 [====================>.........] - ETA: 45s - loss: 2.9256 - regression_loss: 2.3715 - classification_loss: 0.5541 363/500 [====================>.........] - ETA: 45s - loss: 2.9256 - regression_loss: 2.3715 - classification_loss: 0.5541 364/500 [====================>.........] - ETA: 45s - loss: 2.9250 - regression_loss: 2.3714 - classification_loss: 0.5536 365/500 [====================>.........] - ETA: 44s - loss: 2.9240 - regression_loss: 2.3709 - classification_loss: 0.5531 366/500 [====================>.........] - ETA: 44s - loss: 2.9241 - regression_loss: 2.3711 - classification_loss: 0.5530 367/500 [=====================>........] - ETA: 44s - loss: 2.9244 - regression_loss: 2.3713 - classification_loss: 0.5531 368/500 [=====================>........] - ETA: 43s - loss: 2.9239 - regression_loss: 2.3712 - classification_loss: 0.5527 369/500 [=====================>........] - ETA: 43s - loss: 2.9232 - regression_loss: 2.3709 - classification_loss: 0.5523 370/500 [=====================>........] - ETA: 43s - loss: 2.9228 - regression_loss: 2.3706 - classification_loss: 0.5522 371/500 [=====================>........] - ETA: 42s - loss: 2.9220 - regression_loss: 2.3700 - classification_loss: 0.5520 372/500 [=====================>........] - ETA: 42s - loss: 2.9211 - regression_loss: 2.3693 - classification_loss: 0.5517 373/500 [=====================>........] - ETA: 42s - loss: 2.9208 - regression_loss: 2.3691 - classification_loss: 0.5517 374/500 [=====================>........] - ETA: 41s - loss: 2.9208 - regression_loss: 2.3694 - classification_loss: 0.5514 375/500 [=====================>........] - ETA: 41s - loss: 2.9221 - regression_loss: 2.3705 - classification_loss: 0.5516 376/500 [=====================>........] - ETA: 41s - loss: 2.9213 - regression_loss: 2.3702 - classification_loss: 0.5511 377/500 [=====================>........] - ETA: 40s - loss: 2.9244 - regression_loss: 2.3705 - classification_loss: 0.5540 378/500 [=====================>........] - ETA: 40s - loss: 2.9242 - regression_loss: 2.3706 - classification_loss: 0.5537 379/500 [=====================>........] - ETA: 40s - loss: 2.9246 - regression_loss: 2.3707 - classification_loss: 0.5538 380/500 [=====================>........] - ETA: 39s - loss: 2.9244 - regression_loss: 2.3706 - classification_loss: 0.5538 381/500 [=====================>........] - ETA: 39s - loss: 2.9251 - regression_loss: 2.3713 - classification_loss: 0.5538 382/500 [=====================>........] - ETA: 39s - loss: 2.9247 - regression_loss: 2.3710 - classification_loss: 0.5537 383/500 [=====================>........] - ETA: 38s - loss: 2.9226 - regression_loss: 2.3696 - classification_loss: 0.5531 384/500 [======================>.......] - ETA: 38s - loss: 2.9225 - regression_loss: 2.3694 - classification_loss: 0.5531 385/500 [======================>.......] - ETA: 38s - loss: 2.9228 - regression_loss: 2.3696 - classification_loss: 0.5532 386/500 [======================>.......] - ETA: 37s - loss: 2.9225 - regression_loss: 2.3694 - classification_loss: 0.5531 387/500 [======================>.......] - ETA: 37s - loss: 2.9224 - regression_loss: 2.3694 - classification_loss: 0.5530 388/500 [======================>.......] - ETA: 37s - loss: 2.9219 - regression_loss: 2.3692 - classification_loss: 0.5527 389/500 [======================>.......] - ETA: 36s - loss: 2.9214 - regression_loss: 2.3692 - classification_loss: 0.5522 390/500 [======================>.......] - ETA: 36s - loss: 2.9219 - regression_loss: 2.3692 - classification_loss: 0.5527 391/500 [======================>.......] - ETA: 36s - loss: 2.9217 - regression_loss: 2.3692 - classification_loss: 0.5526 392/500 [======================>.......] - ETA: 35s - loss: 2.9215 - regression_loss: 2.3691 - classification_loss: 0.5524 393/500 [======================>.......] - ETA: 35s - loss: 2.9220 - regression_loss: 2.3695 - classification_loss: 0.5525 394/500 [======================>.......] - ETA: 35s - loss: 2.9227 - regression_loss: 2.3701 - classification_loss: 0.5526 395/500 [======================>.......] - ETA: 34s - loss: 2.9227 - regression_loss: 2.3702 - classification_loss: 0.5525 396/500 [======================>.......] - ETA: 34s - loss: 2.9233 - regression_loss: 2.3707 - classification_loss: 0.5526 397/500 [======================>.......] - ETA: 34s - loss: 2.9234 - regression_loss: 2.3712 - classification_loss: 0.5522 398/500 [======================>.......] - ETA: 33s - loss: 2.9238 - regression_loss: 2.3716 - classification_loss: 0.5522 399/500 [======================>.......] - ETA: 33s - loss: 2.9221 - regression_loss: 2.3706 - classification_loss: 0.5515 400/500 [=======================>......] - ETA: 33s - loss: 2.9220 - regression_loss: 2.3706 - classification_loss: 0.5514 401/500 [=======================>......] - ETA: 32s - loss: 2.9214 - regression_loss: 2.3702 - classification_loss: 0.5512 402/500 [=======================>......] - ETA: 32s - loss: 2.9222 - regression_loss: 2.3707 - classification_loss: 0.5515 403/500 [=======================>......] - ETA: 32s - loss: 2.9223 - regression_loss: 2.3709 - classification_loss: 0.5514 404/500 [=======================>......] - ETA: 31s - loss: 2.9226 - regression_loss: 2.3713 - classification_loss: 0.5514 405/500 [=======================>......] - ETA: 31s - loss: 2.9226 - regression_loss: 2.3712 - classification_loss: 0.5513 406/500 [=======================>......] - ETA: 31s - loss: 2.9219 - regression_loss: 2.3705 - classification_loss: 0.5514 407/500 [=======================>......] - ETA: 30s - loss: 2.9217 - regression_loss: 2.3704 - classification_loss: 0.5513 408/500 [=======================>......] - ETA: 30s - loss: 2.9215 - regression_loss: 2.3702 - classification_loss: 0.5513 409/500 [=======================>......] - ETA: 30s - loss: 2.9211 - regression_loss: 2.3700 - classification_loss: 0.5511 410/500 [=======================>......] - ETA: 29s - loss: 2.9217 - regression_loss: 2.3706 - classification_loss: 0.5511 411/500 [=======================>......] - ETA: 29s - loss: 2.9220 - regression_loss: 2.3709 - classification_loss: 0.5511 412/500 [=======================>......] - ETA: 29s - loss: 2.9208 - regression_loss: 2.3700 - classification_loss: 0.5508 413/500 [=======================>......] - ETA: 28s - loss: 2.9204 - regression_loss: 2.3699 - classification_loss: 0.5505 414/500 [=======================>......] - ETA: 28s - loss: 2.9190 - regression_loss: 2.3690 - classification_loss: 0.5500 415/500 [=======================>......] - ETA: 28s - loss: 2.9193 - regression_loss: 2.3694 - classification_loss: 0.5499 416/500 [=======================>......] - ETA: 27s - loss: 2.9179 - regression_loss: 2.3682 - classification_loss: 0.5497 417/500 [========================>.....] - ETA: 27s - loss: 2.9177 - regression_loss: 2.3683 - classification_loss: 0.5494 418/500 [========================>.....] - ETA: 27s - loss: 2.9170 - regression_loss: 2.3680 - classification_loss: 0.5490 419/500 [========================>.....] - ETA: 26s - loss: 2.9168 - regression_loss: 2.3680 - classification_loss: 0.5489 420/500 [========================>.....] - ETA: 26s - loss: 2.9163 - regression_loss: 2.3676 - classification_loss: 0.5487 421/500 [========================>.....] - ETA: 26s - loss: 2.9168 - regression_loss: 2.3681 - classification_loss: 0.5488 422/500 [========================>.....] - ETA: 25s - loss: 2.9171 - regression_loss: 2.3684 - classification_loss: 0.5487 423/500 [========================>.....] - ETA: 25s - loss: 2.9161 - regression_loss: 2.3674 - classification_loss: 0.5486 424/500 [========================>.....] - ETA: 25s - loss: 2.9159 - regression_loss: 2.3673 - classification_loss: 0.5486 425/500 [========================>.....] - ETA: 24s - loss: 2.9157 - regression_loss: 2.3672 - classification_loss: 0.5485 426/500 [========================>.....] - ETA: 24s - loss: 2.9152 - regression_loss: 2.3668 - classification_loss: 0.5484 427/500 [========================>.....] - ETA: 24s - loss: 2.9156 - regression_loss: 2.3672 - classification_loss: 0.5484 428/500 [========================>.....] - ETA: 23s - loss: 2.9153 - regression_loss: 2.3670 - classification_loss: 0.5483 429/500 [========================>.....] - ETA: 23s - loss: 2.9151 - regression_loss: 2.3668 - classification_loss: 0.5483 430/500 [========================>.....] - ETA: 23s - loss: 2.9128 - regression_loss: 2.3649 - classification_loss: 0.5479 431/500 [========================>.....] - ETA: 22s - loss: 2.9130 - regression_loss: 2.3653 - classification_loss: 0.5477 432/500 [========================>.....] - ETA: 22s - loss: 2.9130 - regression_loss: 2.3654 - classification_loss: 0.5476 433/500 [========================>.....] - ETA: 22s - loss: 2.9130 - regression_loss: 2.3654 - classification_loss: 0.5476 434/500 [=========================>....] - ETA: 21s - loss: 2.9128 - regression_loss: 2.3653 - classification_loss: 0.5474 435/500 [=========================>....] - ETA: 21s - loss: 2.9132 - regression_loss: 2.3652 - classification_loss: 0.5480 436/500 [=========================>....] - ETA: 21s - loss: 2.9123 - regression_loss: 2.3647 - classification_loss: 0.5476 437/500 [=========================>....] - ETA: 20s - loss: 2.9120 - regression_loss: 2.3646 - classification_loss: 0.5473 438/500 [=========================>....] - ETA: 20s - loss: 2.9117 - regression_loss: 2.3646 - classification_loss: 0.5471 439/500 [=========================>....] - ETA: 20s - loss: 2.9118 - regression_loss: 2.3646 - classification_loss: 0.5472 440/500 [=========================>....] - ETA: 19s - loss: 2.9118 - regression_loss: 2.3646 - classification_loss: 0.5472 441/500 [=========================>....] - ETA: 19s - loss: 2.9118 - regression_loss: 2.3645 - classification_loss: 0.5473 442/500 [=========================>....] - ETA: 19s - loss: 2.9126 - regression_loss: 2.3650 - classification_loss: 0.5475 443/500 [=========================>....] - ETA: 18s - loss: 2.9126 - regression_loss: 2.3650 - classification_loss: 0.5475 444/500 [=========================>....] - ETA: 18s - loss: 2.9124 - regression_loss: 2.3651 - classification_loss: 0.5473 445/500 [=========================>....] - ETA: 18s - loss: 2.9128 - regression_loss: 2.3654 - classification_loss: 0.5474 446/500 [=========================>....] - ETA: 17s - loss: 2.9129 - regression_loss: 2.3654 - classification_loss: 0.5474 447/500 [=========================>....] - ETA: 17s - loss: 2.9127 - regression_loss: 2.3652 - classification_loss: 0.5476 448/500 [=========================>....] - ETA: 17s - loss: 2.9125 - regression_loss: 2.3650 - classification_loss: 0.5474 449/500 [=========================>....] - ETA: 16s - loss: 2.9132 - regression_loss: 2.3658 - classification_loss: 0.5474 450/500 [==========================>...] - ETA: 16s - loss: 2.9131 - regression_loss: 2.3658 - classification_loss: 0.5473 451/500 [==========================>...] - ETA: 16s - loss: 2.9128 - regression_loss: 2.3656 - classification_loss: 0.5472 452/500 [==========================>...] - ETA: 15s - loss: 2.9111 - regression_loss: 2.3642 - classification_loss: 0.5469 453/500 [==========================>...] - ETA: 15s - loss: 2.9107 - regression_loss: 2.3642 - classification_loss: 0.5465 454/500 [==========================>...] - ETA: 15s - loss: 2.9107 - regression_loss: 2.3642 - classification_loss: 0.5465 455/500 [==========================>...] - ETA: 14s - loss: 2.9087 - regression_loss: 2.3625 - classification_loss: 0.5461 456/500 [==========================>...] - ETA: 14s - loss: 2.9078 - regression_loss: 2.3620 - classification_loss: 0.5457 457/500 [==========================>...] - ETA: 14s - loss: 2.9088 - regression_loss: 2.3626 - classification_loss: 0.5462 458/500 [==========================>...] - ETA: 13s - loss: 2.9087 - regression_loss: 2.3625 - classification_loss: 0.5462 459/500 [==========================>...] - ETA: 13s - loss: 2.9080 - regression_loss: 2.3621 - classification_loss: 0.5459 460/500 [==========================>...] - ETA: 13s - loss: 2.9081 - regression_loss: 2.3622 - classification_loss: 0.5460 461/500 [==========================>...] - ETA: 12s - loss: 2.9079 - regression_loss: 2.3621 - classification_loss: 0.5458 462/500 [==========================>...] - ETA: 12s - loss: 2.9080 - regression_loss: 2.3622 - classification_loss: 0.5458 463/500 [==========================>...] - ETA: 12s - loss: 2.9073 - regression_loss: 2.3619 - classification_loss: 0.5453 464/500 [==========================>...] - ETA: 11s - loss: 2.9073 - regression_loss: 2.3620 - classification_loss: 0.5453 465/500 [==========================>...] - ETA: 11s - loss: 2.9069 - regression_loss: 2.3618 - classification_loss: 0.5451 466/500 [==========================>...] - ETA: 11s - loss: 2.9071 - regression_loss: 2.3620 - classification_loss: 0.5450 467/500 [===========================>..] - ETA: 10s - loss: 2.9067 - regression_loss: 2.3619 - classification_loss: 0.5448 468/500 [===========================>..] - ETA: 10s - loss: 2.9066 - regression_loss: 2.3621 - classification_loss: 0.5445 469/500 [===========================>..] - ETA: 10s - loss: 2.9063 - regression_loss: 2.3620 - classification_loss: 0.5444 470/500 [===========================>..] - ETA: 9s - loss: 2.9042 - regression_loss: 2.3602 - classification_loss: 0.5440  471/500 [===========================>..] - ETA: 9s - loss: 2.9034 - regression_loss: 2.3596 - classification_loss: 0.5439 472/500 [===========================>..] - ETA: 9s - loss: 2.9031 - regression_loss: 2.3594 - classification_loss: 0.5437 473/500 [===========================>..] - ETA: 8s - loss: 2.9031 - regression_loss: 2.3596 - classification_loss: 0.5435 474/500 [===========================>..] - ETA: 8s - loss: 2.9030 - regression_loss: 2.3596 - classification_loss: 0.5433 475/500 [===========================>..] - ETA: 8s - loss: 2.9022 - regression_loss: 2.3590 - classification_loss: 0.5432 476/500 [===========================>..] - ETA: 7s - loss: 2.9015 - regression_loss: 2.3585 - classification_loss: 0.5430 477/500 [===========================>..] - ETA: 7s - loss: 2.9015 - regression_loss: 2.3584 - classification_loss: 0.5430 478/500 [===========================>..] - ETA: 7s - loss: 2.9008 - regression_loss: 2.3581 - classification_loss: 0.5428 479/500 [===========================>..] - ETA: 6s - loss: 2.9008 - regression_loss: 2.3580 - classification_loss: 0.5428 480/500 [===========================>..] - ETA: 6s - loss: 2.8996 - regression_loss: 2.3571 - classification_loss: 0.5425 481/500 [===========================>..] - ETA: 6s - loss: 2.8978 - regression_loss: 2.3556 - classification_loss: 0.5422 482/500 [===========================>..] - ETA: 5s - loss: 2.8967 - regression_loss: 2.3548 - classification_loss: 0.5419 483/500 [===========================>..] - ETA: 5s - loss: 2.8966 - regression_loss: 2.3547 - classification_loss: 0.5420 484/500 [============================>.] - ETA: 5s - loss: 2.8948 - regression_loss: 2.3532 - classification_loss: 0.5416 485/500 [============================>.] - ETA: 4s - loss: 2.8943 - regression_loss: 2.3528 - classification_loss: 0.5414 486/500 [============================>.] - ETA: 4s - loss: 2.8940 - regression_loss: 2.3526 - classification_loss: 0.5414 487/500 [============================>.] - ETA: 4s - loss: 2.8939 - regression_loss: 2.3526 - classification_loss: 0.5413 488/500 [============================>.] - ETA: 3s - loss: 2.8941 - regression_loss: 2.3526 - classification_loss: 0.5415 489/500 [============================>.] - ETA: 3s - loss: 2.8939 - regression_loss: 2.3526 - classification_loss: 0.5414 490/500 [============================>.] - ETA: 3s - loss: 2.8935 - regression_loss: 2.3523 - classification_loss: 0.5412 491/500 [============================>.] - ETA: 2s - loss: 2.8935 - regression_loss: 2.3520 - classification_loss: 0.5414 492/500 [============================>.] - ETA: 2s - loss: 2.8929 - regression_loss: 2.3517 - classification_loss: 0.5412 493/500 [============================>.] - ETA: 2s - loss: 2.8927 - regression_loss: 2.3516 - classification_loss: 0.5411 494/500 [============================>.] - ETA: 1s - loss: 2.8934 - regression_loss: 2.3521 - classification_loss: 0.5413 495/500 [============================>.] - ETA: 1s - loss: 2.8932 - regression_loss: 2.3521 - classification_loss: 0.5412 496/500 [============================>.] - ETA: 1s - loss: 2.8932 - regression_loss: 2.3522 - classification_loss: 0.5410 497/500 [============================>.] - ETA: 0s - loss: 2.8928 - regression_loss: 2.3520 - classification_loss: 0.5408 498/500 [============================>.] - ETA: 0s - loss: 2.8919 - regression_loss: 2.3512 - classification_loss: 0.5407 499/500 [============================>.] - ETA: 0s - loss: 2.8918 - regression_loss: 2.3512 - classification_loss: 0.5406 500/500 [==============================] - 165s 330ms/step - loss: 2.8918 - regression_loss: 2.3513 - classification_loss: 0.5405 1172 instances of class plum with average precision: 0.0638 mAP: 0.0638 Epoch 00003: saving model to ./training/snapshots/resnet101_pascal_03.h5 Epoch 4/150 1/500 [..............................] - ETA: 2:38 - loss: 2.9439 - regression_loss: 2.3885 - classification_loss: 0.5554 2/500 [..............................] - ETA: 2:45 - loss: 2.8555 - regression_loss: 2.3271 - classification_loss: 0.5285 3/500 [..............................] - ETA: 2:44 - loss: 3.0233 - regression_loss: 2.4558 - classification_loss: 0.5674 4/500 [..............................] - ETA: 2:42 - loss: 2.9882 - regression_loss: 2.4312 - classification_loss: 0.5571 5/500 [..............................] - ETA: 2:44 - loss: 3.0982 - regression_loss: 2.4816 - classification_loss: 0.6166 6/500 [..............................] - ETA: 2:42 - loss: 3.0312 - regression_loss: 2.4524 - classification_loss: 0.5788 7/500 [..............................] - ETA: 2:41 - loss: 2.9940 - regression_loss: 2.4304 - classification_loss: 0.5636 8/500 [..............................] - ETA: 2:41 - loss: 3.0173 - regression_loss: 2.4439 - classification_loss: 0.5733 9/500 [..............................] - ETA: 2:42 - loss: 2.9934 - regression_loss: 2.4314 - classification_loss: 0.5620 10/500 [..............................] - ETA: 2:43 - loss: 2.9625 - regression_loss: 2.4139 - classification_loss: 0.5486 11/500 [..............................] - ETA: 2:42 - loss: 2.9841 - regression_loss: 2.4348 - classification_loss: 0.5493 12/500 [..............................] - ETA: 2:42 - loss: 2.9823 - regression_loss: 2.4340 - classification_loss: 0.5483 13/500 [..............................] - ETA: 2:41 - loss: 2.9630 - regression_loss: 2.4225 - classification_loss: 0.5405 14/500 [..............................] - ETA: 2:41 - loss: 2.9237 - regression_loss: 2.3991 - classification_loss: 0.5246 15/500 [..............................] - ETA: 2:40 - loss: 2.8860 - regression_loss: 2.3672 - classification_loss: 0.5188 16/500 [..............................] - ETA: 2:40 - loss: 2.8964 - regression_loss: 2.3740 - classification_loss: 0.5224 17/500 [>.............................] - ETA: 2:39 - loss: 2.8907 - regression_loss: 2.3650 - classification_loss: 0.5257 18/500 [>.............................] - ETA: 2:39 - loss: 2.8733 - regression_loss: 2.3562 - classification_loss: 0.5171 19/500 [>.............................] - ETA: 2:38 - loss: 2.8809 - regression_loss: 2.3625 - classification_loss: 0.5184 20/500 [>.............................] - ETA: 2:37 - loss: 2.8936 - regression_loss: 2.3728 - classification_loss: 0.5209 21/500 [>.............................] - ETA: 2:36 - loss: 2.8882 - regression_loss: 2.3738 - classification_loss: 0.5143 22/500 [>.............................] - ETA: 2:36 - loss: 2.8636 - regression_loss: 2.3537 - classification_loss: 0.5099 23/500 [>.............................] - ETA: 2:35 - loss: 2.8570 - regression_loss: 2.3525 - classification_loss: 0.5045 24/500 [>.............................] - ETA: 2:35 - loss: 2.8626 - regression_loss: 2.3584 - classification_loss: 0.5041 25/500 [>.............................] - ETA: 2:34 - loss: 2.8169 - regression_loss: 2.3211 - classification_loss: 0.4959 26/500 [>.............................] - ETA: 2:34 - loss: 2.8203 - regression_loss: 2.3223 - classification_loss: 0.4979 27/500 [>.............................] - ETA: 2:34 - loss: 2.8241 - regression_loss: 2.3259 - classification_loss: 0.4982 28/500 [>.............................] - ETA: 2:33 - loss: 2.8218 - regression_loss: 2.3238 - classification_loss: 0.4980 29/500 [>.............................] - ETA: 2:33 - loss: 2.8161 - regression_loss: 2.3202 - classification_loss: 0.4959 30/500 [>.............................] - ETA: 2:33 - loss: 2.8160 - regression_loss: 2.3212 - classification_loss: 0.4948 31/500 [>.............................] - ETA: 2:32 - loss: 2.8177 - regression_loss: 2.3225 - classification_loss: 0.4952 32/500 [>.............................] - ETA: 2:32 - loss: 2.8147 - regression_loss: 2.3185 - classification_loss: 0.4962 33/500 [>.............................] - ETA: 2:32 - loss: 2.8188 - regression_loss: 2.3164 - classification_loss: 0.5024 34/500 [=>............................] - ETA: 2:31 - loss: 2.8181 - regression_loss: 2.3165 - classification_loss: 0.5016 35/500 [=>............................] - ETA: 2:31 - loss: 2.8182 - regression_loss: 2.3163 - classification_loss: 0.5019 36/500 [=>............................] - ETA: 2:31 - loss: 2.8188 - regression_loss: 2.3134 - classification_loss: 0.5054 37/500 [=>............................] - ETA: 2:30 - loss: 2.8280 - regression_loss: 2.3202 - classification_loss: 0.5077 38/500 [=>............................] - ETA: 2:30 - loss: 2.8288 - regression_loss: 2.3205 - classification_loss: 0.5082 39/500 [=>............................] - ETA: 2:29 - loss: 2.8254 - regression_loss: 2.3181 - classification_loss: 0.5072 40/500 [=>............................] - ETA: 2:29 - loss: 2.8196 - regression_loss: 2.3123 - classification_loss: 0.5073 41/500 [=>............................] - ETA: 2:29 - loss: 2.7986 - regression_loss: 2.2940 - classification_loss: 0.5046 42/500 [=>............................] - ETA: 2:29 - loss: 2.8003 - regression_loss: 2.2986 - classification_loss: 0.5016 43/500 [=>............................] - ETA: 2:28 - loss: 2.8037 - regression_loss: 2.3005 - classification_loss: 0.5032 44/500 [=>............................] - ETA: 2:28 - loss: 2.8040 - regression_loss: 2.2997 - classification_loss: 0.5043 45/500 [=>............................] - ETA: 2:28 - loss: 2.8024 - regression_loss: 2.2973 - classification_loss: 0.5052 46/500 [=>............................] - ETA: 2:27 - loss: 2.7966 - regression_loss: 2.2926 - classification_loss: 0.5040 47/500 [=>............................] - ETA: 2:27 - loss: 2.8013 - regression_loss: 2.2958 - classification_loss: 0.5055 48/500 [=>............................] - ETA: 2:27 - loss: 2.7962 - regression_loss: 2.2941 - classification_loss: 0.5022 49/500 [=>............................] - ETA: 2:26 - loss: 2.8034 - regression_loss: 2.2971 - classification_loss: 0.5063 50/500 [==>...........................] - ETA: 2:26 - loss: 2.8055 - regression_loss: 2.2969 - classification_loss: 0.5086 51/500 [==>...........................] - ETA: 2:26 - loss: 2.8068 - regression_loss: 2.2990 - classification_loss: 0.5078 52/500 [==>...........................] - ETA: 2:25 - loss: 2.8042 - regression_loss: 2.2979 - classification_loss: 0.5062 53/500 [==>...........................] - ETA: 2:25 - loss: 2.8083 - regression_loss: 2.3016 - classification_loss: 0.5067 54/500 [==>...........................] - ETA: 2:25 - loss: 2.8021 - regression_loss: 2.2970 - classification_loss: 0.5051 55/500 [==>...........................] - ETA: 2:24 - loss: 2.8095 - regression_loss: 2.3014 - classification_loss: 0.5080 56/500 [==>...........................] - ETA: 2:24 - loss: 2.8096 - regression_loss: 2.3021 - classification_loss: 0.5075 57/500 [==>...........................] - ETA: 2:24 - loss: 2.8097 - regression_loss: 2.3025 - classification_loss: 0.5072 58/500 [==>...........................] - ETA: 2:23 - loss: 2.8098 - regression_loss: 2.3030 - classification_loss: 0.5067 59/500 [==>...........................] - ETA: 2:23 - loss: 2.8120 - regression_loss: 2.3056 - classification_loss: 0.5064 60/500 [==>...........................] - ETA: 2:23 - loss: 2.8103 - regression_loss: 2.3049 - classification_loss: 0.5054 61/500 [==>...........................] - ETA: 2:22 - loss: 2.8093 - regression_loss: 2.3046 - classification_loss: 0.5048 62/500 [==>...........................] - ETA: 2:22 - loss: 2.8137 - regression_loss: 2.3068 - classification_loss: 0.5069 63/500 [==>...........................] - ETA: 2:22 - loss: 2.8053 - regression_loss: 2.2988 - classification_loss: 0.5065 64/500 [==>...........................] - ETA: 2:21 - loss: 2.8042 - regression_loss: 2.2980 - classification_loss: 0.5062 65/500 [==>...........................] - ETA: 2:21 - loss: 2.8024 - regression_loss: 2.2969 - classification_loss: 0.5055 66/500 [==>...........................] - ETA: 2:21 - loss: 2.8068 - regression_loss: 2.2998 - classification_loss: 0.5070 67/500 [===>..........................] - ETA: 2:20 - loss: 2.8140 - regression_loss: 2.3047 - classification_loss: 0.5093 68/500 [===>..........................] - ETA: 2:20 - loss: 2.8155 - regression_loss: 2.3075 - classification_loss: 0.5079 69/500 [===>..........................] - ETA: 2:20 - loss: 2.8144 - regression_loss: 2.3074 - classification_loss: 0.5070 70/500 [===>..........................] - ETA: 2:19 - loss: 2.8100 - regression_loss: 2.3036 - classification_loss: 0.5064 71/500 [===>..........................] - ETA: 2:19 - loss: 2.8088 - regression_loss: 2.3028 - classification_loss: 0.5060 72/500 [===>..........................] - ETA: 2:19 - loss: 2.8135 - regression_loss: 2.3044 - classification_loss: 0.5092 73/500 [===>..........................] - ETA: 2:19 - loss: 2.8126 - regression_loss: 2.3036 - classification_loss: 0.5090 74/500 [===>..........................] - ETA: 2:18 - loss: 2.8159 - regression_loss: 2.3056 - classification_loss: 0.5102 75/500 [===>..........................] - ETA: 2:18 - loss: 2.8264 - regression_loss: 2.3118 - classification_loss: 0.5146 76/500 [===>..........................] - ETA: 2:18 - loss: 2.8262 - regression_loss: 2.3120 - classification_loss: 0.5141 77/500 [===>..........................] - ETA: 2:17 - loss: 2.8268 - regression_loss: 2.3125 - classification_loss: 0.5143 78/500 [===>..........................] - ETA: 2:17 - loss: 2.8268 - regression_loss: 2.3130 - classification_loss: 0.5137 79/500 [===>..........................] - ETA: 2:17 - loss: 2.8264 - regression_loss: 2.3122 - classification_loss: 0.5142 80/500 [===>..........................] - ETA: 2:16 - loss: 2.8270 - regression_loss: 2.3128 - classification_loss: 0.5142 81/500 [===>..........................] - ETA: 2:16 - loss: 2.8240 - regression_loss: 2.3101 - classification_loss: 0.5139 82/500 [===>..........................] - ETA: 2:16 - loss: 2.8158 - regression_loss: 2.3042 - classification_loss: 0.5115 83/500 [===>..........................] - ETA: 2:16 - loss: 2.8150 - regression_loss: 2.3038 - classification_loss: 0.5112 84/500 [====>.........................] - ETA: 2:15 - loss: 2.8164 - regression_loss: 2.3044 - classification_loss: 0.5120 85/500 [====>.........................] - ETA: 2:15 - loss: 2.8168 - regression_loss: 2.3048 - classification_loss: 0.5120 86/500 [====>.........................] - ETA: 2:15 - loss: 2.8137 - regression_loss: 2.3031 - classification_loss: 0.5106 87/500 [====>.........................] - ETA: 2:14 - loss: 2.8127 - regression_loss: 2.3029 - classification_loss: 0.5098 88/500 [====>.........................] - ETA: 2:14 - loss: 2.8141 - regression_loss: 2.3042 - classification_loss: 0.5100 89/500 [====>.........................] - ETA: 2:14 - loss: 2.8102 - regression_loss: 2.3000 - classification_loss: 0.5101 90/500 [====>.........................] - ETA: 2:13 - loss: 2.8129 - regression_loss: 2.3025 - classification_loss: 0.5105 91/500 [====>.........................] - ETA: 2:13 - loss: 2.8091 - regression_loss: 2.2996 - classification_loss: 0.5095 92/500 [====>.........................] - ETA: 2:13 - loss: 2.8031 - regression_loss: 2.2953 - classification_loss: 0.5077 93/500 [====>.........................] - ETA: 2:12 - loss: 2.8043 - regression_loss: 2.2953 - classification_loss: 0.5089 94/500 [====>.........................] - ETA: 2:12 - loss: 2.8059 - regression_loss: 2.2972 - classification_loss: 0.5087 95/500 [====>.........................] - ETA: 2:12 - loss: 2.8028 - regression_loss: 2.2960 - classification_loss: 0.5068 96/500 [====>.........................] - ETA: 2:12 - loss: 2.8028 - regression_loss: 2.2974 - classification_loss: 0.5055 97/500 [====>.........................] - ETA: 2:11 - loss: 2.8032 - regression_loss: 2.2981 - classification_loss: 0.5051 98/500 [====>.........................] - ETA: 2:11 - loss: 2.8009 - regression_loss: 2.2971 - classification_loss: 0.5038 99/500 [====>.........................] - ETA: 2:11 - loss: 2.8011 - regression_loss: 2.2972 - classification_loss: 0.5039 100/500 [=====>........................] - ETA: 2:10 - loss: 2.7983 - regression_loss: 2.2960 - classification_loss: 0.5022 101/500 [=====>........................] - ETA: 2:10 - loss: 2.7980 - regression_loss: 2.2959 - classification_loss: 0.5020 102/500 [=====>........................] - ETA: 2:10 - loss: 2.7981 - regression_loss: 2.2960 - classification_loss: 0.5021 103/500 [=====>........................] - ETA: 2:09 - loss: 2.7983 - regression_loss: 2.2971 - classification_loss: 0.5013 104/500 [=====>........................] - ETA: 2:09 - loss: 2.7972 - regression_loss: 2.2956 - classification_loss: 0.5016 105/500 [=====>........................] - ETA: 2:09 - loss: 2.7943 - regression_loss: 2.2936 - classification_loss: 0.5007 106/500 [=====>........................] - ETA: 2:08 - loss: 2.7934 - regression_loss: 2.2931 - classification_loss: 0.5003 107/500 [=====>........................] - ETA: 2:08 - loss: 2.7909 - regression_loss: 2.2921 - classification_loss: 0.4988 108/500 [=====>........................] - ETA: 2:08 - loss: 2.7911 - regression_loss: 2.2922 - classification_loss: 0.4989 109/500 [=====>........................] - ETA: 2:07 - loss: 2.7897 - regression_loss: 2.2919 - classification_loss: 0.4978 110/500 [=====>........................] - ETA: 2:07 - loss: 2.7912 - regression_loss: 2.2930 - classification_loss: 0.4982 111/500 [=====>........................] - ETA: 2:07 - loss: 2.7904 - regression_loss: 2.2925 - classification_loss: 0.4979 112/500 [=====>........................] - ETA: 2:06 - loss: 2.7899 - regression_loss: 2.2923 - classification_loss: 0.4976 113/500 [=====>........................] - ETA: 2:06 - loss: 2.7901 - regression_loss: 2.2919 - classification_loss: 0.4981 114/500 [=====>........................] - ETA: 2:06 - loss: 2.7846 - regression_loss: 2.2873 - classification_loss: 0.4973 115/500 [=====>........................] - ETA: 2:05 - loss: 2.7848 - regression_loss: 2.2872 - classification_loss: 0.4977 116/500 [=====>........................] - ETA: 2:05 - loss: 2.7832 - regression_loss: 2.2860 - classification_loss: 0.4972 117/500 [======>.......................] - ETA: 2:05 - loss: 2.7845 - regression_loss: 2.2874 - classification_loss: 0.4972 118/500 [======>.......................] - ETA: 2:04 - loss: 2.7841 - regression_loss: 2.2873 - classification_loss: 0.4967 119/500 [======>.......................] - ETA: 2:04 - loss: 2.7831 - regression_loss: 2.2876 - classification_loss: 0.4956 120/500 [======>.......................] - ETA: 2:04 - loss: 2.7837 - regression_loss: 2.2885 - classification_loss: 0.4952 121/500 [======>.......................] - ETA: 2:03 - loss: 2.7892 - regression_loss: 2.2922 - classification_loss: 0.4971 122/500 [======>.......................] - ETA: 2:03 - loss: 2.7895 - regression_loss: 2.2929 - classification_loss: 0.4966 123/500 [======>.......................] - ETA: 2:03 - loss: 2.7897 - regression_loss: 2.2931 - classification_loss: 0.4966 124/500 [======>.......................] - ETA: 2:02 - loss: 2.7886 - regression_loss: 2.2918 - classification_loss: 0.4967 125/500 [======>.......................] - ETA: 2:02 - loss: 2.7898 - regression_loss: 2.2935 - classification_loss: 0.4963 126/500 [======>.......................] - ETA: 2:02 - loss: 2.7892 - regression_loss: 2.2932 - classification_loss: 0.4960 127/500 [======>.......................] - ETA: 2:01 - loss: 2.7887 - regression_loss: 2.2931 - classification_loss: 0.4956 128/500 [======>.......................] - ETA: 2:01 - loss: 2.7891 - regression_loss: 2.2938 - classification_loss: 0.4953 129/500 [======>.......................] - ETA: 2:01 - loss: 2.7892 - regression_loss: 2.2942 - classification_loss: 0.4950 130/500 [======>.......................] - ETA: 2:00 - loss: 2.7911 - regression_loss: 2.2949 - classification_loss: 0.4962 131/500 [======>.......................] - ETA: 2:00 - loss: 2.7900 - regression_loss: 2.2945 - classification_loss: 0.4955 132/500 [======>.......................] - ETA: 2:00 - loss: 2.7899 - regression_loss: 2.2946 - classification_loss: 0.4952 133/500 [======>.......................] - ETA: 1:59 - loss: 2.7895 - regression_loss: 2.2944 - classification_loss: 0.4950 134/500 [=======>......................] - ETA: 1:59 - loss: 2.7892 - regression_loss: 2.2945 - classification_loss: 0.4947 135/500 [=======>......................] - ETA: 1:59 - loss: 2.7899 - regression_loss: 2.2956 - classification_loss: 0.4943 136/500 [=======>......................] - ETA: 1:58 - loss: 2.7898 - regression_loss: 2.2956 - classification_loss: 0.4943 137/500 [=======>......................] - ETA: 1:58 - loss: 2.7890 - regression_loss: 2.2950 - classification_loss: 0.4940 138/500 [=======>......................] - ETA: 1:58 - loss: 2.7890 - regression_loss: 2.2949 - classification_loss: 0.4941 139/500 [=======>......................] - ETA: 1:58 - loss: 2.7901 - regression_loss: 2.2953 - classification_loss: 0.4947 140/500 [=======>......................] - ETA: 1:57 - loss: 2.7910 - regression_loss: 2.2961 - classification_loss: 0.4949 141/500 [=======>......................] - ETA: 1:57 - loss: 2.7890 - regression_loss: 2.2946 - classification_loss: 0.4944 142/500 [=======>......................] - ETA: 1:56 - loss: 2.7883 - regression_loss: 2.2943 - classification_loss: 0.4940 143/500 [=======>......................] - ETA: 1:56 - loss: 2.7888 - regression_loss: 2.2946 - classification_loss: 0.4942 144/500 [=======>......................] - ETA: 1:56 - loss: 2.7827 - regression_loss: 2.2898 - classification_loss: 0.4930 145/500 [=======>......................] - ETA: 1:56 - loss: 2.7828 - regression_loss: 2.2899 - classification_loss: 0.4929 146/500 [=======>......................] - ETA: 1:55 - loss: 2.7823 - regression_loss: 2.2899 - classification_loss: 0.4924 147/500 [=======>......................] - ETA: 1:55 - loss: 2.7810 - regression_loss: 2.2895 - classification_loss: 0.4916 148/500 [=======>......................] - ETA: 1:55 - loss: 2.7834 - regression_loss: 2.2914 - classification_loss: 0.4920 149/500 [=======>......................] - ETA: 1:54 - loss: 2.7789 - regression_loss: 2.2874 - classification_loss: 0.4915 150/500 [========>.....................] - ETA: 1:54 - loss: 2.7757 - regression_loss: 2.2850 - classification_loss: 0.4907 151/500 [========>.....................] - ETA: 1:54 - loss: 2.7746 - regression_loss: 2.2843 - classification_loss: 0.4903 152/500 [========>.....................] - ETA: 1:53 - loss: 2.7738 - regression_loss: 2.2846 - classification_loss: 0.4892 153/500 [========>.....................] - ETA: 1:53 - loss: 2.7744 - regression_loss: 2.2848 - classification_loss: 0.4895 154/500 [========>.....................] - ETA: 1:53 - loss: 2.7734 - regression_loss: 2.2843 - classification_loss: 0.4891 155/500 [========>.....................] - ETA: 1:52 - loss: 2.7735 - regression_loss: 2.2841 - classification_loss: 0.4893 156/500 [========>.....................] - ETA: 1:52 - loss: 2.7713 - regression_loss: 2.2827 - classification_loss: 0.4886 157/500 [========>.....................] - ETA: 1:52 - loss: 2.7722 - regression_loss: 2.2834 - classification_loss: 0.4888 158/500 [========>.....................] - ETA: 1:51 - loss: 2.7721 - regression_loss: 2.2835 - classification_loss: 0.4886 159/500 [========>.....................] - ETA: 1:51 - loss: 2.7720 - regression_loss: 2.2841 - classification_loss: 0.4879 160/500 [========>.....................] - ETA: 1:51 - loss: 2.7716 - regression_loss: 2.2840 - classification_loss: 0.4876 161/500 [========>.....................] - ETA: 1:50 - loss: 2.7706 - regression_loss: 2.2830 - classification_loss: 0.4875 162/500 [========>.....................] - ETA: 1:50 - loss: 2.7712 - regression_loss: 2.2831 - classification_loss: 0.4881 163/500 [========>.....................] - ETA: 1:50 - loss: 2.7703 - regression_loss: 2.2825 - classification_loss: 0.4878 164/500 [========>.....................] - ETA: 1:49 - loss: 2.7699 - regression_loss: 2.2823 - classification_loss: 0.4876 165/500 [========>.....................] - ETA: 1:49 - loss: 2.7690 - regression_loss: 2.2819 - classification_loss: 0.4871 166/500 [========>.....................] - ETA: 1:49 - loss: 2.7669 - regression_loss: 2.2795 - classification_loss: 0.4874 167/500 [=========>....................] - ETA: 1:48 - loss: 2.7675 - regression_loss: 2.2799 - classification_loss: 0.4876 168/500 [=========>....................] - ETA: 1:48 - loss: 2.7659 - regression_loss: 2.2790 - classification_loss: 0.4869 169/500 [=========>....................] - ETA: 1:48 - loss: 2.7666 - regression_loss: 2.2799 - classification_loss: 0.4867 170/500 [=========>....................] - ETA: 1:48 - loss: 2.7662 - regression_loss: 2.2801 - classification_loss: 0.4861 171/500 [=========>....................] - ETA: 1:47 - loss: 2.7675 - regression_loss: 2.2809 - classification_loss: 0.4866 172/500 [=========>....................] - ETA: 1:47 - loss: 2.7690 - regression_loss: 2.2823 - classification_loss: 0.4867 173/500 [=========>....................] - ETA: 1:47 - loss: 2.7669 - regression_loss: 2.2809 - classification_loss: 0.4859 174/500 [=========>....................] - ETA: 1:46 - loss: 2.7671 - regression_loss: 2.2811 - classification_loss: 0.4860 175/500 [=========>....................] - ETA: 1:46 - loss: 2.7661 - regression_loss: 2.2804 - classification_loss: 0.4857 176/500 [=========>....................] - ETA: 1:46 - loss: 2.7694 - regression_loss: 2.2823 - classification_loss: 0.4871 177/500 [=========>....................] - ETA: 1:45 - loss: 2.7658 - regression_loss: 2.2794 - classification_loss: 0.4864 178/500 [=========>....................] - ETA: 1:45 - loss: 2.7621 - regression_loss: 2.2748 - classification_loss: 0.4873 179/500 [=========>....................] - ETA: 1:45 - loss: 2.7604 - regression_loss: 2.2734 - classification_loss: 0.4871 180/500 [=========>....................] - ETA: 1:44 - loss: 2.7573 - regression_loss: 2.2707 - classification_loss: 0.4866 181/500 [=========>....................] - ETA: 1:44 - loss: 2.7597 - regression_loss: 2.2719 - classification_loss: 0.4878 182/500 [=========>....................] - ETA: 1:44 - loss: 2.7605 - regression_loss: 2.2725 - classification_loss: 0.4880 183/500 [=========>....................] - ETA: 1:43 - loss: 2.7603 - regression_loss: 2.2724 - classification_loss: 0.4878 184/500 [==========>...................] - ETA: 1:43 - loss: 2.7603 - regression_loss: 2.2727 - classification_loss: 0.4876 185/500 [==========>...................] - ETA: 1:43 - loss: 2.7603 - regression_loss: 2.2727 - classification_loss: 0.4876 186/500 [==========>...................] - ETA: 1:42 - loss: 2.7599 - regression_loss: 2.2721 - classification_loss: 0.4878 187/500 [==========>...................] - ETA: 1:42 - loss: 2.7590 - regression_loss: 2.2716 - classification_loss: 0.4874 188/500 [==========>...................] - ETA: 1:42 - loss: 2.7607 - regression_loss: 2.2730 - classification_loss: 0.4877 189/500 [==========>...................] - ETA: 1:41 - loss: 2.7612 - regression_loss: 2.2733 - classification_loss: 0.4879 190/500 [==========>...................] - ETA: 1:41 - loss: 2.7612 - regression_loss: 2.2728 - classification_loss: 0.4884 191/500 [==========>...................] - ETA: 1:41 - loss: 2.7639 - regression_loss: 2.2750 - classification_loss: 0.4889 192/500 [==========>...................] - ETA: 1:40 - loss: 2.7643 - regression_loss: 2.2753 - classification_loss: 0.4890 193/500 [==========>...................] - ETA: 1:40 - loss: 2.7639 - regression_loss: 2.2752 - classification_loss: 0.4888 194/500 [==========>...................] - ETA: 1:40 - loss: 2.7628 - regression_loss: 2.2740 - classification_loss: 0.4888 195/500 [==========>...................] - ETA: 1:39 - loss: 2.7629 - regression_loss: 2.2738 - classification_loss: 0.4892 196/500 [==========>...................] - ETA: 1:39 - loss: 2.7624 - regression_loss: 2.2735 - classification_loss: 0.4889 197/500 [==========>...................] - ETA: 1:39 - loss: 2.7622 - regression_loss: 2.2735 - classification_loss: 0.4887 198/500 [==========>...................] - ETA: 1:38 - loss: 2.7621 - regression_loss: 2.2735 - classification_loss: 0.4886 199/500 [==========>...................] - ETA: 1:38 - loss: 2.7620 - regression_loss: 2.2731 - classification_loss: 0.4889 200/500 [===========>..................] - ETA: 1:38 - loss: 2.7581 - regression_loss: 2.2699 - classification_loss: 0.4882 201/500 [===========>..................] - ETA: 1:37 - loss: 2.7581 - regression_loss: 2.2700 - classification_loss: 0.4881 202/500 [===========>..................] - ETA: 1:37 - loss: 2.7576 - regression_loss: 2.2698 - classification_loss: 0.4878 203/500 [===========>..................] - ETA: 1:37 - loss: 2.7519 - regression_loss: 2.2649 - classification_loss: 0.4870 204/500 [===========>..................] - ETA: 1:37 - loss: 2.7516 - regression_loss: 2.2647 - classification_loss: 0.4869 205/500 [===========>..................] - ETA: 1:36 - loss: 2.7512 - regression_loss: 2.2640 - classification_loss: 0.4872 206/500 [===========>..................] - ETA: 1:36 - loss: 2.7505 - regression_loss: 2.2627 - classification_loss: 0.4878 207/500 [===========>..................] - ETA: 1:36 - loss: 2.7514 - regression_loss: 2.2633 - classification_loss: 0.4881 208/500 [===========>..................] - ETA: 1:35 - loss: 2.7523 - regression_loss: 2.2642 - classification_loss: 0.4882 209/500 [===========>..................] - ETA: 1:35 - loss: 2.7509 - regression_loss: 2.2632 - classification_loss: 0.4876 210/500 [===========>..................] - ETA: 1:35 - loss: 2.7473 - regression_loss: 2.2604 - classification_loss: 0.4869 211/500 [===========>..................] - ETA: 1:34 - loss: 2.7455 - regression_loss: 2.2585 - classification_loss: 0.4871 212/500 [===========>..................] - ETA: 1:34 - loss: 2.7457 - regression_loss: 2.2587 - classification_loss: 0.4870 213/500 [===========>..................] - ETA: 1:34 - loss: 2.7476 - regression_loss: 2.2607 - classification_loss: 0.4870 214/500 [===========>..................] - ETA: 1:33 - loss: 2.7476 - regression_loss: 2.2608 - classification_loss: 0.4868 215/500 [===========>..................] - ETA: 1:33 - loss: 2.7489 - regression_loss: 2.2617 - classification_loss: 0.4872 216/500 [===========>..................] - ETA: 1:33 - loss: 2.7479 - regression_loss: 2.2608 - classification_loss: 0.4871 217/500 [============>.................] - ETA: 1:32 - loss: 2.7489 - regression_loss: 2.2618 - classification_loss: 0.4871 218/500 [============>.................] - ETA: 1:32 - loss: 2.7481 - regression_loss: 2.2614 - classification_loss: 0.4866 219/500 [============>.................] - ETA: 1:32 - loss: 2.7454 - regression_loss: 2.2592 - classification_loss: 0.4862 220/500 [============>.................] - ETA: 1:31 - loss: 2.7439 - regression_loss: 2.2579 - classification_loss: 0.4860 221/500 [============>.................] - ETA: 1:31 - loss: 2.7436 - regression_loss: 2.2579 - classification_loss: 0.4856 222/500 [============>.................] - ETA: 1:31 - loss: 2.7414 - regression_loss: 2.2565 - classification_loss: 0.4849 223/500 [============>.................] - ETA: 1:30 - loss: 2.7424 - regression_loss: 2.2567 - classification_loss: 0.4857 224/500 [============>.................] - ETA: 1:30 - loss: 2.7423 - regression_loss: 2.2564 - classification_loss: 0.4859 225/500 [============>.................] - ETA: 1:30 - loss: 2.7418 - regression_loss: 2.2559 - classification_loss: 0.4859 226/500 [============>.................] - ETA: 1:30 - loss: 2.7391 - regression_loss: 2.2540 - classification_loss: 0.4851 227/500 [============>.................] - ETA: 1:29 - loss: 2.7397 - regression_loss: 2.2545 - classification_loss: 0.4852 228/500 [============>.................] - ETA: 1:29 - loss: 2.7404 - regression_loss: 2.2551 - classification_loss: 0.4854 229/500 [============>.................] - ETA: 1:29 - loss: 2.7392 - regression_loss: 2.2541 - classification_loss: 0.4851 230/500 [============>.................] - ETA: 1:28 - loss: 2.7396 - regression_loss: 2.2545 - classification_loss: 0.4851 231/500 [============>.................] - ETA: 1:28 - loss: 2.7388 - regression_loss: 2.2541 - classification_loss: 0.4847 232/500 [============>.................] - ETA: 1:28 - loss: 2.7400 - regression_loss: 2.2548 - classification_loss: 0.4852 233/500 [============>.................] - ETA: 1:27 - loss: 2.7402 - regression_loss: 2.2551 - classification_loss: 0.4851 234/500 [=============>................] - ETA: 1:27 - loss: 2.7390 - regression_loss: 2.2542 - classification_loss: 0.4849 235/500 [=============>................] - ETA: 1:27 - loss: 2.7391 - regression_loss: 2.2542 - classification_loss: 0.4849 236/500 [=============>................] - ETA: 1:26 - loss: 2.7369 - regression_loss: 2.2523 - classification_loss: 0.4846 237/500 [=============>................] - ETA: 1:26 - loss: 2.7375 - regression_loss: 2.2526 - classification_loss: 0.4848 238/500 [=============>................] - ETA: 1:26 - loss: 2.7362 - regression_loss: 2.2518 - classification_loss: 0.4844 239/500 [=============>................] - ETA: 1:25 - loss: 2.7368 - regression_loss: 2.2524 - classification_loss: 0.4844 240/500 [=============>................] - ETA: 1:25 - loss: 2.7366 - regression_loss: 2.2523 - classification_loss: 0.4843 241/500 [=============>................] - ETA: 1:25 - loss: 2.7362 - regression_loss: 2.2521 - classification_loss: 0.4842 242/500 [=============>................] - ETA: 1:24 - loss: 2.7359 - regression_loss: 2.2519 - classification_loss: 0.4840 243/500 [=============>................] - ETA: 1:24 - loss: 2.7348 - regression_loss: 2.2511 - classification_loss: 0.4836 244/500 [=============>................] - ETA: 1:24 - loss: 2.7351 - regression_loss: 2.2514 - classification_loss: 0.4837 245/500 [=============>................] - ETA: 1:23 - loss: 2.7352 - regression_loss: 2.2515 - classification_loss: 0.4837 246/500 [=============>................] - ETA: 1:23 - loss: 2.7357 - regression_loss: 2.2516 - classification_loss: 0.4841 247/500 [=============>................] - ETA: 1:23 - loss: 2.7357 - regression_loss: 2.2518 - classification_loss: 0.4838 248/500 [=============>................] - ETA: 1:22 - loss: 2.7377 - regression_loss: 2.2530 - classification_loss: 0.4847 249/500 [=============>................] - ETA: 1:22 - loss: 2.7376 - regression_loss: 2.2528 - classification_loss: 0.4848 250/500 [==============>...............] - ETA: 1:22 - loss: 2.7363 - regression_loss: 2.2518 - classification_loss: 0.4845 251/500 [==============>...............] - ETA: 1:21 - loss: 2.7360 - regression_loss: 2.2515 - classification_loss: 0.4845 252/500 [==============>...............] - ETA: 1:21 - loss: 2.7326 - regression_loss: 2.2488 - classification_loss: 0.4838 253/500 [==============>...............] - ETA: 1:21 - loss: 2.7338 - regression_loss: 2.2501 - classification_loss: 0.4836 254/500 [==============>...............] - ETA: 1:20 - loss: 2.7328 - regression_loss: 2.2494 - classification_loss: 0.4834 255/500 [==============>...............] - ETA: 1:20 - loss: 2.7326 - regression_loss: 2.2498 - classification_loss: 0.4828 256/500 [==============>...............] - ETA: 1:20 - loss: 2.7328 - regression_loss: 2.2502 - classification_loss: 0.4826 257/500 [==============>...............] - ETA: 1:19 - loss: 2.7320 - regression_loss: 2.2500 - classification_loss: 0.4820 258/500 [==============>...............] - ETA: 1:19 - loss: 2.7322 - regression_loss: 2.2502 - classification_loss: 0.4820 259/500 [==============>...............] - ETA: 1:19 - loss: 2.7291 - regression_loss: 2.2477 - classification_loss: 0.4814 260/500 [==============>...............] - ETA: 1:18 - loss: 2.7293 - regression_loss: 2.2478 - classification_loss: 0.4815 261/500 [==============>...............] - ETA: 1:18 - loss: 2.7304 - regression_loss: 2.2486 - classification_loss: 0.4817 262/500 [==============>...............] - ETA: 1:18 - loss: 2.7293 - regression_loss: 2.2471 - classification_loss: 0.4822 263/500 [==============>...............] - ETA: 1:17 - loss: 2.7280 - regression_loss: 2.2460 - classification_loss: 0.4820 264/500 [==============>...............] - ETA: 1:17 - loss: 2.7268 - regression_loss: 2.2455 - classification_loss: 0.4813 265/500 [==============>...............] - ETA: 1:17 - loss: 2.7266 - regression_loss: 2.2454 - classification_loss: 0.4812 266/500 [==============>...............] - ETA: 1:16 - loss: 2.7267 - regression_loss: 2.2457 - classification_loss: 0.4810 267/500 [===============>..............] - ETA: 1:16 - loss: 2.7233 - regression_loss: 2.2429 - classification_loss: 0.4804 268/500 [===============>..............] - ETA: 1:16 - loss: 2.7235 - regression_loss: 2.2431 - classification_loss: 0.4804 269/500 [===============>..............] - ETA: 1:15 - loss: 2.7223 - regression_loss: 2.2423 - classification_loss: 0.4800 270/500 [===============>..............] - ETA: 1:15 - loss: 2.7218 - regression_loss: 2.2418 - classification_loss: 0.4799 271/500 [===============>..............] - ETA: 1:15 - loss: 2.7211 - regression_loss: 2.2415 - classification_loss: 0.4796 272/500 [===============>..............] - ETA: 1:15 - loss: 2.7209 - regression_loss: 2.2413 - classification_loss: 0.4796 273/500 [===============>..............] - ETA: 1:14 - loss: 2.7206 - regression_loss: 2.2413 - classification_loss: 0.4793 274/500 [===============>..............] - ETA: 1:14 - loss: 2.7207 - regression_loss: 2.2414 - classification_loss: 0.4793 275/500 [===============>..............] - ETA: 1:13 - loss: 2.7210 - regression_loss: 2.2415 - classification_loss: 0.4795 276/500 [===============>..............] - ETA: 1:13 - loss: 2.7209 - regression_loss: 2.2415 - classification_loss: 0.4794 277/500 [===============>..............] - ETA: 1:13 - loss: 2.7210 - regression_loss: 2.2415 - classification_loss: 0.4795 278/500 [===============>..............] - ETA: 1:13 - loss: 2.7195 - regression_loss: 2.2405 - classification_loss: 0.4790 279/500 [===============>..............] - ETA: 1:12 - loss: 2.7193 - regression_loss: 2.2403 - classification_loss: 0.4790 280/500 [===============>..............] - ETA: 1:12 - loss: 2.7210 - regression_loss: 2.2412 - classification_loss: 0.4797 281/500 [===============>..............] - ETA: 1:12 - loss: 2.7209 - regression_loss: 2.2412 - classification_loss: 0.4797 282/500 [===============>..............] - ETA: 1:11 - loss: 2.7212 - regression_loss: 2.2414 - classification_loss: 0.4798 283/500 [===============>..............] - ETA: 1:11 - loss: 2.7216 - regression_loss: 2.2418 - classification_loss: 0.4798 284/500 [================>.............] - ETA: 1:11 - loss: 2.7194 - regression_loss: 2.2402 - classification_loss: 0.4792 285/500 [================>.............] - ETA: 1:10 - loss: 2.7184 - regression_loss: 2.2397 - classification_loss: 0.4786 286/500 [================>.............] - ETA: 1:10 - loss: 2.7157 - regression_loss: 2.2373 - classification_loss: 0.4783 287/500 [================>.............] - ETA: 1:10 - loss: 2.7159 - regression_loss: 2.2374 - classification_loss: 0.4785 288/500 [================>.............] - ETA: 1:09 - loss: 2.7156 - regression_loss: 2.2373 - classification_loss: 0.4783 289/500 [================>.............] - ETA: 1:09 - loss: 2.7152 - regression_loss: 2.2370 - classification_loss: 0.4781 290/500 [================>.............] - ETA: 1:09 - loss: 2.7143 - regression_loss: 2.2366 - classification_loss: 0.4777 291/500 [================>.............] - ETA: 1:08 - loss: 2.7134 - regression_loss: 2.2362 - classification_loss: 0.4772 292/500 [================>.............] - ETA: 1:08 - loss: 2.7137 - regression_loss: 2.2366 - classification_loss: 0.4771 293/500 [================>.............] - ETA: 1:08 - loss: 2.7132 - regression_loss: 2.2361 - classification_loss: 0.4772 294/500 [================>.............] - ETA: 1:07 - loss: 2.7113 - regression_loss: 2.2344 - classification_loss: 0.4769 295/500 [================>.............] - ETA: 1:07 - loss: 2.7110 - regression_loss: 2.2341 - classification_loss: 0.4768 296/500 [================>.............] - ETA: 1:07 - loss: 2.7105 - regression_loss: 2.2340 - classification_loss: 0.4765 297/500 [================>.............] - ETA: 1:06 - loss: 2.7094 - regression_loss: 2.2331 - classification_loss: 0.4762 298/500 [================>.............] - ETA: 1:06 - loss: 2.7098 - regression_loss: 2.2335 - classification_loss: 0.4763 299/500 [================>.............] - ETA: 1:06 - loss: 2.7083 - regression_loss: 2.2328 - classification_loss: 0.4756 300/500 [=================>............] - ETA: 1:05 - loss: 2.7086 - regression_loss: 2.2330 - classification_loss: 0.4756 301/500 [=================>............] - ETA: 1:05 - loss: 2.7080 - regression_loss: 2.2327 - classification_loss: 0.4754 302/500 [=================>............] - ETA: 1:05 - loss: 2.7068 - regression_loss: 2.2319 - classification_loss: 0.4749 303/500 [=================>............] - ETA: 1:04 - loss: 2.7065 - regression_loss: 2.2317 - classification_loss: 0.4747 304/500 [=================>............] - ETA: 1:04 - loss: 2.7072 - regression_loss: 2.2323 - classification_loss: 0.4749 305/500 [=================>............] - ETA: 1:04 - loss: 2.7072 - regression_loss: 2.2322 - classification_loss: 0.4750 306/500 [=================>............] - ETA: 1:03 - loss: 2.7042 - regression_loss: 2.2295 - classification_loss: 0.4747 307/500 [=================>............] - ETA: 1:03 - loss: 2.7045 - regression_loss: 2.2299 - classification_loss: 0.4746 308/500 [=================>............] - ETA: 1:03 - loss: 2.7045 - regression_loss: 2.2305 - classification_loss: 0.4740 309/500 [=================>............] - ETA: 1:02 - loss: 2.7050 - regression_loss: 2.2307 - classification_loss: 0.4744 310/500 [=================>............] - ETA: 1:02 - loss: 2.7051 - regression_loss: 2.2308 - classification_loss: 0.4744 311/500 [=================>............] - ETA: 1:02 - loss: 2.7049 - regression_loss: 2.2306 - classification_loss: 0.4743 312/500 [=================>............] - ETA: 1:01 - loss: 2.7051 - regression_loss: 2.2307 - classification_loss: 0.4744 313/500 [=================>............] - ETA: 1:01 - loss: 2.7054 - regression_loss: 2.2311 - classification_loss: 0.4743 314/500 [=================>............] - ETA: 1:01 - loss: 2.7010 - regression_loss: 2.2272 - classification_loss: 0.4738 315/500 [=================>............] - ETA: 1:00 - loss: 2.7008 - regression_loss: 2.2271 - classification_loss: 0.4737 316/500 [=================>............] - ETA: 1:00 - loss: 2.7022 - regression_loss: 2.2281 - classification_loss: 0.4741 317/500 [==================>...........] - ETA: 1:00 - loss: 2.7032 - regression_loss: 2.2288 - classification_loss: 0.4744 318/500 [==================>...........] - ETA: 59s - loss: 2.7030 - regression_loss: 2.2289 - classification_loss: 0.4741  319/500 [==================>...........] - ETA: 59s - loss: 2.7048 - regression_loss: 2.2302 - classification_loss: 0.4746 320/500 [==================>...........] - ETA: 59s - loss: 2.7046 - regression_loss: 2.2301 - classification_loss: 0.4745 321/500 [==================>...........] - ETA: 58s - loss: 2.7046 - regression_loss: 2.2301 - classification_loss: 0.4745 322/500 [==================>...........] - ETA: 58s - loss: 2.7041 - regression_loss: 2.2295 - classification_loss: 0.4746 323/500 [==================>...........] - ETA: 58s - loss: 2.7040 - regression_loss: 2.2294 - classification_loss: 0.4746 324/500 [==================>...........] - ETA: 57s - loss: 2.7049 - regression_loss: 2.2303 - classification_loss: 0.4746 325/500 [==================>...........] - ETA: 57s - loss: 2.7045 - regression_loss: 2.2303 - classification_loss: 0.4742 326/500 [==================>...........] - ETA: 57s - loss: 2.7041 - regression_loss: 2.2304 - classification_loss: 0.4737 327/500 [==================>...........] - ETA: 56s - loss: 2.7044 - regression_loss: 2.2306 - classification_loss: 0.4738 328/500 [==================>...........] - ETA: 56s - loss: 2.7049 - regression_loss: 2.2308 - classification_loss: 0.4741 329/500 [==================>...........] - ETA: 56s - loss: 2.7055 - regression_loss: 2.2311 - classification_loss: 0.4743 330/500 [==================>...........] - ETA: 55s - loss: 2.7054 - regression_loss: 2.2311 - classification_loss: 0.4743 331/500 [==================>...........] - ETA: 55s - loss: 2.7041 - regression_loss: 2.2300 - classification_loss: 0.4741 332/500 [==================>...........] - ETA: 55s - loss: 2.7046 - regression_loss: 2.2304 - classification_loss: 0.4741 333/500 [==================>...........] - ETA: 54s - loss: 2.7048 - regression_loss: 2.2308 - classification_loss: 0.4740 334/500 [===================>..........] - ETA: 54s - loss: 2.7032 - regression_loss: 2.2296 - classification_loss: 0.4736 335/500 [===================>..........] - ETA: 54s - loss: 2.7046 - regression_loss: 2.2305 - classification_loss: 0.4741 336/500 [===================>..........] - ETA: 53s - loss: 2.7054 - regression_loss: 2.2311 - classification_loss: 0.4744 337/500 [===================>..........] - ETA: 53s - loss: 2.7039 - regression_loss: 2.2301 - classification_loss: 0.4738 338/500 [===================>..........] - ETA: 53s - loss: 2.7041 - regression_loss: 2.2305 - classification_loss: 0.4736 339/500 [===================>..........] - ETA: 52s - loss: 2.7020 - regression_loss: 2.2288 - classification_loss: 0.4733 340/500 [===================>..........] - ETA: 52s - loss: 2.7021 - regression_loss: 2.2288 - classification_loss: 0.4733 341/500 [===================>..........] - ETA: 52s - loss: 2.7019 - regression_loss: 2.2287 - classification_loss: 0.4731 342/500 [===================>..........] - ETA: 52s - loss: 2.7018 - regression_loss: 2.2287 - classification_loss: 0.4731 343/500 [===================>..........] - ETA: 51s - loss: 2.7018 - regression_loss: 2.2287 - classification_loss: 0.4731 344/500 [===================>..........] - ETA: 51s - loss: 2.7016 - regression_loss: 2.2286 - classification_loss: 0.4729 345/500 [===================>..........] - ETA: 51s - loss: 2.7009 - regression_loss: 2.2280 - classification_loss: 0.4729 346/500 [===================>..........] - ETA: 50s - loss: 2.6995 - regression_loss: 2.2269 - classification_loss: 0.4726 347/500 [===================>..........] - ETA: 50s - loss: 2.7000 - regression_loss: 2.2274 - classification_loss: 0.4726 348/500 [===================>..........] - ETA: 50s - loss: 2.7001 - regression_loss: 2.2272 - classification_loss: 0.4728 349/500 [===================>..........] - ETA: 49s - loss: 2.6986 - regression_loss: 2.2258 - classification_loss: 0.4728 350/500 [====================>.........] - ETA: 49s - loss: 2.6984 - regression_loss: 2.2257 - classification_loss: 0.4727 351/500 [====================>.........] - ETA: 49s - loss: 2.6961 - regression_loss: 2.2239 - classification_loss: 0.4722 352/500 [====================>.........] - ETA: 48s - loss: 2.6962 - regression_loss: 2.2241 - classification_loss: 0.4721 353/500 [====================>.........] - ETA: 48s - loss: 2.6962 - regression_loss: 2.2242 - classification_loss: 0.4720 354/500 [====================>.........] - ETA: 48s - loss: 2.6953 - regression_loss: 2.2236 - classification_loss: 0.4718 355/500 [====================>.........] - ETA: 47s - loss: 2.6936 - regression_loss: 2.2223 - classification_loss: 0.4713 356/500 [====================>.........] - ETA: 47s - loss: 2.6941 - regression_loss: 2.2226 - classification_loss: 0.4715 357/500 [====================>.........] - ETA: 47s - loss: 2.6941 - regression_loss: 2.2227 - classification_loss: 0.4713 358/500 [====================>.........] - ETA: 46s - loss: 2.6920 - regression_loss: 2.2211 - classification_loss: 0.4709 359/500 [====================>.........] - ETA: 46s - loss: 2.6920 - regression_loss: 2.2212 - classification_loss: 0.4708 360/500 [====================>.........] - ETA: 46s - loss: 2.6921 - regression_loss: 2.2214 - classification_loss: 0.4707 361/500 [====================>.........] - ETA: 45s - loss: 2.6926 - regression_loss: 2.2216 - classification_loss: 0.4709 362/500 [====================>.........] - ETA: 45s - loss: 2.6927 - regression_loss: 2.2219 - classification_loss: 0.4709 363/500 [====================>.........] - ETA: 45s - loss: 2.6933 - regression_loss: 2.2223 - classification_loss: 0.4710 364/500 [====================>.........] - ETA: 44s - loss: 2.6940 - regression_loss: 2.2233 - classification_loss: 0.4708 365/500 [====================>.........] - ETA: 44s - loss: 2.6919 - regression_loss: 2.2217 - classification_loss: 0.4702 366/500 [====================>.........] - ETA: 44s - loss: 2.6916 - regression_loss: 2.2217 - classification_loss: 0.4700 367/500 [=====================>........] - ETA: 43s - loss: 2.6915 - regression_loss: 2.2218 - classification_loss: 0.4698 368/500 [=====================>........] - ETA: 43s - loss: 2.6915 - regression_loss: 2.2218 - classification_loss: 0.4697 369/500 [=====================>........] - ETA: 43s - loss: 2.6918 - regression_loss: 2.2223 - classification_loss: 0.4695 370/500 [=====================>........] - ETA: 42s - loss: 2.6918 - regression_loss: 2.2224 - classification_loss: 0.4694 371/500 [=====================>........] - ETA: 42s - loss: 2.6918 - regression_loss: 2.2225 - classification_loss: 0.4693 372/500 [=====================>........] - ETA: 42s - loss: 2.6915 - regression_loss: 2.2223 - classification_loss: 0.4692 373/500 [=====================>........] - ETA: 41s - loss: 2.6915 - regression_loss: 2.2224 - classification_loss: 0.4692 374/500 [=====================>........] - ETA: 41s - loss: 2.6915 - regression_loss: 2.2224 - classification_loss: 0.4691 375/500 [=====================>........] - ETA: 41s - loss: 2.6915 - regression_loss: 2.2225 - classification_loss: 0.4690 376/500 [=====================>........] - ETA: 40s - loss: 2.6921 - regression_loss: 2.2222 - classification_loss: 0.4699 377/500 [=====================>........] - ETA: 40s - loss: 2.6921 - regression_loss: 2.2222 - classification_loss: 0.4699 378/500 [=====================>........] - ETA: 40s - loss: 2.6922 - regression_loss: 2.2223 - classification_loss: 0.4699 379/500 [=====================>........] - ETA: 39s - loss: 2.6925 - regression_loss: 2.2228 - classification_loss: 0.4697 380/500 [=====================>........] - ETA: 39s - loss: 2.6924 - regression_loss: 2.2225 - classification_loss: 0.4699 381/500 [=====================>........] - ETA: 39s - loss: 2.6923 - regression_loss: 2.2225 - classification_loss: 0.4698 382/500 [=====================>........] - ETA: 38s - loss: 2.6924 - regression_loss: 2.2227 - classification_loss: 0.4696 383/500 [=====================>........] - ETA: 38s - loss: 2.6929 - regression_loss: 2.2232 - classification_loss: 0.4697 384/500 [======================>.......] - ETA: 38s - loss: 2.6932 - regression_loss: 2.2235 - classification_loss: 0.4697 385/500 [======================>.......] - ETA: 37s - loss: 2.6923 - regression_loss: 2.2229 - classification_loss: 0.4694 386/500 [======================>.......] - ETA: 37s - loss: 2.6894 - regression_loss: 2.2205 - classification_loss: 0.4690 387/500 [======================>.......] - ETA: 37s - loss: 2.6898 - regression_loss: 2.2207 - classification_loss: 0.4691 388/500 [======================>.......] - ETA: 36s - loss: 2.6908 - regression_loss: 2.2213 - classification_loss: 0.4695 389/500 [======================>.......] - ETA: 36s - loss: 2.6906 - regression_loss: 2.2213 - classification_loss: 0.4693 390/500 [======================>.......] - ETA: 36s - loss: 2.6905 - regression_loss: 2.2209 - classification_loss: 0.4697 391/500 [======================>.......] - ETA: 35s - loss: 2.6903 - regression_loss: 2.2207 - classification_loss: 0.4696 392/500 [======================>.......] - ETA: 35s - loss: 2.6902 - regression_loss: 2.2207 - classification_loss: 0.4695 393/500 [======================>.......] - ETA: 35s - loss: 2.6908 - regression_loss: 2.2212 - classification_loss: 0.4696 394/500 [======================>.......] - ETA: 34s - loss: 2.6922 - regression_loss: 2.2221 - classification_loss: 0.4701 395/500 [======================>.......] - ETA: 34s - loss: 2.6918 - regression_loss: 2.2219 - classification_loss: 0.4699 396/500 [======================>.......] - ETA: 34s - loss: 2.6919 - regression_loss: 2.2222 - classification_loss: 0.4698 397/500 [======================>.......] - ETA: 33s - loss: 2.6914 - regression_loss: 2.2217 - classification_loss: 0.4697 398/500 [======================>.......] - ETA: 33s - loss: 2.6904 - regression_loss: 2.2210 - classification_loss: 0.4694 399/500 [======================>.......] - ETA: 33s - loss: 2.6887 - regression_loss: 2.2191 - classification_loss: 0.4696 400/500 [=======================>......] - ETA: 32s - loss: 2.6894 - regression_loss: 2.2197 - classification_loss: 0.4696 401/500 [=======================>......] - ETA: 32s - loss: 2.6890 - regression_loss: 2.2196 - classification_loss: 0.4694 402/500 [=======================>......] - ETA: 32s - loss: 2.6890 - regression_loss: 2.2196 - classification_loss: 0.4694 403/500 [=======================>......] - ETA: 31s - loss: 2.6893 - regression_loss: 2.2197 - classification_loss: 0.4695 404/500 [=======================>......] - ETA: 31s - loss: 2.6899 - regression_loss: 2.2202 - classification_loss: 0.4698 405/500 [=======================>......] - ETA: 31s - loss: 2.6902 - regression_loss: 2.2203 - classification_loss: 0.4699 406/500 [=======================>......] - ETA: 30s - loss: 2.6892 - regression_loss: 2.2197 - classification_loss: 0.4695 407/500 [=======================>......] - ETA: 30s - loss: 2.6890 - regression_loss: 2.2197 - classification_loss: 0.4694 408/500 [=======================>......] - ETA: 30s - loss: 2.6888 - regression_loss: 2.2196 - classification_loss: 0.4692 409/500 [=======================>......] - ETA: 29s - loss: 2.6883 - regression_loss: 2.2193 - classification_loss: 0.4690 410/500 [=======================>......] - ETA: 29s - loss: 2.6886 - regression_loss: 2.2194 - classification_loss: 0.4691 411/500 [=======================>......] - ETA: 29s - loss: 2.6874 - regression_loss: 2.2184 - classification_loss: 0.4690 412/500 [=======================>......] - ETA: 28s - loss: 2.6871 - regression_loss: 2.2181 - classification_loss: 0.4691 413/500 [=======================>......] - ETA: 28s - loss: 2.6867 - regression_loss: 2.2177 - classification_loss: 0.4690 414/500 [=======================>......] - ETA: 28s - loss: 2.6868 - regression_loss: 2.2178 - classification_loss: 0.4690 415/500 [=======================>......] - ETA: 27s - loss: 2.6875 - regression_loss: 2.2181 - classification_loss: 0.4693 416/500 [=======================>......] - ETA: 27s - loss: 2.6865 - regression_loss: 2.2172 - classification_loss: 0.4693 417/500 [========================>.....] - ETA: 27s - loss: 2.6870 - regression_loss: 2.2174 - classification_loss: 0.4696 418/500 [========================>.....] - ETA: 26s - loss: 2.6861 - regression_loss: 2.2168 - classification_loss: 0.4694 419/500 [========================>.....] - ETA: 26s - loss: 2.6841 - regression_loss: 2.2150 - classification_loss: 0.4691 420/500 [========================>.....] - ETA: 26s - loss: 2.6851 - regression_loss: 2.2158 - classification_loss: 0.4693 421/500 [========================>.....] - ETA: 26s - loss: 2.6856 - regression_loss: 2.2165 - classification_loss: 0.4691 422/500 [========================>.....] - ETA: 25s - loss: 2.6834 - regression_loss: 2.2147 - classification_loss: 0.4688 423/500 [========================>.....] - ETA: 25s - loss: 2.6837 - regression_loss: 2.2149 - classification_loss: 0.4688 424/500 [========================>.....] - ETA: 25s - loss: 2.6832 - regression_loss: 2.2145 - classification_loss: 0.4687 425/500 [========================>.....] - ETA: 24s - loss: 2.6831 - regression_loss: 2.2145 - classification_loss: 0.4686 426/500 [========================>.....] - ETA: 24s - loss: 2.6836 - regression_loss: 2.2149 - classification_loss: 0.4687 427/500 [========================>.....] - ETA: 24s - loss: 2.6828 - regression_loss: 2.2139 - classification_loss: 0.4688 428/500 [========================>.....] - ETA: 23s - loss: 2.6822 - regression_loss: 2.2134 - classification_loss: 0.4688 429/500 [========================>.....] - ETA: 23s - loss: 2.6821 - regression_loss: 2.2134 - classification_loss: 0.4686 430/500 [========================>.....] - ETA: 23s - loss: 2.6822 - regression_loss: 2.2133 - classification_loss: 0.4689 431/500 [========================>.....] - ETA: 22s - loss: 2.6823 - regression_loss: 2.2134 - classification_loss: 0.4689 432/500 [========================>.....] - ETA: 22s - loss: 2.6819 - regression_loss: 2.2132 - classification_loss: 0.4688 433/500 [========================>.....] - ETA: 22s - loss: 2.6820 - regression_loss: 2.2135 - classification_loss: 0.4685 434/500 [=========================>....] - ETA: 21s - loss: 2.6822 - regression_loss: 2.2137 - classification_loss: 0.4685 435/500 [=========================>....] - ETA: 21s - loss: 2.6824 - regression_loss: 2.2139 - classification_loss: 0.4685 436/500 [=========================>....] - ETA: 21s - loss: 2.6821 - regression_loss: 2.2137 - classification_loss: 0.4684 437/500 [=========================>....] - ETA: 20s - loss: 2.6822 - regression_loss: 2.2138 - classification_loss: 0.4684 438/500 [=========================>....] - ETA: 20s - loss: 2.6812 - regression_loss: 2.2132 - classification_loss: 0.4681 439/500 [=========================>....] - ETA: 20s - loss: 2.6812 - regression_loss: 2.2132 - classification_loss: 0.4680 440/500 [=========================>....] - ETA: 19s - loss: 2.6809 - regression_loss: 2.2130 - classification_loss: 0.4679 441/500 [=========================>....] - ETA: 19s - loss: 2.6796 - regression_loss: 2.2120 - classification_loss: 0.4676 442/500 [=========================>....] - ETA: 19s - loss: 2.6793 - regression_loss: 2.2118 - classification_loss: 0.4675 443/500 [=========================>....] - ETA: 18s - loss: 2.6792 - regression_loss: 2.2118 - classification_loss: 0.4674 444/500 [=========================>....] - ETA: 18s - loss: 2.6776 - regression_loss: 2.2105 - classification_loss: 0.4672 445/500 [=========================>....] - ETA: 18s - loss: 2.6769 - regression_loss: 2.2096 - classification_loss: 0.4672 446/500 [=========================>....] - ETA: 17s - loss: 2.6759 - regression_loss: 2.2089 - classification_loss: 0.4670 447/500 [=========================>....] - ETA: 17s - loss: 2.6772 - regression_loss: 2.2098 - classification_loss: 0.4674 448/500 [=========================>....] - ETA: 17s - loss: 2.6761 - regression_loss: 2.2091 - classification_loss: 0.4670 449/500 [=========================>....] - ETA: 16s - loss: 2.6765 - regression_loss: 2.2094 - classification_loss: 0.4671 450/500 [==========================>...] - ETA: 16s - loss: 2.6773 - regression_loss: 2.2100 - classification_loss: 0.4672 451/500 [==========================>...] - ETA: 16s - loss: 2.6747 - regression_loss: 2.2079 - classification_loss: 0.4668 452/500 [==========================>...] - ETA: 15s - loss: 2.6746 - regression_loss: 2.2079 - classification_loss: 0.4667 453/500 [==========================>...] - ETA: 15s - loss: 2.6747 - regression_loss: 2.2080 - classification_loss: 0.4667 454/500 [==========================>...] - ETA: 15s - loss: 2.6734 - regression_loss: 2.2072 - classification_loss: 0.4662 455/500 [==========================>...] - ETA: 14s - loss: 2.6737 - regression_loss: 2.2073 - classification_loss: 0.4664 456/500 [==========================>...] - ETA: 14s - loss: 2.6733 - regression_loss: 2.2071 - classification_loss: 0.4662 457/500 [==========================>...] - ETA: 14s - loss: 2.6732 - regression_loss: 2.2071 - classification_loss: 0.4661 458/500 [==========================>...] - ETA: 13s - loss: 2.6731 - regression_loss: 2.2071 - classification_loss: 0.4660 459/500 [==========================>...] - ETA: 13s - loss: 2.6729 - regression_loss: 2.2069 - classification_loss: 0.4659 460/500 [==========================>...] - ETA: 13s - loss: 2.6731 - regression_loss: 2.2071 - classification_loss: 0.4660 461/500 [==========================>...] - ETA: 12s - loss: 2.6728 - regression_loss: 2.2069 - classification_loss: 0.4659 462/500 [==========================>...] - ETA: 12s - loss: 2.6735 - regression_loss: 2.2076 - classification_loss: 0.4659 463/500 [==========================>...] - ETA: 12s - loss: 2.6732 - regression_loss: 2.2072 - classification_loss: 0.4660 464/500 [==========================>...] - ETA: 11s - loss: 2.6731 - regression_loss: 2.2072 - classification_loss: 0.4659 465/500 [==========================>...] - ETA: 11s - loss: 2.6732 - regression_loss: 2.2074 - classification_loss: 0.4658 466/500 [==========================>...] - ETA: 11s - loss: 2.6734 - regression_loss: 2.2076 - classification_loss: 0.4658 467/500 [===========================>..] - ETA: 10s - loss: 2.6718 - regression_loss: 2.2064 - classification_loss: 0.4654 468/500 [===========================>..] - ETA: 10s - loss: 2.6713 - regression_loss: 2.2059 - classification_loss: 0.4655 469/500 [===========================>..] - ETA: 10s - loss: 2.6710 - regression_loss: 2.2058 - classification_loss: 0.4652 470/500 [===========================>..] - ETA: 9s - loss: 2.6709 - regression_loss: 2.2056 - classification_loss: 0.4653  471/500 [===========================>..] - ETA: 9s - loss: 2.6711 - regression_loss: 2.2060 - classification_loss: 0.4651 472/500 [===========================>..] - ETA: 9s - loss: 2.6714 - regression_loss: 2.2061 - classification_loss: 0.4653 473/500 [===========================>..] - ETA: 8s - loss: 2.6714 - regression_loss: 2.2061 - classification_loss: 0.4652 474/500 [===========================>..] - ETA: 8s - loss: 2.6716 - regression_loss: 2.2065 - classification_loss: 0.4651 475/500 [===========================>..] - ETA: 8s - loss: 2.6709 - regression_loss: 2.2060 - classification_loss: 0.4649 476/500 [===========================>..] - ETA: 7s - loss: 2.6710 - regression_loss: 2.2061 - classification_loss: 0.4649 477/500 [===========================>..] - ETA: 7s - loss: 2.6708 - regression_loss: 2.2061 - classification_loss: 0.4647 478/500 [===========================>..] - ETA: 7s - loss: 2.6705 - regression_loss: 2.2059 - classification_loss: 0.4646 479/500 [===========================>..] - ETA: 6s - loss: 2.6706 - regression_loss: 2.2060 - classification_loss: 0.4646 480/500 [===========================>..] - ETA: 6s - loss: 2.6720 - regression_loss: 2.2069 - classification_loss: 0.4651 481/500 [===========================>..] - ETA: 6s - loss: 2.6722 - regression_loss: 2.2071 - classification_loss: 0.4651 482/500 [===========================>..] - ETA: 5s - loss: 2.6719 - regression_loss: 2.2070 - classification_loss: 0.4649 483/500 [===========================>..] - ETA: 5s - loss: 2.6720 - regression_loss: 2.2073 - classification_loss: 0.4647 484/500 [============================>.] - ETA: 5s - loss: 2.6724 - regression_loss: 2.2078 - classification_loss: 0.4646 485/500 [============================>.] - ETA: 4s - loss: 2.6724 - regression_loss: 2.2078 - classification_loss: 0.4646 486/500 [============================>.] - ETA: 4s - loss: 2.6722 - regression_loss: 2.2077 - classification_loss: 0.4646 487/500 [============================>.] - ETA: 4s - loss: 2.6723 - regression_loss: 2.2077 - classification_loss: 0.4645 488/500 [============================>.] - ETA: 3s - loss: 2.6714 - regression_loss: 2.2071 - classification_loss: 0.4643 489/500 [============================>.] - ETA: 3s - loss: 2.6711 - regression_loss: 2.2069 - classification_loss: 0.4642 490/500 [============================>.] - ETA: 3s - loss: 2.6713 - regression_loss: 2.2071 - classification_loss: 0.4643 491/500 [============================>.] - ETA: 2s - loss: 2.6713 - regression_loss: 2.2071 - classification_loss: 0.4642 492/500 [============================>.] - ETA: 2s - loss: 2.6713 - regression_loss: 2.2072 - classification_loss: 0.4641 493/500 [============================>.] - ETA: 2s - loss: 2.6707 - regression_loss: 2.2068 - classification_loss: 0.4639 494/500 [============================>.] - ETA: 1s - loss: 2.6708 - regression_loss: 2.2069 - classification_loss: 0.4639 495/500 [============================>.] - ETA: 1s - loss: 2.6708 - regression_loss: 2.2071 - classification_loss: 0.4638 496/500 [============================>.] - ETA: 1s - loss: 2.6702 - regression_loss: 2.2066 - classification_loss: 0.4636 497/500 [============================>.] - ETA: 0s - loss: 2.6705 - regression_loss: 2.2070 - classification_loss: 0.4636 498/500 [============================>.] - ETA: 0s - loss: 2.6709 - regression_loss: 2.2073 - classification_loss: 0.4636 499/500 [============================>.] - ETA: 0s - loss: 2.6691 - regression_loss: 2.2060 - classification_loss: 0.4631 500/500 [==============================] - 165s 329ms/step - loss: 2.6682 - regression_loss: 2.2053 - classification_loss: 0.4629 1172 instances of class plum with average precision: 0.1420 mAP: 0.1420 Epoch 00004: saving model to ./training/snapshots/resnet101_pascal_04.h5 Epoch 5/150 1/500 [..............................] - ETA: 2:48 - loss: 2.3114 - regression_loss: 1.8990 - classification_loss: 0.4124 2/500 [..............................] - ETA: 2:52 - loss: 2.1242 - regression_loss: 1.7772 - classification_loss: 0.3470 3/500 [..............................] - ETA: 2:50 - loss: 2.1845 - regression_loss: 1.8298 - classification_loss: 0.3547 4/500 [..............................] - ETA: 2:50 - loss: 2.3440 - regression_loss: 1.9654 - classification_loss: 0.3786 5/500 [..............................] - ETA: 2:51 - loss: 2.3576 - regression_loss: 1.9807 - classification_loss: 0.3769 6/500 [..............................] - ETA: 2:50 - loss: 2.3040 - regression_loss: 1.9407 - classification_loss: 0.3633 7/500 [..............................] - ETA: 2:48 - loss: 2.3448 - regression_loss: 1.9715 - classification_loss: 0.3733 8/500 [..............................] - ETA: 2:46 - loss: 2.3550 - regression_loss: 1.9770 - classification_loss: 0.3779 9/500 [..............................] - ETA: 2:45 - loss: 2.2949 - regression_loss: 1.9067 - classification_loss: 0.3882 10/500 [..............................] - ETA: 2:45 - loss: 2.3217 - regression_loss: 1.9336 - classification_loss: 0.3881 11/500 [..............................] - ETA: 2:44 - loss: 2.3521 - regression_loss: 1.9571 - classification_loss: 0.3950 12/500 [..............................] - ETA: 2:44 - loss: 2.3804 - regression_loss: 1.9775 - classification_loss: 0.4028 13/500 [..............................] - ETA: 2:43 - loss: 2.3881 - regression_loss: 1.9859 - classification_loss: 0.4022 14/500 [..............................] - ETA: 2:42 - loss: 2.4070 - regression_loss: 2.0028 - classification_loss: 0.4042 15/500 [..............................] - ETA: 2:41 - loss: 2.4120 - regression_loss: 2.0091 - classification_loss: 0.4029 16/500 [..............................] - ETA: 2:40 - loss: 2.4268 - regression_loss: 2.0192 - classification_loss: 0.4076 17/500 [>.............................] - ETA: 2:41 - loss: 2.4356 - regression_loss: 2.0261 - classification_loss: 0.4095 18/500 [>.............................] - ETA: 2:40 - loss: 2.4805 - regression_loss: 2.0614 - classification_loss: 0.4192 19/500 [>.............................] - ETA: 2:40 - loss: 2.5001 - regression_loss: 2.0767 - classification_loss: 0.4234 20/500 [>.............................] - ETA: 2:39 - loss: 2.5076 - regression_loss: 2.0829 - classification_loss: 0.4247 21/500 [>.............................] - ETA: 2:39 - loss: 2.5032 - regression_loss: 2.0820 - classification_loss: 0.4212 22/500 [>.............................] - ETA: 2:38 - loss: 2.5033 - regression_loss: 2.0824 - classification_loss: 0.4209 23/500 [>.............................] - ETA: 2:38 - loss: 2.5064 - regression_loss: 2.0860 - classification_loss: 0.4204 24/500 [>.............................] - ETA: 2:38 - loss: 2.5258 - regression_loss: 2.1004 - classification_loss: 0.4254 25/500 [>.............................] - ETA: 2:37 - loss: 2.5045 - regression_loss: 2.0843 - classification_loss: 0.4202 26/500 [>.............................] - ETA: 2:36 - loss: 2.5138 - regression_loss: 2.0928 - classification_loss: 0.4210 27/500 [>.............................] - ETA: 2:35 - loss: 2.5126 - regression_loss: 2.0926 - classification_loss: 0.4201 28/500 [>.............................] - ETA: 2:35 - loss: 2.5110 - regression_loss: 2.0926 - classification_loss: 0.4185 29/500 [>.............................] - ETA: 2:35 - loss: 2.5151 - regression_loss: 2.0970 - classification_loss: 0.4181 30/500 [>.............................] - ETA: 2:35 - loss: 2.5264 - regression_loss: 2.1060 - classification_loss: 0.4203 31/500 [>.............................] - ETA: 2:35 - loss: 2.5287 - regression_loss: 2.1097 - classification_loss: 0.4190 32/500 [>.............................] - ETA: 2:34 - loss: 2.5333 - regression_loss: 2.1148 - classification_loss: 0.4185 33/500 [>.............................] - ETA: 2:34 - loss: 2.5321 - regression_loss: 2.1149 - classification_loss: 0.4172 34/500 [=>............................] - ETA: 2:34 - loss: 2.5362 - regression_loss: 2.1198 - classification_loss: 0.4163 35/500 [=>............................] - ETA: 2:33 - loss: 2.5335 - regression_loss: 2.1183 - classification_loss: 0.4152 36/500 [=>............................] - ETA: 2:33 - loss: 2.5400 - regression_loss: 2.1221 - classification_loss: 0.4180 37/500 [=>............................] - ETA: 2:33 - loss: 2.5366 - regression_loss: 2.1091 - classification_loss: 0.4275 38/500 [=>............................] - ETA: 2:32 - loss: 2.5397 - regression_loss: 2.1102 - classification_loss: 0.4295 39/500 [=>............................] - ETA: 2:32 - loss: 2.5453 - regression_loss: 2.1146 - classification_loss: 0.4307 40/500 [=>............................] - ETA: 2:32 - loss: 2.5466 - regression_loss: 2.1144 - classification_loss: 0.4322 41/500 [=>............................] - ETA: 2:31 - loss: 2.5467 - regression_loss: 2.1132 - classification_loss: 0.4336 42/500 [=>............................] - ETA: 2:31 - loss: 2.5482 - regression_loss: 2.1148 - classification_loss: 0.4334 43/500 [=>............................] - ETA: 2:30 - loss: 2.5274 - regression_loss: 2.0969 - classification_loss: 0.4305 44/500 [=>............................] - ETA: 2:30 - loss: 2.5129 - regression_loss: 2.0844 - classification_loss: 0.4285 45/500 [=>............................] - ETA: 2:30 - loss: 2.5220 - regression_loss: 2.0895 - classification_loss: 0.4325 46/500 [=>............................] - ETA: 2:29 - loss: 2.5316 - regression_loss: 2.0963 - classification_loss: 0.4353 47/500 [=>............................] - ETA: 2:29 - loss: 2.5096 - regression_loss: 2.0770 - classification_loss: 0.4326 48/500 [=>............................] - ETA: 2:28 - loss: 2.5082 - regression_loss: 2.0747 - classification_loss: 0.4334 49/500 [=>............................] - ETA: 2:28 - loss: 2.5139 - regression_loss: 2.0783 - classification_loss: 0.4356 50/500 [==>...........................] - ETA: 2:28 - loss: 2.5165 - regression_loss: 2.0813 - classification_loss: 0.4351 51/500 [==>...........................] - ETA: 2:28 - loss: 2.5076 - regression_loss: 2.0744 - classification_loss: 0.4332 52/500 [==>...........................] - ETA: 2:27 - loss: 2.5051 - regression_loss: 2.0727 - classification_loss: 0.4324 53/500 [==>...........................] - ETA: 2:27 - loss: 2.5073 - regression_loss: 2.0741 - classification_loss: 0.4332 54/500 [==>...........................] - ETA: 2:26 - loss: 2.5052 - regression_loss: 2.0742 - classification_loss: 0.4310 55/500 [==>...........................] - ETA: 2:26 - loss: 2.5139 - regression_loss: 2.0793 - classification_loss: 0.4345 56/500 [==>...........................] - ETA: 2:26 - loss: 2.5065 - regression_loss: 2.0708 - classification_loss: 0.4356 57/500 [==>...........................] - ETA: 2:25 - loss: 2.5044 - regression_loss: 2.0701 - classification_loss: 0.4343 58/500 [==>...........................] - ETA: 2:25 - loss: 2.5077 - regression_loss: 2.0734 - classification_loss: 0.4343 59/500 [==>...........................] - ETA: 2:25 - loss: 2.5040 - regression_loss: 2.0704 - classification_loss: 0.4336 60/500 [==>...........................] - ETA: 2:24 - loss: 2.5019 - regression_loss: 2.0706 - classification_loss: 0.4313 61/500 [==>...........................] - ETA: 2:24 - loss: 2.5020 - regression_loss: 2.0703 - classification_loss: 0.4316 62/500 [==>...........................] - ETA: 2:24 - loss: 2.4933 - regression_loss: 2.0631 - classification_loss: 0.4302 63/500 [==>...........................] - ETA: 2:23 - loss: 2.4968 - regression_loss: 2.0657 - classification_loss: 0.4311 64/500 [==>...........................] - ETA: 2:23 - loss: 2.4926 - regression_loss: 2.0621 - classification_loss: 0.4305 65/500 [==>...........................] - ETA: 2:23 - loss: 2.4940 - regression_loss: 2.0635 - classification_loss: 0.4305 66/500 [==>...........................] - ETA: 2:22 - loss: 2.4971 - regression_loss: 2.0659 - classification_loss: 0.4312 67/500 [===>..........................] - ETA: 2:22 - loss: 2.5024 - regression_loss: 2.0698 - classification_loss: 0.4326 68/500 [===>..........................] - ETA: 2:22 - loss: 2.5049 - regression_loss: 2.0722 - classification_loss: 0.4327 69/500 [===>..........................] - ETA: 2:21 - loss: 2.5077 - regression_loss: 2.0745 - classification_loss: 0.4332 70/500 [===>..........................] - ETA: 2:21 - loss: 2.5146 - regression_loss: 2.0819 - classification_loss: 0.4328 71/500 [===>..........................] - ETA: 2:20 - loss: 2.5163 - regression_loss: 2.0835 - classification_loss: 0.4327 72/500 [===>..........................] - ETA: 2:20 - loss: 2.5235 - regression_loss: 2.0917 - classification_loss: 0.4318 73/500 [===>..........................] - ETA: 2:20 - loss: 2.5293 - regression_loss: 2.0972 - classification_loss: 0.4321 74/500 [===>..........................] - ETA: 2:20 - loss: 2.5323 - regression_loss: 2.0997 - classification_loss: 0.4326 75/500 [===>..........................] - ETA: 2:19 - loss: 2.5289 - regression_loss: 2.0975 - classification_loss: 0.4314 76/500 [===>..........................] - ETA: 2:19 - loss: 2.5408 - regression_loss: 2.1087 - classification_loss: 0.4321 77/500 [===>..........................] - ETA: 2:19 - loss: 2.5345 - regression_loss: 2.1024 - classification_loss: 0.4320 78/500 [===>..........................] - ETA: 2:19 - loss: 2.5344 - regression_loss: 2.1016 - classification_loss: 0.4328 79/500 [===>..........................] - ETA: 2:18 - loss: 2.5357 - regression_loss: 2.1026 - classification_loss: 0.4332 80/500 [===>..........................] - ETA: 2:18 - loss: 2.5350 - regression_loss: 2.1025 - classification_loss: 0.4325 81/500 [===>..........................] - ETA: 2:18 - loss: 2.5387 - regression_loss: 2.1058 - classification_loss: 0.4328 82/500 [===>..........................] - ETA: 2:17 - loss: 2.5391 - regression_loss: 2.1062 - classification_loss: 0.4329 83/500 [===>..........................] - ETA: 2:17 - loss: 2.5400 - regression_loss: 2.1063 - classification_loss: 0.4337 84/500 [====>.........................] - ETA: 2:17 - loss: 2.5380 - regression_loss: 2.1050 - classification_loss: 0.4330 85/500 [====>.........................] - ETA: 2:16 - loss: 2.5391 - regression_loss: 2.1061 - classification_loss: 0.4331 86/500 [====>.........................] - ETA: 2:16 - loss: 2.5390 - regression_loss: 2.1057 - classification_loss: 0.4333 87/500 [====>.........................] - ETA: 2:16 - loss: 2.5412 - regression_loss: 2.1075 - classification_loss: 0.4337 88/500 [====>.........................] - ETA: 2:15 - loss: 2.5355 - regression_loss: 2.1031 - classification_loss: 0.4324 89/500 [====>.........................] - ETA: 2:15 - loss: 2.5356 - regression_loss: 2.1030 - classification_loss: 0.4325 90/500 [====>.........................] - ETA: 2:15 - loss: 2.5358 - regression_loss: 2.1032 - classification_loss: 0.4326 91/500 [====>.........................] - ETA: 2:14 - loss: 2.5306 - regression_loss: 2.0975 - classification_loss: 0.4331 92/500 [====>.........................] - ETA: 2:14 - loss: 2.5322 - regression_loss: 2.0989 - classification_loss: 0.4333 93/500 [====>.........................] - ETA: 2:14 - loss: 2.5337 - regression_loss: 2.1003 - classification_loss: 0.4334 94/500 [====>.........................] - ETA: 2:13 - loss: 2.5348 - regression_loss: 2.1012 - classification_loss: 0.4335 95/500 [====>.........................] - ETA: 2:13 - loss: 2.5309 - regression_loss: 2.0984 - classification_loss: 0.4325 96/500 [====>.........................] - ETA: 2:13 - loss: 2.5338 - regression_loss: 2.0996 - classification_loss: 0.4342 97/500 [====>.........................] - ETA: 2:12 - loss: 2.5391 - regression_loss: 2.1038 - classification_loss: 0.4353 98/500 [====>.........................] - ETA: 2:12 - loss: 2.5387 - regression_loss: 2.1040 - classification_loss: 0.4348 99/500 [====>.........................] - ETA: 2:12 - loss: 2.5353 - regression_loss: 2.1009 - classification_loss: 0.4344 100/500 [=====>........................] - ETA: 2:11 - loss: 2.5237 - regression_loss: 2.0908 - classification_loss: 0.4329 101/500 [=====>........................] - ETA: 2:11 - loss: 2.5240 - regression_loss: 2.0913 - classification_loss: 0.4328 102/500 [=====>........................] - ETA: 2:11 - loss: 2.5224 - regression_loss: 2.0901 - classification_loss: 0.4323 103/500 [=====>........................] - ETA: 2:10 - loss: 2.5217 - regression_loss: 2.0904 - classification_loss: 0.4312 104/500 [=====>........................] - ETA: 2:10 - loss: 2.5217 - regression_loss: 2.0908 - classification_loss: 0.4309 105/500 [=====>........................] - ETA: 2:10 - loss: 2.5165 - regression_loss: 2.0871 - classification_loss: 0.4294 106/500 [=====>........................] - ETA: 2:09 - loss: 2.5154 - regression_loss: 2.0863 - classification_loss: 0.4290 107/500 [=====>........................] - ETA: 2:09 - loss: 2.5180 - regression_loss: 2.0879 - classification_loss: 0.4301 108/500 [=====>........................] - ETA: 2:09 - loss: 2.5133 - regression_loss: 2.0843 - classification_loss: 0.4290 109/500 [=====>........................] - ETA: 2:08 - loss: 2.5150 - regression_loss: 2.0858 - classification_loss: 0.4292 110/500 [=====>........................] - ETA: 2:08 - loss: 2.5130 - regression_loss: 2.0844 - classification_loss: 0.4286 111/500 [=====>........................] - ETA: 2:08 - loss: 2.5146 - regression_loss: 2.0857 - classification_loss: 0.4289 112/500 [=====>........................] - ETA: 2:08 - loss: 2.5193 - regression_loss: 2.0889 - classification_loss: 0.4305 113/500 [=====>........................] - ETA: 2:07 - loss: 2.5211 - regression_loss: 2.0901 - classification_loss: 0.4310 114/500 [=====>........................] - ETA: 2:07 - loss: 2.5215 - regression_loss: 2.0906 - classification_loss: 0.4308 115/500 [=====>........................] - ETA: 2:07 - loss: 2.5203 - regression_loss: 2.0895 - classification_loss: 0.4308 116/500 [=====>........................] - ETA: 2:06 - loss: 2.5221 - regression_loss: 2.0924 - classification_loss: 0.4298 117/500 [======>.......................] - ETA: 2:06 - loss: 2.5232 - regression_loss: 2.0928 - classification_loss: 0.4304 118/500 [======>.......................] - ETA: 2:06 - loss: 2.5234 - regression_loss: 2.0933 - classification_loss: 0.4302 119/500 [======>.......................] - ETA: 2:05 - loss: 2.5234 - regression_loss: 2.0937 - classification_loss: 0.4297 120/500 [======>.......................] - ETA: 2:05 - loss: 2.5293 - regression_loss: 2.0981 - classification_loss: 0.4312 121/500 [======>.......................] - ETA: 2:05 - loss: 2.5302 - regression_loss: 2.0987 - classification_loss: 0.4315 122/500 [======>.......................] - ETA: 2:05 - loss: 2.5237 - regression_loss: 2.0936 - classification_loss: 0.4301 123/500 [======>.......................] - ETA: 2:04 - loss: 2.5243 - regression_loss: 2.0937 - classification_loss: 0.4306 124/500 [======>.......................] - ETA: 2:04 - loss: 2.5194 - regression_loss: 2.0900 - classification_loss: 0.4294 125/500 [======>.......................] - ETA: 2:04 - loss: 2.5173 - regression_loss: 2.0880 - classification_loss: 0.4293 126/500 [======>.......................] - ETA: 2:03 - loss: 2.5174 - regression_loss: 2.0882 - classification_loss: 0.4292 127/500 [======>.......................] - ETA: 2:03 - loss: 2.5198 - regression_loss: 2.0902 - classification_loss: 0.4296 128/500 [======>.......................] - ETA: 2:03 - loss: 2.5243 - regression_loss: 2.0931 - classification_loss: 0.4312 129/500 [======>.......................] - ETA: 2:02 - loss: 2.5249 - regression_loss: 2.0936 - classification_loss: 0.4313 130/500 [======>.......................] - ETA: 2:02 - loss: 2.5200 - regression_loss: 2.0889 - classification_loss: 0.4310 131/500 [======>.......................] - ETA: 2:02 - loss: 2.5114 - regression_loss: 2.0820 - classification_loss: 0.4294 132/500 [======>.......................] - ETA: 2:01 - loss: 2.5127 - regression_loss: 2.0831 - classification_loss: 0.4296 133/500 [======>.......................] - ETA: 2:01 - loss: 2.5137 - regression_loss: 2.0841 - classification_loss: 0.4296 134/500 [=======>......................] - ETA: 2:00 - loss: 2.5151 - regression_loss: 2.0858 - classification_loss: 0.4293 135/500 [=======>......................] - ETA: 2:00 - loss: 2.5151 - regression_loss: 2.0856 - classification_loss: 0.4295 136/500 [=======>......................] - ETA: 2:00 - loss: 2.5177 - regression_loss: 2.0879 - classification_loss: 0.4297 137/500 [=======>......................] - ETA: 2:00 - loss: 2.5173 - regression_loss: 2.0880 - classification_loss: 0.4292 138/500 [=======>......................] - ETA: 1:59 - loss: 2.5152 - regression_loss: 2.0869 - classification_loss: 0.4283 139/500 [=======>......................] - ETA: 1:59 - loss: 2.5131 - regression_loss: 2.0849 - classification_loss: 0.4282 140/500 [=======>......................] - ETA: 1:59 - loss: 2.5113 - regression_loss: 2.0841 - classification_loss: 0.4272 141/500 [=======>......................] - ETA: 1:58 - loss: 2.5126 - regression_loss: 2.0855 - classification_loss: 0.4270 142/500 [=======>......................] - ETA: 1:58 - loss: 2.5132 - regression_loss: 2.0860 - classification_loss: 0.4273 143/500 [=======>......................] - ETA: 1:58 - loss: 2.5120 - regression_loss: 2.0849 - classification_loss: 0.4271 144/500 [=======>......................] - ETA: 1:57 - loss: 2.5135 - regression_loss: 2.0872 - classification_loss: 0.4263 145/500 [=======>......................] - ETA: 1:57 - loss: 2.5146 - regression_loss: 2.0880 - classification_loss: 0.4265 146/500 [=======>......................] - ETA: 1:57 - loss: 2.5105 - regression_loss: 2.0843 - classification_loss: 0.4262 147/500 [=======>......................] - ETA: 1:56 - loss: 2.5119 - regression_loss: 2.0855 - classification_loss: 0.4264 148/500 [=======>......................] - ETA: 1:56 - loss: 2.5129 - regression_loss: 2.0866 - classification_loss: 0.4263 149/500 [=======>......................] - ETA: 1:55 - loss: 2.5066 - regression_loss: 2.0813 - classification_loss: 0.4252 150/500 [========>.....................] - ETA: 1:55 - loss: 2.5071 - regression_loss: 2.0818 - classification_loss: 0.4253 151/500 [========>.....................] - ETA: 1:55 - loss: 2.5070 - regression_loss: 2.0817 - classification_loss: 0.4253 152/500 [========>.....................] - ETA: 1:55 - loss: 2.5033 - regression_loss: 2.0791 - classification_loss: 0.4242 153/500 [========>.....................] - ETA: 1:54 - loss: 2.5027 - regression_loss: 2.0783 - classification_loss: 0.4245 154/500 [========>.....................] - ETA: 1:54 - loss: 2.5046 - regression_loss: 2.0803 - classification_loss: 0.4243 155/500 [========>.....................] - ETA: 1:54 - loss: 2.5034 - regression_loss: 2.0796 - classification_loss: 0.4238 156/500 [========>.....................] - ETA: 1:53 - loss: 2.5026 - regression_loss: 2.0792 - classification_loss: 0.4234 157/500 [========>.....................] - ETA: 1:53 - loss: 2.5088 - regression_loss: 2.0849 - classification_loss: 0.4239 158/500 [========>.....................] - ETA: 1:53 - loss: 2.5072 - regression_loss: 2.0838 - classification_loss: 0.4234 159/500 [========>.....................] - ETA: 1:52 - loss: 2.5026 - regression_loss: 2.0799 - classification_loss: 0.4227 160/500 [========>.....................] - ETA: 1:52 - loss: 2.5026 - regression_loss: 2.0799 - classification_loss: 0.4227 161/500 [========>.....................] - ETA: 1:52 - loss: 2.5025 - regression_loss: 2.0794 - classification_loss: 0.4232 162/500 [========>.....................] - ETA: 1:51 - loss: 2.5011 - regression_loss: 2.0778 - classification_loss: 0.4233 163/500 [========>.....................] - ETA: 1:51 - loss: 2.5004 - regression_loss: 2.0774 - classification_loss: 0.4229 164/500 [========>.....................] - ETA: 1:51 - loss: 2.5015 - regression_loss: 2.0787 - classification_loss: 0.4229 165/500 [========>.....................] - ETA: 1:50 - loss: 2.5013 - regression_loss: 2.0784 - classification_loss: 0.4228 166/500 [========>.....................] - ETA: 1:50 - loss: 2.4977 - regression_loss: 2.0756 - classification_loss: 0.4221 167/500 [=========>....................] - ETA: 1:50 - loss: 2.4945 - regression_loss: 2.0733 - classification_loss: 0.4212 168/500 [=========>....................] - ETA: 1:49 - loss: 2.4910 - regression_loss: 2.0700 - classification_loss: 0.4210 169/500 [=========>....................] - ETA: 1:49 - loss: 2.4924 - regression_loss: 2.0713 - classification_loss: 0.4212 170/500 [=========>....................] - ETA: 1:49 - loss: 2.4869 - regression_loss: 2.0666 - classification_loss: 0.4203 171/500 [=========>....................] - ETA: 1:48 - loss: 2.4886 - regression_loss: 2.0675 - classification_loss: 0.4211 172/500 [=========>....................] - ETA: 1:48 - loss: 2.4896 - regression_loss: 2.0683 - classification_loss: 0.4212 173/500 [=========>....................] - ETA: 1:48 - loss: 2.4886 - regression_loss: 2.0673 - classification_loss: 0.4213 174/500 [=========>....................] - ETA: 1:47 - loss: 2.4901 - regression_loss: 2.0681 - classification_loss: 0.4220 175/500 [=========>....................] - ETA: 1:47 - loss: 2.4872 - regression_loss: 2.0658 - classification_loss: 0.4213 176/500 [=========>....................] - ETA: 1:47 - loss: 2.4870 - regression_loss: 2.0656 - classification_loss: 0.4215 177/500 [=========>....................] - ETA: 1:46 - loss: 2.4878 - regression_loss: 2.0658 - classification_loss: 0.4219 178/500 [=========>....................] - ETA: 1:46 - loss: 2.4911 - regression_loss: 2.0685 - classification_loss: 0.4226 179/500 [=========>....................] - ETA: 1:46 - loss: 2.4942 - regression_loss: 2.0707 - classification_loss: 0.4235 180/500 [=========>....................] - ETA: 1:45 - loss: 2.4946 - regression_loss: 2.0711 - classification_loss: 0.4235 181/500 [=========>....................] - ETA: 1:45 - loss: 2.4963 - regression_loss: 2.0723 - classification_loss: 0.4241 182/500 [=========>....................] - ETA: 1:45 - loss: 2.4959 - regression_loss: 2.0721 - classification_loss: 0.4238 183/500 [=========>....................] - ETA: 1:44 - loss: 2.4973 - regression_loss: 2.0733 - classification_loss: 0.4240 184/500 [==========>...................] - ETA: 1:44 - loss: 2.4988 - regression_loss: 2.0741 - classification_loss: 0.4247 185/500 [==========>...................] - ETA: 1:44 - loss: 2.5000 - regression_loss: 2.0752 - classification_loss: 0.4248 186/500 [==========>...................] - ETA: 1:43 - loss: 2.5003 - regression_loss: 2.0753 - classification_loss: 0.4251 187/500 [==========>...................] - ETA: 1:43 - loss: 2.4987 - regression_loss: 2.0741 - classification_loss: 0.4245 188/500 [==========>...................] - ETA: 1:43 - loss: 2.5012 - regression_loss: 2.0757 - classification_loss: 0.4255 189/500 [==========>...................] - ETA: 1:42 - loss: 2.5016 - regression_loss: 2.0759 - classification_loss: 0.4257 190/500 [==========>...................] - ETA: 1:42 - loss: 2.5018 - regression_loss: 2.0761 - classification_loss: 0.4257 191/500 [==========>...................] - ETA: 1:42 - loss: 2.5025 - regression_loss: 2.0766 - classification_loss: 0.4260 192/500 [==========>...................] - ETA: 1:41 - loss: 2.5010 - regression_loss: 2.0753 - classification_loss: 0.4257 193/500 [==========>...................] - ETA: 1:41 - loss: 2.5010 - regression_loss: 2.0756 - classification_loss: 0.4254 194/500 [==========>...................] - ETA: 1:41 - loss: 2.4997 - regression_loss: 2.0747 - classification_loss: 0.4250 195/500 [==========>...................] - ETA: 1:40 - loss: 2.5019 - regression_loss: 2.0765 - classification_loss: 0.4254 196/500 [==========>...................] - ETA: 1:40 - loss: 2.5020 - regression_loss: 2.0771 - classification_loss: 0.4248 197/500 [==========>...................] - ETA: 1:40 - loss: 2.5008 - regression_loss: 2.0763 - classification_loss: 0.4245 198/500 [==========>...................] - ETA: 1:39 - loss: 2.5019 - regression_loss: 2.0771 - classification_loss: 0.4249 199/500 [==========>...................] - ETA: 1:39 - loss: 2.5041 - regression_loss: 2.0782 - classification_loss: 0.4258 200/500 [===========>..................] - ETA: 1:39 - loss: 2.5036 - regression_loss: 2.0783 - classification_loss: 0.4253 201/500 [===========>..................] - ETA: 1:38 - loss: 2.5045 - regression_loss: 2.0788 - classification_loss: 0.4257 202/500 [===========>..................] - ETA: 1:38 - loss: 2.5022 - regression_loss: 2.0771 - classification_loss: 0.4251 203/500 [===========>..................] - ETA: 1:38 - loss: 2.5025 - regression_loss: 2.0770 - classification_loss: 0.4256 204/500 [===========>..................] - ETA: 1:37 - loss: 2.5027 - regression_loss: 2.0775 - classification_loss: 0.4251 205/500 [===========>..................] - ETA: 1:37 - loss: 2.5030 - regression_loss: 2.0779 - classification_loss: 0.4251 206/500 [===========>..................] - ETA: 1:37 - loss: 2.5039 - regression_loss: 2.0788 - classification_loss: 0.4252 207/500 [===========>..................] - ETA: 1:36 - loss: 2.5059 - regression_loss: 2.0802 - classification_loss: 0.4256 208/500 [===========>..................] - ETA: 1:36 - loss: 2.5066 - regression_loss: 2.0810 - classification_loss: 0.4256 209/500 [===========>..................] - ETA: 1:36 - loss: 2.5068 - regression_loss: 2.0813 - classification_loss: 0.4255 210/500 [===========>..................] - ETA: 1:35 - loss: 2.5074 - regression_loss: 2.0819 - classification_loss: 0.4256 211/500 [===========>..................] - ETA: 1:35 - loss: 2.5050 - regression_loss: 2.0796 - classification_loss: 0.4254 212/500 [===========>..................] - ETA: 1:35 - loss: 2.5040 - regression_loss: 2.0786 - classification_loss: 0.4254 213/500 [===========>..................] - ETA: 1:34 - loss: 2.5021 - regression_loss: 2.0774 - classification_loss: 0.4248 214/500 [===========>..................] - ETA: 1:34 - loss: 2.4994 - regression_loss: 2.0750 - classification_loss: 0.4244 215/500 [===========>..................] - ETA: 1:34 - loss: 2.5025 - regression_loss: 2.0777 - classification_loss: 0.4248 216/500 [===========>..................] - ETA: 1:33 - loss: 2.5038 - regression_loss: 2.0786 - classification_loss: 0.4252 217/500 [============>.................] - ETA: 1:33 - loss: 2.5034 - regression_loss: 2.0779 - classification_loss: 0.4255 218/500 [============>.................] - ETA: 1:33 - loss: 2.5061 - regression_loss: 2.0799 - classification_loss: 0.4262 219/500 [============>.................] - ETA: 1:32 - loss: 2.5071 - regression_loss: 2.0806 - classification_loss: 0.4265 220/500 [============>.................] - ETA: 1:32 - loss: 2.5066 - regression_loss: 2.0804 - classification_loss: 0.4262 221/500 [============>.................] - ETA: 1:32 - loss: 2.5049 - regression_loss: 2.0791 - classification_loss: 0.4258 222/500 [============>.................] - ETA: 1:31 - loss: 2.5050 - regression_loss: 2.0794 - classification_loss: 0.4256 223/500 [============>.................] - ETA: 1:31 - loss: 2.5042 - regression_loss: 2.0784 - classification_loss: 0.4258 224/500 [============>.................] - ETA: 1:31 - loss: 2.5046 - regression_loss: 2.0787 - classification_loss: 0.4259 225/500 [============>.................] - ETA: 1:30 - loss: 2.5048 - regression_loss: 2.0794 - classification_loss: 0.4253 226/500 [============>.................] - ETA: 1:30 - loss: 2.5027 - regression_loss: 2.0779 - classification_loss: 0.4249 227/500 [============>.................] - ETA: 1:30 - loss: 2.5039 - regression_loss: 2.0788 - classification_loss: 0.4251 228/500 [============>.................] - ETA: 1:29 - loss: 2.5039 - regression_loss: 2.0788 - classification_loss: 0.4251 229/500 [============>.................] - ETA: 1:29 - loss: 2.5049 - regression_loss: 2.0797 - classification_loss: 0.4253 230/500 [============>.................] - ETA: 1:29 - loss: 2.5043 - regression_loss: 2.0794 - classification_loss: 0.4249 231/500 [============>.................] - ETA: 1:28 - loss: 2.5033 - regression_loss: 2.0783 - classification_loss: 0.4250 232/500 [============>.................] - ETA: 1:28 - loss: 2.5032 - regression_loss: 2.0782 - classification_loss: 0.4250 233/500 [============>.................] - ETA: 1:28 - loss: 2.5035 - regression_loss: 2.0786 - classification_loss: 0.4249 234/500 [=============>................] - ETA: 1:27 - loss: 2.5030 - regression_loss: 2.0787 - classification_loss: 0.4243 235/500 [=============>................] - ETA: 1:27 - loss: 2.5005 - regression_loss: 2.0766 - classification_loss: 0.4239 236/500 [=============>................] - ETA: 1:27 - loss: 2.5016 - regression_loss: 2.0777 - classification_loss: 0.4239 237/500 [=============>................] - ETA: 1:26 - loss: 2.5022 - regression_loss: 2.0785 - classification_loss: 0.4237 238/500 [=============>................] - ETA: 1:26 - loss: 2.5032 - regression_loss: 2.0793 - classification_loss: 0.4239 239/500 [=============>................] - ETA: 1:26 - loss: 2.5035 - regression_loss: 2.0795 - classification_loss: 0.4239 240/500 [=============>................] - ETA: 1:25 - loss: 2.5039 - regression_loss: 2.0801 - classification_loss: 0.4238 241/500 [=============>................] - ETA: 1:25 - loss: 2.5058 - regression_loss: 2.0814 - classification_loss: 0.4244 242/500 [=============>................] - ETA: 1:25 - loss: 2.5025 - regression_loss: 2.0786 - classification_loss: 0.4239 243/500 [=============>................] - ETA: 1:24 - loss: 2.5021 - regression_loss: 2.0785 - classification_loss: 0.4236 244/500 [=============>................] - ETA: 1:24 - loss: 2.5043 - regression_loss: 2.0799 - classification_loss: 0.4244 245/500 [=============>................] - ETA: 1:24 - loss: 2.5059 - regression_loss: 2.0814 - classification_loss: 0.4246 246/500 [=============>................] - ETA: 1:23 - loss: 2.5054 - regression_loss: 2.0810 - classification_loss: 0.4244 247/500 [=============>................] - ETA: 1:23 - loss: 2.5066 - regression_loss: 2.0813 - classification_loss: 0.4253 248/500 [=============>................] - ETA: 1:23 - loss: 2.5061 - regression_loss: 2.0810 - classification_loss: 0.4251 249/500 [=============>................] - ETA: 1:22 - loss: 2.5044 - regression_loss: 2.0797 - classification_loss: 0.4248 250/500 [==============>...............] - ETA: 1:22 - loss: 2.5055 - regression_loss: 2.0804 - classification_loss: 0.4251 251/500 [==============>...............] - ETA: 1:22 - loss: 2.5062 - regression_loss: 2.0810 - classification_loss: 0.4252 252/500 [==============>...............] - ETA: 1:21 - loss: 2.5064 - regression_loss: 2.0812 - classification_loss: 0.4251 253/500 [==============>...............] - ETA: 1:21 - loss: 2.5050 - regression_loss: 2.0803 - classification_loss: 0.4247 254/500 [==============>...............] - ETA: 1:21 - loss: 2.5056 - regression_loss: 2.0807 - classification_loss: 0.4249 255/500 [==============>...............] - ETA: 1:20 - loss: 2.5051 - regression_loss: 2.0804 - classification_loss: 0.4247 256/500 [==============>...............] - ETA: 1:20 - loss: 2.5053 - regression_loss: 2.0804 - classification_loss: 0.4249 257/500 [==============>...............] - ETA: 1:20 - loss: 2.5053 - regression_loss: 2.0805 - classification_loss: 0.4248 258/500 [==============>...............] - ETA: 1:19 - loss: 2.5044 - regression_loss: 2.0798 - classification_loss: 0.4245 259/500 [==============>...............] - ETA: 1:19 - loss: 2.5035 - regression_loss: 2.0791 - classification_loss: 0.4244 260/500 [==============>...............] - ETA: 1:19 - loss: 2.5049 - regression_loss: 2.0802 - classification_loss: 0.4247 261/500 [==============>...............] - ETA: 1:18 - loss: 2.5051 - regression_loss: 2.0804 - classification_loss: 0.4247 262/500 [==============>...............] - ETA: 1:18 - loss: 2.5063 - regression_loss: 2.0814 - classification_loss: 0.4249 263/500 [==============>...............] - ETA: 1:18 - loss: 2.5069 - regression_loss: 2.0820 - classification_loss: 0.4249 264/500 [==============>...............] - ETA: 1:17 - loss: 2.5081 - regression_loss: 2.0831 - classification_loss: 0.4250 265/500 [==============>...............] - ETA: 1:17 - loss: 2.5085 - regression_loss: 2.0834 - classification_loss: 0.4251 266/500 [==============>...............] - ETA: 1:17 - loss: 2.5085 - regression_loss: 2.0834 - classification_loss: 0.4250 267/500 [===============>..............] - ETA: 1:16 - loss: 2.5091 - regression_loss: 2.0839 - classification_loss: 0.4252 268/500 [===============>..............] - ETA: 1:16 - loss: 2.5087 - regression_loss: 2.0835 - classification_loss: 0.4253 269/500 [===============>..............] - ETA: 1:16 - loss: 2.5090 - regression_loss: 2.0837 - classification_loss: 0.4252 270/500 [===============>..............] - ETA: 1:15 - loss: 2.5092 - regression_loss: 2.0840 - classification_loss: 0.4252 271/500 [===============>..............] - ETA: 1:15 - loss: 2.5095 - regression_loss: 2.0843 - classification_loss: 0.4252 272/500 [===============>..............] - ETA: 1:15 - loss: 2.5077 - regression_loss: 2.0832 - classification_loss: 0.4245 273/500 [===============>..............] - ETA: 1:14 - loss: 2.5085 - regression_loss: 2.0837 - classification_loss: 0.4247 274/500 [===============>..............] - ETA: 1:14 - loss: 2.5092 - regression_loss: 2.0842 - classification_loss: 0.4249 275/500 [===============>..............] - ETA: 1:14 - loss: 2.5094 - regression_loss: 2.0845 - classification_loss: 0.4249 276/500 [===============>..............] - ETA: 1:13 - loss: 2.5099 - regression_loss: 2.0850 - classification_loss: 0.4249 277/500 [===============>..............] - ETA: 1:13 - loss: 2.5102 - regression_loss: 2.0853 - classification_loss: 0.4249 278/500 [===============>..............] - ETA: 1:13 - loss: 2.5072 - regression_loss: 2.0827 - classification_loss: 0.4244 279/500 [===============>..............] - ETA: 1:12 - loss: 2.5076 - regression_loss: 2.0831 - classification_loss: 0.4245 280/500 [===============>..............] - ETA: 1:12 - loss: 2.5071 - regression_loss: 2.0828 - classification_loss: 0.4243 281/500 [===============>..............] - ETA: 1:12 - loss: 2.5073 - regression_loss: 2.0831 - classification_loss: 0.4242 282/500 [===============>..............] - ETA: 1:11 - loss: 2.5059 - regression_loss: 2.0821 - classification_loss: 0.4238 283/500 [===============>..............] - ETA: 1:11 - loss: 2.5058 - regression_loss: 2.0819 - classification_loss: 0.4238 284/500 [================>.............] - ETA: 1:11 - loss: 2.5059 - regression_loss: 2.0820 - classification_loss: 0.4238 285/500 [================>.............] - ETA: 1:10 - loss: 2.5064 - regression_loss: 2.0824 - classification_loss: 0.4240 286/500 [================>.............] - ETA: 1:10 - loss: 2.5063 - regression_loss: 2.0823 - classification_loss: 0.4239 287/500 [================>.............] - ETA: 1:10 - loss: 2.5069 - regression_loss: 2.0826 - classification_loss: 0.4243 288/500 [================>.............] - ETA: 1:09 - loss: 2.5067 - regression_loss: 2.0824 - classification_loss: 0.4244 289/500 [================>.............] - ETA: 1:09 - loss: 2.5069 - regression_loss: 2.0825 - classification_loss: 0.4244 290/500 [================>.............] - ETA: 1:09 - loss: 2.5068 - regression_loss: 2.0825 - classification_loss: 0.4243 291/500 [================>.............] - ETA: 1:09 - loss: 2.5074 - regression_loss: 2.0828 - classification_loss: 0.4246 292/500 [================>.............] - ETA: 1:08 - loss: 2.5051 - regression_loss: 2.0810 - classification_loss: 0.4241 293/500 [================>.............] - ETA: 1:08 - loss: 2.5047 - regression_loss: 2.0808 - classification_loss: 0.4240 294/500 [================>.............] - ETA: 1:07 - loss: 2.5035 - regression_loss: 2.0797 - classification_loss: 0.4238 295/500 [================>.............] - ETA: 1:07 - loss: 2.5043 - regression_loss: 2.0802 - classification_loss: 0.4241 296/500 [================>.............] - ETA: 1:07 - loss: 2.5038 - regression_loss: 2.0799 - classification_loss: 0.4239 297/500 [================>.............] - ETA: 1:06 - loss: 2.5048 - regression_loss: 2.0806 - classification_loss: 0.4242 298/500 [================>.............] - ETA: 1:06 - loss: 2.5057 - regression_loss: 2.0811 - classification_loss: 0.4245 299/500 [================>.............] - ETA: 1:06 - loss: 2.5042 - regression_loss: 2.0798 - classification_loss: 0.4244 300/500 [=================>............] - ETA: 1:06 - loss: 2.5041 - regression_loss: 2.0798 - classification_loss: 0.4244 301/500 [=================>............] - ETA: 1:05 - loss: 2.5041 - regression_loss: 2.0797 - classification_loss: 0.4244 302/500 [=================>............] - ETA: 1:05 - loss: 2.5036 - regression_loss: 2.0793 - classification_loss: 0.4243 303/500 [=================>............] - ETA: 1:05 - loss: 2.5040 - regression_loss: 2.0796 - classification_loss: 0.4243 304/500 [=================>............] - ETA: 1:04 - loss: 2.5050 - regression_loss: 2.0808 - classification_loss: 0.4241 305/500 [=================>............] - ETA: 1:04 - loss: 2.5053 - regression_loss: 2.0809 - classification_loss: 0.4244 306/500 [=================>............] - ETA: 1:04 - loss: 2.5048 - regression_loss: 2.0804 - classification_loss: 0.4244 307/500 [=================>............] - ETA: 1:03 - loss: 2.5051 - regression_loss: 2.0807 - classification_loss: 0.4244 308/500 [=================>............] - ETA: 1:03 - loss: 2.5057 - regression_loss: 2.0812 - classification_loss: 0.4245 309/500 [=================>............] - ETA: 1:02 - loss: 2.5058 - regression_loss: 2.0814 - classification_loss: 0.4244 310/500 [=================>............] - ETA: 1:02 - loss: 2.5061 - regression_loss: 2.0816 - classification_loss: 0.4245 311/500 [=================>............] - ETA: 1:02 - loss: 2.5049 - regression_loss: 2.0807 - classification_loss: 0.4242 312/500 [=================>............] - ETA: 1:02 - loss: 2.5046 - regression_loss: 2.0805 - classification_loss: 0.4241 313/500 [=================>............] - ETA: 1:01 - loss: 2.5046 - regression_loss: 2.0804 - classification_loss: 0.4242 314/500 [=================>............] - ETA: 1:01 - loss: 2.5040 - regression_loss: 2.0797 - classification_loss: 0.4243 315/500 [=================>............] - ETA: 1:01 - loss: 2.5041 - regression_loss: 2.0795 - classification_loss: 0.4246 316/500 [=================>............] - ETA: 1:00 - loss: 2.5035 - regression_loss: 2.0794 - classification_loss: 0.4241 317/500 [==================>...........] - ETA: 1:00 - loss: 2.5033 - regression_loss: 2.0792 - classification_loss: 0.4241 318/500 [==================>...........] - ETA: 1:00 - loss: 2.5031 - regression_loss: 2.0793 - classification_loss: 0.4239 319/500 [==================>...........] - ETA: 59s - loss: 2.5013 - regression_loss: 2.0778 - classification_loss: 0.4235  320/500 [==================>...........] - ETA: 59s - loss: 2.5020 - regression_loss: 2.0783 - classification_loss: 0.4237 321/500 [==================>...........] - ETA: 59s - loss: 2.5023 - regression_loss: 2.0784 - classification_loss: 0.4239 322/500 [==================>...........] - ETA: 58s - loss: 2.5027 - regression_loss: 2.0787 - classification_loss: 0.4240 323/500 [==================>...........] - ETA: 58s - loss: 2.5023 - regression_loss: 2.0785 - classification_loss: 0.4237 324/500 [==================>...........] - ETA: 58s - loss: 2.5047 - regression_loss: 2.0801 - classification_loss: 0.4246 325/500 [==================>...........] - ETA: 57s - loss: 2.5044 - regression_loss: 2.0799 - classification_loss: 0.4246 326/500 [==================>...........] - ETA: 57s - loss: 2.5031 - regression_loss: 2.0789 - classification_loss: 0.4242 327/500 [==================>...........] - ETA: 57s - loss: 2.5026 - regression_loss: 2.0786 - classification_loss: 0.4241 328/500 [==================>...........] - ETA: 56s - loss: 2.5009 - regression_loss: 2.0772 - classification_loss: 0.4237 329/500 [==================>...........] - ETA: 56s - loss: 2.4997 - regression_loss: 2.0765 - classification_loss: 0.4232 330/500 [==================>...........] - ETA: 56s - loss: 2.4974 - regression_loss: 2.0747 - classification_loss: 0.4227 331/500 [==================>...........] - ETA: 55s - loss: 2.4980 - regression_loss: 2.0750 - classification_loss: 0.4229 332/500 [==================>...........] - ETA: 55s - loss: 2.4984 - regression_loss: 2.0757 - classification_loss: 0.4227 333/500 [==================>...........] - ETA: 55s - loss: 2.4991 - regression_loss: 2.0764 - classification_loss: 0.4227 334/500 [===================>..........] - ETA: 54s - loss: 2.4992 - regression_loss: 2.0768 - classification_loss: 0.4223 335/500 [===================>..........] - ETA: 54s - loss: 2.5007 - regression_loss: 2.0777 - classification_loss: 0.4230 336/500 [===================>..........] - ETA: 54s - loss: 2.5004 - regression_loss: 2.0775 - classification_loss: 0.4229 337/500 [===================>..........] - ETA: 53s - loss: 2.4993 - regression_loss: 2.0766 - classification_loss: 0.4226 338/500 [===================>..........] - ETA: 53s - loss: 2.4992 - regression_loss: 2.0766 - classification_loss: 0.4226 339/500 [===================>..........] - ETA: 53s - loss: 2.4990 - regression_loss: 2.0765 - classification_loss: 0.4225 340/500 [===================>..........] - ETA: 52s - loss: 2.4992 - regression_loss: 2.0766 - classification_loss: 0.4226 341/500 [===================>..........] - ETA: 52s - loss: 2.5005 - regression_loss: 2.0775 - classification_loss: 0.4231 342/500 [===================>..........] - ETA: 52s - loss: 2.4994 - regression_loss: 2.0765 - classification_loss: 0.4228 343/500 [===================>..........] - ETA: 51s - loss: 2.4991 - regression_loss: 2.0763 - classification_loss: 0.4228 344/500 [===================>..........] - ETA: 51s - loss: 2.4994 - regression_loss: 2.0765 - classification_loss: 0.4228 345/500 [===================>..........] - ETA: 51s - loss: 2.4991 - regression_loss: 2.0765 - classification_loss: 0.4226 346/500 [===================>..........] - ETA: 50s - loss: 2.4990 - regression_loss: 2.0764 - classification_loss: 0.4227 347/500 [===================>..........] - ETA: 50s - loss: 2.4975 - regression_loss: 2.0752 - classification_loss: 0.4223 348/500 [===================>..........] - ETA: 50s - loss: 2.4973 - regression_loss: 2.0751 - classification_loss: 0.4222 349/500 [===================>..........] - ETA: 49s - loss: 2.4975 - regression_loss: 2.0753 - classification_loss: 0.4222 350/500 [====================>.........] - ETA: 49s - loss: 2.4982 - regression_loss: 2.0759 - classification_loss: 0.4222 351/500 [====================>.........] - ETA: 49s - loss: 2.4996 - regression_loss: 2.0768 - classification_loss: 0.4228 352/500 [====================>.........] - ETA: 48s - loss: 2.4997 - regression_loss: 2.0769 - classification_loss: 0.4228 353/500 [====================>.........] - ETA: 48s - loss: 2.4965 - regression_loss: 2.0743 - classification_loss: 0.4222 354/500 [====================>.........] - ETA: 48s - loss: 2.4957 - regression_loss: 2.0738 - classification_loss: 0.4219 355/500 [====================>.........] - ETA: 47s - loss: 2.4963 - regression_loss: 2.0743 - classification_loss: 0.4220 356/500 [====================>.........] - ETA: 47s - loss: 2.4972 - regression_loss: 2.0749 - classification_loss: 0.4224 357/500 [====================>.........] - ETA: 47s - loss: 2.4977 - regression_loss: 2.0752 - classification_loss: 0.4225 358/500 [====================>.........] - ETA: 46s - loss: 2.4972 - regression_loss: 2.0749 - classification_loss: 0.4224 359/500 [====================>.........] - ETA: 46s - loss: 2.4969 - regression_loss: 2.0748 - classification_loss: 0.4221 360/500 [====================>.........] - ETA: 46s - loss: 2.4955 - regression_loss: 2.0734 - classification_loss: 0.4221 361/500 [====================>.........] - ETA: 45s - loss: 2.4958 - regression_loss: 2.0737 - classification_loss: 0.4221 362/500 [====================>.........] - ETA: 45s - loss: 2.4968 - regression_loss: 2.0745 - classification_loss: 0.4223 363/500 [====================>.........] - ETA: 45s - loss: 2.4968 - regression_loss: 2.0746 - classification_loss: 0.4221 364/500 [====================>.........] - ETA: 44s - loss: 2.4956 - regression_loss: 2.0737 - classification_loss: 0.4219 365/500 [====================>.........] - ETA: 44s - loss: 2.4952 - regression_loss: 2.0736 - classification_loss: 0.4216 366/500 [====================>.........] - ETA: 44s - loss: 2.4952 - regression_loss: 2.0737 - classification_loss: 0.4216 367/500 [=====================>........] - ETA: 43s - loss: 2.4952 - regression_loss: 2.0737 - classification_loss: 0.4215 368/500 [=====================>........] - ETA: 43s - loss: 2.4953 - regression_loss: 2.0735 - classification_loss: 0.4218 369/500 [=====================>........] - ETA: 43s - loss: 2.4957 - regression_loss: 2.0738 - classification_loss: 0.4219 370/500 [=====================>........] - ETA: 42s - loss: 2.4961 - regression_loss: 2.0741 - classification_loss: 0.4220 371/500 [=====================>........] - ETA: 42s - loss: 2.4945 - regression_loss: 2.0726 - classification_loss: 0.4220 372/500 [=====================>........] - ETA: 42s - loss: 2.4954 - regression_loss: 2.0736 - classification_loss: 0.4218 373/500 [=====================>........] - ETA: 41s - loss: 2.4955 - regression_loss: 2.0736 - classification_loss: 0.4218 374/500 [=====================>........] - ETA: 41s - loss: 2.4934 - regression_loss: 2.0718 - classification_loss: 0.4216 375/500 [=====================>........] - ETA: 41s - loss: 2.4936 - regression_loss: 2.0721 - classification_loss: 0.4215 376/500 [=====================>........] - ETA: 40s - loss: 2.4934 - regression_loss: 2.0719 - classification_loss: 0.4215 377/500 [=====================>........] - ETA: 40s - loss: 2.4899 - regression_loss: 2.0687 - classification_loss: 0.4212 378/500 [=====================>........] - ETA: 40s - loss: 2.4878 - regression_loss: 2.0671 - classification_loss: 0.4207 379/500 [=====================>........] - ETA: 39s - loss: 2.4879 - regression_loss: 2.0674 - classification_loss: 0.4205 380/500 [=====================>........] - ETA: 39s - loss: 2.4876 - regression_loss: 2.0672 - classification_loss: 0.4204 381/500 [=====================>........] - ETA: 39s - loss: 2.4889 - regression_loss: 2.0682 - classification_loss: 0.4207 382/500 [=====================>........] - ETA: 38s - loss: 2.4887 - regression_loss: 2.0681 - classification_loss: 0.4207 383/500 [=====================>........] - ETA: 38s - loss: 2.4893 - regression_loss: 2.0684 - classification_loss: 0.4209 384/500 [======================>.......] - ETA: 38s - loss: 2.4893 - regression_loss: 2.0684 - classification_loss: 0.4209 385/500 [======================>.......] - ETA: 37s - loss: 2.4892 - regression_loss: 2.0684 - classification_loss: 0.4208 386/500 [======================>.......] - ETA: 37s - loss: 2.4880 - regression_loss: 2.0674 - classification_loss: 0.4206 387/500 [======================>.......] - ETA: 37s - loss: 2.4867 - regression_loss: 2.0663 - classification_loss: 0.4204 388/500 [======================>.......] - ETA: 36s - loss: 2.4865 - regression_loss: 2.0662 - classification_loss: 0.4203 389/500 [======================>.......] - ETA: 36s - loss: 2.4869 - regression_loss: 2.0667 - classification_loss: 0.4202 390/500 [======================>.......] - ETA: 36s - loss: 2.4868 - regression_loss: 2.0667 - classification_loss: 0.4201 391/500 [======================>.......] - ETA: 36s - loss: 2.4860 - regression_loss: 2.0663 - classification_loss: 0.4198 392/500 [======================>.......] - ETA: 35s - loss: 2.4859 - regression_loss: 2.0663 - classification_loss: 0.4197 393/500 [======================>.......] - ETA: 35s - loss: 2.4865 - regression_loss: 2.0666 - classification_loss: 0.4199 394/500 [======================>.......] - ETA: 35s - loss: 2.4863 - regression_loss: 2.0664 - classification_loss: 0.4199 395/500 [======================>.......] - ETA: 34s - loss: 2.4863 - regression_loss: 2.0664 - classification_loss: 0.4199 396/500 [======================>.......] - ETA: 34s - loss: 2.4870 - regression_loss: 2.0671 - classification_loss: 0.4199 397/500 [======================>.......] - ETA: 34s - loss: 2.4858 - regression_loss: 2.0660 - classification_loss: 0.4198 398/500 [======================>.......] - ETA: 33s - loss: 2.4858 - regression_loss: 2.0661 - classification_loss: 0.4197 399/500 [======================>.......] - ETA: 33s - loss: 2.4860 - regression_loss: 2.0662 - classification_loss: 0.4198 400/500 [=======================>......] - ETA: 33s - loss: 2.4870 - regression_loss: 2.0668 - classification_loss: 0.4202 401/500 [=======================>......] - ETA: 32s - loss: 2.4860 - regression_loss: 2.0660 - classification_loss: 0.4201 402/500 [=======================>......] - ETA: 32s - loss: 2.4858 - regression_loss: 2.0659 - classification_loss: 0.4199 403/500 [=======================>......] - ETA: 32s - loss: 2.4850 - regression_loss: 2.0654 - classification_loss: 0.4196 404/500 [=======================>......] - ETA: 31s - loss: 2.4851 - regression_loss: 2.0657 - classification_loss: 0.4195 405/500 [=======================>......] - ETA: 31s - loss: 2.4845 - regression_loss: 2.0651 - classification_loss: 0.4194 406/500 [=======================>......] - ETA: 31s - loss: 2.4846 - regression_loss: 2.0652 - classification_loss: 0.4194 407/500 [=======================>......] - ETA: 30s - loss: 2.4832 - regression_loss: 2.0642 - classification_loss: 0.4190 408/500 [=======================>......] - ETA: 30s - loss: 2.4843 - regression_loss: 2.0650 - classification_loss: 0.4193 409/500 [=======================>......] - ETA: 30s - loss: 2.4847 - regression_loss: 2.0652 - classification_loss: 0.4195 410/500 [=======================>......] - ETA: 29s - loss: 2.4847 - regression_loss: 2.0653 - classification_loss: 0.4194 411/500 [=======================>......] - ETA: 29s - loss: 2.4859 - regression_loss: 2.0662 - classification_loss: 0.4197 412/500 [=======================>......] - ETA: 29s - loss: 2.4846 - regression_loss: 2.0651 - classification_loss: 0.4195 413/500 [=======================>......] - ETA: 28s - loss: 2.4847 - regression_loss: 2.0652 - classification_loss: 0.4194 414/500 [=======================>......] - ETA: 28s - loss: 2.4839 - regression_loss: 2.0646 - classification_loss: 0.4194 415/500 [=======================>......] - ETA: 28s - loss: 2.4837 - regression_loss: 2.0644 - classification_loss: 0.4193 416/500 [=======================>......] - ETA: 27s - loss: 2.4835 - regression_loss: 2.0644 - classification_loss: 0.4191 417/500 [========================>.....] - ETA: 27s - loss: 2.4823 - regression_loss: 2.0633 - classification_loss: 0.4191 418/500 [========================>.....] - ETA: 27s - loss: 2.4808 - regression_loss: 2.0617 - classification_loss: 0.4191 419/500 [========================>.....] - ETA: 26s - loss: 2.4802 - regression_loss: 2.0611 - classification_loss: 0.4192 420/500 [========================>.....] - ETA: 26s - loss: 2.4813 - regression_loss: 2.0621 - classification_loss: 0.4192 421/500 [========================>.....] - ETA: 26s - loss: 2.4813 - regression_loss: 2.0621 - classification_loss: 0.4192 422/500 [========================>.....] - ETA: 25s - loss: 2.4819 - regression_loss: 2.0626 - classification_loss: 0.4194 423/500 [========================>.....] - ETA: 25s - loss: 2.4830 - regression_loss: 2.0634 - classification_loss: 0.4196 424/500 [========================>.....] - ETA: 25s - loss: 2.4834 - regression_loss: 2.0639 - classification_loss: 0.4195 425/500 [========================>.....] - ETA: 24s - loss: 2.4826 - regression_loss: 2.0633 - classification_loss: 0.4193 426/500 [========================>.....] - ETA: 24s - loss: 2.4827 - regression_loss: 2.0633 - classification_loss: 0.4194 427/500 [========================>.....] - ETA: 24s - loss: 2.4842 - regression_loss: 2.0644 - classification_loss: 0.4199 428/500 [========================>.....] - ETA: 23s - loss: 2.4846 - regression_loss: 2.0646 - classification_loss: 0.4200 429/500 [========================>.....] - ETA: 23s - loss: 2.4849 - regression_loss: 2.0648 - classification_loss: 0.4201 430/500 [========================>.....] - ETA: 23s - loss: 2.4820 - regression_loss: 2.0624 - classification_loss: 0.4196 431/500 [========================>.....] - ETA: 22s - loss: 2.4826 - regression_loss: 2.0629 - classification_loss: 0.4197 432/500 [========================>.....] - ETA: 22s - loss: 2.4828 - regression_loss: 2.0630 - classification_loss: 0.4197 433/500 [========================>.....] - ETA: 22s - loss: 2.4833 - regression_loss: 2.0634 - classification_loss: 0.4199 434/500 [=========================>....] - ETA: 21s - loss: 2.4820 - regression_loss: 2.0622 - classification_loss: 0.4198 435/500 [=========================>....] - ETA: 21s - loss: 2.4809 - regression_loss: 2.0614 - classification_loss: 0.4195 436/500 [=========================>....] - ETA: 21s - loss: 2.4809 - regression_loss: 2.0615 - classification_loss: 0.4194 437/500 [=========================>....] - ETA: 20s - loss: 2.4811 - regression_loss: 2.0617 - classification_loss: 0.4194 438/500 [=========================>....] - ETA: 20s - loss: 2.4803 - regression_loss: 2.0609 - classification_loss: 0.4194 439/500 [=========================>....] - ETA: 20s - loss: 2.4798 - regression_loss: 2.0606 - classification_loss: 0.4192 440/500 [=========================>....] - ETA: 19s - loss: 2.4796 - regression_loss: 2.0604 - classification_loss: 0.4192 441/500 [=========================>....] - ETA: 19s - loss: 2.4801 - regression_loss: 2.0608 - classification_loss: 0.4193 442/500 [=========================>....] - ETA: 19s - loss: 2.4808 - regression_loss: 2.0612 - classification_loss: 0.4195 443/500 [=========================>....] - ETA: 18s - loss: 2.4813 - regression_loss: 2.0615 - classification_loss: 0.4198 444/500 [=========================>....] - ETA: 18s - loss: 2.4815 - regression_loss: 2.0617 - classification_loss: 0.4198 445/500 [=========================>....] - ETA: 18s - loss: 2.4817 - regression_loss: 2.0618 - classification_loss: 0.4198 446/500 [=========================>....] - ETA: 17s - loss: 2.4817 - regression_loss: 2.0621 - classification_loss: 0.4196 447/500 [=========================>....] - ETA: 17s - loss: 2.4816 - regression_loss: 2.0620 - classification_loss: 0.4196 448/500 [=========================>....] - ETA: 17s - loss: 2.4822 - regression_loss: 2.0626 - classification_loss: 0.4196 449/500 [=========================>....] - ETA: 16s - loss: 2.4827 - regression_loss: 2.0631 - classification_loss: 0.4197 450/500 [==========================>...] - ETA: 16s - loss: 2.4830 - regression_loss: 2.0632 - classification_loss: 0.4198 451/500 [==========================>...] - ETA: 16s - loss: 2.4804 - regression_loss: 2.0610 - classification_loss: 0.4194 452/500 [==========================>...] - ETA: 15s - loss: 2.4810 - regression_loss: 2.0615 - classification_loss: 0.4195 453/500 [==========================>...] - ETA: 15s - loss: 2.4811 - regression_loss: 2.0616 - classification_loss: 0.4195 454/500 [==========================>...] - ETA: 15s - loss: 2.4813 - regression_loss: 2.0617 - classification_loss: 0.4196 455/500 [==========================>...] - ETA: 14s - loss: 2.4816 - regression_loss: 2.0619 - classification_loss: 0.4197 456/500 [==========================>...] - ETA: 14s - loss: 2.4821 - regression_loss: 2.0624 - classification_loss: 0.4197 457/500 [==========================>...] - ETA: 14s - loss: 2.4825 - regression_loss: 2.0628 - classification_loss: 0.4196 458/500 [==========================>...] - ETA: 13s - loss: 2.4825 - regression_loss: 2.0628 - classification_loss: 0.4198 459/500 [==========================>...] - ETA: 13s - loss: 2.4828 - regression_loss: 2.0631 - classification_loss: 0.4198 460/500 [==========================>...] - ETA: 13s - loss: 2.4813 - regression_loss: 2.0618 - classification_loss: 0.4195 461/500 [==========================>...] - ETA: 12s - loss: 2.4796 - regression_loss: 2.0602 - classification_loss: 0.4194 462/500 [==========================>...] - ETA: 12s - loss: 2.4790 - regression_loss: 2.0595 - classification_loss: 0.4194 463/500 [==========================>...] - ETA: 12s - loss: 2.4776 - regression_loss: 2.0581 - classification_loss: 0.4195 464/500 [==========================>...] - ETA: 11s - loss: 2.4773 - regression_loss: 2.0575 - classification_loss: 0.4197 465/500 [==========================>...] - ETA: 11s - loss: 2.4766 - regression_loss: 2.0572 - classification_loss: 0.4194 466/500 [==========================>...] - ETA: 11s - loss: 2.4767 - regression_loss: 2.0572 - classification_loss: 0.4195 467/500 [===========================>..] - ETA: 10s - loss: 2.4775 - regression_loss: 2.0578 - classification_loss: 0.4196 468/500 [===========================>..] - ETA: 10s - loss: 2.4775 - regression_loss: 2.0579 - classification_loss: 0.4196 469/500 [===========================>..] - ETA: 10s - loss: 2.4767 - regression_loss: 2.0572 - classification_loss: 0.4195 470/500 [===========================>..] - ETA: 9s - loss: 2.4765 - regression_loss: 2.0571 - classification_loss: 0.4194  471/500 [===========================>..] - ETA: 9s - loss: 2.4771 - regression_loss: 2.0576 - classification_loss: 0.4195 472/500 [===========================>..] - ETA: 9s - loss: 2.4751 - regression_loss: 2.0557 - classification_loss: 0.4194 473/500 [===========================>..] - ETA: 8s - loss: 2.4729 - regression_loss: 2.0538 - classification_loss: 0.4191 474/500 [===========================>..] - ETA: 8s - loss: 2.4721 - regression_loss: 2.0532 - classification_loss: 0.4189 475/500 [===========================>..] - ETA: 8s - loss: 2.4716 - regression_loss: 2.0528 - classification_loss: 0.4188 476/500 [===========================>..] - ETA: 7s - loss: 2.4717 - regression_loss: 2.0531 - classification_loss: 0.4187 477/500 [===========================>..] - ETA: 7s - loss: 2.4716 - regression_loss: 2.0530 - classification_loss: 0.4186 478/500 [===========================>..] - ETA: 7s - loss: 2.4691 - regression_loss: 2.0507 - classification_loss: 0.4184 479/500 [===========================>..] - ETA: 6s - loss: 2.4691 - regression_loss: 2.0508 - classification_loss: 0.4183 480/500 [===========================>..] - ETA: 6s - loss: 2.4689 - regression_loss: 2.0507 - classification_loss: 0.4182 481/500 [===========================>..] - ETA: 6s - loss: 2.4691 - regression_loss: 2.0510 - classification_loss: 0.4181 482/500 [===========================>..] - ETA: 5s - loss: 2.4686 - regression_loss: 2.0506 - classification_loss: 0.4180 483/500 [===========================>..] - ETA: 5s - loss: 2.4690 - regression_loss: 2.0508 - classification_loss: 0.4183 484/500 [============================>.] - ETA: 5s - loss: 2.4698 - regression_loss: 2.0514 - classification_loss: 0.4184 485/500 [============================>.] - ETA: 4s - loss: 2.4687 - regression_loss: 2.0506 - classification_loss: 0.4181 486/500 [============================>.] - ETA: 4s - loss: 2.4672 - regression_loss: 2.0495 - classification_loss: 0.4177 487/500 [============================>.] - ETA: 4s - loss: 2.4668 - regression_loss: 2.0492 - classification_loss: 0.4176 488/500 [============================>.] - ETA: 3s - loss: 2.4667 - regression_loss: 2.0492 - classification_loss: 0.4176 489/500 [============================>.] - ETA: 3s - loss: 2.4666 - regression_loss: 2.0491 - classification_loss: 0.4176 490/500 [============================>.] - ETA: 3s - loss: 2.4641 - regression_loss: 2.0471 - classification_loss: 0.4170 491/500 [============================>.] - ETA: 2s - loss: 2.4636 - regression_loss: 2.0468 - classification_loss: 0.4168 492/500 [============================>.] - ETA: 2s - loss: 2.4637 - regression_loss: 2.0467 - classification_loss: 0.4169 493/500 [============================>.] - ETA: 2s - loss: 2.4638 - regression_loss: 2.0468 - classification_loss: 0.4169 494/500 [============================>.] - ETA: 1s - loss: 2.4634 - regression_loss: 2.0464 - classification_loss: 0.4170 495/500 [============================>.] - ETA: 1s - loss: 2.4631 - regression_loss: 2.0463 - classification_loss: 0.4168 496/500 [============================>.] - ETA: 1s - loss: 2.4621 - regression_loss: 2.0455 - classification_loss: 0.4165 497/500 [============================>.] - ETA: 0s - loss: 2.4618 - regression_loss: 2.0453 - classification_loss: 0.4165 498/500 [============================>.] - ETA: 0s - loss: 2.4618 - regression_loss: 2.0454 - classification_loss: 0.4164 499/500 [============================>.] - ETA: 0s - loss: 2.4611 - regression_loss: 2.0448 - classification_loss: 0.4163 500/500 [==============================] - 165s 331ms/step - loss: 2.4611 - regression_loss: 2.0448 - classification_loss: 0.4162 1172 instances of class plum with average precision: 0.2547 mAP: 0.2547 Epoch 00005: saving model to ./training/snapshots/resnet101_pascal_05.h5 Epoch 6/150 1/500 [..............................] - ETA: 2:34 - loss: 2.4525 - regression_loss: 2.0377 - classification_loss: 0.4148 2/500 [..............................] - ETA: 2:34 - loss: 2.4185 - regression_loss: 2.0351 - classification_loss: 0.3834 3/500 [..............................] - ETA: 2:33 - loss: 2.2466 - regression_loss: 1.8732 - classification_loss: 0.3734 4/500 [..............................] - ETA: 2:34 - loss: 2.4319 - regression_loss: 2.0032 - classification_loss: 0.4287 5/500 [..............................] - ETA: 2:35 - loss: 2.4401 - regression_loss: 2.0203 - classification_loss: 0.4198 6/500 [..............................] - ETA: 2:38 - loss: 2.4470 - regression_loss: 2.0334 - classification_loss: 0.4136 7/500 [..............................] - ETA: 2:38 - loss: 2.4565 - regression_loss: 2.0496 - classification_loss: 0.4069 8/500 [..............................] - ETA: 2:39 - loss: 2.4505 - regression_loss: 2.0435 - classification_loss: 0.4070 9/500 [..............................] - ETA: 2:39 - loss: 2.4639 - regression_loss: 2.0536 - classification_loss: 0.4104 10/500 [..............................] - ETA: 2:39 - loss: 2.3792 - regression_loss: 1.9815 - classification_loss: 0.3977 11/500 [..............................] - ETA: 2:39 - loss: 2.3335 - regression_loss: 1.9365 - classification_loss: 0.3970 12/500 [..............................] - ETA: 2:40 - loss: 2.3464 - regression_loss: 1.9522 - classification_loss: 0.3942 13/500 [..............................] - ETA: 2:39 - loss: 2.3515 - regression_loss: 1.9599 - classification_loss: 0.3916 14/500 [..............................] - ETA: 2:38 - loss: 2.3493 - regression_loss: 1.9612 - classification_loss: 0.3881 15/500 [..............................] - ETA: 2:39 - loss: 2.3411 - regression_loss: 1.9613 - classification_loss: 0.3798 16/500 [..............................] - ETA: 2:39 - loss: 2.3350 - regression_loss: 1.9603 - classification_loss: 0.3747 17/500 [>.............................] - ETA: 2:38 - loss: 2.3083 - regression_loss: 1.9411 - classification_loss: 0.3672 18/500 [>.............................] - ETA: 2:38 - loss: 2.3119 - regression_loss: 1.9422 - classification_loss: 0.3697 19/500 [>.............................] - ETA: 2:38 - loss: 2.3220 - regression_loss: 1.9478 - classification_loss: 0.3742 20/500 [>.............................] - ETA: 2:37 - loss: 2.3318 - regression_loss: 1.9554 - classification_loss: 0.3763 21/500 [>.............................] - ETA: 2:37 - loss: 2.3206 - regression_loss: 1.9483 - classification_loss: 0.3723 22/500 [>.............................] - ETA: 2:37 - loss: 2.3545 - regression_loss: 1.9769 - classification_loss: 0.3776 23/500 [>.............................] - ETA: 2:36 - loss: 2.3686 - regression_loss: 1.9892 - classification_loss: 0.3794 24/500 [>.............................] - ETA: 2:36 - loss: 2.3773 - regression_loss: 1.9976 - classification_loss: 0.3797 25/500 [>.............................] - ETA: 2:36 - loss: 2.3576 - regression_loss: 1.9793 - classification_loss: 0.3783 26/500 [>.............................] - ETA: 2:35 - loss: 2.3654 - regression_loss: 1.9844 - classification_loss: 0.3810 27/500 [>.............................] - ETA: 2:35 - loss: 2.3894 - regression_loss: 2.0022 - classification_loss: 0.3872 28/500 [>.............................] - ETA: 2:35 - loss: 2.3901 - regression_loss: 1.9996 - classification_loss: 0.3905 29/500 [>.............................] - ETA: 2:34 - loss: 2.3901 - regression_loss: 1.9992 - classification_loss: 0.3909 30/500 [>.............................] - ETA: 2:34 - loss: 2.3939 - regression_loss: 2.0030 - classification_loss: 0.3909 31/500 [>.............................] - ETA: 2:34 - loss: 2.4027 - regression_loss: 2.0124 - classification_loss: 0.3903 32/500 [>.............................] - ETA: 2:34 - loss: 2.4011 - regression_loss: 2.0100 - classification_loss: 0.3911 33/500 [>.............................] - ETA: 2:33 - loss: 2.3609 - regression_loss: 1.9737 - classification_loss: 0.3872 34/500 [=>............................] - ETA: 2:32 - loss: 2.3673 - regression_loss: 1.9784 - classification_loss: 0.3888 35/500 [=>............................] - ETA: 2:33 - loss: 2.3776 - regression_loss: 1.9867 - classification_loss: 0.3909 36/500 [=>............................] - ETA: 2:32 - loss: 2.3943 - regression_loss: 1.9971 - classification_loss: 0.3972 37/500 [=>............................] - ETA: 2:32 - loss: 2.3861 - regression_loss: 1.9915 - classification_loss: 0.3945 38/500 [=>............................] - ETA: 2:32 - loss: 2.3898 - regression_loss: 1.9949 - classification_loss: 0.3949 39/500 [=>............................] - ETA: 2:31 - loss: 2.3805 - regression_loss: 1.9873 - classification_loss: 0.3931 40/500 [=>............................] - ETA: 2:31 - loss: 2.3778 - regression_loss: 1.9859 - classification_loss: 0.3918 41/500 [=>............................] - ETA: 2:31 - loss: 2.3860 - regression_loss: 1.9919 - classification_loss: 0.3941 42/500 [=>............................] - ETA: 2:31 - loss: 2.3881 - regression_loss: 1.9935 - classification_loss: 0.3946 43/500 [=>............................] - ETA: 2:30 - loss: 2.3923 - regression_loss: 1.9976 - classification_loss: 0.3946 44/500 [=>............................] - ETA: 2:30 - loss: 2.3934 - regression_loss: 1.9996 - classification_loss: 0.3938 45/500 [=>............................] - ETA: 2:29 - loss: 2.3960 - regression_loss: 2.0019 - classification_loss: 0.3941 46/500 [=>............................] - ETA: 2:29 - loss: 2.4076 - regression_loss: 2.0126 - classification_loss: 0.3950 47/500 [=>............................] - ETA: 2:28 - loss: 2.4102 - regression_loss: 2.0147 - classification_loss: 0.3955 48/500 [=>............................] - ETA: 2:28 - loss: 2.4130 - regression_loss: 2.0157 - classification_loss: 0.3973 49/500 [=>............................] - ETA: 2:28 - loss: 2.4172 - regression_loss: 2.0167 - classification_loss: 0.4005 50/500 [==>...........................] - ETA: 2:27 - loss: 2.3996 - regression_loss: 2.0013 - classification_loss: 0.3984 51/500 [==>...........................] - ETA: 2:27 - loss: 2.4039 - regression_loss: 2.0047 - classification_loss: 0.3992 52/500 [==>...........................] - ETA: 2:27 - loss: 2.3982 - regression_loss: 2.0015 - classification_loss: 0.3966 53/500 [==>...........................] - ETA: 2:27 - loss: 2.3949 - regression_loss: 1.9981 - classification_loss: 0.3968 54/500 [==>...........................] - ETA: 2:26 - loss: 2.3869 - regression_loss: 1.9926 - classification_loss: 0.3944 55/500 [==>...........................] - ETA: 2:26 - loss: 2.3838 - regression_loss: 1.9903 - classification_loss: 0.3935 56/500 [==>...........................] - ETA: 2:25 - loss: 2.3893 - regression_loss: 1.9958 - classification_loss: 0.3935 57/500 [==>...........................] - ETA: 2:25 - loss: 2.3899 - regression_loss: 1.9965 - classification_loss: 0.3934 58/500 [==>...........................] - ETA: 2:25 - loss: 2.3973 - regression_loss: 2.0020 - classification_loss: 0.3952 59/500 [==>...........................] - ETA: 2:24 - loss: 2.3991 - regression_loss: 2.0045 - classification_loss: 0.3945 60/500 [==>...........................] - ETA: 2:24 - loss: 2.3936 - regression_loss: 1.9995 - classification_loss: 0.3941 61/500 [==>...........................] - ETA: 2:23 - loss: 2.3868 - regression_loss: 1.9943 - classification_loss: 0.3925 62/500 [==>...........................] - ETA: 2:23 - loss: 2.3871 - regression_loss: 1.9942 - classification_loss: 0.3929 63/500 [==>...........................] - ETA: 2:23 - loss: 2.3857 - regression_loss: 1.9930 - classification_loss: 0.3927 64/500 [==>...........................] - ETA: 2:22 - loss: 2.3886 - regression_loss: 1.9951 - classification_loss: 0.3935 65/500 [==>...........................] - ETA: 2:22 - loss: 2.3915 - regression_loss: 1.9974 - classification_loss: 0.3941 66/500 [==>...........................] - ETA: 2:22 - loss: 2.3926 - regression_loss: 1.9983 - classification_loss: 0.3943 67/500 [===>..........................] - ETA: 2:21 - loss: 2.3940 - regression_loss: 1.9998 - classification_loss: 0.3942 68/500 [===>..........................] - ETA: 2:21 - loss: 2.3948 - regression_loss: 2.0010 - classification_loss: 0.3938 69/500 [===>..........................] - ETA: 2:21 - loss: 2.3866 - regression_loss: 1.9938 - classification_loss: 0.3928 70/500 [===>..........................] - ETA: 2:20 - loss: 2.3911 - regression_loss: 1.9975 - classification_loss: 0.3935 71/500 [===>..........................] - ETA: 2:20 - loss: 2.3916 - regression_loss: 1.9992 - classification_loss: 0.3924 72/500 [===>..........................] - ETA: 2:20 - loss: 2.3925 - regression_loss: 2.0001 - classification_loss: 0.3924 73/500 [===>..........................] - ETA: 2:19 - loss: 2.3878 - regression_loss: 1.9968 - classification_loss: 0.3909 74/500 [===>..........................] - ETA: 2:19 - loss: 2.3894 - regression_loss: 1.9982 - classification_loss: 0.3912 75/500 [===>..........................] - ETA: 2:19 - loss: 2.3944 - regression_loss: 2.0012 - classification_loss: 0.3932 76/500 [===>..........................] - ETA: 2:19 - loss: 2.3944 - regression_loss: 2.0014 - classification_loss: 0.3930 77/500 [===>..........................] - ETA: 2:18 - loss: 2.3854 - regression_loss: 1.9938 - classification_loss: 0.3915 78/500 [===>..........................] - ETA: 2:18 - loss: 2.3855 - regression_loss: 1.9939 - classification_loss: 0.3916 79/500 [===>..........................] - ETA: 2:18 - loss: 2.3786 - regression_loss: 1.9878 - classification_loss: 0.3908 80/500 [===>..........................] - ETA: 2:17 - loss: 2.3800 - regression_loss: 1.9892 - classification_loss: 0.3908 81/500 [===>..........................] - ETA: 2:17 - loss: 2.3823 - regression_loss: 1.9914 - classification_loss: 0.3910 82/500 [===>..........................] - ETA: 2:17 - loss: 2.3861 - regression_loss: 1.9953 - classification_loss: 0.3908 83/500 [===>..........................] - ETA: 2:16 - loss: 2.3888 - regression_loss: 1.9975 - classification_loss: 0.3913 84/500 [====>.........................] - ETA: 2:16 - loss: 2.3779 - regression_loss: 1.9877 - classification_loss: 0.3902 85/500 [====>.........................] - ETA: 2:16 - loss: 2.3787 - regression_loss: 1.9874 - classification_loss: 0.3914 86/500 [====>.........................] - ETA: 2:15 - loss: 2.3784 - regression_loss: 1.9875 - classification_loss: 0.3910 87/500 [====>.........................] - ETA: 2:15 - loss: 2.3722 - regression_loss: 1.9830 - classification_loss: 0.3893 88/500 [====>.........................] - ETA: 2:15 - loss: 2.3721 - regression_loss: 1.9837 - classification_loss: 0.3884 89/500 [====>.........................] - ETA: 2:14 - loss: 2.3731 - regression_loss: 1.9850 - classification_loss: 0.3880 90/500 [====>.........................] - ETA: 2:14 - loss: 2.3663 - regression_loss: 1.9789 - classification_loss: 0.3874 91/500 [====>.........................] - ETA: 2:14 - loss: 2.3688 - regression_loss: 1.9808 - classification_loss: 0.3880 92/500 [====>.........................] - ETA: 2:14 - loss: 2.3701 - regression_loss: 1.9818 - classification_loss: 0.3883 93/500 [====>.........................] - ETA: 2:13 - loss: 2.3702 - regression_loss: 1.9826 - classification_loss: 0.3876 94/500 [====>.........................] - ETA: 2:13 - loss: 2.3783 - regression_loss: 1.9874 - classification_loss: 0.3909 95/500 [====>.........................] - ETA: 2:13 - loss: 2.3780 - regression_loss: 1.9873 - classification_loss: 0.3907 96/500 [====>.........................] - ETA: 2:12 - loss: 2.3794 - regression_loss: 1.9882 - classification_loss: 0.3913 97/500 [====>.........................] - ETA: 2:12 - loss: 2.3811 - regression_loss: 1.9890 - classification_loss: 0.3920 98/500 [====>.........................] - ETA: 2:11 - loss: 2.3801 - regression_loss: 1.9884 - classification_loss: 0.3918 99/500 [====>.........................] - ETA: 2:11 - loss: 2.3758 - regression_loss: 1.9848 - classification_loss: 0.3910 100/500 [=====>........................] - ETA: 2:11 - loss: 2.3781 - regression_loss: 1.9870 - classification_loss: 0.3912 101/500 [=====>........................] - ETA: 2:10 - loss: 2.3771 - regression_loss: 1.9858 - classification_loss: 0.3912 102/500 [=====>........................] - ETA: 2:10 - loss: 2.3692 - regression_loss: 1.9786 - classification_loss: 0.3905 103/500 [=====>........................] - ETA: 2:10 - loss: 2.3700 - regression_loss: 1.9801 - classification_loss: 0.3899 104/500 [=====>........................] - ETA: 2:10 - loss: 2.3707 - regression_loss: 1.9809 - classification_loss: 0.3898 105/500 [=====>........................] - ETA: 2:09 - loss: 2.3702 - regression_loss: 1.9799 - classification_loss: 0.3903 106/500 [=====>........................] - ETA: 2:09 - loss: 2.3710 - regression_loss: 1.9806 - classification_loss: 0.3904 107/500 [=====>........................] - ETA: 2:08 - loss: 2.3722 - regression_loss: 1.9814 - classification_loss: 0.3908 108/500 [=====>........................] - ETA: 2:08 - loss: 2.3726 - regression_loss: 1.9817 - classification_loss: 0.3909 109/500 [=====>........................] - ETA: 2:08 - loss: 2.3757 - regression_loss: 1.9838 - classification_loss: 0.3919 110/500 [=====>........................] - ETA: 2:08 - loss: 2.3750 - regression_loss: 1.9841 - classification_loss: 0.3910 111/500 [=====>........................] - ETA: 2:07 - loss: 2.3698 - regression_loss: 1.9797 - classification_loss: 0.3902 112/500 [=====>........................] - ETA: 2:07 - loss: 2.3645 - regression_loss: 1.9758 - classification_loss: 0.3887 113/500 [=====>........................] - ETA: 2:06 - loss: 2.3648 - regression_loss: 1.9760 - classification_loss: 0.3888 114/500 [=====>........................] - ETA: 2:06 - loss: 2.3564 - regression_loss: 1.9685 - classification_loss: 0.3879 115/500 [=====>........................] - ETA: 2:06 - loss: 2.3586 - regression_loss: 1.9701 - classification_loss: 0.3885 116/500 [=====>........................] - ETA: 2:05 - loss: 2.3609 - regression_loss: 1.9723 - classification_loss: 0.3886 117/500 [======>.......................] - ETA: 2:05 - loss: 2.3599 - regression_loss: 1.9710 - classification_loss: 0.3890 118/500 [======>.......................] - ETA: 2:05 - loss: 2.3614 - regression_loss: 1.9719 - classification_loss: 0.3894 119/500 [======>.......................] - ETA: 2:04 - loss: 2.3611 - regression_loss: 1.9717 - classification_loss: 0.3894 120/500 [======>.......................] - ETA: 2:04 - loss: 2.3607 - regression_loss: 1.9713 - classification_loss: 0.3894 121/500 [======>.......................] - ETA: 2:04 - loss: 2.3593 - regression_loss: 1.9697 - classification_loss: 0.3896 122/500 [======>.......................] - ETA: 2:03 - loss: 2.3604 - regression_loss: 1.9710 - classification_loss: 0.3894 123/500 [======>.......................] - ETA: 2:03 - loss: 2.3598 - regression_loss: 1.9707 - classification_loss: 0.3892 124/500 [======>.......................] - ETA: 2:03 - loss: 2.3641 - regression_loss: 1.9743 - classification_loss: 0.3898 125/500 [======>.......................] - ETA: 2:03 - loss: 2.3667 - regression_loss: 1.9763 - classification_loss: 0.3904 126/500 [======>.......................] - ETA: 2:02 - loss: 2.3703 - regression_loss: 1.9782 - classification_loss: 0.3922 127/500 [======>.......................] - ETA: 2:02 - loss: 2.3773 - regression_loss: 1.9837 - classification_loss: 0.3936 128/500 [======>.......................] - ETA: 2:02 - loss: 2.3716 - regression_loss: 1.9792 - classification_loss: 0.3924 129/500 [======>.......................] - ETA: 2:01 - loss: 2.3723 - regression_loss: 1.9797 - classification_loss: 0.3926 130/500 [======>.......................] - ETA: 2:01 - loss: 2.3731 - regression_loss: 1.9803 - classification_loss: 0.3928 131/500 [======>.......................] - ETA: 2:01 - loss: 2.3745 - regression_loss: 1.9813 - classification_loss: 0.3933 132/500 [======>.......................] - ETA: 2:00 - loss: 2.3778 - regression_loss: 1.9842 - classification_loss: 0.3937 133/500 [======>.......................] - ETA: 2:00 - loss: 2.3814 - regression_loss: 1.9877 - classification_loss: 0.3937 134/500 [=======>......................] - ETA: 2:00 - loss: 2.3840 - regression_loss: 1.9902 - classification_loss: 0.3938 135/500 [=======>......................] - ETA: 1:59 - loss: 2.3896 - regression_loss: 1.9944 - classification_loss: 0.3952 136/500 [=======>......................] - ETA: 1:59 - loss: 2.3904 - regression_loss: 1.9952 - classification_loss: 0.3952 137/500 [=======>......................] - ETA: 1:59 - loss: 2.3911 - regression_loss: 1.9957 - classification_loss: 0.3954 138/500 [=======>......................] - ETA: 1:58 - loss: 2.3927 - regression_loss: 1.9969 - classification_loss: 0.3957 139/500 [=======>......................] - ETA: 1:58 - loss: 2.3994 - regression_loss: 2.0019 - classification_loss: 0.3975 140/500 [=======>......................] - ETA: 1:58 - loss: 2.4047 - regression_loss: 2.0067 - classification_loss: 0.3981 141/500 [=======>......................] - ETA: 1:58 - loss: 2.3954 - regression_loss: 1.9988 - classification_loss: 0.3967 142/500 [=======>......................] - ETA: 1:57 - loss: 2.3965 - regression_loss: 1.9998 - classification_loss: 0.3967 143/500 [=======>......................] - ETA: 1:57 - loss: 2.3970 - regression_loss: 2.0004 - classification_loss: 0.3967 144/500 [=======>......................] - ETA: 1:56 - loss: 2.3946 - regression_loss: 1.9988 - classification_loss: 0.3958 145/500 [=======>......................] - ETA: 1:56 - loss: 2.3948 - regression_loss: 1.9992 - classification_loss: 0.3956 146/500 [=======>......................] - ETA: 1:56 - loss: 2.3947 - regression_loss: 1.9994 - classification_loss: 0.3953 147/500 [=======>......................] - ETA: 1:55 - loss: 2.3947 - regression_loss: 1.9995 - classification_loss: 0.3952 148/500 [=======>......................] - ETA: 1:55 - loss: 2.3982 - regression_loss: 2.0003 - classification_loss: 0.3979 149/500 [=======>......................] - ETA: 1:55 - loss: 2.3996 - regression_loss: 2.0012 - classification_loss: 0.3984 150/500 [========>.....................] - ETA: 1:54 - loss: 2.3981 - regression_loss: 2.0001 - classification_loss: 0.3980 151/500 [========>.....................] - ETA: 1:54 - loss: 2.3950 - regression_loss: 1.9974 - classification_loss: 0.3975 152/500 [========>.....................] - ETA: 1:54 - loss: 2.3974 - regression_loss: 1.9993 - classification_loss: 0.3981 153/500 [========>.....................] - ETA: 1:54 - loss: 2.3983 - regression_loss: 2.0001 - classification_loss: 0.3982 154/500 [========>.....................] - ETA: 1:53 - loss: 2.4001 - regression_loss: 2.0017 - classification_loss: 0.3984 155/500 [========>.....................] - ETA: 1:53 - loss: 2.3998 - regression_loss: 2.0018 - classification_loss: 0.3980 156/500 [========>.....................] - ETA: 1:53 - loss: 2.3944 - regression_loss: 1.9973 - classification_loss: 0.3971 157/500 [========>.....................] - ETA: 1:52 - loss: 2.3940 - regression_loss: 1.9968 - classification_loss: 0.3972 158/500 [========>.....................] - ETA: 1:52 - loss: 2.3879 - regression_loss: 1.9918 - classification_loss: 0.3961 159/500 [========>.....................] - ETA: 1:52 - loss: 2.3831 - regression_loss: 1.9880 - classification_loss: 0.3951 160/500 [========>.....................] - ETA: 1:51 - loss: 2.3847 - regression_loss: 1.9890 - classification_loss: 0.3957 161/500 [========>.....................] - ETA: 1:51 - loss: 2.3837 - regression_loss: 1.9882 - classification_loss: 0.3955 162/500 [========>.....................] - ETA: 1:51 - loss: 2.3850 - regression_loss: 1.9889 - classification_loss: 0.3961 163/500 [========>.....................] - ETA: 1:50 - loss: 2.3862 - regression_loss: 1.9900 - classification_loss: 0.3962 164/500 [========>.....................] - ETA: 1:50 - loss: 2.3881 - regression_loss: 1.9914 - classification_loss: 0.3967 165/500 [========>.....................] - ETA: 1:50 - loss: 2.3888 - regression_loss: 1.9920 - classification_loss: 0.3967 166/500 [========>.....................] - ETA: 1:49 - loss: 2.3857 - regression_loss: 1.9896 - classification_loss: 0.3961 167/500 [=========>....................] - ETA: 1:49 - loss: 2.3847 - regression_loss: 1.9888 - classification_loss: 0.3959 168/500 [=========>....................] - ETA: 1:49 - loss: 2.3842 - regression_loss: 1.9882 - classification_loss: 0.3960 169/500 [=========>....................] - ETA: 1:48 - loss: 2.3823 - regression_loss: 1.9870 - classification_loss: 0.3953 170/500 [=========>....................] - ETA: 1:48 - loss: 2.3832 - regression_loss: 1.9878 - classification_loss: 0.3955 171/500 [=========>....................] - ETA: 1:48 - loss: 2.3826 - regression_loss: 1.9872 - classification_loss: 0.3954 172/500 [=========>....................] - ETA: 1:47 - loss: 2.3838 - regression_loss: 1.9885 - classification_loss: 0.3953 173/500 [=========>....................] - ETA: 1:47 - loss: 2.3829 - regression_loss: 1.9881 - classification_loss: 0.3947 174/500 [=========>....................] - ETA: 1:47 - loss: 2.3824 - regression_loss: 1.9875 - classification_loss: 0.3948 175/500 [=========>....................] - ETA: 1:46 - loss: 2.3824 - regression_loss: 1.9876 - classification_loss: 0.3948 176/500 [=========>....................] - ETA: 1:46 - loss: 2.3807 - regression_loss: 1.9866 - classification_loss: 0.3940 177/500 [=========>....................] - ETA: 1:46 - loss: 2.3809 - regression_loss: 1.9870 - classification_loss: 0.3939 178/500 [=========>....................] - ETA: 1:45 - loss: 2.3801 - regression_loss: 1.9864 - classification_loss: 0.3937 179/500 [=========>....................] - ETA: 1:45 - loss: 2.3796 - regression_loss: 1.9859 - classification_loss: 0.3937 180/500 [=========>....................] - ETA: 1:45 - loss: 2.3784 - regression_loss: 1.9850 - classification_loss: 0.3935 181/500 [=========>....................] - ETA: 1:44 - loss: 2.3786 - regression_loss: 1.9851 - classification_loss: 0.3935 182/500 [=========>....................] - ETA: 1:44 - loss: 2.3802 - regression_loss: 1.9866 - classification_loss: 0.3936 183/500 [=========>....................] - ETA: 1:44 - loss: 2.3818 - regression_loss: 1.9881 - classification_loss: 0.3936 184/500 [==========>...................] - ETA: 1:43 - loss: 2.3825 - regression_loss: 1.9888 - classification_loss: 0.3937 185/500 [==========>...................] - ETA: 1:43 - loss: 2.3801 - regression_loss: 1.9869 - classification_loss: 0.3932 186/500 [==========>...................] - ETA: 1:43 - loss: 2.3793 - regression_loss: 1.9861 - classification_loss: 0.3932 187/500 [==========>...................] - ETA: 1:42 - loss: 2.3788 - regression_loss: 1.9857 - classification_loss: 0.3931 188/500 [==========>...................] - ETA: 1:42 - loss: 2.3803 - regression_loss: 1.9869 - classification_loss: 0.3934 189/500 [==========>...................] - ETA: 1:42 - loss: 2.3802 - regression_loss: 1.9873 - classification_loss: 0.3929 190/500 [==========>...................] - ETA: 1:41 - loss: 2.3802 - regression_loss: 1.9873 - classification_loss: 0.3930 191/500 [==========>...................] - ETA: 1:41 - loss: 2.3794 - regression_loss: 1.9868 - classification_loss: 0.3927 192/500 [==========>...................] - ETA: 1:41 - loss: 2.3771 - regression_loss: 1.9842 - classification_loss: 0.3929 193/500 [==========>...................] - ETA: 1:40 - loss: 2.3777 - regression_loss: 1.9850 - classification_loss: 0.3928 194/500 [==========>...................] - ETA: 1:40 - loss: 2.3724 - regression_loss: 1.9803 - classification_loss: 0.3922 195/500 [==========>...................] - ETA: 1:40 - loss: 2.3731 - regression_loss: 1.9808 - classification_loss: 0.3923 196/500 [==========>...................] - ETA: 1:39 - loss: 2.3727 - regression_loss: 1.9801 - classification_loss: 0.3926 197/500 [==========>...................] - ETA: 1:39 - loss: 2.3686 - regression_loss: 1.9768 - classification_loss: 0.3918 198/500 [==========>...................] - ETA: 1:39 - loss: 2.3704 - regression_loss: 1.9787 - classification_loss: 0.3917 199/500 [==========>...................] - ETA: 1:38 - loss: 2.3723 - regression_loss: 1.9801 - classification_loss: 0.3922 200/500 [===========>..................] - ETA: 1:38 - loss: 2.3671 - regression_loss: 1.9759 - classification_loss: 0.3913 201/500 [===========>..................] - ETA: 1:38 - loss: 2.3692 - regression_loss: 1.9774 - classification_loss: 0.3919 202/500 [===========>..................] - ETA: 1:37 - loss: 2.3698 - regression_loss: 1.9779 - classification_loss: 0.3920 203/500 [===========>..................] - ETA: 1:37 - loss: 2.3708 - regression_loss: 1.9788 - classification_loss: 0.3919 204/500 [===========>..................] - ETA: 1:37 - loss: 2.3650 - regression_loss: 1.9740 - classification_loss: 0.3910 205/500 [===========>..................] - ETA: 1:36 - loss: 2.3645 - regression_loss: 1.9731 - classification_loss: 0.3914 206/500 [===========>..................] - ETA: 1:36 - loss: 2.3651 - regression_loss: 1.9737 - classification_loss: 0.3913 207/500 [===========>..................] - ETA: 1:36 - loss: 2.3655 - regression_loss: 1.9741 - classification_loss: 0.3914 208/500 [===========>..................] - ETA: 1:35 - loss: 2.3667 - regression_loss: 1.9749 - classification_loss: 0.3919 209/500 [===========>..................] - ETA: 1:35 - loss: 2.3656 - regression_loss: 1.9743 - classification_loss: 0.3914 210/500 [===========>..................] - ETA: 1:35 - loss: 2.3655 - regression_loss: 1.9736 - classification_loss: 0.3919 211/500 [===========>..................] - ETA: 1:34 - loss: 2.3658 - regression_loss: 1.9741 - classification_loss: 0.3917 212/500 [===========>..................] - ETA: 1:34 - loss: 2.3674 - regression_loss: 1.9754 - classification_loss: 0.3920 213/500 [===========>..................] - ETA: 1:34 - loss: 2.3674 - regression_loss: 1.9755 - classification_loss: 0.3919 214/500 [===========>..................] - ETA: 1:33 - loss: 2.3689 - regression_loss: 1.9769 - classification_loss: 0.3920 215/500 [===========>..................] - ETA: 1:33 - loss: 2.3695 - regression_loss: 1.9773 - classification_loss: 0.3922 216/500 [===========>..................] - ETA: 1:33 - loss: 2.3670 - regression_loss: 1.9754 - classification_loss: 0.3916 217/500 [============>.................] - ETA: 1:32 - loss: 2.3691 - regression_loss: 1.9776 - classification_loss: 0.3915 218/500 [============>.................] - ETA: 1:32 - loss: 2.3693 - regression_loss: 1.9778 - classification_loss: 0.3916 219/500 [============>.................] - ETA: 1:32 - loss: 2.3705 - regression_loss: 1.9788 - classification_loss: 0.3917 220/500 [============>.................] - ETA: 1:31 - loss: 2.3665 - regression_loss: 1.9753 - classification_loss: 0.3912 221/500 [============>.................] - ETA: 1:31 - loss: 2.3637 - regression_loss: 1.9729 - classification_loss: 0.3907 222/500 [============>.................] - ETA: 1:31 - loss: 2.3642 - regression_loss: 1.9735 - classification_loss: 0.3907 223/500 [============>.................] - ETA: 1:30 - loss: 2.3626 - regression_loss: 1.9722 - classification_loss: 0.3904 224/500 [============>.................] - ETA: 1:30 - loss: 2.3638 - regression_loss: 1.9731 - classification_loss: 0.3908 225/500 [============>.................] - ETA: 1:30 - loss: 2.3633 - regression_loss: 1.9728 - classification_loss: 0.3905 226/500 [============>.................] - ETA: 1:29 - loss: 2.3626 - regression_loss: 1.9725 - classification_loss: 0.3901 227/500 [============>.................] - ETA: 1:29 - loss: 2.3649 - regression_loss: 1.9745 - classification_loss: 0.3903 228/500 [============>.................] - ETA: 1:29 - loss: 2.3660 - regression_loss: 1.9756 - classification_loss: 0.3904 229/500 [============>.................] - ETA: 1:28 - loss: 2.3669 - regression_loss: 1.9764 - classification_loss: 0.3905 230/500 [============>.................] - ETA: 1:28 - loss: 2.3681 - regression_loss: 1.9773 - classification_loss: 0.3907 231/500 [============>.................] - ETA: 1:28 - loss: 2.3689 - regression_loss: 1.9780 - classification_loss: 0.3909 232/500 [============>.................] - ETA: 1:27 - loss: 2.3646 - regression_loss: 1.9744 - classification_loss: 0.3903 233/500 [============>.................] - ETA: 1:27 - loss: 2.3637 - regression_loss: 1.9734 - classification_loss: 0.3903 234/500 [=============>................] - ETA: 1:27 - loss: 2.3613 - regression_loss: 1.9709 - classification_loss: 0.3904 235/500 [=============>................] - ETA: 1:26 - loss: 2.3606 - regression_loss: 1.9703 - classification_loss: 0.3903 236/500 [=============>................] - ETA: 1:26 - loss: 2.3624 - regression_loss: 1.9715 - classification_loss: 0.3908 237/500 [=============>................] - ETA: 1:26 - loss: 2.3638 - regression_loss: 1.9729 - classification_loss: 0.3908 238/500 [=============>................] - ETA: 1:26 - loss: 2.3651 - regression_loss: 1.9743 - classification_loss: 0.3909 239/500 [=============>................] - ETA: 1:25 - loss: 2.3620 - regression_loss: 1.9718 - classification_loss: 0.3902 240/500 [=============>................] - ETA: 1:25 - loss: 2.3616 - regression_loss: 1.9712 - classification_loss: 0.3903 241/500 [=============>................] - ETA: 1:25 - loss: 2.3620 - regression_loss: 1.9714 - classification_loss: 0.3906 242/500 [=============>................] - ETA: 1:24 - loss: 2.3621 - regression_loss: 1.9716 - classification_loss: 0.3905 243/500 [=============>................] - ETA: 1:24 - loss: 2.3631 - regression_loss: 1.9723 - classification_loss: 0.3907 244/500 [=============>................] - ETA: 1:23 - loss: 2.3628 - regression_loss: 1.9721 - classification_loss: 0.3907 245/500 [=============>................] - ETA: 1:23 - loss: 2.3631 - regression_loss: 1.9725 - classification_loss: 0.3905 246/500 [=============>................] - ETA: 1:23 - loss: 2.3610 - regression_loss: 1.9708 - classification_loss: 0.3901 247/500 [=============>................] - ETA: 1:23 - loss: 2.3605 - regression_loss: 1.9707 - classification_loss: 0.3898 248/500 [=============>................] - ETA: 1:22 - loss: 2.3606 - regression_loss: 1.9705 - classification_loss: 0.3902 249/500 [=============>................] - ETA: 1:22 - loss: 2.3603 - regression_loss: 1.9701 - classification_loss: 0.3902 250/500 [==============>...............] - ETA: 1:22 - loss: 2.3607 - regression_loss: 1.9704 - classification_loss: 0.3903 251/500 [==============>...............] - ETA: 1:21 - loss: 2.3600 - regression_loss: 1.9701 - classification_loss: 0.3899 252/500 [==============>...............] - ETA: 1:21 - loss: 2.3606 - regression_loss: 1.9703 - classification_loss: 0.3903 253/500 [==============>...............] - ETA: 1:21 - loss: 2.3602 - regression_loss: 1.9699 - classification_loss: 0.3903 254/500 [==============>...............] - ETA: 1:20 - loss: 2.3604 - regression_loss: 1.9702 - classification_loss: 0.3901 255/500 [==============>...............] - ETA: 1:20 - loss: 2.3601 - regression_loss: 1.9699 - classification_loss: 0.3902 256/500 [==============>...............] - ETA: 1:20 - loss: 2.3606 - regression_loss: 1.9706 - classification_loss: 0.3900 257/500 [==============>...............] - ETA: 1:19 - loss: 2.3601 - regression_loss: 1.9705 - classification_loss: 0.3897 258/500 [==============>...............] - ETA: 1:19 - loss: 2.3612 - regression_loss: 1.9712 - classification_loss: 0.3900 259/500 [==============>...............] - ETA: 1:19 - loss: 2.3610 - regression_loss: 1.9710 - classification_loss: 0.3900 260/500 [==============>...............] - ETA: 1:18 - loss: 2.3615 - regression_loss: 1.9713 - classification_loss: 0.3902 261/500 [==============>...............] - ETA: 1:18 - loss: 2.3640 - regression_loss: 1.9733 - classification_loss: 0.3907 262/500 [==============>...............] - ETA: 1:18 - loss: 2.3631 - regression_loss: 1.9726 - classification_loss: 0.3905 263/500 [==============>...............] - ETA: 1:17 - loss: 2.3626 - regression_loss: 1.9723 - classification_loss: 0.3902 264/500 [==============>...............] - ETA: 1:17 - loss: 2.3622 - regression_loss: 1.9725 - classification_loss: 0.3897 265/500 [==============>...............] - ETA: 1:17 - loss: 2.3610 - regression_loss: 1.9717 - classification_loss: 0.3893 266/500 [==============>...............] - ETA: 1:16 - loss: 2.3617 - regression_loss: 1.9722 - classification_loss: 0.3895 267/500 [===============>..............] - ETA: 1:16 - loss: 2.3638 - regression_loss: 1.9738 - classification_loss: 0.3900 268/500 [===============>..............] - ETA: 1:16 - loss: 2.3633 - regression_loss: 1.9734 - classification_loss: 0.3900 269/500 [===============>..............] - ETA: 1:15 - loss: 2.3652 - regression_loss: 1.9750 - classification_loss: 0.3903 270/500 [===============>..............] - ETA: 1:15 - loss: 2.3643 - regression_loss: 1.9743 - classification_loss: 0.3900 271/500 [===============>..............] - ETA: 1:15 - loss: 2.3651 - regression_loss: 1.9748 - classification_loss: 0.3902 272/500 [===============>..............] - ETA: 1:14 - loss: 2.3651 - regression_loss: 1.9749 - classification_loss: 0.3902 273/500 [===============>..............] - ETA: 1:14 - loss: 2.3653 - regression_loss: 1.9750 - classification_loss: 0.3903 274/500 [===============>..............] - ETA: 1:14 - loss: 2.3654 - regression_loss: 1.9751 - classification_loss: 0.3903 275/500 [===============>..............] - ETA: 1:13 - loss: 2.3673 - regression_loss: 1.9765 - classification_loss: 0.3908 276/500 [===============>..............] - ETA: 1:13 - loss: 2.3690 - regression_loss: 1.9774 - classification_loss: 0.3915 277/500 [===============>..............] - ETA: 1:13 - loss: 2.3676 - regression_loss: 1.9761 - classification_loss: 0.3915 278/500 [===============>..............] - ETA: 1:12 - loss: 2.3682 - regression_loss: 1.9766 - classification_loss: 0.3916 279/500 [===============>..............] - ETA: 1:12 - loss: 2.3677 - regression_loss: 1.9758 - classification_loss: 0.3919 280/500 [===============>..............] - ETA: 1:12 - loss: 2.3658 - regression_loss: 1.9743 - classification_loss: 0.3915 281/500 [===============>..............] - ETA: 1:11 - loss: 2.3663 - regression_loss: 1.9750 - classification_loss: 0.3913 282/500 [===============>..............] - ETA: 1:11 - loss: 2.3668 - regression_loss: 1.9755 - classification_loss: 0.3913 283/500 [===============>..............] - ETA: 1:11 - loss: 2.3636 - regression_loss: 1.9728 - classification_loss: 0.3908 284/500 [================>.............] - ETA: 1:10 - loss: 2.3596 - regression_loss: 1.9695 - classification_loss: 0.3901 285/500 [================>.............] - ETA: 1:10 - loss: 2.3586 - regression_loss: 1.9688 - classification_loss: 0.3898 286/500 [================>.............] - ETA: 1:10 - loss: 2.3589 - regression_loss: 1.9690 - classification_loss: 0.3899 287/500 [================>.............] - ETA: 1:09 - loss: 2.3588 - regression_loss: 1.9688 - classification_loss: 0.3900 288/500 [================>.............] - ETA: 1:09 - loss: 2.3575 - regression_loss: 1.9679 - classification_loss: 0.3896 289/500 [================>.............] - ETA: 1:09 - loss: 2.3549 - regression_loss: 1.9659 - classification_loss: 0.3890 290/500 [================>.............] - ETA: 1:08 - loss: 2.3549 - regression_loss: 1.9658 - classification_loss: 0.3891 291/500 [================>.............] - ETA: 1:08 - loss: 2.3560 - regression_loss: 1.9666 - classification_loss: 0.3895 292/500 [================>.............] - ETA: 1:08 - loss: 2.3543 - regression_loss: 1.9652 - classification_loss: 0.3891 293/500 [================>.............] - ETA: 1:07 - loss: 2.3543 - regression_loss: 1.9652 - classification_loss: 0.3891 294/500 [================>.............] - ETA: 1:07 - loss: 2.3547 - regression_loss: 1.9656 - classification_loss: 0.3891 295/500 [================>.............] - ETA: 1:07 - loss: 2.3546 - regression_loss: 1.9656 - classification_loss: 0.3890 296/500 [================>.............] - ETA: 1:07 - loss: 2.3529 - regression_loss: 1.9634 - classification_loss: 0.3895 297/500 [================>.............] - ETA: 1:06 - loss: 2.3521 - regression_loss: 1.9627 - classification_loss: 0.3893 298/500 [================>.............] - ETA: 1:06 - loss: 2.3520 - regression_loss: 1.9628 - classification_loss: 0.3892 299/500 [================>.............] - ETA: 1:06 - loss: 2.3497 - regression_loss: 1.9610 - classification_loss: 0.3887 300/500 [=================>............] - ETA: 1:05 - loss: 2.3504 - regression_loss: 1.9615 - classification_loss: 0.3889 301/500 [=================>............] - ETA: 1:05 - loss: 2.3504 - regression_loss: 1.9616 - classification_loss: 0.3888 302/500 [=================>............] - ETA: 1:05 - loss: 2.3504 - regression_loss: 1.9617 - classification_loss: 0.3887 303/500 [=================>............] - ETA: 1:04 - loss: 2.3529 - regression_loss: 1.9632 - classification_loss: 0.3897 304/500 [=================>............] - ETA: 1:04 - loss: 2.3536 - regression_loss: 1.9637 - classification_loss: 0.3899 305/500 [=================>............] - ETA: 1:04 - loss: 2.3540 - regression_loss: 1.9640 - classification_loss: 0.3900 306/500 [=================>............] - ETA: 1:03 - loss: 2.3547 - regression_loss: 1.9647 - classification_loss: 0.3901 307/500 [=================>............] - ETA: 1:03 - loss: 2.3542 - regression_loss: 1.9643 - classification_loss: 0.3899 308/500 [=================>............] - ETA: 1:03 - loss: 2.3543 - regression_loss: 1.9645 - classification_loss: 0.3898 309/500 [=================>............] - ETA: 1:02 - loss: 2.3521 - regression_loss: 1.9628 - classification_loss: 0.3892 310/500 [=================>............] - ETA: 1:02 - loss: 2.3515 - regression_loss: 1.9622 - classification_loss: 0.3893 311/500 [=================>............] - ETA: 1:02 - loss: 2.3517 - regression_loss: 1.9624 - classification_loss: 0.3894 312/500 [=================>............] - ETA: 1:01 - loss: 2.3522 - regression_loss: 1.9628 - classification_loss: 0.3893 313/500 [=================>............] - ETA: 1:01 - loss: 2.3520 - regression_loss: 1.9627 - classification_loss: 0.3893 314/500 [=================>............] - ETA: 1:01 - loss: 2.3516 - regression_loss: 1.9624 - classification_loss: 0.3892 315/500 [=================>............] - ETA: 1:00 - loss: 2.3518 - regression_loss: 1.9626 - classification_loss: 0.3892 316/500 [=================>............] - ETA: 1:00 - loss: 2.3511 - regression_loss: 1.9619 - classification_loss: 0.3892 317/500 [==================>...........] - ETA: 1:00 - loss: 2.3510 - regression_loss: 1.9622 - classification_loss: 0.3888 318/500 [==================>...........] - ETA: 59s - loss: 2.3511 - regression_loss: 1.9623 - classification_loss: 0.3888  319/500 [==================>...........] - ETA: 59s - loss: 2.3509 - regression_loss: 1.9623 - classification_loss: 0.3886 320/500 [==================>...........] - ETA: 59s - loss: 2.3514 - regression_loss: 1.9627 - classification_loss: 0.3887 321/500 [==================>...........] - ETA: 58s - loss: 2.3505 - regression_loss: 1.9621 - classification_loss: 0.3884 322/500 [==================>...........] - ETA: 58s - loss: 2.3516 - regression_loss: 1.9628 - classification_loss: 0.3888 323/500 [==================>...........] - ETA: 58s - loss: 2.3520 - regression_loss: 1.9631 - classification_loss: 0.3889 324/500 [==================>...........] - ETA: 57s - loss: 2.3533 - regression_loss: 1.9641 - classification_loss: 0.3892 325/500 [==================>...........] - ETA: 57s - loss: 2.3521 - regression_loss: 1.9628 - classification_loss: 0.3894 326/500 [==================>...........] - ETA: 57s - loss: 2.3532 - regression_loss: 1.9637 - classification_loss: 0.3895 327/500 [==================>...........] - ETA: 56s - loss: 2.3549 - regression_loss: 1.9651 - classification_loss: 0.3898 328/500 [==================>...........] - ETA: 56s - loss: 2.3552 - regression_loss: 1.9656 - classification_loss: 0.3896 329/500 [==================>...........] - ETA: 56s - loss: 2.3543 - regression_loss: 1.9649 - classification_loss: 0.3894 330/500 [==================>...........] - ETA: 55s - loss: 2.3543 - regression_loss: 1.9649 - classification_loss: 0.3895 331/500 [==================>...........] - ETA: 55s - loss: 2.3548 - regression_loss: 1.9654 - classification_loss: 0.3893 332/500 [==================>...........] - ETA: 55s - loss: 2.3548 - regression_loss: 1.9656 - classification_loss: 0.3892 333/500 [==================>...........] - ETA: 54s - loss: 2.3553 - regression_loss: 1.9660 - classification_loss: 0.3893 334/500 [===================>..........] - ETA: 54s - loss: 2.3536 - regression_loss: 1.9648 - classification_loss: 0.3888 335/500 [===================>..........] - ETA: 54s - loss: 2.3533 - regression_loss: 1.9646 - classification_loss: 0.3887 336/500 [===================>..........] - ETA: 53s - loss: 2.3536 - regression_loss: 1.9650 - classification_loss: 0.3887 337/500 [===================>..........] - ETA: 53s - loss: 2.3519 - regression_loss: 1.9634 - classification_loss: 0.3885 338/500 [===================>..........] - ETA: 53s - loss: 2.3524 - regression_loss: 1.9638 - classification_loss: 0.3886 339/500 [===================>..........] - ETA: 52s - loss: 2.3537 - regression_loss: 1.9646 - classification_loss: 0.3890 340/500 [===================>..........] - ETA: 52s - loss: 2.3540 - regression_loss: 1.9650 - classification_loss: 0.3890 341/500 [===================>..........] - ETA: 52s - loss: 2.3541 - regression_loss: 1.9651 - classification_loss: 0.3890 342/500 [===================>..........] - ETA: 51s - loss: 2.3551 - regression_loss: 1.9660 - classification_loss: 0.3892 343/500 [===================>..........] - ETA: 51s - loss: 2.3556 - regression_loss: 1.9665 - classification_loss: 0.3891 344/500 [===================>..........] - ETA: 51s - loss: 2.3546 - regression_loss: 1.9657 - classification_loss: 0.3889 345/500 [===================>..........] - ETA: 50s - loss: 2.3557 - regression_loss: 1.9664 - classification_loss: 0.3893 346/500 [===================>..........] - ETA: 50s - loss: 2.3535 - regression_loss: 1.9646 - classification_loss: 0.3889 347/500 [===================>..........] - ETA: 50s - loss: 2.3540 - regression_loss: 1.9651 - classification_loss: 0.3889 348/500 [===================>..........] - ETA: 49s - loss: 2.3507 - regression_loss: 1.9623 - classification_loss: 0.3884 349/500 [===================>..........] - ETA: 49s - loss: 2.3509 - regression_loss: 1.9626 - classification_loss: 0.3883 350/500 [====================>.........] - ETA: 49s - loss: 2.3513 - regression_loss: 1.9630 - classification_loss: 0.3883 351/500 [====================>.........] - ETA: 49s - loss: 2.3514 - regression_loss: 1.9630 - classification_loss: 0.3884 352/500 [====================>.........] - ETA: 48s - loss: 2.3526 - regression_loss: 1.9639 - classification_loss: 0.3887 353/500 [====================>.........] - ETA: 48s - loss: 2.3527 - regression_loss: 1.9639 - classification_loss: 0.3887 354/500 [====================>.........] - ETA: 48s - loss: 2.3531 - regression_loss: 1.9644 - classification_loss: 0.3887 355/500 [====================>.........] - ETA: 47s - loss: 2.3539 - regression_loss: 1.9649 - classification_loss: 0.3890 356/500 [====================>.........] - ETA: 47s - loss: 2.3535 - regression_loss: 1.9646 - classification_loss: 0.3889 357/500 [====================>.........] - ETA: 47s - loss: 2.3535 - regression_loss: 1.9646 - classification_loss: 0.3889 358/500 [====================>.........] - ETA: 46s - loss: 2.3536 - regression_loss: 1.9647 - classification_loss: 0.3889 359/500 [====================>.........] - ETA: 46s - loss: 2.3539 - regression_loss: 1.9651 - classification_loss: 0.3889 360/500 [====================>.........] - ETA: 46s - loss: 2.3535 - regression_loss: 1.9648 - classification_loss: 0.3887 361/500 [====================>.........] - ETA: 45s - loss: 2.3534 - regression_loss: 1.9647 - classification_loss: 0.3886 362/500 [====================>.........] - ETA: 45s - loss: 2.3508 - regression_loss: 1.9622 - classification_loss: 0.3885 363/500 [====================>.........] - ETA: 45s - loss: 2.3490 - regression_loss: 1.9608 - classification_loss: 0.3882 364/500 [====================>.........] - ETA: 44s - loss: 2.3494 - regression_loss: 1.9610 - classification_loss: 0.3884 365/500 [====================>.........] - ETA: 44s - loss: 2.3496 - regression_loss: 1.9612 - classification_loss: 0.3883 366/500 [====================>.........] - ETA: 44s - loss: 2.3481 - regression_loss: 1.9600 - classification_loss: 0.3881 367/500 [=====================>........] - ETA: 43s - loss: 2.3459 - regression_loss: 1.9580 - classification_loss: 0.3879 368/500 [=====================>........] - ETA: 43s - loss: 2.3454 - regression_loss: 1.9577 - classification_loss: 0.3877 369/500 [=====================>........] - ETA: 43s - loss: 2.3452 - regression_loss: 1.9575 - classification_loss: 0.3877 370/500 [=====================>........] - ETA: 42s - loss: 2.3452 - regression_loss: 1.9576 - classification_loss: 0.3876 371/500 [=====================>........] - ETA: 42s - loss: 2.3448 - regression_loss: 1.9576 - classification_loss: 0.3873 372/500 [=====================>........] - ETA: 42s - loss: 2.3455 - regression_loss: 1.9581 - classification_loss: 0.3874 373/500 [=====================>........] - ETA: 41s - loss: 2.3449 - regression_loss: 1.9576 - classification_loss: 0.3874 374/500 [=====================>........] - ETA: 41s - loss: 2.3441 - regression_loss: 1.9571 - classification_loss: 0.3870 375/500 [=====================>........] - ETA: 41s - loss: 2.3442 - regression_loss: 1.9572 - classification_loss: 0.3870 376/500 [=====================>........] - ETA: 40s - loss: 2.3428 - regression_loss: 1.9561 - classification_loss: 0.3867 377/500 [=====================>........] - ETA: 40s - loss: 2.3434 - regression_loss: 1.9566 - classification_loss: 0.3868 378/500 [=====================>........] - ETA: 40s - loss: 2.3441 - regression_loss: 1.9572 - classification_loss: 0.3868 379/500 [=====================>........] - ETA: 39s - loss: 2.3432 - regression_loss: 1.9567 - classification_loss: 0.3865 380/500 [=====================>........] - ETA: 39s - loss: 2.3434 - regression_loss: 1.9568 - classification_loss: 0.3866 381/500 [=====================>........] - ETA: 39s - loss: 2.3419 - regression_loss: 1.9558 - classification_loss: 0.3861 382/500 [=====================>........] - ETA: 38s - loss: 2.3429 - regression_loss: 1.9566 - classification_loss: 0.3863 383/500 [=====================>........] - ETA: 38s - loss: 2.3424 - regression_loss: 1.9563 - classification_loss: 0.3861 384/500 [======================>.......] - ETA: 38s - loss: 2.3421 - regression_loss: 1.9561 - classification_loss: 0.3860 385/500 [======================>.......] - ETA: 37s - loss: 2.3435 - regression_loss: 1.9568 - classification_loss: 0.3867 386/500 [======================>.......] - ETA: 37s - loss: 2.3435 - regression_loss: 1.9568 - classification_loss: 0.3867 387/500 [======================>.......] - ETA: 37s - loss: 2.3428 - regression_loss: 1.9562 - classification_loss: 0.3866 388/500 [======================>.......] - ETA: 36s - loss: 2.3417 - regression_loss: 1.9553 - classification_loss: 0.3863 389/500 [======================>.......] - ETA: 36s - loss: 2.3421 - regression_loss: 1.9557 - classification_loss: 0.3864 390/500 [======================>.......] - ETA: 36s - loss: 2.3427 - regression_loss: 1.9560 - classification_loss: 0.3867 391/500 [======================>.......] - ETA: 35s - loss: 2.3432 - regression_loss: 1.9566 - classification_loss: 0.3866 392/500 [======================>.......] - ETA: 35s - loss: 2.3429 - regression_loss: 1.9563 - classification_loss: 0.3866 393/500 [======================>.......] - ETA: 35s - loss: 2.3424 - regression_loss: 1.9560 - classification_loss: 0.3864 394/500 [======================>.......] - ETA: 34s - loss: 2.3439 - regression_loss: 1.9572 - classification_loss: 0.3867 395/500 [======================>.......] - ETA: 34s - loss: 2.3444 - regression_loss: 1.9576 - classification_loss: 0.3868 396/500 [======================>.......] - ETA: 34s - loss: 2.3449 - regression_loss: 1.9582 - classification_loss: 0.3868 397/500 [======================>.......] - ETA: 33s - loss: 2.3451 - regression_loss: 1.9583 - classification_loss: 0.3868 398/500 [======================>.......] - ETA: 33s - loss: 2.3452 - regression_loss: 1.9582 - classification_loss: 0.3870 399/500 [======================>.......] - ETA: 33s - loss: 2.3452 - regression_loss: 1.9582 - classification_loss: 0.3870 400/500 [=======================>......] - ETA: 32s - loss: 2.3453 - regression_loss: 1.9584 - classification_loss: 0.3870 401/500 [=======================>......] - ETA: 32s - loss: 2.3436 - regression_loss: 1.9564 - classification_loss: 0.3871 402/500 [=======================>......] - ETA: 32s - loss: 2.3437 - regression_loss: 1.9566 - classification_loss: 0.3871 403/500 [=======================>......] - ETA: 31s - loss: 2.3416 - regression_loss: 1.9548 - classification_loss: 0.3868 404/500 [=======================>......] - ETA: 31s - loss: 2.3416 - regression_loss: 1.9549 - classification_loss: 0.3867 405/500 [=======================>......] - ETA: 31s - loss: 2.3419 - regression_loss: 1.9550 - classification_loss: 0.3868 406/500 [=======================>......] - ETA: 30s - loss: 2.3419 - regression_loss: 1.9550 - classification_loss: 0.3868 407/500 [=======================>......] - ETA: 30s - loss: 2.3406 - regression_loss: 1.9539 - classification_loss: 0.3867 408/500 [=======================>......] - ETA: 30s - loss: 2.3377 - regression_loss: 1.9514 - classification_loss: 0.3863 409/500 [=======================>......] - ETA: 29s - loss: 2.3374 - regression_loss: 1.9512 - classification_loss: 0.3862 410/500 [=======================>......] - ETA: 29s - loss: 2.3355 - regression_loss: 1.9496 - classification_loss: 0.3859 411/500 [=======================>......] - ETA: 29s - loss: 2.3357 - regression_loss: 1.9499 - classification_loss: 0.3858 412/500 [=======================>......] - ETA: 28s - loss: 2.3365 - regression_loss: 1.9507 - classification_loss: 0.3859 413/500 [=======================>......] - ETA: 28s - loss: 2.3356 - regression_loss: 1.9496 - classification_loss: 0.3860 414/500 [=======================>......] - ETA: 28s - loss: 2.3355 - regression_loss: 1.9495 - classification_loss: 0.3860 415/500 [=======================>......] - ETA: 27s - loss: 2.3360 - regression_loss: 1.9501 - classification_loss: 0.3860 416/500 [=======================>......] - ETA: 27s - loss: 2.3355 - regression_loss: 1.9495 - classification_loss: 0.3860 417/500 [========================>.....] - ETA: 27s - loss: 2.3344 - regression_loss: 1.9487 - classification_loss: 0.3857 418/500 [========================>.....] - ETA: 26s - loss: 2.3347 - regression_loss: 1.9489 - classification_loss: 0.3858 419/500 [========================>.....] - ETA: 26s - loss: 2.3354 - regression_loss: 1.9495 - classification_loss: 0.3859 420/500 [========================>.....] - ETA: 26s - loss: 2.3347 - regression_loss: 1.9484 - classification_loss: 0.3863 421/500 [========================>.....] - ETA: 25s - loss: 2.3341 - regression_loss: 1.9481 - classification_loss: 0.3861 422/500 [========================>.....] - ETA: 25s - loss: 2.3342 - regression_loss: 1.9482 - classification_loss: 0.3861 423/500 [========================>.....] - ETA: 25s - loss: 2.3343 - regression_loss: 1.9482 - classification_loss: 0.3861 424/500 [========================>.....] - ETA: 24s - loss: 2.3334 - regression_loss: 1.9475 - classification_loss: 0.3858 425/500 [========================>.....] - ETA: 24s - loss: 2.3326 - regression_loss: 1.9469 - classification_loss: 0.3857 426/500 [========================>.....] - ETA: 24s - loss: 2.3325 - regression_loss: 1.9468 - classification_loss: 0.3858 427/500 [========================>.....] - ETA: 23s - loss: 2.3320 - regression_loss: 1.9463 - classification_loss: 0.3858 428/500 [========================>.....] - ETA: 23s - loss: 2.3322 - regression_loss: 1.9465 - classification_loss: 0.3857 429/500 [========================>.....] - ETA: 23s - loss: 2.3305 - regression_loss: 1.9451 - classification_loss: 0.3854 430/500 [========================>.....] - ETA: 23s - loss: 2.3292 - regression_loss: 1.9438 - classification_loss: 0.3854 431/500 [========================>.....] - ETA: 22s - loss: 2.3274 - regression_loss: 1.9420 - classification_loss: 0.3853 432/500 [========================>.....] - ETA: 22s - loss: 2.3273 - regression_loss: 1.9419 - classification_loss: 0.3854 433/500 [========================>.....] - ETA: 22s - loss: 2.3272 - regression_loss: 1.9419 - classification_loss: 0.3853 434/500 [=========================>....] - ETA: 21s - loss: 2.3265 - regression_loss: 1.9414 - classification_loss: 0.3850 435/500 [=========================>....] - ETA: 21s - loss: 2.3270 - regression_loss: 1.9419 - classification_loss: 0.3851 436/500 [=========================>....] - ETA: 21s - loss: 2.3265 - regression_loss: 1.9414 - classification_loss: 0.3851 437/500 [=========================>....] - ETA: 20s - loss: 2.3261 - regression_loss: 1.9413 - classification_loss: 0.3848 438/500 [=========================>....] - ETA: 20s - loss: 2.3267 - regression_loss: 1.9417 - classification_loss: 0.3850 439/500 [=========================>....] - ETA: 20s - loss: 2.3272 - regression_loss: 1.9421 - classification_loss: 0.3851 440/500 [=========================>....] - ETA: 19s - loss: 2.3270 - regression_loss: 1.9419 - classification_loss: 0.3850 441/500 [=========================>....] - ETA: 19s - loss: 2.3252 - regression_loss: 1.9406 - classification_loss: 0.3846 442/500 [=========================>....] - ETA: 19s - loss: 2.3248 - regression_loss: 1.9403 - classification_loss: 0.3845 443/500 [=========================>....] - ETA: 18s - loss: 2.3251 - regression_loss: 1.9407 - classification_loss: 0.3844 444/500 [=========================>....] - ETA: 18s - loss: 2.3234 - regression_loss: 1.9393 - classification_loss: 0.3841 445/500 [=========================>....] - ETA: 18s - loss: 2.3231 - regression_loss: 1.9391 - classification_loss: 0.3840 446/500 [=========================>....] - ETA: 17s - loss: 2.3234 - regression_loss: 1.9393 - classification_loss: 0.3841 447/500 [=========================>....] - ETA: 17s - loss: 2.3239 - regression_loss: 1.9396 - classification_loss: 0.3842 448/500 [=========================>....] - ETA: 17s - loss: 2.3231 - regression_loss: 1.9390 - classification_loss: 0.3842 449/500 [=========================>....] - ETA: 16s - loss: 2.3216 - regression_loss: 1.9378 - classification_loss: 0.3838 450/500 [==========================>...] - ETA: 16s - loss: 2.3221 - regression_loss: 1.9381 - classification_loss: 0.3840 451/500 [==========================>...] - ETA: 16s - loss: 2.3220 - regression_loss: 1.9380 - classification_loss: 0.3840 452/500 [==========================>...] - ETA: 15s - loss: 2.3221 - regression_loss: 1.9381 - classification_loss: 0.3840 453/500 [==========================>...] - ETA: 15s - loss: 2.3223 - regression_loss: 1.9383 - classification_loss: 0.3840 454/500 [==========================>...] - ETA: 15s - loss: 2.3205 - regression_loss: 1.9365 - classification_loss: 0.3840 455/500 [==========================>...] - ETA: 14s - loss: 2.3199 - regression_loss: 1.9360 - classification_loss: 0.3839 456/500 [==========================>...] - ETA: 14s - loss: 2.3198 - regression_loss: 1.9360 - classification_loss: 0.3838 457/500 [==========================>...] - ETA: 14s - loss: 2.3195 - regression_loss: 1.9359 - classification_loss: 0.3836 458/500 [==========================>...] - ETA: 13s - loss: 2.3190 - regression_loss: 1.9356 - classification_loss: 0.3834 459/500 [==========================>...] - ETA: 13s - loss: 2.3193 - regression_loss: 1.9358 - classification_loss: 0.3835 460/500 [==========================>...] - ETA: 13s - loss: 2.3197 - regression_loss: 1.9362 - classification_loss: 0.3836 461/500 [==========================>...] - ETA: 12s - loss: 2.3190 - regression_loss: 1.9355 - classification_loss: 0.3835 462/500 [==========================>...] - ETA: 12s - loss: 2.3195 - regression_loss: 1.9359 - classification_loss: 0.3836 463/500 [==========================>...] - ETA: 12s - loss: 2.3164 - regression_loss: 1.9332 - classification_loss: 0.3831 464/500 [==========================>...] - ETA: 11s - loss: 2.3163 - regression_loss: 1.9332 - classification_loss: 0.3831 465/500 [==========================>...] - ETA: 11s - loss: 2.3163 - regression_loss: 1.9332 - classification_loss: 0.3832 466/500 [==========================>...] - ETA: 11s - loss: 2.3177 - regression_loss: 1.9344 - classification_loss: 0.3833 467/500 [===========================>..] - ETA: 10s - loss: 2.3181 - regression_loss: 1.9347 - classification_loss: 0.3835 468/500 [===========================>..] - ETA: 10s - loss: 2.3184 - regression_loss: 1.9349 - classification_loss: 0.3835 469/500 [===========================>..] - ETA: 10s - loss: 2.3189 - regression_loss: 1.9353 - classification_loss: 0.3835 470/500 [===========================>..] - ETA: 9s - loss: 2.3189 - regression_loss: 1.9354 - classification_loss: 0.3835  471/500 [===========================>..] - ETA: 9s - loss: 2.3189 - regression_loss: 1.9355 - classification_loss: 0.3834 472/500 [===========================>..] - ETA: 9s - loss: 2.3187 - regression_loss: 1.9354 - classification_loss: 0.3834 473/500 [===========================>..] - ETA: 8s - loss: 2.3203 - regression_loss: 1.9367 - classification_loss: 0.3836 474/500 [===========================>..] - ETA: 8s - loss: 2.3205 - regression_loss: 1.9368 - classification_loss: 0.3837 475/500 [===========================>..] - ETA: 8s - loss: 2.3196 - regression_loss: 1.9361 - classification_loss: 0.3835 476/500 [===========================>..] - ETA: 7s - loss: 2.3196 - regression_loss: 1.9361 - classification_loss: 0.3835 477/500 [===========================>..] - ETA: 7s - loss: 2.3195 - regression_loss: 1.9360 - classification_loss: 0.3835 478/500 [===========================>..] - ETA: 7s - loss: 2.3189 - regression_loss: 1.9356 - classification_loss: 0.3833 479/500 [===========================>..] - ETA: 6s - loss: 2.3195 - regression_loss: 1.9361 - classification_loss: 0.3834 480/500 [===========================>..] - ETA: 6s - loss: 2.3196 - regression_loss: 1.9363 - classification_loss: 0.3832 481/500 [===========================>..] - ETA: 6s - loss: 2.3198 - regression_loss: 1.9365 - classification_loss: 0.3833 482/500 [===========================>..] - ETA: 5s - loss: 2.3198 - regression_loss: 1.9365 - classification_loss: 0.3832 483/500 [===========================>..] - ETA: 5s - loss: 2.3197 - regression_loss: 1.9366 - classification_loss: 0.3831 484/500 [============================>.] - ETA: 5s - loss: 2.3200 - regression_loss: 1.9368 - classification_loss: 0.3832 485/500 [============================>.] - ETA: 4s - loss: 2.3194 - regression_loss: 1.9361 - classification_loss: 0.3833 486/500 [============================>.] - ETA: 4s - loss: 2.3200 - regression_loss: 1.9367 - classification_loss: 0.3833 487/500 [============================>.] - ETA: 4s - loss: 2.3200 - regression_loss: 1.9367 - classification_loss: 0.3833 488/500 [============================>.] - ETA: 3s - loss: 2.3202 - regression_loss: 1.9370 - classification_loss: 0.3832 489/500 [============================>.] - ETA: 3s - loss: 2.3188 - regression_loss: 1.9359 - classification_loss: 0.3829 490/500 [============================>.] - ETA: 3s - loss: 2.3186 - regression_loss: 1.9357 - classification_loss: 0.3829 491/500 [============================>.] - ETA: 2s - loss: 2.3172 - regression_loss: 1.9346 - classification_loss: 0.3826 492/500 [============================>.] - ETA: 2s - loss: 2.3158 - regression_loss: 1.9335 - classification_loss: 0.3822 493/500 [============================>.] - ETA: 2s - loss: 2.3161 - regression_loss: 1.9337 - classification_loss: 0.3824 494/500 [============================>.] - ETA: 1s - loss: 2.3146 - regression_loss: 1.9325 - classification_loss: 0.3821 495/500 [============================>.] - ETA: 1s - loss: 2.3156 - regression_loss: 1.9334 - classification_loss: 0.3822 496/500 [============================>.] - ETA: 1s - loss: 2.3156 - regression_loss: 1.9333 - classification_loss: 0.3823 497/500 [============================>.] - ETA: 0s - loss: 2.3157 - regression_loss: 1.9334 - classification_loss: 0.3823 498/500 [============================>.] - ETA: 0s - loss: 2.3158 - regression_loss: 1.9337 - classification_loss: 0.3822 499/500 [============================>.] - ETA: 0s - loss: 2.3162 - regression_loss: 1.9341 - classification_loss: 0.3822 500/500 [==============================] - 164s 329ms/step - loss: 2.3156 - regression_loss: 1.9336 - classification_loss: 0.3820 1172 instances of class plum with average precision: 0.3218 mAP: 0.3218 Epoch 00006: saving model to ./training/snapshots/resnet101_pascal_06.h5 Epoch 7/150 1/500 [..............................] - ETA: 2:36 - loss: 2.3568 - regression_loss: 1.9707 - classification_loss: 0.3862 2/500 [..............................] - ETA: 2:39 - loss: 1.8289 - regression_loss: 1.4905 - classification_loss: 0.3384 3/500 [..............................] - ETA: 2:43 - loss: 2.0714 - regression_loss: 1.7032 - classification_loss: 0.3683 4/500 [..............................] - ETA: 2:42 - loss: 2.1729 - regression_loss: 1.8070 - classification_loss: 0.3659 5/500 [..............................] - ETA: 2:41 - loss: 2.2186 - regression_loss: 1.8414 - classification_loss: 0.3772 6/500 [..............................] - ETA: 2:41 - loss: 2.0743 - regression_loss: 1.7095 - classification_loss: 0.3648 7/500 [..............................] - ETA: 2:40 - loss: 2.1609 - regression_loss: 1.7778 - classification_loss: 0.3831 8/500 [..............................] - ETA: 2:40 - loss: 2.1840 - regression_loss: 1.8077 - classification_loss: 0.3763 9/500 [..............................] - ETA: 2:40 - loss: 2.2066 - regression_loss: 1.8352 - classification_loss: 0.3714 10/500 [..............................] - ETA: 2:40 - loss: 2.2249 - regression_loss: 1.8499 - classification_loss: 0.3750 11/500 [..............................] - ETA: 2:39 - loss: 2.2361 - regression_loss: 1.8572 - classification_loss: 0.3789 12/500 [..............................] - ETA: 2:39 - loss: 2.2257 - regression_loss: 1.8519 - classification_loss: 0.3738 13/500 [..............................] - ETA: 2:38 - loss: 2.2178 - regression_loss: 1.8514 - classification_loss: 0.3664 14/500 [..............................] - ETA: 2:38 - loss: 2.2287 - regression_loss: 1.8580 - classification_loss: 0.3707 15/500 [..............................] - ETA: 2:38 - loss: 2.2327 - regression_loss: 1.8627 - classification_loss: 0.3700 16/500 [..............................] - ETA: 2:37 - loss: 2.2038 - regression_loss: 1.8391 - classification_loss: 0.3647 17/500 [>.............................] - ETA: 2:37 - loss: 2.1597 - regression_loss: 1.8022 - classification_loss: 0.3575 18/500 [>.............................] - ETA: 2:38 - loss: 2.1568 - regression_loss: 1.8022 - classification_loss: 0.3545 19/500 [>.............................] - ETA: 2:38 - loss: 2.1438 - regression_loss: 1.7928 - classification_loss: 0.3511 20/500 [>.............................] - ETA: 2:38 - loss: 2.1756 - regression_loss: 1.8187 - classification_loss: 0.3569 21/500 [>.............................] - ETA: 2:37 - loss: 2.1705 - regression_loss: 1.8169 - classification_loss: 0.3536 22/500 [>.............................] - ETA: 2:36 - loss: 2.1548 - regression_loss: 1.8059 - classification_loss: 0.3489 23/500 [>.............................] - ETA: 2:36 - loss: 2.1613 - regression_loss: 1.8126 - classification_loss: 0.3487 24/500 [>.............................] - ETA: 2:36 - loss: 2.1707 - regression_loss: 1.8221 - classification_loss: 0.3486 25/500 [>.............................] - ETA: 2:36 - loss: 2.1710 - regression_loss: 1.8184 - classification_loss: 0.3526 26/500 [>.............................] - ETA: 2:36 - loss: 2.1531 - regression_loss: 1.8068 - classification_loss: 0.3463 27/500 [>.............................] - ETA: 2:36 - loss: 2.1555 - regression_loss: 1.8101 - classification_loss: 0.3454 28/500 [>.............................] - ETA: 2:35 - loss: 2.1582 - regression_loss: 1.8112 - classification_loss: 0.3469 29/500 [>.............................] - ETA: 2:35 - loss: 2.1679 - regression_loss: 1.8207 - classification_loss: 0.3472 30/500 [>.............................] - ETA: 2:34 - loss: 2.1775 - regression_loss: 1.8252 - classification_loss: 0.3524 31/500 [>.............................] - ETA: 2:34 - loss: 2.1842 - regression_loss: 1.8302 - classification_loss: 0.3540 32/500 [>.............................] - ETA: 2:33 - loss: 2.2046 - regression_loss: 1.8427 - classification_loss: 0.3619 33/500 [>.............................] - ETA: 2:33 - loss: 2.1790 - regression_loss: 1.8197 - classification_loss: 0.3593 34/500 [=>............................] - ETA: 2:32 - loss: 2.1835 - regression_loss: 1.8243 - classification_loss: 0.3592 35/500 [=>............................] - ETA: 2:32 - loss: 2.1961 - regression_loss: 1.8350 - classification_loss: 0.3612 36/500 [=>............................] - ETA: 2:32 - loss: 2.2029 - regression_loss: 1.8395 - classification_loss: 0.3634 37/500 [=>............................] - ETA: 2:32 - loss: 2.2116 - regression_loss: 1.8447 - classification_loss: 0.3669 38/500 [=>............................] - ETA: 2:32 - loss: 2.2134 - regression_loss: 1.8468 - classification_loss: 0.3666 39/500 [=>............................] - ETA: 2:31 - loss: 2.2269 - regression_loss: 1.8580 - classification_loss: 0.3689 40/500 [=>............................] - ETA: 2:31 - loss: 2.2348 - regression_loss: 1.8663 - classification_loss: 0.3685 41/500 [=>............................] - ETA: 2:31 - loss: 2.2293 - regression_loss: 1.8619 - classification_loss: 0.3674 42/500 [=>............................] - ETA: 2:31 - loss: 2.2296 - regression_loss: 1.8621 - classification_loss: 0.3674 43/500 [=>............................] - ETA: 2:30 - loss: 2.2329 - regression_loss: 1.8644 - classification_loss: 0.3685 44/500 [=>............................] - ETA: 2:30 - loss: 2.2258 - regression_loss: 1.8555 - classification_loss: 0.3703 45/500 [=>............................] - ETA: 2:30 - loss: 2.2328 - regression_loss: 1.8622 - classification_loss: 0.3707 46/500 [=>............................] - ETA: 2:29 - loss: 2.2375 - regression_loss: 1.8553 - classification_loss: 0.3822 47/500 [=>............................] - ETA: 2:29 - loss: 2.2429 - regression_loss: 1.8605 - classification_loss: 0.3824 48/500 [=>............................] - ETA: 2:29 - loss: 2.2395 - regression_loss: 1.8568 - classification_loss: 0.3828 49/500 [=>............................] - ETA: 2:28 - loss: 2.2387 - regression_loss: 1.8563 - classification_loss: 0.3824 50/500 [==>...........................] - ETA: 2:28 - loss: 2.2335 - regression_loss: 1.8532 - classification_loss: 0.3803 51/500 [==>...........................] - ETA: 2:28 - loss: 2.2375 - regression_loss: 1.8573 - classification_loss: 0.3802 52/500 [==>...........................] - ETA: 2:27 - loss: 2.2399 - regression_loss: 1.8597 - classification_loss: 0.3802 53/500 [==>...........................] - ETA: 2:27 - loss: 2.2491 - regression_loss: 1.8677 - classification_loss: 0.3814 54/500 [==>...........................] - ETA: 2:27 - loss: 2.2382 - regression_loss: 1.8583 - classification_loss: 0.3799 55/500 [==>...........................] - ETA: 2:26 - loss: 2.2264 - regression_loss: 1.8484 - classification_loss: 0.3780 56/500 [==>...........................] - ETA: 2:26 - loss: 2.2246 - regression_loss: 1.8462 - classification_loss: 0.3784 57/500 [==>...........................] - ETA: 2:26 - loss: 2.2243 - regression_loss: 1.8448 - classification_loss: 0.3795 58/500 [==>...........................] - ETA: 2:25 - loss: 2.2344 - regression_loss: 1.8529 - classification_loss: 0.3815 59/500 [==>...........................] - ETA: 2:25 - loss: 2.2383 - regression_loss: 1.8567 - classification_loss: 0.3816 60/500 [==>...........................] - ETA: 2:25 - loss: 2.2347 - regression_loss: 1.8539 - classification_loss: 0.3808 61/500 [==>...........................] - ETA: 2:24 - loss: 2.2343 - regression_loss: 1.8546 - classification_loss: 0.3797 62/500 [==>...........................] - ETA: 2:24 - loss: 2.2373 - regression_loss: 1.8577 - classification_loss: 0.3796 63/500 [==>...........................] - ETA: 2:24 - loss: 2.2285 - regression_loss: 1.8500 - classification_loss: 0.3785 64/500 [==>...........................] - ETA: 2:23 - loss: 2.2230 - regression_loss: 1.8458 - classification_loss: 0.3772 65/500 [==>...........................] - ETA: 2:23 - loss: 2.2009 - regression_loss: 1.8269 - classification_loss: 0.3740 66/500 [==>...........................] - ETA: 2:22 - loss: 2.2057 - regression_loss: 1.8300 - classification_loss: 0.3758 67/500 [===>..........................] - ETA: 2:22 - loss: 2.2103 - regression_loss: 1.8331 - classification_loss: 0.3773 68/500 [===>..........................] - ETA: 2:22 - loss: 2.2139 - regression_loss: 1.8370 - classification_loss: 0.3768 69/500 [===>..........................] - ETA: 2:21 - loss: 2.2157 - regression_loss: 1.8383 - classification_loss: 0.3774 70/500 [===>..........................] - ETA: 2:21 - loss: 2.2185 - regression_loss: 1.8406 - classification_loss: 0.3778 71/500 [===>..........................] - ETA: 2:21 - loss: 2.2168 - regression_loss: 1.8397 - classification_loss: 0.3771 72/500 [===>..........................] - ETA: 2:20 - loss: 2.2195 - regression_loss: 1.8428 - classification_loss: 0.3767 73/500 [===>..........................] - ETA: 2:20 - loss: 2.2251 - regression_loss: 1.8482 - classification_loss: 0.3769 74/500 [===>..........................] - ETA: 2:20 - loss: 2.2296 - regression_loss: 1.8526 - classification_loss: 0.3769 75/500 [===>..........................] - ETA: 2:20 - loss: 2.2311 - regression_loss: 1.8538 - classification_loss: 0.3773 76/500 [===>..........................] - ETA: 2:19 - loss: 2.2301 - regression_loss: 1.8528 - classification_loss: 0.3773 77/500 [===>..........................] - ETA: 2:19 - loss: 2.2270 - regression_loss: 1.8507 - classification_loss: 0.3763 78/500 [===>..........................] - ETA: 2:19 - loss: 2.2287 - regression_loss: 1.8523 - classification_loss: 0.3764 79/500 [===>..........................] - ETA: 2:18 - loss: 2.2328 - regression_loss: 1.8551 - classification_loss: 0.3776 80/500 [===>..........................] - ETA: 2:18 - loss: 2.2384 - regression_loss: 1.8583 - classification_loss: 0.3801 81/500 [===>..........................] - ETA: 2:18 - loss: 2.2403 - regression_loss: 1.8602 - classification_loss: 0.3800 82/500 [===>..........................] - ETA: 2:18 - loss: 2.2435 - regression_loss: 1.8634 - classification_loss: 0.3802 83/500 [===>..........................] - ETA: 2:17 - loss: 2.2433 - regression_loss: 1.8638 - classification_loss: 0.3796 84/500 [====>.........................] - ETA: 2:17 - loss: 2.2402 - regression_loss: 1.8619 - classification_loss: 0.3783 85/500 [====>.........................] - ETA: 2:17 - loss: 2.2424 - regression_loss: 1.8640 - classification_loss: 0.3784 86/500 [====>.........................] - ETA: 2:17 - loss: 2.2429 - regression_loss: 1.8647 - classification_loss: 0.3782 87/500 [====>.........................] - ETA: 2:16 - loss: 2.2373 - regression_loss: 1.8580 - classification_loss: 0.3793 88/500 [====>.........................] - ETA: 2:16 - loss: 2.2417 - regression_loss: 1.8627 - classification_loss: 0.3790 89/500 [====>.........................] - ETA: 2:15 - loss: 2.2417 - regression_loss: 1.8628 - classification_loss: 0.3788 90/500 [====>.........................] - ETA: 2:15 - loss: 2.2489 - regression_loss: 1.8679 - classification_loss: 0.3810 91/500 [====>.........................] - ETA: 2:15 - loss: 2.2503 - regression_loss: 1.8691 - classification_loss: 0.3812 92/500 [====>.........................] - ETA: 2:14 - loss: 2.2530 - regression_loss: 1.8720 - classification_loss: 0.3811 93/500 [====>.........................] - ETA: 2:14 - loss: 2.2539 - regression_loss: 1.8729 - classification_loss: 0.3811 94/500 [====>.........................] - ETA: 2:13 - loss: 2.2544 - regression_loss: 1.8736 - classification_loss: 0.3808 95/500 [====>.........................] - ETA: 2:13 - loss: 2.2514 - regression_loss: 1.8708 - classification_loss: 0.3806 96/500 [====>.........................] - ETA: 2:13 - loss: 2.2456 - regression_loss: 1.8666 - classification_loss: 0.3790 97/500 [====>.........................] - ETA: 2:12 - loss: 2.2464 - regression_loss: 1.8677 - classification_loss: 0.3787 98/500 [====>.........................] - ETA: 2:12 - loss: 2.2452 - regression_loss: 1.8671 - classification_loss: 0.3781 99/500 [====>.........................] - ETA: 2:11 - loss: 2.2373 - regression_loss: 1.8603 - classification_loss: 0.3770 100/500 [=====>........................] - ETA: 2:11 - loss: 2.2362 - regression_loss: 1.8590 - classification_loss: 0.3772 101/500 [=====>........................] - ETA: 2:11 - loss: 2.2343 - regression_loss: 1.8576 - classification_loss: 0.3767 102/500 [=====>........................] - ETA: 2:11 - loss: 2.2360 - regression_loss: 1.8594 - classification_loss: 0.3766 103/500 [=====>........................] - ETA: 2:10 - loss: 2.2381 - regression_loss: 1.8613 - classification_loss: 0.3768 104/500 [=====>........................] - ETA: 2:10 - loss: 2.2365 - regression_loss: 1.8601 - classification_loss: 0.3763 105/500 [=====>........................] - ETA: 2:10 - loss: 2.2398 - regression_loss: 1.8630 - classification_loss: 0.3769 106/500 [=====>........................] - ETA: 2:09 - loss: 2.2378 - regression_loss: 1.8622 - classification_loss: 0.3756 107/500 [=====>........................] - ETA: 2:09 - loss: 2.2316 - regression_loss: 1.8572 - classification_loss: 0.3744 108/500 [=====>........................] - ETA: 2:08 - loss: 2.2268 - regression_loss: 1.8533 - classification_loss: 0.3735 109/500 [=====>........................] - ETA: 2:08 - loss: 2.2295 - regression_loss: 1.8555 - classification_loss: 0.3740 110/500 [=====>........................] - ETA: 2:08 - loss: 2.2241 - regression_loss: 1.8502 - classification_loss: 0.3738 111/500 [=====>........................] - ETA: 2:07 - loss: 2.2199 - regression_loss: 1.8474 - classification_loss: 0.3725 112/500 [=====>........................] - ETA: 2:07 - loss: 2.2147 - regression_loss: 1.8434 - classification_loss: 0.3713 113/500 [=====>........................] - ETA: 2:07 - loss: 2.2163 - regression_loss: 1.8452 - classification_loss: 0.3710 114/500 [=====>........................] - ETA: 2:07 - loss: 2.2134 - regression_loss: 1.8427 - classification_loss: 0.3707 115/500 [=====>........................] - ETA: 2:06 - loss: 2.2160 - regression_loss: 1.8453 - classification_loss: 0.3707 116/500 [=====>........................] - ETA: 2:06 - loss: 2.2164 - regression_loss: 1.8460 - classification_loss: 0.3704 117/500 [======>.......................] - ETA: 2:06 - loss: 2.2160 - regression_loss: 1.8460 - classification_loss: 0.3700 118/500 [======>.......................] - ETA: 2:05 - loss: 2.2206 - regression_loss: 1.8502 - classification_loss: 0.3703 119/500 [======>.......................] - ETA: 2:05 - loss: 2.2262 - regression_loss: 1.8551 - classification_loss: 0.3711 120/500 [======>.......................] - ETA: 2:05 - loss: 2.2290 - regression_loss: 1.8572 - classification_loss: 0.3717 121/500 [======>.......................] - ETA: 2:04 - loss: 2.2274 - regression_loss: 1.8557 - classification_loss: 0.3717 122/500 [======>.......................] - ETA: 2:04 - loss: 2.2246 - regression_loss: 1.8538 - classification_loss: 0.3708 123/500 [======>.......................] - ETA: 2:04 - loss: 2.2161 - regression_loss: 1.8460 - classification_loss: 0.3702 124/500 [======>.......................] - ETA: 2:03 - loss: 2.2177 - regression_loss: 1.8475 - classification_loss: 0.3702 125/500 [======>.......................] - ETA: 2:03 - loss: 2.2187 - regression_loss: 1.8484 - classification_loss: 0.3704 126/500 [======>.......................] - ETA: 2:03 - loss: 2.2212 - regression_loss: 1.8504 - classification_loss: 0.3708 127/500 [======>.......................] - ETA: 2:02 - loss: 2.2252 - regression_loss: 1.8537 - classification_loss: 0.3716 128/500 [======>.......................] - ETA: 2:02 - loss: 2.2249 - regression_loss: 1.8533 - classification_loss: 0.3716 129/500 [======>.......................] - ETA: 2:01 - loss: 2.2304 - regression_loss: 1.8565 - classification_loss: 0.3739 130/500 [======>.......................] - ETA: 2:01 - loss: 2.2333 - regression_loss: 1.8596 - classification_loss: 0.3737 131/500 [======>.......................] - ETA: 2:01 - loss: 2.2347 - regression_loss: 1.8610 - classification_loss: 0.3737 132/500 [======>.......................] - ETA: 2:00 - loss: 2.2353 - regression_loss: 1.8617 - classification_loss: 0.3736 133/500 [======>.......................] - ETA: 2:00 - loss: 2.2357 - regression_loss: 1.8623 - classification_loss: 0.3733 134/500 [=======>......................] - ETA: 2:00 - loss: 2.2368 - regression_loss: 1.8636 - classification_loss: 0.3732 135/500 [=======>......................] - ETA: 1:59 - loss: 2.2369 - regression_loss: 1.8640 - classification_loss: 0.3728 136/500 [=======>......................] - ETA: 1:59 - loss: 2.2387 - regression_loss: 1.8657 - classification_loss: 0.3730 137/500 [=======>......................] - ETA: 1:59 - loss: 2.2405 - regression_loss: 1.8672 - classification_loss: 0.3733 138/500 [=======>......................] - ETA: 1:58 - loss: 2.2417 - regression_loss: 1.8683 - classification_loss: 0.3734 139/500 [=======>......................] - ETA: 1:58 - loss: 2.2391 - regression_loss: 1.8661 - classification_loss: 0.3730 140/500 [=======>......................] - ETA: 1:58 - loss: 2.2377 - regression_loss: 1.8653 - classification_loss: 0.3723 141/500 [=======>......................] - ETA: 1:57 - loss: 2.2387 - regression_loss: 1.8662 - classification_loss: 0.3725 142/500 [=======>......................] - ETA: 1:57 - loss: 2.2384 - regression_loss: 1.8662 - classification_loss: 0.3722 143/500 [=======>......................] - ETA: 1:57 - loss: 2.2327 - regression_loss: 1.8615 - classification_loss: 0.3712 144/500 [=======>......................] - ETA: 1:56 - loss: 2.2318 - regression_loss: 1.8613 - classification_loss: 0.3705 145/500 [=======>......................] - ETA: 1:56 - loss: 2.2299 - regression_loss: 1.8602 - classification_loss: 0.3697 146/500 [=======>......................] - ETA: 1:56 - loss: 2.2250 - regression_loss: 1.8565 - classification_loss: 0.3686 147/500 [=======>......................] - ETA: 1:56 - loss: 2.2266 - regression_loss: 1.8579 - classification_loss: 0.3686 148/500 [=======>......................] - ETA: 1:55 - loss: 2.2277 - regression_loss: 1.8588 - classification_loss: 0.3689 149/500 [=======>......................] - ETA: 1:55 - loss: 2.2277 - regression_loss: 1.8587 - classification_loss: 0.3689 150/500 [========>.....................] - ETA: 1:55 - loss: 2.2285 - regression_loss: 1.8595 - classification_loss: 0.3690 151/500 [========>.....................] - ETA: 1:54 - loss: 2.2269 - regression_loss: 1.8581 - classification_loss: 0.3688 152/500 [========>.....................] - ETA: 1:54 - loss: 2.2263 - regression_loss: 1.8577 - classification_loss: 0.3686 153/500 [========>.....................] - ETA: 1:54 - loss: 2.2265 - regression_loss: 1.8579 - classification_loss: 0.3686 154/500 [========>.....................] - ETA: 1:53 - loss: 2.2252 - regression_loss: 1.8567 - classification_loss: 0.3686 155/500 [========>.....................] - ETA: 1:53 - loss: 2.2289 - regression_loss: 1.8596 - classification_loss: 0.3693 156/500 [========>.....................] - ETA: 1:53 - loss: 2.2304 - regression_loss: 1.8612 - classification_loss: 0.3692 157/500 [========>.....................] - ETA: 1:52 - loss: 2.2352 - regression_loss: 1.8650 - classification_loss: 0.3702 158/500 [========>.....................] - ETA: 1:52 - loss: 2.2312 - regression_loss: 1.8616 - classification_loss: 0.3697 159/500 [========>.....................] - ETA: 1:52 - loss: 2.2304 - regression_loss: 1.8603 - classification_loss: 0.3701 160/500 [========>.....................] - ETA: 1:51 - loss: 2.2297 - regression_loss: 1.8597 - classification_loss: 0.3700 161/500 [========>.....................] - ETA: 1:51 - loss: 2.2249 - regression_loss: 1.8554 - classification_loss: 0.3695 162/500 [========>.....................] - ETA: 1:51 - loss: 2.2239 - regression_loss: 1.8544 - classification_loss: 0.3695 163/500 [========>.....................] - ETA: 1:50 - loss: 2.2232 - regression_loss: 1.8539 - classification_loss: 0.3693 164/500 [========>.....................] - ETA: 1:50 - loss: 2.2246 - regression_loss: 1.8549 - classification_loss: 0.3697 165/500 [========>.....................] - ETA: 1:50 - loss: 2.2261 - regression_loss: 1.8559 - classification_loss: 0.3702 166/500 [========>.....................] - ETA: 1:49 - loss: 2.2267 - regression_loss: 1.8561 - classification_loss: 0.3706 167/500 [=========>....................] - ETA: 1:49 - loss: 2.2270 - regression_loss: 1.8563 - classification_loss: 0.3708 168/500 [=========>....................] - ETA: 1:49 - loss: 2.2311 - regression_loss: 1.8594 - classification_loss: 0.3717 169/500 [=========>....................] - ETA: 1:48 - loss: 2.2318 - regression_loss: 1.8601 - classification_loss: 0.3717 170/500 [=========>....................] - ETA: 1:48 - loss: 2.2321 - regression_loss: 1.8604 - classification_loss: 0.3717 171/500 [=========>....................] - ETA: 1:48 - loss: 2.2322 - regression_loss: 1.8605 - classification_loss: 0.3717 172/500 [=========>....................] - ETA: 1:48 - loss: 2.2318 - regression_loss: 1.8605 - classification_loss: 0.3713 173/500 [=========>....................] - ETA: 1:47 - loss: 2.2314 - regression_loss: 1.8599 - classification_loss: 0.3714 174/500 [=========>....................] - ETA: 1:47 - loss: 2.2311 - regression_loss: 1.8598 - classification_loss: 0.3712 175/500 [=========>....................] - ETA: 1:46 - loss: 2.2327 - regression_loss: 1.8614 - classification_loss: 0.3713 176/500 [=========>....................] - ETA: 1:46 - loss: 2.2302 - regression_loss: 1.8593 - classification_loss: 0.3709 177/500 [=========>....................] - ETA: 1:46 - loss: 2.2282 - regression_loss: 1.8568 - classification_loss: 0.3714 178/500 [=========>....................] - ETA: 1:46 - loss: 2.2263 - regression_loss: 1.8552 - classification_loss: 0.3711 179/500 [=========>....................] - ETA: 1:45 - loss: 2.2270 - regression_loss: 1.8562 - classification_loss: 0.3708 180/500 [=========>....................] - ETA: 1:45 - loss: 2.2279 - regression_loss: 1.8569 - classification_loss: 0.3710 181/500 [=========>....................] - ETA: 1:45 - loss: 2.2268 - regression_loss: 1.8557 - classification_loss: 0.3710 182/500 [=========>....................] - ETA: 1:44 - loss: 2.2275 - regression_loss: 1.8565 - classification_loss: 0.3710 183/500 [=========>....................] - ETA: 1:44 - loss: 2.2280 - regression_loss: 1.8570 - classification_loss: 0.3710 184/500 [==========>...................] - ETA: 1:44 - loss: 2.2297 - regression_loss: 1.8585 - classification_loss: 0.3712 185/500 [==========>...................] - ETA: 1:43 - loss: 2.2243 - regression_loss: 1.8535 - classification_loss: 0.3708 186/500 [==========>...................] - ETA: 1:43 - loss: 2.2248 - regression_loss: 1.8540 - classification_loss: 0.3708 187/500 [==========>...................] - ETA: 1:43 - loss: 2.2258 - regression_loss: 1.8542 - classification_loss: 0.3715 188/500 [==========>...................] - ETA: 1:42 - loss: 2.2272 - regression_loss: 1.8549 - classification_loss: 0.3723 189/500 [==========>...................] - ETA: 1:42 - loss: 2.2278 - regression_loss: 1.8554 - classification_loss: 0.3724 190/500 [==========>...................] - ETA: 1:42 - loss: 2.2280 - regression_loss: 1.8557 - classification_loss: 0.3724 191/500 [==========>...................] - ETA: 1:41 - loss: 2.2301 - regression_loss: 1.8569 - classification_loss: 0.3732 192/500 [==========>...................] - ETA: 1:41 - loss: 2.2310 - regression_loss: 1.8577 - classification_loss: 0.3732 193/500 [==========>...................] - ETA: 1:41 - loss: 2.2309 - regression_loss: 1.8575 - classification_loss: 0.3734 194/500 [==========>...................] - ETA: 1:40 - loss: 2.2308 - regression_loss: 1.8577 - classification_loss: 0.3731 195/500 [==========>...................] - ETA: 1:40 - loss: 2.2283 - regression_loss: 1.8553 - classification_loss: 0.3730 196/500 [==========>...................] - ETA: 1:40 - loss: 2.2280 - regression_loss: 1.8550 - classification_loss: 0.3730 197/500 [==========>...................] - ETA: 1:39 - loss: 2.2271 - regression_loss: 1.8533 - classification_loss: 0.3738 198/500 [==========>...................] - ETA: 1:39 - loss: 2.2287 - regression_loss: 1.8551 - classification_loss: 0.3737 199/500 [==========>...................] - ETA: 1:39 - loss: 2.2322 - regression_loss: 1.8566 - classification_loss: 0.3755 200/500 [===========>..................] - ETA: 1:38 - loss: 2.2341 - regression_loss: 1.8584 - classification_loss: 0.3757 201/500 [===========>..................] - ETA: 1:38 - loss: 2.2348 - regression_loss: 1.8593 - classification_loss: 0.3755 202/500 [===========>..................] - ETA: 1:38 - loss: 2.2342 - regression_loss: 1.8594 - classification_loss: 0.3748 203/500 [===========>..................] - ETA: 1:38 - loss: 2.2340 - regression_loss: 1.8592 - classification_loss: 0.3748 204/500 [===========>..................] - ETA: 1:37 - loss: 2.2346 - regression_loss: 1.8600 - classification_loss: 0.3745 205/500 [===========>..................] - ETA: 1:37 - loss: 2.2380 - regression_loss: 1.8630 - classification_loss: 0.3750 206/500 [===========>..................] - ETA: 1:37 - loss: 2.2396 - regression_loss: 1.8644 - classification_loss: 0.3752 207/500 [===========>..................] - ETA: 1:36 - loss: 2.2422 - regression_loss: 1.8665 - classification_loss: 0.3757 208/500 [===========>..................] - ETA: 1:36 - loss: 2.2426 - regression_loss: 1.8673 - classification_loss: 0.3753 209/500 [===========>..................] - ETA: 1:36 - loss: 2.2434 - regression_loss: 1.8682 - classification_loss: 0.3752 210/500 [===========>..................] - ETA: 1:35 - loss: 2.2456 - regression_loss: 1.8700 - classification_loss: 0.3756 211/500 [===========>..................] - ETA: 1:35 - loss: 2.2450 - regression_loss: 1.8696 - classification_loss: 0.3755 212/500 [===========>..................] - ETA: 1:35 - loss: 2.2459 - regression_loss: 1.8702 - classification_loss: 0.3758 213/500 [===========>..................] - ETA: 1:34 - loss: 2.2415 - regression_loss: 1.8662 - classification_loss: 0.3753 214/500 [===========>..................] - ETA: 1:34 - loss: 2.2404 - regression_loss: 1.8652 - classification_loss: 0.3752 215/500 [===========>..................] - ETA: 1:34 - loss: 2.2380 - regression_loss: 1.8634 - classification_loss: 0.3746 216/500 [===========>..................] - ETA: 1:33 - loss: 2.2373 - regression_loss: 1.8631 - classification_loss: 0.3742 217/500 [============>.................] - ETA: 1:33 - loss: 2.2390 - regression_loss: 1.8646 - classification_loss: 0.3744 218/500 [============>.................] - ETA: 1:33 - loss: 2.2390 - regression_loss: 1.8649 - classification_loss: 0.3740 219/500 [============>.................] - ETA: 1:32 - loss: 2.2402 - regression_loss: 1.8664 - classification_loss: 0.3738 220/500 [============>.................] - ETA: 1:32 - loss: 2.2385 - regression_loss: 1.8651 - classification_loss: 0.3735 221/500 [============>.................] - ETA: 1:32 - loss: 2.2406 - regression_loss: 1.8668 - classification_loss: 0.3738 222/500 [============>.................] - ETA: 1:31 - loss: 2.2400 - regression_loss: 1.8668 - classification_loss: 0.3732 223/500 [============>.................] - ETA: 1:31 - loss: 2.2414 - regression_loss: 1.8679 - classification_loss: 0.3735 224/500 [============>.................] - ETA: 1:31 - loss: 2.2422 - regression_loss: 1.8684 - classification_loss: 0.3737 225/500 [============>.................] - ETA: 1:30 - loss: 2.2422 - regression_loss: 1.8686 - classification_loss: 0.3736 226/500 [============>.................] - ETA: 1:30 - loss: 2.2409 - regression_loss: 1.8678 - classification_loss: 0.3730 227/500 [============>.................] - ETA: 1:30 - loss: 2.2417 - regression_loss: 1.8686 - classification_loss: 0.3731 228/500 [============>.................] - ETA: 1:29 - loss: 2.2423 - regression_loss: 1.8687 - classification_loss: 0.3735 229/500 [============>.................] - ETA: 1:29 - loss: 2.2421 - regression_loss: 1.8690 - classification_loss: 0.3731 230/500 [============>.................] - ETA: 1:29 - loss: 2.2421 - regression_loss: 1.8691 - classification_loss: 0.3730 231/500 [============>.................] - ETA: 1:28 - loss: 2.2418 - regression_loss: 1.8688 - classification_loss: 0.3730 232/500 [============>.................] - ETA: 1:28 - loss: 2.2392 - regression_loss: 1.8670 - classification_loss: 0.3722 233/500 [============>.................] - ETA: 1:28 - loss: 2.2402 - regression_loss: 1.8678 - classification_loss: 0.3725 234/500 [=============>................] - ETA: 1:27 - loss: 2.2417 - regression_loss: 1.8687 - classification_loss: 0.3730 235/500 [=============>................] - ETA: 1:27 - loss: 2.2377 - regression_loss: 1.8656 - classification_loss: 0.3721 236/500 [=============>................] - ETA: 1:27 - loss: 2.2379 - regression_loss: 1.8659 - classification_loss: 0.3720 237/500 [=============>................] - ETA: 1:26 - loss: 2.2410 - regression_loss: 1.8681 - classification_loss: 0.3729 238/500 [=============>................] - ETA: 1:26 - loss: 2.2419 - regression_loss: 1.8688 - classification_loss: 0.3731 239/500 [=============>................] - ETA: 1:26 - loss: 2.2397 - regression_loss: 1.8671 - classification_loss: 0.3725 240/500 [=============>................] - ETA: 1:25 - loss: 2.2399 - regression_loss: 1.8677 - classification_loss: 0.3722 241/500 [=============>................] - ETA: 1:25 - loss: 2.2407 - regression_loss: 1.8684 - classification_loss: 0.3723 242/500 [=============>................] - ETA: 1:25 - loss: 2.2380 - regression_loss: 1.8663 - classification_loss: 0.3717 243/500 [=============>................] - ETA: 1:24 - loss: 2.2386 - regression_loss: 1.8669 - classification_loss: 0.3716 244/500 [=============>................] - ETA: 1:24 - loss: 2.2390 - regression_loss: 1.8673 - classification_loss: 0.3717 245/500 [=============>................] - ETA: 1:24 - loss: 2.2386 - regression_loss: 1.8668 - classification_loss: 0.3718 246/500 [=============>................] - ETA: 1:23 - loss: 2.2381 - regression_loss: 1.8646 - classification_loss: 0.3736 247/500 [=============>................] - ETA: 1:23 - loss: 2.2384 - regression_loss: 1.8646 - classification_loss: 0.3738 248/500 [=============>................] - ETA: 1:23 - loss: 2.2389 - regression_loss: 1.8650 - classification_loss: 0.3738 249/500 [=============>................] - ETA: 1:22 - loss: 2.2392 - regression_loss: 1.8653 - classification_loss: 0.3739 250/500 [==============>...............] - ETA: 1:22 - loss: 2.2363 - regression_loss: 1.8627 - classification_loss: 0.3736 251/500 [==============>...............] - ETA: 1:22 - loss: 2.2368 - regression_loss: 1.8631 - classification_loss: 0.3737 252/500 [==============>...............] - ETA: 1:21 - loss: 2.2363 - regression_loss: 1.8629 - classification_loss: 0.3734 253/500 [==============>...............] - ETA: 1:21 - loss: 2.2379 - regression_loss: 1.8645 - classification_loss: 0.3734 254/500 [==============>...............] - ETA: 1:21 - loss: 2.2391 - regression_loss: 1.8656 - classification_loss: 0.3735 255/500 [==============>...............] - ETA: 1:20 - loss: 2.2411 - regression_loss: 1.8672 - classification_loss: 0.3739 256/500 [==============>...............] - ETA: 1:20 - loss: 2.2427 - regression_loss: 1.8684 - classification_loss: 0.3743 257/500 [==============>...............] - ETA: 1:20 - loss: 2.2416 - regression_loss: 1.8674 - classification_loss: 0.3742 258/500 [==============>...............] - ETA: 1:19 - loss: 2.2408 - regression_loss: 1.8667 - classification_loss: 0.3741 259/500 [==============>...............] - ETA: 1:19 - loss: 2.2418 - regression_loss: 1.8673 - classification_loss: 0.3746 260/500 [==============>...............] - ETA: 1:19 - loss: 2.2415 - regression_loss: 1.8671 - classification_loss: 0.3743 261/500 [==============>...............] - ETA: 1:18 - loss: 2.2426 - regression_loss: 1.8677 - classification_loss: 0.3748 262/500 [==============>...............] - ETA: 1:18 - loss: 2.2430 - regression_loss: 1.8682 - classification_loss: 0.3748 263/500 [==============>...............] - ETA: 1:18 - loss: 2.2432 - regression_loss: 1.8684 - classification_loss: 0.3748 264/500 [==============>...............] - ETA: 1:17 - loss: 2.2408 - regression_loss: 1.8661 - classification_loss: 0.3746 265/500 [==============>...............] - ETA: 1:17 - loss: 2.2409 - regression_loss: 1.8664 - classification_loss: 0.3745 266/500 [==============>...............] - ETA: 1:17 - loss: 2.2418 - regression_loss: 1.8674 - classification_loss: 0.3744 267/500 [===============>..............] - ETA: 1:16 - loss: 2.2413 - regression_loss: 1.8670 - classification_loss: 0.3743 268/500 [===============>..............] - ETA: 1:16 - loss: 2.2423 - regression_loss: 1.8679 - classification_loss: 0.3743 269/500 [===============>..............] - ETA: 1:16 - loss: 2.2397 - regression_loss: 1.8655 - classification_loss: 0.3743 270/500 [===============>..............] - ETA: 1:15 - loss: 2.2406 - regression_loss: 1.8663 - classification_loss: 0.3743 271/500 [===============>..............] - ETA: 1:15 - loss: 2.2381 - regression_loss: 1.8641 - classification_loss: 0.3741 272/500 [===============>..............] - ETA: 1:15 - loss: 2.2366 - regression_loss: 1.8621 - classification_loss: 0.3745 273/500 [===============>..............] - ETA: 1:14 - loss: 2.2368 - regression_loss: 1.8624 - classification_loss: 0.3744 274/500 [===============>..............] - ETA: 1:14 - loss: 2.2365 - regression_loss: 1.8621 - classification_loss: 0.3744 275/500 [===============>..............] - ETA: 1:14 - loss: 2.2371 - regression_loss: 1.8628 - classification_loss: 0.3743 276/500 [===============>..............] - ETA: 1:13 - loss: 2.2363 - regression_loss: 1.8623 - classification_loss: 0.3741 277/500 [===============>..............] - ETA: 1:13 - loss: 2.2387 - regression_loss: 1.8642 - classification_loss: 0.3745 278/500 [===============>..............] - ETA: 1:13 - loss: 2.2414 - regression_loss: 1.8665 - classification_loss: 0.3749 279/500 [===============>..............] - ETA: 1:13 - loss: 2.2396 - regression_loss: 1.8650 - classification_loss: 0.3745 280/500 [===============>..............] - ETA: 1:12 - loss: 2.2402 - regression_loss: 1.8655 - classification_loss: 0.3747 281/500 [===============>..............] - ETA: 1:12 - loss: 2.2369 - regression_loss: 1.8627 - classification_loss: 0.3742 282/500 [===============>..............] - ETA: 1:12 - loss: 2.2348 - regression_loss: 1.8611 - classification_loss: 0.3737 283/500 [===============>..............] - ETA: 1:11 - loss: 2.2351 - regression_loss: 1.8614 - classification_loss: 0.3737 284/500 [================>.............] - ETA: 1:11 - loss: 2.2323 - regression_loss: 1.8589 - classification_loss: 0.3734 285/500 [================>.............] - ETA: 1:11 - loss: 2.2338 - regression_loss: 1.8601 - classification_loss: 0.3737 286/500 [================>.............] - ETA: 1:10 - loss: 2.2342 - regression_loss: 1.8606 - classification_loss: 0.3736 287/500 [================>.............] - ETA: 1:10 - loss: 2.2348 - regression_loss: 1.8612 - classification_loss: 0.3737 288/500 [================>.............] - ETA: 1:10 - loss: 2.2343 - regression_loss: 1.8607 - classification_loss: 0.3736 289/500 [================>.............] - ETA: 1:09 - loss: 2.2371 - regression_loss: 1.8632 - classification_loss: 0.3739 290/500 [================>.............] - ETA: 1:09 - loss: 2.2344 - regression_loss: 1.8610 - classification_loss: 0.3734 291/500 [================>.............] - ETA: 1:09 - loss: 2.2351 - regression_loss: 1.8616 - classification_loss: 0.3735 292/500 [================>.............] - ETA: 1:08 - loss: 2.2352 - regression_loss: 1.8617 - classification_loss: 0.3735 293/500 [================>.............] - ETA: 1:08 - loss: 2.2355 - regression_loss: 1.8620 - classification_loss: 0.3735 294/500 [================>.............] - ETA: 1:08 - loss: 2.2370 - regression_loss: 1.8633 - classification_loss: 0.3738 295/500 [================>.............] - ETA: 1:07 - loss: 2.2366 - regression_loss: 1.8629 - classification_loss: 0.3737 296/500 [================>.............] - ETA: 1:07 - loss: 2.2363 - regression_loss: 1.8628 - classification_loss: 0.3735 297/500 [================>.............] - ETA: 1:07 - loss: 2.2357 - regression_loss: 1.8622 - classification_loss: 0.3735 298/500 [================>.............] - ETA: 1:06 - loss: 2.2361 - regression_loss: 1.8618 - classification_loss: 0.3743 299/500 [================>.............] - ETA: 1:06 - loss: 2.2354 - regression_loss: 1.8608 - classification_loss: 0.3746 300/500 [=================>............] - ETA: 1:06 - loss: 2.2354 - regression_loss: 1.8609 - classification_loss: 0.3745 301/500 [=================>............] - ETA: 1:05 - loss: 2.2353 - regression_loss: 1.8609 - classification_loss: 0.3744 302/500 [=================>............] - ETA: 1:05 - loss: 2.2353 - regression_loss: 1.8609 - classification_loss: 0.3745 303/500 [=================>............] - ETA: 1:05 - loss: 2.2342 - regression_loss: 1.8600 - classification_loss: 0.3742 304/500 [=================>............] - ETA: 1:04 - loss: 2.2349 - regression_loss: 1.8604 - classification_loss: 0.3745 305/500 [=================>............] - ETA: 1:04 - loss: 2.2333 - regression_loss: 1.8588 - classification_loss: 0.3745 306/500 [=================>............] - ETA: 1:04 - loss: 2.2310 - regression_loss: 1.8569 - classification_loss: 0.3741 307/500 [=================>............] - ETA: 1:03 - loss: 2.2312 - regression_loss: 1.8571 - classification_loss: 0.3741 308/500 [=================>............] - ETA: 1:03 - loss: 2.2309 - regression_loss: 1.8570 - classification_loss: 0.3739 309/500 [=================>............] - ETA: 1:03 - loss: 2.2304 - regression_loss: 1.8566 - classification_loss: 0.3738 310/500 [=================>............] - ETA: 1:02 - loss: 2.2308 - regression_loss: 1.8570 - classification_loss: 0.3739 311/500 [=================>............] - ETA: 1:02 - loss: 2.2299 - regression_loss: 1.8563 - classification_loss: 0.3736 312/500 [=================>............] - ETA: 1:02 - loss: 2.2303 - regression_loss: 1.8566 - classification_loss: 0.3736 313/500 [=================>............] - ETA: 1:01 - loss: 2.2311 - regression_loss: 1.8574 - classification_loss: 0.3737 314/500 [=================>............] - ETA: 1:01 - loss: 2.2275 - regression_loss: 1.8542 - classification_loss: 0.3734 315/500 [=================>............] - ETA: 1:01 - loss: 2.2238 - regression_loss: 1.8510 - classification_loss: 0.3728 316/500 [=================>............] - ETA: 1:00 - loss: 2.2229 - regression_loss: 1.8504 - classification_loss: 0.3725 317/500 [==================>...........] - ETA: 1:00 - loss: 2.2236 - regression_loss: 1.8511 - classification_loss: 0.3725 318/500 [==================>...........] - ETA: 1:00 - loss: 2.2225 - regression_loss: 1.8502 - classification_loss: 0.3723 319/500 [==================>...........] - ETA: 59s - loss: 2.2225 - regression_loss: 1.8503 - classification_loss: 0.3722  320/500 [==================>...........] - ETA: 59s - loss: 2.2224 - regression_loss: 1.8503 - classification_loss: 0.3722 321/500 [==================>...........] - ETA: 59s - loss: 2.2197 - regression_loss: 1.8481 - classification_loss: 0.3716 322/500 [==================>...........] - ETA: 58s - loss: 2.2198 - regression_loss: 1.8482 - classification_loss: 0.3716 323/500 [==================>...........] - ETA: 58s - loss: 2.2202 - regression_loss: 1.8486 - classification_loss: 0.3716 324/500 [==================>...........] - ETA: 58s - loss: 2.2200 - regression_loss: 1.8483 - classification_loss: 0.3716 325/500 [==================>...........] - ETA: 57s - loss: 2.2177 - regression_loss: 1.8464 - classification_loss: 0.3713 326/500 [==================>...........] - ETA: 57s - loss: 2.2182 - regression_loss: 1.8470 - classification_loss: 0.3712 327/500 [==================>...........] - ETA: 57s - loss: 2.2142 - regression_loss: 1.8435 - classification_loss: 0.3707 328/500 [==================>...........] - ETA: 56s - loss: 2.2159 - regression_loss: 1.8451 - classification_loss: 0.3708 329/500 [==================>...........] - ETA: 56s - loss: 2.2143 - regression_loss: 1.8438 - classification_loss: 0.3705 330/500 [==================>...........] - ETA: 56s - loss: 2.2145 - regression_loss: 1.8441 - classification_loss: 0.3704 331/500 [==================>...........] - ETA: 55s - loss: 2.2147 - regression_loss: 1.8445 - classification_loss: 0.3702 332/500 [==================>...........] - ETA: 55s - loss: 2.2152 - regression_loss: 1.8451 - classification_loss: 0.3701 333/500 [==================>...........] - ETA: 55s - loss: 2.2135 - regression_loss: 1.8436 - classification_loss: 0.3699 334/500 [===================>..........] - ETA: 54s - loss: 2.2143 - regression_loss: 1.8443 - classification_loss: 0.3700 335/500 [===================>..........] - ETA: 54s - loss: 2.2150 - regression_loss: 1.8450 - classification_loss: 0.3701 336/500 [===================>..........] - ETA: 54s - loss: 2.2154 - regression_loss: 1.8453 - classification_loss: 0.3701 337/500 [===================>..........] - ETA: 53s - loss: 2.2162 - regression_loss: 1.8461 - classification_loss: 0.3701 338/500 [===================>..........] - ETA: 53s - loss: 2.2143 - regression_loss: 1.8444 - classification_loss: 0.3699 339/500 [===================>..........] - ETA: 53s - loss: 2.2144 - regression_loss: 1.8446 - classification_loss: 0.3699 340/500 [===================>..........] - ETA: 52s - loss: 2.2144 - regression_loss: 1.8446 - classification_loss: 0.3698 341/500 [===================>..........] - ETA: 52s - loss: 2.2151 - regression_loss: 1.8454 - classification_loss: 0.3697 342/500 [===================>..........] - ETA: 52s - loss: 2.2136 - regression_loss: 1.8443 - classification_loss: 0.3693 343/500 [===================>..........] - ETA: 51s - loss: 2.2116 - regression_loss: 1.8425 - classification_loss: 0.3691 344/500 [===================>..........] - ETA: 51s - loss: 2.2113 - regression_loss: 1.8422 - classification_loss: 0.3690 345/500 [===================>..........] - ETA: 51s - loss: 2.2105 - regression_loss: 1.8417 - classification_loss: 0.3688 346/500 [===================>..........] - ETA: 50s - loss: 2.2109 - regression_loss: 1.8419 - classification_loss: 0.3689 347/500 [===================>..........] - ETA: 50s - loss: 2.2112 - regression_loss: 1.8424 - classification_loss: 0.3688 348/500 [===================>..........] - ETA: 50s - loss: 2.2113 - regression_loss: 1.8425 - classification_loss: 0.3688 349/500 [===================>..........] - ETA: 49s - loss: 2.2095 - regression_loss: 1.8411 - classification_loss: 0.3684 350/500 [====================>.........] - ETA: 49s - loss: 2.2101 - regression_loss: 1.8414 - classification_loss: 0.3687 351/500 [====================>.........] - ETA: 49s - loss: 2.2119 - regression_loss: 1.8430 - classification_loss: 0.3689 352/500 [====================>.........] - ETA: 48s - loss: 2.2129 - regression_loss: 1.8438 - classification_loss: 0.3690 353/500 [====================>.........] - ETA: 48s - loss: 2.2138 - regression_loss: 1.8446 - classification_loss: 0.3691 354/500 [====================>.........] - ETA: 48s - loss: 2.2133 - regression_loss: 1.8443 - classification_loss: 0.3690 355/500 [====================>.........] - ETA: 47s - loss: 2.2131 - regression_loss: 1.8442 - classification_loss: 0.3689 356/500 [====================>.........] - ETA: 47s - loss: 2.2137 - regression_loss: 1.8449 - classification_loss: 0.3688 357/500 [====================>.........] - ETA: 47s - loss: 2.2135 - regression_loss: 1.8446 - classification_loss: 0.3689 358/500 [====================>.........] - ETA: 46s - loss: 2.2125 - regression_loss: 1.8440 - classification_loss: 0.3685 359/500 [====================>.........] - ETA: 46s - loss: 2.2131 - regression_loss: 1.8444 - classification_loss: 0.3687 360/500 [====================>.........] - ETA: 46s - loss: 2.2131 - regression_loss: 1.8445 - classification_loss: 0.3685 361/500 [====================>.........] - ETA: 45s - loss: 2.2125 - regression_loss: 1.8441 - classification_loss: 0.3685 362/500 [====================>.........] - ETA: 45s - loss: 2.2119 - regression_loss: 1.8437 - classification_loss: 0.3682 363/500 [====================>.........] - ETA: 45s - loss: 2.2126 - regression_loss: 1.8443 - classification_loss: 0.3683 364/500 [====================>.........] - ETA: 44s - loss: 2.2109 - regression_loss: 1.8431 - classification_loss: 0.3678 365/500 [====================>.........] - ETA: 44s - loss: 2.2092 - regression_loss: 1.8416 - classification_loss: 0.3676 366/500 [====================>.........] - ETA: 44s - loss: 2.2092 - regression_loss: 1.8417 - classification_loss: 0.3675 367/500 [=====================>........] - ETA: 43s - loss: 2.2075 - regression_loss: 1.8403 - classification_loss: 0.3672 368/500 [=====================>........] - ETA: 43s - loss: 2.2077 - regression_loss: 1.8406 - classification_loss: 0.3672 369/500 [=====================>........] - ETA: 43s - loss: 2.2077 - regression_loss: 1.8406 - classification_loss: 0.3671 370/500 [=====================>........] - ETA: 43s - loss: 2.2079 - regression_loss: 1.8408 - classification_loss: 0.3671 371/500 [=====================>........] - ETA: 42s - loss: 2.2089 - regression_loss: 1.8415 - classification_loss: 0.3674 372/500 [=====================>........] - ETA: 42s - loss: 2.2092 - regression_loss: 1.8407 - classification_loss: 0.3685 373/500 [=====================>........] - ETA: 42s - loss: 2.2093 - regression_loss: 1.8407 - classification_loss: 0.3685 374/500 [=====================>........] - ETA: 41s - loss: 2.2095 - regression_loss: 1.8410 - classification_loss: 0.3685 375/500 [=====================>........] - ETA: 41s - loss: 2.2075 - regression_loss: 1.8395 - classification_loss: 0.3681 376/500 [=====================>........] - ETA: 41s - loss: 2.2077 - regression_loss: 1.8397 - classification_loss: 0.3680 377/500 [=====================>........] - ETA: 40s - loss: 2.2074 - regression_loss: 1.8396 - classification_loss: 0.3679 378/500 [=====================>........] - ETA: 40s - loss: 2.2080 - regression_loss: 1.8402 - classification_loss: 0.3679 379/500 [=====================>........] - ETA: 40s - loss: 2.2079 - regression_loss: 1.8400 - classification_loss: 0.3679 380/500 [=====================>........] - ETA: 39s - loss: 2.2088 - regression_loss: 1.8407 - classification_loss: 0.3681 381/500 [=====================>........] - ETA: 39s - loss: 2.2084 - regression_loss: 1.8405 - classification_loss: 0.3678 382/500 [=====================>........] - ETA: 39s - loss: 2.2082 - regression_loss: 1.8406 - classification_loss: 0.3677 383/500 [=====================>........] - ETA: 38s - loss: 2.2070 - regression_loss: 1.8395 - classification_loss: 0.3675 384/500 [======================>.......] - ETA: 38s - loss: 2.2071 - regression_loss: 1.8396 - classification_loss: 0.3674 385/500 [======================>.......] - ETA: 38s - loss: 2.2077 - regression_loss: 1.8403 - classification_loss: 0.3674 386/500 [======================>.......] - ETA: 37s - loss: 2.2069 - regression_loss: 1.8396 - classification_loss: 0.3672 387/500 [======================>.......] - ETA: 37s - loss: 2.2053 - regression_loss: 1.8384 - classification_loss: 0.3669 388/500 [======================>.......] - ETA: 37s - loss: 2.2049 - regression_loss: 1.8380 - classification_loss: 0.3668 389/500 [======================>.......] - ETA: 36s - loss: 2.2028 - regression_loss: 1.8362 - classification_loss: 0.3665 390/500 [======================>.......] - ETA: 36s - loss: 2.2018 - regression_loss: 1.8354 - classification_loss: 0.3664 391/500 [======================>.......] - ETA: 36s - loss: 2.2016 - regression_loss: 1.8353 - classification_loss: 0.3663 392/500 [======================>.......] - ETA: 35s - loss: 2.2017 - regression_loss: 1.8354 - classification_loss: 0.3663 393/500 [======================>.......] - ETA: 35s - loss: 2.2004 - regression_loss: 1.8343 - classification_loss: 0.3661 394/500 [======================>.......] - ETA: 35s - loss: 2.2001 - regression_loss: 1.8342 - classification_loss: 0.3659 395/500 [======================>.......] - ETA: 34s - loss: 2.2003 - regression_loss: 1.8344 - classification_loss: 0.3660 396/500 [======================>.......] - ETA: 34s - loss: 2.2007 - regression_loss: 1.8348 - classification_loss: 0.3659 397/500 [======================>.......] - ETA: 34s - loss: 2.2011 - regression_loss: 1.8352 - classification_loss: 0.3659 398/500 [======================>.......] - ETA: 33s - loss: 2.2016 - regression_loss: 1.8356 - classification_loss: 0.3660 399/500 [======================>.......] - ETA: 33s - loss: 2.2023 - regression_loss: 1.8363 - classification_loss: 0.3660 400/500 [=======================>......] - ETA: 33s - loss: 2.2026 - regression_loss: 1.8366 - classification_loss: 0.3660 401/500 [=======================>......] - ETA: 32s - loss: 2.2030 - regression_loss: 1.8370 - classification_loss: 0.3660 402/500 [=======================>......] - ETA: 32s - loss: 2.2004 - regression_loss: 1.8347 - classification_loss: 0.3657 403/500 [=======================>......] - ETA: 32s - loss: 2.1986 - regression_loss: 1.8333 - classification_loss: 0.3653 404/500 [=======================>......] - ETA: 31s - loss: 2.2000 - regression_loss: 1.8346 - classification_loss: 0.3654 405/500 [=======================>......] - ETA: 31s - loss: 2.1999 - regression_loss: 1.8346 - classification_loss: 0.3653 406/500 [=======================>......] - ETA: 31s - loss: 2.1999 - regression_loss: 1.8346 - classification_loss: 0.3653 407/500 [=======================>......] - ETA: 30s - loss: 2.1996 - regression_loss: 1.8342 - classification_loss: 0.3654 408/500 [=======================>......] - ETA: 30s - loss: 2.1995 - regression_loss: 1.8341 - classification_loss: 0.3653 409/500 [=======================>......] - ETA: 30s - loss: 2.1998 - regression_loss: 1.8346 - classification_loss: 0.3652 410/500 [=======================>......] - ETA: 29s - loss: 2.1999 - regression_loss: 1.8350 - classification_loss: 0.3650 411/500 [=======================>......] - ETA: 29s - loss: 2.1982 - regression_loss: 1.8336 - classification_loss: 0.3646 412/500 [=======================>......] - ETA: 29s - loss: 2.1982 - regression_loss: 1.8336 - classification_loss: 0.3646 413/500 [=======================>......] - ETA: 28s - loss: 2.1966 - regression_loss: 1.8321 - classification_loss: 0.3645 414/500 [=======================>......] - ETA: 28s - loss: 2.1966 - regression_loss: 1.8321 - classification_loss: 0.3645 415/500 [=======================>......] - ETA: 28s - loss: 2.1966 - regression_loss: 1.8323 - classification_loss: 0.3643 416/500 [=======================>......] - ETA: 27s - loss: 2.1963 - regression_loss: 1.8320 - classification_loss: 0.3643 417/500 [========================>.....] - ETA: 27s - loss: 2.1970 - regression_loss: 1.8327 - classification_loss: 0.3643 418/500 [========================>.....] - ETA: 27s - loss: 2.1974 - regression_loss: 1.8331 - classification_loss: 0.3643 419/500 [========================>.....] - ETA: 26s - loss: 2.1968 - regression_loss: 1.8326 - classification_loss: 0.3642 420/500 [========================>.....] - ETA: 26s - loss: 2.1974 - regression_loss: 1.8330 - classification_loss: 0.3644 421/500 [========================>.....] - ETA: 26s - loss: 2.1980 - regression_loss: 1.8335 - classification_loss: 0.3645 422/500 [========================>.....] - ETA: 25s - loss: 2.1983 - regression_loss: 1.8337 - classification_loss: 0.3646 423/500 [========================>.....] - ETA: 25s - loss: 2.1981 - regression_loss: 1.8335 - classification_loss: 0.3645 424/500 [========================>.....] - ETA: 25s - loss: 2.1976 - regression_loss: 1.8333 - classification_loss: 0.3643 425/500 [========================>.....] - ETA: 24s - loss: 2.1955 - regression_loss: 1.8313 - classification_loss: 0.3641 426/500 [========================>.....] - ETA: 24s - loss: 2.1957 - regression_loss: 1.8315 - classification_loss: 0.3642 427/500 [========================>.....] - ETA: 24s - loss: 2.1950 - regression_loss: 1.8309 - classification_loss: 0.3641 428/500 [========================>.....] - ETA: 23s - loss: 2.1933 - regression_loss: 1.8295 - classification_loss: 0.3638 429/500 [========================>.....] - ETA: 23s - loss: 2.1941 - regression_loss: 1.8303 - classification_loss: 0.3638 430/500 [========================>.....] - ETA: 23s - loss: 2.1952 - regression_loss: 1.8311 - classification_loss: 0.3641 431/500 [========================>.....] - ETA: 22s - loss: 2.1943 - regression_loss: 1.8302 - classification_loss: 0.3641 432/500 [========================>.....] - ETA: 22s - loss: 2.1933 - regression_loss: 1.8294 - classification_loss: 0.3640 433/500 [========================>.....] - ETA: 22s - loss: 2.1934 - regression_loss: 1.8295 - classification_loss: 0.3639 434/500 [=========================>....] - ETA: 21s - loss: 2.1937 - regression_loss: 1.8297 - classification_loss: 0.3640 435/500 [=========================>....] - ETA: 21s - loss: 2.1953 - regression_loss: 1.8306 - classification_loss: 0.3647 436/500 [=========================>....] - ETA: 21s - loss: 2.1959 - regression_loss: 1.8311 - classification_loss: 0.3648 437/500 [=========================>....] - ETA: 20s - loss: 2.1973 - regression_loss: 1.8322 - classification_loss: 0.3651 438/500 [=========================>....] - ETA: 20s - loss: 2.1977 - regression_loss: 1.8326 - classification_loss: 0.3651 439/500 [=========================>....] - ETA: 20s - loss: 2.1958 - regression_loss: 1.8311 - classification_loss: 0.3648 440/500 [=========================>....] - ETA: 19s - loss: 2.1941 - regression_loss: 1.8296 - classification_loss: 0.3645 441/500 [=========================>....] - ETA: 19s - loss: 2.1947 - regression_loss: 1.8302 - classification_loss: 0.3645 442/500 [=========================>....] - ETA: 19s - loss: 2.1954 - regression_loss: 1.8308 - classification_loss: 0.3646 443/500 [=========================>....] - ETA: 18s - loss: 2.1967 - regression_loss: 1.8319 - classification_loss: 0.3648 444/500 [=========================>....] - ETA: 18s - loss: 2.1968 - regression_loss: 1.8320 - classification_loss: 0.3648 445/500 [=========================>....] - ETA: 18s - loss: 2.1957 - regression_loss: 1.8309 - classification_loss: 0.3648 446/500 [=========================>....] - ETA: 17s - loss: 2.1961 - regression_loss: 1.8313 - classification_loss: 0.3649 447/500 [=========================>....] - ETA: 17s - loss: 2.1965 - regression_loss: 1.8316 - classification_loss: 0.3649 448/500 [=========================>....] - ETA: 17s - loss: 2.1960 - regression_loss: 1.8313 - classification_loss: 0.3647 449/500 [=========================>....] - ETA: 16s - loss: 2.1978 - regression_loss: 1.8329 - classification_loss: 0.3649 450/500 [==========================>...] - ETA: 16s - loss: 2.1976 - regression_loss: 1.8328 - classification_loss: 0.3648 451/500 [==========================>...] - ETA: 16s - loss: 2.1982 - regression_loss: 1.8333 - classification_loss: 0.3649 452/500 [==========================>...] - ETA: 15s - loss: 2.1992 - regression_loss: 1.8343 - classification_loss: 0.3649 453/500 [==========================>...] - ETA: 15s - loss: 2.1991 - regression_loss: 1.8343 - classification_loss: 0.3648 454/500 [==========================>...] - ETA: 15s - loss: 2.1995 - regression_loss: 1.8346 - classification_loss: 0.3649 455/500 [==========================>...] - ETA: 14s - loss: 2.1982 - regression_loss: 1.8336 - classification_loss: 0.3646 456/500 [==========================>...] - ETA: 14s - loss: 2.1967 - regression_loss: 1.8321 - classification_loss: 0.3646 457/500 [==========================>...] - ETA: 14s - loss: 2.1949 - regression_loss: 1.8306 - classification_loss: 0.3643 458/500 [==========================>...] - ETA: 13s - loss: 2.1953 - regression_loss: 1.8308 - classification_loss: 0.3645 459/500 [==========================>...] - ETA: 13s - loss: 2.1954 - regression_loss: 1.8310 - classification_loss: 0.3644 460/500 [==========================>...] - ETA: 13s - loss: 2.1951 - regression_loss: 1.8308 - classification_loss: 0.3643 461/500 [==========================>...] - ETA: 12s - loss: 2.1953 - regression_loss: 1.8308 - classification_loss: 0.3644 462/500 [==========================>...] - ETA: 12s - loss: 2.1956 - regression_loss: 1.8311 - classification_loss: 0.3645 463/500 [==========================>...] - ETA: 12s - loss: 2.1960 - regression_loss: 1.8315 - classification_loss: 0.3645 464/500 [==========================>...] - ETA: 11s - loss: 2.1966 - regression_loss: 1.8319 - classification_loss: 0.3647 465/500 [==========================>...] - ETA: 11s - loss: 2.1956 - regression_loss: 1.8310 - classification_loss: 0.3646 466/500 [==========================>...] - ETA: 11s - loss: 2.1941 - regression_loss: 1.8298 - classification_loss: 0.3644 467/500 [===========================>..] - ETA: 10s - loss: 2.1946 - regression_loss: 1.8302 - classification_loss: 0.3644 468/500 [===========================>..] - ETA: 10s - loss: 2.1958 - regression_loss: 1.8312 - classification_loss: 0.3646 469/500 [===========================>..] - ETA: 10s - loss: 2.1936 - regression_loss: 1.8294 - classification_loss: 0.3642 470/500 [===========================>..] - ETA: 9s - loss: 2.1941 - regression_loss: 1.8298 - classification_loss: 0.3643  471/500 [===========================>..] - ETA: 9s - loss: 2.1929 - regression_loss: 1.8288 - classification_loss: 0.3641 472/500 [===========================>..] - ETA: 9s - loss: 2.1912 - regression_loss: 1.8268 - classification_loss: 0.3644 473/500 [===========================>..] - ETA: 8s - loss: 2.1906 - regression_loss: 1.8264 - classification_loss: 0.3641 474/500 [===========================>..] - ETA: 8s - loss: 2.1908 - regression_loss: 1.8267 - classification_loss: 0.3640 475/500 [===========================>..] - ETA: 8s - loss: 2.1898 - regression_loss: 1.8259 - classification_loss: 0.3639 476/500 [===========================>..] - ETA: 7s - loss: 2.1871 - regression_loss: 1.8237 - classification_loss: 0.3635 477/500 [===========================>..] - ETA: 7s - loss: 2.1878 - regression_loss: 1.8242 - classification_loss: 0.3636 478/500 [===========================>..] - ETA: 7s - loss: 2.1880 - regression_loss: 1.8245 - classification_loss: 0.3635 479/500 [===========================>..] - ETA: 6s - loss: 2.1873 - regression_loss: 1.8239 - classification_loss: 0.3633 480/500 [===========================>..] - ETA: 6s - loss: 2.1868 - regression_loss: 1.8236 - classification_loss: 0.3632 481/500 [===========================>..] - ETA: 6s - loss: 2.1873 - regression_loss: 1.8241 - classification_loss: 0.3631 482/500 [===========================>..] - ETA: 5s - loss: 2.1877 - regression_loss: 1.8246 - classification_loss: 0.3632 483/500 [===========================>..] - ETA: 5s - loss: 2.1875 - regression_loss: 1.8243 - classification_loss: 0.3632 484/500 [============================>.] - ETA: 5s - loss: 2.1872 - regression_loss: 1.8241 - classification_loss: 0.3631 485/500 [============================>.] - ETA: 4s - loss: 2.1875 - regression_loss: 1.8244 - classification_loss: 0.3631 486/500 [============================>.] - ETA: 4s - loss: 2.1871 - regression_loss: 1.8242 - classification_loss: 0.3629 487/500 [============================>.] - ETA: 4s - loss: 2.1876 - regression_loss: 1.8247 - classification_loss: 0.3629 488/500 [============================>.] - ETA: 3s - loss: 2.1865 - regression_loss: 1.8238 - classification_loss: 0.3627 489/500 [============================>.] - ETA: 3s - loss: 2.1869 - regression_loss: 1.8242 - classification_loss: 0.3627 490/500 [============================>.] - ETA: 3s - loss: 2.1870 - regression_loss: 1.8243 - classification_loss: 0.3626 491/500 [============================>.] - ETA: 2s - loss: 2.1871 - regression_loss: 1.8245 - classification_loss: 0.3626 492/500 [============================>.] - ETA: 2s - loss: 2.1863 - regression_loss: 1.8239 - classification_loss: 0.3625 493/500 [============================>.] - ETA: 2s - loss: 2.1861 - regression_loss: 1.8237 - classification_loss: 0.3624 494/500 [============================>.] - ETA: 1s - loss: 2.1865 - regression_loss: 1.8241 - classification_loss: 0.3623 495/500 [============================>.] - ETA: 1s - loss: 2.1863 - regression_loss: 1.8241 - classification_loss: 0.3622 496/500 [============================>.] - ETA: 1s - loss: 2.1860 - regression_loss: 1.8238 - classification_loss: 0.3622 497/500 [============================>.] - ETA: 0s - loss: 2.1857 - regression_loss: 1.8237 - classification_loss: 0.3621 498/500 [============================>.] - ETA: 0s - loss: 2.1848 - regression_loss: 1.8226 - classification_loss: 0.3622 499/500 [============================>.] - ETA: 0s - loss: 2.1844 - regression_loss: 1.8223 - classification_loss: 0.3621 500/500 [==============================] - 166s 332ms/step - loss: 2.1844 - regression_loss: 1.8223 - classification_loss: 0.3622 1172 instances of class plum with average precision: 0.3768 mAP: 0.3768 Epoch 00007: saving model to ./training/snapshots/resnet101_pascal_07.h5 Epoch 8/150 1/500 [..............................] - ETA: 2:37 - loss: 2.0499 - regression_loss: 1.7659 - classification_loss: 0.2840 2/500 [..............................] - ETA: 2:38 - loss: 2.1454 - regression_loss: 1.8181 - classification_loss: 0.3272 3/500 [..............................] - ETA: 2:43 - loss: 2.2139 - regression_loss: 1.8626 - classification_loss: 0.3514 4/500 [..............................] - ETA: 2:44 - loss: 2.2249 - regression_loss: 1.8728 - classification_loss: 0.3521 5/500 [..............................] - ETA: 2:43 - loss: 2.2160 - regression_loss: 1.8530 - classification_loss: 0.3630 6/500 [..............................] - ETA: 2:45 - loss: 2.2072 - regression_loss: 1.8425 - classification_loss: 0.3647 7/500 [..............................] - ETA: 2:43 - loss: 2.2337 - regression_loss: 1.8734 - classification_loss: 0.3603 8/500 [..............................] - ETA: 2:42 - loss: 2.2537 - regression_loss: 1.8969 - classification_loss: 0.3567 9/500 [..............................] - ETA: 2:43 - loss: 2.2452 - regression_loss: 1.8934 - classification_loss: 0.3519 10/500 [..............................] - ETA: 2:42 - loss: 2.2705 - regression_loss: 1.9123 - classification_loss: 0.3582 11/500 [..............................] - ETA: 2:42 - loss: 2.2945 - regression_loss: 1.9279 - classification_loss: 0.3666 12/500 [..............................] - ETA: 2:42 - loss: 2.2223 - regression_loss: 1.8717 - classification_loss: 0.3506 13/500 [..............................] - ETA: 2:42 - loss: 2.2190 - regression_loss: 1.8670 - classification_loss: 0.3519 14/500 [..............................] - ETA: 2:42 - loss: 2.2179 - regression_loss: 1.8684 - classification_loss: 0.3495 15/500 [..............................] - ETA: 2:42 - loss: 2.2284 - regression_loss: 1.8766 - classification_loss: 0.3519 16/500 [..............................] - ETA: 2:41 - loss: 2.2699 - regression_loss: 1.9132 - classification_loss: 0.3567 17/500 [>.............................] - ETA: 2:40 - loss: 2.2712 - regression_loss: 1.9148 - classification_loss: 0.3564 18/500 [>.............................] - ETA: 2:40 - loss: 2.2585 - regression_loss: 1.9054 - classification_loss: 0.3531 19/500 [>.............................] - ETA: 2:40 - loss: 2.2262 - regression_loss: 1.8740 - classification_loss: 0.3522 20/500 [>.............................] - ETA: 2:39 - loss: 2.2542 - regression_loss: 1.8946 - classification_loss: 0.3596 21/500 [>.............................] - ETA: 2:39 - loss: 2.2607 - regression_loss: 1.9012 - classification_loss: 0.3595 22/500 [>.............................] - ETA: 2:38 - loss: 2.2541 - regression_loss: 1.8953 - classification_loss: 0.3589 23/500 [>.............................] - ETA: 2:39 - loss: 2.2535 - regression_loss: 1.8946 - classification_loss: 0.3589 24/500 [>.............................] - ETA: 2:38 - loss: 2.2374 - regression_loss: 1.8815 - classification_loss: 0.3559 25/500 [>.............................] - ETA: 2:37 - loss: 2.2358 - regression_loss: 1.8804 - classification_loss: 0.3554 26/500 [>.............................] - ETA: 2:37 - loss: 2.2432 - regression_loss: 1.8870 - classification_loss: 0.3562 27/500 [>.............................] - ETA: 2:36 - loss: 2.2499 - regression_loss: 1.8913 - classification_loss: 0.3586 28/500 [>.............................] - ETA: 2:36 - loss: 2.2635 - regression_loss: 1.9006 - classification_loss: 0.3629 29/500 [>.............................] - ETA: 2:36 - loss: 2.2664 - regression_loss: 1.9029 - classification_loss: 0.3635 30/500 [>.............................] - ETA: 2:35 - loss: 2.2606 - regression_loss: 1.8950 - classification_loss: 0.3656 31/500 [>.............................] - ETA: 2:35 - loss: 2.2732 - regression_loss: 1.9054 - classification_loss: 0.3678 32/500 [>.............................] - ETA: 2:35 - loss: 2.2746 - regression_loss: 1.9075 - classification_loss: 0.3671 33/500 [>.............................] - ETA: 2:34 - loss: 2.2750 - regression_loss: 1.9076 - classification_loss: 0.3674 34/500 [=>............................] - ETA: 2:34 - loss: 2.2827 - regression_loss: 1.9153 - classification_loss: 0.3675 35/500 [=>............................] - ETA: 2:34 - loss: 2.2762 - regression_loss: 1.9113 - classification_loss: 0.3649 36/500 [=>............................] - ETA: 2:33 - loss: 2.2540 - regression_loss: 1.8932 - classification_loss: 0.3608 37/500 [=>............................] - ETA: 2:32 - loss: 2.2550 - regression_loss: 1.8953 - classification_loss: 0.3597 38/500 [=>............................] - ETA: 2:32 - loss: 2.2474 - regression_loss: 1.8883 - classification_loss: 0.3591 39/500 [=>............................] - ETA: 2:32 - loss: 2.2546 - regression_loss: 1.8945 - classification_loss: 0.3601 40/500 [=>............................] - ETA: 2:32 - loss: 2.2418 - regression_loss: 1.8836 - classification_loss: 0.3581 41/500 [=>............................] - ETA: 2:31 - loss: 2.2379 - regression_loss: 1.8800 - classification_loss: 0.3578 42/500 [=>............................] - ETA: 2:31 - loss: 2.2339 - regression_loss: 1.8782 - classification_loss: 0.3557 43/500 [=>............................] - ETA: 2:31 - loss: 2.2354 - regression_loss: 1.8801 - classification_loss: 0.3554 44/500 [=>............................] - ETA: 2:30 - loss: 2.2308 - regression_loss: 1.8755 - classification_loss: 0.3554 45/500 [=>............................] - ETA: 2:30 - loss: 2.2467 - regression_loss: 1.8875 - classification_loss: 0.3592 46/500 [=>............................] - ETA: 2:30 - loss: 2.2468 - regression_loss: 1.8875 - classification_loss: 0.3593 47/500 [=>............................] - ETA: 2:29 - loss: 2.2491 - regression_loss: 1.8904 - classification_loss: 0.3587 48/500 [=>............................] - ETA: 2:29 - loss: 2.2455 - regression_loss: 1.8870 - classification_loss: 0.3584 49/500 [=>............................] - ETA: 2:28 - loss: 2.2461 - regression_loss: 1.8873 - classification_loss: 0.3588 50/500 [==>...........................] - ETA: 2:28 - loss: 2.2235 - regression_loss: 1.8690 - classification_loss: 0.3545 51/500 [==>...........................] - ETA: 2:28 - loss: 2.2042 - regression_loss: 1.8532 - classification_loss: 0.3510 52/500 [==>...........................] - ETA: 2:27 - loss: 2.1985 - regression_loss: 1.8489 - classification_loss: 0.3496 53/500 [==>...........................] - ETA: 2:27 - loss: 2.1998 - regression_loss: 1.8504 - classification_loss: 0.3495 54/500 [==>...........................] - ETA: 2:27 - loss: 2.2092 - regression_loss: 1.8587 - classification_loss: 0.3505 55/500 [==>...........................] - ETA: 2:27 - loss: 2.2142 - regression_loss: 1.8630 - classification_loss: 0.3513 56/500 [==>...........................] - ETA: 2:26 - loss: 2.2198 - regression_loss: 1.8666 - classification_loss: 0.3532 57/500 [==>...........................] - ETA: 2:26 - loss: 2.2169 - regression_loss: 1.8644 - classification_loss: 0.3525 58/500 [==>...........................] - ETA: 2:25 - loss: 2.2193 - regression_loss: 1.8664 - classification_loss: 0.3529 59/500 [==>...........................] - ETA: 2:25 - loss: 2.2104 - regression_loss: 1.8590 - classification_loss: 0.3514 60/500 [==>...........................] - ETA: 2:25 - loss: 2.2053 - regression_loss: 1.8512 - classification_loss: 0.3541 61/500 [==>...........................] - ETA: 2:25 - loss: 2.1976 - regression_loss: 1.8437 - classification_loss: 0.3540 62/500 [==>...........................] - ETA: 2:24 - loss: 2.1878 - regression_loss: 1.8343 - classification_loss: 0.3534 63/500 [==>...........................] - ETA: 2:24 - loss: 2.1806 - regression_loss: 1.8282 - classification_loss: 0.3524 64/500 [==>...........................] - ETA: 2:23 - loss: 2.1798 - regression_loss: 1.8276 - classification_loss: 0.3522 65/500 [==>...........................] - ETA: 2:23 - loss: 2.1836 - regression_loss: 1.8301 - classification_loss: 0.3535 66/500 [==>...........................] - ETA: 2:23 - loss: 2.1862 - regression_loss: 1.8321 - classification_loss: 0.3540 67/500 [===>..........................] - ETA: 2:22 - loss: 2.1832 - regression_loss: 1.8306 - classification_loss: 0.3526 68/500 [===>..........................] - ETA: 2:22 - loss: 2.1828 - regression_loss: 1.8304 - classification_loss: 0.3524 69/500 [===>..........................] - ETA: 2:22 - loss: 2.1859 - regression_loss: 1.8328 - classification_loss: 0.3531 70/500 [===>..........................] - ETA: 2:21 - loss: 2.1708 - regression_loss: 1.8194 - classification_loss: 0.3514 71/500 [===>..........................] - ETA: 2:21 - loss: 2.1743 - regression_loss: 1.8213 - classification_loss: 0.3530 72/500 [===>..........................] - ETA: 2:21 - loss: 2.1746 - regression_loss: 1.8219 - classification_loss: 0.3527 73/500 [===>..........................] - ETA: 2:20 - loss: 2.1765 - regression_loss: 1.8237 - classification_loss: 0.3527 74/500 [===>..........................] - ETA: 2:20 - loss: 2.1801 - regression_loss: 1.8266 - classification_loss: 0.3536 75/500 [===>..........................] - ETA: 2:20 - loss: 2.1741 - regression_loss: 1.8209 - classification_loss: 0.3532 76/500 [===>..........................] - ETA: 2:19 - loss: 2.1643 - regression_loss: 1.8125 - classification_loss: 0.3518 77/500 [===>..........................] - ETA: 2:19 - loss: 2.1661 - regression_loss: 1.8141 - classification_loss: 0.3520 78/500 [===>..........................] - ETA: 2:18 - loss: 2.1578 - regression_loss: 1.8072 - classification_loss: 0.3506 79/500 [===>..........................] - ETA: 2:18 - loss: 2.1553 - regression_loss: 1.8048 - classification_loss: 0.3506 80/500 [===>..........................] - ETA: 2:18 - loss: 2.1404 - regression_loss: 1.7905 - classification_loss: 0.3499 81/500 [===>..........................] - ETA: 2:17 - loss: 2.1403 - regression_loss: 1.7912 - classification_loss: 0.3491 82/500 [===>..........................] - ETA: 2:17 - loss: 2.1406 - regression_loss: 1.7930 - classification_loss: 0.3476 83/500 [===>..........................] - ETA: 2:17 - loss: 2.1435 - regression_loss: 1.7965 - classification_loss: 0.3470 84/500 [====>.........................] - ETA: 2:16 - loss: 2.1461 - regression_loss: 1.7985 - classification_loss: 0.3476 85/500 [====>.........................] - ETA: 2:16 - loss: 2.1470 - regression_loss: 1.7994 - classification_loss: 0.3476 86/500 [====>.........................] - ETA: 2:16 - loss: 2.1422 - regression_loss: 1.7964 - classification_loss: 0.3457 87/500 [====>.........................] - ETA: 2:15 - loss: 2.1443 - regression_loss: 1.7970 - classification_loss: 0.3473 88/500 [====>.........................] - ETA: 2:15 - loss: 2.1449 - regression_loss: 1.7976 - classification_loss: 0.3473 89/500 [====>.........................] - ETA: 2:15 - loss: 2.1436 - regression_loss: 1.7964 - classification_loss: 0.3472 90/500 [====>.........................] - ETA: 2:14 - loss: 2.1484 - regression_loss: 1.7993 - classification_loss: 0.3491 91/500 [====>.........................] - ETA: 2:14 - loss: 2.1369 - regression_loss: 1.7896 - classification_loss: 0.3474 92/500 [====>.........................] - ETA: 2:14 - loss: 2.1362 - regression_loss: 1.7892 - classification_loss: 0.3470 93/500 [====>.........................] - ETA: 2:14 - loss: 2.1387 - regression_loss: 1.7919 - classification_loss: 0.3467 94/500 [====>.........................] - ETA: 2:13 - loss: 2.1335 - regression_loss: 1.7869 - classification_loss: 0.3465 95/500 [====>.........................] - ETA: 2:13 - loss: 2.1339 - regression_loss: 1.7876 - classification_loss: 0.3463 96/500 [====>.........................] - ETA: 2:13 - loss: 2.1273 - regression_loss: 1.7823 - classification_loss: 0.3450 97/500 [====>.........................] - ETA: 2:12 - loss: 2.1264 - regression_loss: 1.7815 - classification_loss: 0.3449 98/500 [====>.........................] - ETA: 2:12 - loss: 2.1197 - regression_loss: 1.7754 - classification_loss: 0.3443 99/500 [====>.........................] - ETA: 2:12 - loss: 2.1156 - regression_loss: 1.7719 - classification_loss: 0.3437 100/500 [=====>........................] - ETA: 2:11 - loss: 2.1227 - regression_loss: 1.7777 - classification_loss: 0.3450 101/500 [=====>........................] - ETA: 2:11 - loss: 2.1258 - regression_loss: 1.7804 - classification_loss: 0.3453 102/500 [=====>........................] - ETA: 2:10 - loss: 2.1278 - regression_loss: 1.7815 - classification_loss: 0.3463 103/500 [=====>........................] - ETA: 2:10 - loss: 2.1239 - regression_loss: 1.7786 - classification_loss: 0.3453 104/500 [=====>........................] - ETA: 2:10 - loss: 2.1232 - regression_loss: 1.7784 - classification_loss: 0.3448 105/500 [=====>........................] - ETA: 2:09 - loss: 2.1264 - regression_loss: 1.7802 - classification_loss: 0.3462 106/500 [=====>........................] - ETA: 2:09 - loss: 2.1280 - regression_loss: 1.7816 - classification_loss: 0.3464 107/500 [=====>........................] - ETA: 2:09 - loss: 2.1283 - regression_loss: 1.7820 - classification_loss: 0.3463 108/500 [=====>........................] - ETA: 2:08 - loss: 2.1280 - regression_loss: 1.7816 - classification_loss: 0.3464 109/500 [=====>........................] - ETA: 2:08 - loss: 2.1303 - regression_loss: 1.7836 - classification_loss: 0.3468 110/500 [=====>........................] - ETA: 2:08 - loss: 2.1326 - regression_loss: 1.7853 - classification_loss: 0.3472 111/500 [=====>........................] - ETA: 2:07 - loss: 2.1201 - regression_loss: 1.7736 - classification_loss: 0.3465 112/500 [=====>........................] - ETA: 2:07 - loss: 2.1216 - regression_loss: 1.7748 - classification_loss: 0.3468 113/500 [=====>........................] - ETA: 2:07 - loss: 2.1249 - regression_loss: 1.7780 - classification_loss: 0.3469 114/500 [=====>........................] - ETA: 2:06 - loss: 2.1278 - regression_loss: 1.7808 - classification_loss: 0.3470 115/500 [=====>........................] - ETA: 2:06 - loss: 2.1277 - regression_loss: 1.7811 - classification_loss: 0.3467 116/500 [=====>........................] - ETA: 2:06 - loss: 2.1240 - regression_loss: 1.7777 - classification_loss: 0.3464 117/500 [======>.......................] - ETA: 2:05 - loss: 2.1252 - regression_loss: 1.7786 - classification_loss: 0.3465 118/500 [======>.......................] - ETA: 2:05 - loss: 2.1266 - regression_loss: 1.7801 - classification_loss: 0.3465 119/500 [======>.......................] - ETA: 2:05 - loss: 2.1280 - regression_loss: 1.7813 - classification_loss: 0.3467 120/500 [======>.......................] - ETA: 2:04 - loss: 2.1214 - regression_loss: 1.7761 - classification_loss: 0.3452 121/500 [======>.......................] - ETA: 2:04 - loss: 2.1259 - regression_loss: 1.7798 - classification_loss: 0.3461 122/500 [======>.......................] - ETA: 2:04 - loss: 2.1275 - regression_loss: 1.7808 - classification_loss: 0.3467 123/500 [======>.......................] - ETA: 2:03 - loss: 2.1219 - regression_loss: 1.7763 - classification_loss: 0.3456 124/500 [======>.......................] - ETA: 2:03 - loss: 2.1238 - regression_loss: 1.7779 - classification_loss: 0.3459 125/500 [======>.......................] - ETA: 2:03 - loss: 2.1176 - regression_loss: 1.7726 - classification_loss: 0.3450 126/500 [======>.......................] - ETA: 2:02 - loss: 2.1196 - regression_loss: 1.7746 - classification_loss: 0.3450 127/500 [======>.......................] - ETA: 2:02 - loss: 2.1185 - regression_loss: 1.7736 - classification_loss: 0.3449 128/500 [======>.......................] - ETA: 2:02 - loss: 2.1158 - regression_loss: 1.7715 - classification_loss: 0.3443 129/500 [======>.......................] - ETA: 2:01 - loss: 2.1171 - regression_loss: 1.7727 - classification_loss: 0.3443 130/500 [======>.......................] - ETA: 2:01 - loss: 2.1183 - regression_loss: 1.7737 - classification_loss: 0.3445 131/500 [======>.......................] - ETA: 2:01 - loss: 2.1181 - regression_loss: 1.7740 - classification_loss: 0.3441 132/500 [======>.......................] - ETA: 2:01 - loss: 2.1193 - regression_loss: 1.7754 - classification_loss: 0.3439 133/500 [======>.......................] - ETA: 2:00 - loss: 2.1195 - regression_loss: 1.7750 - classification_loss: 0.3445 134/500 [=======>......................] - ETA: 2:00 - loss: 2.1168 - regression_loss: 1.7728 - classification_loss: 0.3440 135/500 [=======>......................] - ETA: 2:00 - loss: 2.1092 - regression_loss: 1.7666 - classification_loss: 0.3426 136/500 [=======>......................] - ETA: 1:59 - loss: 2.1078 - regression_loss: 1.7653 - classification_loss: 0.3425 137/500 [=======>......................] - ETA: 1:59 - loss: 2.1091 - regression_loss: 1.7662 - classification_loss: 0.3428 138/500 [=======>......................] - ETA: 1:59 - loss: 2.1105 - regression_loss: 1.7681 - classification_loss: 0.3425 139/500 [=======>......................] - ETA: 1:58 - loss: 2.1108 - regression_loss: 1.7684 - classification_loss: 0.3423 140/500 [=======>......................] - ETA: 1:58 - loss: 2.1123 - regression_loss: 1.7701 - classification_loss: 0.3422 141/500 [=======>......................] - ETA: 1:58 - loss: 2.1113 - regression_loss: 1.7694 - classification_loss: 0.3419 142/500 [=======>......................] - ETA: 1:57 - loss: 2.1104 - regression_loss: 1.7687 - classification_loss: 0.3417 143/500 [=======>......................] - ETA: 1:57 - loss: 2.1086 - regression_loss: 1.7676 - classification_loss: 0.3410 144/500 [=======>......................] - ETA: 1:56 - loss: 2.1106 - regression_loss: 1.7694 - classification_loss: 0.3412 145/500 [=======>......................] - ETA: 1:56 - loss: 2.1116 - regression_loss: 1.7702 - classification_loss: 0.3414 146/500 [=======>......................] - ETA: 1:56 - loss: 2.1127 - regression_loss: 1.7711 - classification_loss: 0.3416 147/500 [=======>......................] - ETA: 1:55 - loss: 2.1122 - regression_loss: 1.7709 - classification_loss: 0.3413 148/500 [=======>......................] - ETA: 1:55 - loss: 2.1153 - regression_loss: 1.7733 - classification_loss: 0.3420 149/500 [=======>......................] - ETA: 1:55 - loss: 2.1178 - regression_loss: 1.7759 - classification_loss: 0.3420 150/500 [========>.....................] - ETA: 1:55 - loss: 2.1191 - regression_loss: 1.7767 - classification_loss: 0.3424 151/500 [========>.....................] - ETA: 1:54 - loss: 2.1169 - regression_loss: 1.7753 - classification_loss: 0.3416 152/500 [========>.....................] - ETA: 1:54 - loss: 2.1144 - regression_loss: 1.7733 - classification_loss: 0.3410 153/500 [========>.....................] - ETA: 1:54 - loss: 2.1130 - regression_loss: 1.7723 - classification_loss: 0.3406 154/500 [========>.....................] - ETA: 1:53 - loss: 2.1151 - regression_loss: 1.7742 - classification_loss: 0.3409 155/500 [========>.....................] - ETA: 1:53 - loss: 2.1181 - regression_loss: 1.7768 - classification_loss: 0.3413 156/500 [========>.....................] - ETA: 1:53 - loss: 2.1184 - regression_loss: 1.7771 - classification_loss: 0.3413 157/500 [========>.....................] - ETA: 1:52 - loss: 2.1143 - regression_loss: 1.7737 - classification_loss: 0.3406 158/500 [========>.....................] - ETA: 1:52 - loss: 2.1127 - regression_loss: 1.7722 - classification_loss: 0.3405 159/500 [========>.....................] - ETA: 1:52 - loss: 2.1128 - regression_loss: 1.7723 - classification_loss: 0.3405 160/500 [========>.....................] - ETA: 1:51 - loss: 2.1117 - regression_loss: 1.7703 - classification_loss: 0.3414 161/500 [========>.....................] - ETA: 1:51 - loss: 2.1049 - regression_loss: 1.7642 - classification_loss: 0.3407 162/500 [========>.....................] - ETA: 1:51 - loss: 2.1050 - regression_loss: 1.7646 - classification_loss: 0.3404 163/500 [========>.....................] - ETA: 1:50 - loss: 2.1068 - regression_loss: 1.7663 - classification_loss: 0.3405 164/500 [========>.....................] - ETA: 1:50 - loss: 2.1076 - regression_loss: 1.7667 - classification_loss: 0.3409 165/500 [========>.....................] - ETA: 1:50 - loss: 2.1096 - regression_loss: 1.7686 - classification_loss: 0.3410 166/500 [========>.....................] - ETA: 1:49 - loss: 2.1111 - regression_loss: 1.7698 - classification_loss: 0.3412 167/500 [=========>....................] - ETA: 1:49 - loss: 2.1099 - regression_loss: 1.7691 - classification_loss: 0.3409 168/500 [=========>....................] - ETA: 1:49 - loss: 2.1070 - regression_loss: 1.7667 - classification_loss: 0.3403 169/500 [=========>....................] - ETA: 1:49 - loss: 2.1039 - regression_loss: 1.7643 - classification_loss: 0.3396 170/500 [=========>....................] - ETA: 1:48 - loss: 2.1094 - regression_loss: 1.7682 - classification_loss: 0.3413 171/500 [=========>....................] - ETA: 1:48 - loss: 2.1088 - regression_loss: 1.7671 - classification_loss: 0.3418 172/500 [=========>....................] - ETA: 1:48 - loss: 2.1087 - regression_loss: 1.7669 - classification_loss: 0.3418 173/500 [=========>....................] - ETA: 1:47 - loss: 2.1077 - regression_loss: 1.7662 - classification_loss: 0.3415 174/500 [=========>....................] - ETA: 1:47 - loss: 2.1086 - regression_loss: 1.7668 - classification_loss: 0.3417 175/500 [=========>....................] - ETA: 1:47 - loss: 2.1047 - regression_loss: 1.7632 - classification_loss: 0.3414 176/500 [=========>....................] - ETA: 1:46 - loss: 2.1046 - regression_loss: 1.7633 - classification_loss: 0.3412 177/500 [=========>....................] - ETA: 1:46 - loss: 2.1047 - regression_loss: 1.7636 - classification_loss: 0.3411 178/500 [=========>....................] - ETA: 1:46 - loss: 2.1047 - regression_loss: 1.7638 - classification_loss: 0.3408 179/500 [=========>....................] - ETA: 1:45 - loss: 2.1037 - regression_loss: 1.7633 - classification_loss: 0.3404 180/500 [=========>....................] - ETA: 1:45 - loss: 2.1034 - regression_loss: 1.7634 - classification_loss: 0.3399 181/500 [=========>....................] - ETA: 1:44 - loss: 2.1051 - regression_loss: 1.7647 - classification_loss: 0.3403 182/500 [=========>....................] - ETA: 1:44 - loss: 2.1058 - regression_loss: 1.7657 - classification_loss: 0.3401 183/500 [=========>....................] - ETA: 1:44 - loss: 2.1067 - regression_loss: 1.7666 - classification_loss: 0.3401 184/500 [==========>...................] - ETA: 1:44 - loss: 2.1085 - regression_loss: 1.7679 - classification_loss: 0.3406 185/500 [==========>...................] - ETA: 1:43 - loss: 2.1091 - regression_loss: 1.7683 - classification_loss: 0.3408 186/500 [==========>...................] - ETA: 1:43 - loss: 2.1117 - regression_loss: 1.7704 - classification_loss: 0.3413 187/500 [==========>...................] - ETA: 1:43 - loss: 2.1122 - regression_loss: 1.7715 - classification_loss: 0.3408 188/500 [==========>...................] - ETA: 1:42 - loss: 2.1123 - regression_loss: 1.7714 - classification_loss: 0.3408 189/500 [==========>...................] - ETA: 1:42 - loss: 2.1123 - regression_loss: 1.7714 - classification_loss: 0.3410 190/500 [==========>...................] - ETA: 1:42 - loss: 2.1132 - regression_loss: 1.7721 - classification_loss: 0.3411 191/500 [==========>...................] - ETA: 1:41 - loss: 2.1142 - regression_loss: 1.7730 - classification_loss: 0.3412 192/500 [==========>...................] - ETA: 1:41 - loss: 2.1129 - regression_loss: 1.7719 - classification_loss: 0.3409 193/500 [==========>...................] - ETA: 1:41 - loss: 2.1140 - regression_loss: 1.7727 - classification_loss: 0.3412 194/500 [==========>...................] - ETA: 1:40 - loss: 2.1136 - regression_loss: 1.7725 - classification_loss: 0.3411 195/500 [==========>...................] - ETA: 1:40 - loss: 2.1172 - regression_loss: 1.7756 - classification_loss: 0.3417 196/500 [==========>...................] - ETA: 1:40 - loss: 2.1177 - regression_loss: 1.7759 - classification_loss: 0.3418 197/500 [==========>...................] - ETA: 1:39 - loss: 2.1184 - regression_loss: 1.7763 - classification_loss: 0.3421 198/500 [==========>...................] - ETA: 1:39 - loss: 2.1196 - regression_loss: 1.7775 - classification_loss: 0.3421 199/500 [==========>...................] - ETA: 1:39 - loss: 2.1173 - regression_loss: 1.7743 - classification_loss: 0.3430 200/500 [===========>..................] - ETA: 1:38 - loss: 2.1139 - regression_loss: 1.7717 - classification_loss: 0.3422 201/500 [===========>..................] - ETA: 1:38 - loss: 2.1129 - regression_loss: 1.7707 - classification_loss: 0.3422 202/500 [===========>..................] - ETA: 1:38 - loss: 2.1148 - regression_loss: 1.7719 - classification_loss: 0.3429 203/500 [===========>..................] - ETA: 1:37 - loss: 2.1165 - regression_loss: 1.7732 - classification_loss: 0.3433 204/500 [===========>..................] - ETA: 1:37 - loss: 2.1179 - regression_loss: 1.7743 - classification_loss: 0.3436 205/500 [===========>..................] - ETA: 1:37 - loss: 2.1193 - regression_loss: 1.7757 - classification_loss: 0.3436 206/500 [===========>..................] - ETA: 1:36 - loss: 2.1194 - regression_loss: 1.7752 - classification_loss: 0.3442 207/500 [===========>..................] - ETA: 1:36 - loss: 2.1185 - regression_loss: 1.7742 - classification_loss: 0.3442 208/500 [===========>..................] - ETA: 1:36 - loss: 2.1173 - regression_loss: 1.7735 - classification_loss: 0.3437 209/500 [===========>..................] - ETA: 1:35 - loss: 2.1171 - regression_loss: 1.7735 - classification_loss: 0.3436 210/500 [===========>..................] - ETA: 1:35 - loss: 2.1164 - regression_loss: 1.7729 - classification_loss: 0.3435 211/500 [===========>..................] - ETA: 1:35 - loss: 2.1125 - regression_loss: 1.7695 - classification_loss: 0.3431 212/500 [===========>..................] - ETA: 1:34 - loss: 2.1131 - regression_loss: 1.7700 - classification_loss: 0.3431 213/500 [===========>..................] - ETA: 1:34 - loss: 2.1130 - regression_loss: 1.7699 - classification_loss: 0.3431 214/500 [===========>..................] - ETA: 1:34 - loss: 2.1148 - regression_loss: 1.7715 - classification_loss: 0.3434 215/500 [===========>..................] - ETA: 1:33 - loss: 2.1156 - regression_loss: 1.7723 - classification_loss: 0.3432 216/500 [===========>..................] - ETA: 1:33 - loss: 2.1164 - regression_loss: 1.7729 - classification_loss: 0.3435 217/500 [============>.................] - ETA: 1:33 - loss: 2.1145 - regression_loss: 1.7712 - classification_loss: 0.3432 218/500 [============>.................] - ETA: 1:32 - loss: 2.1107 - regression_loss: 1.7678 - classification_loss: 0.3430 219/500 [============>.................] - ETA: 1:32 - loss: 2.1117 - regression_loss: 1.7688 - classification_loss: 0.3429 220/500 [============>.................] - ETA: 1:32 - loss: 2.1116 - regression_loss: 1.7688 - classification_loss: 0.3429 221/500 [============>.................] - ETA: 1:31 - loss: 2.1120 - regression_loss: 1.7691 - classification_loss: 0.3429 222/500 [============>.................] - ETA: 1:31 - loss: 2.1097 - regression_loss: 1.7674 - classification_loss: 0.3423 223/500 [============>.................] - ETA: 1:31 - loss: 2.1088 - regression_loss: 1.7669 - classification_loss: 0.3419 224/500 [============>.................] - ETA: 1:30 - loss: 2.1083 - regression_loss: 1.7663 - classification_loss: 0.3419 225/500 [============>.................] - ETA: 1:30 - loss: 2.1078 - regression_loss: 1.7660 - classification_loss: 0.3419 226/500 [============>.................] - ETA: 1:30 - loss: 2.1088 - regression_loss: 1.7667 - classification_loss: 0.3421 227/500 [============>.................] - ETA: 1:29 - loss: 2.1095 - regression_loss: 1.7673 - classification_loss: 0.3422 228/500 [============>.................] - ETA: 1:29 - loss: 2.1105 - regression_loss: 1.7682 - classification_loss: 0.3423 229/500 [============>.................] - ETA: 1:29 - loss: 2.1072 - regression_loss: 1.7654 - classification_loss: 0.3418 230/500 [============>.................] - ETA: 1:29 - loss: 2.1083 - regression_loss: 1.7663 - classification_loss: 0.3420 231/500 [============>.................] - ETA: 1:28 - loss: 2.1085 - regression_loss: 1.7665 - classification_loss: 0.3421 232/500 [============>.................] - ETA: 1:28 - loss: 2.1111 - regression_loss: 1.7688 - classification_loss: 0.3423 233/500 [============>.................] - ETA: 1:28 - loss: 2.1119 - regression_loss: 1.7696 - classification_loss: 0.3423 234/500 [=============>................] - ETA: 1:27 - loss: 2.1123 - regression_loss: 1.7701 - classification_loss: 0.3422 235/500 [=============>................] - ETA: 1:27 - loss: 2.1127 - regression_loss: 1.7705 - classification_loss: 0.3422 236/500 [=============>................] - ETA: 1:26 - loss: 2.1121 - regression_loss: 1.7701 - classification_loss: 0.3421 237/500 [=============>................] - ETA: 1:26 - loss: 2.1149 - regression_loss: 1.7723 - classification_loss: 0.3426 238/500 [=============>................] - ETA: 1:26 - loss: 2.1160 - regression_loss: 1.7734 - classification_loss: 0.3426 239/500 [=============>................] - ETA: 1:25 - loss: 2.1143 - regression_loss: 1.7719 - classification_loss: 0.3424 240/500 [=============>................] - ETA: 1:25 - loss: 2.1145 - regression_loss: 1.7719 - classification_loss: 0.3426 241/500 [=============>................] - ETA: 1:25 - loss: 2.1161 - regression_loss: 1.7734 - classification_loss: 0.3427 242/500 [=============>................] - ETA: 1:24 - loss: 2.1124 - regression_loss: 1.7702 - classification_loss: 0.3422 243/500 [=============>................] - ETA: 1:24 - loss: 2.1096 - regression_loss: 1.7679 - classification_loss: 0.3416 244/500 [=============>................] - ETA: 1:24 - loss: 2.1118 - regression_loss: 1.7695 - classification_loss: 0.3424 245/500 [=============>................] - ETA: 1:23 - loss: 2.1125 - regression_loss: 1.7702 - classification_loss: 0.3423 246/500 [=============>................] - ETA: 1:23 - loss: 2.1099 - regression_loss: 1.7679 - classification_loss: 0.3419 247/500 [=============>................] - ETA: 1:23 - loss: 2.1078 - regression_loss: 1.7661 - classification_loss: 0.3417 248/500 [=============>................] - ETA: 1:23 - loss: 2.1078 - regression_loss: 1.7662 - classification_loss: 0.3416 249/500 [=============>................] - ETA: 1:22 - loss: 2.1068 - regression_loss: 1.7652 - classification_loss: 0.3416 250/500 [==============>...............] - ETA: 1:22 - loss: 2.1079 - regression_loss: 1.7662 - classification_loss: 0.3417 251/500 [==============>...............] - ETA: 1:22 - loss: 2.1087 - regression_loss: 1.7668 - classification_loss: 0.3418 252/500 [==============>...............] - ETA: 1:21 - loss: 2.1052 - regression_loss: 1.7631 - classification_loss: 0.3421 253/500 [==============>...............] - ETA: 1:21 - loss: 2.1037 - regression_loss: 1.7622 - classification_loss: 0.3416 254/500 [==============>...............] - ETA: 1:21 - loss: 2.1027 - regression_loss: 1.7613 - classification_loss: 0.3414 255/500 [==============>...............] - ETA: 1:20 - loss: 2.1029 - regression_loss: 1.7616 - classification_loss: 0.3413 256/500 [==============>...............] - ETA: 1:20 - loss: 2.1037 - regression_loss: 1.7622 - classification_loss: 0.3415 257/500 [==============>...............] - ETA: 1:20 - loss: 2.1006 - regression_loss: 1.7596 - classification_loss: 0.3410 258/500 [==============>...............] - ETA: 1:19 - loss: 2.1011 - regression_loss: 1.7601 - classification_loss: 0.3410 259/500 [==============>...............] - ETA: 1:19 - loss: 2.1011 - regression_loss: 1.7601 - classification_loss: 0.3409 260/500 [==============>...............] - ETA: 1:19 - loss: 2.1016 - regression_loss: 1.7606 - classification_loss: 0.3410 261/500 [==============>...............] - ETA: 1:18 - loss: 2.0984 - regression_loss: 1.7578 - classification_loss: 0.3406 262/500 [==============>...............] - ETA: 1:18 - loss: 2.0984 - regression_loss: 1.7576 - classification_loss: 0.3409 263/500 [==============>...............] - ETA: 1:18 - loss: 2.0998 - regression_loss: 1.7591 - classification_loss: 0.3407 264/500 [==============>...............] - ETA: 1:17 - loss: 2.1005 - regression_loss: 1.7595 - classification_loss: 0.3410 265/500 [==============>...............] - ETA: 1:17 - loss: 2.1006 - regression_loss: 1.7596 - classification_loss: 0.3410 266/500 [==============>...............] - ETA: 1:17 - loss: 2.0986 - regression_loss: 1.7581 - classification_loss: 0.3405 267/500 [===============>..............] - ETA: 1:16 - loss: 2.0998 - regression_loss: 1.7590 - classification_loss: 0.3407 268/500 [===============>..............] - ETA: 1:16 - loss: 2.0982 - regression_loss: 1.7577 - classification_loss: 0.3405 269/500 [===============>..............] - ETA: 1:16 - loss: 2.0983 - regression_loss: 1.7579 - classification_loss: 0.3404 270/500 [===============>..............] - ETA: 1:15 - loss: 2.0991 - regression_loss: 1.7583 - classification_loss: 0.3407 271/500 [===============>..............] - ETA: 1:15 - loss: 2.0998 - regression_loss: 1.7590 - classification_loss: 0.3409 272/500 [===============>..............] - ETA: 1:15 - loss: 2.1002 - regression_loss: 1.7593 - classification_loss: 0.3408 273/500 [===============>..............] - ETA: 1:14 - loss: 2.1002 - regression_loss: 1.7594 - classification_loss: 0.3408 274/500 [===============>..............] - ETA: 1:14 - loss: 2.1034 - regression_loss: 1.7615 - classification_loss: 0.3418 275/500 [===============>..............] - ETA: 1:14 - loss: 2.1045 - regression_loss: 1.7626 - classification_loss: 0.3419 276/500 [===============>..............] - ETA: 1:13 - loss: 2.1040 - regression_loss: 1.7622 - classification_loss: 0.3418 277/500 [===============>..............] - ETA: 1:13 - loss: 2.1043 - regression_loss: 1.7624 - classification_loss: 0.3419 278/500 [===============>..............] - ETA: 1:13 - loss: 2.1038 - regression_loss: 1.7620 - classification_loss: 0.3418 279/500 [===============>..............] - ETA: 1:12 - loss: 2.1035 - regression_loss: 1.7618 - classification_loss: 0.3417 280/500 [===============>..............] - ETA: 1:12 - loss: 2.1030 - regression_loss: 1.7615 - classification_loss: 0.3415 281/500 [===============>..............] - ETA: 1:12 - loss: 2.1017 - regression_loss: 1.7602 - classification_loss: 0.3415 282/500 [===============>..............] - ETA: 1:11 - loss: 2.1005 - regression_loss: 1.7592 - classification_loss: 0.3413 283/500 [===============>..............] - ETA: 1:11 - loss: 2.1016 - regression_loss: 1.7602 - classification_loss: 0.3414 284/500 [================>.............] - ETA: 1:11 - loss: 2.1018 - regression_loss: 1.7606 - classification_loss: 0.3413 285/500 [================>.............] - ETA: 1:10 - loss: 2.1030 - regression_loss: 1.7616 - classification_loss: 0.3414 286/500 [================>.............] - ETA: 1:10 - loss: 2.1038 - regression_loss: 1.7624 - classification_loss: 0.3414 287/500 [================>.............] - ETA: 1:10 - loss: 2.1046 - regression_loss: 1.7631 - classification_loss: 0.3416 288/500 [================>.............] - ETA: 1:09 - loss: 2.1027 - regression_loss: 1.7612 - classification_loss: 0.3415 289/500 [================>.............] - ETA: 1:09 - loss: 2.1018 - regression_loss: 1.7600 - classification_loss: 0.3418 290/500 [================>.............] - ETA: 1:09 - loss: 2.1024 - regression_loss: 1.7606 - classification_loss: 0.3417 291/500 [================>.............] - ETA: 1:09 - loss: 2.0994 - regression_loss: 1.7579 - classification_loss: 0.3414 292/500 [================>.............] - ETA: 1:08 - loss: 2.0979 - regression_loss: 1.7569 - classification_loss: 0.3411 293/500 [================>.............] - ETA: 1:08 - loss: 2.0981 - regression_loss: 1.7568 - classification_loss: 0.3413 294/500 [================>.............] - ETA: 1:08 - loss: 2.0971 - regression_loss: 1.7559 - classification_loss: 0.3411 295/500 [================>.............] - ETA: 1:07 - loss: 2.0977 - regression_loss: 1.7565 - classification_loss: 0.3413 296/500 [================>.............] - ETA: 1:07 - loss: 2.0981 - regression_loss: 1.7569 - classification_loss: 0.3412 297/500 [================>.............] - ETA: 1:07 - loss: 2.0986 - regression_loss: 1.7574 - classification_loss: 0.3412 298/500 [================>.............] - ETA: 1:06 - loss: 2.0985 - regression_loss: 1.7575 - classification_loss: 0.3410 299/500 [================>.............] - ETA: 1:06 - loss: 2.0986 - regression_loss: 1.7576 - classification_loss: 0.3410 300/500 [=================>............] - ETA: 1:06 - loss: 2.0951 - regression_loss: 1.7544 - classification_loss: 0.3407 301/500 [=================>............] - ETA: 1:05 - loss: 2.0979 - regression_loss: 1.7567 - classification_loss: 0.3412 302/500 [=================>............] - ETA: 1:05 - loss: 2.0982 - regression_loss: 1.7570 - classification_loss: 0.3412 303/500 [=================>............] - ETA: 1:05 - loss: 2.0966 - regression_loss: 1.7557 - classification_loss: 0.3408 304/500 [=================>............] - ETA: 1:04 - loss: 2.0969 - regression_loss: 1.7561 - classification_loss: 0.3408 305/500 [=================>............] - ETA: 1:04 - loss: 2.0963 - regression_loss: 1.7557 - classification_loss: 0.3406 306/500 [=================>............] - ETA: 1:04 - loss: 2.0968 - regression_loss: 1.7561 - classification_loss: 0.3408 307/500 [=================>............] - ETA: 1:03 - loss: 2.0980 - regression_loss: 1.7572 - classification_loss: 0.3408 308/500 [=================>............] - ETA: 1:03 - loss: 2.0968 - regression_loss: 1.7563 - classification_loss: 0.3406 309/500 [=================>............] - ETA: 1:03 - loss: 2.0961 - regression_loss: 1.7559 - classification_loss: 0.3402 310/500 [=================>............] - ETA: 1:02 - loss: 2.0945 - regression_loss: 1.7542 - classification_loss: 0.3402 311/500 [=================>............] - ETA: 1:02 - loss: 2.0943 - regression_loss: 1.7542 - classification_loss: 0.3401 312/500 [=================>............] - ETA: 1:02 - loss: 2.0940 - regression_loss: 1.7539 - classification_loss: 0.3401 313/500 [=================>............] - ETA: 1:01 - loss: 2.0942 - regression_loss: 1.7541 - classification_loss: 0.3401 314/500 [=================>............] - ETA: 1:01 - loss: 2.0957 - regression_loss: 1.7554 - classification_loss: 0.3403 315/500 [=================>............] - ETA: 1:01 - loss: 2.0960 - regression_loss: 1.7558 - classification_loss: 0.3402 316/500 [=================>............] - ETA: 1:00 - loss: 2.0968 - regression_loss: 1.7565 - classification_loss: 0.3402 317/500 [==================>...........] - ETA: 1:00 - loss: 2.0978 - regression_loss: 1.7575 - classification_loss: 0.3403 318/500 [==================>...........] - ETA: 1:00 - loss: 2.0968 - regression_loss: 1.7567 - classification_loss: 0.3401 319/500 [==================>...........] - ETA: 59s - loss: 2.0979 - regression_loss: 1.7578 - classification_loss: 0.3402  320/500 [==================>...........] - ETA: 59s - loss: 2.0964 - regression_loss: 1.7566 - classification_loss: 0.3399 321/500 [==================>...........] - ETA: 59s - loss: 2.0971 - regression_loss: 1.7574 - classification_loss: 0.3397 322/500 [==================>...........] - ETA: 58s - loss: 2.0980 - regression_loss: 1.7582 - classification_loss: 0.3398 323/500 [==================>...........] - ETA: 58s - loss: 2.0936 - regression_loss: 1.7543 - classification_loss: 0.3393 324/500 [==================>...........] - ETA: 58s - loss: 2.0932 - regression_loss: 1.7540 - classification_loss: 0.3393 325/500 [==================>...........] - ETA: 57s - loss: 2.0937 - regression_loss: 1.7544 - classification_loss: 0.3393 326/500 [==================>...........] - ETA: 57s - loss: 2.0913 - regression_loss: 1.7524 - classification_loss: 0.3389 327/500 [==================>...........] - ETA: 57s - loss: 2.0941 - regression_loss: 1.7547 - classification_loss: 0.3393 328/500 [==================>...........] - ETA: 56s - loss: 2.0927 - regression_loss: 1.7536 - classification_loss: 0.3391 329/500 [==================>...........] - ETA: 56s - loss: 2.0920 - regression_loss: 1.7531 - classification_loss: 0.3389 330/500 [==================>...........] - ETA: 56s - loss: 2.0927 - regression_loss: 1.7537 - classification_loss: 0.3391 331/500 [==================>...........] - ETA: 55s - loss: 2.0932 - regression_loss: 1.7541 - classification_loss: 0.3391 332/500 [==================>...........] - ETA: 55s - loss: 2.0935 - regression_loss: 1.7545 - classification_loss: 0.3390 333/500 [==================>...........] - ETA: 55s - loss: 2.0934 - regression_loss: 1.7546 - classification_loss: 0.3389 334/500 [===================>..........] - ETA: 54s - loss: 2.0927 - regression_loss: 1.7542 - classification_loss: 0.3386 335/500 [===================>..........] - ETA: 54s - loss: 2.0931 - regression_loss: 1.7547 - classification_loss: 0.3384 336/500 [===================>..........] - ETA: 54s - loss: 2.0932 - regression_loss: 1.7547 - classification_loss: 0.3385 337/500 [===================>..........] - ETA: 53s - loss: 2.0941 - regression_loss: 1.7556 - classification_loss: 0.3385 338/500 [===================>..........] - ETA: 53s - loss: 2.0946 - regression_loss: 1.7561 - classification_loss: 0.3385 339/500 [===================>..........] - ETA: 53s - loss: 2.0934 - regression_loss: 1.7553 - classification_loss: 0.3381 340/500 [===================>..........] - ETA: 52s - loss: 2.0940 - regression_loss: 1.7558 - classification_loss: 0.3382 341/500 [===================>..........] - ETA: 52s - loss: 2.0919 - regression_loss: 1.7542 - classification_loss: 0.3377 342/500 [===================>..........] - ETA: 52s - loss: 2.0918 - regression_loss: 1.7541 - classification_loss: 0.3377 343/500 [===================>..........] - ETA: 51s - loss: 2.0894 - regression_loss: 1.7521 - classification_loss: 0.3373 344/500 [===================>..........] - ETA: 51s - loss: 2.0896 - regression_loss: 1.7525 - classification_loss: 0.3372 345/500 [===================>..........] - ETA: 51s - loss: 2.0893 - regression_loss: 1.7522 - classification_loss: 0.3371 346/500 [===================>..........] - ETA: 50s - loss: 2.0867 - regression_loss: 1.7499 - classification_loss: 0.3368 347/500 [===================>..........] - ETA: 50s - loss: 2.0876 - regression_loss: 1.7507 - classification_loss: 0.3369 348/500 [===================>..........] - ETA: 50s - loss: 2.0859 - regression_loss: 1.7493 - classification_loss: 0.3366 349/500 [===================>..........] - ETA: 49s - loss: 2.0864 - regression_loss: 1.7500 - classification_loss: 0.3364 350/500 [====================>.........] - ETA: 49s - loss: 2.0868 - regression_loss: 1.7504 - classification_loss: 0.3364 351/500 [====================>.........] - ETA: 49s - loss: 2.0874 - regression_loss: 1.7509 - classification_loss: 0.3365 352/500 [====================>.........] - ETA: 48s - loss: 2.0865 - regression_loss: 1.7504 - classification_loss: 0.3362 353/500 [====================>.........] - ETA: 48s - loss: 2.0871 - regression_loss: 1.7509 - classification_loss: 0.3362 354/500 [====================>.........] - ETA: 48s - loss: 2.0879 - regression_loss: 1.7515 - classification_loss: 0.3364 355/500 [====================>.........] - ETA: 47s - loss: 2.0873 - regression_loss: 1.7512 - classification_loss: 0.3361 356/500 [====================>.........] - ETA: 47s - loss: 2.0884 - regression_loss: 1.7520 - classification_loss: 0.3363 357/500 [====================>.........] - ETA: 47s - loss: 2.0892 - regression_loss: 1.7527 - classification_loss: 0.3365 358/500 [====================>.........] - ETA: 46s - loss: 2.0894 - regression_loss: 1.7530 - classification_loss: 0.3363 359/500 [====================>.........] - ETA: 46s - loss: 2.0885 - regression_loss: 1.7525 - classification_loss: 0.3361 360/500 [====================>.........] - ETA: 46s - loss: 2.0882 - regression_loss: 1.7522 - classification_loss: 0.3360 361/500 [====================>.........] - ETA: 45s - loss: 2.0904 - regression_loss: 1.7539 - classification_loss: 0.3364 362/500 [====================>.........] - ETA: 45s - loss: 2.0885 - regression_loss: 1.7525 - classification_loss: 0.3360 363/500 [====================>.........] - ETA: 45s - loss: 2.0890 - regression_loss: 1.7528 - classification_loss: 0.3361 364/500 [====================>.........] - ETA: 44s - loss: 2.0901 - regression_loss: 1.7540 - classification_loss: 0.3361 365/500 [====================>.........] - ETA: 44s - loss: 2.0899 - regression_loss: 1.7541 - classification_loss: 0.3359 366/500 [====================>.........] - ETA: 44s - loss: 2.0909 - regression_loss: 1.7547 - classification_loss: 0.3361 367/500 [=====================>........] - ETA: 43s - loss: 2.0896 - regression_loss: 1.7536 - classification_loss: 0.3359 368/500 [=====================>........] - ETA: 43s - loss: 2.0892 - regression_loss: 1.7532 - classification_loss: 0.3360 369/500 [=====================>........] - ETA: 43s - loss: 2.0879 - regression_loss: 1.7523 - classification_loss: 0.3356 370/500 [=====================>........] - ETA: 42s - loss: 2.0884 - regression_loss: 1.7528 - classification_loss: 0.3355 371/500 [=====================>........] - ETA: 42s - loss: 2.0889 - regression_loss: 1.7534 - classification_loss: 0.3356 372/500 [=====================>........] - ETA: 42s - loss: 2.0900 - regression_loss: 1.7544 - classification_loss: 0.3356 373/500 [=====================>........] - ETA: 41s - loss: 2.0892 - regression_loss: 1.7537 - classification_loss: 0.3354 374/500 [=====================>........] - ETA: 41s - loss: 2.0904 - regression_loss: 1.7545 - classification_loss: 0.3358 375/500 [=====================>........] - ETA: 41s - loss: 2.0921 - regression_loss: 1.7560 - classification_loss: 0.3362 376/500 [=====================>........] - ETA: 40s - loss: 2.0904 - regression_loss: 1.7545 - classification_loss: 0.3359 377/500 [=====================>........] - ETA: 40s - loss: 2.0906 - regression_loss: 1.7548 - classification_loss: 0.3359 378/500 [=====================>........] - ETA: 40s - loss: 2.0907 - regression_loss: 1.7549 - classification_loss: 0.3358 379/500 [=====================>........] - ETA: 39s - loss: 2.0913 - regression_loss: 1.7553 - classification_loss: 0.3361 380/500 [=====================>........] - ETA: 39s - loss: 2.0909 - regression_loss: 1.7549 - classification_loss: 0.3360 381/500 [=====================>........] - ETA: 39s - loss: 2.0888 - regression_loss: 1.7532 - classification_loss: 0.3356 382/500 [=====================>........] - ETA: 38s - loss: 2.0897 - regression_loss: 1.7538 - classification_loss: 0.3358 383/500 [=====================>........] - ETA: 38s - loss: 2.0898 - regression_loss: 1.7540 - classification_loss: 0.3358 384/500 [======================>.......] - ETA: 38s - loss: 2.0885 - regression_loss: 1.7530 - classification_loss: 0.3355 385/500 [======================>.......] - ETA: 37s - loss: 2.0895 - regression_loss: 1.7538 - classification_loss: 0.3358 386/500 [======================>.......] - ETA: 37s - loss: 2.0898 - regression_loss: 1.7541 - classification_loss: 0.3356 387/500 [======================>.......] - ETA: 37s - loss: 2.0898 - regression_loss: 1.7542 - classification_loss: 0.3356 388/500 [======================>.......] - ETA: 36s - loss: 2.0879 - regression_loss: 1.7526 - classification_loss: 0.3352 389/500 [======================>.......] - ETA: 36s - loss: 2.0872 - regression_loss: 1.7521 - classification_loss: 0.3351 390/500 [======================>.......] - ETA: 36s - loss: 2.0878 - regression_loss: 1.7526 - classification_loss: 0.3352 391/500 [======================>.......] - ETA: 35s - loss: 2.0860 - regression_loss: 1.7511 - classification_loss: 0.3349 392/500 [======================>.......] - ETA: 35s - loss: 2.0863 - regression_loss: 1.7514 - classification_loss: 0.3349 393/500 [======================>.......] - ETA: 35s - loss: 2.0850 - regression_loss: 1.7504 - classification_loss: 0.3346 394/500 [======================>.......] - ETA: 34s - loss: 2.0855 - regression_loss: 1.7509 - classification_loss: 0.3347 395/500 [======================>.......] - ETA: 34s - loss: 2.0852 - regression_loss: 1.7505 - classification_loss: 0.3346 396/500 [======================>.......] - ETA: 34s - loss: 2.0863 - regression_loss: 1.7516 - classification_loss: 0.3347 397/500 [======================>.......] - ETA: 34s - loss: 2.0866 - regression_loss: 1.7519 - classification_loss: 0.3348 398/500 [======================>.......] - ETA: 33s - loss: 2.0867 - regression_loss: 1.7519 - classification_loss: 0.3348 399/500 [======================>.......] - ETA: 33s - loss: 2.0870 - regression_loss: 1.7524 - classification_loss: 0.3347 400/500 [=======================>......] - ETA: 33s - loss: 2.0867 - regression_loss: 1.7521 - classification_loss: 0.3346 401/500 [=======================>......] - ETA: 32s - loss: 2.0874 - regression_loss: 1.7528 - classification_loss: 0.3347 402/500 [=======================>......] - ETA: 32s - loss: 2.0868 - regression_loss: 1.7524 - classification_loss: 0.3344 403/500 [=======================>......] - ETA: 32s - loss: 2.0858 - regression_loss: 1.7515 - classification_loss: 0.3343 404/500 [=======================>......] - ETA: 31s - loss: 2.0859 - regression_loss: 1.7516 - classification_loss: 0.3343 405/500 [=======================>......] - ETA: 31s - loss: 2.0860 - regression_loss: 1.7518 - classification_loss: 0.3342 406/500 [=======================>......] - ETA: 31s - loss: 2.0869 - regression_loss: 1.7526 - classification_loss: 0.3343 407/500 [=======================>......] - ETA: 30s - loss: 2.0873 - regression_loss: 1.7530 - classification_loss: 0.3344 408/500 [=======================>......] - ETA: 30s - loss: 2.0883 - regression_loss: 1.7537 - classification_loss: 0.3346 409/500 [=======================>......] - ETA: 30s - loss: 2.0879 - regression_loss: 1.7533 - classification_loss: 0.3346 410/500 [=======================>......] - ETA: 29s - loss: 2.0882 - regression_loss: 1.7536 - classification_loss: 0.3346 411/500 [=======================>......] - ETA: 29s - loss: 2.0874 - regression_loss: 1.7529 - classification_loss: 0.3345 412/500 [=======================>......] - ETA: 29s - loss: 2.0881 - regression_loss: 1.7536 - classification_loss: 0.3345 413/500 [=======================>......] - ETA: 28s - loss: 2.0875 - regression_loss: 1.7532 - classification_loss: 0.3343 414/500 [=======================>......] - ETA: 28s - loss: 2.0853 - regression_loss: 1.7514 - classification_loss: 0.3339 415/500 [=======================>......] - ETA: 28s - loss: 2.0862 - regression_loss: 1.7522 - classification_loss: 0.3340 416/500 [=======================>......] - ETA: 27s - loss: 2.0860 - regression_loss: 1.7521 - classification_loss: 0.3339 417/500 [========================>.....] - ETA: 27s - loss: 2.0862 - regression_loss: 1.7525 - classification_loss: 0.3336 418/500 [========================>.....] - ETA: 27s - loss: 2.0868 - regression_loss: 1.7533 - classification_loss: 0.3335 419/500 [========================>.....] - ETA: 26s - loss: 2.0871 - regression_loss: 1.7536 - classification_loss: 0.3335 420/500 [========================>.....] - ETA: 26s - loss: 2.0875 - regression_loss: 1.7539 - classification_loss: 0.3336 421/500 [========================>.....] - ETA: 26s - loss: 2.0872 - regression_loss: 1.7536 - classification_loss: 0.3337 422/500 [========================>.....] - ETA: 25s - loss: 2.0859 - regression_loss: 1.7519 - classification_loss: 0.3340 423/500 [========================>.....] - ETA: 25s - loss: 2.0861 - regression_loss: 1.7521 - classification_loss: 0.3340 424/500 [========================>.....] - ETA: 25s - loss: 2.0855 - regression_loss: 1.7516 - classification_loss: 0.3339 425/500 [========================>.....] - ETA: 24s - loss: 2.0852 - regression_loss: 1.7513 - classification_loss: 0.3339 426/500 [========================>.....] - ETA: 24s - loss: 2.0851 - regression_loss: 1.7514 - classification_loss: 0.3338 427/500 [========================>.....] - ETA: 24s - loss: 2.0855 - regression_loss: 1.7517 - classification_loss: 0.3338 428/500 [========================>.....] - ETA: 23s - loss: 2.0846 - regression_loss: 1.7508 - classification_loss: 0.3338 429/500 [========================>.....] - ETA: 23s - loss: 2.0818 - regression_loss: 1.7483 - classification_loss: 0.3335 430/500 [========================>.....] - ETA: 23s - loss: 2.0813 - regression_loss: 1.7480 - classification_loss: 0.3333 431/500 [========================>.....] - ETA: 22s - loss: 2.0812 - regression_loss: 1.7481 - classification_loss: 0.3331 432/500 [========================>.....] - ETA: 22s - loss: 2.0819 - regression_loss: 1.7487 - classification_loss: 0.3332 433/500 [========================>.....] - ETA: 22s - loss: 2.0807 - regression_loss: 1.7477 - classification_loss: 0.3330 434/500 [=========================>....] - ETA: 21s - loss: 2.0801 - regression_loss: 1.7471 - classification_loss: 0.3330 435/500 [=========================>....] - ETA: 21s - loss: 2.0791 - regression_loss: 1.7463 - classification_loss: 0.3329 436/500 [=========================>....] - ETA: 21s - loss: 2.0768 - regression_loss: 1.7444 - classification_loss: 0.3324 437/500 [=========================>....] - ETA: 20s - loss: 2.0763 - regression_loss: 1.7438 - classification_loss: 0.3325 438/500 [=========================>....] - ETA: 20s - loss: 2.0771 - regression_loss: 1.7444 - classification_loss: 0.3327 439/500 [=========================>....] - ETA: 20s - loss: 2.0769 - regression_loss: 1.7441 - classification_loss: 0.3328 440/500 [=========================>....] - ETA: 19s - loss: 2.0774 - regression_loss: 1.7445 - classification_loss: 0.3329 441/500 [=========================>....] - ETA: 19s - loss: 2.0782 - regression_loss: 1.7453 - classification_loss: 0.3329 442/500 [=========================>....] - ETA: 19s - loss: 2.0761 - regression_loss: 1.7433 - classification_loss: 0.3328 443/500 [=========================>....] - ETA: 18s - loss: 2.0777 - regression_loss: 1.7448 - classification_loss: 0.3329 444/500 [=========================>....] - ETA: 18s - loss: 2.0760 - regression_loss: 1.7434 - classification_loss: 0.3326 445/500 [=========================>....] - ETA: 18s - loss: 2.0735 - regression_loss: 1.7413 - classification_loss: 0.3323 446/500 [=========================>....] - ETA: 17s - loss: 2.0726 - regression_loss: 1.7405 - classification_loss: 0.3321 447/500 [=========================>....] - ETA: 17s - loss: 2.0729 - regression_loss: 1.7407 - classification_loss: 0.3322 448/500 [=========================>....] - ETA: 17s - loss: 2.0727 - regression_loss: 1.7406 - classification_loss: 0.3321 449/500 [=========================>....] - ETA: 16s - loss: 2.0725 - regression_loss: 1.7406 - classification_loss: 0.3320 450/500 [==========================>...] - ETA: 16s - loss: 2.0727 - regression_loss: 1.7408 - classification_loss: 0.3320 451/500 [==========================>...] - ETA: 16s - loss: 2.0730 - regression_loss: 1.7409 - classification_loss: 0.3320 452/500 [==========================>...] - ETA: 15s - loss: 2.0713 - regression_loss: 1.7396 - classification_loss: 0.3317 453/500 [==========================>...] - ETA: 15s - loss: 2.0718 - regression_loss: 1.7400 - classification_loss: 0.3318 454/500 [==========================>...] - ETA: 15s - loss: 2.0702 - regression_loss: 1.7387 - classification_loss: 0.3315 455/500 [==========================>...] - ETA: 14s - loss: 2.0685 - regression_loss: 1.7373 - classification_loss: 0.3312 456/500 [==========================>...] - ETA: 14s - loss: 2.0667 - regression_loss: 1.7357 - classification_loss: 0.3309 457/500 [==========================>...] - ETA: 14s - loss: 2.0682 - regression_loss: 1.7370 - classification_loss: 0.3312 458/500 [==========================>...] - ETA: 13s - loss: 2.0682 - regression_loss: 1.7370 - classification_loss: 0.3312 459/500 [==========================>...] - ETA: 13s - loss: 2.0683 - regression_loss: 1.7371 - classification_loss: 0.3312 460/500 [==========================>...] - ETA: 13s - loss: 2.0665 - regression_loss: 1.7355 - classification_loss: 0.3310 461/500 [==========================>...] - ETA: 12s - loss: 2.0664 - regression_loss: 1.7355 - classification_loss: 0.3310 462/500 [==========================>...] - ETA: 12s - loss: 2.0656 - regression_loss: 1.7348 - classification_loss: 0.3308 463/500 [==========================>...] - ETA: 12s - loss: 2.0670 - regression_loss: 1.7359 - classification_loss: 0.3311 464/500 [==========================>...] - ETA: 11s - loss: 2.0682 - regression_loss: 1.7370 - classification_loss: 0.3313 465/500 [==========================>...] - ETA: 11s - loss: 2.0692 - regression_loss: 1.7378 - classification_loss: 0.3314 466/500 [==========================>...] - ETA: 11s - loss: 2.0694 - regression_loss: 1.7381 - classification_loss: 0.3313 467/500 [===========================>..] - ETA: 10s - loss: 2.0693 - regression_loss: 1.7380 - classification_loss: 0.3313 468/500 [===========================>..] - ETA: 10s - loss: 2.0692 - regression_loss: 1.7379 - classification_loss: 0.3313 469/500 [===========================>..] - ETA: 10s - loss: 2.0698 - regression_loss: 1.7384 - classification_loss: 0.3314 470/500 [===========================>..] - ETA: 9s - loss: 2.0702 - regression_loss: 1.7389 - classification_loss: 0.3314  471/500 [===========================>..] - ETA: 9s - loss: 2.0702 - regression_loss: 1.7386 - classification_loss: 0.3316 472/500 [===========================>..] - ETA: 9s - loss: 2.0702 - regression_loss: 1.7387 - classification_loss: 0.3316 473/500 [===========================>..] - ETA: 8s - loss: 2.0705 - regression_loss: 1.7389 - classification_loss: 0.3316 474/500 [===========================>..] - ETA: 8s - loss: 2.0704 - regression_loss: 1.7390 - classification_loss: 0.3314 475/500 [===========================>..] - ETA: 8s - loss: 2.0701 - regression_loss: 1.7389 - classification_loss: 0.3312 476/500 [===========================>..] - ETA: 7s - loss: 2.0701 - regression_loss: 1.7389 - classification_loss: 0.3312 477/500 [===========================>..] - ETA: 7s - loss: 2.0692 - regression_loss: 1.7382 - classification_loss: 0.3310 478/500 [===========================>..] - ETA: 7s - loss: 2.0688 - regression_loss: 1.7379 - classification_loss: 0.3309 479/500 [===========================>..] - ETA: 6s - loss: 2.0691 - regression_loss: 1.7380 - classification_loss: 0.3310 480/500 [===========================>..] - ETA: 6s - loss: 2.0689 - regression_loss: 1.7379 - classification_loss: 0.3310 481/500 [===========================>..] - ETA: 6s - loss: 2.0691 - regression_loss: 1.7381 - classification_loss: 0.3310 482/500 [===========================>..] - ETA: 5s - loss: 2.0695 - regression_loss: 1.7385 - classification_loss: 0.3310 483/500 [===========================>..] - ETA: 5s - loss: 2.0699 - regression_loss: 1.7389 - classification_loss: 0.3310 484/500 [============================>.] - ETA: 5s - loss: 2.0680 - regression_loss: 1.7374 - classification_loss: 0.3307 485/500 [============================>.] - ETA: 4s - loss: 2.0682 - regression_loss: 1.7375 - classification_loss: 0.3307 486/500 [============================>.] - ETA: 4s - loss: 2.0682 - regression_loss: 1.7375 - classification_loss: 0.3307 487/500 [============================>.] - ETA: 4s - loss: 2.0682 - regression_loss: 1.7375 - classification_loss: 0.3307 488/500 [============================>.] - ETA: 3s - loss: 2.0687 - regression_loss: 1.7380 - classification_loss: 0.3307 489/500 [============================>.] - ETA: 3s - loss: 2.0692 - regression_loss: 1.7383 - classification_loss: 0.3309 490/500 [============================>.] - ETA: 3s - loss: 2.0671 - regression_loss: 1.7365 - classification_loss: 0.3306 491/500 [============================>.] - ETA: 2s - loss: 2.0676 - regression_loss: 1.7369 - classification_loss: 0.3306 492/500 [============================>.] - ETA: 2s - loss: 2.0665 - regression_loss: 1.7361 - classification_loss: 0.3303 493/500 [============================>.] - ETA: 2s - loss: 2.0666 - regression_loss: 1.7363 - classification_loss: 0.3303 494/500 [============================>.] - ETA: 1s - loss: 2.0669 - regression_loss: 1.7364 - classification_loss: 0.3305 495/500 [============================>.] - ETA: 1s - loss: 2.0670 - regression_loss: 1.7366 - classification_loss: 0.3304 496/500 [============================>.] - ETA: 1s - loss: 2.0675 - regression_loss: 1.7370 - classification_loss: 0.3305 497/500 [============================>.] - ETA: 0s - loss: 2.0672 - regression_loss: 1.7368 - classification_loss: 0.3304 498/500 [============================>.] - ETA: 0s - loss: 2.0674 - regression_loss: 1.7369 - classification_loss: 0.3305 499/500 [============================>.] - ETA: 0s - loss: 2.0677 - regression_loss: 1.7372 - classification_loss: 0.3305 500/500 [==============================] - 165s 330ms/step - loss: 2.0682 - regression_loss: 1.7376 - classification_loss: 0.3306 1172 instances of class plum with average precision: 0.4035 mAP: 0.4035 Epoch 00008: saving model to ./training/snapshots/resnet101_pascal_08.h5 Epoch 9/150 1/500 [..............................] - ETA: 2:32 - loss: 1.4207 - regression_loss: 1.1951 - classification_loss: 0.2255 2/500 [..............................] - ETA: 2:43 - loss: 1.8992 - regression_loss: 1.5966 - classification_loss: 0.3026 3/500 [..............................] - ETA: 2:42 - loss: 1.9861 - regression_loss: 1.6736 - classification_loss: 0.3125 4/500 [..............................] - ETA: 2:41 - loss: 2.0444 - regression_loss: 1.7182 - classification_loss: 0.3262 5/500 [..............................] - ETA: 2:44 - loss: 1.9892 - regression_loss: 1.6860 - classification_loss: 0.3033 6/500 [..............................] - ETA: 2:44 - loss: 1.9352 - regression_loss: 1.6194 - classification_loss: 0.3158 7/500 [..............................] - ETA: 2:43 - loss: 1.9416 - regression_loss: 1.6317 - classification_loss: 0.3099 8/500 [..............................] - ETA: 2:43 - loss: 1.9796 - regression_loss: 1.6618 - classification_loss: 0.3179 9/500 [..............................] - ETA: 2:42 - loss: 2.0009 - regression_loss: 1.6774 - classification_loss: 0.3235 10/500 [..............................] - ETA: 2:41 - loss: 1.9760 - regression_loss: 1.6615 - classification_loss: 0.3145 11/500 [..............................] - ETA: 2:40 - loss: 1.9711 - regression_loss: 1.6622 - classification_loss: 0.3089 12/500 [..............................] - ETA: 2:41 - loss: 1.9773 - regression_loss: 1.6767 - classification_loss: 0.3006 13/500 [..............................] - ETA: 2:40 - loss: 2.0541 - regression_loss: 1.7377 - classification_loss: 0.3164 14/500 [..............................] - ETA: 2:40 - loss: 2.0864 - regression_loss: 1.7579 - classification_loss: 0.3285 15/500 [..............................] - ETA: 2:40 - loss: 2.0638 - regression_loss: 1.7415 - classification_loss: 0.3223 16/500 [..............................] - ETA: 2:39 - loss: 2.0493 - regression_loss: 1.7327 - classification_loss: 0.3166 17/500 [>.............................] - ETA: 2:39 - loss: 2.0675 - regression_loss: 1.7459 - classification_loss: 0.3216 18/500 [>.............................] - ETA: 2:38 - loss: 2.1021 - regression_loss: 1.7690 - classification_loss: 0.3331 19/500 [>.............................] - ETA: 2:38 - loss: 2.1107 - regression_loss: 1.7764 - classification_loss: 0.3342 20/500 [>.............................] - ETA: 2:37 - loss: 2.0657 - regression_loss: 1.7374 - classification_loss: 0.3282 21/500 [>.............................] - ETA: 2:37 - loss: 2.0832 - regression_loss: 1.7503 - classification_loss: 0.3330 22/500 [>.............................] - ETA: 2:36 - loss: 2.0999 - regression_loss: 1.7623 - classification_loss: 0.3376 23/500 [>.............................] - ETA: 2:36 - loss: 2.0903 - regression_loss: 1.7550 - classification_loss: 0.3353 24/500 [>.............................] - ETA: 2:36 - loss: 2.1168 - regression_loss: 1.7776 - classification_loss: 0.3392 25/500 [>.............................] - ETA: 2:35 - loss: 2.1252 - regression_loss: 1.7837 - classification_loss: 0.3415 26/500 [>.............................] - ETA: 2:35 - loss: 2.1271 - regression_loss: 1.7853 - classification_loss: 0.3419 27/500 [>.............................] - ETA: 2:35 - loss: 2.1265 - regression_loss: 1.7857 - classification_loss: 0.3408 28/500 [>.............................] - ETA: 2:34 - loss: 2.1351 - regression_loss: 1.7944 - classification_loss: 0.3408 29/500 [>.............................] - ETA: 2:34 - loss: 2.1033 - regression_loss: 1.7652 - classification_loss: 0.3382 30/500 [>.............................] - ETA: 2:33 - loss: 2.0913 - regression_loss: 1.7515 - classification_loss: 0.3398 31/500 [>.............................] - ETA: 2:33 - loss: 2.0966 - regression_loss: 1.7581 - classification_loss: 0.3386 32/500 [>.............................] - ETA: 2:33 - loss: 2.0824 - regression_loss: 1.7451 - classification_loss: 0.3373 33/500 [>.............................] - ETA: 2:32 - loss: 2.0872 - regression_loss: 1.7501 - classification_loss: 0.3371 34/500 [=>............................] - ETA: 2:32 - loss: 2.0835 - regression_loss: 1.7475 - classification_loss: 0.3360 35/500 [=>............................] - ETA: 2:31 - loss: 2.0889 - regression_loss: 1.7522 - classification_loss: 0.3367 36/500 [=>............................] - ETA: 2:31 - loss: 2.0878 - regression_loss: 1.7508 - classification_loss: 0.3370 37/500 [=>............................] - ETA: 2:31 - loss: 2.0907 - regression_loss: 1.7556 - classification_loss: 0.3350 38/500 [=>............................] - ETA: 2:30 - loss: 2.1021 - regression_loss: 1.7665 - classification_loss: 0.3356 39/500 [=>............................] - ETA: 2:30 - loss: 2.1009 - regression_loss: 1.7679 - classification_loss: 0.3329 40/500 [=>............................] - ETA: 2:30 - loss: 2.0823 - regression_loss: 1.7523 - classification_loss: 0.3299 41/500 [=>............................] - ETA: 2:29 - loss: 2.0780 - regression_loss: 1.7482 - classification_loss: 0.3298 42/500 [=>............................] - ETA: 2:29 - loss: 2.0829 - regression_loss: 1.7527 - classification_loss: 0.3302 43/500 [=>............................] - ETA: 2:29 - loss: 2.0576 - regression_loss: 1.7293 - classification_loss: 0.3282 44/500 [=>............................] - ETA: 2:29 - loss: 2.0464 - regression_loss: 1.7198 - classification_loss: 0.3266 45/500 [=>............................] - ETA: 2:28 - loss: 2.0495 - regression_loss: 1.7218 - classification_loss: 0.3277 46/500 [=>............................] - ETA: 2:28 - loss: 2.0509 - regression_loss: 1.7231 - classification_loss: 0.3277 47/500 [=>............................] - ETA: 2:28 - loss: 2.0375 - regression_loss: 1.7125 - classification_loss: 0.3249 48/500 [=>............................] - ETA: 2:28 - loss: 2.0527 - regression_loss: 1.7233 - classification_loss: 0.3294 49/500 [=>............................] - ETA: 2:27 - loss: 2.0508 - regression_loss: 1.7225 - classification_loss: 0.3283 50/500 [==>...........................] - ETA: 2:27 - loss: 2.0612 - regression_loss: 1.7321 - classification_loss: 0.3291 51/500 [==>...........................] - ETA: 2:27 - loss: 2.0628 - regression_loss: 1.7343 - classification_loss: 0.3285 52/500 [==>...........................] - ETA: 2:27 - loss: 2.0692 - regression_loss: 1.7411 - classification_loss: 0.3281 53/500 [==>...........................] - ETA: 2:26 - loss: 2.0722 - regression_loss: 1.7458 - classification_loss: 0.3265 54/500 [==>...........................] - ETA: 2:26 - loss: 2.0620 - regression_loss: 1.7368 - classification_loss: 0.3252 55/500 [==>...........................] - ETA: 2:25 - loss: 2.0597 - regression_loss: 1.7346 - classification_loss: 0.3251 56/500 [==>...........................] - ETA: 2:25 - loss: 2.0622 - regression_loss: 1.7370 - classification_loss: 0.3252 57/500 [==>...........................] - ETA: 2:25 - loss: 2.0574 - regression_loss: 1.7326 - classification_loss: 0.3248 58/500 [==>...........................] - ETA: 2:25 - loss: 2.0576 - regression_loss: 1.7330 - classification_loss: 0.3247 59/500 [==>...........................] - ETA: 2:24 - loss: 2.0671 - regression_loss: 1.7421 - classification_loss: 0.3251 60/500 [==>...........................] - ETA: 2:24 - loss: 2.0635 - regression_loss: 1.7392 - classification_loss: 0.3243 61/500 [==>...........................] - ETA: 2:24 - loss: 2.0643 - regression_loss: 1.7404 - classification_loss: 0.3240 62/500 [==>...........................] - ETA: 2:23 - loss: 2.0604 - regression_loss: 1.7380 - classification_loss: 0.3224 63/500 [==>...........................] - ETA: 2:23 - loss: 2.0552 - regression_loss: 1.7344 - classification_loss: 0.3208 64/500 [==>...........................] - ETA: 2:23 - loss: 2.0461 - regression_loss: 1.7276 - classification_loss: 0.3185 65/500 [==>...........................] - ETA: 2:22 - loss: 2.0429 - regression_loss: 1.7241 - classification_loss: 0.3188 66/500 [==>...........................] - ETA: 2:22 - loss: 2.0416 - regression_loss: 1.7230 - classification_loss: 0.3186 67/500 [===>..........................] - ETA: 2:22 - loss: 2.0430 - regression_loss: 1.7247 - classification_loss: 0.3183 68/500 [===>..........................] - ETA: 2:21 - loss: 2.0216 - regression_loss: 1.7060 - classification_loss: 0.3155 69/500 [===>..........................] - ETA: 2:21 - loss: 2.0108 - regression_loss: 1.6966 - classification_loss: 0.3142 70/500 [===>..........................] - ETA: 2:21 - loss: 2.0047 - regression_loss: 1.6915 - classification_loss: 0.3131 71/500 [===>..........................] - ETA: 2:20 - loss: 2.0070 - regression_loss: 1.6942 - classification_loss: 0.3128 72/500 [===>..........................] - ETA: 2:20 - loss: 2.0054 - regression_loss: 1.6931 - classification_loss: 0.3123 73/500 [===>..........................] - ETA: 2:20 - loss: 2.0064 - regression_loss: 1.6944 - classification_loss: 0.3120 74/500 [===>..........................] - ETA: 2:19 - loss: 2.0093 - regression_loss: 1.6966 - classification_loss: 0.3127 75/500 [===>..........................] - ETA: 2:19 - loss: 2.0087 - regression_loss: 1.6965 - classification_loss: 0.3122 76/500 [===>..........................] - ETA: 2:19 - loss: 2.0159 - regression_loss: 1.7014 - classification_loss: 0.3145 77/500 [===>..........................] - ETA: 2:18 - loss: 2.0146 - regression_loss: 1.7007 - classification_loss: 0.3139 78/500 [===>..........................] - ETA: 2:18 - loss: 2.0142 - regression_loss: 1.7011 - classification_loss: 0.3131 79/500 [===>..........................] - ETA: 2:18 - loss: 2.0169 - regression_loss: 1.7032 - classification_loss: 0.3137 80/500 [===>..........................] - ETA: 2:17 - loss: 2.0178 - regression_loss: 1.7041 - classification_loss: 0.3137 81/500 [===>..........................] - ETA: 2:17 - loss: 2.0244 - regression_loss: 1.7085 - classification_loss: 0.3159 82/500 [===>..........................] - ETA: 2:17 - loss: 2.0112 - regression_loss: 1.6973 - classification_loss: 0.3139 83/500 [===>..........................] - ETA: 2:16 - loss: 2.0047 - regression_loss: 1.6916 - classification_loss: 0.3132 84/500 [====>.........................] - ETA: 2:16 - loss: 2.0039 - regression_loss: 1.6908 - classification_loss: 0.3132 85/500 [====>.........................] - ETA: 2:16 - loss: 2.0049 - regression_loss: 1.6919 - classification_loss: 0.3130 86/500 [====>.........................] - ETA: 2:16 - loss: 2.0080 - regression_loss: 1.6948 - classification_loss: 0.3132 87/500 [====>.........................] - ETA: 2:15 - loss: 1.9998 - regression_loss: 1.6871 - classification_loss: 0.3127 88/500 [====>.........................] - ETA: 2:15 - loss: 2.0030 - regression_loss: 1.6898 - classification_loss: 0.3132 89/500 [====>.........................] - ETA: 2:15 - loss: 2.0067 - regression_loss: 1.6931 - classification_loss: 0.3136 90/500 [====>.........................] - ETA: 2:14 - loss: 2.0025 - regression_loss: 1.6891 - classification_loss: 0.3134 91/500 [====>.........................] - ETA: 2:14 - loss: 2.0029 - regression_loss: 1.6892 - classification_loss: 0.3136 92/500 [====>.........................] - ETA: 2:14 - loss: 2.0056 - regression_loss: 1.6921 - classification_loss: 0.3136 93/500 [====>.........................] - ETA: 2:13 - loss: 2.0132 - regression_loss: 1.6978 - classification_loss: 0.3155 94/500 [====>.........................] - ETA: 2:13 - loss: 2.0145 - regression_loss: 1.6988 - classification_loss: 0.3157 95/500 [====>.........................] - ETA: 2:13 - loss: 2.0149 - regression_loss: 1.6989 - classification_loss: 0.3160 96/500 [====>.........................] - ETA: 2:12 - loss: 2.0166 - regression_loss: 1.7007 - classification_loss: 0.3159 97/500 [====>.........................] - ETA: 2:12 - loss: 2.0175 - regression_loss: 1.7014 - classification_loss: 0.3160 98/500 [====>.........................] - ETA: 2:12 - loss: 2.0193 - regression_loss: 1.7031 - classification_loss: 0.3162 99/500 [====>.........................] - ETA: 2:11 - loss: 2.0179 - regression_loss: 1.7018 - classification_loss: 0.3162 100/500 [=====>........................] - ETA: 2:11 - loss: 2.0177 - regression_loss: 1.7015 - classification_loss: 0.3162 101/500 [=====>........................] - ETA: 2:11 - loss: 2.0206 - regression_loss: 1.7044 - classification_loss: 0.3162 102/500 [=====>........................] - ETA: 2:10 - loss: 2.0193 - regression_loss: 1.7034 - classification_loss: 0.3160 103/500 [=====>........................] - ETA: 2:10 - loss: 2.0205 - regression_loss: 1.7044 - classification_loss: 0.3161 104/500 [=====>........................] - ETA: 2:10 - loss: 2.0179 - regression_loss: 1.7027 - classification_loss: 0.3151 105/500 [=====>........................] - ETA: 2:09 - loss: 2.0184 - regression_loss: 1.7031 - classification_loss: 0.3153 106/500 [=====>........................] - ETA: 2:09 - loss: 2.0168 - regression_loss: 1.7024 - classification_loss: 0.3143 107/500 [=====>........................] - ETA: 2:09 - loss: 2.0179 - regression_loss: 1.7034 - classification_loss: 0.3145 108/500 [=====>........................] - ETA: 2:09 - loss: 2.0206 - regression_loss: 1.7057 - classification_loss: 0.3150 109/500 [=====>........................] - ETA: 2:08 - loss: 2.0209 - regression_loss: 1.7061 - classification_loss: 0.3148 110/500 [=====>........................] - ETA: 2:08 - loss: 2.0218 - regression_loss: 1.7067 - classification_loss: 0.3151 111/500 [=====>........................] - ETA: 2:08 - loss: 2.0269 - regression_loss: 1.7095 - classification_loss: 0.3174 112/500 [=====>........................] - ETA: 2:07 - loss: 2.0269 - regression_loss: 1.7092 - classification_loss: 0.3177 113/500 [=====>........................] - ETA: 2:07 - loss: 2.0271 - regression_loss: 1.7091 - classification_loss: 0.3179 114/500 [=====>........................] - ETA: 2:07 - loss: 2.0266 - regression_loss: 1.7090 - classification_loss: 0.3177 115/500 [=====>........................] - ETA: 2:07 - loss: 2.0194 - regression_loss: 1.7029 - classification_loss: 0.3165 116/500 [=====>........................] - ETA: 2:06 - loss: 2.0183 - regression_loss: 1.7024 - classification_loss: 0.3159 117/500 [======>.......................] - ETA: 2:06 - loss: 2.0085 - regression_loss: 1.6939 - classification_loss: 0.3146 118/500 [======>.......................] - ETA: 2:05 - loss: 2.0101 - regression_loss: 1.6954 - classification_loss: 0.3147 119/500 [======>.......................] - ETA: 2:05 - loss: 2.0093 - regression_loss: 1.6951 - classification_loss: 0.3141 120/500 [======>.......................] - ETA: 2:05 - loss: 2.0037 - regression_loss: 1.6903 - classification_loss: 0.3135 121/500 [======>.......................] - ETA: 2:04 - loss: 2.0010 - regression_loss: 1.6878 - classification_loss: 0.3133 122/500 [======>.......................] - ETA: 2:04 - loss: 2.0041 - regression_loss: 1.6905 - classification_loss: 0.3137 123/500 [======>.......................] - ETA: 2:04 - loss: 1.9977 - regression_loss: 1.6847 - classification_loss: 0.3130 124/500 [======>.......................] - ETA: 2:04 - loss: 1.9956 - regression_loss: 1.6829 - classification_loss: 0.3127 125/500 [======>.......................] - ETA: 2:03 - loss: 1.9938 - regression_loss: 1.6813 - classification_loss: 0.3125 126/500 [======>.......................] - ETA: 2:03 - loss: 1.9952 - regression_loss: 1.6826 - classification_loss: 0.3126 127/500 [======>.......................] - ETA: 2:03 - loss: 1.9971 - regression_loss: 1.6845 - classification_loss: 0.3126 128/500 [======>.......................] - ETA: 2:02 - loss: 1.9904 - regression_loss: 1.6786 - classification_loss: 0.3118 129/500 [======>.......................] - ETA: 2:02 - loss: 1.9917 - regression_loss: 1.6795 - classification_loss: 0.3122 130/500 [======>.......................] - ETA: 2:02 - loss: 1.9933 - regression_loss: 1.6806 - classification_loss: 0.3127 131/500 [======>.......................] - ETA: 2:01 - loss: 1.9941 - regression_loss: 1.6810 - classification_loss: 0.3131 132/500 [======>.......................] - ETA: 2:01 - loss: 1.9964 - regression_loss: 1.6825 - classification_loss: 0.3139 133/500 [======>.......................] - ETA: 2:01 - loss: 1.9961 - regression_loss: 1.6821 - classification_loss: 0.3140 134/500 [=======>......................] - ETA: 2:00 - loss: 1.9938 - regression_loss: 1.6805 - classification_loss: 0.3134 135/500 [=======>......................] - ETA: 2:00 - loss: 1.9986 - regression_loss: 1.6848 - classification_loss: 0.3138 136/500 [=======>......................] - ETA: 2:00 - loss: 1.9918 - regression_loss: 1.6791 - classification_loss: 0.3127 137/500 [=======>......................] - ETA: 1:59 - loss: 1.9925 - regression_loss: 1.6798 - classification_loss: 0.3126 138/500 [=======>......................] - ETA: 1:59 - loss: 1.9954 - regression_loss: 1.6824 - classification_loss: 0.3130 139/500 [=======>......................] - ETA: 1:59 - loss: 1.9943 - regression_loss: 1.6818 - classification_loss: 0.3125 140/500 [=======>......................] - ETA: 1:58 - loss: 1.9961 - regression_loss: 1.6830 - classification_loss: 0.3130 141/500 [=======>......................] - ETA: 1:58 - loss: 1.9961 - regression_loss: 1.6830 - classification_loss: 0.3132 142/500 [=======>......................] - ETA: 1:58 - loss: 1.9938 - regression_loss: 1.6812 - classification_loss: 0.3126 143/500 [=======>......................] - ETA: 1:58 - loss: 1.9960 - regression_loss: 1.6830 - classification_loss: 0.3131 144/500 [=======>......................] - ETA: 1:57 - loss: 1.9915 - regression_loss: 1.6782 - classification_loss: 0.3133 145/500 [=======>......................] - ETA: 1:57 - loss: 1.9946 - regression_loss: 1.6807 - classification_loss: 0.3139 146/500 [=======>......................] - ETA: 1:57 - loss: 1.9958 - regression_loss: 1.6819 - classification_loss: 0.3139 147/500 [=======>......................] - ETA: 1:56 - loss: 1.9902 - regression_loss: 1.6772 - classification_loss: 0.3130 148/500 [=======>......................] - ETA: 1:56 - loss: 1.9941 - regression_loss: 1.6801 - classification_loss: 0.3140 149/500 [=======>......................] - ETA: 1:56 - loss: 1.9956 - regression_loss: 1.6814 - classification_loss: 0.3142 150/500 [========>.....................] - ETA: 1:55 - loss: 1.9962 - regression_loss: 1.6811 - classification_loss: 0.3151 151/500 [========>.....................] - ETA: 1:55 - loss: 1.9942 - regression_loss: 1.6794 - classification_loss: 0.3148 152/500 [========>.....................] - ETA: 1:55 - loss: 1.9890 - regression_loss: 1.6750 - classification_loss: 0.3140 153/500 [========>.....................] - ETA: 1:54 - loss: 1.9872 - regression_loss: 1.6732 - classification_loss: 0.3140 154/500 [========>.....................] - ETA: 1:54 - loss: 1.9837 - regression_loss: 1.6702 - classification_loss: 0.3135 155/500 [========>.....................] - ETA: 1:54 - loss: 1.9861 - regression_loss: 1.6723 - classification_loss: 0.3138 156/500 [========>.....................] - ETA: 1:53 - loss: 1.9886 - regression_loss: 1.6744 - classification_loss: 0.3141 157/500 [========>.....................] - ETA: 1:53 - loss: 1.9896 - regression_loss: 1.6747 - classification_loss: 0.3149 158/500 [========>.....................] - ETA: 1:53 - loss: 1.9917 - regression_loss: 1.6769 - classification_loss: 0.3148 159/500 [========>.....................] - ETA: 1:52 - loss: 1.9928 - regression_loss: 1.6779 - classification_loss: 0.3150 160/500 [========>.....................] - ETA: 1:52 - loss: 1.9881 - regression_loss: 1.6742 - classification_loss: 0.3139 161/500 [========>.....................] - ETA: 1:52 - loss: 1.9875 - regression_loss: 1.6741 - classification_loss: 0.3134 162/500 [========>.....................] - ETA: 1:51 - loss: 1.9882 - regression_loss: 1.6747 - classification_loss: 0.3135 163/500 [========>.....................] - ETA: 1:51 - loss: 1.9885 - regression_loss: 1.6747 - classification_loss: 0.3138 164/500 [========>.....................] - ETA: 1:51 - loss: 1.9910 - regression_loss: 1.6769 - classification_loss: 0.3141 165/500 [========>.....................] - ETA: 1:50 - loss: 1.9928 - regression_loss: 1.6787 - classification_loss: 0.3141 166/500 [========>.....................] - ETA: 1:50 - loss: 1.9904 - regression_loss: 1.6768 - classification_loss: 0.3136 167/500 [=========>....................] - ETA: 1:50 - loss: 1.9914 - regression_loss: 1.6777 - classification_loss: 0.3137 168/500 [=========>....................] - ETA: 1:49 - loss: 1.9934 - regression_loss: 1.6794 - classification_loss: 0.3140 169/500 [=========>....................] - ETA: 1:49 - loss: 1.9970 - regression_loss: 1.6820 - classification_loss: 0.3149 170/500 [=========>....................] - ETA: 1:49 - loss: 1.9987 - regression_loss: 1.6811 - classification_loss: 0.3176 171/500 [=========>....................] - ETA: 1:48 - loss: 2.0039 - regression_loss: 1.6851 - classification_loss: 0.3188 172/500 [=========>....................] - ETA: 1:48 - loss: 2.0100 - regression_loss: 1.6901 - classification_loss: 0.3199 173/500 [=========>....................] - ETA: 1:48 - loss: 2.0099 - regression_loss: 1.6900 - classification_loss: 0.3198 174/500 [=========>....................] - ETA: 1:47 - loss: 2.0120 - regression_loss: 1.6921 - classification_loss: 0.3199 175/500 [=========>....................] - ETA: 1:47 - loss: 2.0139 - regression_loss: 1.6938 - classification_loss: 0.3201 176/500 [=========>....................] - ETA: 1:47 - loss: 2.0122 - regression_loss: 1.6920 - classification_loss: 0.3202 177/500 [=========>....................] - ETA: 1:46 - loss: 2.0126 - regression_loss: 1.6923 - classification_loss: 0.3203 178/500 [=========>....................] - ETA: 1:46 - loss: 2.0149 - regression_loss: 1.6942 - classification_loss: 0.3207 179/500 [=========>....................] - ETA: 1:46 - loss: 2.0155 - regression_loss: 1.6942 - classification_loss: 0.3214 180/500 [=========>....................] - ETA: 1:45 - loss: 2.0167 - regression_loss: 1.6951 - classification_loss: 0.3217 181/500 [=========>....................] - ETA: 1:45 - loss: 2.0110 - regression_loss: 1.6903 - classification_loss: 0.3208 182/500 [=========>....................] - ETA: 1:45 - loss: 2.0120 - regression_loss: 1.6910 - classification_loss: 0.3210 183/500 [=========>....................] - ETA: 1:44 - loss: 2.0130 - regression_loss: 1.6919 - classification_loss: 0.3211 184/500 [==========>...................] - ETA: 1:44 - loss: 2.0122 - regression_loss: 1.6916 - classification_loss: 0.3206 185/500 [==========>...................] - ETA: 1:43 - loss: 2.0126 - regression_loss: 1.6925 - classification_loss: 0.3201 186/500 [==========>...................] - ETA: 1:43 - loss: 2.0138 - regression_loss: 1.6934 - classification_loss: 0.3204 187/500 [==========>...................] - ETA: 1:43 - loss: 2.0124 - regression_loss: 1.6925 - classification_loss: 0.3199 188/500 [==========>...................] - ETA: 1:42 - loss: 2.0126 - regression_loss: 1.6927 - classification_loss: 0.3199 189/500 [==========>...................] - ETA: 1:42 - loss: 2.0129 - regression_loss: 1.6925 - classification_loss: 0.3204 190/500 [==========>...................] - ETA: 1:42 - loss: 2.0145 - regression_loss: 1.6936 - classification_loss: 0.3209 191/500 [==========>...................] - ETA: 1:41 - loss: 2.0169 - regression_loss: 1.6954 - classification_loss: 0.3215 192/500 [==========>...................] - ETA: 1:41 - loss: 2.0191 - regression_loss: 1.6972 - classification_loss: 0.3219 193/500 [==========>...................] - ETA: 1:41 - loss: 2.0196 - regression_loss: 1.6978 - classification_loss: 0.3218 194/500 [==========>...................] - ETA: 1:40 - loss: 2.0196 - regression_loss: 1.6977 - classification_loss: 0.3219 195/500 [==========>...................] - ETA: 1:40 - loss: 2.0179 - regression_loss: 1.6964 - classification_loss: 0.3215 196/500 [==========>...................] - ETA: 1:40 - loss: 2.0151 - regression_loss: 1.6940 - classification_loss: 0.3212 197/500 [==========>...................] - ETA: 1:40 - loss: 2.0178 - regression_loss: 1.6962 - classification_loss: 0.3216 198/500 [==========>...................] - ETA: 1:39 - loss: 2.0181 - regression_loss: 1.6962 - classification_loss: 0.3219 199/500 [==========>...................] - ETA: 1:39 - loss: 2.0190 - regression_loss: 1.6972 - classification_loss: 0.3218 200/500 [===========>..................] - ETA: 1:39 - loss: 2.0204 - regression_loss: 1.6984 - classification_loss: 0.3220 201/500 [===========>..................] - ETA: 1:38 - loss: 2.0218 - regression_loss: 1.6997 - classification_loss: 0.3220 202/500 [===========>..................] - ETA: 1:38 - loss: 2.0215 - regression_loss: 1.6992 - classification_loss: 0.3223 203/500 [===========>..................] - ETA: 1:38 - loss: 2.0250 - regression_loss: 1.7020 - classification_loss: 0.3231 204/500 [===========>..................] - ETA: 1:37 - loss: 2.0229 - regression_loss: 1.7004 - classification_loss: 0.3225 205/500 [===========>..................] - ETA: 1:37 - loss: 2.0235 - regression_loss: 1.7008 - classification_loss: 0.3227 206/500 [===========>..................] - ETA: 1:37 - loss: 2.0244 - regression_loss: 1.7014 - classification_loss: 0.3230 207/500 [===========>..................] - ETA: 1:36 - loss: 2.0263 - regression_loss: 1.7030 - classification_loss: 0.3233 208/500 [===========>..................] - ETA: 1:36 - loss: 2.0226 - regression_loss: 1.6998 - classification_loss: 0.3228 209/500 [===========>..................] - ETA: 1:36 - loss: 2.0224 - regression_loss: 1.7000 - classification_loss: 0.3224 210/500 [===========>..................] - ETA: 1:35 - loss: 2.0240 - regression_loss: 1.7017 - classification_loss: 0.3223 211/500 [===========>..................] - ETA: 1:35 - loss: 2.0263 - regression_loss: 1.7035 - classification_loss: 0.3228 212/500 [===========>..................] - ETA: 1:35 - loss: 2.0243 - regression_loss: 1.7020 - classification_loss: 0.3222 213/500 [===========>..................] - ETA: 1:34 - loss: 2.0226 - regression_loss: 1.7009 - classification_loss: 0.3217 214/500 [===========>..................] - ETA: 1:34 - loss: 2.0236 - regression_loss: 1.7018 - classification_loss: 0.3218 215/500 [===========>..................] - ETA: 1:34 - loss: 2.0209 - regression_loss: 1.6993 - classification_loss: 0.3216 216/500 [===========>..................] - ETA: 1:33 - loss: 2.0213 - regression_loss: 1.6993 - classification_loss: 0.3220 217/500 [============>.................] - ETA: 1:33 - loss: 2.0276 - regression_loss: 1.7043 - classification_loss: 0.3233 218/500 [============>.................] - ETA: 1:33 - loss: 2.0251 - regression_loss: 1.7023 - classification_loss: 0.3227 219/500 [============>.................] - ETA: 1:32 - loss: 2.0256 - regression_loss: 1.7030 - classification_loss: 0.3227 220/500 [============>.................] - ETA: 1:32 - loss: 2.0226 - regression_loss: 1.7003 - classification_loss: 0.3223 221/500 [============>.................] - ETA: 1:32 - loss: 2.0177 - regression_loss: 1.6958 - classification_loss: 0.3219 222/500 [============>.................] - ETA: 1:31 - loss: 2.0186 - regression_loss: 1.6964 - classification_loss: 0.3221 223/500 [============>.................] - ETA: 1:31 - loss: 2.0164 - regression_loss: 1.6942 - classification_loss: 0.3222 224/500 [============>.................] - ETA: 1:31 - loss: 2.0159 - regression_loss: 1.6938 - classification_loss: 0.3221 225/500 [============>.................] - ETA: 1:30 - loss: 2.0129 - regression_loss: 1.6913 - classification_loss: 0.3216 226/500 [============>.................] - ETA: 1:30 - loss: 2.0135 - regression_loss: 1.6919 - classification_loss: 0.3215 227/500 [============>.................] - ETA: 1:30 - loss: 2.0147 - regression_loss: 1.6931 - classification_loss: 0.3216 228/500 [============>.................] - ETA: 1:29 - loss: 2.0145 - regression_loss: 1.6928 - classification_loss: 0.3217 229/500 [============>.................] - ETA: 1:29 - loss: 2.0161 - regression_loss: 1.6939 - classification_loss: 0.3222 230/500 [============>.................] - ETA: 1:29 - loss: 2.0152 - regression_loss: 1.6933 - classification_loss: 0.3219 231/500 [============>.................] - ETA: 1:28 - loss: 2.0155 - regression_loss: 1.6937 - classification_loss: 0.3218 232/500 [============>.................] - ETA: 1:28 - loss: 2.0162 - regression_loss: 1.6943 - classification_loss: 0.3218 233/500 [============>.................] - ETA: 1:28 - loss: 2.0173 - regression_loss: 1.6952 - classification_loss: 0.3221 234/500 [=============>................] - ETA: 1:27 - loss: 2.0168 - regression_loss: 1.6951 - classification_loss: 0.3217 235/500 [=============>................] - ETA: 1:27 - loss: 2.0176 - regression_loss: 1.6955 - classification_loss: 0.3220 236/500 [=============>................] - ETA: 1:27 - loss: 2.0172 - regression_loss: 1.6954 - classification_loss: 0.3219 237/500 [=============>................] - ETA: 1:26 - loss: 2.0183 - regression_loss: 1.6963 - classification_loss: 0.3220 238/500 [=============>................] - ETA: 1:26 - loss: 2.0165 - regression_loss: 1.6947 - classification_loss: 0.3219 239/500 [=============>................] - ETA: 1:26 - loss: 2.0146 - regression_loss: 1.6932 - classification_loss: 0.3213 240/500 [=============>................] - ETA: 1:25 - loss: 2.0162 - regression_loss: 1.6945 - classification_loss: 0.3217 241/500 [=============>................] - ETA: 1:25 - loss: 2.0180 - regression_loss: 1.6962 - classification_loss: 0.3218 242/500 [=============>................] - ETA: 1:25 - loss: 2.0179 - regression_loss: 1.6963 - classification_loss: 0.3217 243/500 [=============>................] - ETA: 1:24 - loss: 2.0186 - regression_loss: 1.6969 - classification_loss: 0.3216 244/500 [=============>................] - ETA: 1:24 - loss: 2.0173 - regression_loss: 1.6959 - classification_loss: 0.3213 245/500 [=============>................] - ETA: 1:24 - loss: 2.0178 - regression_loss: 1.6964 - classification_loss: 0.3214 246/500 [=============>................] - ETA: 1:23 - loss: 2.0209 - regression_loss: 1.6984 - classification_loss: 0.3225 247/500 [=============>................] - ETA: 1:23 - loss: 2.0191 - regression_loss: 1.6963 - classification_loss: 0.3227 248/500 [=============>................] - ETA: 1:23 - loss: 2.0184 - regression_loss: 1.6956 - classification_loss: 0.3228 249/500 [=============>................] - ETA: 1:22 - loss: 2.0203 - regression_loss: 1.6972 - classification_loss: 0.3230 250/500 [==============>...............] - ETA: 1:22 - loss: 2.0189 - regression_loss: 1.6961 - classification_loss: 0.3228 251/500 [==============>...............] - ETA: 1:22 - loss: 2.0178 - regression_loss: 1.6951 - classification_loss: 0.3226 252/500 [==============>...............] - ETA: 1:21 - loss: 2.0151 - regression_loss: 1.6930 - classification_loss: 0.3221 253/500 [==============>...............] - ETA: 1:21 - loss: 2.0165 - regression_loss: 1.6942 - classification_loss: 0.3223 254/500 [==============>...............] - ETA: 1:21 - loss: 2.0189 - regression_loss: 1.6961 - classification_loss: 0.3228 255/500 [==============>...............] - ETA: 1:20 - loss: 2.0193 - regression_loss: 1.6964 - classification_loss: 0.3229 256/500 [==============>...............] - ETA: 1:20 - loss: 2.0149 - regression_loss: 1.6926 - classification_loss: 0.3223 257/500 [==============>...............] - ETA: 1:20 - loss: 2.0121 - regression_loss: 1.6903 - classification_loss: 0.3218 258/500 [==============>...............] - ETA: 1:19 - loss: 2.0127 - regression_loss: 1.6907 - classification_loss: 0.3220 259/500 [==============>...............] - ETA: 1:19 - loss: 2.0127 - regression_loss: 1.6908 - classification_loss: 0.3219 260/500 [==============>...............] - ETA: 1:19 - loss: 2.0120 - regression_loss: 1.6899 - classification_loss: 0.3221 261/500 [==============>...............] - ETA: 1:18 - loss: 2.0091 - regression_loss: 1.6876 - classification_loss: 0.3215 262/500 [==============>...............] - ETA: 1:18 - loss: 2.0105 - regression_loss: 1.6888 - classification_loss: 0.3217 263/500 [==============>...............] - ETA: 1:18 - loss: 2.0124 - regression_loss: 1.6904 - classification_loss: 0.3220 264/500 [==============>...............] - ETA: 1:17 - loss: 2.0106 - regression_loss: 1.6886 - classification_loss: 0.3219 265/500 [==============>...............] - ETA: 1:17 - loss: 2.0091 - regression_loss: 1.6874 - classification_loss: 0.3217 266/500 [==============>...............] - ETA: 1:17 - loss: 2.0113 - regression_loss: 1.6892 - classification_loss: 0.3221 267/500 [===============>..............] - ETA: 1:16 - loss: 2.0079 - regression_loss: 1.6862 - classification_loss: 0.3217 268/500 [===============>..............] - ETA: 1:16 - loss: 2.0113 - regression_loss: 1.6889 - classification_loss: 0.3223 269/500 [===============>..............] - ETA: 1:16 - loss: 2.0117 - regression_loss: 1.6893 - classification_loss: 0.3224 270/500 [===============>..............] - ETA: 1:15 - loss: 2.0118 - regression_loss: 1.6895 - classification_loss: 0.3223 271/500 [===============>..............] - ETA: 1:15 - loss: 2.0116 - regression_loss: 1.6894 - classification_loss: 0.3222 272/500 [===============>..............] - ETA: 1:15 - loss: 2.0120 - regression_loss: 1.6900 - classification_loss: 0.3220 273/500 [===============>..............] - ETA: 1:15 - loss: 2.0123 - regression_loss: 1.6903 - classification_loss: 0.3220 274/500 [===============>..............] - ETA: 1:14 - loss: 2.0135 - regression_loss: 1.6914 - classification_loss: 0.3221 275/500 [===============>..............] - ETA: 1:14 - loss: 2.0102 - regression_loss: 1.6887 - classification_loss: 0.3215 276/500 [===============>..............] - ETA: 1:14 - loss: 2.0074 - regression_loss: 1.6861 - classification_loss: 0.3213 277/500 [===============>..............] - ETA: 1:13 - loss: 2.0098 - regression_loss: 1.6883 - classification_loss: 0.3215 278/500 [===============>..............] - ETA: 1:13 - loss: 2.0074 - regression_loss: 1.6862 - classification_loss: 0.3212 279/500 [===============>..............] - ETA: 1:13 - loss: 2.0081 - regression_loss: 1.6867 - classification_loss: 0.3214 280/500 [===============>..............] - ETA: 1:12 - loss: 2.0083 - regression_loss: 1.6869 - classification_loss: 0.3214 281/500 [===============>..............] - ETA: 1:12 - loss: 2.0087 - regression_loss: 1.6873 - classification_loss: 0.3214 282/500 [===============>..............] - ETA: 1:12 - loss: 2.0087 - regression_loss: 1.6875 - classification_loss: 0.3212 283/500 [===============>..............] - ETA: 1:11 - loss: 2.0089 - regression_loss: 1.6878 - classification_loss: 0.3211 284/500 [================>.............] - ETA: 1:11 - loss: 2.0094 - regression_loss: 1.6883 - classification_loss: 0.3211 285/500 [================>.............] - ETA: 1:11 - loss: 2.0097 - regression_loss: 1.6887 - classification_loss: 0.3210 286/500 [================>.............] - ETA: 1:10 - loss: 2.0098 - regression_loss: 1.6888 - classification_loss: 0.3210 287/500 [================>.............] - ETA: 1:10 - loss: 2.0098 - regression_loss: 1.6889 - classification_loss: 0.3209 288/500 [================>.............] - ETA: 1:10 - loss: 2.0076 - regression_loss: 1.6871 - classification_loss: 0.3205 289/500 [================>.............] - ETA: 1:09 - loss: 2.0082 - regression_loss: 1.6876 - classification_loss: 0.3206 290/500 [================>.............] - ETA: 1:09 - loss: 2.0075 - regression_loss: 1.6870 - classification_loss: 0.3205 291/500 [================>.............] - ETA: 1:09 - loss: 2.0085 - regression_loss: 1.6880 - classification_loss: 0.3205 292/500 [================>.............] - ETA: 1:08 - loss: 2.0098 - regression_loss: 1.6892 - classification_loss: 0.3206 293/500 [================>.............] - ETA: 1:08 - loss: 2.0088 - regression_loss: 1.6885 - classification_loss: 0.3203 294/500 [================>.............] - ETA: 1:08 - loss: 2.0095 - regression_loss: 1.6890 - classification_loss: 0.3205 295/500 [================>.............] - ETA: 1:07 - loss: 2.0088 - regression_loss: 1.6884 - classification_loss: 0.3204 296/500 [================>.............] - ETA: 1:07 - loss: 2.0083 - regression_loss: 1.6881 - classification_loss: 0.3202 297/500 [================>.............] - ETA: 1:07 - loss: 2.0073 - regression_loss: 1.6873 - classification_loss: 0.3201 298/500 [================>.............] - ETA: 1:06 - loss: 2.0071 - regression_loss: 1.6870 - classification_loss: 0.3201 299/500 [================>.............] - ETA: 1:06 - loss: 2.0069 - regression_loss: 1.6869 - classification_loss: 0.3200 300/500 [=================>............] - ETA: 1:06 - loss: 2.0057 - regression_loss: 1.6858 - classification_loss: 0.3199 301/500 [=================>............] - ETA: 1:05 - loss: 2.0067 - regression_loss: 1.6864 - classification_loss: 0.3203 302/500 [=================>............] - ETA: 1:05 - loss: 2.0046 - regression_loss: 1.6847 - classification_loss: 0.3198 303/500 [=================>............] - ETA: 1:05 - loss: 2.0049 - regression_loss: 1.6851 - classification_loss: 0.3198 304/500 [=================>............] - ETA: 1:04 - loss: 2.0044 - regression_loss: 1.6849 - classification_loss: 0.3195 305/500 [=================>............] - ETA: 1:04 - loss: 2.0046 - regression_loss: 1.6849 - classification_loss: 0.3197 306/500 [=================>............] - ETA: 1:04 - loss: 2.0048 - regression_loss: 1.6851 - classification_loss: 0.3197 307/500 [=================>............] - ETA: 1:03 - loss: 2.0048 - regression_loss: 1.6848 - classification_loss: 0.3200 308/500 [=================>............] - ETA: 1:03 - loss: 2.0064 - regression_loss: 1.6861 - classification_loss: 0.3202 309/500 [=================>............] - ETA: 1:03 - loss: 2.0069 - regression_loss: 1.6866 - classification_loss: 0.3203 310/500 [=================>............] - ETA: 1:02 - loss: 2.0061 - regression_loss: 1.6859 - classification_loss: 0.3203 311/500 [=================>............] - ETA: 1:02 - loss: 2.0047 - regression_loss: 1.6847 - classification_loss: 0.3200 312/500 [=================>............] - ETA: 1:02 - loss: 2.0030 - regression_loss: 1.6831 - classification_loss: 0.3198 313/500 [=================>............] - ETA: 1:01 - loss: 2.0002 - regression_loss: 1.6807 - classification_loss: 0.3195 314/500 [=================>............] - ETA: 1:01 - loss: 2.0001 - regression_loss: 1.6805 - classification_loss: 0.3196 315/500 [=================>............] - ETA: 1:01 - loss: 2.0020 - regression_loss: 1.6823 - classification_loss: 0.3198 316/500 [=================>............] - ETA: 1:00 - loss: 2.0013 - regression_loss: 1.6819 - classification_loss: 0.3193 317/500 [==================>...........] - ETA: 1:00 - loss: 1.9994 - regression_loss: 1.6801 - classification_loss: 0.3193 318/500 [==================>...........] - ETA: 1:00 - loss: 1.9970 - regression_loss: 1.6779 - classification_loss: 0.3191 319/500 [==================>...........] - ETA: 59s - loss: 1.9975 - regression_loss: 1.6783 - classification_loss: 0.3192  320/500 [==================>...........] - ETA: 59s - loss: 1.9980 - regression_loss: 1.6787 - classification_loss: 0.3193 321/500 [==================>...........] - ETA: 59s - loss: 1.9980 - regression_loss: 1.6786 - classification_loss: 0.3194 322/500 [==================>...........] - ETA: 58s - loss: 1.9992 - regression_loss: 1.6793 - classification_loss: 0.3198 323/500 [==================>...........] - ETA: 58s - loss: 1.9968 - regression_loss: 1.6769 - classification_loss: 0.3199 324/500 [==================>...........] - ETA: 58s - loss: 1.9976 - regression_loss: 1.6775 - classification_loss: 0.3201 325/500 [==================>...........] - ETA: 57s - loss: 1.9985 - regression_loss: 1.6780 - classification_loss: 0.3206 326/500 [==================>...........] - ETA: 57s - loss: 1.9964 - regression_loss: 1.6760 - classification_loss: 0.3204 327/500 [==================>...........] - ETA: 57s - loss: 1.9970 - regression_loss: 1.6764 - classification_loss: 0.3205 328/500 [==================>...........] - ETA: 56s - loss: 1.9944 - regression_loss: 1.6742 - classification_loss: 0.3202 329/500 [==================>...........] - ETA: 56s - loss: 1.9950 - regression_loss: 1.6746 - classification_loss: 0.3204 330/500 [==================>...........] - ETA: 56s - loss: 1.9949 - regression_loss: 1.6747 - classification_loss: 0.3203 331/500 [==================>...........] - ETA: 55s - loss: 1.9936 - regression_loss: 1.6735 - classification_loss: 0.3201 332/500 [==================>...........] - ETA: 55s - loss: 1.9911 - regression_loss: 1.6713 - classification_loss: 0.3198 333/500 [==================>...........] - ETA: 55s - loss: 1.9894 - regression_loss: 1.6698 - classification_loss: 0.3195 334/500 [===================>..........] - ETA: 54s - loss: 1.9902 - regression_loss: 1.6704 - classification_loss: 0.3198 335/500 [===================>..........] - ETA: 54s - loss: 1.9908 - regression_loss: 1.6709 - classification_loss: 0.3199 336/500 [===================>..........] - ETA: 54s - loss: 1.9927 - regression_loss: 1.6726 - classification_loss: 0.3201 337/500 [===================>..........] - ETA: 53s - loss: 1.9943 - regression_loss: 1.6738 - classification_loss: 0.3206 338/500 [===================>..........] - ETA: 53s - loss: 1.9944 - regression_loss: 1.6738 - classification_loss: 0.3206 339/500 [===================>..........] - ETA: 53s - loss: 1.9942 - regression_loss: 1.6736 - classification_loss: 0.3206 340/500 [===================>..........] - ETA: 52s - loss: 1.9953 - regression_loss: 1.6745 - classification_loss: 0.3207 341/500 [===================>..........] - ETA: 52s - loss: 1.9955 - regression_loss: 1.6747 - classification_loss: 0.3208 342/500 [===================>..........] - ETA: 52s - loss: 1.9960 - regression_loss: 1.6752 - classification_loss: 0.3208 343/500 [===================>..........] - ETA: 51s - loss: 1.9970 - regression_loss: 1.6764 - classification_loss: 0.3206 344/500 [===================>..........] - ETA: 51s - loss: 1.9942 - regression_loss: 1.6739 - classification_loss: 0.3202 345/500 [===================>..........] - ETA: 51s - loss: 1.9927 - regression_loss: 1.6727 - classification_loss: 0.3200 346/500 [===================>..........] - ETA: 50s - loss: 1.9938 - regression_loss: 1.6736 - classification_loss: 0.3202 347/500 [===================>..........] - ETA: 50s - loss: 1.9922 - regression_loss: 1.6723 - classification_loss: 0.3199 348/500 [===================>..........] - ETA: 50s - loss: 1.9930 - regression_loss: 1.6728 - classification_loss: 0.3201 349/500 [===================>..........] - ETA: 50s - loss: 1.9936 - regression_loss: 1.6733 - classification_loss: 0.3203 350/500 [====================>.........] - ETA: 49s - loss: 1.9925 - regression_loss: 1.6725 - classification_loss: 0.3200 351/500 [====================>.........] - ETA: 49s - loss: 1.9923 - regression_loss: 1.6723 - classification_loss: 0.3199 352/500 [====================>.........] - ETA: 49s - loss: 1.9915 - regression_loss: 1.6716 - classification_loss: 0.3199 353/500 [====================>.........] - ETA: 48s - loss: 1.9922 - regression_loss: 1.6723 - classification_loss: 0.3199 354/500 [====================>.........] - ETA: 48s - loss: 1.9915 - regression_loss: 1.6717 - classification_loss: 0.3198 355/500 [====================>.........] - ETA: 48s - loss: 1.9882 - regression_loss: 1.6689 - classification_loss: 0.3193 356/500 [====================>.........] - ETA: 47s - loss: 1.9888 - regression_loss: 1.6692 - classification_loss: 0.3196 357/500 [====================>.........] - ETA: 47s - loss: 1.9898 - regression_loss: 1.6702 - classification_loss: 0.3196 358/500 [====================>.........] - ETA: 47s - loss: 1.9904 - regression_loss: 1.6706 - classification_loss: 0.3197 359/500 [====================>.........] - ETA: 46s - loss: 1.9890 - regression_loss: 1.6696 - classification_loss: 0.3194 360/500 [====================>.........] - ETA: 46s - loss: 1.9891 - regression_loss: 1.6698 - classification_loss: 0.3194 361/500 [====================>.........] - ETA: 46s - loss: 1.9886 - regression_loss: 1.6694 - classification_loss: 0.3192 362/500 [====================>.........] - ETA: 45s - loss: 1.9896 - regression_loss: 1.6704 - classification_loss: 0.3192 363/500 [====================>.........] - ETA: 45s - loss: 1.9911 - regression_loss: 1.6717 - classification_loss: 0.3194 364/500 [====================>.........] - ETA: 45s - loss: 1.9912 - regression_loss: 1.6718 - classification_loss: 0.3193 365/500 [====================>.........] - ETA: 44s - loss: 1.9919 - regression_loss: 1.6726 - classification_loss: 0.3193 366/500 [====================>.........] - ETA: 44s - loss: 1.9917 - regression_loss: 1.6724 - classification_loss: 0.3194 367/500 [=====================>........] - ETA: 44s - loss: 1.9925 - regression_loss: 1.6729 - classification_loss: 0.3196 368/500 [=====================>........] - ETA: 43s - loss: 1.9928 - regression_loss: 1.6732 - classification_loss: 0.3196 369/500 [=====================>........] - ETA: 43s - loss: 1.9936 - regression_loss: 1.6739 - classification_loss: 0.3197 370/500 [=====================>........] - ETA: 43s - loss: 1.9962 - regression_loss: 1.6759 - classification_loss: 0.3204 371/500 [=====================>........] - ETA: 42s - loss: 1.9969 - regression_loss: 1.6765 - classification_loss: 0.3204 372/500 [=====================>........] - ETA: 42s - loss: 1.9976 - regression_loss: 1.6771 - classification_loss: 0.3206 373/500 [=====================>........] - ETA: 42s - loss: 1.9979 - regression_loss: 1.6773 - classification_loss: 0.3206 374/500 [=====================>........] - ETA: 41s - loss: 1.9957 - regression_loss: 1.6753 - classification_loss: 0.3204 375/500 [=====================>........] - ETA: 41s - loss: 1.9958 - regression_loss: 1.6754 - classification_loss: 0.3205 376/500 [=====================>........] - ETA: 41s - loss: 1.9972 - regression_loss: 1.6767 - classification_loss: 0.3205 377/500 [=====================>........] - ETA: 40s - loss: 1.9976 - regression_loss: 1.6771 - classification_loss: 0.3206 378/500 [=====================>........] - ETA: 40s - loss: 1.9978 - regression_loss: 1.6773 - classification_loss: 0.3206 379/500 [=====================>........] - ETA: 40s - loss: 1.9960 - regression_loss: 1.6757 - classification_loss: 0.3204 380/500 [=====================>........] - ETA: 39s - loss: 1.9966 - regression_loss: 1.6762 - classification_loss: 0.3205 381/500 [=====================>........] - ETA: 39s - loss: 1.9967 - regression_loss: 1.6762 - classification_loss: 0.3205 382/500 [=====================>........] - ETA: 39s - loss: 1.9974 - regression_loss: 1.6769 - classification_loss: 0.3206 383/500 [=====================>........] - ETA: 38s - loss: 1.9964 - regression_loss: 1.6761 - classification_loss: 0.3202 384/500 [======================>.......] - ETA: 38s - loss: 1.9965 - regression_loss: 1.6763 - classification_loss: 0.3202 385/500 [======================>.......] - ETA: 38s - loss: 1.9964 - regression_loss: 1.6762 - classification_loss: 0.3202 386/500 [======================>.......] - ETA: 37s - loss: 1.9968 - regression_loss: 1.6765 - classification_loss: 0.3203 387/500 [======================>.......] - ETA: 37s - loss: 1.9973 - regression_loss: 1.6769 - classification_loss: 0.3204 388/500 [======================>.......] - ETA: 37s - loss: 1.9976 - regression_loss: 1.6773 - classification_loss: 0.3203 389/500 [======================>.......] - ETA: 36s - loss: 1.9976 - regression_loss: 1.6773 - classification_loss: 0.3203 390/500 [======================>.......] - ETA: 36s - loss: 1.9979 - regression_loss: 1.6775 - classification_loss: 0.3204 391/500 [======================>.......] - ETA: 36s - loss: 1.9973 - regression_loss: 1.6769 - classification_loss: 0.3204 392/500 [======================>.......] - ETA: 35s - loss: 1.9986 - regression_loss: 1.6781 - classification_loss: 0.3205 393/500 [======================>.......] - ETA: 35s - loss: 1.9991 - regression_loss: 1.6786 - classification_loss: 0.3205 394/500 [======================>.......] - ETA: 35s - loss: 1.9995 - regression_loss: 1.6790 - classification_loss: 0.3206 395/500 [======================>.......] - ETA: 34s - loss: 1.9998 - regression_loss: 1.6792 - classification_loss: 0.3206 396/500 [======================>.......] - ETA: 34s - loss: 1.9999 - regression_loss: 1.6793 - classification_loss: 0.3205 397/500 [======================>.......] - ETA: 34s - loss: 1.9989 - regression_loss: 1.6785 - classification_loss: 0.3203 398/500 [======================>.......] - ETA: 33s - loss: 1.9981 - regression_loss: 1.6781 - classification_loss: 0.3200 399/500 [======================>.......] - ETA: 33s - loss: 1.9983 - regression_loss: 1.6784 - classification_loss: 0.3199 400/500 [=======================>......] - ETA: 33s - loss: 1.9978 - regression_loss: 1.6782 - classification_loss: 0.3197 401/500 [=======================>......] - ETA: 32s - loss: 1.9986 - regression_loss: 1.6788 - classification_loss: 0.3198 402/500 [=======================>......] - ETA: 32s - loss: 1.9972 - regression_loss: 1.6777 - classification_loss: 0.3196 403/500 [=======================>......] - ETA: 32s - loss: 1.9981 - regression_loss: 1.6783 - classification_loss: 0.3197 404/500 [=======================>......] - ETA: 31s - loss: 1.9989 - regression_loss: 1.6787 - classification_loss: 0.3201 405/500 [=======================>......] - ETA: 31s - loss: 1.9991 - regression_loss: 1.6790 - classification_loss: 0.3201 406/500 [=======================>......] - ETA: 31s - loss: 2.0005 - regression_loss: 1.6803 - classification_loss: 0.3202 407/500 [=======================>......] - ETA: 30s - loss: 2.0007 - regression_loss: 1.6806 - classification_loss: 0.3202 408/500 [=======================>......] - ETA: 30s - loss: 2.0003 - regression_loss: 1.6802 - classification_loss: 0.3202 409/500 [=======================>......] - ETA: 30s - loss: 2.0000 - regression_loss: 1.6798 - classification_loss: 0.3203 410/500 [=======================>......] - ETA: 29s - loss: 1.9979 - regression_loss: 1.6780 - classification_loss: 0.3199 411/500 [=======================>......] - ETA: 29s - loss: 1.9992 - regression_loss: 1.6792 - classification_loss: 0.3200 412/500 [=======================>......] - ETA: 29s - loss: 1.9984 - regression_loss: 1.6785 - classification_loss: 0.3199 413/500 [=======================>......] - ETA: 28s - loss: 1.9982 - regression_loss: 1.6783 - classification_loss: 0.3198 414/500 [=======================>......] - ETA: 28s - loss: 1.9994 - regression_loss: 1.6795 - classification_loss: 0.3200 415/500 [=======================>......] - ETA: 28s - loss: 1.9988 - regression_loss: 1.6791 - classification_loss: 0.3197 416/500 [=======================>......] - ETA: 27s - loss: 1.9992 - regression_loss: 1.6794 - classification_loss: 0.3197 417/500 [========================>.....] - ETA: 27s - loss: 1.9969 - regression_loss: 1.6775 - classification_loss: 0.3194 418/500 [========================>.....] - ETA: 27s - loss: 1.9945 - regression_loss: 1.6755 - classification_loss: 0.3191 419/500 [========================>.....] - ETA: 26s - loss: 1.9945 - regression_loss: 1.6756 - classification_loss: 0.3189 420/500 [========================>.....] - ETA: 26s - loss: 1.9945 - regression_loss: 1.6756 - classification_loss: 0.3189 421/500 [========================>.....] - ETA: 26s - loss: 1.9948 - regression_loss: 1.6759 - classification_loss: 0.3189 422/500 [========================>.....] - ETA: 25s - loss: 1.9954 - regression_loss: 1.6765 - classification_loss: 0.3189 423/500 [========================>.....] - ETA: 25s - loss: 1.9937 - regression_loss: 1.6750 - classification_loss: 0.3188 424/500 [========================>.....] - ETA: 25s - loss: 1.9940 - regression_loss: 1.6753 - classification_loss: 0.3187 425/500 [========================>.....] - ETA: 24s - loss: 1.9939 - regression_loss: 1.6752 - classification_loss: 0.3187 426/500 [========================>.....] - ETA: 24s - loss: 1.9937 - regression_loss: 1.6750 - classification_loss: 0.3187 427/500 [========================>.....] - ETA: 24s - loss: 1.9918 - regression_loss: 1.6734 - classification_loss: 0.3184 428/500 [========================>.....] - ETA: 23s - loss: 1.9915 - regression_loss: 1.6732 - classification_loss: 0.3183 429/500 [========================>.....] - ETA: 23s - loss: 1.9905 - regression_loss: 1.6723 - classification_loss: 0.3182 430/500 [========================>.....] - ETA: 23s - loss: 1.9904 - regression_loss: 1.6724 - classification_loss: 0.3180 431/500 [========================>.....] - ETA: 22s - loss: 1.9908 - regression_loss: 1.6728 - classification_loss: 0.3180 432/500 [========================>.....] - ETA: 22s - loss: 1.9908 - regression_loss: 1.6728 - classification_loss: 0.3180 433/500 [========================>.....] - ETA: 22s - loss: 1.9899 - regression_loss: 1.6721 - classification_loss: 0.3178 434/500 [=========================>....] - ETA: 21s - loss: 1.9892 - regression_loss: 1.6714 - classification_loss: 0.3177 435/500 [=========================>....] - ETA: 21s - loss: 1.9882 - regression_loss: 1.6707 - classification_loss: 0.3175 436/500 [=========================>....] - ETA: 21s - loss: 1.9867 - regression_loss: 1.6693 - classification_loss: 0.3173 437/500 [=========================>....] - ETA: 20s - loss: 1.9840 - regression_loss: 1.6671 - classification_loss: 0.3169 438/500 [=========================>....] - ETA: 20s - loss: 1.9843 - regression_loss: 1.6674 - classification_loss: 0.3169 439/500 [=========================>....] - ETA: 20s - loss: 1.9853 - regression_loss: 1.6681 - classification_loss: 0.3172 440/500 [=========================>....] - ETA: 19s - loss: 1.9871 - regression_loss: 1.6695 - classification_loss: 0.3176 441/500 [=========================>....] - ETA: 19s - loss: 1.9875 - regression_loss: 1.6700 - classification_loss: 0.3176 442/500 [=========================>....] - ETA: 19s - loss: 1.9885 - regression_loss: 1.6707 - classification_loss: 0.3179 443/500 [=========================>....] - ETA: 18s - loss: 1.9884 - regression_loss: 1.6705 - classification_loss: 0.3180 444/500 [=========================>....] - ETA: 18s - loss: 1.9894 - regression_loss: 1.6714 - classification_loss: 0.3179 445/500 [=========================>....] - ETA: 18s - loss: 1.9901 - regression_loss: 1.6720 - classification_loss: 0.3181 446/500 [=========================>....] - ETA: 17s - loss: 1.9877 - regression_loss: 1.6700 - classification_loss: 0.3177 447/500 [=========================>....] - ETA: 17s - loss: 1.9868 - regression_loss: 1.6692 - classification_loss: 0.3176 448/500 [=========================>....] - ETA: 17s - loss: 1.9878 - regression_loss: 1.6699 - classification_loss: 0.3179 449/500 [=========================>....] - ETA: 16s - loss: 1.9884 - regression_loss: 1.6704 - classification_loss: 0.3180 450/500 [==========================>...] - ETA: 16s - loss: 1.9883 - regression_loss: 1.6705 - classification_loss: 0.3179 451/500 [==========================>...] - ETA: 16s - loss: 1.9884 - regression_loss: 1.6706 - classification_loss: 0.3179 452/500 [==========================>...] - ETA: 15s - loss: 1.9872 - regression_loss: 1.6691 - classification_loss: 0.3181 453/500 [==========================>...] - ETA: 15s - loss: 1.9881 - regression_loss: 1.6699 - classification_loss: 0.3182 454/500 [==========================>...] - ETA: 15s - loss: 1.9883 - regression_loss: 1.6702 - classification_loss: 0.3182 455/500 [==========================>...] - ETA: 14s - loss: 1.9898 - regression_loss: 1.6715 - classification_loss: 0.3183 456/500 [==========================>...] - ETA: 14s - loss: 1.9902 - regression_loss: 1.6720 - classification_loss: 0.3182 457/500 [==========================>...] - ETA: 14s - loss: 1.9902 - regression_loss: 1.6720 - classification_loss: 0.3182 458/500 [==========================>...] - ETA: 13s - loss: 1.9892 - regression_loss: 1.6712 - classification_loss: 0.3179 459/500 [==========================>...] - ETA: 13s - loss: 1.9885 - regression_loss: 1.6708 - classification_loss: 0.3176 460/500 [==========================>...] - ETA: 13s - loss: 1.9868 - regression_loss: 1.6695 - classification_loss: 0.3174 461/500 [==========================>...] - ETA: 12s - loss: 1.9874 - regression_loss: 1.6699 - classification_loss: 0.3175 462/500 [==========================>...] - ETA: 12s - loss: 1.9880 - regression_loss: 1.6704 - classification_loss: 0.3177 463/500 [==========================>...] - ETA: 12s - loss: 1.9882 - regression_loss: 1.6703 - classification_loss: 0.3178 464/500 [==========================>...] - ETA: 11s - loss: 1.9861 - regression_loss: 1.6685 - classification_loss: 0.3176 465/500 [==========================>...] - ETA: 11s - loss: 1.9848 - regression_loss: 1.6675 - classification_loss: 0.3173 466/500 [==========================>...] - ETA: 11s - loss: 1.9859 - regression_loss: 1.6685 - classification_loss: 0.3174 467/500 [===========================>..] - ETA: 10s - loss: 1.9852 - regression_loss: 1.6679 - classification_loss: 0.3173 468/500 [===========================>..] - ETA: 10s - loss: 1.9852 - regression_loss: 1.6679 - classification_loss: 0.3173 469/500 [===========================>..] - ETA: 10s - loss: 1.9856 - regression_loss: 1.6683 - classification_loss: 0.3173 470/500 [===========================>..] - ETA: 9s - loss: 1.9864 - regression_loss: 1.6690 - classification_loss: 0.3174  471/500 [===========================>..] - ETA: 9s - loss: 1.9835 - regression_loss: 1.6666 - classification_loss: 0.3170 472/500 [===========================>..] - ETA: 9s - loss: 1.9844 - regression_loss: 1.6673 - classification_loss: 0.3170 473/500 [===========================>..] - ETA: 8s - loss: 1.9847 - regression_loss: 1.6676 - classification_loss: 0.3171 474/500 [===========================>..] - ETA: 8s - loss: 1.9853 - regression_loss: 1.6682 - classification_loss: 0.3171 475/500 [===========================>..] - ETA: 8s - loss: 1.9835 - regression_loss: 1.6666 - classification_loss: 0.3168 476/500 [===========================>..] - ETA: 7s - loss: 1.9845 - regression_loss: 1.6675 - classification_loss: 0.3170 477/500 [===========================>..] - ETA: 7s - loss: 1.9848 - regression_loss: 1.6677 - classification_loss: 0.3171 478/500 [===========================>..] - ETA: 7s - loss: 1.9843 - regression_loss: 1.6674 - classification_loss: 0.3169 479/500 [===========================>..] - ETA: 6s - loss: 1.9847 - regression_loss: 1.6677 - classification_loss: 0.3170 480/500 [===========================>..] - ETA: 6s - loss: 1.9855 - regression_loss: 1.6684 - classification_loss: 0.3171 481/500 [===========================>..] - ETA: 6s - loss: 1.9856 - regression_loss: 1.6685 - classification_loss: 0.3171 482/500 [===========================>..] - ETA: 5s - loss: 1.9856 - regression_loss: 1.6686 - classification_loss: 0.3170 483/500 [===========================>..] - ETA: 5s - loss: 1.9855 - regression_loss: 1.6684 - classification_loss: 0.3170 484/500 [============================>.] - ETA: 5s - loss: 1.9854 - regression_loss: 1.6685 - classification_loss: 0.3169 485/500 [============================>.] - ETA: 4s - loss: 1.9861 - regression_loss: 1.6692 - classification_loss: 0.3169 486/500 [============================>.] - ETA: 4s - loss: 1.9866 - regression_loss: 1.6696 - classification_loss: 0.3170 487/500 [============================>.] - ETA: 4s - loss: 1.9866 - regression_loss: 1.6697 - classification_loss: 0.3169 488/500 [============================>.] - ETA: 3s - loss: 1.9863 - regression_loss: 1.6695 - classification_loss: 0.3168 489/500 [============================>.] - ETA: 3s - loss: 1.9861 - regression_loss: 1.6693 - classification_loss: 0.3169 490/500 [============================>.] - ETA: 3s - loss: 1.9857 - regression_loss: 1.6690 - classification_loss: 0.3167 491/500 [============================>.] - ETA: 2s - loss: 1.9861 - regression_loss: 1.6694 - classification_loss: 0.3167 492/500 [============================>.] - ETA: 2s - loss: 1.9863 - regression_loss: 1.6696 - classification_loss: 0.3167 493/500 [============================>.] - ETA: 2s - loss: 1.9862 - regression_loss: 1.6695 - classification_loss: 0.3167 494/500 [============================>.] - ETA: 1s - loss: 1.9863 - regression_loss: 1.6697 - classification_loss: 0.3166 495/500 [============================>.] - ETA: 1s - loss: 1.9864 - regression_loss: 1.6698 - classification_loss: 0.3165 496/500 [============================>.] - ETA: 1s - loss: 1.9866 - regression_loss: 1.6700 - classification_loss: 0.3166 497/500 [============================>.] - ETA: 0s - loss: 1.9877 - regression_loss: 1.6710 - classification_loss: 0.3167 498/500 [============================>.] - ETA: 0s - loss: 1.9878 - regression_loss: 1.6711 - classification_loss: 0.3168 499/500 [============================>.] - ETA: 0s - loss: 1.9884 - regression_loss: 1.6715 - classification_loss: 0.3169 500/500 [==============================] - 165s 331ms/step - loss: 1.9873 - regression_loss: 1.6707 - classification_loss: 0.3166 1172 instances of class plum with average precision: 0.4742 mAP: 0.4742 Epoch 00009: saving model to ./training/snapshots/resnet101_pascal_09.h5 Epoch 10/150 1/500 [..............................] - ETA: 2:32 - loss: 2.0292 - regression_loss: 1.7028 - classification_loss: 0.3264 2/500 [..............................] - ETA: 2:35 - loss: 1.9740 - regression_loss: 1.6536 - classification_loss: 0.3205 3/500 [..............................] - ETA: 2:38 - loss: 1.6957 - regression_loss: 1.3980 - classification_loss: 0.2977 4/500 [..............................] - ETA: 2:43 - loss: 1.8012 - regression_loss: 1.4867 - classification_loss: 0.3144 5/500 [..............................] - ETA: 2:42 - loss: 1.9709 - regression_loss: 1.6136 - classification_loss: 0.3573 6/500 [..............................] - ETA: 2:42 - loss: 1.9462 - regression_loss: 1.6022 - classification_loss: 0.3440 7/500 [..............................] - ETA: 2:42 - loss: 1.8641 - regression_loss: 1.5397 - classification_loss: 0.3244 8/500 [..............................] - ETA: 2:41 - loss: 1.8461 - regression_loss: 1.5291 - classification_loss: 0.3170 9/500 [..............................] - ETA: 2:40 - loss: 1.8611 - regression_loss: 1.5508 - classification_loss: 0.3102 10/500 [..............................] - ETA: 2:41 - loss: 1.8595 - regression_loss: 1.5512 - classification_loss: 0.3084 11/500 [..............................] - ETA: 2:41 - loss: 1.8015 - regression_loss: 1.5043 - classification_loss: 0.2972 12/500 [..............................] - ETA: 2:40 - loss: 1.8318 - regression_loss: 1.5349 - classification_loss: 0.2969 13/500 [..............................] - ETA: 2:40 - loss: 1.8542 - regression_loss: 1.5567 - classification_loss: 0.2976 14/500 [..............................] - ETA: 2:40 - loss: 1.8685 - regression_loss: 1.5692 - classification_loss: 0.2994 15/500 [..............................] - ETA: 2:39 - loss: 1.8346 - regression_loss: 1.5355 - classification_loss: 0.2990 16/500 [..............................] - ETA: 2:39 - loss: 1.8726 - regression_loss: 1.5663 - classification_loss: 0.3063 17/500 [>.............................] - ETA: 2:39 - loss: 1.8938 - regression_loss: 1.5837 - classification_loss: 0.3100 18/500 [>.............................] - ETA: 2:38 - loss: 1.9126 - regression_loss: 1.6036 - classification_loss: 0.3091 19/500 [>.............................] - ETA: 2:38 - loss: 1.9302 - regression_loss: 1.6160 - classification_loss: 0.3141 20/500 [>.............................] - ETA: 2:38 - loss: 1.8942 - regression_loss: 1.5861 - classification_loss: 0.3081 21/500 [>.............................] - ETA: 2:37 - loss: 1.8964 - regression_loss: 1.5876 - classification_loss: 0.3088 22/500 [>.............................] - ETA: 2:37 - loss: 1.8773 - regression_loss: 1.5736 - classification_loss: 0.3037 23/500 [>.............................] - ETA: 2:37 - loss: 1.8784 - regression_loss: 1.5761 - classification_loss: 0.3023 24/500 [>.............................] - ETA: 2:36 - loss: 1.8660 - regression_loss: 1.5638 - classification_loss: 0.3022 25/500 [>.............................] - ETA: 2:36 - loss: 1.8693 - regression_loss: 1.5677 - classification_loss: 0.3016 26/500 [>.............................] - ETA: 2:35 - loss: 1.8719 - regression_loss: 1.5723 - classification_loss: 0.2996 27/500 [>.............................] - ETA: 2:35 - loss: 1.8867 - regression_loss: 1.5875 - classification_loss: 0.2992 28/500 [>.............................] - ETA: 2:35 - loss: 1.8913 - regression_loss: 1.5917 - classification_loss: 0.2996 29/500 [>.............................] - ETA: 2:34 - loss: 1.8829 - regression_loss: 1.5839 - classification_loss: 0.2991 30/500 [>.............................] - ETA: 2:34 - loss: 1.8937 - regression_loss: 1.5936 - classification_loss: 0.3001 31/500 [>.............................] - ETA: 2:34 - loss: 1.8840 - regression_loss: 1.5872 - classification_loss: 0.2968 32/500 [>.............................] - ETA: 2:33 - loss: 1.8682 - regression_loss: 1.5748 - classification_loss: 0.2934 33/500 [>.............................] - ETA: 2:33 - loss: 1.8659 - regression_loss: 1.5745 - classification_loss: 0.2914 34/500 [=>............................] - ETA: 2:33 - loss: 1.8726 - regression_loss: 1.5829 - classification_loss: 0.2898 35/500 [=>............................] - ETA: 2:33 - loss: 1.8792 - regression_loss: 1.5867 - classification_loss: 0.2925 36/500 [=>............................] - ETA: 2:33 - loss: 1.8963 - regression_loss: 1.6016 - classification_loss: 0.2947 37/500 [=>............................] - ETA: 2:32 - loss: 1.9046 - regression_loss: 1.6075 - classification_loss: 0.2971 38/500 [=>............................] - ETA: 2:32 - loss: 1.9239 - regression_loss: 1.6213 - classification_loss: 0.3025 39/500 [=>............................] - ETA: 2:32 - loss: 1.9493 - regression_loss: 1.6396 - classification_loss: 0.3097 40/500 [=>............................] - ETA: 2:31 - loss: 1.9531 - regression_loss: 1.6422 - classification_loss: 0.3109 41/500 [=>............................] - ETA: 2:31 - loss: 1.9524 - regression_loss: 1.6410 - classification_loss: 0.3113 42/500 [=>............................] - ETA: 2:31 - loss: 1.9459 - regression_loss: 1.6350 - classification_loss: 0.3109 43/500 [=>............................] - ETA: 2:30 - loss: 1.9502 - regression_loss: 1.6396 - classification_loss: 0.3106 44/500 [=>............................] - ETA: 2:30 - loss: 1.9549 - regression_loss: 1.6417 - classification_loss: 0.3132 45/500 [=>............................] - ETA: 2:29 - loss: 1.9726 - regression_loss: 1.6550 - classification_loss: 0.3176 46/500 [=>............................] - ETA: 2:29 - loss: 1.9618 - regression_loss: 1.6462 - classification_loss: 0.3156 47/500 [=>............................] - ETA: 2:29 - loss: 1.9599 - regression_loss: 1.6454 - classification_loss: 0.3145 48/500 [=>............................] - ETA: 2:28 - loss: 1.9720 - regression_loss: 1.6550 - classification_loss: 0.3170 49/500 [=>............................] - ETA: 2:28 - loss: 1.9851 - regression_loss: 1.6623 - classification_loss: 0.3228 50/500 [==>...........................] - ETA: 2:28 - loss: 1.9931 - regression_loss: 1.6689 - classification_loss: 0.3242 51/500 [==>...........................] - ETA: 2:28 - loss: 1.9982 - regression_loss: 1.6727 - classification_loss: 0.3255 52/500 [==>...........................] - ETA: 2:27 - loss: 1.9921 - regression_loss: 1.6681 - classification_loss: 0.3241 53/500 [==>...........................] - ETA: 2:27 - loss: 1.9984 - regression_loss: 1.6738 - classification_loss: 0.3246 54/500 [==>...........................] - ETA: 2:26 - loss: 2.0034 - regression_loss: 1.6789 - classification_loss: 0.3245 55/500 [==>...........................] - ETA: 2:26 - loss: 2.0021 - regression_loss: 1.6785 - classification_loss: 0.3237 56/500 [==>...........................] - ETA: 2:25 - loss: 1.9922 - regression_loss: 1.6708 - classification_loss: 0.3214 57/500 [==>...........................] - ETA: 2:25 - loss: 1.9940 - regression_loss: 1.6744 - classification_loss: 0.3196 58/500 [==>...........................] - ETA: 2:25 - loss: 1.9756 - regression_loss: 1.6596 - classification_loss: 0.3160 59/500 [==>...........................] - ETA: 2:25 - loss: 1.9609 - regression_loss: 1.6474 - classification_loss: 0.3135 60/500 [==>...........................] - ETA: 2:24 - loss: 1.9713 - regression_loss: 1.6547 - classification_loss: 0.3166 61/500 [==>...........................] - ETA: 2:24 - loss: 1.9711 - regression_loss: 1.6545 - classification_loss: 0.3166 62/500 [==>...........................] - ETA: 2:24 - loss: 1.9767 - regression_loss: 1.6569 - classification_loss: 0.3198 63/500 [==>...........................] - ETA: 2:23 - loss: 1.9809 - regression_loss: 1.6597 - classification_loss: 0.3213 64/500 [==>...........................] - ETA: 2:23 - loss: 1.9857 - regression_loss: 1.6628 - classification_loss: 0.3228 65/500 [==>...........................] - ETA: 2:23 - loss: 1.9905 - regression_loss: 1.6666 - classification_loss: 0.3239 66/500 [==>...........................] - ETA: 2:22 - loss: 2.0024 - regression_loss: 1.6745 - classification_loss: 0.3279 67/500 [===>..........................] - ETA: 2:22 - loss: 2.0127 - regression_loss: 1.6806 - classification_loss: 0.3322 68/500 [===>..........................] - ETA: 2:22 - loss: 2.0160 - regression_loss: 1.6832 - classification_loss: 0.3328 69/500 [===>..........................] - ETA: 2:21 - loss: 2.0190 - regression_loss: 1.6861 - classification_loss: 0.3328 70/500 [===>..........................] - ETA: 2:21 - loss: 2.0215 - regression_loss: 1.6887 - classification_loss: 0.3329 71/500 [===>..........................] - ETA: 2:21 - loss: 2.0227 - regression_loss: 1.6895 - classification_loss: 0.3331 72/500 [===>..........................] - ETA: 2:20 - loss: 2.0245 - regression_loss: 1.6906 - classification_loss: 0.3339 73/500 [===>..........................] - ETA: 2:20 - loss: 2.0271 - regression_loss: 1.6923 - classification_loss: 0.3348 74/500 [===>..........................] - ETA: 2:20 - loss: 2.0289 - regression_loss: 1.6943 - classification_loss: 0.3346 75/500 [===>..........................] - ETA: 2:19 - loss: 2.0258 - regression_loss: 1.6920 - classification_loss: 0.3338 76/500 [===>..........................] - ETA: 2:19 - loss: 2.0181 - regression_loss: 1.6860 - classification_loss: 0.3321 77/500 [===>..........................] - ETA: 2:19 - loss: 2.0190 - regression_loss: 1.6863 - classification_loss: 0.3327 78/500 [===>..........................] - ETA: 2:18 - loss: 2.0179 - regression_loss: 1.6857 - classification_loss: 0.3322 79/500 [===>..........................] - ETA: 2:18 - loss: 2.0210 - regression_loss: 1.6887 - classification_loss: 0.3322 80/500 [===>..........................] - ETA: 2:18 - loss: 2.0210 - regression_loss: 1.6890 - classification_loss: 0.3320 81/500 [===>..........................] - ETA: 2:18 - loss: 2.0208 - regression_loss: 1.6894 - classification_loss: 0.3314 82/500 [===>..........................] - ETA: 2:17 - loss: 2.0195 - regression_loss: 1.6885 - classification_loss: 0.3309 83/500 [===>..........................] - ETA: 2:17 - loss: 2.0210 - regression_loss: 1.6900 - classification_loss: 0.3310 84/500 [====>.........................] - ETA: 2:17 - loss: 2.0221 - regression_loss: 1.6903 - classification_loss: 0.3318 85/500 [====>.........................] - ETA: 2:16 - loss: 2.0222 - regression_loss: 1.6904 - classification_loss: 0.3317 86/500 [====>.........................] - ETA: 2:16 - loss: 2.0222 - regression_loss: 1.6905 - classification_loss: 0.3317 87/500 [====>.........................] - ETA: 2:16 - loss: 2.0165 - regression_loss: 1.6863 - classification_loss: 0.3302 88/500 [====>.........................] - ETA: 2:15 - loss: 2.0124 - regression_loss: 1.6829 - classification_loss: 0.3296 89/500 [====>.........................] - ETA: 2:15 - loss: 2.0120 - regression_loss: 1.6831 - classification_loss: 0.3288 90/500 [====>.........................] - ETA: 2:15 - loss: 2.0133 - regression_loss: 1.6844 - classification_loss: 0.3290 91/500 [====>.........................] - ETA: 2:14 - loss: 2.0131 - regression_loss: 1.6837 - classification_loss: 0.3294 92/500 [====>.........................] - ETA: 2:14 - loss: 2.0121 - regression_loss: 1.6828 - classification_loss: 0.3293 93/500 [====>.........................] - ETA: 2:14 - loss: 2.0045 - regression_loss: 1.6762 - classification_loss: 0.3282 94/500 [====>.........................] - ETA: 2:13 - loss: 2.0035 - regression_loss: 1.6754 - classification_loss: 0.3281 95/500 [====>.........................] - ETA: 2:13 - loss: 2.0086 - regression_loss: 1.6791 - classification_loss: 0.3295 96/500 [====>.........................] - ETA: 2:13 - loss: 2.0103 - regression_loss: 1.6800 - classification_loss: 0.3303 97/500 [====>.........................] - ETA: 2:13 - loss: 2.0118 - regression_loss: 1.6816 - classification_loss: 0.3301 98/500 [====>.........................] - ETA: 2:12 - loss: 2.0086 - regression_loss: 1.6791 - classification_loss: 0.3295 99/500 [====>.........................] - ETA: 2:12 - loss: 2.0093 - regression_loss: 1.6800 - classification_loss: 0.3293 100/500 [=====>........................] - ETA: 2:12 - loss: 2.0018 - regression_loss: 1.6744 - classification_loss: 0.3274 101/500 [=====>........................] - ETA: 2:11 - loss: 1.9969 - regression_loss: 1.6704 - classification_loss: 0.3265 102/500 [=====>........................] - ETA: 2:11 - loss: 1.9867 - regression_loss: 1.6617 - classification_loss: 0.3250 103/500 [=====>........................] - ETA: 2:11 - loss: 1.9878 - regression_loss: 1.6630 - classification_loss: 0.3249 104/500 [=====>........................] - ETA: 2:10 - loss: 1.9893 - regression_loss: 1.6643 - classification_loss: 0.3251 105/500 [=====>........................] - ETA: 2:10 - loss: 1.9932 - regression_loss: 1.6682 - classification_loss: 0.3250 106/500 [=====>........................] - ETA: 2:10 - loss: 1.9986 - regression_loss: 1.6723 - classification_loss: 0.3262 107/500 [=====>........................] - ETA: 2:09 - loss: 1.9973 - regression_loss: 1.6718 - classification_loss: 0.3255 108/500 [=====>........................] - ETA: 2:09 - loss: 1.9974 - regression_loss: 1.6719 - classification_loss: 0.3254 109/500 [=====>........................] - ETA: 2:09 - loss: 1.9972 - regression_loss: 1.6722 - classification_loss: 0.3250 110/500 [=====>........................] - ETA: 2:08 - loss: 1.9990 - regression_loss: 1.6740 - classification_loss: 0.3250 111/500 [=====>........................] - ETA: 2:08 - loss: 1.9948 - regression_loss: 1.6705 - classification_loss: 0.3243 112/500 [=====>........................] - ETA: 2:08 - loss: 1.9964 - regression_loss: 1.6721 - classification_loss: 0.3243 113/500 [=====>........................] - ETA: 2:08 - loss: 1.9902 - regression_loss: 1.6669 - classification_loss: 0.3234 114/500 [=====>........................] - ETA: 2:07 - loss: 1.9910 - regression_loss: 1.6679 - classification_loss: 0.3232 115/500 [=====>........................] - ETA: 2:07 - loss: 1.9946 - regression_loss: 1.6700 - classification_loss: 0.3246 116/500 [=====>........................] - ETA: 2:07 - loss: 1.9960 - regression_loss: 1.6712 - classification_loss: 0.3247 117/500 [======>.......................] - ETA: 2:06 - loss: 1.9856 - regression_loss: 1.6615 - classification_loss: 0.3241 118/500 [======>.......................] - ETA: 2:06 - loss: 1.9850 - regression_loss: 1.6612 - classification_loss: 0.3238 119/500 [======>.......................] - ETA: 2:06 - loss: 1.9866 - regression_loss: 1.6630 - classification_loss: 0.3236 120/500 [======>.......................] - ETA: 2:05 - loss: 1.9852 - regression_loss: 1.6623 - classification_loss: 0.3229 121/500 [======>.......................] - ETA: 2:05 - loss: 1.9868 - regression_loss: 1.6638 - classification_loss: 0.3230 122/500 [======>.......................] - ETA: 2:04 - loss: 1.9867 - regression_loss: 1.6639 - classification_loss: 0.3228 123/500 [======>.......................] - ETA: 2:04 - loss: 1.9858 - regression_loss: 1.6633 - classification_loss: 0.3225 124/500 [======>.......................] - ETA: 2:04 - loss: 1.9879 - regression_loss: 1.6654 - classification_loss: 0.3225 125/500 [======>.......................] - ETA: 2:03 - loss: 1.9848 - regression_loss: 1.6628 - classification_loss: 0.3219 126/500 [======>.......................] - ETA: 2:03 - loss: 1.9896 - regression_loss: 1.6673 - classification_loss: 0.3223 127/500 [======>.......................] - ETA: 2:03 - loss: 1.9889 - regression_loss: 1.6668 - classification_loss: 0.3221 128/500 [======>.......................] - ETA: 2:03 - loss: 1.9874 - regression_loss: 1.6656 - classification_loss: 0.3219 129/500 [======>.......................] - ETA: 2:02 - loss: 1.9884 - regression_loss: 1.6664 - classification_loss: 0.3221 130/500 [======>.......................] - ETA: 2:02 - loss: 1.9850 - regression_loss: 1.6640 - classification_loss: 0.3210 131/500 [======>.......................] - ETA: 2:01 - loss: 1.9860 - regression_loss: 1.6650 - classification_loss: 0.3210 132/500 [======>.......................] - ETA: 2:01 - loss: 1.9865 - regression_loss: 1.6656 - classification_loss: 0.3209 133/500 [======>.......................] - ETA: 2:01 - loss: 1.9850 - regression_loss: 1.6643 - classification_loss: 0.3207 134/500 [=======>......................] - ETA: 2:00 - loss: 1.9869 - regression_loss: 1.6663 - classification_loss: 0.3206 135/500 [=======>......................] - ETA: 2:00 - loss: 1.9866 - regression_loss: 1.6661 - classification_loss: 0.3205 136/500 [=======>......................] - ETA: 2:00 - loss: 1.9839 - regression_loss: 1.6639 - classification_loss: 0.3200 137/500 [=======>......................] - ETA: 2:00 - loss: 1.9825 - regression_loss: 1.6623 - classification_loss: 0.3202 138/500 [=======>......................] - ETA: 1:59 - loss: 1.9780 - regression_loss: 1.6587 - classification_loss: 0.3192 139/500 [=======>......................] - ETA: 1:59 - loss: 1.9739 - regression_loss: 1.6548 - classification_loss: 0.3191 140/500 [=======>......................] - ETA: 1:59 - loss: 1.9761 - regression_loss: 1.6568 - classification_loss: 0.3193 141/500 [=======>......................] - ETA: 1:58 - loss: 1.9737 - regression_loss: 1.6549 - classification_loss: 0.3188 142/500 [=======>......................] - ETA: 1:58 - loss: 1.9704 - regression_loss: 1.6518 - classification_loss: 0.3186 143/500 [=======>......................] - ETA: 1:57 - loss: 1.9714 - regression_loss: 1.6530 - classification_loss: 0.3185 144/500 [=======>......................] - ETA: 1:57 - loss: 1.9736 - regression_loss: 1.6543 - classification_loss: 0.3193 145/500 [=======>......................] - ETA: 1:57 - loss: 1.9745 - regression_loss: 1.6551 - classification_loss: 0.3194 146/500 [=======>......................] - ETA: 1:56 - loss: 1.9760 - regression_loss: 1.6563 - classification_loss: 0.3198 147/500 [=======>......................] - ETA: 1:56 - loss: 1.9759 - regression_loss: 1.6562 - classification_loss: 0.3198 148/500 [=======>......................] - ETA: 1:56 - loss: 1.9729 - regression_loss: 1.6542 - classification_loss: 0.3187 149/500 [=======>......................] - ETA: 1:56 - loss: 1.9769 - regression_loss: 1.6576 - classification_loss: 0.3193 150/500 [========>.....................] - ETA: 1:55 - loss: 1.9803 - regression_loss: 1.6606 - classification_loss: 0.3197 151/500 [========>.....................] - ETA: 1:55 - loss: 1.9774 - regression_loss: 1.6581 - classification_loss: 0.3193 152/500 [========>.....................] - ETA: 1:55 - loss: 1.9770 - regression_loss: 1.6581 - classification_loss: 0.3189 153/500 [========>.....................] - ETA: 1:54 - loss: 1.9788 - regression_loss: 1.6602 - classification_loss: 0.3186 154/500 [========>.....................] - ETA: 1:54 - loss: 1.9800 - regression_loss: 1.6612 - classification_loss: 0.3188 155/500 [========>.....................] - ETA: 1:54 - loss: 1.9727 - regression_loss: 1.6547 - classification_loss: 0.3179 156/500 [========>.....................] - ETA: 1:53 - loss: 1.9755 - regression_loss: 1.6575 - classification_loss: 0.3180 157/500 [========>.....................] - ETA: 1:53 - loss: 1.9740 - regression_loss: 1.6558 - classification_loss: 0.3182 158/500 [========>.....................] - ETA: 1:52 - loss: 1.9740 - regression_loss: 1.6559 - classification_loss: 0.3181 159/500 [========>.....................] - ETA: 1:52 - loss: 1.9753 - regression_loss: 1.6569 - classification_loss: 0.3184 160/500 [========>.....................] - ETA: 1:52 - loss: 1.9756 - regression_loss: 1.6574 - classification_loss: 0.3183 161/500 [========>.....................] - ETA: 1:51 - loss: 1.9771 - regression_loss: 1.6589 - classification_loss: 0.3181 162/500 [========>.....................] - ETA: 1:51 - loss: 1.9827 - regression_loss: 1.6636 - classification_loss: 0.3191 163/500 [========>.....................] - ETA: 1:51 - loss: 1.9846 - regression_loss: 1.6653 - classification_loss: 0.3193 164/500 [========>.....................] - ETA: 1:50 - loss: 1.9854 - regression_loss: 1.6662 - classification_loss: 0.3193 165/500 [========>.....................] - ETA: 1:50 - loss: 1.9844 - regression_loss: 1.6652 - classification_loss: 0.3192 166/500 [========>.....................] - ETA: 1:50 - loss: 1.9878 - regression_loss: 1.6683 - classification_loss: 0.3195 167/500 [=========>....................] - ETA: 1:49 - loss: 1.9877 - regression_loss: 1.6684 - classification_loss: 0.3193 168/500 [=========>....................] - ETA: 1:49 - loss: 1.9886 - regression_loss: 1.6693 - classification_loss: 0.3193 169/500 [=========>....................] - ETA: 1:49 - loss: 1.9893 - regression_loss: 1.6697 - classification_loss: 0.3195 170/500 [=========>....................] - ETA: 1:48 - loss: 1.9857 - regression_loss: 1.6666 - classification_loss: 0.3191 171/500 [=========>....................] - ETA: 1:48 - loss: 1.9856 - regression_loss: 1.6667 - classification_loss: 0.3190 172/500 [=========>....................] - ETA: 1:48 - loss: 1.9866 - regression_loss: 1.6671 - classification_loss: 0.3196 173/500 [=========>....................] - ETA: 1:47 - loss: 1.9862 - regression_loss: 1.6669 - classification_loss: 0.3193 174/500 [=========>....................] - ETA: 1:47 - loss: 1.9790 - regression_loss: 1.6607 - classification_loss: 0.3183 175/500 [=========>....................] - ETA: 1:47 - loss: 1.9779 - regression_loss: 1.6593 - classification_loss: 0.3185 176/500 [=========>....................] - ETA: 1:47 - loss: 1.9751 - regression_loss: 1.6571 - classification_loss: 0.3180 177/500 [=========>....................] - ETA: 1:46 - loss: 1.9770 - regression_loss: 1.6586 - classification_loss: 0.3184 178/500 [=========>....................] - ETA: 1:46 - loss: 1.9718 - regression_loss: 1.6540 - classification_loss: 0.3178 179/500 [=========>....................] - ETA: 1:46 - loss: 1.9729 - regression_loss: 1.6553 - classification_loss: 0.3176 180/500 [=========>....................] - ETA: 1:45 - loss: 1.9754 - regression_loss: 1.6573 - classification_loss: 0.3181 181/500 [=========>....................] - ETA: 1:45 - loss: 1.9765 - regression_loss: 1.6586 - classification_loss: 0.3178 182/500 [=========>....................] - ETA: 1:45 - loss: 1.9781 - regression_loss: 1.6601 - classification_loss: 0.3181 183/500 [=========>....................] - ETA: 1:44 - loss: 1.9752 - regression_loss: 1.6576 - classification_loss: 0.3176 184/500 [==========>...................] - ETA: 1:44 - loss: 1.9718 - regression_loss: 1.6548 - classification_loss: 0.3170 185/500 [==========>...................] - ETA: 1:44 - loss: 1.9725 - regression_loss: 1.6559 - classification_loss: 0.3166 186/500 [==========>...................] - ETA: 1:43 - loss: 1.9717 - regression_loss: 1.6550 - classification_loss: 0.3167 187/500 [==========>...................] - ETA: 1:43 - loss: 1.9663 - regression_loss: 1.6499 - classification_loss: 0.3165 188/500 [==========>...................] - ETA: 1:43 - loss: 1.9639 - regression_loss: 1.6480 - classification_loss: 0.3159 189/500 [==========>...................] - ETA: 1:42 - loss: 1.9650 - regression_loss: 1.6488 - classification_loss: 0.3161 190/500 [==========>...................] - ETA: 1:42 - loss: 1.9657 - regression_loss: 1.6494 - classification_loss: 0.3162 191/500 [==========>...................] - ETA: 1:41 - loss: 1.9655 - regression_loss: 1.6491 - classification_loss: 0.3163 192/500 [==========>...................] - ETA: 1:41 - loss: 1.9646 - regression_loss: 1.6485 - classification_loss: 0.3161 193/500 [==========>...................] - ETA: 1:41 - loss: 1.9660 - regression_loss: 1.6498 - classification_loss: 0.3162 194/500 [==========>...................] - ETA: 1:40 - loss: 1.9663 - regression_loss: 1.6502 - classification_loss: 0.3161 195/500 [==========>...................] - ETA: 1:40 - loss: 1.9672 - regression_loss: 1.6510 - classification_loss: 0.3162 196/500 [==========>...................] - ETA: 1:40 - loss: 1.9643 - regression_loss: 1.6476 - classification_loss: 0.3167 197/500 [==========>...................] - ETA: 1:40 - loss: 1.9668 - regression_loss: 1.6499 - classification_loss: 0.3169 198/500 [==========>...................] - ETA: 1:39 - loss: 1.9655 - regression_loss: 1.6488 - classification_loss: 0.3167 199/500 [==========>...................] - ETA: 1:39 - loss: 1.9645 - regression_loss: 1.6479 - classification_loss: 0.3165 200/500 [===========>..................] - ETA: 1:39 - loss: 1.9643 - regression_loss: 1.6479 - classification_loss: 0.3164 201/500 [===========>..................] - ETA: 1:38 - loss: 1.9657 - regression_loss: 1.6492 - classification_loss: 0.3165 202/500 [===========>..................] - ETA: 1:38 - loss: 1.9669 - regression_loss: 1.6503 - classification_loss: 0.3165 203/500 [===========>..................] - ETA: 1:38 - loss: 1.9654 - regression_loss: 1.6494 - classification_loss: 0.3161 204/500 [===========>..................] - ETA: 1:37 - loss: 1.9675 - regression_loss: 1.6511 - classification_loss: 0.3163 205/500 [===========>..................] - ETA: 1:37 - loss: 1.9674 - regression_loss: 1.6512 - classification_loss: 0.3162 206/500 [===========>..................] - ETA: 1:37 - loss: 1.9704 - regression_loss: 1.6537 - classification_loss: 0.3167 207/500 [===========>..................] - ETA: 1:36 - loss: 1.9681 - regression_loss: 1.6515 - classification_loss: 0.3165 208/500 [===========>..................] - ETA: 1:36 - loss: 1.9685 - regression_loss: 1.6520 - classification_loss: 0.3165 209/500 [===========>..................] - ETA: 1:36 - loss: 1.9693 - regression_loss: 1.6528 - classification_loss: 0.3165 210/500 [===========>..................] - ETA: 1:35 - loss: 1.9685 - regression_loss: 1.6523 - classification_loss: 0.3162 211/500 [===========>..................] - ETA: 1:35 - loss: 1.9634 - regression_loss: 1.6478 - classification_loss: 0.3155 212/500 [===========>..................] - ETA: 1:35 - loss: 1.9656 - regression_loss: 1.6499 - classification_loss: 0.3157 213/500 [===========>..................] - ETA: 1:34 - loss: 1.9670 - regression_loss: 1.6512 - classification_loss: 0.3158 214/500 [===========>..................] - ETA: 1:34 - loss: 1.9698 - regression_loss: 1.6537 - classification_loss: 0.3160 215/500 [===========>..................] - ETA: 1:34 - loss: 1.9695 - regression_loss: 1.6535 - classification_loss: 0.3160 216/500 [===========>..................] - ETA: 1:33 - loss: 1.9700 - regression_loss: 1.6540 - classification_loss: 0.3160 217/500 [============>.................] - ETA: 1:33 - loss: 1.9701 - regression_loss: 1.6543 - classification_loss: 0.3158 218/500 [============>.................] - ETA: 1:33 - loss: 1.9682 - regression_loss: 1.6529 - classification_loss: 0.3153 219/500 [============>.................] - ETA: 1:32 - loss: 1.9694 - regression_loss: 1.6545 - classification_loss: 0.3148 220/500 [============>.................] - ETA: 1:32 - loss: 1.9702 - regression_loss: 1.6551 - classification_loss: 0.3151 221/500 [============>.................] - ETA: 1:32 - loss: 1.9708 - regression_loss: 1.6558 - classification_loss: 0.3150 222/500 [============>.................] - ETA: 1:31 - loss: 1.9735 - regression_loss: 1.6583 - classification_loss: 0.3152 223/500 [============>.................] - ETA: 1:31 - loss: 1.9747 - regression_loss: 1.6591 - classification_loss: 0.3156 224/500 [============>.................] - ETA: 1:31 - loss: 1.9743 - regression_loss: 1.6587 - classification_loss: 0.3155 225/500 [============>.................] - ETA: 1:30 - loss: 1.9753 - regression_loss: 1.6599 - classification_loss: 0.3153 226/500 [============>.................] - ETA: 1:30 - loss: 1.9728 - regression_loss: 1.6581 - classification_loss: 0.3147 227/500 [============>.................] - ETA: 1:30 - loss: 1.9698 - regression_loss: 1.6558 - classification_loss: 0.3140 228/500 [============>.................] - ETA: 1:30 - loss: 1.9677 - regression_loss: 1.6541 - classification_loss: 0.3136 229/500 [============>.................] - ETA: 1:29 - loss: 1.9675 - regression_loss: 1.6540 - classification_loss: 0.3136 230/500 [============>.................] - ETA: 1:29 - loss: 1.9647 - regression_loss: 1.6517 - classification_loss: 0.3129 231/500 [============>.................] - ETA: 1:29 - loss: 1.9648 - regression_loss: 1.6519 - classification_loss: 0.3129 232/500 [============>.................] - ETA: 1:28 - loss: 1.9616 - regression_loss: 1.6493 - classification_loss: 0.3124 233/500 [============>.................] - ETA: 1:28 - loss: 1.9635 - regression_loss: 1.6508 - classification_loss: 0.3128 234/500 [=============>................] - ETA: 1:28 - loss: 1.9628 - regression_loss: 1.6502 - classification_loss: 0.3126 235/500 [=============>................] - ETA: 1:27 - loss: 1.9595 - regression_loss: 1.6475 - classification_loss: 0.3120 236/500 [=============>................] - ETA: 1:27 - loss: 1.9588 - regression_loss: 1.6474 - classification_loss: 0.3114 237/500 [=============>................] - ETA: 1:27 - loss: 1.9588 - regression_loss: 1.6476 - classification_loss: 0.3112 238/500 [=============>................] - ETA: 1:26 - loss: 1.9563 - regression_loss: 1.6451 - classification_loss: 0.3112 239/500 [=============>................] - ETA: 1:26 - loss: 1.9562 - regression_loss: 1.6451 - classification_loss: 0.3110 240/500 [=============>................] - ETA: 1:26 - loss: 1.9542 - regression_loss: 1.6437 - classification_loss: 0.3106 241/500 [=============>................] - ETA: 1:25 - loss: 1.9547 - regression_loss: 1.6440 - classification_loss: 0.3106 242/500 [=============>................] - ETA: 1:25 - loss: 1.9549 - regression_loss: 1.6442 - classification_loss: 0.3107 243/500 [=============>................] - ETA: 1:25 - loss: 1.9531 - regression_loss: 1.6424 - classification_loss: 0.3106 244/500 [=============>................] - ETA: 1:24 - loss: 1.9514 - regression_loss: 1.6410 - classification_loss: 0.3103 245/500 [=============>................] - ETA: 1:24 - loss: 1.9516 - regression_loss: 1.6412 - classification_loss: 0.3104 246/500 [=============>................] - ETA: 1:24 - loss: 1.9511 - regression_loss: 1.6412 - classification_loss: 0.3100 247/500 [=============>................] - ETA: 1:23 - loss: 1.9522 - regression_loss: 1.6418 - classification_loss: 0.3104 248/500 [=============>................] - ETA: 1:23 - loss: 1.9511 - regression_loss: 1.6407 - classification_loss: 0.3104 249/500 [=============>................] - ETA: 1:23 - loss: 1.9486 - regression_loss: 1.6388 - classification_loss: 0.3098 250/500 [==============>...............] - ETA: 1:22 - loss: 1.9482 - regression_loss: 1.6385 - classification_loss: 0.3097 251/500 [==============>...............] - ETA: 1:22 - loss: 1.9482 - regression_loss: 1.6387 - classification_loss: 0.3095 252/500 [==============>...............] - ETA: 1:22 - loss: 1.9498 - regression_loss: 1.6401 - classification_loss: 0.3097 253/500 [==============>...............] - ETA: 1:21 - loss: 1.9478 - regression_loss: 1.6385 - classification_loss: 0.3093 254/500 [==============>...............] - ETA: 1:21 - loss: 1.9482 - regression_loss: 1.6388 - classification_loss: 0.3093 255/500 [==============>...............] - ETA: 1:21 - loss: 1.9498 - regression_loss: 1.6400 - classification_loss: 0.3097 256/500 [==============>...............] - ETA: 1:20 - loss: 1.9504 - regression_loss: 1.6405 - classification_loss: 0.3099 257/500 [==============>...............] - ETA: 1:20 - loss: 1.9479 - regression_loss: 1.6381 - classification_loss: 0.3098 258/500 [==============>...............] - ETA: 1:20 - loss: 1.9494 - regression_loss: 1.6393 - classification_loss: 0.3101 259/500 [==============>...............] - ETA: 1:19 - loss: 1.9493 - regression_loss: 1.6395 - classification_loss: 0.3099 260/500 [==============>...............] - ETA: 1:19 - loss: 1.9497 - regression_loss: 1.6399 - classification_loss: 0.3097 261/500 [==============>...............] - ETA: 1:19 - loss: 1.9501 - regression_loss: 1.6403 - classification_loss: 0.3098 262/500 [==============>...............] - ETA: 1:18 - loss: 1.9469 - regression_loss: 1.6376 - classification_loss: 0.3092 263/500 [==============>...............] - ETA: 1:18 - loss: 1.9479 - regression_loss: 1.6388 - classification_loss: 0.3091 264/500 [==============>...............] - ETA: 1:18 - loss: 1.9476 - regression_loss: 1.6385 - classification_loss: 0.3090 265/500 [==============>...............] - ETA: 1:17 - loss: 1.9448 - regression_loss: 1.6362 - classification_loss: 0.3086 266/500 [==============>...............] - ETA: 1:17 - loss: 1.9437 - regression_loss: 1.6338 - classification_loss: 0.3100 267/500 [===============>..............] - ETA: 1:17 - loss: 1.9427 - regression_loss: 1.6331 - classification_loss: 0.3096 268/500 [===============>..............] - ETA: 1:16 - loss: 1.9433 - regression_loss: 1.6336 - classification_loss: 0.3098 269/500 [===============>..............] - ETA: 1:16 - loss: 1.9460 - regression_loss: 1.6359 - classification_loss: 0.3101 270/500 [===============>..............] - ETA: 1:16 - loss: 1.9421 - regression_loss: 1.6327 - classification_loss: 0.3095 271/500 [===============>..............] - ETA: 1:15 - loss: 1.9371 - regression_loss: 1.6283 - classification_loss: 0.3088 272/500 [===============>..............] - ETA: 1:15 - loss: 1.9380 - regression_loss: 1.6292 - classification_loss: 0.3088 273/500 [===============>..............] - ETA: 1:15 - loss: 1.9378 - regression_loss: 1.6291 - classification_loss: 0.3087 274/500 [===============>..............] - ETA: 1:14 - loss: 1.9378 - regression_loss: 1.6290 - classification_loss: 0.3087 275/500 [===============>..............] - ETA: 1:14 - loss: 1.9386 - regression_loss: 1.6298 - classification_loss: 0.3088 276/500 [===============>..............] - ETA: 1:14 - loss: 1.9388 - regression_loss: 1.6301 - classification_loss: 0.3087 277/500 [===============>..............] - ETA: 1:13 - loss: 1.9352 - regression_loss: 1.6270 - classification_loss: 0.3082 278/500 [===============>..............] - ETA: 1:13 - loss: 1.9360 - regression_loss: 1.6278 - classification_loss: 0.3083 279/500 [===============>..............] - ETA: 1:13 - loss: 1.9371 - regression_loss: 1.6288 - classification_loss: 0.3082 280/500 [===============>..............] - ETA: 1:12 - loss: 1.9387 - regression_loss: 1.6305 - classification_loss: 0.3082 281/500 [===============>..............] - ETA: 1:12 - loss: 1.9390 - regression_loss: 1.6308 - classification_loss: 0.3081 282/500 [===============>..............] - ETA: 1:12 - loss: 1.9376 - regression_loss: 1.6297 - classification_loss: 0.3080 283/500 [===============>..............] - ETA: 1:11 - loss: 1.9343 - regression_loss: 1.6269 - classification_loss: 0.3074 284/500 [================>.............] - ETA: 1:11 - loss: 1.9347 - regression_loss: 1.6273 - classification_loss: 0.3074 285/500 [================>.............] - ETA: 1:11 - loss: 1.9349 - regression_loss: 1.6273 - classification_loss: 0.3076 286/500 [================>.............] - ETA: 1:10 - loss: 1.9325 - regression_loss: 1.6252 - classification_loss: 0.3073 287/500 [================>.............] - ETA: 1:10 - loss: 1.9313 - regression_loss: 1.6243 - classification_loss: 0.3071 288/500 [================>.............] - ETA: 1:10 - loss: 1.9313 - regression_loss: 1.6242 - classification_loss: 0.3071 289/500 [================>.............] - ETA: 1:09 - loss: 1.9322 - regression_loss: 1.6249 - classification_loss: 0.3073 290/500 [================>.............] - ETA: 1:09 - loss: 1.9328 - regression_loss: 1.6254 - classification_loss: 0.3074 291/500 [================>.............] - ETA: 1:09 - loss: 1.9305 - regression_loss: 1.6236 - classification_loss: 0.3069 292/500 [================>.............] - ETA: 1:08 - loss: 1.9307 - regression_loss: 1.6238 - classification_loss: 0.3069 293/500 [================>.............] - ETA: 1:08 - loss: 1.9282 - regression_loss: 1.6218 - classification_loss: 0.3063 294/500 [================>.............] - ETA: 1:08 - loss: 1.9279 - regression_loss: 1.6219 - classification_loss: 0.3060 295/500 [================>.............] - ETA: 1:07 - loss: 1.9262 - regression_loss: 1.6203 - classification_loss: 0.3059 296/500 [================>.............] - ETA: 1:07 - loss: 1.9260 - regression_loss: 1.6202 - classification_loss: 0.3058 297/500 [================>.............] - ETA: 1:07 - loss: 1.9223 - regression_loss: 1.6166 - classification_loss: 0.3056 298/500 [================>.............] - ETA: 1:06 - loss: 1.9232 - regression_loss: 1.6175 - classification_loss: 0.3057 299/500 [================>.............] - ETA: 1:06 - loss: 1.9227 - regression_loss: 1.6171 - classification_loss: 0.3056 300/500 [=================>............] - ETA: 1:06 - loss: 1.9236 - regression_loss: 1.6177 - classification_loss: 0.3059 301/500 [=================>............] - ETA: 1:05 - loss: 1.9237 - regression_loss: 1.6179 - classification_loss: 0.3058 302/500 [=================>............] - ETA: 1:05 - loss: 1.9224 - regression_loss: 1.6168 - classification_loss: 0.3056 303/500 [=================>............] - ETA: 1:05 - loss: 1.9223 - regression_loss: 1.6166 - classification_loss: 0.3057 304/500 [=================>............] - ETA: 1:04 - loss: 1.9198 - regression_loss: 1.6144 - classification_loss: 0.3054 305/500 [=================>............] - ETA: 1:04 - loss: 1.9218 - regression_loss: 1.6160 - classification_loss: 0.3057 306/500 [=================>............] - ETA: 1:04 - loss: 1.9219 - regression_loss: 1.6162 - classification_loss: 0.3057 307/500 [=================>............] - ETA: 1:03 - loss: 1.9218 - regression_loss: 1.6162 - classification_loss: 0.3057 308/500 [=================>............] - ETA: 1:03 - loss: 1.9196 - regression_loss: 1.6140 - classification_loss: 0.3056 309/500 [=================>............] - ETA: 1:03 - loss: 1.9202 - regression_loss: 1.6146 - classification_loss: 0.3056 310/500 [=================>............] - ETA: 1:02 - loss: 1.9199 - regression_loss: 1.6144 - classification_loss: 0.3055 311/500 [=================>............] - ETA: 1:02 - loss: 1.9193 - regression_loss: 1.6139 - classification_loss: 0.3054 312/500 [=================>............] - ETA: 1:02 - loss: 1.9191 - regression_loss: 1.6137 - classification_loss: 0.3054 313/500 [=================>............] - ETA: 1:01 - loss: 1.9184 - regression_loss: 1.6133 - classification_loss: 0.3051 314/500 [=================>............] - ETA: 1:01 - loss: 1.9147 - regression_loss: 1.6101 - classification_loss: 0.3046 315/500 [=================>............] - ETA: 1:01 - loss: 1.9163 - regression_loss: 1.6110 - classification_loss: 0.3053 316/500 [=================>............] - ETA: 1:00 - loss: 1.9147 - regression_loss: 1.6099 - classification_loss: 0.3049 317/500 [==================>...........] - ETA: 1:00 - loss: 1.9132 - regression_loss: 1.6085 - classification_loss: 0.3046 318/500 [==================>...........] - ETA: 1:00 - loss: 1.9126 - regression_loss: 1.6081 - classification_loss: 0.3045 319/500 [==================>...........] - ETA: 59s - loss: 1.9098 - regression_loss: 1.6055 - classification_loss: 0.3042  320/500 [==================>...........] - ETA: 59s - loss: 1.9101 - regression_loss: 1.6058 - classification_loss: 0.3044 321/500 [==================>...........] - ETA: 59s - loss: 1.9116 - regression_loss: 1.6068 - classification_loss: 0.3049 322/500 [==================>...........] - ETA: 58s - loss: 1.9125 - regression_loss: 1.6075 - classification_loss: 0.3050 323/500 [==================>...........] - ETA: 58s - loss: 1.9113 - regression_loss: 1.6063 - classification_loss: 0.3049 324/500 [==================>...........] - ETA: 58s - loss: 1.9105 - regression_loss: 1.6055 - classification_loss: 0.3050 325/500 [==================>...........] - ETA: 57s - loss: 1.9077 - regression_loss: 1.6031 - classification_loss: 0.3046 326/500 [==================>...........] - ETA: 57s - loss: 1.9074 - regression_loss: 1.6029 - classification_loss: 0.3046 327/500 [==================>...........] - ETA: 57s - loss: 1.9071 - regression_loss: 1.6027 - classification_loss: 0.3044 328/500 [==================>...........] - ETA: 56s - loss: 1.9065 - regression_loss: 1.6024 - classification_loss: 0.3041 329/500 [==================>...........] - ETA: 56s - loss: 1.9072 - regression_loss: 1.6031 - classification_loss: 0.3040 330/500 [==================>...........] - ETA: 56s - loss: 1.9075 - regression_loss: 1.6035 - classification_loss: 0.3040 331/500 [==================>...........] - ETA: 55s - loss: 1.9086 - regression_loss: 1.6046 - classification_loss: 0.3040 332/500 [==================>...........] - ETA: 55s - loss: 1.9070 - regression_loss: 1.6030 - classification_loss: 0.3041 333/500 [==================>...........] - ETA: 55s - loss: 1.9079 - regression_loss: 1.6037 - classification_loss: 0.3041 334/500 [===================>..........] - ETA: 54s - loss: 1.9092 - regression_loss: 1.6050 - classification_loss: 0.3042 335/500 [===================>..........] - ETA: 54s - loss: 1.9080 - regression_loss: 1.6041 - classification_loss: 0.3039 336/500 [===================>..........] - ETA: 54s - loss: 1.9066 - regression_loss: 1.6030 - classification_loss: 0.3036 337/500 [===================>..........] - ETA: 53s - loss: 1.9065 - regression_loss: 1.6028 - classification_loss: 0.3037 338/500 [===================>..........] - ETA: 53s - loss: 1.9064 - regression_loss: 1.6028 - classification_loss: 0.3036 339/500 [===================>..........] - ETA: 53s - loss: 1.9087 - regression_loss: 1.6046 - classification_loss: 0.3041 340/500 [===================>..........] - ETA: 52s - loss: 1.9083 - regression_loss: 1.6042 - classification_loss: 0.3041 341/500 [===================>..........] - ETA: 52s - loss: 1.9079 - regression_loss: 1.6039 - classification_loss: 0.3040 342/500 [===================>..........] - ETA: 52s - loss: 1.9087 - regression_loss: 1.6047 - classification_loss: 0.3040 343/500 [===================>..........] - ETA: 51s - loss: 1.9073 - regression_loss: 1.6036 - classification_loss: 0.3038 344/500 [===================>..........] - ETA: 51s - loss: 1.9077 - regression_loss: 1.6039 - classification_loss: 0.3038 345/500 [===================>..........] - ETA: 51s - loss: 1.9082 - regression_loss: 1.6044 - classification_loss: 0.3038 346/500 [===================>..........] - ETA: 50s - loss: 1.9062 - regression_loss: 1.6027 - classification_loss: 0.3034 347/500 [===================>..........] - ETA: 50s - loss: 1.9061 - regression_loss: 1.6029 - classification_loss: 0.3032 348/500 [===================>..........] - ETA: 50s - loss: 1.9039 - regression_loss: 1.6010 - classification_loss: 0.3028 349/500 [===================>..........] - ETA: 49s - loss: 1.9053 - regression_loss: 1.6024 - classification_loss: 0.3029 350/500 [====================>.........] - ETA: 49s - loss: 1.9055 - regression_loss: 1.6024 - classification_loss: 0.3030 351/500 [====================>.........] - ETA: 49s - loss: 1.9057 - regression_loss: 1.6028 - classification_loss: 0.3029 352/500 [====================>.........] - ETA: 48s - loss: 1.9068 - regression_loss: 1.6038 - classification_loss: 0.3030 353/500 [====================>.........] - ETA: 48s - loss: 1.9077 - regression_loss: 1.6043 - classification_loss: 0.3034 354/500 [====================>.........] - ETA: 48s - loss: 1.9075 - regression_loss: 1.6043 - classification_loss: 0.3032 355/500 [====================>.........] - ETA: 47s - loss: 1.9078 - regression_loss: 1.6047 - classification_loss: 0.3031 356/500 [====================>.........] - ETA: 47s - loss: 1.9078 - regression_loss: 1.6047 - classification_loss: 0.3031 357/500 [====================>.........] - ETA: 47s - loss: 1.9082 - regression_loss: 1.6051 - classification_loss: 0.3031 358/500 [====================>.........] - ETA: 46s - loss: 1.9099 - regression_loss: 1.6066 - classification_loss: 0.3033 359/500 [====================>.........] - ETA: 46s - loss: 1.9101 - regression_loss: 1.6068 - classification_loss: 0.3032 360/500 [====================>.........] - ETA: 46s - loss: 1.9092 - regression_loss: 1.6063 - classification_loss: 0.3029 361/500 [====================>.........] - ETA: 45s - loss: 1.9095 - regression_loss: 1.6066 - classification_loss: 0.3029 362/500 [====================>.........] - ETA: 45s - loss: 1.9105 - regression_loss: 1.6075 - classification_loss: 0.3030 363/500 [====================>.........] - ETA: 45s - loss: 1.9100 - regression_loss: 1.6071 - classification_loss: 0.3029 364/500 [====================>.........] - ETA: 44s - loss: 1.9109 - regression_loss: 1.6079 - classification_loss: 0.3030 365/500 [====================>.........] - ETA: 44s - loss: 1.9121 - regression_loss: 1.6091 - classification_loss: 0.3030 366/500 [====================>.........] - ETA: 44s - loss: 1.9122 - regression_loss: 1.6089 - classification_loss: 0.3032 367/500 [=====================>........] - ETA: 43s - loss: 1.9109 - regression_loss: 1.6078 - classification_loss: 0.3031 368/500 [=====================>........] - ETA: 43s - loss: 1.9110 - regression_loss: 1.6080 - classification_loss: 0.3030 369/500 [=====================>........] - ETA: 43s - loss: 1.9105 - regression_loss: 1.6076 - classification_loss: 0.3030 370/500 [=====================>........] - ETA: 42s - loss: 1.9082 - regression_loss: 1.6053 - classification_loss: 0.3029 371/500 [=====================>........] - ETA: 42s - loss: 1.9069 - regression_loss: 1.6042 - classification_loss: 0.3027 372/500 [=====================>........] - ETA: 42s - loss: 1.9065 - regression_loss: 1.6042 - classification_loss: 0.3024 373/500 [=====================>........] - ETA: 41s - loss: 1.9073 - regression_loss: 1.6049 - classification_loss: 0.3024 374/500 [=====================>........] - ETA: 41s - loss: 1.9077 - regression_loss: 1.6053 - classification_loss: 0.3025 375/500 [=====================>........] - ETA: 41s - loss: 1.9071 - regression_loss: 1.6046 - classification_loss: 0.3026 376/500 [=====================>........] - ETA: 40s - loss: 1.9066 - regression_loss: 1.6042 - classification_loss: 0.3024 377/500 [=====================>........] - ETA: 40s - loss: 1.9072 - regression_loss: 1.6045 - classification_loss: 0.3026 378/500 [=====================>........] - ETA: 40s - loss: 1.9053 - regression_loss: 1.6030 - classification_loss: 0.3023 379/500 [=====================>........] - ETA: 39s - loss: 1.9072 - regression_loss: 1.6046 - classification_loss: 0.3026 380/500 [=====================>........] - ETA: 39s - loss: 1.9074 - regression_loss: 1.6049 - classification_loss: 0.3025 381/500 [=====================>........] - ETA: 39s - loss: 1.9080 - regression_loss: 1.6054 - classification_loss: 0.3026 382/500 [=====================>........] - ETA: 38s - loss: 1.9081 - regression_loss: 1.6055 - classification_loss: 0.3027 383/500 [=====================>........] - ETA: 38s - loss: 1.9083 - regression_loss: 1.6057 - classification_loss: 0.3026 384/500 [======================>.......] - ETA: 38s - loss: 1.9087 - regression_loss: 1.6060 - classification_loss: 0.3028 385/500 [======================>.......] - ETA: 37s - loss: 1.9076 - regression_loss: 1.6051 - classification_loss: 0.3025 386/500 [======================>.......] - ETA: 37s - loss: 1.9051 - regression_loss: 1.6030 - classification_loss: 0.3021 387/500 [======================>.......] - ETA: 37s - loss: 1.9060 - regression_loss: 1.6036 - classification_loss: 0.3025 388/500 [======================>.......] - ETA: 36s - loss: 1.9069 - regression_loss: 1.6044 - classification_loss: 0.3025 389/500 [======================>.......] - ETA: 36s - loss: 1.9068 - regression_loss: 1.6041 - classification_loss: 0.3027 390/500 [======================>.......] - ETA: 36s - loss: 1.9053 - regression_loss: 1.6030 - classification_loss: 0.3024 391/500 [======================>.......] - ETA: 35s - loss: 1.9051 - regression_loss: 1.6027 - classification_loss: 0.3024 392/500 [======================>.......] - ETA: 35s - loss: 1.9026 - regression_loss: 1.6006 - classification_loss: 0.3021 393/500 [======================>.......] - ETA: 35s - loss: 1.9034 - regression_loss: 1.6013 - classification_loss: 0.3021 394/500 [======================>.......] - ETA: 34s - loss: 1.9045 - regression_loss: 1.6021 - classification_loss: 0.3024 395/500 [======================>.......] - ETA: 34s - loss: 1.9049 - regression_loss: 1.6025 - classification_loss: 0.3024 396/500 [======================>.......] - ETA: 34s - loss: 1.9053 - regression_loss: 1.6028 - classification_loss: 0.3025 397/500 [======================>.......] - ETA: 33s - loss: 1.9067 - regression_loss: 1.6039 - classification_loss: 0.3027 398/500 [======================>.......] - ETA: 33s - loss: 1.9069 - regression_loss: 1.6042 - classification_loss: 0.3027 399/500 [======================>.......] - ETA: 33s - loss: 1.9077 - regression_loss: 1.6049 - classification_loss: 0.3028 400/500 [=======================>......] - ETA: 33s - loss: 1.9086 - regression_loss: 1.6056 - classification_loss: 0.3030 401/500 [=======================>......] - ETA: 32s - loss: 1.9093 - regression_loss: 1.6064 - classification_loss: 0.3030 402/500 [=======================>......] - ETA: 32s - loss: 1.9099 - regression_loss: 1.6068 - classification_loss: 0.3032 403/500 [=======================>......] - ETA: 32s - loss: 1.9106 - regression_loss: 1.6074 - classification_loss: 0.3033 404/500 [=======================>......] - ETA: 31s - loss: 1.9098 - regression_loss: 1.6068 - classification_loss: 0.3029 405/500 [=======================>......] - ETA: 31s - loss: 1.9099 - regression_loss: 1.6067 - classification_loss: 0.3032 406/500 [=======================>......] - ETA: 31s - loss: 1.9110 - regression_loss: 1.6076 - classification_loss: 0.3034 407/500 [=======================>......] - ETA: 30s - loss: 1.9102 - regression_loss: 1.6069 - classification_loss: 0.3033 408/500 [=======================>......] - ETA: 30s - loss: 1.9105 - regression_loss: 1.6072 - classification_loss: 0.3034 409/500 [=======================>......] - ETA: 30s - loss: 1.9107 - regression_loss: 1.6073 - classification_loss: 0.3034 410/500 [=======================>......] - ETA: 29s - loss: 1.9128 - regression_loss: 1.6091 - classification_loss: 0.3037 411/500 [=======================>......] - ETA: 29s - loss: 1.9133 - regression_loss: 1.6096 - classification_loss: 0.3037 412/500 [=======================>......] - ETA: 29s - loss: 1.9137 - regression_loss: 1.6099 - classification_loss: 0.3038 413/500 [=======================>......] - ETA: 28s - loss: 1.9148 - regression_loss: 1.6109 - classification_loss: 0.3039 414/500 [=======================>......] - ETA: 28s - loss: 1.9146 - regression_loss: 1.6107 - classification_loss: 0.3039 415/500 [=======================>......] - ETA: 28s - loss: 1.9134 - regression_loss: 1.6095 - classification_loss: 0.3039 416/500 [=======================>......] - ETA: 27s - loss: 1.9141 - regression_loss: 1.6103 - classification_loss: 0.3039 417/500 [========================>.....] - ETA: 27s - loss: 1.9150 - regression_loss: 1.6109 - classification_loss: 0.3041 418/500 [========================>.....] - ETA: 27s - loss: 1.9156 - regression_loss: 1.6114 - classification_loss: 0.3041 419/500 [========================>.....] - ETA: 26s - loss: 1.9169 - regression_loss: 1.6123 - classification_loss: 0.3045 420/500 [========================>.....] - ETA: 26s - loss: 1.9175 - regression_loss: 1.6130 - classification_loss: 0.3045 421/500 [========================>.....] - ETA: 26s - loss: 1.9181 - regression_loss: 1.6136 - classification_loss: 0.3045 422/500 [========================>.....] - ETA: 25s - loss: 1.9187 - regression_loss: 1.6141 - classification_loss: 0.3046 423/500 [========================>.....] - ETA: 25s - loss: 1.9188 - regression_loss: 1.6142 - classification_loss: 0.3045 424/500 [========================>.....] - ETA: 25s - loss: 1.9186 - regression_loss: 1.6141 - classification_loss: 0.3045 425/500 [========================>.....] - ETA: 24s - loss: 1.9181 - regression_loss: 1.6137 - classification_loss: 0.3045 426/500 [========================>.....] - ETA: 24s - loss: 1.9189 - regression_loss: 1.6144 - classification_loss: 0.3045 427/500 [========================>.....] - ETA: 24s - loss: 1.9204 - regression_loss: 1.6158 - classification_loss: 0.3046 428/500 [========================>.....] - ETA: 23s - loss: 1.9210 - regression_loss: 1.6164 - classification_loss: 0.3046 429/500 [========================>.....] - ETA: 23s - loss: 1.9212 - regression_loss: 1.6166 - classification_loss: 0.3047 430/500 [========================>.....] - ETA: 23s - loss: 1.9221 - regression_loss: 1.6175 - classification_loss: 0.3047 431/500 [========================>.....] - ETA: 22s - loss: 1.9218 - regression_loss: 1.6172 - classification_loss: 0.3046 432/500 [========================>.....] - ETA: 22s - loss: 1.9213 - regression_loss: 1.6169 - classification_loss: 0.3044 433/500 [========================>.....] - ETA: 22s - loss: 1.9198 - regression_loss: 1.6153 - classification_loss: 0.3045 434/500 [=========================>....] - ETA: 21s - loss: 1.9197 - regression_loss: 1.6153 - classification_loss: 0.3045 435/500 [=========================>....] - ETA: 21s - loss: 1.9173 - regression_loss: 1.6129 - classification_loss: 0.3044 436/500 [=========================>....] - ETA: 21s - loss: 1.9167 - regression_loss: 1.6125 - classification_loss: 0.3042 437/500 [=========================>....] - ETA: 20s - loss: 1.9178 - regression_loss: 1.6136 - classification_loss: 0.3043 438/500 [=========================>....] - ETA: 20s - loss: 1.9184 - regression_loss: 1.6140 - classification_loss: 0.3043 439/500 [=========================>....] - ETA: 20s - loss: 1.9180 - regression_loss: 1.6138 - classification_loss: 0.3043 440/500 [=========================>....] - ETA: 19s - loss: 1.9178 - regression_loss: 1.6136 - classification_loss: 0.3042 441/500 [=========================>....] - ETA: 19s - loss: 1.9165 - regression_loss: 1.6126 - classification_loss: 0.3039 442/500 [=========================>....] - ETA: 19s - loss: 1.9168 - regression_loss: 1.6129 - classification_loss: 0.3039 443/500 [=========================>....] - ETA: 18s - loss: 1.9170 - regression_loss: 1.6131 - classification_loss: 0.3039 444/500 [=========================>....] - ETA: 18s - loss: 1.9175 - regression_loss: 1.6137 - classification_loss: 0.3038 445/500 [=========================>....] - ETA: 18s - loss: 1.9152 - regression_loss: 1.6116 - classification_loss: 0.3036 446/500 [=========================>....] - ETA: 17s - loss: 1.9157 - regression_loss: 1.6121 - classification_loss: 0.3036 447/500 [=========================>....] - ETA: 17s - loss: 1.9156 - regression_loss: 1.6121 - classification_loss: 0.3036 448/500 [=========================>....] - ETA: 17s - loss: 1.9165 - regression_loss: 1.6127 - classification_loss: 0.3038 449/500 [=========================>....] - ETA: 16s - loss: 1.9166 - regression_loss: 1.6128 - classification_loss: 0.3038 450/500 [==========================>...] - ETA: 16s - loss: 1.9164 - regression_loss: 1.6126 - classification_loss: 0.3038 451/500 [==========================>...] - ETA: 16s - loss: 1.9167 - regression_loss: 1.6129 - classification_loss: 0.3038 452/500 [==========================>...] - ETA: 15s - loss: 1.9170 - regression_loss: 1.6131 - classification_loss: 0.3039 453/500 [==========================>...] - ETA: 15s - loss: 1.9165 - regression_loss: 1.6129 - classification_loss: 0.3036 454/500 [==========================>...] - ETA: 15s - loss: 1.9167 - regression_loss: 1.6130 - classification_loss: 0.3036 455/500 [==========================>...] - ETA: 14s - loss: 1.9170 - regression_loss: 1.6133 - classification_loss: 0.3037 456/500 [==========================>...] - ETA: 14s - loss: 1.9168 - regression_loss: 1.6132 - classification_loss: 0.3036 457/500 [==========================>...] - ETA: 14s - loss: 1.9172 - regression_loss: 1.6136 - classification_loss: 0.3035 458/500 [==========================>...] - ETA: 13s - loss: 1.9180 - regression_loss: 1.6145 - classification_loss: 0.3036 459/500 [==========================>...] - ETA: 13s - loss: 1.9188 - regression_loss: 1.6151 - classification_loss: 0.3036 460/500 [==========================>...] - ETA: 13s - loss: 1.9192 - regression_loss: 1.6155 - classification_loss: 0.3037 461/500 [==========================>...] - ETA: 12s - loss: 1.9179 - regression_loss: 1.6144 - classification_loss: 0.3035 462/500 [==========================>...] - ETA: 12s - loss: 1.9173 - regression_loss: 1.6138 - classification_loss: 0.3034 463/500 [==========================>...] - ETA: 12s - loss: 1.9186 - regression_loss: 1.6150 - classification_loss: 0.3036 464/500 [==========================>...] - ETA: 11s - loss: 1.9168 - regression_loss: 1.6135 - classification_loss: 0.3033 465/500 [==========================>...] - ETA: 11s - loss: 1.9177 - regression_loss: 1.6140 - classification_loss: 0.3036 466/500 [==========================>...] - ETA: 11s - loss: 1.9179 - regression_loss: 1.6143 - classification_loss: 0.3036 467/500 [===========================>..] - ETA: 10s - loss: 1.9154 - regression_loss: 1.6121 - classification_loss: 0.3033 468/500 [===========================>..] - ETA: 10s - loss: 1.9157 - regression_loss: 1.6124 - classification_loss: 0.3033 469/500 [===========================>..] - ETA: 10s - loss: 1.9163 - regression_loss: 1.6129 - classification_loss: 0.3033 470/500 [===========================>..] - ETA: 9s - loss: 1.9170 - regression_loss: 1.6136 - classification_loss: 0.3034  471/500 [===========================>..] - ETA: 9s - loss: 1.9181 - regression_loss: 1.6145 - classification_loss: 0.3035 472/500 [===========================>..] - ETA: 9s - loss: 1.9178 - regression_loss: 1.6144 - classification_loss: 0.3035 473/500 [===========================>..] - ETA: 8s - loss: 1.9177 - regression_loss: 1.6142 - classification_loss: 0.3034 474/500 [===========================>..] - ETA: 8s - loss: 1.9181 - regression_loss: 1.6147 - classification_loss: 0.3034 475/500 [===========================>..] - ETA: 8s - loss: 1.9194 - regression_loss: 1.6158 - classification_loss: 0.3036 476/500 [===========================>..] - ETA: 7s - loss: 1.9200 - regression_loss: 1.6166 - classification_loss: 0.3034 477/500 [===========================>..] - ETA: 7s - loss: 1.9200 - regression_loss: 1.6164 - classification_loss: 0.3036 478/500 [===========================>..] - ETA: 7s - loss: 1.9209 - regression_loss: 1.6173 - classification_loss: 0.3036 479/500 [===========================>..] - ETA: 6s - loss: 1.9209 - regression_loss: 1.6173 - classification_loss: 0.3036 480/500 [===========================>..] - ETA: 6s - loss: 1.9204 - regression_loss: 1.6169 - classification_loss: 0.3035 481/500 [===========================>..] - ETA: 6s - loss: 1.9200 - regression_loss: 1.6167 - classification_loss: 0.3033 482/500 [===========================>..] - ETA: 5s - loss: 1.9199 - regression_loss: 1.6167 - classification_loss: 0.3032 483/500 [===========================>..] - ETA: 5s - loss: 1.9193 - regression_loss: 1.6163 - classification_loss: 0.3031 484/500 [============================>.] - ETA: 5s - loss: 1.9184 - regression_loss: 1.6156 - classification_loss: 0.3029 485/500 [============================>.] - ETA: 4s - loss: 1.9186 - regression_loss: 1.6158 - classification_loss: 0.3029 486/500 [============================>.] - ETA: 4s - loss: 1.9189 - regression_loss: 1.6159 - classification_loss: 0.3029 487/500 [============================>.] - ETA: 4s - loss: 1.9171 - regression_loss: 1.6145 - classification_loss: 0.3026 488/500 [============================>.] - ETA: 3s - loss: 1.9178 - regression_loss: 1.6149 - classification_loss: 0.3028 489/500 [============================>.] - ETA: 3s - loss: 1.9172 - regression_loss: 1.6145 - classification_loss: 0.3027 490/500 [============================>.] - ETA: 3s - loss: 1.9166 - regression_loss: 1.6140 - classification_loss: 0.3026 491/500 [============================>.] - ETA: 2s - loss: 1.9161 - regression_loss: 1.6135 - classification_loss: 0.3026 492/500 [============================>.] - ETA: 2s - loss: 1.9165 - regression_loss: 1.6138 - classification_loss: 0.3026 493/500 [============================>.] - ETA: 2s - loss: 1.9157 - regression_loss: 1.6132 - classification_loss: 0.3025 494/500 [============================>.] - ETA: 1s - loss: 1.9157 - regression_loss: 1.6132 - classification_loss: 0.3025 495/500 [============================>.] - ETA: 1s - loss: 1.9151 - regression_loss: 1.6128 - classification_loss: 0.3023 496/500 [============================>.] - ETA: 1s - loss: 1.9166 - regression_loss: 1.6140 - classification_loss: 0.3025 497/500 [============================>.] - ETA: 0s - loss: 1.9167 - regression_loss: 1.6142 - classification_loss: 0.3025 498/500 [============================>.] - ETA: 0s - loss: 1.9161 - regression_loss: 1.6137 - classification_loss: 0.3024 499/500 [============================>.] - ETA: 0s - loss: 1.9158 - regression_loss: 1.6135 - classification_loss: 0.3023 500/500 [==============================] - 165s 331ms/step - loss: 1.9153 - regression_loss: 1.6130 - classification_loss: 0.3023 1172 instances of class plum with average precision: 0.4947 mAP: 0.4947 Epoch 00010: saving model to ./training/snapshots/resnet101_pascal_10.h5 Epoch 11/150 1/500 [..............................] - ETA: 2:36 - loss: 2.1900 - regression_loss: 1.8500 - classification_loss: 0.3400 2/500 [..............................] - ETA: 2:36 - loss: 2.3457 - regression_loss: 1.9019 - classification_loss: 0.4438 3/500 [..............................] - ETA: 2:34 - loss: 2.2725 - regression_loss: 1.8585 - classification_loss: 0.4140 4/500 [..............................] - ETA: 2:37 - loss: 2.1573 - regression_loss: 1.7805 - classification_loss: 0.3768 5/500 [..............................] - ETA: 2:39 - loss: 2.0312 - regression_loss: 1.6871 - classification_loss: 0.3441 6/500 [..............................] - ETA: 2:41 - loss: 2.0137 - regression_loss: 1.6834 - classification_loss: 0.3303 7/500 [..............................] - ETA: 2:42 - loss: 2.0192 - regression_loss: 1.6882 - classification_loss: 0.3310 8/500 [..............................] - ETA: 2:40 - loss: 2.0067 - regression_loss: 1.6816 - classification_loss: 0.3251 9/500 [..............................] - ETA: 2:39 - loss: 1.9565 - regression_loss: 1.6426 - classification_loss: 0.3139 10/500 [..............................] - ETA: 2:39 - loss: 1.8809 - regression_loss: 1.5823 - classification_loss: 0.2985 11/500 [..............................] - ETA: 2:39 - loss: 1.8013 - regression_loss: 1.5148 - classification_loss: 0.2865 12/500 [..............................] - ETA: 2:38 - loss: 1.8350 - regression_loss: 1.5417 - classification_loss: 0.2933 13/500 [..............................] - ETA: 2:37 - loss: 1.7684 - regression_loss: 1.4811 - classification_loss: 0.2873 14/500 [..............................] - ETA: 2:37 - loss: 1.7190 - regression_loss: 1.4393 - classification_loss: 0.2797 15/500 [..............................] - ETA: 2:37 - loss: 1.7494 - regression_loss: 1.4660 - classification_loss: 0.2834 16/500 [..............................] - ETA: 2:36 - loss: 1.7521 - regression_loss: 1.4677 - classification_loss: 0.2844 17/500 [>.............................] - ETA: 2:36 - loss: 1.7582 - regression_loss: 1.4716 - classification_loss: 0.2866 18/500 [>.............................] - ETA: 2:35 - loss: 1.7525 - regression_loss: 1.4728 - classification_loss: 0.2797 19/500 [>.............................] - ETA: 2:35 - loss: 1.7633 - regression_loss: 1.4834 - classification_loss: 0.2799 20/500 [>.............................] - ETA: 2:34 - loss: 1.7844 - regression_loss: 1.5031 - classification_loss: 0.2813 21/500 [>.............................] - ETA: 2:34 - loss: 1.7469 - regression_loss: 1.4730 - classification_loss: 0.2739 22/500 [>.............................] - ETA: 2:34 - loss: 1.7404 - regression_loss: 1.4686 - classification_loss: 0.2718 23/500 [>.............................] - ETA: 2:33 - loss: 1.7521 - regression_loss: 1.4781 - classification_loss: 0.2740 24/500 [>.............................] - ETA: 2:34 - loss: 1.7295 - regression_loss: 1.4591 - classification_loss: 0.2704 25/500 [>.............................] - ETA: 2:34 - loss: 1.7447 - regression_loss: 1.4728 - classification_loss: 0.2719 26/500 [>.............................] - ETA: 2:33 - loss: 1.7517 - regression_loss: 1.4788 - classification_loss: 0.2729 27/500 [>.............................] - ETA: 2:33 - loss: 1.7664 - regression_loss: 1.4921 - classification_loss: 0.2742 28/500 [>.............................] - ETA: 2:33 - loss: 1.7759 - regression_loss: 1.5013 - classification_loss: 0.2746 29/500 [>.............................] - ETA: 2:33 - loss: 1.7871 - regression_loss: 1.5149 - classification_loss: 0.2722 30/500 [>.............................] - ETA: 2:32 - loss: 1.8064 - regression_loss: 1.5306 - classification_loss: 0.2758 31/500 [>.............................] - ETA: 2:32 - loss: 1.7984 - regression_loss: 1.5235 - classification_loss: 0.2749 32/500 [>.............................] - ETA: 2:32 - loss: 1.7751 - regression_loss: 1.4995 - classification_loss: 0.2755 33/500 [>.............................] - ETA: 2:31 - loss: 1.7607 - regression_loss: 1.4863 - classification_loss: 0.2744 34/500 [=>............................] - ETA: 2:31 - loss: 1.7719 - regression_loss: 1.4988 - classification_loss: 0.2732 35/500 [=>............................] - ETA: 2:31 - loss: 1.7587 - regression_loss: 1.4865 - classification_loss: 0.2722 36/500 [=>............................] - ETA: 2:31 - loss: 1.7708 - regression_loss: 1.4960 - classification_loss: 0.2747 37/500 [=>............................] - ETA: 2:31 - loss: 1.7825 - regression_loss: 1.5062 - classification_loss: 0.2763 38/500 [=>............................] - ETA: 2:31 - loss: 1.8060 - regression_loss: 1.5238 - classification_loss: 0.2822 39/500 [=>............................] - ETA: 2:30 - loss: 1.8163 - regression_loss: 1.5319 - classification_loss: 0.2844 40/500 [=>............................] - ETA: 2:30 - loss: 1.8121 - regression_loss: 1.5294 - classification_loss: 0.2827 41/500 [=>............................] - ETA: 2:30 - loss: 1.8259 - regression_loss: 1.5422 - classification_loss: 0.2837 42/500 [=>............................] - ETA: 2:30 - loss: 1.8269 - regression_loss: 1.5424 - classification_loss: 0.2845 43/500 [=>............................] - ETA: 2:29 - loss: 1.8405 - regression_loss: 1.5521 - classification_loss: 0.2884 44/500 [=>............................] - ETA: 2:29 - loss: 1.8217 - regression_loss: 1.5360 - classification_loss: 0.2856 45/500 [=>............................] - ETA: 2:28 - loss: 1.8286 - regression_loss: 1.5423 - classification_loss: 0.2862 46/500 [=>............................] - ETA: 2:28 - loss: 1.8337 - regression_loss: 1.5473 - classification_loss: 0.2864 47/500 [=>............................] - ETA: 2:28 - loss: 1.8293 - regression_loss: 1.5427 - classification_loss: 0.2866 48/500 [=>............................] - ETA: 2:27 - loss: 1.8378 - regression_loss: 1.5501 - classification_loss: 0.2877 49/500 [=>............................] - ETA: 2:27 - loss: 1.8409 - regression_loss: 1.5530 - classification_loss: 0.2879 50/500 [==>...........................] - ETA: 2:27 - loss: 1.8464 - regression_loss: 1.5572 - classification_loss: 0.2892 51/500 [==>...........................] - ETA: 2:26 - loss: 1.8454 - regression_loss: 1.5548 - classification_loss: 0.2905 52/500 [==>...........................] - ETA: 2:26 - loss: 1.8491 - regression_loss: 1.5574 - classification_loss: 0.2917 53/500 [==>...........................] - ETA: 2:26 - loss: 1.8523 - regression_loss: 1.5598 - classification_loss: 0.2925 54/500 [==>...........................] - ETA: 2:25 - loss: 1.8549 - regression_loss: 1.5621 - classification_loss: 0.2928 55/500 [==>...........................] - ETA: 2:25 - loss: 1.8517 - regression_loss: 1.5561 - classification_loss: 0.2956 56/500 [==>...........................] - ETA: 2:25 - loss: 1.8458 - regression_loss: 1.5520 - classification_loss: 0.2938 57/500 [==>...........................] - ETA: 2:24 - loss: 1.8485 - regression_loss: 1.5542 - classification_loss: 0.2942 58/500 [==>...........................] - ETA: 2:24 - loss: 1.8564 - regression_loss: 1.5616 - classification_loss: 0.2949 59/500 [==>...........................] - ETA: 2:24 - loss: 1.8602 - regression_loss: 1.5651 - classification_loss: 0.2951 60/500 [==>...........................] - ETA: 2:23 - loss: 1.8562 - regression_loss: 1.5627 - classification_loss: 0.2935 61/500 [==>...........................] - ETA: 2:23 - loss: 1.8701 - regression_loss: 1.5746 - classification_loss: 0.2955 62/500 [==>...........................] - ETA: 2:23 - loss: 1.8641 - regression_loss: 1.5690 - classification_loss: 0.2952 63/500 [==>...........................] - ETA: 2:22 - loss: 1.8656 - regression_loss: 1.5697 - classification_loss: 0.2959 64/500 [==>...........................] - ETA: 2:22 - loss: 1.8688 - regression_loss: 1.5733 - classification_loss: 0.2956 65/500 [==>...........................] - ETA: 2:22 - loss: 1.8673 - regression_loss: 1.5713 - classification_loss: 0.2959 66/500 [==>...........................] - ETA: 2:22 - loss: 1.8727 - regression_loss: 1.5745 - classification_loss: 0.2982 67/500 [===>..........................] - ETA: 2:21 - loss: 1.8768 - regression_loss: 1.5784 - classification_loss: 0.2984 68/500 [===>..........................] - ETA: 2:21 - loss: 1.8770 - regression_loss: 1.5795 - classification_loss: 0.2975 69/500 [===>..........................] - ETA: 2:21 - loss: 1.8831 - regression_loss: 1.5842 - classification_loss: 0.2989 70/500 [===>..........................] - ETA: 2:20 - loss: 1.8831 - regression_loss: 1.5848 - classification_loss: 0.2983 71/500 [===>..........................] - ETA: 2:20 - loss: 1.8936 - regression_loss: 1.5937 - classification_loss: 0.2999 72/500 [===>..........................] - ETA: 2:20 - loss: 1.8919 - regression_loss: 1.5926 - classification_loss: 0.2993 73/500 [===>..........................] - ETA: 2:19 - loss: 1.8923 - regression_loss: 1.5926 - classification_loss: 0.2997 74/500 [===>..........................] - ETA: 2:19 - loss: 1.8883 - regression_loss: 1.5872 - classification_loss: 0.3010 75/500 [===>..........................] - ETA: 2:19 - loss: 1.8922 - regression_loss: 1.5907 - classification_loss: 0.3015 76/500 [===>..........................] - ETA: 2:18 - loss: 1.8978 - regression_loss: 1.5949 - classification_loss: 0.3028 77/500 [===>..........................] - ETA: 2:18 - loss: 1.9010 - regression_loss: 1.5975 - classification_loss: 0.3035 78/500 [===>..........................] - ETA: 2:18 - loss: 1.8880 - regression_loss: 1.5864 - classification_loss: 0.3016 79/500 [===>..........................] - ETA: 2:18 - loss: 1.8769 - regression_loss: 1.5768 - classification_loss: 0.3002 80/500 [===>..........................] - ETA: 2:17 - loss: 1.8722 - regression_loss: 1.5724 - classification_loss: 0.2998 81/500 [===>..........................] - ETA: 2:17 - loss: 1.8740 - regression_loss: 1.5743 - classification_loss: 0.2997 82/500 [===>..........................] - ETA: 2:17 - loss: 1.8654 - regression_loss: 1.5675 - classification_loss: 0.2980 83/500 [===>..........................] - ETA: 2:16 - loss: 1.8695 - regression_loss: 1.5705 - classification_loss: 0.2989 84/500 [====>.........................] - ETA: 2:16 - loss: 1.8725 - regression_loss: 1.5733 - classification_loss: 0.2992 85/500 [====>.........................] - ETA: 2:16 - loss: 1.8719 - regression_loss: 1.5732 - classification_loss: 0.2987 86/500 [====>.........................] - ETA: 2:15 - loss: 1.8748 - regression_loss: 1.5755 - classification_loss: 0.2993 87/500 [====>.........................] - ETA: 2:15 - loss: 1.8653 - regression_loss: 1.5676 - classification_loss: 0.2977 88/500 [====>.........................] - ETA: 2:14 - loss: 1.8652 - regression_loss: 1.5677 - classification_loss: 0.2975 89/500 [====>.........................] - ETA: 2:14 - loss: 1.8556 - regression_loss: 1.5598 - classification_loss: 0.2958 90/500 [====>.........................] - ETA: 2:14 - loss: 1.8471 - regression_loss: 1.5528 - classification_loss: 0.2943 91/500 [====>.........................] - ETA: 2:13 - loss: 1.8341 - regression_loss: 1.5413 - classification_loss: 0.2928 92/500 [====>.........................] - ETA: 2:13 - loss: 1.8347 - regression_loss: 1.5418 - classification_loss: 0.2929 93/500 [====>.........................] - ETA: 2:13 - loss: 1.8337 - regression_loss: 1.5408 - classification_loss: 0.2929 94/500 [====>.........................] - ETA: 2:12 - loss: 1.8370 - regression_loss: 1.5436 - classification_loss: 0.2933 95/500 [====>.........................] - ETA: 2:12 - loss: 1.8289 - regression_loss: 1.5370 - classification_loss: 0.2919 96/500 [====>.........................] - ETA: 2:12 - loss: 1.8307 - regression_loss: 1.5389 - classification_loss: 0.2918 97/500 [====>.........................] - ETA: 2:11 - loss: 1.8328 - regression_loss: 1.5411 - classification_loss: 0.2918 98/500 [====>.........................] - ETA: 2:11 - loss: 1.8393 - regression_loss: 1.5454 - classification_loss: 0.2939 99/500 [====>.........................] - ETA: 2:11 - loss: 1.8445 - regression_loss: 1.5486 - classification_loss: 0.2959 100/500 [=====>........................] - ETA: 2:10 - loss: 1.8405 - regression_loss: 1.5454 - classification_loss: 0.2951 101/500 [=====>........................] - ETA: 2:10 - loss: 1.8439 - regression_loss: 1.5487 - classification_loss: 0.2952 102/500 [=====>........................] - ETA: 2:10 - loss: 1.8414 - regression_loss: 1.5476 - classification_loss: 0.2939 103/500 [=====>........................] - ETA: 2:10 - loss: 1.8414 - regression_loss: 1.5477 - classification_loss: 0.2938 104/500 [=====>........................] - ETA: 2:09 - loss: 1.8398 - regression_loss: 1.5465 - classification_loss: 0.2933 105/500 [=====>........................] - ETA: 2:09 - loss: 1.8393 - regression_loss: 1.5459 - classification_loss: 0.2934 106/500 [=====>........................] - ETA: 2:09 - loss: 1.8434 - regression_loss: 1.5495 - classification_loss: 0.2939 107/500 [=====>........................] - ETA: 2:08 - loss: 1.8340 - regression_loss: 1.5419 - classification_loss: 0.2921 108/500 [=====>........................] - ETA: 2:08 - loss: 1.8349 - regression_loss: 1.5432 - classification_loss: 0.2918 109/500 [=====>........................] - ETA: 2:08 - loss: 1.8362 - regression_loss: 1.5445 - classification_loss: 0.2917 110/500 [=====>........................] - ETA: 2:07 - loss: 1.8326 - regression_loss: 1.5416 - classification_loss: 0.2911 111/500 [=====>........................] - ETA: 2:07 - loss: 1.8321 - regression_loss: 1.5411 - classification_loss: 0.2910 112/500 [=====>........................] - ETA: 2:07 - loss: 1.8243 - regression_loss: 1.5345 - classification_loss: 0.2898 113/500 [=====>........................] - ETA: 2:06 - loss: 1.8262 - regression_loss: 1.5360 - classification_loss: 0.2902 114/500 [=====>........................] - ETA: 2:06 - loss: 1.8302 - regression_loss: 1.5393 - classification_loss: 0.2908 115/500 [=====>........................] - ETA: 2:06 - loss: 1.8246 - regression_loss: 1.5348 - classification_loss: 0.2898 116/500 [=====>........................] - ETA: 2:05 - loss: 1.8277 - regression_loss: 1.5375 - classification_loss: 0.2902 117/500 [======>.......................] - ETA: 2:05 - loss: 1.8284 - regression_loss: 1.5385 - classification_loss: 0.2899 118/500 [======>.......................] - ETA: 2:05 - loss: 1.8355 - regression_loss: 1.5437 - classification_loss: 0.2918 119/500 [======>.......................] - ETA: 2:04 - loss: 1.8346 - regression_loss: 1.5431 - classification_loss: 0.2915 120/500 [======>.......................] - ETA: 2:04 - loss: 1.8346 - regression_loss: 1.5431 - classification_loss: 0.2915 121/500 [======>.......................] - ETA: 2:04 - loss: 1.8342 - regression_loss: 1.5433 - classification_loss: 0.2908 122/500 [======>.......................] - ETA: 2:04 - loss: 1.8374 - regression_loss: 1.5465 - classification_loss: 0.2908 123/500 [======>.......................] - ETA: 2:03 - loss: 1.8373 - regression_loss: 1.5465 - classification_loss: 0.2907 124/500 [======>.......................] - ETA: 2:03 - loss: 1.8380 - regression_loss: 1.5471 - classification_loss: 0.2910 125/500 [======>.......................] - ETA: 2:03 - loss: 1.8405 - regression_loss: 1.5494 - classification_loss: 0.2911 126/500 [======>.......................] - ETA: 2:02 - loss: 1.8436 - regression_loss: 1.5523 - classification_loss: 0.2913 127/500 [======>.......................] - ETA: 2:02 - loss: 1.8454 - regression_loss: 1.5539 - classification_loss: 0.2916 128/500 [======>.......................] - ETA: 2:02 - loss: 1.8513 - regression_loss: 1.5590 - classification_loss: 0.2923 129/500 [======>.......................] - ETA: 2:01 - loss: 1.8534 - regression_loss: 1.5607 - classification_loss: 0.2928 130/500 [======>.......................] - ETA: 2:01 - loss: 1.8600 - regression_loss: 1.5661 - classification_loss: 0.2939 131/500 [======>.......................] - ETA: 2:01 - loss: 1.8620 - regression_loss: 1.5681 - classification_loss: 0.2938 132/500 [======>.......................] - ETA: 2:00 - loss: 1.8575 - regression_loss: 1.5639 - classification_loss: 0.2936 133/500 [======>.......................] - ETA: 2:00 - loss: 1.8602 - regression_loss: 1.5658 - classification_loss: 0.2945 134/500 [=======>......................] - ETA: 1:59 - loss: 1.8571 - regression_loss: 1.5634 - classification_loss: 0.2938 135/500 [=======>......................] - ETA: 1:59 - loss: 1.8573 - regression_loss: 1.5638 - classification_loss: 0.2935 136/500 [=======>......................] - ETA: 1:59 - loss: 1.8484 - regression_loss: 1.5558 - classification_loss: 0.2927 137/500 [=======>......................] - ETA: 1:58 - loss: 1.8536 - regression_loss: 1.5604 - classification_loss: 0.2932 138/500 [=======>......................] - ETA: 1:58 - loss: 1.8526 - regression_loss: 1.5596 - classification_loss: 0.2930 139/500 [=======>......................] - ETA: 1:58 - loss: 1.8471 - regression_loss: 1.5552 - classification_loss: 0.2918 140/500 [=======>......................] - ETA: 1:58 - loss: 1.8486 - regression_loss: 1.5566 - classification_loss: 0.2919 141/500 [=======>......................] - ETA: 1:57 - loss: 1.8507 - regression_loss: 1.5588 - classification_loss: 0.2919 142/500 [=======>......................] - ETA: 1:57 - loss: 1.8527 - regression_loss: 1.5605 - classification_loss: 0.2922 143/500 [=======>......................] - ETA: 1:57 - loss: 1.8504 - regression_loss: 1.5585 - classification_loss: 0.2918 144/500 [=======>......................] - ETA: 1:56 - loss: 1.8522 - regression_loss: 1.5600 - classification_loss: 0.2922 145/500 [=======>......................] - ETA: 1:56 - loss: 1.8521 - regression_loss: 1.5601 - classification_loss: 0.2920 146/500 [=======>......................] - ETA: 1:56 - loss: 1.8541 - regression_loss: 1.5616 - classification_loss: 0.2925 147/500 [=======>......................] - ETA: 1:55 - loss: 1.8510 - regression_loss: 1.5592 - classification_loss: 0.2918 148/500 [=======>......................] - ETA: 1:55 - loss: 1.8512 - regression_loss: 1.5597 - classification_loss: 0.2915 149/500 [=======>......................] - ETA: 1:55 - loss: 1.8524 - regression_loss: 1.5607 - classification_loss: 0.2917 150/500 [========>.....................] - ETA: 1:54 - loss: 1.8546 - regression_loss: 1.5629 - classification_loss: 0.2917 151/500 [========>.....................] - ETA: 1:54 - loss: 1.8527 - regression_loss: 1.5612 - classification_loss: 0.2914 152/500 [========>.....................] - ETA: 1:54 - loss: 1.8505 - regression_loss: 1.5580 - classification_loss: 0.2925 153/500 [========>.....................] - ETA: 1:53 - loss: 1.8529 - regression_loss: 1.5602 - classification_loss: 0.2927 154/500 [========>.....................] - ETA: 1:53 - loss: 1.8533 - regression_loss: 1.5605 - classification_loss: 0.2928 155/500 [========>.....................] - ETA: 1:53 - loss: 1.8471 - regression_loss: 1.5554 - classification_loss: 0.2917 156/500 [========>.....................] - ETA: 1:52 - loss: 1.8437 - regression_loss: 1.5528 - classification_loss: 0.2909 157/500 [========>.....................] - ETA: 1:52 - loss: 1.8450 - regression_loss: 1.5538 - classification_loss: 0.2911 158/500 [========>.....................] - ETA: 1:52 - loss: 1.8453 - regression_loss: 1.5542 - classification_loss: 0.2910 159/500 [========>.....................] - ETA: 1:51 - loss: 1.8430 - regression_loss: 1.5525 - classification_loss: 0.2906 160/500 [========>.....................] - ETA: 1:51 - loss: 1.8430 - regression_loss: 1.5529 - classification_loss: 0.2901 161/500 [========>.....................] - ETA: 1:51 - loss: 1.8447 - regression_loss: 1.5546 - classification_loss: 0.2901 162/500 [========>.....................] - ETA: 1:51 - loss: 1.8396 - regression_loss: 1.5501 - classification_loss: 0.2894 163/500 [========>.....................] - ETA: 1:50 - loss: 1.8363 - regression_loss: 1.5477 - classification_loss: 0.2886 164/500 [========>.....................] - ETA: 1:50 - loss: 1.8400 - regression_loss: 1.5507 - classification_loss: 0.2893 165/500 [========>.....................] - ETA: 1:50 - loss: 1.8429 - regression_loss: 1.5534 - classification_loss: 0.2896 166/500 [========>.....................] - ETA: 1:49 - loss: 1.8408 - regression_loss: 1.5518 - classification_loss: 0.2890 167/500 [=========>....................] - ETA: 1:49 - loss: 1.8437 - regression_loss: 1.5545 - classification_loss: 0.2891 168/500 [=========>....................] - ETA: 1:49 - loss: 1.8412 - regression_loss: 1.5520 - classification_loss: 0.2892 169/500 [=========>....................] - ETA: 1:48 - loss: 1.8400 - regression_loss: 1.5507 - classification_loss: 0.2893 170/500 [=========>....................] - ETA: 1:48 - loss: 1.8404 - regression_loss: 1.5512 - classification_loss: 0.2892 171/500 [=========>....................] - ETA: 1:48 - loss: 1.8419 - regression_loss: 1.5526 - classification_loss: 0.2893 172/500 [=========>....................] - ETA: 1:47 - loss: 1.8431 - regression_loss: 1.5539 - classification_loss: 0.2892 173/500 [=========>....................] - ETA: 1:47 - loss: 1.8432 - regression_loss: 1.5539 - classification_loss: 0.2893 174/500 [=========>....................] - ETA: 1:47 - loss: 1.8422 - regression_loss: 1.5532 - classification_loss: 0.2890 175/500 [=========>....................] - ETA: 1:46 - loss: 1.8436 - regression_loss: 1.5544 - classification_loss: 0.2891 176/500 [=========>....................] - ETA: 1:46 - loss: 1.8434 - regression_loss: 1.5543 - classification_loss: 0.2891 177/500 [=========>....................] - ETA: 1:46 - loss: 1.8388 - regression_loss: 1.5498 - classification_loss: 0.2890 178/500 [=========>....................] - ETA: 1:45 - loss: 1.8409 - regression_loss: 1.5516 - classification_loss: 0.2893 179/500 [=========>....................] - ETA: 1:45 - loss: 1.8356 - regression_loss: 1.5472 - classification_loss: 0.2884 180/500 [=========>....................] - ETA: 1:45 - loss: 1.8306 - regression_loss: 1.5429 - classification_loss: 0.2877 181/500 [=========>....................] - ETA: 1:44 - loss: 1.8325 - regression_loss: 1.5447 - classification_loss: 0.2878 182/500 [=========>....................] - ETA: 1:44 - loss: 1.8340 - regression_loss: 1.5462 - classification_loss: 0.2878 183/500 [=========>....................] - ETA: 1:44 - loss: 1.8367 - regression_loss: 1.5488 - classification_loss: 0.2880 184/500 [==========>...................] - ETA: 1:43 - loss: 1.8381 - regression_loss: 1.5500 - classification_loss: 0.2881 185/500 [==========>...................] - ETA: 1:43 - loss: 1.8382 - regression_loss: 1.5501 - classification_loss: 0.2881 186/500 [==========>...................] - ETA: 1:43 - loss: 1.8400 - regression_loss: 1.5512 - classification_loss: 0.2888 187/500 [==========>...................] - ETA: 1:42 - loss: 1.8417 - regression_loss: 1.5525 - classification_loss: 0.2892 188/500 [==========>...................] - ETA: 1:42 - loss: 1.8423 - regression_loss: 1.5531 - classification_loss: 0.2892 189/500 [==========>...................] - ETA: 1:42 - loss: 1.8433 - regression_loss: 1.5540 - classification_loss: 0.2893 190/500 [==========>...................] - ETA: 1:41 - loss: 1.8462 - regression_loss: 1.5566 - classification_loss: 0.2896 191/500 [==========>...................] - ETA: 1:41 - loss: 1.8415 - regression_loss: 1.5529 - classification_loss: 0.2886 192/500 [==========>...................] - ETA: 1:41 - loss: 1.8416 - regression_loss: 1.5529 - classification_loss: 0.2886 193/500 [==========>...................] - ETA: 1:40 - loss: 1.8423 - regression_loss: 1.5536 - classification_loss: 0.2887 194/500 [==========>...................] - ETA: 1:40 - loss: 1.8431 - regression_loss: 1.5542 - classification_loss: 0.2889 195/500 [==========>...................] - ETA: 1:40 - loss: 1.8453 - regression_loss: 1.5560 - classification_loss: 0.2894 196/500 [==========>...................] - ETA: 1:40 - loss: 1.8449 - regression_loss: 1.5558 - classification_loss: 0.2891 197/500 [==========>...................] - ETA: 1:39 - loss: 1.8468 - regression_loss: 1.5574 - classification_loss: 0.2894 198/500 [==========>...................] - ETA: 1:39 - loss: 1.8455 - regression_loss: 1.5561 - classification_loss: 0.2893 199/500 [==========>...................] - ETA: 1:39 - loss: 1.8475 - regression_loss: 1.5582 - classification_loss: 0.2893 200/500 [===========>..................] - ETA: 1:38 - loss: 1.8466 - regression_loss: 1.5573 - classification_loss: 0.2893 201/500 [===========>..................] - ETA: 1:38 - loss: 1.8454 - regression_loss: 1.5559 - classification_loss: 0.2895 202/500 [===========>..................] - ETA: 1:38 - loss: 1.8463 - regression_loss: 1.5568 - classification_loss: 0.2895 203/500 [===========>..................] - ETA: 1:37 - loss: 1.8426 - regression_loss: 1.5536 - classification_loss: 0.2890 204/500 [===========>..................] - ETA: 1:37 - loss: 1.8422 - regression_loss: 1.5531 - classification_loss: 0.2891 205/500 [===========>..................] - ETA: 1:37 - loss: 1.8362 - regression_loss: 1.5478 - classification_loss: 0.2884 206/500 [===========>..................] - ETA: 1:36 - loss: 1.8337 - regression_loss: 1.5458 - classification_loss: 0.2879 207/500 [===========>..................] - ETA: 1:36 - loss: 1.8330 - regression_loss: 1.5451 - classification_loss: 0.2879 208/500 [===========>..................] - ETA: 1:36 - loss: 1.8310 - regression_loss: 1.5437 - classification_loss: 0.2874 209/500 [===========>..................] - ETA: 1:35 - loss: 1.8300 - regression_loss: 1.5428 - classification_loss: 0.2872 210/500 [===========>..................] - ETA: 1:35 - loss: 1.8326 - regression_loss: 1.5449 - classification_loss: 0.2876 211/500 [===========>..................] - ETA: 1:35 - loss: 1.8338 - regression_loss: 1.5461 - classification_loss: 0.2877 212/500 [===========>..................] - ETA: 1:34 - loss: 1.8332 - regression_loss: 1.5458 - classification_loss: 0.2874 213/500 [===========>..................] - ETA: 1:34 - loss: 1.8357 - regression_loss: 1.5479 - classification_loss: 0.2878 214/500 [===========>..................] - ETA: 1:34 - loss: 1.8365 - regression_loss: 1.5484 - classification_loss: 0.2880 215/500 [===========>..................] - ETA: 1:33 - loss: 1.8375 - regression_loss: 1.5496 - classification_loss: 0.2879 216/500 [===========>..................] - ETA: 1:33 - loss: 1.8372 - regression_loss: 1.5493 - classification_loss: 0.2879 217/500 [============>.................] - ETA: 1:33 - loss: 1.8382 - regression_loss: 1.5497 - classification_loss: 0.2885 218/500 [============>.................] - ETA: 1:32 - loss: 1.8387 - regression_loss: 1.5503 - classification_loss: 0.2884 219/500 [============>.................] - ETA: 1:32 - loss: 1.8373 - regression_loss: 1.5494 - classification_loss: 0.2879 220/500 [============>.................] - ETA: 1:32 - loss: 1.8383 - regression_loss: 1.5503 - classification_loss: 0.2880 221/500 [============>.................] - ETA: 1:31 - loss: 1.8395 - regression_loss: 1.5513 - classification_loss: 0.2883 222/500 [============>.................] - ETA: 1:31 - loss: 1.8409 - regression_loss: 1.5524 - classification_loss: 0.2884 223/500 [============>.................] - ETA: 1:31 - loss: 1.8402 - regression_loss: 1.5522 - classification_loss: 0.2880 224/500 [============>.................] - ETA: 1:30 - loss: 1.8403 - regression_loss: 1.5525 - classification_loss: 0.2878 225/500 [============>.................] - ETA: 1:30 - loss: 1.8425 - regression_loss: 1.5543 - classification_loss: 0.2881 226/500 [============>.................] - ETA: 1:30 - loss: 1.8446 - regression_loss: 1.5557 - classification_loss: 0.2889 227/500 [============>.................] - ETA: 1:29 - loss: 1.8453 - regression_loss: 1.5564 - classification_loss: 0.2889 228/500 [============>.................] - ETA: 1:29 - loss: 1.8443 - regression_loss: 1.5553 - classification_loss: 0.2890 229/500 [============>.................] - ETA: 1:29 - loss: 1.8440 - regression_loss: 1.5551 - classification_loss: 0.2890 230/500 [============>.................] - ETA: 1:28 - loss: 1.8446 - regression_loss: 1.5549 - classification_loss: 0.2897 231/500 [============>.................] - ETA: 1:28 - loss: 1.8418 - regression_loss: 1.5526 - classification_loss: 0.2892 232/500 [============>.................] - ETA: 1:28 - loss: 1.8407 - regression_loss: 1.5520 - classification_loss: 0.2888 233/500 [============>.................] - ETA: 1:27 - loss: 1.8407 - regression_loss: 1.5520 - classification_loss: 0.2887 234/500 [=============>................] - ETA: 1:27 - loss: 1.8406 - regression_loss: 1.5519 - classification_loss: 0.2887 235/500 [=============>................] - ETA: 1:27 - loss: 1.8408 - regression_loss: 1.5519 - classification_loss: 0.2888 236/500 [=============>................] - ETA: 1:26 - loss: 1.8424 - regression_loss: 1.5533 - classification_loss: 0.2890 237/500 [=============>................] - ETA: 1:26 - loss: 1.8403 - regression_loss: 1.5517 - classification_loss: 0.2886 238/500 [=============>................] - ETA: 1:26 - loss: 1.8405 - regression_loss: 1.5518 - classification_loss: 0.2886 239/500 [=============>................] - ETA: 1:25 - loss: 1.8394 - regression_loss: 1.5511 - classification_loss: 0.2884 240/500 [=============>................] - ETA: 1:25 - loss: 1.8408 - regression_loss: 1.5522 - classification_loss: 0.2885 241/500 [=============>................] - ETA: 1:25 - loss: 1.8409 - regression_loss: 1.5524 - classification_loss: 0.2885 242/500 [=============>................] - ETA: 1:24 - loss: 1.8421 - regression_loss: 1.5536 - classification_loss: 0.2886 243/500 [=============>................] - ETA: 1:24 - loss: 1.8391 - regression_loss: 1.5509 - classification_loss: 0.2881 244/500 [=============>................] - ETA: 1:24 - loss: 1.8406 - regression_loss: 1.5523 - classification_loss: 0.2883 245/500 [=============>................] - ETA: 1:23 - loss: 1.8398 - regression_loss: 1.5517 - classification_loss: 0.2881 246/500 [=============>................] - ETA: 1:23 - loss: 1.8388 - regression_loss: 1.5509 - classification_loss: 0.2878 247/500 [=============>................] - ETA: 1:23 - loss: 1.8369 - regression_loss: 1.5495 - classification_loss: 0.2874 248/500 [=============>................] - ETA: 1:22 - loss: 1.8342 - regression_loss: 1.5470 - classification_loss: 0.2872 249/500 [=============>................] - ETA: 1:22 - loss: 1.8326 - regression_loss: 1.5457 - classification_loss: 0.2869 250/500 [==============>...............] - ETA: 1:22 - loss: 1.8337 - regression_loss: 1.5465 - classification_loss: 0.2872 251/500 [==============>...............] - ETA: 1:21 - loss: 1.8347 - regression_loss: 1.5472 - classification_loss: 0.2874 252/500 [==============>...............] - ETA: 1:21 - loss: 1.8353 - regression_loss: 1.5479 - classification_loss: 0.2875 253/500 [==============>...............] - ETA: 1:21 - loss: 1.8371 - regression_loss: 1.5495 - classification_loss: 0.2876 254/500 [==============>...............] - ETA: 1:20 - loss: 1.8348 - regression_loss: 1.5477 - classification_loss: 0.2871 255/500 [==============>...............] - ETA: 1:20 - loss: 1.8322 - regression_loss: 1.5456 - classification_loss: 0.2865 256/500 [==============>...............] - ETA: 1:20 - loss: 1.8296 - regression_loss: 1.5436 - classification_loss: 0.2860 257/500 [==============>...............] - ETA: 1:19 - loss: 1.8278 - regression_loss: 1.5421 - classification_loss: 0.2858 258/500 [==============>...............] - ETA: 1:19 - loss: 1.8294 - regression_loss: 1.5433 - classification_loss: 0.2861 259/500 [==============>...............] - ETA: 1:19 - loss: 1.8327 - regression_loss: 1.5459 - classification_loss: 0.2868 260/500 [==============>...............] - ETA: 1:18 - loss: 1.8302 - regression_loss: 1.5438 - classification_loss: 0.2864 261/500 [==============>...............] - ETA: 1:18 - loss: 1.8313 - regression_loss: 1.5447 - classification_loss: 0.2867 262/500 [==============>...............] - ETA: 1:18 - loss: 1.8312 - regression_loss: 1.5446 - classification_loss: 0.2866 263/500 [==============>...............] - ETA: 1:17 - loss: 1.8335 - regression_loss: 1.5463 - classification_loss: 0.2872 264/500 [==============>...............] - ETA: 1:17 - loss: 1.8345 - regression_loss: 1.5475 - classification_loss: 0.2870 265/500 [==============>...............] - ETA: 1:17 - loss: 1.8333 - regression_loss: 1.5465 - classification_loss: 0.2868 266/500 [==============>...............] - ETA: 1:16 - loss: 1.8313 - regression_loss: 1.5448 - classification_loss: 0.2865 267/500 [===============>..............] - ETA: 1:16 - loss: 1.8305 - regression_loss: 1.5441 - classification_loss: 0.2864 268/500 [===============>..............] - ETA: 1:16 - loss: 1.8295 - regression_loss: 1.5431 - classification_loss: 0.2864 269/500 [===============>..............] - ETA: 1:16 - loss: 1.8279 - regression_loss: 1.5416 - classification_loss: 0.2863 270/500 [===============>..............] - ETA: 1:15 - loss: 1.8258 - regression_loss: 1.5398 - classification_loss: 0.2861 271/500 [===============>..............] - ETA: 1:15 - loss: 1.8268 - regression_loss: 1.5406 - classification_loss: 0.2861 272/500 [===============>..............] - ETA: 1:15 - loss: 1.8278 - regression_loss: 1.5417 - classification_loss: 0.2861 273/500 [===============>..............] - ETA: 1:14 - loss: 1.8287 - regression_loss: 1.5424 - classification_loss: 0.2863 274/500 [===============>..............] - ETA: 1:14 - loss: 1.8287 - regression_loss: 1.5423 - classification_loss: 0.2864 275/500 [===============>..............] - ETA: 1:14 - loss: 1.8304 - regression_loss: 1.5435 - classification_loss: 0.2869 276/500 [===============>..............] - ETA: 1:13 - loss: 1.8319 - regression_loss: 1.5446 - classification_loss: 0.2873 277/500 [===============>..............] - ETA: 1:13 - loss: 1.8328 - regression_loss: 1.5455 - classification_loss: 0.2874 278/500 [===============>..............] - ETA: 1:13 - loss: 1.8332 - regression_loss: 1.5457 - classification_loss: 0.2874 279/500 [===============>..............] - ETA: 1:12 - loss: 1.8344 - regression_loss: 1.5469 - classification_loss: 0.2875 280/500 [===============>..............] - ETA: 1:12 - loss: 1.8353 - regression_loss: 1.5476 - classification_loss: 0.2877 281/500 [===============>..............] - ETA: 1:12 - loss: 1.8361 - regression_loss: 1.5484 - classification_loss: 0.2877 282/500 [===============>..............] - ETA: 1:11 - loss: 1.8369 - regression_loss: 1.5491 - classification_loss: 0.2878 283/500 [===============>..............] - ETA: 1:11 - loss: 1.8378 - regression_loss: 1.5500 - classification_loss: 0.2878 284/500 [================>.............] - ETA: 1:11 - loss: 1.8388 - regression_loss: 1.5510 - classification_loss: 0.2878 285/500 [================>.............] - ETA: 1:10 - loss: 1.8368 - regression_loss: 1.5489 - classification_loss: 0.2879 286/500 [================>.............] - ETA: 1:10 - loss: 1.8364 - regression_loss: 1.5485 - classification_loss: 0.2880 287/500 [================>.............] - ETA: 1:10 - loss: 1.8368 - regression_loss: 1.5491 - classification_loss: 0.2877 288/500 [================>.............] - ETA: 1:09 - loss: 1.8374 - regression_loss: 1.5497 - classification_loss: 0.2878 289/500 [================>.............] - ETA: 1:09 - loss: 1.8381 - regression_loss: 1.5503 - classification_loss: 0.2878 290/500 [================>.............] - ETA: 1:09 - loss: 1.8391 - regression_loss: 1.5512 - classification_loss: 0.2879 291/500 [================>.............] - ETA: 1:08 - loss: 1.8376 - regression_loss: 1.5501 - classification_loss: 0.2876 292/500 [================>.............] - ETA: 1:08 - loss: 1.8414 - regression_loss: 1.5529 - classification_loss: 0.2884 293/500 [================>.............] - ETA: 1:08 - loss: 1.8385 - regression_loss: 1.5504 - classification_loss: 0.2881 294/500 [================>.............] - ETA: 1:07 - loss: 1.8393 - regression_loss: 1.5509 - classification_loss: 0.2883 295/500 [================>.............] - ETA: 1:07 - loss: 1.8391 - regression_loss: 1.5507 - classification_loss: 0.2884 296/500 [================>.............] - ETA: 1:07 - loss: 1.8386 - regression_loss: 1.5503 - classification_loss: 0.2883 297/500 [================>.............] - ETA: 1:06 - loss: 1.8391 - regression_loss: 1.5509 - classification_loss: 0.2882 298/500 [================>.............] - ETA: 1:06 - loss: 1.8403 - regression_loss: 1.5523 - classification_loss: 0.2880 299/500 [================>.............] - ETA: 1:06 - loss: 1.8405 - regression_loss: 1.5525 - classification_loss: 0.2880 300/500 [=================>............] - ETA: 1:05 - loss: 1.8389 - regression_loss: 1.5510 - classification_loss: 0.2879 301/500 [=================>............] - ETA: 1:05 - loss: 1.8396 - regression_loss: 1.5516 - classification_loss: 0.2880 302/500 [=================>............] - ETA: 1:05 - loss: 1.8390 - regression_loss: 1.5514 - classification_loss: 0.2877 303/500 [=================>............] - ETA: 1:04 - loss: 1.8403 - regression_loss: 1.5524 - classification_loss: 0.2879 304/500 [=================>............] - ETA: 1:04 - loss: 1.8406 - regression_loss: 1.5528 - classification_loss: 0.2877 305/500 [=================>............] - ETA: 1:04 - loss: 1.8409 - regression_loss: 1.5532 - classification_loss: 0.2876 306/500 [=================>............] - ETA: 1:03 - loss: 1.8399 - regression_loss: 1.5524 - classification_loss: 0.2875 307/500 [=================>............] - ETA: 1:03 - loss: 1.8361 - regression_loss: 1.5492 - classification_loss: 0.2870 308/500 [=================>............] - ETA: 1:03 - loss: 1.8359 - regression_loss: 1.5489 - classification_loss: 0.2870 309/500 [=================>............] - ETA: 1:02 - loss: 1.8341 - regression_loss: 1.5474 - classification_loss: 0.2867 310/500 [=================>............] - ETA: 1:02 - loss: 1.8338 - regression_loss: 1.5470 - classification_loss: 0.2868 311/500 [=================>............] - ETA: 1:02 - loss: 1.8342 - regression_loss: 1.5474 - classification_loss: 0.2869 312/500 [=================>............] - ETA: 1:01 - loss: 1.8371 - regression_loss: 1.5497 - classification_loss: 0.2874 313/500 [=================>............] - ETA: 1:01 - loss: 1.8394 - regression_loss: 1.5516 - classification_loss: 0.2878 314/500 [=================>............] - ETA: 1:01 - loss: 1.8410 - regression_loss: 1.5530 - classification_loss: 0.2879 315/500 [=================>............] - ETA: 1:00 - loss: 1.8432 - regression_loss: 1.5549 - classification_loss: 0.2884 316/500 [=================>............] - ETA: 1:00 - loss: 1.8442 - regression_loss: 1.5555 - classification_loss: 0.2887 317/500 [==================>...........] - ETA: 1:00 - loss: 1.8459 - regression_loss: 1.5566 - classification_loss: 0.2892 318/500 [==================>...........] - ETA: 59s - loss: 1.8460 - regression_loss: 1.5567 - classification_loss: 0.2893  319/500 [==================>...........] - ETA: 59s - loss: 1.8468 - regression_loss: 1.5573 - classification_loss: 0.2895 320/500 [==================>...........] - ETA: 59s - loss: 1.8455 - regression_loss: 1.5560 - classification_loss: 0.2895 321/500 [==================>...........] - ETA: 59s - loss: 1.8444 - regression_loss: 1.5551 - classification_loss: 0.2893 322/500 [==================>...........] - ETA: 58s - loss: 1.8427 - regression_loss: 1.5538 - classification_loss: 0.2890 323/500 [==================>...........] - ETA: 58s - loss: 1.8420 - regression_loss: 1.5531 - classification_loss: 0.2889 324/500 [==================>...........] - ETA: 58s - loss: 1.8411 - regression_loss: 1.5522 - classification_loss: 0.2888 325/500 [==================>...........] - ETA: 57s - loss: 1.8385 - regression_loss: 1.5500 - classification_loss: 0.2885 326/500 [==================>...........] - ETA: 57s - loss: 1.8358 - regression_loss: 1.5477 - classification_loss: 0.2881 327/500 [==================>...........] - ETA: 57s - loss: 1.8358 - regression_loss: 1.5477 - classification_loss: 0.2881 328/500 [==================>...........] - ETA: 56s - loss: 1.8362 - regression_loss: 1.5479 - classification_loss: 0.2882 329/500 [==================>...........] - ETA: 56s - loss: 1.8363 - regression_loss: 1.5481 - classification_loss: 0.2882 330/500 [==================>...........] - ETA: 56s - loss: 1.8367 - regression_loss: 1.5485 - classification_loss: 0.2882 331/500 [==================>...........] - ETA: 55s - loss: 1.8379 - regression_loss: 1.5497 - classification_loss: 0.2882 332/500 [==================>...........] - ETA: 55s - loss: 1.8388 - regression_loss: 1.5502 - classification_loss: 0.2886 333/500 [==================>...........] - ETA: 55s - loss: 1.8397 - regression_loss: 1.5509 - classification_loss: 0.2888 334/500 [===================>..........] - ETA: 54s - loss: 1.8377 - regression_loss: 1.5492 - classification_loss: 0.2885 335/500 [===================>..........] - ETA: 54s - loss: 1.8384 - regression_loss: 1.5499 - classification_loss: 0.2885 336/500 [===================>..........] - ETA: 54s - loss: 1.8376 - regression_loss: 1.5494 - classification_loss: 0.2881 337/500 [===================>..........] - ETA: 53s - loss: 1.8353 - regression_loss: 1.5473 - classification_loss: 0.2880 338/500 [===================>..........] - ETA: 53s - loss: 1.8354 - regression_loss: 1.5475 - classification_loss: 0.2880 339/500 [===================>..........] - ETA: 53s - loss: 1.8358 - regression_loss: 1.5480 - classification_loss: 0.2878 340/500 [===================>..........] - ETA: 52s - loss: 1.8348 - regression_loss: 1.5472 - classification_loss: 0.2875 341/500 [===================>..........] - ETA: 52s - loss: 1.8357 - regression_loss: 1.5482 - classification_loss: 0.2876 342/500 [===================>..........] - ETA: 52s - loss: 1.8340 - regression_loss: 1.5465 - classification_loss: 0.2875 343/500 [===================>..........] - ETA: 51s - loss: 1.8344 - regression_loss: 1.5468 - classification_loss: 0.2876 344/500 [===================>..........] - ETA: 51s - loss: 1.8322 - regression_loss: 1.5450 - classification_loss: 0.2872 345/500 [===================>..........] - ETA: 51s - loss: 1.8328 - regression_loss: 1.5456 - classification_loss: 0.2873 346/500 [===================>..........] - ETA: 50s - loss: 1.8313 - regression_loss: 1.5444 - classification_loss: 0.2870 347/500 [===================>..........] - ETA: 50s - loss: 1.8318 - regression_loss: 1.5448 - classification_loss: 0.2871 348/500 [===================>..........] - ETA: 50s - loss: 1.8308 - regression_loss: 1.5439 - classification_loss: 0.2869 349/500 [===================>..........] - ETA: 49s - loss: 1.8305 - regression_loss: 1.5438 - classification_loss: 0.2867 350/500 [====================>.........] - ETA: 49s - loss: 1.8305 - regression_loss: 1.5438 - classification_loss: 0.2867 351/500 [====================>.........] - ETA: 49s - loss: 1.8309 - regression_loss: 1.5441 - classification_loss: 0.2868 352/500 [====================>.........] - ETA: 48s - loss: 1.8315 - regression_loss: 1.5445 - classification_loss: 0.2870 353/500 [====================>.........] - ETA: 48s - loss: 1.8290 - regression_loss: 1.5425 - classification_loss: 0.2866 354/500 [====================>.........] - ETA: 48s - loss: 1.8293 - regression_loss: 1.5425 - classification_loss: 0.2868 355/500 [====================>.........] - ETA: 47s - loss: 1.8302 - regression_loss: 1.5435 - classification_loss: 0.2868 356/500 [====================>.........] - ETA: 47s - loss: 1.8272 - regression_loss: 1.5408 - classification_loss: 0.2864 357/500 [====================>.........] - ETA: 47s - loss: 1.8269 - regression_loss: 1.5405 - classification_loss: 0.2864 358/500 [====================>.........] - ETA: 46s - loss: 1.8245 - regression_loss: 1.5383 - classification_loss: 0.2862 359/500 [====================>.........] - ETA: 46s - loss: 1.8245 - regression_loss: 1.5385 - classification_loss: 0.2860 360/500 [====================>.........] - ETA: 46s - loss: 1.8226 - regression_loss: 1.5369 - classification_loss: 0.2857 361/500 [====================>.........] - ETA: 45s - loss: 1.8229 - regression_loss: 1.5373 - classification_loss: 0.2856 362/500 [====================>.........] - ETA: 45s - loss: 1.8237 - regression_loss: 1.5379 - classification_loss: 0.2858 363/500 [====================>.........] - ETA: 45s - loss: 1.8215 - regression_loss: 1.5360 - classification_loss: 0.2855 364/500 [====================>.........] - ETA: 44s - loss: 1.8217 - regression_loss: 1.5362 - classification_loss: 0.2855 365/500 [====================>.........] - ETA: 44s - loss: 1.8221 - regression_loss: 1.5366 - classification_loss: 0.2855 366/500 [====================>.........] - ETA: 44s - loss: 1.8217 - regression_loss: 1.5362 - classification_loss: 0.2855 367/500 [=====================>........] - ETA: 43s - loss: 1.8217 - regression_loss: 1.5362 - classification_loss: 0.2854 368/500 [=====================>........] - ETA: 43s - loss: 1.8231 - regression_loss: 1.5374 - classification_loss: 0.2857 369/500 [=====================>........] - ETA: 43s - loss: 1.8224 - regression_loss: 1.5370 - classification_loss: 0.2855 370/500 [=====================>........] - ETA: 42s - loss: 1.8230 - regression_loss: 1.5374 - classification_loss: 0.2856 371/500 [=====================>........] - ETA: 42s - loss: 1.8234 - regression_loss: 1.5377 - classification_loss: 0.2857 372/500 [=====================>........] - ETA: 42s - loss: 1.8248 - regression_loss: 1.5389 - classification_loss: 0.2859 373/500 [=====================>........] - ETA: 41s - loss: 1.8248 - regression_loss: 1.5389 - classification_loss: 0.2858 374/500 [=====================>........] - ETA: 41s - loss: 1.8246 - regression_loss: 1.5388 - classification_loss: 0.2858 375/500 [=====================>........] - ETA: 41s - loss: 1.8250 - regression_loss: 1.5393 - classification_loss: 0.2857 376/500 [=====================>........] - ETA: 40s - loss: 1.8263 - regression_loss: 1.5403 - classification_loss: 0.2861 377/500 [=====================>........] - ETA: 40s - loss: 1.8261 - regression_loss: 1.5401 - classification_loss: 0.2860 378/500 [=====================>........] - ETA: 40s - loss: 1.8249 - regression_loss: 1.5390 - classification_loss: 0.2859 379/500 [=====================>........] - ETA: 39s - loss: 1.8248 - regression_loss: 1.5389 - classification_loss: 0.2859 380/500 [=====================>........] - ETA: 39s - loss: 1.8256 - regression_loss: 1.5395 - classification_loss: 0.2860 381/500 [=====================>........] - ETA: 39s - loss: 1.8240 - regression_loss: 1.5383 - classification_loss: 0.2857 382/500 [=====================>........] - ETA: 38s - loss: 1.8250 - regression_loss: 1.5392 - classification_loss: 0.2858 383/500 [=====================>........] - ETA: 38s - loss: 1.8251 - regression_loss: 1.5393 - classification_loss: 0.2858 384/500 [======================>.......] - ETA: 38s - loss: 1.8266 - regression_loss: 1.5406 - classification_loss: 0.2860 385/500 [======================>.......] - ETA: 37s - loss: 1.8268 - regression_loss: 1.5408 - classification_loss: 0.2861 386/500 [======================>.......] - ETA: 37s - loss: 1.8275 - regression_loss: 1.5414 - classification_loss: 0.2861 387/500 [======================>.......] - ETA: 37s - loss: 1.8271 - regression_loss: 1.5412 - classification_loss: 0.2859 388/500 [======================>.......] - ETA: 36s - loss: 1.8275 - regression_loss: 1.5415 - classification_loss: 0.2859 389/500 [======================>.......] - ETA: 36s - loss: 1.8278 - regression_loss: 1.5419 - classification_loss: 0.2860 390/500 [======================>.......] - ETA: 36s - loss: 1.8284 - regression_loss: 1.5424 - classification_loss: 0.2860 391/500 [======================>.......] - ETA: 35s - loss: 1.8294 - regression_loss: 1.5433 - classification_loss: 0.2861 392/500 [======================>.......] - ETA: 35s - loss: 1.8309 - regression_loss: 1.5446 - classification_loss: 0.2863 393/500 [======================>.......] - ETA: 35s - loss: 1.8284 - regression_loss: 1.5424 - classification_loss: 0.2860 394/500 [======================>.......] - ETA: 34s - loss: 1.8264 - regression_loss: 1.5407 - classification_loss: 0.2857 395/500 [======================>.......] - ETA: 34s - loss: 1.8269 - regression_loss: 1.5412 - classification_loss: 0.2857 396/500 [======================>.......] - ETA: 34s - loss: 1.8269 - regression_loss: 1.5411 - classification_loss: 0.2858 397/500 [======================>.......] - ETA: 33s - loss: 1.8271 - regression_loss: 1.5413 - classification_loss: 0.2858 398/500 [======================>.......] - ETA: 33s - loss: 1.8285 - regression_loss: 1.5424 - classification_loss: 0.2861 399/500 [======================>.......] - ETA: 33s - loss: 1.8285 - regression_loss: 1.5423 - classification_loss: 0.2861 400/500 [=======================>......] - ETA: 32s - loss: 1.8288 - regression_loss: 1.5427 - classification_loss: 0.2862 401/500 [=======================>......] - ETA: 32s - loss: 1.8287 - regression_loss: 1.5426 - classification_loss: 0.2861 402/500 [=======================>......] - ETA: 32s - loss: 1.8295 - regression_loss: 1.5432 - classification_loss: 0.2863 403/500 [=======================>......] - ETA: 31s - loss: 1.8294 - regression_loss: 1.5432 - classification_loss: 0.2862 404/500 [=======================>......] - ETA: 31s - loss: 1.8278 - regression_loss: 1.5419 - classification_loss: 0.2859 405/500 [=======================>......] - ETA: 31s - loss: 1.8285 - regression_loss: 1.5424 - classification_loss: 0.2862 406/500 [=======================>......] - ETA: 30s - loss: 1.8280 - regression_loss: 1.5420 - classification_loss: 0.2860 407/500 [=======================>......] - ETA: 30s - loss: 1.8285 - regression_loss: 1.5424 - classification_loss: 0.2861 408/500 [=======================>......] - ETA: 30s - loss: 1.8270 - regression_loss: 1.5410 - classification_loss: 0.2860 409/500 [=======================>......] - ETA: 29s - loss: 1.8268 - regression_loss: 1.5408 - classification_loss: 0.2860 410/500 [=======================>......] - ETA: 29s - loss: 1.8248 - regression_loss: 1.5391 - classification_loss: 0.2857 411/500 [=======================>......] - ETA: 29s - loss: 1.8250 - regression_loss: 1.5393 - classification_loss: 0.2857 412/500 [=======================>......] - ETA: 28s - loss: 1.8253 - regression_loss: 1.5396 - classification_loss: 0.2857 413/500 [=======================>......] - ETA: 28s - loss: 1.8265 - regression_loss: 1.5406 - classification_loss: 0.2859 414/500 [=======================>......] - ETA: 28s - loss: 1.8250 - regression_loss: 1.5393 - classification_loss: 0.2856 415/500 [=======================>......] - ETA: 28s - loss: 1.8257 - regression_loss: 1.5399 - classification_loss: 0.2858 416/500 [=======================>......] - ETA: 27s - loss: 1.8269 - regression_loss: 1.5411 - classification_loss: 0.2858 417/500 [========================>.....] - ETA: 27s - loss: 1.8275 - regression_loss: 1.5414 - classification_loss: 0.2861 418/500 [========================>.....] - ETA: 27s - loss: 1.8274 - regression_loss: 1.5414 - classification_loss: 0.2860 419/500 [========================>.....] - ETA: 26s - loss: 1.8272 - regression_loss: 1.5412 - classification_loss: 0.2860 420/500 [========================>.....] - ETA: 26s - loss: 1.8281 - regression_loss: 1.5420 - classification_loss: 0.2861 421/500 [========================>.....] - ETA: 26s - loss: 1.8283 - regression_loss: 1.5422 - classification_loss: 0.2861 422/500 [========================>.....] - ETA: 25s - loss: 1.8288 - regression_loss: 1.5427 - classification_loss: 0.2861 423/500 [========================>.....] - ETA: 25s - loss: 1.8268 - regression_loss: 1.5409 - classification_loss: 0.2859 424/500 [========================>.....] - ETA: 25s - loss: 1.8272 - regression_loss: 1.5412 - classification_loss: 0.2859 425/500 [========================>.....] - ETA: 24s - loss: 1.8280 - regression_loss: 1.5421 - classification_loss: 0.2860 426/500 [========================>.....] - ETA: 24s - loss: 1.8280 - regression_loss: 1.5421 - classification_loss: 0.2859 427/500 [========================>.....] - ETA: 24s - loss: 1.8283 - regression_loss: 1.5424 - classification_loss: 0.2860 428/500 [========================>.....] - ETA: 23s - loss: 1.8274 - regression_loss: 1.5416 - classification_loss: 0.2858 429/500 [========================>.....] - ETA: 23s - loss: 1.8263 - regression_loss: 1.5408 - classification_loss: 0.2855 430/500 [========================>.....] - ETA: 23s - loss: 1.8247 - regression_loss: 1.5395 - classification_loss: 0.2852 431/500 [========================>.....] - ETA: 22s - loss: 1.8262 - regression_loss: 1.5408 - classification_loss: 0.2855 432/500 [========================>.....] - ETA: 22s - loss: 1.8269 - regression_loss: 1.5413 - classification_loss: 0.2856 433/500 [========================>.....] - ETA: 22s - loss: 1.8275 - regression_loss: 1.5420 - classification_loss: 0.2856 434/500 [=========================>....] - ETA: 21s - loss: 1.8282 - regression_loss: 1.5425 - classification_loss: 0.2857 435/500 [=========================>....] - ETA: 21s - loss: 1.8263 - regression_loss: 1.5407 - classification_loss: 0.2856 436/500 [=========================>....] - ETA: 21s - loss: 1.8243 - regression_loss: 1.5390 - classification_loss: 0.2853 437/500 [=========================>....] - ETA: 20s - loss: 1.8255 - regression_loss: 1.5401 - classification_loss: 0.2854 438/500 [=========================>....] - ETA: 20s - loss: 1.8249 - regression_loss: 1.5395 - classification_loss: 0.2854 439/500 [=========================>....] - ETA: 20s - loss: 1.8238 - regression_loss: 1.5385 - classification_loss: 0.2853 440/500 [=========================>....] - ETA: 19s - loss: 1.8237 - regression_loss: 1.5384 - classification_loss: 0.2853 441/500 [=========================>....] - ETA: 19s - loss: 1.8221 - regression_loss: 1.5369 - classification_loss: 0.2851 442/500 [=========================>....] - ETA: 19s - loss: 1.8223 - regression_loss: 1.5372 - classification_loss: 0.2851 443/500 [=========================>....] - ETA: 18s - loss: 1.8218 - regression_loss: 1.5369 - classification_loss: 0.2848 444/500 [=========================>....] - ETA: 18s - loss: 1.8218 - regression_loss: 1.5370 - classification_loss: 0.2848 445/500 [=========================>....] - ETA: 18s - loss: 1.8222 - regression_loss: 1.5373 - classification_loss: 0.2849 446/500 [=========================>....] - ETA: 17s - loss: 1.8209 - regression_loss: 1.5363 - classification_loss: 0.2846 447/500 [=========================>....] - ETA: 17s - loss: 1.8225 - regression_loss: 1.5377 - classification_loss: 0.2848 448/500 [=========================>....] - ETA: 17s - loss: 1.8229 - regression_loss: 1.5380 - classification_loss: 0.2849 449/500 [=========================>....] - ETA: 16s - loss: 1.8240 - regression_loss: 1.5390 - classification_loss: 0.2850 450/500 [==========================>...] - ETA: 16s - loss: 1.8245 - regression_loss: 1.5395 - classification_loss: 0.2850 451/500 [==========================>...] - ETA: 16s - loss: 1.8239 - regression_loss: 1.5391 - classification_loss: 0.2848 452/500 [==========================>...] - ETA: 15s - loss: 1.8227 - regression_loss: 1.5378 - classification_loss: 0.2849 453/500 [==========================>...] - ETA: 15s - loss: 1.8231 - regression_loss: 1.5383 - classification_loss: 0.2849 454/500 [==========================>...] - ETA: 15s - loss: 1.8237 - regression_loss: 1.5387 - classification_loss: 0.2850 455/500 [==========================>...] - ETA: 14s - loss: 1.8240 - regression_loss: 1.5390 - classification_loss: 0.2850 456/500 [==========================>...] - ETA: 14s - loss: 1.8247 - regression_loss: 1.5396 - classification_loss: 0.2851 457/500 [==========================>...] - ETA: 14s - loss: 1.8245 - regression_loss: 1.5394 - classification_loss: 0.2850 458/500 [==========================>...] - ETA: 13s - loss: 1.8231 - regression_loss: 1.5383 - classification_loss: 0.2848 459/500 [==========================>...] - ETA: 13s - loss: 1.8237 - regression_loss: 1.5387 - classification_loss: 0.2850 460/500 [==========================>...] - ETA: 13s - loss: 1.8217 - regression_loss: 1.5370 - classification_loss: 0.2846 461/500 [==========================>...] - ETA: 12s - loss: 1.8219 - regression_loss: 1.5373 - classification_loss: 0.2846 462/500 [==========================>...] - ETA: 12s - loss: 1.8206 - regression_loss: 1.5363 - classification_loss: 0.2844 463/500 [==========================>...] - ETA: 12s - loss: 1.8197 - regression_loss: 1.5351 - classification_loss: 0.2846 464/500 [==========================>...] - ETA: 11s - loss: 1.8172 - regression_loss: 1.5330 - classification_loss: 0.2842 465/500 [==========================>...] - ETA: 11s - loss: 1.8176 - regression_loss: 1.5332 - classification_loss: 0.2844 466/500 [==========================>...] - ETA: 11s - loss: 1.8182 - regression_loss: 1.5337 - classification_loss: 0.2845 467/500 [===========================>..] - ETA: 10s - loss: 1.8180 - regression_loss: 1.5336 - classification_loss: 0.2844 468/500 [===========================>..] - ETA: 10s - loss: 1.8179 - regression_loss: 1.5335 - classification_loss: 0.2843 469/500 [===========================>..] - ETA: 10s - loss: 1.8171 - regression_loss: 1.5328 - classification_loss: 0.2843 470/500 [===========================>..] - ETA: 9s - loss: 1.8175 - regression_loss: 1.5331 - classification_loss: 0.2844  471/500 [===========================>..] - ETA: 9s - loss: 1.8151 - regression_loss: 1.5310 - classification_loss: 0.2841 472/500 [===========================>..] - ETA: 9s - loss: 1.8153 - regression_loss: 1.5312 - classification_loss: 0.2842 473/500 [===========================>..] - ETA: 8s - loss: 1.8148 - regression_loss: 1.5308 - classification_loss: 0.2841 474/500 [===========================>..] - ETA: 8s - loss: 1.8149 - regression_loss: 1.5309 - classification_loss: 0.2840 475/500 [===========================>..] - ETA: 8s - loss: 1.8151 - regression_loss: 1.5313 - classification_loss: 0.2838 476/500 [===========================>..] - ETA: 7s - loss: 1.8155 - regression_loss: 1.5316 - classification_loss: 0.2839 477/500 [===========================>..] - ETA: 7s - loss: 1.8166 - regression_loss: 1.5326 - classification_loss: 0.2840 478/500 [===========================>..] - ETA: 7s - loss: 1.8161 - regression_loss: 1.5322 - classification_loss: 0.2839 479/500 [===========================>..] - ETA: 6s - loss: 1.8165 - regression_loss: 1.5326 - classification_loss: 0.2839 480/500 [===========================>..] - ETA: 6s - loss: 1.8170 - regression_loss: 1.5332 - classification_loss: 0.2838 481/500 [===========================>..] - ETA: 6s - loss: 1.8173 - regression_loss: 1.5335 - classification_loss: 0.2838 482/500 [===========================>..] - ETA: 5s - loss: 1.8159 - regression_loss: 1.5320 - classification_loss: 0.2839 483/500 [===========================>..] - ETA: 5s - loss: 1.8139 - regression_loss: 1.5302 - classification_loss: 0.2836 484/500 [============================>.] - ETA: 5s - loss: 1.8138 - regression_loss: 1.5302 - classification_loss: 0.2836 485/500 [============================>.] - ETA: 4s - loss: 1.8145 - regression_loss: 1.5309 - classification_loss: 0.2836 486/500 [============================>.] - ETA: 4s - loss: 1.8155 - regression_loss: 1.5317 - classification_loss: 0.2838 487/500 [============================>.] - ETA: 4s - loss: 1.8155 - regression_loss: 1.5317 - classification_loss: 0.2838 488/500 [============================>.] - ETA: 3s - loss: 1.8159 - regression_loss: 1.5322 - classification_loss: 0.2837 489/500 [============================>.] - ETA: 3s - loss: 1.8172 - regression_loss: 1.5332 - classification_loss: 0.2840 490/500 [============================>.] - ETA: 3s - loss: 1.8168 - regression_loss: 1.5330 - classification_loss: 0.2839 491/500 [============================>.] - ETA: 2s - loss: 1.8172 - regression_loss: 1.5333 - classification_loss: 0.2839 492/500 [============================>.] - ETA: 2s - loss: 1.8178 - regression_loss: 1.5339 - classification_loss: 0.2839 493/500 [============================>.] - ETA: 2s - loss: 1.8180 - regression_loss: 1.5340 - classification_loss: 0.2840 494/500 [============================>.] - ETA: 1s - loss: 1.8188 - regression_loss: 1.5348 - classification_loss: 0.2841 495/500 [============================>.] - ETA: 1s - loss: 1.8175 - regression_loss: 1.5336 - classification_loss: 0.2839 496/500 [============================>.] - ETA: 1s - loss: 1.8163 - regression_loss: 1.5326 - classification_loss: 0.2837 497/500 [============================>.] - ETA: 0s - loss: 1.8159 - regression_loss: 1.5322 - classification_loss: 0.2837 498/500 [============================>.] - ETA: 0s - loss: 1.8163 - regression_loss: 1.5325 - classification_loss: 0.2838 499/500 [============================>.] - ETA: 0s - loss: 1.8148 - regression_loss: 1.5312 - classification_loss: 0.2836 500/500 [==============================] - 165s 330ms/step - loss: 1.8133 - regression_loss: 1.5300 - classification_loss: 0.2833 1172 instances of class plum with average precision: 0.5111 mAP: 0.5111 Epoch 00011: saving model to ./training/snapshots/resnet101_pascal_11.h5 Epoch 12/150 1/500 [..............................] - ETA: 2:48 - loss: 2.5593 - regression_loss: 2.1319 - classification_loss: 0.4275 2/500 [..............................] - ETA: 2:45 - loss: 2.1125 - regression_loss: 1.7781 - classification_loss: 0.3343 3/500 [..............................] - ETA: 2:45 - loss: 1.8252 - regression_loss: 1.5418 - classification_loss: 0.2834 4/500 [..............................] - ETA: 2:45 - loss: 1.9111 - regression_loss: 1.6109 - classification_loss: 0.3002 5/500 [..............................] - ETA: 2:43 - loss: 1.9208 - regression_loss: 1.6333 - classification_loss: 0.2876 6/500 [..............................] - ETA: 2:43 - loss: 1.9558 - regression_loss: 1.6579 - classification_loss: 0.2980 7/500 [..............................] - ETA: 2:44 - loss: 1.9922 - regression_loss: 1.6928 - classification_loss: 0.2994 8/500 [..............................] - ETA: 2:43 - loss: 2.0602 - regression_loss: 1.7474 - classification_loss: 0.3128 9/500 [..............................] - ETA: 2:42 - loss: 2.0471 - regression_loss: 1.7360 - classification_loss: 0.3110 10/500 [..............................] - ETA: 2:42 - loss: 2.0094 - regression_loss: 1.7039 - classification_loss: 0.3055 11/500 [..............................] - ETA: 2:42 - loss: 1.9955 - regression_loss: 1.6938 - classification_loss: 0.3016 12/500 [..............................] - ETA: 2:40 - loss: 2.0356 - regression_loss: 1.7252 - classification_loss: 0.3105 13/500 [..............................] - ETA: 2:40 - loss: 2.0246 - regression_loss: 1.7235 - classification_loss: 0.3011 14/500 [..............................] - ETA: 2:39 - loss: 2.0330 - regression_loss: 1.7274 - classification_loss: 0.3055 15/500 [..............................] - ETA: 2:40 - loss: 1.9887 - regression_loss: 1.6892 - classification_loss: 0.2994 16/500 [..............................] - ETA: 2:41 - loss: 1.9812 - regression_loss: 1.6834 - classification_loss: 0.2978 17/500 [>.............................] - ETA: 2:40 - loss: 1.9228 - regression_loss: 1.6335 - classification_loss: 0.2893 18/500 [>.............................] - ETA: 2:39 - loss: 1.9142 - regression_loss: 1.6265 - classification_loss: 0.2877 19/500 [>.............................] - ETA: 2:40 - loss: 1.9197 - regression_loss: 1.6328 - classification_loss: 0.2870 20/500 [>.............................] - ETA: 2:40 - loss: 1.9357 - regression_loss: 1.6454 - classification_loss: 0.2903 21/500 [>.............................] - ETA: 2:40 - loss: 1.9218 - regression_loss: 1.6340 - classification_loss: 0.2879 22/500 [>.............................] - ETA: 2:39 - loss: 1.8762 - regression_loss: 1.5943 - classification_loss: 0.2819 23/500 [>.............................] - ETA: 2:39 - loss: 1.8725 - regression_loss: 1.5819 - classification_loss: 0.2907 24/500 [>.............................] - ETA: 2:38 - loss: 1.8689 - regression_loss: 1.5791 - classification_loss: 0.2898 25/500 [>.............................] - ETA: 2:38 - loss: 1.8722 - regression_loss: 1.5820 - classification_loss: 0.2902 26/500 [>.............................] - ETA: 2:38 - loss: 1.8711 - regression_loss: 1.5812 - classification_loss: 0.2900 27/500 [>.............................] - ETA: 2:38 - loss: 1.8762 - regression_loss: 1.5848 - classification_loss: 0.2914 28/500 [>.............................] - ETA: 2:37 - loss: 1.8855 - regression_loss: 1.5954 - classification_loss: 0.2901 29/500 [>.............................] - ETA: 2:36 - loss: 1.8576 - regression_loss: 1.5673 - classification_loss: 0.2903 30/500 [>.............................] - ETA: 2:35 - loss: 1.8448 - regression_loss: 1.5552 - classification_loss: 0.2895 31/500 [>.............................] - ETA: 2:35 - loss: 1.8442 - regression_loss: 1.5555 - classification_loss: 0.2887 32/500 [>.............................] - ETA: 2:35 - loss: 1.8221 - regression_loss: 1.5365 - classification_loss: 0.2856 33/500 [>.............................] - ETA: 2:34 - loss: 1.8349 - regression_loss: 1.5481 - classification_loss: 0.2868 34/500 [=>............................] - ETA: 2:34 - loss: 1.8330 - regression_loss: 1.5462 - classification_loss: 0.2867 35/500 [=>............................] - ETA: 2:33 - loss: 1.8450 - regression_loss: 1.5568 - classification_loss: 0.2883 36/500 [=>............................] - ETA: 2:33 - loss: 1.8573 - regression_loss: 1.5648 - classification_loss: 0.2925 37/500 [=>............................] - ETA: 2:33 - loss: 1.8286 - regression_loss: 1.5388 - classification_loss: 0.2898 38/500 [=>............................] - ETA: 2:32 - loss: 1.8424 - regression_loss: 1.5524 - classification_loss: 0.2900 39/500 [=>............................] - ETA: 2:32 - loss: 1.8346 - regression_loss: 1.5458 - classification_loss: 0.2888 40/500 [=>............................] - ETA: 2:32 - loss: 1.8382 - regression_loss: 1.5493 - classification_loss: 0.2889 41/500 [=>............................] - ETA: 2:31 - loss: 1.8498 - regression_loss: 1.5594 - classification_loss: 0.2904 42/500 [=>............................] - ETA: 2:31 - loss: 1.8459 - regression_loss: 1.5555 - classification_loss: 0.2904 43/500 [=>............................] - ETA: 2:30 - loss: 1.8521 - regression_loss: 1.5614 - classification_loss: 0.2907 44/500 [=>............................] - ETA: 2:30 - loss: 1.8505 - regression_loss: 1.5599 - classification_loss: 0.2906 45/500 [=>............................] - ETA: 2:30 - loss: 1.8536 - regression_loss: 1.5632 - classification_loss: 0.2903 46/500 [=>............................] - ETA: 2:30 - loss: 1.8611 - regression_loss: 1.5704 - classification_loss: 0.2907 47/500 [=>............................] - ETA: 2:30 - loss: 1.8518 - regression_loss: 1.5631 - classification_loss: 0.2887 48/500 [=>............................] - ETA: 2:29 - loss: 1.8228 - regression_loss: 1.5379 - classification_loss: 0.2849 49/500 [=>............................] - ETA: 2:29 - loss: 1.8086 - regression_loss: 1.5261 - classification_loss: 0.2825 50/500 [==>...........................] - ETA: 2:29 - loss: 1.8105 - regression_loss: 1.5286 - classification_loss: 0.2819 51/500 [==>...........................] - ETA: 2:28 - loss: 1.7993 - regression_loss: 1.5194 - classification_loss: 0.2799 52/500 [==>...........................] - ETA: 2:28 - loss: 1.7900 - regression_loss: 1.5113 - classification_loss: 0.2787 53/500 [==>...........................] - ETA: 2:28 - loss: 1.7927 - regression_loss: 1.5133 - classification_loss: 0.2794 54/500 [==>...........................] - ETA: 2:27 - loss: 1.7897 - regression_loss: 1.5113 - classification_loss: 0.2784 55/500 [==>...........................] - ETA: 2:27 - loss: 1.7936 - regression_loss: 1.5136 - classification_loss: 0.2800 56/500 [==>...........................] - ETA: 2:27 - loss: 1.7989 - regression_loss: 1.5184 - classification_loss: 0.2805 57/500 [==>...........................] - ETA: 2:26 - loss: 1.8025 - regression_loss: 1.5217 - classification_loss: 0.2808 58/500 [==>...........................] - ETA: 2:26 - loss: 1.8088 - regression_loss: 1.5277 - classification_loss: 0.2811 59/500 [==>...........................] - ETA: 2:26 - loss: 1.8017 - regression_loss: 1.5228 - classification_loss: 0.2789 60/500 [==>...........................] - ETA: 2:25 - loss: 1.8000 - regression_loss: 1.5214 - classification_loss: 0.2785 61/500 [==>...........................] - ETA: 2:25 - loss: 1.8034 - regression_loss: 1.5248 - classification_loss: 0.2786 62/500 [==>...........................] - ETA: 2:24 - loss: 1.8047 - regression_loss: 1.5262 - classification_loss: 0.2785 63/500 [==>...........................] - ETA: 2:24 - loss: 1.8128 - regression_loss: 1.5335 - classification_loss: 0.2794 64/500 [==>...........................] - ETA: 2:24 - loss: 1.8041 - regression_loss: 1.5260 - classification_loss: 0.2781 65/500 [==>...........................] - ETA: 2:23 - loss: 1.8103 - regression_loss: 1.5312 - classification_loss: 0.2791 66/500 [==>...........................] - ETA: 2:23 - loss: 1.8069 - regression_loss: 1.5286 - classification_loss: 0.2784 67/500 [===>..........................] - ETA: 2:23 - loss: 1.8110 - regression_loss: 1.5310 - classification_loss: 0.2800 68/500 [===>..........................] - ETA: 2:22 - loss: 1.8100 - regression_loss: 1.5311 - classification_loss: 0.2789 69/500 [===>..........................] - ETA: 2:22 - loss: 1.8034 - regression_loss: 1.5261 - classification_loss: 0.2773 70/500 [===>..........................] - ETA: 2:21 - loss: 1.8025 - regression_loss: 1.5254 - classification_loss: 0.2771 71/500 [===>..........................] - ETA: 2:21 - loss: 1.7967 - regression_loss: 1.5204 - classification_loss: 0.2763 72/500 [===>..........................] - ETA: 2:21 - loss: 1.7888 - regression_loss: 1.5145 - classification_loss: 0.2743 73/500 [===>..........................] - ETA: 2:20 - loss: 1.7934 - regression_loss: 1.5185 - classification_loss: 0.2748 74/500 [===>..........................] - ETA: 2:20 - loss: 1.7958 - regression_loss: 1.5205 - classification_loss: 0.2753 75/500 [===>..........................] - ETA: 2:20 - loss: 1.7952 - regression_loss: 1.5200 - classification_loss: 0.2752 76/500 [===>..........................] - ETA: 2:19 - loss: 1.8066 - regression_loss: 1.5292 - classification_loss: 0.2773 77/500 [===>..........................] - ETA: 2:19 - loss: 1.8122 - regression_loss: 1.5338 - classification_loss: 0.2784 78/500 [===>..........................] - ETA: 2:19 - loss: 1.8163 - regression_loss: 1.5380 - classification_loss: 0.2782 79/500 [===>..........................] - ETA: 2:19 - loss: 1.8117 - regression_loss: 1.5342 - classification_loss: 0.2775 80/500 [===>..........................] - ETA: 2:18 - loss: 1.8145 - regression_loss: 1.5365 - classification_loss: 0.2780 81/500 [===>..........................] - ETA: 2:18 - loss: 1.8186 - regression_loss: 1.5400 - classification_loss: 0.2786 82/500 [===>..........................] - ETA: 2:18 - loss: 1.8086 - regression_loss: 1.5321 - classification_loss: 0.2766 83/500 [===>..........................] - ETA: 2:17 - loss: 1.8103 - regression_loss: 1.5334 - classification_loss: 0.2769 84/500 [====>.........................] - ETA: 2:17 - loss: 1.8113 - regression_loss: 1.5346 - classification_loss: 0.2766 85/500 [====>.........................] - ETA: 2:17 - loss: 1.8123 - regression_loss: 1.5359 - classification_loss: 0.2764 86/500 [====>.........................] - ETA: 2:16 - loss: 1.8078 - regression_loss: 1.5322 - classification_loss: 0.2756 87/500 [====>.........................] - ETA: 2:16 - loss: 1.8108 - regression_loss: 1.5348 - classification_loss: 0.2760 88/500 [====>.........................] - ETA: 2:16 - loss: 1.8104 - regression_loss: 1.5348 - classification_loss: 0.2756 89/500 [====>.........................] - ETA: 2:15 - loss: 1.8146 - regression_loss: 1.5384 - classification_loss: 0.2762 90/500 [====>.........................] - ETA: 2:15 - loss: 1.8191 - regression_loss: 1.5417 - classification_loss: 0.2773 91/500 [====>.........................] - ETA: 2:15 - loss: 1.8229 - regression_loss: 1.5450 - classification_loss: 0.2779 92/500 [====>.........................] - ETA: 2:14 - loss: 1.8205 - regression_loss: 1.5425 - classification_loss: 0.2780 93/500 [====>.........................] - ETA: 2:14 - loss: 1.8217 - regression_loss: 1.5432 - classification_loss: 0.2785 94/500 [====>.........................] - ETA: 2:14 - loss: 1.8181 - regression_loss: 1.5393 - classification_loss: 0.2787 95/500 [====>.........................] - ETA: 2:13 - loss: 1.8197 - regression_loss: 1.5405 - classification_loss: 0.2792 96/500 [====>.........................] - ETA: 2:13 - loss: 1.8221 - regression_loss: 1.5430 - classification_loss: 0.2790 97/500 [====>.........................] - ETA: 2:13 - loss: 1.8257 - regression_loss: 1.5459 - classification_loss: 0.2798 98/500 [====>.........................] - ETA: 2:12 - loss: 1.8277 - regression_loss: 1.5477 - classification_loss: 0.2800 99/500 [====>.........................] - ETA: 2:12 - loss: 1.8350 - regression_loss: 1.5527 - classification_loss: 0.2824 100/500 [=====>........................] - ETA: 2:12 - loss: 1.8281 - regression_loss: 1.5468 - classification_loss: 0.2813 101/500 [=====>........................] - ETA: 2:11 - loss: 1.8191 - regression_loss: 1.5391 - classification_loss: 0.2800 102/500 [=====>........................] - ETA: 2:11 - loss: 1.8244 - regression_loss: 1.5427 - classification_loss: 0.2816 103/500 [=====>........................] - ETA: 2:11 - loss: 1.8265 - regression_loss: 1.5447 - classification_loss: 0.2818 104/500 [=====>........................] - ETA: 2:11 - loss: 1.8361 - regression_loss: 1.5528 - classification_loss: 0.2833 105/500 [=====>........................] - ETA: 2:10 - loss: 1.8379 - regression_loss: 1.5545 - classification_loss: 0.2834 106/500 [=====>........................] - ETA: 2:10 - loss: 1.8407 - regression_loss: 1.5569 - classification_loss: 0.2837 107/500 [=====>........................] - ETA: 2:09 - loss: 1.8389 - regression_loss: 1.5552 - classification_loss: 0.2838 108/500 [=====>........................] - ETA: 2:09 - loss: 1.8418 - regression_loss: 1.5575 - classification_loss: 0.2843 109/500 [=====>........................] - ETA: 2:09 - loss: 1.8413 - regression_loss: 1.5578 - classification_loss: 0.2835 110/500 [=====>........................] - ETA: 2:08 - loss: 1.8419 - regression_loss: 1.5584 - classification_loss: 0.2835 111/500 [=====>........................] - ETA: 2:08 - loss: 1.8419 - regression_loss: 1.5584 - classification_loss: 0.2835 112/500 [=====>........................] - ETA: 2:08 - loss: 1.8401 - regression_loss: 1.5566 - classification_loss: 0.2834 113/500 [=====>........................] - ETA: 2:07 - loss: 1.8420 - regression_loss: 1.5579 - classification_loss: 0.2841 114/500 [=====>........................] - ETA: 2:07 - loss: 1.8440 - regression_loss: 1.5594 - classification_loss: 0.2846 115/500 [=====>........................] - ETA: 2:07 - loss: 1.8455 - regression_loss: 1.5610 - classification_loss: 0.2845 116/500 [=====>........................] - ETA: 2:06 - loss: 1.8437 - regression_loss: 1.5594 - classification_loss: 0.2843 117/500 [======>.......................] - ETA: 2:06 - loss: 1.8391 - regression_loss: 1.5552 - classification_loss: 0.2839 118/500 [======>.......................] - ETA: 2:06 - loss: 1.8403 - regression_loss: 1.5564 - classification_loss: 0.2839 119/500 [======>.......................] - ETA: 2:06 - loss: 1.8404 - regression_loss: 1.5568 - classification_loss: 0.2836 120/500 [======>.......................] - ETA: 2:05 - loss: 1.8412 - regression_loss: 1.5572 - classification_loss: 0.2840 121/500 [======>.......................] - ETA: 2:05 - loss: 1.8336 - regression_loss: 1.5503 - classification_loss: 0.2832 122/500 [======>.......................] - ETA: 2:05 - loss: 1.8319 - regression_loss: 1.5488 - classification_loss: 0.2830 123/500 [======>.......................] - ETA: 2:04 - loss: 1.8310 - regression_loss: 1.5480 - classification_loss: 0.2830 124/500 [======>.......................] - ETA: 2:04 - loss: 1.8252 - regression_loss: 1.5433 - classification_loss: 0.2819 125/500 [======>.......................] - ETA: 2:04 - loss: 1.8206 - regression_loss: 1.5398 - classification_loss: 0.2808 126/500 [======>.......................] - ETA: 2:03 - loss: 1.8244 - regression_loss: 1.5433 - classification_loss: 0.2811 127/500 [======>.......................] - ETA: 2:03 - loss: 1.8250 - regression_loss: 1.5440 - classification_loss: 0.2810 128/500 [======>.......................] - ETA: 2:03 - loss: 1.8278 - regression_loss: 1.5457 - classification_loss: 0.2821 129/500 [======>.......................] - ETA: 2:02 - loss: 1.8290 - regression_loss: 1.5459 - classification_loss: 0.2831 130/500 [======>.......................] - ETA: 2:02 - loss: 1.8362 - regression_loss: 1.5519 - classification_loss: 0.2844 131/500 [======>.......................] - ETA: 2:02 - loss: 1.8362 - regression_loss: 1.5514 - classification_loss: 0.2847 132/500 [======>.......................] - ETA: 2:01 - loss: 1.8319 - regression_loss: 1.5479 - classification_loss: 0.2841 133/500 [======>.......................] - ETA: 2:01 - loss: 1.8313 - regression_loss: 1.5471 - classification_loss: 0.2842 134/500 [=======>......................] - ETA: 2:01 - loss: 1.8318 - regression_loss: 1.5477 - classification_loss: 0.2841 135/500 [=======>......................] - ETA: 2:00 - loss: 1.8335 - regression_loss: 1.5490 - classification_loss: 0.2845 136/500 [=======>......................] - ETA: 2:00 - loss: 1.8334 - regression_loss: 1.5489 - classification_loss: 0.2845 137/500 [=======>......................] - ETA: 2:00 - loss: 1.8343 - regression_loss: 1.5497 - classification_loss: 0.2847 138/500 [=======>......................] - ETA: 1:59 - loss: 1.8285 - regression_loss: 1.5441 - classification_loss: 0.2844 139/500 [=======>......................] - ETA: 1:59 - loss: 1.8309 - regression_loss: 1.5454 - classification_loss: 0.2856 140/500 [=======>......................] - ETA: 1:59 - loss: 1.8320 - regression_loss: 1.5464 - classification_loss: 0.2856 141/500 [=======>......................] - ETA: 1:58 - loss: 1.8317 - regression_loss: 1.5461 - classification_loss: 0.2856 142/500 [=======>......................] - ETA: 1:58 - loss: 1.8240 - regression_loss: 1.5395 - classification_loss: 0.2845 143/500 [=======>......................] - ETA: 1:58 - loss: 1.8202 - regression_loss: 1.5364 - classification_loss: 0.2838 144/500 [=======>......................] - ETA: 1:57 - loss: 1.8226 - regression_loss: 1.5386 - classification_loss: 0.2839 145/500 [=======>......................] - ETA: 1:57 - loss: 1.8253 - regression_loss: 1.5414 - classification_loss: 0.2840 146/500 [=======>......................] - ETA: 1:57 - loss: 1.8290 - regression_loss: 1.5437 - classification_loss: 0.2853 147/500 [=======>......................] - ETA: 1:56 - loss: 1.8247 - regression_loss: 1.5400 - classification_loss: 0.2847 148/500 [=======>......................] - ETA: 1:56 - loss: 1.8268 - regression_loss: 1.5418 - classification_loss: 0.2851 149/500 [=======>......................] - ETA: 1:56 - loss: 1.8266 - regression_loss: 1.5418 - classification_loss: 0.2848 150/500 [========>.....................] - ETA: 1:55 - loss: 1.8273 - regression_loss: 1.5425 - classification_loss: 0.2848 151/500 [========>.....................] - ETA: 1:55 - loss: 1.8260 - regression_loss: 1.5416 - classification_loss: 0.2844 152/500 [========>.....................] - ETA: 1:55 - loss: 1.8259 - regression_loss: 1.5417 - classification_loss: 0.2842 153/500 [========>.....................] - ETA: 1:54 - loss: 1.8245 - regression_loss: 1.5406 - classification_loss: 0.2838 154/500 [========>.....................] - ETA: 1:54 - loss: 1.8209 - regression_loss: 1.5374 - classification_loss: 0.2835 155/500 [========>.....................] - ETA: 1:54 - loss: 1.8251 - regression_loss: 1.5412 - classification_loss: 0.2838 156/500 [========>.....................] - ETA: 1:53 - loss: 1.8261 - regression_loss: 1.5424 - classification_loss: 0.2838 157/500 [========>.....................] - ETA: 1:53 - loss: 1.8256 - regression_loss: 1.5422 - classification_loss: 0.2834 158/500 [========>.....................] - ETA: 1:53 - loss: 1.8274 - regression_loss: 1.5438 - classification_loss: 0.2836 159/500 [========>.....................] - ETA: 1:52 - loss: 1.8292 - regression_loss: 1.5454 - classification_loss: 0.2838 160/500 [========>.....................] - ETA: 1:52 - loss: 1.8244 - regression_loss: 1.5411 - classification_loss: 0.2832 161/500 [========>.....................] - ETA: 1:52 - loss: 1.8246 - regression_loss: 1.5413 - classification_loss: 0.2834 162/500 [========>.....................] - ETA: 1:51 - loss: 1.8215 - regression_loss: 1.5388 - classification_loss: 0.2827 163/500 [========>.....................] - ETA: 1:51 - loss: 1.8231 - regression_loss: 1.5403 - classification_loss: 0.2828 164/500 [========>.....................] - ETA: 1:51 - loss: 1.8244 - regression_loss: 1.5414 - classification_loss: 0.2830 165/500 [========>.....................] - ETA: 1:50 - loss: 1.8246 - regression_loss: 1.5414 - classification_loss: 0.2831 166/500 [========>.....................] - ETA: 1:50 - loss: 1.8242 - regression_loss: 1.5413 - classification_loss: 0.2830 167/500 [=========>....................] - ETA: 1:50 - loss: 1.8271 - regression_loss: 1.5437 - classification_loss: 0.2834 168/500 [=========>....................] - ETA: 1:49 - loss: 1.8289 - regression_loss: 1.5453 - classification_loss: 0.2836 169/500 [=========>....................] - ETA: 1:49 - loss: 1.8268 - regression_loss: 1.5434 - classification_loss: 0.2834 170/500 [=========>....................] - ETA: 1:49 - loss: 1.8234 - regression_loss: 1.5408 - classification_loss: 0.2826 171/500 [=========>....................] - ETA: 1:48 - loss: 1.8242 - regression_loss: 1.5413 - classification_loss: 0.2829 172/500 [=========>....................] - ETA: 1:48 - loss: 1.8269 - regression_loss: 1.5435 - classification_loss: 0.2833 173/500 [=========>....................] - ETA: 1:48 - loss: 1.8244 - regression_loss: 1.5416 - classification_loss: 0.2827 174/500 [=========>....................] - ETA: 1:47 - loss: 1.8186 - regression_loss: 1.5367 - classification_loss: 0.2820 175/500 [=========>....................] - ETA: 1:47 - loss: 1.8198 - regression_loss: 1.5374 - classification_loss: 0.2824 176/500 [=========>....................] - ETA: 1:47 - loss: 1.8182 - regression_loss: 1.5362 - classification_loss: 0.2821 177/500 [=========>....................] - ETA: 1:46 - loss: 1.8215 - regression_loss: 1.5387 - classification_loss: 0.2828 178/500 [=========>....................] - ETA: 1:46 - loss: 1.8165 - regression_loss: 1.5345 - classification_loss: 0.2820 179/500 [=========>....................] - ETA: 1:46 - loss: 1.8144 - regression_loss: 1.5331 - classification_loss: 0.2813 180/500 [=========>....................] - ETA: 1:45 - loss: 1.8130 - regression_loss: 1.5321 - classification_loss: 0.2809 181/500 [=========>....................] - ETA: 1:45 - loss: 1.8144 - regression_loss: 1.5334 - classification_loss: 0.2810 182/500 [=========>....................] - ETA: 1:45 - loss: 1.8148 - regression_loss: 1.5338 - classification_loss: 0.2810 183/500 [=========>....................] - ETA: 1:44 - loss: 1.8091 - regression_loss: 1.5290 - classification_loss: 0.2801 184/500 [==========>...................] - ETA: 1:44 - loss: 1.8111 - regression_loss: 1.5309 - classification_loss: 0.2801 185/500 [==========>...................] - ETA: 1:44 - loss: 1.8130 - regression_loss: 1.5321 - classification_loss: 0.2809 186/500 [==========>...................] - ETA: 1:43 - loss: 1.8138 - regression_loss: 1.5327 - classification_loss: 0.2811 187/500 [==========>...................] - ETA: 1:43 - loss: 1.8128 - regression_loss: 1.5319 - classification_loss: 0.2809 188/500 [==========>...................] - ETA: 1:43 - loss: 1.8163 - regression_loss: 1.5348 - classification_loss: 0.2815 189/500 [==========>...................] - ETA: 1:42 - loss: 1.8166 - regression_loss: 1.5350 - classification_loss: 0.2815 190/500 [==========>...................] - ETA: 1:42 - loss: 1.8188 - regression_loss: 1.5369 - classification_loss: 0.2819 191/500 [==========>...................] - ETA: 1:42 - loss: 1.8182 - regression_loss: 1.5360 - classification_loss: 0.2821 192/500 [==========>...................] - ETA: 1:41 - loss: 1.8191 - regression_loss: 1.5367 - classification_loss: 0.2824 193/500 [==========>...................] - ETA: 1:41 - loss: 1.8208 - regression_loss: 1.5384 - classification_loss: 0.2824 194/500 [==========>...................] - ETA: 1:41 - loss: 1.8240 - regression_loss: 1.5412 - classification_loss: 0.2828 195/500 [==========>...................] - ETA: 1:40 - loss: 1.8258 - regression_loss: 1.5430 - classification_loss: 0.2827 196/500 [==========>...................] - ETA: 1:40 - loss: 1.8225 - regression_loss: 1.5404 - classification_loss: 0.2821 197/500 [==========>...................] - ETA: 1:40 - loss: 1.8202 - regression_loss: 1.5385 - classification_loss: 0.2817 198/500 [==========>...................] - ETA: 1:39 - loss: 1.8177 - regression_loss: 1.5366 - classification_loss: 0.2811 199/500 [==========>...................] - ETA: 1:39 - loss: 1.8189 - regression_loss: 1.5375 - classification_loss: 0.2813 200/500 [===========>..................] - ETA: 1:39 - loss: 1.8181 - regression_loss: 1.5369 - classification_loss: 0.2812 201/500 [===========>..................] - ETA: 1:38 - loss: 1.8205 - regression_loss: 1.5386 - classification_loss: 0.2819 202/500 [===========>..................] - ETA: 1:38 - loss: 1.8200 - regression_loss: 1.5382 - classification_loss: 0.2819 203/500 [===========>..................] - ETA: 1:38 - loss: 1.8204 - regression_loss: 1.5384 - classification_loss: 0.2820 204/500 [===========>..................] - ETA: 1:38 - loss: 1.8198 - regression_loss: 1.5378 - classification_loss: 0.2820 205/500 [===========>..................] - ETA: 1:37 - loss: 1.8212 - regression_loss: 1.5390 - classification_loss: 0.2823 206/500 [===========>..................] - ETA: 1:37 - loss: 1.8198 - regression_loss: 1.5381 - classification_loss: 0.2817 207/500 [===========>..................] - ETA: 1:36 - loss: 1.8199 - regression_loss: 1.5382 - classification_loss: 0.2817 208/500 [===========>..................] - ETA: 1:36 - loss: 1.8220 - regression_loss: 1.5402 - classification_loss: 0.2818 209/500 [===========>..................] - ETA: 1:36 - loss: 1.8180 - regression_loss: 1.5368 - classification_loss: 0.2812 210/500 [===========>..................] - ETA: 1:35 - loss: 1.8171 - regression_loss: 1.5361 - classification_loss: 0.2811 211/500 [===========>..................] - ETA: 1:35 - loss: 1.8157 - regression_loss: 1.5344 - classification_loss: 0.2813 212/500 [===========>..................] - ETA: 1:35 - loss: 1.8157 - regression_loss: 1.5344 - classification_loss: 0.2812 213/500 [===========>..................] - ETA: 1:34 - loss: 1.8152 - regression_loss: 1.5340 - classification_loss: 0.2812 214/500 [===========>..................] - ETA: 1:34 - loss: 1.8107 - regression_loss: 1.5299 - classification_loss: 0.2809 215/500 [===========>..................] - ETA: 1:34 - loss: 1.8107 - regression_loss: 1.5300 - classification_loss: 0.2807 216/500 [===========>..................] - ETA: 1:33 - loss: 1.8113 - regression_loss: 1.5304 - classification_loss: 0.2808 217/500 [============>.................] - ETA: 1:33 - loss: 1.8107 - regression_loss: 1.5300 - classification_loss: 0.2808 218/500 [============>.................] - ETA: 1:33 - loss: 1.8106 - regression_loss: 1.5297 - classification_loss: 0.2809 219/500 [============>.................] - ETA: 1:32 - loss: 1.8112 - regression_loss: 1.5301 - classification_loss: 0.2811 220/500 [============>.................] - ETA: 1:32 - loss: 1.8124 - regression_loss: 1.5310 - classification_loss: 0.2814 221/500 [============>.................] - ETA: 1:32 - loss: 1.8120 - regression_loss: 1.5307 - classification_loss: 0.2814 222/500 [============>.................] - ETA: 1:31 - loss: 1.8127 - regression_loss: 1.5313 - classification_loss: 0.2814 223/500 [============>.................] - ETA: 1:31 - loss: 1.8148 - regression_loss: 1.5330 - classification_loss: 0.2818 224/500 [============>.................] - ETA: 1:31 - loss: 1.8157 - regression_loss: 1.5339 - classification_loss: 0.2818 225/500 [============>.................] - ETA: 1:30 - loss: 1.8146 - regression_loss: 1.5329 - classification_loss: 0.2817 226/500 [============>.................] - ETA: 1:30 - loss: 1.8158 - regression_loss: 1.5341 - classification_loss: 0.2817 227/500 [============>.................] - ETA: 1:30 - loss: 1.8168 - regression_loss: 1.5348 - classification_loss: 0.2820 228/500 [============>.................] - ETA: 1:30 - loss: 1.8158 - regression_loss: 1.5341 - classification_loss: 0.2817 229/500 [============>.................] - ETA: 1:29 - loss: 1.8133 - regression_loss: 1.5318 - classification_loss: 0.2815 230/500 [============>.................] - ETA: 1:29 - loss: 1.8125 - regression_loss: 1.5311 - classification_loss: 0.2813 231/500 [============>.................] - ETA: 1:29 - loss: 1.8131 - regression_loss: 1.5320 - classification_loss: 0.2812 232/500 [============>.................] - ETA: 1:28 - loss: 1.8138 - regression_loss: 1.5325 - classification_loss: 0.2812 233/500 [============>.................] - ETA: 1:28 - loss: 1.8124 - regression_loss: 1.5312 - classification_loss: 0.2813 234/500 [=============>................] - ETA: 1:28 - loss: 1.8140 - regression_loss: 1.5324 - classification_loss: 0.2816 235/500 [=============>................] - ETA: 1:27 - loss: 1.8165 - regression_loss: 1.5346 - classification_loss: 0.2818 236/500 [=============>................] - ETA: 1:27 - loss: 1.8168 - regression_loss: 1.5349 - classification_loss: 0.2819 237/500 [=============>................] - ETA: 1:27 - loss: 1.8180 - regression_loss: 1.5361 - classification_loss: 0.2819 238/500 [=============>................] - ETA: 1:26 - loss: 1.8176 - regression_loss: 1.5361 - classification_loss: 0.2816 239/500 [=============>................] - ETA: 1:26 - loss: 1.8178 - regression_loss: 1.5362 - classification_loss: 0.2816 240/500 [=============>................] - ETA: 1:26 - loss: 1.8165 - regression_loss: 1.5353 - classification_loss: 0.2811 241/500 [=============>................] - ETA: 1:25 - loss: 1.8147 - regression_loss: 1.5339 - classification_loss: 0.2808 242/500 [=============>................] - ETA: 1:25 - loss: 1.8179 - regression_loss: 1.5363 - classification_loss: 0.2816 243/500 [=============>................] - ETA: 1:25 - loss: 1.8179 - regression_loss: 1.5367 - classification_loss: 0.2812 244/500 [=============>................] - ETA: 1:24 - loss: 1.8187 - regression_loss: 1.5377 - classification_loss: 0.2810 245/500 [=============>................] - ETA: 1:24 - loss: 1.8145 - regression_loss: 1.5342 - classification_loss: 0.2803 246/500 [=============>................] - ETA: 1:24 - loss: 1.8164 - regression_loss: 1.5360 - classification_loss: 0.2804 247/500 [=============>................] - ETA: 1:23 - loss: 1.8137 - regression_loss: 1.5338 - classification_loss: 0.2800 248/500 [=============>................] - ETA: 1:23 - loss: 1.8149 - regression_loss: 1.5348 - classification_loss: 0.2801 249/500 [=============>................] - ETA: 1:23 - loss: 1.8113 - regression_loss: 1.5317 - classification_loss: 0.2796 250/500 [==============>...............] - ETA: 1:22 - loss: 1.8075 - regression_loss: 1.5284 - classification_loss: 0.2791 251/500 [==============>...............] - ETA: 1:22 - loss: 1.8061 - regression_loss: 1.5271 - classification_loss: 0.2790 252/500 [==============>...............] - ETA: 1:22 - loss: 1.8057 - regression_loss: 1.5267 - classification_loss: 0.2790 253/500 [==============>...............] - ETA: 1:21 - loss: 1.8061 - regression_loss: 1.5269 - classification_loss: 0.2791 254/500 [==============>...............] - ETA: 1:21 - loss: 1.8052 - regression_loss: 1.5263 - classification_loss: 0.2789 255/500 [==============>...............] - ETA: 1:20 - loss: 1.8057 - regression_loss: 1.5268 - classification_loss: 0.2789 256/500 [==============>...............] - ETA: 1:20 - loss: 1.8055 - regression_loss: 1.5268 - classification_loss: 0.2787 257/500 [==============>...............] - ETA: 1:20 - loss: 1.8038 - regression_loss: 1.5251 - classification_loss: 0.2787 258/500 [==============>...............] - ETA: 1:20 - loss: 1.7991 - regression_loss: 1.5209 - classification_loss: 0.2782 259/500 [==============>...............] - ETA: 1:19 - loss: 1.8002 - regression_loss: 1.5223 - classification_loss: 0.2779 260/500 [==============>...............] - ETA: 1:19 - loss: 1.7997 - regression_loss: 1.5216 - classification_loss: 0.2781 261/500 [==============>...............] - ETA: 1:18 - loss: 1.8009 - regression_loss: 1.5226 - classification_loss: 0.2782 262/500 [==============>...............] - ETA: 1:18 - loss: 1.8018 - regression_loss: 1.5233 - classification_loss: 0.2785 263/500 [==============>...............] - ETA: 1:18 - loss: 1.8029 - regression_loss: 1.5243 - classification_loss: 0.2786 264/500 [==============>...............] - ETA: 1:17 - loss: 1.8026 - regression_loss: 1.5240 - classification_loss: 0.2786 265/500 [==============>...............] - ETA: 1:17 - loss: 1.8025 - regression_loss: 1.5239 - classification_loss: 0.2786 266/500 [==============>...............] - ETA: 1:17 - loss: 1.8023 - regression_loss: 1.5238 - classification_loss: 0.2785 267/500 [===============>..............] - ETA: 1:16 - loss: 1.8017 - regression_loss: 1.5234 - classification_loss: 0.2784 268/500 [===============>..............] - ETA: 1:16 - loss: 1.8011 - regression_loss: 1.5231 - classification_loss: 0.2780 269/500 [===============>..............] - ETA: 1:16 - loss: 1.8013 - regression_loss: 1.5233 - classification_loss: 0.2780 270/500 [===============>..............] - ETA: 1:15 - loss: 1.8013 - regression_loss: 1.5233 - classification_loss: 0.2780 271/500 [===============>..............] - ETA: 1:15 - loss: 1.8015 - regression_loss: 1.5236 - classification_loss: 0.2779 272/500 [===============>..............] - ETA: 1:15 - loss: 1.8029 - regression_loss: 1.5248 - classification_loss: 0.2782 273/500 [===============>..............] - ETA: 1:14 - loss: 1.8052 - regression_loss: 1.5265 - classification_loss: 0.2786 274/500 [===============>..............] - ETA: 1:14 - loss: 1.8038 - regression_loss: 1.5254 - classification_loss: 0.2784 275/500 [===============>..............] - ETA: 1:14 - loss: 1.8061 - regression_loss: 1.5266 - classification_loss: 0.2795 276/500 [===============>..............] - ETA: 1:13 - loss: 1.8033 - regression_loss: 1.5242 - classification_loss: 0.2791 277/500 [===============>..............] - ETA: 1:13 - loss: 1.8040 - regression_loss: 1.5249 - classification_loss: 0.2791 278/500 [===============>..............] - ETA: 1:13 - loss: 1.8010 - regression_loss: 1.5219 - classification_loss: 0.2791 279/500 [===============>..............] - ETA: 1:12 - loss: 1.8023 - regression_loss: 1.5233 - classification_loss: 0.2790 280/500 [===============>..............] - ETA: 1:12 - loss: 1.8039 - regression_loss: 1.5246 - classification_loss: 0.2793 281/500 [===============>..............] - ETA: 1:12 - loss: 1.8022 - regression_loss: 1.5233 - classification_loss: 0.2789 282/500 [===============>..............] - ETA: 1:11 - loss: 1.7993 - regression_loss: 1.5207 - classification_loss: 0.2786 283/500 [===============>..............] - ETA: 1:11 - loss: 1.7989 - regression_loss: 1.5205 - classification_loss: 0.2785 284/500 [================>.............] - ETA: 1:11 - loss: 1.7981 - regression_loss: 1.5198 - classification_loss: 0.2783 285/500 [================>.............] - ETA: 1:10 - loss: 1.7958 - regression_loss: 1.5178 - classification_loss: 0.2780 286/500 [================>.............] - ETA: 1:10 - loss: 1.7930 - regression_loss: 1.5153 - classification_loss: 0.2777 287/500 [================>.............] - ETA: 1:10 - loss: 1.7960 - regression_loss: 1.5176 - classification_loss: 0.2784 288/500 [================>.............] - ETA: 1:10 - loss: 1.7962 - regression_loss: 1.5178 - classification_loss: 0.2784 289/500 [================>.............] - ETA: 1:09 - loss: 1.7967 - regression_loss: 1.5182 - classification_loss: 0.2785 290/500 [================>.............] - ETA: 1:09 - loss: 1.7977 - regression_loss: 1.5192 - classification_loss: 0.2785 291/500 [================>.............] - ETA: 1:09 - loss: 1.7980 - regression_loss: 1.5195 - classification_loss: 0.2785 292/500 [================>.............] - ETA: 1:08 - loss: 1.7978 - regression_loss: 1.5192 - classification_loss: 0.2785 293/500 [================>.............] - ETA: 1:08 - loss: 1.7980 - regression_loss: 1.5195 - classification_loss: 0.2785 294/500 [================>.............] - ETA: 1:08 - loss: 1.7967 - regression_loss: 1.5186 - classification_loss: 0.2782 295/500 [================>.............] - ETA: 1:07 - loss: 1.7975 - regression_loss: 1.5192 - classification_loss: 0.2783 296/500 [================>.............] - ETA: 1:07 - loss: 1.7994 - regression_loss: 1.5208 - classification_loss: 0.2786 297/500 [================>.............] - ETA: 1:07 - loss: 1.8007 - regression_loss: 1.5221 - classification_loss: 0.2786 298/500 [================>.............] - ETA: 1:06 - loss: 1.8003 - regression_loss: 1.5217 - classification_loss: 0.2786 299/500 [================>.............] - ETA: 1:06 - loss: 1.8009 - regression_loss: 1.5222 - classification_loss: 0.2787 300/500 [=================>............] - ETA: 1:06 - loss: 1.8021 - regression_loss: 1.5233 - classification_loss: 0.2788 301/500 [=================>............] - ETA: 1:05 - loss: 1.8023 - regression_loss: 1.5235 - classification_loss: 0.2788 302/500 [=================>............] - ETA: 1:05 - loss: 1.8029 - regression_loss: 1.5240 - classification_loss: 0.2789 303/500 [=================>............] - ETA: 1:05 - loss: 1.8038 - regression_loss: 1.5245 - classification_loss: 0.2793 304/500 [=================>............] - ETA: 1:04 - loss: 1.8033 - regression_loss: 1.5242 - classification_loss: 0.2791 305/500 [=================>............] - ETA: 1:04 - loss: 1.8038 - regression_loss: 1.5248 - classification_loss: 0.2790 306/500 [=================>............] - ETA: 1:04 - loss: 1.8032 - regression_loss: 1.5244 - classification_loss: 0.2788 307/500 [=================>............] - ETA: 1:03 - loss: 1.8039 - regression_loss: 1.5251 - classification_loss: 0.2789 308/500 [=================>............] - ETA: 1:03 - loss: 1.8004 - regression_loss: 1.5218 - classification_loss: 0.2786 309/500 [=================>............] - ETA: 1:03 - loss: 1.7994 - regression_loss: 1.5212 - classification_loss: 0.2782 310/500 [=================>............] - ETA: 1:02 - loss: 1.7996 - regression_loss: 1.5216 - classification_loss: 0.2780 311/500 [=================>............] - ETA: 1:02 - loss: 1.7983 - regression_loss: 1.5205 - classification_loss: 0.2778 312/500 [=================>............] - ETA: 1:02 - loss: 1.7987 - regression_loss: 1.5209 - classification_loss: 0.2778 313/500 [=================>............] - ETA: 1:01 - loss: 1.7966 - regression_loss: 1.5193 - classification_loss: 0.2773 314/500 [=================>............] - ETA: 1:01 - loss: 1.7977 - regression_loss: 1.5203 - classification_loss: 0.2774 315/500 [=================>............] - ETA: 1:01 - loss: 1.7981 - regression_loss: 1.5206 - classification_loss: 0.2775 316/500 [=================>............] - ETA: 1:00 - loss: 1.8006 - regression_loss: 1.5225 - classification_loss: 0.2781 317/500 [==================>...........] - ETA: 1:00 - loss: 1.8007 - regression_loss: 1.5225 - classification_loss: 0.2782 318/500 [==================>...........] - ETA: 1:00 - loss: 1.8003 - regression_loss: 1.5222 - classification_loss: 0.2781 319/500 [==================>...........] - ETA: 59s - loss: 1.8004 - regression_loss: 1.5223 - classification_loss: 0.2780  320/500 [==================>...........] - ETA: 59s - loss: 1.8014 - regression_loss: 1.5232 - classification_loss: 0.2782 321/500 [==================>...........] - ETA: 59s - loss: 1.8014 - regression_loss: 1.5232 - classification_loss: 0.2782 322/500 [==================>...........] - ETA: 58s - loss: 1.8015 - regression_loss: 1.5233 - classification_loss: 0.2782 323/500 [==================>...........] - ETA: 58s - loss: 1.7995 - regression_loss: 1.5216 - classification_loss: 0.2779 324/500 [==================>...........] - ETA: 58s - loss: 1.7978 - regression_loss: 1.5200 - classification_loss: 0.2778 325/500 [==================>...........] - ETA: 57s - loss: 1.7973 - regression_loss: 1.5196 - classification_loss: 0.2777 326/500 [==================>...........] - ETA: 57s - loss: 1.7972 - regression_loss: 1.5195 - classification_loss: 0.2777 327/500 [==================>...........] - ETA: 57s - loss: 1.7940 - regression_loss: 1.5168 - classification_loss: 0.2772 328/500 [==================>...........] - ETA: 56s - loss: 1.7942 - regression_loss: 1.5170 - classification_loss: 0.2772 329/500 [==================>...........] - ETA: 56s - loss: 1.7939 - regression_loss: 1.5168 - classification_loss: 0.2771 330/500 [==================>...........] - ETA: 56s - loss: 1.7942 - regression_loss: 1.5169 - classification_loss: 0.2773 331/500 [==================>...........] - ETA: 55s - loss: 1.7933 - regression_loss: 1.5160 - classification_loss: 0.2773 332/500 [==================>...........] - ETA: 55s - loss: 1.7934 - regression_loss: 1.5159 - classification_loss: 0.2774 333/500 [==================>...........] - ETA: 55s - loss: 1.7928 - regression_loss: 1.5154 - classification_loss: 0.2774 334/500 [===================>..........] - ETA: 54s - loss: 1.7889 - regression_loss: 1.5119 - classification_loss: 0.2770 335/500 [===================>..........] - ETA: 54s - loss: 1.7895 - regression_loss: 1.5125 - classification_loss: 0.2770 336/500 [===================>..........] - ETA: 54s - loss: 1.7904 - regression_loss: 1.5133 - classification_loss: 0.2771 337/500 [===================>..........] - ETA: 53s - loss: 1.7889 - regression_loss: 1.5122 - classification_loss: 0.2766 338/500 [===================>..........] - ETA: 53s - loss: 1.7911 - regression_loss: 1.5141 - classification_loss: 0.2770 339/500 [===================>..........] - ETA: 53s - loss: 1.7931 - regression_loss: 1.5157 - classification_loss: 0.2774 340/500 [===================>..........] - ETA: 52s - loss: 1.7923 - regression_loss: 1.5151 - classification_loss: 0.2773 341/500 [===================>..........] - ETA: 52s - loss: 1.7935 - regression_loss: 1.5163 - classification_loss: 0.2772 342/500 [===================>..........] - ETA: 52s - loss: 1.7929 - regression_loss: 1.5159 - classification_loss: 0.2770 343/500 [===================>..........] - ETA: 51s - loss: 1.7950 - regression_loss: 1.5177 - classification_loss: 0.2773 344/500 [===================>..........] - ETA: 51s - loss: 1.7926 - regression_loss: 1.5155 - classification_loss: 0.2770 345/500 [===================>..........] - ETA: 51s - loss: 1.7928 - regression_loss: 1.5157 - classification_loss: 0.2771 346/500 [===================>..........] - ETA: 50s - loss: 1.7944 - regression_loss: 1.5170 - classification_loss: 0.2773 347/500 [===================>..........] - ETA: 50s - loss: 1.7952 - regression_loss: 1.5178 - classification_loss: 0.2774 348/500 [===================>..........] - ETA: 50s - loss: 1.7955 - regression_loss: 1.5180 - classification_loss: 0.2775 349/500 [===================>..........] - ETA: 49s - loss: 1.7930 - regression_loss: 1.5157 - classification_loss: 0.2772 350/500 [====================>.........] - ETA: 49s - loss: 1.7920 - regression_loss: 1.5150 - classification_loss: 0.2770 351/500 [====================>.........] - ETA: 49s - loss: 1.7921 - regression_loss: 1.5151 - classification_loss: 0.2769 352/500 [====================>.........] - ETA: 48s - loss: 1.7912 - regression_loss: 1.5142 - classification_loss: 0.2770 353/500 [====================>.........] - ETA: 48s - loss: 1.7912 - regression_loss: 1.5141 - classification_loss: 0.2770 354/500 [====================>.........] - ETA: 48s - loss: 1.7912 - regression_loss: 1.5143 - classification_loss: 0.2769 355/500 [====================>.........] - ETA: 47s - loss: 1.7883 - regression_loss: 1.5118 - classification_loss: 0.2765 356/500 [====================>.........] - ETA: 47s - loss: 1.7860 - regression_loss: 1.5098 - classification_loss: 0.2762 357/500 [====================>.........] - ETA: 47s - loss: 1.7857 - regression_loss: 1.5097 - classification_loss: 0.2760 358/500 [====================>.........] - ETA: 46s - loss: 1.7857 - regression_loss: 1.5097 - classification_loss: 0.2760 359/500 [====================>.........] - ETA: 46s - loss: 1.7869 - regression_loss: 1.5107 - classification_loss: 0.2762 360/500 [====================>.........] - ETA: 46s - loss: 1.7879 - regression_loss: 1.5117 - classification_loss: 0.2763 361/500 [====================>.........] - ETA: 45s - loss: 1.7877 - regression_loss: 1.5113 - classification_loss: 0.2764 362/500 [====================>.........] - ETA: 45s - loss: 1.7880 - regression_loss: 1.5116 - classification_loss: 0.2764 363/500 [====================>.........] - ETA: 45s - loss: 1.7895 - regression_loss: 1.5128 - classification_loss: 0.2767 364/500 [====================>.........] - ETA: 44s - loss: 1.7894 - regression_loss: 1.5128 - classification_loss: 0.2766 365/500 [====================>.........] - ETA: 44s - loss: 1.7895 - regression_loss: 1.5130 - classification_loss: 0.2766 366/500 [====================>.........] - ETA: 44s - loss: 1.7905 - regression_loss: 1.5137 - classification_loss: 0.2769 367/500 [=====================>........] - ETA: 43s - loss: 1.7910 - regression_loss: 1.5140 - classification_loss: 0.2769 368/500 [=====================>........] - ETA: 43s - loss: 1.7913 - regression_loss: 1.5143 - classification_loss: 0.2770 369/500 [=====================>........] - ETA: 43s - loss: 1.7917 - regression_loss: 1.5146 - classification_loss: 0.2770 370/500 [=====================>........] - ETA: 42s - loss: 1.7918 - regression_loss: 1.5148 - classification_loss: 0.2769 371/500 [=====================>........] - ETA: 42s - loss: 1.7927 - regression_loss: 1.5156 - classification_loss: 0.2770 372/500 [=====================>........] - ETA: 42s - loss: 1.7928 - regression_loss: 1.5159 - classification_loss: 0.2769 373/500 [=====================>........] - ETA: 41s - loss: 1.7942 - regression_loss: 1.5173 - classification_loss: 0.2770 374/500 [=====================>........] - ETA: 41s - loss: 1.7945 - regression_loss: 1.5176 - classification_loss: 0.2770 375/500 [=====================>........] - ETA: 41s - loss: 1.7924 - regression_loss: 1.5157 - classification_loss: 0.2767 376/500 [=====================>........] - ETA: 40s - loss: 1.7931 - regression_loss: 1.5163 - classification_loss: 0.2768 377/500 [=====================>........] - ETA: 40s - loss: 1.7943 - regression_loss: 1.5173 - classification_loss: 0.2771 378/500 [=====================>........] - ETA: 40s - loss: 1.7936 - regression_loss: 1.5168 - classification_loss: 0.2768 379/500 [=====================>........] - ETA: 39s - loss: 1.7947 - regression_loss: 1.5179 - classification_loss: 0.2768 380/500 [=====================>........] - ETA: 39s - loss: 1.7957 - regression_loss: 1.5186 - classification_loss: 0.2771 381/500 [=====================>........] - ETA: 39s - loss: 1.7957 - regression_loss: 1.5185 - classification_loss: 0.2772 382/500 [=====================>........] - ETA: 39s - loss: 1.7950 - regression_loss: 1.5179 - classification_loss: 0.2770 383/500 [=====================>........] - ETA: 38s - loss: 1.7967 - regression_loss: 1.5193 - classification_loss: 0.2774 384/500 [======================>.......] - ETA: 38s - loss: 1.7979 - regression_loss: 1.5204 - classification_loss: 0.2775 385/500 [======================>.......] - ETA: 38s - loss: 1.7983 - regression_loss: 1.5207 - classification_loss: 0.2776 386/500 [======================>.......] - ETA: 37s - loss: 1.7989 - regression_loss: 1.5214 - classification_loss: 0.2775 387/500 [======================>.......] - ETA: 37s - loss: 1.7993 - regression_loss: 1.5218 - classification_loss: 0.2775 388/500 [======================>.......] - ETA: 37s - loss: 1.7994 - regression_loss: 1.5219 - classification_loss: 0.2775 389/500 [======================>.......] - ETA: 36s - loss: 1.8001 - regression_loss: 1.5226 - classification_loss: 0.2775 390/500 [======================>.......] - ETA: 36s - loss: 1.7996 - regression_loss: 1.5221 - classification_loss: 0.2774 391/500 [======================>.......] - ETA: 36s - loss: 1.8002 - regression_loss: 1.5227 - classification_loss: 0.2775 392/500 [======================>.......] - ETA: 35s - loss: 1.7996 - regression_loss: 1.5222 - classification_loss: 0.2774 393/500 [======================>.......] - ETA: 35s - loss: 1.8003 - regression_loss: 1.5228 - classification_loss: 0.2775 394/500 [======================>.......] - ETA: 35s - loss: 1.7989 - regression_loss: 1.5215 - classification_loss: 0.2774 395/500 [======================>.......] - ETA: 34s - loss: 1.7964 - regression_loss: 1.5194 - classification_loss: 0.2770 396/500 [======================>.......] - ETA: 34s - loss: 1.7964 - regression_loss: 1.5194 - classification_loss: 0.2770 397/500 [======================>.......] - ETA: 34s - loss: 1.7973 - regression_loss: 1.5201 - classification_loss: 0.2771 398/500 [======================>.......] - ETA: 33s - loss: 1.7979 - regression_loss: 1.5206 - classification_loss: 0.2772 399/500 [======================>.......] - ETA: 33s - loss: 1.7980 - regression_loss: 1.5208 - classification_loss: 0.2772 400/500 [=======================>......] - ETA: 33s - loss: 1.7980 - regression_loss: 1.5208 - classification_loss: 0.2772 401/500 [=======================>......] - ETA: 32s - loss: 1.7965 - regression_loss: 1.5192 - classification_loss: 0.2773 402/500 [=======================>......] - ETA: 32s - loss: 1.7964 - regression_loss: 1.5191 - classification_loss: 0.2773 403/500 [=======================>......] - ETA: 32s - loss: 1.7969 - regression_loss: 1.5195 - classification_loss: 0.2775 404/500 [=======================>......] - ETA: 31s - loss: 1.7957 - regression_loss: 1.5185 - classification_loss: 0.2772 405/500 [=======================>......] - ETA: 31s - loss: 1.7953 - regression_loss: 1.5183 - classification_loss: 0.2771 406/500 [=======================>......] - ETA: 31s - loss: 1.7946 - regression_loss: 1.5177 - classification_loss: 0.2769 407/500 [=======================>......] - ETA: 30s - loss: 1.7951 - regression_loss: 1.5183 - classification_loss: 0.2768 408/500 [=======================>......] - ETA: 30s - loss: 1.7958 - regression_loss: 1.5189 - classification_loss: 0.2769 409/500 [=======================>......] - ETA: 30s - loss: 1.7955 - regression_loss: 1.5187 - classification_loss: 0.2768 410/500 [=======================>......] - ETA: 29s - loss: 1.7956 - regression_loss: 1.5187 - classification_loss: 0.2768 411/500 [=======================>......] - ETA: 29s - loss: 1.7939 - regression_loss: 1.5175 - classification_loss: 0.2764 412/500 [=======================>......] - ETA: 29s - loss: 1.7934 - regression_loss: 1.5170 - classification_loss: 0.2764 413/500 [=======================>......] - ETA: 28s - loss: 1.7912 - regression_loss: 1.5152 - classification_loss: 0.2760 414/500 [=======================>......] - ETA: 28s - loss: 1.7922 - regression_loss: 1.5160 - classification_loss: 0.2761 415/500 [=======================>......] - ETA: 28s - loss: 1.7925 - regression_loss: 1.5163 - classification_loss: 0.2762 416/500 [=======================>......] - ETA: 27s - loss: 1.7928 - regression_loss: 1.5164 - classification_loss: 0.2763 417/500 [========================>.....] - ETA: 27s - loss: 1.7930 - regression_loss: 1.5166 - classification_loss: 0.2763 418/500 [========================>.....] - ETA: 27s - loss: 1.7928 - regression_loss: 1.5165 - classification_loss: 0.2763 419/500 [========================>.....] - ETA: 26s - loss: 1.7920 - regression_loss: 1.5159 - classification_loss: 0.2762 420/500 [========================>.....] - ETA: 26s - loss: 1.7914 - regression_loss: 1.5152 - classification_loss: 0.2762 421/500 [========================>.....] - ETA: 26s - loss: 1.7915 - regression_loss: 1.5153 - classification_loss: 0.2762 422/500 [========================>.....] - ETA: 25s - loss: 1.7909 - regression_loss: 1.5149 - classification_loss: 0.2760 423/500 [========================>.....] - ETA: 25s - loss: 1.7888 - regression_loss: 1.5131 - classification_loss: 0.2758 424/500 [========================>.....] - ETA: 25s - loss: 1.7890 - regression_loss: 1.5132 - classification_loss: 0.2758 425/500 [========================>.....] - ETA: 24s - loss: 1.7886 - regression_loss: 1.5128 - classification_loss: 0.2758 426/500 [========================>.....] - ETA: 24s - loss: 1.7895 - regression_loss: 1.5137 - classification_loss: 0.2758 427/500 [========================>.....] - ETA: 24s - loss: 1.7892 - regression_loss: 1.5134 - classification_loss: 0.2758 428/500 [========================>.....] - ETA: 23s - loss: 1.7899 - regression_loss: 1.5141 - classification_loss: 0.2759 429/500 [========================>.....] - ETA: 23s - loss: 1.7907 - regression_loss: 1.5147 - classification_loss: 0.2760 430/500 [========================>.....] - ETA: 23s - loss: 1.7909 - regression_loss: 1.5150 - classification_loss: 0.2759 431/500 [========================>.....] - ETA: 22s - loss: 1.7915 - regression_loss: 1.5154 - classification_loss: 0.2761 432/500 [========================>.....] - ETA: 22s - loss: 1.7914 - regression_loss: 1.5153 - classification_loss: 0.2761 433/500 [========================>.....] - ETA: 22s - loss: 1.7912 - regression_loss: 1.5151 - classification_loss: 0.2761 434/500 [=========================>....] - ETA: 21s - loss: 1.7920 - regression_loss: 1.5160 - classification_loss: 0.2760 435/500 [=========================>....] - ETA: 21s - loss: 1.7907 - regression_loss: 1.5149 - classification_loss: 0.2757 436/500 [=========================>....] - ETA: 21s - loss: 1.7895 - regression_loss: 1.5139 - classification_loss: 0.2756 437/500 [=========================>....] - ETA: 20s - loss: 1.7898 - regression_loss: 1.5140 - classification_loss: 0.2758 438/500 [=========================>....] - ETA: 20s - loss: 1.7898 - regression_loss: 1.5140 - classification_loss: 0.2758 439/500 [=========================>....] - ETA: 20s - loss: 1.7899 - regression_loss: 1.5140 - classification_loss: 0.2759 440/500 [=========================>....] - ETA: 19s - loss: 1.7896 - regression_loss: 1.5136 - classification_loss: 0.2759 441/500 [=========================>....] - ETA: 19s - loss: 1.7887 - regression_loss: 1.5129 - classification_loss: 0.2758 442/500 [=========================>....] - ETA: 19s - loss: 1.7892 - regression_loss: 1.5134 - classification_loss: 0.2758 443/500 [=========================>....] - ETA: 18s - loss: 1.7895 - regression_loss: 1.5137 - classification_loss: 0.2757 444/500 [=========================>....] - ETA: 18s - loss: 1.7872 - regression_loss: 1.5118 - classification_loss: 0.2755 445/500 [=========================>....] - ETA: 18s - loss: 1.7846 - regression_loss: 1.5095 - classification_loss: 0.2751 446/500 [=========================>....] - ETA: 17s - loss: 1.7848 - regression_loss: 1.5097 - classification_loss: 0.2751 447/500 [=========================>....] - ETA: 17s - loss: 1.7840 - regression_loss: 1.5090 - classification_loss: 0.2750 448/500 [=========================>....] - ETA: 17s - loss: 1.7842 - regression_loss: 1.5093 - classification_loss: 0.2750 449/500 [=========================>....] - ETA: 16s - loss: 1.7846 - regression_loss: 1.5097 - classification_loss: 0.2749 450/500 [==========================>...] - ETA: 16s - loss: 1.7837 - regression_loss: 1.5091 - classification_loss: 0.2746 451/500 [==========================>...] - ETA: 16s - loss: 1.7843 - regression_loss: 1.5097 - classification_loss: 0.2746 452/500 [==========================>...] - ETA: 15s - loss: 1.7848 - regression_loss: 1.5102 - classification_loss: 0.2746 453/500 [==========================>...] - ETA: 15s - loss: 1.7836 - regression_loss: 1.5090 - classification_loss: 0.2745 454/500 [==========================>...] - ETA: 15s - loss: 1.7834 - regression_loss: 1.5088 - classification_loss: 0.2745 455/500 [==========================>...] - ETA: 14s - loss: 1.7828 - regression_loss: 1.5084 - classification_loss: 0.2744 456/500 [==========================>...] - ETA: 14s - loss: 1.7827 - regression_loss: 1.5082 - classification_loss: 0.2745 457/500 [==========================>...] - ETA: 14s - loss: 1.7837 - regression_loss: 1.5091 - classification_loss: 0.2746 458/500 [==========================>...] - ETA: 13s - loss: 1.7830 - regression_loss: 1.5085 - classification_loss: 0.2745 459/500 [==========================>...] - ETA: 13s - loss: 1.7812 - regression_loss: 1.5069 - classification_loss: 0.2743 460/500 [==========================>...] - ETA: 13s - loss: 1.7801 - regression_loss: 1.5059 - classification_loss: 0.2741 461/500 [==========================>...] - ETA: 12s - loss: 1.7812 - regression_loss: 1.5068 - classification_loss: 0.2743 462/500 [==========================>...] - ETA: 12s - loss: 1.7795 - regression_loss: 1.5054 - classification_loss: 0.2741 463/500 [==========================>...] - ETA: 12s - loss: 1.7785 - regression_loss: 1.5047 - classification_loss: 0.2738 464/500 [==========================>...] - ETA: 11s - loss: 1.7796 - regression_loss: 1.5055 - classification_loss: 0.2742 465/500 [==========================>...] - ETA: 11s - loss: 1.7775 - regression_loss: 1.5036 - classification_loss: 0.2738 466/500 [==========================>...] - ETA: 11s - loss: 1.7779 - regression_loss: 1.5040 - classification_loss: 0.2739 467/500 [===========================>..] - ETA: 10s - loss: 1.7768 - regression_loss: 1.5030 - classification_loss: 0.2738 468/500 [===========================>..] - ETA: 10s - loss: 1.7767 - regression_loss: 1.5029 - classification_loss: 0.2737 469/500 [===========================>..] - ETA: 10s - loss: 1.7760 - regression_loss: 1.5024 - classification_loss: 0.2737 470/500 [===========================>..] - ETA: 9s - loss: 1.7764 - regression_loss: 1.5027 - classification_loss: 0.2737  471/500 [===========================>..] - ETA: 9s - loss: 1.7763 - regression_loss: 1.5025 - classification_loss: 0.2738 472/500 [===========================>..] - ETA: 9s - loss: 1.7770 - regression_loss: 1.5031 - classification_loss: 0.2739 473/500 [===========================>..] - ETA: 8s - loss: 1.7759 - regression_loss: 1.5022 - classification_loss: 0.2738 474/500 [===========================>..] - ETA: 8s - loss: 1.7752 - regression_loss: 1.5015 - classification_loss: 0.2737 475/500 [===========================>..] - ETA: 8s - loss: 1.7747 - regression_loss: 1.5008 - classification_loss: 0.2739 476/500 [===========================>..] - ETA: 7s - loss: 1.7734 - regression_loss: 1.4997 - classification_loss: 0.2737 477/500 [===========================>..] - ETA: 7s - loss: 1.7726 - regression_loss: 1.4991 - classification_loss: 0.2735 478/500 [===========================>..] - ETA: 7s - loss: 1.7733 - regression_loss: 1.4998 - classification_loss: 0.2735 479/500 [===========================>..] - ETA: 6s - loss: 1.7722 - regression_loss: 1.4988 - classification_loss: 0.2734 480/500 [===========================>..] - ETA: 6s - loss: 1.7745 - regression_loss: 1.5008 - classification_loss: 0.2736 481/500 [===========================>..] - ETA: 6s - loss: 1.7751 - regression_loss: 1.5013 - classification_loss: 0.2738 482/500 [===========================>..] - ETA: 5s - loss: 1.7730 - regression_loss: 1.4995 - classification_loss: 0.2735 483/500 [===========================>..] - ETA: 5s - loss: 1.7738 - regression_loss: 1.5002 - classification_loss: 0.2736 484/500 [============================>.] - ETA: 5s - loss: 1.7735 - regression_loss: 1.5000 - classification_loss: 0.2735 485/500 [============================>.] - ETA: 4s - loss: 1.7750 - regression_loss: 1.5012 - classification_loss: 0.2738 486/500 [============================>.] - ETA: 4s - loss: 1.7738 - regression_loss: 1.5003 - classification_loss: 0.2735 487/500 [============================>.] - ETA: 4s - loss: 1.7722 - regression_loss: 1.4990 - classification_loss: 0.2732 488/500 [============================>.] - ETA: 3s - loss: 1.7725 - regression_loss: 1.4993 - classification_loss: 0.2732 489/500 [============================>.] - ETA: 3s - loss: 1.7724 - regression_loss: 1.4992 - classification_loss: 0.2732 490/500 [============================>.] - ETA: 3s - loss: 1.7722 - regression_loss: 1.4991 - classification_loss: 0.2732 491/500 [============================>.] - ETA: 2s - loss: 1.7711 - regression_loss: 1.4982 - classification_loss: 0.2729 492/500 [============================>.] - ETA: 2s - loss: 1.7687 - regression_loss: 1.4961 - classification_loss: 0.2726 493/500 [============================>.] - ETA: 2s - loss: 1.7693 - regression_loss: 1.4967 - classification_loss: 0.2726 494/500 [============================>.] - ETA: 1s - loss: 1.7691 - regression_loss: 1.4966 - classification_loss: 0.2725 495/500 [============================>.] - ETA: 1s - loss: 1.7677 - regression_loss: 1.4954 - classification_loss: 0.2724 496/500 [============================>.] - ETA: 1s - loss: 1.7675 - regression_loss: 1.4952 - classification_loss: 0.2723 497/500 [============================>.] - ETA: 0s - loss: 1.7682 - regression_loss: 1.4959 - classification_loss: 0.2724 498/500 [============================>.] - ETA: 0s - loss: 1.7688 - regression_loss: 1.4963 - classification_loss: 0.2725 499/500 [============================>.] - ETA: 0s - loss: 1.7686 - regression_loss: 1.4961 - classification_loss: 0.2724 500/500 [==============================] - 165s 331ms/step - loss: 1.7692 - regression_loss: 1.4965 - classification_loss: 0.2727 1172 instances of class plum with average precision: 0.5234 mAP: 0.5234 Epoch 00012: saving model to ./training/snapshots/resnet101_pascal_12.h5 Epoch 13/150 1/500 [..............................] - ETA: 2:45 - loss: 2.1424 - regression_loss: 1.8493 - classification_loss: 0.2931 2/500 [..............................] - ETA: 2:45 - loss: 1.5445 - regression_loss: 1.3284 - classification_loss: 0.2161 3/500 [..............................] - ETA: 2:45 - loss: 1.6885 - regression_loss: 1.4536 - classification_loss: 0.2348 4/500 [..............................] - ETA: 2:44 - loss: 1.7993 - regression_loss: 1.5395 - classification_loss: 0.2598 5/500 [..............................] - ETA: 2:42 - loss: 1.7693 - regression_loss: 1.5152 - classification_loss: 0.2541 6/500 [..............................] - ETA: 2:40 - loss: 1.8249 - regression_loss: 1.5618 - classification_loss: 0.2632 7/500 [..............................] - ETA: 2:42 - loss: 1.9002 - regression_loss: 1.6240 - classification_loss: 0.2762 8/500 [..............................] - ETA: 2:42 - loss: 1.9391 - regression_loss: 1.6574 - classification_loss: 0.2817 9/500 [..............................] - ETA: 2:41 - loss: 1.8785 - regression_loss: 1.6058 - classification_loss: 0.2728 10/500 [..............................] - ETA: 2:40 - loss: 1.9190 - regression_loss: 1.6406 - classification_loss: 0.2784 11/500 [..............................] - ETA: 2:41 - loss: 1.8483 - regression_loss: 1.5811 - classification_loss: 0.2672 12/500 [..............................] - ETA: 2:42 - loss: 1.8818 - regression_loss: 1.6057 - classification_loss: 0.2761 13/500 [..............................] - ETA: 2:41 - loss: 1.8863 - regression_loss: 1.6098 - classification_loss: 0.2766 14/500 [..............................] - ETA: 2:40 - loss: 1.8582 - regression_loss: 1.5866 - classification_loss: 0.2715 15/500 [..............................] - ETA: 2:39 - loss: 1.8687 - regression_loss: 1.5958 - classification_loss: 0.2729 16/500 [..............................] - ETA: 2:39 - loss: 1.8617 - regression_loss: 1.5887 - classification_loss: 0.2729 17/500 [>.............................] - ETA: 2:39 - loss: 1.8737 - regression_loss: 1.6015 - classification_loss: 0.2722 18/500 [>.............................] - ETA: 2:38 - loss: 1.8842 - regression_loss: 1.6088 - classification_loss: 0.2754 19/500 [>.............................] - ETA: 2:38 - loss: 1.8633 - regression_loss: 1.5876 - classification_loss: 0.2756 20/500 [>.............................] - ETA: 2:37 - loss: 1.8378 - regression_loss: 1.5675 - classification_loss: 0.2703 21/500 [>.............................] - ETA: 2:36 - loss: 1.8073 - regression_loss: 1.5418 - classification_loss: 0.2654 22/500 [>.............................] - ETA: 2:36 - loss: 1.8086 - regression_loss: 1.5440 - classification_loss: 0.2646 23/500 [>.............................] - ETA: 2:36 - loss: 1.8036 - regression_loss: 1.5401 - classification_loss: 0.2635 24/500 [>.............................] - ETA: 2:36 - loss: 1.8202 - regression_loss: 1.5550 - classification_loss: 0.2652 25/500 [>.............................] - ETA: 2:36 - loss: 1.8310 - regression_loss: 1.5622 - classification_loss: 0.2688 26/500 [>.............................] - ETA: 2:35 - loss: 1.8437 - regression_loss: 1.5722 - classification_loss: 0.2715 27/500 [>.............................] - ETA: 2:34 - loss: 1.8402 - regression_loss: 1.5698 - classification_loss: 0.2704 28/500 [>.............................] - ETA: 2:34 - loss: 1.8365 - regression_loss: 1.5660 - classification_loss: 0.2705 29/500 [>.............................] - ETA: 2:33 - loss: 1.8606 - regression_loss: 1.5820 - classification_loss: 0.2786 30/500 [>.............................] - ETA: 2:33 - loss: 1.8678 - regression_loss: 1.5850 - classification_loss: 0.2828 31/500 [>.............................] - ETA: 2:33 - loss: 1.8592 - regression_loss: 1.5789 - classification_loss: 0.2803 32/500 [>.............................] - ETA: 2:33 - loss: 1.8567 - regression_loss: 1.5788 - classification_loss: 0.2779 33/500 [>.............................] - ETA: 2:32 - loss: 1.8565 - regression_loss: 1.5764 - classification_loss: 0.2800 34/500 [=>............................] - ETA: 2:32 - loss: 1.8693 - regression_loss: 1.5876 - classification_loss: 0.2818 35/500 [=>............................] - ETA: 2:32 - loss: 1.8793 - regression_loss: 1.5968 - classification_loss: 0.2825 36/500 [=>............................] - ETA: 2:32 - loss: 1.8762 - regression_loss: 1.5956 - classification_loss: 0.2806 37/500 [=>............................] - ETA: 2:31 - loss: 1.8773 - regression_loss: 1.5952 - classification_loss: 0.2820 38/500 [=>............................] - ETA: 2:31 - loss: 1.8820 - regression_loss: 1.5986 - classification_loss: 0.2834 39/500 [=>............................] - ETA: 2:31 - loss: 1.8847 - regression_loss: 1.6019 - classification_loss: 0.2828 40/500 [=>............................] - ETA: 2:31 - loss: 1.8844 - regression_loss: 1.6017 - classification_loss: 0.2828 41/500 [=>............................] - ETA: 2:30 - loss: 1.8630 - regression_loss: 1.5833 - classification_loss: 0.2797 42/500 [=>............................] - ETA: 2:30 - loss: 1.8652 - regression_loss: 1.5856 - classification_loss: 0.2796 43/500 [=>............................] - ETA: 2:29 - loss: 1.8651 - regression_loss: 1.5831 - classification_loss: 0.2820 44/500 [=>............................] - ETA: 2:29 - loss: 1.8688 - regression_loss: 1.5872 - classification_loss: 0.2816 45/500 [=>............................] - ETA: 2:29 - loss: 1.8608 - regression_loss: 1.5810 - classification_loss: 0.2798 46/500 [=>............................] - ETA: 2:29 - loss: 1.8580 - regression_loss: 1.5783 - classification_loss: 0.2796 47/500 [=>............................] - ETA: 2:28 - loss: 1.8534 - regression_loss: 1.5747 - classification_loss: 0.2787 48/500 [=>............................] - ETA: 2:28 - loss: 1.8417 - regression_loss: 1.5629 - classification_loss: 0.2787 49/500 [=>............................] - ETA: 2:28 - loss: 1.8386 - regression_loss: 1.5607 - classification_loss: 0.2779 50/500 [==>...........................] - ETA: 2:27 - loss: 1.8289 - regression_loss: 1.5521 - classification_loss: 0.2768 51/500 [==>...........................] - ETA: 2:27 - loss: 1.8116 - regression_loss: 1.5370 - classification_loss: 0.2746 52/500 [==>...........................] - ETA: 2:27 - loss: 1.8032 - regression_loss: 1.5292 - classification_loss: 0.2740 53/500 [==>...........................] - ETA: 2:26 - loss: 1.8082 - regression_loss: 1.5340 - classification_loss: 0.2742 54/500 [==>...........................] - ETA: 2:26 - loss: 1.8220 - regression_loss: 1.5456 - classification_loss: 0.2764 55/500 [==>...........................] - ETA: 2:25 - loss: 1.8216 - regression_loss: 1.5460 - classification_loss: 0.2756 56/500 [==>...........................] - ETA: 2:25 - loss: 1.8154 - regression_loss: 1.5418 - classification_loss: 0.2735 57/500 [==>...........................] - ETA: 2:25 - loss: 1.8033 - regression_loss: 1.5318 - classification_loss: 0.2715 58/500 [==>...........................] - ETA: 2:25 - loss: 1.8115 - regression_loss: 1.5389 - classification_loss: 0.2725 59/500 [==>...........................] - ETA: 2:24 - loss: 1.8246 - regression_loss: 1.5485 - classification_loss: 0.2762 60/500 [==>...........................] - ETA: 2:24 - loss: 1.8231 - regression_loss: 1.5465 - classification_loss: 0.2765 61/500 [==>...........................] - ETA: 2:23 - loss: 1.8284 - regression_loss: 1.5504 - classification_loss: 0.2779 62/500 [==>...........................] - ETA: 2:23 - loss: 1.8285 - regression_loss: 1.5505 - classification_loss: 0.2780 63/500 [==>...........................] - ETA: 2:23 - loss: 1.8308 - regression_loss: 1.5535 - classification_loss: 0.2772 64/500 [==>...........................] - ETA: 2:23 - loss: 1.8237 - regression_loss: 1.5470 - classification_loss: 0.2766 65/500 [==>...........................] - ETA: 2:22 - loss: 1.8235 - regression_loss: 1.5473 - classification_loss: 0.2762 66/500 [==>...........................] - ETA: 2:22 - loss: 1.8288 - regression_loss: 1.5521 - classification_loss: 0.2768 67/500 [===>..........................] - ETA: 2:21 - loss: 1.8178 - regression_loss: 1.5429 - classification_loss: 0.2749 68/500 [===>..........................] - ETA: 2:21 - loss: 1.8137 - regression_loss: 1.5382 - classification_loss: 0.2755 69/500 [===>..........................] - ETA: 2:21 - loss: 1.8057 - regression_loss: 1.5297 - classification_loss: 0.2760 70/500 [===>..........................] - ETA: 2:21 - loss: 1.8081 - regression_loss: 1.5331 - classification_loss: 0.2750 71/500 [===>..........................] - ETA: 2:20 - loss: 1.8052 - regression_loss: 1.5290 - classification_loss: 0.2762 72/500 [===>..........................] - ETA: 2:20 - loss: 1.7917 - regression_loss: 1.5176 - classification_loss: 0.2741 73/500 [===>..........................] - ETA: 2:20 - loss: 1.7929 - regression_loss: 1.5189 - classification_loss: 0.2740 74/500 [===>..........................] - ETA: 2:19 - loss: 1.7978 - regression_loss: 1.5235 - classification_loss: 0.2744 75/500 [===>..........................] - ETA: 2:19 - loss: 1.7914 - regression_loss: 1.5180 - classification_loss: 0.2735 76/500 [===>..........................] - ETA: 2:19 - loss: 1.7892 - regression_loss: 1.5163 - classification_loss: 0.2730 77/500 [===>..........................] - ETA: 2:18 - loss: 1.7761 - regression_loss: 1.5049 - classification_loss: 0.2712 78/500 [===>..........................] - ETA: 2:18 - loss: 1.7702 - regression_loss: 1.5002 - classification_loss: 0.2700 79/500 [===>..........................] - ETA: 2:18 - loss: 1.7792 - regression_loss: 1.5071 - classification_loss: 0.2721 80/500 [===>..........................] - ETA: 2:17 - loss: 1.7694 - regression_loss: 1.4989 - classification_loss: 0.2705 81/500 [===>..........................] - ETA: 2:17 - loss: 1.7744 - regression_loss: 1.5034 - classification_loss: 0.2710 82/500 [===>..........................] - ETA: 2:17 - loss: 1.7732 - regression_loss: 1.5015 - classification_loss: 0.2718 83/500 [===>..........................] - ETA: 2:17 - loss: 1.7643 - regression_loss: 1.4932 - classification_loss: 0.2710 84/500 [====>.........................] - ETA: 2:16 - loss: 1.7676 - regression_loss: 1.4955 - classification_loss: 0.2721 85/500 [====>.........................] - ETA: 2:16 - loss: 1.7667 - regression_loss: 1.4949 - classification_loss: 0.2718 86/500 [====>.........................] - ETA: 2:16 - loss: 1.7693 - regression_loss: 1.4970 - classification_loss: 0.2723 87/500 [====>.........................] - ETA: 2:15 - loss: 1.7670 - regression_loss: 1.4952 - classification_loss: 0.2718 88/500 [====>.........................] - ETA: 2:15 - loss: 1.7594 - regression_loss: 1.4893 - classification_loss: 0.2701 89/500 [====>.........................] - ETA: 2:15 - loss: 1.7554 - regression_loss: 1.4862 - classification_loss: 0.2692 90/500 [====>.........................] - ETA: 2:14 - loss: 1.7508 - regression_loss: 1.4817 - classification_loss: 0.2690 91/500 [====>.........................] - ETA: 2:14 - loss: 1.7474 - regression_loss: 1.4787 - classification_loss: 0.2687 92/500 [====>.........................] - ETA: 2:14 - loss: 1.7474 - regression_loss: 1.4790 - classification_loss: 0.2684 93/500 [====>.........................] - ETA: 2:13 - loss: 1.7439 - regression_loss: 1.4755 - classification_loss: 0.2685 94/500 [====>.........................] - ETA: 2:13 - loss: 1.7449 - regression_loss: 1.4761 - classification_loss: 0.2688 95/500 [====>.........................] - ETA: 2:13 - loss: 1.7475 - regression_loss: 1.4781 - classification_loss: 0.2694 96/500 [====>.........................] - ETA: 2:12 - loss: 1.7458 - regression_loss: 1.4766 - classification_loss: 0.2692 97/500 [====>.........................] - ETA: 2:12 - loss: 1.7490 - regression_loss: 1.4796 - classification_loss: 0.2694 98/500 [====>.........................] - ETA: 2:12 - loss: 1.7514 - regression_loss: 1.4829 - classification_loss: 0.2685 99/500 [====>.........................] - ETA: 2:11 - loss: 1.7571 - regression_loss: 1.4879 - classification_loss: 0.2692 100/500 [=====>........................] - ETA: 2:11 - loss: 1.7570 - regression_loss: 1.4879 - classification_loss: 0.2691 101/500 [=====>........................] - ETA: 2:11 - loss: 1.7578 - regression_loss: 1.4889 - classification_loss: 0.2690 102/500 [=====>........................] - ETA: 2:10 - loss: 1.7508 - regression_loss: 1.4828 - classification_loss: 0.2680 103/500 [=====>........................] - ETA: 2:10 - loss: 1.7516 - regression_loss: 1.4834 - classification_loss: 0.2682 104/500 [=====>........................] - ETA: 2:10 - loss: 1.7561 - regression_loss: 1.4878 - classification_loss: 0.2683 105/500 [=====>........................] - ETA: 2:09 - loss: 1.7449 - regression_loss: 1.4779 - classification_loss: 0.2670 106/500 [=====>........................] - ETA: 2:09 - loss: 1.7463 - regression_loss: 1.4789 - classification_loss: 0.2674 107/500 [=====>........................] - ETA: 2:09 - loss: 1.7381 - regression_loss: 1.4717 - classification_loss: 0.2664 108/500 [=====>........................] - ETA: 2:08 - loss: 1.7388 - regression_loss: 1.4725 - classification_loss: 0.2664 109/500 [=====>........................] - ETA: 2:08 - loss: 1.7398 - regression_loss: 1.4733 - classification_loss: 0.2665 110/500 [=====>........................] - ETA: 2:08 - loss: 1.7405 - regression_loss: 1.4740 - classification_loss: 0.2665 111/500 [=====>........................] - ETA: 2:07 - loss: 1.7439 - regression_loss: 1.4769 - classification_loss: 0.2669 112/500 [=====>........................] - ETA: 2:07 - loss: 1.7427 - regression_loss: 1.4755 - classification_loss: 0.2672 113/500 [=====>........................] - ETA: 2:07 - loss: 1.7439 - regression_loss: 1.4771 - classification_loss: 0.2668 114/500 [=====>........................] - ETA: 2:06 - loss: 1.7425 - regression_loss: 1.4756 - classification_loss: 0.2669 115/500 [=====>........................] - ETA: 2:06 - loss: 1.7436 - regression_loss: 1.4764 - classification_loss: 0.2672 116/500 [=====>........................] - ETA: 2:06 - loss: 1.7457 - regression_loss: 1.4780 - classification_loss: 0.2677 117/500 [======>.......................] - ETA: 2:06 - loss: 1.7502 - regression_loss: 1.4817 - classification_loss: 0.2684 118/500 [======>.......................] - ETA: 2:05 - loss: 1.7521 - regression_loss: 1.4834 - classification_loss: 0.2686 119/500 [======>.......................] - ETA: 2:05 - loss: 1.7548 - regression_loss: 1.4857 - classification_loss: 0.2691 120/500 [======>.......................] - ETA: 2:04 - loss: 1.7522 - regression_loss: 1.4836 - classification_loss: 0.2686 121/500 [======>.......................] - ETA: 2:04 - loss: 1.7596 - regression_loss: 1.4899 - classification_loss: 0.2697 122/500 [======>.......................] - ETA: 2:04 - loss: 1.7578 - regression_loss: 1.4883 - classification_loss: 0.2694 123/500 [======>.......................] - ETA: 2:03 - loss: 1.7591 - regression_loss: 1.4893 - classification_loss: 0.2698 124/500 [======>.......................] - ETA: 2:03 - loss: 1.7542 - regression_loss: 1.4853 - classification_loss: 0.2689 125/500 [======>.......................] - ETA: 2:03 - loss: 1.7540 - regression_loss: 1.4850 - classification_loss: 0.2689 126/500 [======>.......................] - ETA: 2:02 - loss: 1.7490 - regression_loss: 1.4808 - classification_loss: 0.2683 127/500 [======>.......................] - ETA: 2:02 - loss: 1.7547 - regression_loss: 1.4850 - classification_loss: 0.2697 128/500 [======>.......................] - ETA: 2:02 - loss: 1.7510 - regression_loss: 1.4814 - classification_loss: 0.2696 129/500 [======>.......................] - ETA: 2:01 - loss: 1.7537 - regression_loss: 1.4842 - classification_loss: 0.2695 130/500 [======>.......................] - ETA: 2:01 - loss: 1.7479 - regression_loss: 1.4797 - classification_loss: 0.2682 131/500 [======>.......................] - ETA: 2:01 - loss: 1.7502 - regression_loss: 1.4810 - classification_loss: 0.2692 132/500 [======>.......................] - ETA: 2:01 - loss: 1.7554 - regression_loss: 1.4854 - classification_loss: 0.2699 133/500 [======>.......................] - ETA: 2:00 - loss: 1.7546 - regression_loss: 1.4847 - classification_loss: 0.2699 134/500 [=======>......................] - ETA: 2:00 - loss: 1.7486 - regression_loss: 1.4794 - classification_loss: 0.2692 135/500 [=======>......................] - ETA: 1:59 - loss: 1.7498 - regression_loss: 1.4807 - classification_loss: 0.2690 136/500 [=======>......................] - ETA: 1:59 - loss: 1.7529 - regression_loss: 1.4833 - classification_loss: 0.2696 137/500 [=======>......................] - ETA: 1:59 - loss: 1.7494 - regression_loss: 1.4806 - classification_loss: 0.2688 138/500 [=======>......................] - ETA: 1:58 - loss: 1.7448 - regression_loss: 1.4767 - classification_loss: 0.2681 139/500 [=======>......................] - ETA: 1:58 - loss: 1.7465 - regression_loss: 1.4780 - classification_loss: 0.2686 140/500 [=======>......................] - ETA: 1:58 - loss: 1.7484 - regression_loss: 1.4794 - classification_loss: 0.2690 141/500 [=======>......................] - ETA: 1:57 - loss: 1.7458 - regression_loss: 1.4772 - classification_loss: 0.2686 142/500 [=======>......................] - ETA: 1:57 - loss: 1.7464 - regression_loss: 1.4777 - classification_loss: 0.2687 143/500 [=======>......................] - ETA: 1:57 - loss: 1.7470 - regression_loss: 1.4781 - classification_loss: 0.2689 144/500 [=======>......................] - ETA: 1:56 - loss: 1.7428 - regression_loss: 1.4745 - classification_loss: 0.2683 145/500 [=======>......................] - ETA: 1:56 - loss: 1.7454 - regression_loss: 1.4765 - classification_loss: 0.2689 146/500 [=======>......................] - ETA: 1:56 - loss: 1.7399 - regression_loss: 1.4717 - classification_loss: 0.2682 147/500 [=======>......................] - ETA: 1:55 - loss: 1.7418 - regression_loss: 1.4734 - classification_loss: 0.2685 148/500 [=======>......................] - ETA: 1:55 - loss: 1.7437 - regression_loss: 1.4747 - classification_loss: 0.2690 149/500 [=======>......................] - ETA: 1:55 - loss: 1.7357 - regression_loss: 1.4678 - classification_loss: 0.2679 150/500 [========>.....................] - ETA: 1:54 - loss: 1.7375 - regression_loss: 1.4689 - classification_loss: 0.2686 151/500 [========>.....................] - ETA: 1:54 - loss: 1.7406 - regression_loss: 1.4719 - classification_loss: 0.2687 152/500 [========>.....................] - ETA: 1:54 - loss: 1.7412 - regression_loss: 1.4727 - classification_loss: 0.2685 153/500 [========>.....................] - ETA: 1:54 - loss: 1.7425 - regression_loss: 1.4736 - classification_loss: 0.2689 154/500 [========>.....................] - ETA: 1:53 - loss: 1.7484 - regression_loss: 1.4788 - classification_loss: 0.2696 155/500 [========>.....................] - ETA: 1:53 - loss: 1.7506 - regression_loss: 1.4808 - classification_loss: 0.2698 156/500 [========>.....................] - ETA: 1:53 - loss: 1.7507 - regression_loss: 1.4809 - classification_loss: 0.2698 157/500 [========>.....................] - ETA: 1:52 - loss: 1.7505 - regression_loss: 1.4809 - classification_loss: 0.2697 158/500 [========>.....................] - ETA: 1:52 - loss: 1.7530 - regression_loss: 1.4835 - classification_loss: 0.2695 159/500 [========>.....................] - ETA: 1:52 - loss: 1.7459 - regression_loss: 1.4772 - classification_loss: 0.2687 160/500 [========>.....................] - ETA: 1:51 - loss: 1.7466 - regression_loss: 1.4778 - classification_loss: 0.2688 161/500 [========>.....................] - ETA: 1:51 - loss: 1.7459 - regression_loss: 1.4773 - classification_loss: 0.2685 162/500 [========>.....................] - ETA: 1:51 - loss: 1.7440 - regression_loss: 1.4762 - classification_loss: 0.2678 163/500 [========>.....................] - ETA: 1:50 - loss: 1.7437 - regression_loss: 1.4757 - classification_loss: 0.2680 164/500 [========>.....................] - ETA: 1:50 - loss: 1.7446 - regression_loss: 1.4765 - classification_loss: 0.2680 165/500 [========>.....................] - ETA: 1:50 - loss: 1.7477 - regression_loss: 1.4790 - classification_loss: 0.2687 166/500 [========>.....................] - ETA: 1:49 - loss: 1.7460 - regression_loss: 1.4773 - classification_loss: 0.2687 167/500 [=========>....................] - ETA: 1:49 - loss: 1.7455 - regression_loss: 1.4772 - classification_loss: 0.2682 168/500 [=========>....................] - ETA: 1:49 - loss: 1.7414 - regression_loss: 1.4733 - classification_loss: 0.2682 169/500 [=========>....................] - ETA: 1:48 - loss: 1.7400 - regression_loss: 1.4724 - classification_loss: 0.2676 170/500 [=========>....................] - ETA: 1:48 - loss: 1.7383 - regression_loss: 1.4711 - classification_loss: 0.2673 171/500 [=========>....................] - ETA: 1:48 - loss: 1.7413 - regression_loss: 1.4735 - classification_loss: 0.2678 172/500 [=========>....................] - ETA: 1:47 - loss: 1.7458 - regression_loss: 1.4771 - classification_loss: 0.2687 173/500 [=========>....................] - ETA: 1:47 - loss: 1.7453 - regression_loss: 1.4768 - classification_loss: 0.2685 174/500 [=========>....................] - ETA: 1:47 - loss: 1.7444 - regression_loss: 1.4761 - classification_loss: 0.2683 175/500 [=========>....................] - ETA: 1:47 - loss: 1.7453 - regression_loss: 1.4769 - classification_loss: 0.2683 176/500 [=========>....................] - ETA: 1:46 - loss: 1.7458 - regression_loss: 1.4773 - classification_loss: 0.2685 177/500 [=========>....................] - ETA: 1:46 - loss: 1.7458 - regression_loss: 1.4771 - classification_loss: 0.2687 178/500 [=========>....................] - ETA: 1:46 - loss: 1.7474 - regression_loss: 1.4784 - classification_loss: 0.2690 179/500 [=========>....................] - ETA: 1:45 - loss: 1.7414 - regression_loss: 1.4729 - classification_loss: 0.2685 180/500 [=========>....................] - ETA: 1:45 - loss: 1.7425 - regression_loss: 1.4739 - classification_loss: 0.2686 181/500 [=========>....................] - ETA: 1:45 - loss: 1.7432 - regression_loss: 1.4745 - classification_loss: 0.2688 182/500 [=========>....................] - ETA: 1:44 - loss: 1.7376 - regression_loss: 1.4697 - classification_loss: 0.2679 183/500 [=========>....................] - ETA: 1:44 - loss: 1.7387 - regression_loss: 1.4707 - classification_loss: 0.2680 184/500 [==========>...................] - ETA: 1:44 - loss: 1.7377 - regression_loss: 1.4699 - classification_loss: 0.2678 185/500 [==========>...................] - ETA: 1:43 - loss: 1.7395 - regression_loss: 1.4714 - classification_loss: 0.2682 186/500 [==========>...................] - ETA: 1:43 - loss: 1.7401 - regression_loss: 1.4721 - classification_loss: 0.2680 187/500 [==========>...................] - ETA: 1:43 - loss: 1.7406 - regression_loss: 1.4725 - classification_loss: 0.2680 188/500 [==========>...................] - ETA: 1:42 - loss: 1.7415 - regression_loss: 1.4732 - classification_loss: 0.2683 189/500 [==========>...................] - ETA: 1:42 - loss: 1.7399 - regression_loss: 1.4719 - classification_loss: 0.2680 190/500 [==========>...................] - ETA: 1:42 - loss: 1.7404 - regression_loss: 1.4722 - classification_loss: 0.2681 191/500 [==========>...................] - ETA: 1:41 - loss: 1.7393 - regression_loss: 1.4712 - classification_loss: 0.2681 192/500 [==========>...................] - ETA: 1:41 - loss: 1.7351 - regression_loss: 1.4677 - classification_loss: 0.2674 193/500 [==========>...................] - ETA: 1:41 - loss: 1.7332 - regression_loss: 1.4662 - classification_loss: 0.2670 194/500 [==========>...................] - ETA: 1:40 - loss: 1.7381 - regression_loss: 1.4705 - classification_loss: 0.2676 195/500 [==========>...................] - ETA: 1:40 - loss: 1.7371 - regression_loss: 1.4696 - classification_loss: 0.2675 196/500 [==========>...................] - ETA: 1:40 - loss: 1.7351 - regression_loss: 1.4677 - classification_loss: 0.2674 197/500 [==========>...................] - ETA: 1:39 - loss: 1.7375 - regression_loss: 1.4701 - classification_loss: 0.2674 198/500 [==========>...................] - ETA: 1:39 - loss: 1.7372 - regression_loss: 1.4696 - classification_loss: 0.2676 199/500 [==========>...................] - ETA: 1:39 - loss: 1.7376 - regression_loss: 1.4702 - classification_loss: 0.2675 200/500 [===========>..................] - ETA: 1:38 - loss: 1.7382 - regression_loss: 1.4708 - classification_loss: 0.2674 201/500 [===========>..................] - ETA: 1:38 - loss: 1.7398 - regression_loss: 1.4723 - classification_loss: 0.2674 202/500 [===========>..................] - ETA: 1:38 - loss: 1.7365 - regression_loss: 1.4694 - classification_loss: 0.2671 203/500 [===========>..................] - ETA: 1:37 - loss: 1.7360 - regression_loss: 1.4690 - classification_loss: 0.2670 204/500 [===========>..................] - ETA: 1:37 - loss: 1.7345 - regression_loss: 1.4674 - classification_loss: 0.2671 205/500 [===========>..................] - ETA: 1:37 - loss: 1.7360 - regression_loss: 1.4689 - classification_loss: 0.2670 206/500 [===========>..................] - ETA: 1:36 - loss: 1.7355 - regression_loss: 1.4689 - classification_loss: 0.2666 207/500 [===========>..................] - ETA: 1:36 - loss: 1.7352 - regression_loss: 1.4688 - classification_loss: 0.2664 208/500 [===========>..................] - ETA: 1:36 - loss: 1.7351 - regression_loss: 1.4686 - classification_loss: 0.2666 209/500 [===========>..................] - ETA: 1:35 - loss: 1.7306 - regression_loss: 1.4644 - classification_loss: 0.2663 210/500 [===========>..................] - ETA: 1:35 - loss: 1.7301 - regression_loss: 1.4639 - classification_loss: 0.2661 211/500 [===========>..................] - ETA: 1:35 - loss: 1.7260 - regression_loss: 1.4601 - classification_loss: 0.2659 212/500 [===========>..................] - ETA: 1:34 - loss: 1.7269 - regression_loss: 1.4609 - classification_loss: 0.2660 213/500 [===========>..................] - ETA: 1:34 - loss: 1.7274 - regression_loss: 1.4614 - classification_loss: 0.2660 214/500 [===========>..................] - ETA: 1:34 - loss: 1.7285 - regression_loss: 1.4623 - classification_loss: 0.2662 215/500 [===========>..................] - ETA: 1:33 - loss: 1.7282 - regression_loss: 1.4620 - classification_loss: 0.2662 216/500 [===========>..................] - ETA: 1:33 - loss: 1.7284 - regression_loss: 1.4622 - classification_loss: 0.2663 217/500 [============>.................] - ETA: 1:33 - loss: 1.7249 - regression_loss: 1.4589 - classification_loss: 0.2659 218/500 [============>.................] - ETA: 1:32 - loss: 1.7232 - regression_loss: 1.4576 - classification_loss: 0.2656 219/500 [============>.................] - ETA: 1:32 - loss: 1.7242 - regression_loss: 1.4587 - classification_loss: 0.2655 220/500 [============>.................] - ETA: 1:32 - loss: 1.7256 - regression_loss: 1.4598 - classification_loss: 0.2658 221/500 [============>.................] - ETA: 1:31 - loss: 1.7249 - regression_loss: 1.4593 - classification_loss: 0.2657 222/500 [============>.................] - ETA: 1:31 - loss: 1.7259 - regression_loss: 1.4604 - classification_loss: 0.2655 223/500 [============>.................] - ETA: 1:31 - loss: 1.7261 - regression_loss: 1.4606 - classification_loss: 0.2655 224/500 [============>.................] - ETA: 1:30 - loss: 1.7261 - regression_loss: 1.4607 - classification_loss: 0.2654 225/500 [============>.................] - ETA: 1:30 - loss: 1.7277 - regression_loss: 1.4621 - classification_loss: 0.2656 226/500 [============>.................] - ETA: 1:30 - loss: 1.7276 - regression_loss: 1.4613 - classification_loss: 0.2662 227/500 [============>.................] - ETA: 1:30 - loss: 1.7283 - regression_loss: 1.4618 - classification_loss: 0.2665 228/500 [============>.................] - ETA: 1:29 - loss: 1.7276 - regression_loss: 1.4613 - classification_loss: 0.2663 229/500 [============>.................] - ETA: 1:29 - loss: 1.7276 - regression_loss: 1.4612 - classification_loss: 0.2664 230/500 [============>.................] - ETA: 1:29 - loss: 1.7267 - regression_loss: 1.4606 - classification_loss: 0.2661 231/500 [============>.................] - ETA: 1:28 - loss: 1.7272 - regression_loss: 1.4610 - classification_loss: 0.2662 232/500 [============>.................] - ETA: 1:28 - loss: 1.7281 - regression_loss: 1.4618 - classification_loss: 0.2663 233/500 [============>.................] - ETA: 1:28 - loss: 1.7285 - regression_loss: 1.4621 - classification_loss: 0.2663 234/500 [=============>................] - ETA: 1:27 - loss: 1.7283 - regression_loss: 1.4620 - classification_loss: 0.2663 235/500 [=============>................] - ETA: 1:27 - loss: 1.7281 - regression_loss: 1.4618 - classification_loss: 0.2663 236/500 [=============>................] - ETA: 1:27 - loss: 1.7250 - regression_loss: 1.4592 - classification_loss: 0.2657 237/500 [=============>................] - ETA: 1:26 - loss: 1.7250 - regression_loss: 1.4594 - classification_loss: 0.2656 238/500 [=============>................] - ETA: 1:26 - loss: 1.7252 - regression_loss: 1.4596 - classification_loss: 0.2656 239/500 [=============>................] - ETA: 1:26 - loss: 1.7266 - regression_loss: 1.4610 - classification_loss: 0.2656 240/500 [=============>................] - ETA: 1:25 - loss: 1.7292 - regression_loss: 1.4630 - classification_loss: 0.2662 241/500 [=============>................] - ETA: 1:25 - loss: 1.7316 - regression_loss: 1.4654 - classification_loss: 0.2662 242/500 [=============>................] - ETA: 1:25 - loss: 1.7338 - regression_loss: 1.4669 - classification_loss: 0.2669 243/500 [=============>................] - ETA: 1:24 - loss: 1.7352 - regression_loss: 1.4682 - classification_loss: 0.2670 244/500 [=============>................] - ETA: 1:24 - loss: 1.7358 - regression_loss: 1.4687 - classification_loss: 0.2671 245/500 [=============>................] - ETA: 1:24 - loss: 1.7327 - regression_loss: 1.4661 - classification_loss: 0.2666 246/500 [=============>................] - ETA: 1:23 - loss: 1.7313 - regression_loss: 1.4650 - classification_loss: 0.2663 247/500 [=============>................] - ETA: 1:23 - loss: 1.7333 - regression_loss: 1.4665 - classification_loss: 0.2668 248/500 [=============>................] - ETA: 1:23 - loss: 1.7326 - regression_loss: 1.4661 - classification_loss: 0.2666 249/500 [=============>................] - ETA: 1:22 - loss: 1.7306 - regression_loss: 1.4644 - classification_loss: 0.2662 250/500 [==============>...............] - ETA: 1:22 - loss: 1.7313 - regression_loss: 1.4650 - classification_loss: 0.2663 251/500 [==============>...............] - ETA: 1:22 - loss: 1.7297 - regression_loss: 1.4636 - classification_loss: 0.2661 252/500 [==============>...............] - ETA: 1:21 - loss: 1.7290 - regression_loss: 1.4630 - classification_loss: 0.2660 253/500 [==============>...............] - ETA: 1:21 - loss: 1.7292 - regression_loss: 1.4632 - classification_loss: 0.2660 254/500 [==============>...............] - ETA: 1:21 - loss: 1.7308 - regression_loss: 1.4647 - classification_loss: 0.2661 255/500 [==============>...............] - ETA: 1:20 - loss: 1.7301 - regression_loss: 1.4640 - classification_loss: 0.2660 256/500 [==============>...............] - ETA: 1:20 - loss: 1.7267 - regression_loss: 1.4611 - classification_loss: 0.2656 257/500 [==============>...............] - ETA: 1:20 - loss: 1.7258 - regression_loss: 1.4607 - classification_loss: 0.2652 258/500 [==============>...............] - ETA: 1:19 - loss: 1.7228 - regression_loss: 1.4580 - classification_loss: 0.2648 259/500 [==============>...............] - ETA: 1:19 - loss: 1.7202 - regression_loss: 1.4557 - classification_loss: 0.2645 260/500 [==============>...............] - ETA: 1:19 - loss: 1.7167 - regression_loss: 1.4526 - classification_loss: 0.2640 261/500 [==============>...............] - ETA: 1:18 - loss: 1.7144 - regression_loss: 1.4508 - classification_loss: 0.2635 262/500 [==============>...............] - ETA: 1:18 - loss: 1.7156 - regression_loss: 1.4520 - classification_loss: 0.2636 263/500 [==============>...............] - ETA: 1:18 - loss: 1.7167 - regression_loss: 1.4532 - classification_loss: 0.2635 264/500 [==============>...............] - ETA: 1:17 - loss: 1.7167 - regression_loss: 1.4533 - classification_loss: 0.2634 265/500 [==============>...............] - ETA: 1:17 - loss: 1.7194 - regression_loss: 1.4557 - classification_loss: 0.2637 266/500 [==============>...............] - ETA: 1:17 - loss: 1.7199 - regression_loss: 1.4562 - classification_loss: 0.2637 267/500 [===============>..............] - ETA: 1:16 - loss: 1.7154 - regression_loss: 1.4523 - classification_loss: 0.2631 268/500 [===============>..............] - ETA: 1:16 - loss: 1.7157 - regression_loss: 1.4527 - classification_loss: 0.2631 269/500 [===============>..............] - ETA: 1:16 - loss: 1.7143 - regression_loss: 1.4516 - classification_loss: 0.2627 270/500 [===============>..............] - ETA: 1:15 - loss: 1.7158 - regression_loss: 1.4530 - classification_loss: 0.2628 271/500 [===============>..............] - ETA: 1:15 - loss: 1.7166 - regression_loss: 1.4537 - classification_loss: 0.2629 272/500 [===============>..............] - ETA: 1:15 - loss: 1.7129 - regression_loss: 1.4505 - classification_loss: 0.2624 273/500 [===============>..............] - ETA: 1:14 - loss: 1.7153 - regression_loss: 1.4524 - classification_loss: 0.2629 274/500 [===============>..............] - ETA: 1:14 - loss: 1.7169 - regression_loss: 1.4538 - classification_loss: 0.2631 275/500 [===============>..............] - ETA: 1:14 - loss: 1.7161 - regression_loss: 1.4533 - classification_loss: 0.2628 276/500 [===============>..............] - ETA: 1:13 - loss: 1.7127 - regression_loss: 1.4504 - classification_loss: 0.2624 277/500 [===============>..............] - ETA: 1:13 - loss: 1.7125 - regression_loss: 1.4503 - classification_loss: 0.2623 278/500 [===============>..............] - ETA: 1:13 - loss: 1.7147 - regression_loss: 1.4521 - classification_loss: 0.2626 279/500 [===============>..............] - ETA: 1:12 - loss: 1.7137 - regression_loss: 1.4513 - classification_loss: 0.2624 280/500 [===============>..............] - ETA: 1:12 - loss: 1.7141 - regression_loss: 1.4517 - classification_loss: 0.2623 281/500 [===============>..............] - ETA: 1:12 - loss: 1.7142 - regression_loss: 1.4518 - classification_loss: 0.2624 282/500 [===============>..............] - ETA: 1:11 - loss: 1.7153 - regression_loss: 1.4530 - classification_loss: 0.2623 283/500 [===============>..............] - ETA: 1:11 - loss: 1.7120 - regression_loss: 1.4501 - classification_loss: 0.2619 284/500 [================>.............] - ETA: 1:11 - loss: 1.7115 - regression_loss: 1.4500 - classification_loss: 0.2616 285/500 [================>.............] - ETA: 1:10 - loss: 1.7111 - regression_loss: 1.4496 - classification_loss: 0.2614 286/500 [================>.............] - ETA: 1:10 - loss: 1.7138 - regression_loss: 1.4518 - classification_loss: 0.2620 287/500 [================>.............] - ETA: 1:10 - loss: 1.7146 - regression_loss: 1.4526 - classification_loss: 0.2620 288/500 [================>.............] - ETA: 1:09 - loss: 1.7145 - regression_loss: 1.4526 - classification_loss: 0.2619 289/500 [================>.............] - ETA: 1:09 - loss: 1.7151 - regression_loss: 1.4530 - classification_loss: 0.2620 290/500 [================>.............] - ETA: 1:09 - loss: 1.7131 - regression_loss: 1.4513 - classification_loss: 0.2618 291/500 [================>.............] - ETA: 1:08 - loss: 1.7133 - regression_loss: 1.4513 - classification_loss: 0.2619 292/500 [================>.............] - ETA: 1:08 - loss: 1.7134 - regression_loss: 1.4515 - classification_loss: 0.2619 293/500 [================>.............] - ETA: 1:08 - loss: 1.7161 - regression_loss: 1.4536 - classification_loss: 0.2625 294/500 [================>.............] - ETA: 1:07 - loss: 1.7156 - regression_loss: 1.4531 - classification_loss: 0.2625 295/500 [================>.............] - ETA: 1:07 - loss: 1.7150 - regression_loss: 1.4525 - classification_loss: 0.2624 296/500 [================>.............] - ETA: 1:07 - loss: 1.7146 - regression_loss: 1.4521 - classification_loss: 0.2625 297/500 [================>.............] - ETA: 1:06 - loss: 1.7153 - regression_loss: 1.4528 - classification_loss: 0.2625 298/500 [================>.............] - ETA: 1:06 - loss: 1.7134 - regression_loss: 1.4510 - classification_loss: 0.2624 299/500 [================>.............] - ETA: 1:06 - loss: 1.7145 - regression_loss: 1.4519 - classification_loss: 0.2626 300/500 [=================>............] - ETA: 1:05 - loss: 1.7128 - regression_loss: 1.4504 - classification_loss: 0.2623 301/500 [=================>............] - ETA: 1:05 - loss: 1.7128 - regression_loss: 1.4506 - classification_loss: 0.2623 302/500 [=================>............] - ETA: 1:05 - loss: 1.7148 - regression_loss: 1.4521 - classification_loss: 0.2627 303/500 [=================>............] - ETA: 1:04 - loss: 1.7155 - regression_loss: 1.4527 - classification_loss: 0.2628 304/500 [=================>............] - ETA: 1:04 - loss: 1.7163 - regression_loss: 1.4528 - classification_loss: 0.2635 305/500 [=================>............] - ETA: 1:04 - loss: 1.7175 - regression_loss: 1.4538 - classification_loss: 0.2637 306/500 [=================>............] - ETA: 1:03 - loss: 1.7176 - regression_loss: 1.4538 - classification_loss: 0.2638 307/500 [=================>............] - ETA: 1:03 - loss: 1.7178 - regression_loss: 1.4540 - classification_loss: 0.2638 308/500 [=================>............] - ETA: 1:03 - loss: 1.7187 - regression_loss: 1.4548 - classification_loss: 0.2639 309/500 [=================>............] - ETA: 1:02 - loss: 1.7185 - regression_loss: 1.4548 - classification_loss: 0.2638 310/500 [=================>............] - ETA: 1:02 - loss: 1.7198 - regression_loss: 1.4557 - classification_loss: 0.2640 311/500 [=================>............] - ETA: 1:02 - loss: 1.7186 - regression_loss: 1.4548 - classification_loss: 0.2638 312/500 [=================>............] - ETA: 1:01 - loss: 1.7207 - regression_loss: 1.4567 - classification_loss: 0.2641 313/500 [=================>............] - ETA: 1:01 - loss: 1.7197 - regression_loss: 1.4558 - classification_loss: 0.2639 314/500 [=================>............] - ETA: 1:01 - loss: 1.7191 - regression_loss: 1.4554 - classification_loss: 0.2637 315/500 [=================>............] - ETA: 1:01 - loss: 1.7205 - regression_loss: 1.4567 - classification_loss: 0.2638 316/500 [=================>............] - ETA: 1:00 - loss: 1.7191 - regression_loss: 1.4554 - classification_loss: 0.2637 317/500 [==================>...........] - ETA: 1:00 - loss: 1.7201 - regression_loss: 1.4562 - classification_loss: 0.2639 318/500 [==================>...........] - ETA: 1:00 - loss: 1.7209 - regression_loss: 1.4568 - classification_loss: 0.2640 319/500 [==================>...........] - ETA: 59s - loss: 1.7214 - regression_loss: 1.4574 - classification_loss: 0.2640  320/500 [==================>...........] - ETA: 59s - loss: 1.7218 - regression_loss: 1.4578 - classification_loss: 0.2640 321/500 [==================>...........] - ETA: 59s - loss: 1.7201 - regression_loss: 1.4565 - classification_loss: 0.2636 322/500 [==================>...........] - ETA: 58s - loss: 1.7203 - regression_loss: 1.4568 - classification_loss: 0.2636 323/500 [==================>...........] - ETA: 58s - loss: 1.7199 - regression_loss: 1.4563 - classification_loss: 0.2635 324/500 [==================>...........] - ETA: 58s - loss: 1.7206 - regression_loss: 1.4568 - classification_loss: 0.2638 325/500 [==================>...........] - ETA: 57s - loss: 1.7217 - regression_loss: 1.4576 - classification_loss: 0.2641 326/500 [==================>...........] - ETA: 57s - loss: 1.7195 - regression_loss: 1.4557 - classification_loss: 0.2638 327/500 [==================>...........] - ETA: 57s - loss: 1.7171 - regression_loss: 1.4535 - classification_loss: 0.2636 328/500 [==================>...........] - ETA: 56s - loss: 1.7133 - regression_loss: 1.4502 - classification_loss: 0.2631 329/500 [==================>...........] - ETA: 56s - loss: 1.7143 - regression_loss: 1.4510 - classification_loss: 0.2633 330/500 [==================>...........] - ETA: 56s - loss: 1.7146 - regression_loss: 1.4513 - classification_loss: 0.2633 331/500 [==================>...........] - ETA: 55s - loss: 1.7151 - regression_loss: 1.4517 - classification_loss: 0.2634 332/500 [==================>...........] - ETA: 55s - loss: 1.7150 - regression_loss: 1.4517 - classification_loss: 0.2633 333/500 [==================>...........] - ETA: 55s - loss: 1.7149 - regression_loss: 1.4516 - classification_loss: 0.2633 334/500 [===================>..........] - ETA: 54s - loss: 1.7131 - regression_loss: 1.4501 - classification_loss: 0.2630 335/500 [===================>..........] - ETA: 54s - loss: 1.7148 - regression_loss: 1.4515 - classification_loss: 0.2633 336/500 [===================>..........] - ETA: 54s - loss: 1.7164 - regression_loss: 1.4529 - classification_loss: 0.2635 337/500 [===================>..........] - ETA: 53s - loss: 1.7178 - regression_loss: 1.4541 - classification_loss: 0.2637 338/500 [===================>..........] - ETA: 53s - loss: 1.7176 - regression_loss: 1.4540 - classification_loss: 0.2636 339/500 [===================>..........] - ETA: 53s - loss: 1.7179 - regression_loss: 1.4543 - classification_loss: 0.2637 340/500 [===================>..........] - ETA: 52s - loss: 1.7169 - regression_loss: 1.4533 - classification_loss: 0.2636 341/500 [===================>..........] - ETA: 52s - loss: 1.7150 - regression_loss: 1.4517 - classification_loss: 0.2633 342/500 [===================>..........] - ETA: 52s - loss: 1.7151 - regression_loss: 1.4519 - classification_loss: 0.2632 343/500 [===================>..........] - ETA: 51s - loss: 1.7155 - regression_loss: 1.4522 - classification_loss: 0.2633 344/500 [===================>..........] - ETA: 51s - loss: 1.7155 - regression_loss: 1.4522 - classification_loss: 0.2633 345/500 [===================>..........] - ETA: 51s - loss: 1.7154 - regression_loss: 1.4524 - classification_loss: 0.2631 346/500 [===================>..........] - ETA: 50s - loss: 1.7134 - regression_loss: 1.4506 - classification_loss: 0.2628 347/500 [===================>..........] - ETA: 50s - loss: 1.7127 - regression_loss: 1.4502 - classification_loss: 0.2625 348/500 [===================>..........] - ETA: 50s - loss: 1.7119 - regression_loss: 1.4494 - classification_loss: 0.2625 349/500 [===================>..........] - ETA: 49s - loss: 1.7123 - regression_loss: 1.4497 - classification_loss: 0.2626 350/500 [====================>.........] - ETA: 49s - loss: 1.7125 - regression_loss: 1.4498 - classification_loss: 0.2626 351/500 [====================>.........] - ETA: 49s - loss: 1.7153 - regression_loss: 1.4521 - classification_loss: 0.2632 352/500 [====================>.........] - ETA: 48s - loss: 1.7168 - regression_loss: 1.4534 - classification_loss: 0.2634 353/500 [====================>.........] - ETA: 48s - loss: 1.7164 - regression_loss: 1.4531 - classification_loss: 0.2633 354/500 [====================>.........] - ETA: 48s - loss: 1.7137 - regression_loss: 1.4508 - classification_loss: 0.2629 355/500 [====================>.........] - ETA: 47s - loss: 1.7145 - regression_loss: 1.4515 - classification_loss: 0.2630 356/500 [====================>.........] - ETA: 47s - loss: 1.7149 - regression_loss: 1.4519 - classification_loss: 0.2630 357/500 [====================>.........] - ETA: 47s - loss: 1.7151 - regression_loss: 1.4519 - classification_loss: 0.2631 358/500 [====================>.........] - ETA: 46s - loss: 1.7165 - regression_loss: 1.4531 - classification_loss: 0.2634 359/500 [====================>.........] - ETA: 46s - loss: 1.7171 - regression_loss: 1.4535 - classification_loss: 0.2636 360/500 [====================>.........] - ETA: 46s - loss: 1.7182 - regression_loss: 1.4545 - classification_loss: 0.2637 361/500 [====================>.........] - ETA: 45s - loss: 1.7179 - regression_loss: 1.4543 - classification_loss: 0.2636 362/500 [====================>.........] - ETA: 45s - loss: 1.7183 - regression_loss: 1.4547 - classification_loss: 0.2636 363/500 [====================>.........] - ETA: 45s - loss: 1.7186 - regression_loss: 1.4550 - classification_loss: 0.2636 364/500 [====================>.........] - ETA: 44s - loss: 1.7208 - regression_loss: 1.4569 - classification_loss: 0.2639 365/500 [====================>.........] - ETA: 44s - loss: 1.7203 - regression_loss: 1.4564 - classification_loss: 0.2639 366/500 [====================>.........] - ETA: 44s - loss: 1.7206 - regression_loss: 1.4568 - classification_loss: 0.2638 367/500 [=====================>........] - ETA: 43s - loss: 1.7212 - regression_loss: 1.4574 - classification_loss: 0.2637 368/500 [=====================>........] - ETA: 43s - loss: 1.7183 - regression_loss: 1.4549 - classification_loss: 0.2634 369/500 [=====================>........] - ETA: 43s - loss: 1.7188 - regression_loss: 1.4553 - classification_loss: 0.2635 370/500 [=====================>........] - ETA: 42s - loss: 1.7190 - regression_loss: 1.4553 - classification_loss: 0.2637 371/500 [=====================>........] - ETA: 42s - loss: 1.7186 - regression_loss: 1.4550 - classification_loss: 0.2636 372/500 [=====================>........] - ETA: 42s - loss: 1.7186 - regression_loss: 1.4551 - classification_loss: 0.2635 373/500 [=====================>........] - ETA: 41s - loss: 1.7193 - regression_loss: 1.4556 - classification_loss: 0.2637 374/500 [=====================>........] - ETA: 41s - loss: 1.7200 - regression_loss: 1.4562 - classification_loss: 0.2638 375/500 [=====================>........] - ETA: 41s - loss: 1.7202 - regression_loss: 1.4564 - classification_loss: 0.2639 376/500 [=====================>........] - ETA: 40s - loss: 1.7198 - regression_loss: 1.4559 - classification_loss: 0.2639 377/500 [=====================>........] - ETA: 40s - loss: 1.7203 - regression_loss: 1.4563 - classification_loss: 0.2640 378/500 [=====================>........] - ETA: 40s - loss: 1.7182 - regression_loss: 1.4544 - classification_loss: 0.2638 379/500 [=====================>........] - ETA: 39s - loss: 1.7162 - regression_loss: 1.4525 - classification_loss: 0.2636 380/500 [=====================>........] - ETA: 39s - loss: 1.7138 - regression_loss: 1.4504 - classification_loss: 0.2634 381/500 [=====================>........] - ETA: 39s - loss: 1.7141 - regression_loss: 1.4506 - classification_loss: 0.2634 382/500 [=====================>........] - ETA: 38s - loss: 1.7150 - regression_loss: 1.4516 - classification_loss: 0.2634 383/500 [=====================>........] - ETA: 38s - loss: 1.7154 - regression_loss: 1.4520 - classification_loss: 0.2635 384/500 [======================>.......] - ETA: 38s - loss: 1.7131 - regression_loss: 1.4500 - classification_loss: 0.2631 385/500 [======================>.......] - ETA: 37s - loss: 1.7102 - regression_loss: 1.4475 - classification_loss: 0.2628 386/500 [======================>.......] - ETA: 37s - loss: 1.7103 - regression_loss: 1.4475 - classification_loss: 0.2628 387/500 [======================>.......] - ETA: 37s - loss: 1.7083 - regression_loss: 1.4458 - classification_loss: 0.2624 388/500 [======================>.......] - ETA: 36s - loss: 1.7086 - regression_loss: 1.4461 - classification_loss: 0.2624 389/500 [======================>.......] - ETA: 36s - loss: 1.7066 - regression_loss: 1.4445 - classification_loss: 0.2621 390/500 [======================>.......] - ETA: 36s - loss: 1.7062 - regression_loss: 1.4443 - classification_loss: 0.2620 391/500 [======================>.......] - ETA: 35s - loss: 1.7061 - regression_loss: 1.4442 - classification_loss: 0.2619 392/500 [======================>.......] - ETA: 35s - loss: 1.7066 - regression_loss: 1.4444 - classification_loss: 0.2621 393/500 [======================>.......] - ETA: 35s - loss: 1.7069 - regression_loss: 1.4448 - classification_loss: 0.2621 394/500 [======================>.......] - ETA: 34s - loss: 1.7086 - regression_loss: 1.4464 - classification_loss: 0.2622 395/500 [======================>.......] - ETA: 34s - loss: 1.7088 - regression_loss: 1.4465 - classification_loss: 0.2623 396/500 [======================>.......] - ETA: 34s - loss: 1.7079 - regression_loss: 1.4458 - classification_loss: 0.2621 397/500 [======================>.......] - ETA: 33s - loss: 1.7086 - regression_loss: 1.4463 - classification_loss: 0.2622 398/500 [======================>.......] - ETA: 33s - loss: 1.7083 - regression_loss: 1.4461 - classification_loss: 0.2622 399/500 [======================>.......] - ETA: 33s - loss: 1.7068 - regression_loss: 1.4448 - classification_loss: 0.2619 400/500 [=======================>......] - ETA: 32s - loss: 1.7069 - regression_loss: 1.4450 - classification_loss: 0.2619 401/500 [=======================>......] - ETA: 32s - loss: 1.7086 - regression_loss: 1.4465 - classification_loss: 0.2621 402/500 [=======================>......] - ETA: 32s - loss: 1.7103 - regression_loss: 1.4479 - classification_loss: 0.2624 403/500 [=======================>......] - ETA: 31s - loss: 1.7110 - regression_loss: 1.4486 - classification_loss: 0.2625 404/500 [=======================>......] - ETA: 31s - loss: 1.7111 - regression_loss: 1.4486 - classification_loss: 0.2625 405/500 [=======================>......] - ETA: 31s - loss: 1.7108 - regression_loss: 1.4484 - classification_loss: 0.2624 406/500 [=======================>......] - ETA: 31s - loss: 1.7112 - regression_loss: 1.4489 - classification_loss: 0.2623 407/500 [=======================>......] - ETA: 30s - loss: 1.7104 - regression_loss: 1.4483 - classification_loss: 0.2622 408/500 [=======================>......] - ETA: 30s - loss: 1.7105 - regression_loss: 1.4483 - classification_loss: 0.2622 409/500 [=======================>......] - ETA: 30s - loss: 1.7089 - regression_loss: 1.4470 - classification_loss: 0.2619 410/500 [=======================>......] - ETA: 29s - loss: 1.7096 - regression_loss: 1.4475 - classification_loss: 0.2621 411/500 [=======================>......] - ETA: 29s - loss: 1.7086 - regression_loss: 1.4467 - classification_loss: 0.2619 412/500 [=======================>......] - ETA: 29s - loss: 1.7084 - regression_loss: 1.4465 - classification_loss: 0.2618 413/500 [=======================>......] - ETA: 28s - loss: 1.7104 - regression_loss: 1.4482 - classification_loss: 0.2622 414/500 [=======================>......] - ETA: 28s - loss: 1.7106 - regression_loss: 1.4484 - classification_loss: 0.2622 415/500 [=======================>......] - ETA: 28s - loss: 1.7103 - regression_loss: 1.4482 - classification_loss: 0.2621 416/500 [=======================>......] - ETA: 27s - loss: 1.7094 - regression_loss: 1.4475 - classification_loss: 0.2619 417/500 [========================>.....] - ETA: 27s - loss: 1.7089 - regression_loss: 1.4469 - classification_loss: 0.2621 418/500 [========================>.....] - ETA: 27s - loss: 1.7087 - regression_loss: 1.4466 - classification_loss: 0.2621 419/500 [========================>.....] - ETA: 26s - loss: 1.7089 - regression_loss: 1.4469 - classification_loss: 0.2620 420/500 [========================>.....] - ETA: 26s - loss: 1.7076 - regression_loss: 1.4459 - classification_loss: 0.2617 421/500 [========================>.....] - ETA: 26s - loss: 1.7082 - regression_loss: 1.4465 - classification_loss: 0.2618 422/500 [========================>.....] - ETA: 25s - loss: 1.7094 - regression_loss: 1.4473 - classification_loss: 0.2621 423/500 [========================>.....] - ETA: 25s - loss: 1.7071 - regression_loss: 1.4453 - classification_loss: 0.2618 424/500 [========================>.....] - ETA: 25s - loss: 1.7089 - regression_loss: 1.4466 - classification_loss: 0.2623 425/500 [========================>.....] - ETA: 24s - loss: 1.7094 - regression_loss: 1.4469 - classification_loss: 0.2625 426/500 [========================>.....] - ETA: 24s - loss: 1.7093 - regression_loss: 1.4468 - classification_loss: 0.2625 427/500 [========================>.....] - ETA: 24s - loss: 1.7101 - regression_loss: 1.4474 - classification_loss: 0.2627 428/500 [========================>.....] - ETA: 23s - loss: 1.7089 - regression_loss: 1.4464 - classification_loss: 0.2625 429/500 [========================>.....] - ETA: 23s - loss: 1.7097 - regression_loss: 1.4471 - classification_loss: 0.2626 430/500 [========================>.....] - ETA: 23s - loss: 1.7095 - regression_loss: 1.4469 - classification_loss: 0.2627 431/500 [========================>.....] - ETA: 22s - loss: 1.7106 - regression_loss: 1.4477 - classification_loss: 0.2628 432/500 [========================>.....] - ETA: 22s - loss: 1.7108 - regression_loss: 1.4480 - classification_loss: 0.2629 433/500 [========================>.....] - ETA: 22s - loss: 1.7116 - regression_loss: 1.4487 - classification_loss: 0.2629 434/500 [=========================>....] - ETA: 21s - loss: 1.7116 - regression_loss: 1.4487 - classification_loss: 0.2629 435/500 [=========================>....] - ETA: 21s - loss: 1.7115 - regression_loss: 1.4488 - classification_loss: 0.2627 436/500 [=========================>....] - ETA: 21s - loss: 1.7125 - regression_loss: 1.4498 - classification_loss: 0.2628 437/500 [=========================>....] - ETA: 20s - loss: 1.7105 - regression_loss: 1.4479 - classification_loss: 0.2626 438/500 [=========================>....] - ETA: 20s - loss: 1.7114 - regression_loss: 1.4488 - classification_loss: 0.2626 439/500 [=========================>....] - ETA: 20s - loss: 1.7124 - regression_loss: 1.4496 - classification_loss: 0.2627 440/500 [=========================>....] - ETA: 19s - loss: 1.7114 - regression_loss: 1.4488 - classification_loss: 0.2626 441/500 [=========================>....] - ETA: 19s - loss: 1.7095 - regression_loss: 1.4473 - classification_loss: 0.2623 442/500 [=========================>....] - ETA: 19s - loss: 1.7109 - regression_loss: 1.4484 - classification_loss: 0.2625 443/500 [=========================>....] - ETA: 18s - loss: 1.7127 - regression_loss: 1.4498 - classification_loss: 0.2629 444/500 [=========================>....] - ETA: 18s - loss: 1.7137 - regression_loss: 1.4508 - classification_loss: 0.2629 445/500 [=========================>....] - ETA: 18s - loss: 1.7133 - regression_loss: 1.4505 - classification_loss: 0.2628 446/500 [=========================>....] - ETA: 17s - loss: 1.7133 - regression_loss: 1.4505 - classification_loss: 0.2628 447/500 [=========================>....] - ETA: 17s - loss: 1.7124 - regression_loss: 1.4496 - classification_loss: 0.2628 448/500 [=========================>....] - ETA: 17s - loss: 1.7143 - regression_loss: 1.4513 - classification_loss: 0.2630 449/500 [=========================>....] - ETA: 16s - loss: 1.7142 - regression_loss: 1.4512 - classification_loss: 0.2631 450/500 [==========================>...] - ETA: 16s - loss: 1.7149 - regression_loss: 1.4517 - classification_loss: 0.2631 451/500 [==========================>...] - ETA: 16s - loss: 1.7157 - regression_loss: 1.4525 - classification_loss: 0.2632 452/500 [==========================>...] - ETA: 15s - loss: 1.7142 - regression_loss: 1.4513 - classification_loss: 0.2629 453/500 [==========================>...] - ETA: 15s - loss: 1.7146 - regression_loss: 1.4516 - classification_loss: 0.2630 454/500 [==========================>...] - ETA: 15s - loss: 1.7143 - regression_loss: 1.4514 - classification_loss: 0.2629 455/500 [==========================>...] - ETA: 14s - loss: 1.7139 - regression_loss: 1.4510 - classification_loss: 0.2628 456/500 [==========================>...] - ETA: 14s - loss: 1.7145 - regression_loss: 1.4514 - classification_loss: 0.2630 457/500 [==========================>...] - ETA: 14s - loss: 1.7149 - regression_loss: 1.4519 - classification_loss: 0.2630 458/500 [==========================>...] - ETA: 13s - loss: 1.7128 - regression_loss: 1.4502 - classification_loss: 0.2626 459/500 [==========================>...] - ETA: 13s - loss: 1.7103 - regression_loss: 1.4478 - classification_loss: 0.2625 460/500 [==========================>...] - ETA: 13s - loss: 1.7098 - regression_loss: 1.4473 - classification_loss: 0.2624 461/500 [==========================>...] - ETA: 12s - loss: 1.7089 - regression_loss: 1.4467 - classification_loss: 0.2622 462/500 [==========================>...] - ETA: 12s - loss: 1.7097 - regression_loss: 1.4474 - classification_loss: 0.2622 463/500 [==========================>...] - ETA: 12s - loss: 1.7086 - regression_loss: 1.4465 - classification_loss: 0.2621 464/500 [==========================>...] - ETA: 11s - loss: 1.7070 - regression_loss: 1.4451 - classification_loss: 0.2619 465/500 [==========================>...] - ETA: 11s - loss: 1.7078 - regression_loss: 1.4460 - classification_loss: 0.2618 466/500 [==========================>...] - ETA: 11s - loss: 1.7050 - regression_loss: 1.4436 - classification_loss: 0.2614 467/500 [===========================>..] - ETA: 10s - loss: 1.7055 - regression_loss: 1.4440 - classification_loss: 0.2615 468/500 [===========================>..] - ETA: 10s - loss: 1.7060 - regression_loss: 1.4446 - classification_loss: 0.2615 469/500 [===========================>..] - ETA: 10s - loss: 1.7067 - regression_loss: 1.4452 - classification_loss: 0.2615 470/500 [===========================>..] - ETA: 9s - loss: 1.7055 - regression_loss: 1.4439 - classification_loss: 0.2616  471/500 [===========================>..] - ETA: 9s - loss: 1.7059 - regression_loss: 1.4442 - classification_loss: 0.2616 472/500 [===========================>..] - ETA: 9s - loss: 1.7065 - regression_loss: 1.4447 - classification_loss: 0.2618 473/500 [===========================>..] - ETA: 8s - loss: 1.7057 - regression_loss: 1.4443 - classification_loss: 0.2615 474/500 [===========================>..] - ETA: 8s - loss: 1.7053 - regression_loss: 1.4439 - classification_loss: 0.2614 475/500 [===========================>..] - ETA: 8s - loss: 1.7050 - regression_loss: 1.4436 - classification_loss: 0.2614 476/500 [===========================>..] - ETA: 7s - loss: 1.7036 - regression_loss: 1.4425 - classification_loss: 0.2611 477/500 [===========================>..] - ETA: 7s - loss: 1.7043 - regression_loss: 1.4430 - classification_loss: 0.2613 478/500 [===========================>..] - ETA: 7s - loss: 1.7049 - regression_loss: 1.4435 - classification_loss: 0.2614 479/500 [===========================>..] - ETA: 6s - loss: 1.7045 - regression_loss: 1.4432 - classification_loss: 0.2612 480/500 [===========================>..] - ETA: 6s - loss: 1.7037 - regression_loss: 1.4426 - classification_loss: 0.2611 481/500 [===========================>..] - ETA: 6s - loss: 1.7034 - regression_loss: 1.4423 - classification_loss: 0.2611 482/500 [===========================>..] - ETA: 5s - loss: 1.7038 - regression_loss: 1.4428 - classification_loss: 0.2610 483/500 [===========================>..] - ETA: 5s - loss: 1.7038 - regression_loss: 1.4427 - classification_loss: 0.2611 484/500 [============================>.] - ETA: 5s - loss: 1.7029 - regression_loss: 1.4419 - classification_loss: 0.2610 485/500 [============================>.] - ETA: 4s - loss: 1.7028 - regression_loss: 1.4418 - classification_loss: 0.2610 486/500 [============================>.] - ETA: 4s - loss: 1.7028 - regression_loss: 1.4419 - classification_loss: 0.2609 487/500 [============================>.] - ETA: 4s - loss: 1.7028 - regression_loss: 1.4419 - classification_loss: 0.2609 488/500 [============================>.] - ETA: 3s - loss: 1.7025 - regression_loss: 1.4417 - classification_loss: 0.2608 489/500 [============================>.] - ETA: 3s - loss: 1.7023 - regression_loss: 1.4414 - classification_loss: 0.2608 490/500 [============================>.] - ETA: 3s - loss: 1.7025 - regression_loss: 1.4416 - classification_loss: 0.2609 491/500 [============================>.] - ETA: 2s - loss: 1.7031 - regression_loss: 1.4421 - classification_loss: 0.2610 492/500 [============================>.] - ETA: 2s - loss: 1.7020 - regression_loss: 1.4412 - classification_loss: 0.2608 493/500 [============================>.] - ETA: 2s - loss: 1.7022 - regression_loss: 1.4415 - classification_loss: 0.2607 494/500 [============================>.] - ETA: 1s - loss: 1.7024 - regression_loss: 1.4416 - classification_loss: 0.2608 495/500 [============================>.] - ETA: 1s - loss: 1.7029 - regression_loss: 1.4419 - classification_loss: 0.2611 496/500 [============================>.] - ETA: 1s - loss: 1.7031 - regression_loss: 1.4420 - classification_loss: 0.2611 497/500 [============================>.] - ETA: 0s - loss: 1.7012 - regression_loss: 1.4404 - classification_loss: 0.2608 498/500 [============================>.] - ETA: 0s - loss: 1.7005 - regression_loss: 1.4399 - classification_loss: 0.2606 499/500 [============================>.] - ETA: 0s - loss: 1.7001 - regression_loss: 1.4396 - classification_loss: 0.2606 500/500 [==============================] - 165s 330ms/step - loss: 1.7009 - regression_loss: 1.4401 - classification_loss: 0.2607 1172 instances of class plum with average precision: 0.5526 mAP: 0.5526 Epoch 00013: saving model to ./training/snapshots/resnet101_pascal_13.h5 Epoch 14/150 1/500 [..............................] - ETA: 2:46 - loss: 1.9883 - regression_loss: 1.6804 - classification_loss: 0.3080 2/500 [..............................] - ETA: 2:42 - loss: 1.3234 - regression_loss: 1.1065 - classification_loss: 0.2169 3/500 [..............................] - ETA: 2:40 - loss: 1.6059 - regression_loss: 1.3581 - classification_loss: 0.2479 4/500 [..............................] - ETA: 2:42 - loss: 1.4823 - regression_loss: 1.2567 - classification_loss: 0.2256 5/500 [..............................] - ETA: 2:43 - loss: 1.5998 - regression_loss: 1.3499 - classification_loss: 0.2498 6/500 [..............................] - ETA: 2:43 - loss: 1.5457 - regression_loss: 1.3077 - classification_loss: 0.2381 7/500 [..............................] - ETA: 2:44 - loss: 1.5151 - regression_loss: 1.2800 - classification_loss: 0.2351 8/500 [..............................] - ETA: 2:44 - loss: 1.5241 - regression_loss: 1.2866 - classification_loss: 0.2375 9/500 [..............................] - ETA: 2:42 - loss: 1.5232 - regression_loss: 1.2908 - classification_loss: 0.2324 10/500 [..............................] - ETA: 2:43 - loss: 1.5769 - regression_loss: 1.3343 - classification_loss: 0.2426 11/500 [..............................] - ETA: 2:42 - loss: 1.5864 - regression_loss: 1.3429 - classification_loss: 0.2434 12/500 [..............................] - ETA: 2:41 - loss: 1.6372 - regression_loss: 1.3874 - classification_loss: 0.2499 13/500 [..............................] - ETA: 2:40 - loss: 1.6337 - regression_loss: 1.3817 - classification_loss: 0.2520 14/500 [..............................] - ETA: 2:40 - loss: 1.5833 - regression_loss: 1.3347 - classification_loss: 0.2486 15/500 [..............................] - ETA: 2:39 - loss: 1.5397 - regression_loss: 1.2962 - classification_loss: 0.2435 16/500 [..............................] - ETA: 2:39 - loss: 1.5844 - regression_loss: 1.3357 - classification_loss: 0.2487 17/500 [>.............................] - ETA: 2:38 - loss: 1.5558 - regression_loss: 1.3098 - classification_loss: 0.2460 18/500 [>.............................] - ETA: 2:38 - loss: 1.5284 - regression_loss: 1.2860 - classification_loss: 0.2425 19/500 [>.............................] - ETA: 2:38 - loss: 1.5426 - regression_loss: 1.3013 - classification_loss: 0.2414 20/500 [>.............................] - ETA: 2:38 - loss: 1.5703 - regression_loss: 1.3253 - classification_loss: 0.2450 21/500 [>.............................] - ETA: 2:38 - loss: 1.5844 - regression_loss: 1.3370 - classification_loss: 0.2474 22/500 [>.............................] - ETA: 2:38 - loss: 1.5521 - regression_loss: 1.3107 - classification_loss: 0.2414 23/500 [>.............................] - ETA: 2:37 - loss: 1.5727 - regression_loss: 1.3297 - classification_loss: 0.2430 24/500 [>.............................] - ETA: 2:36 - loss: 1.5921 - regression_loss: 1.3461 - classification_loss: 0.2460 25/500 [>.............................] - ETA: 2:36 - loss: 1.5787 - regression_loss: 1.3347 - classification_loss: 0.2439 26/500 [>.............................] - ETA: 2:36 - loss: 1.5565 - regression_loss: 1.3168 - classification_loss: 0.2397 27/500 [>.............................] - ETA: 2:35 - loss: 1.5619 - regression_loss: 1.3245 - classification_loss: 0.2374 28/500 [>.............................] - ETA: 2:35 - loss: 1.5551 - regression_loss: 1.3199 - classification_loss: 0.2351 29/500 [>.............................] - ETA: 2:35 - loss: 1.5631 - regression_loss: 1.3276 - classification_loss: 0.2355 30/500 [>.............................] - ETA: 2:34 - loss: 1.5529 - regression_loss: 1.3204 - classification_loss: 0.2325 31/500 [>.............................] - ETA: 2:34 - loss: 1.5545 - regression_loss: 1.3223 - classification_loss: 0.2323 32/500 [>.............................] - ETA: 2:34 - loss: 1.5605 - regression_loss: 1.3271 - classification_loss: 0.2333 33/500 [>.............................] - ETA: 2:33 - loss: 1.5711 - regression_loss: 1.3336 - classification_loss: 0.2375 34/500 [=>............................] - ETA: 2:33 - loss: 1.5513 - regression_loss: 1.3167 - classification_loss: 0.2346 35/500 [=>............................] - ETA: 2:32 - loss: 1.5686 - regression_loss: 1.3313 - classification_loss: 0.2374 36/500 [=>............................] - ETA: 2:31 - loss: 1.5727 - regression_loss: 1.3348 - classification_loss: 0.2379 37/500 [=>............................] - ETA: 2:31 - loss: 1.5806 - regression_loss: 1.3420 - classification_loss: 0.2387 38/500 [=>............................] - ETA: 2:31 - loss: 1.5910 - regression_loss: 1.3511 - classification_loss: 0.2399 39/500 [=>............................] - ETA: 2:31 - loss: 1.5916 - regression_loss: 1.3515 - classification_loss: 0.2401 40/500 [=>............................] - ETA: 2:30 - loss: 1.5983 - regression_loss: 1.3577 - classification_loss: 0.2407 41/500 [=>............................] - ETA: 2:30 - loss: 1.6037 - regression_loss: 1.3625 - classification_loss: 0.2413 42/500 [=>............................] - ETA: 2:30 - loss: 1.6103 - regression_loss: 1.3682 - classification_loss: 0.2421 43/500 [=>............................] - ETA: 2:29 - loss: 1.5965 - regression_loss: 1.3565 - classification_loss: 0.2401 44/500 [=>............................] - ETA: 2:29 - loss: 1.5885 - regression_loss: 1.3497 - classification_loss: 0.2388 45/500 [=>............................] - ETA: 2:29 - loss: 1.5825 - regression_loss: 1.3449 - classification_loss: 0.2376 46/500 [=>............................] - ETA: 2:28 - loss: 1.5872 - regression_loss: 1.3486 - classification_loss: 0.2385 47/500 [=>............................] - ETA: 2:28 - loss: 1.5907 - regression_loss: 1.3514 - classification_loss: 0.2393 48/500 [=>............................] - ETA: 2:28 - loss: 1.5796 - regression_loss: 1.3424 - classification_loss: 0.2372 49/500 [=>............................] - ETA: 2:27 - loss: 1.5834 - regression_loss: 1.3445 - classification_loss: 0.2389 50/500 [==>...........................] - ETA: 2:27 - loss: 1.5899 - regression_loss: 1.3501 - classification_loss: 0.2398 51/500 [==>...........................] - ETA: 2:26 - loss: 1.6010 - regression_loss: 1.3592 - classification_loss: 0.2418 52/500 [==>...........................] - ETA: 2:26 - loss: 1.5969 - regression_loss: 1.3559 - classification_loss: 0.2410 53/500 [==>...........................] - ETA: 2:26 - loss: 1.5910 - regression_loss: 1.3502 - classification_loss: 0.2408 54/500 [==>...........................] - ETA: 2:26 - loss: 1.5894 - regression_loss: 1.3486 - classification_loss: 0.2409 55/500 [==>...........................] - ETA: 2:25 - loss: 1.6031 - regression_loss: 1.3602 - classification_loss: 0.2429 56/500 [==>...........................] - ETA: 2:25 - loss: 1.6017 - regression_loss: 1.3589 - classification_loss: 0.2428 57/500 [==>...........................] - ETA: 2:25 - loss: 1.6111 - regression_loss: 1.3672 - classification_loss: 0.2440 58/500 [==>...........................] - ETA: 2:24 - loss: 1.6135 - regression_loss: 1.3693 - classification_loss: 0.2442 59/500 [==>...........................] - ETA: 2:24 - loss: 1.6153 - regression_loss: 1.3710 - classification_loss: 0.2443 60/500 [==>...........................] - ETA: 2:24 - loss: 1.6268 - regression_loss: 1.3802 - classification_loss: 0.2465 61/500 [==>...........................] - ETA: 2:23 - loss: 1.6232 - regression_loss: 1.3775 - classification_loss: 0.2457 62/500 [==>...........................] - ETA: 2:23 - loss: 1.6159 - regression_loss: 1.3720 - classification_loss: 0.2440 63/500 [==>...........................] - ETA: 2:23 - loss: 1.6109 - regression_loss: 1.3667 - classification_loss: 0.2443 64/500 [==>...........................] - ETA: 2:23 - loss: 1.6108 - regression_loss: 1.3664 - classification_loss: 0.2443 65/500 [==>...........................] - ETA: 2:22 - loss: 1.6077 - regression_loss: 1.3638 - classification_loss: 0.2438 66/500 [==>...........................] - ETA: 2:22 - loss: 1.6149 - regression_loss: 1.3691 - classification_loss: 0.2458 67/500 [===>..........................] - ETA: 2:22 - loss: 1.6116 - regression_loss: 1.3665 - classification_loss: 0.2451 68/500 [===>..........................] - ETA: 2:22 - loss: 1.6108 - regression_loss: 1.3664 - classification_loss: 0.2444 69/500 [===>..........................] - ETA: 2:22 - loss: 1.6022 - regression_loss: 1.3593 - classification_loss: 0.2428 70/500 [===>..........................] - ETA: 2:21 - loss: 1.6134 - regression_loss: 1.3692 - classification_loss: 0.2443 71/500 [===>..........................] - ETA: 2:21 - loss: 1.6047 - regression_loss: 1.3619 - classification_loss: 0.2428 72/500 [===>..........................] - ETA: 2:21 - loss: 1.6038 - regression_loss: 1.3612 - classification_loss: 0.2426 73/500 [===>..........................] - ETA: 2:20 - loss: 1.6076 - regression_loss: 1.3642 - classification_loss: 0.2434 74/500 [===>..........................] - ETA: 2:20 - loss: 1.6064 - regression_loss: 1.3627 - classification_loss: 0.2437 75/500 [===>..........................] - ETA: 2:19 - loss: 1.5994 - regression_loss: 1.3569 - classification_loss: 0.2425 76/500 [===>..........................] - ETA: 2:19 - loss: 1.5993 - regression_loss: 1.3580 - classification_loss: 0.2414 77/500 [===>..........................] - ETA: 2:19 - loss: 1.5999 - regression_loss: 1.3584 - classification_loss: 0.2415 78/500 [===>..........................] - ETA: 2:18 - loss: 1.6065 - regression_loss: 1.3641 - classification_loss: 0.2424 79/500 [===>..........................] - ETA: 2:18 - loss: 1.6085 - regression_loss: 1.3655 - classification_loss: 0.2430 80/500 [===>..........................] - ETA: 2:18 - loss: 1.6115 - regression_loss: 1.3686 - classification_loss: 0.2429 81/500 [===>..........................] - ETA: 2:18 - loss: 1.6169 - regression_loss: 1.3725 - classification_loss: 0.2444 82/500 [===>..........................] - ETA: 2:17 - loss: 1.6179 - regression_loss: 1.3734 - classification_loss: 0.2445 83/500 [===>..........................] - ETA: 2:17 - loss: 1.6222 - regression_loss: 1.3776 - classification_loss: 0.2447 84/500 [====>.........................] - ETA: 2:17 - loss: 1.6163 - regression_loss: 1.3722 - classification_loss: 0.2441 85/500 [====>.........................] - ETA: 2:16 - loss: 1.6197 - regression_loss: 1.3753 - classification_loss: 0.2444 86/500 [====>.........................] - ETA: 2:16 - loss: 1.6207 - regression_loss: 1.3760 - classification_loss: 0.2447 87/500 [====>.........................] - ETA: 2:16 - loss: 1.6250 - regression_loss: 1.3800 - classification_loss: 0.2450 88/500 [====>.........................] - ETA: 2:15 - loss: 1.6253 - regression_loss: 1.3793 - classification_loss: 0.2460 89/500 [====>.........................] - ETA: 2:15 - loss: 1.6299 - regression_loss: 1.3835 - classification_loss: 0.2464 90/500 [====>.........................] - ETA: 2:15 - loss: 1.6356 - regression_loss: 1.3884 - classification_loss: 0.2472 91/500 [====>.........................] - ETA: 2:15 - loss: 1.6302 - regression_loss: 1.3838 - classification_loss: 0.2464 92/500 [====>.........................] - ETA: 2:14 - loss: 1.6333 - regression_loss: 1.3864 - classification_loss: 0.2468 93/500 [====>.........................] - ETA: 2:14 - loss: 1.6301 - regression_loss: 1.3836 - classification_loss: 0.2465 94/500 [====>.........................] - ETA: 2:14 - loss: 1.6231 - regression_loss: 1.3780 - classification_loss: 0.2452 95/500 [====>.........................] - ETA: 2:13 - loss: 1.6266 - regression_loss: 1.3812 - classification_loss: 0.2454 96/500 [====>.........................] - ETA: 2:13 - loss: 1.6233 - regression_loss: 1.3782 - classification_loss: 0.2450 97/500 [====>.........................] - ETA: 2:13 - loss: 1.6185 - regression_loss: 1.3728 - classification_loss: 0.2456 98/500 [====>.........................] - ETA: 2:12 - loss: 1.6201 - regression_loss: 1.3746 - classification_loss: 0.2456 99/500 [====>.........................] - ETA: 2:12 - loss: 1.6215 - regression_loss: 1.3759 - classification_loss: 0.2456 100/500 [=====>........................] - ETA: 2:12 - loss: 1.6097 - regression_loss: 1.3655 - classification_loss: 0.2442 101/500 [=====>........................] - ETA: 2:11 - loss: 1.6124 - regression_loss: 1.3679 - classification_loss: 0.2444 102/500 [=====>........................] - ETA: 2:11 - loss: 1.6173 - regression_loss: 1.3718 - classification_loss: 0.2455 103/500 [=====>........................] - ETA: 2:11 - loss: 1.6184 - regression_loss: 1.3719 - classification_loss: 0.2465 104/500 [=====>........................] - ETA: 2:10 - loss: 1.6154 - regression_loss: 1.3697 - classification_loss: 0.2457 105/500 [=====>........................] - ETA: 2:10 - loss: 1.6163 - regression_loss: 1.3708 - classification_loss: 0.2455 106/500 [=====>........................] - ETA: 2:10 - loss: 1.6161 - regression_loss: 1.3694 - classification_loss: 0.2467 107/500 [=====>........................] - ETA: 2:09 - loss: 1.6204 - regression_loss: 1.3732 - classification_loss: 0.2472 108/500 [=====>........................] - ETA: 2:09 - loss: 1.6294 - regression_loss: 1.3809 - classification_loss: 0.2485 109/500 [=====>........................] - ETA: 2:09 - loss: 1.6289 - regression_loss: 1.3800 - classification_loss: 0.2489 110/500 [=====>........................] - ETA: 2:08 - loss: 1.6318 - regression_loss: 1.3825 - classification_loss: 0.2494 111/500 [=====>........................] - ETA: 2:08 - loss: 1.6319 - regression_loss: 1.3831 - classification_loss: 0.2488 112/500 [=====>........................] - ETA: 2:08 - loss: 1.6247 - regression_loss: 1.3764 - classification_loss: 0.2483 113/500 [=====>........................] - ETA: 2:07 - loss: 1.6197 - regression_loss: 1.3718 - classification_loss: 0.2479 114/500 [=====>........................] - ETA: 2:07 - loss: 1.6202 - regression_loss: 1.3723 - classification_loss: 0.2479 115/500 [=====>........................] - ETA: 2:07 - loss: 1.6227 - regression_loss: 1.3742 - classification_loss: 0.2485 116/500 [=====>........................] - ETA: 2:06 - loss: 1.6254 - regression_loss: 1.3769 - classification_loss: 0.2486 117/500 [======>.......................] - ETA: 2:06 - loss: 1.6282 - regression_loss: 1.3789 - classification_loss: 0.2493 118/500 [======>.......................] - ETA: 2:05 - loss: 1.6222 - regression_loss: 1.3740 - classification_loss: 0.2481 119/500 [======>.......................] - ETA: 2:05 - loss: 1.6231 - regression_loss: 1.3746 - classification_loss: 0.2485 120/500 [======>.......................] - ETA: 2:05 - loss: 1.6226 - regression_loss: 1.3740 - classification_loss: 0.2486 121/500 [======>.......................] - ETA: 2:05 - loss: 1.6177 - regression_loss: 1.3697 - classification_loss: 0.2480 122/500 [======>.......................] - ETA: 2:04 - loss: 1.6195 - regression_loss: 1.3715 - classification_loss: 0.2480 123/500 [======>.......................] - ETA: 2:04 - loss: 1.6192 - regression_loss: 1.3713 - classification_loss: 0.2478 124/500 [======>.......................] - ETA: 2:03 - loss: 1.6180 - regression_loss: 1.3702 - classification_loss: 0.2478 125/500 [======>.......................] - ETA: 2:03 - loss: 1.6093 - regression_loss: 1.3625 - classification_loss: 0.2468 126/500 [======>.......................] - ETA: 2:03 - loss: 1.6080 - regression_loss: 1.3612 - classification_loss: 0.2468 127/500 [======>.......................] - ETA: 2:02 - loss: 1.6108 - regression_loss: 1.3634 - classification_loss: 0.2473 128/500 [======>.......................] - ETA: 2:02 - loss: 1.6133 - regression_loss: 1.3652 - classification_loss: 0.2481 129/500 [======>.......................] - ETA: 2:02 - loss: 1.6118 - regression_loss: 1.3636 - classification_loss: 0.2483 130/500 [======>.......................] - ETA: 2:02 - loss: 1.6071 - regression_loss: 1.3600 - classification_loss: 0.2472 131/500 [======>.......................] - ETA: 2:01 - loss: 1.6099 - regression_loss: 1.3625 - classification_loss: 0.2475 132/500 [======>.......................] - ETA: 2:01 - loss: 1.6128 - regression_loss: 1.3650 - classification_loss: 0.2478 133/500 [======>.......................] - ETA: 2:01 - loss: 1.6164 - regression_loss: 1.3684 - classification_loss: 0.2480 134/500 [=======>......................] - ETA: 2:00 - loss: 1.6208 - regression_loss: 1.3716 - classification_loss: 0.2492 135/500 [=======>......................] - ETA: 2:00 - loss: 1.6212 - regression_loss: 1.3722 - classification_loss: 0.2491 136/500 [=======>......................] - ETA: 2:00 - loss: 1.6205 - regression_loss: 1.3716 - classification_loss: 0.2490 137/500 [=======>......................] - ETA: 1:59 - loss: 1.6219 - regression_loss: 1.3726 - classification_loss: 0.2493 138/500 [=======>......................] - ETA: 1:59 - loss: 1.6244 - regression_loss: 1.3747 - classification_loss: 0.2497 139/500 [=======>......................] - ETA: 1:59 - loss: 1.6264 - regression_loss: 1.3763 - classification_loss: 0.2500 140/500 [=======>......................] - ETA: 1:58 - loss: 1.6286 - regression_loss: 1.3783 - classification_loss: 0.2503 141/500 [=======>......................] - ETA: 1:58 - loss: 1.6306 - regression_loss: 1.3799 - classification_loss: 0.2507 142/500 [=======>......................] - ETA: 1:58 - loss: 1.6318 - regression_loss: 1.3810 - classification_loss: 0.2508 143/500 [=======>......................] - ETA: 1:57 - loss: 1.6295 - regression_loss: 1.3792 - classification_loss: 0.2503 144/500 [=======>......................] - ETA: 1:57 - loss: 1.6258 - regression_loss: 1.3762 - classification_loss: 0.2496 145/500 [=======>......................] - ETA: 1:57 - loss: 1.6267 - regression_loss: 1.3771 - classification_loss: 0.2496 146/500 [=======>......................] - ETA: 1:56 - loss: 1.6288 - regression_loss: 1.3791 - classification_loss: 0.2497 147/500 [=======>......................] - ETA: 1:56 - loss: 1.6299 - regression_loss: 1.3801 - classification_loss: 0.2498 148/500 [=======>......................] - ETA: 1:56 - loss: 1.6316 - regression_loss: 1.3815 - classification_loss: 0.2500 149/500 [=======>......................] - ETA: 1:55 - loss: 1.6342 - regression_loss: 1.3841 - classification_loss: 0.2501 150/500 [========>.....................] - ETA: 1:55 - loss: 1.6355 - regression_loss: 1.3850 - classification_loss: 0.2504 151/500 [========>.....................] - ETA: 1:55 - loss: 1.6365 - regression_loss: 1.3859 - classification_loss: 0.2507 152/500 [========>.....................] - ETA: 1:54 - loss: 1.6344 - regression_loss: 1.3833 - classification_loss: 0.2511 153/500 [========>.....................] - ETA: 1:54 - loss: 1.6373 - regression_loss: 1.3856 - classification_loss: 0.2517 154/500 [========>.....................] - ETA: 1:54 - loss: 1.6398 - regression_loss: 1.3880 - classification_loss: 0.2518 155/500 [========>.....................] - ETA: 1:54 - loss: 1.6453 - regression_loss: 1.3923 - classification_loss: 0.2530 156/500 [========>.....................] - ETA: 1:53 - loss: 1.6457 - regression_loss: 1.3924 - classification_loss: 0.2533 157/500 [========>.....................] - ETA: 1:53 - loss: 1.6520 - regression_loss: 1.3974 - classification_loss: 0.2546 158/500 [========>.....................] - ETA: 1:53 - loss: 1.6508 - regression_loss: 1.3963 - classification_loss: 0.2545 159/500 [========>.....................] - ETA: 1:52 - loss: 1.6529 - regression_loss: 1.3979 - classification_loss: 0.2549 160/500 [========>.....................] - ETA: 1:52 - loss: 1.6542 - regression_loss: 1.3993 - classification_loss: 0.2549 161/500 [========>.....................] - ETA: 1:52 - loss: 1.6470 - regression_loss: 1.3930 - classification_loss: 0.2540 162/500 [========>.....................] - ETA: 1:51 - loss: 1.6485 - regression_loss: 1.3940 - classification_loss: 0.2545 163/500 [========>.....................] - ETA: 1:51 - loss: 1.6512 - regression_loss: 1.3963 - classification_loss: 0.2548 164/500 [========>.....................] - ETA: 1:51 - loss: 1.6528 - regression_loss: 1.3979 - classification_loss: 0.2549 165/500 [========>.....................] - ETA: 1:50 - loss: 1.6533 - regression_loss: 1.3982 - classification_loss: 0.2551 166/500 [========>.....................] - ETA: 1:50 - loss: 1.6544 - regression_loss: 1.3994 - classification_loss: 0.2550 167/500 [=========>....................] - ETA: 1:50 - loss: 1.6547 - regression_loss: 1.3997 - classification_loss: 0.2550 168/500 [=========>....................] - ETA: 1:49 - loss: 1.6575 - regression_loss: 1.4022 - classification_loss: 0.2553 169/500 [=========>....................] - ETA: 1:49 - loss: 1.6548 - regression_loss: 1.3997 - classification_loss: 0.2550 170/500 [=========>....................] - ETA: 1:49 - loss: 1.6541 - regression_loss: 1.3993 - classification_loss: 0.2548 171/500 [=========>....................] - ETA: 1:48 - loss: 1.6524 - regression_loss: 1.3977 - classification_loss: 0.2547 172/500 [=========>....................] - ETA: 1:48 - loss: 1.6461 - regression_loss: 1.3919 - classification_loss: 0.2543 173/500 [=========>....................] - ETA: 1:48 - loss: 1.6479 - regression_loss: 1.3934 - classification_loss: 0.2545 174/500 [=========>....................] - ETA: 1:47 - loss: 1.6479 - regression_loss: 1.3932 - classification_loss: 0.2547 175/500 [=========>....................] - ETA: 1:47 - loss: 1.6481 - regression_loss: 1.3934 - classification_loss: 0.2547 176/500 [=========>....................] - ETA: 1:47 - loss: 1.6460 - regression_loss: 1.3918 - classification_loss: 0.2542 177/500 [=========>....................] - ETA: 1:46 - loss: 1.6409 - regression_loss: 1.3873 - classification_loss: 0.2536 178/500 [=========>....................] - ETA: 1:46 - loss: 1.6422 - regression_loss: 1.3884 - classification_loss: 0.2538 179/500 [=========>....................] - ETA: 1:46 - loss: 1.6442 - regression_loss: 1.3903 - classification_loss: 0.2539 180/500 [=========>....................] - ETA: 1:45 - loss: 1.6471 - regression_loss: 1.3925 - classification_loss: 0.2546 181/500 [=========>....................] - ETA: 1:45 - loss: 1.6467 - regression_loss: 1.3920 - classification_loss: 0.2547 182/500 [=========>....................] - ETA: 1:45 - loss: 1.6490 - regression_loss: 1.3943 - classification_loss: 0.2547 183/500 [=========>....................] - ETA: 1:44 - loss: 1.6501 - regression_loss: 1.3956 - classification_loss: 0.2545 184/500 [==========>...................] - ETA: 1:44 - loss: 1.6458 - regression_loss: 1.3919 - classification_loss: 0.2539 185/500 [==========>...................] - ETA: 1:44 - loss: 1.6482 - regression_loss: 1.3938 - classification_loss: 0.2544 186/500 [==========>...................] - ETA: 1:43 - loss: 1.6490 - regression_loss: 1.3944 - classification_loss: 0.2546 187/500 [==========>...................] - ETA: 1:43 - loss: 1.6464 - regression_loss: 1.3924 - classification_loss: 0.2540 188/500 [==========>...................] - ETA: 1:43 - loss: 1.6453 - regression_loss: 1.3914 - classification_loss: 0.2539 189/500 [==========>...................] - ETA: 1:42 - loss: 1.6462 - regression_loss: 1.3922 - classification_loss: 0.2540 190/500 [==========>...................] - ETA: 1:42 - loss: 1.6452 - regression_loss: 1.3914 - classification_loss: 0.2538 191/500 [==========>...................] - ETA: 1:42 - loss: 1.6464 - regression_loss: 1.3923 - classification_loss: 0.2541 192/500 [==========>...................] - ETA: 1:41 - loss: 1.6434 - regression_loss: 1.3898 - classification_loss: 0.2535 193/500 [==========>...................] - ETA: 1:41 - loss: 1.6434 - regression_loss: 1.3898 - classification_loss: 0.2536 194/500 [==========>...................] - ETA: 1:41 - loss: 1.6439 - regression_loss: 1.3903 - classification_loss: 0.2536 195/500 [==========>...................] - ETA: 1:40 - loss: 1.6399 - regression_loss: 1.3869 - classification_loss: 0.2530 196/500 [==========>...................] - ETA: 1:40 - loss: 1.6435 - regression_loss: 1.3897 - classification_loss: 0.2538 197/500 [==========>...................] - ETA: 1:40 - loss: 1.6397 - regression_loss: 1.3868 - classification_loss: 0.2529 198/500 [==========>...................] - ETA: 1:39 - loss: 1.6407 - regression_loss: 1.3875 - classification_loss: 0.2532 199/500 [==========>...................] - ETA: 1:39 - loss: 1.6413 - regression_loss: 1.3882 - classification_loss: 0.2531 200/500 [===========>..................] - ETA: 1:39 - loss: 1.6424 - regression_loss: 1.3890 - classification_loss: 0.2534 201/500 [===========>..................] - ETA: 1:38 - loss: 1.6423 - regression_loss: 1.3892 - classification_loss: 0.2531 202/500 [===========>..................] - ETA: 1:38 - loss: 1.6387 - regression_loss: 1.3863 - classification_loss: 0.2525 203/500 [===========>..................] - ETA: 1:38 - loss: 1.6396 - regression_loss: 1.3873 - classification_loss: 0.2524 204/500 [===========>..................] - ETA: 1:37 - loss: 1.6419 - regression_loss: 1.3894 - classification_loss: 0.2525 205/500 [===========>..................] - ETA: 1:37 - loss: 1.6453 - regression_loss: 1.3922 - classification_loss: 0.2531 206/500 [===========>..................] - ETA: 1:37 - loss: 1.6448 - regression_loss: 1.3916 - classification_loss: 0.2532 207/500 [===========>..................] - ETA: 1:36 - loss: 1.6467 - regression_loss: 1.3927 - classification_loss: 0.2540 208/500 [===========>..................] - ETA: 1:36 - loss: 1.6445 - regression_loss: 1.3904 - classification_loss: 0.2541 209/500 [===========>..................] - ETA: 1:36 - loss: 1.6421 - regression_loss: 1.3885 - classification_loss: 0.2536 210/500 [===========>..................] - ETA: 1:35 - loss: 1.6438 - regression_loss: 1.3899 - classification_loss: 0.2539 211/500 [===========>..................] - ETA: 1:35 - loss: 1.6459 - regression_loss: 1.3917 - classification_loss: 0.2543 212/500 [===========>..................] - ETA: 1:35 - loss: 1.6435 - regression_loss: 1.3898 - classification_loss: 0.2538 213/500 [===========>..................] - ETA: 1:34 - loss: 1.6461 - regression_loss: 1.3922 - classification_loss: 0.2540 214/500 [===========>..................] - ETA: 1:34 - loss: 1.6481 - regression_loss: 1.3940 - classification_loss: 0.2542 215/500 [===========>..................] - ETA: 1:34 - loss: 1.6452 - regression_loss: 1.3915 - classification_loss: 0.2537 216/500 [===========>..................] - ETA: 1:33 - loss: 1.6464 - regression_loss: 1.3920 - classification_loss: 0.2544 217/500 [============>.................] - ETA: 1:33 - loss: 1.6462 - regression_loss: 1.3918 - classification_loss: 0.2544 218/500 [============>.................] - ETA: 1:33 - loss: 1.6459 - regression_loss: 1.3916 - classification_loss: 0.2544 219/500 [============>.................] - ETA: 1:32 - loss: 1.6465 - regression_loss: 1.3920 - classification_loss: 0.2544 220/500 [============>.................] - ETA: 1:32 - loss: 1.6480 - regression_loss: 1.3935 - classification_loss: 0.2545 221/500 [============>.................] - ETA: 1:32 - loss: 1.6500 - regression_loss: 1.3952 - classification_loss: 0.2548 222/500 [============>.................] - ETA: 1:31 - loss: 1.6494 - regression_loss: 1.3948 - classification_loss: 0.2546 223/500 [============>.................] - ETA: 1:31 - loss: 1.6507 - regression_loss: 1.3958 - classification_loss: 0.2549 224/500 [============>.................] - ETA: 1:31 - loss: 1.6474 - regression_loss: 1.3928 - classification_loss: 0.2546 225/500 [============>.................] - ETA: 1:30 - loss: 1.6462 - regression_loss: 1.3918 - classification_loss: 0.2543 226/500 [============>.................] - ETA: 1:30 - loss: 1.6466 - regression_loss: 1.3921 - classification_loss: 0.2545 227/500 [============>.................] - ETA: 1:30 - loss: 1.6487 - regression_loss: 1.3939 - classification_loss: 0.2548 228/500 [============>.................] - ETA: 1:29 - loss: 1.6527 - regression_loss: 1.3972 - classification_loss: 0.2555 229/500 [============>.................] - ETA: 1:29 - loss: 1.6534 - regression_loss: 1.3976 - classification_loss: 0.2558 230/500 [============>.................] - ETA: 1:29 - loss: 1.6542 - regression_loss: 1.3985 - classification_loss: 0.2558 231/500 [============>.................] - ETA: 1:28 - loss: 1.6537 - regression_loss: 1.3981 - classification_loss: 0.2556 232/500 [============>.................] - ETA: 1:28 - loss: 1.6562 - regression_loss: 1.4004 - classification_loss: 0.2559 233/500 [============>.................] - ETA: 1:28 - loss: 1.6513 - regression_loss: 1.3961 - classification_loss: 0.2552 234/500 [=============>................] - ETA: 1:27 - loss: 1.6515 - regression_loss: 1.3962 - classification_loss: 0.2553 235/500 [=============>................] - ETA: 1:27 - loss: 1.6525 - regression_loss: 1.3969 - classification_loss: 0.2556 236/500 [=============>................] - ETA: 1:27 - loss: 1.6511 - regression_loss: 1.3958 - classification_loss: 0.2554 237/500 [=============>................] - ETA: 1:26 - loss: 1.6527 - regression_loss: 1.3969 - classification_loss: 0.2558 238/500 [=============>................] - ETA: 1:26 - loss: 1.6545 - regression_loss: 1.3985 - classification_loss: 0.2560 239/500 [=============>................] - ETA: 1:26 - loss: 1.6552 - regression_loss: 1.3992 - classification_loss: 0.2559 240/500 [=============>................] - ETA: 1:25 - loss: 1.6500 - regression_loss: 1.3948 - classification_loss: 0.2553 241/500 [=============>................] - ETA: 1:25 - loss: 1.6497 - regression_loss: 1.3947 - classification_loss: 0.2550 242/500 [=============>................] - ETA: 1:25 - loss: 1.6498 - regression_loss: 1.3949 - classification_loss: 0.2548 243/500 [=============>................] - ETA: 1:25 - loss: 1.6511 - regression_loss: 1.3961 - classification_loss: 0.2550 244/500 [=============>................] - ETA: 1:24 - loss: 1.6498 - regression_loss: 1.3950 - classification_loss: 0.2548 245/500 [=============>................] - ETA: 1:24 - loss: 1.6468 - regression_loss: 1.3924 - classification_loss: 0.2543 246/500 [=============>................] - ETA: 1:24 - loss: 1.6438 - regression_loss: 1.3896 - classification_loss: 0.2542 247/500 [=============>................] - ETA: 1:23 - loss: 1.6431 - regression_loss: 1.3882 - classification_loss: 0.2549 248/500 [=============>................] - ETA: 1:23 - loss: 1.6426 - regression_loss: 1.3878 - classification_loss: 0.2548 249/500 [=============>................] - ETA: 1:23 - loss: 1.6456 - regression_loss: 1.3904 - classification_loss: 0.2553 250/500 [==============>...............] - ETA: 1:22 - loss: 1.6427 - regression_loss: 1.3880 - classification_loss: 0.2547 251/500 [==============>...............] - ETA: 1:22 - loss: 1.6394 - regression_loss: 1.3851 - classification_loss: 0.2542 252/500 [==============>...............] - ETA: 1:22 - loss: 1.6363 - regression_loss: 1.3825 - classification_loss: 0.2539 253/500 [==============>...............] - ETA: 1:21 - loss: 1.6353 - regression_loss: 1.3816 - classification_loss: 0.2537 254/500 [==============>...............] - ETA: 1:21 - loss: 1.6349 - regression_loss: 1.3813 - classification_loss: 0.2536 255/500 [==============>...............] - ETA: 1:21 - loss: 1.6323 - regression_loss: 1.3793 - classification_loss: 0.2531 256/500 [==============>...............] - ETA: 1:20 - loss: 1.6315 - regression_loss: 1.3786 - classification_loss: 0.2528 257/500 [==============>...............] - ETA: 1:20 - loss: 1.6321 - regression_loss: 1.3793 - classification_loss: 0.2528 258/500 [==============>...............] - ETA: 1:20 - loss: 1.6307 - regression_loss: 1.3782 - classification_loss: 0.2525 259/500 [==============>...............] - ETA: 1:19 - loss: 1.6290 - regression_loss: 1.3767 - classification_loss: 0.2523 260/500 [==============>...............] - ETA: 1:19 - loss: 1.6293 - regression_loss: 1.3771 - classification_loss: 0.2522 261/500 [==============>...............] - ETA: 1:19 - loss: 1.6303 - regression_loss: 1.3781 - classification_loss: 0.2522 262/500 [==============>...............] - ETA: 1:18 - loss: 1.6277 - regression_loss: 1.3759 - classification_loss: 0.2518 263/500 [==============>...............] - ETA: 1:18 - loss: 1.6249 - regression_loss: 1.3735 - classification_loss: 0.2514 264/500 [==============>...............] - ETA: 1:18 - loss: 1.6256 - regression_loss: 1.3741 - classification_loss: 0.2515 265/500 [==============>...............] - ETA: 1:17 - loss: 1.6246 - regression_loss: 1.3733 - classification_loss: 0.2514 266/500 [==============>...............] - ETA: 1:17 - loss: 1.6254 - regression_loss: 1.3740 - classification_loss: 0.2514 267/500 [===============>..............] - ETA: 1:17 - loss: 1.6231 - regression_loss: 1.3716 - classification_loss: 0.2514 268/500 [===============>..............] - ETA: 1:16 - loss: 1.6244 - regression_loss: 1.3729 - classification_loss: 0.2515 269/500 [===============>..............] - ETA: 1:16 - loss: 1.6240 - regression_loss: 1.3726 - classification_loss: 0.2514 270/500 [===============>..............] - ETA: 1:16 - loss: 1.6218 - regression_loss: 1.3707 - classification_loss: 0.2510 271/500 [===============>..............] - ETA: 1:15 - loss: 1.6234 - regression_loss: 1.3722 - classification_loss: 0.2512 272/500 [===============>..............] - ETA: 1:15 - loss: 1.6203 - regression_loss: 1.3695 - classification_loss: 0.2508 273/500 [===============>..............] - ETA: 1:15 - loss: 1.6177 - regression_loss: 1.3673 - classification_loss: 0.2504 274/500 [===============>..............] - ETA: 1:14 - loss: 1.6173 - regression_loss: 1.3670 - classification_loss: 0.2504 275/500 [===============>..............] - ETA: 1:14 - loss: 1.6188 - regression_loss: 1.3681 - classification_loss: 0.2507 276/500 [===============>..............] - ETA: 1:14 - loss: 1.6183 - regression_loss: 1.3675 - classification_loss: 0.2508 277/500 [===============>..............] - ETA: 1:13 - loss: 1.6195 - regression_loss: 1.3686 - classification_loss: 0.2510 278/500 [===============>..............] - ETA: 1:13 - loss: 1.6204 - regression_loss: 1.3693 - classification_loss: 0.2511 279/500 [===============>..............] - ETA: 1:13 - loss: 1.6220 - regression_loss: 1.3707 - classification_loss: 0.2513 280/500 [===============>..............] - ETA: 1:12 - loss: 1.6215 - regression_loss: 1.3704 - classification_loss: 0.2511 281/500 [===============>..............] - ETA: 1:12 - loss: 1.6224 - regression_loss: 1.3713 - classification_loss: 0.2511 282/500 [===============>..............] - ETA: 1:12 - loss: 1.6225 - regression_loss: 1.3713 - classification_loss: 0.2511 283/500 [===============>..............] - ETA: 1:11 - loss: 1.6228 - regression_loss: 1.3717 - classification_loss: 0.2511 284/500 [================>.............] - ETA: 1:11 - loss: 1.6244 - regression_loss: 1.3728 - classification_loss: 0.2516 285/500 [================>.............] - ETA: 1:11 - loss: 1.6246 - regression_loss: 1.3729 - classification_loss: 0.2516 286/500 [================>.............] - ETA: 1:10 - loss: 1.6250 - regression_loss: 1.3735 - classification_loss: 0.2516 287/500 [================>.............] - ETA: 1:10 - loss: 1.6252 - regression_loss: 1.3737 - classification_loss: 0.2515 288/500 [================>.............] - ETA: 1:10 - loss: 1.6259 - regression_loss: 1.3743 - classification_loss: 0.2516 289/500 [================>.............] - ETA: 1:09 - loss: 1.6268 - regression_loss: 1.3751 - classification_loss: 0.2517 290/500 [================>.............] - ETA: 1:09 - loss: 1.6278 - regression_loss: 1.3760 - classification_loss: 0.2519 291/500 [================>.............] - ETA: 1:09 - loss: 1.6266 - regression_loss: 1.3749 - classification_loss: 0.2517 292/500 [================>.............] - ETA: 1:08 - loss: 1.6273 - regression_loss: 1.3756 - classification_loss: 0.2517 293/500 [================>.............] - ETA: 1:08 - loss: 1.6267 - regression_loss: 1.3750 - classification_loss: 0.2517 294/500 [================>.............] - ETA: 1:08 - loss: 1.6275 - regression_loss: 1.3757 - classification_loss: 0.2517 295/500 [================>.............] - ETA: 1:07 - loss: 1.6283 - regression_loss: 1.3765 - classification_loss: 0.2518 296/500 [================>.............] - ETA: 1:07 - loss: 1.6252 - regression_loss: 1.3737 - classification_loss: 0.2516 297/500 [================>.............] - ETA: 1:07 - loss: 1.6241 - regression_loss: 1.3727 - classification_loss: 0.2514 298/500 [================>.............] - ETA: 1:06 - loss: 1.6245 - regression_loss: 1.3732 - classification_loss: 0.2513 299/500 [================>.............] - ETA: 1:06 - loss: 1.6242 - regression_loss: 1.3730 - classification_loss: 0.2512 300/500 [=================>............] - ETA: 1:06 - loss: 1.6228 - regression_loss: 1.3719 - classification_loss: 0.2510 301/500 [=================>............] - ETA: 1:05 - loss: 1.6246 - regression_loss: 1.3735 - classification_loss: 0.2511 302/500 [=================>............] - ETA: 1:05 - loss: 1.6233 - regression_loss: 1.3725 - classification_loss: 0.2508 303/500 [=================>............] - ETA: 1:05 - loss: 1.6221 - regression_loss: 1.3716 - classification_loss: 0.2505 304/500 [=================>............] - ETA: 1:04 - loss: 1.6198 - regression_loss: 1.3696 - classification_loss: 0.2502 305/500 [=================>............] - ETA: 1:04 - loss: 1.6208 - regression_loss: 1.3704 - classification_loss: 0.2504 306/500 [=================>............] - ETA: 1:04 - loss: 1.6194 - regression_loss: 1.3692 - classification_loss: 0.2502 307/500 [=================>............] - ETA: 1:03 - loss: 1.6207 - regression_loss: 1.3704 - classification_loss: 0.2503 308/500 [=================>............] - ETA: 1:03 - loss: 1.6210 - regression_loss: 1.3708 - classification_loss: 0.2503 309/500 [=================>............] - ETA: 1:03 - loss: 1.6177 - regression_loss: 1.3679 - classification_loss: 0.2498 310/500 [=================>............] - ETA: 1:02 - loss: 1.6182 - regression_loss: 1.3683 - classification_loss: 0.2499 311/500 [=================>............] - ETA: 1:02 - loss: 1.6214 - regression_loss: 1.3710 - classification_loss: 0.2503 312/500 [=================>............] - ETA: 1:02 - loss: 1.6205 - regression_loss: 1.3703 - classification_loss: 0.2502 313/500 [=================>............] - ETA: 1:01 - loss: 1.6188 - regression_loss: 1.3689 - classification_loss: 0.2499 314/500 [=================>............] - ETA: 1:01 - loss: 1.6214 - regression_loss: 1.3709 - classification_loss: 0.2505 315/500 [=================>............] - ETA: 1:01 - loss: 1.6225 - regression_loss: 1.3718 - classification_loss: 0.2507 316/500 [=================>............] - ETA: 1:00 - loss: 1.6229 - regression_loss: 1.3720 - classification_loss: 0.2509 317/500 [==================>...........] - ETA: 1:00 - loss: 1.6245 - regression_loss: 1.3735 - classification_loss: 0.2510 318/500 [==================>...........] - ETA: 1:00 - loss: 1.6258 - regression_loss: 1.3746 - classification_loss: 0.2512 319/500 [==================>...........] - ETA: 59s - loss: 1.6255 - regression_loss: 1.3743 - classification_loss: 0.2512  320/500 [==================>...........] - ETA: 59s - loss: 1.6251 - regression_loss: 1.3737 - classification_loss: 0.2513 321/500 [==================>...........] - ETA: 59s - loss: 1.6227 - regression_loss: 1.3717 - classification_loss: 0.2510 322/500 [==================>...........] - ETA: 58s - loss: 1.6249 - regression_loss: 1.3733 - classification_loss: 0.2516 323/500 [==================>...........] - ETA: 58s - loss: 1.6252 - regression_loss: 1.3739 - classification_loss: 0.2513 324/500 [==================>...........] - ETA: 58s - loss: 1.6226 - regression_loss: 1.3716 - classification_loss: 0.2509 325/500 [==================>...........] - ETA: 57s - loss: 1.6204 - regression_loss: 1.3698 - classification_loss: 0.2506 326/500 [==================>...........] - ETA: 57s - loss: 1.6217 - regression_loss: 1.3709 - classification_loss: 0.2508 327/500 [==================>...........] - ETA: 57s - loss: 1.6215 - regression_loss: 1.3707 - classification_loss: 0.2508 328/500 [==================>...........] - ETA: 56s - loss: 1.6232 - regression_loss: 1.3721 - classification_loss: 0.2511 329/500 [==================>...........] - ETA: 56s - loss: 1.6212 - regression_loss: 1.3704 - classification_loss: 0.2508 330/500 [==================>...........] - ETA: 56s - loss: 1.6220 - regression_loss: 1.3710 - classification_loss: 0.2510 331/500 [==================>...........] - ETA: 55s - loss: 1.6230 - regression_loss: 1.3720 - classification_loss: 0.2509 332/500 [==================>...........] - ETA: 55s - loss: 1.6239 - regression_loss: 1.3727 - classification_loss: 0.2511 333/500 [==================>...........] - ETA: 55s - loss: 1.6234 - regression_loss: 1.3724 - classification_loss: 0.2511 334/500 [===================>..........] - ETA: 54s - loss: 1.6233 - regression_loss: 1.3723 - classification_loss: 0.2511 335/500 [===================>..........] - ETA: 54s - loss: 1.6248 - regression_loss: 1.3734 - classification_loss: 0.2514 336/500 [===================>..........] - ETA: 54s - loss: 1.6224 - regression_loss: 1.3714 - classification_loss: 0.2510 337/500 [===================>..........] - ETA: 53s - loss: 1.6232 - regression_loss: 1.3721 - classification_loss: 0.2511 338/500 [===================>..........] - ETA: 53s - loss: 1.6255 - regression_loss: 1.3738 - classification_loss: 0.2516 339/500 [===================>..........] - ETA: 53s - loss: 1.6252 - regression_loss: 1.3737 - classification_loss: 0.2515 340/500 [===================>..........] - ETA: 52s - loss: 1.6256 - regression_loss: 1.3741 - classification_loss: 0.2515 341/500 [===================>..........] - ETA: 52s - loss: 1.6260 - regression_loss: 1.3745 - classification_loss: 0.2515 342/500 [===================>..........] - ETA: 52s - loss: 1.6276 - regression_loss: 1.3759 - classification_loss: 0.2517 343/500 [===================>..........] - ETA: 51s - loss: 1.6284 - regression_loss: 1.3769 - classification_loss: 0.2516 344/500 [===================>..........] - ETA: 51s - loss: 1.6273 - regression_loss: 1.3759 - classification_loss: 0.2514 345/500 [===================>..........] - ETA: 51s - loss: 1.6277 - regression_loss: 1.3763 - classification_loss: 0.2514 346/500 [===================>..........] - ETA: 50s - loss: 1.6282 - regression_loss: 1.3767 - classification_loss: 0.2515 347/500 [===================>..........] - ETA: 50s - loss: 1.6295 - regression_loss: 1.3779 - classification_loss: 0.2516 348/500 [===================>..........] - ETA: 50s - loss: 1.6292 - regression_loss: 1.3776 - classification_loss: 0.2516 349/500 [===================>..........] - ETA: 49s - loss: 1.6289 - regression_loss: 1.3773 - classification_loss: 0.2516 350/500 [====================>.........] - ETA: 49s - loss: 1.6308 - regression_loss: 1.3788 - classification_loss: 0.2520 351/500 [====================>.........] - ETA: 49s - loss: 1.6314 - regression_loss: 1.3794 - classification_loss: 0.2520 352/500 [====================>.........] - ETA: 48s - loss: 1.6323 - regression_loss: 1.3802 - classification_loss: 0.2522 353/500 [====================>.........] - ETA: 48s - loss: 1.6323 - regression_loss: 1.3802 - classification_loss: 0.2522 354/500 [====================>.........] - ETA: 48s - loss: 1.6338 - regression_loss: 1.3814 - classification_loss: 0.2524 355/500 [====================>.........] - ETA: 47s - loss: 1.6341 - regression_loss: 1.3817 - classification_loss: 0.2524 356/500 [====================>.........] - ETA: 47s - loss: 1.6339 - regression_loss: 1.3815 - classification_loss: 0.2524 357/500 [====================>.........] - ETA: 47s - loss: 1.6335 - regression_loss: 1.3811 - classification_loss: 0.2524 358/500 [====================>.........] - ETA: 46s - loss: 1.6339 - regression_loss: 1.3813 - classification_loss: 0.2526 359/500 [====================>.........] - ETA: 46s - loss: 1.6321 - regression_loss: 1.3799 - classification_loss: 0.2522 360/500 [====================>.........] - ETA: 46s - loss: 1.6318 - regression_loss: 1.3796 - classification_loss: 0.2522 361/500 [====================>.........] - ETA: 45s - loss: 1.6333 - regression_loss: 1.3808 - classification_loss: 0.2524 362/500 [====================>.........] - ETA: 45s - loss: 1.6332 - regression_loss: 1.3807 - classification_loss: 0.2524 363/500 [====================>.........] - ETA: 45s - loss: 1.6330 - regression_loss: 1.3808 - classification_loss: 0.2523 364/500 [====================>.........] - ETA: 44s - loss: 1.6314 - regression_loss: 1.3793 - classification_loss: 0.2521 365/500 [====================>.........] - ETA: 44s - loss: 1.6310 - regression_loss: 1.3789 - classification_loss: 0.2521 366/500 [====================>.........] - ETA: 44s - loss: 1.6313 - regression_loss: 1.3792 - classification_loss: 0.2521 367/500 [=====================>........] - ETA: 43s - loss: 1.6319 - regression_loss: 1.3798 - classification_loss: 0.2521 368/500 [=====================>........] - ETA: 43s - loss: 1.6327 - regression_loss: 1.3806 - classification_loss: 0.2521 369/500 [=====================>........] - ETA: 43s - loss: 1.6311 - regression_loss: 1.3793 - classification_loss: 0.2518 370/500 [=====================>........] - ETA: 42s - loss: 1.6312 - regression_loss: 1.3793 - classification_loss: 0.2518 371/500 [=====================>........] - ETA: 42s - loss: 1.6307 - regression_loss: 1.3790 - classification_loss: 0.2517 372/500 [=====================>........] - ETA: 42s - loss: 1.6316 - regression_loss: 1.3796 - classification_loss: 0.2519 373/500 [=====================>........] - ETA: 41s - loss: 1.6322 - regression_loss: 1.3801 - classification_loss: 0.2521 374/500 [=====================>........] - ETA: 41s - loss: 1.6321 - regression_loss: 1.3800 - classification_loss: 0.2521 375/500 [=====================>........] - ETA: 41s - loss: 1.6300 - regression_loss: 1.3780 - classification_loss: 0.2520 376/500 [=====================>........] - ETA: 40s - loss: 1.6293 - regression_loss: 1.3773 - classification_loss: 0.2520 377/500 [=====================>........] - ETA: 40s - loss: 1.6270 - regression_loss: 1.3753 - classification_loss: 0.2517 378/500 [=====================>........] - ETA: 40s - loss: 1.6258 - regression_loss: 1.3742 - classification_loss: 0.2515 379/500 [=====================>........] - ETA: 39s - loss: 1.6253 - regression_loss: 1.3737 - classification_loss: 0.2516 380/500 [=====================>........] - ETA: 39s - loss: 1.6259 - regression_loss: 1.3743 - classification_loss: 0.2516 381/500 [=====================>........] - ETA: 39s - loss: 1.6242 - regression_loss: 1.3730 - classification_loss: 0.2512 382/500 [=====================>........] - ETA: 38s - loss: 1.6245 - regression_loss: 1.3732 - classification_loss: 0.2513 383/500 [=====================>........] - ETA: 38s - loss: 1.6255 - regression_loss: 1.3741 - classification_loss: 0.2514 384/500 [======================>.......] - ETA: 38s - loss: 1.6261 - regression_loss: 1.3746 - classification_loss: 0.2515 385/500 [======================>.......] - ETA: 38s - loss: 1.6262 - regression_loss: 1.3748 - classification_loss: 0.2515 386/500 [======================>.......] - ETA: 37s - loss: 1.6254 - regression_loss: 1.3742 - classification_loss: 0.2512 387/500 [======================>.......] - ETA: 37s - loss: 1.6257 - regression_loss: 1.3745 - classification_loss: 0.2512 388/500 [======================>.......] - ETA: 37s - loss: 1.6263 - regression_loss: 1.3749 - classification_loss: 0.2514 389/500 [======================>.......] - ETA: 36s - loss: 1.6271 - regression_loss: 1.3756 - classification_loss: 0.2515 390/500 [======================>.......] - ETA: 36s - loss: 1.6277 - regression_loss: 1.3762 - classification_loss: 0.2515 391/500 [======================>.......] - ETA: 36s - loss: 1.6269 - regression_loss: 1.3756 - classification_loss: 0.2513 392/500 [======================>.......] - ETA: 35s - loss: 1.6269 - regression_loss: 1.3756 - classification_loss: 0.2513 393/500 [======================>.......] - ETA: 35s - loss: 1.6272 - regression_loss: 1.3759 - classification_loss: 0.2514 394/500 [======================>.......] - ETA: 35s - loss: 1.6275 - regression_loss: 1.3761 - classification_loss: 0.2514 395/500 [======================>.......] - ETA: 34s - loss: 1.6283 - regression_loss: 1.3768 - classification_loss: 0.2514 396/500 [======================>.......] - ETA: 34s - loss: 1.6291 - regression_loss: 1.3776 - classification_loss: 0.2515 397/500 [======================>.......] - ETA: 34s - loss: 1.6278 - regression_loss: 1.3764 - classification_loss: 0.2514 398/500 [======================>.......] - ETA: 33s - loss: 1.6285 - regression_loss: 1.3772 - classification_loss: 0.2514 399/500 [======================>.......] - ETA: 33s - loss: 1.6290 - regression_loss: 1.3776 - classification_loss: 0.2514 400/500 [=======================>......] - ETA: 33s - loss: 1.6298 - regression_loss: 1.3785 - classification_loss: 0.2513 401/500 [=======================>......] - ETA: 32s - loss: 1.6273 - regression_loss: 1.3763 - classification_loss: 0.2510 402/500 [=======================>......] - ETA: 32s - loss: 1.6275 - regression_loss: 1.3765 - classification_loss: 0.2510 403/500 [=======================>......] - ETA: 32s - loss: 1.6280 - regression_loss: 1.3770 - classification_loss: 0.2510 404/500 [=======================>......] - ETA: 31s - loss: 1.6281 - regression_loss: 1.3771 - classification_loss: 0.2510 405/500 [=======================>......] - ETA: 31s - loss: 1.6272 - regression_loss: 1.3764 - classification_loss: 0.2507 406/500 [=======================>......] - ETA: 31s - loss: 1.6257 - regression_loss: 1.3752 - classification_loss: 0.2504 407/500 [=======================>......] - ETA: 30s - loss: 1.6260 - regression_loss: 1.3756 - classification_loss: 0.2504 408/500 [=======================>......] - ETA: 30s - loss: 1.6262 - regression_loss: 1.3759 - classification_loss: 0.2503 409/500 [=======================>......] - ETA: 30s - loss: 1.6272 - regression_loss: 1.3767 - classification_loss: 0.2505 410/500 [=======================>......] - ETA: 29s - loss: 1.6254 - regression_loss: 1.3751 - classification_loss: 0.2503 411/500 [=======================>......] - ETA: 29s - loss: 1.6254 - regression_loss: 1.3750 - classification_loss: 0.2504 412/500 [=======================>......] - ETA: 29s - loss: 1.6250 - regression_loss: 1.3749 - classification_loss: 0.2501 413/500 [=======================>......] - ETA: 28s - loss: 1.6257 - regression_loss: 1.3755 - classification_loss: 0.2502 414/500 [=======================>......] - ETA: 28s - loss: 1.6270 - regression_loss: 1.3768 - classification_loss: 0.2503 415/500 [=======================>......] - ETA: 28s - loss: 1.6289 - regression_loss: 1.3781 - classification_loss: 0.2508 416/500 [=======================>......] - ETA: 27s - loss: 1.6298 - regression_loss: 1.3788 - classification_loss: 0.2510 417/500 [========================>.....] - ETA: 27s - loss: 1.6293 - regression_loss: 1.3785 - classification_loss: 0.2508 418/500 [========================>.....] - ETA: 27s - loss: 1.6273 - regression_loss: 1.3768 - classification_loss: 0.2505 419/500 [========================>.....] - ETA: 26s - loss: 1.6289 - regression_loss: 1.3781 - classification_loss: 0.2508 420/500 [========================>.....] - ETA: 26s - loss: 1.6298 - regression_loss: 1.3788 - classification_loss: 0.2509 421/500 [========================>.....] - ETA: 26s - loss: 1.6299 - regression_loss: 1.3789 - classification_loss: 0.2510 422/500 [========================>.....] - ETA: 25s - loss: 1.6296 - regression_loss: 1.3787 - classification_loss: 0.2509 423/500 [========================>.....] - ETA: 25s - loss: 1.6299 - regression_loss: 1.3790 - classification_loss: 0.2509 424/500 [========================>.....] - ETA: 25s - loss: 1.6307 - regression_loss: 1.3798 - classification_loss: 0.2510 425/500 [========================>.....] - ETA: 24s - loss: 1.6317 - regression_loss: 1.3805 - classification_loss: 0.2513 426/500 [========================>.....] - ETA: 24s - loss: 1.6309 - regression_loss: 1.3799 - classification_loss: 0.2511 427/500 [========================>.....] - ETA: 24s - loss: 1.6295 - regression_loss: 1.3787 - classification_loss: 0.2508 428/500 [========================>.....] - ETA: 23s - loss: 1.6303 - regression_loss: 1.3792 - classification_loss: 0.2511 429/500 [========================>.....] - ETA: 23s - loss: 1.6313 - regression_loss: 1.3801 - classification_loss: 0.2512 430/500 [========================>.....] - ETA: 23s - loss: 1.6314 - regression_loss: 1.3802 - classification_loss: 0.2512 431/500 [========================>.....] - ETA: 22s - loss: 1.6288 - regression_loss: 1.3780 - classification_loss: 0.2508 432/500 [========================>.....] - ETA: 22s - loss: 1.6282 - regression_loss: 1.3776 - classification_loss: 0.2506 433/500 [========================>.....] - ETA: 22s - loss: 1.6299 - regression_loss: 1.3790 - classification_loss: 0.2509 434/500 [=========================>....] - ETA: 21s - loss: 1.6289 - regression_loss: 1.3782 - classification_loss: 0.2507 435/500 [=========================>....] - ETA: 21s - loss: 1.6294 - regression_loss: 1.3787 - classification_loss: 0.2507 436/500 [=========================>....] - ETA: 21s - loss: 1.6287 - regression_loss: 1.3780 - classification_loss: 0.2507 437/500 [=========================>....] - ETA: 20s - loss: 1.6277 - regression_loss: 1.3772 - classification_loss: 0.2505 438/500 [=========================>....] - ETA: 20s - loss: 1.6277 - regression_loss: 1.3771 - classification_loss: 0.2505 439/500 [=========================>....] - ETA: 20s - loss: 1.6277 - regression_loss: 1.3772 - classification_loss: 0.2505 440/500 [=========================>....] - ETA: 19s - loss: 1.6281 - regression_loss: 1.3775 - classification_loss: 0.2505 441/500 [=========================>....] - ETA: 19s - loss: 1.6272 - regression_loss: 1.3769 - classification_loss: 0.2504 442/500 [=========================>....] - ETA: 19s - loss: 1.6274 - regression_loss: 1.3771 - classification_loss: 0.2503 443/500 [=========================>....] - ETA: 18s - loss: 1.6268 - regression_loss: 1.3766 - classification_loss: 0.2502 444/500 [=========================>....] - ETA: 18s - loss: 1.6277 - regression_loss: 1.3774 - classification_loss: 0.2503 445/500 [=========================>....] - ETA: 18s - loss: 1.6293 - regression_loss: 1.3788 - classification_loss: 0.2505 446/500 [=========================>....] - ETA: 17s - loss: 1.6292 - regression_loss: 1.3788 - classification_loss: 0.2505 447/500 [=========================>....] - ETA: 17s - loss: 1.6295 - regression_loss: 1.3790 - classification_loss: 0.2505 448/500 [=========================>....] - ETA: 17s - loss: 1.6297 - regression_loss: 1.3792 - classification_loss: 0.2505 449/500 [=========================>....] - ETA: 16s - loss: 1.6296 - regression_loss: 1.3792 - classification_loss: 0.2504 450/500 [==========================>...] - ETA: 16s - loss: 1.6302 - regression_loss: 1.3797 - classification_loss: 0.2505 451/500 [==========================>...] - ETA: 16s - loss: 1.6314 - regression_loss: 1.3808 - classification_loss: 0.2506 452/500 [==========================>...] - ETA: 15s - loss: 1.6322 - regression_loss: 1.3816 - classification_loss: 0.2507 453/500 [==========================>...] - ETA: 15s - loss: 1.6322 - regression_loss: 1.3817 - classification_loss: 0.2506 454/500 [==========================>...] - ETA: 15s - loss: 1.6313 - regression_loss: 1.3810 - classification_loss: 0.2503 455/500 [==========================>...] - ETA: 14s - loss: 1.6299 - regression_loss: 1.3797 - classification_loss: 0.2502 456/500 [==========================>...] - ETA: 14s - loss: 1.6302 - regression_loss: 1.3800 - classification_loss: 0.2502 457/500 [==========================>...] - ETA: 14s - loss: 1.6310 - regression_loss: 1.3807 - classification_loss: 0.2503 458/500 [==========================>...] - ETA: 13s - loss: 1.6295 - regression_loss: 1.3794 - classification_loss: 0.2501 459/500 [==========================>...] - ETA: 13s - loss: 1.6299 - regression_loss: 1.3798 - classification_loss: 0.2501 460/500 [==========================>...] - ETA: 13s - loss: 1.6298 - regression_loss: 1.3797 - classification_loss: 0.2501 461/500 [==========================>...] - ETA: 12s - loss: 1.6298 - regression_loss: 1.3797 - classification_loss: 0.2501 462/500 [==========================>...] - ETA: 12s - loss: 1.6290 - regression_loss: 1.3791 - classification_loss: 0.2499 463/500 [==========================>...] - ETA: 12s - loss: 1.6290 - regression_loss: 1.3793 - classification_loss: 0.2498 464/500 [==========================>...] - ETA: 11s - loss: 1.6293 - regression_loss: 1.3797 - classification_loss: 0.2496 465/500 [==========================>...] - ETA: 11s - loss: 1.6305 - regression_loss: 1.3806 - classification_loss: 0.2498 466/500 [==========================>...] - ETA: 11s - loss: 1.6294 - regression_loss: 1.3798 - classification_loss: 0.2496 467/500 [===========================>..] - ETA: 10s - loss: 1.6301 - regression_loss: 1.3804 - classification_loss: 0.2497 468/500 [===========================>..] - ETA: 10s - loss: 1.6282 - regression_loss: 1.3788 - classification_loss: 0.2495 469/500 [===========================>..] - ETA: 10s - loss: 1.6289 - regression_loss: 1.3793 - classification_loss: 0.2496 470/500 [===========================>..] - ETA: 9s - loss: 1.6270 - regression_loss: 1.3777 - classification_loss: 0.2493  471/500 [===========================>..] - ETA: 9s - loss: 1.6271 - regression_loss: 1.3779 - classification_loss: 0.2493 472/500 [===========================>..] - ETA: 9s - loss: 1.6270 - regression_loss: 1.3778 - classification_loss: 0.2492 473/500 [===========================>..] - ETA: 8s - loss: 1.6262 - regression_loss: 1.3772 - classification_loss: 0.2490 474/500 [===========================>..] - ETA: 8s - loss: 1.6274 - regression_loss: 1.3782 - classification_loss: 0.2492 475/500 [===========================>..] - ETA: 8s - loss: 1.6272 - regression_loss: 1.3781 - classification_loss: 0.2491 476/500 [===========================>..] - ETA: 7s - loss: 1.6277 - regression_loss: 1.3785 - classification_loss: 0.2492 477/500 [===========================>..] - ETA: 7s - loss: 1.6274 - regression_loss: 1.3782 - classification_loss: 0.2492 478/500 [===========================>..] - ETA: 7s - loss: 1.6282 - regression_loss: 1.3789 - classification_loss: 0.2493 479/500 [===========================>..] - ETA: 6s - loss: 1.6261 - regression_loss: 1.3770 - classification_loss: 0.2491 480/500 [===========================>..] - ETA: 6s - loss: 1.6256 - regression_loss: 1.3768 - classification_loss: 0.2489 481/500 [===========================>..] - ETA: 6s - loss: 1.6277 - regression_loss: 1.3786 - classification_loss: 0.2491 482/500 [===========================>..] - ETA: 5s - loss: 1.6285 - regression_loss: 1.3793 - classification_loss: 0.2493 483/500 [===========================>..] - ETA: 5s - loss: 1.6289 - regression_loss: 1.3797 - classification_loss: 0.2492 484/500 [============================>.] - ETA: 5s - loss: 1.6303 - regression_loss: 1.3810 - classification_loss: 0.2494 485/500 [============================>.] - ETA: 4s - loss: 1.6300 - regression_loss: 1.3806 - classification_loss: 0.2494 486/500 [============================>.] - ETA: 4s - loss: 1.6306 - regression_loss: 1.3812 - classification_loss: 0.2494 487/500 [============================>.] - ETA: 4s - loss: 1.6305 - regression_loss: 1.3812 - classification_loss: 0.2494 488/500 [============================>.] - ETA: 3s - loss: 1.6303 - regression_loss: 1.3809 - classification_loss: 0.2494 489/500 [============================>.] - ETA: 3s - loss: 1.6305 - regression_loss: 1.3813 - classification_loss: 0.2492 490/500 [============================>.] - ETA: 3s - loss: 1.6285 - regression_loss: 1.3796 - classification_loss: 0.2489 491/500 [============================>.] - ETA: 2s - loss: 1.6297 - regression_loss: 1.3805 - classification_loss: 0.2492 492/500 [============================>.] - ETA: 2s - loss: 1.6288 - regression_loss: 1.3798 - classification_loss: 0.2490 493/500 [============================>.] - ETA: 2s - loss: 1.6292 - regression_loss: 1.3800 - classification_loss: 0.2492 494/500 [============================>.] - ETA: 1s - loss: 1.6299 - regression_loss: 1.3807 - classification_loss: 0.2493 495/500 [============================>.] - ETA: 1s - loss: 1.6304 - regression_loss: 1.3810 - classification_loss: 0.2494 496/500 [============================>.] - ETA: 1s - loss: 1.6302 - regression_loss: 1.3809 - classification_loss: 0.2493 497/500 [============================>.] - ETA: 0s - loss: 1.6287 - regression_loss: 1.3795 - classification_loss: 0.2492 498/500 [============================>.] - ETA: 0s - loss: 1.6264 - regression_loss: 1.3775 - classification_loss: 0.2489 499/500 [============================>.] - ETA: 0s - loss: 1.6270 - regression_loss: 1.3779 - classification_loss: 0.2491 500/500 [==============================] - 165s 331ms/step - loss: 1.6278 - regression_loss: 1.3785 - classification_loss: 0.2492 1172 instances of class plum with average precision: 0.5086 mAP: 0.5086 Epoch 00014: saving model to ./training/snapshots/resnet101_pascal_14.h5 Epoch 15/150 1/500 [..............................] - ETA: 2:51 - loss: 1.0447 - regression_loss: 0.9116 - classification_loss: 0.1331 2/500 [..............................] - ETA: 2:47 - loss: 1.4496 - regression_loss: 1.2353 - classification_loss: 0.2143 3/500 [..............................] - ETA: 2:48 - loss: 1.8013 - regression_loss: 1.5168 - classification_loss: 0.2846 4/500 [..............................] - ETA: 2:46 - loss: 1.6964 - regression_loss: 1.4238 - classification_loss: 0.2725 5/500 [..............................] - ETA: 2:44 - loss: 1.7913 - regression_loss: 1.5120 - classification_loss: 0.2794 6/500 [..............................] - ETA: 2:42 - loss: 1.6495 - regression_loss: 1.3795 - classification_loss: 0.2700 7/500 [..............................] - ETA: 2:42 - loss: 1.5555 - regression_loss: 1.2991 - classification_loss: 0.2564 8/500 [..............................] - ETA: 2:43 - loss: 1.5033 - regression_loss: 1.2525 - classification_loss: 0.2507 9/500 [..............................] - ETA: 2:43 - loss: 1.6003 - regression_loss: 1.3313 - classification_loss: 0.2690 10/500 [..............................] - ETA: 2:43 - loss: 1.6237 - regression_loss: 1.3530 - classification_loss: 0.2708 11/500 [..............................] - ETA: 2:43 - loss: 1.6679 - regression_loss: 1.3938 - classification_loss: 0.2741 12/500 [..............................] - ETA: 2:42 - loss: 1.6634 - regression_loss: 1.3890 - classification_loss: 0.2744 13/500 [..............................] - ETA: 2:42 - loss: 1.6603 - regression_loss: 1.3893 - classification_loss: 0.2709 14/500 [..............................] - ETA: 2:41 - loss: 1.6782 - regression_loss: 1.4063 - classification_loss: 0.2720 15/500 [..............................] - ETA: 2:40 - loss: 1.6803 - regression_loss: 1.4074 - classification_loss: 0.2728 16/500 [..............................] - ETA: 2:40 - loss: 1.6980 - regression_loss: 1.4252 - classification_loss: 0.2728 17/500 [>.............................] - ETA: 2:40 - loss: 1.6904 - regression_loss: 1.4183 - classification_loss: 0.2721 18/500 [>.............................] - ETA: 2:39 - loss: 1.6622 - regression_loss: 1.3942 - classification_loss: 0.2681 19/500 [>.............................] - ETA: 2:39 - loss: 1.6393 - regression_loss: 1.3774 - classification_loss: 0.2620 20/500 [>.............................] - ETA: 2:39 - loss: 1.6355 - regression_loss: 1.3728 - classification_loss: 0.2626 21/500 [>.............................] - ETA: 2:38 - loss: 1.6197 - regression_loss: 1.3607 - classification_loss: 0.2590 22/500 [>.............................] - ETA: 2:38 - loss: 1.5817 - regression_loss: 1.3257 - classification_loss: 0.2560 23/500 [>.............................] - ETA: 2:38 - loss: 1.5586 - regression_loss: 1.3081 - classification_loss: 0.2505 24/500 [>.............................] - ETA: 2:37 - loss: 1.5781 - regression_loss: 1.3251 - classification_loss: 0.2530 25/500 [>.............................] - ETA: 2:37 - loss: 1.5984 - regression_loss: 1.3411 - classification_loss: 0.2573 26/500 [>.............................] - ETA: 2:37 - loss: 1.6033 - regression_loss: 1.3475 - classification_loss: 0.2558 27/500 [>.............................] - ETA: 2:36 - loss: 1.6146 - regression_loss: 1.3569 - classification_loss: 0.2577 28/500 [>.............................] - ETA: 2:36 - loss: 1.6111 - regression_loss: 1.3556 - classification_loss: 0.2555 29/500 [>.............................] - ETA: 2:35 - loss: 1.6187 - regression_loss: 1.3631 - classification_loss: 0.2556 30/500 [>.............................] - ETA: 2:34 - loss: 1.6204 - regression_loss: 1.3677 - classification_loss: 0.2527 31/500 [>.............................] - ETA: 2:34 - loss: 1.6411 - regression_loss: 1.3867 - classification_loss: 0.2544 32/500 [>.............................] - ETA: 2:34 - loss: 1.6305 - regression_loss: 1.3790 - classification_loss: 0.2515 33/500 [>.............................] - ETA: 2:33 - loss: 1.6193 - regression_loss: 1.3715 - classification_loss: 0.2478 34/500 [=>............................] - ETA: 2:33 - loss: 1.6318 - regression_loss: 1.3823 - classification_loss: 0.2495 35/500 [=>............................] - ETA: 2:33 - loss: 1.6441 - regression_loss: 1.3921 - classification_loss: 0.2520 36/500 [=>............................] - ETA: 2:32 - loss: 1.6165 - regression_loss: 1.3648 - classification_loss: 0.2517 37/500 [=>............................] - ETA: 2:32 - loss: 1.6155 - regression_loss: 1.3618 - classification_loss: 0.2537 38/500 [=>............................] - ETA: 2:31 - loss: 1.6244 - regression_loss: 1.3704 - classification_loss: 0.2540 39/500 [=>............................] - ETA: 2:31 - loss: 1.6154 - regression_loss: 1.3633 - classification_loss: 0.2521 40/500 [=>............................] - ETA: 2:30 - loss: 1.6177 - regression_loss: 1.3641 - classification_loss: 0.2537 41/500 [=>............................] - ETA: 2:30 - loss: 1.6140 - regression_loss: 1.3620 - classification_loss: 0.2520 42/500 [=>............................] - ETA: 2:30 - loss: 1.6160 - regression_loss: 1.3633 - classification_loss: 0.2527 43/500 [=>............................] - ETA: 2:30 - loss: 1.5985 - regression_loss: 1.3485 - classification_loss: 0.2500 44/500 [=>............................] - ETA: 2:29 - loss: 1.5823 - regression_loss: 1.3349 - classification_loss: 0.2474 45/500 [=>............................] - ETA: 2:29 - loss: 1.5576 - regression_loss: 1.3134 - classification_loss: 0.2442 46/500 [=>............................] - ETA: 2:29 - loss: 1.5724 - regression_loss: 1.3260 - classification_loss: 0.2464 47/500 [=>............................] - ETA: 2:28 - loss: 1.5890 - regression_loss: 1.3391 - classification_loss: 0.2499 48/500 [=>............................] - ETA: 2:28 - loss: 1.5975 - regression_loss: 1.3458 - classification_loss: 0.2518 49/500 [=>............................] - ETA: 2:28 - loss: 1.6024 - regression_loss: 1.3509 - classification_loss: 0.2515 50/500 [==>...........................] - ETA: 2:27 - loss: 1.6069 - regression_loss: 1.3554 - classification_loss: 0.2515 51/500 [==>...........................] - ETA: 2:27 - loss: 1.6081 - regression_loss: 1.3567 - classification_loss: 0.2513 52/500 [==>...........................] - ETA: 2:27 - loss: 1.6161 - regression_loss: 1.3646 - classification_loss: 0.2515 53/500 [==>...........................] - ETA: 2:26 - loss: 1.6147 - regression_loss: 1.3631 - classification_loss: 0.2516 54/500 [==>...........................] - ETA: 2:26 - loss: 1.6145 - regression_loss: 1.3632 - classification_loss: 0.2512 55/500 [==>...........................] - ETA: 2:26 - loss: 1.6026 - regression_loss: 1.3536 - classification_loss: 0.2491 56/500 [==>...........................] - ETA: 2:26 - loss: 1.6050 - regression_loss: 1.3552 - classification_loss: 0.2499 57/500 [==>...........................] - ETA: 2:25 - loss: 1.6093 - regression_loss: 1.3599 - classification_loss: 0.2494 58/500 [==>...........................] - ETA: 2:25 - loss: 1.6132 - regression_loss: 1.3638 - classification_loss: 0.2494 59/500 [==>...........................] - ETA: 2:24 - loss: 1.6298 - regression_loss: 1.3775 - classification_loss: 0.2523 60/500 [==>...........................] - ETA: 2:24 - loss: 1.6361 - regression_loss: 1.3825 - classification_loss: 0.2535 61/500 [==>...........................] - ETA: 2:24 - loss: 1.6347 - regression_loss: 1.3816 - classification_loss: 0.2531 62/500 [==>...........................] - ETA: 2:24 - loss: 1.6452 - regression_loss: 1.3902 - classification_loss: 0.2549 63/500 [==>...........................] - ETA: 2:24 - loss: 1.6380 - regression_loss: 1.3844 - classification_loss: 0.2536 64/500 [==>...........................] - ETA: 2:23 - loss: 1.6370 - regression_loss: 1.3836 - classification_loss: 0.2534 65/500 [==>...........................] - ETA: 2:23 - loss: 1.6407 - regression_loss: 1.3876 - classification_loss: 0.2531 66/500 [==>...........................] - ETA: 2:23 - loss: 1.6368 - regression_loss: 1.3849 - classification_loss: 0.2519 67/500 [===>..........................] - ETA: 2:23 - loss: 1.6399 - regression_loss: 1.3876 - classification_loss: 0.2523 68/500 [===>..........................] - ETA: 2:23 - loss: 1.6440 - regression_loss: 1.3899 - classification_loss: 0.2542 69/500 [===>..........................] - ETA: 2:22 - loss: 1.6464 - regression_loss: 1.3928 - classification_loss: 0.2536 70/500 [===>..........................] - ETA: 2:22 - loss: 1.6419 - regression_loss: 1.3892 - classification_loss: 0.2526 71/500 [===>..........................] - ETA: 2:22 - loss: 1.6460 - regression_loss: 1.3931 - classification_loss: 0.2529 72/500 [===>..........................] - ETA: 2:21 - loss: 1.6622 - regression_loss: 1.4068 - classification_loss: 0.2554 73/500 [===>..........................] - ETA: 2:21 - loss: 1.6622 - regression_loss: 1.4070 - classification_loss: 0.2551 74/500 [===>..........................] - ETA: 2:20 - loss: 1.6597 - regression_loss: 1.4047 - classification_loss: 0.2549 75/500 [===>..........................] - ETA: 2:20 - loss: 1.6743 - regression_loss: 1.4168 - classification_loss: 0.2575 76/500 [===>..........................] - ETA: 2:20 - loss: 1.6798 - regression_loss: 1.4219 - classification_loss: 0.2579 77/500 [===>..........................] - ETA: 2:20 - loss: 1.6779 - regression_loss: 1.4204 - classification_loss: 0.2575 78/500 [===>..........................] - ETA: 2:19 - loss: 1.6773 - regression_loss: 1.4203 - classification_loss: 0.2570 79/500 [===>..........................] - ETA: 2:19 - loss: 1.6742 - regression_loss: 1.4161 - classification_loss: 0.2581 80/500 [===>..........................] - ETA: 2:19 - loss: 1.6797 - regression_loss: 1.4211 - classification_loss: 0.2586 81/500 [===>..........................] - ETA: 2:18 - loss: 1.6851 - regression_loss: 1.4260 - classification_loss: 0.2591 82/500 [===>..........................] - ETA: 2:18 - loss: 1.6748 - regression_loss: 1.4172 - classification_loss: 0.2576 83/500 [===>..........................] - ETA: 2:18 - loss: 1.6742 - regression_loss: 1.4167 - classification_loss: 0.2575 84/500 [====>.........................] - ETA: 2:17 - loss: 1.6761 - regression_loss: 1.4181 - classification_loss: 0.2580 85/500 [====>.........................] - ETA: 2:17 - loss: 1.6725 - regression_loss: 1.4152 - classification_loss: 0.2573 86/500 [====>.........................] - ETA: 2:17 - loss: 1.6739 - regression_loss: 1.4164 - classification_loss: 0.2575 87/500 [====>.........................] - ETA: 2:16 - loss: 1.6700 - regression_loss: 1.4130 - classification_loss: 0.2570 88/500 [====>.........................] - ETA: 2:16 - loss: 1.6769 - regression_loss: 1.4185 - classification_loss: 0.2584 89/500 [====>.........................] - ETA: 2:16 - loss: 1.6778 - regression_loss: 1.4196 - classification_loss: 0.2582 90/500 [====>.........................] - ETA: 2:15 - loss: 1.6840 - regression_loss: 1.4246 - classification_loss: 0.2593 91/500 [====>.........................] - ETA: 2:15 - loss: 1.6700 - regression_loss: 1.4125 - classification_loss: 0.2575 92/500 [====>.........................] - ETA: 2:15 - loss: 1.6642 - regression_loss: 1.4076 - classification_loss: 0.2567 93/500 [====>.........................] - ETA: 2:14 - loss: 1.6563 - regression_loss: 1.4008 - classification_loss: 0.2555 94/500 [====>.........................] - ETA: 2:14 - loss: 1.6590 - regression_loss: 1.4037 - classification_loss: 0.2553 95/500 [====>.........................] - ETA: 2:14 - loss: 1.6577 - regression_loss: 1.4027 - classification_loss: 0.2550 96/500 [====>.........................] - ETA: 2:13 - loss: 1.6624 - regression_loss: 1.4062 - classification_loss: 0.2562 97/500 [====>.........................] - ETA: 2:13 - loss: 1.6665 - regression_loss: 1.4093 - classification_loss: 0.2572 98/500 [====>.........................] - ETA: 2:13 - loss: 1.6705 - regression_loss: 1.4132 - classification_loss: 0.2573 99/500 [====>.........................] - ETA: 2:12 - loss: 1.6736 - regression_loss: 1.4166 - classification_loss: 0.2570 100/500 [=====>........................] - ETA: 2:12 - loss: 1.6762 - regression_loss: 1.4189 - classification_loss: 0.2573 101/500 [=====>........................] - ETA: 2:12 - loss: 1.6710 - regression_loss: 1.4146 - classification_loss: 0.2564 102/500 [=====>........................] - ETA: 2:11 - loss: 1.6759 - regression_loss: 1.4187 - classification_loss: 0.2572 103/500 [=====>........................] - ETA: 2:11 - loss: 1.6788 - regression_loss: 1.4215 - classification_loss: 0.2573 104/500 [=====>........................] - ETA: 2:11 - loss: 1.6827 - regression_loss: 1.4248 - classification_loss: 0.2579 105/500 [=====>........................] - ETA: 2:10 - loss: 1.6836 - regression_loss: 1.4257 - classification_loss: 0.2579 106/500 [=====>........................] - ETA: 2:10 - loss: 1.6807 - regression_loss: 1.4237 - classification_loss: 0.2570 107/500 [=====>........................] - ETA: 2:09 - loss: 1.6846 - regression_loss: 1.4271 - classification_loss: 0.2575 108/500 [=====>........................] - ETA: 2:09 - loss: 1.6837 - regression_loss: 1.4261 - classification_loss: 0.2577 109/500 [=====>........................] - ETA: 2:09 - loss: 1.6847 - regression_loss: 1.4267 - classification_loss: 0.2580 110/500 [=====>........................] - ETA: 2:08 - loss: 1.6752 - regression_loss: 1.4184 - classification_loss: 0.2569 111/500 [=====>........................] - ETA: 2:08 - loss: 1.6737 - regression_loss: 1.4168 - classification_loss: 0.2569 112/500 [=====>........................] - ETA: 2:08 - loss: 1.6654 - regression_loss: 1.4089 - classification_loss: 0.2565 113/500 [=====>........................] - ETA: 2:08 - loss: 1.6609 - regression_loss: 1.4049 - classification_loss: 0.2560 114/500 [=====>........................] - ETA: 2:07 - loss: 1.6625 - regression_loss: 1.4066 - classification_loss: 0.2559 115/500 [=====>........................] - ETA: 2:07 - loss: 1.6622 - regression_loss: 1.4068 - classification_loss: 0.2554 116/500 [=====>........................] - ETA: 2:07 - loss: 1.6612 - regression_loss: 1.4056 - classification_loss: 0.2556 117/500 [======>.......................] - ETA: 2:06 - loss: 1.6622 - regression_loss: 1.4065 - classification_loss: 0.2556 118/500 [======>.......................] - ETA: 2:06 - loss: 1.6583 - regression_loss: 1.4033 - classification_loss: 0.2550 119/500 [======>.......................] - ETA: 2:06 - loss: 1.6596 - regression_loss: 1.4043 - classification_loss: 0.2553 120/500 [======>.......................] - ETA: 2:05 - loss: 1.6618 - regression_loss: 1.4071 - classification_loss: 0.2547 121/500 [======>.......................] - ETA: 2:05 - loss: 1.6604 - regression_loss: 1.4058 - classification_loss: 0.2546 122/500 [======>.......................] - ETA: 2:05 - loss: 1.6562 - regression_loss: 1.4024 - classification_loss: 0.2538 123/500 [======>.......................] - ETA: 2:04 - loss: 1.6556 - regression_loss: 1.4020 - classification_loss: 0.2536 124/500 [======>.......................] - ETA: 2:04 - loss: 1.6481 - regression_loss: 1.3955 - classification_loss: 0.2526 125/500 [======>.......................] - ETA: 2:04 - loss: 1.6491 - regression_loss: 1.3963 - classification_loss: 0.2527 126/500 [======>.......................] - ETA: 2:03 - loss: 1.6503 - regression_loss: 1.3974 - classification_loss: 0.2529 127/500 [======>.......................] - ETA: 2:03 - loss: 1.6494 - regression_loss: 1.3965 - classification_loss: 0.2529 128/500 [======>.......................] - ETA: 2:03 - loss: 1.6503 - regression_loss: 1.3973 - classification_loss: 0.2529 129/500 [======>.......................] - ETA: 2:02 - loss: 1.6473 - regression_loss: 1.3949 - classification_loss: 0.2524 130/500 [======>.......................] - ETA: 2:02 - loss: 1.6470 - regression_loss: 1.3941 - classification_loss: 0.2528 131/500 [======>.......................] - ETA: 2:02 - loss: 1.6496 - regression_loss: 1.3964 - classification_loss: 0.2532 132/500 [======>.......................] - ETA: 2:01 - loss: 1.6492 - regression_loss: 1.3958 - classification_loss: 0.2534 133/500 [======>.......................] - ETA: 2:01 - loss: 1.6453 - regression_loss: 1.3925 - classification_loss: 0.2528 134/500 [=======>......................] - ETA: 2:01 - loss: 1.6420 - regression_loss: 1.3897 - classification_loss: 0.2523 135/500 [=======>......................] - ETA: 2:01 - loss: 1.6426 - regression_loss: 1.3901 - classification_loss: 0.2525 136/500 [=======>......................] - ETA: 2:00 - loss: 1.6346 - regression_loss: 1.3831 - classification_loss: 0.2515 137/500 [=======>......................] - ETA: 2:00 - loss: 1.6362 - regression_loss: 1.3847 - classification_loss: 0.2515 138/500 [=======>......................] - ETA: 1:59 - loss: 1.6367 - regression_loss: 1.3850 - classification_loss: 0.2517 139/500 [=======>......................] - ETA: 1:59 - loss: 1.6286 - regression_loss: 1.3781 - classification_loss: 0.2505 140/500 [=======>......................] - ETA: 1:59 - loss: 1.6318 - regression_loss: 1.3809 - classification_loss: 0.2509 141/500 [=======>......................] - ETA: 1:58 - loss: 1.6316 - regression_loss: 1.3806 - classification_loss: 0.2511 142/500 [=======>......................] - ETA: 1:58 - loss: 1.6340 - regression_loss: 1.3823 - classification_loss: 0.2517 143/500 [=======>......................] - ETA: 1:58 - loss: 1.6309 - regression_loss: 1.3794 - classification_loss: 0.2515 144/500 [=======>......................] - ETA: 1:57 - loss: 1.6256 - regression_loss: 1.3749 - classification_loss: 0.2506 145/500 [=======>......................] - ETA: 1:57 - loss: 1.6243 - regression_loss: 1.3740 - classification_loss: 0.2502 146/500 [=======>......................] - ETA: 1:57 - loss: 1.6189 - regression_loss: 1.3695 - classification_loss: 0.2494 147/500 [=======>......................] - ETA: 1:56 - loss: 1.6244 - regression_loss: 1.3742 - classification_loss: 0.2502 148/500 [=======>......................] - ETA: 1:56 - loss: 1.6262 - regression_loss: 1.3757 - classification_loss: 0.2505 149/500 [=======>......................] - ETA: 1:56 - loss: 1.6234 - regression_loss: 1.3733 - classification_loss: 0.2501 150/500 [========>.....................] - ETA: 1:55 - loss: 1.6254 - regression_loss: 1.3749 - classification_loss: 0.2506 151/500 [========>.....................] - ETA: 1:55 - loss: 1.6263 - regression_loss: 1.3758 - classification_loss: 0.2505 152/500 [========>.....................] - ETA: 1:55 - loss: 1.6242 - regression_loss: 1.3743 - classification_loss: 0.2499 153/500 [========>.....................] - ETA: 1:54 - loss: 1.6261 - regression_loss: 1.3761 - classification_loss: 0.2500 154/500 [========>.....................] - ETA: 1:54 - loss: 1.6281 - regression_loss: 1.3778 - classification_loss: 0.2503 155/500 [========>.....................] - ETA: 1:53 - loss: 1.6221 - regression_loss: 1.3728 - classification_loss: 0.2493 156/500 [========>.....................] - ETA: 1:53 - loss: 1.6250 - regression_loss: 1.3747 - classification_loss: 0.2503 157/500 [========>.....................] - ETA: 1:53 - loss: 1.6308 - regression_loss: 1.3790 - classification_loss: 0.2518 158/500 [========>.....................] - ETA: 1:52 - loss: 1.6344 - regression_loss: 1.3821 - classification_loss: 0.2524 159/500 [========>.....................] - ETA: 1:52 - loss: 1.6384 - regression_loss: 1.3852 - classification_loss: 0.2533 160/500 [========>.....................] - ETA: 1:52 - loss: 1.6410 - regression_loss: 1.3878 - classification_loss: 0.2532 161/500 [========>.....................] - ETA: 1:51 - loss: 1.6398 - regression_loss: 1.3867 - classification_loss: 0.2531 162/500 [========>.....................] - ETA: 1:51 - loss: 1.6385 - regression_loss: 1.3857 - classification_loss: 0.2528 163/500 [========>.....................] - ETA: 1:51 - loss: 1.6376 - regression_loss: 1.3850 - classification_loss: 0.2526 164/500 [========>.....................] - ETA: 1:51 - loss: 1.6339 - regression_loss: 1.3821 - classification_loss: 0.2518 165/500 [========>.....................] - ETA: 1:50 - loss: 1.6351 - regression_loss: 1.3835 - classification_loss: 0.2516 166/500 [========>.....................] - ETA: 1:50 - loss: 1.6345 - regression_loss: 1.3828 - classification_loss: 0.2517 167/500 [=========>....................] - ETA: 1:50 - loss: 1.6348 - regression_loss: 1.3824 - classification_loss: 0.2525 168/500 [=========>....................] - ETA: 1:49 - loss: 1.6352 - regression_loss: 1.3829 - classification_loss: 0.2523 169/500 [=========>....................] - ETA: 1:49 - loss: 1.6366 - regression_loss: 1.3842 - classification_loss: 0.2524 170/500 [=========>....................] - ETA: 1:48 - loss: 1.6347 - regression_loss: 1.3828 - classification_loss: 0.2520 171/500 [=========>....................] - ETA: 1:48 - loss: 1.6331 - regression_loss: 1.3808 - classification_loss: 0.2523 172/500 [=========>....................] - ETA: 1:48 - loss: 1.6305 - regression_loss: 1.3772 - classification_loss: 0.2533 173/500 [=========>....................] - ETA: 1:48 - loss: 1.6313 - regression_loss: 1.3779 - classification_loss: 0.2533 174/500 [=========>....................] - ETA: 1:47 - loss: 1.6334 - regression_loss: 1.3799 - classification_loss: 0.2535 175/500 [=========>....................] - ETA: 1:47 - loss: 1.6329 - regression_loss: 1.3795 - classification_loss: 0.2534 176/500 [=========>....................] - ETA: 1:47 - loss: 1.6351 - regression_loss: 1.3814 - classification_loss: 0.2536 177/500 [=========>....................] - ETA: 1:46 - loss: 1.6385 - regression_loss: 1.3841 - classification_loss: 0.2544 178/500 [=========>....................] - ETA: 1:46 - loss: 1.6386 - regression_loss: 1.3843 - classification_loss: 0.2544 179/500 [=========>....................] - ETA: 1:45 - loss: 1.6391 - regression_loss: 1.3849 - classification_loss: 0.2542 180/500 [=========>....................] - ETA: 1:45 - loss: 1.6403 - regression_loss: 1.3860 - classification_loss: 0.2543 181/500 [=========>....................] - ETA: 1:45 - loss: 1.6367 - regression_loss: 1.3827 - classification_loss: 0.2540 182/500 [=========>....................] - ETA: 1:44 - loss: 1.6335 - regression_loss: 1.3801 - classification_loss: 0.2535 183/500 [=========>....................] - ETA: 1:44 - loss: 1.6310 - regression_loss: 1.3777 - classification_loss: 0.2533 184/500 [==========>...................] - ETA: 1:44 - loss: 1.6310 - regression_loss: 1.3776 - classification_loss: 0.2534 185/500 [==========>...................] - ETA: 1:44 - loss: 1.6295 - regression_loss: 1.3762 - classification_loss: 0.2533 186/500 [==========>...................] - ETA: 1:43 - loss: 1.6305 - regression_loss: 1.3771 - classification_loss: 0.2534 187/500 [==========>...................] - ETA: 1:43 - loss: 1.6270 - regression_loss: 1.3735 - classification_loss: 0.2535 188/500 [==========>...................] - ETA: 1:43 - loss: 1.6247 - regression_loss: 1.3706 - classification_loss: 0.2541 189/500 [==========>...................] - ETA: 1:42 - loss: 1.6227 - regression_loss: 1.3692 - classification_loss: 0.2536 190/500 [==========>...................] - ETA: 1:42 - loss: 1.6224 - regression_loss: 1.3690 - classification_loss: 0.2534 191/500 [==========>...................] - ETA: 1:42 - loss: 1.6185 - regression_loss: 1.3657 - classification_loss: 0.2528 192/500 [==========>...................] - ETA: 1:41 - loss: 1.6177 - regression_loss: 1.3651 - classification_loss: 0.2526 193/500 [==========>...................] - ETA: 1:41 - loss: 1.6169 - regression_loss: 1.3643 - classification_loss: 0.2527 194/500 [==========>...................] - ETA: 1:41 - loss: 1.6167 - regression_loss: 1.3641 - classification_loss: 0.2526 195/500 [==========>...................] - ETA: 1:40 - loss: 1.6141 - regression_loss: 1.3622 - classification_loss: 0.2519 196/500 [==========>...................] - ETA: 1:40 - loss: 1.6145 - regression_loss: 1.3625 - classification_loss: 0.2520 197/500 [==========>...................] - ETA: 1:40 - loss: 1.6125 - regression_loss: 1.3608 - classification_loss: 0.2517 198/500 [==========>...................] - ETA: 1:39 - loss: 1.6147 - regression_loss: 1.3628 - classification_loss: 0.2518 199/500 [==========>...................] - ETA: 1:39 - loss: 1.6190 - regression_loss: 1.3661 - classification_loss: 0.2529 200/500 [===========>..................] - ETA: 1:39 - loss: 1.6215 - regression_loss: 1.3679 - classification_loss: 0.2536 201/500 [===========>..................] - ETA: 1:38 - loss: 1.6204 - regression_loss: 1.3671 - classification_loss: 0.2532 202/500 [===========>..................] - ETA: 1:38 - loss: 1.6231 - regression_loss: 1.3693 - classification_loss: 0.2538 203/500 [===========>..................] - ETA: 1:38 - loss: 1.6214 - regression_loss: 1.3678 - classification_loss: 0.2536 204/500 [===========>..................] - ETA: 1:37 - loss: 1.6242 - regression_loss: 1.3703 - classification_loss: 0.2539 205/500 [===========>..................] - ETA: 1:37 - loss: 1.6229 - regression_loss: 1.3687 - classification_loss: 0.2542 206/500 [===========>..................] - ETA: 1:37 - loss: 1.6236 - regression_loss: 1.3695 - classification_loss: 0.2541 207/500 [===========>..................] - ETA: 1:36 - loss: 1.6237 - regression_loss: 1.3698 - classification_loss: 0.2539 208/500 [===========>..................] - ETA: 1:36 - loss: 1.6249 - regression_loss: 1.3707 - classification_loss: 0.2542 209/500 [===========>..................] - ETA: 1:36 - loss: 1.6225 - regression_loss: 1.3689 - classification_loss: 0.2536 210/500 [===========>..................] - ETA: 1:35 - loss: 1.6206 - regression_loss: 1.3673 - classification_loss: 0.2533 211/500 [===========>..................] - ETA: 1:35 - loss: 1.6212 - regression_loss: 1.3680 - classification_loss: 0.2533 212/500 [===========>..................] - ETA: 1:35 - loss: 1.6230 - regression_loss: 1.3695 - classification_loss: 0.2535 213/500 [===========>..................] - ETA: 1:34 - loss: 1.6253 - regression_loss: 1.3716 - classification_loss: 0.2537 214/500 [===========>..................] - ETA: 1:34 - loss: 1.6267 - regression_loss: 1.3727 - classification_loss: 0.2540 215/500 [===========>..................] - ETA: 1:34 - loss: 1.6275 - regression_loss: 1.3734 - classification_loss: 0.2541 216/500 [===========>..................] - ETA: 1:33 - loss: 1.6281 - regression_loss: 1.3740 - classification_loss: 0.2540 217/500 [============>.................] - ETA: 1:33 - loss: 1.6279 - regression_loss: 1.3740 - classification_loss: 0.2539 218/500 [============>.................] - ETA: 1:33 - loss: 1.6305 - regression_loss: 1.3762 - classification_loss: 0.2542 219/500 [============>.................] - ETA: 1:32 - loss: 1.6314 - regression_loss: 1.3770 - classification_loss: 0.2544 220/500 [============>.................] - ETA: 1:32 - loss: 1.6313 - regression_loss: 1.3770 - classification_loss: 0.2543 221/500 [============>.................] - ETA: 1:32 - loss: 1.6299 - regression_loss: 1.3759 - classification_loss: 0.2540 222/500 [============>.................] - ETA: 1:31 - loss: 1.6287 - regression_loss: 1.3749 - classification_loss: 0.2539 223/500 [============>.................] - ETA: 1:31 - loss: 1.6277 - regression_loss: 1.3740 - classification_loss: 0.2537 224/500 [============>.................] - ETA: 1:31 - loss: 1.6261 - regression_loss: 1.3724 - classification_loss: 0.2537 225/500 [============>.................] - ETA: 1:30 - loss: 1.6276 - regression_loss: 1.3738 - classification_loss: 0.2539 226/500 [============>.................] - ETA: 1:30 - loss: 1.6299 - regression_loss: 1.3756 - classification_loss: 0.2544 227/500 [============>.................] - ETA: 1:30 - loss: 1.6290 - regression_loss: 1.3748 - classification_loss: 0.2543 228/500 [============>.................] - ETA: 1:29 - loss: 1.6304 - regression_loss: 1.3762 - classification_loss: 0.2542 229/500 [============>.................] - ETA: 1:29 - loss: 1.6276 - regression_loss: 1.3739 - classification_loss: 0.2537 230/500 [============>.................] - ETA: 1:29 - loss: 1.6274 - regression_loss: 1.3737 - classification_loss: 0.2537 231/500 [============>.................] - ETA: 1:28 - loss: 1.6283 - regression_loss: 1.3745 - classification_loss: 0.2538 232/500 [============>.................] - ETA: 1:28 - loss: 1.6320 - regression_loss: 1.3776 - classification_loss: 0.2543 233/500 [============>.................] - ETA: 1:28 - loss: 1.6338 - regression_loss: 1.3792 - classification_loss: 0.2547 234/500 [=============>................] - ETA: 1:27 - loss: 1.6345 - regression_loss: 1.3798 - classification_loss: 0.2548 235/500 [=============>................] - ETA: 1:27 - loss: 1.6333 - regression_loss: 1.3787 - classification_loss: 0.2546 236/500 [=============>................] - ETA: 1:27 - loss: 1.6359 - regression_loss: 1.3808 - classification_loss: 0.2551 237/500 [=============>................] - ETA: 1:26 - loss: 1.6377 - regression_loss: 1.3823 - classification_loss: 0.2554 238/500 [=============>................] - ETA: 1:26 - loss: 1.6392 - regression_loss: 1.3835 - classification_loss: 0.2557 239/500 [=============>................] - ETA: 1:26 - loss: 1.6411 - regression_loss: 1.3850 - classification_loss: 0.2561 240/500 [=============>................] - ETA: 1:25 - loss: 1.6427 - regression_loss: 1.3864 - classification_loss: 0.2563 241/500 [=============>................] - ETA: 1:25 - loss: 1.6442 - regression_loss: 1.3878 - classification_loss: 0.2564 242/500 [=============>................] - ETA: 1:25 - loss: 1.6463 - regression_loss: 1.3896 - classification_loss: 0.2567 243/500 [=============>................] - ETA: 1:25 - loss: 1.6473 - regression_loss: 1.3904 - classification_loss: 0.2569 244/500 [=============>................] - ETA: 1:24 - loss: 1.6486 - regression_loss: 1.3915 - classification_loss: 0.2571 245/500 [=============>................] - ETA: 1:24 - loss: 1.6483 - regression_loss: 1.3914 - classification_loss: 0.2569 246/500 [=============>................] - ETA: 1:24 - loss: 1.6463 - regression_loss: 1.3889 - classification_loss: 0.2573 247/500 [=============>................] - ETA: 1:23 - loss: 1.6440 - regression_loss: 1.3870 - classification_loss: 0.2569 248/500 [=============>................] - ETA: 1:23 - loss: 1.6414 - regression_loss: 1.3848 - classification_loss: 0.2566 249/500 [=============>................] - ETA: 1:23 - loss: 1.6401 - regression_loss: 1.3838 - classification_loss: 0.2564 250/500 [==============>...............] - ETA: 1:22 - loss: 1.6390 - regression_loss: 1.3831 - classification_loss: 0.2560 251/500 [==============>...............] - ETA: 1:22 - loss: 1.6390 - regression_loss: 1.3830 - classification_loss: 0.2560 252/500 [==============>...............] - ETA: 1:22 - loss: 1.6375 - regression_loss: 1.3817 - classification_loss: 0.2558 253/500 [==============>...............] - ETA: 1:21 - loss: 1.6402 - regression_loss: 1.3841 - classification_loss: 0.2560 254/500 [==============>...............] - ETA: 1:21 - loss: 1.6401 - regression_loss: 1.3839 - classification_loss: 0.2562 255/500 [==============>...............] - ETA: 1:21 - loss: 1.6406 - regression_loss: 1.3842 - classification_loss: 0.2564 256/500 [==============>...............] - ETA: 1:20 - loss: 1.6380 - regression_loss: 1.3820 - classification_loss: 0.2560 257/500 [==============>...............] - ETA: 1:20 - loss: 1.6386 - regression_loss: 1.3826 - classification_loss: 0.2560 258/500 [==============>...............] - ETA: 1:20 - loss: 1.6366 - regression_loss: 1.3810 - classification_loss: 0.2557 259/500 [==============>...............] - ETA: 1:19 - loss: 1.6367 - regression_loss: 1.3810 - classification_loss: 0.2557 260/500 [==============>...............] - ETA: 1:19 - loss: 1.6370 - regression_loss: 1.3811 - classification_loss: 0.2559 261/500 [==============>...............] - ETA: 1:19 - loss: 1.6372 - regression_loss: 1.3814 - classification_loss: 0.2558 262/500 [==============>...............] - ETA: 1:18 - loss: 1.6383 - regression_loss: 1.3822 - classification_loss: 0.2560 263/500 [==============>...............] - ETA: 1:18 - loss: 1.6387 - regression_loss: 1.3828 - classification_loss: 0.2559 264/500 [==============>...............] - ETA: 1:18 - loss: 1.6396 - regression_loss: 1.3836 - classification_loss: 0.2560 265/500 [==============>...............] - ETA: 1:17 - loss: 1.6368 - regression_loss: 1.3811 - classification_loss: 0.2557 266/500 [==============>...............] - ETA: 1:17 - loss: 1.6363 - regression_loss: 1.3810 - classification_loss: 0.2554 267/500 [===============>..............] - ETA: 1:17 - loss: 1.6357 - regression_loss: 1.3805 - classification_loss: 0.2552 268/500 [===============>..............] - ETA: 1:16 - loss: 1.6357 - regression_loss: 1.3806 - classification_loss: 0.2551 269/500 [===============>..............] - ETA: 1:16 - loss: 1.6335 - regression_loss: 1.3787 - classification_loss: 0.2548 270/500 [===============>..............] - ETA: 1:16 - loss: 1.6355 - regression_loss: 1.3803 - classification_loss: 0.2552 271/500 [===============>..............] - ETA: 1:15 - loss: 1.6346 - regression_loss: 1.3792 - classification_loss: 0.2555 272/500 [===============>..............] - ETA: 1:15 - loss: 1.6375 - regression_loss: 1.3819 - classification_loss: 0.2556 273/500 [===============>..............] - ETA: 1:15 - loss: 1.6380 - regression_loss: 1.3823 - classification_loss: 0.2557 274/500 [===============>..............] - ETA: 1:14 - loss: 1.6381 - regression_loss: 1.3824 - classification_loss: 0.2556 275/500 [===============>..............] - ETA: 1:14 - loss: 1.6380 - regression_loss: 1.3824 - classification_loss: 0.2556 276/500 [===============>..............] - ETA: 1:14 - loss: 1.6349 - regression_loss: 1.3799 - classification_loss: 0.2550 277/500 [===============>..............] - ETA: 1:13 - loss: 1.6355 - regression_loss: 1.3803 - classification_loss: 0.2552 278/500 [===============>..............] - ETA: 1:13 - loss: 1.6369 - regression_loss: 1.3814 - classification_loss: 0.2555 279/500 [===============>..............] - ETA: 1:13 - loss: 1.6333 - regression_loss: 1.3784 - classification_loss: 0.2550 280/500 [===============>..............] - ETA: 1:12 - loss: 1.6299 - regression_loss: 1.3754 - classification_loss: 0.2545 281/500 [===============>..............] - ETA: 1:12 - loss: 1.6262 - regression_loss: 1.3721 - classification_loss: 0.2540 282/500 [===============>..............] - ETA: 1:12 - loss: 1.6265 - regression_loss: 1.3725 - classification_loss: 0.2540 283/500 [===============>..............] - ETA: 1:11 - loss: 1.6262 - regression_loss: 1.3723 - classification_loss: 0.2539 284/500 [================>.............] - ETA: 1:11 - loss: 1.6254 - regression_loss: 1.3714 - classification_loss: 0.2539 285/500 [================>.............] - ETA: 1:11 - loss: 1.6256 - regression_loss: 1.3717 - classification_loss: 0.2539 286/500 [================>.............] - ETA: 1:10 - loss: 1.6242 - regression_loss: 1.3705 - classification_loss: 0.2537 287/500 [================>.............] - ETA: 1:10 - loss: 1.6228 - regression_loss: 1.3693 - classification_loss: 0.2535 288/500 [================>.............] - ETA: 1:10 - loss: 1.6228 - regression_loss: 1.3693 - classification_loss: 0.2535 289/500 [================>.............] - ETA: 1:09 - loss: 1.6206 - regression_loss: 1.3673 - classification_loss: 0.2533 290/500 [================>.............] - ETA: 1:09 - loss: 1.6200 - regression_loss: 1.3669 - classification_loss: 0.2531 291/500 [================>.............] - ETA: 1:09 - loss: 1.6177 - regression_loss: 1.3648 - classification_loss: 0.2528 292/500 [================>.............] - ETA: 1:08 - loss: 1.6161 - regression_loss: 1.3636 - classification_loss: 0.2526 293/500 [================>.............] - ETA: 1:08 - loss: 1.6164 - regression_loss: 1.3637 - classification_loss: 0.2528 294/500 [================>.............] - ETA: 1:08 - loss: 1.6161 - regression_loss: 1.3633 - classification_loss: 0.2528 295/500 [================>.............] - ETA: 1:07 - loss: 1.6140 - regression_loss: 1.3617 - classification_loss: 0.2523 296/500 [================>.............] - ETA: 1:07 - loss: 1.6153 - regression_loss: 1.3629 - classification_loss: 0.2524 297/500 [================>.............] - ETA: 1:07 - loss: 1.6151 - regression_loss: 1.3626 - classification_loss: 0.2525 298/500 [================>.............] - ETA: 1:06 - loss: 1.6151 - regression_loss: 1.3626 - classification_loss: 0.2524 299/500 [================>.............] - ETA: 1:06 - loss: 1.6153 - regression_loss: 1.3629 - classification_loss: 0.2524 300/500 [=================>............] - ETA: 1:06 - loss: 1.6159 - regression_loss: 1.3636 - classification_loss: 0.2524 301/500 [=================>............] - ETA: 1:05 - loss: 1.6139 - regression_loss: 1.3620 - classification_loss: 0.2519 302/500 [=================>............] - ETA: 1:05 - loss: 1.6135 - regression_loss: 1.3617 - classification_loss: 0.2518 303/500 [=================>............] - ETA: 1:05 - loss: 1.6138 - regression_loss: 1.3619 - classification_loss: 0.2518 304/500 [=================>............] - ETA: 1:04 - loss: 1.6134 - regression_loss: 1.3614 - classification_loss: 0.2520 305/500 [=================>............] - ETA: 1:04 - loss: 1.6110 - regression_loss: 1.3595 - classification_loss: 0.2516 306/500 [=================>............] - ETA: 1:04 - loss: 1.6118 - regression_loss: 1.3602 - classification_loss: 0.2516 307/500 [=================>............] - ETA: 1:03 - loss: 1.6096 - regression_loss: 1.3584 - classification_loss: 0.2512 308/500 [=================>............] - ETA: 1:03 - loss: 1.6077 - regression_loss: 1.3568 - classification_loss: 0.2509 309/500 [=================>............] - ETA: 1:03 - loss: 1.6062 - regression_loss: 1.3554 - classification_loss: 0.2509 310/500 [=================>............] - ETA: 1:02 - loss: 1.6065 - regression_loss: 1.3557 - classification_loss: 0.2508 311/500 [=================>............] - ETA: 1:02 - loss: 1.6067 - regression_loss: 1.3560 - classification_loss: 0.2508 312/500 [=================>............] - ETA: 1:02 - loss: 1.6081 - regression_loss: 1.3570 - classification_loss: 0.2511 313/500 [=================>............] - ETA: 1:01 - loss: 1.6088 - regression_loss: 1.3579 - classification_loss: 0.2510 314/500 [=================>............] - ETA: 1:01 - loss: 1.6068 - regression_loss: 1.3562 - classification_loss: 0.2506 315/500 [=================>............] - ETA: 1:01 - loss: 1.6064 - regression_loss: 1.3558 - classification_loss: 0.2505 316/500 [=================>............] - ETA: 1:00 - loss: 1.6043 - regression_loss: 1.3540 - classification_loss: 0.2503 317/500 [==================>...........] - ETA: 1:00 - loss: 1.6061 - regression_loss: 1.3557 - classification_loss: 0.2504 318/500 [==================>...........] - ETA: 1:00 - loss: 1.6050 - regression_loss: 1.3549 - classification_loss: 0.2501 319/500 [==================>...........] - ETA: 59s - loss: 1.6056 - regression_loss: 1.3554 - classification_loss: 0.2502  320/500 [==================>...........] - ETA: 59s - loss: 1.6067 - regression_loss: 1.3562 - classification_loss: 0.2504 321/500 [==================>...........] - ETA: 59s - loss: 1.6064 - regression_loss: 1.3562 - classification_loss: 0.2502 322/500 [==================>...........] - ETA: 58s - loss: 1.6037 - regression_loss: 1.3540 - classification_loss: 0.2497 323/500 [==================>...........] - ETA: 58s - loss: 1.6050 - regression_loss: 1.3552 - classification_loss: 0.2498 324/500 [==================>...........] - ETA: 58s - loss: 1.6061 - regression_loss: 1.3562 - classification_loss: 0.2499 325/500 [==================>...........] - ETA: 57s - loss: 1.6057 - regression_loss: 1.3558 - classification_loss: 0.2499 326/500 [==================>...........] - ETA: 57s - loss: 1.6042 - regression_loss: 1.3546 - classification_loss: 0.2496 327/500 [==================>...........] - ETA: 57s - loss: 1.6056 - regression_loss: 1.3558 - classification_loss: 0.2498 328/500 [==================>...........] - ETA: 56s - loss: 1.6031 - regression_loss: 1.3536 - classification_loss: 0.2495 329/500 [==================>...........] - ETA: 56s - loss: 1.5999 - regression_loss: 1.3508 - classification_loss: 0.2491 330/500 [==================>...........] - ETA: 56s - loss: 1.6010 - regression_loss: 1.3519 - classification_loss: 0.2491 331/500 [==================>...........] - ETA: 55s - loss: 1.6005 - regression_loss: 1.3516 - classification_loss: 0.2490 332/500 [==================>...........] - ETA: 55s - loss: 1.6017 - regression_loss: 1.3526 - classification_loss: 0.2491 333/500 [==================>...........] - ETA: 55s - loss: 1.6029 - regression_loss: 1.3537 - classification_loss: 0.2492 334/500 [===================>..........] - ETA: 54s - loss: 1.6031 - regression_loss: 1.3540 - classification_loss: 0.2491 335/500 [===================>..........] - ETA: 54s - loss: 1.6044 - regression_loss: 1.3552 - classification_loss: 0.2492 336/500 [===================>..........] - ETA: 54s - loss: 1.6043 - regression_loss: 1.3549 - classification_loss: 0.2494 337/500 [===================>..........] - ETA: 53s - loss: 1.6038 - regression_loss: 1.3545 - classification_loss: 0.2493 338/500 [===================>..........] - ETA: 53s - loss: 1.6022 - regression_loss: 1.3531 - classification_loss: 0.2491 339/500 [===================>..........] - ETA: 53s - loss: 1.6025 - regression_loss: 1.3535 - classification_loss: 0.2490 340/500 [===================>..........] - ETA: 52s - loss: 1.5998 - regression_loss: 1.3513 - classification_loss: 0.2485 341/500 [===================>..........] - ETA: 52s - loss: 1.5971 - regression_loss: 1.3489 - classification_loss: 0.2482 342/500 [===================>..........] - ETA: 52s - loss: 1.5976 - regression_loss: 1.3494 - classification_loss: 0.2482 343/500 [===================>..........] - ETA: 51s - loss: 1.5974 - regression_loss: 1.3493 - classification_loss: 0.2481 344/500 [===================>..........] - ETA: 51s - loss: 1.5993 - regression_loss: 1.3508 - classification_loss: 0.2485 345/500 [===================>..........] - ETA: 51s - loss: 1.5968 - regression_loss: 1.3487 - classification_loss: 0.2481 346/500 [===================>..........] - ETA: 50s - loss: 1.5964 - regression_loss: 1.3483 - classification_loss: 0.2481 347/500 [===================>..........] - ETA: 50s - loss: 1.5980 - regression_loss: 1.3495 - classification_loss: 0.2484 348/500 [===================>..........] - ETA: 50s - loss: 1.5982 - regression_loss: 1.3498 - classification_loss: 0.2484 349/500 [===================>..........] - ETA: 49s - loss: 1.5993 - regression_loss: 1.3508 - classification_loss: 0.2485 350/500 [====================>.........] - ETA: 49s - loss: 1.6002 - regression_loss: 1.3517 - classification_loss: 0.2486 351/500 [====================>.........] - ETA: 49s - loss: 1.6023 - regression_loss: 1.3534 - classification_loss: 0.2489 352/500 [====================>.........] - ETA: 48s - loss: 1.6034 - regression_loss: 1.3544 - classification_loss: 0.2489 353/500 [====================>.........] - ETA: 48s - loss: 1.6044 - regression_loss: 1.3555 - classification_loss: 0.2489 354/500 [====================>.........] - ETA: 48s - loss: 1.6062 - regression_loss: 1.3571 - classification_loss: 0.2492 355/500 [====================>.........] - ETA: 47s - loss: 1.6062 - regression_loss: 1.3571 - classification_loss: 0.2491 356/500 [====================>.........] - ETA: 47s - loss: 1.6035 - regression_loss: 1.3548 - classification_loss: 0.2487 357/500 [====================>.........] - ETA: 47s - loss: 1.6034 - regression_loss: 1.3548 - classification_loss: 0.2486 358/500 [====================>.........] - ETA: 47s - loss: 1.6039 - regression_loss: 1.3544 - classification_loss: 0.2496 359/500 [====================>.........] - ETA: 46s - loss: 1.6034 - regression_loss: 1.3539 - classification_loss: 0.2495 360/500 [====================>.........] - ETA: 46s - loss: 1.6045 - regression_loss: 1.3548 - classification_loss: 0.2497 361/500 [====================>.........] - ETA: 46s - loss: 1.6069 - regression_loss: 1.3569 - classification_loss: 0.2500 362/500 [====================>.........] - ETA: 45s - loss: 1.6047 - regression_loss: 1.3550 - classification_loss: 0.2496 363/500 [====================>.........] - ETA: 45s - loss: 1.6058 - regression_loss: 1.3561 - classification_loss: 0.2496 364/500 [====================>.........] - ETA: 45s - loss: 1.6059 - regression_loss: 1.3563 - classification_loss: 0.2496 365/500 [====================>.........] - ETA: 44s - loss: 1.6062 - regression_loss: 1.3565 - classification_loss: 0.2497 366/500 [====================>.........] - ETA: 44s - loss: 1.6039 - regression_loss: 1.3546 - classification_loss: 0.2494 367/500 [=====================>........] - ETA: 44s - loss: 1.6041 - regression_loss: 1.3546 - classification_loss: 0.2495 368/500 [=====================>........] - ETA: 43s - loss: 1.6050 - regression_loss: 1.3553 - classification_loss: 0.2497 369/500 [=====================>........] - ETA: 43s - loss: 1.6062 - regression_loss: 1.3562 - classification_loss: 0.2499 370/500 [=====================>........] - ETA: 43s - loss: 1.6058 - regression_loss: 1.3559 - classification_loss: 0.2499 371/500 [=====================>........] - ETA: 42s - loss: 1.6060 - regression_loss: 1.3562 - classification_loss: 0.2499 372/500 [=====================>........] - ETA: 42s - loss: 1.6063 - regression_loss: 1.3562 - classification_loss: 0.2501 373/500 [=====================>........] - ETA: 42s - loss: 1.6062 - regression_loss: 1.3562 - classification_loss: 0.2501 374/500 [=====================>........] - ETA: 41s - loss: 1.6070 - regression_loss: 1.3569 - classification_loss: 0.2500 375/500 [=====================>........] - ETA: 41s - loss: 1.6042 - regression_loss: 1.3546 - classification_loss: 0.2497 376/500 [=====================>........] - ETA: 41s - loss: 1.6030 - regression_loss: 1.3536 - classification_loss: 0.2494 377/500 [=====================>........] - ETA: 40s - loss: 1.6028 - regression_loss: 1.3534 - classification_loss: 0.2494 378/500 [=====================>........] - ETA: 40s - loss: 1.6033 - regression_loss: 1.3539 - classification_loss: 0.2494 379/500 [=====================>........] - ETA: 40s - loss: 1.6040 - regression_loss: 1.3545 - classification_loss: 0.2495 380/500 [=====================>........] - ETA: 39s - loss: 1.6046 - regression_loss: 1.3550 - classification_loss: 0.2496 381/500 [=====================>........] - ETA: 39s - loss: 1.6042 - regression_loss: 1.3547 - classification_loss: 0.2496 382/500 [=====================>........] - ETA: 39s - loss: 1.6047 - regression_loss: 1.3552 - classification_loss: 0.2495 383/500 [=====================>........] - ETA: 38s - loss: 1.6054 - regression_loss: 1.3558 - classification_loss: 0.2496 384/500 [======================>.......] - ETA: 38s - loss: 1.6059 - regression_loss: 1.3561 - classification_loss: 0.2498 385/500 [======================>.......] - ETA: 38s - loss: 1.6044 - regression_loss: 1.3549 - classification_loss: 0.2495 386/500 [======================>.......] - ETA: 37s - loss: 1.6029 - regression_loss: 1.3537 - classification_loss: 0.2492 387/500 [======================>.......] - ETA: 37s - loss: 1.6035 - regression_loss: 1.3541 - classification_loss: 0.2493 388/500 [======================>.......] - ETA: 37s - loss: 1.6030 - regression_loss: 1.3539 - classification_loss: 0.2492 389/500 [======================>.......] - ETA: 36s - loss: 1.6020 - regression_loss: 1.3531 - classification_loss: 0.2489 390/500 [======================>.......] - ETA: 36s - loss: 1.6019 - regression_loss: 1.3531 - classification_loss: 0.2488 391/500 [======================>.......] - ETA: 36s - loss: 1.6025 - regression_loss: 1.3535 - classification_loss: 0.2489 392/500 [======================>.......] - ETA: 35s - loss: 1.6037 - regression_loss: 1.3546 - classification_loss: 0.2491 393/500 [======================>.......] - ETA: 35s - loss: 1.6036 - regression_loss: 1.3545 - classification_loss: 0.2490 394/500 [======================>.......] - ETA: 35s - loss: 1.6045 - regression_loss: 1.3554 - classification_loss: 0.2491 395/500 [======================>.......] - ETA: 34s - loss: 1.6032 - regression_loss: 1.3544 - classification_loss: 0.2488 396/500 [======================>.......] - ETA: 34s - loss: 1.6028 - regression_loss: 1.3540 - classification_loss: 0.2488 397/500 [======================>.......] - ETA: 34s - loss: 1.6035 - regression_loss: 1.3549 - classification_loss: 0.2486 398/500 [======================>.......] - ETA: 33s - loss: 1.6016 - regression_loss: 1.3533 - classification_loss: 0.2483 399/500 [======================>.......] - ETA: 33s - loss: 1.6027 - regression_loss: 1.3544 - classification_loss: 0.2483 400/500 [=======================>......] - ETA: 33s - loss: 1.6022 - regression_loss: 1.3541 - classification_loss: 0.2481 401/500 [=======================>......] - ETA: 32s - loss: 1.6026 - regression_loss: 1.3545 - classification_loss: 0.2481 402/500 [=======================>......] - ETA: 32s - loss: 1.6002 - regression_loss: 1.3524 - classification_loss: 0.2478 403/500 [=======================>......] - ETA: 32s - loss: 1.6001 - regression_loss: 1.3523 - classification_loss: 0.2477 404/500 [=======================>......] - ETA: 31s - loss: 1.5999 - regression_loss: 1.3521 - classification_loss: 0.2478 405/500 [=======================>......] - ETA: 31s - loss: 1.5981 - regression_loss: 1.3504 - classification_loss: 0.2477 406/500 [=======================>......] - ETA: 31s - loss: 1.5991 - regression_loss: 1.3514 - classification_loss: 0.2478 407/500 [=======================>......] - ETA: 30s - loss: 1.5989 - regression_loss: 1.3511 - classification_loss: 0.2477 408/500 [=======================>......] - ETA: 30s - loss: 1.5991 - regression_loss: 1.3514 - classification_loss: 0.2477 409/500 [=======================>......] - ETA: 30s - loss: 1.6007 - regression_loss: 1.3529 - classification_loss: 0.2478 410/500 [=======================>......] - ETA: 29s - loss: 1.6012 - regression_loss: 1.3534 - classification_loss: 0.2478 411/500 [=======================>......] - ETA: 29s - loss: 1.6016 - regression_loss: 1.3538 - classification_loss: 0.2479 412/500 [=======================>......] - ETA: 29s - loss: 1.6007 - regression_loss: 1.3530 - classification_loss: 0.2477 413/500 [=======================>......] - ETA: 28s - loss: 1.6010 - regression_loss: 1.3532 - classification_loss: 0.2477 414/500 [=======================>......] - ETA: 28s - loss: 1.6006 - regression_loss: 1.3530 - classification_loss: 0.2477 415/500 [=======================>......] - ETA: 28s - loss: 1.5994 - regression_loss: 1.3520 - classification_loss: 0.2474 416/500 [=======================>......] - ETA: 27s - loss: 1.5985 - regression_loss: 1.3512 - classification_loss: 0.2473 417/500 [========================>.....] - ETA: 27s - loss: 1.5992 - regression_loss: 1.3516 - classification_loss: 0.2476 418/500 [========================>.....] - ETA: 27s - loss: 1.5994 - regression_loss: 1.3519 - classification_loss: 0.2475 419/500 [========================>.....] - ETA: 26s - loss: 1.5994 - regression_loss: 1.3520 - classification_loss: 0.2475 420/500 [========================>.....] - ETA: 26s - loss: 1.5981 - regression_loss: 1.3509 - classification_loss: 0.2472 421/500 [========================>.....] - ETA: 26s - loss: 1.5961 - regression_loss: 1.3492 - classification_loss: 0.2469 422/500 [========================>.....] - ETA: 25s - loss: 1.5966 - regression_loss: 1.3496 - classification_loss: 0.2469 423/500 [========================>.....] - ETA: 25s - loss: 1.5966 - regression_loss: 1.3498 - classification_loss: 0.2469 424/500 [========================>.....] - ETA: 25s - loss: 1.5966 - regression_loss: 1.3497 - classification_loss: 0.2469 425/500 [========================>.....] - ETA: 24s - loss: 1.5974 - regression_loss: 1.3504 - classification_loss: 0.2470 426/500 [========================>.....] - ETA: 24s - loss: 1.5973 - regression_loss: 1.3503 - classification_loss: 0.2469 427/500 [========================>.....] - ETA: 24s - loss: 1.5951 - regression_loss: 1.3484 - classification_loss: 0.2466 428/500 [========================>.....] - ETA: 23s - loss: 1.5961 - regression_loss: 1.3493 - classification_loss: 0.2468 429/500 [========================>.....] - ETA: 23s - loss: 1.5971 - regression_loss: 1.3501 - classification_loss: 0.2470 430/500 [========================>.....] - ETA: 23s - loss: 1.5974 - regression_loss: 1.3503 - classification_loss: 0.2471 431/500 [========================>.....] - ETA: 22s - loss: 1.5958 - regression_loss: 1.3490 - classification_loss: 0.2468 432/500 [========================>.....] - ETA: 22s - loss: 1.5941 - regression_loss: 1.3476 - classification_loss: 0.2465 433/500 [========================>.....] - ETA: 22s - loss: 1.5928 - regression_loss: 1.3465 - classification_loss: 0.2463 434/500 [=========================>....] - ETA: 21s - loss: 1.5945 - regression_loss: 1.3479 - classification_loss: 0.2466 435/500 [=========================>....] - ETA: 21s - loss: 1.5942 - regression_loss: 1.3477 - classification_loss: 0.2465 436/500 [=========================>....] - ETA: 21s - loss: 1.5945 - regression_loss: 1.3480 - classification_loss: 0.2465 437/500 [=========================>....] - ETA: 20s - loss: 1.5940 - regression_loss: 1.3477 - classification_loss: 0.2463 438/500 [=========================>....] - ETA: 20s - loss: 1.5935 - regression_loss: 1.3473 - classification_loss: 0.2462 439/500 [=========================>....] - ETA: 20s - loss: 1.5930 - regression_loss: 1.3469 - classification_loss: 0.2461 440/500 [=========================>....] - ETA: 19s - loss: 1.5928 - regression_loss: 1.3469 - classification_loss: 0.2459 441/500 [=========================>....] - ETA: 19s - loss: 1.5914 - regression_loss: 1.3458 - classification_loss: 0.2456 442/500 [=========================>....] - ETA: 19s - loss: 1.5905 - regression_loss: 1.3451 - classification_loss: 0.2454 443/500 [=========================>....] - ETA: 18s - loss: 1.5890 - regression_loss: 1.3439 - classification_loss: 0.2451 444/500 [=========================>....] - ETA: 18s - loss: 1.5895 - regression_loss: 1.3444 - classification_loss: 0.2451 445/500 [=========================>....] - ETA: 18s - loss: 1.5880 - regression_loss: 1.3430 - classification_loss: 0.2450 446/500 [=========================>....] - ETA: 17s - loss: 1.5892 - regression_loss: 1.3440 - classification_loss: 0.2452 447/500 [=========================>....] - ETA: 17s - loss: 1.5891 - regression_loss: 1.3439 - classification_loss: 0.2452 448/500 [=========================>....] - ETA: 17s - loss: 1.5910 - regression_loss: 1.3454 - classification_loss: 0.2456 449/500 [=========================>....] - ETA: 16s - loss: 1.5920 - regression_loss: 1.3459 - classification_loss: 0.2461 450/500 [==========================>...] - ETA: 16s - loss: 1.5924 - regression_loss: 1.3463 - classification_loss: 0.2461 451/500 [==========================>...] - ETA: 16s - loss: 1.5915 - regression_loss: 1.3456 - classification_loss: 0.2459 452/500 [==========================>...] - ETA: 15s - loss: 1.5918 - regression_loss: 1.3459 - classification_loss: 0.2458 453/500 [==========================>...] - ETA: 15s - loss: 1.5915 - regression_loss: 1.3457 - classification_loss: 0.2458 454/500 [==========================>...] - ETA: 15s - loss: 1.5920 - regression_loss: 1.3462 - classification_loss: 0.2459 455/500 [==========================>...] - ETA: 14s - loss: 1.5925 - regression_loss: 1.3466 - classification_loss: 0.2459 456/500 [==========================>...] - ETA: 14s - loss: 1.5931 - regression_loss: 1.3471 - classification_loss: 0.2460 457/500 [==========================>...] - ETA: 14s - loss: 1.5928 - regression_loss: 1.3469 - classification_loss: 0.2459 458/500 [==========================>...] - ETA: 13s - loss: 1.5941 - regression_loss: 1.3480 - classification_loss: 0.2461 459/500 [==========================>...] - ETA: 13s - loss: 1.5937 - regression_loss: 1.3476 - classification_loss: 0.2461 460/500 [==========================>...] - ETA: 13s - loss: 1.5944 - regression_loss: 1.3482 - classification_loss: 0.2462 461/500 [==========================>...] - ETA: 12s - loss: 1.5943 - regression_loss: 1.3483 - classification_loss: 0.2460 462/500 [==========================>...] - ETA: 12s - loss: 1.5941 - regression_loss: 1.3482 - classification_loss: 0.2459 463/500 [==========================>...] - ETA: 12s - loss: 1.5944 - regression_loss: 1.3484 - classification_loss: 0.2459 464/500 [==========================>...] - ETA: 11s - loss: 1.5953 - regression_loss: 1.3492 - classification_loss: 0.2461 465/500 [==========================>...] - ETA: 11s - loss: 1.5945 - regression_loss: 1.3485 - classification_loss: 0.2459 466/500 [==========================>...] - ETA: 11s - loss: 1.5951 - regression_loss: 1.3491 - classification_loss: 0.2460 467/500 [===========================>..] - ETA: 10s - loss: 1.5954 - regression_loss: 1.3493 - classification_loss: 0.2460 468/500 [===========================>..] - ETA: 10s - loss: 1.5938 - regression_loss: 1.3475 - classification_loss: 0.2463 469/500 [===========================>..] - ETA: 10s - loss: 1.5947 - regression_loss: 1.3483 - classification_loss: 0.2464 470/500 [===========================>..] - ETA: 9s - loss: 1.5956 - regression_loss: 1.3491 - classification_loss: 0.2465  471/500 [===========================>..] - ETA: 9s - loss: 1.5956 - regression_loss: 1.3491 - classification_loss: 0.2465 472/500 [===========================>..] - ETA: 9s - loss: 1.5962 - regression_loss: 1.3498 - classification_loss: 0.2465 473/500 [===========================>..] - ETA: 8s - loss: 1.5973 - regression_loss: 1.3507 - classification_loss: 0.2466 474/500 [===========================>..] - ETA: 8s - loss: 1.5965 - regression_loss: 1.3501 - classification_loss: 0.2465 475/500 [===========================>..] - ETA: 8s - loss: 1.5959 - regression_loss: 1.3495 - classification_loss: 0.2464 476/500 [===========================>..] - ETA: 7s - loss: 1.5971 - regression_loss: 1.3506 - classification_loss: 0.2465 477/500 [===========================>..] - ETA: 7s - loss: 1.5969 - regression_loss: 1.3505 - classification_loss: 0.2464 478/500 [===========================>..] - ETA: 7s - loss: 1.5957 - regression_loss: 1.3494 - classification_loss: 0.2462 479/500 [===========================>..] - ETA: 6s - loss: 1.5945 - regression_loss: 1.3484 - classification_loss: 0.2461 480/500 [===========================>..] - ETA: 6s - loss: 1.5955 - regression_loss: 1.3493 - classification_loss: 0.2462 481/500 [===========================>..] - ETA: 6s - loss: 1.5949 - regression_loss: 1.3488 - classification_loss: 0.2461 482/500 [===========================>..] - ETA: 5s - loss: 1.5957 - regression_loss: 1.3496 - classification_loss: 0.2460 483/500 [===========================>..] - ETA: 5s - loss: 1.5953 - regression_loss: 1.3493 - classification_loss: 0.2460 484/500 [============================>.] - ETA: 5s - loss: 1.5943 - regression_loss: 1.3485 - classification_loss: 0.2458 485/500 [============================>.] - ETA: 4s - loss: 1.5944 - regression_loss: 1.3486 - classification_loss: 0.2457 486/500 [============================>.] - ETA: 4s - loss: 1.5950 - regression_loss: 1.3493 - classification_loss: 0.2458 487/500 [============================>.] - ETA: 4s - loss: 1.5939 - regression_loss: 1.3482 - classification_loss: 0.2457 488/500 [============================>.] - ETA: 3s - loss: 1.5949 - regression_loss: 1.3491 - classification_loss: 0.2457 489/500 [============================>.] - ETA: 3s - loss: 1.5951 - regression_loss: 1.3493 - classification_loss: 0.2459 490/500 [============================>.] - ETA: 3s - loss: 1.5942 - regression_loss: 1.3485 - classification_loss: 0.2457 491/500 [============================>.] - ETA: 2s - loss: 1.5943 - regression_loss: 1.3488 - classification_loss: 0.2456 492/500 [============================>.] - ETA: 2s - loss: 1.5951 - regression_loss: 1.3495 - classification_loss: 0.2456 493/500 [============================>.] - ETA: 2s - loss: 1.5964 - regression_loss: 1.3506 - classification_loss: 0.2458 494/500 [============================>.] - ETA: 1s - loss: 1.5971 - regression_loss: 1.3513 - classification_loss: 0.2459 495/500 [============================>.] - ETA: 1s - loss: 1.5956 - regression_loss: 1.3499 - classification_loss: 0.2456 496/500 [============================>.] - ETA: 1s - loss: 1.5969 - regression_loss: 1.3511 - classification_loss: 0.2458 497/500 [============================>.] - ETA: 0s - loss: 1.5964 - regression_loss: 1.3507 - classification_loss: 0.2457 498/500 [============================>.] - ETA: 0s - loss: 1.5972 - regression_loss: 1.3515 - classification_loss: 0.2458 499/500 [============================>.] - ETA: 0s - loss: 1.5979 - regression_loss: 1.3521 - classification_loss: 0.2458 500/500 [==============================] - 166s 331ms/step - loss: 1.5983 - regression_loss: 1.3525 - classification_loss: 0.2458 1172 instances of class plum with average precision: 0.5467 mAP: 0.5467 Epoch 00015: saving model to ./training/snapshots/resnet101_pascal_15.h5 Epoch 16/150 1/500 [..............................] - ETA: 2:36 - loss: 1.8097 - regression_loss: 1.5363 - classification_loss: 0.2734 2/500 [..............................] - ETA: 2:37 - loss: 2.0924 - regression_loss: 1.7817 - classification_loss: 0.3107 3/500 [..............................] - ETA: 2:43 - loss: 2.0539 - regression_loss: 1.7453 - classification_loss: 0.3086 4/500 [..............................] - ETA: 2:46 - loss: 1.9404 - regression_loss: 1.6445 - classification_loss: 0.2959 5/500 [..............................] - ETA: 2:47 - loss: 1.8930 - regression_loss: 1.6064 - classification_loss: 0.2866 6/500 [..............................] - ETA: 2:46 - loss: 1.8494 - regression_loss: 1.5734 - classification_loss: 0.2760 7/500 [..............................] - ETA: 2:47 - loss: 1.8562 - regression_loss: 1.5776 - classification_loss: 0.2786 8/500 [..............................] - ETA: 2:45 - loss: 1.8738 - regression_loss: 1.5886 - classification_loss: 0.2852 9/500 [..............................] - ETA: 2:44 - loss: 1.8747 - regression_loss: 1.5892 - classification_loss: 0.2855 10/500 [..............................] - ETA: 2:44 - loss: 1.8538 - regression_loss: 1.5748 - classification_loss: 0.2790 11/500 [..............................] - ETA: 2:43 - loss: 1.8449 - regression_loss: 1.5695 - classification_loss: 0.2754 12/500 [..............................] - ETA: 2:43 - loss: 1.8395 - regression_loss: 1.5642 - classification_loss: 0.2753 13/500 [..............................] - ETA: 2:42 - loss: 1.8346 - regression_loss: 1.5580 - classification_loss: 0.2767 14/500 [..............................] - ETA: 2:42 - loss: 1.8558 - regression_loss: 1.5707 - classification_loss: 0.2851 15/500 [..............................] - ETA: 2:41 - loss: 1.8342 - regression_loss: 1.5512 - classification_loss: 0.2830 16/500 [..............................] - ETA: 2:40 - loss: 1.8371 - regression_loss: 1.5560 - classification_loss: 0.2812 17/500 [>.............................] - ETA: 2:39 - loss: 1.7886 - regression_loss: 1.5097 - classification_loss: 0.2789 18/500 [>.............................] - ETA: 2:38 - loss: 1.7992 - regression_loss: 1.5194 - classification_loss: 0.2798 19/500 [>.............................] - ETA: 2:38 - loss: 1.7899 - regression_loss: 1.5087 - classification_loss: 0.2812 20/500 [>.............................] - ETA: 2:37 - loss: 1.7583 - regression_loss: 1.4840 - classification_loss: 0.2743 21/500 [>.............................] - ETA: 2:37 - loss: 1.7603 - regression_loss: 1.4873 - classification_loss: 0.2730 22/500 [>.............................] - ETA: 2:37 - loss: 1.7201 - regression_loss: 1.4532 - classification_loss: 0.2669 23/500 [>.............................] - ETA: 2:37 - loss: 1.7021 - regression_loss: 1.4387 - classification_loss: 0.2633 24/500 [>.............................] - ETA: 2:36 - loss: 1.6725 - regression_loss: 1.4122 - classification_loss: 0.2602 25/500 [>.............................] - ETA: 2:37 - loss: 1.6503 - regression_loss: 1.3936 - classification_loss: 0.2567 26/500 [>.............................] - ETA: 2:36 - loss: 1.6449 - regression_loss: 1.3891 - classification_loss: 0.2559 27/500 [>.............................] - ETA: 2:36 - loss: 1.6506 - regression_loss: 1.3958 - classification_loss: 0.2548 28/500 [>.............................] - ETA: 2:35 - loss: 1.6517 - regression_loss: 1.3971 - classification_loss: 0.2546 29/500 [>.............................] - ETA: 2:35 - loss: 1.6573 - regression_loss: 1.4017 - classification_loss: 0.2556 30/500 [>.............................] - ETA: 2:35 - loss: 1.6661 - regression_loss: 1.4073 - classification_loss: 0.2588 31/500 [>.............................] - ETA: 2:34 - loss: 1.6766 - regression_loss: 1.4176 - classification_loss: 0.2590 32/500 [>.............................] - ETA: 2:33 - loss: 1.6551 - regression_loss: 1.3986 - classification_loss: 0.2565 33/500 [>.............................] - ETA: 2:33 - loss: 1.6250 - regression_loss: 1.3729 - classification_loss: 0.2521 34/500 [=>............................] - ETA: 2:33 - loss: 1.6198 - regression_loss: 1.3685 - classification_loss: 0.2513 35/500 [=>............................] - ETA: 2:32 - loss: 1.6032 - regression_loss: 1.3554 - classification_loss: 0.2478 36/500 [=>............................] - ETA: 2:32 - loss: 1.6022 - regression_loss: 1.3549 - classification_loss: 0.2473 37/500 [=>............................] - ETA: 2:31 - loss: 1.6157 - regression_loss: 1.3667 - classification_loss: 0.2490 38/500 [=>............................] - ETA: 2:31 - loss: 1.6195 - regression_loss: 1.3708 - classification_loss: 0.2486 39/500 [=>............................] - ETA: 2:31 - loss: 1.6188 - regression_loss: 1.3707 - classification_loss: 0.2481 40/500 [=>............................] - ETA: 2:30 - loss: 1.6223 - regression_loss: 1.3735 - classification_loss: 0.2488 41/500 [=>............................] - ETA: 2:30 - loss: 1.6229 - regression_loss: 1.3742 - classification_loss: 0.2487 42/500 [=>............................] - ETA: 2:29 - loss: 1.6294 - regression_loss: 1.3798 - classification_loss: 0.2496 43/500 [=>............................] - ETA: 2:29 - loss: 1.6311 - regression_loss: 1.3820 - classification_loss: 0.2492 44/500 [=>............................] - ETA: 2:28 - loss: 1.6221 - regression_loss: 1.3734 - classification_loss: 0.2487 45/500 [=>............................] - ETA: 2:28 - loss: 1.6330 - regression_loss: 1.3832 - classification_loss: 0.2498 46/500 [=>............................] - ETA: 2:28 - loss: 1.6370 - regression_loss: 1.3869 - classification_loss: 0.2500 47/500 [=>............................] - ETA: 2:28 - loss: 1.6282 - regression_loss: 1.3787 - classification_loss: 0.2495 48/500 [=>............................] - ETA: 2:27 - loss: 1.6347 - regression_loss: 1.3840 - classification_loss: 0.2508 49/500 [=>............................] - ETA: 2:27 - loss: 1.6227 - regression_loss: 1.3737 - classification_loss: 0.2490 50/500 [==>...........................] - ETA: 2:26 - loss: 1.6149 - regression_loss: 1.3661 - classification_loss: 0.2487 51/500 [==>...........................] - ETA: 2:26 - loss: 1.6255 - regression_loss: 1.3745 - classification_loss: 0.2510 52/500 [==>...........................] - ETA: 2:26 - loss: 1.6270 - regression_loss: 1.3758 - classification_loss: 0.2512 53/500 [==>...........................] - ETA: 2:26 - loss: 1.6077 - regression_loss: 1.3596 - classification_loss: 0.2481 54/500 [==>...........................] - ETA: 2:25 - loss: 1.5969 - regression_loss: 1.3510 - classification_loss: 0.2459 55/500 [==>...........................] - ETA: 2:25 - loss: 1.5942 - regression_loss: 1.3487 - classification_loss: 0.2455 56/500 [==>...........................] - ETA: 2:25 - loss: 1.5818 - regression_loss: 1.3379 - classification_loss: 0.2439 57/500 [==>...........................] - ETA: 2:24 - loss: 1.5821 - regression_loss: 1.3383 - classification_loss: 0.2439 58/500 [==>...........................] - ETA: 2:24 - loss: 1.5821 - regression_loss: 1.3392 - classification_loss: 0.2429 59/500 [==>...........................] - ETA: 2:24 - loss: 1.5884 - regression_loss: 1.3445 - classification_loss: 0.2438 60/500 [==>...........................] - ETA: 2:23 - loss: 1.5874 - regression_loss: 1.3434 - classification_loss: 0.2440 61/500 [==>...........................] - ETA: 2:23 - loss: 1.5832 - regression_loss: 1.3397 - classification_loss: 0.2435 62/500 [==>...........................] - ETA: 2:22 - loss: 1.5698 - regression_loss: 1.3287 - classification_loss: 0.2411 63/500 [==>...........................] - ETA: 2:22 - loss: 1.5681 - regression_loss: 1.3277 - classification_loss: 0.2404 64/500 [==>...........................] - ETA: 2:22 - loss: 1.5704 - regression_loss: 1.3301 - classification_loss: 0.2403 65/500 [==>...........................] - ETA: 2:21 - loss: 1.5559 - regression_loss: 1.3175 - classification_loss: 0.2384 66/500 [==>...........................] - ETA: 2:21 - loss: 1.5530 - regression_loss: 1.3148 - classification_loss: 0.2382 67/500 [===>..........................] - ETA: 2:21 - loss: 1.5602 - regression_loss: 1.3202 - classification_loss: 0.2400 68/500 [===>..........................] - ETA: 2:21 - loss: 1.5606 - regression_loss: 1.3198 - classification_loss: 0.2409 69/500 [===>..........................] - ETA: 2:20 - loss: 1.5481 - regression_loss: 1.3088 - classification_loss: 0.2392 70/500 [===>..........................] - ETA: 2:20 - loss: 1.5473 - regression_loss: 1.3078 - classification_loss: 0.2395 71/500 [===>..........................] - ETA: 2:20 - loss: 1.5446 - regression_loss: 1.3057 - classification_loss: 0.2389 72/500 [===>..........................] - ETA: 2:19 - loss: 1.5552 - regression_loss: 1.3156 - classification_loss: 0.2396 73/500 [===>..........................] - ETA: 2:19 - loss: 1.5561 - regression_loss: 1.3159 - classification_loss: 0.2402 74/500 [===>..........................] - ETA: 2:18 - loss: 1.5540 - regression_loss: 1.3144 - classification_loss: 0.2396 75/500 [===>..........................] - ETA: 2:18 - loss: 1.5526 - regression_loss: 1.3138 - classification_loss: 0.2388 76/500 [===>..........................] - ETA: 2:18 - loss: 1.5566 - regression_loss: 1.3173 - classification_loss: 0.2393 77/500 [===>..........................] - ETA: 2:17 - loss: 1.5563 - regression_loss: 1.3173 - classification_loss: 0.2390 78/500 [===>..........................] - ETA: 2:17 - loss: 1.5577 - regression_loss: 1.3185 - classification_loss: 0.2392 79/500 [===>..........................] - ETA: 2:17 - loss: 1.5518 - regression_loss: 1.3139 - classification_loss: 0.2379 80/500 [===>..........................] - ETA: 2:16 - loss: 1.5507 - regression_loss: 1.3126 - classification_loss: 0.2380 81/500 [===>..........................] - ETA: 2:16 - loss: 1.5527 - regression_loss: 1.3155 - classification_loss: 0.2373 82/500 [===>..........................] - ETA: 2:16 - loss: 1.5592 - regression_loss: 1.3214 - classification_loss: 0.2378 83/500 [===>..........................] - ETA: 2:16 - loss: 1.5634 - regression_loss: 1.3255 - classification_loss: 0.2378 84/500 [====>.........................] - ETA: 2:15 - loss: 1.5686 - regression_loss: 1.3297 - classification_loss: 0.2389 85/500 [====>.........................] - ETA: 2:15 - loss: 1.5711 - regression_loss: 1.3318 - classification_loss: 0.2393 86/500 [====>.........................] - ETA: 2:14 - loss: 1.5681 - regression_loss: 1.3291 - classification_loss: 0.2389 87/500 [====>.........................] - ETA: 2:14 - loss: 1.5559 - regression_loss: 1.3188 - classification_loss: 0.2371 88/500 [====>.........................] - ETA: 2:14 - loss: 1.5582 - regression_loss: 1.3206 - classification_loss: 0.2376 89/500 [====>.........................] - ETA: 2:13 - loss: 1.5593 - regression_loss: 1.3221 - classification_loss: 0.2372 90/500 [====>.........................] - ETA: 2:13 - loss: 1.5619 - regression_loss: 1.3245 - classification_loss: 0.2374 91/500 [====>.........................] - ETA: 2:13 - loss: 1.5572 - regression_loss: 1.3203 - classification_loss: 0.2368 92/500 [====>.........................] - ETA: 2:13 - loss: 1.5572 - regression_loss: 1.3206 - classification_loss: 0.2366 93/500 [====>.........................] - ETA: 2:12 - loss: 1.5503 - regression_loss: 1.3147 - classification_loss: 0.2356 94/500 [====>.........................] - ETA: 2:12 - loss: 1.5433 - regression_loss: 1.3092 - classification_loss: 0.2341 95/500 [====>.........................] - ETA: 2:12 - loss: 1.5370 - regression_loss: 1.3036 - classification_loss: 0.2334 96/500 [====>.........................] - ETA: 2:11 - loss: 1.5330 - regression_loss: 1.3003 - classification_loss: 0.2327 97/500 [====>.........................] - ETA: 2:11 - loss: 1.5365 - regression_loss: 1.3036 - classification_loss: 0.2329 98/500 [====>.........................] - ETA: 2:11 - loss: 1.5350 - regression_loss: 1.3030 - classification_loss: 0.2320 99/500 [====>.........................] - ETA: 2:10 - loss: 1.5372 - regression_loss: 1.3051 - classification_loss: 0.2321 100/500 [=====>........................] - ETA: 2:10 - loss: 1.5326 - regression_loss: 1.3012 - classification_loss: 0.2314 101/500 [=====>........................] - ETA: 2:10 - loss: 1.5326 - regression_loss: 1.3013 - classification_loss: 0.2313 102/500 [=====>........................] - ETA: 2:09 - loss: 1.5297 - regression_loss: 1.2991 - classification_loss: 0.2306 103/500 [=====>........................] - ETA: 2:09 - loss: 1.5299 - regression_loss: 1.2994 - classification_loss: 0.2305 104/500 [=====>........................] - ETA: 2:09 - loss: 1.5342 - regression_loss: 1.3028 - classification_loss: 0.2313 105/500 [=====>........................] - ETA: 2:08 - loss: 1.5351 - regression_loss: 1.3039 - classification_loss: 0.2313 106/500 [=====>........................] - ETA: 2:08 - loss: 1.5337 - regression_loss: 1.3026 - classification_loss: 0.2311 107/500 [=====>........................] - ETA: 2:08 - loss: 1.5346 - regression_loss: 1.3035 - classification_loss: 0.2311 108/500 [=====>........................] - ETA: 2:07 - loss: 1.5335 - regression_loss: 1.3026 - classification_loss: 0.2309 109/500 [=====>........................] - ETA: 2:07 - loss: 1.5376 - regression_loss: 1.3061 - classification_loss: 0.2315 110/500 [=====>........................] - ETA: 2:07 - loss: 1.5419 - regression_loss: 1.3095 - classification_loss: 0.2325 111/500 [=====>........................] - ETA: 2:06 - loss: 1.5342 - regression_loss: 1.3030 - classification_loss: 0.2312 112/500 [=====>........................] - ETA: 2:06 - loss: 1.5349 - regression_loss: 1.3037 - classification_loss: 0.2312 113/500 [=====>........................] - ETA: 2:06 - loss: 1.5360 - regression_loss: 1.3044 - classification_loss: 0.2316 114/500 [=====>........................] - ETA: 2:05 - loss: 1.5365 - regression_loss: 1.3050 - classification_loss: 0.2315 115/500 [=====>........................] - ETA: 2:05 - loss: 1.5351 - regression_loss: 1.3039 - classification_loss: 0.2312 116/500 [=====>........................] - ETA: 2:05 - loss: 1.5296 - regression_loss: 1.2993 - classification_loss: 0.2303 117/500 [======>.......................] - ETA: 2:04 - loss: 1.5312 - regression_loss: 1.3014 - classification_loss: 0.2298 118/500 [======>.......................] - ETA: 2:04 - loss: 1.5267 - regression_loss: 1.2977 - classification_loss: 0.2290 119/500 [======>.......................] - ETA: 2:04 - loss: 1.5280 - regression_loss: 1.2993 - classification_loss: 0.2287 120/500 [======>.......................] - ETA: 2:03 - loss: 1.5262 - regression_loss: 1.2978 - classification_loss: 0.2283 121/500 [======>.......................] - ETA: 2:03 - loss: 1.5279 - regression_loss: 1.2992 - classification_loss: 0.2286 122/500 [======>.......................] - ETA: 2:03 - loss: 1.5229 - regression_loss: 1.2942 - classification_loss: 0.2287 123/500 [======>.......................] - ETA: 2:03 - loss: 1.5239 - regression_loss: 1.2946 - classification_loss: 0.2292 124/500 [======>.......................] - ETA: 2:02 - loss: 1.5204 - regression_loss: 1.2920 - classification_loss: 0.2284 125/500 [======>.......................] - ETA: 2:02 - loss: 1.5255 - regression_loss: 1.2957 - classification_loss: 0.2298 126/500 [======>.......................] - ETA: 2:02 - loss: 1.5270 - regression_loss: 1.2972 - classification_loss: 0.2298 127/500 [======>.......................] - ETA: 2:01 - loss: 1.5222 - regression_loss: 1.2934 - classification_loss: 0.2288 128/500 [======>.......................] - ETA: 2:01 - loss: 1.5221 - regression_loss: 1.2935 - classification_loss: 0.2286 129/500 [======>.......................] - ETA: 2:01 - loss: 1.5263 - regression_loss: 1.2973 - classification_loss: 0.2291 130/500 [======>.......................] - ETA: 2:00 - loss: 1.5222 - regression_loss: 1.2938 - classification_loss: 0.2284 131/500 [======>.......................] - ETA: 2:00 - loss: 1.5194 - regression_loss: 1.2913 - classification_loss: 0.2281 132/500 [======>.......................] - ETA: 2:00 - loss: 1.5254 - regression_loss: 1.2959 - classification_loss: 0.2295 133/500 [======>.......................] - ETA: 1:59 - loss: 1.5244 - regression_loss: 1.2950 - classification_loss: 0.2294 134/500 [=======>......................] - ETA: 1:59 - loss: 1.5305 - regression_loss: 1.2997 - classification_loss: 0.2308 135/500 [=======>......................] - ETA: 1:59 - loss: 1.5332 - regression_loss: 1.3013 - classification_loss: 0.2319 136/500 [=======>......................] - ETA: 1:58 - loss: 1.5374 - regression_loss: 1.3046 - classification_loss: 0.2329 137/500 [=======>......................] - ETA: 1:58 - loss: 1.5398 - regression_loss: 1.3064 - classification_loss: 0.2334 138/500 [=======>......................] - ETA: 1:58 - loss: 1.5344 - regression_loss: 1.3015 - classification_loss: 0.2330 139/500 [=======>......................] - ETA: 1:58 - loss: 1.5326 - regression_loss: 1.3000 - classification_loss: 0.2327 140/500 [=======>......................] - ETA: 1:57 - loss: 1.5343 - regression_loss: 1.3014 - classification_loss: 0.2329 141/500 [=======>......................] - ETA: 1:57 - loss: 1.5291 - regression_loss: 1.2965 - classification_loss: 0.2326 142/500 [=======>......................] - ETA: 1:57 - loss: 1.5290 - regression_loss: 1.2964 - classification_loss: 0.2326 143/500 [=======>......................] - ETA: 1:56 - loss: 1.5234 - regression_loss: 1.2916 - classification_loss: 0.2318 144/500 [=======>......................] - ETA: 1:56 - loss: 1.5254 - regression_loss: 1.2936 - classification_loss: 0.2318 145/500 [=======>......................] - ETA: 1:56 - loss: 1.5179 - regression_loss: 1.2869 - classification_loss: 0.2309 146/500 [=======>......................] - ETA: 1:55 - loss: 1.5199 - regression_loss: 1.2886 - classification_loss: 0.2314 147/500 [=======>......................] - ETA: 1:55 - loss: 1.5226 - regression_loss: 1.2907 - classification_loss: 0.2319 148/500 [=======>......................] - ETA: 1:55 - loss: 1.5258 - regression_loss: 1.2932 - classification_loss: 0.2326 149/500 [=======>......................] - ETA: 1:54 - loss: 1.5265 - regression_loss: 1.2939 - classification_loss: 0.2325 150/500 [========>.....................] - ETA: 1:54 - loss: 1.5223 - regression_loss: 1.2900 - classification_loss: 0.2323 151/500 [========>.....................] - ETA: 1:54 - loss: 1.5187 - regression_loss: 1.2868 - classification_loss: 0.2319 152/500 [========>.....................] - ETA: 1:53 - loss: 1.5201 - regression_loss: 1.2881 - classification_loss: 0.2320 153/500 [========>.....................] - ETA: 1:53 - loss: 1.5209 - regression_loss: 1.2891 - classification_loss: 0.2318 154/500 [========>.....................] - ETA: 1:53 - loss: 1.5228 - regression_loss: 1.2908 - classification_loss: 0.2320 155/500 [========>.....................] - ETA: 1:52 - loss: 1.5170 - regression_loss: 1.2860 - classification_loss: 0.2310 156/500 [========>.....................] - ETA: 1:52 - loss: 1.5141 - regression_loss: 1.2837 - classification_loss: 0.2303 157/500 [========>.....................] - ETA: 1:52 - loss: 1.5191 - regression_loss: 1.2882 - classification_loss: 0.2309 158/500 [========>.....................] - ETA: 1:51 - loss: 1.5219 - regression_loss: 1.2907 - classification_loss: 0.2312 159/500 [========>.....................] - ETA: 1:51 - loss: 1.5258 - regression_loss: 1.2941 - classification_loss: 0.2317 160/500 [========>.....................] - ETA: 1:51 - loss: 1.5251 - regression_loss: 1.2933 - classification_loss: 0.2317 161/500 [========>.....................] - ETA: 1:50 - loss: 1.5285 - regression_loss: 1.2963 - classification_loss: 0.2323 162/500 [========>.....................] - ETA: 1:50 - loss: 1.5312 - regression_loss: 1.2986 - classification_loss: 0.2326 163/500 [========>.....................] - ETA: 1:50 - loss: 1.5317 - regression_loss: 1.2991 - classification_loss: 0.2326 164/500 [========>.....................] - ETA: 1:50 - loss: 1.5290 - regression_loss: 1.2971 - classification_loss: 0.2319 165/500 [========>.....................] - ETA: 1:49 - loss: 1.5303 - regression_loss: 1.2981 - classification_loss: 0.2322 166/500 [========>.....................] - ETA: 1:49 - loss: 1.5338 - regression_loss: 1.3010 - classification_loss: 0.2328 167/500 [=========>....................] - ETA: 1:49 - loss: 1.5332 - regression_loss: 1.3003 - classification_loss: 0.2330 168/500 [=========>....................] - ETA: 1:48 - loss: 1.5354 - regression_loss: 1.3021 - classification_loss: 0.2334 169/500 [=========>....................] - ETA: 1:48 - loss: 1.5361 - regression_loss: 1.3026 - classification_loss: 0.2335 170/500 [=========>....................] - ETA: 1:48 - loss: 1.5362 - regression_loss: 1.3027 - classification_loss: 0.2334 171/500 [=========>....................] - ETA: 1:47 - loss: 1.5324 - regression_loss: 1.2996 - classification_loss: 0.2328 172/500 [=========>....................] - ETA: 1:47 - loss: 1.5290 - regression_loss: 1.2967 - classification_loss: 0.2323 173/500 [=========>....................] - ETA: 1:47 - loss: 1.5247 - regression_loss: 1.2924 - classification_loss: 0.2323 174/500 [=========>....................] - ETA: 1:46 - loss: 1.5246 - regression_loss: 1.2924 - classification_loss: 0.2322 175/500 [=========>....................] - ETA: 1:46 - loss: 1.5275 - regression_loss: 1.2950 - classification_loss: 0.2325 176/500 [=========>....................] - ETA: 1:46 - loss: 1.5259 - regression_loss: 1.2934 - classification_loss: 0.2326 177/500 [=========>....................] - ETA: 1:45 - loss: 1.5266 - regression_loss: 1.2941 - classification_loss: 0.2325 178/500 [=========>....................] - ETA: 1:45 - loss: 1.5280 - regression_loss: 1.2953 - classification_loss: 0.2328 179/500 [=========>....................] - ETA: 1:45 - loss: 1.5310 - regression_loss: 1.2981 - classification_loss: 0.2330 180/500 [=========>....................] - ETA: 1:45 - loss: 1.5347 - regression_loss: 1.3013 - classification_loss: 0.2334 181/500 [=========>....................] - ETA: 1:44 - loss: 1.5332 - regression_loss: 1.2999 - classification_loss: 0.2333 182/500 [=========>....................] - ETA: 1:44 - loss: 1.5323 - regression_loss: 1.2990 - classification_loss: 0.2332 183/500 [=========>....................] - ETA: 1:44 - loss: 1.5337 - regression_loss: 1.3002 - classification_loss: 0.2334 184/500 [==========>...................] - ETA: 1:43 - loss: 1.5328 - regression_loss: 1.2996 - classification_loss: 0.2332 185/500 [==========>...................] - ETA: 1:43 - loss: 1.5333 - regression_loss: 1.3002 - classification_loss: 0.2331 186/500 [==========>...................] - ETA: 1:43 - loss: 1.5332 - regression_loss: 1.3003 - classification_loss: 0.2329 187/500 [==========>...................] - ETA: 1:42 - loss: 1.5346 - regression_loss: 1.3017 - classification_loss: 0.2329 188/500 [==========>...................] - ETA: 1:42 - loss: 1.5367 - regression_loss: 1.3035 - classification_loss: 0.2332 189/500 [==========>...................] - ETA: 1:42 - loss: 1.5380 - regression_loss: 1.3049 - classification_loss: 0.2331 190/500 [==========>...................] - ETA: 1:41 - loss: 1.5369 - regression_loss: 1.3038 - classification_loss: 0.2331 191/500 [==========>...................] - ETA: 1:41 - loss: 1.5376 - regression_loss: 1.3044 - classification_loss: 0.2332 192/500 [==========>...................] - ETA: 1:41 - loss: 1.5420 - regression_loss: 1.3081 - classification_loss: 0.2339 193/500 [==========>...................] - ETA: 1:40 - loss: 1.5404 - regression_loss: 1.3067 - classification_loss: 0.2337 194/500 [==========>...................] - ETA: 1:40 - loss: 1.5438 - regression_loss: 1.3103 - classification_loss: 0.2335 195/500 [==========>...................] - ETA: 1:40 - loss: 1.5466 - regression_loss: 1.3127 - classification_loss: 0.2339 196/500 [==========>...................] - ETA: 1:39 - loss: 1.5458 - regression_loss: 1.3111 - classification_loss: 0.2346 197/500 [==========>...................] - ETA: 1:39 - loss: 1.5464 - regression_loss: 1.3115 - classification_loss: 0.2350 198/500 [==========>...................] - ETA: 1:39 - loss: 1.5485 - regression_loss: 1.3132 - classification_loss: 0.2353 199/500 [==========>...................] - ETA: 1:38 - loss: 1.5487 - regression_loss: 1.3134 - classification_loss: 0.2353 200/500 [===========>..................] - ETA: 1:38 - loss: 1.5479 - regression_loss: 1.3128 - classification_loss: 0.2352 201/500 [===========>..................] - ETA: 1:38 - loss: 1.5509 - regression_loss: 1.3152 - classification_loss: 0.2357 202/500 [===========>..................] - ETA: 1:37 - loss: 1.5501 - regression_loss: 1.3145 - classification_loss: 0.2356 203/500 [===========>..................] - ETA: 1:37 - loss: 1.5503 - regression_loss: 1.3150 - classification_loss: 0.2352 204/500 [===========>..................] - ETA: 1:37 - loss: 1.5529 - regression_loss: 1.3173 - classification_loss: 0.2355 205/500 [===========>..................] - ETA: 1:36 - loss: 1.5544 - regression_loss: 1.3188 - classification_loss: 0.2356 206/500 [===========>..................] - ETA: 1:36 - loss: 1.5552 - regression_loss: 1.3197 - classification_loss: 0.2355 207/500 [===========>..................] - ETA: 1:36 - loss: 1.5547 - regression_loss: 1.3192 - classification_loss: 0.2355 208/500 [===========>..................] - ETA: 1:35 - loss: 1.5523 - regression_loss: 1.3174 - classification_loss: 0.2349 209/500 [===========>..................] - ETA: 1:35 - loss: 1.5486 - regression_loss: 1.3142 - classification_loss: 0.2344 210/500 [===========>..................] - ETA: 1:35 - loss: 1.5477 - regression_loss: 1.3134 - classification_loss: 0.2342 211/500 [===========>..................] - ETA: 1:34 - loss: 1.5491 - regression_loss: 1.3146 - classification_loss: 0.2345 212/500 [===========>..................] - ETA: 1:34 - loss: 1.5504 - regression_loss: 1.3157 - classification_loss: 0.2347 213/500 [===========>..................] - ETA: 1:34 - loss: 1.5517 - regression_loss: 1.3168 - classification_loss: 0.2349 214/500 [===========>..................] - ETA: 1:33 - loss: 1.5498 - regression_loss: 1.3153 - classification_loss: 0.2346 215/500 [===========>..................] - ETA: 1:33 - loss: 1.5503 - regression_loss: 1.3157 - classification_loss: 0.2346 216/500 [===========>..................] - ETA: 1:33 - loss: 1.5504 - regression_loss: 1.3158 - classification_loss: 0.2346 217/500 [============>.................] - ETA: 1:32 - loss: 1.5543 - regression_loss: 1.3191 - classification_loss: 0.2352 218/500 [============>.................] - ETA: 1:32 - loss: 1.5519 - regression_loss: 1.3172 - classification_loss: 0.2347 219/500 [============>.................] - ETA: 1:32 - loss: 1.5539 - regression_loss: 1.3187 - classification_loss: 0.2352 220/500 [============>.................] - ETA: 1:31 - loss: 1.5496 - regression_loss: 1.3149 - classification_loss: 0.2347 221/500 [============>.................] - ETA: 1:31 - loss: 1.5471 - regression_loss: 1.3127 - classification_loss: 0.2343 222/500 [============>.................] - ETA: 1:31 - loss: 1.5473 - regression_loss: 1.3131 - classification_loss: 0.2342 223/500 [============>.................] - ETA: 1:30 - loss: 1.5470 - regression_loss: 1.3127 - classification_loss: 0.2343 224/500 [============>.................] - ETA: 1:30 - loss: 1.5493 - regression_loss: 1.3145 - classification_loss: 0.2348 225/500 [============>.................] - ETA: 1:30 - loss: 1.5496 - regression_loss: 1.3148 - classification_loss: 0.2348 226/500 [============>.................] - ETA: 1:30 - loss: 1.5481 - regression_loss: 1.3138 - classification_loss: 0.2343 227/500 [============>.................] - ETA: 1:29 - loss: 1.5512 - regression_loss: 1.3162 - classification_loss: 0.2350 228/500 [============>.................] - ETA: 1:29 - loss: 1.5474 - regression_loss: 1.3129 - classification_loss: 0.2345 229/500 [============>.................] - ETA: 1:29 - loss: 1.5496 - regression_loss: 1.3151 - classification_loss: 0.2346 230/500 [============>.................] - ETA: 1:28 - loss: 1.5511 - regression_loss: 1.3163 - classification_loss: 0.2348 231/500 [============>.................] - ETA: 1:28 - loss: 1.5515 - regression_loss: 1.3168 - classification_loss: 0.2347 232/500 [============>.................] - ETA: 1:28 - loss: 1.5502 - regression_loss: 1.3158 - classification_loss: 0.2344 233/500 [============>.................] - ETA: 1:27 - loss: 1.5506 - regression_loss: 1.3162 - classification_loss: 0.2344 234/500 [=============>................] - ETA: 1:27 - loss: 1.5520 - regression_loss: 1.3174 - classification_loss: 0.2346 235/500 [=============>................] - ETA: 1:27 - loss: 1.5497 - regression_loss: 1.3154 - classification_loss: 0.2343 236/500 [=============>................] - ETA: 1:26 - loss: 1.5484 - regression_loss: 1.3144 - classification_loss: 0.2340 237/500 [=============>................] - ETA: 1:26 - loss: 1.5490 - regression_loss: 1.3150 - classification_loss: 0.2340 238/500 [=============>................] - ETA: 1:26 - loss: 1.5482 - regression_loss: 1.3143 - classification_loss: 0.2339 239/500 [=============>................] - ETA: 1:25 - loss: 1.5444 - regression_loss: 1.3109 - classification_loss: 0.2335 240/500 [=============>................] - ETA: 1:25 - loss: 1.5448 - regression_loss: 1.3114 - classification_loss: 0.2334 241/500 [=============>................] - ETA: 1:25 - loss: 1.5414 - regression_loss: 1.3085 - classification_loss: 0.2329 242/500 [=============>................] - ETA: 1:24 - loss: 1.5423 - regression_loss: 1.3092 - classification_loss: 0.2330 243/500 [=============>................] - ETA: 1:24 - loss: 1.5422 - regression_loss: 1.3093 - classification_loss: 0.2329 244/500 [=============>................] - ETA: 1:24 - loss: 1.5416 - regression_loss: 1.3088 - classification_loss: 0.2328 245/500 [=============>................] - ETA: 1:23 - loss: 1.5408 - regression_loss: 1.3083 - classification_loss: 0.2325 246/500 [=============>................] - ETA: 1:23 - loss: 1.5378 - regression_loss: 1.3057 - classification_loss: 0.2321 247/500 [=============>................] - ETA: 1:23 - loss: 1.5382 - regression_loss: 1.3062 - classification_loss: 0.2321 248/500 [=============>................] - ETA: 1:22 - loss: 1.5356 - regression_loss: 1.3040 - classification_loss: 0.2316 249/500 [=============>................] - ETA: 1:22 - loss: 1.5365 - regression_loss: 1.3047 - classification_loss: 0.2318 250/500 [==============>...............] - ETA: 1:22 - loss: 1.5370 - regression_loss: 1.3053 - classification_loss: 0.2317 251/500 [==============>...............] - ETA: 1:21 - loss: 1.5375 - regression_loss: 1.3059 - classification_loss: 0.2316 252/500 [==============>...............] - ETA: 1:21 - loss: 1.5385 - regression_loss: 1.3067 - classification_loss: 0.2318 253/500 [==============>...............] - ETA: 1:21 - loss: 1.5364 - regression_loss: 1.3047 - classification_loss: 0.2317 254/500 [==============>...............] - ETA: 1:20 - loss: 1.5348 - regression_loss: 1.3024 - classification_loss: 0.2324 255/500 [==============>...............] - ETA: 1:20 - loss: 1.5346 - regression_loss: 1.3021 - classification_loss: 0.2325 256/500 [==============>...............] - ETA: 1:20 - loss: 1.5352 - regression_loss: 1.3028 - classification_loss: 0.2324 257/500 [==============>...............] - ETA: 1:19 - loss: 1.5362 - regression_loss: 1.3039 - classification_loss: 0.2323 258/500 [==============>...............] - ETA: 1:19 - loss: 1.5368 - regression_loss: 1.3042 - classification_loss: 0.2326 259/500 [==============>...............] - ETA: 1:19 - loss: 1.5368 - regression_loss: 1.3043 - classification_loss: 0.2325 260/500 [==============>...............] - ETA: 1:18 - loss: 1.5380 - regression_loss: 1.3054 - classification_loss: 0.2326 261/500 [==============>...............] - ETA: 1:18 - loss: 1.5371 - regression_loss: 1.3044 - classification_loss: 0.2326 262/500 [==============>...............] - ETA: 1:18 - loss: 1.5368 - regression_loss: 1.3042 - classification_loss: 0.2326 263/500 [==============>...............] - ETA: 1:17 - loss: 1.5353 - regression_loss: 1.3028 - classification_loss: 0.2325 264/500 [==============>...............] - ETA: 1:17 - loss: 1.5359 - regression_loss: 1.3033 - classification_loss: 0.2326 265/500 [==============>...............] - ETA: 1:17 - loss: 1.5371 - regression_loss: 1.3045 - classification_loss: 0.2326 266/500 [==============>...............] - ETA: 1:16 - loss: 1.5379 - regression_loss: 1.3052 - classification_loss: 0.2327 267/500 [===============>..............] - ETA: 1:16 - loss: 1.5370 - regression_loss: 1.3042 - classification_loss: 0.2328 268/500 [===============>..............] - ETA: 1:16 - loss: 1.5382 - regression_loss: 1.3053 - classification_loss: 0.2329 269/500 [===============>..............] - ETA: 1:15 - loss: 1.5377 - regression_loss: 1.3045 - classification_loss: 0.2332 270/500 [===============>..............] - ETA: 1:15 - loss: 1.5372 - regression_loss: 1.3040 - classification_loss: 0.2332 271/500 [===============>..............] - ETA: 1:15 - loss: 1.5355 - regression_loss: 1.3026 - classification_loss: 0.2329 272/500 [===============>..............] - ETA: 1:14 - loss: 1.5360 - regression_loss: 1.3030 - classification_loss: 0.2330 273/500 [===============>..............] - ETA: 1:14 - loss: 1.5337 - regression_loss: 1.3010 - classification_loss: 0.2326 274/500 [===============>..............] - ETA: 1:14 - loss: 1.5305 - regression_loss: 1.2983 - classification_loss: 0.2322 275/500 [===============>..............] - ETA: 1:13 - loss: 1.5328 - regression_loss: 1.3002 - classification_loss: 0.2326 276/500 [===============>..............] - ETA: 1:13 - loss: 1.5350 - regression_loss: 1.3023 - classification_loss: 0.2327 277/500 [===============>..............] - ETA: 1:13 - loss: 1.5362 - regression_loss: 1.3033 - classification_loss: 0.2329 278/500 [===============>..............] - ETA: 1:12 - loss: 1.5336 - regression_loss: 1.3009 - classification_loss: 0.2327 279/500 [===============>..............] - ETA: 1:12 - loss: 1.5320 - regression_loss: 1.2995 - classification_loss: 0.2325 280/500 [===============>..............] - ETA: 1:12 - loss: 1.5332 - regression_loss: 1.3007 - classification_loss: 0.2325 281/500 [===============>..............] - ETA: 1:11 - loss: 1.5309 - regression_loss: 1.2987 - classification_loss: 0.2322 282/500 [===============>..............] - ETA: 1:11 - loss: 1.5289 - regression_loss: 1.2968 - classification_loss: 0.2321 283/500 [===============>..............] - ETA: 1:11 - loss: 1.5306 - regression_loss: 1.2983 - classification_loss: 0.2323 284/500 [================>.............] - ETA: 1:10 - loss: 1.5322 - regression_loss: 1.2996 - classification_loss: 0.2326 285/500 [================>.............] - ETA: 1:10 - loss: 1.5310 - regression_loss: 1.2987 - classification_loss: 0.2323 286/500 [================>.............] - ETA: 1:10 - loss: 1.5312 - regression_loss: 1.2989 - classification_loss: 0.2323 287/500 [================>.............] - ETA: 1:09 - loss: 1.5284 - regression_loss: 1.2965 - classification_loss: 0.2318 288/500 [================>.............] - ETA: 1:09 - loss: 1.5279 - regression_loss: 1.2962 - classification_loss: 0.2317 289/500 [================>.............] - ETA: 1:09 - loss: 1.5259 - regression_loss: 1.2946 - classification_loss: 0.2313 290/500 [================>.............] - ETA: 1:08 - loss: 1.5245 - regression_loss: 1.2934 - classification_loss: 0.2311 291/500 [================>.............] - ETA: 1:08 - loss: 1.5246 - regression_loss: 1.2934 - classification_loss: 0.2312 292/500 [================>.............] - ETA: 1:08 - loss: 1.5228 - regression_loss: 1.2920 - classification_loss: 0.2308 293/500 [================>.............] - ETA: 1:07 - loss: 1.5236 - regression_loss: 1.2927 - classification_loss: 0.2309 294/500 [================>.............] - ETA: 1:07 - loss: 1.5231 - regression_loss: 1.2923 - classification_loss: 0.2309 295/500 [================>.............] - ETA: 1:07 - loss: 1.5244 - regression_loss: 1.2934 - classification_loss: 0.2310 296/500 [================>.............] - ETA: 1:06 - loss: 1.5242 - regression_loss: 1.2935 - classification_loss: 0.2308 297/500 [================>.............] - ETA: 1:06 - loss: 1.5225 - regression_loss: 1.2918 - classification_loss: 0.2307 298/500 [================>.............] - ETA: 1:06 - loss: 1.5233 - regression_loss: 1.2925 - classification_loss: 0.2308 299/500 [================>.............] - ETA: 1:05 - loss: 1.5234 - regression_loss: 1.2927 - classification_loss: 0.2307 300/500 [=================>............] - ETA: 1:05 - loss: 1.5220 - regression_loss: 1.2914 - classification_loss: 0.2305 301/500 [=================>............] - ETA: 1:05 - loss: 1.5193 - regression_loss: 1.2891 - classification_loss: 0.2302 302/500 [=================>............] - ETA: 1:04 - loss: 1.5178 - regression_loss: 1.2878 - classification_loss: 0.2300 303/500 [=================>............] - ETA: 1:04 - loss: 1.5189 - regression_loss: 1.2887 - classification_loss: 0.2302 304/500 [=================>............] - ETA: 1:04 - loss: 1.5201 - regression_loss: 1.2896 - classification_loss: 0.2305 305/500 [=================>............] - ETA: 1:04 - loss: 1.5237 - regression_loss: 1.2921 - classification_loss: 0.2316 306/500 [=================>............] - ETA: 1:03 - loss: 1.5249 - regression_loss: 1.2932 - classification_loss: 0.2317 307/500 [=================>............] - ETA: 1:03 - loss: 1.5248 - regression_loss: 1.2934 - classification_loss: 0.2314 308/500 [=================>............] - ETA: 1:03 - loss: 1.5259 - regression_loss: 1.2943 - classification_loss: 0.2316 309/500 [=================>............] - ETA: 1:02 - loss: 1.5268 - regression_loss: 1.2951 - classification_loss: 0.2317 310/500 [=================>............] - ETA: 1:02 - loss: 1.5271 - regression_loss: 1.2953 - classification_loss: 0.2318 311/500 [=================>............] - ETA: 1:02 - loss: 1.5292 - regression_loss: 1.2971 - classification_loss: 0.2321 312/500 [=================>............] - ETA: 1:01 - loss: 1.5307 - regression_loss: 1.2985 - classification_loss: 0.2322 313/500 [=================>............] - ETA: 1:01 - loss: 1.5315 - regression_loss: 1.2991 - classification_loss: 0.2323 314/500 [=================>............] - ETA: 1:01 - loss: 1.5333 - regression_loss: 1.3005 - classification_loss: 0.2328 315/500 [=================>............] - ETA: 1:00 - loss: 1.5338 - regression_loss: 1.3010 - classification_loss: 0.2328 316/500 [=================>............] - ETA: 1:00 - loss: 1.5343 - regression_loss: 1.3015 - classification_loss: 0.2328 317/500 [==================>...........] - ETA: 1:00 - loss: 1.5324 - regression_loss: 1.2999 - classification_loss: 0.2325 318/500 [==================>...........] - ETA: 59s - loss: 1.5304 - regression_loss: 1.2981 - classification_loss: 0.2323  319/500 [==================>...........] - ETA: 59s - loss: 1.5314 - regression_loss: 1.2990 - classification_loss: 0.2324 320/500 [==================>...........] - ETA: 59s - loss: 1.5324 - regression_loss: 1.2999 - classification_loss: 0.2325 321/500 [==================>...........] - ETA: 58s - loss: 1.5315 - regression_loss: 1.2991 - classification_loss: 0.2324 322/500 [==================>...........] - ETA: 58s - loss: 1.5295 - regression_loss: 1.2975 - classification_loss: 0.2320 323/500 [==================>...........] - ETA: 58s - loss: 1.5282 - regression_loss: 1.2965 - classification_loss: 0.2317 324/500 [==================>...........] - ETA: 57s - loss: 1.5296 - regression_loss: 1.2977 - classification_loss: 0.2319 325/500 [==================>...........] - ETA: 57s - loss: 1.5309 - regression_loss: 1.2986 - classification_loss: 0.2323 326/500 [==================>...........] - ETA: 57s - loss: 1.5284 - regression_loss: 1.2965 - classification_loss: 0.2320 327/500 [==================>...........] - ETA: 56s - loss: 1.5252 - regression_loss: 1.2936 - classification_loss: 0.2316 328/500 [==================>...........] - ETA: 56s - loss: 1.5258 - regression_loss: 1.2941 - classification_loss: 0.2317 329/500 [==================>...........] - ETA: 56s - loss: 1.5263 - regression_loss: 1.2945 - classification_loss: 0.2318 330/500 [==================>...........] - ETA: 55s - loss: 1.5277 - regression_loss: 1.2958 - classification_loss: 0.2319 331/500 [==================>...........] - ETA: 55s - loss: 1.5283 - regression_loss: 1.2963 - classification_loss: 0.2321 332/500 [==================>...........] - ETA: 55s - loss: 1.5278 - regression_loss: 1.2959 - classification_loss: 0.2319 333/500 [==================>...........] - ETA: 54s - loss: 1.5299 - regression_loss: 1.2976 - classification_loss: 0.2322 334/500 [===================>..........] - ETA: 54s - loss: 1.5301 - regression_loss: 1.2978 - classification_loss: 0.2323 335/500 [===================>..........] - ETA: 54s - loss: 1.5309 - regression_loss: 1.2985 - classification_loss: 0.2324 336/500 [===================>..........] - ETA: 53s - loss: 1.5288 - regression_loss: 1.2966 - classification_loss: 0.2322 337/500 [===================>..........] - ETA: 53s - loss: 1.5287 - regression_loss: 1.2966 - classification_loss: 0.2321 338/500 [===================>..........] - ETA: 53s - loss: 1.5287 - regression_loss: 1.2965 - classification_loss: 0.2322 339/500 [===================>..........] - ETA: 52s - loss: 1.5303 - regression_loss: 1.2977 - classification_loss: 0.2325 340/500 [===================>..........] - ETA: 52s - loss: 1.5303 - regression_loss: 1.2977 - classification_loss: 0.2326 341/500 [===================>..........] - ETA: 52s - loss: 1.5278 - regression_loss: 1.2955 - classification_loss: 0.2323 342/500 [===================>..........] - ETA: 51s - loss: 1.5263 - regression_loss: 1.2942 - classification_loss: 0.2320 343/500 [===================>..........] - ETA: 51s - loss: 1.5248 - regression_loss: 1.2930 - classification_loss: 0.2317 344/500 [===================>..........] - ETA: 51s - loss: 1.5246 - regression_loss: 1.2928 - classification_loss: 0.2317 345/500 [===================>..........] - ETA: 50s - loss: 1.5261 - regression_loss: 1.2941 - classification_loss: 0.2320 346/500 [===================>..........] - ETA: 50s - loss: 1.5246 - regression_loss: 1.2931 - classification_loss: 0.2316 347/500 [===================>..........] - ETA: 50s - loss: 1.5251 - regression_loss: 1.2936 - classification_loss: 0.2316 348/500 [===================>..........] - ETA: 49s - loss: 1.5242 - regression_loss: 1.2929 - classification_loss: 0.2314 349/500 [===================>..........] - ETA: 49s - loss: 1.5255 - regression_loss: 1.2939 - classification_loss: 0.2316 350/500 [====================>.........] - ETA: 49s - loss: 1.5281 - regression_loss: 1.2960 - classification_loss: 0.2320 351/500 [====================>.........] - ETA: 48s - loss: 1.5292 - regression_loss: 1.2969 - classification_loss: 0.2323 352/500 [====================>.........] - ETA: 48s - loss: 1.5305 - regression_loss: 1.2980 - classification_loss: 0.2325 353/500 [====================>.........] - ETA: 48s - loss: 1.5302 - regression_loss: 1.2978 - classification_loss: 0.2324 354/500 [====================>.........] - ETA: 47s - loss: 1.5299 - regression_loss: 1.2976 - classification_loss: 0.2323 355/500 [====================>.........] - ETA: 47s - loss: 1.5313 - regression_loss: 1.2987 - classification_loss: 0.2326 356/500 [====================>.........] - ETA: 47s - loss: 1.5322 - regression_loss: 1.2994 - classification_loss: 0.2328 357/500 [====================>.........] - ETA: 46s - loss: 1.5328 - regression_loss: 1.2997 - classification_loss: 0.2331 358/500 [====================>.........] - ETA: 46s - loss: 1.5331 - regression_loss: 1.2999 - classification_loss: 0.2332 359/500 [====================>.........] - ETA: 46s - loss: 1.5309 - regression_loss: 1.2976 - classification_loss: 0.2333 360/500 [====================>.........] - ETA: 45s - loss: 1.5295 - regression_loss: 1.2962 - classification_loss: 0.2333 361/500 [====================>.........] - ETA: 45s - loss: 1.5306 - regression_loss: 1.2973 - classification_loss: 0.2333 362/500 [====================>.........] - ETA: 45s - loss: 1.5301 - regression_loss: 1.2969 - classification_loss: 0.2332 363/500 [====================>.........] - ETA: 44s - loss: 1.5293 - regression_loss: 1.2963 - classification_loss: 0.2330 364/500 [====================>.........] - ETA: 44s - loss: 1.5305 - regression_loss: 1.2974 - classification_loss: 0.2332 365/500 [====================>.........] - ETA: 44s - loss: 1.5304 - regression_loss: 1.2975 - classification_loss: 0.2330 366/500 [====================>.........] - ETA: 43s - loss: 1.5319 - regression_loss: 1.2988 - classification_loss: 0.2332 367/500 [=====================>........] - ETA: 43s - loss: 1.5314 - regression_loss: 1.2983 - classification_loss: 0.2330 368/500 [=====================>........] - ETA: 43s - loss: 1.5316 - regression_loss: 1.2986 - classification_loss: 0.2330 369/500 [=====================>........] - ETA: 42s - loss: 1.5297 - regression_loss: 1.2969 - classification_loss: 0.2328 370/500 [=====================>........] - ETA: 42s - loss: 1.5295 - regression_loss: 1.2966 - classification_loss: 0.2329 371/500 [=====================>........] - ETA: 42s - loss: 1.5284 - regression_loss: 1.2957 - classification_loss: 0.2327 372/500 [=====================>........] - ETA: 41s - loss: 1.5305 - regression_loss: 1.2975 - classification_loss: 0.2330 373/500 [=====================>........] - ETA: 41s - loss: 1.5292 - regression_loss: 1.2965 - classification_loss: 0.2327 374/500 [=====================>........] - ETA: 41s - loss: 1.5286 - regression_loss: 1.2960 - classification_loss: 0.2326 375/500 [=====================>........] - ETA: 40s - loss: 1.5286 - regression_loss: 1.2963 - classification_loss: 0.2323 376/500 [=====================>........] - ETA: 40s - loss: 1.5264 - regression_loss: 1.2943 - classification_loss: 0.2321 377/500 [=====================>........] - ETA: 40s - loss: 1.5256 - regression_loss: 1.2937 - classification_loss: 0.2319 378/500 [=====================>........] - ETA: 39s - loss: 1.5255 - regression_loss: 1.2936 - classification_loss: 0.2319 379/500 [=====================>........] - ETA: 39s - loss: 1.5258 - regression_loss: 1.2938 - classification_loss: 0.2319 380/500 [=====================>........] - ETA: 39s - loss: 1.5256 - regression_loss: 1.2936 - classification_loss: 0.2320 381/500 [=====================>........] - ETA: 39s - loss: 1.5266 - regression_loss: 1.2944 - classification_loss: 0.2322 382/500 [=====================>........] - ETA: 38s - loss: 1.5272 - regression_loss: 1.2950 - classification_loss: 0.2322 383/500 [=====================>........] - ETA: 38s - loss: 1.5262 - regression_loss: 1.2940 - classification_loss: 0.2321 384/500 [======================>.......] - ETA: 38s - loss: 1.5273 - regression_loss: 1.2946 - classification_loss: 0.2327 385/500 [======================>.......] - ETA: 37s - loss: 1.5281 - regression_loss: 1.2952 - classification_loss: 0.2328 386/500 [======================>.......] - ETA: 37s - loss: 1.5279 - regression_loss: 1.2951 - classification_loss: 0.2328 387/500 [======================>.......] - ETA: 37s - loss: 1.5270 - regression_loss: 1.2943 - classification_loss: 0.2326 388/500 [======================>.......] - ETA: 36s - loss: 1.5253 - regression_loss: 1.2930 - classification_loss: 0.2324 389/500 [======================>.......] - ETA: 36s - loss: 1.5264 - regression_loss: 1.2939 - classification_loss: 0.2325 390/500 [======================>.......] - ETA: 36s - loss: 1.5270 - regression_loss: 1.2944 - classification_loss: 0.2326 391/500 [======================>.......] - ETA: 35s - loss: 1.5258 - regression_loss: 1.2930 - classification_loss: 0.2328 392/500 [======================>.......] - ETA: 35s - loss: 1.5270 - regression_loss: 1.2940 - classification_loss: 0.2330 393/500 [======================>.......] - ETA: 35s - loss: 1.5271 - regression_loss: 1.2941 - classification_loss: 0.2330 394/500 [======================>.......] - ETA: 34s - loss: 1.5266 - regression_loss: 1.2937 - classification_loss: 0.2329 395/500 [======================>.......] - ETA: 34s - loss: 1.5269 - regression_loss: 1.2939 - classification_loss: 0.2330 396/500 [======================>.......] - ETA: 34s - loss: 1.5273 - regression_loss: 1.2943 - classification_loss: 0.2330 397/500 [======================>.......] - ETA: 33s - loss: 1.5269 - regression_loss: 1.2939 - classification_loss: 0.2330 398/500 [======================>.......] - ETA: 33s - loss: 1.5274 - regression_loss: 1.2943 - classification_loss: 0.2331 399/500 [======================>.......] - ETA: 33s - loss: 1.5275 - regression_loss: 1.2944 - classification_loss: 0.2332 400/500 [=======================>......] - ETA: 32s - loss: 1.5268 - regression_loss: 1.2937 - classification_loss: 0.2331 401/500 [=======================>......] - ETA: 32s - loss: 1.5280 - regression_loss: 1.2948 - classification_loss: 0.2332 402/500 [=======================>......] - ETA: 32s - loss: 1.5286 - regression_loss: 1.2953 - classification_loss: 0.2333 403/500 [=======================>......] - ETA: 31s - loss: 1.5294 - regression_loss: 1.2960 - classification_loss: 0.2334 404/500 [=======================>......] - ETA: 31s - loss: 1.5284 - regression_loss: 1.2951 - classification_loss: 0.2333 405/500 [=======================>......] - ETA: 31s - loss: 1.5288 - regression_loss: 1.2955 - classification_loss: 0.2334 406/500 [=======================>......] - ETA: 30s - loss: 1.5290 - regression_loss: 1.2957 - classification_loss: 0.2333 407/500 [=======================>......] - ETA: 30s - loss: 1.5293 - regression_loss: 1.2961 - classification_loss: 0.2332 408/500 [=======================>......] - ETA: 30s - loss: 1.5309 - regression_loss: 1.2974 - classification_loss: 0.2335 409/500 [=======================>......] - ETA: 29s - loss: 1.5321 - regression_loss: 1.2984 - classification_loss: 0.2337 410/500 [=======================>......] - ETA: 29s - loss: 1.5316 - regression_loss: 1.2979 - classification_loss: 0.2337 411/500 [=======================>......] - ETA: 29s - loss: 1.5331 - regression_loss: 1.2992 - classification_loss: 0.2339 412/500 [=======================>......] - ETA: 28s - loss: 1.5335 - regression_loss: 1.2995 - classification_loss: 0.2339 413/500 [=======================>......] - ETA: 28s - loss: 1.5342 - regression_loss: 1.3001 - classification_loss: 0.2340 414/500 [=======================>......] - ETA: 28s - loss: 1.5334 - regression_loss: 1.2992 - classification_loss: 0.2342 415/500 [=======================>......] - ETA: 27s - loss: 1.5308 - regression_loss: 1.2968 - classification_loss: 0.2339 416/500 [=======================>......] - ETA: 27s - loss: 1.5312 - regression_loss: 1.2972 - classification_loss: 0.2340 417/500 [========================>.....] - ETA: 27s - loss: 1.5306 - regression_loss: 1.2966 - classification_loss: 0.2340 418/500 [========================>.....] - ETA: 26s - loss: 1.5309 - regression_loss: 1.2970 - classification_loss: 0.2340 419/500 [========================>.....] - ETA: 26s - loss: 1.5320 - regression_loss: 1.2980 - classification_loss: 0.2341 420/500 [========================>.....] - ETA: 26s - loss: 1.5314 - regression_loss: 1.2974 - classification_loss: 0.2340 421/500 [========================>.....] - ETA: 25s - loss: 1.5317 - regression_loss: 1.2977 - classification_loss: 0.2340 422/500 [========================>.....] - ETA: 25s - loss: 1.5297 - regression_loss: 1.2960 - classification_loss: 0.2337 423/500 [========================>.....] - ETA: 25s - loss: 1.5275 - regression_loss: 1.2940 - classification_loss: 0.2334 424/500 [========================>.....] - ETA: 24s - loss: 1.5274 - regression_loss: 1.2940 - classification_loss: 0.2333 425/500 [========================>.....] - ETA: 24s - loss: 1.5271 - regression_loss: 1.2939 - classification_loss: 0.2332 426/500 [========================>.....] - ETA: 24s - loss: 1.5276 - regression_loss: 1.2946 - classification_loss: 0.2330 427/500 [========================>.....] - ETA: 23s - loss: 1.5262 - regression_loss: 1.2933 - classification_loss: 0.2329 428/500 [========================>.....] - ETA: 23s - loss: 1.5271 - regression_loss: 1.2940 - classification_loss: 0.2330 429/500 [========================>.....] - ETA: 23s - loss: 1.5275 - regression_loss: 1.2944 - classification_loss: 0.2331 430/500 [========================>.....] - ETA: 23s - loss: 1.5278 - regression_loss: 1.2947 - classification_loss: 0.2331 431/500 [========================>.....] - ETA: 22s - loss: 1.5279 - regression_loss: 1.2948 - classification_loss: 0.2331 432/500 [========================>.....] - ETA: 22s - loss: 1.5267 - regression_loss: 1.2938 - classification_loss: 0.2329 433/500 [========================>.....] - ETA: 22s - loss: 1.5260 - regression_loss: 1.2933 - classification_loss: 0.2328 434/500 [=========================>....] - ETA: 21s - loss: 1.5261 - regression_loss: 1.2933 - classification_loss: 0.2327 435/500 [=========================>....] - ETA: 21s - loss: 1.5267 - regression_loss: 1.2940 - classification_loss: 0.2328 436/500 [=========================>....] - ETA: 21s - loss: 1.5290 - regression_loss: 1.2959 - classification_loss: 0.2331 437/500 [=========================>....] - ETA: 20s - loss: 1.5269 - regression_loss: 1.2940 - classification_loss: 0.2328 438/500 [=========================>....] - ETA: 20s - loss: 1.5254 - regression_loss: 1.2928 - classification_loss: 0.2326 439/500 [=========================>....] - ETA: 20s - loss: 1.5256 - regression_loss: 1.2931 - classification_loss: 0.2325 440/500 [=========================>....] - ETA: 19s - loss: 1.5254 - regression_loss: 1.2929 - classification_loss: 0.2325 441/500 [=========================>....] - ETA: 19s - loss: 1.5265 - regression_loss: 1.2937 - classification_loss: 0.2328 442/500 [=========================>....] - ETA: 19s - loss: 1.5264 - regression_loss: 1.2936 - classification_loss: 0.2328 443/500 [=========================>....] - ETA: 18s - loss: 1.5245 - regression_loss: 1.2920 - classification_loss: 0.2325 444/500 [=========================>....] - ETA: 18s - loss: 1.5234 - regression_loss: 1.2911 - classification_loss: 0.2323 445/500 [=========================>....] - ETA: 18s - loss: 1.5234 - regression_loss: 1.2912 - classification_loss: 0.2322 446/500 [=========================>....] - ETA: 17s - loss: 1.5241 - regression_loss: 1.2917 - classification_loss: 0.2324 447/500 [=========================>....] - ETA: 17s - loss: 1.5239 - regression_loss: 1.2915 - classification_loss: 0.2323 448/500 [=========================>....] - ETA: 17s - loss: 1.5233 - regression_loss: 1.2910 - classification_loss: 0.2322 449/500 [=========================>....] - ETA: 16s - loss: 1.5222 - regression_loss: 1.2902 - classification_loss: 0.2320 450/500 [==========================>...] - ETA: 16s - loss: 1.5224 - regression_loss: 1.2905 - classification_loss: 0.2318 451/500 [==========================>...] - ETA: 16s - loss: 1.5222 - regression_loss: 1.2905 - classification_loss: 0.2318 452/500 [==========================>...] - ETA: 15s - loss: 1.5220 - regression_loss: 1.2903 - classification_loss: 0.2317 453/500 [==========================>...] - ETA: 15s - loss: 1.5203 - regression_loss: 1.2888 - classification_loss: 0.2315 454/500 [==========================>...] - ETA: 15s - loss: 1.5213 - regression_loss: 1.2897 - classification_loss: 0.2315 455/500 [==========================>...] - ETA: 14s - loss: 1.5208 - regression_loss: 1.2894 - classification_loss: 0.2315 456/500 [==========================>...] - ETA: 14s - loss: 1.5193 - regression_loss: 1.2880 - classification_loss: 0.2313 457/500 [==========================>...] - ETA: 14s - loss: 1.5201 - regression_loss: 1.2889 - classification_loss: 0.2313 458/500 [==========================>...] - ETA: 13s - loss: 1.5191 - regression_loss: 1.2880 - classification_loss: 0.2311 459/500 [==========================>...] - ETA: 13s - loss: 1.5176 - regression_loss: 1.2867 - classification_loss: 0.2309 460/500 [==========================>...] - ETA: 13s - loss: 1.5177 - regression_loss: 1.2868 - classification_loss: 0.2309 461/500 [==========================>...] - ETA: 12s - loss: 1.5176 - regression_loss: 1.2869 - classification_loss: 0.2308 462/500 [==========================>...] - ETA: 12s - loss: 1.5154 - regression_loss: 1.2850 - classification_loss: 0.2304 463/500 [==========================>...] - ETA: 12s - loss: 1.5146 - regression_loss: 1.2843 - classification_loss: 0.2303 464/500 [==========================>...] - ETA: 11s - loss: 1.5139 - regression_loss: 1.2837 - classification_loss: 0.2302 465/500 [==========================>...] - ETA: 11s - loss: 1.5131 - regression_loss: 1.2831 - classification_loss: 0.2300 466/500 [==========================>...] - ETA: 11s - loss: 1.5144 - regression_loss: 1.2842 - classification_loss: 0.2302 467/500 [===========================>..] - ETA: 10s - loss: 1.5156 - regression_loss: 1.2852 - classification_loss: 0.2304 468/500 [===========================>..] - ETA: 10s - loss: 1.5148 - regression_loss: 1.2845 - classification_loss: 0.2304 469/500 [===========================>..] - ETA: 10s - loss: 1.5129 - regression_loss: 1.2828 - classification_loss: 0.2301 470/500 [===========================>..] - ETA: 9s - loss: 1.5140 - regression_loss: 1.2837 - classification_loss: 0.2302  471/500 [===========================>..] - ETA: 9s - loss: 1.5137 - regression_loss: 1.2832 - classification_loss: 0.2305 472/500 [===========================>..] - ETA: 9s - loss: 1.5134 - regression_loss: 1.2828 - classification_loss: 0.2305 473/500 [===========================>..] - ETA: 8s - loss: 1.5124 - regression_loss: 1.2821 - classification_loss: 0.2303 474/500 [===========================>..] - ETA: 8s - loss: 1.5122 - regression_loss: 1.2821 - classification_loss: 0.2301 475/500 [===========================>..] - ETA: 8s - loss: 1.5124 - regression_loss: 1.2822 - classification_loss: 0.2302 476/500 [===========================>..] - ETA: 7s - loss: 1.5124 - regression_loss: 1.2823 - classification_loss: 0.2301 477/500 [===========================>..] - ETA: 7s - loss: 1.5121 - regression_loss: 1.2820 - classification_loss: 0.2301 478/500 [===========================>..] - ETA: 7s - loss: 1.5103 - regression_loss: 1.2804 - classification_loss: 0.2299 479/500 [===========================>..] - ETA: 6s - loss: 1.5089 - regression_loss: 1.2793 - classification_loss: 0.2296 480/500 [===========================>..] - ETA: 6s - loss: 1.5096 - regression_loss: 1.2798 - classification_loss: 0.2298 481/500 [===========================>..] - ETA: 6s - loss: 1.5089 - regression_loss: 1.2792 - classification_loss: 0.2297 482/500 [===========================>..] - ETA: 5s - loss: 1.5084 - regression_loss: 1.2788 - classification_loss: 0.2296 483/500 [===========================>..] - ETA: 5s - loss: 1.5065 - regression_loss: 1.2771 - classification_loss: 0.2294 484/500 [============================>.] - ETA: 5s - loss: 1.5073 - regression_loss: 1.2778 - classification_loss: 0.2295 485/500 [============================>.] - ETA: 4s - loss: 1.5063 - regression_loss: 1.2769 - classification_loss: 0.2294 486/500 [============================>.] - ETA: 4s - loss: 1.5063 - regression_loss: 1.2768 - classification_loss: 0.2295 487/500 [============================>.] - ETA: 4s - loss: 1.5054 - regression_loss: 1.2758 - classification_loss: 0.2295 488/500 [============================>.] - ETA: 3s - loss: 1.5045 - regression_loss: 1.2752 - classification_loss: 0.2293 489/500 [============================>.] - ETA: 3s - loss: 1.5049 - regression_loss: 1.2756 - classification_loss: 0.2293 490/500 [============================>.] - ETA: 3s - loss: 1.5034 - regression_loss: 1.2743 - classification_loss: 0.2290 491/500 [============================>.] - ETA: 2s - loss: 1.5038 - regression_loss: 1.2749 - classification_loss: 0.2289 492/500 [============================>.] - ETA: 2s - loss: 1.5041 - regression_loss: 1.2751 - classification_loss: 0.2290 493/500 [============================>.] - ETA: 2s - loss: 1.5045 - regression_loss: 1.2756 - classification_loss: 0.2290 494/500 [============================>.] - ETA: 1s - loss: 1.5058 - regression_loss: 1.2767 - classification_loss: 0.2291 495/500 [============================>.] - ETA: 1s - loss: 1.5056 - regression_loss: 1.2764 - classification_loss: 0.2292 496/500 [============================>.] - ETA: 1s - loss: 1.5043 - regression_loss: 1.2754 - classification_loss: 0.2289 497/500 [============================>.] - ETA: 0s - loss: 1.5044 - regression_loss: 1.2755 - classification_loss: 0.2289 498/500 [============================>.] - ETA: 0s - loss: 1.5050 - regression_loss: 1.2761 - classification_loss: 0.2290 499/500 [============================>.] - ETA: 0s - loss: 1.5049 - regression_loss: 1.2760 - classification_loss: 0.2289 500/500 [==============================] - 165s 329ms/step - loss: 1.5052 - regression_loss: 1.2762 - classification_loss: 0.2290 1172 instances of class plum with average precision: 0.5774 mAP: 0.5774 Epoch 00016: saving model to ./training/snapshots/resnet101_pascal_16.h5 Epoch 17/150 1/500 [..............................] - ETA: 2:46 - loss: 0.5231 - regression_loss: 0.4468 - classification_loss: 0.0763 2/500 [..............................] - ETA: 2:43 - loss: 0.8914 - regression_loss: 0.7626 - classification_loss: 0.1288 3/500 [..............................] - ETA: 2:41 - loss: 1.2007 - regression_loss: 1.0297 - classification_loss: 0.1711 4/500 [..............................] - ETA: 2:42 - loss: 1.3976 - regression_loss: 1.1825 - classification_loss: 0.2151 5/500 [..............................] - ETA: 2:42 - loss: 1.2911 - regression_loss: 1.0959 - classification_loss: 0.1952 6/500 [..............................] - ETA: 2:41 - loss: 1.2794 - regression_loss: 1.0807 - classification_loss: 0.1986 7/500 [..............................] - ETA: 2:41 - loss: 1.2250 - regression_loss: 1.0356 - classification_loss: 0.1895 8/500 [..............................] - ETA: 2:40 - loss: 1.2848 - regression_loss: 1.0868 - classification_loss: 0.1981 9/500 [..............................] - ETA: 2:41 - loss: 1.2271 - regression_loss: 1.0348 - classification_loss: 0.1923 10/500 [..............................] - ETA: 2:42 - loss: 1.2827 - regression_loss: 1.0842 - classification_loss: 0.1985 11/500 [..............................] - ETA: 2:42 - loss: 1.3090 - regression_loss: 1.1077 - classification_loss: 0.2013 12/500 [..............................] - ETA: 2:41 - loss: 1.2693 - regression_loss: 1.0744 - classification_loss: 0.1949 13/500 [..............................] - ETA: 2:40 - loss: 1.2794 - regression_loss: 1.0844 - classification_loss: 0.1950 14/500 [..............................] - ETA: 2:40 - loss: 1.3166 - regression_loss: 1.1183 - classification_loss: 0.1983 15/500 [..............................] - ETA: 2:40 - loss: 1.3558 - regression_loss: 1.1551 - classification_loss: 0.2008 16/500 [..............................] - ETA: 2:39 - loss: 1.3579 - regression_loss: 1.1608 - classification_loss: 0.1971 17/500 [>.............................] - ETA: 2:39 - loss: 1.4080 - regression_loss: 1.2028 - classification_loss: 0.2053 18/500 [>.............................] - ETA: 2:39 - loss: 1.4111 - regression_loss: 1.2043 - classification_loss: 0.2068 19/500 [>.............................] - ETA: 2:40 - loss: 1.4234 - regression_loss: 1.2153 - classification_loss: 0.2081 20/500 [>.............................] - ETA: 2:39 - loss: 1.4427 - regression_loss: 1.2319 - classification_loss: 0.2108 21/500 [>.............................] - ETA: 2:39 - loss: 1.4198 - regression_loss: 1.2068 - classification_loss: 0.2130 22/500 [>.............................] - ETA: 2:39 - loss: 1.4080 - regression_loss: 1.1993 - classification_loss: 0.2087 23/500 [>.............................] - ETA: 2:39 - loss: 1.4340 - regression_loss: 1.2216 - classification_loss: 0.2124 24/500 [>.............................] - ETA: 2:38 - loss: 1.4524 - regression_loss: 1.2368 - classification_loss: 0.2156 25/500 [>.............................] - ETA: 2:38 - loss: 1.4550 - regression_loss: 1.2389 - classification_loss: 0.2161 26/500 [>.............................] - ETA: 2:37 - loss: 1.4704 - regression_loss: 1.2517 - classification_loss: 0.2187 27/500 [>.............................] - ETA: 2:37 - loss: 1.4485 - regression_loss: 1.2319 - classification_loss: 0.2166 28/500 [>.............................] - ETA: 2:36 - loss: 1.4559 - regression_loss: 1.2379 - classification_loss: 0.2180 29/500 [>.............................] - ETA: 2:36 - loss: 1.4608 - regression_loss: 1.2429 - classification_loss: 0.2179 30/500 [>.............................] - ETA: 2:36 - loss: 1.4505 - regression_loss: 1.2314 - classification_loss: 0.2190 31/500 [>.............................] - ETA: 2:35 - loss: 1.4741 - regression_loss: 1.2512 - classification_loss: 0.2229 32/500 [>.............................] - ETA: 2:35 - loss: 1.4518 - regression_loss: 1.2323 - classification_loss: 0.2195 33/500 [>.............................] - ETA: 2:36 - loss: 1.4645 - regression_loss: 1.2429 - classification_loss: 0.2216 34/500 [=>............................] - ETA: 2:35 - loss: 1.4754 - regression_loss: 1.2531 - classification_loss: 0.2222 35/500 [=>............................] - ETA: 2:34 - loss: 1.4498 - regression_loss: 1.2312 - classification_loss: 0.2186 36/500 [=>............................] - ETA: 2:34 - loss: 1.4500 - regression_loss: 1.2317 - classification_loss: 0.2183 37/500 [=>............................] - ETA: 2:34 - loss: 1.4717 - regression_loss: 1.2498 - classification_loss: 0.2220 38/500 [=>............................] - ETA: 2:34 - loss: 1.4865 - regression_loss: 1.2629 - classification_loss: 0.2236 39/500 [=>............................] - ETA: 2:33 - loss: 1.4894 - regression_loss: 1.2658 - classification_loss: 0.2236 40/500 [=>............................] - ETA: 2:33 - loss: 1.4939 - regression_loss: 1.2694 - classification_loss: 0.2245 41/500 [=>............................] - ETA: 2:33 - loss: 1.5030 - regression_loss: 1.2775 - classification_loss: 0.2255 42/500 [=>............................] - ETA: 2:32 - loss: 1.5061 - regression_loss: 1.2810 - classification_loss: 0.2251 43/500 [=>............................] - ETA: 2:32 - loss: 1.5074 - regression_loss: 1.2814 - classification_loss: 0.2260 44/500 [=>............................] - ETA: 2:31 - loss: 1.4966 - regression_loss: 1.2730 - classification_loss: 0.2236 45/500 [=>............................] - ETA: 2:31 - loss: 1.4914 - regression_loss: 1.2683 - classification_loss: 0.2231 46/500 [=>............................] - ETA: 2:31 - loss: 1.5028 - regression_loss: 1.2775 - classification_loss: 0.2253 47/500 [=>............................] - ETA: 2:31 - loss: 1.5113 - regression_loss: 1.2848 - classification_loss: 0.2265 48/500 [=>............................] - ETA: 2:30 - loss: 1.5232 - regression_loss: 1.2957 - classification_loss: 0.2275 49/500 [=>............................] - ETA: 2:30 - loss: 1.5283 - regression_loss: 1.2985 - classification_loss: 0.2298 50/500 [==>...........................] - ETA: 2:30 - loss: 1.5320 - regression_loss: 1.3017 - classification_loss: 0.2303 51/500 [==>...........................] - ETA: 2:29 - loss: 1.5372 - regression_loss: 1.3066 - classification_loss: 0.2306 52/500 [==>...........................] - ETA: 2:29 - loss: 1.5336 - regression_loss: 1.3041 - classification_loss: 0.2295 53/500 [==>...........................] - ETA: 2:29 - loss: 1.5428 - regression_loss: 1.3120 - classification_loss: 0.2308 54/500 [==>...........................] - ETA: 2:28 - loss: 1.5549 - regression_loss: 1.3215 - classification_loss: 0.2334 55/500 [==>...........................] - ETA: 2:28 - loss: 1.5523 - regression_loss: 1.3200 - classification_loss: 0.2323 56/500 [==>...........................] - ETA: 2:28 - loss: 1.5620 - regression_loss: 1.3287 - classification_loss: 0.2333 57/500 [==>...........................] - ETA: 2:28 - loss: 1.5675 - regression_loss: 1.3337 - classification_loss: 0.2339 58/500 [==>...........................] - ETA: 2:27 - loss: 1.5604 - regression_loss: 1.3272 - classification_loss: 0.2333 59/500 [==>...........................] - ETA: 2:27 - loss: 1.5576 - regression_loss: 1.3247 - classification_loss: 0.2329 60/500 [==>...........................] - ETA: 2:27 - loss: 1.5504 - regression_loss: 1.3180 - classification_loss: 0.2324 61/500 [==>...........................] - ETA: 2:26 - loss: 1.5409 - regression_loss: 1.3101 - classification_loss: 0.2308 62/500 [==>...........................] - ETA: 2:26 - loss: 1.5272 - regression_loss: 1.2978 - classification_loss: 0.2294 63/500 [==>...........................] - ETA: 2:25 - loss: 1.5315 - regression_loss: 1.3019 - classification_loss: 0.2295 64/500 [==>...........................] - ETA: 2:25 - loss: 1.5410 - regression_loss: 1.3102 - classification_loss: 0.2308 65/500 [==>...........................] - ETA: 2:25 - loss: 1.5448 - regression_loss: 1.3136 - classification_loss: 0.2312 66/500 [==>...........................] - ETA: 2:24 - loss: 1.5423 - regression_loss: 1.3110 - classification_loss: 0.2313 67/500 [===>..........................] - ETA: 2:24 - loss: 1.5445 - regression_loss: 1.3123 - classification_loss: 0.2322 68/500 [===>..........................] - ETA: 2:24 - loss: 1.5507 - regression_loss: 1.3182 - classification_loss: 0.2325 69/500 [===>..........................] - ETA: 2:24 - loss: 1.5548 - regression_loss: 1.3212 - classification_loss: 0.2335 70/500 [===>..........................] - ETA: 2:23 - loss: 1.5540 - regression_loss: 1.3206 - classification_loss: 0.2334 71/500 [===>..........................] - ETA: 2:23 - loss: 1.5602 - regression_loss: 1.3264 - classification_loss: 0.2338 72/500 [===>..........................] - ETA: 2:23 - loss: 1.5593 - regression_loss: 1.3254 - classification_loss: 0.2338 73/500 [===>..........................] - ETA: 2:22 - loss: 1.5591 - regression_loss: 1.3256 - classification_loss: 0.2335 74/500 [===>..........................] - ETA: 2:22 - loss: 1.5647 - regression_loss: 1.3305 - classification_loss: 0.2342 75/500 [===>..........................] - ETA: 2:22 - loss: 1.5559 - regression_loss: 1.3229 - classification_loss: 0.2330 76/500 [===>..........................] - ETA: 2:21 - loss: 1.5612 - regression_loss: 1.3268 - classification_loss: 0.2344 77/500 [===>..........................] - ETA: 2:21 - loss: 1.5580 - regression_loss: 1.3243 - classification_loss: 0.2338 78/500 [===>..........................] - ETA: 2:21 - loss: 1.5613 - regression_loss: 1.3274 - classification_loss: 0.2339 79/500 [===>..........................] - ETA: 2:20 - loss: 1.5599 - regression_loss: 1.3262 - classification_loss: 0.2338 80/500 [===>..........................] - ETA: 2:20 - loss: 1.5605 - regression_loss: 1.3266 - classification_loss: 0.2339 81/500 [===>..........................] - ETA: 2:20 - loss: 1.5695 - regression_loss: 1.3338 - classification_loss: 0.2357 82/500 [===>..........................] - ETA: 2:19 - loss: 1.5639 - regression_loss: 1.3283 - classification_loss: 0.2357 83/500 [===>..........................] - ETA: 2:19 - loss: 1.5669 - regression_loss: 1.3313 - classification_loss: 0.2356 84/500 [====>.........................] - ETA: 2:19 - loss: 1.5709 - regression_loss: 1.3344 - classification_loss: 0.2365 85/500 [====>.........................] - ETA: 2:18 - loss: 1.5631 - regression_loss: 1.3275 - classification_loss: 0.2355 86/500 [====>.........................] - ETA: 2:18 - loss: 1.5636 - regression_loss: 1.3277 - classification_loss: 0.2360 87/500 [====>.........................] - ETA: 2:18 - loss: 1.5547 - regression_loss: 1.3197 - classification_loss: 0.2350 88/500 [====>.........................] - ETA: 2:17 - loss: 1.5500 - regression_loss: 1.3154 - classification_loss: 0.2345 89/500 [====>.........................] - ETA: 2:17 - loss: 1.5527 - regression_loss: 1.3182 - classification_loss: 0.2345 90/500 [====>.........................] - ETA: 2:17 - loss: 1.5559 - regression_loss: 1.3215 - classification_loss: 0.2344 91/500 [====>.........................] - ETA: 2:16 - loss: 1.5604 - regression_loss: 1.3256 - classification_loss: 0.2348 92/500 [====>.........................] - ETA: 2:16 - loss: 1.5559 - regression_loss: 1.3216 - classification_loss: 0.2343 93/500 [====>.........................] - ETA: 2:16 - loss: 1.5580 - regression_loss: 1.3232 - classification_loss: 0.2348 94/500 [====>.........................] - ETA: 2:15 - loss: 1.5568 - regression_loss: 1.3217 - classification_loss: 0.2351 95/500 [====>.........................] - ETA: 2:15 - loss: 1.5569 - regression_loss: 1.3219 - classification_loss: 0.2350 96/500 [====>.........................] - ETA: 2:15 - loss: 1.5600 - regression_loss: 1.3243 - classification_loss: 0.2358 97/500 [====>.........................] - ETA: 2:14 - loss: 1.5509 - regression_loss: 1.3163 - classification_loss: 0.2346 98/500 [====>.........................] - ETA: 2:14 - loss: 1.5477 - regression_loss: 1.3134 - classification_loss: 0.2343 99/500 [====>.........................] - ETA: 2:13 - loss: 1.5432 - regression_loss: 1.3095 - classification_loss: 0.2336 100/500 [=====>........................] - ETA: 2:13 - loss: 1.5437 - regression_loss: 1.3098 - classification_loss: 0.2339 101/500 [=====>........................] - ETA: 2:13 - loss: 1.5435 - regression_loss: 1.3097 - classification_loss: 0.2339 102/500 [=====>........................] - ETA: 2:12 - loss: 1.5405 - regression_loss: 1.3072 - classification_loss: 0.2334 103/500 [=====>........................] - ETA: 2:12 - loss: 1.5369 - regression_loss: 1.3042 - classification_loss: 0.2327 104/500 [=====>........................] - ETA: 2:11 - loss: 1.5372 - regression_loss: 1.3045 - classification_loss: 0.2327 105/500 [=====>........................] - ETA: 2:11 - loss: 1.5289 - regression_loss: 1.2969 - classification_loss: 0.2320 106/500 [=====>........................] - ETA: 2:11 - loss: 1.5277 - regression_loss: 1.2959 - classification_loss: 0.2318 107/500 [=====>........................] - ETA: 2:11 - loss: 1.5314 - regression_loss: 1.2989 - classification_loss: 0.2325 108/500 [=====>........................] - ETA: 2:10 - loss: 1.5320 - regression_loss: 1.2994 - classification_loss: 0.2326 109/500 [=====>........................] - ETA: 2:10 - loss: 1.5287 - regression_loss: 1.2968 - classification_loss: 0.2319 110/500 [=====>........................] - ETA: 2:09 - loss: 1.5251 - regression_loss: 1.2941 - classification_loss: 0.2311 111/500 [=====>........................] - ETA: 2:09 - loss: 1.5249 - regression_loss: 1.2943 - classification_loss: 0.2306 112/500 [=====>........................] - ETA: 2:09 - loss: 1.5278 - regression_loss: 1.2968 - classification_loss: 0.2310 113/500 [=====>........................] - ETA: 2:08 - loss: 1.5270 - regression_loss: 1.2962 - classification_loss: 0.2308 114/500 [=====>........................] - ETA: 2:08 - loss: 1.5233 - regression_loss: 1.2932 - classification_loss: 0.2302 115/500 [=====>........................] - ETA: 2:08 - loss: 1.5268 - regression_loss: 1.2963 - classification_loss: 0.2305 116/500 [=====>........................] - ETA: 2:07 - loss: 1.5296 - regression_loss: 1.2983 - classification_loss: 0.2313 117/500 [======>.......................] - ETA: 2:07 - loss: 1.5249 - regression_loss: 1.2946 - classification_loss: 0.2303 118/500 [======>.......................] - ETA: 2:07 - loss: 1.5265 - regression_loss: 1.2960 - classification_loss: 0.2305 119/500 [======>.......................] - ETA: 2:06 - loss: 1.5298 - regression_loss: 1.2990 - classification_loss: 0.2307 120/500 [======>.......................] - ETA: 2:06 - loss: 1.5290 - regression_loss: 1.2982 - classification_loss: 0.2308 121/500 [======>.......................] - ETA: 2:06 - loss: 1.5249 - regression_loss: 1.2951 - classification_loss: 0.2299 122/500 [======>.......................] - ETA: 2:05 - loss: 1.5201 - regression_loss: 1.2910 - classification_loss: 0.2292 123/500 [======>.......................] - ETA: 2:05 - loss: 1.5164 - regression_loss: 1.2878 - classification_loss: 0.2285 124/500 [======>.......................] - ETA: 2:05 - loss: 1.5211 - regression_loss: 1.2920 - classification_loss: 0.2291 125/500 [======>.......................] - ETA: 2:05 - loss: 1.5194 - regression_loss: 1.2907 - classification_loss: 0.2287 126/500 [======>.......................] - ETA: 2:04 - loss: 1.5156 - regression_loss: 1.2871 - classification_loss: 0.2285 127/500 [======>.......................] - ETA: 2:04 - loss: 1.5073 - regression_loss: 1.2795 - classification_loss: 0.2278 128/500 [======>.......................] - ETA: 2:04 - loss: 1.5011 - regression_loss: 1.2740 - classification_loss: 0.2271 129/500 [======>.......................] - ETA: 2:03 - loss: 1.5067 - regression_loss: 1.2787 - classification_loss: 0.2280 130/500 [======>.......................] - ETA: 2:03 - loss: 1.5077 - regression_loss: 1.2794 - classification_loss: 0.2282 131/500 [======>.......................] - ETA: 2:03 - loss: 1.5001 - regression_loss: 1.2727 - classification_loss: 0.2274 132/500 [======>.......................] - ETA: 2:02 - loss: 1.4973 - regression_loss: 1.2704 - classification_loss: 0.2269 133/500 [======>.......................] - ETA: 2:02 - loss: 1.4987 - regression_loss: 1.2719 - classification_loss: 0.2268 134/500 [=======>......................] - ETA: 2:02 - loss: 1.5018 - regression_loss: 1.2746 - classification_loss: 0.2272 135/500 [=======>......................] - ETA: 2:01 - loss: 1.4950 - regression_loss: 1.2688 - classification_loss: 0.2263 136/500 [=======>......................] - ETA: 2:01 - loss: 1.4974 - regression_loss: 1.2709 - classification_loss: 0.2265 137/500 [=======>......................] - ETA: 2:00 - loss: 1.4963 - regression_loss: 1.2700 - classification_loss: 0.2263 138/500 [=======>......................] - ETA: 2:00 - loss: 1.4920 - regression_loss: 1.2664 - classification_loss: 0.2256 139/500 [=======>......................] - ETA: 2:00 - loss: 1.4899 - regression_loss: 1.2647 - classification_loss: 0.2252 140/500 [=======>......................] - ETA: 1:59 - loss: 1.4939 - regression_loss: 1.2681 - classification_loss: 0.2258 141/500 [=======>......................] - ETA: 1:59 - loss: 1.4938 - regression_loss: 1.2680 - classification_loss: 0.2257 142/500 [=======>......................] - ETA: 1:59 - loss: 1.4946 - regression_loss: 1.2688 - classification_loss: 0.2259 143/500 [=======>......................] - ETA: 1:58 - loss: 1.4908 - regression_loss: 1.2656 - classification_loss: 0.2252 144/500 [=======>......................] - ETA: 1:58 - loss: 1.4945 - regression_loss: 1.2689 - classification_loss: 0.2256 145/500 [=======>......................] - ETA: 1:58 - loss: 1.4938 - regression_loss: 1.2681 - classification_loss: 0.2257 146/500 [=======>......................] - ETA: 1:57 - loss: 1.4935 - regression_loss: 1.2673 - classification_loss: 0.2262 147/500 [=======>......................] - ETA: 1:57 - loss: 1.4951 - regression_loss: 1.2688 - classification_loss: 0.2263 148/500 [=======>......................] - ETA: 1:57 - loss: 1.4945 - regression_loss: 1.2684 - classification_loss: 0.2261 149/500 [=======>......................] - ETA: 1:56 - loss: 1.4883 - regression_loss: 1.2630 - classification_loss: 0.2253 150/500 [========>.....................] - ETA: 1:56 - loss: 1.4902 - regression_loss: 1.2647 - classification_loss: 0.2255 151/500 [========>.....................] - ETA: 1:56 - loss: 1.4907 - regression_loss: 1.2652 - classification_loss: 0.2254 152/500 [========>.....................] - ETA: 1:55 - loss: 1.4893 - regression_loss: 1.2641 - classification_loss: 0.2252 153/500 [========>.....................] - ETA: 1:55 - loss: 1.4842 - regression_loss: 1.2599 - classification_loss: 0.2243 154/500 [========>.....................] - ETA: 1:55 - loss: 1.4849 - regression_loss: 1.2602 - classification_loss: 0.2247 155/500 [========>.....................] - ETA: 1:54 - loss: 1.4847 - regression_loss: 1.2603 - classification_loss: 0.2243 156/500 [========>.....................] - ETA: 1:54 - loss: 1.4860 - regression_loss: 1.2617 - classification_loss: 0.2244 157/500 [========>.....................] - ETA: 1:54 - loss: 1.4846 - regression_loss: 1.2608 - classification_loss: 0.2238 158/500 [========>.....................] - ETA: 1:53 - loss: 1.4849 - regression_loss: 1.2613 - classification_loss: 0.2236 159/500 [========>.....................] - ETA: 1:53 - loss: 1.4868 - regression_loss: 1.2629 - classification_loss: 0.2239 160/500 [========>.....................] - ETA: 1:53 - loss: 1.4873 - regression_loss: 1.2637 - classification_loss: 0.2236 161/500 [========>.....................] - ETA: 1:52 - loss: 1.4891 - regression_loss: 1.2655 - classification_loss: 0.2237 162/500 [========>.....................] - ETA: 1:52 - loss: 1.4913 - regression_loss: 1.2674 - classification_loss: 0.2240 163/500 [========>.....................] - ETA: 1:52 - loss: 1.4899 - regression_loss: 1.2661 - classification_loss: 0.2237 164/500 [========>.....................] - ETA: 1:51 - loss: 1.4903 - regression_loss: 1.2666 - classification_loss: 0.2237 165/500 [========>.....................] - ETA: 1:51 - loss: 1.4860 - regression_loss: 1.2632 - classification_loss: 0.2228 166/500 [========>.....................] - ETA: 1:50 - loss: 1.4865 - regression_loss: 1.2637 - classification_loss: 0.2227 167/500 [=========>....................] - ETA: 1:50 - loss: 1.4875 - regression_loss: 1.2648 - classification_loss: 0.2227 168/500 [=========>....................] - ETA: 1:50 - loss: 1.4878 - regression_loss: 1.2651 - classification_loss: 0.2227 169/500 [=========>....................] - ETA: 1:49 - loss: 1.4933 - regression_loss: 1.2698 - classification_loss: 0.2235 170/500 [=========>....................] - ETA: 1:49 - loss: 1.4949 - regression_loss: 1.2711 - classification_loss: 0.2237 171/500 [=========>....................] - ETA: 1:49 - loss: 1.4900 - regression_loss: 1.2669 - classification_loss: 0.2231 172/500 [=========>....................] - ETA: 1:48 - loss: 1.4929 - regression_loss: 1.2695 - classification_loss: 0.2234 173/500 [=========>....................] - ETA: 1:48 - loss: 1.4935 - regression_loss: 1.2699 - classification_loss: 0.2236 174/500 [=========>....................] - ETA: 1:48 - loss: 1.4919 - regression_loss: 1.2686 - classification_loss: 0.2233 175/500 [=========>....................] - ETA: 1:47 - loss: 1.4949 - regression_loss: 1.2707 - classification_loss: 0.2242 176/500 [=========>....................] - ETA: 1:47 - loss: 1.4959 - regression_loss: 1.2717 - classification_loss: 0.2242 177/500 [=========>....................] - ETA: 1:47 - loss: 1.4952 - regression_loss: 1.2710 - classification_loss: 0.2242 178/500 [=========>....................] - ETA: 1:46 - loss: 1.4952 - regression_loss: 1.2712 - classification_loss: 0.2240 179/500 [=========>....................] - ETA: 1:46 - loss: 1.4979 - regression_loss: 1.2732 - classification_loss: 0.2246 180/500 [=========>....................] - ETA: 1:46 - loss: 1.5015 - regression_loss: 1.2763 - classification_loss: 0.2252 181/500 [=========>....................] - ETA: 1:45 - loss: 1.5010 - regression_loss: 1.2760 - classification_loss: 0.2250 182/500 [=========>....................] - ETA: 1:45 - loss: 1.4980 - regression_loss: 1.2735 - classification_loss: 0.2245 183/500 [=========>....................] - ETA: 1:45 - loss: 1.5014 - regression_loss: 1.2761 - classification_loss: 0.2253 184/500 [==========>...................] - ETA: 1:44 - loss: 1.5016 - regression_loss: 1.2761 - classification_loss: 0.2255 185/500 [==========>...................] - ETA: 1:44 - loss: 1.5028 - regression_loss: 1.2771 - classification_loss: 0.2257 186/500 [==========>...................] - ETA: 1:44 - loss: 1.5014 - regression_loss: 1.2760 - classification_loss: 0.2253 187/500 [==========>...................] - ETA: 1:43 - loss: 1.4966 - regression_loss: 1.2718 - classification_loss: 0.2248 188/500 [==========>...................] - ETA: 1:43 - loss: 1.4969 - regression_loss: 1.2716 - classification_loss: 0.2253 189/500 [==========>...................] - ETA: 1:43 - loss: 1.4974 - regression_loss: 1.2720 - classification_loss: 0.2255 190/500 [==========>...................] - ETA: 1:42 - loss: 1.4961 - regression_loss: 1.2709 - classification_loss: 0.2252 191/500 [==========>...................] - ETA: 1:42 - loss: 1.4974 - regression_loss: 1.2719 - classification_loss: 0.2255 192/500 [==========>...................] - ETA: 1:42 - loss: 1.4972 - regression_loss: 1.2717 - classification_loss: 0.2255 193/500 [==========>...................] - ETA: 1:41 - loss: 1.4957 - regression_loss: 1.2704 - classification_loss: 0.2253 194/500 [==========>...................] - ETA: 1:41 - loss: 1.4980 - regression_loss: 1.2723 - classification_loss: 0.2257 195/500 [==========>...................] - ETA: 1:41 - loss: 1.4992 - regression_loss: 1.2735 - classification_loss: 0.2257 196/500 [==========>...................] - ETA: 1:40 - loss: 1.5024 - regression_loss: 1.2763 - classification_loss: 0.2261 197/500 [==========>...................] - ETA: 1:40 - loss: 1.4986 - regression_loss: 1.2730 - classification_loss: 0.2256 198/500 [==========>...................] - ETA: 1:40 - loss: 1.5025 - regression_loss: 1.2758 - classification_loss: 0.2267 199/500 [==========>...................] - ETA: 1:39 - loss: 1.5046 - regression_loss: 1.2774 - classification_loss: 0.2272 200/500 [===========>..................] - ETA: 1:39 - loss: 1.5041 - regression_loss: 1.2769 - classification_loss: 0.2273 201/500 [===========>..................] - ETA: 1:39 - loss: 1.5026 - regression_loss: 1.2754 - classification_loss: 0.2272 202/500 [===========>..................] - ETA: 1:38 - loss: 1.5020 - regression_loss: 1.2746 - classification_loss: 0.2274 203/500 [===========>..................] - ETA: 1:38 - loss: 1.4985 - regression_loss: 1.2715 - classification_loss: 0.2270 204/500 [===========>..................] - ETA: 1:38 - loss: 1.4998 - regression_loss: 1.2726 - classification_loss: 0.2271 205/500 [===========>..................] - ETA: 1:37 - loss: 1.5035 - regression_loss: 1.2754 - classification_loss: 0.2281 206/500 [===========>..................] - ETA: 1:37 - loss: 1.5017 - regression_loss: 1.2740 - classification_loss: 0.2277 207/500 [===========>..................] - ETA: 1:37 - loss: 1.5001 - regression_loss: 1.2729 - classification_loss: 0.2273 208/500 [===========>..................] - ETA: 1:36 - loss: 1.4966 - regression_loss: 1.2697 - classification_loss: 0.2269 209/500 [===========>..................] - ETA: 1:36 - loss: 1.4988 - regression_loss: 1.2713 - classification_loss: 0.2275 210/500 [===========>..................] - ETA: 1:36 - loss: 1.5003 - regression_loss: 1.2725 - classification_loss: 0.2278 211/500 [===========>..................] - ETA: 1:35 - loss: 1.5019 - regression_loss: 1.2739 - classification_loss: 0.2280 212/500 [===========>..................] - ETA: 1:35 - loss: 1.5015 - regression_loss: 1.2736 - classification_loss: 0.2280 213/500 [===========>..................] - ETA: 1:35 - loss: 1.5006 - regression_loss: 1.2727 - classification_loss: 0.2278 214/500 [===========>..................] - ETA: 1:34 - loss: 1.4999 - regression_loss: 1.2723 - classification_loss: 0.2276 215/500 [===========>..................] - ETA: 1:34 - loss: 1.5006 - regression_loss: 1.2730 - classification_loss: 0.2276 216/500 [===========>..................] - ETA: 1:34 - loss: 1.5023 - regression_loss: 1.2744 - classification_loss: 0.2279 217/500 [============>.................] - ETA: 1:33 - loss: 1.5040 - regression_loss: 1.2760 - classification_loss: 0.2281 218/500 [============>.................] - ETA: 1:33 - loss: 1.5033 - regression_loss: 1.2757 - classification_loss: 0.2276 219/500 [============>.................] - ETA: 1:33 - loss: 1.5041 - regression_loss: 1.2763 - classification_loss: 0.2278 220/500 [============>.................] - ETA: 1:33 - loss: 1.5024 - regression_loss: 1.2748 - classification_loss: 0.2276 221/500 [============>.................] - ETA: 1:32 - loss: 1.5012 - regression_loss: 1.2737 - classification_loss: 0.2275 222/500 [============>.................] - ETA: 1:32 - loss: 1.5034 - regression_loss: 1.2754 - classification_loss: 0.2280 223/500 [============>.................] - ETA: 1:32 - loss: 1.5006 - regression_loss: 1.2729 - classification_loss: 0.2277 224/500 [============>.................] - ETA: 1:31 - loss: 1.5023 - regression_loss: 1.2745 - classification_loss: 0.2278 225/500 [============>.................] - ETA: 1:31 - loss: 1.5016 - regression_loss: 1.2738 - classification_loss: 0.2278 226/500 [============>.................] - ETA: 1:31 - loss: 1.5026 - regression_loss: 1.2747 - classification_loss: 0.2279 227/500 [============>.................] - ETA: 1:30 - loss: 1.5043 - regression_loss: 1.2762 - classification_loss: 0.2281 228/500 [============>.................] - ETA: 1:30 - loss: 1.5004 - regression_loss: 1.2726 - classification_loss: 0.2278 229/500 [============>.................] - ETA: 1:30 - loss: 1.4990 - regression_loss: 1.2716 - classification_loss: 0.2274 230/500 [============>.................] - ETA: 1:29 - loss: 1.5005 - regression_loss: 1.2728 - classification_loss: 0.2277 231/500 [============>.................] - ETA: 1:29 - loss: 1.5011 - regression_loss: 1.2733 - classification_loss: 0.2278 232/500 [============>.................] - ETA: 1:29 - loss: 1.5022 - regression_loss: 1.2743 - classification_loss: 0.2279 233/500 [============>.................] - ETA: 1:28 - loss: 1.5032 - regression_loss: 1.2751 - classification_loss: 0.2280 234/500 [=============>................] - ETA: 1:28 - loss: 1.5045 - regression_loss: 1.2765 - classification_loss: 0.2279 235/500 [=============>................] - ETA: 1:28 - loss: 1.5035 - regression_loss: 1.2759 - classification_loss: 0.2276 236/500 [=============>................] - ETA: 1:27 - loss: 1.5045 - regression_loss: 1.2767 - classification_loss: 0.2277 237/500 [=============>................] - ETA: 1:27 - loss: 1.5020 - regression_loss: 1.2747 - classification_loss: 0.2273 238/500 [=============>................] - ETA: 1:27 - loss: 1.5022 - regression_loss: 1.2748 - classification_loss: 0.2274 239/500 [=============>................] - ETA: 1:26 - loss: 1.5014 - regression_loss: 1.2741 - classification_loss: 0.2273 240/500 [=============>................] - ETA: 1:26 - loss: 1.5019 - regression_loss: 1.2746 - classification_loss: 0.2273 241/500 [=============>................] - ETA: 1:26 - loss: 1.4995 - regression_loss: 1.2727 - classification_loss: 0.2268 242/500 [=============>................] - ETA: 1:25 - loss: 1.4966 - regression_loss: 1.2703 - classification_loss: 0.2263 243/500 [=============>................] - ETA: 1:25 - loss: 1.4988 - regression_loss: 1.2721 - classification_loss: 0.2268 244/500 [=============>................] - ETA: 1:25 - loss: 1.4965 - regression_loss: 1.2701 - classification_loss: 0.2264 245/500 [=============>................] - ETA: 1:24 - loss: 1.4936 - regression_loss: 1.2677 - classification_loss: 0.2259 246/500 [=============>................] - ETA: 1:24 - loss: 1.4942 - regression_loss: 1.2683 - classification_loss: 0.2259 247/500 [=============>................] - ETA: 1:24 - loss: 1.4927 - regression_loss: 1.2670 - classification_loss: 0.2257 248/500 [=============>................] - ETA: 1:23 - loss: 1.4949 - regression_loss: 1.2687 - classification_loss: 0.2262 249/500 [=============>................] - ETA: 1:23 - loss: 1.4963 - regression_loss: 1.2700 - classification_loss: 0.2263 250/500 [==============>...............] - ETA: 1:23 - loss: 1.4981 - regression_loss: 1.2716 - classification_loss: 0.2265 251/500 [==============>...............] - ETA: 1:22 - loss: 1.4956 - regression_loss: 1.2695 - classification_loss: 0.2262 252/500 [==============>...............] - ETA: 1:22 - loss: 1.4921 - regression_loss: 1.2664 - classification_loss: 0.2257 253/500 [==============>...............] - ETA: 1:22 - loss: 1.4927 - regression_loss: 1.2669 - classification_loss: 0.2258 254/500 [==============>...............] - ETA: 1:21 - loss: 1.4935 - regression_loss: 1.2680 - classification_loss: 0.2255 255/500 [==============>...............] - ETA: 1:21 - loss: 1.4955 - regression_loss: 1.2694 - classification_loss: 0.2261 256/500 [==============>...............] - ETA: 1:21 - loss: 1.4957 - regression_loss: 1.2696 - classification_loss: 0.2261 257/500 [==============>...............] - ETA: 1:20 - loss: 1.4981 - regression_loss: 1.2717 - classification_loss: 0.2264 258/500 [==============>...............] - ETA: 1:20 - loss: 1.4990 - regression_loss: 1.2726 - classification_loss: 0.2264 259/500 [==============>...............] - ETA: 1:20 - loss: 1.4985 - regression_loss: 1.2723 - classification_loss: 0.2262 260/500 [==============>...............] - ETA: 1:19 - loss: 1.4957 - regression_loss: 1.2700 - classification_loss: 0.2257 261/500 [==============>...............] - ETA: 1:19 - loss: 1.4983 - regression_loss: 1.2721 - classification_loss: 0.2262 262/500 [==============>...............] - ETA: 1:19 - loss: 1.4997 - regression_loss: 1.2735 - classification_loss: 0.2262 263/500 [==============>...............] - ETA: 1:18 - loss: 1.4991 - regression_loss: 1.2731 - classification_loss: 0.2259 264/500 [==============>...............] - ETA: 1:18 - loss: 1.5000 - regression_loss: 1.2740 - classification_loss: 0.2260 265/500 [==============>...............] - ETA: 1:18 - loss: 1.5000 - regression_loss: 1.2740 - classification_loss: 0.2260 266/500 [==============>...............] - ETA: 1:17 - loss: 1.4996 - regression_loss: 1.2739 - classification_loss: 0.2257 267/500 [===============>..............] - ETA: 1:17 - loss: 1.4967 - regression_loss: 1.2715 - classification_loss: 0.2252 268/500 [===============>..............] - ETA: 1:17 - loss: 1.4944 - regression_loss: 1.2690 - classification_loss: 0.2254 269/500 [===============>..............] - ETA: 1:16 - loss: 1.4960 - regression_loss: 1.2706 - classification_loss: 0.2255 270/500 [===============>..............] - ETA: 1:16 - loss: 1.4941 - regression_loss: 1.2690 - classification_loss: 0.2251 271/500 [===============>..............] - ETA: 1:16 - loss: 1.4956 - regression_loss: 1.2704 - classification_loss: 0.2253 272/500 [===============>..............] - ETA: 1:15 - loss: 1.4940 - regression_loss: 1.2688 - classification_loss: 0.2252 273/500 [===============>..............] - ETA: 1:15 - loss: 1.4951 - regression_loss: 1.2698 - classification_loss: 0.2253 274/500 [===============>..............] - ETA: 1:15 - loss: 1.4957 - regression_loss: 1.2702 - classification_loss: 0.2255 275/500 [===============>..............] - ETA: 1:14 - loss: 1.4962 - regression_loss: 1.2708 - classification_loss: 0.2254 276/500 [===============>..............] - ETA: 1:14 - loss: 1.4938 - regression_loss: 1.2686 - classification_loss: 0.2252 277/500 [===============>..............] - ETA: 1:14 - loss: 1.4926 - regression_loss: 1.2676 - classification_loss: 0.2250 278/500 [===============>..............] - ETA: 1:13 - loss: 1.4934 - regression_loss: 1.2685 - classification_loss: 0.2248 279/500 [===============>..............] - ETA: 1:13 - loss: 1.4946 - regression_loss: 1.2697 - classification_loss: 0.2249 280/500 [===============>..............] - ETA: 1:13 - loss: 1.4968 - regression_loss: 1.2717 - classification_loss: 0.2252 281/500 [===============>..............] - ETA: 1:12 - loss: 1.4982 - regression_loss: 1.2728 - classification_loss: 0.2254 282/500 [===============>..............] - ETA: 1:12 - loss: 1.4983 - regression_loss: 1.2729 - classification_loss: 0.2254 283/500 [===============>..............] - ETA: 1:12 - loss: 1.4979 - regression_loss: 1.2726 - classification_loss: 0.2253 284/500 [================>.............] - ETA: 1:11 - loss: 1.4990 - regression_loss: 1.2736 - classification_loss: 0.2254 285/500 [================>.............] - ETA: 1:11 - loss: 1.4985 - regression_loss: 1.2734 - classification_loss: 0.2251 286/500 [================>.............] - ETA: 1:11 - loss: 1.4979 - regression_loss: 1.2729 - classification_loss: 0.2250 287/500 [================>.............] - ETA: 1:10 - loss: 1.4985 - regression_loss: 1.2735 - classification_loss: 0.2250 288/500 [================>.............] - ETA: 1:10 - loss: 1.4993 - regression_loss: 1.2741 - classification_loss: 0.2252 289/500 [================>.............] - ETA: 1:10 - loss: 1.5002 - regression_loss: 1.2750 - classification_loss: 0.2253 290/500 [================>.............] - ETA: 1:09 - loss: 1.5003 - regression_loss: 1.2749 - classification_loss: 0.2253 291/500 [================>.............] - ETA: 1:09 - loss: 1.4991 - regression_loss: 1.2741 - classification_loss: 0.2251 292/500 [================>.............] - ETA: 1:09 - loss: 1.4991 - regression_loss: 1.2739 - classification_loss: 0.2251 293/500 [================>.............] - ETA: 1:08 - loss: 1.4966 - regression_loss: 1.2718 - classification_loss: 0.2248 294/500 [================>.............] - ETA: 1:08 - loss: 1.4937 - regression_loss: 1.2692 - classification_loss: 0.2245 295/500 [================>.............] - ETA: 1:08 - loss: 1.4938 - regression_loss: 1.2693 - classification_loss: 0.2245 296/500 [================>.............] - ETA: 1:07 - loss: 1.4951 - regression_loss: 1.2704 - classification_loss: 0.2247 297/500 [================>.............] - ETA: 1:07 - loss: 1.4947 - regression_loss: 1.2702 - classification_loss: 0.2246 298/500 [================>.............] - ETA: 1:07 - loss: 1.4915 - regression_loss: 1.2674 - classification_loss: 0.2241 299/500 [================>.............] - ETA: 1:06 - loss: 1.4900 - regression_loss: 1.2662 - classification_loss: 0.2239 300/500 [=================>............] - ETA: 1:06 - loss: 1.4886 - regression_loss: 1.2650 - classification_loss: 0.2236 301/500 [=================>............] - ETA: 1:06 - loss: 1.4888 - regression_loss: 1.2650 - classification_loss: 0.2237 302/500 [=================>............] - ETA: 1:05 - loss: 1.4875 - regression_loss: 1.2640 - classification_loss: 0.2236 303/500 [=================>............] - ETA: 1:05 - loss: 1.4863 - regression_loss: 1.2628 - classification_loss: 0.2235 304/500 [=================>............] - ETA: 1:05 - loss: 1.4861 - regression_loss: 1.2626 - classification_loss: 0.2235 305/500 [=================>............] - ETA: 1:04 - loss: 1.4850 - regression_loss: 1.2617 - classification_loss: 0.2233 306/500 [=================>............] - ETA: 1:04 - loss: 1.4845 - regression_loss: 1.2613 - classification_loss: 0.2231 307/500 [=================>............] - ETA: 1:04 - loss: 1.4869 - regression_loss: 1.2634 - classification_loss: 0.2235 308/500 [=================>............] - ETA: 1:03 - loss: 1.4851 - regression_loss: 1.2619 - classification_loss: 0.2232 309/500 [=================>............] - ETA: 1:03 - loss: 1.4828 - regression_loss: 1.2599 - classification_loss: 0.2229 310/500 [=================>............] - ETA: 1:03 - loss: 1.4842 - regression_loss: 1.2610 - classification_loss: 0.2232 311/500 [=================>............] - ETA: 1:02 - loss: 1.4823 - regression_loss: 1.2594 - classification_loss: 0.2229 312/500 [=================>............] - ETA: 1:02 - loss: 1.4842 - regression_loss: 1.2610 - classification_loss: 0.2232 313/500 [=================>............] - ETA: 1:02 - loss: 1.4837 - regression_loss: 1.2606 - classification_loss: 0.2231 314/500 [=================>............] - ETA: 1:01 - loss: 1.4842 - regression_loss: 1.2611 - classification_loss: 0.2231 315/500 [=================>............] - ETA: 1:01 - loss: 1.4868 - regression_loss: 1.2633 - classification_loss: 0.2235 316/500 [=================>............] - ETA: 1:01 - loss: 1.4882 - regression_loss: 1.2646 - classification_loss: 0.2235 317/500 [==================>...........] - ETA: 1:00 - loss: 1.4883 - regression_loss: 1.2647 - classification_loss: 0.2236 318/500 [==================>...........] - ETA: 1:00 - loss: 1.4894 - regression_loss: 1.2657 - classification_loss: 0.2237 319/500 [==================>...........] - ETA: 1:00 - loss: 1.4907 - regression_loss: 1.2667 - classification_loss: 0.2240 320/500 [==================>...........] - ETA: 59s - loss: 1.4900 - regression_loss: 1.2663 - classification_loss: 0.2238  321/500 [==================>...........] - ETA: 59s - loss: 1.4872 - regression_loss: 1.2639 - classification_loss: 0.2234 322/500 [==================>...........] - ETA: 59s - loss: 1.4868 - regression_loss: 1.2635 - classification_loss: 0.2233 323/500 [==================>...........] - ETA: 58s - loss: 1.4877 - regression_loss: 1.2642 - classification_loss: 0.2236 324/500 [==================>...........] - ETA: 58s - loss: 1.4866 - regression_loss: 1.2632 - classification_loss: 0.2234 325/500 [==================>...........] - ETA: 58s - loss: 1.4888 - regression_loss: 1.2650 - classification_loss: 0.2237 326/500 [==================>...........] - ETA: 57s - loss: 1.4889 - regression_loss: 1.2652 - classification_loss: 0.2236 327/500 [==================>...........] - ETA: 57s - loss: 1.4886 - regression_loss: 1.2651 - classification_loss: 0.2236 328/500 [==================>...........] - ETA: 57s - loss: 1.4891 - regression_loss: 1.2655 - classification_loss: 0.2236 329/500 [==================>...........] - ETA: 56s - loss: 1.4886 - regression_loss: 1.2651 - classification_loss: 0.2235 330/500 [==================>...........] - ETA: 56s - loss: 1.4901 - regression_loss: 1.2663 - classification_loss: 0.2238 331/500 [==================>...........] - ETA: 56s - loss: 1.4914 - regression_loss: 1.2673 - classification_loss: 0.2241 332/500 [==================>...........] - ETA: 55s - loss: 1.4925 - regression_loss: 1.2683 - classification_loss: 0.2242 333/500 [==================>...........] - ETA: 55s - loss: 1.4935 - regression_loss: 1.2692 - classification_loss: 0.2243 334/500 [===================>..........] - ETA: 55s - loss: 1.4912 - regression_loss: 1.2673 - classification_loss: 0.2239 335/500 [===================>..........] - ETA: 54s - loss: 1.4889 - regression_loss: 1.2653 - classification_loss: 0.2237 336/500 [===================>..........] - ETA: 54s - loss: 1.4906 - regression_loss: 1.2668 - classification_loss: 0.2238 337/500 [===================>..........] - ETA: 54s - loss: 1.4910 - regression_loss: 1.2674 - classification_loss: 0.2237 338/500 [===================>..........] - ETA: 53s - loss: 1.4912 - regression_loss: 1.2676 - classification_loss: 0.2236 339/500 [===================>..........] - ETA: 53s - loss: 1.4917 - regression_loss: 1.2680 - classification_loss: 0.2237 340/500 [===================>..........] - ETA: 53s - loss: 1.4910 - regression_loss: 1.2673 - classification_loss: 0.2237 341/500 [===================>..........] - ETA: 52s - loss: 1.4908 - regression_loss: 1.2672 - classification_loss: 0.2236 342/500 [===================>..........] - ETA: 52s - loss: 1.4894 - regression_loss: 1.2661 - classification_loss: 0.2233 343/500 [===================>..........] - ETA: 52s - loss: 1.4905 - regression_loss: 1.2669 - classification_loss: 0.2235 344/500 [===================>..........] - ETA: 51s - loss: 1.4916 - regression_loss: 1.2678 - classification_loss: 0.2238 345/500 [===================>..........] - ETA: 51s - loss: 1.4918 - regression_loss: 1.2680 - classification_loss: 0.2239 346/500 [===================>..........] - ETA: 51s - loss: 1.4921 - regression_loss: 1.2682 - classification_loss: 0.2239 347/500 [===================>..........] - ETA: 50s - loss: 1.4911 - regression_loss: 1.2675 - classification_loss: 0.2237 348/500 [===================>..........] - ETA: 50s - loss: 1.4886 - regression_loss: 1.2653 - classification_loss: 0.2233 349/500 [===================>..........] - ETA: 50s - loss: 1.4857 - regression_loss: 1.2628 - classification_loss: 0.2229 350/500 [====================>.........] - ETA: 49s - loss: 1.4866 - regression_loss: 1.2636 - classification_loss: 0.2231 351/500 [====================>.........] - ETA: 49s - loss: 1.4883 - regression_loss: 1.2651 - classification_loss: 0.2232 352/500 [====================>.........] - ETA: 49s - loss: 1.4893 - regression_loss: 1.2658 - classification_loss: 0.2234 353/500 [====================>.........] - ETA: 48s - loss: 1.4897 - regression_loss: 1.2663 - classification_loss: 0.2234 354/500 [====================>.........] - ETA: 48s - loss: 1.4905 - regression_loss: 1.2671 - classification_loss: 0.2234 355/500 [====================>.........] - ETA: 48s - loss: 1.4886 - regression_loss: 1.2654 - classification_loss: 0.2232 356/500 [====================>.........] - ETA: 47s - loss: 1.4874 - regression_loss: 1.2643 - classification_loss: 0.2230 357/500 [====================>.........] - ETA: 47s - loss: 1.4859 - regression_loss: 1.2632 - classification_loss: 0.2227 358/500 [====================>.........] - ETA: 47s - loss: 1.4859 - regression_loss: 1.2632 - classification_loss: 0.2227 359/500 [====================>.........] - ETA: 46s - loss: 1.4857 - regression_loss: 1.2632 - classification_loss: 0.2225 360/500 [====================>.........] - ETA: 46s - loss: 1.4831 - regression_loss: 1.2609 - classification_loss: 0.2221 361/500 [====================>.........] - ETA: 46s - loss: 1.4819 - regression_loss: 1.2600 - classification_loss: 0.2218 362/500 [====================>.........] - ETA: 45s - loss: 1.4806 - regression_loss: 1.2591 - classification_loss: 0.2216 363/500 [====================>.........] - ETA: 45s - loss: 1.4830 - regression_loss: 1.2605 - classification_loss: 0.2225 364/500 [====================>.........] - ETA: 45s - loss: 1.4816 - regression_loss: 1.2593 - classification_loss: 0.2223 365/500 [====================>.........] - ETA: 44s - loss: 1.4846 - regression_loss: 1.2614 - classification_loss: 0.2233 366/500 [====================>.........] - ETA: 44s - loss: 1.4831 - regression_loss: 1.2601 - classification_loss: 0.2230 367/500 [=====================>........] - ETA: 44s - loss: 1.4841 - regression_loss: 1.2610 - classification_loss: 0.2231 368/500 [=====================>........] - ETA: 43s - loss: 1.4844 - regression_loss: 1.2612 - classification_loss: 0.2232 369/500 [=====================>........] - ETA: 43s - loss: 1.4858 - regression_loss: 1.2626 - classification_loss: 0.2233 370/500 [=====================>........] - ETA: 43s - loss: 1.4876 - regression_loss: 1.2641 - classification_loss: 0.2235 371/500 [=====================>........] - ETA: 42s - loss: 1.4894 - regression_loss: 1.2656 - classification_loss: 0.2238 372/500 [=====================>........] - ETA: 42s - loss: 1.4884 - regression_loss: 1.2644 - classification_loss: 0.2240 373/500 [=====================>........] - ETA: 42s - loss: 1.4892 - regression_loss: 1.2651 - classification_loss: 0.2241 374/500 [=====================>........] - ETA: 41s - loss: 1.4902 - regression_loss: 1.2660 - classification_loss: 0.2242 375/500 [=====================>........] - ETA: 41s - loss: 1.4916 - regression_loss: 1.2670 - classification_loss: 0.2246 376/500 [=====================>........] - ETA: 41s - loss: 1.4916 - regression_loss: 1.2662 - classification_loss: 0.2254 377/500 [=====================>........] - ETA: 40s - loss: 1.4913 - regression_loss: 1.2660 - classification_loss: 0.2252 378/500 [=====================>........] - ETA: 40s - loss: 1.4911 - regression_loss: 1.2659 - classification_loss: 0.2252 379/500 [=====================>........] - ETA: 40s - loss: 1.4910 - regression_loss: 1.2658 - classification_loss: 0.2252 380/500 [=====================>........] - ETA: 39s - loss: 1.4891 - regression_loss: 1.2643 - classification_loss: 0.2248 381/500 [=====================>........] - ETA: 39s - loss: 1.4897 - regression_loss: 1.2648 - classification_loss: 0.2249 382/500 [=====================>........] - ETA: 39s - loss: 1.4904 - regression_loss: 1.2655 - classification_loss: 0.2249 383/500 [=====================>........] - ETA: 38s - loss: 1.4912 - regression_loss: 1.2661 - classification_loss: 0.2251 384/500 [======================>.......] - ETA: 38s - loss: 1.4902 - regression_loss: 1.2651 - classification_loss: 0.2251 385/500 [======================>.......] - ETA: 38s - loss: 1.4903 - regression_loss: 1.2649 - classification_loss: 0.2253 386/500 [======================>.......] - ETA: 37s - loss: 1.4901 - regression_loss: 1.2647 - classification_loss: 0.2253 387/500 [======================>.......] - ETA: 37s - loss: 1.4886 - regression_loss: 1.2634 - classification_loss: 0.2252 388/500 [======================>.......] - ETA: 37s - loss: 1.4896 - regression_loss: 1.2642 - classification_loss: 0.2253 389/500 [======================>.......] - ETA: 36s - loss: 1.4901 - regression_loss: 1.2647 - classification_loss: 0.2253 390/500 [======================>.......] - ETA: 36s - loss: 1.4907 - regression_loss: 1.2653 - classification_loss: 0.2254 391/500 [======================>.......] - ETA: 36s - loss: 1.4912 - regression_loss: 1.2656 - classification_loss: 0.2256 392/500 [======================>.......] - ETA: 35s - loss: 1.4910 - regression_loss: 1.2655 - classification_loss: 0.2255 393/500 [======================>.......] - ETA: 35s - loss: 1.4931 - regression_loss: 1.2670 - classification_loss: 0.2260 394/500 [======================>.......] - ETA: 35s - loss: 1.4933 - regression_loss: 1.2674 - classification_loss: 0.2259 395/500 [======================>.......] - ETA: 34s - loss: 1.4923 - regression_loss: 1.2667 - classification_loss: 0.2256 396/500 [======================>.......] - ETA: 34s - loss: 1.4924 - regression_loss: 1.2667 - classification_loss: 0.2257 397/500 [======================>.......] - ETA: 34s - loss: 1.4934 - regression_loss: 1.2674 - classification_loss: 0.2260 398/500 [======================>.......] - ETA: 33s - loss: 1.4934 - regression_loss: 1.2674 - classification_loss: 0.2260 399/500 [======================>.......] - ETA: 33s - loss: 1.4931 - regression_loss: 1.2672 - classification_loss: 0.2259 400/500 [=======================>......] - ETA: 33s - loss: 1.4935 - regression_loss: 1.2678 - classification_loss: 0.2258 401/500 [=======================>......] - ETA: 32s - loss: 1.4927 - regression_loss: 1.2670 - classification_loss: 0.2257 402/500 [=======================>......] - ETA: 32s - loss: 1.4930 - regression_loss: 1.2673 - classification_loss: 0.2257 403/500 [=======================>......] - ETA: 32s - loss: 1.4912 - regression_loss: 1.2658 - classification_loss: 0.2255 404/500 [=======================>......] - ETA: 31s - loss: 1.4916 - regression_loss: 1.2662 - classification_loss: 0.2254 405/500 [=======================>......] - ETA: 31s - loss: 1.4906 - regression_loss: 1.2653 - classification_loss: 0.2253 406/500 [=======================>......] - ETA: 31s - loss: 1.4910 - regression_loss: 1.2657 - classification_loss: 0.2254 407/500 [=======================>......] - ETA: 30s - loss: 1.4918 - regression_loss: 1.2662 - classification_loss: 0.2256 408/500 [=======================>......] - ETA: 30s - loss: 1.4940 - regression_loss: 1.2679 - classification_loss: 0.2261 409/500 [=======================>......] - ETA: 30s - loss: 1.4933 - regression_loss: 1.2673 - classification_loss: 0.2261 410/500 [=======================>......] - ETA: 29s - loss: 1.4926 - regression_loss: 1.2667 - classification_loss: 0.2259 411/500 [=======================>......] - ETA: 29s - loss: 1.4917 - regression_loss: 1.2660 - classification_loss: 0.2257 412/500 [=======================>......] - ETA: 29s - loss: 1.4917 - regression_loss: 1.2660 - classification_loss: 0.2258 413/500 [=======================>......] - ETA: 28s - loss: 1.4927 - regression_loss: 1.2668 - classification_loss: 0.2259 414/500 [=======================>......] - ETA: 28s - loss: 1.4909 - regression_loss: 1.2652 - classification_loss: 0.2256 415/500 [=======================>......] - ETA: 28s - loss: 1.4887 - regression_loss: 1.2634 - classification_loss: 0.2253 416/500 [=======================>......] - ETA: 27s - loss: 1.4868 - regression_loss: 1.2617 - classification_loss: 0.2251 417/500 [========================>.....] - ETA: 27s - loss: 1.4859 - regression_loss: 1.2611 - classification_loss: 0.2248 418/500 [========================>.....] - ETA: 27s - loss: 1.4872 - regression_loss: 1.2624 - classification_loss: 0.2249 419/500 [========================>.....] - ETA: 26s - loss: 1.4878 - regression_loss: 1.2628 - classification_loss: 0.2250 420/500 [========================>.....] - ETA: 26s - loss: 1.4861 - regression_loss: 1.2614 - classification_loss: 0.2247 421/500 [========================>.....] - ETA: 26s - loss: 1.4866 - regression_loss: 1.2619 - classification_loss: 0.2248 422/500 [========================>.....] - ETA: 25s - loss: 1.4879 - regression_loss: 1.2625 - classification_loss: 0.2253 423/500 [========================>.....] - ETA: 25s - loss: 1.4887 - regression_loss: 1.2633 - classification_loss: 0.2254 424/500 [========================>.....] - ETA: 25s - loss: 1.4875 - regression_loss: 1.2624 - classification_loss: 0.2252 425/500 [========================>.....] - ETA: 24s - loss: 1.4891 - regression_loss: 1.2638 - classification_loss: 0.2253 426/500 [========================>.....] - ETA: 24s - loss: 1.4899 - regression_loss: 1.2644 - classification_loss: 0.2255 427/500 [========================>.....] - ETA: 24s - loss: 1.4909 - regression_loss: 1.2654 - classification_loss: 0.2256 428/500 [========================>.....] - ETA: 23s - loss: 1.4900 - regression_loss: 1.2645 - classification_loss: 0.2255 429/500 [========================>.....] - ETA: 23s - loss: 1.4893 - regression_loss: 1.2639 - classification_loss: 0.2254 430/500 [========================>.....] - ETA: 23s - loss: 1.4890 - regression_loss: 1.2636 - classification_loss: 0.2255 431/500 [========================>.....] - ETA: 22s - loss: 1.4894 - regression_loss: 1.2639 - classification_loss: 0.2255 432/500 [========================>.....] - ETA: 22s - loss: 1.4906 - regression_loss: 1.2650 - classification_loss: 0.2256 433/500 [========================>.....] - ETA: 22s - loss: 1.4905 - regression_loss: 1.2649 - classification_loss: 0.2256 434/500 [=========================>....] - ETA: 21s - loss: 1.4891 - regression_loss: 1.2637 - classification_loss: 0.2254 435/500 [=========================>....] - ETA: 21s - loss: 1.4895 - regression_loss: 1.2642 - classification_loss: 0.2253 436/500 [=========================>....] - ETA: 21s - loss: 1.4901 - regression_loss: 1.2648 - classification_loss: 0.2253 437/500 [=========================>....] - ETA: 20s - loss: 1.4907 - regression_loss: 1.2652 - classification_loss: 0.2254 438/500 [=========================>....] - ETA: 20s - loss: 1.4898 - regression_loss: 1.2645 - classification_loss: 0.2253 439/500 [=========================>....] - ETA: 20s - loss: 1.4881 - regression_loss: 1.2631 - classification_loss: 0.2251 440/500 [=========================>....] - ETA: 19s - loss: 1.4889 - regression_loss: 1.2638 - classification_loss: 0.2251 441/500 [=========================>....] - ETA: 19s - loss: 1.4887 - regression_loss: 1.2637 - classification_loss: 0.2250 442/500 [=========================>....] - ETA: 19s - loss: 1.4897 - regression_loss: 1.2645 - classification_loss: 0.2251 443/500 [=========================>....] - ETA: 18s - loss: 1.4901 - regression_loss: 1.2649 - classification_loss: 0.2252 444/500 [=========================>....] - ETA: 18s - loss: 1.4898 - regression_loss: 1.2644 - classification_loss: 0.2254 445/500 [=========================>....] - ETA: 18s - loss: 1.4897 - regression_loss: 1.2643 - classification_loss: 0.2254 446/500 [=========================>....] - ETA: 17s - loss: 1.4891 - regression_loss: 1.2639 - classification_loss: 0.2253 447/500 [=========================>....] - ETA: 17s - loss: 1.4885 - regression_loss: 1.2634 - classification_loss: 0.2252 448/500 [=========================>....] - ETA: 17s - loss: 1.4893 - regression_loss: 1.2640 - classification_loss: 0.2253 449/500 [=========================>....] - ETA: 16s - loss: 1.4881 - regression_loss: 1.2629 - classification_loss: 0.2252 450/500 [==========================>...] - ETA: 16s - loss: 1.4892 - regression_loss: 1.2639 - classification_loss: 0.2253 451/500 [==========================>...] - ETA: 16s - loss: 1.4901 - regression_loss: 1.2646 - classification_loss: 0.2255 452/500 [==========================>...] - ETA: 15s - loss: 1.4890 - regression_loss: 1.2636 - classification_loss: 0.2254 453/500 [==========================>...] - ETA: 15s - loss: 1.4900 - regression_loss: 1.2644 - classification_loss: 0.2256 454/500 [==========================>...] - ETA: 15s - loss: 1.4888 - regression_loss: 1.2633 - classification_loss: 0.2255 455/500 [==========================>...] - ETA: 14s - loss: 1.4885 - regression_loss: 1.2631 - classification_loss: 0.2254 456/500 [==========================>...] - ETA: 14s - loss: 1.4887 - regression_loss: 1.2632 - classification_loss: 0.2255 457/500 [==========================>...] - ETA: 14s - loss: 1.4890 - regression_loss: 1.2635 - classification_loss: 0.2255 458/500 [==========================>...] - ETA: 13s - loss: 1.4885 - regression_loss: 1.2632 - classification_loss: 0.2253 459/500 [==========================>...] - ETA: 13s - loss: 1.4890 - regression_loss: 1.2636 - classification_loss: 0.2254 460/500 [==========================>...] - ETA: 13s - loss: 1.4894 - regression_loss: 1.2640 - classification_loss: 0.2254 461/500 [==========================>...] - ETA: 12s - loss: 1.4898 - regression_loss: 1.2643 - classification_loss: 0.2255 462/500 [==========================>...] - ETA: 12s - loss: 1.4908 - regression_loss: 1.2653 - classification_loss: 0.2256 463/500 [==========================>...] - ETA: 12s - loss: 1.4915 - regression_loss: 1.2658 - classification_loss: 0.2258 464/500 [==========================>...] - ETA: 11s - loss: 1.4927 - regression_loss: 1.2667 - classification_loss: 0.2260 465/500 [==========================>...] - ETA: 11s - loss: 1.4927 - regression_loss: 1.2667 - classification_loss: 0.2260 466/500 [==========================>...] - ETA: 11s - loss: 1.4913 - regression_loss: 1.2654 - classification_loss: 0.2259 467/500 [===========================>..] - ETA: 10s - loss: 1.4909 - regression_loss: 1.2651 - classification_loss: 0.2258 468/500 [===========================>..] - ETA: 10s - loss: 1.4915 - regression_loss: 1.2656 - classification_loss: 0.2259 469/500 [===========================>..] - ETA: 10s - loss: 1.4902 - regression_loss: 1.2644 - classification_loss: 0.2257 470/500 [===========================>..] - ETA: 9s - loss: 1.4906 - regression_loss: 1.2648 - classification_loss: 0.2258  471/500 [===========================>..] - ETA: 9s - loss: 1.4907 - regression_loss: 1.2649 - classification_loss: 0.2258 472/500 [===========================>..] - ETA: 9s - loss: 1.4886 - regression_loss: 1.2631 - classification_loss: 0.2255 473/500 [===========================>..] - ETA: 8s - loss: 1.4880 - regression_loss: 1.2626 - classification_loss: 0.2253 474/500 [===========================>..] - ETA: 8s - loss: 1.4866 - regression_loss: 1.2616 - classification_loss: 0.2250 475/500 [===========================>..] - ETA: 8s - loss: 1.4847 - regression_loss: 1.2600 - classification_loss: 0.2248 476/500 [===========================>..] - ETA: 7s - loss: 1.4857 - regression_loss: 1.2607 - classification_loss: 0.2250 477/500 [===========================>..] - ETA: 7s - loss: 1.4868 - regression_loss: 1.2615 - classification_loss: 0.2253 478/500 [===========================>..] - ETA: 7s - loss: 1.4884 - regression_loss: 1.2627 - classification_loss: 0.2257 479/500 [===========================>..] - ETA: 6s - loss: 1.4896 - regression_loss: 1.2637 - classification_loss: 0.2259 480/500 [===========================>..] - ETA: 6s - loss: 1.4900 - regression_loss: 1.2640 - classification_loss: 0.2260 481/500 [===========================>..] - ETA: 6s - loss: 1.4906 - regression_loss: 1.2647 - classification_loss: 0.2260 482/500 [===========================>..] - ETA: 5s - loss: 1.4909 - regression_loss: 1.2649 - classification_loss: 0.2260 483/500 [===========================>..] - ETA: 5s - loss: 1.4912 - regression_loss: 1.2652 - classification_loss: 0.2260 484/500 [============================>.] - ETA: 5s - loss: 1.4917 - regression_loss: 1.2656 - classification_loss: 0.2261 485/500 [============================>.] - ETA: 4s - loss: 1.4919 - regression_loss: 1.2658 - classification_loss: 0.2261 486/500 [============================>.] - ETA: 4s - loss: 1.4924 - regression_loss: 1.2663 - classification_loss: 0.2261 487/500 [============================>.] - ETA: 4s - loss: 1.4903 - regression_loss: 1.2645 - classification_loss: 0.2258 488/500 [============================>.] - ETA: 3s - loss: 1.4900 - regression_loss: 1.2642 - classification_loss: 0.2258 489/500 [============================>.] - ETA: 3s - loss: 1.4885 - regression_loss: 1.2629 - classification_loss: 0.2256 490/500 [============================>.] - ETA: 3s - loss: 1.4898 - regression_loss: 1.2639 - classification_loss: 0.2259 491/500 [============================>.] - ETA: 2s - loss: 1.4907 - regression_loss: 1.2646 - classification_loss: 0.2261 492/500 [============================>.] - ETA: 2s - loss: 1.4908 - regression_loss: 1.2646 - classification_loss: 0.2262 493/500 [============================>.] - ETA: 2s - loss: 1.4912 - regression_loss: 1.2647 - classification_loss: 0.2265 494/500 [============================>.] - ETA: 1s - loss: 1.4931 - regression_loss: 1.2663 - classification_loss: 0.2269 495/500 [============================>.] - ETA: 1s - loss: 1.4920 - regression_loss: 1.2653 - classification_loss: 0.2267 496/500 [============================>.] - ETA: 1s - loss: 1.4907 - regression_loss: 1.2640 - classification_loss: 0.2266 497/500 [============================>.] - ETA: 0s - loss: 1.4897 - regression_loss: 1.2632 - classification_loss: 0.2264 498/500 [============================>.] - ETA: 0s - loss: 1.4907 - regression_loss: 1.2640 - classification_loss: 0.2267 499/500 [============================>.] - ETA: 0s - loss: 1.4912 - regression_loss: 1.2644 - classification_loss: 0.2268 500/500 [==============================] - 166s 331ms/step - loss: 1.4917 - regression_loss: 1.2649 - classification_loss: 0.2268 1172 instances of class plum with average precision: 0.5714 mAP: 0.5714 Epoch 00017: saving model to ./training/snapshots/resnet101_pascal_17.h5 Epoch 18/150 1/500 [..............................] - ETA: 2:47 - loss: 1.7939 - regression_loss: 1.5110 - classification_loss: 0.2829 2/500 [..............................] - ETA: 2:53 - loss: 1.7048 - regression_loss: 1.4554 - classification_loss: 0.2494 3/500 [..............................] - ETA: 2:50 - loss: 1.8310 - regression_loss: 1.5702 - classification_loss: 0.2608 4/500 [..............................] - ETA: 2:52 - loss: 1.9362 - regression_loss: 1.6554 - classification_loss: 0.2808 5/500 [..............................] - ETA: 2:52 - loss: 1.7756 - regression_loss: 1.4979 - classification_loss: 0.2777 6/500 [..............................] - ETA: 2:52 - loss: 1.7044 - regression_loss: 1.4426 - classification_loss: 0.2618 7/500 [..............................] - ETA: 2:52 - loss: 1.6217 - regression_loss: 1.3778 - classification_loss: 0.2438 8/500 [..............................] - ETA: 2:50 - loss: 1.6322 - regression_loss: 1.3928 - classification_loss: 0.2394 9/500 [..............................] - ETA: 2:48 - loss: 1.6502 - regression_loss: 1.4061 - classification_loss: 0.2441 10/500 [..............................] - ETA: 2:47 - loss: 1.6237 - regression_loss: 1.3814 - classification_loss: 0.2424 11/500 [..............................] - ETA: 2:46 - loss: 1.6104 - regression_loss: 1.3715 - classification_loss: 0.2389 12/500 [..............................] - ETA: 2:46 - loss: 1.5512 - regression_loss: 1.3228 - classification_loss: 0.2284 13/500 [..............................] - ETA: 2:45 - loss: 1.5200 - regression_loss: 1.2973 - classification_loss: 0.2227 14/500 [..............................] - ETA: 2:44 - loss: 1.5202 - regression_loss: 1.3041 - classification_loss: 0.2161 15/500 [..............................] - ETA: 2:44 - loss: 1.4818 - regression_loss: 1.2691 - classification_loss: 0.2126 16/500 [..............................] - ETA: 2:43 - loss: 1.5030 - regression_loss: 1.2879 - classification_loss: 0.2150 17/500 [>.............................] - ETA: 2:42 - loss: 1.5107 - regression_loss: 1.2949 - classification_loss: 0.2159 18/500 [>.............................] - ETA: 2:42 - loss: 1.5212 - regression_loss: 1.3016 - classification_loss: 0.2196 19/500 [>.............................] - ETA: 2:42 - loss: 1.5415 - regression_loss: 1.3196 - classification_loss: 0.2219 20/500 [>.............................] - ETA: 2:41 - loss: 1.5554 - regression_loss: 1.3329 - classification_loss: 0.2226 21/500 [>.............................] - ETA: 2:40 - loss: 1.5287 - regression_loss: 1.3100 - classification_loss: 0.2186 22/500 [>.............................] - ETA: 2:40 - loss: 1.5507 - regression_loss: 1.3271 - classification_loss: 0.2236 23/500 [>.............................] - ETA: 2:39 - loss: 1.5620 - regression_loss: 1.3378 - classification_loss: 0.2242 24/500 [>.............................] - ETA: 2:39 - loss: 1.5522 - regression_loss: 1.3283 - classification_loss: 0.2239 25/500 [>.............................] - ETA: 2:38 - loss: 1.5439 - regression_loss: 1.3207 - classification_loss: 0.2232 26/500 [>.............................] - ETA: 2:38 - loss: 1.5198 - regression_loss: 1.2994 - classification_loss: 0.2204 27/500 [>.............................] - ETA: 2:37 - loss: 1.5074 - regression_loss: 1.2885 - classification_loss: 0.2189 28/500 [>.............................] - ETA: 2:37 - loss: 1.4863 - regression_loss: 1.2706 - classification_loss: 0.2157 29/500 [>.............................] - ETA: 2:37 - loss: 1.4742 - regression_loss: 1.2591 - classification_loss: 0.2152 30/500 [>.............................] - ETA: 2:37 - loss: 1.4800 - regression_loss: 1.2680 - classification_loss: 0.2120 31/500 [>.............................] - ETA: 2:36 - loss: 1.4829 - regression_loss: 1.2713 - classification_loss: 0.2116 32/500 [>.............................] - ETA: 2:36 - loss: 1.4582 - regression_loss: 1.2493 - classification_loss: 0.2088 33/500 [>.............................] - ETA: 2:36 - loss: 1.4644 - regression_loss: 1.2557 - classification_loss: 0.2087 34/500 [=>............................] - ETA: 2:35 - loss: 1.4679 - regression_loss: 1.2585 - classification_loss: 0.2094 35/500 [=>............................] - ETA: 2:34 - loss: 1.4489 - regression_loss: 1.2430 - classification_loss: 0.2059 36/500 [=>............................] - ETA: 2:34 - loss: 1.4585 - regression_loss: 1.2511 - classification_loss: 0.2074 37/500 [=>............................] - ETA: 2:34 - loss: 1.4312 - regression_loss: 1.2268 - classification_loss: 0.2043 38/500 [=>............................] - ETA: 2:33 - loss: 1.4267 - regression_loss: 1.2228 - classification_loss: 0.2038 39/500 [=>............................] - ETA: 2:33 - loss: 1.4318 - regression_loss: 1.2260 - classification_loss: 0.2058 40/500 [=>............................] - ETA: 2:33 - loss: 1.4345 - regression_loss: 1.2284 - classification_loss: 0.2061 41/500 [=>............................] - ETA: 2:33 - loss: 1.4094 - regression_loss: 1.2058 - classification_loss: 0.2036 42/500 [=>............................] - ETA: 2:32 - loss: 1.3959 - regression_loss: 1.1940 - classification_loss: 0.2019 43/500 [=>............................] - ETA: 2:32 - loss: 1.3992 - regression_loss: 1.1974 - classification_loss: 0.2018 44/500 [=>............................] - ETA: 2:32 - loss: 1.3962 - regression_loss: 1.1942 - classification_loss: 0.2020 45/500 [=>............................] - ETA: 2:31 - loss: 1.4026 - regression_loss: 1.1999 - classification_loss: 0.2026 46/500 [=>............................] - ETA: 2:31 - loss: 1.3981 - regression_loss: 1.1955 - classification_loss: 0.2026 47/500 [=>............................] - ETA: 2:31 - loss: 1.3855 - regression_loss: 1.1848 - classification_loss: 0.2007 48/500 [=>............................] - ETA: 2:30 - loss: 1.3723 - regression_loss: 1.1731 - classification_loss: 0.1992 49/500 [=>............................] - ETA: 2:30 - loss: 1.3732 - regression_loss: 1.1735 - classification_loss: 0.1997 50/500 [==>...........................] - ETA: 2:30 - loss: 1.3891 - regression_loss: 1.1859 - classification_loss: 0.2031 51/500 [==>...........................] - ETA: 2:29 - loss: 1.3891 - regression_loss: 1.1861 - classification_loss: 0.2030 52/500 [==>...........................] - ETA: 2:29 - loss: 1.3923 - regression_loss: 1.1888 - classification_loss: 0.2035 53/500 [==>...........................] - ETA: 2:29 - loss: 1.4008 - regression_loss: 1.1963 - classification_loss: 0.2045 54/500 [==>...........................] - ETA: 2:28 - loss: 1.3947 - regression_loss: 1.1913 - classification_loss: 0.2034 55/500 [==>...........................] - ETA: 2:28 - loss: 1.3847 - regression_loss: 1.1832 - classification_loss: 0.2015 56/500 [==>...........................] - ETA: 2:27 - loss: 1.3927 - regression_loss: 1.1907 - classification_loss: 0.2020 57/500 [==>...........................] - ETA: 2:27 - loss: 1.3932 - regression_loss: 1.1924 - classification_loss: 0.2008 58/500 [==>...........................] - ETA: 2:27 - loss: 1.3867 - regression_loss: 1.1867 - classification_loss: 0.2000 59/500 [==>...........................] - ETA: 2:27 - loss: 1.3946 - regression_loss: 1.1933 - classification_loss: 0.2014 60/500 [==>...........................] - ETA: 2:26 - loss: 1.4015 - regression_loss: 1.1982 - classification_loss: 0.2033 61/500 [==>...........................] - ETA: 2:26 - loss: 1.4103 - regression_loss: 1.2055 - classification_loss: 0.2048 62/500 [==>...........................] - ETA: 2:26 - loss: 1.4108 - regression_loss: 1.2057 - classification_loss: 0.2050 63/500 [==>...........................] - ETA: 2:25 - loss: 1.4105 - regression_loss: 1.2065 - classification_loss: 0.2040 64/500 [==>...........................] - ETA: 2:25 - loss: 1.4126 - regression_loss: 1.2081 - classification_loss: 0.2044 65/500 [==>...........................] - ETA: 2:24 - loss: 1.4132 - regression_loss: 1.2087 - classification_loss: 0.2045 66/500 [==>...........................] - ETA: 2:24 - loss: 1.4219 - regression_loss: 1.2153 - classification_loss: 0.2065 67/500 [===>..........................] - ETA: 2:24 - loss: 1.4318 - regression_loss: 1.2237 - classification_loss: 0.2080 68/500 [===>..........................] - ETA: 2:24 - loss: 1.4302 - regression_loss: 1.2235 - classification_loss: 0.2067 69/500 [===>..........................] - ETA: 2:23 - loss: 1.4269 - regression_loss: 1.2201 - classification_loss: 0.2069 70/500 [===>..........................] - ETA: 2:23 - loss: 1.4329 - regression_loss: 1.2253 - classification_loss: 0.2076 71/500 [===>..........................] - ETA: 2:23 - loss: 1.4372 - regression_loss: 1.2305 - classification_loss: 0.2067 72/500 [===>..........................] - ETA: 2:22 - loss: 1.4306 - regression_loss: 1.2249 - classification_loss: 0.2058 73/500 [===>..........................] - ETA: 2:22 - loss: 1.4245 - regression_loss: 1.2187 - classification_loss: 0.2058 74/500 [===>..........................] - ETA: 2:22 - loss: 1.4254 - regression_loss: 1.2191 - classification_loss: 0.2063 75/500 [===>..........................] - ETA: 2:21 - loss: 1.4253 - regression_loss: 1.2188 - classification_loss: 0.2065 76/500 [===>..........................] - ETA: 2:21 - loss: 1.4200 - regression_loss: 1.2147 - classification_loss: 0.2052 77/500 [===>..........................] - ETA: 2:20 - loss: 1.4290 - regression_loss: 1.2216 - classification_loss: 0.2074 78/500 [===>..........................] - ETA: 2:20 - loss: 1.4299 - regression_loss: 1.2229 - classification_loss: 0.2071 79/500 [===>..........................] - ETA: 2:20 - loss: 1.4376 - regression_loss: 1.2295 - classification_loss: 0.2081 80/500 [===>..........................] - ETA: 2:19 - loss: 1.4393 - regression_loss: 1.2312 - classification_loss: 0.2081 81/500 [===>..........................] - ETA: 2:19 - loss: 1.4451 - regression_loss: 1.2361 - classification_loss: 0.2090 82/500 [===>..........................] - ETA: 2:19 - loss: 1.4468 - regression_loss: 1.2380 - classification_loss: 0.2088 83/500 [===>..........................] - ETA: 2:18 - loss: 1.4499 - regression_loss: 1.2408 - classification_loss: 0.2091 84/500 [====>.........................] - ETA: 2:18 - loss: 1.4561 - regression_loss: 1.2463 - classification_loss: 0.2098 85/500 [====>.........................] - ETA: 2:18 - loss: 1.4608 - regression_loss: 1.2503 - classification_loss: 0.2106 86/500 [====>.........................] - ETA: 2:17 - loss: 1.4530 - regression_loss: 1.2437 - classification_loss: 0.2093 87/500 [====>.........................] - ETA: 2:17 - loss: 1.4529 - regression_loss: 1.2431 - classification_loss: 0.2098 88/500 [====>.........................] - ETA: 2:17 - loss: 1.4518 - regression_loss: 1.2418 - classification_loss: 0.2100 89/500 [====>.........................] - ETA: 2:16 - loss: 1.4483 - regression_loss: 1.2389 - classification_loss: 0.2093 90/500 [====>.........................] - ETA: 2:16 - loss: 1.4379 - regression_loss: 1.2291 - classification_loss: 0.2089 91/500 [====>.........................] - ETA: 2:16 - loss: 1.4416 - regression_loss: 1.2319 - classification_loss: 0.2097 92/500 [====>.........................] - ETA: 2:15 - loss: 1.4376 - regression_loss: 1.2276 - classification_loss: 0.2100 93/500 [====>.........................] - ETA: 2:15 - loss: 1.4374 - regression_loss: 1.2272 - classification_loss: 0.2101 94/500 [====>.........................] - ETA: 2:15 - loss: 1.4311 - regression_loss: 1.2218 - classification_loss: 0.2094 95/500 [====>.........................] - ETA: 2:14 - loss: 1.4319 - regression_loss: 1.2226 - classification_loss: 0.2093 96/500 [====>.........................] - ETA: 2:14 - loss: 1.4330 - regression_loss: 1.2233 - classification_loss: 0.2097 97/500 [====>.........................] - ETA: 2:14 - loss: 1.4354 - regression_loss: 1.2262 - classification_loss: 0.2092 98/500 [====>.........................] - ETA: 2:13 - loss: 1.4359 - regression_loss: 1.2263 - classification_loss: 0.2096 99/500 [====>.........................] - ETA: 2:13 - loss: 1.4342 - regression_loss: 1.2248 - classification_loss: 0.2094 100/500 [=====>........................] - ETA: 2:12 - loss: 1.4273 - regression_loss: 1.2182 - classification_loss: 0.2090 101/500 [=====>........................] - ETA: 2:12 - loss: 1.4260 - regression_loss: 1.2168 - classification_loss: 0.2093 102/500 [=====>........................] - ETA: 2:12 - loss: 1.4251 - regression_loss: 1.2155 - classification_loss: 0.2096 103/500 [=====>........................] - ETA: 2:11 - loss: 1.4245 - regression_loss: 1.2153 - classification_loss: 0.2092 104/500 [=====>........................] - ETA: 2:11 - loss: 1.4265 - regression_loss: 1.2174 - classification_loss: 0.2091 105/500 [=====>........................] - ETA: 2:11 - loss: 1.4298 - regression_loss: 1.2203 - classification_loss: 0.2095 106/500 [=====>........................] - ETA: 2:11 - loss: 1.4267 - regression_loss: 1.2174 - classification_loss: 0.2093 107/500 [=====>........................] - ETA: 2:10 - loss: 1.4201 - regression_loss: 1.2116 - classification_loss: 0.2085 108/500 [=====>........................] - ETA: 2:10 - loss: 1.4233 - regression_loss: 1.2143 - classification_loss: 0.2090 109/500 [=====>........................] - ETA: 2:10 - loss: 1.4239 - regression_loss: 1.2147 - classification_loss: 0.2092 110/500 [=====>........................] - ETA: 2:09 - loss: 1.4186 - regression_loss: 1.2098 - classification_loss: 0.2088 111/500 [=====>........................] - ETA: 2:09 - loss: 1.4195 - regression_loss: 1.2102 - classification_loss: 0.2092 112/500 [=====>........................] - ETA: 2:09 - loss: 1.4194 - regression_loss: 1.2101 - classification_loss: 0.2093 113/500 [=====>........................] - ETA: 2:08 - loss: 1.4219 - regression_loss: 1.2122 - classification_loss: 0.2097 114/500 [=====>........................] - ETA: 2:08 - loss: 1.4209 - regression_loss: 1.2113 - classification_loss: 0.2096 115/500 [=====>........................] - ETA: 2:08 - loss: 1.4232 - regression_loss: 1.2134 - classification_loss: 0.2099 116/500 [=====>........................] - ETA: 2:07 - loss: 1.4245 - regression_loss: 1.2144 - classification_loss: 0.2100 117/500 [======>.......................] - ETA: 2:07 - loss: 1.4241 - regression_loss: 1.2141 - classification_loss: 0.2101 118/500 [======>.......................] - ETA: 2:07 - loss: 1.4301 - regression_loss: 1.2195 - classification_loss: 0.2106 119/500 [======>.......................] - ETA: 2:07 - loss: 1.4296 - regression_loss: 1.2194 - classification_loss: 0.2103 120/500 [======>.......................] - ETA: 2:06 - loss: 1.4238 - regression_loss: 1.2142 - classification_loss: 0.2096 121/500 [======>.......................] - ETA: 2:06 - loss: 1.4196 - regression_loss: 1.2108 - classification_loss: 0.2088 122/500 [======>.......................] - ETA: 2:06 - loss: 1.4215 - regression_loss: 1.2127 - classification_loss: 0.2089 123/500 [======>.......................] - ETA: 2:05 - loss: 1.4294 - regression_loss: 1.2185 - classification_loss: 0.2109 124/500 [======>.......................] - ETA: 2:05 - loss: 1.4299 - regression_loss: 1.2185 - classification_loss: 0.2114 125/500 [======>.......................] - ETA: 2:04 - loss: 1.4227 - regression_loss: 1.2122 - classification_loss: 0.2106 126/500 [======>.......................] - ETA: 2:04 - loss: 1.4243 - regression_loss: 1.2138 - classification_loss: 0.2105 127/500 [======>.......................] - ETA: 2:04 - loss: 1.4226 - regression_loss: 1.2123 - classification_loss: 0.2103 128/500 [======>.......................] - ETA: 2:04 - loss: 1.4251 - regression_loss: 1.2143 - classification_loss: 0.2109 129/500 [======>.......................] - ETA: 2:03 - loss: 1.4286 - regression_loss: 1.2169 - classification_loss: 0.2117 130/500 [======>.......................] - ETA: 2:03 - loss: 1.4209 - regression_loss: 1.2103 - classification_loss: 0.2107 131/500 [======>.......................] - ETA: 2:03 - loss: 1.4256 - regression_loss: 1.2145 - classification_loss: 0.2111 132/500 [======>.......................] - ETA: 2:02 - loss: 1.4253 - regression_loss: 1.2140 - classification_loss: 0.2113 133/500 [======>.......................] - ETA: 2:02 - loss: 1.4272 - regression_loss: 1.2157 - classification_loss: 0.2115 134/500 [=======>......................] - ETA: 2:02 - loss: 1.4229 - regression_loss: 1.2118 - classification_loss: 0.2111 135/500 [=======>......................] - ETA: 2:01 - loss: 1.4225 - regression_loss: 1.2115 - classification_loss: 0.2110 136/500 [=======>......................] - ETA: 2:01 - loss: 1.4256 - regression_loss: 1.2142 - classification_loss: 0.2113 137/500 [=======>......................] - ETA: 2:00 - loss: 1.4268 - regression_loss: 1.2154 - classification_loss: 0.2114 138/500 [=======>......................] - ETA: 2:00 - loss: 1.4231 - regression_loss: 1.2124 - classification_loss: 0.2107 139/500 [=======>......................] - ETA: 2:00 - loss: 1.4251 - regression_loss: 1.2141 - classification_loss: 0.2111 140/500 [=======>......................] - ETA: 2:00 - loss: 1.4231 - regression_loss: 1.2121 - classification_loss: 0.2110 141/500 [=======>......................] - ETA: 1:59 - loss: 1.4212 - regression_loss: 1.2106 - classification_loss: 0.2106 142/500 [=======>......................] - ETA: 1:59 - loss: 1.4209 - regression_loss: 1.2104 - classification_loss: 0.2105 143/500 [=======>......................] - ETA: 1:59 - loss: 1.4253 - regression_loss: 1.2143 - classification_loss: 0.2110 144/500 [=======>......................] - ETA: 1:58 - loss: 1.4315 - regression_loss: 1.2193 - classification_loss: 0.2122 145/500 [=======>......................] - ETA: 1:58 - loss: 1.4332 - regression_loss: 1.2206 - classification_loss: 0.2126 146/500 [=======>......................] - ETA: 1:58 - loss: 1.4325 - regression_loss: 1.2199 - classification_loss: 0.2126 147/500 [=======>......................] - ETA: 1:57 - loss: 1.4307 - regression_loss: 1.2183 - classification_loss: 0.2124 148/500 [=======>......................] - ETA: 1:57 - loss: 1.4294 - regression_loss: 1.2173 - classification_loss: 0.2121 149/500 [=======>......................] - ETA: 1:57 - loss: 1.4321 - regression_loss: 1.2195 - classification_loss: 0.2125 150/500 [========>.....................] - ETA: 1:56 - loss: 1.4342 - regression_loss: 1.2214 - classification_loss: 0.2128 151/500 [========>.....................] - ETA: 1:56 - loss: 1.4337 - regression_loss: 1.2211 - classification_loss: 0.2126 152/500 [========>.....................] - ETA: 1:56 - loss: 1.4279 - regression_loss: 1.2161 - classification_loss: 0.2118 153/500 [========>.....................] - ETA: 1:55 - loss: 1.4268 - regression_loss: 1.2151 - classification_loss: 0.2117 154/500 [========>.....................] - ETA: 1:55 - loss: 1.4272 - regression_loss: 1.2154 - classification_loss: 0.2118 155/500 [========>.....................] - ETA: 1:54 - loss: 1.4298 - regression_loss: 1.2178 - classification_loss: 0.2120 156/500 [========>.....................] - ETA: 1:54 - loss: 1.4325 - regression_loss: 1.2199 - classification_loss: 0.2126 157/500 [========>.....................] - ETA: 1:54 - loss: 1.4283 - regression_loss: 1.2164 - classification_loss: 0.2119 158/500 [========>.....................] - ETA: 1:53 - loss: 1.4299 - regression_loss: 1.2179 - classification_loss: 0.2120 159/500 [========>.....................] - ETA: 1:53 - loss: 1.4279 - regression_loss: 1.2162 - classification_loss: 0.2117 160/500 [========>.....................] - ETA: 1:53 - loss: 1.4278 - regression_loss: 1.2163 - classification_loss: 0.2116 161/500 [========>.....................] - ETA: 1:52 - loss: 1.4285 - regression_loss: 1.2169 - classification_loss: 0.2116 162/500 [========>.....................] - ETA: 1:52 - loss: 1.4318 - regression_loss: 1.2197 - classification_loss: 0.2121 163/500 [========>.....................] - ETA: 1:52 - loss: 1.4338 - regression_loss: 1.2215 - classification_loss: 0.2123 164/500 [========>.....................] - ETA: 1:51 - loss: 1.4336 - regression_loss: 1.2211 - classification_loss: 0.2125 165/500 [========>.....................] - ETA: 1:51 - loss: 1.4294 - regression_loss: 1.2176 - classification_loss: 0.2118 166/500 [========>.....................] - ETA: 1:51 - loss: 1.4309 - regression_loss: 1.2188 - classification_loss: 0.2121 167/500 [=========>....................] - ETA: 1:50 - loss: 1.4335 - regression_loss: 1.2209 - classification_loss: 0.2126 168/500 [=========>....................] - ETA: 1:50 - loss: 1.4351 - regression_loss: 1.2213 - classification_loss: 0.2137 169/500 [=========>....................] - ETA: 1:50 - loss: 1.4366 - regression_loss: 1.2226 - classification_loss: 0.2140 170/500 [=========>....................] - ETA: 1:49 - loss: 1.4338 - regression_loss: 1.2204 - classification_loss: 0.2134 171/500 [=========>....................] - ETA: 1:49 - loss: 1.4337 - regression_loss: 1.2202 - classification_loss: 0.2135 172/500 [=========>....................] - ETA: 1:49 - loss: 1.4339 - regression_loss: 1.2202 - classification_loss: 0.2137 173/500 [=========>....................] - ETA: 1:48 - loss: 1.4299 - regression_loss: 1.2168 - classification_loss: 0.2131 174/500 [=========>....................] - ETA: 1:48 - loss: 1.4320 - regression_loss: 1.2186 - classification_loss: 0.2135 175/500 [=========>....................] - ETA: 1:48 - loss: 1.4335 - regression_loss: 1.2198 - classification_loss: 0.2137 176/500 [=========>....................] - ETA: 1:47 - loss: 1.4315 - regression_loss: 1.2180 - classification_loss: 0.2136 177/500 [=========>....................] - ETA: 1:47 - loss: 1.4269 - regression_loss: 1.2139 - classification_loss: 0.2131 178/500 [=========>....................] - ETA: 1:47 - loss: 1.4288 - regression_loss: 1.2157 - classification_loss: 0.2131 179/500 [=========>....................] - ETA: 1:46 - loss: 1.4272 - regression_loss: 1.2142 - classification_loss: 0.2131 180/500 [=========>....................] - ETA: 1:46 - loss: 1.4252 - regression_loss: 1.2124 - classification_loss: 0.2128 181/500 [=========>....................] - ETA: 1:46 - loss: 1.4257 - regression_loss: 1.2128 - classification_loss: 0.2129 182/500 [=========>....................] - ETA: 1:45 - loss: 1.4291 - regression_loss: 1.2159 - classification_loss: 0.2132 183/500 [=========>....................] - ETA: 1:45 - loss: 1.4275 - regression_loss: 1.2146 - classification_loss: 0.2130 184/500 [==========>...................] - ETA: 1:45 - loss: 1.4298 - regression_loss: 1.2163 - classification_loss: 0.2134 185/500 [==========>...................] - ETA: 1:44 - loss: 1.4322 - regression_loss: 1.2184 - classification_loss: 0.2139 186/500 [==========>...................] - ETA: 1:44 - loss: 1.4299 - regression_loss: 1.2163 - classification_loss: 0.2136 187/500 [==========>...................] - ETA: 1:44 - loss: 1.4291 - regression_loss: 1.2156 - classification_loss: 0.2134 188/500 [==========>...................] - ETA: 1:43 - loss: 1.4246 - regression_loss: 1.2118 - classification_loss: 0.2127 189/500 [==========>...................] - ETA: 1:43 - loss: 1.4261 - regression_loss: 1.2132 - classification_loss: 0.2129 190/500 [==========>...................] - ETA: 1:43 - loss: 1.4217 - regression_loss: 1.2094 - classification_loss: 0.2123 191/500 [==========>...................] - ETA: 1:42 - loss: 1.4215 - regression_loss: 1.2095 - classification_loss: 0.2120 192/500 [==========>...................] - ETA: 1:42 - loss: 1.4239 - regression_loss: 1.2114 - classification_loss: 0.2125 193/500 [==========>...................] - ETA: 1:42 - loss: 1.4255 - regression_loss: 1.2125 - classification_loss: 0.2129 194/500 [==========>...................] - ETA: 1:41 - loss: 1.4252 - regression_loss: 1.2125 - classification_loss: 0.2127 195/500 [==========>...................] - ETA: 1:41 - loss: 1.4276 - regression_loss: 1.2145 - classification_loss: 0.2131 196/500 [==========>...................] - ETA: 1:41 - loss: 1.4245 - regression_loss: 1.2114 - classification_loss: 0.2131 197/500 [==========>...................] - ETA: 1:40 - loss: 1.4250 - regression_loss: 1.2118 - classification_loss: 0.2132 198/500 [==========>...................] - ETA: 1:40 - loss: 1.4261 - regression_loss: 1.2131 - classification_loss: 0.2131 199/500 [==========>...................] - ETA: 1:40 - loss: 1.4222 - regression_loss: 1.2096 - classification_loss: 0.2126 200/500 [===========>..................] - ETA: 1:39 - loss: 1.4250 - regression_loss: 1.2119 - classification_loss: 0.2131 201/500 [===========>..................] - ETA: 1:39 - loss: 1.4219 - regression_loss: 1.2092 - classification_loss: 0.2127 202/500 [===========>..................] - ETA: 1:39 - loss: 1.4216 - regression_loss: 1.2092 - classification_loss: 0.2125 203/500 [===========>..................] - ETA: 1:38 - loss: 1.4233 - regression_loss: 1.2107 - classification_loss: 0.2126 204/500 [===========>..................] - ETA: 1:38 - loss: 1.4219 - regression_loss: 1.2097 - classification_loss: 0.2122 205/500 [===========>..................] - ETA: 1:38 - loss: 1.4238 - regression_loss: 1.2113 - classification_loss: 0.2125 206/500 [===========>..................] - ETA: 1:37 - loss: 1.4258 - regression_loss: 1.2131 - classification_loss: 0.2127 207/500 [===========>..................] - ETA: 1:37 - loss: 1.4257 - regression_loss: 1.2130 - classification_loss: 0.2127 208/500 [===========>..................] - ETA: 1:37 - loss: 1.4263 - regression_loss: 1.2136 - classification_loss: 0.2127 209/500 [===========>..................] - ETA: 1:36 - loss: 1.4288 - regression_loss: 1.2158 - classification_loss: 0.2130 210/500 [===========>..................] - ETA: 1:36 - loss: 1.4292 - regression_loss: 1.2166 - classification_loss: 0.2127 211/500 [===========>..................] - ETA: 1:36 - loss: 1.4319 - regression_loss: 1.2190 - classification_loss: 0.2129 212/500 [===========>..................] - ETA: 1:35 - loss: 1.4322 - regression_loss: 1.2193 - classification_loss: 0.2130 213/500 [===========>..................] - ETA: 1:35 - loss: 1.4335 - regression_loss: 1.2203 - classification_loss: 0.2132 214/500 [===========>..................] - ETA: 1:35 - loss: 1.4346 - regression_loss: 1.2214 - classification_loss: 0.2132 215/500 [===========>..................] - ETA: 1:34 - loss: 1.4357 - regression_loss: 1.2225 - classification_loss: 0.2132 216/500 [===========>..................] - ETA: 1:34 - loss: 1.4317 - regression_loss: 1.2191 - classification_loss: 0.2126 217/500 [============>.................] - ETA: 1:34 - loss: 1.4300 - regression_loss: 1.2178 - classification_loss: 0.2123 218/500 [============>.................] - ETA: 1:33 - loss: 1.4314 - regression_loss: 1.2191 - classification_loss: 0.2123 219/500 [============>.................] - ETA: 1:33 - loss: 1.4279 - regression_loss: 1.2162 - classification_loss: 0.2117 220/500 [============>.................] - ETA: 1:33 - loss: 1.4286 - regression_loss: 1.2169 - classification_loss: 0.2116 221/500 [============>.................] - ETA: 1:32 - loss: 1.4267 - regression_loss: 1.2155 - classification_loss: 0.2112 222/500 [============>.................] - ETA: 1:32 - loss: 1.4283 - regression_loss: 1.2169 - classification_loss: 0.2114 223/500 [============>.................] - ETA: 1:32 - loss: 1.4252 - regression_loss: 1.2143 - classification_loss: 0.2109 224/500 [============>.................] - ETA: 1:31 - loss: 1.4269 - regression_loss: 1.2158 - classification_loss: 0.2111 225/500 [============>.................] - ETA: 1:31 - loss: 1.4268 - regression_loss: 1.2157 - classification_loss: 0.2111 226/500 [============>.................] - ETA: 1:31 - loss: 1.4277 - regression_loss: 1.2169 - classification_loss: 0.2108 227/500 [============>.................] - ETA: 1:30 - loss: 1.4253 - regression_loss: 1.2150 - classification_loss: 0.2103 228/500 [============>.................] - ETA: 1:30 - loss: 1.4224 - regression_loss: 1.2126 - classification_loss: 0.2098 229/500 [============>.................] - ETA: 1:30 - loss: 1.4259 - regression_loss: 1.2156 - classification_loss: 0.2103 230/500 [============>.................] - ETA: 1:29 - loss: 1.4278 - regression_loss: 1.2174 - classification_loss: 0.2105 231/500 [============>.................] - ETA: 1:29 - loss: 1.4264 - regression_loss: 1.2163 - classification_loss: 0.2101 232/500 [============>.................] - ETA: 1:29 - loss: 1.4289 - regression_loss: 1.2183 - classification_loss: 0.2106 233/500 [============>.................] - ETA: 1:28 - loss: 1.4295 - regression_loss: 1.2189 - classification_loss: 0.2106 234/500 [=============>................] - ETA: 1:28 - loss: 1.4293 - regression_loss: 1.2189 - classification_loss: 0.2105 235/500 [=============>................] - ETA: 1:28 - loss: 1.4282 - regression_loss: 1.2181 - classification_loss: 0.2101 236/500 [=============>................] - ETA: 1:27 - loss: 1.4252 - regression_loss: 1.2155 - classification_loss: 0.2096 237/500 [=============>................] - ETA: 1:27 - loss: 1.4256 - regression_loss: 1.2160 - classification_loss: 0.2096 238/500 [=============>................] - ETA: 1:27 - loss: 1.4233 - regression_loss: 1.2140 - classification_loss: 0.2093 239/500 [=============>................] - ETA: 1:26 - loss: 1.4214 - regression_loss: 1.2123 - classification_loss: 0.2091 240/500 [=============>................] - ETA: 1:26 - loss: 1.4230 - regression_loss: 1.2138 - classification_loss: 0.2093 241/500 [=============>................] - ETA: 1:26 - loss: 1.4214 - regression_loss: 1.2124 - classification_loss: 0.2090 242/500 [=============>................] - ETA: 1:25 - loss: 1.4207 - regression_loss: 1.2118 - classification_loss: 0.2089 243/500 [=============>................] - ETA: 1:25 - loss: 1.4184 - regression_loss: 1.2097 - classification_loss: 0.2086 244/500 [=============>................] - ETA: 1:25 - loss: 1.4207 - regression_loss: 1.2118 - classification_loss: 0.2089 245/500 [=============>................] - ETA: 1:24 - loss: 1.4224 - regression_loss: 1.2133 - classification_loss: 0.2091 246/500 [=============>................] - ETA: 1:24 - loss: 1.4215 - regression_loss: 1.2127 - classification_loss: 0.2089 247/500 [=============>................] - ETA: 1:24 - loss: 1.4236 - regression_loss: 1.2144 - classification_loss: 0.2092 248/500 [=============>................] - ETA: 1:23 - loss: 1.4256 - regression_loss: 1.2159 - classification_loss: 0.2097 249/500 [=============>................] - ETA: 1:23 - loss: 1.4250 - regression_loss: 1.2153 - classification_loss: 0.2097 250/500 [==============>...............] - ETA: 1:23 - loss: 1.4225 - regression_loss: 1.2127 - classification_loss: 0.2098 251/500 [==============>...............] - ETA: 1:22 - loss: 1.4210 - regression_loss: 1.2114 - classification_loss: 0.2096 252/500 [==============>...............] - ETA: 1:22 - loss: 1.4208 - regression_loss: 1.2111 - classification_loss: 0.2097 253/500 [==============>...............] - ETA: 1:22 - loss: 1.4232 - regression_loss: 1.2132 - classification_loss: 0.2101 254/500 [==============>...............] - ETA: 1:21 - loss: 1.4205 - regression_loss: 1.2108 - classification_loss: 0.2096 255/500 [==============>...............] - ETA: 1:21 - loss: 1.4217 - regression_loss: 1.2121 - classification_loss: 0.2096 256/500 [==============>...............] - ETA: 1:21 - loss: 1.4182 - regression_loss: 1.2091 - classification_loss: 0.2092 257/500 [==============>...............] - ETA: 1:20 - loss: 1.4193 - regression_loss: 1.2101 - classification_loss: 0.2092 258/500 [==============>...............] - ETA: 1:20 - loss: 1.4222 - regression_loss: 1.2123 - classification_loss: 0.2100 259/500 [==============>...............] - ETA: 1:20 - loss: 1.4200 - regression_loss: 1.2104 - classification_loss: 0.2096 260/500 [==============>...............] - ETA: 1:19 - loss: 1.4182 - regression_loss: 1.2089 - classification_loss: 0.2093 261/500 [==============>...............] - ETA: 1:19 - loss: 1.4186 - regression_loss: 1.2091 - classification_loss: 0.2095 262/500 [==============>...............] - ETA: 1:19 - loss: 1.4212 - regression_loss: 1.2113 - classification_loss: 0.2099 263/500 [==============>...............] - ETA: 1:18 - loss: 1.4247 - regression_loss: 1.2142 - classification_loss: 0.2105 264/500 [==============>...............] - ETA: 1:18 - loss: 1.4276 - regression_loss: 1.2165 - classification_loss: 0.2111 265/500 [==============>...............] - ETA: 1:18 - loss: 1.4256 - regression_loss: 1.2147 - classification_loss: 0.2108 266/500 [==============>...............] - ETA: 1:17 - loss: 1.4264 - regression_loss: 1.2155 - classification_loss: 0.2109 267/500 [===============>..............] - ETA: 1:17 - loss: 1.4263 - regression_loss: 1.2154 - classification_loss: 0.2109 268/500 [===============>..............] - ETA: 1:17 - loss: 1.4251 - regression_loss: 1.2143 - classification_loss: 0.2109 269/500 [===============>..............] - ETA: 1:16 - loss: 1.4262 - regression_loss: 1.2151 - classification_loss: 0.2111 270/500 [===============>..............] - ETA: 1:16 - loss: 1.4290 - regression_loss: 1.2174 - classification_loss: 0.2116 271/500 [===============>..............] - ETA: 1:16 - loss: 1.4296 - regression_loss: 1.2178 - classification_loss: 0.2118 272/500 [===============>..............] - ETA: 1:15 - loss: 1.4300 - regression_loss: 1.2182 - classification_loss: 0.2118 273/500 [===============>..............] - ETA: 1:15 - loss: 1.4291 - regression_loss: 1.2174 - classification_loss: 0.2118 274/500 [===============>..............] - ETA: 1:15 - loss: 1.4283 - regression_loss: 1.2166 - classification_loss: 0.2117 275/500 [===============>..............] - ETA: 1:14 - loss: 1.4290 - regression_loss: 1.2172 - classification_loss: 0.2117 276/500 [===============>..............] - ETA: 1:14 - loss: 1.4257 - regression_loss: 1.2143 - classification_loss: 0.2114 277/500 [===============>..............] - ETA: 1:14 - loss: 1.4255 - regression_loss: 1.2142 - classification_loss: 0.2113 278/500 [===============>..............] - ETA: 1:13 - loss: 1.4251 - regression_loss: 1.2139 - classification_loss: 0.2112 279/500 [===============>..............] - ETA: 1:13 - loss: 1.4244 - regression_loss: 1.2135 - classification_loss: 0.2109 280/500 [===============>..............] - ETA: 1:13 - loss: 1.4254 - regression_loss: 1.2146 - classification_loss: 0.2109 281/500 [===============>..............] - ETA: 1:12 - loss: 1.4251 - regression_loss: 1.2144 - classification_loss: 0.2108 282/500 [===============>..............] - ETA: 1:12 - loss: 1.4224 - regression_loss: 1.2119 - classification_loss: 0.2104 283/500 [===============>..............] - ETA: 1:12 - loss: 1.4226 - regression_loss: 1.2121 - classification_loss: 0.2105 284/500 [================>.............] - ETA: 1:11 - loss: 1.4222 - regression_loss: 1.2120 - classification_loss: 0.2102 285/500 [================>.............] - ETA: 1:11 - loss: 1.4245 - regression_loss: 1.2139 - classification_loss: 0.2106 286/500 [================>.............] - ETA: 1:11 - loss: 1.4241 - regression_loss: 1.2137 - classification_loss: 0.2104 287/500 [================>.............] - ETA: 1:10 - loss: 1.4246 - regression_loss: 1.2143 - classification_loss: 0.2103 288/500 [================>.............] - ETA: 1:10 - loss: 1.4228 - regression_loss: 1.2128 - classification_loss: 0.2100 289/500 [================>.............] - ETA: 1:10 - loss: 1.4221 - regression_loss: 1.2121 - classification_loss: 0.2100 290/500 [================>.............] - ETA: 1:09 - loss: 1.4217 - regression_loss: 1.2118 - classification_loss: 0.2099 291/500 [================>.............] - ETA: 1:09 - loss: 1.4201 - regression_loss: 1.2104 - classification_loss: 0.2097 292/500 [================>.............] - ETA: 1:09 - loss: 1.4215 - regression_loss: 1.2117 - classification_loss: 0.2098 293/500 [================>.............] - ETA: 1:08 - loss: 1.4227 - regression_loss: 1.2128 - classification_loss: 0.2100 294/500 [================>.............] - ETA: 1:08 - loss: 1.4228 - regression_loss: 1.2131 - classification_loss: 0.2098 295/500 [================>.............] - ETA: 1:08 - loss: 1.4232 - regression_loss: 1.2134 - classification_loss: 0.2098 296/500 [================>.............] - ETA: 1:07 - loss: 1.4249 - regression_loss: 1.2148 - classification_loss: 0.2100 297/500 [================>.............] - ETA: 1:07 - loss: 1.4227 - regression_loss: 1.2128 - classification_loss: 0.2099 298/500 [================>.............] - ETA: 1:07 - loss: 1.4211 - regression_loss: 1.2114 - classification_loss: 0.2097 299/500 [================>.............] - ETA: 1:06 - loss: 1.4214 - regression_loss: 1.2116 - classification_loss: 0.2098 300/500 [=================>............] - ETA: 1:06 - loss: 1.4222 - regression_loss: 1.2123 - classification_loss: 0.2099 301/500 [=================>............] - ETA: 1:06 - loss: 1.4196 - regression_loss: 1.2100 - classification_loss: 0.2096 302/500 [=================>............] - ETA: 1:05 - loss: 1.4185 - regression_loss: 1.2091 - classification_loss: 0.2094 303/500 [=================>............] - ETA: 1:05 - loss: 1.4173 - regression_loss: 1.2081 - classification_loss: 0.2092 304/500 [=================>............] - ETA: 1:05 - loss: 1.4198 - regression_loss: 1.2103 - classification_loss: 0.2096 305/500 [=================>............] - ETA: 1:04 - loss: 1.4206 - regression_loss: 1.2112 - classification_loss: 0.2094 306/500 [=================>............] - ETA: 1:04 - loss: 1.4178 - regression_loss: 1.2088 - classification_loss: 0.2090 307/500 [=================>............] - ETA: 1:04 - loss: 1.4156 - regression_loss: 1.2068 - classification_loss: 0.2087 308/500 [=================>............] - ETA: 1:03 - loss: 1.4174 - regression_loss: 1.2083 - classification_loss: 0.2091 309/500 [=================>............] - ETA: 1:03 - loss: 1.4187 - regression_loss: 1.2094 - classification_loss: 0.2093 310/500 [=================>............] - ETA: 1:03 - loss: 1.4180 - regression_loss: 1.2089 - classification_loss: 0.2091 311/500 [=================>............] - ETA: 1:02 - loss: 1.4192 - regression_loss: 1.2099 - classification_loss: 0.2093 312/500 [=================>............] - ETA: 1:02 - loss: 1.4194 - regression_loss: 1.2102 - classification_loss: 0.2093 313/500 [=================>............] - ETA: 1:02 - loss: 1.4185 - regression_loss: 1.2094 - classification_loss: 0.2091 314/500 [=================>............] - ETA: 1:01 - loss: 1.4195 - regression_loss: 1.2103 - classification_loss: 0.2092 315/500 [=================>............] - ETA: 1:01 - loss: 1.4211 - regression_loss: 1.2117 - classification_loss: 0.2094 316/500 [=================>............] - ETA: 1:01 - loss: 1.4204 - regression_loss: 1.2111 - classification_loss: 0.2093 317/500 [==================>...........] - ETA: 1:00 - loss: 1.4200 - regression_loss: 1.2107 - classification_loss: 0.2093 318/500 [==================>...........] - ETA: 1:00 - loss: 1.4207 - regression_loss: 1.2113 - classification_loss: 0.2094 319/500 [==================>...........] - ETA: 1:00 - loss: 1.4217 - regression_loss: 1.2123 - classification_loss: 0.2094 320/500 [==================>...........] - ETA: 59s - loss: 1.4236 - regression_loss: 1.2141 - classification_loss: 0.2095  321/500 [==================>...........] - ETA: 59s - loss: 1.4263 - regression_loss: 1.2163 - classification_loss: 0.2101 322/500 [==================>...........] - ETA: 59s - loss: 1.4281 - regression_loss: 1.2178 - classification_loss: 0.2103 323/500 [==================>...........] - ETA: 58s - loss: 1.4289 - regression_loss: 1.2185 - classification_loss: 0.2104 324/500 [==================>...........] - ETA: 58s - loss: 1.4299 - regression_loss: 1.2194 - classification_loss: 0.2106 325/500 [==================>...........] - ETA: 58s - loss: 1.4306 - regression_loss: 1.2200 - classification_loss: 0.2107 326/500 [==================>...........] - ETA: 57s - loss: 1.4322 - regression_loss: 1.2214 - classification_loss: 0.2108 327/500 [==================>...........] - ETA: 57s - loss: 1.4330 - regression_loss: 1.2221 - classification_loss: 0.2109 328/500 [==================>...........] - ETA: 57s - loss: 1.4317 - regression_loss: 1.2209 - classification_loss: 0.2108 329/500 [==================>...........] - ETA: 56s - loss: 1.4318 - regression_loss: 1.2210 - classification_loss: 0.2109 330/500 [==================>...........] - ETA: 56s - loss: 1.4324 - regression_loss: 1.2216 - classification_loss: 0.2108 331/500 [==================>...........] - ETA: 56s - loss: 1.4326 - regression_loss: 1.2220 - classification_loss: 0.2106 332/500 [==================>...........] - ETA: 55s - loss: 1.4329 - regression_loss: 1.2224 - classification_loss: 0.2106 333/500 [==================>...........] - ETA: 55s - loss: 1.4311 - regression_loss: 1.2207 - classification_loss: 0.2104 334/500 [===================>..........] - ETA: 55s - loss: 1.4307 - regression_loss: 1.2203 - classification_loss: 0.2104 335/500 [===================>..........] - ETA: 54s - loss: 1.4283 - regression_loss: 1.2182 - classification_loss: 0.2101 336/500 [===================>..........] - ETA: 54s - loss: 1.4279 - regression_loss: 1.2180 - classification_loss: 0.2099 337/500 [===================>..........] - ETA: 54s - loss: 1.4264 - regression_loss: 1.2168 - classification_loss: 0.2097 338/500 [===================>..........] - ETA: 53s - loss: 1.4244 - regression_loss: 1.2151 - classification_loss: 0.2093 339/500 [===================>..........] - ETA: 53s - loss: 1.4237 - regression_loss: 1.2145 - classification_loss: 0.2092 340/500 [===================>..........] - ETA: 53s - loss: 1.4244 - regression_loss: 1.2151 - classification_loss: 0.2094 341/500 [===================>..........] - ETA: 52s - loss: 1.4251 - regression_loss: 1.2157 - classification_loss: 0.2094 342/500 [===================>..........] - ETA: 52s - loss: 1.4227 - regression_loss: 1.2136 - classification_loss: 0.2092 343/500 [===================>..........] - ETA: 52s - loss: 1.4198 - regression_loss: 1.2111 - classification_loss: 0.2088 344/500 [===================>..........] - ETA: 51s - loss: 1.4211 - regression_loss: 1.2120 - classification_loss: 0.2091 345/500 [===================>..........] - ETA: 51s - loss: 1.4214 - regression_loss: 1.2122 - classification_loss: 0.2091 346/500 [===================>..........] - ETA: 51s - loss: 1.4243 - regression_loss: 1.2147 - classification_loss: 0.2097 347/500 [===================>..........] - ETA: 50s - loss: 1.4230 - regression_loss: 1.2136 - classification_loss: 0.2094 348/500 [===================>..........] - ETA: 50s - loss: 1.4212 - regression_loss: 1.2122 - classification_loss: 0.2091 349/500 [===================>..........] - ETA: 50s - loss: 1.4200 - regression_loss: 1.2112 - classification_loss: 0.2088 350/500 [====================>.........] - ETA: 49s - loss: 1.4213 - regression_loss: 1.2124 - classification_loss: 0.2089 351/500 [====================>.........] - ETA: 49s - loss: 1.4217 - regression_loss: 1.2129 - classification_loss: 0.2088 352/500 [====================>.........] - ETA: 49s - loss: 1.4221 - regression_loss: 1.2133 - classification_loss: 0.2089 353/500 [====================>.........] - ETA: 48s - loss: 1.4216 - regression_loss: 1.2128 - classification_loss: 0.2088 354/500 [====================>.........] - ETA: 48s - loss: 1.4213 - regression_loss: 1.2127 - classification_loss: 0.2086 355/500 [====================>.........] - ETA: 48s - loss: 1.4205 - regression_loss: 1.2121 - classification_loss: 0.2084 356/500 [====================>.........] - ETA: 47s - loss: 1.4199 - regression_loss: 1.2116 - classification_loss: 0.2084 357/500 [====================>.........] - ETA: 47s - loss: 1.4195 - regression_loss: 1.2113 - classification_loss: 0.2081 358/500 [====================>.........] - ETA: 47s - loss: 1.4183 - regression_loss: 1.2103 - classification_loss: 0.2080 359/500 [====================>.........] - ETA: 46s - loss: 1.4188 - regression_loss: 1.2107 - classification_loss: 0.2080 360/500 [====================>.........] - ETA: 46s - loss: 1.4164 - regression_loss: 1.2086 - classification_loss: 0.2078 361/500 [====================>.........] - ETA: 46s - loss: 1.4156 - regression_loss: 1.2077 - classification_loss: 0.2079 362/500 [====================>.........] - ETA: 45s - loss: 1.4164 - regression_loss: 1.2083 - classification_loss: 0.2081 363/500 [====================>.........] - ETA: 45s - loss: 1.4170 - regression_loss: 1.2089 - classification_loss: 0.2081 364/500 [====================>.........] - ETA: 45s - loss: 1.4169 - regression_loss: 1.2088 - classification_loss: 0.2081 365/500 [====================>.........] - ETA: 44s - loss: 1.4159 - regression_loss: 1.2080 - classification_loss: 0.2079 366/500 [====================>.........] - ETA: 44s - loss: 1.4151 - regression_loss: 1.2074 - classification_loss: 0.2077 367/500 [=====================>........] - ETA: 44s - loss: 1.4147 - regression_loss: 1.2071 - classification_loss: 0.2076 368/500 [=====================>........] - ETA: 43s - loss: 1.4163 - regression_loss: 1.2084 - classification_loss: 0.2079 369/500 [=====================>........] - ETA: 43s - loss: 1.4167 - regression_loss: 1.2088 - classification_loss: 0.2079 370/500 [=====================>........] - ETA: 43s - loss: 1.4169 - regression_loss: 1.2089 - classification_loss: 0.2080 371/500 [=====================>........] - ETA: 42s - loss: 1.4176 - regression_loss: 1.2095 - classification_loss: 0.2081 372/500 [=====================>........] - ETA: 42s - loss: 1.4171 - regression_loss: 1.2090 - classification_loss: 0.2082 373/500 [=====================>........] - ETA: 42s - loss: 1.4169 - regression_loss: 1.2089 - classification_loss: 0.2081 374/500 [=====================>........] - ETA: 41s - loss: 1.4155 - regression_loss: 1.2077 - classification_loss: 0.2078 375/500 [=====================>........] - ETA: 41s - loss: 1.4138 - regression_loss: 1.2062 - classification_loss: 0.2075 376/500 [=====================>........] - ETA: 41s - loss: 1.4136 - regression_loss: 1.2060 - classification_loss: 0.2076 377/500 [=====================>........] - ETA: 40s - loss: 1.4143 - regression_loss: 1.2067 - classification_loss: 0.2076 378/500 [=====================>........] - ETA: 40s - loss: 1.4145 - regression_loss: 1.2071 - classification_loss: 0.2074 379/500 [=====================>........] - ETA: 40s - loss: 1.4127 - regression_loss: 1.2055 - classification_loss: 0.2071 380/500 [=====================>........] - ETA: 39s - loss: 1.4113 - regression_loss: 1.2044 - classification_loss: 0.2069 381/500 [=====================>........] - ETA: 39s - loss: 1.4106 - regression_loss: 1.2038 - classification_loss: 0.2068 382/500 [=====================>........] - ETA: 39s - loss: 1.4084 - regression_loss: 1.2019 - classification_loss: 0.2065 383/500 [=====================>........] - ETA: 38s - loss: 1.4100 - regression_loss: 1.2034 - classification_loss: 0.2067 384/500 [======================>.......] - ETA: 38s - loss: 1.4101 - regression_loss: 1.2034 - classification_loss: 0.2067 385/500 [======================>.......] - ETA: 38s - loss: 1.4100 - regression_loss: 1.2035 - classification_loss: 0.2066 386/500 [======================>.......] - ETA: 37s - loss: 1.4104 - regression_loss: 1.2037 - classification_loss: 0.2067 387/500 [======================>.......] - ETA: 37s - loss: 1.4085 - regression_loss: 1.2021 - classification_loss: 0.2064 388/500 [======================>.......] - ETA: 37s - loss: 1.4092 - regression_loss: 1.2027 - classification_loss: 0.2065 389/500 [======================>.......] - ETA: 36s - loss: 1.4091 - regression_loss: 1.2026 - classification_loss: 0.2066 390/500 [======================>.......] - ETA: 36s - loss: 1.4087 - regression_loss: 1.2022 - classification_loss: 0.2065 391/500 [======================>.......] - ETA: 36s - loss: 1.4087 - regression_loss: 1.2023 - classification_loss: 0.2064 392/500 [======================>.......] - ETA: 35s - loss: 1.4073 - regression_loss: 1.2012 - classification_loss: 0.2061 393/500 [======================>.......] - ETA: 35s - loss: 1.4082 - regression_loss: 1.2019 - classification_loss: 0.2063 394/500 [======================>.......] - ETA: 35s - loss: 1.4090 - regression_loss: 1.2026 - classification_loss: 0.2064 395/500 [======================>.......] - ETA: 34s - loss: 1.4095 - regression_loss: 1.2031 - classification_loss: 0.2064 396/500 [======================>.......] - ETA: 34s - loss: 1.4102 - regression_loss: 1.2037 - classification_loss: 0.2065 397/500 [======================>.......] - ETA: 34s - loss: 1.4106 - regression_loss: 1.2040 - classification_loss: 0.2067 398/500 [======================>.......] - ETA: 33s - loss: 1.4115 - regression_loss: 1.2047 - classification_loss: 0.2068 399/500 [======================>.......] - ETA: 33s - loss: 1.4095 - regression_loss: 1.2031 - classification_loss: 0.2064 400/500 [=======================>......] - ETA: 33s - loss: 1.4110 - regression_loss: 1.2041 - classification_loss: 0.2069 401/500 [=======================>......] - ETA: 32s - loss: 1.4109 - regression_loss: 1.2040 - classification_loss: 0.2069 402/500 [=======================>......] - ETA: 32s - loss: 1.4088 - regression_loss: 1.2021 - classification_loss: 0.2067 403/500 [=======================>......] - ETA: 32s - loss: 1.4081 - regression_loss: 1.2015 - classification_loss: 0.2066 404/500 [=======================>......] - ETA: 31s - loss: 1.4093 - regression_loss: 1.2025 - classification_loss: 0.2068 405/500 [=======================>......] - ETA: 31s - loss: 1.4094 - regression_loss: 1.2026 - classification_loss: 0.2068 406/500 [=======================>......] - ETA: 31s - loss: 1.4089 - regression_loss: 1.2022 - classification_loss: 0.2067 407/500 [=======================>......] - ETA: 30s - loss: 1.4109 - regression_loss: 1.2038 - classification_loss: 0.2071 408/500 [=======================>......] - ETA: 30s - loss: 1.4117 - regression_loss: 1.2045 - classification_loss: 0.2072 409/500 [=======================>......] - ETA: 30s - loss: 1.4137 - regression_loss: 1.2062 - classification_loss: 0.2076 410/500 [=======================>......] - ETA: 29s - loss: 1.4133 - regression_loss: 1.2058 - classification_loss: 0.2075 411/500 [=======================>......] - ETA: 29s - loss: 1.4109 - regression_loss: 1.2037 - classification_loss: 0.2072 412/500 [=======================>......] - ETA: 29s - loss: 1.4103 - regression_loss: 1.2032 - classification_loss: 0.2071 413/500 [=======================>......] - ETA: 28s - loss: 1.4108 - regression_loss: 1.2036 - classification_loss: 0.2072 414/500 [=======================>......] - ETA: 28s - loss: 1.4111 - regression_loss: 1.2039 - classification_loss: 0.2072 415/500 [=======================>......] - ETA: 28s - loss: 1.4110 - regression_loss: 1.2038 - classification_loss: 0.2072 416/500 [=======================>......] - ETA: 27s - loss: 1.4115 - regression_loss: 1.2043 - classification_loss: 0.2072 417/500 [========================>.....] - ETA: 27s - loss: 1.4124 - regression_loss: 1.2050 - classification_loss: 0.2074 418/500 [========================>.....] - ETA: 27s - loss: 1.4127 - regression_loss: 1.2051 - classification_loss: 0.2076 419/500 [========================>.....] - ETA: 26s - loss: 1.4137 - regression_loss: 1.2061 - classification_loss: 0.2077 420/500 [========================>.....] - ETA: 26s - loss: 1.4129 - regression_loss: 1.2053 - classification_loss: 0.2076 421/500 [========================>.....] - ETA: 26s - loss: 1.4128 - regression_loss: 1.2052 - classification_loss: 0.2076 422/500 [========================>.....] - ETA: 25s - loss: 1.4137 - regression_loss: 1.2059 - classification_loss: 0.2077 423/500 [========================>.....] - ETA: 25s - loss: 1.4156 - regression_loss: 1.2075 - classification_loss: 0.2081 424/500 [========================>.....] - ETA: 25s - loss: 1.4163 - regression_loss: 1.2081 - classification_loss: 0.2082 425/500 [========================>.....] - ETA: 24s - loss: 1.4169 - regression_loss: 1.2086 - classification_loss: 0.2083 426/500 [========================>.....] - ETA: 24s - loss: 1.4163 - regression_loss: 1.2080 - classification_loss: 0.2082 427/500 [========================>.....] - ETA: 24s - loss: 1.4149 - regression_loss: 1.2069 - classification_loss: 0.2080 428/500 [========================>.....] - ETA: 23s - loss: 1.4164 - regression_loss: 1.2080 - classification_loss: 0.2084 429/500 [========================>.....] - ETA: 23s - loss: 1.4164 - regression_loss: 1.2081 - classification_loss: 0.2084 430/500 [========================>.....] - ETA: 23s - loss: 1.4168 - regression_loss: 1.2085 - classification_loss: 0.2083 431/500 [========================>.....] - ETA: 22s - loss: 1.4150 - regression_loss: 1.2070 - classification_loss: 0.2081 432/500 [========================>.....] - ETA: 22s - loss: 1.4163 - regression_loss: 1.2080 - classification_loss: 0.2083 433/500 [========================>.....] - ETA: 22s - loss: 1.4144 - regression_loss: 1.2063 - classification_loss: 0.2082 434/500 [=========================>....] - ETA: 21s - loss: 1.4147 - regression_loss: 1.2065 - classification_loss: 0.2082 435/500 [=========================>....] - ETA: 21s - loss: 1.4135 - regression_loss: 1.2054 - classification_loss: 0.2082 436/500 [=========================>....] - ETA: 21s - loss: 1.4139 - regression_loss: 1.2057 - classification_loss: 0.2082 437/500 [=========================>....] - ETA: 20s - loss: 1.4148 - regression_loss: 1.2066 - classification_loss: 0.2082 438/500 [=========================>....] - ETA: 20s - loss: 1.4141 - regression_loss: 1.2060 - classification_loss: 0.2081 439/500 [=========================>....] - ETA: 20s - loss: 1.4130 - regression_loss: 1.2051 - classification_loss: 0.2079 440/500 [=========================>....] - ETA: 19s - loss: 1.4131 - regression_loss: 1.2052 - classification_loss: 0.2079 441/500 [=========================>....] - ETA: 19s - loss: 1.4127 - regression_loss: 1.2049 - classification_loss: 0.2078 442/500 [=========================>....] - ETA: 19s - loss: 1.4134 - regression_loss: 1.2054 - classification_loss: 0.2080 443/500 [=========================>....] - ETA: 18s - loss: 1.4141 - regression_loss: 1.2059 - classification_loss: 0.2082 444/500 [=========================>....] - ETA: 18s - loss: 1.4133 - regression_loss: 1.2053 - classification_loss: 0.2080 445/500 [=========================>....] - ETA: 18s - loss: 1.4116 - regression_loss: 1.2038 - classification_loss: 0.2079 446/500 [=========================>....] - ETA: 17s - loss: 1.4105 - regression_loss: 1.2028 - classification_loss: 0.2077 447/500 [=========================>....] - ETA: 17s - loss: 1.4107 - regression_loss: 1.2031 - classification_loss: 0.2076 448/500 [=========================>....] - ETA: 17s - loss: 1.4101 - regression_loss: 1.2026 - classification_loss: 0.2075 449/500 [=========================>....] - ETA: 16s - loss: 1.4087 - regression_loss: 1.2014 - classification_loss: 0.2073 450/500 [==========================>...] - ETA: 16s - loss: 1.4096 - regression_loss: 1.2022 - classification_loss: 0.2074 451/500 [==========================>...] - ETA: 16s - loss: 1.4115 - regression_loss: 1.2038 - classification_loss: 0.2077 452/500 [==========================>...] - ETA: 15s - loss: 1.4121 - regression_loss: 1.2044 - classification_loss: 0.2077 453/500 [==========================>...] - ETA: 15s - loss: 1.4121 - regression_loss: 1.2044 - classification_loss: 0.2077 454/500 [==========================>...] - ETA: 15s - loss: 1.4114 - regression_loss: 1.2037 - classification_loss: 0.2077 455/500 [==========================>...] - ETA: 14s - loss: 1.4116 - regression_loss: 1.2039 - classification_loss: 0.2077 456/500 [==========================>...] - ETA: 14s - loss: 1.4103 - regression_loss: 1.2028 - classification_loss: 0.2076 457/500 [==========================>...] - ETA: 14s - loss: 1.4106 - regression_loss: 1.2031 - classification_loss: 0.2075 458/500 [==========================>...] - ETA: 13s - loss: 1.4115 - regression_loss: 1.2039 - classification_loss: 0.2077 459/500 [==========================>...] - ETA: 13s - loss: 1.4107 - regression_loss: 1.2032 - classification_loss: 0.2075 460/500 [==========================>...] - ETA: 13s - loss: 1.4116 - regression_loss: 1.2040 - classification_loss: 0.2076 461/500 [==========================>...] - ETA: 12s - loss: 1.4129 - regression_loss: 1.2051 - classification_loss: 0.2078 462/500 [==========================>...] - ETA: 12s - loss: 1.4138 - regression_loss: 1.2058 - classification_loss: 0.2080 463/500 [==========================>...] - ETA: 12s - loss: 1.4138 - regression_loss: 1.2058 - classification_loss: 0.2080 464/500 [==========================>...] - ETA: 11s - loss: 1.4142 - regression_loss: 1.2062 - classification_loss: 0.2081 465/500 [==========================>...] - ETA: 11s - loss: 1.4134 - regression_loss: 1.2055 - classification_loss: 0.2079 466/500 [==========================>...] - ETA: 11s - loss: 1.4138 - regression_loss: 1.2058 - classification_loss: 0.2080 467/500 [===========================>..] - ETA: 10s - loss: 1.4136 - regression_loss: 1.2056 - classification_loss: 0.2080 468/500 [===========================>..] - ETA: 10s - loss: 1.4141 - regression_loss: 1.2061 - classification_loss: 0.2080 469/500 [===========================>..] - ETA: 10s - loss: 1.4143 - regression_loss: 1.2062 - classification_loss: 0.2080 470/500 [===========================>..] - ETA: 9s - loss: 1.4142 - regression_loss: 1.2061 - classification_loss: 0.2081  471/500 [===========================>..] - ETA: 9s - loss: 1.4128 - regression_loss: 1.2050 - classification_loss: 0.2079 472/500 [===========================>..] - ETA: 9s - loss: 1.4126 - regression_loss: 1.2049 - classification_loss: 0.2077 473/500 [===========================>..] - ETA: 8s - loss: 1.4143 - regression_loss: 1.2064 - classification_loss: 0.2079 474/500 [===========================>..] - ETA: 8s - loss: 1.4148 - regression_loss: 1.2068 - classification_loss: 0.2081 475/500 [===========================>..] - ETA: 8s - loss: 1.4156 - regression_loss: 1.2074 - classification_loss: 0.2081 476/500 [===========================>..] - ETA: 7s - loss: 1.4168 - regression_loss: 1.2084 - classification_loss: 0.2084 477/500 [===========================>..] - ETA: 7s - loss: 1.4150 - regression_loss: 1.2069 - classification_loss: 0.2082 478/500 [===========================>..] - ETA: 7s - loss: 1.4148 - regression_loss: 1.2066 - classification_loss: 0.2081 479/500 [===========================>..] - ETA: 6s - loss: 1.4141 - regression_loss: 1.2060 - classification_loss: 0.2081 480/500 [===========================>..] - ETA: 6s - loss: 1.4133 - regression_loss: 1.2052 - classification_loss: 0.2080 481/500 [===========================>..] - ETA: 6s - loss: 1.4143 - regression_loss: 1.2063 - classification_loss: 0.2081 482/500 [===========================>..] - ETA: 5s - loss: 1.4139 - regression_loss: 1.2059 - classification_loss: 0.2079 483/500 [===========================>..] - ETA: 5s - loss: 1.4146 - regression_loss: 1.2066 - classification_loss: 0.2080 484/500 [============================>.] - ETA: 5s - loss: 1.4150 - regression_loss: 1.2069 - classification_loss: 0.2081 485/500 [============================>.] - ETA: 4s - loss: 1.4135 - regression_loss: 1.2056 - classification_loss: 0.2078 486/500 [============================>.] - ETA: 4s - loss: 1.4149 - regression_loss: 1.2069 - classification_loss: 0.2080 487/500 [============================>.] - ETA: 4s - loss: 1.4143 - regression_loss: 1.2064 - classification_loss: 0.2079 488/500 [============================>.] - ETA: 3s - loss: 1.4148 - regression_loss: 1.2069 - classification_loss: 0.2079 489/500 [============================>.] - ETA: 3s - loss: 1.4139 - regression_loss: 1.2061 - classification_loss: 0.2078 490/500 [============================>.] - ETA: 3s - loss: 1.4150 - regression_loss: 1.2070 - classification_loss: 0.2080 491/500 [============================>.] - ETA: 2s - loss: 1.4154 - regression_loss: 1.2074 - classification_loss: 0.2080 492/500 [============================>.] - ETA: 2s - loss: 1.4133 - regression_loss: 1.2056 - classification_loss: 0.2077 493/500 [============================>.] - ETA: 2s - loss: 1.4130 - regression_loss: 1.2053 - classification_loss: 0.2077 494/500 [============================>.] - ETA: 1s - loss: 1.4120 - regression_loss: 1.2046 - classification_loss: 0.2074 495/500 [============================>.] - ETA: 1s - loss: 1.4134 - regression_loss: 1.2058 - classification_loss: 0.2076 496/500 [============================>.] - ETA: 1s - loss: 1.4146 - regression_loss: 1.2068 - classification_loss: 0.2078 497/500 [============================>.] - ETA: 0s - loss: 1.4136 - regression_loss: 1.2060 - classification_loss: 0.2076 498/500 [============================>.] - ETA: 0s - loss: 1.4134 - regression_loss: 1.2058 - classification_loss: 0.2075 499/500 [============================>.] - ETA: 0s - loss: 1.4128 - regression_loss: 1.2053 - classification_loss: 0.2075 500/500 [==============================] - 166s 332ms/step - loss: 1.4130 - regression_loss: 1.2055 - classification_loss: 0.2075 1172 instances of class plum with average precision: 0.5814 mAP: 0.5814 Epoch 00018: saving model to ./training/snapshots/resnet101_pascal_18.h5 Epoch 19/150 1/500 [..............................] - ETA: 2:35 - loss: 0.8126 - regression_loss: 0.7111 - classification_loss: 0.1016 2/500 [..............................] - ETA: 2:41 - loss: 1.0184 - regression_loss: 0.8775 - classification_loss: 0.1409 3/500 [..............................] - ETA: 2:45 - loss: 1.2728 - regression_loss: 1.1085 - classification_loss: 0.1643 4/500 [..............................] - ETA: 2:43 - loss: 1.4015 - regression_loss: 1.2125 - classification_loss: 0.1890 5/500 [..............................] - ETA: 2:41 - loss: 1.4743 - regression_loss: 1.2607 - classification_loss: 0.2136 6/500 [..............................] - ETA: 2:40 - loss: 1.5281 - regression_loss: 1.3002 - classification_loss: 0.2278 7/500 [..............................] - ETA: 2:39 - loss: 1.5205 - regression_loss: 1.2938 - classification_loss: 0.2266 8/500 [..............................] - ETA: 2:39 - loss: 1.4376 - regression_loss: 1.2206 - classification_loss: 0.2170 9/500 [..............................] - ETA: 2:38 - loss: 1.4404 - regression_loss: 1.2266 - classification_loss: 0.2138 10/500 [..............................] - ETA: 2:37 - loss: 1.3222 - regression_loss: 1.1240 - classification_loss: 0.1982 11/500 [..............................] - ETA: 2:37 - loss: 1.3834 - regression_loss: 1.1750 - classification_loss: 0.2084 12/500 [..............................] - ETA: 2:38 - loss: 1.4325 - regression_loss: 1.2132 - classification_loss: 0.2194 13/500 [..............................] - ETA: 2:37 - loss: 1.3863 - regression_loss: 1.1754 - classification_loss: 0.2108 14/500 [..............................] - ETA: 2:37 - loss: 1.4167 - regression_loss: 1.2022 - classification_loss: 0.2145 15/500 [..............................] - ETA: 2:37 - loss: 1.3759 - regression_loss: 1.1668 - classification_loss: 0.2091 16/500 [..............................] - ETA: 2:37 - loss: 1.3910 - regression_loss: 1.1802 - classification_loss: 0.2109 17/500 [>.............................] - ETA: 2:37 - loss: 1.3930 - regression_loss: 1.1835 - classification_loss: 0.2096 18/500 [>.............................] - ETA: 2:37 - loss: 1.4330 - regression_loss: 1.2160 - classification_loss: 0.2169 19/500 [>.............................] - ETA: 2:37 - loss: 1.4597 - regression_loss: 1.2399 - classification_loss: 0.2198 20/500 [>.............................] - ETA: 2:36 - loss: 1.4190 - regression_loss: 1.2016 - classification_loss: 0.2174 21/500 [>.............................] - ETA: 2:37 - loss: 1.4265 - regression_loss: 1.2081 - classification_loss: 0.2183 22/500 [>.............................] - ETA: 2:37 - loss: 1.4370 - regression_loss: 1.2188 - classification_loss: 0.2182 23/500 [>.............................] - ETA: 2:37 - loss: 1.4484 - regression_loss: 1.2295 - classification_loss: 0.2188 24/500 [>.............................] - ETA: 2:37 - loss: 1.4565 - regression_loss: 1.2371 - classification_loss: 0.2194 25/500 [>.............................] - ETA: 2:36 - loss: 1.4888 - regression_loss: 1.2639 - classification_loss: 0.2249 26/500 [>.............................] - ETA: 2:36 - loss: 1.4931 - regression_loss: 1.2688 - classification_loss: 0.2242 27/500 [>.............................] - ETA: 2:35 - loss: 1.4848 - regression_loss: 1.2619 - classification_loss: 0.2229 28/500 [>.............................] - ETA: 2:35 - loss: 1.4870 - regression_loss: 1.2650 - classification_loss: 0.2220 29/500 [>.............................] - ETA: 2:34 - loss: 1.4983 - regression_loss: 1.2760 - classification_loss: 0.2224 30/500 [>.............................] - ETA: 2:34 - loss: 1.4861 - regression_loss: 1.2661 - classification_loss: 0.2200 31/500 [>.............................] - ETA: 2:34 - loss: 1.5014 - regression_loss: 1.2796 - classification_loss: 0.2218 32/500 [>.............................] - ETA: 2:33 - loss: 1.4754 - regression_loss: 1.2566 - classification_loss: 0.2188 33/500 [>.............................] - ETA: 2:33 - loss: 1.4751 - regression_loss: 1.2564 - classification_loss: 0.2188 34/500 [=>............................] - ETA: 2:32 - loss: 1.4842 - regression_loss: 1.2640 - classification_loss: 0.2202 35/500 [=>............................] - ETA: 2:31 - loss: 1.4918 - regression_loss: 1.2707 - classification_loss: 0.2211 36/500 [=>............................] - ETA: 2:31 - loss: 1.4695 - regression_loss: 1.2521 - classification_loss: 0.2174 37/500 [=>............................] - ETA: 2:31 - loss: 1.4566 - regression_loss: 1.2423 - classification_loss: 0.2143 38/500 [=>............................] - ETA: 2:31 - loss: 1.4516 - regression_loss: 1.2385 - classification_loss: 0.2131 39/500 [=>............................] - ETA: 2:30 - loss: 1.4302 - regression_loss: 1.2199 - classification_loss: 0.2103 40/500 [=>............................] - ETA: 2:30 - loss: 1.4219 - regression_loss: 1.2124 - classification_loss: 0.2094 41/500 [=>............................] - ETA: 2:30 - loss: 1.4197 - regression_loss: 1.2112 - classification_loss: 0.2086 42/500 [=>............................] - ETA: 2:30 - loss: 1.3997 - regression_loss: 1.1935 - classification_loss: 0.2062 43/500 [=>............................] - ETA: 2:29 - loss: 1.3802 - regression_loss: 1.1768 - classification_loss: 0.2034 44/500 [=>............................] - ETA: 2:29 - loss: 1.3846 - regression_loss: 1.1812 - classification_loss: 0.2034 45/500 [=>............................] - ETA: 2:29 - loss: 1.3837 - regression_loss: 1.1797 - classification_loss: 0.2041 46/500 [=>............................] - ETA: 2:29 - loss: 1.3823 - regression_loss: 1.1791 - classification_loss: 0.2032 47/500 [=>............................] - ETA: 2:29 - loss: 1.3686 - regression_loss: 1.1676 - classification_loss: 0.2010 48/500 [=>............................] - ETA: 2:28 - loss: 1.3678 - regression_loss: 1.1670 - classification_loss: 0.2008 49/500 [=>............................] - ETA: 2:28 - loss: 1.3588 - regression_loss: 1.1591 - classification_loss: 0.1997 50/500 [==>...........................] - ETA: 2:28 - loss: 1.3670 - regression_loss: 1.1658 - classification_loss: 0.2012 51/500 [==>...........................] - ETA: 2:27 - loss: 1.3648 - regression_loss: 1.1648 - classification_loss: 0.2000 52/500 [==>...........................] - ETA: 2:27 - loss: 1.3686 - regression_loss: 1.1671 - classification_loss: 0.2015 53/500 [==>...........................] - ETA: 2:27 - loss: 1.3708 - regression_loss: 1.1684 - classification_loss: 0.2025 54/500 [==>...........................] - ETA: 2:27 - loss: 1.3748 - regression_loss: 1.1726 - classification_loss: 0.2023 55/500 [==>...........................] - ETA: 2:26 - loss: 1.3652 - regression_loss: 1.1643 - classification_loss: 0.2009 56/500 [==>...........................] - ETA: 2:26 - loss: 1.3648 - regression_loss: 1.1640 - classification_loss: 0.2008 57/500 [==>...........................] - ETA: 2:26 - loss: 1.3672 - regression_loss: 1.1657 - classification_loss: 0.2015 58/500 [==>...........................] - ETA: 2:25 - loss: 1.3695 - regression_loss: 1.1682 - classification_loss: 0.2013 59/500 [==>...........................] - ETA: 2:25 - loss: 1.3674 - regression_loss: 1.1663 - classification_loss: 0.2011 60/500 [==>...........................] - ETA: 2:25 - loss: 1.3727 - regression_loss: 1.1710 - classification_loss: 0.2017 61/500 [==>...........................] - ETA: 2:25 - loss: 1.3731 - regression_loss: 1.1706 - classification_loss: 0.2024 62/500 [==>...........................] - ETA: 2:24 - loss: 1.3741 - regression_loss: 1.1715 - classification_loss: 0.2026 63/500 [==>...........................] - ETA: 2:24 - loss: 1.3731 - regression_loss: 1.1713 - classification_loss: 0.2017 64/500 [==>...........................] - ETA: 2:23 - loss: 1.3698 - regression_loss: 1.1689 - classification_loss: 0.2010 65/500 [==>...........................] - ETA: 2:23 - loss: 1.3832 - regression_loss: 1.1819 - classification_loss: 0.2013 66/500 [==>...........................] - ETA: 2:23 - loss: 1.3840 - regression_loss: 1.1822 - classification_loss: 0.2018 67/500 [===>..........................] - ETA: 2:22 - loss: 1.3752 - regression_loss: 1.1748 - classification_loss: 0.2004 68/500 [===>..........................] - ETA: 2:22 - loss: 1.3864 - regression_loss: 1.1836 - classification_loss: 0.2028 69/500 [===>..........................] - ETA: 2:21 - loss: 1.3812 - regression_loss: 1.1794 - classification_loss: 0.2019 70/500 [===>..........................] - ETA: 2:21 - loss: 1.3728 - regression_loss: 1.1721 - classification_loss: 0.2007 71/500 [===>..........................] - ETA: 2:21 - loss: 1.3764 - regression_loss: 1.1748 - classification_loss: 0.2016 72/500 [===>..........................] - ETA: 2:21 - loss: 1.3766 - regression_loss: 1.1747 - classification_loss: 0.2019 73/500 [===>..........................] - ETA: 2:20 - loss: 1.3784 - regression_loss: 1.1763 - classification_loss: 0.2021 74/500 [===>..........................] - ETA: 2:20 - loss: 1.3697 - regression_loss: 1.1684 - classification_loss: 0.2013 75/500 [===>..........................] - ETA: 2:20 - loss: 1.3754 - regression_loss: 1.1737 - classification_loss: 0.2017 76/500 [===>..........................] - ETA: 2:19 - loss: 1.3827 - regression_loss: 1.1800 - classification_loss: 0.2027 77/500 [===>..........................] - ETA: 2:19 - loss: 1.3799 - regression_loss: 1.1775 - classification_loss: 0.2025 78/500 [===>..........................] - ETA: 2:19 - loss: 1.3870 - regression_loss: 1.1832 - classification_loss: 0.2039 79/500 [===>..........................] - ETA: 2:18 - loss: 1.3921 - regression_loss: 1.1875 - classification_loss: 0.2046 80/500 [===>..........................] - ETA: 2:18 - loss: 1.3954 - regression_loss: 1.1900 - classification_loss: 0.2054 81/500 [===>..........................] - ETA: 2:18 - loss: 1.3969 - regression_loss: 1.1914 - classification_loss: 0.2054 82/500 [===>..........................] - ETA: 2:18 - loss: 1.3977 - regression_loss: 1.1923 - classification_loss: 0.2054 83/500 [===>..........................] - ETA: 2:17 - loss: 1.3949 - regression_loss: 1.1897 - classification_loss: 0.2053 84/500 [====>.........................] - ETA: 2:17 - loss: 1.4012 - regression_loss: 1.1951 - classification_loss: 0.2061 85/500 [====>.........................] - ETA: 2:16 - loss: 1.4053 - regression_loss: 1.1989 - classification_loss: 0.2065 86/500 [====>.........................] - ETA: 2:16 - loss: 1.4031 - regression_loss: 1.1959 - classification_loss: 0.2073 87/500 [====>.........................] - ETA: 2:16 - loss: 1.4023 - regression_loss: 1.1952 - classification_loss: 0.2071 88/500 [====>.........................] - ETA: 2:16 - loss: 1.4034 - regression_loss: 1.1961 - classification_loss: 0.2073 89/500 [====>.........................] - ETA: 2:15 - loss: 1.3998 - regression_loss: 1.1928 - classification_loss: 0.2070 90/500 [====>.........................] - ETA: 2:15 - loss: 1.3941 - regression_loss: 1.1883 - classification_loss: 0.2059 91/500 [====>.........................] - ETA: 2:15 - loss: 1.3855 - regression_loss: 1.1801 - classification_loss: 0.2054 92/500 [====>.........................] - ETA: 2:14 - loss: 1.3829 - regression_loss: 1.1782 - classification_loss: 0.2047 93/500 [====>.........................] - ETA: 2:14 - loss: 1.3811 - regression_loss: 1.1761 - classification_loss: 0.2050 94/500 [====>.........................] - ETA: 2:14 - loss: 1.3889 - regression_loss: 1.1826 - classification_loss: 0.2062 95/500 [====>.........................] - ETA: 2:13 - loss: 1.3926 - regression_loss: 1.1862 - classification_loss: 0.2064 96/500 [====>.........................] - ETA: 2:13 - loss: 1.3868 - regression_loss: 1.1815 - classification_loss: 0.2053 97/500 [====>.........................] - ETA: 2:13 - loss: 1.3933 - regression_loss: 1.1863 - classification_loss: 0.2070 98/500 [====>.........................] - ETA: 2:12 - loss: 1.3873 - regression_loss: 1.1813 - classification_loss: 0.2060 99/500 [====>.........................] - ETA: 2:12 - loss: 1.3906 - regression_loss: 1.1844 - classification_loss: 0.2061 100/500 [=====>........................] - ETA: 2:11 - loss: 1.3923 - regression_loss: 1.1860 - classification_loss: 0.2063 101/500 [=====>........................] - ETA: 2:11 - loss: 1.3944 - regression_loss: 1.1881 - classification_loss: 0.2064 102/500 [=====>........................] - ETA: 2:11 - loss: 1.3980 - regression_loss: 1.1913 - classification_loss: 0.2067 103/500 [=====>........................] - ETA: 2:11 - loss: 1.3922 - regression_loss: 1.1851 - classification_loss: 0.2071 104/500 [=====>........................] - ETA: 2:10 - loss: 1.3863 - regression_loss: 1.1804 - classification_loss: 0.2059 105/500 [=====>........................] - ETA: 2:10 - loss: 1.3807 - regression_loss: 1.1755 - classification_loss: 0.2052 106/500 [=====>........................] - ETA: 2:09 - loss: 1.3775 - regression_loss: 1.1710 - classification_loss: 0.2065 107/500 [=====>........................] - ETA: 2:09 - loss: 1.3789 - regression_loss: 1.1724 - classification_loss: 0.2064 108/500 [=====>........................] - ETA: 2:09 - loss: 1.3760 - regression_loss: 1.1702 - classification_loss: 0.2058 109/500 [=====>........................] - ETA: 2:09 - loss: 1.3845 - regression_loss: 1.1776 - classification_loss: 0.2069 110/500 [=====>........................] - ETA: 2:08 - loss: 1.3971 - regression_loss: 1.1879 - classification_loss: 0.2092 111/500 [=====>........................] - ETA: 2:08 - loss: 1.4031 - regression_loss: 1.1928 - classification_loss: 0.2102 112/500 [=====>........................] - ETA: 2:07 - loss: 1.4019 - regression_loss: 1.1924 - classification_loss: 0.2096 113/500 [=====>........................] - ETA: 2:07 - loss: 1.4028 - regression_loss: 1.1926 - classification_loss: 0.2103 114/500 [=====>........................] - ETA: 2:07 - loss: 1.4064 - regression_loss: 1.1956 - classification_loss: 0.2108 115/500 [=====>........................] - ETA: 2:07 - loss: 1.4066 - regression_loss: 1.1960 - classification_loss: 0.2106 116/500 [=====>........................] - ETA: 2:06 - loss: 1.4093 - regression_loss: 1.1976 - classification_loss: 0.2117 117/500 [======>.......................] - ETA: 2:06 - loss: 1.4053 - regression_loss: 1.1942 - classification_loss: 0.2111 118/500 [======>.......................] - ETA: 2:05 - loss: 1.4039 - regression_loss: 1.1928 - classification_loss: 0.2111 119/500 [======>.......................] - ETA: 2:05 - loss: 1.4012 - regression_loss: 1.1908 - classification_loss: 0.2105 120/500 [======>.......................] - ETA: 2:05 - loss: 1.4010 - regression_loss: 1.1905 - classification_loss: 0.2105 121/500 [======>.......................] - ETA: 2:04 - loss: 1.4043 - regression_loss: 1.1938 - classification_loss: 0.2105 122/500 [======>.......................] - ETA: 2:04 - loss: 1.4066 - regression_loss: 1.1961 - classification_loss: 0.2105 123/500 [======>.......................] - ETA: 2:04 - loss: 1.4096 - regression_loss: 1.1984 - classification_loss: 0.2112 124/500 [======>.......................] - ETA: 2:04 - loss: 1.4130 - regression_loss: 1.2018 - classification_loss: 0.2112 125/500 [======>.......................] - ETA: 2:03 - loss: 1.4159 - regression_loss: 1.2045 - classification_loss: 0.2113 126/500 [======>.......................] - ETA: 2:03 - loss: 1.4187 - regression_loss: 1.2072 - classification_loss: 0.2115 127/500 [======>.......................] - ETA: 2:03 - loss: 1.4197 - regression_loss: 1.2045 - classification_loss: 0.2152 128/500 [======>.......................] - ETA: 2:02 - loss: 1.4215 - regression_loss: 1.2060 - classification_loss: 0.2155 129/500 [======>.......................] - ETA: 2:02 - loss: 1.4187 - regression_loss: 1.2041 - classification_loss: 0.2146 130/500 [======>.......................] - ETA: 2:02 - loss: 1.4222 - regression_loss: 1.2069 - classification_loss: 0.2154 131/500 [======>.......................] - ETA: 2:01 - loss: 1.4220 - regression_loss: 1.2069 - classification_loss: 0.2151 132/500 [======>.......................] - ETA: 2:01 - loss: 1.4265 - regression_loss: 1.2104 - classification_loss: 0.2161 133/500 [======>.......................] - ETA: 2:01 - loss: 1.4304 - regression_loss: 1.2138 - classification_loss: 0.2166 134/500 [=======>......................] - ETA: 2:00 - loss: 1.4301 - regression_loss: 1.2135 - classification_loss: 0.2165 135/500 [=======>......................] - ETA: 2:00 - loss: 1.4296 - regression_loss: 1.2133 - classification_loss: 0.2163 136/500 [=======>......................] - ETA: 2:00 - loss: 1.4291 - regression_loss: 1.2134 - classification_loss: 0.2158 137/500 [=======>......................] - ETA: 1:59 - loss: 1.4256 - regression_loss: 1.2100 - classification_loss: 0.2156 138/500 [=======>......................] - ETA: 1:59 - loss: 1.4274 - regression_loss: 1.2115 - classification_loss: 0.2159 139/500 [=======>......................] - ETA: 1:59 - loss: 1.4275 - regression_loss: 1.2115 - classification_loss: 0.2160 140/500 [=======>......................] - ETA: 1:58 - loss: 1.4246 - regression_loss: 1.2089 - classification_loss: 0.2157 141/500 [=======>......................] - ETA: 1:58 - loss: 1.4247 - regression_loss: 1.2089 - classification_loss: 0.2158 142/500 [=======>......................] - ETA: 1:58 - loss: 1.4284 - regression_loss: 1.2120 - classification_loss: 0.2164 143/500 [=======>......................] - ETA: 1:57 - loss: 1.4241 - regression_loss: 1.2087 - classification_loss: 0.2154 144/500 [=======>......................] - ETA: 1:57 - loss: 1.4274 - regression_loss: 1.2116 - classification_loss: 0.2158 145/500 [=======>......................] - ETA: 1:57 - loss: 1.4234 - regression_loss: 1.2084 - classification_loss: 0.2150 146/500 [=======>......................] - ETA: 1:56 - loss: 1.4189 - regression_loss: 1.2046 - classification_loss: 0.2143 147/500 [=======>......................] - ETA: 1:56 - loss: 1.4137 - regression_loss: 1.1999 - classification_loss: 0.2137 148/500 [=======>......................] - ETA: 1:56 - loss: 1.4110 - regression_loss: 1.1977 - classification_loss: 0.2133 149/500 [=======>......................] - ETA: 1:55 - loss: 1.4140 - regression_loss: 1.1999 - classification_loss: 0.2141 150/500 [========>.....................] - ETA: 1:55 - loss: 1.4123 - regression_loss: 1.1985 - classification_loss: 0.2139 151/500 [========>.....................] - ETA: 1:55 - loss: 1.4139 - regression_loss: 1.1997 - classification_loss: 0.2142 152/500 [========>.....................] - ETA: 1:54 - loss: 1.4151 - regression_loss: 1.2009 - classification_loss: 0.2142 153/500 [========>.....................] - ETA: 1:54 - loss: 1.4143 - regression_loss: 1.2002 - classification_loss: 0.2141 154/500 [========>.....................] - ETA: 1:54 - loss: 1.4179 - regression_loss: 1.2035 - classification_loss: 0.2144 155/500 [========>.....................] - ETA: 1:53 - loss: 1.4174 - regression_loss: 1.2032 - classification_loss: 0.2142 156/500 [========>.....................] - ETA: 1:53 - loss: 1.4123 - regression_loss: 1.1986 - classification_loss: 0.2137 157/500 [========>.....................] - ETA: 1:53 - loss: 1.4071 - regression_loss: 1.1944 - classification_loss: 0.2127 158/500 [========>.....................] - ETA: 1:52 - loss: 1.4034 - regression_loss: 1.1913 - classification_loss: 0.2121 159/500 [========>.....................] - ETA: 1:52 - loss: 1.4060 - regression_loss: 1.1934 - classification_loss: 0.2126 160/500 [========>.....................] - ETA: 1:52 - loss: 1.4072 - regression_loss: 1.1947 - classification_loss: 0.2125 161/500 [========>.....................] - ETA: 1:51 - loss: 1.4029 - regression_loss: 1.1912 - classification_loss: 0.2118 162/500 [========>.....................] - ETA: 1:51 - loss: 1.4063 - regression_loss: 1.1942 - classification_loss: 0.2122 163/500 [========>.....................] - ETA: 1:51 - loss: 1.4056 - regression_loss: 1.1940 - classification_loss: 0.2117 164/500 [========>.....................] - ETA: 1:50 - loss: 1.4065 - regression_loss: 1.1947 - classification_loss: 0.2118 165/500 [========>.....................] - ETA: 1:50 - loss: 1.4083 - regression_loss: 1.1962 - classification_loss: 0.2121 166/500 [========>.....................] - ETA: 1:50 - loss: 1.4066 - regression_loss: 1.1950 - classification_loss: 0.2116 167/500 [=========>....................] - ETA: 1:49 - loss: 1.4084 - regression_loss: 1.1965 - classification_loss: 0.2119 168/500 [=========>....................] - ETA: 1:49 - loss: 1.4067 - regression_loss: 1.1950 - classification_loss: 0.2117 169/500 [=========>....................] - ETA: 1:49 - loss: 1.4078 - regression_loss: 1.1960 - classification_loss: 0.2118 170/500 [=========>....................] - ETA: 1:48 - loss: 1.4083 - regression_loss: 1.1965 - classification_loss: 0.2117 171/500 [=========>....................] - ETA: 1:48 - loss: 1.4079 - regression_loss: 1.1964 - classification_loss: 0.2115 172/500 [=========>....................] - ETA: 1:48 - loss: 1.4083 - regression_loss: 1.1968 - classification_loss: 0.2115 173/500 [=========>....................] - ETA: 1:47 - loss: 1.4100 - regression_loss: 1.1985 - classification_loss: 0.2116 174/500 [=========>....................] - ETA: 1:47 - loss: 1.4071 - regression_loss: 1.1960 - classification_loss: 0.2111 175/500 [=========>....................] - ETA: 1:47 - loss: 1.4080 - regression_loss: 1.1969 - classification_loss: 0.2112 176/500 [=========>....................] - ETA: 1:46 - loss: 1.4088 - regression_loss: 1.1976 - classification_loss: 0.2111 177/500 [=========>....................] - ETA: 1:46 - loss: 1.4047 - regression_loss: 1.1942 - classification_loss: 0.2105 178/500 [=========>....................] - ETA: 1:46 - loss: 1.4043 - regression_loss: 1.1939 - classification_loss: 0.2104 179/500 [=========>....................] - ETA: 1:45 - loss: 1.4032 - regression_loss: 1.1932 - classification_loss: 0.2101 180/500 [=========>....................] - ETA: 1:45 - loss: 1.4039 - regression_loss: 1.1937 - classification_loss: 0.2102 181/500 [=========>....................] - ETA: 1:45 - loss: 1.4005 - regression_loss: 1.1908 - classification_loss: 0.2098 182/500 [=========>....................] - ETA: 1:44 - loss: 1.4017 - regression_loss: 1.1919 - classification_loss: 0.2098 183/500 [=========>....................] - ETA: 1:44 - loss: 1.4015 - regression_loss: 1.1919 - classification_loss: 0.2097 184/500 [==========>...................] - ETA: 1:44 - loss: 1.3994 - regression_loss: 1.1902 - classification_loss: 0.2092 185/500 [==========>...................] - ETA: 1:43 - loss: 1.4000 - regression_loss: 1.1908 - classification_loss: 0.2092 186/500 [==========>...................] - ETA: 1:43 - loss: 1.3983 - regression_loss: 1.1896 - classification_loss: 0.2087 187/500 [==========>...................] - ETA: 1:43 - loss: 1.4011 - regression_loss: 1.1919 - classification_loss: 0.2092 188/500 [==========>...................] - ETA: 1:43 - loss: 1.4008 - regression_loss: 1.1916 - classification_loss: 0.2091 189/500 [==========>...................] - ETA: 1:42 - loss: 1.4006 - regression_loss: 1.1915 - classification_loss: 0.2091 190/500 [==========>...................] - ETA: 1:42 - loss: 1.3997 - regression_loss: 1.1908 - classification_loss: 0.2089 191/500 [==========>...................] - ETA: 1:42 - loss: 1.4014 - regression_loss: 1.1924 - classification_loss: 0.2089 192/500 [==========>...................] - ETA: 1:41 - loss: 1.4037 - regression_loss: 1.1945 - classification_loss: 0.2092 193/500 [==========>...................] - ETA: 1:41 - loss: 1.4057 - regression_loss: 1.1962 - classification_loss: 0.2095 194/500 [==========>...................] - ETA: 1:41 - loss: 1.4061 - regression_loss: 1.1966 - classification_loss: 0.2096 195/500 [==========>...................] - ETA: 1:40 - loss: 1.4032 - regression_loss: 1.1942 - classification_loss: 0.2090 196/500 [==========>...................] - ETA: 1:40 - loss: 1.4013 - regression_loss: 1.1926 - classification_loss: 0.2088 197/500 [==========>...................] - ETA: 1:40 - loss: 1.4028 - regression_loss: 1.1939 - classification_loss: 0.2088 198/500 [==========>...................] - ETA: 1:39 - loss: 1.4021 - regression_loss: 1.1936 - classification_loss: 0.2085 199/500 [==========>...................] - ETA: 1:39 - loss: 1.4004 - regression_loss: 1.1915 - classification_loss: 0.2089 200/500 [===========>..................] - ETA: 1:39 - loss: 1.4038 - regression_loss: 1.1945 - classification_loss: 0.2094 201/500 [===========>..................] - ETA: 1:38 - loss: 1.4072 - regression_loss: 1.1971 - classification_loss: 0.2101 202/500 [===========>..................] - ETA: 1:38 - loss: 1.4049 - regression_loss: 1.1948 - classification_loss: 0.2101 203/500 [===========>..................] - ETA: 1:38 - loss: 1.4076 - regression_loss: 1.1971 - classification_loss: 0.2105 204/500 [===========>..................] - ETA: 1:37 - loss: 1.4031 - regression_loss: 1.1932 - classification_loss: 0.2099 205/500 [===========>..................] - ETA: 1:37 - loss: 1.4029 - regression_loss: 1.1932 - classification_loss: 0.2097 206/500 [===========>..................] - ETA: 1:37 - loss: 1.4045 - regression_loss: 1.1947 - classification_loss: 0.2098 207/500 [===========>..................] - ETA: 1:36 - loss: 1.4065 - regression_loss: 1.1965 - classification_loss: 0.2101 208/500 [===========>..................] - ETA: 1:36 - loss: 1.4056 - regression_loss: 1.1958 - classification_loss: 0.2098 209/500 [===========>..................] - ETA: 1:36 - loss: 1.4055 - regression_loss: 1.1957 - classification_loss: 0.2098 210/500 [===========>..................] - ETA: 1:35 - loss: 1.4030 - regression_loss: 1.1935 - classification_loss: 0.2095 211/500 [===========>..................] - ETA: 1:35 - loss: 1.3996 - regression_loss: 1.1905 - classification_loss: 0.2091 212/500 [===========>..................] - ETA: 1:35 - loss: 1.4015 - regression_loss: 1.1920 - classification_loss: 0.2094 213/500 [===========>..................] - ETA: 1:34 - loss: 1.4026 - regression_loss: 1.1933 - classification_loss: 0.2094 214/500 [===========>..................] - ETA: 1:34 - loss: 1.4015 - regression_loss: 1.1923 - classification_loss: 0.2092 215/500 [===========>..................] - ETA: 1:34 - loss: 1.4030 - regression_loss: 1.1936 - classification_loss: 0.2093 216/500 [===========>..................] - ETA: 1:33 - loss: 1.4042 - regression_loss: 1.1950 - classification_loss: 0.2092 217/500 [============>.................] - ETA: 1:33 - loss: 1.4046 - regression_loss: 1.1953 - classification_loss: 0.2093 218/500 [============>.................] - ETA: 1:33 - loss: 1.4067 - regression_loss: 1.1970 - classification_loss: 0.2097 219/500 [============>.................] - ETA: 1:32 - loss: 1.4083 - regression_loss: 1.1984 - classification_loss: 0.2099 220/500 [============>.................] - ETA: 1:32 - loss: 1.4064 - regression_loss: 1.1968 - classification_loss: 0.2096 221/500 [============>.................] - ETA: 1:32 - loss: 1.4042 - regression_loss: 1.1951 - classification_loss: 0.2092 222/500 [============>.................] - ETA: 1:31 - loss: 1.4029 - regression_loss: 1.1939 - classification_loss: 0.2090 223/500 [============>.................] - ETA: 1:31 - loss: 1.4010 - regression_loss: 1.1922 - classification_loss: 0.2088 224/500 [============>.................] - ETA: 1:31 - loss: 1.3975 - regression_loss: 1.1892 - classification_loss: 0.2083 225/500 [============>.................] - ETA: 1:30 - loss: 1.3933 - regression_loss: 1.1854 - classification_loss: 0.2080 226/500 [============>.................] - ETA: 1:30 - loss: 1.3905 - regression_loss: 1.1830 - classification_loss: 0.2075 227/500 [============>.................] - ETA: 1:30 - loss: 1.3887 - regression_loss: 1.1818 - classification_loss: 0.2070 228/500 [============>.................] - ETA: 1:29 - loss: 1.3886 - regression_loss: 1.1820 - classification_loss: 0.2066 229/500 [============>.................] - ETA: 1:29 - loss: 1.3889 - regression_loss: 1.1821 - classification_loss: 0.2068 230/500 [============>.................] - ETA: 1:29 - loss: 1.3884 - regression_loss: 1.1819 - classification_loss: 0.2066 231/500 [============>.................] - ETA: 1:28 - loss: 1.3913 - regression_loss: 1.1836 - classification_loss: 0.2077 232/500 [============>.................] - ETA: 1:28 - loss: 1.3878 - regression_loss: 1.1806 - classification_loss: 0.2072 233/500 [============>.................] - ETA: 1:28 - loss: 1.3913 - regression_loss: 1.1832 - classification_loss: 0.2082 234/500 [=============>................] - ETA: 1:27 - loss: 1.3886 - regression_loss: 1.1809 - classification_loss: 0.2077 235/500 [=============>................] - ETA: 1:27 - loss: 1.3893 - regression_loss: 1.1815 - classification_loss: 0.2078 236/500 [=============>................] - ETA: 1:27 - loss: 1.3896 - regression_loss: 1.1817 - classification_loss: 0.2079 237/500 [=============>................] - ETA: 1:26 - loss: 1.3913 - regression_loss: 1.1832 - classification_loss: 0.2081 238/500 [=============>................] - ETA: 1:26 - loss: 1.3897 - regression_loss: 1.1818 - classification_loss: 0.2079 239/500 [=============>................] - ETA: 1:26 - loss: 1.3937 - regression_loss: 1.1851 - classification_loss: 0.2086 240/500 [=============>................] - ETA: 1:26 - loss: 1.3918 - regression_loss: 1.1833 - classification_loss: 0.2086 241/500 [=============>................] - ETA: 1:25 - loss: 1.3901 - regression_loss: 1.1818 - classification_loss: 0.2083 242/500 [=============>................] - ETA: 1:25 - loss: 1.3889 - regression_loss: 1.1807 - classification_loss: 0.2082 243/500 [=============>................] - ETA: 1:25 - loss: 1.3897 - regression_loss: 1.1814 - classification_loss: 0.2083 244/500 [=============>................] - ETA: 1:24 - loss: 1.3889 - regression_loss: 1.1807 - classification_loss: 0.2081 245/500 [=============>................] - ETA: 1:24 - loss: 1.3913 - regression_loss: 1.1829 - classification_loss: 0.2084 246/500 [=============>................] - ETA: 1:24 - loss: 1.3917 - regression_loss: 1.1833 - classification_loss: 0.2084 247/500 [=============>................] - ETA: 1:23 - loss: 1.3921 - regression_loss: 1.1836 - classification_loss: 0.2086 248/500 [=============>................] - ETA: 1:23 - loss: 1.3955 - regression_loss: 1.1864 - classification_loss: 0.2091 249/500 [=============>................] - ETA: 1:23 - loss: 1.3940 - regression_loss: 1.1853 - classification_loss: 0.2088 250/500 [==============>...............] - ETA: 1:22 - loss: 1.3920 - regression_loss: 1.1835 - classification_loss: 0.2085 251/500 [==============>...............] - ETA: 1:22 - loss: 1.3927 - regression_loss: 1.1846 - classification_loss: 0.2081 252/500 [==============>...............] - ETA: 1:21 - loss: 1.3932 - regression_loss: 1.1852 - classification_loss: 0.2080 253/500 [==============>...............] - ETA: 1:21 - loss: 1.3951 - regression_loss: 1.1870 - classification_loss: 0.2080 254/500 [==============>...............] - ETA: 1:21 - loss: 1.3981 - regression_loss: 1.1895 - classification_loss: 0.2085 255/500 [==============>...............] - ETA: 1:20 - loss: 1.3986 - regression_loss: 1.1900 - classification_loss: 0.2086 256/500 [==============>...............] - ETA: 1:20 - loss: 1.3990 - regression_loss: 1.1905 - classification_loss: 0.2084 257/500 [==============>...............] - ETA: 1:20 - loss: 1.4001 - regression_loss: 1.1914 - classification_loss: 0.2087 258/500 [==============>...............] - ETA: 1:19 - loss: 1.4015 - regression_loss: 1.1927 - classification_loss: 0.2088 259/500 [==============>...............] - ETA: 1:19 - loss: 1.4033 - regression_loss: 1.1940 - classification_loss: 0.2092 260/500 [==============>...............] - ETA: 1:19 - loss: 1.4018 - regression_loss: 1.1926 - classification_loss: 0.2092 261/500 [==============>...............] - ETA: 1:18 - loss: 1.3999 - regression_loss: 1.1911 - classification_loss: 0.2088 262/500 [==============>...............] - ETA: 1:18 - loss: 1.4009 - regression_loss: 1.1919 - classification_loss: 0.2090 263/500 [==============>...............] - ETA: 1:18 - loss: 1.4021 - regression_loss: 1.1928 - classification_loss: 0.2093 264/500 [==============>...............] - ETA: 1:18 - loss: 1.4048 - regression_loss: 1.1953 - classification_loss: 0.2094 265/500 [==============>...............] - ETA: 1:17 - loss: 1.4044 - regression_loss: 1.1951 - classification_loss: 0.2093 266/500 [==============>...............] - ETA: 1:17 - loss: 1.4008 - regression_loss: 1.1919 - classification_loss: 0.2088 267/500 [===============>..............] - ETA: 1:17 - loss: 1.4017 - regression_loss: 1.1929 - classification_loss: 0.2088 268/500 [===============>..............] - ETA: 1:16 - loss: 1.3996 - regression_loss: 1.1909 - classification_loss: 0.2088 269/500 [===============>..............] - ETA: 1:16 - loss: 1.3976 - regression_loss: 1.1892 - classification_loss: 0.2084 270/500 [===============>..............] - ETA: 1:16 - loss: 1.3941 - regression_loss: 1.1861 - classification_loss: 0.2080 271/500 [===============>..............] - ETA: 1:15 - loss: 1.3954 - regression_loss: 1.1871 - classification_loss: 0.2083 272/500 [===============>..............] - ETA: 1:15 - loss: 1.3954 - regression_loss: 1.1872 - classification_loss: 0.2082 273/500 [===============>..............] - ETA: 1:15 - loss: 1.3929 - regression_loss: 1.1849 - classification_loss: 0.2080 274/500 [===============>..............] - ETA: 1:14 - loss: 1.3928 - regression_loss: 1.1849 - classification_loss: 0.2079 275/500 [===============>..............] - ETA: 1:14 - loss: 1.3932 - regression_loss: 1.1853 - classification_loss: 0.2079 276/500 [===============>..............] - ETA: 1:14 - loss: 1.3912 - regression_loss: 1.1836 - classification_loss: 0.2076 277/500 [===============>..............] - ETA: 1:13 - loss: 1.3932 - regression_loss: 1.1853 - classification_loss: 0.2079 278/500 [===============>..............] - ETA: 1:13 - loss: 1.3945 - regression_loss: 1.1861 - classification_loss: 0.2084 279/500 [===============>..............] - ETA: 1:13 - loss: 1.3961 - regression_loss: 1.1872 - classification_loss: 0.2088 280/500 [===============>..............] - ETA: 1:12 - loss: 1.3957 - regression_loss: 1.1870 - classification_loss: 0.2087 281/500 [===============>..............] - ETA: 1:12 - loss: 1.3956 - regression_loss: 1.1867 - classification_loss: 0.2090 282/500 [===============>..............] - ETA: 1:12 - loss: 1.3938 - regression_loss: 1.1851 - classification_loss: 0.2087 283/500 [===============>..............] - ETA: 1:11 - loss: 1.3943 - regression_loss: 1.1856 - classification_loss: 0.2088 284/500 [================>.............] - ETA: 1:11 - loss: 1.3951 - regression_loss: 1.1863 - classification_loss: 0.2088 285/500 [================>.............] - ETA: 1:11 - loss: 1.3931 - regression_loss: 1.1847 - classification_loss: 0.2084 286/500 [================>.............] - ETA: 1:10 - loss: 1.3927 - regression_loss: 1.1842 - classification_loss: 0.2085 287/500 [================>.............] - ETA: 1:10 - loss: 1.3946 - regression_loss: 1.1860 - classification_loss: 0.2085 288/500 [================>.............] - ETA: 1:10 - loss: 1.3968 - regression_loss: 1.1877 - classification_loss: 0.2091 289/500 [================>.............] - ETA: 1:09 - loss: 1.3947 - regression_loss: 1.1859 - classification_loss: 0.2089 290/500 [================>.............] - ETA: 1:09 - loss: 1.3952 - regression_loss: 1.1863 - classification_loss: 0.2089 291/500 [================>.............] - ETA: 1:09 - loss: 1.3956 - regression_loss: 1.1871 - classification_loss: 0.2086 292/500 [================>.............] - ETA: 1:08 - loss: 1.3965 - regression_loss: 1.1879 - classification_loss: 0.2087 293/500 [================>.............] - ETA: 1:08 - loss: 1.3960 - regression_loss: 1.1876 - classification_loss: 0.2084 294/500 [================>.............] - ETA: 1:08 - loss: 1.3979 - regression_loss: 1.1892 - classification_loss: 0.2087 295/500 [================>.............] - ETA: 1:07 - loss: 1.3995 - regression_loss: 1.1905 - classification_loss: 0.2090 296/500 [================>.............] - ETA: 1:07 - loss: 1.4014 - regression_loss: 1.1923 - classification_loss: 0.2091 297/500 [================>.............] - ETA: 1:07 - loss: 1.4021 - regression_loss: 1.1929 - classification_loss: 0.2092 298/500 [================>.............] - ETA: 1:06 - loss: 1.4034 - regression_loss: 1.1940 - classification_loss: 0.2094 299/500 [================>.............] - ETA: 1:06 - loss: 1.4056 - regression_loss: 1.1959 - classification_loss: 0.2097 300/500 [=================>............] - ETA: 1:06 - loss: 1.4062 - regression_loss: 1.1965 - classification_loss: 0.2097 301/500 [=================>............] - ETA: 1:05 - loss: 1.4085 - regression_loss: 1.1984 - classification_loss: 0.2101 302/500 [=================>............] - ETA: 1:05 - loss: 1.4093 - regression_loss: 1.1992 - classification_loss: 0.2101 303/500 [=================>............] - ETA: 1:05 - loss: 1.4115 - regression_loss: 1.2010 - classification_loss: 0.2105 304/500 [=================>............] - ETA: 1:04 - loss: 1.4113 - regression_loss: 1.2010 - classification_loss: 0.2103 305/500 [=================>............] - ETA: 1:04 - loss: 1.4114 - regression_loss: 1.2010 - classification_loss: 0.2104 306/500 [=================>............] - ETA: 1:04 - loss: 1.4100 - regression_loss: 1.1998 - classification_loss: 0.2102 307/500 [=================>............] - ETA: 1:03 - loss: 1.4074 - regression_loss: 1.1975 - classification_loss: 0.2099 308/500 [=================>............] - ETA: 1:03 - loss: 1.4090 - regression_loss: 1.1988 - classification_loss: 0.2102 309/500 [=================>............] - ETA: 1:03 - loss: 1.4093 - regression_loss: 1.1991 - classification_loss: 0.2102 310/500 [=================>............] - ETA: 1:02 - loss: 1.4102 - regression_loss: 1.1999 - classification_loss: 0.2102 311/500 [=================>............] - ETA: 1:02 - loss: 1.4097 - regression_loss: 1.1993 - classification_loss: 0.2103 312/500 [=================>............] - ETA: 1:02 - loss: 1.4078 - regression_loss: 1.1976 - classification_loss: 0.2102 313/500 [=================>............] - ETA: 1:01 - loss: 1.4088 - regression_loss: 1.1986 - classification_loss: 0.2102 314/500 [=================>............] - ETA: 1:01 - loss: 1.4088 - regression_loss: 1.1986 - classification_loss: 0.2102 315/500 [=================>............] - ETA: 1:01 - loss: 1.4093 - regression_loss: 1.1992 - classification_loss: 0.2102 316/500 [=================>............] - ETA: 1:00 - loss: 1.4070 - regression_loss: 1.1971 - classification_loss: 0.2100 317/500 [==================>...........] - ETA: 1:00 - loss: 1.4059 - regression_loss: 1.1961 - classification_loss: 0.2098 318/500 [==================>...........] - ETA: 1:00 - loss: 1.4051 - regression_loss: 1.1954 - classification_loss: 0.2097 319/500 [==================>...........] - ETA: 59s - loss: 1.4031 - regression_loss: 1.1937 - classification_loss: 0.2094  320/500 [==================>...........] - ETA: 59s - loss: 1.4043 - regression_loss: 1.1947 - classification_loss: 0.2097 321/500 [==================>...........] - ETA: 59s - loss: 1.4042 - regression_loss: 1.1945 - classification_loss: 0.2097 322/500 [==================>...........] - ETA: 58s - loss: 1.4023 - regression_loss: 1.1930 - classification_loss: 0.2094 323/500 [==================>...........] - ETA: 58s - loss: 1.4034 - regression_loss: 1.1938 - classification_loss: 0.2096 324/500 [==================>...........] - ETA: 58s - loss: 1.4033 - regression_loss: 1.1938 - classification_loss: 0.2095 325/500 [==================>...........] - ETA: 57s - loss: 1.4022 - regression_loss: 1.1929 - classification_loss: 0.2093 326/500 [==================>...........] - ETA: 57s - loss: 1.4022 - regression_loss: 1.1929 - classification_loss: 0.2093 327/500 [==================>...........] - ETA: 57s - loss: 1.4039 - regression_loss: 1.1944 - classification_loss: 0.2096 328/500 [==================>...........] - ETA: 56s - loss: 1.4044 - regression_loss: 1.1947 - classification_loss: 0.2097 329/500 [==================>...........] - ETA: 56s - loss: 1.4018 - regression_loss: 1.1925 - classification_loss: 0.2093 330/500 [==================>...........] - ETA: 56s - loss: 1.4034 - regression_loss: 1.1938 - classification_loss: 0.2096 331/500 [==================>...........] - ETA: 55s - loss: 1.4019 - regression_loss: 1.1925 - classification_loss: 0.2093 332/500 [==================>...........] - ETA: 55s - loss: 1.4032 - regression_loss: 1.1938 - classification_loss: 0.2094 333/500 [==================>...........] - ETA: 55s - loss: 1.4027 - regression_loss: 1.1935 - classification_loss: 0.2092 334/500 [===================>..........] - ETA: 54s - loss: 1.4026 - regression_loss: 1.1935 - classification_loss: 0.2091 335/500 [===================>..........] - ETA: 54s - loss: 1.4036 - regression_loss: 1.1944 - classification_loss: 0.2092 336/500 [===================>..........] - ETA: 54s - loss: 1.4022 - regression_loss: 1.1933 - classification_loss: 0.2089 337/500 [===================>..........] - ETA: 53s - loss: 1.4027 - regression_loss: 1.1936 - classification_loss: 0.2090 338/500 [===================>..........] - ETA: 53s - loss: 1.4039 - regression_loss: 1.1947 - classification_loss: 0.2092 339/500 [===================>..........] - ETA: 53s - loss: 1.4038 - regression_loss: 1.1948 - classification_loss: 0.2090 340/500 [===================>..........] - ETA: 52s - loss: 1.4058 - regression_loss: 1.1962 - classification_loss: 0.2096 341/500 [===================>..........] - ETA: 52s - loss: 1.4075 - regression_loss: 1.1978 - classification_loss: 0.2098 342/500 [===================>..........] - ETA: 52s - loss: 1.4054 - regression_loss: 1.1959 - classification_loss: 0.2095 343/500 [===================>..........] - ETA: 52s - loss: 1.4033 - regression_loss: 1.1941 - classification_loss: 0.2091 344/500 [===================>..........] - ETA: 51s - loss: 1.4038 - regression_loss: 1.1946 - classification_loss: 0.2092 345/500 [===================>..........] - ETA: 51s - loss: 1.4014 - regression_loss: 1.1926 - classification_loss: 0.2089 346/500 [===================>..........] - ETA: 51s - loss: 1.4024 - regression_loss: 1.1934 - classification_loss: 0.2090 347/500 [===================>..........] - ETA: 50s - loss: 1.4031 - regression_loss: 1.1941 - classification_loss: 0.2090 348/500 [===================>..........] - ETA: 50s - loss: 1.4041 - regression_loss: 1.1948 - classification_loss: 0.2093 349/500 [===================>..........] - ETA: 50s - loss: 1.4027 - regression_loss: 1.1936 - classification_loss: 0.2091 350/500 [====================>.........] - ETA: 49s - loss: 1.4011 - regression_loss: 1.1922 - classification_loss: 0.2088 351/500 [====================>.........] - ETA: 49s - loss: 1.4004 - regression_loss: 1.1916 - classification_loss: 0.2087 352/500 [====================>.........] - ETA: 49s - loss: 1.4008 - regression_loss: 1.1920 - classification_loss: 0.2087 353/500 [====================>.........] - ETA: 48s - loss: 1.4008 - regression_loss: 1.1922 - classification_loss: 0.2086 354/500 [====================>.........] - ETA: 48s - loss: 1.4020 - regression_loss: 1.1932 - classification_loss: 0.2088 355/500 [====================>.........] - ETA: 48s - loss: 1.4017 - regression_loss: 1.1930 - classification_loss: 0.2087 356/500 [====================>.........] - ETA: 47s - loss: 1.4001 - regression_loss: 1.1916 - classification_loss: 0.2085 357/500 [====================>.........] - ETA: 47s - loss: 1.4000 - regression_loss: 1.1915 - classification_loss: 0.2085 358/500 [====================>.........] - ETA: 47s - loss: 1.3983 - regression_loss: 1.1902 - classification_loss: 0.2081 359/500 [====================>.........] - ETA: 46s - loss: 1.3989 - regression_loss: 1.1906 - classification_loss: 0.2083 360/500 [====================>.........] - ETA: 46s - loss: 1.3993 - regression_loss: 1.1909 - classification_loss: 0.2084 361/500 [====================>.........] - ETA: 46s - loss: 1.3976 - regression_loss: 1.1895 - classification_loss: 0.2082 362/500 [====================>.........] - ETA: 45s - loss: 1.3962 - regression_loss: 1.1884 - classification_loss: 0.2078 363/500 [====================>.........] - ETA: 45s - loss: 1.3945 - regression_loss: 1.1869 - classification_loss: 0.2075 364/500 [====================>.........] - ETA: 44s - loss: 1.3942 - regression_loss: 1.1865 - classification_loss: 0.2077 365/500 [====================>.........] - ETA: 44s - loss: 1.3935 - regression_loss: 1.1860 - classification_loss: 0.2074 366/500 [====================>.........] - ETA: 44s - loss: 1.3939 - regression_loss: 1.1865 - classification_loss: 0.2075 367/500 [=====================>........] - ETA: 44s - loss: 1.3932 - regression_loss: 1.1859 - classification_loss: 0.2073 368/500 [=====================>........] - ETA: 43s - loss: 1.3952 - regression_loss: 1.1876 - classification_loss: 0.2076 369/500 [=====================>........] - ETA: 43s - loss: 1.3938 - regression_loss: 1.1863 - classification_loss: 0.2075 370/500 [=====================>........] - ETA: 43s - loss: 1.3941 - regression_loss: 1.1866 - classification_loss: 0.2075 371/500 [=====================>........] - ETA: 42s - loss: 1.3937 - regression_loss: 1.1864 - classification_loss: 0.2074 372/500 [=====================>........] - ETA: 42s - loss: 1.3945 - regression_loss: 1.1870 - classification_loss: 0.2075 373/500 [=====================>........] - ETA: 42s - loss: 1.3954 - regression_loss: 1.1877 - classification_loss: 0.2077 374/500 [=====================>........] - ETA: 41s - loss: 1.3967 - regression_loss: 1.1889 - classification_loss: 0.2078 375/500 [=====================>........] - ETA: 41s - loss: 1.3974 - regression_loss: 1.1895 - classification_loss: 0.2079 376/500 [=====================>........] - ETA: 41s - loss: 1.3962 - regression_loss: 1.1887 - classification_loss: 0.2076 377/500 [=====================>........] - ETA: 40s - loss: 1.3953 - regression_loss: 1.1878 - classification_loss: 0.2075 378/500 [=====================>........] - ETA: 40s - loss: 1.3937 - regression_loss: 1.1865 - classification_loss: 0.2072 379/500 [=====================>........] - ETA: 40s - loss: 1.3939 - regression_loss: 1.1864 - classification_loss: 0.2074 380/500 [=====================>........] - ETA: 39s - loss: 1.3945 - regression_loss: 1.1870 - classification_loss: 0.2074 381/500 [=====================>........] - ETA: 39s - loss: 1.3936 - regression_loss: 1.1863 - classification_loss: 0.2072 382/500 [=====================>........] - ETA: 39s - loss: 1.3953 - regression_loss: 1.1877 - classification_loss: 0.2076 383/500 [=====================>........] - ETA: 38s - loss: 1.3954 - regression_loss: 1.1878 - classification_loss: 0.2076 384/500 [======================>.......] - ETA: 38s - loss: 1.3950 - regression_loss: 1.1875 - classification_loss: 0.2075 385/500 [======================>.......] - ETA: 38s - loss: 1.3928 - regression_loss: 1.1856 - classification_loss: 0.2072 386/500 [======================>.......] - ETA: 37s - loss: 1.3935 - regression_loss: 1.1862 - classification_loss: 0.2073 387/500 [======================>.......] - ETA: 37s - loss: 1.3938 - regression_loss: 1.1864 - classification_loss: 0.2073 388/500 [======================>.......] - ETA: 37s - loss: 1.3939 - regression_loss: 1.1865 - classification_loss: 0.2074 389/500 [======================>.......] - ETA: 36s - loss: 1.3945 - regression_loss: 1.1871 - classification_loss: 0.2074 390/500 [======================>.......] - ETA: 36s - loss: 1.3958 - regression_loss: 1.1881 - classification_loss: 0.2077 391/500 [======================>.......] - ETA: 36s - loss: 1.3960 - regression_loss: 1.1883 - classification_loss: 0.2077 392/500 [======================>.......] - ETA: 35s - loss: 1.3975 - regression_loss: 1.1896 - classification_loss: 0.2079 393/500 [======================>.......] - ETA: 35s - loss: 1.3962 - regression_loss: 1.1884 - classification_loss: 0.2078 394/500 [======================>.......] - ETA: 35s - loss: 1.3957 - regression_loss: 1.1881 - classification_loss: 0.2076 395/500 [======================>.......] - ETA: 34s - loss: 1.3974 - regression_loss: 1.1896 - classification_loss: 0.2078 396/500 [======================>.......] - ETA: 34s - loss: 1.3972 - regression_loss: 1.1894 - classification_loss: 0.2078 397/500 [======================>.......] - ETA: 34s - loss: 1.3978 - regression_loss: 1.1899 - classification_loss: 0.2079 398/500 [======================>.......] - ETA: 33s - loss: 1.3976 - regression_loss: 1.1896 - classification_loss: 0.2079 399/500 [======================>.......] - ETA: 33s - loss: 1.3971 - regression_loss: 1.1893 - classification_loss: 0.2079 400/500 [=======================>......] - ETA: 33s - loss: 1.3979 - regression_loss: 1.1900 - classification_loss: 0.2080 401/500 [=======================>......] - ETA: 32s - loss: 1.3991 - regression_loss: 1.1910 - classification_loss: 0.2081 402/500 [=======================>......] - ETA: 32s - loss: 1.3980 - regression_loss: 1.1901 - classification_loss: 0.2079 403/500 [=======================>......] - ETA: 32s - loss: 1.3986 - regression_loss: 1.1907 - classification_loss: 0.2079 404/500 [=======================>......] - ETA: 31s - loss: 1.3997 - regression_loss: 1.1915 - classification_loss: 0.2082 405/500 [=======================>......] - ETA: 31s - loss: 1.4016 - regression_loss: 1.1930 - classification_loss: 0.2086 406/500 [=======================>......] - ETA: 31s - loss: 1.4012 - regression_loss: 1.1927 - classification_loss: 0.2085 407/500 [=======================>......] - ETA: 30s - loss: 1.4007 - regression_loss: 1.1924 - classification_loss: 0.2084 408/500 [=======================>......] - ETA: 30s - loss: 1.4010 - regression_loss: 1.1928 - classification_loss: 0.2082 409/500 [=======================>......] - ETA: 30s - loss: 1.3997 - regression_loss: 1.1917 - classification_loss: 0.2080 410/500 [=======================>......] - ETA: 29s - loss: 1.4004 - regression_loss: 1.1924 - classification_loss: 0.2080 411/500 [=======================>......] - ETA: 29s - loss: 1.4003 - regression_loss: 1.1924 - classification_loss: 0.2079 412/500 [=======================>......] - ETA: 29s - loss: 1.4007 - regression_loss: 1.1927 - classification_loss: 0.2080 413/500 [=======================>......] - ETA: 28s - loss: 1.4004 - regression_loss: 1.1924 - classification_loss: 0.2080 414/500 [=======================>......] - ETA: 28s - loss: 1.4007 - regression_loss: 1.1927 - classification_loss: 0.2079 415/500 [=======================>......] - ETA: 28s - loss: 1.4014 - regression_loss: 1.1934 - classification_loss: 0.2080 416/500 [=======================>......] - ETA: 27s - loss: 1.4009 - regression_loss: 1.1930 - classification_loss: 0.2079 417/500 [========================>.....] - ETA: 27s - loss: 1.4018 - regression_loss: 1.1937 - classification_loss: 0.2081 418/500 [========================>.....] - ETA: 27s - loss: 1.4003 - regression_loss: 1.1926 - classification_loss: 0.2078 419/500 [========================>.....] - ETA: 26s - loss: 1.4012 - regression_loss: 1.1933 - classification_loss: 0.2079 420/500 [========================>.....] - ETA: 26s - loss: 1.4014 - regression_loss: 1.1935 - classification_loss: 0.2079 421/500 [========================>.....] - ETA: 26s - loss: 1.4005 - regression_loss: 1.1928 - classification_loss: 0.2077 422/500 [========================>.....] - ETA: 25s - loss: 1.4015 - regression_loss: 1.1937 - classification_loss: 0.2078 423/500 [========================>.....] - ETA: 25s - loss: 1.4003 - regression_loss: 1.1927 - classification_loss: 0.2075 424/500 [========================>.....] - ETA: 25s - loss: 1.4015 - regression_loss: 1.1937 - classification_loss: 0.2078 425/500 [========================>.....] - ETA: 24s - loss: 1.3997 - regression_loss: 1.1922 - classification_loss: 0.2075 426/500 [========================>.....] - ETA: 24s - loss: 1.3982 - regression_loss: 1.1909 - classification_loss: 0.2073 427/500 [========================>.....] - ETA: 24s - loss: 1.3977 - regression_loss: 1.1904 - classification_loss: 0.2073 428/500 [========================>.....] - ETA: 23s - loss: 1.3983 - regression_loss: 1.1907 - classification_loss: 0.2075 429/500 [========================>.....] - ETA: 23s - loss: 1.3981 - regression_loss: 1.1905 - classification_loss: 0.2076 430/500 [========================>.....] - ETA: 23s - loss: 1.3982 - regression_loss: 1.1905 - classification_loss: 0.2077 431/500 [========================>.....] - ETA: 22s - loss: 1.3977 - regression_loss: 1.1901 - classification_loss: 0.2076 432/500 [========================>.....] - ETA: 22s - loss: 1.3983 - regression_loss: 1.1907 - classification_loss: 0.2077 433/500 [========================>.....] - ETA: 22s - loss: 1.3984 - regression_loss: 1.1908 - classification_loss: 0.2076 434/500 [=========================>....] - ETA: 21s - loss: 1.3978 - regression_loss: 1.1903 - classification_loss: 0.2075 435/500 [=========================>....] - ETA: 21s - loss: 1.3986 - regression_loss: 1.1910 - classification_loss: 0.2076 436/500 [=========================>....] - ETA: 21s - loss: 1.3970 - regression_loss: 1.1896 - classification_loss: 0.2074 437/500 [=========================>....] - ETA: 20s - loss: 1.3984 - regression_loss: 1.1908 - classification_loss: 0.2076 438/500 [=========================>....] - ETA: 20s - loss: 1.3992 - regression_loss: 1.1916 - classification_loss: 0.2077 439/500 [=========================>....] - ETA: 20s - loss: 1.3990 - regression_loss: 1.1914 - classification_loss: 0.2076 440/500 [=========================>....] - ETA: 19s - loss: 1.4000 - regression_loss: 1.1923 - classification_loss: 0.2077 441/500 [=========================>....] - ETA: 19s - loss: 1.3998 - regression_loss: 1.1920 - classification_loss: 0.2077 442/500 [=========================>....] - ETA: 19s - loss: 1.4001 - regression_loss: 1.1924 - classification_loss: 0.2078 443/500 [=========================>....] - ETA: 18s - loss: 1.3985 - regression_loss: 1.1907 - classification_loss: 0.2078 444/500 [=========================>....] - ETA: 18s - loss: 1.3972 - regression_loss: 1.1895 - classification_loss: 0.2077 445/500 [=========================>....] - ETA: 18s - loss: 1.3979 - regression_loss: 1.1901 - classification_loss: 0.2078 446/500 [=========================>....] - ETA: 17s - loss: 1.3981 - regression_loss: 1.1903 - classification_loss: 0.2079 447/500 [=========================>....] - ETA: 17s - loss: 1.3970 - regression_loss: 1.1893 - classification_loss: 0.2077 448/500 [=========================>....] - ETA: 17s - loss: 1.3963 - regression_loss: 1.1887 - classification_loss: 0.2076 449/500 [=========================>....] - ETA: 16s - loss: 1.3966 - regression_loss: 1.1890 - classification_loss: 0.2076 450/500 [==========================>...] - ETA: 16s - loss: 1.3965 - regression_loss: 1.1888 - classification_loss: 0.2077 451/500 [==========================>...] - ETA: 16s - loss: 1.3961 - regression_loss: 1.1884 - classification_loss: 0.2076 452/500 [==========================>...] - ETA: 15s - loss: 1.3963 - regression_loss: 1.1887 - classification_loss: 0.2076 453/500 [==========================>...] - ETA: 15s - loss: 1.3975 - regression_loss: 1.1896 - classification_loss: 0.2080 454/500 [==========================>...] - ETA: 15s - loss: 1.3982 - regression_loss: 1.1901 - classification_loss: 0.2081 455/500 [==========================>...] - ETA: 14s - loss: 1.3974 - regression_loss: 1.1894 - classification_loss: 0.2079 456/500 [==========================>...] - ETA: 14s - loss: 1.3976 - regression_loss: 1.1897 - classification_loss: 0.2079 457/500 [==========================>...] - ETA: 14s - loss: 1.3980 - regression_loss: 1.1902 - classification_loss: 0.2079 458/500 [==========================>...] - ETA: 13s - loss: 1.3987 - regression_loss: 1.1907 - classification_loss: 0.2079 459/500 [==========================>...] - ETA: 13s - loss: 1.3979 - regression_loss: 1.1901 - classification_loss: 0.2078 460/500 [==========================>...] - ETA: 13s - loss: 1.3966 - regression_loss: 1.1890 - classification_loss: 0.2076 461/500 [==========================>...] - ETA: 12s - loss: 1.3959 - regression_loss: 1.1884 - classification_loss: 0.2075 462/500 [==========================>...] - ETA: 12s - loss: 1.3958 - regression_loss: 1.1883 - classification_loss: 0.2075 463/500 [==========================>...] - ETA: 12s - loss: 1.3957 - regression_loss: 1.1882 - classification_loss: 0.2076 464/500 [==========================>...] - ETA: 11s - loss: 1.3962 - regression_loss: 1.1886 - classification_loss: 0.2076 465/500 [==========================>...] - ETA: 11s - loss: 1.3959 - regression_loss: 1.1884 - classification_loss: 0.2075 466/500 [==========================>...] - ETA: 11s - loss: 1.3957 - regression_loss: 1.1883 - classification_loss: 0.2074 467/500 [===========================>..] - ETA: 10s - loss: 1.3958 - regression_loss: 1.1885 - classification_loss: 0.2073 468/500 [===========================>..] - ETA: 10s - loss: 1.3968 - regression_loss: 1.1893 - classification_loss: 0.2075 469/500 [===========================>..] - ETA: 10s - loss: 1.3980 - regression_loss: 1.1903 - classification_loss: 0.2077 470/500 [===========================>..] - ETA: 9s - loss: 1.3972 - regression_loss: 1.1897 - classification_loss: 0.2075  471/500 [===========================>..] - ETA: 9s - loss: 1.3956 - regression_loss: 1.1883 - classification_loss: 0.2073 472/500 [===========================>..] - ETA: 9s - loss: 1.3940 - regression_loss: 1.1869 - classification_loss: 0.2071 473/500 [===========================>..] - ETA: 8s - loss: 1.3938 - regression_loss: 1.1867 - classification_loss: 0.2071 474/500 [===========================>..] - ETA: 8s - loss: 1.3937 - regression_loss: 1.1866 - classification_loss: 0.2071 475/500 [===========================>..] - ETA: 8s - loss: 1.3940 - regression_loss: 1.1868 - classification_loss: 0.2072 476/500 [===========================>..] - ETA: 7s - loss: 1.3945 - regression_loss: 1.1872 - classification_loss: 0.2073 477/500 [===========================>..] - ETA: 7s - loss: 1.3926 - regression_loss: 1.1855 - classification_loss: 0.2071 478/500 [===========================>..] - ETA: 7s - loss: 1.3929 - regression_loss: 1.1861 - classification_loss: 0.2068 479/500 [===========================>..] - ETA: 6s - loss: 1.3928 - regression_loss: 1.1861 - classification_loss: 0.2068 480/500 [===========================>..] - ETA: 6s - loss: 1.3909 - regression_loss: 1.1843 - classification_loss: 0.2065 481/500 [===========================>..] - ETA: 6s - loss: 1.3913 - regression_loss: 1.1847 - classification_loss: 0.2066 482/500 [===========================>..] - ETA: 5s - loss: 1.3897 - regression_loss: 1.1833 - classification_loss: 0.2063 483/500 [===========================>..] - ETA: 5s - loss: 1.3889 - regression_loss: 1.1827 - classification_loss: 0.2062 484/500 [============================>.] - ETA: 5s - loss: 1.3893 - regression_loss: 1.1831 - classification_loss: 0.2062 485/500 [============================>.] - ETA: 4s - loss: 1.3892 - regression_loss: 1.1829 - classification_loss: 0.2063 486/500 [============================>.] - ETA: 4s - loss: 1.3886 - regression_loss: 1.1823 - classification_loss: 0.2063 487/500 [============================>.] - ETA: 4s - loss: 1.3887 - regression_loss: 1.1824 - classification_loss: 0.2063 488/500 [============================>.] - ETA: 3s - loss: 1.3872 - regression_loss: 1.1811 - classification_loss: 0.2062 489/500 [============================>.] - ETA: 3s - loss: 1.3859 - regression_loss: 1.1798 - classification_loss: 0.2061 490/500 [============================>.] - ETA: 3s - loss: 1.3862 - regression_loss: 1.1802 - classification_loss: 0.2060 491/500 [============================>.] - ETA: 2s - loss: 1.3856 - regression_loss: 1.1797 - classification_loss: 0.2059 492/500 [============================>.] - ETA: 2s - loss: 1.3854 - regression_loss: 1.1795 - classification_loss: 0.2059 493/500 [============================>.] - ETA: 2s - loss: 1.3860 - regression_loss: 1.1800 - classification_loss: 0.2060 494/500 [============================>.] - ETA: 1s - loss: 1.3869 - regression_loss: 1.1808 - classification_loss: 0.2061 495/500 [============================>.] - ETA: 1s - loss: 1.3865 - regression_loss: 1.1804 - classification_loss: 0.2061 496/500 [============================>.] - ETA: 1s - loss: 1.3844 - regression_loss: 1.1785 - classification_loss: 0.2059 497/500 [============================>.] - ETA: 0s - loss: 1.3855 - regression_loss: 1.1794 - classification_loss: 0.2061 498/500 [============================>.] - ETA: 0s - loss: 1.3855 - regression_loss: 1.1794 - classification_loss: 0.2061 499/500 [============================>.] - ETA: 0s - loss: 1.3872 - regression_loss: 1.1809 - classification_loss: 0.2064 500/500 [==============================] - 166s 331ms/step - loss: 1.3878 - regression_loss: 1.1814 - classification_loss: 0.2064 1172 instances of class plum with average precision: 0.5915 mAP: 0.5915 Epoch 00019: saving model to ./training/snapshots/resnet101_pascal_19.h5 Epoch 20/150 1/500 [..............................] - ETA: 2:38 - loss: 1.5159 - regression_loss: 1.3137 - classification_loss: 0.2022 2/500 [..............................] - ETA: 2:39 - loss: 1.1827 - regression_loss: 1.0403 - classification_loss: 0.1424 3/500 [..............................] - ETA: 2:43 - loss: 1.1256 - regression_loss: 0.9827 - classification_loss: 0.1429 4/500 [..............................] - ETA: 2:44 - loss: 1.1544 - regression_loss: 1.0020 - classification_loss: 0.1524 5/500 [..............................] - ETA: 2:42 - loss: 1.1690 - regression_loss: 1.0214 - classification_loss: 0.1475 6/500 [..............................] - ETA: 2:42 - loss: 1.1028 - regression_loss: 0.9594 - classification_loss: 0.1434 7/500 [..............................] - ETA: 2:43 - loss: 1.1044 - regression_loss: 0.9577 - classification_loss: 0.1467 8/500 [..............................] - ETA: 2:42 - loss: 1.1499 - regression_loss: 1.0009 - classification_loss: 0.1490 9/500 [..............................] - ETA: 2:41 - loss: 1.1805 - regression_loss: 1.0250 - classification_loss: 0.1555 10/500 [..............................] - ETA: 2:40 - loss: 1.2350 - regression_loss: 1.0707 - classification_loss: 0.1644 11/500 [..............................] - ETA: 2:41 - loss: 1.2218 - regression_loss: 1.0572 - classification_loss: 0.1646 12/500 [..............................] - ETA: 2:41 - loss: 1.2789 - regression_loss: 1.1076 - classification_loss: 0.1713 13/500 [..............................] - ETA: 2:42 - loss: 1.3160 - regression_loss: 1.1415 - classification_loss: 0.1746 14/500 [..............................] - ETA: 2:42 - loss: 1.2841 - regression_loss: 1.1131 - classification_loss: 0.1710 15/500 [..............................] - ETA: 2:42 - loss: 1.3361 - regression_loss: 1.1565 - classification_loss: 0.1797 16/500 [..............................] - ETA: 2:41 - loss: 1.3234 - regression_loss: 1.1443 - classification_loss: 0.1790 17/500 [>.............................] - ETA: 2:41 - loss: 1.2981 - regression_loss: 1.1216 - classification_loss: 0.1764 18/500 [>.............................] - ETA: 2:40 - loss: 1.2917 - regression_loss: 1.1189 - classification_loss: 0.1728 19/500 [>.............................] - ETA: 2:40 - loss: 1.2949 - regression_loss: 1.1203 - classification_loss: 0.1746 20/500 [>.............................] - ETA: 2:40 - loss: 1.2986 - regression_loss: 1.1235 - classification_loss: 0.1751 21/500 [>.............................] - ETA: 2:40 - loss: 1.2829 - regression_loss: 1.1061 - classification_loss: 0.1768 22/500 [>.............................] - ETA: 2:40 - loss: 1.2717 - regression_loss: 1.0957 - classification_loss: 0.1760 23/500 [>.............................] - ETA: 2:39 - loss: 1.2468 - regression_loss: 1.0723 - classification_loss: 0.1745 24/500 [>.............................] - ETA: 2:39 - loss: 1.2684 - regression_loss: 1.0894 - classification_loss: 0.1790 25/500 [>.............................] - ETA: 2:38 - loss: 1.2515 - regression_loss: 1.0754 - classification_loss: 0.1761 26/500 [>.............................] - ETA: 2:37 - loss: 1.2737 - regression_loss: 1.0931 - classification_loss: 0.1806 27/500 [>.............................] - ETA: 2:37 - loss: 1.2514 - regression_loss: 1.0748 - classification_loss: 0.1767 28/500 [>.............................] - ETA: 2:36 - loss: 1.2591 - regression_loss: 1.0814 - classification_loss: 0.1776 29/500 [>.............................] - ETA: 2:36 - loss: 1.2780 - regression_loss: 1.0964 - classification_loss: 0.1815 30/500 [>.............................] - ETA: 2:36 - loss: 1.2788 - regression_loss: 1.0970 - classification_loss: 0.1818 31/500 [>.............................] - ETA: 2:35 - loss: 1.2835 - regression_loss: 1.0999 - classification_loss: 0.1836 32/500 [>.............................] - ETA: 2:35 - loss: 1.2932 - regression_loss: 1.1086 - classification_loss: 0.1846 33/500 [>.............................] - ETA: 2:34 - loss: 1.2936 - regression_loss: 1.1094 - classification_loss: 0.1841 34/500 [=>............................] - ETA: 2:34 - loss: 1.3075 - regression_loss: 1.1213 - classification_loss: 0.1863 35/500 [=>............................] - ETA: 2:34 - loss: 1.3086 - regression_loss: 1.1215 - classification_loss: 0.1871 36/500 [=>............................] - ETA: 2:33 - loss: 1.3075 - regression_loss: 1.1206 - classification_loss: 0.1868 37/500 [=>............................] - ETA: 2:33 - loss: 1.2904 - regression_loss: 1.1068 - classification_loss: 0.1836 38/500 [=>............................] - ETA: 2:33 - loss: 1.2940 - regression_loss: 1.1101 - classification_loss: 0.1839 39/500 [=>............................] - ETA: 2:33 - loss: 1.2973 - regression_loss: 1.1116 - classification_loss: 0.1857 40/500 [=>............................] - ETA: 2:32 - loss: 1.2863 - regression_loss: 1.1007 - classification_loss: 0.1856 41/500 [=>............................] - ETA: 2:32 - loss: 1.2955 - regression_loss: 1.1082 - classification_loss: 0.1873 42/500 [=>............................] - ETA: 2:32 - loss: 1.2944 - regression_loss: 1.1086 - classification_loss: 0.1858 43/500 [=>............................] - ETA: 2:31 - loss: 1.2976 - regression_loss: 1.1113 - classification_loss: 0.1864 44/500 [=>............................] - ETA: 2:31 - loss: 1.2841 - regression_loss: 1.0998 - classification_loss: 0.1843 45/500 [=>............................] - ETA: 2:30 - loss: 1.2980 - regression_loss: 1.1122 - classification_loss: 0.1858 46/500 [=>............................] - ETA: 2:30 - loss: 1.2907 - regression_loss: 1.1062 - classification_loss: 0.1844 47/500 [=>............................] - ETA: 2:29 - loss: 1.2975 - regression_loss: 1.1128 - classification_loss: 0.1847 48/500 [=>............................] - ETA: 2:29 - loss: 1.2904 - regression_loss: 1.1069 - classification_loss: 0.1834 49/500 [=>............................] - ETA: 2:29 - loss: 1.2827 - regression_loss: 1.1005 - classification_loss: 0.1822 50/500 [==>...........................] - ETA: 2:28 - loss: 1.2853 - regression_loss: 1.1031 - classification_loss: 0.1822 51/500 [==>...........................] - ETA: 2:28 - loss: 1.2694 - regression_loss: 1.0888 - classification_loss: 0.1806 52/500 [==>...........................] - ETA: 2:28 - loss: 1.2680 - regression_loss: 1.0881 - classification_loss: 0.1799 53/500 [==>...........................] - ETA: 2:27 - loss: 1.2682 - regression_loss: 1.0884 - classification_loss: 0.1798 54/500 [==>...........................] - ETA: 2:27 - loss: 1.2709 - regression_loss: 1.0911 - classification_loss: 0.1798 55/500 [==>...........................] - ETA: 2:27 - loss: 1.2608 - regression_loss: 1.0823 - classification_loss: 0.1785 56/500 [==>...........................] - ETA: 2:26 - loss: 1.2530 - regression_loss: 1.0755 - classification_loss: 0.1774 57/500 [==>...........................] - ETA: 2:26 - loss: 1.2615 - regression_loss: 1.0827 - classification_loss: 0.1788 58/500 [==>...........................] - ETA: 2:26 - loss: 1.2720 - regression_loss: 1.0912 - classification_loss: 0.1808 59/500 [==>...........................] - ETA: 2:26 - loss: 1.2628 - regression_loss: 1.0819 - classification_loss: 0.1809 60/500 [==>...........................] - ETA: 2:26 - loss: 1.2621 - regression_loss: 1.0805 - classification_loss: 0.1816 61/500 [==>...........................] - ETA: 2:25 - loss: 1.2521 - regression_loss: 1.0705 - classification_loss: 0.1816 62/500 [==>...........................] - ETA: 2:25 - loss: 1.2567 - regression_loss: 1.0746 - classification_loss: 0.1821 63/500 [==>...........................] - ETA: 2:25 - loss: 1.2461 - regression_loss: 1.0654 - classification_loss: 0.1808 64/500 [==>...........................] - ETA: 2:24 - loss: 1.2458 - regression_loss: 1.0645 - classification_loss: 0.1812 65/500 [==>...........................] - ETA: 2:24 - loss: 1.2417 - regression_loss: 1.0610 - classification_loss: 0.1807 66/500 [==>...........................] - ETA: 2:24 - loss: 1.2462 - regression_loss: 1.0651 - classification_loss: 0.1811 67/500 [===>..........................] - ETA: 2:23 - loss: 1.2510 - regression_loss: 1.0692 - classification_loss: 0.1818 68/500 [===>..........................] - ETA: 2:23 - loss: 1.2475 - regression_loss: 1.0662 - classification_loss: 0.1813 69/500 [===>..........................] - ETA: 2:23 - loss: 1.2407 - regression_loss: 1.0607 - classification_loss: 0.1799 70/500 [===>..........................] - ETA: 2:22 - loss: 1.2438 - regression_loss: 1.0637 - classification_loss: 0.1801 71/500 [===>..........................] - ETA: 2:22 - loss: 1.2381 - regression_loss: 1.0591 - classification_loss: 0.1790 72/500 [===>..........................] - ETA: 2:21 - loss: 1.2418 - regression_loss: 1.0621 - classification_loss: 0.1797 73/500 [===>..........................] - ETA: 2:21 - loss: 1.2471 - regression_loss: 1.0667 - classification_loss: 0.1803 74/500 [===>..........................] - ETA: 2:21 - loss: 1.2482 - regression_loss: 1.0679 - classification_loss: 0.1803 75/500 [===>..........................] - ETA: 2:20 - loss: 1.2475 - regression_loss: 1.0674 - classification_loss: 0.1801 76/500 [===>..........................] - ETA: 2:20 - loss: 1.2497 - regression_loss: 1.0688 - classification_loss: 0.1809 77/500 [===>..........................] - ETA: 2:19 - loss: 1.2495 - regression_loss: 1.0687 - classification_loss: 0.1808 78/500 [===>..........................] - ETA: 2:19 - loss: 1.2513 - regression_loss: 1.0713 - classification_loss: 0.1799 79/500 [===>..........................] - ETA: 2:19 - loss: 1.2458 - regression_loss: 1.0669 - classification_loss: 0.1790 80/500 [===>..........................] - ETA: 2:19 - loss: 1.2499 - regression_loss: 1.0703 - classification_loss: 0.1796 81/500 [===>..........................] - ETA: 2:18 - loss: 1.2531 - regression_loss: 1.0734 - classification_loss: 0.1796 82/500 [===>..........................] - ETA: 2:18 - loss: 1.2556 - regression_loss: 1.0756 - classification_loss: 0.1800 83/500 [===>..........................] - ETA: 2:18 - loss: 1.2490 - regression_loss: 1.0701 - classification_loss: 0.1790 84/500 [====>.........................] - ETA: 2:17 - loss: 1.2491 - regression_loss: 1.0703 - classification_loss: 0.1788 85/500 [====>.........................] - ETA: 2:17 - loss: 1.2410 - regression_loss: 1.0632 - classification_loss: 0.1778 86/500 [====>.........................] - ETA: 2:17 - loss: 1.2507 - regression_loss: 1.0703 - classification_loss: 0.1804 87/500 [====>.........................] - ETA: 2:17 - loss: 1.2619 - regression_loss: 1.0807 - classification_loss: 0.1812 88/500 [====>.........................] - ETA: 2:16 - loss: 1.2574 - regression_loss: 1.0767 - classification_loss: 0.1807 89/500 [====>.........................] - ETA: 2:16 - loss: 1.2585 - regression_loss: 1.0780 - classification_loss: 0.1805 90/500 [====>.........................] - ETA: 2:16 - loss: 1.2579 - regression_loss: 1.0774 - classification_loss: 0.1806 91/500 [====>.........................] - ETA: 2:15 - loss: 1.2586 - regression_loss: 1.0782 - classification_loss: 0.1805 92/500 [====>.........................] - ETA: 2:15 - loss: 1.2611 - regression_loss: 1.0806 - classification_loss: 0.1805 93/500 [====>.........................] - ETA: 2:15 - loss: 1.2650 - regression_loss: 1.0838 - classification_loss: 0.1812 94/500 [====>.........................] - ETA: 2:14 - loss: 1.2595 - regression_loss: 1.0791 - classification_loss: 0.1805 95/500 [====>.........................] - ETA: 2:14 - loss: 1.2496 - regression_loss: 1.0703 - classification_loss: 0.1793 96/500 [====>.........................] - ETA: 2:14 - loss: 1.2543 - regression_loss: 1.0742 - classification_loss: 0.1800 97/500 [====>.........................] - ETA: 2:13 - loss: 1.2583 - regression_loss: 1.0778 - classification_loss: 0.1805 98/500 [====>.........................] - ETA: 2:13 - loss: 1.2580 - regression_loss: 1.0775 - classification_loss: 0.1805 99/500 [====>.........................] - ETA: 2:13 - loss: 1.2620 - regression_loss: 1.0810 - classification_loss: 0.1810 100/500 [=====>........................] - ETA: 2:12 - loss: 1.2656 - regression_loss: 1.0841 - classification_loss: 0.1815 101/500 [=====>........................] - ETA: 2:12 - loss: 1.2637 - regression_loss: 1.0821 - classification_loss: 0.1816 102/500 [=====>........................] - ETA: 2:12 - loss: 1.2624 - regression_loss: 1.0812 - classification_loss: 0.1812 103/500 [=====>........................] - ETA: 2:11 - loss: 1.2660 - regression_loss: 1.0841 - classification_loss: 0.1819 104/500 [=====>........................] - ETA: 2:11 - loss: 1.2590 - regression_loss: 1.0780 - classification_loss: 0.1811 105/500 [=====>........................] - ETA: 2:11 - loss: 1.2557 - regression_loss: 1.0752 - classification_loss: 0.1805 106/500 [=====>........................] - ETA: 2:10 - loss: 1.2584 - regression_loss: 1.0776 - classification_loss: 0.1807 107/500 [=====>........................] - ETA: 2:10 - loss: 1.2543 - regression_loss: 1.0743 - classification_loss: 0.1800 108/500 [=====>........................] - ETA: 2:10 - loss: 1.2496 - regression_loss: 1.0702 - classification_loss: 0.1794 109/500 [=====>........................] - ETA: 2:10 - loss: 1.2515 - regression_loss: 1.0720 - classification_loss: 0.1795 110/500 [=====>........................] - ETA: 2:09 - loss: 1.2516 - regression_loss: 1.0720 - classification_loss: 0.1796 111/500 [=====>........................] - ETA: 2:09 - loss: 1.2551 - regression_loss: 1.0748 - classification_loss: 0.1803 112/500 [=====>........................] - ETA: 2:09 - loss: 1.2560 - regression_loss: 1.0763 - classification_loss: 0.1797 113/500 [=====>........................] - ETA: 2:08 - loss: 1.2581 - regression_loss: 1.0781 - classification_loss: 0.1800 114/500 [=====>........................] - ETA: 2:08 - loss: 1.2564 - regression_loss: 1.0765 - classification_loss: 0.1799 115/500 [=====>........................] - ETA: 2:07 - loss: 1.2563 - regression_loss: 1.0765 - classification_loss: 0.1799 116/500 [=====>........................] - ETA: 2:07 - loss: 1.2633 - regression_loss: 1.0821 - classification_loss: 0.1812 117/500 [======>.......................] - ETA: 2:07 - loss: 1.2706 - regression_loss: 1.0885 - classification_loss: 0.1821 118/500 [======>.......................] - ETA: 2:06 - loss: 1.2683 - regression_loss: 1.0866 - classification_loss: 0.1816 119/500 [======>.......................] - ETA: 2:06 - loss: 1.2653 - regression_loss: 1.0842 - classification_loss: 0.1811 120/500 [======>.......................] - ETA: 2:06 - loss: 1.2684 - regression_loss: 1.0875 - classification_loss: 0.1809 121/500 [======>.......................] - ETA: 2:05 - loss: 1.2657 - regression_loss: 1.0852 - classification_loss: 0.1804 122/500 [======>.......................] - ETA: 2:05 - loss: 1.2691 - regression_loss: 1.0880 - classification_loss: 0.1811 123/500 [======>.......................] - ETA: 2:05 - loss: 1.2673 - regression_loss: 1.0863 - classification_loss: 0.1810 124/500 [======>.......................] - ETA: 2:05 - loss: 1.2698 - regression_loss: 1.0885 - classification_loss: 0.1814 125/500 [======>.......................] - ETA: 2:04 - loss: 1.2717 - regression_loss: 1.0900 - classification_loss: 0.1817 126/500 [======>.......................] - ETA: 2:04 - loss: 1.2721 - regression_loss: 1.0902 - classification_loss: 0.1820 127/500 [======>.......................] - ETA: 2:04 - loss: 1.2742 - regression_loss: 1.0919 - classification_loss: 0.1823 128/500 [======>.......................] - ETA: 2:03 - loss: 1.2775 - regression_loss: 1.0945 - classification_loss: 0.1830 129/500 [======>.......................] - ETA: 2:03 - loss: 1.2778 - regression_loss: 1.0949 - classification_loss: 0.1829 130/500 [======>.......................] - ETA: 2:03 - loss: 1.2809 - regression_loss: 1.0974 - classification_loss: 0.1835 131/500 [======>.......................] - ETA: 2:02 - loss: 1.2860 - regression_loss: 1.1014 - classification_loss: 0.1846 132/500 [======>.......................] - ETA: 2:02 - loss: 1.2905 - regression_loss: 1.1050 - classification_loss: 0.1855 133/500 [======>.......................] - ETA: 2:02 - loss: 1.2930 - regression_loss: 1.1075 - classification_loss: 0.1855 134/500 [=======>......................] - ETA: 2:01 - loss: 1.2926 - regression_loss: 1.1079 - classification_loss: 0.1848 135/500 [=======>......................] - ETA: 2:01 - loss: 1.2931 - regression_loss: 1.1081 - classification_loss: 0.1851 136/500 [=======>......................] - ETA: 2:01 - loss: 1.2936 - regression_loss: 1.1088 - classification_loss: 0.1849 137/500 [=======>......................] - ETA: 2:00 - loss: 1.2979 - regression_loss: 1.1124 - classification_loss: 0.1855 138/500 [=======>......................] - ETA: 2:00 - loss: 1.2924 - regression_loss: 1.1075 - classification_loss: 0.1848 139/500 [=======>......................] - ETA: 2:00 - loss: 1.2956 - regression_loss: 1.1105 - classification_loss: 0.1851 140/500 [=======>......................] - ETA: 1:59 - loss: 1.2981 - regression_loss: 1.1125 - classification_loss: 0.1856 141/500 [=======>......................] - ETA: 1:59 - loss: 1.3002 - regression_loss: 1.1140 - classification_loss: 0.1862 142/500 [=======>......................] - ETA: 1:59 - loss: 1.3076 - regression_loss: 1.1204 - classification_loss: 0.1871 143/500 [=======>......................] - ETA: 1:58 - loss: 1.3128 - regression_loss: 1.1249 - classification_loss: 0.1880 144/500 [=======>......................] - ETA: 1:58 - loss: 1.3157 - regression_loss: 1.1273 - classification_loss: 0.1884 145/500 [=======>......................] - ETA: 1:57 - loss: 1.3197 - regression_loss: 1.1305 - classification_loss: 0.1891 146/500 [=======>......................] - ETA: 1:57 - loss: 1.3210 - regression_loss: 1.1317 - classification_loss: 0.1893 147/500 [=======>......................] - ETA: 1:57 - loss: 1.3239 - regression_loss: 1.1341 - classification_loss: 0.1898 148/500 [=======>......................] - ETA: 1:56 - loss: 1.3252 - regression_loss: 1.1353 - classification_loss: 0.1899 149/500 [=======>......................] - ETA: 1:56 - loss: 1.3252 - regression_loss: 1.1357 - classification_loss: 0.1895 150/500 [========>.....................] - ETA: 1:56 - loss: 1.3285 - regression_loss: 1.1387 - classification_loss: 0.1898 151/500 [========>.....................] - ETA: 1:55 - loss: 1.3281 - regression_loss: 1.1382 - classification_loss: 0.1899 152/500 [========>.....................] - ETA: 1:55 - loss: 1.3243 - regression_loss: 1.1346 - classification_loss: 0.1897 153/500 [========>.....................] - ETA: 1:55 - loss: 1.3299 - regression_loss: 1.1390 - classification_loss: 0.1909 154/500 [========>.....................] - ETA: 1:54 - loss: 1.3338 - regression_loss: 1.1426 - classification_loss: 0.1912 155/500 [========>.....................] - ETA: 1:54 - loss: 1.3320 - regression_loss: 1.1412 - classification_loss: 0.1907 156/500 [========>.....................] - ETA: 1:54 - loss: 1.3359 - regression_loss: 1.1443 - classification_loss: 0.1916 157/500 [========>.....................] - ETA: 1:53 - loss: 1.3378 - regression_loss: 1.1458 - classification_loss: 0.1920 158/500 [========>.....................] - ETA: 1:53 - loss: 1.3428 - regression_loss: 1.1496 - classification_loss: 0.1932 159/500 [========>.....................] - ETA: 1:53 - loss: 1.3426 - regression_loss: 1.1494 - classification_loss: 0.1932 160/500 [========>.....................] - ETA: 1:52 - loss: 1.3453 - regression_loss: 1.1517 - classification_loss: 0.1936 161/500 [========>.....................] - ETA: 1:52 - loss: 1.3474 - regression_loss: 1.1534 - classification_loss: 0.1940 162/500 [========>.....................] - ETA: 1:52 - loss: 1.3519 - regression_loss: 1.1566 - classification_loss: 0.1952 163/500 [========>.....................] - ETA: 1:51 - loss: 1.3588 - regression_loss: 1.1624 - classification_loss: 0.1964 164/500 [========>.....................] - ETA: 1:51 - loss: 1.3638 - regression_loss: 1.1663 - classification_loss: 0.1975 165/500 [========>.....................] - ETA: 1:51 - loss: 1.3676 - regression_loss: 1.1692 - classification_loss: 0.1984 166/500 [========>.....................] - ETA: 1:50 - loss: 1.3712 - regression_loss: 1.1722 - classification_loss: 0.1990 167/500 [=========>....................] - ETA: 1:50 - loss: 1.3731 - regression_loss: 1.1739 - classification_loss: 0.1993 168/500 [=========>....................] - ETA: 1:50 - loss: 1.3691 - regression_loss: 1.1705 - classification_loss: 0.1986 169/500 [=========>....................] - ETA: 1:49 - loss: 1.3674 - regression_loss: 1.1691 - classification_loss: 0.1983 170/500 [=========>....................] - ETA: 1:49 - loss: 1.3690 - regression_loss: 1.1703 - classification_loss: 0.1987 171/500 [=========>....................] - ETA: 1:49 - loss: 1.3684 - regression_loss: 1.1698 - classification_loss: 0.1986 172/500 [=========>....................] - ETA: 1:48 - loss: 1.3690 - regression_loss: 1.1704 - classification_loss: 0.1987 173/500 [=========>....................] - ETA: 1:48 - loss: 1.3721 - regression_loss: 1.1732 - classification_loss: 0.1989 174/500 [=========>....................] - ETA: 1:48 - loss: 1.3685 - regression_loss: 1.1701 - classification_loss: 0.1984 175/500 [=========>....................] - ETA: 1:47 - loss: 1.3692 - regression_loss: 1.1706 - classification_loss: 0.1985 176/500 [=========>....................] - ETA: 1:47 - loss: 1.3704 - regression_loss: 1.1717 - classification_loss: 0.1987 177/500 [=========>....................] - ETA: 1:47 - loss: 1.3739 - regression_loss: 1.1749 - classification_loss: 0.1990 178/500 [=========>....................] - ETA: 1:46 - loss: 1.3739 - regression_loss: 1.1747 - classification_loss: 0.1992 179/500 [=========>....................] - ETA: 1:46 - loss: 1.3757 - regression_loss: 1.1761 - classification_loss: 0.1996 180/500 [=========>....................] - ETA: 1:46 - loss: 1.3788 - regression_loss: 1.1786 - classification_loss: 0.2002 181/500 [=========>....................] - ETA: 1:45 - loss: 1.3769 - regression_loss: 1.1768 - classification_loss: 0.2001 182/500 [=========>....................] - ETA: 1:45 - loss: 1.3735 - regression_loss: 1.1735 - classification_loss: 0.2001 183/500 [=========>....................] - ETA: 1:45 - loss: 1.3707 - regression_loss: 1.1708 - classification_loss: 0.1999 184/500 [==========>...................] - ETA: 1:44 - loss: 1.3737 - regression_loss: 1.1733 - classification_loss: 0.2004 185/500 [==========>...................] - ETA: 1:44 - loss: 1.3738 - regression_loss: 1.1732 - classification_loss: 0.2005 186/500 [==========>...................] - ETA: 1:44 - loss: 1.3750 - regression_loss: 1.1743 - classification_loss: 0.2007 187/500 [==========>...................] - ETA: 1:43 - loss: 1.3744 - regression_loss: 1.1739 - classification_loss: 0.2005 188/500 [==========>...................] - ETA: 1:43 - loss: 1.3758 - regression_loss: 1.1751 - classification_loss: 0.2007 189/500 [==========>...................] - ETA: 1:43 - loss: 1.3765 - regression_loss: 1.1757 - classification_loss: 0.2008 190/500 [==========>...................] - ETA: 1:42 - loss: 1.3779 - regression_loss: 1.1771 - classification_loss: 0.2008 191/500 [==========>...................] - ETA: 1:42 - loss: 1.3778 - regression_loss: 1.1769 - classification_loss: 0.2009 192/500 [==========>...................] - ETA: 1:42 - loss: 1.3777 - regression_loss: 1.1768 - classification_loss: 0.2009 193/500 [==========>...................] - ETA: 1:41 - loss: 1.3809 - regression_loss: 1.1792 - classification_loss: 0.2017 194/500 [==========>...................] - ETA: 1:41 - loss: 1.3808 - regression_loss: 1.1792 - classification_loss: 0.2016 195/500 [==========>...................] - ETA: 1:41 - loss: 1.3798 - regression_loss: 1.1783 - classification_loss: 0.2015 196/500 [==========>...................] - ETA: 1:40 - loss: 1.3811 - regression_loss: 1.1793 - classification_loss: 0.2018 197/500 [==========>...................] - ETA: 1:40 - loss: 1.3817 - regression_loss: 1.1795 - classification_loss: 0.2022 198/500 [==========>...................] - ETA: 1:40 - loss: 1.3838 - regression_loss: 1.1809 - classification_loss: 0.2029 199/500 [==========>...................] - ETA: 1:39 - loss: 1.3803 - regression_loss: 1.1780 - classification_loss: 0.2023 200/500 [===========>..................] - ETA: 1:39 - loss: 1.3775 - regression_loss: 1.1757 - classification_loss: 0.2018 201/500 [===========>..................] - ETA: 1:39 - loss: 1.3790 - regression_loss: 1.1770 - classification_loss: 0.2020 202/500 [===========>..................] - ETA: 1:38 - loss: 1.3813 - regression_loss: 1.1791 - classification_loss: 0.2022 203/500 [===========>..................] - ETA: 1:38 - loss: 1.3819 - regression_loss: 1.1798 - classification_loss: 0.2021 204/500 [===========>..................] - ETA: 1:38 - loss: 1.3804 - regression_loss: 1.1785 - classification_loss: 0.2018 205/500 [===========>..................] - ETA: 1:37 - loss: 1.3816 - regression_loss: 1.1796 - classification_loss: 0.2020 206/500 [===========>..................] - ETA: 1:37 - loss: 1.3810 - regression_loss: 1.1794 - classification_loss: 0.2016 207/500 [===========>..................] - ETA: 1:37 - loss: 1.3808 - regression_loss: 1.1794 - classification_loss: 0.2014 208/500 [===========>..................] - ETA: 1:36 - loss: 1.3809 - regression_loss: 1.1798 - classification_loss: 0.2011 209/500 [===========>..................] - ETA: 1:36 - loss: 1.3832 - regression_loss: 1.1817 - classification_loss: 0.2015 210/500 [===========>..................] - ETA: 1:36 - loss: 1.3800 - regression_loss: 1.1790 - classification_loss: 0.2010 211/500 [===========>..................] - ETA: 1:35 - loss: 1.3770 - regression_loss: 1.1763 - classification_loss: 0.2007 212/500 [===========>..................] - ETA: 1:35 - loss: 1.3775 - regression_loss: 1.1768 - classification_loss: 0.2008 213/500 [===========>..................] - ETA: 1:35 - loss: 1.3792 - regression_loss: 1.1782 - classification_loss: 0.2011 214/500 [===========>..................] - ETA: 1:34 - loss: 1.3776 - regression_loss: 1.1768 - classification_loss: 0.2008 215/500 [===========>..................] - ETA: 1:34 - loss: 1.3768 - regression_loss: 1.1761 - classification_loss: 0.2007 216/500 [===========>..................] - ETA: 1:34 - loss: 1.3758 - regression_loss: 1.1753 - classification_loss: 0.2005 217/500 [============>.................] - ETA: 1:33 - loss: 1.3727 - regression_loss: 1.1728 - classification_loss: 0.1999 218/500 [============>.................] - ETA: 1:33 - loss: 1.3699 - regression_loss: 1.1704 - classification_loss: 0.1995 219/500 [============>.................] - ETA: 1:33 - loss: 1.3715 - regression_loss: 1.1719 - classification_loss: 0.1996 220/500 [============>.................] - ETA: 1:32 - loss: 1.3678 - regression_loss: 1.1687 - classification_loss: 0.1991 221/500 [============>.................] - ETA: 1:32 - loss: 1.3661 - regression_loss: 1.1671 - classification_loss: 0.1990 222/500 [============>.................] - ETA: 1:32 - loss: 1.3641 - regression_loss: 1.1655 - classification_loss: 0.1986 223/500 [============>.................] - ETA: 1:31 - loss: 1.3664 - regression_loss: 1.1675 - classification_loss: 0.1989 224/500 [============>.................] - ETA: 1:31 - loss: 1.3690 - regression_loss: 1.1698 - classification_loss: 0.1992 225/500 [============>.................] - ETA: 1:31 - loss: 1.3677 - regression_loss: 1.1688 - classification_loss: 0.1990 226/500 [============>.................] - ETA: 1:30 - loss: 1.3674 - regression_loss: 1.1685 - classification_loss: 0.1988 227/500 [============>.................] - ETA: 1:30 - loss: 1.3669 - regression_loss: 1.1682 - classification_loss: 0.1987 228/500 [============>.................] - ETA: 1:30 - loss: 1.3679 - regression_loss: 1.1692 - classification_loss: 0.1987 229/500 [============>.................] - ETA: 1:29 - loss: 1.3681 - regression_loss: 1.1693 - classification_loss: 0.1988 230/500 [============>.................] - ETA: 1:29 - loss: 1.3681 - regression_loss: 1.1692 - classification_loss: 0.1989 231/500 [============>.................] - ETA: 1:29 - loss: 1.3697 - regression_loss: 1.1703 - classification_loss: 0.1993 232/500 [============>.................] - ETA: 1:28 - loss: 1.3713 - regression_loss: 1.1717 - classification_loss: 0.1996 233/500 [============>.................] - ETA: 1:28 - loss: 1.3686 - regression_loss: 1.1694 - classification_loss: 0.1991 234/500 [=============>................] - ETA: 1:28 - loss: 1.3695 - regression_loss: 1.1703 - classification_loss: 0.1992 235/500 [=============>................] - ETA: 1:27 - loss: 1.3663 - regression_loss: 1.1677 - classification_loss: 0.1986 236/500 [=============>................] - ETA: 1:27 - loss: 1.3643 - regression_loss: 1.1662 - classification_loss: 0.1981 237/500 [=============>................] - ETA: 1:27 - loss: 1.3655 - regression_loss: 1.1673 - classification_loss: 0.1982 238/500 [=============>................] - ETA: 1:26 - loss: 1.3628 - regression_loss: 1.1650 - classification_loss: 0.1979 239/500 [=============>................] - ETA: 1:26 - loss: 1.3620 - regression_loss: 1.1641 - classification_loss: 0.1979 240/500 [=============>................] - ETA: 1:26 - loss: 1.3604 - regression_loss: 1.1627 - classification_loss: 0.1977 241/500 [=============>................] - ETA: 1:25 - loss: 1.3593 - regression_loss: 1.1617 - classification_loss: 0.1975 242/500 [=============>................] - ETA: 1:25 - loss: 1.3564 - regression_loss: 1.1593 - classification_loss: 0.1971 243/500 [=============>................] - ETA: 1:25 - loss: 1.3537 - regression_loss: 1.1565 - classification_loss: 0.1973 244/500 [=============>................] - ETA: 1:24 - loss: 1.3550 - regression_loss: 1.1577 - classification_loss: 0.1973 245/500 [=============>................] - ETA: 1:24 - loss: 1.3518 - regression_loss: 1.1550 - classification_loss: 0.1968 246/500 [=============>................] - ETA: 1:24 - loss: 1.3534 - regression_loss: 1.1563 - classification_loss: 0.1971 247/500 [=============>................] - ETA: 1:23 - loss: 1.3526 - regression_loss: 1.1555 - classification_loss: 0.1971 248/500 [=============>................] - ETA: 1:23 - loss: 1.3503 - regression_loss: 1.1535 - classification_loss: 0.1968 249/500 [=============>................] - ETA: 1:23 - loss: 1.3520 - regression_loss: 1.1548 - classification_loss: 0.1972 250/500 [==============>...............] - ETA: 1:22 - loss: 1.3525 - regression_loss: 1.1553 - classification_loss: 0.1973 251/500 [==============>...............] - ETA: 1:22 - loss: 1.3537 - regression_loss: 1.1561 - classification_loss: 0.1976 252/500 [==============>...............] - ETA: 1:22 - loss: 1.3554 - regression_loss: 1.1574 - classification_loss: 0.1980 253/500 [==============>...............] - ETA: 1:21 - loss: 1.3566 - regression_loss: 1.1585 - classification_loss: 0.1981 254/500 [==============>...............] - ETA: 1:21 - loss: 1.3545 - regression_loss: 1.1567 - classification_loss: 0.1978 255/500 [==============>...............] - ETA: 1:21 - loss: 1.3561 - regression_loss: 1.1581 - classification_loss: 0.1980 256/500 [==============>...............] - ETA: 1:20 - loss: 1.3575 - regression_loss: 1.1594 - classification_loss: 0.1981 257/500 [==============>...............] - ETA: 1:20 - loss: 1.3564 - regression_loss: 1.1586 - classification_loss: 0.1979 258/500 [==============>...............] - ETA: 1:20 - loss: 1.3578 - regression_loss: 1.1598 - classification_loss: 0.1980 259/500 [==============>...............] - ETA: 1:19 - loss: 1.3586 - regression_loss: 1.1603 - classification_loss: 0.1982 260/500 [==============>...............] - ETA: 1:19 - loss: 1.3594 - regression_loss: 1.1609 - classification_loss: 0.1985 261/500 [==============>...............] - ETA: 1:19 - loss: 1.3603 - regression_loss: 1.1620 - classification_loss: 0.1983 262/500 [==============>...............] - ETA: 1:18 - loss: 1.3603 - regression_loss: 1.1619 - classification_loss: 0.1984 263/500 [==============>...............] - ETA: 1:18 - loss: 1.3574 - regression_loss: 1.1594 - classification_loss: 0.1980 264/500 [==============>...............] - ETA: 1:18 - loss: 1.3573 - regression_loss: 1.1594 - classification_loss: 0.1980 265/500 [==============>...............] - ETA: 1:17 - loss: 1.3552 - regression_loss: 1.1575 - classification_loss: 0.1977 266/500 [==============>...............] - ETA: 1:17 - loss: 1.3569 - regression_loss: 1.1590 - classification_loss: 0.1980 267/500 [===============>..............] - ETA: 1:17 - loss: 1.3537 - regression_loss: 1.1562 - classification_loss: 0.1975 268/500 [===============>..............] - ETA: 1:16 - loss: 1.3516 - regression_loss: 1.1545 - classification_loss: 0.1970 269/500 [===============>..............] - ETA: 1:16 - loss: 1.3538 - regression_loss: 1.1564 - classification_loss: 0.1974 270/500 [===============>..............] - ETA: 1:16 - loss: 1.3548 - regression_loss: 1.1572 - classification_loss: 0.1976 271/500 [===============>..............] - ETA: 1:15 - loss: 1.3569 - regression_loss: 1.1589 - classification_loss: 0.1980 272/500 [===============>..............] - ETA: 1:15 - loss: 1.3555 - regression_loss: 1.1578 - classification_loss: 0.1977 273/500 [===============>..............] - ETA: 1:15 - loss: 1.3565 - regression_loss: 1.1587 - classification_loss: 0.1978 274/500 [===============>..............] - ETA: 1:14 - loss: 1.3571 - regression_loss: 1.1593 - classification_loss: 0.1978 275/500 [===============>..............] - ETA: 1:14 - loss: 1.3591 - regression_loss: 1.1612 - classification_loss: 0.1979 276/500 [===============>..............] - ETA: 1:14 - loss: 1.3587 - regression_loss: 1.1610 - classification_loss: 0.1977 277/500 [===============>..............] - ETA: 1:13 - loss: 1.3587 - regression_loss: 1.1613 - classification_loss: 0.1974 278/500 [===============>..............] - ETA: 1:13 - loss: 1.3625 - regression_loss: 1.1641 - classification_loss: 0.1984 279/500 [===============>..............] - ETA: 1:13 - loss: 1.3637 - regression_loss: 1.1653 - classification_loss: 0.1983 280/500 [===============>..............] - ETA: 1:12 - loss: 1.3676 - regression_loss: 1.1684 - classification_loss: 0.1991 281/500 [===============>..............] - ETA: 1:12 - loss: 1.3698 - regression_loss: 1.1704 - classification_loss: 0.1993 282/500 [===============>..............] - ETA: 1:12 - loss: 1.3701 - regression_loss: 1.1709 - classification_loss: 0.1992 283/500 [===============>..............] - ETA: 1:11 - loss: 1.3702 - regression_loss: 1.1711 - classification_loss: 0.1991 284/500 [================>.............] - ETA: 1:11 - loss: 1.3717 - regression_loss: 1.1725 - classification_loss: 0.1992 285/500 [================>.............] - ETA: 1:11 - loss: 1.3722 - regression_loss: 1.1731 - classification_loss: 0.1991 286/500 [================>.............] - ETA: 1:10 - loss: 1.3743 - regression_loss: 1.1748 - classification_loss: 0.1996 287/500 [================>.............] - ETA: 1:10 - loss: 1.3766 - regression_loss: 1.1767 - classification_loss: 0.1999 288/500 [================>.............] - ETA: 1:10 - loss: 1.3767 - regression_loss: 1.1770 - classification_loss: 0.1997 289/500 [================>.............] - ETA: 1:09 - loss: 1.3760 - regression_loss: 1.1764 - classification_loss: 0.1996 290/500 [================>.............] - ETA: 1:09 - loss: 1.3774 - regression_loss: 1.1774 - classification_loss: 0.2000 291/500 [================>.............] - ETA: 1:09 - loss: 1.3746 - regression_loss: 1.1749 - classification_loss: 0.1997 292/500 [================>.............] - ETA: 1:08 - loss: 1.3756 - regression_loss: 1.1755 - classification_loss: 0.2001 293/500 [================>.............] - ETA: 1:08 - loss: 1.3767 - regression_loss: 1.1765 - classification_loss: 0.2002 294/500 [================>.............] - ETA: 1:08 - loss: 1.3756 - regression_loss: 1.1755 - classification_loss: 0.2002 295/500 [================>.............] - ETA: 1:07 - loss: 1.3767 - regression_loss: 1.1765 - classification_loss: 0.2003 296/500 [================>.............] - ETA: 1:07 - loss: 1.3785 - regression_loss: 1.1779 - classification_loss: 0.2006 297/500 [================>.............] - ETA: 1:07 - loss: 1.3772 - regression_loss: 1.1767 - classification_loss: 0.2005 298/500 [================>.............] - ETA: 1:06 - loss: 1.3800 - regression_loss: 1.1788 - classification_loss: 0.2011 299/500 [================>.............] - ETA: 1:06 - loss: 1.3805 - regression_loss: 1.1793 - classification_loss: 0.2012 300/500 [=================>............] - ETA: 1:06 - loss: 1.3815 - regression_loss: 1.1800 - classification_loss: 0.2015 301/500 [=================>............] - ETA: 1:05 - loss: 1.3801 - regression_loss: 1.1790 - classification_loss: 0.2011 302/500 [=================>............] - ETA: 1:05 - loss: 1.3801 - regression_loss: 1.1791 - classification_loss: 0.2011 303/500 [=================>............] - ETA: 1:05 - loss: 1.3799 - regression_loss: 1.1788 - classification_loss: 0.2011 304/500 [=================>............] - ETA: 1:04 - loss: 1.3795 - regression_loss: 1.1784 - classification_loss: 0.2011 305/500 [=================>............] - ETA: 1:04 - loss: 1.3803 - regression_loss: 1.1790 - classification_loss: 0.2013 306/500 [=================>............] - ETA: 1:04 - loss: 1.3782 - regression_loss: 1.1772 - classification_loss: 0.2009 307/500 [=================>............] - ETA: 1:03 - loss: 1.3778 - regression_loss: 1.1770 - classification_loss: 0.2008 308/500 [=================>............] - ETA: 1:03 - loss: 1.3749 - regression_loss: 1.1745 - classification_loss: 0.2004 309/500 [=================>............] - ETA: 1:03 - loss: 1.3731 - regression_loss: 1.1731 - classification_loss: 0.2000 310/500 [=================>............] - ETA: 1:02 - loss: 1.3728 - regression_loss: 1.1726 - classification_loss: 0.2002 311/500 [=================>............] - ETA: 1:02 - loss: 1.3711 - regression_loss: 1.1712 - classification_loss: 0.1999 312/500 [=================>............] - ETA: 1:02 - loss: 1.3706 - regression_loss: 1.1707 - classification_loss: 0.1999 313/500 [=================>............] - ETA: 1:01 - loss: 1.3683 - regression_loss: 1.1686 - classification_loss: 0.1997 314/500 [=================>............] - ETA: 1:01 - loss: 1.3671 - regression_loss: 1.1676 - classification_loss: 0.1994 315/500 [=================>............] - ETA: 1:01 - loss: 1.3670 - regression_loss: 1.1677 - classification_loss: 0.1993 316/500 [=================>............] - ETA: 1:00 - loss: 1.3673 - regression_loss: 1.1680 - classification_loss: 0.1993 317/500 [==================>...........] - ETA: 1:00 - loss: 1.3677 - regression_loss: 1.1684 - classification_loss: 0.1994 318/500 [==================>...........] - ETA: 1:00 - loss: 1.3665 - regression_loss: 1.1673 - classification_loss: 0.1992 319/500 [==================>...........] - ETA: 59s - loss: 1.3674 - regression_loss: 1.1680 - classification_loss: 0.1994  320/500 [==================>...........] - ETA: 59s - loss: 1.3683 - regression_loss: 1.1688 - classification_loss: 0.1994 321/500 [==================>...........] - ETA: 59s - loss: 1.3684 - regression_loss: 1.1692 - classification_loss: 0.1992 322/500 [==================>...........] - ETA: 58s - loss: 1.3709 - regression_loss: 1.1712 - classification_loss: 0.1997 323/500 [==================>...........] - ETA: 58s - loss: 1.3706 - regression_loss: 1.1710 - classification_loss: 0.1996 324/500 [==================>...........] - ETA: 58s - loss: 1.3706 - regression_loss: 1.1710 - classification_loss: 0.1996 325/500 [==================>...........] - ETA: 57s - loss: 1.3705 - regression_loss: 1.1709 - classification_loss: 0.1996 326/500 [==================>...........] - ETA: 57s - loss: 1.3683 - regression_loss: 1.1691 - classification_loss: 0.1992 327/500 [==================>...........] - ETA: 57s - loss: 1.3681 - regression_loss: 1.1689 - classification_loss: 0.1992 328/500 [==================>...........] - ETA: 57s - loss: 1.3674 - regression_loss: 1.1683 - classification_loss: 0.1991 329/500 [==================>...........] - ETA: 56s - loss: 1.3686 - regression_loss: 1.1693 - classification_loss: 0.1993 330/500 [==================>...........] - ETA: 56s - loss: 1.3679 - regression_loss: 1.1686 - classification_loss: 0.1992 331/500 [==================>...........] - ETA: 56s - loss: 1.3698 - regression_loss: 1.1704 - classification_loss: 0.1994 332/500 [==================>...........] - ETA: 55s - loss: 1.3683 - regression_loss: 1.1691 - classification_loss: 0.1992 333/500 [==================>...........] - ETA: 55s - loss: 1.3690 - regression_loss: 1.1697 - classification_loss: 0.1992 334/500 [===================>..........] - ETA: 55s - loss: 1.3692 - regression_loss: 1.1700 - classification_loss: 0.1992 335/500 [===================>..........] - ETA: 54s - loss: 1.3688 - regression_loss: 1.1697 - classification_loss: 0.1992 336/500 [===================>..........] - ETA: 54s - loss: 1.3693 - regression_loss: 1.1701 - classification_loss: 0.1992 337/500 [===================>..........] - ETA: 54s - loss: 1.3699 - regression_loss: 1.1707 - classification_loss: 0.1992 338/500 [===================>..........] - ETA: 53s - loss: 1.3701 - regression_loss: 1.1709 - classification_loss: 0.1992 339/500 [===================>..........] - ETA: 53s - loss: 1.3682 - regression_loss: 1.1692 - classification_loss: 0.1989 340/500 [===================>..........] - ETA: 52s - loss: 1.3691 - regression_loss: 1.1700 - classification_loss: 0.1991 341/500 [===================>..........] - ETA: 52s - loss: 1.3701 - regression_loss: 1.1709 - classification_loss: 0.1992 342/500 [===================>..........] - ETA: 52s - loss: 1.3701 - regression_loss: 1.1709 - classification_loss: 0.1992 343/500 [===================>..........] - ETA: 51s - loss: 1.3678 - regression_loss: 1.1690 - classification_loss: 0.1988 344/500 [===================>..........] - ETA: 51s - loss: 1.3696 - regression_loss: 1.1704 - classification_loss: 0.1991 345/500 [===================>..........] - ETA: 51s - loss: 1.3696 - regression_loss: 1.1704 - classification_loss: 0.1992 346/500 [===================>..........] - ETA: 50s - loss: 1.3710 - regression_loss: 1.1715 - classification_loss: 0.1995 347/500 [===================>..........] - ETA: 50s - loss: 1.3708 - regression_loss: 1.1714 - classification_loss: 0.1994 348/500 [===================>..........] - ETA: 50s - loss: 1.3697 - regression_loss: 1.1705 - classification_loss: 0.1992 349/500 [===================>..........] - ETA: 49s - loss: 1.3725 - regression_loss: 1.1729 - classification_loss: 0.1996 350/500 [====================>.........] - ETA: 49s - loss: 1.3735 - regression_loss: 1.1737 - classification_loss: 0.1997 351/500 [====================>.........] - ETA: 49s - loss: 1.3737 - regression_loss: 1.1740 - classification_loss: 0.1997 352/500 [====================>.........] - ETA: 49s - loss: 1.3755 - regression_loss: 1.1756 - classification_loss: 0.2000 353/500 [====================>.........] - ETA: 48s - loss: 1.3756 - regression_loss: 1.1758 - classification_loss: 0.1998 354/500 [====================>.........] - ETA: 48s - loss: 1.3758 - regression_loss: 1.1760 - classification_loss: 0.1999 355/500 [====================>.........] - ETA: 48s - loss: 1.3746 - regression_loss: 1.1749 - classification_loss: 0.1997 356/500 [====================>.........] - ETA: 47s - loss: 1.3747 - regression_loss: 1.1750 - classification_loss: 0.1997 357/500 [====================>.........] - ETA: 47s - loss: 1.3756 - regression_loss: 1.1758 - classification_loss: 0.1998 358/500 [====================>.........] - ETA: 46s - loss: 1.3747 - regression_loss: 1.1750 - classification_loss: 0.1997 359/500 [====================>.........] - ETA: 46s - loss: 1.3730 - regression_loss: 1.1734 - classification_loss: 0.1995 360/500 [====================>.........] - ETA: 46s - loss: 1.3739 - regression_loss: 1.1744 - classification_loss: 0.1995 361/500 [====================>.........] - ETA: 46s - loss: 1.3733 - regression_loss: 1.1739 - classification_loss: 0.1994 362/500 [====================>.........] - ETA: 45s - loss: 1.3725 - regression_loss: 1.1732 - classification_loss: 0.1993 363/500 [====================>.........] - ETA: 45s - loss: 1.3729 - regression_loss: 1.1736 - classification_loss: 0.1993 364/500 [====================>.........] - ETA: 44s - loss: 1.3707 - regression_loss: 1.1717 - classification_loss: 0.1990 365/500 [====================>.........] - ETA: 44s - loss: 1.3716 - regression_loss: 1.1724 - classification_loss: 0.1991 366/500 [====================>.........] - ETA: 44s - loss: 1.3732 - regression_loss: 1.1737 - classification_loss: 0.1995 367/500 [=====================>........] - ETA: 44s - loss: 1.3724 - regression_loss: 1.1730 - classification_loss: 0.1994 368/500 [=====================>........] - ETA: 43s - loss: 1.3724 - regression_loss: 1.1730 - classification_loss: 0.1994 369/500 [=====================>........] - ETA: 43s - loss: 1.3722 - regression_loss: 1.1729 - classification_loss: 0.1993 370/500 [=====================>........] - ETA: 43s - loss: 1.3721 - regression_loss: 1.1728 - classification_loss: 0.1993 371/500 [=====================>........] - ETA: 42s - loss: 1.3716 - regression_loss: 1.1723 - classification_loss: 0.1992 372/500 [=====================>........] - ETA: 42s - loss: 1.3715 - regression_loss: 1.1723 - classification_loss: 0.1992 373/500 [=====================>........] - ETA: 42s - loss: 1.3723 - regression_loss: 1.1730 - classification_loss: 0.1993 374/500 [=====================>........] - ETA: 41s - loss: 1.3725 - regression_loss: 1.1731 - classification_loss: 0.1993 375/500 [=====================>........] - ETA: 41s - loss: 1.3734 - regression_loss: 1.1740 - classification_loss: 0.1994 376/500 [=====================>........] - ETA: 41s - loss: 1.3748 - regression_loss: 1.1751 - classification_loss: 0.1998 377/500 [=====================>........] - ETA: 40s - loss: 1.3752 - regression_loss: 1.1753 - classification_loss: 0.1998 378/500 [=====================>........] - ETA: 40s - loss: 1.3730 - regression_loss: 1.1736 - classification_loss: 0.1995 379/500 [=====================>........] - ETA: 40s - loss: 1.3717 - regression_loss: 1.1724 - classification_loss: 0.1993 380/500 [=====================>........] - ETA: 39s - loss: 1.3705 - regression_loss: 1.1715 - classification_loss: 0.1990 381/500 [=====================>........] - ETA: 39s - loss: 1.3688 - regression_loss: 1.1701 - classification_loss: 0.1987 382/500 [=====================>........] - ETA: 39s - loss: 1.3674 - regression_loss: 1.1689 - classification_loss: 0.1985 383/500 [=====================>........] - ETA: 38s - loss: 1.3690 - regression_loss: 1.1702 - classification_loss: 0.1988 384/500 [======================>.......] - ETA: 38s - loss: 1.3699 - regression_loss: 1.1710 - classification_loss: 0.1989 385/500 [======================>.......] - ETA: 38s - loss: 1.3690 - regression_loss: 1.1702 - classification_loss: 0.1989 386/500 [======================>.......] - ETA: 37s - loss: 1.3682 - regression_loss: 1.1695 - classification_loss: 0.1987 387/500 [======================>.......] - ETA: 37s - loss: 1.3670 - regression_loss: 1.1685 - classification_loss: 0.1985 388/500 [======================>.......] - ETA: 37s - loss: 1.3675 - regression_loss: 1.1689 - classification_loss: 0.1986 389/500 [======================>.......] - ETA: 36s - loss: 1.3674 - regression_loss: 1.1688 - classification_loss: 0.1986 390/500 [======================>.......] - ETA: 36s - loss: 1.3682 - regression_loss: 1.1695 - classification_loss: 0.1987 391/500 [======================>.......] - ETA: 36s - loss: 1.3665 - regression_loss: 1.1680 - classification_loss: 0.1985 392/500 [======================>.......] - ETA: 35s - loss: 1.3673 - regression_loss: 1.1687 - classification_loss: 0.1986 393/500 [======================>.......] - ETA: 35s - loss: 1.3674 - regression_loss: 1.1687 - classification_loss: 0.1987 394/500 [======================>.......] - ETA: 35s - loss: 1.3659 - regression_loss: 1.1675 - classification_loss: 0.1985 395/500 [======================>.......] - ETA: 34s - loss: 1.3663 - regression_loss: 1.1678 - classification_loss: 0.1985 396/500 [======================>.......] - ETA: 34s - loss: 1.3677 - regression_loss: 1.1688 - classification_loss: 0.1989 397/500 [======================>.......] - ETA: 34s - loss: 1.3681 - regression_loss: 1.1692 - classification_loss: 0.1989 398/500 [======================>.......] - ETA: 33s - loss: 1.3695 - regression_loss: 1.1704 - classification_loss: 0.1991 399/500 [======================>.......] - ETA: 33s - loss: 1.3706 - regression_loss: 1.1712 - classification_loss: 0.1994 400/500 [=======================>......] - ETA: 33s - loss: 1.3697 - regression_loss: 1.1705 - classification_loss: 0.1992 401/500 [=======================>......] - ETA: 32s - loss: 1.3690 - regression_loss: 1.1699 - classification_loss: 0.1991 402/500 [=======================>......] - ETA: 32s - loss: 1.3697 - regression_loss: 1.1705 - classification_loss: 0.1991 403/500 [=======================>......] - ETA: 32s - loss: 1.3700 - regression_loss: 1.1709 - classification_loss: 0.1991 404/500 [=======================>......] - ETA: 31s - loss: 1.3683 - regression_loss: 1.1694 - classification_loss: 0.1989 405/500 [=======================>......] - ETA: 31s - loss: 1.3666 - regression_loss: 1.1679 - classification_loss: 0.1987 406/500 [=======================>......] - ETA: 31s - loss: 1.3671 - regression_loss: 1.1684 - classification_loss: 0.1987 407/500 [=======================>......] - ETA: 30s - loss: 1.3675 - regression_loss: 1.1687 - classification_loss: 0.1988 408/500 [=======================>......] - ETA: 30s - loss: 1.3658 - regression_loss: 1.1673 - classification_loss: 0.1985 409/500 [=======================>......] - ETA: 30s - loss: 1.3654 - regression_loss: 1.1670 - classification_loss: 0.1984 410/500 [=======================>......] - ETA: 29s - loss: 1.3659 - regression_loss: 1.1675 - classification_loss: 0.1983 411/500 [=======================>......] - ETA: 29s - loss: 1.3672 - regression_loss: 1.1686 - classification_loss: 0.1986 412/500 [=======================>......] - ETA: 29s - loss: 1.3684 - regression_loss: 1.1697 - classification_loss: 0.1988 413/500 [=======================>......] - ETA: 28s - loss: 1.3694 - regression_loss: 1.1703 - classification_loss: 0.1990 414/500 [=======================>......] - ETA: 28s - loss: 1.3700 - regression_loss: 1.1709 - classification_loss: 0.1991 415/500 [=======================>......] - ETA: 28s - loss: 1.3711 - regression_loss: 1.1718 - classification_loss: 0.1993 416/500 [=======================>......] - ETA: 27s - loss: 1.3711 - regression_loss: 1.1719 - classification_loss: 0.1993 417/500 [========================>.....] - ETA: 27s - loss: 1.3713 - regression_loss: 1.1720 - classification_loss: 0.1993 418/500 [========================>.....] - ETA: 27s - loss: 1.3713 - regression_loss: 1.1720 - classification_loss: 0.1993 419/500 [========================>.....] - ETA: 26s - loss: 1.3706 - regression_loss: 1.1715 - classification_loss: 0.1992 420/500 [========================>.....] - ETA: 26s - loss: 1.3722 - regression_loss: 1.1727 - classification_loss: 0.1995 421/500 [========================>.....] - ETA: 26s - loss: 1.3727 - regression_loss: 1.1732 - classification_loss: 0.1995 422/500 [========================>.....] - ETA: 25s - loss: 1.3732 - regression_loss: 1.1737 - classification_loss: 0.1995 423/500 [========================>.....] - ETA: 25s - loss: 1.3721 - regression_loss: 1.1729 - classification_loss: 0.1993 424/500 [========================>.....] - ETA: 25s - loss: 1.3730 - regression_loss: 1.1735 - classification_loss: 0.1996 425/500 [========================>.....] - ETA: 24s - loss: 1.3742 - regression_loss: 1.1744 - classification_loss: 0.1998 426/500 [========================>.....] - ETA: 24s - loss: 1.3726 - regression_loss: 1.1731 - classification_loss: 0.1995 427/500 [========================>.....] - ETA: 24s - loss: 1.3714 - regression_loss: 1.1721 - classification_loss: 0.1993 428/500 [========================>.....] - ETA: 23s - loss: 1.3713 - regression_loss: 1.1721 - classification_loss: 0.1993 429/500 [========================>.....] - ETA: 23s - loss: 1.3725 - regression_loss: 1.1731 - classification_loss: 0.1994 430/500 [========================>.....] - ETA: 23s - loss: 1.3707 - regression_loss: 1.1714 - classification_loss: 0.1994 431/500 [========================>.....] - ETA: 22s - loss: 1.3702 - regression_loss: 1.1710 - classification_loss: 0.1992 432/500 [========================>.....] - ETA: 22s - loss: 1.3712 - regression_loss: 1.1719 - classification_loss: 0.1992 433/500 [========================>.....] - ETA: 22s - loss: 1.3723 - regression_loss: 1.1729 - classification_loss: 0.1994 434/500 [=========================>....] - ETA: 21s - loss: 1.3724 - regression_loss: 1.1732 - classification_loss: 0.1992 435/500 [=========================>....] - ETA: 21s - loss: 1.3726 - regression_loss: 1.1733 - classification_loss: 0.1993 436/500 [=========================>....] - ETA: 21s - loss: 1.3736 - regression_loss: 1.1741 - classification_loss: 0.1995 437/500 [=========================>....] - ETA: 20s - loss: 1.3732 - regression_loss: 1.1737 - classification_loss: 0.1994 438/500 [=========================>....] - ETA: 20s - loss: 1.3741 - regression_loss: 1.1744 - classification_loss: 0.1997 439/500 [=========================>....] - ETA: 20s - loss: 1.3740 - regression_loss: 1.1744 - classification_loss: 0.1996 440/500 [=========================>....] - ETA: 19s - loss: 1.3736 - regression_loss: 1.1740 - classification_loss: 0.1996 441/500 [=========================>....] - ETA: 19s - loss: 1.3720 - regression_loss: 1.1726 - classification_loss: 0.1994 442/500 [=========================>....] - ETA: 19s - loss: 1.3706 - regression_loss: 1.1714 - classification_loss: 0.1992 443/500 [=========================>....] - ETA: 18s - loss: 1.3713 - regression_loss: 1.1720 - classification_loss: 0.1993 444/500 [=========================>....] - ETA: 18s - loss: 1.3717 - regression_loss: 1.1724 - classification_loss: 0.1993 445/500 [=========================>....] - ETA: 18s - loss: 1.3713 - regression_loss: 1.1721 - classification_loss: 0.1992 446/500 [=========================>....] - ETA: 17s - loss: 1.3692 - regression_loss: 1.1702 - classification_loss: 0.1990 447/500 [=========================>....] - ETA: 17s - loss: 1.3677 - regression_loss: 1.1689 - classification_loss: 0.1988 448/500 [=========================>....] - ETA: 17s - loss: 1.3677 - regression_loss: 1.1689 - classification_loss: 0.1988 449/500 [=========================>....] - ETA: 16s - loss: 1.3674 - regression_loss: 1.1687 - classification_loss: 0.1987 450/500 [==========================>...] - ETA: 16s - loss: 1.3655 - regression_loss: 1.1670 - classification_loss: 0.1985 451/500 [==========================>...] - ETA: 16s - loss: 1.3640 - regression_loss: 1.1657 - classification_loss: 0.1984 452/500 [==========================>...] - ETA: 15s - loss: 1.3635 - regression_loss: 1.1652 - classification_loss: 0.1983 453/500 [==========================>...] - ETA: 15s - loss: 1.3629 - regression_loss: 1.1647 - classification_loss: 0.1981 454/500 [==========================>...] - ETA: 15s - loss: 1.3639 - regression_loss: 1.1657 - classification_loss: 0.1982 455/500 [==========================>...] - ETA: 14s - loss: 1.3648 - regression_loss: 1.1664 - classification_loss: 0.1984 456/500 [==========================>...] - ETA: 14s - loss: 1.3649 - regression_loss: 1.1665 - classification_loss: 0.1984 457/500 [==========================>...] - ETA: 14s - loss: 1.3638 - regression_loss: 1.1656 - classification_loss: 0.1982 458/500 [==========================>...] - ETA: 13s - loss: 1.3646 - regression_loss: 1.1662 - classification_loss: 0.1984 459/500 [==========================>...] - ETA: 13s - loss: 1.3658 - regression_loss: 1.1673 - classification_loss: 0.1985 460/500 [==========================>...] - ETA: 13s - loss: 1.3655 - regression_loss: 1.1665 - classification_loss: 0.1990 461/500 [==========================>...] - ETA: 12s - loss: 1.3658 - regression_loss: 1.1669 - classification_loss: 0.1989 462/500 [==========================>...] - ETA: 12s - loss: 1.3657 - regression_loss: 1.1668 - classification_loss: 0.1989 463/500 [==========================>...] - ETA: 12s - loss: 1.3669 - regression_loss: 1.1678 - classification_loss: 0.1991 464/500 [==========================>...] - ETA: 11s - loss: 1.3667 - regression_loss: 1.1675 - classification_loss: 0.1992 465/500 [==========================>...] - ETA: 11s - loss: 1.3667 - regression_loss: 1.1677 - classification_loss: 0.1990 466/500 [==========================>...] - ETA: 11s - loss: 1.3660 - regression_loss: 1.1671 - classification_loss: 0.1989 467/500 [===========================>..] - ETA: 10s - loss: 1.3663 - regression_loss: 1.1674 - classification_loss: 0.1989 468/500 [===========================>..] - ETA: 10s - loss: 1.3668 - regression_loss: 1.1680 - classification_loss: 0.1988 469/500 [===========================>..] - ETA: 10s - loss: 1.3653 - regression_loss: 1.1667 - classification_loss: 0.1986 470/500 [===========================>..] - ETA: 9s - loss: 1.3636 - regression_loss: 1.1653 - classification_loss: 0.1983  471/500 [===========================>..] - ETA: 9s - loss: 1.3621 - regression_loss: 1.1640 - classification_loss: 0.1981 472/500 [===========================>..] - ETA: 9s - loss: 1.3627 - regression_loss: 1.1646 - classification_loss: 0.1982 473/500 [===========================>..] - ETA: 8s - loss: 1.3621 - regression_loss: 1.1641 - classification_loss: 0.1980 474/500 [===========================>..] - ETA: 8s - loss: 1.3625 - regression_loss: 1.1644 - classification_loss: 0.1981 475/500 [===========================>..] - ETA: 8s - loss: 1.3629 - regression_loss: 1.1647 - classification_loss: 0.1981 476/500 [===========================>..] - ETA: 7s - loss: 1.3633 - regression_loss: 1.1652 - classification_loss: 0.1982 477/500 [===========================>..] - ETA: 7s - loss: 1.3633 - regression_loss: 1.1651 - classification_loss: 0.1982 478/500 [===========================>..] - ETA: 7s - loss: 1.3641 - regression_loss: 1.1657 - classification_loss: 0.1984 479/500 [===========================>..] - ETA: 6s - loss: 1.3630 - regression_loss: 1.1648 - classification_loss: 0.1983 480/500 [===========================>..] - ETA: 6s - loss: 1.3613 - regression_loss: 1.1632 - classification_loss: 0.1981 481/500 [===========================>..] - ETA: 6s - loss: 1.3616 - regression_loss: 1.1635 - classification_loss: 0.1981 482/500 [===========================>..] - ETA: 5s - loss: 1.3620 - regression_loss: 1.1638 - classification_loss: 0.1982 483/500 [===========================>..] - ETA: 5s - loss: 1.3602 - regression_loss: 1.1623 - classification_loss: 0.1979 484/500 [============================>.] - ETA: 5s - loss: 1.3598 - regression_loss: 1.1620 - classification_loss: 0.1978 485/500 [============================>.] - ETA: 4s - loss: 1.3593 - regression_loss: 1.1613 - classification_loss: 0.1980 486/500 [============================>.] - ETA: 4s - loss: 1.3578 - regression_loss: 1.1600 - classification_loss: 0.1977 487/500 [============================>.] - ETA: 4s - loss: 1.3569 - regression_loss: 1.1594 - classification_loss: 0.1976 488/500 [============================>.] - ETA: 3s - loss: 1.3555 - regression_loss: 1.1582 - classification_loss: 0.1973 489/500 [============================>.] - ETA: 3s - loss: 1.3555 - regression_loss: 1.1581 - classification_loss: 0.1974 490/500 [============================>.] - ETA: 3s - loss: 1.3559 - regression_loss: 1.1584 - classification_loss: 0.1975 491/500 [============================>.] - ETA: 2s - loss: 1.3551 - regression_loss: 1.1578 - classification_loss: 0.1973 492/500 [============================>.] - ETA: 2s - loss: 1.3548 - regression_loss: 1.1575 - classification_loss: 0.1973 493/500 [============================>.] - ETA: 2s - loss: 1.3530 - regression_loss: 1.1560 - classification_loss: 0.1970 494/500 [============================>.] - ETA: 1s - loss: 1.3536 - regression_loss: 1.1565 - classification_loss: 0.1971 495/500 [============================>.] - ETA: 1s - loss: 1.3523 - regression_loss: 1.1553 - classification_loss: 0.1970 496/500 [============================>.] - ETA: 1s - loss: 1.3531 - regression_loss: 1.1560 - classification_loss: 0.1971 497/500 [============================>.] - ETA: 0s - loss: 1.3540 - regression_loss: 1.1567 - classification_loss: 0.1973 498/500 [============================>.] - ETA: 0s - loss: 1.3530 - regression_loss: 1.1558 - classification_loss: 0.1972 499/500 [============================>.] - ETA: 0s - loss: 1.3529 - regression_loss: 1.1558 - classification_loss: 0.1971 500/500 [==============================] - 166s 331ms/step - loss: 1.3536 - regression_loss: 1.1563 - classification_loss: 0.1973 1172 instances of class plum with average precision: 0.6169 mAP: 0.6169 Epoch 00020: saving model to ./training/snapshots/resnet101_pascal_20.h5 Epoch 21/150 1/500 [..............................] - ETA: 2:29 - loss: 1.3827 - regression_loss: 1.1901 - classification_loss: 0.1926 2/500 [..............................] - ETA: 2:44 - loss: 1.5047 - regression_loss: 1.3024 - classification_loss: 0.2022 3/500 [..............................] - ETA: 2:47 - loss: 1.4385 - regression_loss: 1.2395 - classification_loss: 0.1991 4/500 [..............................] - ETA: 2:46 - loss: 1.3581 - regression_loss: 1.1714 - classification_loss: 0.1867 5/500 [..............................] - ETA: 2:48 - loss: 1.2369 - regression_loss: 1.0536 - classification_loss: 0.1833 6/500 [..............................] - ETA: 2:46 - loss: 1.2977 - regression_loss: 1.1030 - classification_loss: 0.1947 7/500 [..............................] - ETA: 2:44 - loss: 1.2572 - regression_loss: 1.0685 - classification_loss: 0.1888 8/500 [..............................] - ETA: 2:43 - loss: 1.2524 - regression_loss: 1.0633 - classification_loss: 0.1891 9/500 [..............................] - ETA: 2:43 - loss: 1.2309 - regression_loss: 1.0451 - classification_loss: 0.1857 10/500 [..............................] - ETA: 2:43 - loss: 1.3358 - regression_loss: 1.1343 - classification_loss: 0.2016 11/500 [..............................] - ETA: 2:42 - loss: 1.2973 - regression_loss: 1.1022 - classification_loss: 0.1951 12/500 [..............................] - ETA: 2:43 - loss: 1.2879 - regression_loss: 1.0996 - classification_loss: 0.1883 13/500 [..............................] - ETA: 2:42 - loss: 1.2771 - regression_loss: 1.0907 - classification_loss: 0.1864 14/500 [..............................] - ETA: 2:43 - loss: 1.2693 - regression_loss: 1.0824 - classification_loss: 0.1868 15/500 [..............................] - ETA: 2:42 - loss: 1.2907 - regression_loss: 1.1020 - classification_loss: 0.1886 16/500 [..............................] - ETA: 2:42 - loss: 1.3167 - regression_loss: 1.1229 - classification_loss: 0.1939 17/500 [>.............................] - ETA: 2:41 - loss: 1.3298 - regression_loss: 1.1356 - classification_loss: 0.1942 18/500 [>.............................] - ETA: 2:40 - loss: 1.3399 - regression_loss: 1.1448 - classification_loss: 0.1951 19/500 [>.............................] - ETA: 2:39 - loss: 1.3182 - regression_loss: 1.1245 - classification_loss: 0.1937 20/500 [>.............................] - ETA: 2:39 - loss: 1.3042 - regression_loss: 1.1137 - classification_loss: 0.1905 21/500 [>.............................] - ETA: 2:38 - loss: 1.2936 - regression_loss: 1.1051 - classification_loss: 0.1885 22/500 [>.............................] - ETA: 2:38 - loss: 1.3035 - regression_loss: 1.1144 - classification_loss: 0.1892 23/500 [>.............................] - ETA: 2:38 - loss: 1.3278 - regression_loss: 1.1335 - classification_loss: 0.1943 24/500 [>.............................] - ETA: 2:37 - loss: 1.3434 - regression_loss: 1.1472 - classification_loss: 0.1963 25/500 [>.............................] - ETA: 2:37 - loss: 1.3307 - regression_loss: 1.1358 - classification_loss: 0.1948 26/500 [>.............................] - ETA: 2:37 - loss: 1.3307 - regression_loss: 1.1343 - classification_loss: 0.1964 27/500 [>.............................] - ETA: 2:36 - loss: 1.3147 - regression_loss: 1.1210 - classification_loss: 0.1937 28/500 [>.............................] - ETA: 2:36 - loss: 1.3106 - regression_loss: 1.1174 - classification_loss: 0.1932 29/500 [>.............................] - ETA: 2:35 - loss: 1.3033 - regression_loss: 1.1130 - classification_loss: 0.1903 30/500 [>.............................] - ETA: 2:35 - loss: 1.2887 - regression_loss: 1.1005 - classification_loss: 0.1882 31/500 [>.............................] - ETA: 2:34 - loss: 1.2932 - regression_loss: 1.1024 - classification_loss: 0.1907 32/500 [>.............................] - ETA: 2:34 - loss: 1.2786 - regression_loss: 1.0904 - classification_loss: 0.1882 33/500 [>.............................] - ETA: 2:33 - loss: 1.2886 - regression_loss: 1.0975 - classification_loss: 0.1911 34/500 [=>............................] - ETA: 2:33 - loss: 1.3057 - regression_loss: 1.1108 - classification_loss: 0.1949 35/500 [=>............................] - ETA: 2:33 - loss: 1.3014 - regression_loss: 1.1064 - classification_loss: 0.1950 36/500 [=>............................] - ETA: 2:33 - loss: 1.3126 - regression_loss: 1.1160 - classification_loss: 0.1967 37/500 [=>............................] - ETA: 2:32 - loss: 1.3016 - regression_loss: 1.1066 - classification_loss: 0.1950 38/500 [=>............................] - ETA: 2:32 - loss: 1.2852 - regression_loss: 1.0917 - classification_loss: 0.1935 39/500 [=>............................] - ETA: 2:31 - loss: 1.2992 - regression_loss: 1.1047 - classification_loss: 0.1945 40/500 [=>............................] - ETA: 2:31 - loss: 1.3052 - regression_loss: 1.1095 - classification_loss: 0.1957 41/500 [=>............................] - ETA: 2:31 - loss: 1.3175 - regression_loss: 1.1193 - classification_loss: 0.1982 42/500 [=>............................] - ETA: 2:30 - loss: 1.3028 - regression_loss: 1.1070 - classification_loss: 0.1958 43/500 [=>............................] - ETA: 2:30 - loss: 1.2888 - regression_loss: 1.0945 - classification_loss: 0.1944 44/500 [=>............................] - ETA: 2:30 - loss: 1.2945 - regression_loss: 1.0986 - classification_loss: 0.1960 45/500 [=>............................] - ETA: 2:30 - loss: 1.2783 - regression_loss: 1.0842 - classification_loss: 0.1940 46/500 [=>............................] - ETA: 2:29 - loss: 1.2990 - regression_loss: 1.1015 - classification_loss: 0.1976 47/500 [=>............................] - ETA: 2:29 - loss: 1.2804 - regression_loss: 1.0855 - classification_loss: 0.1948 48/500 [=>............................] - ETA: 2:29 - loss: 1.2766 - regression_loss: 1.0815 - classification_loss: 0.1950 49/500 [=>............................] - ETA: 2:28 - loss: 1.2750 - regression_loss: 1.0814 - classification_loss: 0.1935 50/500 [==>...........................] - ETA: 2:28 - loss: 1.2900 - regression_loss: 1.0912 - classification_loss: 0.1989 51/500 [==>...........................] - ETA: 2:28 - loss: 1.2734 - regression_loss: 1.0772 - classification_loss: 0.1962 52/500 [==>...........................] - ETA: 2:27 - loss: 1.2799 - regression_loss: 1.0829 - classification_loss: 0.1969 53/500 [==>...........................] - ETA: 2:27 - loss: 1.2831 - regression_loss: 1.0859 - classification_loss: 0.1972 54/500 [==>...........................] - ETA: 2:27 - loss: 1.2916 - regression_loss: 1.0917 - classification_loss: 0.1999 55/500 [==>...........................] - ETA: 2:26 - loss: 1.2851 - regression_loss: 1.0869 - classification_loss: 0.1982 56/500 [==>...........................] - ETA: 2:26 - loss: 1.2754 - regression_loss: 1.0794 - classification_loss: 0.1960 57/500 [==>...........................] - ETA: 2:26 - loss: 1.2840 - regression_loss: 1.0872 - classification_loss: 0.1968 58/500 [==>...........................] - ETA: 2:25 - loss: 1.2702 - regression_loss: 1.0750 - classification_loss: 0.1952 59/500 [==>...........................] - ETA: 2:25 - loss: 1.2667 - regression_loss: 1.0721 - classification_loss: 0.1946 60/500 [==>...........................] - ETA: 2:25 - loss: 1.2727 - regression_loss: 1.0775 - classification_loss: 0.1951 61/500 [==>...........................] - ETA: 2:24 - loss: 1.2797 - regression_loss: 1.0842 - classification_loss: 0.1955 62/500 [==>...........................] - ETA: 2:24 - loss: 1.2838 - regression_loss: 1.0882 - classification_loss: 0.1956 63/500 [==>...........................] - ETA: 2:24 - loss: 1.2768 - regression_loss: 1.0825 - classification_loss: 0.1942 64/500 [==>...........................] - ETA: 2:24 - loss: 1.2674 - regression_loss: 1.0742 - classification_loss: 0.1933 65/500 [==>...........................] - ETA: 2:24 - loss: 1.2731 - regression_loss: 1.0790 - classification_loss: 0.1941 66/500 [==>...........................] - ETA: 2:23 - loss: 1.2699 - regression_loss: 1.0763 - classification_loss: 0.1936 67/500 [===>..........................] - ETA: 2:23 - loss: 1.2586 - regression_loss: 1.0665 - classification_loss: 0.1921 68/500 [===>..........................] - ETA: 2:22 - loss: 1.2585 - regression_loss: 1.0666 - classification_loss: 0.1919 69/500 [===>..........................] - ETA: 2:22 - loss: 1.2595 - regression_loss: 1.0687 - classification_loss: 0.1908 70/500 [===>..........................] - ETA: 2:22 - loss: 1.2572 - regression_loss: 1.0668 - classification_loss: 0.1904 71/500 [===>..........................] - ETA: 2:21 - loss: 1.2557 - regression_loss: 1.0655 - classification_loss: 0.1901 72/500 [===>..........................] - ETA: 2:21 - loss: 1.2602 - regression_loss: 1.0697 - classification_loss: 0.1905 73/500 [===>..........................] - ETA: 2:20 - loss: 1.2654 - regression_loss: 1.0739 - classification_loss: 0.1916 74/500 [===>..........................] - ETA: 2:20 - loss: 1.2670 - regression_loss: 1.0756 - classification_loss: 0.1914 75/500 [===>..........................] - ETA: 2:20 - loss: 1.2730 - regression_loss: 1.0807 - classification_loss: 0.1923 76/500 [===>..........................] - ETA: 2:19 - loss: 1.2744 - regression_loss: 1.0816 - classification_loss: 0.1929 77/500 [===>..........................] - ETA: 2:19 - loss: 1.2752 - regression_loss: 1.0820 - classification_loss: 0.1931 78/500 [===>..........................] - ETA: 2:19 - loss: 1.2800 - regression_loss: 1.0865 - classification_loss: 0.1934 79/500 [===>..........................] - ETA: 2:18 - loss: 1.2859 - regression_loss: 1.0919 - classification_loss: 0.1940 80/500 [===>..........................] - ETA: 2:18 - loss: 1.2836 - regression_loss: 1.0901 - classification_loss: 0.1935 81/500 [===>..........................] - ETA: 2:18 - loss: 1.2874 - regression_loss: 1.0938 - classification_loss: 0.1935 82/500 [===>..........................] - ETA: 2:17 - loss: 1.2822 - regression_loss: 1.0894 - classification_loss: 0.1927 83/500 [===>..........................] - ETA: 2:17 - loss: 1.2753 - regression_loss: 1.0841 - classification_loss: 0.1912 84/500 [====>.........................] - ETA: 2:17 - loss: 1.2810 - regression_loss: 1.0885 - classification_loss: 0.1924 85/500 [====>.........................] - ETA: 2:16 - loss: 1.2781 - regression_loss: 1.0863 - classification_loss: 0.1918 86/500 [====>.........................] - ETA: 2:16 - loss: 1.2781 - regression_loss: 1.0856 - classification_loss: 0.1925 87/500 [====>.........................] - ETA: 2:16 - loss: 1.2705 - regression_loss: 1.0793 - classification_loss: 0.1911 88/500 [====>.........................] - ETA: 2:15 - loss: 1.2677 - regression_loss: 1.0763 - classification_loss: 0.1915 89/500 [====>.........................] - ETA: 2:15 - loss: 1.2725 - regression_loss: 1.0805 - classification_loss: 0.1920 90/500 [====>.........................] - ETA: 2:15 - loss: 1.2732 - regression_loss: 1.0813 - classification_loss: 0.1920 91/500 [====>.........................] - ETA: 2:14 - loss: 1.2738 - regression_loss: 1.0825 - classification_loss: 0.1913 92/500 [====>.........................] - ETA: 2:14 - loss: 1.2760 - regression_loss: 1.0848 - classification_loss: 0.1912 93/500 [====>.........................] - ETA: 2:14 - loss: 1.2737 - regression_loss: 1.0831 - classification_loss: 0.1906 94/500 [====>.........................] - ETA: 2:13 - loss: 1.2790 - regression_loss: 1.0874 - classification_loss: 0.1916 95/500 [====>.........................] - ETA: 2:13 - loss: 1.2812 - regression_loss: 1.0895 - classification_loss: 0.1917 96/500 [====>.........................] - ETA: 2:12 - loss: 1.2771 - regression_loss: 1.0862 - classification_loss: 0.1909 97/500 [====>.........................] - ETA: 2:12 - loss: 1.2768 - regression_loss: 1.0866 - classification_loss: 0.1902 98/500 [====>.........................] - ETA: 2:12 - loss: 1.2803 - regression_loss: 1.0899 - classification_loss: 0.1904 99/500 [====>.........................] - ETA: 2:11 - loss: 1.2726 - regression_loss: 1.0836 - classification_loss: 0.1891 100/500 [=====>........................] - ETA: 2:11 - loss: 1.2682 - regression_loss: 1.0793 - classification_loss: 0.1888 101/500 [=====>........................] - ETA: 2:11 - loss: 1.2720 - regression_loss: 1.0821 - classification_loss: 0.1898 102/500 [=====>........................] - ETA: 2:11 - loss: 1.2703 - regression_loss: 1.0808 - classification_loss: 0.1895 103/500 [=====>........................] - ETA: 2:10 - loss: 1.2642 - regression_loss: 1.0755 - classification_loss: 0.1887 104/500 [=====>........................] - ETA: 2:10 - loss: 1.2579 - regression_loss: 1.0702 - classification_loss: 0.1877 105/500 [=====>........................] - ETA: 2:10 - loss: 1.2642 - regression_loss: 1.0756 - classification_loss: 0.1885 106/500 [=====>........................] - ETA: 2:10 - loss: 1.2618 - regression_loss: 1.0733 - classification_loss: 0.1884 107/500 [=====>........................] - ETA: 2:09 - loss: 1.2607 - regression_loss: 1.0725 - classification_loss: 0.1882 108/500 [=====>........................] - ETA: 2:09 - loss: 1.2611 - regression_loss: 1.0729 - classification_loss: 0.1882 109/500 [=====>........................] - ETA: 2:08 - loss: 1.2602 - regression_loss: 1.0721 - classification_loss: 0.1881 110/500 [=====>........................] - ETA: 2:08 - loss: 1.2624 - regression_loss: 1.0741 - classification_loss: 0.1883 111/500 [=====>........................] - ETA: 2:08 - loss: 1.2629 - regression_loss: 1.0749 - classification_loss: 0.1879 112/500 [=====>........................] - ETA: 2:07 - loss: 1.2589 - regression_loss: 1.0713 - classification_loss: 0.1875 113/500 [=====>........................] - ETA: 2:07 - loss: 1.2636 - regression_loss: 1.0759 - classification_loss: 0.1877 114/500 [=====>........................] - ETA: 2:07 - loss: 1.2662 - regression_loss: 1.0784 - classification_loss: 0.1879 115/500 [=====>........................] - ETA: 2:06 - loss: 1.2646 - regression_loss: 1.0768 - classification_loss: 0.1877 116/500 [=====>........................] - ETA: 2:06 - loss: 1.2669 - regression_loss: 1.0792 - classification_loss: 0.1877 117/500 [======>.......................] - ETA: 2:06 - loss: 1.2697 - regression_loss: 1.0816 - classification_loss: 0.1881 118/500 [======>.......................] - ETA: 2:05 - loss: 1.2739 - regression_loss: 1.0850 - classification_loss: 0.1889 119/500 [======>.......................] - ETA: 2:05 - loss: 1.2786 - regression_loss: 1.0891 - classification_loss: 0.1896 120/500 [======>.......................] - ETA: 2:05 - loss: 1.2780 - regression_loss: 1.0886 - classification_loss: 0.1894 121/500 [======>.......................] - ETA: 2:04 - loss: 1.2812 - regression_loss: 1.0914 - classification_loss: 0.1898 122/500 [======>.......................] - ETA: 2:04 - loss: 1.2792 - regression_loss: 1.0896 - classification_loss: 0.1896 123/500 [======>.......................] - ETA: 2:04 - loss: 1.2763 - regression_loss: 1.0874 - classification_loss: 0.1889 124/500 [======>.......................] - ETA: 2:03 - loss: 1.2735 - regression_loss: 1.0852 - classification_loss: 0.1883 125/500 [======>.......................] - ETA: 2:03 - loss: 1.2748 - regression_loss: 1.0860 - classification_loss: 0.1888 126/500 [======>.......................] - ETA: 2:03 - loss: 1.2744 - regression_loss: 1.0859 - classification_loss: 0.1885 127/500 [======>.......................] - ETA: 2:02 - loss: 1.2772 - regression_loss: 1.0881 - classification_loss: 0.1891 128/500 [======>.......................] - ETA: 2:02 - loss: 1.2745 - regression_loss: 1.0862 - classification_loss: 0.1883 129/500 [======>.......................] - ETA: 2:02 - loss: 1.2755 - regression_loss: 1.0872 - classification_loss: 0.1883 130/500 [======>.......................] - ETA: 2:01 - loss: 1.2751 - regression_loss: 1.0872 - classification_loss: 0.1880 131/500 [======>.......................] - ETA: 2:01 - loss: 1.2695 - regression_loss: 1.0824 - classification_loss: 0.1871 132/500 [======>.......................] - ETA: 2:01 - loss: 1.2747 - regression_loss: 1.0871 - classification_loss: 0.1877 133/500 [======>.......................] - ETA: 2:00 - loss: 1.2732 - regression_loss: 1.0859 - classification_loss: 0.1873 134/500 [=======>......................] - ETA: 2:00 - loss: 1.2694 - regression_loss: 1.0830 - classification_loss: 0.1864 135/500 [=======>......................] - ETA: 2:00 - loss: 1.2648 - regression_loss: 1.0792 - classification_loss: 0.1856 136/500 [=======>......................] - ETA: 2:00 - loss: 1.2663 - regression_loss: 1.0805 - classification_loss: 0.1857 137/500 [=======>......................] - ETA: 1:59 - loss: 1.2694 - regression_loss: 1.0834 - classification_loss: 0.1861 138/500 [=======>......................] - ETA: 1:59 - loss: 1.2658 - regression_loss: 1.0802 - classification_loss: 0.1855 139/500 [=======>......................] - ETA: 1:59 - loss: 1.2626 - regression_loss: 1.0777 - classification_loss: 0.1849 140/500 [=======>......................] - ETA: 1:58 - loss: 1.2623 - regression_loss: 1.0775 - classification_loss: 0.1849 141/500 [=======>......................] - ETA: 1:58 - loss: 1.2655 - regression_loss: 1.0801 - classification_loss: 0.1854 142/500 [=======>......................] - ETA: 1:58 - loss: 1.2678 - regression_loss: 1.0823 - classification_loss: 0.1855 143/500 [=======>......................] - ETA: 1:57 - loss: 1.2660 - regression_loss: 1.0809 - classification_loss: 0.1852 144/500 [=======>......................] - ETA: 1:57 - loss: 1.2683 - regression_loss: 1.0827 - classification_loss: 0.1856 145/500 [=======>......................] - ETA: 1:57 - loss: 1.2690 - regression_loss: 1.0835 - classification_loss: 0.1855 146/500 [=======>......................] - ETA: 1:56 - loss: 1.2745 - regression_loss: 1.0878 - classification_loss: 0.1866 147/500 [=======>......................] - ETA: 1:56 - loss: 1.2705 - regression_loss: 1.0843 - classification_loss: 0.1862 148/500 [=======>......................] - ETA: 1:56 - loss: 1.2723 - regression_loss: 1.0858 - classification_loss: 0.1865 149/500 [=======>......................] - ETA: 1:55 - loss: 1.2717 - regression_loss: 1.0854 - classification_loss: 0.1863 150/500 [========>.....................] - ETA: 1:55 - loss: 1.2705 - regression_loss: 1.0843 - classification_loss: 0.1862 151/500 [========>.....................] - ETA: 1:55 - loss: 1.2712 - regression_loss: 1.0851 - classification_loss: 0.1861 152/500 [========>.....................] - ETA: 1:54 - loss: 1.2738 - regression_loss: 1.0874 - classification_loss: 0.1863 153/500 [========>.....................] - ETA: 1:54 - loss: 1.2738 - regression_loss: 1.0875 - classification_loss: 0.1863 154/500 [========>.....................] - ETA: 1:54 - loss: 1.2781 - regression_loss: 1.0911 - classification_loss: 0.1870 155/500 [========>.....................] - ETA: 1:53 - loss: 1.2795 - regression_loss: 1.0927 - classification_loss: 0.1868 156/500 [========>.....................] - ETA: 1:53 - loss: 1.2797 - regression_loss: 1.0928 - classification_loss: 0.1869 157/500 [========>.....................] - ETA: 1:53 - loss: 1.2807 - regression_loss: 1.0935 - classification_loss: 0.1872 158/500 [========>.....................] - ETA: 1:52 - loss: 1.2825 - regression_loss: 1.0950 - classification_loss: 0.1874 159/500 [========>.....................] - ETA: 1:52 - loss: 1.2795 - regression_loss: 1.0926 - classification_loss: 0.1869 160/500 [========>.....................] - ETA: 1:52 - loss: 1.2822 - regression_loss: 1.0947 - classification_loss: 0.1875 161/500 [========>.....................] - ETA: 1:51 - loss: 1.2784 - regression_loss: 1.0915 - classification_loss: 0.1869 162/500 [========>.....................] - ETA: 1:51 - loss: 1.2750 - regression_loss: 1.0888 - classification_loss: 0.1863 163/500 [========>.....................] - ETA: 1:51 - loss: 1.2766 - regression_loss: 1.0902 - classification_loss: 0.1863 164/500 [========>.....................] - ETA: 1:50 - loss: 1.2725 - regression_loss: 1.0866 - classification_loss: 0.1860 165/500 [========>.....................] - ETA: 1:50 - loss: 1.2730 - regression_loss: 1.0865 - classification_loss: 0.1865 166/500 [========>.....................] - ETA: 1:50 - loss: 1.2773 - regression_loss: 1.0897 - classification_loss: 0.1875 167/500 [=========>....................] - ETA: 1:49 - loss: 1.2758 - regression_loss: 1.0880 - classification_loss: 0.1878 168/500 [=========>....................] - ETA: 1:49 - loss: 1.2806 - regression_loss: 1.0920 - classification_loss: 0.1885 169/500 [=========>....................] - ETA: 1:49 - loss: 1.2819 - regression_loss: 1.0933 - classification_loss: 0.1886 170/500 [=========>....................] - ETA: 1:49 - loss: 1.2824 - regression_loss: 1.0934 - classification_loss: 0.1890 171/500 [=========>....................] - ETA: 1:48 - loss: 1.2831 - regression_loss: 1.0942 - classification_loss: 0.1888 172/500 [=========>....................] - ETA: 1:48 - loss: 1.2843 - regression_loss: 1.0955 - classification_loss: 0.1888 173/500 [=========>....................] - ETA: 1:48 - loss: 1.2840 - regression_loss: 1.0952 - classification_loss: 0.1889 174/500 [=========>....................] - ETA: 1:47 - loss: 1.2864 - regression_loss: 1.0973 - classification_loss: 0.1891 175/500 [=========>....................] - ETA: 1:47 - loss: 1.2867 - regression_loss: 1.0974 - classification_loss: 0.1892 176/500 [=========>....................] - ETA: 1:47 - loss: 1.2869 - regression_loss: 1.0976 - classification_loss: 0.1893 177/500 [=========>....................] - ETA: 1:46 - loss: 1.2889 - regression_loss: 1.0994 - classification_loss: 0.1895 178/500 [=========>....................] - ETA: 1:46 - loss: 1.2926 - regression_loss: 1.1026 - classification_loss: 0.1901 179/500 [=========>....................] - ETA: 1:46 - loss: 1.2929 - regression_loss: 1.1028 - classification_loss: 0.1900 180/500 [=========>....................] - ETA: 1:45 - loss: 1.2901 - regression_loss: 1.1002 - classification_loss: 0.1899 181/500 [=========>....................] - ETA: 1:45 - loss: 1.2910 - regression_loss: 1.1010 - classification_loss: 0.1899 182/500 [=========>....................] - ETA: 1:45 - loss: 1.2939 - regression_loss: 1.1035 - classification_loss: 0.1904 183/500 [=========>....................] - ETA: 1:44 - loss: 1.2958 - regression_loss: 1.1051 - classification_loss: 0.1908 184/500 [==========>...................] - ETA: 1:44 - loss: 1.2978 - regression_loss: 1.1068 - classification_loss: 0.1910 185/500 [==========>...................] - ETA: 1:44 - loss: 1.2993 - regression_loss: 1.1081 - classification_loss: 0.1911 186/500 [==========>...................] - ETA: 1:43 - loss: 1.2976 - regression_loss: 1.1067 - classification_loss: 0.1908 187/500 [==========>...................] - ETA: 1:43 - loss: 1.2956 - regression_loss: 1.1051 - classification_loss: 0.1905 188/500 [==========>...................] - ETA: 1:43 - loss: 1.2936 - regression_loss: 1.1035 - classification_loss: 0.1901 189/500 [==========>...................] - ETA: 1:42 - loss: 1.2893 - regression_loss: 1.0996 - classification_loss: 0.1897 190/500 [==========>...................] - ETA: 1:42 - loss: 1.2873 - regression_loss: 1.0980 - classification_loss: 0.1893 191/500 [==========>...................] - ETA: 1:42 - loss: 1.2863 - regression_loss: 1.0972 - classification_loss: 0.1891 192/500 [==========>...................] - ETA: 1:41 - loss: 1.2904 - regression_loss: 1.1006 - classification_loss: 0.1898 193/500 [==========>...................] - ETA: 1:41 - loss: 1.2911 - regression_loss: 1.1014 - classification_loss: 0.1898 194/500 [==========>...................] - ETA: 1:41 - loss: 1.2885 - regression_loss: 1.0991 - classification_loss: 0.1894 195/500 [==========>...................] - ETA: 1:40 - loss: 1.2853 - regression_loss: 1.0961 - classification_loss: 0.1892 196/500 [==========>...................] - ETA: 1:40 - loss: 1.2839 - regression_loss: 1.0948 - classification_loss: 0.1891 197/500 [==========>...................] - ETA: 1:40 - loss: 1.2832 - regression_loss: 1.0942 - classification_loss: 0.1891 198/500 [==========>...................] - ETA: 1:39 - loss: 1.2874 - regression_loss: 1.0980 - classification_loss: 0.1895 199/500 [==========>...................] - ETA: 1:39 - loss: 1.2919 - regression_loss: 1.1016 - classification_loss: 0.1902 200/500 [===========>..................] - ETA: 1:39 - loss: 1.2892 - regression_loss: 1.0993 - classification_loss: 0.1900 201/500 [===========>..................] - ETA: 1:38 - loss: 1.2848 - regression_loss: 1.0953 - classification_loss: 0.1895 202/500 [===========>..................] - ETA: 1:38 - loss: 1.2876 - regression_loss: 1.0976 - classification_loss: 0.1900 203/500 [===========>..................] - ETA: 1:38 - loss: 1.2863 - regression_loss: 1.0964 - classification_loss: 0.1899 204/500 [===========>..................] - ETA: 1:37 - loss: 1.2884 - regression_loss: 1.0981 - classification_loss: 0.1903 205/500 [===========>..................] - ETA: 1:37 - loss: 1.2883 - regression_loss: 1.0980 - classification_loss: 0.1903 206/500 [===========>..................] - ETA: 1:37 - loss: 1.2886 - regression_loss: 1.0983 - classification_loss: 0.1903 207/500 [===========>..................] - ETA: 1:36 - loss: 1.2894 - regression_loss: 1.0990 - classification_loss: 0.1904 208/500 [===========>..................] - ETA: 1:36 - loss: 1.2871 - regression_loss: 1.0969 - classification_loss: 0.1901 209/500 [===========>..................] - ETA: 1:36 - loss: 1.2896 - regression_loss: 1.0991 - classification_loss: 0.1904 210/500 [===========>..................] - ETA: 1:35 - loss: 1.2910 - regression_loss: 1.1005 - classification_loss: 0.1905 211/500 [===========>..................] - ETA: 1:35 - loss: 1.2901 - regression_loss: 1.0997 - classification_loss: 0.1904 212/500 [===========>..................] - ETA: 1:35 - loss: 1.2916 - regression_loss: 1.1009 - classification_loss: 0.1906 213/500 [===========>..................] - ETA: 1:34 - loss: 1.2896 - regression_loss: 1.0992 - classification_loss: 0.1904 214/500 [===========>..................] - ETA: 1:34 - loss: 1.2864 - regression_loss: 1.0965 - classification_loss: 0.1899 215/500 [===========>..................] - ETA: 1:34 - loss: 1.2873 - regression_loss: 1.0974 - classification_loss: 0.1900 216/500 [===========>..................] - ETA: 1:33 - loss: 1.2867 - regression_loss: 1.0967 - classification_loss: 0.1900 217/500 [============>.................] - ETA: 1:33 - loss: 1.2855 - regression_loss: 1.0956 - classification_loss: 0.1898 218/500 [============>.................] - ETA: 1:33 - loss: 1.2846 - regression_loss: 1.0945 - classification_loss: 0.1901 219/500 [============>.................] - ETA: 1:32 - loss: 1.2868 - regression_loss: 1.0966 - classification_loss: 0.1902 220/500 [============>.................] - ETA: 1:32 - loss: 1.2874 - regression_loss: 1.0973 - classification_loss: 0.1901 221/500 [============>.................] - ETA: 1:32 - loss: 1.2891 - regression_loss: 1.0989 - classification_loss: 0.1902 222/500 [============>.................] - ETA: 1:31 - loss: 1.2862 - regression_loss: 1.0966 - classification_loss: 0.1897 223/500 [============>.................] - ETA: 1:31 - loss: 1.2878 - regression_loss: 1.0982 - classification_loss: 0.1896 224/500 [============>.................] - ETA: 1:31 - loss: 1.2861 - regression_loss: 1.0964 - classification_loss: 0.1897 225/500 [============>.................] - ETA: 1:30 - loss: 1.2841 - regression_loss: 1.0948 - classification_loss: 0.1893 226/500 [============>.................] - ETA: 1:30 - loss: 1.2820 - regression_loss: 1.0931 - classification_loss: 0.1889 227/500 [============>.................] - ETA: 1:30 - loss: 1.2793 - regression_loss: 1.0909 - classification_loss: 0.1883 228/500 [============>.................] - ETA: 1:29 - loss: 1.2772 - regression_loss: 1.0890 - classification_loss: 0.1882 229/500 [============>.................] - ETA: 1:29 - loss: 1.2784 - regression_loss: 1.0900 - classification_loss: 0.1884 230/500 [============>.................] - ETA: 1:29 - loss: 1.2782 - regression_loss: 1.0903 - classification_loss: 0.1879 231/500 [============>.................] - ETA: 1:28 - loss: 1.2783 - regression_loss: 1.0900 - classification_loss: 0.1883 232/500 [============>.................] - ETA: 1:28 - loss: 1.2817 - regression_loss: 1.0931 - classification_loss: 0.1886 233/500 [============>.................] - ETA: 1:28 - loss: 1.2795 - regression_loss: 1.0912 - classification_loss: 0.1883 234/500 [=============>................] - ETA: 1:27 - loss: 1.2760 - regression_loss: 1.0883 - classification_loss: 0.1877 235/500 [=============>................] - ETA: 1:27 - loss: 1.2736 - regression_loss: 1.0864 - classification_loss: 0.1872 236/500 [=============>................] - ETA: 1:27 - loss: 1.2748 - regression_loss: 1.0875 - classification_loss: 0.1873 237/500 [=============>................] - ETA: 1:26 - loss: 1.2727 - regression_loss: 1.0857 - classification_loss: 0.1870 238/500 [=============>................] - ETA: 1:26 - loss: 1.2720 - regression_loss: 1.0852 - classification_loss: 0.1868 239/500 [=============>................] - ETA: 1:26 - loss: 1.2747 - regression_loss: 1.0876 - classification_loss: 0.1872 240/500 [=============>................] - ETA: 1:25 - loss: 1.2731 - regression_loss: 1.0863 - classification_loss: 0.1868 241/500 [=============>................] - ETA: 1:25 - loss: 1.2712 - regression_loss: 1.0846 - classification_loss: 0.1865 242/500 [=============>................] - ETA: 1:25 - loss: 1.2710 - regression_loss: 1.0847 - classification_loss: 0.1863 243/500 [=============>................] - ETA: 1:24 - loss: 1.2709 - regression_loss: 1.0846 - classification_loss: 0.1863 244/500 [=============>................] - ETA: 1:24 - loss: 1.2709 - regression_loss: 1.0845 - classification_loss: 0.1864 245/500 [=============>................] - ETA: 1:24 - loss: 1.2678 - regression_loss: 1.0817 - classification_loss: 0.1862 246/500 [=============>................] - ETA: 1:23 - loss: 1.2645 - regression_loss: 1.0788 - classification_loss: 0.1857 247/500 [=============>................] - ETA: 1:23 - loss: 1.2651 - regression_loss: 1.0794 - classification_loss: 0.1857 248/500 [=============>................] - ETA: 1:23 - loss: 1.2647 - regression_loss: 1.0792 - classification_loss: 0.1855 249/500 [=============>................] - ETA: 1:22 - loss: 1.2658 - regression_loss: 1.0800 - classification_loss: 0.1857 250/500 [==============>...............] - ETA: 1:22 - loss: 1.2665 - regression_loss: 1.0807 - classification_loss: 0.1858 251/500 [==============>...............] - ETA: 1:22 - loss: 1.2680 - regression_loss: 1.0823 - classification_loss: 0.1858 252/500 [==============>...............] - ETA: 1:21 - loss: 1.2658 - regression_loss: 1.0804 - classification_loss: 0.1854 253/500 [==============>...............] - ETA: 1:21 - loss: 1.2643 - regression_loss: 1.0791 - classification_loss: 0.1851 254/500 [==============>...............] - ETA: 1:21 - loss: 1.2645 - regression_loss: 1.0797 - classification_loss: 0.1847 255/500 [==============>...............] - ETA: 1:20 - loss: 1.2642 - regression_loss: 1.0794 - classification_loss: 0.1847 256/500 [==============>...............] - ETA: 1:20 - loss: 1.2666 - regression_loss: 1.0814 - classification_loss: 0.1852 257/500 [==============>...............] - ETA: 1:20 - loss: 1.2669 - regression_loss: 1.0819 - classification_loss: 0.1850 258/500 [==============>...............] - ETA: 1:19 - loss: 1.2660 - regression_loss: 1.0812 - classification_loss: 0.1848 259/500 [==============>...............] - ETA: 1:19 - loss: 1.2653 - regression_loss: 1.0807 - classification_loss: 0.1846 260/500 [==============>...............] - ETA: 1:19 - loss: 1.2621 - regression_loss: 1.0779 - classification_loss: 0.1843 261/500 [==============>...............] - ETA: 1:18 - loss: 1.2630 - regression_loss: 1.0788 - classification_loss: 0.1843 262/500 [==============>...............] - ETA: 1:18 - loss: 1.2639 - regression_loss: 1.0799 - classification_loss: 0.1840 263/500 [==============>...............] - ETA: 1:18 - loss: 1.2655 - regression_loss: 1.0811 - classification_loss: 0.1843 264/500 [==============>...............] - ETA: 1:17 - loss: 1.2637 - regression_loss: 1.0797 - classification_loss: 0.1840 265/500 [==============>...............] - ETA: 1:17 - loss: 1.2657 - regression_loss: 1.0814 - classification_loss: 0.1843 266/500 [==============>...............] - ETA: 1:17 - loss: 1.2635 - regression_loss: 1.0795 - classification_loss: 0.1840 267/500 [===============>..............] - ETA: 1:16 - loss: 1.2647 - regression_loss: 1.0805 - classification_loss: 0.1842 268/500 [===============>..............] - ETA: 1:16 - loss: 1.2658 - regression_loss: 1.0816 - classification_loss: 0.1843 269/500 [===============>..............] - ETA: 1:16 - loss: 1.2664 - regression_loss: 1.0820 - classification_loss: 0.1844 270/500 [===============>..............] - ETA: 1:15 - loss: 1.2653 - regression_loss: 1.0811 - classification_loss: 0.1842 271/500 [===============>..............] - ETA: 1:15 - loss: 1.2630 - regression_loss: 1.0790 - classification_loss: 0.1840 272/500 [===============>..............] - ETA: 1:15 - loss: 1.2649 - regression_loss: 1.0808 - classification_loss: 0.1841 273/500 [===============>..............] - ETA: 1:14 - loss: 1.2643 - regression_loss: 1.0802 - classification_loss: 0.1841 274/500 [===============>..............] - ETA: 1:14 - loss: 1.2629 - regression_loss: 1.0791 - classification_loss: 0.1839 275/500 [===============>..............] - ETA: 1:14 - loss: 1.2635 - regression_loss: 1.0796 - classification_loss: 0.1839 276/500 [===============>..............] - ETA: 1:13 - loss: 1.2632 - regression_loss: 1.0794 - classification_loss: 0.1839 277/500 [===============>..............] - ETA: 1:13 - loss: 1.2657 - regression_loss: 1.0813 - classification_loss: 0.1844 278/500 [===============>..............] - ETA: 1:13 - loss: 1.2642 - regression_loss: 1.0802 - classification_loss: 0.1840 279/500 [===============>..............] - ETA: 1:13 - loss: 1.2616 - regression_loss: 1.0780 - classification_loss: 0.1836 280/500 [===============>..............] - ETA: 1:12 - loss: 1.2629 - regression_loss: 1.0790 - classification_loss: 0.1839 281/500 [===============>..............] - ETA: 1:12 - loss: 1.2609 - regression_loss: 1.0773 - classification_loss: 0.1837 282/500 [===============>..............] - ETA: 1:12 - loss: 1.2582 - regression_loss: 1.0749 - classification_loss: 0.1833 283/500 [===============>..............] - ETA: 1:11 - loss: 1.2598 - regression_loss: 1.0763 - classification_loss: 0.1834 284/500 [================>.............] - ETA: 1:11 - loss: 1.2610 - regression_loss: 1.0772 - classification_loss: 0.1838 285/500 [================>.............] - ETA: 1:11 - loss: 1.2598 - regression_loss: 1.0762 - classification_loss: 0.1836 286/500 [================>.............] - ETA: 1:10 - loss: 1.2576 - regression_loss: 1.0740 - classification_loss: 0.1836 287/500 [================>.............] - ETA: 1:10 - loss: 1.2576 - regression_loss: 1.0739 - classification_loss: 0.1838 288/500 [================>.............] - ETA: 1:10 - loss: 1.2580 - regression_loss: 1.0742 - classification_loss: 0.1838 289/500 [================>.............] - ETA: 1:09 - loss: 1.2586 - regression_loss: 1.0747 - classification_loss: 0.1839 290/500 [================>.............] - ETA: 1:09 - loss: 1.2586 - regression_loss: 1.0748 - classification_loss: 0.1838 291/500 [================>.............] - ETA: 1:09 - loss: 1.2570 - regression_loss: 1.0734 - classification_loss: 0.1836 292/500 [================>.............] - ETA: 1:08 - loss: 1.2579 - regression_loss: 1.0742 - classification_loss: 0.1837 293/500 [================>.............] - ETA: 1:08 - loss: 1.2572 - regression_loss: 1.0737 - classification_loss: 0.1836 294/500 [================>.............] - ETA: 1:08 - loss: 1.2595 - regression_loss: 1.0756 - classification_loss: 0.1840 295/500 [================>.............] - ETA: 1:07 - loss: 1.2615 - regression_loss: 1.0773 - classification_loss: 0.1842 296/500 [================>.............] - ETA: 1:07 - loss: 1.2614 - regression_loss: 1.0771 - classification_loss: 0.1844 297/500 [================>.............] - ETA: 1:07 - loss: 1.2617 - regression_loss: 1.0774 - classification_loss: 0.1843 298/500 [================>.............] - ETA: 1:06 - loss: 1.2616 - regression_loss: 1.0772 - classification_loss: 0.1844 299/500 [================>.............] - ETA: 1:06 - loss: 1.2603 - regression_loss: 1.0762 - classification_loss: 0.1840 300/500 [=================>............] - ETA: 1:06 - loss: 1.2601 - regression_loss: 1.0761 - classification_loss: 0.1840 301/500 [=================>............] - ETA: 1:05 - loss: 1.2596 - regression_loss: 1.0757 - classification_loss: 0.1839 302/500 [=================>............] - ETA: 1:05 - loss: 1.2611 - regression_loss: 1.0770 - classification_loss: 0.1841 303/500 [=================>............] - ETA: 1:05 - loss: 1.2625 - regression_loss: 1.0781 - classification_loss: 0.1844 304/500 [=================>............] - ETA: 1:04 - loss: 1.2644 - regression_loss: 1.0794 - classification_loss: 0.1850 305/500 [=================>............] - ETA: 1:04 - loss: 1.2650 - regression_loss: 1.0799 - classification_loss: 0.1851 306/500 [=================>............] - ETA: 1:04 - loss: 1.2658 - regression_loss: 1.0806 - classification_loss: 0.1851 307/500 [=================>............] - ETA: 1:03 - loss: 1.2666 - regression_loss: 1.0815 - classification_loss: 0.1852 308/500 [=================>............] - ETA: 1:03 - loss: 1.2662 - regression_loss: 1.0812 - classification_loss: 0.1850 309/500 [=================>............] - ETA: 1:03 - loss: 1.2673 - regression_loss: 1.0822 - classification_loss: 0.1851 310/500 [=================>............] - ETA: 1:02 - loss: 1.2688 - regression_loss: 1.0836 - classification_loss: 0.1852 311/500 [=================>............] - ETA: 1:02 - loss: 1.2695 - regression_loss: 1.0841 - classification_loss: 0.1853 312/500 [=================>............] - ETA: 1:02 - loss: 1.2699 - regression_loss: 1.0845 - classification_loss: 0.1854 313/500 [=================>............] - ETA: 1:01 - loss: 1.2732 - regression_loss: 1.0872 - classification_loss: 0.1859 314/500 [=================>............] - ETA: 1:01 - loss: 1.2738 - regression_loss: 1.0878 - classification_loss: 0.1860 315/500 [=================>............] - ETA: 1:01 - loss: 1.2751 - regression_loss: 1.0889 - classification_loss: 0.1862 316/500 [=================>............] - ETA: 1:00 - loss: 1.2757 - regression_loss: 1.0895 - classification_loss: 0.1862 317/500 [==================>...........] - ETA: 1:00 - loss: 1.2779 - regression_loss: 1.0912 - classification_loss: 0.1866 318/500 [==================>...........] - ETA: 1:00 - loss: 1.2769 - regression_loss: 1.0903 - classification_loss: 0.1865 319/500 [==================>...........] - ETA: 59s - loss: 1.2743 - regression_loss: 1.0881 - classification_loss: 0.1862  320/500 [==================>...........] - ETA: 59s - loss: 1.2724 - regression_loss: 1.0866 - classification_loss: 0.1859 321/500 [==================>...........] - ETA: 59s - loss: 1.2717 - regression_loss: 1.0859 - classification_loss: 0.1858 322/500 [==================>...........] - ETA: 58s - loss: 1.2734 - regression_loss: 1.0874 - classification_loss: 0.1860 323/500 [==================>...........] - ETA: 58s - loss: 1.2738 - regression_loss: 1.0876 - classification_loss: 0.1862 324/500 [==================>...........] - ETA: 58s - loss: 1.2742 - regression_loss: 1.0879 - classification_loss: 0.1863 325/500 [==================>...........] - ETA: 57s - loss: 1.2751 - regression_loss: 1.0885 - classification_loss: 0.1866 326/500 [==================>...........] - ETA: 57s - loss: 1.2743 - regression_loss: 1.0878 - classification_loss: 0.1864 327/500 [==================>...........] - ETA: 57s - loss: 1.2727 - regression_loss: 1.0866 - classification_loss: 0.1861 328/500 [==================>...........] - ETA: 56s - loss: 1.2730 - regression_loss: 1.0868 - classification_loss: 0.1861 329/500 [==================>...........] - ETA: 56s - loss: 1.2745 - regression_loss: 1.0881 - classification_loss: 0.1864 330/500 [==================>...........] - ETA: 56s - loss: 1.2757 - regression_loss: 1.0891 - classification_loss: 0.1866 331/500 [==================>...........] - ETA: 55s - loss: 1.2755 - regression_loss: 1.0888 - classification_loss: 0.1867 332/500 [==================>...........] - ETA: 55s - loss: 1.2778 - regression_loss: 1.0907 - classification_loss: 0.1871 333/500 [==================>...........] - ETA: 55s - loss: 1.2774 - regression_loss: 1.0904 - classification_loss: 0.1870 334/500 [===================>..........] - ETA: 54s - loss: 1.2751 - regression_loss: 1.0884 - classification_loss: 0.1867 335/500 [===================>..........] - ETA: 54s - loss: 1.2737 - regression_loss: 1.0872 - classification_loss: 0.1865 336/500 [===================>..........] - ETA: 54s - loss: 1.2718 - regression_loss: 1.0856 - classification_loss: 0.1862 337/500 [===================>..........] - ETA: 53s - loss: 1.2706 - regression_loss: 1.0846 - classification_loss: 0.1860 338/500 [===================>..........] - ETA: 53s - loss: 1.2713 - regression_loss: 1.0852 - classification_loss: 0.1861 339/500 [===================>..........] - ETA: 53s - loss: 1.2719 - regression_loss: 1.0857 - classification_loss: 0.1862 340/500 [===================>..........] - ETA: 52s - loss: 1.2713 - regression_loss: 1.0852 - classification_loss: 0.1861 341/500 [===================>..........] - ETA: 52s - loss: 1.2702 - regression_loss: 1.0841 - classification_loss: 0.1860 342/500 [===================>..........] - ETA: 52s - loss: 1.2706 - regression_loss: 1.0846 - classification_loss: 0.1860 343/500 [===================>..........] - ETA: 51s - loss: 1.2711 - regression_loss: 1.0851 - classification_loss: 0.1860 344/500 [===================>..........] - ETA: 51s - loss: 1.2724 - regression_loss: 1.0862 - classification_loss: 0.1863 345/500 [===================>..........] - ETA: 51s - loss: 1.2713 - regression_loss: 1.0852 - classification_loss: 0.1861 346/500 [===================>..........] - ETA: 50s - loss: 1.2698 - regression_loss: 1.0839 - classification_loss: 0.1858 347/500 [===================>..........] - ETA: 50s - loss: 1.2706 - regression_loss: 1.0847 - classification_loss: 0.1859 348/500 [===================>..........] - ETA: 50s - loss: 1.2702 - regression_loss: 1.0842 - classification_loss: 0.1859 349/500 [===================>..........] - ETA: 49s - loss: 1.2698 - regression_loss: 1.0840 - classification_loss: 0.1859 350/500 [====================>.........] - ETA: 49s - loss: 1.2720 - regression_loss: 1.0857 - classification_loss: 0.1863 351/500 [====================>.........] - ETA: 49s - loss: 1.2718 - regression_loss: 1.0856 - classification_loss: 0.1862 352/500 [====================>.........] - ETA: 49s - loss: 1.2730 - regression_loss: 1.0865 - classification_loss: 0.1864 353/500 [====================>.........] - ETA: 48s - loss: 1.2731 - regression_loss: 1.0865 - classification_loss: 0.1866 354/500 [====================>.........] - ETA: 48s - loss: 1.2747 - regression_loss: 1.0880 - classification_loss: 0.1868 355/500 [====================>.........] - ETA: 48s - loss: 1.2737 - regression_loss: 1.0872 - classification_loss: 0.1865 356/500 [====================>.........] - ETA: 47s - loss: 1.2749 - regression_loss: 1.0883 - classification_loss: 0.1866 357/500 [====================>.........] - ETA: 47s - loss: 1.2742 - regression_loss: 1.0878 - classification_loss: 0.1864 358/500 [====================>.........] - ETA: 47s - loss: 1.2739 - regression_loss: 1.0877 - classification_loss: 0.1862 359/500 [====================>.........] - ETA: 46s - loss: 1.2752 - regression_loss: 1.0887 - classification_loss: 0.1865 360/500 [====================>.........] - ETA: 46s - loss: 1.2729 - regression_loss: 1.0867 - classification_loss: 0.1862 361/500 [====================>.........] - ETA: 46s - loss: 1.2742 - regression_loss: 1.0877 - classification_loss: 0.1864 362/500 [====================>.........] - ETA: 45s - loss: 1.2753 - regression_loss: 1.0887 - classification_loss: 0.1866 363/500 [====================>.........] - ETA: 45s - loss: 1.2737 - regression_loss: 1.0872 - classification_loss: 0.1865 364/500 [====================>.........] - ETA: 45s - loss: 1.2727 - regression_loss: 1.0865 - classification_loss: 0.1862 365/500 [====================>.........] - ETA: 44s - loss: 1.2735 - regression_loss: 1.0872 - classification_loss: 0.1863 366/500 [====================>.........] - ETA: 44s - loss: 1.2732 - regression_loss: 1.0870 - classification_loss: 0.1862 367/500 [=====================>........] - ETA: 44s - loss: 1.2746 - regression_loss: 1.0881 - classification_loss: 0.1865 368/500 [=====================>........] - ETA: 43s - loss: 1.2755 - regression_loss: 1.0890 - classification_loss: 0.1866 369/500 [=====================>........] - ETA: 43s - loss: 1.2750 - regression_loss: 1.0886 - classification_loss: 0.1864 370/500 [=====================>........] - ETA: 43s - loss: 1.2756 - regression_loss: 1.0893 - classification_loss: 0.1864 371/500 [=====================>........] - ETA: 42s - loss: 1.2747 - regression_loss: 1.0884 - classification_loss: 0.1863 372/500 [=====================>........] - ETA: 42s - loss: 1.2761 - regression_loss: 1.0896 - classification_loss: 0.1865 373/500 [=====================>........] - ETA: 42s - loss: 1.2747 - regression_loss: 1.0883 - classification_loss: 0.1864 374/500 [=====================>........] - ETA: 41s - loss: 1.2766 - regression_loss: 1.0899 - classification_loss: 0.1867 375/500 [=====================>........] - ETA: 41s - loss: 1.2774 - regression_loss: 1.0906 - classification_loss: 0.1868 376/500 [=====================>........] - ETA: 41s - loss: 1.2784 - regression_loss: 1.0914 - classification_loss: 0.1869 377/500 [=====================>........] - ETA: 40s - loss: 1.2773 - regression_loss: 1.0907 - classification_loss: 0.1867 378/500 [=====================>........] - ETA: 40s - loss: 1.2783 - regression_loss: 1.0916 - classification_loss: 0.1867 379/500 [=====================>........] - ETA: 40s - loss: 1.2774 - regression_loss: 1.0908 - classification_loss: 0.1866 380/500 [=====================>........] - ETA: 39s - loss: 1.2783 - regression_loss: 1.0915 - classification_loss: 0.1867 381/500 [=====================>........] - ETA: 39s - loss: 1.2795 - regression_loss: 1.0927 - classification_loss: 0.1868 382/500 [=====================>........] - ETA: 39s - loss: 1.2794 - regression_loss: 1.0926 - classification_loss: 0.1868 383/500 [=====================>........] - ETA: 38s - loss: 1.2795 - regression_loss: 1.0928 - classification_loss: 0.1867 384/500 [======================>.......] - ETA: 38s - loss: 1.2820 - regression_loss: 1.0947 - classification_loss: 0.1873 385/500 [======================>.......] - ETA: 38s - loss: 1.2824 - regression_loss: 1.0951 - classification_loss: 0.1873 386/500 [======================>.......] - ETA: 37s - loss: 1.2811 - regression_loss: 1.0941 - classification_loss: 0.1870 387/500 [======================>.......] - ETA: 37s - loss: 1.2794 - regression_loss: 1.0927 - classification_loss: 0.1867 388/500 [======================>.......] - ETA: 37s - loss: 1.2800 - regression_loss: 1.0932 - classification_loss: 0.1868 389/500 [======================>.......] - ETA: 36s - loss: 1.2799 - regression_loss: 1.0931 - classification_loss: 0.1868 390/500 [======================>.......] - ETA: 36s - loss: 1.2791 - regression_loss: 1.0924 - classification_loss: 0.1866 391/500 [======================>.......] - ETA: 36s - loss: 1.2799 - regression_loss: 1.0931 - classification_loss: 0.1868 392/500 [======================>.......] - ETA: 35s - loss: 1.2804 - regression_loss: 1.0936 - classification_loss: 0.1868 393/500 [======================>.......] - ETA: 35s - loss: 1.2803 - regression_loss: 1.0935 - classification_loss: 0.1868 394/500 [======================>.......] - ETA: 35s - loss: 1.2803 - regression_loss: 1.0935 - classification_loss: 0.1868 395/500 [======================>.......] - ETA: 34s - loss: 1.2802 - regression_loss: 1.0933 - classification_loss: 0.1869 396/500 [======================>.......] - ETA: 34s - loss: 1.2817 - regression_loss: 1.0946 - classification_loss: 0.1871 397/500 [======================>.......] - ETA: 34s - loss: 1.2819 - regression_loss: 1.0947 - classification_loss: 0.1872 398/500 [======================>.......] - ETA: 33s - loss: 1.2805 - regression_loss: 1.0931 - classification_loss: 0.1873 399/500 [======================>.......] - ETA: 33s - loss: 1.2805 - regression_loss: 1.0930 - classification_loss: 0.1874 400/500 [=======================>......] - ETA: 33s - loss: 1.2811 - regression_loss: 1.0936 - classification_loss: 0.1875 401/500 [=======================>......] - ETA: 32s - loss: 1.2812 - regression_loss: 1.0937 - classification_loss: 0.1875 402/500 [=======================>......] - ETA: 32s - loss: 1.2815 - regression_loss: 1.0941 - classification_loss: 0.1873 403/500 [=======================>......] - ETA: 32s - loss: 1.2802 - regression_loss: 1.0931 - classification_loss: 0.1871 404/500 [=======================>......] - ETA: 31s - loss: 1.2809 - regression_loss: 1.0935 - classification_loss: 0.1874 405/500 [=======================>......] - ETA: 31s - loss: 1.2797 - regression_loss: 1.0925 - classification_loss: 0.1872 406/500 [=======================>......] - ETA: 31s - loss: 1.2793 - regression_loss: 1.0921 - classification_loss: 0.1872 407/500 [=======================>......] - ETA: 30s - loss: 1.2804 - regression_loss: 1.0931 - classification_loss: 0.1873 408/500 [=======================>......] - ETA: 30s - loss: 1.2794 - regression_loss: 1.0922 - classification_loss: 0.1872 409/500 [=======================>......] - ETA: 30s - loss: 1.2795 - regression_loss: 1.0922 - classification_loss: 0.1873 410/500 [=======================>......] - ETA: 29s - loss: 1.2810 - regression_loss: 1.0934 - classification_loss: 0.1875 411/500 [=======================>......] - ETA: 29s - loss: 1.2821 - regression_loss: 1.0942 - classification_loss: 0.1878 412/500 [=======================>......] - ETA: 29s - loss: 1.2828 - regression_loss: 1.0948 - classification_loss: 0.1880 413/500 [=======================>......] - ETA: 28s - loss: 1.2844 - regression_loss: 1.0960 - classification_loss: 0.1883 414/500 [=======================>......] - ETA: 28s - loss: 1.2845 - regression_loss: 1.0961 - classification_loss: 0.1884 415/500 [=======================>......] - ETA: 28s - loss: 1.2837 - regression_loss: 1.0955 - classification_loss: 0.1882 416/500 [=======================>......] - ETA: 27s - loss: 1.2828 - regression_loss: 1.0946 - classification_loss: 0.1882 417/500 [========================>.....] - ETA: 27s - loss: 1.2833 - regression_loss: 1.0951 - classification_loss: 0.1882 418/500 [========================>.....] - ETA: 27s - loss: 1.2842 - regression_loss: 1.0959 - classification_loss: 0.1883 419/500 [========================>.....] - ETA: 26s - loss: 1.2851 - regression_loss: 1.0967 - classification_loss: 0.1884 420/500 [========================>.....] - ETA: 26s - loss: 1.2835 - regression_loss: 1.0953 - classification_loss: 0.1882 421/500 [========================>.....] - ETA: 26s - loss: 1.2855 - regression_loss: 1.0970 - classification_loss: 0.1885 422/500 [========================>.....] - ETA: 25s - loss: 1.2856 - regression_loss: 1.0970 - classification_loss: 0.1886 423/500 [========================>.....] - ETA: 25s - loss: 1.2844 - regression_loss: 1.0960 - classification_loss: 0.1884 424/500 [========================>.....] - ETA: 25s - loss: 1.2831 - regression_loss: 1.0949 - classification_loss: 0.1882 425/500 [========================>.....] - ETA: 24s - loss: 1.2815 - regression_loss: 1.0936 - classification_loss: 0.1879 426/500 [========================>.....] - ETA: 24s - loss: 1.2821 - regression_loss: 1.0940 - classification_loss: 0.1880 427/500 [========================>.....] - ETA: 24s - loss: 1.2833 - regression_loss: 1.0950 - classification_loss: 0.1883 428/500 [========================>.....] - ETA: 23s - loss: 1.2836 - regression_loss: 1.0954 - classification_loss: 0.1882 429/500 [========================>.....] - ETA: 23s - loss: 1.2852 - regression_loss: 1.0966 - classification_loss: 0.1886 430/500 [========================>.....] - ETA: 23s - loss: 1.2860 - regression_loss: 1.0972 - classification_loss: 0.1887 431/500 [========================>.....] - ETA: 22s - loss: 1.2879 - regression_loss: 1.0988 - classification_loss: 0.1891 432/500 [========================>.....] - ETA: 22s - loss: 1.2889 - regression_loss: 1.0997 - classification_loss: 0.1893 433/500 [========================>.....] - ETA: 22s - loss: 1.2896 - regression_loss: 1.1002 - classification_loss: 0.1893 434/500 [=========================>....] - ETA: 21s - loss: 1.2911 - regression_loss: 1.0998 - classification_loss: 0.1914 435/500 [=========================>....] - ETA: 21s - loss: 1.2909 - regression_loss: 1.0991 - classification_loss: 0.1918 436/500 [=========================>....] - ETA: 21s - loss: 1.2912 - regression_loss: 1.0993 - classification_loss: 0.1919 437/500 [=========================>....] - ETA: 20s - loss: 1.2926 - regression_loss: 1.0998 - classification_loss: 0.1928 438/500 [=========================>....] - ETA: 20s - loss: 1.2939 - regression_loss: 1.1009 - classification_loss: 0.1930 439/500 [=========================>....] - ETA: 20s - loss: 1.2929 - regression_loss: 1.1000 - classification_loss: 0.1929 440/500 [=========================>....] - ETA: 19s - loss: 1.2933 - regression_loss: 1.1003 - classification_loss: 0.1930 441/500 [=========================>....] - ETA: 19s - loss: 1.2930 - regression_loss: 1.1001 - classification_loss: 0.1929 442/500 [=========================>....] - ETA: 19s - loss: 1.2911 - regression_loss: 1.0984 - classification_loss: 0.1927 443/500 [=========================>....] - ETA: 18s - loss: 1.2913 - regression_loss: 1.0987 - classification_loss: 0.1926 444/500 [=========================>....] - ETA: 18s - loss: 1.2894 - regression_loss: 1.0970 - classification_loss: 0.1924 445/500 [=========================>....] - ETA: 18s - loss: 1.2903 - regression_loss: 1.0977 - classification_loss: 0.1926 446/500 [=========================>....] - ETA: 17s - loss: 1.2901 - regression_loss: 1.0975 - classification_loss: 0.1927 447/500 [=========================>....] - ETA: 17s - loss: 1.2891 - regression_loss: 1.0964 - classification_loss: 0.1927 448/500 [=========================>....] - ETA: 17s - loss: 1.2890 - regression_loss: 1.0962 - classification_loss: 0.1928 449/500 [=========================>....] - ETA: 16s - loss: 1.2909 - regression_loss: 1.0977 - classification_loss: 0.1933 450/500 [==========================>...] - ETA: 16s - loss: 1.2903 - regression_loss: 1.0971 - classification_loss: 0.1932 451/500 [==========================>...] - ETA: 16s - loss: 1.2915 - regression_loss: 1.0981 - classification_loss: 0.1934 452/500 [==========================>...] - ETA: 15s - loss: 1.2915 - regression_loss: 1.0981 - classification_loss: 0.1933 453/500 [==========================>...] - ETA: 15s - loss: 1.2914 - regression_loss: 1.0980 - classification_loss: 0.1934 454/500 [==========================>...] - ETA: 15s - loss: 1.2910 - regression_loss: 1.0978 - classification_loss: 0.1932 455/500 [==========================>...] - ETA: 14s - loss: 1.2901 - regression_loss: 1.0970 - classification_loss: 0.1931 456/500 [==========================>...] - ETA: 14s - loss: 1.2908 - regression_loss: 1.0977 - classification_loss: 0.1931 457/500 [==========================>...] - ETA: 14s - loss: 1.2917 - regression_loss: 1.0985 - classification_loss: 0.1932 458/500 [==========================>...] - ETA: 13s - loss: 1.2925 - regression_loss: 1.0993 - classification_loss: 0.1932 459/500 [==========================>...] - ETA: 13s - loss: 1.2910 - regression_loss: 1.0981 - classification_loss: 0.1930 460/500 [==========================>...] - ETA: 13s - loss: 1.2920 - regression_loss: 1.0989 - classification_loss: 0.1932 461/500 [==========================>...] - ETA: 12s - loss: 1.2899 - regression_loss: 1.0969 - classification_loss: 0.1930 462/500 [==========================>...] - ETA: 12s - loss: 1.2900 - regression_loss: 1.0971 - classification_loss: 0.1929 463/500 [==========================>...] - ETA: 12s - loss: 1.2890 - regression_loss: 1.0962 - classification_loss: 0.1928 464/500 [==========================>...] - ETA: 11s - loss: 1.2889 - regression_loss: 1.0962 - classification_loss: 0.1927 465/500 [==========================>...] - ETA: 11s - loss: 1.2879 - regression_loss: 1.0954 - classification_loss: 0.1925 466/500 [==========================>...] - ETA: 11s - loss: 1.2886 - regression_loss: 1.0960 - classification_loss: 0.1926 467/500 [===========================>..] - ETA: 10s - loss: 1.2866 - regression_loss: 1.0943 - classification_loss: 0.1923 468/500 [===========================>..] - ETA: 10s - loss: 1.2858 - regression_loss: 1.0936 - classification_loss: 0.1922 469/500 [===========================>..] - ETA: 10s - loss: 1.2853 - regression_loss: 1.0932 - classification_loss: 0.1921 470/500 [===========================>..] - ETA: 9s - loss: 1.2869 - regression_loss: 1.0945 - classification_loss: 0.1923  471/500 [===========================>..] - ETA: 9s - loss: 1.2868 - regression_loss: 1.0946 - classification_loss: 0.1923 472/500 [===========================>..] - ETA: 9s - loss: 1.2880 - regression_loss: 1.0956 - classification_loss: 0.1924 473/500 [===========================>..] - ETA: 8s - loss: 1.2885 - regression_loss: 1.0960 - classification_loss: 0.1925 474/500 [===========================>..] - ETA: 8s - loss: 1.2870 - regression_loss: 1.0948 - classification_loss: 0.1922 475/500 [===========================>..] - ETA: 8s - loss: 1.2870 - regression_loss: 1.0948 - classification_loss: 0.1922 476/500 [===========================>..] - ETA: 7s - loss: 1.2865 - regression_loss: 1.0943 - classification_loss: 0.1921 477/500 [===========================>..] - ETA: 7s - loss: 1.2867 - regression_loss: 1.0945 - classification_loss: 0.1921 478/500 [===========================>..] - ETA: 7s - loss: 1.2850 - regression_loss: 1.0931 - classification_loss: 0.1918 479/500 [===========================>..] - ETA: 6s - loss: 1.2856 - regression_loss: 1.0937 - classification_loss: 0.1919 480/500 [===========================>..] - ETA: 6s - loss: 1.2854 - regression_loss: 1.0935 - classification_loss: 0.1919 481/500 [===========================>..] - ETA: 6s - loss: 1.2854 - regression_loss: 1.0935 - classification_loss: 0.1919 482/500 [===========================>..] - ETA: 5s - loss: 1.2859 - regression_loss: 1.0938 - classification_loss: 0.1921 483/500 [===========================>..] - ETA: 5s - loss: 1.2864 - regression_loss: 1.0944 - classification_loss: 0.1921 484/500 [============================>.] - ETA: 5s - loss: 1.2852 - regression_loss: 1.0934 - classification_loss: 0.1918 485/500 [============================>.] - ETA: 4s - loss: 1.2850 - regression_loss: 1.0933 - classification_loss: 0.1917 486/500 [============================>.] - ETA: 4s - loss: 1.2843 - regression_loss: 1.0927 - classification_loss: 0.1916 487/500 [============================>.] - ETA: 4s - loss: 1.2853 - regression_loss: 1.0934 - classification_loss: 0.1919 488/500 [============================>.] - ETA: 3s - loss: 1.2857 - regression_loss: 1.0938 - classification_loss: 0.1919 489/500 [============================>.] - ETA: 3s - loss: 1.2866 - regression_loss: 1.0946 - classification_loss: 0.1920 490/500 [============================>.] - ETA: 3s - loss: 1.2861 - regression_loss: 1.0941 - classification_loss: 0.1920 491/500 [============================>.] - ETA: 2s - loss: 1.2857 - regression_loss: 1.0938 - classification_loss: 0.1919 492/500 [============================>.] - ETA: 2s - loss: 1.2858 - regression_loss: 1.0939 - classification_loss: 0.1919 493/500 [============================>.] - ETA: 2s - loss: 1.2840 - regression_loss: 1.0924 - classification_loss: 0.1917 494/500 [============================>.] - ETA: 1s - loss: 1.2839 - regression_loss: 1.0922 - classification_loss: 0.1917 495/500 [============================>.] - ETA: 1s - loss: 1.2839 - regression_loss: 1.0922 - classification_loss: 0.1918 496/500 [============================>.] - ETA: 1s - loss: 1.2829 - regression_loss: 1.0913 - classification_loss: 0.1915 497/500 [============================>.] - ETA: 0s - loss: 1.2824 - regression_loss: 1.0910 - classification_loss: 0.1914 498/500 [============================>.] - ETA: 0s - loss: 1.2824 - regression_loss: 1.0910 - classification_loss: 0.1914 499/500 [============================>.] - ETA: 0s - loss: 1.2811 - regression_loss: 1.0899 - classification_loss: 0.1912 500/500 [==============================] - 165s 331ms/step - loss: 1.2807 - regression_loss: 1.0896 - classification_loss: 0.1911 1172 instances of class plum with average precision: 0.6124 mAP: 0.6124 Epoch 00021: saving model to ./training/snapshots/resnet101_pascal_21.h5 Epoch 22/150 1/500 [..............................] - ETA: 2:52 - loss: 1.2708 - regression_loss: 1.1214 - classification_loss: 0.1495 2/500 [..............................] - ETA: 2:46 - loss: 1.4176 - regression_loss: 1.2142 - classification_loss: 0.2034 3/500 [..............................] - ETA: 2:46 - loss: 1.4798 - regression_loss: 1.2793 - classification_loss: 0.2005 4/500 [..............................] - ETA: 2:43 - loss: 1.4726 - regression_loss: 1.2703 - classification_loss: 0.2023 5/500 [..............................] - ETA: 2:41 - loss: 1.3266 - regression_loss: 1.1420 - classification_loss: 0.1846 6/500 [..............................] - ETA: 2:40 - loss: 1.3424 - regression_loss: 1.1550 - classification_loss: 0.1874 7/500 [..............................] - ETA: 2:41 - loss: 1.2550 - regression_loss: 1.0735 - classification_loss: 0.1815 8/500 [..............................] - ETA: 2:42 - loss: 1.3484 - regression_loss: 1.1496 - classification_loss: 0.1988 9/500 [..............................] - ETA: 2:41 - loss: 1.3753 - regression_loss: 1.1728 - classification_loss: 0.2025 10/500 [..............................] - ETA: 2:40 - loss: 1.3167 - regression_loss: 1.1204 - classification_loss: 0.1964 11/500 [..............................] - ETA: 2:40 - loss: 1.3276 - regression_loss: 1.1271 - classification_loss: 0.2005 12/500 [..............................] - ETA: 2:39 - loss: 1.3424 - regression_loss: 1.1413 - classification_loss: 0.2011 13/500 [..............................] - ETA: 2:39 - loss: 1.3702 - regression_loss: 1.1682 - classification_loss: 0.2020 14/500 [..............................] - ETA: 2:40 - loss: 1.3911 - regression_loss: 1.1829 - classification_loss: 0.2083 15/500 [..............................] - ETA: 2:41 - loss: 1.4036 - regression_loss: 1.1942 - classification_loss: 0.2094 16/500 [..............................] - ETA: 2:41 - loss: 1.3867 - regression_loss: 1.1824 - classification_loss: 0.2043 17/500 [>.............................] - ETA: 2:40 - loss: 1.3737 - regression_loss: 1.1719 - classification_loss: 0.2018 18/500 [>.............................] - ETA: 2:40 - loss: 1.3288 - regression_loss: 1.1332 - classification_loss: 0.1957 19/500 [>.............................] - ETA: 2:40 - loss: 1.2796 - regression_loss: 1.0907 - classification_loss: 0.1890 20/500 [>.............................] - ETA: 2:39 - loss: 1.2850 - regression_loss: 1.0959 - classification_loss: 0.1891 21/500 [>.............................] - ETA: 2:38 - loss: 1.2974 - regression_loss: 1.1068 - classification_loss: 0.1906 22/500 [>.............................] - ETA: 2:38 - loss: 1.3256 - regression_loss: 1.1307 - classification_loss: 0.1948 23/500 [>.............................] - ETA: 2:38 - loss: 1.3461 - regression_loss: 1.1483 - classification_loss: 0.1978 24/500 [>.............................] - ETA: 2:38 - loss: 1.3645 - regression_loss: 1.1623 - classification_loss: 0.2022 25/500 [>.............................] - ETA: 2:37 - loss: 1.3758 - regression_loss: 1.1710 - classification_loss: 0.2048 26/500 [>.............................] - ETA: 2:36 - loss: 1.3713 - regression_loss: 1.1678 - classification_loss: 0.2036 27/500 [>.............................] - ETA: 2:36 - loss: 1.3691 - regression_loss: 1.1664 - classification_loss: 0.2027 28/500 [>.............................] - ETA: 2:35 - loss: 1.3445 - regression_loss: 1.1432 - classification_loss: 0.2013 29/500 [>.............................] - ETA: 2:34 - loss: 1.3197 - regression_loss: 1.1230 - classification_loss: 0.1967 30/500 [>.............................] - ETA: 2:34 - loss: 1.3282 - regression_loss: 1.1309 - classification_loss: 0.1974 31/500 [>.............................] - ETA: 2:33 - loss: 1.3173 - regression_loss: 1.1229 - classification_loss: 0.1944 32/500 [>.............................] - ETA: 2:33 - loss: 1.3142 - regression_loss: 1.1213 - classification_loss: 0.1929 33/500 [>.............................] - ETA: 2:32 - loss: 1.2869 - regression_loss: 1.0976 - classification_loss: 0.1893 34/500 [=>............................] - ETA: 2:32 - loss: 1.2824 - regression_loss: 1.0941 - classification_loss: 0.1883 35/500 [=>............................] - ETA: 2:31 - loss: 1.2855 - regression_loss: 1.0964 - classification_loss: 0.1891 36/500 [=>............................] - ETA: 2:31 - loss: 1.2894 - regression_loss: 1.1003 - classification_loss: 0.1891 37/500 [=>............................] - ETA: 2:30 - loss: 1.2914 - regression_loss: 1.1036 - classification_loss: 0.1878 38/500 [=>............................] - ETA: 2:30 - loss: 1.2842 - regression_loss: 1.0982 - classification_loss: 0.1860 39/500 [=>............................] - ETA: 2:29 - loss: 1.2936 - regression_loss: 1.1063 - classification_loss: 0.1873 40/500 [=>............................] - ETA: 2:29 - loss: 1.2808 - regression_loss: 1.0954 - classification_loss: 0.1854 41/500 [=>............................] - ETA: 2:29 - loss: 1.2968 - regression_loss: 1.1082 - classification_loss: 0.1886 42/500 [=>............................] - ETA: 2:29 - loss: 1.2873 - regression_loss: 1.0996 - classification_loss: 0.1877 43/500 [=>............................] - ETA: 2:28 - loss: 1.2973 - regression_loss: 1.1059 - classification_loss: 0.1914 44/500 [=>............................] - ETA: 2:28 - loss: 1.2911 - regression_loss: 1.1012 - classification_loss: 0.1899 45/500 [=>............................] - ETA: 2:28 - loss: 1.2925 - regression_loss: 1.1038 - classification_loss: 0.1887 46/500 [=>............................] - ETA: 2:28 - loss: 1.2953 - regression_loss: 1.1068 - classification_loss: 0.1886 47/500 [=>............................] - ETA: 2:27 - loss: 1.3008 - regression_loss: 1.1112 - classification_loss: 0.1896 48/500 [=>............................] - ETA: 2:27 - loss: 1.2928 - regression_loss: 1.1049 - classification_loss: 0.1879 49/500 [=>............................] - ETA: 2:26 - loss: 1.2790 - regression_loss: 1.0923 - classification_loss: 0.1867 50/500 [==>...........................] - ETA: 2:26 - loss: 1.2751 - regression_loss: 1.0890 - classification_loss: 0.1861 51/500 [==>...........................] - ETA: 2:26 - loss: 1.2843 - regression_loss: 1.0966 - classification_loss: 0.1877 52/500 [==>...........................] - ETA: 2:26 - loss: 1.2809 - regression_loss: 1.0943 - classification_loss: 0.1866 53/500 [==>...........................] - ETA: 2:26 - loss: 1.2859 - regression_loss: 1.0988 - classification_loss: 0.1870 54/500 [==>...........................] - ETA: 2:26 - loss: 1.2764 - regression_loss: 1.0904 - classification_loss: 0.1861 55/500 [==>...........................] - ETA: 2:26 - loss: 1.2711 - regression_loss: 1.0859 - classification_loss: 0.1852 56/500 [==>...........................] - ETA: 2:25 - loss: 1.2790 - regression_loss: 1.0926 - classification_loss: 0.1863 57/500 [==>...........................] - ETA: 2:25 - loss: 1.2798 - regression_loss: 1.0935 - classification_loss: 0.1864 58/500 [==>...........................] - ETA: 2:25 - loss: 1.2742 - regression_loss: 1.0886 - classification_loss: 0.1856 59/500 [==>...........................] - ETA: 2:24 - loss: 1.2793 - regression_loss: 1.0927 - classification_loss: 0.1866 60/500 [==>...........................] - ETA: 2:24 - loss: 1.2759 - regression_loss: 1.0895 - classification_loss: 0.1864 61/500 [==>...........................] - ETA: 2:24 - loss: 1.2788 - regression_loss: 1.0923 - classification_loss: 0.1865 62/500 [==>...........................] - ETA: 2:23 - loss: 1.2830 - regression_loss: 1.0961 - classification_loss: 0.1869 63/500 [==>...........................] - ETA: 2:23 - loss: 1.2761 - regression_loss: 1.0902 - classification_loss: 0.1860 64/500 [==>...........................] - ETA: 2:23 - loss: 1.2670 - regression_loss: 1.0828 - classification_loss: 0.1842 65/500 [==>...........................] - ETA: 2:23 - loss: 1.2616 - regression_loss: 1.0778 - classification_loss: 0.1839 66/500 [==>...........................] - ETA: 2:22 - loss: 1.2686 - regression_loss: 1.0834 - classification_loss: 0.1852 67/500 [===>..........................] - ETA: 2:22 - loss: 1.2572 - regression_loss: 1.0731 - classification_loss: 0.1841 68/500 [===>..........................] - ETA: 2:22 - loss: 1.2545 - regression_loss: 1.0703 - classification_loss: 0.1842 69/500 [===>..........................] - ETA: 2:21 - loss: 1.2556 - regression_loss: 1.0715 - classification_loss: 0.1841 70/500 [===>..........................] - ETA: 2:21 - loss: 1.2517 - regression_loss: 1.0684 - classification_loss: 0.1833 71/500 [===>..........................] - ETA: 2:21 - loss: 1.2412 - regression_loss: 1.0593 - classification_loss: 0.1818 72/500 [===>..........................] - ETA: 2:20 - loss: 1.2441 - regression_loss: 1.0615 - classification_loss: 0.1826 73/500 [===>..........................] - ETA: 2:20 - loss: 1.2482 - regression_loss: 1.0648 - classification_loss: 0.1834 74/500 [===>..........................] - ETA: 2:20 - loss: 1.2423 - regression_loss: 1.0600 - classification_loss: 0.1823 75/500 [===>..........................] - ETA: 2:19 - loss: 1.2440 - regression_loss: 1.0617 - classification_loss: 0.1823 76/500 [===>..........................] - ETA: 2:19 - loss: 1.2398 - regression_loss: 1.0577 - classification_loss: 0.1820 77/500 [===>..........................] - ETA: 2:19 - loss: 1.2379 - regression_loss: 1.0561 - classification_loss: 0.1818 78/500 [===>..........................] - ETA: 2:18 - loss: 1.2377 - regression_loss: 1.0560 - classification_loss: 0.1817 79/500 [===>..........................] - ETA: 2:18 - loss: 1.2407 - regression_loss: 1.0587 - classification_loss: 0.1820 80/500 [===>..........................] - ETA: 2:18 - loss: 1.2498 - regression_loss: 1.0659 - classification_loss: 0.1839 81/500 [===>..........................] - ETA: 2:17 - loss: 1.2579 - regression_loss: 1.0730 - classification_loss: 0.1849 82/500 [===>..........................] - ETA: 2:17 - loss: 1.2577 - regression_loss: 1.0736 - classification_loss: 0.1841 83/500 [===>..........................] - ETA: 2:17 - loss: 1.2615 - regression_loss: 1.0768 - classification_loss: 0.1847 84/500 [====>.........................] - ETA: 2:16 - loss: 1.2590 - regression_loss: 1.0749 - classification_loss: 0.1841 85/500 [====>.........................] - ETA: 2:16 - loss: 1.2601 - regression_loss: 1.0762 - classification_loss: 0.1839 86/500 [====>.........................] - ETA: 2:16 - loss: 1.2539 - regression_loss: 1.0709 - classification_loss: 0.1831 87/500 [====>.........................] - ETA: 2:15 - loss: 1.2574 - regression_loss: 1.0740 - classification_loss: 0.1834 88/500 [====>.........................] - ETA: 2:15 - loss: 1.2527 - regression_loss: 1.0701 - classification_loss: 0.1826 89/500 [====>.........................] - ETA: 2:15 - loss: 1.2550 - regression_loss: 1.0722 - classification_loss: 0.1829 90/500 [====>.........................] - ETA: 2:14 - loss: 1.2534 - regression_loss: 1.0704 - classification_loss: 0.1829 91/500 [====>.........................] - ETA: 2:14 - loss: 1.2512 - regression_loss: 1.0688 - classification_loss: 0.1825 92/500 [====>.........................] - ETA: 2:14 - loss: 1.2501 - regression_loss: 1.0675 - classification_loss: 0.1826 93/500 [====>.........................] - ETA: 2:13 - loss: 1.2541 - regression_loss: 1.0711 - classification_loss: 0.1830 94/500 [====>.........................] - ETA: 2:13 - loss: 1.2599 - regression_loss: 1.0757 - classification_loss: 0.1842 95/500 [====>.........................] - ETA: 2:13 - loss: 1.2636 - regression_loss: 1.0791 - classification_loss: 0.1844 96/500 [====>.........................] - ETA: 2:13 - loss: 1.2661 - regression_loss: 1.0812 - classification_loss: 0.1849 97/500 [====>.........................] - ETA: 2:12 - loss: 1.2600 - regression_loss: 1.0763 - classification_loss: 0.1837 98/500 [====>.........................] - ETA: 2:12 - loss: 1.2644 - regression_loss: 1.0807 - classification_loss: 0.1837 99/500 [====>.........................] - ETA: 2:12 - loss: 1.2642 - regression_loss: 1.0806 - classification_loss: 0.1836 100/500 [=====>........................] - ETA: 2:11 - loss: 1.2671 - regression_loss: 1.0831 - classification_loss: 0.1840 101/500 [=====>........................] - ETA: 2:11 - loss: 1.2599 - regression_loss: 1.0768 - classification_loss: 0.1831 102/500 [=====>........................] - ETA: 2:11 - loss: 1.2596 - regression_loss: 1.0767 - classification_loss: 0.1829 103/500 [=====>........................] - ETA: 2:10 - loss: 1.2649 - regression_loss: 1.0808 - classification_loss: 0.1841 104/500 [=====>........................] - ETA: 2:10 - loss: 1.2606 - regression_loss: 1.0768 - classification_loss: 0.1838 105/500 [=====>........................] - ETA: 2:09 - loss: 1.2610 - regression_loss: 1.0772 - classification_loss: 0.1837 106/500 [=====>........................] - ETA: 2:09 - loss: 1.2624 - regression_loss: 1.0785 - classification_loss: 0.1839 107/500 [=====>........................] - ETA: 2:09 - loss: 1.2644 - regression_loss: 1.0804 - classification_loss: 0.1841 108/500 [=====>........................] - ETA: 2:08 - loss: 1.2631 - regression_loss: 1.0791 - classification_loss: 0.1841 109/500 [=====>........................] - ETA: 2:08 - loss: 1.2588 - regression_loss: 1.0755 - classification_loss: 0.1833 110/500 [=====>........................] - ETA: 2:08 - loss: 1.2612 - regression_loss: 1.0773 - classification_loss: 0.1838 111/500 [=====>........................] - ETA: 2:08 - loss: 1.2624 - regression_loss: 1.0786 - classification_loss: 0.1839 112/500 [=====>........................] - ETA: 2:07 - loss: 1.2620 - regression_loss: 1.0778 - classification_loss: 0.1842 113/500 [=====>........................] - ETA: 2:07 - loss: 1.2704 - regression_loss: 1.0841 - classification_loss: 0.1863 114/500 [=====>........................] - ETA: 2:06 - loss: 1.2729 - regression_loss: 1.0863 - classification_loss: 0.1865 115/500 [=====>........................] - ETA: 2:06 - loss: 1.2749 - regression_loss: 1.0881 - classification_loss: 0.1868 116/500 [=====>........................] - ETA: 2:06 - loss: 1.2788 - regression_loss: 1.0914 - classification_loss: 0.1874 117/500 [======>.......................] - ETA: 2:05 - loss: 1.2754 - regression_loss: 1.0885 - classification_loss: 0.1869 118/500 [======>.......................] - ETA: 2:05 - loss: 1.2798 - regression_loss: 1.0923 - classification_loss: 0.1876 119/500 [======>.......................] - ETA: 2:05 - loss: 1.2738 - regression_loss: 1.0871 - classification_loss: 0.1867 120/500 [======>.......................] - ETA: 2:05 - loss: 1.2731 - regression_loss: 1.0866 - classification_loss: 0.1864 121/500 [======>.......................] - ETA: 2:04 - loss: 1.2738 - regression_loss: 1.0871 - classification_loss: 0.1866 122/500 [======>.......................] - ETA: 2:04 - loss: 1.2715 - regression_loss: 1.0853 - classification_loss: 0.1861 123/500 [======>.......................] - ETA: 2:04 - loss: 1.2747 - regression_loss: 1.0880 - classification_loss: 0.1866 124/500 [======>.......................] - ETA: 2:04 - loss: 1.2769 - regression_loss: 1.0902 - classification_loss: 0.1867 125/500 [======>.......................] - ETA: 2:03 - loss: 1.2743 - regression_loss: 1.0883 - classification_loss: 0.1860 126/500 [======>.......................] - ETA: 2:03 - loss: 1.2663 - regression_loss: 1.0813 - classification_loss: 0.1850 127/500 [======>.......................] - ETA: 2:03 - loss: 1.2625 - regression_loss: 1.0780 - classification_loss: 0.1845 128/500 [======>.......................] - ETA: 2:02 - loss: 1.2651 - regression_loss: 1.0805 - classification_loss: 0.1846 129/500 [======>.......................] - ETA: 2:02 - loss: 1.2679 - regression_loss: 1.0828 - classification_loss: 0.1851 130/500 [======>.......................] - ETA: 2:02 - loss: 1.2699 - regression_loss: 1.0843 - classification_loss: 0.1855 131/500 [======>.......................] - ETA: 2:01 - loss: 1.2653 - regression_loss: 1.0805 - classification_loss: 0.1848 132/500 [======>.......................] - ETA: 2:01 - loss: 1.2682 - regression_loss: 1.0827 - classification_loss: 0.1855 133/500 [======>.......................] - ETA: 2:01 - loss: 1.2673 - regression_loss: 1.0816 - classification_loss: 0.1857 134/500 [=======>......................] - ETA: 2:00 - loss: 1.2696 - regression_loss: 1.0842 - classification_loss: 0.1853 135/500 [=======>......................] - ETA: 2:00 - loss: 1.2678 - regression_loss: 1.0830 - classification_loss: 0.1848 136/500 [=======>......................] - ETA: 2:00 - loss: 1.2669 - regression_loss: 1.0823 - classification_loss: 0.1846 137/500 [=======>......................] - ETA: 1:59 - loss: 1.2637 - regression_loss: 1.0797 - classification_loss: 0.1840 138/500 [=======>......................] - ETA: 1:59 - loss: 1.2648 - regression_loss: 1.0807 - classification_loss: 0.1842 139/500 [=======>......................] - ETA: 1:59 - loss: 1.2597 - regression_loss: 1.0763 - classification_loss: 0.1834 140/500 [=======>......................] - ETA: 1:59 - loss: 1.2594 - regression_loss: 1.0753 - classification_loss: 0.1841 141/500 [=======>......................] - ETA: 1:58 - loss: 1.2594 - regression_loss: 1.0753 - classification_loss: 0.1841 142/500 [=======>......................] - ETA: 1:58 - loss: 1.2614 - regression_loss: 1.0771 - classification_loss: 0.1844 143/500 [=======>......................] - ETA: 1:58 - loss: 1.2614 - regression_loss: 1.0770 - classification_loss: 0.1844 144/500 [=======>......................] - ETA: 1:57 - loss: 1.2578 - regression_loss: 1.0733 - classification_loss: 0.1845 145/500 [=======>......................] - ETA: 1:57 - loss: 1.2579 - regression_loss: 1.0737 - classification_loss: 0.1843 146/500 [=======>......................] - ETA: 1:56 - loss: 1.2557 - regression_loss: 1.0702 - classification_loss: 0.1855 147/500 [=======>......................] - ETA: 1:56 - loss: 1.2551 - regression_loss: 1.0698 - classification_loss: 0.1853 148/500 [=======>......................] - ETA: 1:56 - loss: 1.2599 - regression_loss: 1.0741 - classification_loss: 0.1859 149/500 [=======>......................] - ETA: 1:55 - loss: 1.2608 - regression_loss: 1.0750 - classification_loss: 0.1858 150/500 [========>.....................] - ETA: 1:55 - loss: 1.2576 - regression_loss: 1.0723 - classification_loss: 0.1854 151/500 [========>.....................] - ETA: 1:55 - loss: 1.2547 - regression_loss: 1.0695 - classification_loss: 0.1851 152/500 [========>.....................] - ETA: 1:54 - loss: 1.2537 - regression_loss: 1.0688 - classification_loss: 0.1850 153/500 [========>.....................] - ETA: 1:54 - loss: 1.2566 - regression_loss: 1.0713 - classification_loss: 0.1852 154/500 [========>.....................] - ETA: 1:54 - loss: 1.2602 - regression_loss: 1.0742 - classification_loss: 0.1860 155/500 [========>.....................] - ETA: 1:53 - loss: 1.2598 - regression_loss: 1.0738 - classification_loss: 0.1860 156/500 [========>.....................] - ETA: 1:53 - loss: 1.2559 - regression_loss: 1.0703 - classification_loss: 0.1856 157/500 [========>.....................] - ETA: 1:53 - loss: 1.2597 - regression_loss: 1.0731 - classification_loss: 0.1866 158/500 [========>.....................] - ETA: 1:52 - loss: 1.2646 - regression_loss: 1.0764 - classification_loss: 0.1882 159/500 [========>.....................] - ETA: 1:52 - loss: 1.2670 - regression_loss: 1.0783 - classification_loss: 0.1887 160/500 [========>.....................] - ETA: 1:52 - loss: 1.2661 - regression_loss: 1.0773 - classification_loss: 0.1888 161/500 [========>.....................] - ETA: 1:51 - loss: 1.2693 - regression_loss: 1.0799 - classification_loss: 0.1893 162/500 [========>.....................] - ETA: 1:51 - loss: 1.2707 - regression_loss: 1.0814 - classification_loss: 0.1893 163/500 [========>.....................] - ETA: 1:51 - loss: 1.2701 - regression_loss: 1.0808 - classification_loss: 0.1893 164/500 [========>.....................] - ETA: 1:50 - loss: 1.2654 - regression_loss: 1.0767 - classification_loss: 0.1887 165/500 [========>.....................] - ETA: 1:50 - loss: 1.2613 - regression_loss: 1.0733 - classification_loss: 0.1880 166/500 [========>.....................] - ETA: 1:50 - loss: 1.2645 - regression_loss: 1.0758 - classification_loss: 0.1887 167/500 [=========>....................] - ETA: 1:50 - loss: 1.2681 - regression_loss: 1.0788 - classification_loss: 0.1893 168/500 [=========>....................] - ETA: 1:49 - loss: 1.2653 - regression_loss: 1.0766 - classification_loss: 0.1887 169/500 [=========>....................] - ETA: 1:49 - loss: 1.2613 - regression_loss: 1.0732 - classification_loss: 0.1881 170/500 [=========>....................] - ETA: 1:49 - loss: 1.2656 - regression_loss: 1.0766 - classification_loss: 0.1890 171/500 [=========>....................] - ETA: 1:48 - loss: 1.2704 - regression_loss: 1.0807 - classification_loss: 0.1897 172/500 [=========>....................] - ETA: 1:48 - loss: 1.2675 - regression_loss: 1.0785 - classification_loss: 0.1890 173/500 [=========>....................] - ETA: 1:47 - loss: 1.2664 - regression_loss: 1.0775 - classification_loss: 0.1889 174/500 [=========>....................] - ETA: 1:47 - loss: 1.2701 - regression_loss: 1.0808 - classification_loss: 0.1893 175/500 [=========>....................] - ETA: 1:47 - loss: 1.2699 - regression_loss: 1.0806 - classification_loss: 0.1893 176/500 [=========>....................] - ETA: 1:47 - loss: 1.2710 - regression_loss: 1.0817 - classification_loss: 0.1893 177/500 [=========>....................] - ETA: 1:46 - loss: 1.2722 - regression_loss: 1.0828 - classification_loss: 0.1894 178/500 [=========>....................] - ETA: 1:46 - loss: 1.2742 - regression_loss: 1.0845 - classification_loss: 0.1897 179/500 [=========>....................] - ETA: 1:46 - loss: 1.2754 - regression_loss: 1.0855 - classification_loss: 0.1899 180/500 [=========>....................] - ETA: 1:45 - loss: 1.2754 - regression_loss: 1.0856 - classification_loss: 0.1898 181/500 [=========>....................] - ETA: 1:45 - loss: 1.2729 - regression_loss: 1.0834 - classification_loss: 0.1894 182/500 [=========>....................] - ETA: 1:45 - loss: 1.2693 - regression_loss: 1.0804 - classification_loss: 0.1889 183/500 [=========>....................] - ETA: 1:44 - loss: 1.2670 - regression_loss: 1.0787 - classification_loss: 0.1883 184/500 [==========>...................] - ETA: 1:44 - loss: 1.2685 - regression_loss: 1.0801 - classification_loss: 0.1885 185/500 [==========>...................] - ETA: 1:44 - loss: 1.2670 - regression_loss: 1.0789 - classification_loss: 0.1881 186/500 [==========>...................] - ETA: 1:43 - loss: 1.2676 - regression_loss: 1.0796 - classification_loss: 0.1880 187/500 [==========>...................] - ETA: 1:43 - loss: 1.2643 - regression_loss: 1.0768 - classification_loss: 0.1875 188/500 [==========>...................] - ETA: 1:43 - loss: 1.2659 - regression_loss: 1.0783 - classification_loss: 0.1876 189/500 [==========>...................] - ETA: 1:42 - loss: 1.2667 - regression_loss: 1.0792 - classification_loss: 0.1875 190/500 [==========>...................] - ETA: 1:42 - loss: 1.2702 - regression_loss: 1.0823 - classification_loss: 0.1878 191/500 [==========>...................] - ETA: 1:42 - loss: 1.2709 - regression_loss: 1.0826 - classification_loss: 0.1883 192/500 [==========>...................] - ETA: 1:41 - loss: 1.2691 - regression_loss: 1.0812 - classification_loss: 0.1879 193/500 [==========>...................] - ETA: 1:41 - loss: 1.2722 - regression_loss: 1.0837 - classification_loss: 0.1885 194/500 [==========>...................] - ETA: 1:41 - loss: 1.2761 - regression_loss: 1.0870 - classification_loss: 0.1891 195/500 [==========>...................] - ETA: 1:40 - loss: 1.2762 - regression_loss: 1.0871 - classification_loss: 0.1890 196/500 [==========>...................] - ETA: 1:40 - loss: 1.2784 - regression_loss: 1.0893 - classification_loss: 0.1891 197/500 [==========>...................] - ETA: 1:40 - loss: 1.2773 - regression_loss: 1.0883 - classification_loss: 0.1890 198/500 [==========>...................] - ETA: 1:39 - loss: 1.2793 - regression_loss: 1.0902 - classification_loss: 0.1892 199/500 [==========>...................] - ETA: 1:39 - loss: 1.2819 - regression_loss: 1.0925 - classification_loss: 0.1894 200/500 [===========>..................] - ETA: 1:38 - loss: 1.2787 - regression_loss: 1.0895 - classification_loss: 0.1892 201/500 [===========>..................] - ETA: 1:38 - loss: 1.2790 - regression_loss: 1.0897 - classification_loss: 0.1893 202/500 [===========>..................] - ETA: 1:38 - loss: 1.2798 - regression_loss: 1.0904 - classification_loss: 0.1894 203/500 [===========>..................] - ETA: 1:37 - loss: 1.2768 - regression_loss: 1.0879 - classification_loss: 0.1889 204/500 [===========>..................] - ETA: 1:37 - loss: 1.2772 - regression_loss: 1.0883 - classification_loss: 0.1889 205/500 [===========>..................] - ETA: 1:37 - loss: 1.2819 - regression_loss: 1.0920 - classification_loss: 0.1899 206/500 [===========>..................] - ETA: 1:36 - loss: 1.2832 - regression_loss: 1.0930 - classification_loss: 0.1902 207/500 [===========>..................] - ETA: 1:36 - loss: 1.2841 - regression_loss: 1.0942 - classification_loss: 0.1899 208/500 [===========>..................] - ETA: 1:36 - loss: 1.2803 - regression_loss: 1.0909 - classification_loss: 0.1893 209/500 [===========>..................] - ETA: 1:35 - loss: 1.2789 - regression_loss: 1.0899 - classification_loss: 0.1889 210/500 [===========>..................] - ETA: 1:35 - loss: 1.2806 - regression_loss: 1.0915 - classification_loss: 0.1892 211/500 [===========>..................] - ETA: 1:35 - loss: 1.2791 - regression_loss: 1.0901 - classification_loss: 0.1890 212/500 [===========>..................] - ETA: 1:34 - loss: 1.2812 - regression_loss: 1.0918 - classification_loss: 0.1894 213/500 [===========>..................] - ETA: 1:34 - loss: 1.2799 - regression_loss: 1.0909 - classification_loss: 0.1891 214/500 [===========>..................] - ETA: 1:34 - loss: 1.2776 - regression_loss: 1.0890 - classification_loss: 0.1885 215/500 [===========>..................] - ETA: 1:34 - loss: 1.2736 - regression_loss: 1.0856 - classification_loss: 0.1880 216/500 [===========>..................] - ETA: 1:33 - loss: 1.2739 - regression_loss: 1.0859 - classification_loss: 0.1881 217/500 [============>.................] - ETA: 1:33 - loss: 1.2715 - regression_loss: 1.0836 - classification_loss: 0.1879 218/500 [============>.................] - ETA: 1:33 - loss: 1.2711 - regression_loss: 1.0832 - classification_loss: 0.1879 219/500 [============>.................] - ETA: 1:32 - loss: 1.2694 - regression_loss: 1.0818 - classification_loss: 0.1876 220/500 [============>.................] - ETA: 1:32 - loss: 1.2686 - regression_loss: 1.0811 - classification_loss: 0.1875 221/500 [============>.................] - ETA: 1:32 - loss: 1.2712 - regression_loss: 1.0833 - classification_loss: 0.1879 222/500 [============>.................] - ETA: 1:31 - loss: 1.2719 - regression_loss: 1.0840 - classification_loss: 0.1880 223/500 [============>.................] - ETA: 1:31 - loss: 1.2723 - regression_loss: 1.0843 - classification_loss: 0.1881 224/500 [============>.................] - ETA: 1:30 - loss: 1.2742 - regression_loss: 1.0859 - classification_loss: 0.1883 225/500 [============>.................] - ETA: 1:30 - loss: 1.2739 - regression_loss: 1.0857 - classification_loss: 0.1882 226/500 [============>.................] - ETA: 1:30 - loss: 1.2732 - regression_loss: 1.0850 - classification_loss: 0.1882 227/500 [============>.................] - ETA: 1:29 - loss: 1.2713 - regression_loss: 1.0836 - classification_loss: 0.1877 228/500 [============>.................] - ETA: 1:29 - loss: 1.2727 - regression_loss: 1.0850 - classification_loss: 0.1877 229/500 [============>.................] - ETA: 1:29 - loss: 1.2721 - regression_loss: 1.0844 - classification_loss: 0.1876 230/500 [============>.................] - ETA: 1:29 - loss: 1.2727 - regression_loss: 1.0851 - classification_loss: 0.1876 231/500 [============>.................] - ETA: 1:28 - loss: 1.2716 - regression_loss: 1.0843 - classification_loss: 0.1874 232/500 [============>.................] - ETA: 1:28 - loss: 1.2734 - regression_loss: 1.0857 - classification_loss: 0.1877 233/500 [============>.................] - ETA: 1:28 - loss: 1.2736 - regression_loss: 1.0858 - classification_loss: 0.1878 234/500 [=============>................] - ETA: 1:27 - loss: 1.2748 - regression_loss: 1.0868 - classification_loss: 0.1880 235/500 [=============>................] - ETA: 1:27 - loss: 1.2760 - regression_loss: 1.0879 - classification_loss: 0.1881 236/500 [=============>................] - ETA: 1:27 - loss: 1.2768 - regression_loss: 1.0884 - classification_loss: 0.1884 237/500 [=============>................] - ETA: 1:26 - loss: 1.2756 - regression_loss: 1.0875 - classification_loss: 0.1881 238/500 [=============>................] - ETA: 1:26 - loss: 1.2757 - regression_loss: 1.0876 - classification_loss: 0.1882 239/500 [=============>................] - ETA: 1:26 - loss: 1.2751 - regression_loss: 1.0872 - classification_loss: 0.1879 240/500 [=============>................] - ETA: 1:25 - loss: 1.2769 - regression_loss: 1.0887 - classification_loss: 0.1883 241/500 [=============>................] - ETA: 1:25 - loss: 1.2772 - regression_loss: 1.0888 - classification_loss: 0.1884 242/500 [=============>................] - ETA: 1:25 - loss: 1.2764 - regression_loss: 1.0883 - classification_loss: 0.1881 243/500 [=============>................] - ETA: 1:24 - loss: 1.2771 - regression_loss: 1.0889 - classification_loss: 0.1882 244/500 [=============>................] - ETA: 1:24 - loss: 1.2782 - regression_loss: 1.0898 - classification_loss: 0.1884 245/500 [=============>................] - ETA: 1:24 - loss: 1.2795 - regression_loss: 1.0910 - classification_loss: 0.1885 246/500 [=============>................] - ETA: 1:23 - loss: 1.2820 - regression_loss: 1.0930 - classification_loss: 0.1889 247/500 [=============>................] - ETA: 1:23 - loss: 1.2835 - regression_loss: 1.0944 - classification_loss: 0.1891 248/500 [=============>................] - ETA: 1:23 - loss: 1.2850 - regression_loss: 1.0956 - classification_loss: 0.1894 249/500 [=============>................] - ETA: 1:22 - loss: 1.2823 - regression_loss: 1.0931 - classification_loss: 0.1891 250/500 [==============>...............] - ETA: 1:22 - loss: 1.2802 - regression_loss: 1.0914 - classification_loss: 0.1888 251/500 [==============>...............] - ETA: 1:22 - loss: 1.2796 - regression_loss: 1.0908 - classification_loss: 0.1888 252/500 [==============>...............] - ETA: 1:21 - loss: 1.2785 - regression_loss: 1.0899 - classification_loss: 0.1886 253/500 [==============>...............] - ETA: 1:21 - loss: 1.2763 - regression_loss: 1.0881 - classification_loss: 0.1882 254/500 [==============>...............] - ETA: 1:21 - loss: 1.2765 - regression_loss: 1.0882 - classification_loss: 0.1882 255/500 [==============>...............] - ETA: 1:20 - loss: 1.2789 - regression_loss: 1.0902 - classification_loss: 0.1887 256/500 [==============>...............] - ETA: 1:20 - loss: 1.2800 - regression_loss: 1.0912 - classification_loss: 0.1888 257/500 [==============>...............] - ETA: 1:20 - loss: 1.2792 - regression_loss: 1.0907 - classification_loss: 0.1886 258/500 [==============>...............] - ETA: 1:19 - loss: 1.2783 - regression_loss: 1.0899 - classification_loss: 0.1884 259/500 [==============>...............] - ETA: 1:19 - loss: 1.2778 - regression_loss: 1.0897 - classification_loss: 0.1881 260/500 [==============>...............] - ETA: 1:19 - loss: 1.2776 - regression_loss: 1.0893 - classification_loss: 0.1883 261/500 [==============>...............] - ETA: 1:18 - loss: 1.2761 - regression_loss: 1.0880 - classification_loss: 0.1881 262/500 [==============>...............] - ETA: 1:18 - loss: 1.2765 - regression_loss: 1.0883 - classification_loss: 0.1881 263/500 [==============>...............] - ETA: 1:18 - loss: 1.2781 - regression_loss: 1.0897 - classification_loss: 0.1883 264/500 [==============>...............] - ETA: 1:17 - loss: 1.2788 - regression_loss: 1.0905 - classification_loss: 0.1883 265/500 [==============>...............] - ETA: 1:17 - loss: 1.2792 - regression_loss: 1.0910 - classification_loss: 0.1881 266/500 [==============>...............] - ETA: 1:17 - loss: 1.2811 - regression_loss: 1.0928 - classification_loss: 0.1883 267/500 [===============>..............] - ETA: 1:16 - loss: 1.2796 - regression_loss: 1.0914 - classification_loss: 0.1881 268/500 [===============>..............] - ETA: 1:16 - loss: 1.2801 - regression_loss: 1.0920 - classification_loss: 0.1881 269/500 [===============>..............] - ETA: 1:16 - loss: 1.2801 - regression_loss: 1.0921 - classification_loss: 0.1880 270/500 [===============>..............] - ETA: 1:15 - loss: 1.2786 - regression_loss: 1.0909 - classification_loss: 0.1877 271/500 [===============>..............] - ETA: 1:15 - loss: 1.2789 - regression_loss: 1.0913 - classification_loss: 0.1876 272/500 [===============>..............] - ETA: 1:15 - loss: 1.2772 - regression_loss: 1.0899 - classification_loss: 0.1873 273/500 [===============>..............] - ETA: 1:14 - loss: 1.2748 - regression_loss: 1.0878 - classification_loss: 0.1869 274/500 [===============>..............] - ETA: 1:14 - loss: 1.2729 - regression_loss: 1.0863 - classification_loss: 0.1866 275/500 [===============>..............] - ETA: 1:14 - loss: 1.2733 - regression_loss: 1.0866 - classification_loss: 0.1867 276/500 [===============>..............] - ETA: 1:13 - loss: 1.2710 - regression_loss: 1.0846 - classification_loss: 0.1864 277/500 [===============>..............] - ETA: 1:13 - loss: 1.2686 - regression_loss: 1.0826 - classification_loss: 0.1860 278/500 [===============>..............] - ETA: 1:13 - loss: 1.2715 - regression_loss: 1.0848 - classification_loss: 0.1867 279/500 [===============>..............] - ETA: 1:12 - loss: 1.2714 - regression_loss: 1.0846 - classification_loss: 0.1867 280/500 [===============>..............] - ETA: 1:12 - loss: 1.2724 - regression_loss: 1.0856 - classification_loss: 0.1868 281/500 [===============>..............] - ETA: 1:12 - loss: 1.2703 - regression_loss: 1.0839 - classification_loss: 0.1864 282/500 [===============>..............] - ETA: 1:11 - loss: 1.2702 - regression_loss: 1.0838 - classification_loss: 0.1864 283/500 [===============>..............] - ETA: 1:11 - loss: 1.2714 - regression_loss: 1.0848 - classification_loss: 0.1866 284/500 [================>.............] - ETA: 1:11 - loss: 1.2734 - regression_loss: 1.0865 - classification_loss: 0.1869 285/500 [================>.............] - ETA: 1:10 - loss: 1.2718 - regression_loss: 1.0852 - classification_loss: 0.1867 286/500 [================>.............] - ETA: 1:10 - loss: 1.2699 - regression_loss: 1.0836 - classification_loss: 0.1862 287/500 [================>.............] - ETA: 1:10 - loss: 1.2717 - regression_loss: 1.0851 - classification_loss: 0.1866 288/500 [================>.............] - ETA: 1:09 - loss: 1.2702 - regression_loss: 1.0837 - classification_loss: 0.1864 289/500 [================>.............] - ETA: 1:09 - loss: 1.2705 - regression_loss: 1.0840 - classification_loss: 0.1865 290/500 [================>.............] - ETA: 1:09 - loss: 1.2716 - regression_loss: 1.0849 - classification_loss: 0.1867 291/500 [================>.............] - ETA: 1:08 - loss: 1.2727 - regression_loss: 1.0856 - classification_loss: 0.1871 292/500 [================>.............] - ETA: 1:08 - loss: 1.2697 - regression_loss: 1.0830 - classification_loss: 0.1867 293/500 [================>.............] - ETA: 1:08 - loss: 1.2680 - regression_loss: 1.0815 - classification_loss: 0.1864 294/500 [================>.............] - ETA: 1:07 - loss: 1.2673 - regression_loss: 1.0809 - classification_loss: 0.1864 295/500 [================>.............] - ETA: 1:07 - loss: 1.2653 - regression_loss: 1.0793 - classification_loss: 0.1860 296/500 [================>.............] - ETA: 1:07 - loss: 1.2656 - regression_loss: 1.0797 - classification_loss: 0.1859 297/500 [================>.............] - ETA: 1:06 - loss: 1.2663 - regression_loss: 1.0803 - classification_loss: 0.1860 298/500 [================>.............] - ETA: 1:06 - loss: 1.2670 - regression_loss: 1.0807 - classification_loss: 0.1862 299/500 [================>.............] - ETA: 1:06 - loss: 1.2659 - regression_loss: 1.0798 - classification_loss: 0.1860 300/500 [=================>............] - ETA: 1:05 - loss: 1.2663 - regression_loss: 1.0802 - classification_loss: 0.1862 301/500 [=================>............] - ETA: 1:05 - loss: 1.2685 - regression_loss: 1.0820 - classification_loss: 0.1865 302/500 [=================>............] - ETA: 1:05 - loss: 1.2699 - regression_loss: 1.0832 - classification_loss: 0.1867 303/500 [=================>............] - ETA: 1:05 - loss: 1.2700 - regression_loss: 1.0833 - classification_loss: 0.1867 304/500 [=================>............] - ETA: 1:04 - loss: 1.2704 - regression_loss: 1.0835 - classification_loss: 0.1869 305/500 [=================>............] - ETA: 1:04 - loss: 1.2710 - regression_loss: 1.0841 - classification_loss: 0.1870 306/500 [=================>............] - ETA: 1:04 - loss: 1.2720 - regression_loss: 1.0849 - classification_loss: 0.1871 307/500 [=================>............] - ETA: 1:03 - loss: 1.2708 - regression_loss: 1.0840 - classification_loss: 0.1868 308/500 [=================>............] - ETA: 1:03 - loss: 1.2692 - regression_loss: 1.0827 - classification_loss: 0.1865 309/500 [=================>............] - ETA: 1:03 - loss: 1.2705 - regression_loss: 1.0837 - classification_loss: 0.1867 310/500 [=================>............] - ETA: 1:02 - loss: 1.2698 - regression_loss: 1.0833 - classification_loss: 0.1865 311/500 [=================>............] - ETA: 1:02 - loss: 1.2690 - regression_loss: 1.0827 - classification_loss: 0.1863 312/500 [=================>............] - ETA: 1:02 - loss: 1.2692 - regression_loss: 1.0830 - classification_loss: 0.1862 313/500 [=================>............] - ETA: 1:01 - loss: 1.2700 - regression_loss: 1.0837 - classification_loss: 0.1863 314/500 [=================>............] - ETA: 1:01 - loss: 1.2708 - regression_loss: 1.0843 - classification_loss: 0.1865 315/500 [=================>............] - ETA: 1:01 - loss: 1.2709 - regression_loss: 1.0847 - classification_loss: 0.1862 316/500 [=================>............] - ETA: 1:00 - loss: 1.2742 - regression_loss: 1.0874 - classification_loss: 0.1868 317/500 [==================>...........] - ETA: 1:00 - loss: 1.2724 - regression_loss: 1.0858 - classification_loss: 0.1866 318/500 [==================>...........] - ETA: 1:00 - loss: 1.2715 - regression_loss: 1.0851 - classification_loss: 0.1865 319/500 [==================>...........] - ETA: 59s - loss: 1.2697 - regression_loss: 1.0835 - classification_loss: 0.1862  320/500 [==================>...........] - ETA: 59s - loss: 1.2703 - regression_loss: 1.0840 - classification_loss: 0.1862 321/500 [==================>...........] - ETA: 59s - loss: 1.2692 - regression_loss: 1.0830 - classification_loss: 0.1862 322/500 [==================>...........] - ETA: 58s - loss: 1.2704 - regression_loss: 1.0840 - classification_loss: 0.1864 323/500 [==================>...........] - ETA: 58s - loss: 1.2723 - regression_loss: 1.0857 - classification_loss: 0.1866 324/500 [==================>...........] - ETA: 58s - loss: 1.2698 - regression_loss: 1.0836 - classification_loss: 0.1862 325/500 [==================>...........] - ETA: 57s - loss: 1.2699 - regression_loss: 1.0837 - classification_loss: 0.1862 326/500 [==================>...........] - ETA: 57s - loss: 1.2708 - regression_loss: 1.0846 - classification_loss: 0.1863 327/500 [==================>...........] - ETA: 57s - loss: 1.2720 - regression_loss: 1.0855 - classification_loss: 0.1865 328/500 [==================>...........] - ETA: 56s - loss: 1.2721 - regression_loss: 1.0855 - classification_loss: 0.1866 329/500 [==================>...........] - ETA: 56s - loss: 1.2713 - regression_loss: 1.0850 - classification_loss: 0.1863 330/500 [==================>...........] - ETA: 56s - loss: 1.2714 - regression_loss: 1.0850 - classification_loss: 0.1864 331/500 [==================>...........] - ETA: 55s - loss: 1.2689 - regression_loss: 1.0830 - classification_loss: 0.1860 332/500 [==================>...........] - ETA: 55s - loss: 1.2671 - regression_loss: 1.0815 - classification_loss: 0.1856 333/500 [==================>...........] - ETA: 55s - loss: 1.2669 - regression_loss: 1.0814 - classification_loss: 0.1854 334/500 [===================>..........] - ETA: 54s - loss: 1.2676 - regression_loss: 1.0822 - classification_loss: 0.1854 335/500 [===================>..........] - ETA: 54s - loss: 1.2689 - regression_loss: 1.0833 - classification_loss: 0.1856 336/500 [===================>..........] - ETA: 54s - loss: 1.2705 - regression_loss: 1.0848 - classification_loss: 0.1857 337/500 [===================>..........] - ETA: 53s - loss: 1.2720 - regression_loss: 1.0859 - classification_loss: 0.1861 338/500 [===================>..........] - ETA: 53s - loss: 1.2733 - regression_loss: 1.0870 - classification_loss: 0.1863 339/500 [===================>..........] - ETA: 53s - loss: 1.2752 - regression_loss: 1.0886 - classification_loss: 0.1866 340/500 [===================>..........] - ETA: 52s - loss: 1.2742 - regression_loss: 1.0878 - classification_loss: 0.1864 341/500 [===================>..........] - ETA: 52s - loss: 1.2747 - regression_loss: 1.0884 - classification_loss: 0.1864 342/500 [===================>..........] - ETA: 52s - loss: 1.2736 - regression_loss: 1.0873 - classification_loss: 0.1862 343/500 [===================>..........] - ETA: 51s - loss: 1.2721 - regression_loss: 1.0860 - classification_loss: 0.1861 344/500 [===================>..........] - ETA: 51s - loss: 1.2701 - regression_loss: 1.0843 - classification_loss: 0.1858 345/500 [===================>..........] - ETA: 51s - loss: 1.2711 - regression_loss: 1.0851 - classification_loss: 0.1860 346/500 [===================>..........] - ETA: 50s - loss: 1.2707 - regression_loss: 1.0848 - classification_loss: 0.1859 347/500 [===================>..........] - ETA: 50s - loss: 1.2713 - regression_loss: 1.0854 - classification_loss: 0.1859 348/500 [===================>..........] - ETA: 50s - loss: 1.2688 - regression_loss: 1.0833 - classification_loss: 0.1855 349/500 [===================>..........] - ETA: 49s - loss: 1.2668 - regression_loss: 1.0815 - classification_loss: 0.1853 350/500 [====================>.........] - ETA: 49s - loss: 1.2642 - regression_loss: 1.0792 - classification_loss: 0.1850 351/500 [====================>.........] - ETA: 49s - loss: 1.2649 - regression_loss: 1.0798 - classification_loss: 0.1850 352/500 [====================>.........] - ETA: 48s - loss: 1.2640 - regression_loss: 1.0791 - classification_loss: 0.1849 353/500 [====================>.........] - ETA: 48s - loss: 1.2641 - regression_loss: 1.0791 - classification_loss: 0.1850 354/500 [====================>.........] - ETA: 48s - loss: 1.2654 - regression_loss: 1.0802 - classification_loss: 0.1852 355/500 [====================>.........] - ETA: 47s - loss: 1.2649 - regression_loss: 1.0798 - classification_loss: 0.1851 356/500 [====================>.........] - ETA: 47s - loss: 1.2639 - regression_loss: 1.0789 - classification_loss: 0.1850 357/500 [====================>.........] - ETA: 47s - loss: 1.2649 - regression_loss: 1.0798 - classification_loss: 0.1851 358/500 [====================>.........] - ETA: 46s - loss: 1.2646 - regression_loss: 1.0795 - classification_loss: 0.1851 359/500 [====================>.........] - ETA: 46s - loss: 1.2641 - regression_loss: 1.0792 - classification_loss: 0.1849 360/500 [====================>.........] - ETA: 46s - loss: 1.2624 - regression_loss: 1.0777 - classification_loss: 0.1847 361/500 [====================>.........] - ETA: 45s - loss: 1.2606 - regression_loss: 1.0762 - classification_loss: 0.1844 362/500 [====================>.........] - ETA: 45s - loss: 1.2613 - regression_loss: 1.0769 - classification_loss: 0.1844 363/500 [====================>.........] - ETA: 45s - loss: 1.2623 - regression_loss: 1.0779 - classification_loss: 0.1844 364/500 [====================>.........] - ETA: 44s - loss: 1.2632 - regression_loss: 1.0787 - classification_loss: 0.1845 365/500 [====================>.........] - ETA: 44s - loss: 1.2624 - regression_loss: 1.0780 - classification_loss: 0.1844 366/500 [====================>.........] - ETA: 44s - loss: 1.2619 - regression_loss: 1.0776 - classification_loss: 0.1843 367/500 [=====================>........] - ETA: 43s - loss: 1.2621 - regression_loss: 1.0779 - classification_loss: 0.1843 368/500 [=====================>........] - ETA: 43s - loss: 1.2627 - regression_loss: 1.0784 - classification_loss: 0.1843 369/500 [=====================>........] - ETA: 43s - loss: 1.2613 - regression_loss: 1.0772 - classification_loss: 0.1841 370/500 [=====================>........] - ETA: 42s - loss: 1.2619 - regression_loss: 1.0778 - classification_loss: 0.1841 371/500 [=====================>........] - ETA: 42s - loss: 1.2615 - regression_loss: 1.0775 - classification_loss: 0.1841 372/500 [=====================>........] - ETA: 42s - loss: 1.2595 - regression_loss: 1.0758 - classification_loss: 0.1837 373/500 [=====================>........] - ETA: 41s - loss: 1.2579 - regression_loss: 1.0744 - classification_loss: 0.1835 374/500 [=====================>........] - ETA: 41s - loss: 1.2583 - regression_loss: 1.0746 - classification_loss: 0.1837 375/500 [=====================>........] - ETA: 41s - loss: 1.2577 - regression_loss: 1.0741 - classification_loss: 0.1836 376/500 [=====================>........] - ETA: 40s - loss: 1.2589 - regression_loss: 1.0751 - classification_loss: 0.1838 377/500 [=====================>........] - ETA: 40s - loss: 1.2584 - regression_loss: 1.0747 - classification_loss: 0.1837 378/500 [=====================>........] - ETA: 40s - loss: 1.2591 - regression_loss: 1.0754 - classification_loss: 0.1838 379/500 [=====================>........] - ETA: 39s - loss: 1.2610 - regression_loss: 1.0768 - classification_loss: 0.1841 380/500 [=====================>........] - ETA: 39s - loss: 1.2617 - regression_loss: 1.0775 - classification_loss: 0.1842 381/500 [=====================>........] - ETA: 39s - loss: 1.2634 - regression_loss: 1.0789 - classification_loss: 0.1845 382/500 [=====================>........] - ETA: 38s - loss: 1.2638 - regression_loss: 1.0791 - classification_loss: 0.1847 383/500 [=====================>........] - ETA: 38s - loss: 1.2648 - regression_loss: 1.0800 - classification_loss: 0.1848 384/500 [======================>.......] - ETA: 38s - loss: 1.2645 - regression_loss: 1.0798 - classification_loss: 0.1848 385/500 [======================>.......] - ETA: 37s - loss: 1.2653 - regression_loss: 1.0803 - classification_loss: 0.1850 386/500 [======================>.......] - ETA: 37s - loss: 1.2648 - regression_loss: 1.0799 - classification_loss: 0.1849 387/500 [======================>.......] - ETA: 37s - loss: 1.2638 - regression_loss: 1.0790 - classification_loss: 0.1848 388/500 [======================>.......] - ETA: 36s - loss: 1.2630 - regression_loss: 1.0782 - classification_loss: 0.1848 389/500 [======================>.......] - ETA: 36s - loss: 1.2629 - regression_loss: 1.0782 - classification_loss: 0.1847 390/500 [======================>.......] - ETA: 36s - loss: 1.2618 - regression_loss: 1.0773 - classification_loss: 0.1845 391/500 [======================>.......] - ETA: 35s - loss: 1.2636 - regression_loss: 1.0789 - classification_loss: 0.1847 392/500 [======================>.......] - ETA: 35s - loss: 1.2621 - regression_loss: 1.0776 - classification_loss: 0.1845 393/500 [======================>.......] - ETA: 35s - loss: 1.2624 - regression_loss: 1.0779 - classification_loss: 0.1845 394/500 [======================>.......] - ETA: 34s - loss: 1.2621 - regression_loss: 1.0776 - classification_loss: 0.1844 395/500 [======================>.......] - ETA: 34s - loss: 1.2618 - regression_loss: 1.0774 - classification_loss: 0.1844 396/500 [======================>.......] - ETA: 34s - loss: 1.2607 - regression_loss: 1.0766 - classification_loss: 0.1841 397/500 [======================>.......] - ETA: 33s - loss: 1.2597 - regression_loss: 1.0757 - classification_loss: 0.1840 398/500 [======================>.......] - ETA: 33s - loss: 1.2595 - regression_loss: 1.0754 - classification_loss: 0.1840 399/500 [======================>.......] - ETA: 33s - loss: 1.2594 - regression_loss: 1.0754 - classification_loss: 0.1840 400/500 [=======================>......] - ETA: 32s - loss: 1.2606 - regression_loss: 1.0765 - classification_loss: 0.1841 401/500 [=======================>......] - ETA: 32s - loss: 1.2608 - regression_loss: 1.0767 - classification_loss: 0.1841 402/500 [=======================>......] - ETA: 32s - loss: 1.2612 - regression_loss: 1.0771 - classification_loss: 0.1841 403/500 [=======================>......] - ETA: 31s - loss: 1.2605 - regression_loss: 1.0766 - classification_loss: 0.1839 404/500 [=======================>......] - ETA: 31s - loss: 1.2616 - regression_loss: 1.0777 - classification_loss: 0.1840 405/500 [=======================>......] - ETA: 31s - loss: 1.2597 - regression_loss: 1.0760 - classification_loss: 0.1838 406/500 [=======================>......] - ETA: 31s - loss: 1.2590 - regression_loss: 1.0752 - classification_loss: 0.1838 407/500 [=======================>......] - ETA: 30s - loss: 1.2591 - regression_loss: 1.0753 - classification_loss: 0.1838 408/500 [=======================>......] - ETA: 30s - loss: 1.2598 - regression_loss: 1.0762 - classification_loss: 0.1836 409/500 [=======================>......] - ETA: 30s - loss: 1.2592 - regression_loss: 1.0757 - classification_loss: 0.1835 410/500 [=======================>......] - ETA: 29s - loss: 1.2612 - regression_loss: 1.0773 - classification_loss: 0.1840 411/500 [=======================>......] - ETA: 29s - loss: 1.2602 - regression_loss: 1.0764 - classification_loss: 0.1838 412/500 [=======================>......] - ETA: 29s - loss: 1.2611 - regression_loss: 1.0772 - classification_loss: 0.1838 413/500 [=======================>......] - ETA: 28s - loss: 1.2603 - regression_loss: 1.0767 - classification_loss: 0.1837 414/500 [=======================>......] - ETA: 28s - loss: 1.2610 - regression_loss: 1.0774 - classification_loss: 0.1836 415/500 [=======================>......] - ETA: 28s - loss: 1.2622 - regression_loss: 1.0783 - classification_loss: 0.1839 416/500 [=======================>......] - ETA: 27s - loss: 1.2616 - regression_loss: 1.0779 - classification_loss: 0.1837 417/500 [========================>.....] - ETA: 27s - loss: 1.2606 - regression_loss: 1.0772 - classification_loss: 0.1835 418/500 [========================>.....] - ETA: 27s - loss: 1.2598 - regression_loss: 1.0765 - classification_loss: 0.1832 419/500 [========================>.....] - ETA: 26s - loss: 1.2601 - regression_loss: 1.0768 - classification_loss: 0.1833 420/500 [========================>.....] - ETA: 26s - loss: 1.2613 - regression_loss: 1.0777 - classification_loss: 0.1836 421/500 [========================>.....] - ETA: 26s - loss: 1.2608 - regression_loss: 1.0773 - classification_loss: 0.1835 422/500 [========================>.....] - ETA: 25s - loss: 1.2611 - regression_loss: 1.0776 - classification_loss: 0.1835 423/500 [========================>.....] - ETA: 25s - loss: 1.2604 - regression_loss: 1.0771 - classification_loss: 0.1833 424/500 [========================>.....] - ETA: 25s - loss: 1.2614 - regression_loss: 1.0779 - classification_loss: 0.1835 425/500 [========================>.....] - ETA: 24s - loss: 1.2628 - regression_loss: 1.0789 - classification_loss: 0.1839 426/500 [========================>.....] - ETA: 24s - loss: 1.2630 - regression_loss: 1.0790 - classification_loss: 0.1839 427/500 [========================>.....] - ETA: 24s - loss: 1.2616 - regression_loss: 1.0780 - classification_loss: 0.1837 428/500 [========================>.....] - ETA: 23s - loss: 1.2611 - regression_loss: 1.0776 - classification_loss: 0.1836 429/500 [========================>.....] - ETA: 23s - loss: 1.2630 - regression_loss: 1.0791 - classification_loss: 0.1839 430/500 [========================>.....] - ETA: 23s - loss: 1.2615 - regression_loss: 1.0778 - classification_loss: 0.1837 431/500 [========================>.....] - ETA: 22s - loss: 1.2608 - regression_loss: 1.0773 - classification_loss: 0.1835 432/500 [========================>.....] - ETA: 22s - loss: 1.2590 - regression_loss: 1.0757 - classification_loss: 0.1833 433/500 [========================>.....] - ETA: 22s - loss: 1.2597 - regression_loss: 1.0763 - classification_loss: 0.1834 434/500 [=========================>....] - ETA: 21s - loss: 1.2603 - regression_loss: 1.0768 - classification_loss: 0.1835 435/500 [=========================>....] - ETA: 21s - loss: 1.2607 - regression_loss: 1.0771 - classification_loss: 0.1836 436/500 [=========================>....] - ETA: 21s - loss: 1.2603 - regression_loss: 1.0768 - classification_loss: 0.1835 437/500 [=========================>....] - ETA: 20s - loss: 1.2604 - regression_loss: 1.0770 - classification_loss: 0.1835 438/500 [=========================>....] - ETA: 20s - loss: 1.2612 - regression_loss: 1.0777 - classification_loss: 0.1835 439/500 [=========================>....] - ETA: 20s - loss: 1.2599 - regression_loss: 1.0765 - classification_loss: 0.1833 440/500 [=========================>....] - ETA: 19s - loss: 1.2598 - regression_loss: 1.0765 - classification_loss: 0.1833 441/500 [=========================>....] - ETA: 19s - loss: 1.2586 - regression_loss: 1.0755 - classification_loss: 0.1832 442/500 [=========================>....] - ETA: 19s - loss: 1.2592 - regression_loss: 1.0760 - classification_loss: 0.1832 443/500 [=========================>....] - ETA: 18s - loss: 1.2583 - regression_loss: 1.0752 - classification_loss: 0.1830 444/500 [=========================>....] - ETA: 18s - loss: 1.2570 - regression_loss: 1.0740 - classification_loss: 0.1829 445/500 [=========================>....] - ETA: 18s - loss: 1.2576 - regression_loss: 1.0746 - classification_loss: 0.1830 446/500 [=========================>....] - ETA: 17s - loss: 1.2571 - regression_loss: 1.0742 - classification_loss: 0.1829 447/500 [=========================>....] - ETA: 17s - loss: 1.2579 - regression_loss: 1.0748 - classification_loss: 0.1831 448/500 [=========================>....] - ETA: 17s - loss: 1.2563 - regression_loss: 1.0734 - classification_loss: 0.1830 449/500 [=========================>....] - ETA: 16s - loss: 1.2565 - regression_loss: 1.0735 - classification_loss: 0.1830 450/500 [==========================>...] - ETA: 16s - loss: 1.2560 - regression_loss: 1.0731 - classification_loss: 0.1829 451/500 [==========================>...] - ETA: 16s - loss: 1.2567 - regression_loss: 1.0738 - classification_loss: 0.1829 452/500 [==========================>...] - ETA: 15s - loss: 1.2577 - regression_loss: 1.0747 - classification_loss: 0.1830 453/500 [==========================>...] - ETA: 15s - loss: 1.2567 - regression_loss: 1.0739 - classification_loss: 0.1828 454/500 [==========================>...] - ETA: 15s - loss: 1.2554 - regression_loss: 1.0727 - classification_loss: 0.1826 455/500 [==========================>...] - ETA: 14s - loss: 1.2548 - regression_loss: 1.0723 - classification_loss: 0.1826 456/500 [==========================>...] - ETA: 14s - loss: 1.2547 - regression_loss: 1.0721 - classification_loss: 0.1825 457/500 [==========================>...] - ETA: 14s - loss: 1.2554 - regression_loss: 1.0727 - classification_loss: 0.1827 458/500 [==========================>...] - ETA: 13s - loss: 1.2541 - regression_loss: 1.0716 - classification_loss: 0.1825 459/500 [==========================>...] - ETA: 13s - loss: 1.2532 - regression_loss: 1.0709 - classification_loss: 0.1823 460/500 [==========================>...] - ETA: 13s - loss: 1.2514 - regression_loss: 1.0693 - classification_loss: 0.1820 461/500 [==========================>...] - ETA: 12s - loss: 1.2520 - regression_loss: 1.0698 - classification_loss: 0.1821 462/500 [==========================>...] - ETA: 12s - loss: 1.2524 - regression_loss: 1.0702 - classification_loss: 0.1822 463/500 [==========================>...] - ETA: 12s - loss: 1.2523 - regression_loss: 1.0702 - classification_loss: 0.1821 464/500 [==========================>...] - ETA: 11s - loss: 1.2513 - regression_loss: 1.0694 - classification_loss: 0.1820 465/500 [==========================>...] - ETA: 11s - loss: 1.2516 - regression_loss: 1.0697 - classification_loss: 0.1819 466/500 [==========================>...] - ETA: 11s - loss: 1.2522 - regression_loss: 1.0702 - classification_loss: 0.1821 467/500 [===========================>..] - ETA: 10s - loss: 1.2512 - regression_loss: 1.0691 - classification_loss: 0.1821 468/500 [===========================>..] - ETA: 10s - loss: 1.2518 - regression_loss: 1.0697 - classification_loss: 0.1821 469/500 [===========================>..] - ETA: 10s - loss: 1.2511 - regression_loss: 1.0690 - classification_loss: 0.1821 470/500 [===========================>..] - ETA: 9s - loss: 1.2502 - regression_loss: 1.0682 - classification_loss: 0.1820  471/500 [===========================>..] - ETA: 9s - loss: 1.2514 - regression_loss: 1.0692 - classification_loss: 0.1822 472/500 [===========================>..] - ETA: 9s - loss: 1.2517 - regression_loss: 1.0695 - classification_loss: 0.1822 473/500 [===========================>..] - ETA: 8s - loss: 1.2521 - regression_loss: 1.0699 - classification_loss: 0.1822 474/500 [===========================>..] - ETA: 8s - loss: 1.2526 - regression_loss: 1.0704 - classification_loss: 0.1823 475/500 [===========================>..] - ETA: 8s - loss: 1.2537 - regression_loss: 1.0712 - classification_loss: 0.1824 476/500 [===========================>..] - ETA: 7s - loss: 1.2520 - regression_loss: 1.0699 - classification_loss: 0.1821 477/500 [===========================>..] - ETA: 7s - loss: 1.2527 - regression_loss: 1.0706 - classification_loss: 0.1822 478/500 [===========================>..] - ETA: 7s - loss: 1.2514 - regression_loss: 1.0693 - classification_loss: 0.1821 479/500 [===========================>..] - ETA: 6s - loss: 1.2526 - regression_loss: 1.0701 - classification_loss: 0.1825 480/500 [===========================>..] - ETA: 6s - loss: 1.2514 - regression_loss: 1.0691 - classification_loss: 0.1824 481/500 [===========================>..] - ETA: 6s - loss: 1.2509 - regression_loss: 1.0685 - classification_loss: 0.1824 482/500 [===========================>..] - ETA: 5s - loss: 1.2515 - regression_loss: 1.0690 - classification_loss: 0.1825 483/500 [===========================>..] - ETA: 5s - loss: 1.2526 - regression_loss: 1.0699 - classification_loss: 0.1827 484/500 [============================>.] - ETA: 5s - loss: 1.2526 - regression_loss: 1.0700 - classification_loss: 0.1826 485/500 [============================>.] - ETA: 4s - loss: 1.2534 - regression_loss: 1.0707 - classification_loss: 0.1827 486/500 [============================>.] - ETA: 4s - loss: 1.2534 - regression_loss: 1.0706 - classification_loss: 0.1827 487/500 [============================>.] - ETA: 4s - loss: 1.2530 - regression_loss: 1.0704 - classification_loss: 0.1827 488/500 [============================>.] - ETA: 3s - loss: 1.2532 - regression_loss: 1.0704 - classification_loss: 0.1827 489/500 [============================>.] - ETA: 3s - loss: 1.2514 - regression_loss: 1.0689 - classification_loss: 0.1825 490/500 [============================>.] - ETA: 3s - loss: 1.2499 - regression_loss: 1.0676 - classification_loss: 0.1823 491/500 [============================>.] - ETA: 2s - loss: 1.2489 - regression_loss: 1.0668 - classification_loss: 0.1821 492/500 [============================>.] - ETA: 2s - loss: 1.2482 - regression_loss: 1.0663 - classification_loss: 0.1820 493/500 [============================>.] - ETA: 2s - loss: 1.2484 - regression_loss: 1.0664 - classification_loss: 0.1820 494/500 [============================>.] - ETA: 1s - loss: 1.2492 - regression_loss: 1.0671 - classification_loss: 0.1820 495/500 [============================>.] - ETA: 1s - loss: 1.2490 - regression_loss: 1.0669 - classification_loss: 0.1821 496/500 [============================>.] - ETA: 1s - loss: 1.2478 - regression_loss: 1.0659 - classification_loss: 0.1819 497/500 [============================>.] - ETA: 0s - loss: 1.2483 - regression_loss: 1.0662 - classification_loss: 0.1820 498/500 [============================>.] - ETA: 0s - loss: 1.2480 - regression_loss: 1.0661 - classification_loss: 0.1819 499/500 [============================>.] - ETA: 0s - loss: 1.2488 - regression_loss: 1.0668 - classification_loss: 0.1820 500/500 [==============================] - 165s 330ms/step - loss: 1.2494 - regression_loss: 1.0674 - classification_loss: 0.1820 1172 instances of class plum with average precision: 0.5740 mAP: 0.5740 Epoch 00022: saving model to ./training/snapshots/resnet101_pascal_22.h5 Epoch 23/150 1/500 [..............................] - ETA: 2:30 - loss: 1.7050 - regression_loss: 1.4640 - classification_loss: 0.2410 2/500 [..............................] - ETA: 2:34 - loss: 1.5642 - regression_loss: 1.3289 - classification_loss: 0.2353 3/500 [..............................] - ETA: 2:35 - loss: 1.4731 - regression_loss: 1.2474 - classification_loss: 0.2257 4/500 [..............................] - ETA: 2:34 - loss: 1.3841 - regression_loss: 1.1866 - classification_loss: 0.1975 5/500 [..............................] - ETA: 2:35 - loss: 1.5026 - regression_loss: 1.2782 - classification_loss: 0.2244 6/500 [..............................] - ETA: 2:37 - loss: 1.4840 - regression_loss: 1.2682 - classification_loss: 0.2158 7/500 [..............................] - ETA: 2:39 - loss: 1.4754 - regression_loss: 1.2658 - classification_loss: 0.2096 8/500 [..............................] - ETA: 2:39 - loss: 1.4742 - regression_loss: 1.2653 - classification_loss: 0.2089 9/500 [..............................] - ETA: 2:39 - loss: 1.4535 - regression_loss: 1.2520 - classification_loss: 0.2015 10/500 [..............................] - ETA: 2:39 - loss: 1.4469 - regression_loss: 1.2473 - classification_loss: 0.1996 11/500 [..............................] - ETA: 2:39 - loss: 1.4013 - regression_loss: 1.2067 - classification_loss: 0.1946 12/500 [..............................] - ETA: 2:38 - loss: 1.4053 - regression_loss: 1.2111 - classification_loss: 0.1942 13/500 [..............................] - ETA: 2:37 - loss: 1.4071 - regression_loss: 1.2115 - classification_loss: 0.1957 14/500 [..............................] - ETA: 2:38 - loss: 1.4122 - regression_loss: 1.2154 - classification_loss: 0.1968 15/500 [..............................] - ETA: 2:39 - loss: 1.4272 - regression_loss: 1.2261 - classification_loss: 0.2011 16/500 [..............................] - ETA: 2:39 - loss: 1.4044 - regression_loss: 1.2061 - classification_loss: 0.1983 17/500 [>.............................] - ETA: 2:39 - loss: 1.4280 - regression_loss: 1.2264 - classification_loss: 0.2016 18/500 [>.............................] - ETA: 2:38 - loss: 1.4559 - regression_loss: 1.2496 - classification_loss: 0.2063 19/500 [>.............................] - ETA: 2:38 - loss: 1.4485 - regression_loss: 1.2460 - classification_loss: 0.2025 20/500 [>.............................] - ETA: 2:38 - loss: 1.4604 - regression_loss: 1.2558 - classification_loss: 0.2046 21/500 [>.............................] - ETA: 2:38 - loss: 1.4469 - regression_loss: 1.2465 - classification_loss: 0.2004 22/500 [>.............................] - ETA: 2:38 - loss: 1.4304 - regression_loss: 1.2307 - classification_loss: 0.1997 23/500 [>.............................] - ETA: 2:38 - loss: 1.4348 - regression_loss: 1.2348 - classification_loss: 0.2000 24/500 [>.............................] - ETA: 2:37 - loss: 1.4134 - regression_loss: 1.2173 - classification_loss: 0.1962 25/500 [>.............................] - ETA: 2:37 - loss: 1.3807 - regression_loss: 1.1891 - classification_loss: 0.1915 26/500 [>.............................] - ETA: 2:37 - loss: 1.3902 - regression_loss: 1.1975 - classification_loss: 0.1927 27/500 [>.............................] - ETA: 2:36 - loss: 1.3966 - regression_loss: 1.2015 - classification_loss: 0.1950 28/500 [>.............................] - ETA: 2:36 - loss: 1.4059 - regression_loss: 1.2072 - classification_loss: 0.1987 29/500 [>.............................] - ETA: 2:36 - loss: 1.3853 - regression_loss: 1.1898 - classification_loss: 0.1956 30/500 [>.............................] - ETA: 2:35 - loss: 1.3575 - regression_loss: 1.1659 - classification_loss: 0.1916 31/500 [>.............................] - ETA: 2:35 - loss: 1.3542 - regression_loss: 1.1615 - classification_loss: 0.1927 32/500 [>.............................] - ETA: 2:35 - loss: 1.3429 - regression_loss: 1.1521 - classification_loss: 0.1907 33/500 [>.............................] - ETA: 2:35 - loss: 1.3560 - regression_loss: 1.1632 - classification_loss: 0.1929 34/500 [=>............................] - ETA: 2:35 - loss: 1.3315 - regression_loss: 1.1412 - classification_loss: 0.1903 35/500 [=>............................] - ETA: 2:34 - loss: 1.3122 - regression_loss: 1.1248 - classification_loss: 0.1874 36/500 [=>............................] - ETA: 2:33 - loss: 1.3019 - regression_loss: 1.1164 - classification_loss: 0.1855 37/500 [=>............................] - ETA: 2:33 - loss: 1.2954 - regression_loss: 1.1117 - classification_loss: 0.1837 38/500 [=>............................] - ETA: 2:33 - loss: 1.2794 - regression_loss: 1.0985 - classification_loss: 0.1809 39/500 [=>............................] - ETA: 2:32 - loss: 1.2752 - regression_loss: 1.0947 - classification_loss: 0.1804 40/500 [=>............................] - ETA: 2:32 - loss: 1.2856 - regression_loss: 1.1035 - classification_loss: 0.1821 41/500 [=>............................] - ETA: 2:32 - loss: 1.2928 - regression_loss: 1.1104 - classification_loss: 0.1825 42/500 [=>............................] - ETA: 2:31 - loss: 1.3017 - regression_loss: 1.1175 - classification_loss: 0.1841 43/500 [=>............................] - ETA: 2:31 - loss: 1.2841 - regression_loss: 1.1028 - classification_loss: 0.1813 44/500 [=>............................] - ETA: 2:31 - loss: 1.2678 - regression_loss: 1.0887 - classification_loss: 0.1791 45/500 [=>............................] - ETA: 2:31 - loss: 1.2860 - regression_loss: 1.1035 - classification_loss: 0.1825 46/500 [=>............................] - ETA: 2:30 - loss: 1.2862 - regression_loss: 1.1038 - classification_loss: 0.1824 47/500 [=>............................] - ETA: 2:30 - loss: 1.2810 - regression_loss: 1.0989 - classification_loss: 0.1821 48/500 [=>............................] - ETA: 2:29 - loss: 1.2871 - regression_loss: 1.1037 - classification_loss: 0.1834 49/500 [=>............................] - ETA: 2:29 - loss: 1.2829 - regression_loss: 1.1003 - classification_loss: 0.1826 50/500 [==>...........................] - ETA: 2:29 - loss: 1.2874 - regression_loss: 1.1040 - classification_loss: 0.1834 51/500 [==>...........................] - ETA: 2:29 - loss: 1.2789 - regression_loss: 1.0969 - classification_loss: 0.1820 52/500 [==>...........................] - ETA: 2:29 - loss: 1.2684 - regression_loss: 1.0872 - classification_loss: 0.1813 53/500 [==>...........................] - ETA: 2:28 - loss: 1.2709 - regression_loss: 1.0890 - classification_loss: 0.1819 54/500 [==>...........................] - ETA: 2:28 - loss: 1.2809 - regression_loss: 1.0970 - classification_loss: 0.1839 55/500 [==>...........................] - ETA: 2:28 - loss: 1.2875 - regression_loss: 1.1030 - classification_loss: 0.1846 56/500 [==>...........................] - ETA: 2:28 - loss: 1.2795 - regression_loss: 1.0964 - classification_loss: 0.1831 57/500 [==>...........................] - ETA: 2:27 - loss: 1.2771 - regression_loss: 1.0941 - classification_loss: 0.1830 58/500 [==>...........................] - ETA: 2:27 - loss: 1.2844 - regression_loss: 1.1008 - classification_loss: 0.1836 59/500 [==>...........................] - ETA: 2:26 - loss: 1.2801 - regression_loss: 1.0976 - classification_loss: 0.1826 60/500 [==>...........................] - ETA: 2:26 - loss: 1.2787 - regression_loss: 1.0958 - classification_loss: 0.1829 61/500 [==>...........................] - ETA: 2:26 - loss: 1.2752 - regression_loss: 1.0933 - classification_loss: 0.1819 62/500 [==>...........................] - ETA: 2:26 - loss: 1.2781 - regression_loss: 1.0965 - classification_loss: 0.1816 63/500 [==>...........................] - ETA: 2:26 - loss: 1.2941 - regression_loss: 1.1095 - classification_loss: 0.1846 64/500 [==>...........................] - ETA: 2:25 - loss: 1.2925 - regression_loss: 1.1089 - classification_loss: 0.1836 65/500 [==>...........................] - ETA: 2:25 - loss: 1.2830 - regression_loss: 1.1009 - classification_loss: 0.1821 66/500 [==>...........................] - ETA: 2:24 - loss: 1.2735 - regression_loss: 1.0926 - classification_loss: 0.1809 67/500 [===>..........................] - ETA: 2:24 - loss: 1.2679 - regression_loss: 1.0876 - classification_loss: 0.1804 68/500 [===>..........................] - ETA: 2:24 - loss: 1.2727 - regression_loss: 1.0909 - classification_loss: 0.1818 69/500 [===>..........................] - ETA: 2:23 - loss: 1.2709 - regression_loss: 1.0893 - classification_loss: 0.1816 70/500 [===>..........................] - ETA: 2:23 - loss: 1.2711 - regression_loss: 1.0892 - classification_loss: 0.1819 71/500 [===>..........................] - ETA: 2:22 - loss: 1.2751 - regression_loss: 1.0932 - classification_loss: 0.1819 72/500 [===>..........................] - ETA: 2:22 - loss: 1.2797 - regression_loss: 1.0968 - classification_loss: 0.1829 73/500 [===>..........................] - ETA: 2:22 - loss: 1.2821 - regression_loss: 1.0991 - classification_loss: 0.1830 74/500 [===>..........................] - ETA: 2:21 - loss: 1.2806 - regression_loss: 1.0970 - classification_loss: 0.1836 75/500 [===>..........................] - ETA: 2:21 - loss: 1.2839 - regression_loss: 1.0998 - classification_loss: 0.1840 76/500 [===>..........................] - ETA: 2:21 - loss: 1.2797 - regression_loss: 1.0964 - classification_loss: 0.1833 77/500 [===>..........................] - ETA: 2:21 - loss: 1.2773 - regression_loss: 1.0948 - classification_loss: 0.1825 78/500 [===>..........................] - ETA: 2:20 - loss: 1.2790 - regression_loss: 1.0965 - classification_loss: 0.1826 79/500 [===>..........................] - ETA: 2:20 - loss: 1.2853 - regression_loss: 1.1018 - classification_loss: 0.1835 80/500 [===>..........................] - ETA: 2:20 - loss: 1.2788 - regression_loss: 1.0964 - classification_loss: 0.1824 81/500 [===>..........................] - ETA: 2:19 - loss: 1.2777 - regression_loss: 1.0954 - classification_loss: 0.1823 82/500 [===>..........................] - ETA: 2:19 - loss: 1.2918 - regression_loss: 1.1078 - classification_loss: 0.1840 83/500 [===>..........................] - ETA: 2:19 - loss: 1.2931 - regression_loss: 1.1090 - classification_loss: 0.1841 84/500 [====>.........................] - ETA: 2:18 - loss: 1.3007 - regression_loss: 1.1150 - classification_loss: 0.1856 85/500 [====>.........................] - ETA: 2:18 - loss: 1.3010 - regression_loss: 1.1152 - classification_loss: 0.1858 86/500 [====>.........................] - ETA: 2:18 - loss: 1.3028 - regression_loss: 1.1170 - classification_loss: 0.1859 87/500 [====>.........................] - ETA: 2:17 - loss: 1.3045 - regression_loss: 1.1184 - classification_loss: 0.1861 88/500 [====>.........................] - ETA: 2:17 - loss: 1.3060 - regression_loss: 1.1198 - classification_loss: 0.1862 89/500 [====>.........................] - ETA: 2:16 - loss: 1.3044 - regression_loss: 1.1183 - classification_loss: 0.1861 90/500 [====>.........................] - ETA: 2:16 - loss: 1.3058 - regression_loss: 1.1199 - classification_loss: 0.1860 91/500 [====>.........................] - ETA: 2:16 - loss: 1.3056 - regression_loss: 1.1199 - classification_loss: 0.1857 92/500 [====>.........................] - ETA: 2:16 - loss: 1.3139 - regression_loss: 1.1265 - classification_loss: 0.1875 93/500 [====>.........................] - ETA: 2:15 - loss: 1.3131 - regression_loss: 1.1254 - classification_loss: 0.1877 94/500 [====>.........................] - ETA: 2:15 - loss: 1.3132 - regression_loss: 1.1253 - classification_loss: 0.1879 95/500 [====>.........................] - ETA: 2:15 - loss: 1.3059 - regression_loss: 1.1191 - classification_loss: 0.1869 96/500 [====>.........................] - ETA: 2:14 - loss: 1.3006 - regression_loss: 1.1149 - classification_loss: 0.1858 97/500 [====>.........................] - ETA: 2:14 - loss: 1.2988 - regression_loss: 1.1128 - classification_loss: 0.1859 98/500 [====>.........................] - ETA: 2:13 - loss: 1.2972 - regression_loss: 1.1115 - classification_loss: 0.1857 99/500 [====>.........................] - ETA: 2:13 - loss: 1.2908 - regression_loss: 1.1063 - classification_loss: 0.1845 100/500 [=====>........................] - ETA: 2:13 - loss: 1.2890 - regression_loss: 1.1050 - classification_loss: 0.1839 101/500 [=====>........................] - ETA: 2:12 - loss: 1.2928 - regression_loss: 1.1089 - classification_loss: 0.1839 102/500 [=====>........................] - ETA: 2:12 - loss: 1.2961 - regression_loss: 1.1117 - classification_loss: 0.1844 103/500 [=====>........................] - ETA: 2:11 - loss: 1.2964 - regression_loss: 1.1121 - classification_loss: 0.1843 104/500 [=====>........................] - ETA: 2:11 - loss: 1.2931 - regression_loss: 1.1092 - classification_loss: 0.1839 105/500 [=====>........................] - ETA: 2:11 - loss: 1.2872 - regression_loss: 1.1041 - classification_loss: 0.1831 106/500 [=====>........................] - ETA: 2:10 - loss: 1.2927 - regression_loss: 1.1083 - classification_loss: 0.1844 107/500 [=====>........................] - ETA: 2:10 - loss: 1.2971 - regression_loss: 1.1122 - classification_loss: 0.1848 108/500 [=====>........................] - ETA: 2:10 - loss: 1.2906 - regression_loss: 1.1065 - classification_loss: 0.1842 109/500 [=====>........................] - ETA: 2:09 - loss: 1.2871 - regression_loss: 1.1034 - classification_loss: 0.1837 110/500 [=====>........................] - ETA: 2:09 - loss: 1.2835 - regression_loss: 1.1007 - classification_loss: 0.1828 111/500 [=====>........................] - ETA: 2:09 - loss: 1.2841 - regression_loss: 1.1011 - classification_loss: 0.1830 112/500 [=====>........................] - ETA: 2:08 - loss: 1.2847 - regression_loss: 1.1012 - classification_loss: 0.1835 113/500 [=====>........................] - ETA: 2:08 - loss: 1.2867 - regression_loss: 1.1027 - classification_loss: 0.1840 114/500 [=====>........................] - ETA: 2:08 - loss: 1.2843 - regression_loss: 1.1011 - classification_loss: 0.1832 115/500 [=====>........................] - ETA: 2:07 - loss: 1.2819 - regression_loss: 1.0990 - classification_loss: 0.1829 116/500 [=====>........................] - ETA: 2:07 - loss: 1.2772 - regression_loss: 1.0952 - classification_loss: 0.1820 117/500 [======>.......................] - ETA: 2:07 - loss: 1.2782 - regression_loss: 1.0960 - classification_loss: 0.1821 118/500 [======>.......................] - ETA: 2:07 - loss: 1.2767 - regression_loss: 1.0946 - classification_loss: 0.1821 119/500 [======>.......................] - ETA: 2:06 - loss: 1.2740 - regression_loss: 1.0928 - classification_loss: 0.1811 120/500 [======>.......................] - ETA: 2:06 - loss: 1.2719 - regression_loss: 1.0914 - classification_loss: 0.1805 121/500 [======>.......................] - ETA: 2:05 - loss: 1.2732 - regression_loss: 1.0928 - classification_loss: 0.1804 122/500 [======>.......................] - ETA: 2:05 - loss: 1.2743 - regression_loss: 1.0939 - classification_loss: 0.1803 123/500 [======>.......................] - ETA: 2:05 - loss: 1.2790 - regression_loss: 1.0979 - classification_loss: 0.1811 124/500 [======>.......................] - ETA: 2:04 - loss: 1.2799 - regression_loss: 1.0987 - classification_loss: 0.1812 125/500 [======>.......................] - ETA: 2:04 - loss: 1.2785 - regression_loss: 1.0978 - classification_loss: 0.1807 126/500 [======>.......................] - ETA: 2:04 - loss: 1.2760 - regression_loss: 1.0959 - classification_loss: 0.1801 127/500 [======>.......................] - ETA: 2:03 - loss: 1.2755 - regression_loss: 1.0960 - classification_loss: 0.1795 128/500 [======>.......................] - ETA: 2:03 - loss: 1.2793 - regression_loss: 1.0992 - classification_loss: 0.1801 129/500 [======>.......................] - ETA: 2:03 - loss: 1.2761 - regression_loss: 1.0966 - classification_loss: 0.1794 130/500 [======>.......................] - ETA: 2:02 - loss: 1.2777 - regression_loss: 1.0978 - classification_loss: 0.1798 131/500 [======>.......................] - ETA: 2:02 - loss: 1.2748 - regression_loss: 1.0954 - classification_loss: 0.1794 132/500 [======>.......................] - ETA: 2:02 - loss: 1.2762 - regression_loss: 1.0963 - classification_loss: 0.1798 133/500 [======>.......................] - ETA: 2:01 - loss: 1.2810 - regression_loss: 1.1004 - classification_loss: 0.1806 134/500 [=======>......................] - ETA: 2:01 - loss: 1.2752 - regression_loss: 1.0953 - classification_loss: 0.1798 135/500 [=======>......................] - ETA: 2:01 - loss: 1.2703 - regression_loss: 1.0911 - classification_loss: 0.1792 136/500 [=======>......................] - ETA: 2:00 - loss: 1.2679 - regression_loss: 1.0891 - classification_loss: 0.1788 137/500 [=======>......................] - ETA: 2:00 - loss: 1.2693 - regression_loss: 1.0904 - classification_loss: 0.1788 138/500 [=======>......................] - ETA: 2:00 - loss: 1.2701 - regression_loss: 1.0910 - classification_loss: 0.1791 139/500 [=======>......................] - ETA: 1:59 - loss: 1.2731 - regression_loss: 1.0936 - classification_loss: 0.1795 140/500 [=======>......................] - ETA: 1:59 - loss: 1.2697 - regression_loss: 1.0907 - classification_loss: 0.1790 141/500 [=======>......................] - ETA: 1:59 - loss: 1.2745 - regression_loss: 1.0947 - classification_loss: 0.1798 142/500 [=======>......................] - ETA: 1:58 - loss: 1.2783 - regression_loss: 1.0977 - classification_loss: 0.1805 143/500 [=======>......................] - ETA: 1:58 - loss: 1.2773 - regression_loss: 1.0969 - classification_loss: 0.1804 144/500 [=======>......................] - ETA: 1:58 - loss: 1.2710 - regression_loss: 1.0913 - classification_loss: 0.1797 145/500 [=======>......................] - ETA: 1:57 - loss: 1.2724 - regression_loss: 1.0926 - classification_loss: 0.1798 146/500 [=======>......................] - ETA: 1:57 - loss: 1.2731 - regression_loss: 1.0932 - classification_loss: 0.1799 147/500 [=======>......................] - ETA: 1:57 - loss: 1.2736 - regression_loss: 1.0938 - classification_loss: 0.1798 148/500 [=======>......................] - ETA: 1:56 - loss: 1.2740 - regression_loss: 1.0944 - classification_loss: 0.1796 149/500 [=======>......................] - ETA: 1:56 - loss: 1.2766 - regression_loss: 1.0967 - classification_loss: 0.1800 150/500 [========>.....................] - ETA: 1:56 - loss: 1.2786 - regression_loss: 1.0985 - classification_loss: 0.1802 151/500 [========>.....................] - ETA: 1:55 - loss: 1.2803 - regression_loss: 1.0999 - classification_loss: 0.1803 152/500 [========>.....................] - ETA: 1:55 - loss: 1.2791 - regression_loss: 1.0988 - classification_loss: 0.1803 153/500 [========>.....................] - ETA: 1:55 - loss: 1.2769 - regression_loss: 1.0973 - classification_loss: 0.1796 154/500 [========>.....................] - ETA: 1:55 - loss: 1.2715 - regression_loss: 1.0925 - classification_loss: 0.1789 155/500 [========>.....................] - ETA: 1:54 - loss: 1.2685 - regression_loss: 1.0901 - classification_loss: 0.1784 156/500 [========>.....................] - ETA: 1:54 - loss: 1.2635 - regression_loss: 1.0858 - classification_loss: 0.1777 157/500 [========>.....................] - ETA: 1:54 - loss: 1.2645 - regression_loss: 1.0868 - classification_loss: 0.1777 158/500 [========>.....................] - ETA: 1:53 - loss: 1.2665 - regression_loss: 1.0883 - classification_loss: 0.1782 159/500 [========>.....................] - ETA: 1:53 - loss: 1.2701 - regression_loss: 1.0914 - classification_loss: 0.1787 160/500 [========>.....................] - ETA: 1:53 - loss: 1.2655 - regression_loss: 1.0874 - classification_loss: 0.1781 161/500 [========>.....................] - ETA: 1:52 - loss: 1.2697 - regression_loss: 1.0909 - classification_loss: 0.1788 162/500 [========>.....................] - ETA: 1:52 - loss: 1.2667 - regression_loss: 1.0885 - classification_loss: 0.1782 163/500 [========>.....................] - ETA: 1:51 - loss: 1.2694 - regression_loss: 1.0906 - classification_loss: 0.1788 164/500 [========>.....................] - ETA: 1:51 - loss: 1.2643 - regression_loss: 1.0861 - classification_loss: 0.1782 165/500 [========>.....................] - ETA: 1:51 - loss: 1.2640 - regression_loss: 1.0857 - classification_loss: 0.1783 166/500 [========>.....................] - ETA: 1:50 - loss: 1.2655 - regression_loss: 1.0872 - classification_loss: 0.1783 167/500 [=========>....................] - ETA: 1:50 - loss: 1.2640 - regression_loss: 1.0862 - classification_loss: 0.1778 168/500 [=========>....................] - ETA: 1:50 - loss: 1.2651 - regression_loss: 1.0872 - classification_loss: 0.1779 169/500 [=========>....................] - ETA: 1:49 - loss: 1.2609 - regression_loss: 1.0835 - classification_loss: 0.1774 170/500 [=========>....................] - ETA: 1:49 - loss: 1.2616 - regression_loss: 1.0841 - classification_loss: 0.1775 171/500 [=========>....................] - ETA: 1:49 - loss: 1.2594 - regression_loss: 1.0820 - classification_loss: 0.1774 172/500 [=========>....................] - ETA: 1:48 - loss: 1.2563 - regression_loss: 1.0792 - classification_loss: 0.1771 173/500 [=========>....................] - ETA: 1:48 - loss: 1.2537 - regression_loss: 1.0768 - classification_loss: 0.1768 174/500 [=========>....................] - ETA: 1:48 - loss: 1.2535 - regression_loss: 1.0768 - classification_loss: 0.1768 175/500 [=========>....................] - ETA: 1:47 - loss: 1.2544 - regression_loss: 1.0777 - classification_loss: 0.1767 176/500 [=========>....................] - ETA: 1:47 - loss: 1.2507 - regression_loss: 1.0744 - classification_loss: 0.1762 177/500 [=========>....................] - ETA: 1:47 - loss: 1.2508 - regression_loss: 1.0748 - classification_loss: 0.1760 178/500 [=========>....................] - ETA: 1:46 - loss: 1.2532 - regression_loss: 1.0764 - classification_loss: 0.1768 179/500 [=========>....................] - ETA: 1:46 - loss: 1.2506 - regression_loss: 1.0744 - classification_loss: 0.1762 180/500 [=========>....................] - ETA: 1:46 - loss: 1.2507 - regression_loss: 1.0746 - classification_loss: 0.1762 181/500 [=========>....................] - ETA: 1:45 - loss: 1.2516 - regression_loss: 1.0754 - classification_loss: 0.1762 182/500 [=========>....................] - ETA: 1:45 - loss: 1.2502 - regression_loss: 1.0741 - classification_loss: 0.1760 183/500 [=========>....................] - ETA: 1:45 - loss: 1.2499 - regression_loss: 1.0739 - classification_loss: 0.1759 184/500 [==========>...................] - ETA: 1:44 - loss: 1.2515 - regression_loss: 1.0754 - classification_loss: 0.1762 185/500 [==========>...................] - ETA: 1:44 - loss: 1.2484 - regression_loss: 1.0726 - classification_loss: 0.1758 186/500 [==========>...................] - ETA: 1:44 - loss: 1.2482 - regression_loss: 1.0725 - classification_loss: 0.1757 187/500 [==========>...................] - ETA: 1:43 - loss: 1.2505 - regression_loss: 1.0745 - classification_loss: 0.1760 188/500 [==========>...................] - ETA: 1:43 - loss: 1.2531 - regression_loss: 1.0767 - classification_loss: 0.1764 189/500 [==========>...................] - ETA: 1:43 - loss: 1.2533 - regression_loss: 1.0770 - classification_loss: 0.1763 190/500 [==========>...................] - ETA: 1:42 - loss: 1.2570 - regression_loss: 1.0801 - classification_loss: 0.1769 191/500 [==========>...................] - ETA: 1:42 - loss: 1.2557 - regression_loss: 1.0790 - classification_loss: 0.1767 192/500 [==========>...................] - ETA: 1:42 - loss: 1.2588 - regression_loss: 1.0815 - classification_loss: 0.1773 193/500 [==========>...................] - ETA: 1:41 - loss: 1.2562 - regression_loss: 1.0793 - classification_loss: 0.1769 194/500 [==========>...................] - ETA: 1:41 - loss: 1.2576 - regression_loss: 1.0804 - classification_loss: 0.1772 195/500 [==========>...................] - ETA: 1:41 - loss: 1.2605 - regression_loss: 1.0831 - classification_loss: 0.1774 196/500 [==========>...................] - ETA: 1:40 - loss: 1.2622 - regression_loss: 1.0845 - classification_loss: 0.1778 197/500 [==========>...................] - ETA: 1:40 - loss: 1.2632 - regression_loss: 1.0855 - classification_loss: 0.1777 198/500 [==========>...................] - ETA: 1:40 - loss: 1.2613 - regression_loss: 1.0838 - classification_loss: 0.1775 199/500 [==========>...................] - ETA: 1:39 - loss: 1.2604 - regression_loss: 1.0831 - classification_loss: 0.1773 200/500 [===========>..................] - ETA: 1:39 - loss: 1.2603 - regression_loss: 1.0828 - classification_loss: 0.1775 201/500 [===========>..................] - ETA: 1:39 - loss: 1.2648 - regression_loss: 1.0860 - classification_loss: 0.1788 202/500 [===========>..................] - ETA: 1:38 - loss: 1.2634 - regression_loss: 1.0847 - classification_loss: 0.1788 203/500 [===========>..................] - ETA: 1:38 - loss: 1.2641 - regression_loss: 1.0853 - classification_loss: 0.1788 204/500 [===========>..................] - ETA: 1:38 - loss: 1.2640 - regression_loss: 1.0851 - classification_loss: 0.1789 205/500 [===========>..................] - ETA: 1:37 - loss: 1.2645 - regression_loss: 1.0855 - classification_loss: 0.1790 206/500 [===========>..................] - ETA: 1:37 - loss: 1.2608 - regression_loss: 1.0822 - classification_loss: 0.1786 207/500 [===========>..................] - ETA: 1:37 - loss: 1.2626 - regression_loss: 1.0833 - classification_loss: 0.1793 208/500 [===========>..................] - ETA: 1:36 - loss: 1.2603 - regression_loss: 1.0812 - classification_loss: 0.1791 209/500 [===========>..................] - ETA: 1:36 - loss: 1.2573 - regression_loss: 1.0786 - classification_loss: 0.1786 210/500 [===========>..................] - ETA: 1:36 - loss: 1.2571 - regression_loss: 1.0785 - classification_loss: 0.1786 211/500 [===========>..................] - ETA: 1:35 - loss: 1.2578 - regression_loss: 1.0790 - classification_loss: 0.1787 212/500 [===========>..................] - ETA: 1:35 - loss: 1.2599 - regression_loss: 1.0807 - classification_loss: 0.1792 213/500 [===========>..................] - ETA: 1:35 - loss: 1.2613 - regression_loss: 1.0818 - classification_loss: 0.1795 214/500 [===========>..................] - ETA: 1:34 - loss: 1.2577 - regression_loss: 1.0787 - classification_loss: 0.1789 215/500 [===========>..................] - ETA: 1:34 - loss: 1.2565 - regression_loss: 1.0775 - classification_loss: 0.1789 216/500 [===========>..................] - ETA: 1:34 - loss: 1.2578 - regression_loss: 1.0787 - classification_loss: 0.1791 217/500 [============>.................] - ETA: 1:33 - loss: 1.2569 - regression_loss: 1.0779 - classification_loss: 0.1790 218/500 [============>.................] - ETA: 1:33 - loss: 1.2548 - regression_loss: 1.0762 - classification_loss: 0.1786 219/500 [============>.................] - ETA: 1:33 - loss: 1.2567 - regression_loss: 1.0777 - classification_loss: 0.1790 220/500 [============>.................] - ETA: 1:32 - loss: 1.2584 - regression_loss: 1.0790 - classification_loss: 0.1795 221/500 [============>.................] - ETA: 1:32 - loss: 1.2600 - regression_loss: 1.0805 - classification_loss: 0.1795 222/500 [============>.................] - ETA: 1:32 - loss: 1.2640 - regression_loss: 1.0839 - classification_loss: 0.1801 223/500 [============>.................] - ETA: 1:31 - loss: 1.2644 - regression_loss: 1.0842 - classification_loss: 0.1802 224/500 [============>.................] - ETA: 1:31 - loss: 1.2610 - regression_loss: 1.0812 - classification_loss: 0.1798 225/500 [============>.................] - ETA: 1:31 - loss: 1.2589 - regression_loss: 1.0796 - classification_loss: 0.1793 226/500 [============>.................] - ETA: 1:30 - loss: 1.2607 - regression_loss: 1.0811 - classification_loss: 0.1796 227/500 [============>.................] - ETA: 1:30 - loss: 1.2584 - regression_loss: 1.0792 - classification_loss: 0.1792 228/500 [============>.................] - ETA: 1:30 - loss: 1.2568 - regression_loss: 1.0778 - classification_loss: 0.1790 229/500 [============>.................] - ETA: 1:29 - loss: 1.2558 - regression_loss: 1.0771 - classification_loss: 0.1787 230/500 [============>.................] - ETA: 1:29 - loss: 1.2577 - regression_loss: 1.0786 - classification_loss: 0.1791 231/500 [============>.................] - ETA: 1:29 - loss: 1.2580 - regression_loss: 1.0788 - classification_loss: 0.1792 232/500 [============>.................] - ETA: 1:28 - loss: 1.2594 - regression_loss: 1.0799 - classification_loss: 0.1795 233/500 [============>.................] - ETA: 1:28 - loss: 1.2583 - regression_loss: 1.0790 - classification_loss: 0.1793 234/500 [=============>................] - ETA: 1:28 - loss: 1.2572 - regression_loss: 1.0780 - classification_loss: 0.1792 235/500 [=============>................] - ETA: 1:27 - loss: 1.2555 - regression_loss: 1.0765 - classification_loss: 0.1790 236/500 [=============>................] - ETA: 1:27 - loss: 1.2545 - regression_loss: 1.0756 - classification_loss: 0.1789 237/500 [=============>................] - ETA: 1:27 - loss: 1.2519 - regression_loss: 1.0734 - classification_loss: 0.1785 238/500 [=============>................] - ETA: 1:26 - loss: 1.2521 - regression_loss: 1.0736 - classification_loss: 0.1785 239/500 [=============>................] - ETA: 1:26 - loss: 1.2546 - regression_loss: 1.0756 - classification_loss: 0.1790 240/500 [=============>................] - ETA: 1:26 - loss: 1.2522 - regression_loss: 1.0736 - classification_loss: 0.1786 241/500 [=============>................] - ETA: 1:25 - loss: 1.2540 - regression_loss: 1.0751 - classification_loss: 0.1789 242/500 [=============>................] - ETA: 1:25 - loss: 1.2542 - regression_loss: 1.0753 - classification_loss: 0.1789 243/500 [=============>................] - ETA: 1:25 - loss: 1.2503 - regression_loss: 1.0719 - classification_loss: 0.1785 244/500 [=============>................] - ETA: 1:24 - loss: 1.2494 - regression_loss: 1.0713 - classification_loss: 0.1781 245/500 [=============>................] - ETA: 1:24 - loss: 1.2506 - regression_loss: 1.0728 - classification_loss: 0.1778 246/500 [=============>................] - ETA: 1:24 - loss: 1.2522 - regression_loss: 1.0740 - classification_loss: 0.1781 247/500 [=============>................] - ETA: 1:23 - loss: 1.2530 - regression_loss: 1.0749 - classification_loss: 0.1782 248/500 [=============>................] - ETA: 1:23 - loss: 1.2513 - regression_loss: 1.0733 - classification_loss: 0.1780 249/500 [=============>................] - ETA: 1:23 - loss: 1.2521 - regression_loss: 1.0741 - classification_loss: 0.1781 250/500 [==============>...............] - ETA: 1:22 - loss: 1.2535 - regression_loss: 1.0752 - classification_loss: 0.1782 251/500 [==============>...............] - ETA: 1:22 - loss: 1.2524 - regression_loss: 1.0744 - classification_loss: 0.1780 252/500 [==============>...............] - ETA: 1:22 - loss: 1.2499 - regression_loss: 1.0723 - classification_loss: 0.1776 253/500 [==============>...............] - ETA: 1:21 - loss: 1.2510 - regression_loss: 1.0731 - classification_loss: 0.1779 254/500 [==============>...............] - ETA: 1:21 - loss: 1.2514 - regression_loss: 1.0735 - classification_loss: 0.1779 255/500 [==============>...............] - ETA: 1:21 - loss: 1.2486 - regression_loss: 1.0710 - classification_loss: 0.1776 256/500 [==============>...............] - ETA: 1:20 - loss: 1.2471 - regression_loss: 1.0694 - classification_loss: 0.1776 257/500 [==============>...............] - ETA: 1:20 - loss: 1.2445 - regression_loss: 1.0672 - classification_loss: 0.1772 258/500 [==============>...............] - ETA: 1:20 - loss: 1.2410 - regression_loss: 1.0642 - classification_loss: 0.1768 259/500 [==============>...............] - ETA: 1:19 - loss: 1.2384 - regression_loss: 1.0620 - classification_loss: 0.1764 260/500 [==============>...............] - ETA: 1:19 - loss: 1.2351 - regression_loss: 1.0592 - classification_loss: 0.1760 261/500 [==============>...............] - ETA: 1:19 - loss: 1.2343 - regression_loss: 1.0584 - classification_loss: 0.1758 262/500 [==============>...............] - ETA: 1:19 - loss: 1.2338 - regression_loss: 1.0580 - classification_loss: 0.1758 263/500 [==============>...............] - ETA: 1:18 - loss: 1.2351 - regression_loss: 1.0591 - classification_loss: 0.1760 264/500 [==============>...............] - ETA: 1:18 - loss: 1.2354 - regression_loss: 1.0593 - classification_loss: 0.1761 265/500 [==============>...............] - ETA: 1:18 - loss: 1.2356 - regression_loss: 1.0593 - classification_loss: 0.1763 266/500 [==============>...............] - ETA: 1:17 - loss: 1.2350 - regression_loss: 1.0589 - classification_loss: 0.1761 267/500 [===============>..............] - ETA: 1:17 - loss: 1.2339 - regression_loss: 1.0580 - classification_loss: 0.1759 268/500 [===============>..............] - ETA: 1:17 - loss: 1.2318 - regression_loss: 1.0563 - classification_loss: 0.1755 269/500 [===============>..............] - ETA: 1:16 - loss: 1.2326 - regression_loss: 1.0571 - classification_loss: 0.1755 270/500 [===============>..............] - ETA: 1:16 - loss: 1.2334 - regression_loss: 1.0578 - classification_loss: 0.1756 271/500 [===============>..............] - ETA: 1:16 - loss: 1.2314 - regression_loss: 1.0558 - classification_loss: 0.1755 272/500 [===============>..............] - ETA: 1:15 - loss: 1.2310 - regression_loss: 1.0555 - classification_loss: 0.1755 273/500 [===============>..............] - ETA: 1:15 - loss: 1.2324 - regression_loss: 1.0565 - classification_loss: 0.1758 274/500 [===============>..............] - ETA: 1:15 - loss: 1.2320 - regression_loss: 1.0564 - classification_loss: 0.1755 275/500 [===============>..............] - ETA: 1:14 - loss: 1.2334 - regression_loss: 1.0576 - classification_loss: 0.1758 276/500 [===============>..............] - ETA: 1:14 - loss: 1.2347 - regression_loss: 1.0587 - classification_loss: 0.1760 277/500 [===============>..............] - ETA: 1:14 - loss: 1.2353 - regression_loss: 1.0596 - classification_loss: 0.1758 278/500 [===============>..............] - ETA: 1:13 - loss: 1.2337 - regression_loss: 1.0581 - classification_loss: 0.1756 279/500 [===============>..............] - ETA: 1:13 - loss: 1.2343 - regression_loss: 1.0586 - classification_loss: 0.1757 280/500 [===============>..............] - ETA: 1:13 - loss: 1.2355 - regression_loss: 1.0597 - classification_loss: 0.1758 281/500 [===============>..............] - ETA: 1:12 - loss: 1.2364 - regression_loss: 1.0605 - classification_loss: 0.1759 282/500 [===============>..............] - ETA: 1:12 - loss: 1.2356 - regression_loss: 1.0600 - classification_loss: 0.1757 283/500 [===============>..............] - ETA: 1:11 - loss: 1.2353 - regression_loss: 1.0595 - classification_loss: 0.1758 284/500 [================>.............] - ETA: 1:11 - loss: 1.2369 - regression_loss: 1.0606 - classification_loss: 0.1763 285/500 [================>.............] - ETA: 1:11 - loss: 1.2399 - regression_loss: 1.0634 - classification_loss: 0.1766 286/500 [================>.............] - ETA: 1:10 - loss: 1.2415 - regression_loss: 1.0647 - classification_loss: 0.1768 287/500 [================>.............] - ETA: 1:10 - loss: 1.2394 - regression_loss: 1.0630 - classification_loss: 0.1764 288/500 [================>.............] - ETA: 1:10 - loss: 1.2403 - regression_loss: 1.0638 - classification_loss: 0.1765 289/500 [================>.............] - ETA: 1:09 - loss: 1.2414 - regression_loss: 1.0646 - classification_loss: 0.1768 290/500 [================>.............] - ETA: 1:09 - loss: 1.2424 - regression_loss: 1.0655 - classification_loss: 0.1769 291/500 [================>.............] - ETA: 1:09 - loss: 1.2438 - regression_loss: 1.0667 - classification_loss: 0.1771 292/500 [================>.............] - ETA: 1:08 - loss: 1.2432 - regression_loss: 1.0663 - classification_loss: 0.1768 293/500 [================>.............] - ETA: 1:08 - loss: 1.2449 - regression_loss: 1.0677 - classification_loss: 0.1772 294/500 [================>.............] - ETA: 1:08 - loss: 1.2445 - regression_loss: 1.0673 - classification_loss: 0.1772 295/500 [================>.............] - ETA: 1:07 - loss: 1.2436 - regression_loss: 1.0666 - classification_loss: 0.1770 296/500 [================>.............] - ETA: 1:07 - loss: 1.2443 - regression_loss: 1.0671 - classification_loss: 0.1772 297/500 [================>.............] - ETA: 1:07 - loss: 1.2444 - regression_loss: 1.0671 - classification_loss: 0.1772 298/500 [================>.............] - ETA: 1:06 - loss: 1.2465 - regression_loss: 1.0690 - classification_loss: 0.1775 299/500 [================>.............] - ETA: 1:06 - loss: 1.2453 - regression_loss: 1.0680 - classification_loss: 0.1773 300/500 [=================>............] - ETA: 1:06 - loss: 1.2465 - regression_loss: 1.0689 - classification_loss: 0.1777 301/500 [=================>............] - ETA: 1:06 - loss: 1.2439 - regression_loss: 1.0665 - classification_loss: 0.1774 302/500 [=================>............] - ETA: 1:05 - loss: 1.2428 - regression_loss: 1.0656 - classification_loss: 0.1771 303/500 [=================>............] - ETA: 1:05 - loss: 1.2427 - regression_loss: 1.0657 - classification_loss: 0.1770 304/500 [=================>............] - ETA: 1:05 - loss: 1.2406 - regression_loss: 1.0639 - classification_loss: 0.1767 305/500 [=================>............] - ETA: 1:04 - loss: 1.2401 - regression_loss: 1.0634 - classification_loss: 0.1767 306/500 [=================>............] - ETA: 1:04 - loss: 1.2379 - regression_loss: 1.0616 - classification_loss: 0.1763 307/500 [=================>............] - ETA: 1:04 - loss: 1.2359 - regression_loss: 1.0598 - classification_loss: 0.1761 308/500 [=================>............] - ETA: 1:03 - loss: 1.2365 - regression_loss: 1.0603 - classification_loss: 0.1761 309/500 [=================>............] - ETA: 1:03 - loss: 1.2369 - regression_loss: 1.0608 - classification_loss: 0.1762 310/500 [=================>............] - ETA: 1:03 - loss: 1.2378 - regression_loss: 1.0615 - classification_loss: 0.1763 311/500 [=================>............] - ETA: 1:02 - loss: 1.2361 - regression_loss: 1.0602 - classification_loss: 0.1760 312/500 [=================>............] - ETA: 1:02 - loss: 1.2338 - regression_loss: 1.0583 - classification_loss: 0.1756 313/500 [=================>............] - ETA: 1:02 - loss: 1.2330 - regression_loss: 1.0575 - classification_loss: 0.1754 314/500 [=================>............] - ETA: 1:01 - loss: 1.2309 - regression_loss: 1.0557 - classification_loss: 0.1752 315/500 [=================>............] - ETA: 1:01 - loss: 1.2299 - regression_loss: 1.0548 - classification_loss: 0.1751 316/500 [=================>............] - ETA: 1:01 - loss: 1.2306 - regression_loss: 1.0554 - classification_loss: 0.1752 317/500 [==================>...........] - ETA: 1:00 - loss: 1.2323 - regression_loss: 1.0570 - classification_loss: 0.1753 318/500 [==================>...........] - ETA: 1:00 - loss: 1.2334 - regression_loss: 1.0579 - classification_loss: 0.1755 319/500 [==================>...........] - ETA: 1:00 - loss: 1.2323 - regression_loss: 1.0571 - classification_loss: 0.1752 320/500 [==================>...........] - ETA: 59s - loss: 1.2327 - regression_loss: 1.0575 - classification_loss: 0.1751  321/500 [==================>...........] - ETA: 59s - loss: 1.2345 - regression_loss: 1.0590 - classification_loss: 0.1755 322/500 [==================>...........] - ETA: 59s - loss: 1.2348 - regression_loss: 1.0592 - classification_loss: 0.1755 323/500 [==================>...........] - ETA: 58s - loss: 1.2342 - regression_loss: 1.0589 - classification_loss: 0.1754 324/500 [==================>...........] - ETA: 58s - loss: 1.2354 - regression_loss: 1.0600 - classification_loss: 0.1755 325/500 [==================>...........] - ETA: 58s - loss: 1.2340 - regression_loss: 1.0588 - classification_loss: 0.1752 326/500 [==================>...........] - ETA: 57s - loss: 1.2340 - regression_loss: 1.0588 - classification_loss: 0.1752 327/500 [==================>...........] - ETA: 57s - loss: 1.2317 - regression_loss: 1.0567 - classification_loss: 0.1750 328/500 [==================>...........] - ETA: 57s - loss: 1.2313 - regression_loss: 1.0564 - classification_loss: 0.1749 329/500 [==================>...........] - ETA: 56s - loss: 1.2320 - regression_loss: 1.0569 - classification_loss: 0.1751 330/500 [==================>...........] - ETA: 56s - loss: 1.2320 - regression_loss: 1.0570 - classification_loss: 0.1750 331/500 [==================>...........] - ETA: 56s - loss: 1.2322 - regression_loss: 1.0572 - classification_loss: 0.1750 332/500 [==================>...........] - ETA: 55s - loss: 1.2335 - regression_loss: 1.0584 - classification_loss: 0.1751 333/500 [==================>...........] - ETA: 55s - loss: 1.2339 - regression_loss: 1.0588 - classification_loss: 0.1751 334/500 [===================>..........] - ETA: 55s - loss: 1.2318 - regression_loss: 1.0570 - classification_loss: 0.1748 335/500 [===================>..........] - ETA: 54s - loss: 1.2330 - regression_loss: 1.0580 - classification_loss: 0.1750 336/500 [===================>..........] - ETA: 54s - loss: 1.2339 - regression_loss: 1.0588 - classification_loss: 0.1751 337/500 [===================>..........] - ETA: 54s - loss: 1.2329 - regression_loss: 1.0580 - classification_loss: 0.1749 338/500 [===================>..........] - ETA: 53s - loss: 1.2313 - regression_loss: 1.0566 - classification_loss: 0.1747 339/500 [===================>..........] - ETA: 53s - loss: 1.2315 - regression_loss: 1.0569 - classification_loss: 0.1746 340/500 [===================>..........] - ETA: 53s - loss: 1.2314 - regression_loss: 1.0567 - classification_loss: 0.1747 341/500 [===================>..........] - ETA: 52s - loss: 1.2302 - regression_loss: 1.0557 - classification_loss: 0.1745 342/500 [===================>..........] - ETA: 52s - loss: 1.2313 - regression_loss: 1.0564 - classification_loss: 0.1750 343/500 [===================>..........] - ETA: 52s - loss: 1.2314 - regression_loss: 1.0565 - classification_loss: 0.1749 344/500 [===================>..........] - ETA: 51s - loss: 1.2325 - regression_loss: 1.0574 - classification_loss: 0.1751 345/500 [===================>..........] - ETA: 51s - loss: 1.2325 - regression_loss: 1.0573 - classification_loss: 0.1752 346/500 [===================>..........] - ETA: 51s - loss: 1.2308 - regression_loss: 1.0559 - classification_loss: 0.1749 347/500 [===================>..........] - ETA: 50s - loss: 1.2321 - regression_loss: 1.0569 - classification_loss: 0.1751 348/500 [===================>..........] - ETA: 50s - loss: 1.2331 - regression_loss: 1.0577 - classification_loss: 0.1754 349/500 [===================>..........] - ETA: 50s - loss: 1.2348 - regression_loss: 1.0592 - classification_loss: 0.1756 350/500 [====================>.........] - ETA: 49s - loss: 1.2339 - regression_loss: 1.0584 - classification_loss: 0.1755 351/500 [====================>.........] - ETA: 49s - loss: 1.2330 - regression_loss: 1.0576 - classification_loss: 0.1755 352/500 [====================>.........] - ETA: 49s - loss: 1.2324 - regression_loss: 1.0570 - classification_loss: 0.1753 353/500 [====================>.........] - ETA: 48s - loss: 1.2341 - regression_loss: 1.0585 - classification_loss: 0.1756 354/500 [====================>.........] - ETA: 48s - loss: 1.2343 - regression_loss: 1.0587 - classification_loss: 0.1755 355/500 [====================>.........] - ETA: 48s - loss: 1.2343 - regression_loss: 1.0589 - classification_loss: 0.1754 356/500 [====================>.........] - ETA: 47s - loss: 1.2351 - regression_loss: 1.0594 - classification_loss: 0.1757 357/500 [====================>.........] - ETA: 47s - loss: 1.2347 - regression_loss: 1.0590 - classification_loss: 0.1756 358/500 [====================>.........] - ETA: 47s - loss: 1.2336 - regression_loss: 1.0581 - classification_loss: 0.1755 359/500 [====================>.........] - ETA: 46s - loss: 1.2341 - regression_loss: 1.0585 - classification_loss: 0.1756 360/500 [====================>.........] - ETA: 46s - loss: 1.2344 - regression_loss: 1.0587 - classification_loss: 0.1757 361/500 [====================>.........] - ETA: 46s - loss: 1.2335 - regression_loss: 1.0579 - classification_loss: 0.1756 362/500 [====================>.........] - ETA: 45s - loss: 1.2331 - regression_loss: 1.0575 - classification_loss: 0.1757 363/500 [====================>.........] - ETA: 45s - loss: 1.2315 - regression_loss: 1.0561 - classification_loss: 0.1754 364/500 [====================>.........] - ETA: 45s - loss: 1.2306 - regression_loss: 1.0553 - classification_loss: 0.1753 365/500 [====================>.........] - ETA: 44s - loss: 1.2313 - regression_loss: 1.0558 - classification_loss: 0.1755 366/500 [====================>.........] - ETA: 44s - loss: 1.2296 - regression_loss: 1.0544 - classification_loss: 0.1753 367/500 [=====================>........] - ETA: 44s - loss: 1.2300 - regression_loss: 1.0546 - classification_loss: 0.1753 368/500 [=====================>........] - ETA: 43s - loss: 1.2284 - regression_loss: 1.0533 - classification_loss: 0.1751 369/500 [=====================>........] - ETA: 43s - loss: 1.2284 - regression_loss: 1.0532 - classification_loss: 0.1752 370/500 [=====================>........] - ETA: 43s - loss: 1.2282 - regression_loss: 1.0531 - classification_loss: 0.1751 371/500 [=====================>........] - ETA: 42s - loss: 1.2301 - regression_loss: 1.0545 - classification_loss: 0.1756 372/500 [=====================>........] - ETA: 42s - loss: 1.2308 - regression_loss: 1.0551 - classification_loss: 0.1757 373/500 [=====================>........] - ETA: 42s - loss: 1.2293 - regression_loss: 1.0539 - classification_loss: 0.1754 374/500 [=====================>........] - ETA: 41s - loss: 1.2302 - regression_loss: 1.0547 - classification_loss: 0.1755 375/500 [=====================>........] - ETA: 41s - loss: 1.2308 - regression_loss: 1.0553 - classification_loss: 0.1754 376/500 [=====================>........] - ETA: 41s - loss: 1.2325 - regression_loss: 1.0568 - classification_loss: 0.1757 377/500 [=====================>........] - ETA: 40s - loss: 1.2309 - regression_loss: 1.0555 - classification_loss: 0.1754 378/500 [=====================>........] - ETA: 40s - loss: 1.2300 - regression_loss: 1.0547 - classification_loss: 0.1752 379/500 [=====================>........] - ETA: 40s - loss: 1.2291 - regression_loss: 1.0540 - classification_loss: 0.1751 380/500 [=====================>........] - ETA: 39s - loss: 1.2279 - regression_loss: 1.0530 - classification_loss: 0.1749 381/500 [=====================>........] - ETA: 39s - loss: 1.2304 - regression_loss: 1.0551 - classification_loss: 0.1754 382/500 [=====================>........] - ETA: 39s - loss: 1.2310 - regression_loss: 1.0556 - classification_loss: 0.1754 383/500 [=====================>........] - ETA: 38s - loss: 1.2308 - regression_loss: 1.0555 - classification_loss: 0.1753 384/500 [======================>.......] - ETA: 38s - loss: 1.2298 - regression_loss: 1.0547 - classification_loss: 0.1751 385/500 [======================>.......] - ETA: 38s - loss: 1.2316 - regression_loss: 1.0564 - classification_loss: 0.1752 386/500 [======================>.......] - ETA: 37s - loss: 1.2317 - regression_loss: 1.0565 - classification_loss: 0.1752 387/500 [======================>.......] - ETA: 37s - loss: 1.2316 - regression_loss: 1.0565 - classification_loss: 0.1751 388/500 [======================>.......] - ETA: 37s - loss: 1.2321 - regression_loss: 1.0568 - classification_loss: 0.1753 389/500 [======================>.......] - ETA: 36s - loss: 1.2305 - regression_loss: 1.0554 - classification_loss: 0.1751 390/500 [======================>.......] - ETA: 36s - loss: 1.2316 - regression_loss: 1.0563 - classification_loss: 0.1753 391/500 [======================>.......] - ETA: 36s - loss: 1.2342 - regression_loss: 1.0584 - classification_loss: 0.1757 392/500 [======================>.......] - ETA: 35s - loss: 1.2346 - regression_loss: 1.0587 - classification_loss: 0.1759 393/500 [======================>.......] - ETA: 35s - loss: 1.2350 - regression_loss: 1.0591 - classification_loss: 0.1759 394/500 [======================>.......] - ETA: 35s - loss: 1.2362 - regression_loss: 1.0601 - classification_loss: 0.1761 395/500 [======================>.......] - ETA: 34s - loss: 1.2376 - regression_loss: 1.0613 - classification_loss: 0.1763 396/500 [======================>.......] - ETA: 34s - loss: 1.2364 - regression_loss: 1.0603 - classification_loss: 0.1761 397/500 [======================>.......] - ETA: 34s - loss: 1.2366 - regression_loss: 1.0604 - classification_loss: 0.1762 398/500 [======================>.......] - ETA: 33s - loss: 1.2368 - regression_loss: 1.0605 - classification_loss: 0.1763 399/500 [======================>.......] - ETA: 33s - loss: 1.2350 - regression_loss: 1.0590 - classification_loss: 0.1760 400/500 [=======================>......] - ETA: 33s - loss: 1.2357 - regression_loss: 1.0596 - classification_loss: 0.1762 401/500 [=======================>......] - ETA: 32s - loss: 1.2351 - regression_loss: 1.0590 - classification_loss: 0.1761 402/500 [=======================>......] - ETA: 32s - loss: 1.2333 - regression_loss: 1.0574 - classification_loss: 0.1759 403/500 [=======================>......] - ETA: 32s - loss: 1.2339 - regression_loss: 1.0580 - classification_loss: 0.1759 404/500 [=======================>......] - ETA: 31s - loss: 1.2325 - regression_loss: 1.0567 - classification_loss: 0.1758 405/500 [=======================>......] - ETA: 31s - loss: 1.2330 - regression_loss: 1.0571 - classification_loss: 0.1759 406/500 [=======================>......] - ETA: 31s - loss: 1.2324 - regression_loss: 1.0566 - classification_loss: 0.1758 407/500 [=======================>......] - ETA: 30s - loss: 1.2313 - regression_loss: 1.0557 - classification_loss: 0.1756 408/500 [=======================>......] - ETA: 30s - loss: 1.2310 - regression_loss: 1.0555 - classification_loss: 0.1755 409/500 [=======================>......] - ETA: 30s - loss: 1.2317 - regression_loss: 1.0561 - classification_loss: 0.1756 410/500 [=======================>......] - ETA: 29s - loss: 1.2305 - regression_loss: 1.0552 - classification_loss: 0.1753 411/500 [=======================>......] - ETA: 29s - loss: 1.2290 - regression_loss: 1.0539 - classification_loss: 0.1751 412/500 [=======================>......] - ETA: 29s - loss: 1.2290 - regression_loss: 1.0539 - classification_loss: 0.1751 413/500 [=======================>......] - ETA: 28s - loss: 1.2280 - regression_loss: 1.0530 - classification_loss: 0.1751 414/500 [=======================>......] - ETA: 28s - loss: 1.2267 - regression_loss: 1.0518 - classification_loss: 0.1749 415/500 [=======================>......] - ETA: 28s - loss: 1.2274 - regression_loss: 1.0524 - classification_loss: 0.1750 416/500 [=======================>......] - ETA: 27s - loss: 1.2284 - regression_loss: 1.0533 - classification_loss: 0.1751 417/500 [========================>.....] - ETA: 27s - loss: 1.2297 - regression_loss: 1.0543 - classification_loss: 0.1754 418/500 [========================>.....] - ETA: 27s - loss: 1.2296 - regression_loss: 1.0542 - classification_loss: 0.1754 419/500 [========================>.....] - ETA: 26s - loss: 1.2276 - regression_loss: 1.0525 - classification_loss: 0.1751 420/500 [========================>.....] - ETA: 26s - loss: 1.2276 - regression_loss: 1.0524 - classification_loss: 0.1752 421/500 [========================>.....] - ETA: 26s - loss: 1.2280 - regression_loss: 1.0529 - classification_loss: 0.1752 422/500 [========================>.....] - ETA: 25s - loss: 1.2285 - regression_loss: 1.0533 - classification_loss: 0.1752 423/500 [========================>.....] - ETA: 25s - loss: 1.2297 - regression_loss: 1.0543 - classification_loss: 0.1754 424/500 [========================>.....] - ETA: 25s - loss: 1.2284 - regression_loss: 1.0532 - classification_loss: 0.1752 425/500 [========================>.....] - ETA: 24s - loss: 1.2287 - regression_loss: 1.0535 - classification_loss: 0.1752 426/500 [========================>.....] - ETA: 24s - loss: 1.2294 - regression_loss: 1.0542 - classification_loss: 0.1752 427/500 [========================>.....] - ETA: 24s - loss: 1.2288 - regression_loss: 1.0537 - classification_loss: 0.1751 428/500 [========================>.....] - ETA: 23s - loss: 1.2293 - regression_loss: 1.0542 - classification_loss: 0.1752 429/500 [========================>.....] - ETA: 23s - loss: 1.2305 - regression_loss: 1.0550 - classification_loss: 0.1756 430/500 [========================>.....] - ETA: 23s - loss: 1.2311 - regression_loss: 1.0554 - classification_loss: 0.1758 431/500 [========================>.....] - ETA: 22s - loss: 1.2310 - regression_loss: 1.0553 - classification_loss: 0.1757 432/500 [========================>.....] - ETA: 22s - loss: 1.2313 - regression_loss: 1.0556 - classification_loss: 0.1757 433/500 [========================>.....] - ETA: 22s - loss: 1.2322 - regression_loss: 1.0564 - classification_loss: 0.1758 434/500 [=========================>....] - ETA: 21s - loss: 1.2308 - regression_loss: 1.0552 - classification_loss: 0.1756 435/500 [=========================>....] - ETA: 21s - loss: 1.2308 - regression_loss: 1.0552 - classification_loss: 0.1755 436/500 [=========================>....] - ETA: 21s - loss: 1.2309 - regression_loss: 1.0553 - classification_loss: 0.1756 437/500 [=========================>....] - ETA: 20s - loss: 1.2315 - regression_loss: 1.0557 - classification_loss: 0.1758 438/500 [=========================>....] - ETA: 20s - loss: 1.2315 - regression_loss: 1.0558 - classification_loss: 0.1757 439/500 [=========================>....] - ETA: 20s - loss: 1.2307 - regression_loss: 1.0549 - classification_loss: 0.1758 440/500 [=========================>....] - ETA: 19s - loss: 1.2298 - regression_loss: 1.0542 - classification_loss: 0.1756 441/500 [=========================>....] - ETA: 19s - loss: 1.2299 - regression_loss: 1.0543 - classification_loss: 0.1755 442/500 [=========================>....] - ETA: 19s - loss: 1.2284 - regression_loss: 1.0530 - classification_loss: 0.1754 443/500 [=========================>....] - ETA: 18s - loss: 1.2277 - regression_loss: 1.0523 - classification_loss: 0.1754 444/500 [=========================>....] - ETA: 18s - loss: 1.2285 - regression_loss: 1.0530 - classification_loss: 0.1755 445/500 [=========================>....] - ETA: 18s - loss: 1.2274 - regression_loss: 1.0519 - classification_loss: 0.1754 446/500 [=========================>....] - ETA: 17s - loss: 1.2282 - regression_loss: 1.0528 - classification_loss: 0.1754 447/500 [=========================>....] - ETA: 17s - loss: 1.2287 - regression_loss: 1.0531 - classification_loss: 0.1757 448/500 [=========================>....] - ETA: 17s - loss: 1.2289 - regression_loss: 1.0533 - classification_loss: 0.1756 449/500 [=========================>....] - ETA: 16s - loss: 1.2292 - regression_loss: 1.0535 - classification_loss: 0.1757 450/500 [==========================>...] - ETA: 16s - loss: 1.2290 - regression_loss: 1.0533 - classification_loss: 0.1757 451/500 [==========================>...] - ETA: 16s - loss: 1.2282 - regression_loss: 1.0526 - classification_loss: 0.1756 452/500 [==========================>...] - ETA: 15s - loss: 1.2286 - regression_loss: 1.0530 - classification_loss: 0.1756 453/500 [==========================>...] - ETA: 15s - loss: 1.2282 - regression_loss: 1.0527 - classification_loss: 0.1755 454/500 [==========================>...] - ETA: 15s - loss: 1.2293 - regression_loss: 1.0537 - classification_loss: 0.1756 455/500 [==========================>...] - ETA: 14s - loss: 1.2292 - regression_loss: 1.0536 - classification_loss: 0.1756 456/500 [==========================>...] - ETA: 14s - loss: 1.2278 - regression_loss: 1.0523 - classification_loss: 0.1755 457/500 [==========================>...] - ETA: 14s - loss: 1.2277 - regression_loss: 1.0521 - classification_loss: 0.1755 458/500 [==========================>...] - ETA: 13s - loss: 1.2272 - regression_loss: 1.0517 - classification_loss: 0.1755 459/500 [==========================>...] - ETA: 13s - loss: 1.2263 - regression_loss: 1.0509 - classification_loss: 0.1754 460/500 [==========================>...] - ETA: 13s - loss: 1.2263 - regression_loss: 1.0509 - classification_loss: 0.1754 461/500 [==========================>...] - ETA: 12s - loss: 1.2269 - regression_loss: 1.0514 - classification_loss: 0.1754 462/500 [==========================>...] - ETA: 12s - loss: 1.2254 - regression_loss: 1.0502 - classification_loss: 0.1752 463/500 [==========================>...] - ETA: 12s - loss: 1.2252 - regression_loss: 1.0499 - classification_loss: 0.1753 464/500 [==========================>...] - ETA: 11s - loss: 1.2245 - regression_loss: 1.0494 - classification_loss: 0.1752 465/500 [==========================>...] - ETA: 11s - loss: 1.2244 - regression_loss: 1.0492 - classification_loss: 0.1752 466/500 [==========================>...] - ETA: 11s - loss: 1.2252 - regression_loss: 1.0498 - classification_loss: 0.1754 467/500 [===========================>..] - ETA: 10s - loss: 1.2243 - regression_loss: 1.0491 - classification_loss: 0.1752 468/500 [===========================>..] - ETA: 10s - loss: 1.2235 - regression_loss: 1.0485 - classification_loss: 0.1751 469/500 [===========================>..] - ETA: 10s - loss: 1.2239 - regression_loss: 1.0489 - classification_loss: 0.1750 470/500 [===========================>..] - ETA: 9s - loss: 1.2232 - regression_loss: 1.0483 - classification_loss: 0.1749  471/500 [===========================>..] - ETA: 9s - loss: 1.2226 - regression_loss: 1.0477 - classification_loss: 0.1749 472/500 [===========================>..] - ETA: 9s - loss: 1.2228 - regression_loss: 1.0481 - classification_loss: 0.1748 473/500 [===========================>..] - ETA: 8s - loss: 1.2225 - regression_loss: 1.0478 - classification_loss: 0.1747 474/500 [===========================>..] - ETA: 8s - loss: 1.2231 - regression_loss: 1.0483 - classification_loss: 0.1748 475/500 [===========================>..] - ETA: 8s - loss: 1.2249 - regression_loss: 1.0498 - classification_loss: 0.1751 476/500 [===========================>..] - ETA: 7s - loss: 1.2237 - regression_loss: 1.0488 - classification_loss: 0.1749 477/500 [===========================>..] - ETA: 7s - loss: 1.2224 - regression_loss: 1.0477 - classification_loss: 0.1747 478/500 [===========================>..] - ETA: 7s - loss: 1.2210 - regression_loss: 1.0464 - classification_loss: 0.1745 479/500 [===========================>..] - ETA: 6s - loss: 1.2202 - regression_loss: 1.0458 - classification_loss: 0.1744 480/500 [===========================>..] - ETA: 6s - loss: 1.2213 - regression_loss: 1.0467 - classification_loss: 0.1746 481/500 [===========================>..] - ETA: 6s - loss: 1.2205 - regression_loss: 1.0460 - classification_loss: 0.1745 482/500 [===========================>..] - ETA: 5s - loss: 1.2208 - regression_loss: 1.0463 - classification_loss: 0.1745 483/500 [===========================>..] - ETA: 5s - loss: 1.2208 - regression_loss: 1.0462 - classification_loss: 0.1746 484/500 [============================>.] - ETA: 5s - loss: 1.2193 - regression_loss: 1.0448 - classification_loss: 0.1744 485/500 [============================>.] - ETA: 4s - loss: 1.2186 - regression_loss: 1.0444 - classification_loss: 0.1743 486/500 [============================>.] - ETA: 4s - loss: 1.2190 - regression_loss: 1.0447 - classification_loss: 0.1743 487/500 [============================>.] - ETA: 4s - loss: 1.2199 - regression_loss: 1.0455 - classification_loss: 0.1744 488/500 [============================>.] - ETA: 3s - loss: 1.2197 - regression_loss: 1.0454 - classification_loss: 0.1743 489/500 [============================>.] - ETA: 3s - loss: 1.2194 - regression_loss: 1.0451 - classification_loss: 0.1743 490/500 [============================>.] - ETA: 3s - loss: 1.2186 - regression_loss: 1.0444 - classification_loss: 0.1741 491/500 [============================>.] - ETA: 2s - loss: 1.2197 - regression_loss: 1.0454 - classification_loss: 0.1743 492/500 [============================>.] - ETA: 2s - loss: 1.2208 - regression_loss: 1.0463 - classification_loss: 0.1745 493/500 [============================>.] - ETA: 2s - loss: 1.2211 - regression_loss: 1.0466 - classification_loss: 0.1745 494/500 [============================>.] - ETA: 1s - loss: 1.2208 - regression_loss: 1.0463 - classification_loss: 0.1745 495/500 [============================>.] - ETA: 1s - loss: 1.2191 - regression_loss: 1.0448 - classification_loss: 0.1743 496/500 [============================>.] - ETA: 1s - loss: 1.2191 - regression_loss: 1.0448 - classification_loss: 0.1744 497/500 [============================>.] - ETA: 0s - loss: 1.2174 - regression_loss: 1.0433 - classification_loss: 0.1741 498/500 [============================>.] - ETA: 0s - loss: 1.2169 - regression_loss: 1.0429 - classification_loss: 0.1741 499/500 [============================>.] - ETA: 0s - loss: 1.2164 - regression_loss: 1.0424 - classification_loss: 0.1740 500/500 [==============================] - 166s 331ms/step - loss: 1.2156 - regression_loss: 1.0416 - classification_loss: 0.1740 1172 instances of class plum with average precision: 0.5745 mAP: 0.5745 Epoch 00023: saving model to ./training/snapshots/resnet101_pascal_23.h5 Epoch 24/150 1/500 [..............................] - ETA: 2:47 - loss: 0.5828 - regression_loss: 0.4919 - classification_loss: 0.0909 2/500 [..............................] - ETA: 2:45 - loss: 0.9642 - regression_loss: 0.8374 - classification_loss: 0.1268 3/500 [..............................] - ETA: 2:47 - loss: 1.0967 - regression_loss: 0.9429 - classification_loss: 0.1538 4/500 [..............................] - ETA: 2:46 - loss: 1.1895 - regression_loss: 1.0265 - classification_loss: 0.1629 5/500 [..............................] - ETA: 2:47 - loss: 1.1968 - regression_loss: 1.0328 - classification_loss: 0.1639 6/500 [..............................] - ETA: 2:47 - loss: 1.1544 - regression_loss: 1.0006 - classification_loss: 0.1537 7/500 [..............................] - ETA: 2:47 - loss: 1.0502 - regression_loss: 0.8983 - classification_loss: 0.1519 8/500 [..............................] - ETA: 2:45 - loss: 1.0584 - regression_loss: 0.9126 - classification_loss: 0.1457 9/500 [..............................] - ETA: 2:44 - loss: 1.0465 - regression_loss: 0.9053 - classification_loss: 0.1411 10/500 [..............................] - ETA: 2:43 - loss: 1.1284 - regression_loss: 0.9743 - classification_loss: 0.1541 11/500 [..............................] - ETA: 2:42 - loss: 1.2065 - regression_loss: 1.0400 - classification_loss: 0.1666 12/500 [..............................] - ETA: 2:41 - loss: 1.2167 - regression_loss: 1.0522 - classification_loss: 0.1645 13/500 [..............................] - ETA: 2:41 - loss: 1.2614 - regression_loss: 1.0899 - classification_loss: 0.1716 14/500 [..............................] - ETA: 2:41 - loss: 1.2380 - regression_loss: 1.0716 - classification_loss: 0.1665 15/500 [..............................] - ETA: 2:40 - loss: 1.2457 - regression_loss: 1.0776 - classification_loss: 0.1681 16/500 [..............................] - ETA: 2:40 - loss: 1.2185 - regression_loss: 1.0538 - classification_loss: 0.1647 17/500 [>.............................] - ETA: 2:40 - loss: 1.2365 - regression_loss: 1.0672 - classification_loss: 0.1694 18/500 [>.............................] - ETA: 2:41 - loss: 1.2221 - regression_loss: 1.0582 - classification_loss: 0.1639 19/500 [>.............................] - ETA: 2:40 - loss: 1.2105 - regression_loss: 1.0472 - classification_loss: 0.1633 20/500 [>.............................] - ETA: 2:39 - loss: 1.2224 - regression_loss: 1.0608 - classification_loss: 0.1616 21/500 [>.............................] - ETA: 2:39 - loss: 1.2045 - regression_loss: 1.0449 - classification_loss: 0.1596 22/500 [>.............................] - ETA: 2:39 - loss: 1.2166 - regression_loss: 1.0533 - classification_loss: 0.1633 23/500 [>.............................] - ETA: 2:38 - loss: 1.2305 - regression_loss: 1.0644 - classification_loss: 0.1661 24/500 [>.............................] - ETA: 2:38 - loss: 1.2466 - regression_loss: 1.0769 - classification_loss: 0.1697 25/500 [>.............................] - ETA: 2:38 - loss: 1.2827 - regression_loss: 1.1023 - classification_loss: 0.1804 26/500 [>.............................] - ETA: 2:37 - loss: 1.3119 - regression_loss: 1.1258 - classification_loss: 0.1861 27/500 [>.............................] - ETA: 2:37 - loss: 1.3118 - regression_loss: 1.1262 - classification_loss: 0.1857 28/500 [>.............................] - ETA: 2:37 - loss: 1.3124 - regression_loss: 1.1277 - classification_loss: 0.1847 29/500 [>.............................] - ETA: 2:36 - loss: 1.3103 - regression_loss: 1.1261 - classification_loss: 0.1841 30/500 [>.............................] - ETA: 2:36 - loss: 1.2843 - regression_loss: 1.1031 - classification_loss: 0.1812 31/500 [>.............................] - ETA: 2:36 - loss: 1.2706 - regression_loss: 1.0915 - classification_loss: 0.1791 32/500 [>.............................] - ETA: 2:35 - loss: 1.2775 - regression_loss: 1.0982 - classification_loss: 0.1793 33/500 [>.............................] - ETA: 2:35 - loss: 1.2897 - regression_loss: 1.1086 - classification_loss: 0.1811 34/500 [=>............................] - ETA: 2:35 - loss: 1.2797 - regression_loss: 1.1009 - classification_loss: 0.1788 35/500 [=>............................] - ETA: 2:34 - loss: 1.2757 - regression_loss: 1.0971 - classification_loss: 0.1785 36/500 [=>............................] - ETA: 2:34 - loss: 1.2725 - regression_loss: 1.0938 - classification_loss: 0.1787 37/500 [=>............................] - ETA: 2:33 - loss: 1.2785 - regression_loss: 1.0991 - classification_loss: 0.1794 38/500 [=>............................] - ETA: 2:33 - loss: 1.2950 - regression_loss: 1.1118 - classification_loss: 0.1832 39/500 [=>............................] - ETA: 2:32 - loss: 1.3090 - regression_loss: 1.1234 - classification_loss: 0.1857 40/500 [=>............................] - ETA: 2:32 - loss: 1.2985 - regression_loss: 1.1145 - classification_loss: 0.1840 41/500 [=>............................] - ETA: 2:31 - loss: 1.2813 - regression_loss: 1.0997 - classification_loss: 0.1815 42/500 [=>............................] - ETA: 2:31 - loss: 1.2813 - regression_loss: 1.0994 - classification_loss: 0.1819 43/500 [=>............................] - ETA: 2:31 - loss: 1.2606 - regression_loss: 1.0805 - classification_loss: 0.1801 44/500 [=>............................] - ETA: 2:31 - loss: 1.2715 - regression_loss: 1.0900 - classification_loss: 0.1815 45/500 [=>............................] - ETA: 2:31 - loss: 1.2801 - regression_loss: 1.0959 - classification_loss: 0.1842 46/500 [=>............................] - ETA: 2:30 - loss: 1.2833 - regression_loss: 1.0980 - classification_loss: 0.1852 47/500 [=>............................] - ETA: 2:30 - loss: 1.2829 - regression_loss: 1.0976 - classification_loss: 0.1853 48/500 [=>............................] - ETA: 2:30 - loss: 1.2815 - regression_loss: 1.0981 - classification_loss: 0.1835 49/500 [=>............................] - ETA: 2:29 - loss: 1.2828 - regression_loss: 1.0986 - classification_loss: 0.1842 50/500 [==>...........................] - ETA: 2:29 - loss: 1.2812 - regression_loss: 1.0973 - classification_loss: 0.1839 51/500 [==>...........................] - ETA: 2:29 - loss: 1.2667 - regression_loss: 1.0852 - classification_loss: 0.1815 52/500 [==>...........................] - ETA: 2:28 - loss: 1.2738 - regression_loss: 1.0907 - classification_loss: 0.1831 53/500 [==>...........................] - ETA: 2:28 - loss: 1.2754 - regression_loss: 1.0925 - classification_loss: 0.1830 54/500 [==>...........................] - ETA: 2:27 - loss: 1.2636 - regression_loss: 1.0825 - classification_loss: 0.1811 55/500 [==>...........................] - ETA: 2:27 - loss: 1.2564 - regression_loss: 1.0768 - classification_loss: 0.1797 56/500 [==>...........................] - ETA: 2:27 - loss: 1.2611 - regression_loss: 1.0820 - classification_loss: 0.1792 57/500 [==>...........................] - ETA: 2:26 - loss: 1.2691 - regression_loss: 1.0884 - classification_loss: 0.1807 58/500 [==>...........................] - ETA: 2:26 - loss: 1.2622 - regression_loss: 1.0832 - classification_loss: 0.1790 59/500 [==>...........................] - ETA: 2:26 - loss: 1.2476 - regression_loss: 1.0699 - classification_loss: 0.1777 60/500 [==>...........................] - ETA: 2:25 - loss: 1.2485 - regression_loss: 1.0717 - classification_loss: 0.1768 61/500 [==>...........................] - ETA: 2:25 - loss: 1.2435 - regression_loss: 1.0662 - classification_loss: 0.1773 62/500 [==>...........................] - ETA: 2:25 - loss: 1.2445 - regression_loss: 1.0673 - classification_loss: 0.1771 63/500 [==>...........................] - ETA: 2:24 - loss: 1.2332 - regression_loss: 1.0578 - classification_loss: 0.1754 64/500 [==>...........................] - ETA: 2:24 - loss: 1.2219 - regression_loss: 1.0482 - classification_loss: 0.1737 65/500 [==>...........................] - ETA: 2:23 - loss: 1.2272 - regression_loss: 1.0529 - classification_loss: 0.1743 66/500 [==>...........................] - ETA: 2:23 - loss: 1.2305 - regression_loss: 1.0561 - classification_loss: 0.1745 67/500 [===>..........................] - ETA: 2:23 - loss: 1.2341 - regression_loss: 1.0589 - classification_loss: 0.1752 68/500 [===>..........................] - ETA: 2:23 - loss: 1.2244 - regression_loss: 1.0506 - classification_loss: 0.1737 69/500 [===>..........................] - ETA: 2:23 - loss: 1.2300 - regression_loss: 1.0553 - classification_loss: 0.1747 70/500 [===>..........................] - ETA: 2:22 - loss: 1.2366 - regression_loss: 1.0604 - classification_loss: 0.1762 71/500 [===>..........................] - ETA: 2:22 - loss: 1.2392 - regression_loss: 1.0628 - classification_loss: 0.1764 72/500 [===>..........................] - ETA: 2:21 - loss: 1.2319 - regression_loss: 1.0568 - classification_loss: 0.1751 73/500 [===>..........................] - ETA: 2:21 - loss: 1.2205 - regression_loss: 1.0469 - classification_loss: 0.1736 74/500 [===>..........................] - ETA: 2:21 - loss: 1.2161 - regression_loss: 1.0435 - classification_loss: 0.1726 75/500 [===>..........................] - ETA: 2:20 - loss: 1.2171 - regression_loss: 1.0447 - classification_loss: 0.1724 76/500 [===>..........................] - ETA: 2:20 - loss: 1.2222 - regression_loss: 1.0490 - classification_loss: 0.1732 77/500 [===>..........................] - ETA: 2:19 - loss: 1.2240 - regression_loss: 1.0512 - classification_loss: 0.1728 78/500 [===>..........................] - ETA: 2:19 - loss: 1.2294 - regression_loss: 1.0554 - classification_loss: 0.1739 79/500 [===>..........................] - ETA: 2:19 - loss: 1.2235 - regression_loss: 1.0503 - classification_loss: 0.1732 80/500 [===>..........................] - ETA: 2:19 - loss: 1.2243 - regression_loss: 1.0510 - classification_loss: 0.1733 81/500 [===>..........................] - ETA: 2:18 - loss: 1.2278 - regression_loss: 1.0538 - classification_loss: 0.1740 82/500 [===>..........................] - ETA: 2:18 - loss: 1.2314 - regression_loss: 1.0573 - classification_loss: 0.1741 83/500 [===>..........................] - ETA: 2:18 - loss: 1.2313 - regression_loss: 1.0563 - classification_loss: 0.1749 84/500 [====>.........................] - ETA: 2:17 - loss: 1.2276 - regression_loss: 1.0534 - classification_loss: 0.1742 85/500 [====>.........................] - ETA: 2:17 - loss: 1.2397 - regression_loss: 1.0635 - classification_loss: 0.1762 86/500 [====>.........................] - ETA: 2:16 - loss: 1.2452 - regression_loss: 1.0682 - classification_loss: 0.1770 87/500 [====>.........................] - ETA: 2:16 - loss: 1.2467 - regression_loss: 1.0696 - classification_loss: 0.1771 88/500 [====>.........................] - ETA: 2:16 - loss: 1.2491 - regression_loss: 1.0711 - classification_loss: 0.1780 89/500 [====>.........................] - ETA: 2:15 - loss: 1.2521 - regression_loss: 1.0739 - classification_loss: 0.1782 90/500 [====>.........................] - ETA: 2:15 - loss: 1.2589 - regression_loss: 1.0789 - classification_loss: 0.1800 91/500 [====>.........................] - ETA: 2:15 - loss: 1.2545 - regression_loss: 1.0750 - classification_loss: 0.1796 92/500 [====>.........................] - ETA: 2:14 - loss: 1.2592 - regression_loss: 1.0788 - classification_loss: 0.1804 93/500 [====>.........................] - ETA: 2:14 - loss: 1.2564 - regression_loss: 1.0763 - classification_loss: 0.1800 94/500 [====>.........................] - ETA: 2:14 - loss: 1.2646 - regression_loss: 1.0827 - classification_loss: 0.1819 95/500 [====>.........................] - ETA: 2:13 - loss: 1.2559 - regression_loss: 1.0752 - classification_loss: 0.1807 96/500 [====>.........................] - ETA: 2:13 - loss: 1.2547 - regression_loss: 1.0744 - classification_loss: 0.1803 97/500 [====>.........................] - ETA: 2:13 - loss: 1.2557 - regression_loss: 1.0753 - classification_loss: 0.1804 98/500 [====>.........................] - ETA: 2:12 - loss: 1.2556 - regression_loss: 1.0753 - classification_loss: 0.1803 99/500 [====>.........................] - ETA: 2:12 - loss: 1.2496 - regression_loss: 1.0702 - classification_loss: 0.1794 100/500 [=====>........................] - ETA: 2:12 - loss: 1.2521 - regression_loss: 1.0724 - classification_loss: 0.1798 101/500 [=====>........................] - ETA: 2:11 - loss: 1.2447 - regression_loss: 1.0661 - classification_loss: 0.1786 102/500 [=====>........................] - ETA: 2:11 - loss: 1.2462 - regression_loss: 1.0677 - classification_loss: 0.1785 103/500 [=====>........................] - ETA: 2:11 - loss: 1.2432 - regression_loss: 1.0653 - classification_loss: 0.1779 104/500 [=====>........................] - ETA: 2:10 - loss: 1.2477 - regression_loss: 1.0693 - classification_loss: 0.1785 105/500 [=====>........................] - ETA: 2:10 - loss: 1.2510 - regression_loss: 1.0720 - classification_loss: 0.1790 106/500 [=====>........................] - ETA: 2:10 - loss: 1.2486 - regression_loss: 1.0700 - classification_loss: 0.1786 107/500 [=====>........................] - ETA: 2:09 - loss: 1.2472 - regression_loss: 1.0689 - classification_loss: 0.1783 108/500 [=====>........................] - ETA: 2:09 - loss: 1.2499 - regression_loss: 1.0714 - classification_loss: 0.1785 109/500 [=====>........................] - ETA: 2:09 - loss: 1.2529 - regression_loss: 1.0739 - classification_loss: 0.1791 110/500 [=====>........................] - ETA: 2:08 - loss: 1.2578 - regression_loss: 1.0782 - classification_loss: 0.1797 111/500 [=====>........................] - ETA: 2:08 - loss: 1.2524 - regression_loss: 1.0732 - classification_loss: 0.1792 112/500 [=====>........................] - ETA: 2:08 - loss: 1.2547 - regression_loss: 1.0750 - classification_loss: 0.1796 113/500 [=====>........................] - ETA: 2:07 - loss: 1.2546 - regression_loss: 1.0747 - classification_loss: 0.1799 114/500 [=====>........................] - ETA: 2:07 - loss: 1.2507 - regression_loss: 1.0716 - classification_loss: 0.1791 115/500 [=====>........................] - ETA: 2:07 - loss: 1.2535 - regression_loss: 1.0738 - classification_loss: 0.1797 116/500 [=====>........................] - ETA: 2:06 - loss: 1.2548 - regression_loss: 1.0751 - classification_loss: 0.1797 117/500 [======>.......................] - ETA: 2:06 - loss: 1.2540 - regression_loss: 1.0741 - classification_loss: 0.1799 118/500 [======>.......................] - ETA: 2:06 - loss: 1.2530 - regression_loss: 1.0733 - classification_loss: 0.1797 119/500 [======>.......................] - ETA: 2:05 - loss: 1.2571 - regression_loss: 1.0766 - classification_loss: 0.1805 120/500 [======>.......................] - ETA: 2:05 - loss: 1.2612 - regression_loss: 1.0797 - classification_loss: 0.1816 121/500 [======>.......................] - ETA: 2:05 - loss: 1.2616 - regression_loss: 1.0800 - classification_loss: 0.1816 122/500 [======>.......................] - ETA: 2:05 - loss: 1.2621 - regression_loss: 1.0804 - classification_loss: 0.1817 123/500 [======>.......................] - ETA: 2:04 - loss: 1.2637 - regression_loss: 1.0819 - classification_loss: 0.1818 124/500 [======>.......................] - ETA: 2:04 - loss: 1.2628 - regression_loss: 1.0806 - classification_loss: 0.1821 125/500 [======>.......................] - ETA: 2:04 - loss: 1.2568 - regression_loss: 1.0755 - classification_loss: 0.1812 126/500 [======>.......................] - ETA: 2:03 - loss: 1.2591 - regression_loss: 1.0777 - classification_loss: 0.1814 127/500 [======>.......................] - ETA: 2:03 - loss: 1.2534 - regression_loss: 1.0729 - classification_loss: 0.1805 128/500 [======>.......................] - ETA: 2:02 - loss: 1.2522 - regression_loss: 1.0723 - classification_loss: 0.1799 129/500 [======>.......................] - ETA: 2:02 - loss: 1.2542 - regression_loss: 1.0739 - classification_loss: 0.1802 130/500 [======>.......................] - ETA: 2:02 - loss: 1.2513 - regression_loss: 1.0715 - classification_loss: 0.1797 131/500 [======>.......................] - ETA: 2:01 - loss: 1.2511 - regression_loss: 1.0712 - classification_loss: 0.1799 132/500 [======>.......................] - ETA: 2:01 - loss: 1.2479 - regression_loss: 1.0684 - classification_loss: 0.1795 133/500 [======>.......................] - ETA: 2:01 - loss: 1.2477 - regression_loss: 1.0687 - classification_loss: 0.1790 134/500 [=======>......................] - ETA: 2:00 - loss: 1.2493 - regression_loss: 1.0699 - classification_loss: 0.1793 135/500 [=======>......................] - ETA: 2:00 - loss: 1.2491 - regression_loss: 1.0698 - classification_loss: 0.1793 136/500 [=======>......................] - ETA: 2:00 - loss: 1.2451 - regression_loss: 1.0665 - classification_loss: 0.1786 137/500 [=======>......................] - ETA: 2:00 - loss: 1.2437 - regression_loss: 1.0652 - classification_loss: 0.1785 138/500 [=======>......................] - ETA: 1:59 - loss: 1.2379 - regression_loss: 1.0603 - classification_loss: 0.1776 139/500 [=======>......................] - ETA: 1:59 - loss: 1.2336 - regression_loss: 1.0566 - classification_loss: 0.1770 140/500 [=======>......................] - ETA: 1:59 - loss: 1.2369 - regression_loss: 1.0594 - classification_loss: 0.1775 141/500 [=======>......................] - ETA: 1:58 - loss: 1.2333 - regression_loss: 1.0566 - classification_loss: 0.1767 142/500 [=======>......................] - ETA: 1:58 - loss: 1.2300 - regression_loss: 1.0538 - classification_loss: 0.1761 143/500 [=======>......................] - ETA: 1:58 - loss: 1.2252 - regression_loss: 1.0496 - classification_loss: 0.1756 144/500 [=======>......................] - ETA: 1:57 - loss: 1.2301 - regression_loss: 1.0533 - classification_loss: 0.1767 145/500 [=======>......................] - ETA: 1:57 - loss: 1.2285 - regression_loss: 1.0525 - classification_loss: 0.1760 146/500 [=======>......................] - ETA: 1:57 - loss: 1.2303 - regression_loss: 1.0537 - classification_loss: 0.1766 147/500 [=======>......................] - ETA: 1:56 - loss: 1.2338 - regression_loss: 1.0565 - classification_loss: 0.1774 148/500 [=======>......................] - ETA: 1:56 - loss: 1.2359 - regression_loss: 1.0581 - classification_loss: 0.1777 149/500 [=======>......................] - ETA: 1:56 - loss: 1.2370 - regression_loss: 1.0589 - classification_loss: 0.1781 150/500 [========>.....................] - ETA: 1:55 - loss: 1.2317 - regression_loss: 1.0543 - classification_loss: 0.1774 151/500 [========>.....................] - ETA: 1:55 - loss: 1.2343 - regression_loss: 1.0563 - classification_loss: 0.1780 152/500 [========>.....................] - ETA: 1:55 - loss: 1.2294 - regression_loss: 1.0521 - classification_loss: 0.1774 153/500 [========>.....................] - ETA: 1:54 - loss: 1.2311 - regression_loss: 1.0535 - classification_loss: 0.1776 154/500 [========>.....................] - ETA: 1:54 - loss: 1.2304 - regression_loss: 1.0527 - classification_loss: 0.1776 155/500 [========>.....................] - ETA: 1:54 - loss: 1.2344 - regression_loss: 1.0561 - classification_loss: 0.1783 156/500 [========>.....................] - ETA: 1:53 - loss: 1.2340 - regression_loss: 1.0557 - classification_loss: 0.1783 157/500 [========>.....................] - ETA: 1:53 - loss: 1.2376 - regression_loss: 1.0583 - classification_loss: 0.1793 158/500 [========>.....................] - ETA: 1:52 - loss: 1.2398 - regression_loss: 1.0602 - classification_loss: 0.1796 159/500 [========>.....................] - ETA: 1:52 - loss: 1.2396 - regression_loss: 1.0601 - classification_loss: 0.1794 160/500 [========>.....................] - ETA: 1:52 - loss: 1.2344 - regression_loss: 1.0555 - classification_loss: 0.1789 161/500 [========>.....................] - ETA: 1:51 - loss: 1.2320 - regression_loss: 1.0536 - classification_loss: 0.1784 162/500 [========>.....................] - ETA: 1:51 - loss: 1.2287 - regression_loss: 1.0508 - classification_loss: 0.1779 163/500 [========>.....................] - ETA: 1:51 - loss: 1.2252 - regression_loss: 1.0477 - classification_loss: 0.1775 164/500 [========>.....................] - ETA: 1:51 - loss: 1.2219 - regression_loss: 1.0451 - classification_loss: 0.1769 165/500 [========>.....................] - ETA: 1:50 - loss: 1.2192 - regression_loss: 1.0429 - classification_loss: 0.1763 166/500 [========>.....................] - ETA: 1:50 - loss: 1.2184 - regression_loss: 1.0422 - classification_loss: 0.1762 167/500 [=========>....................] - ETA: 1:50 - loss: 1.2234 - regression_loss: 1.0464 - classification_loss: 0.1770 168/500 [=========>....................] - ETA: 1:49 - loss: 1.2194 - regression_loss: 1.0431 - classification_loss: 0.1763 169/500 [=========>....................] - ETA: 1:49 - loss: 1.2204 - regression_loss: 1.0440 - classification_loss: 0.1764 170/500 [=========>....................] - ETA: 1:49 - loss: 1.2156 - regression_loss: 1.0399 - classification_loss: 0.1757 171/500 [=========>....................] - ETA: 1:48 - loss: 1.2152 - regression_loss: 1.0393 - classification_loss: 0.1759 172/500 [=========>....................] - ETA: 1:48 - loss: 1.2147 - regression_loss: 1.0390 - classification_loss: 0.1758 173/500 [=========>....................] - ETA: 1:48 - loss: 1.2168 - regression_loss: 1.0409 - classification_loss: 0.1758 174/500 [=========>....................] - ETA: 1:47 - loss: 1.2147 - regression_loss: 1.0392 - classification_loss: 0.1754 175/500 [=========>....................] - ETA: 1:47 - loss: 1.2177 - regression_loss: 1.0416 - classification_loss: 0.1762 176/500 [=========>....................] - ETA: 1:47 - loss: 1.2160 - regression_loss: 1.0401 - classification_loss: 0.1760 177/500 [=========>....................] - ETA: 1:46 - loss: 1.2179 - regression_loss: 1.0415 - classification_loss: 0.1764 178/500 [=========>....................] - ETA: 1:46 - loss: 1.2175 - regression_loss: 1.0412 - classification_loss: 0.1762 179/500 [=========>....................] - ETA: 1:46 - loss: 1.2201 - regression_loss: 1.0431 - classification_loss: 0.1769 180/500 [=========>....................] - ETA: 1:45 - loss: 1.2205 - regression_loss: 1.0437 - classification_loss: 0.1767 181/500 [=========>....................] - ETA: 1:45 - loss: 1.2163 - regression_loss: 1.0402 - classification_loss: 0.1761 182/500 [=========>....................] - ETA: 1:44 - loss: 1.2200 - regression_loss: 1.0432 - classification_loss: 0.1768 183/500 [=========>....................] - ETA: 1:44 - loss: 1.2199 - regression_loss: 1.0437 - classification_loss: 0.1762 184/500 [==========>...................] - ETA: 1:44 - loss: 1.2209 - regression_loss: 1.0446 - classification_loss: 0.1763 185/500 [==========>...................] - ETA: 1:43 - loss: 1.2208 - regression_loss: 1.0445 - classification_loss: 0.1763 186/500 [==========>...................] - ETA: 1:43 - loss: 1.2196 - regression_loss: 1.0437 - classification_loss: 0.1759 187/500 [==========>...................] - ETA: 1:43 - loss: 1.2179 - regression_loss: 1.0425 - classification_loss: 0.1755 188/500 [==========>...................] - ETA: 1:42 - loss: 1.2188 - regression_loss: 1.0434 - classification_loss: 0.1754 189/500 [==========>...................] - ETA: 1:42 - loss: 1.2177 - regression_loss: 1.0426 - classification_loss: 0.1751 190/500 [==========>...................] - ETA: 1:42 - loss: 1.2193 - regression_loss: 1.0441 - classification_loss: 0.1752 191/500 [==========>...................] - ETA: 1:41 - loss: 1.2189 - regression_loss: 1.0438 - classification_loss: 0.1751 192/500 [==========>...................] - ETA: 1:41 - loss: 1.2188 - regression_loss: 1.0436 - classification_loss: 0.1752 193/500 [==========>...................] - ETA: 1:41 - loss: 1.2162 - regression_loss: 1.0414 - classification_loss: 0.1747 194/500 [==========>...................] - ETA: 1:40 - loss: 1.2179 - regression_loss: 1.0430 - classification_loss: 0.1750 195/500 [==========>...................] - ETA: 1:40 - loss: 1.2179 - regression_loss: 1.0429 - classification_loss: 0.1749 196/500 [==========>...................] - ETA: 1:40 - loss: 1.2148 - regression_loss: 1.0403 - classification_loss: 0.1746 197/500 [==========>...................] - ETA: 1:39 - loss: 1.2119 - regression_loss: 1.0377 - classification_loss: 0.1741 198/500 [==========>...................] - ETA: 1:39 - loss: 1.2110 - regression_loss: 1.0368 - classification_loss: 0.1742 199/500 [==========>...................] - ETA: 1:39 - loss: 1.2113 - regression_loss: 1.0372 - classification_loss: 0.1742 200/500 [===========>..................] - ETA: 1:38 - loss: 1.2148 - regression_loss: 1.0401 - classification_loss: 0.1747 201/500 [===========>..................] - ETA: 1:38 - loss: 1.2132 - regression_loss: 1.0387 - classification_loss: 0.1745 202/500 [===========>..................] - ETA: 1:38 - loss: 1.2123 - regression_loss: 1.0378 - classification_loss: 0.1744 203/500 [===========>..................] - ETA: 1:37 - loss: 1.2132 - regression_loss: 1.0388 - classification_loss: 0.1744 204/500 [===========>..................] - ETA: 1:37 - loss: 1.2136 - regression_loss: 1.0391 - classification_loss: 0.1745 205/500 [===========>..................] - ETA: 1:37 - loss: 1.2139 - regression_loss: 1.0391 - classification_loss: 0.1748 206/500 [===========>..................] - ETA: 1:37 - loss: 1.2124 - regression_loss: 1.0380 - classification_loss: 0.1744 207/500 [===========>..................] - ETA: 1:36 - loss: 1.2147 - regression_loss: 1.0398 - classification_loss: 0.1749 208/500 [===========>..................] - ETA: 1:36 - loss: 1.2160 - regression_loss: 1.0410 - classification_loss: 0.1750 209/500 [===========>..................] - ETA: 1:36 - loss: 1.2172 - regression_loss: 1.0425 - classification_loss: 0.1747 210/500 [===========>..................] - ETA: 1:35 - loss: 1.2177 - regression_loss: 1.0429 - classification_loss: 0.1748 211/500 [===========>..................] - ETA: 1:35 - loss: 1.2187 - regression_loss: 1.0438 - classification_loss: 0.1749 212/500 [===========>..................] - ETA: 1:35 - loss: 1.2172 - regression_loss: 1.0425 - classification_loss: 0.1747 213/500 [===========>..................] - ETA: 1:34 - loss: 1.2161 - regression_loss: 1.0419 - classification_loss: 0.1743 214/500 [===========>..................] - ETA: 1:34 - loss: 1.2181 - regression_loss: 1.0435 - classification_loss: 0.1746 215/500 [===========>..................] - ETA: 1:33 - loss: 1.2197 - regression_loss: 1.0449 - classification_loss: 0.1748 216/500 [===========>..................] - ETA: 1:33 - loss: 1.2160 - regression_loss: 1.0417 - classification_loss: 0.1743 217/500 [============>.................] - ETA: 1:33 - loss: 1.2165 - regression_loss: 1.0421 - classification_loss: 0.1744 218/500 [============>.................] - ETA: 1:32 - loss: 1.2169 - regression_loss: 1.0428 - classification_loss: 0.1741 219/500 [============>.................] - ETA: 1:32 - loss: 1.2186 - regression_loss: 1.0442 - classification_loss: 0.1743 220/500 [============>.................] - ETA: 1:32 - loss: 1.2203 - regression_loss: 1.0457 - classification_loss: 0.1746 221/500 [============>.................] - ETA: 1:31 - loss: 1.2164 - regression_loss: 1.0423 - classification_loss: 0.1741 222/500 [============>.................] - ETA: 1:31 - loss: 1.2179 - regression_loss: 1.0436 - classification_loss: 0.1743 223/500 [============>.................] - ETA: 1:31 - loss: 1.2182 - regression_loss: 1.0440 - classification_loss: 0.1742 224/500 [============>.................] - ETA: 1:30 - loss: 1.2218 - regression_loss: 1.0468 - classification_loss: 0.1750 225/500 [============>.................] - ETA: 1:30 - loss: 1.2212 - regression_loss: 1.0463 - classification_loss: 0.1749 226/500 [============>.................] - ETA: 1:30 - loss: 1.2201 - regression_loss: 1.0456 - classification_loss: 0.1745 227/500 [============>.................] - ETA: 1:30 - loss: 1.2182 - regression_loss: 1.0436 - classification_loss: 0.1746 228/500 [============>.................] - ETA: 1:29 - loss: 1.2183 - regression_loss: 1.0436 - classification_loss: 0.1747 229/500 [============>.................] - ETA: 1:29 - loss: 1.2170 - regression_loss: 1.0425 - classification_loss: 0.1745 230/500 [============>.................] - ETA: 1:29 - loss: 1.2180 - regression_loss: 1.0432 - classification_loss: 0.1748 231/500 [============>.................] - ETA: 1:28 - loss: 1.2196 - regression_loss: 1.0444 - classification_loss: 0.1752 232/500 [============>.................] - ETA: 1:28 - loss: 1.2200 - regression_loss: 1.0448 - classification_loss: 0.1752 233/500 [============>.................] - ETA: 1:28 - loss: 1.2202 - regression_loss: 1.0449 - classification_loss: 0.1753 234/500 [=============>................] - ETA: 1:27 - loss: 1.2183 - regression_loss: 1.0430 - classification_loss: 0.1753 235/500 [=============>................] - ETA: 1:27 - loss: 1.2171 - regression_loss: 1.0421 - classification_loss: 0.1749 236/500 [=============>................] - ETA: 1:26 - loss: 1.2180 - regression_loss: 1.0429 - classification_loss: 0.1750 237/500 [=============>................] - ETA: 1:26 - loss: 1.2183 - regression_loss: 1.0434 - classification_loss: 0.1749 238/500 [=============>................] - ETA: 1:26 - loss: 1.2190 - regression_loss: 1.0440 - classification_loss: 0.1750 239/500 [=============>................] - ETA: 1:25 - loss: 1.2201 - regression_loss: 1.0447 - classification_loss: 0.1754 240/500 [=============>................] - ETA: 1:25 - loss: 1.2177 - regression_loss: 1.0429 - classification_loss: 0.1748 241/500 [=============>................] - ETA: 1:25 - loss: 1.2169 - regression_loss: 1.0425 - classification_loss: 0.1744 242/500 [=============>................] - ETA: 1:24 - loss: 1.2143 - regression_loss: 1.0403 - classification_loss: 0.1739 243/500 [=============>................] - ETA: 1:24 - loss: 1.2124 - regression_loss: 1.0388 - classification_loss: 0.1735 244/500 [=============>................] - ETA: 1:24 - loss: 1.2095 - regression_loss: 1.0364 - classification_loss: 0.1731 245/500 [=============>................] - ETA: 1:23 - loss: 1.2074 - regression_loss: 1.0346 - classification_loss: 0.1728 246/500 [=============>................] - ETA: 1:23 - loss: 1.2094 - regression_loss: 1.0364 - classification_loss: 0.1730 247/500 [=============>................] - ETA: 1:23 - loss: 1.2095 - regression_loss: 1.0364 - classification_loss: 0.1731 248/500 [=============>................] - ETA: 1:22 - loss: 1.2121 - regression_loss: 1.0385 - classification_loss: 0.1736 249/500 [=============>................] - ETA: 1:22 - loss: 1.2086 - regression_loss: 1.0354 - classification_loss: 0.1732 250/500 [==============>...............] - ETA: 1:22 - loss: 1.2091 - regression_loss: 1.0359 - classification_loss: 0.1732 251/500 [==============>...............] - ETA: 1:22 - loss: 1.2096 - regression_loss: 1.0363 - classification_loss: 0.1733 252/500 [==============>...............] - ETA: 1:21 - loss: 1.2081 - regression_loss: 1.0350 - classification_loss: 0.1731 253/500 [==============>...............] - ETA: 1:21 - loss: 1.2090 - regression_loss: 1.0359 - classification_loss: 0.1732 254/500 [==============>...............] - ETA: 1:21 - loss: 1.2101 - regression_loss: 1.0369 - classification_loss: 0.1732 255/500 [==============>...............] - ETA: 1:20 - loss: 1.2110 - regression_loss: 1.0377 - classification_loss: 0.1733 256/500 [==============>...............] - ETA: 1:20 - loss: 1.2139 - regression_loss: 1.0399 - classification_loss: 0.1740 257/500 [==============>...............] - ETA: 1:20 - loss: 1.2158 - regression_loss: 1.0414 - classification_loss: 0.1744 258/500 [==============>...............] - ETA: 1:19 - loss: 1.2170 - regression_loss: 1.0425 - classification_loss: 0.1745 259/500 [==============>...............] - ETA: 1:19 - loss: 1.2142 - regression_loss: 1.0400 - classification_loss: 0.1742 260/500 [==============>...............] - ETA: 1:18 - loss: 1.2124 - regression_loss: 1.0385 - classification_loss: 0.1739 261/500 [==============>...............] - ETA: 1:18 - loss: 1.2094 - regression_loss: 1.0359 - classification_loss: 0.1734 262/500 [==============>...............] - ETA: 1:18 - loss: 1.2086 - regression_loss: 1.0354 - classification_loss: 0.1732 263/500 [==============>...............] - ETA: 1:18 - loss: 1.2088 - regression_loss: 1.0358 - classification_loss: 0.1730 264/500 [==============>...............] - ETA: 1:17 - loss: 1.2093 - regression_loss: 1.0361 - classification_loss: 0.1732 265/500 [==============>...............] - ETA: 1:17 - loss: 1.2120 - regression_loss: 1.0383 - classification_loss: 0.1737 266/500 [==============>...............] - ETA: 1:17 - loss: 1.2114 - regression_loss: 1.0377 - classification_loss: 0.1737 267/500 [===============>..............] - ETA: 1:16 - loss: 1.2093 - regression_loss: 1.0359 - classification_loss: 0.1734 268/500 [===============>..............] - ETA: 1:16 - loss: 1.2100 - regression_loss: 1.0366 - classification_loss: 0.1735 269/500 [===============>..............] - ETA: 1:16 - loss: 1.2111 - regression_loss: 1.0375 - classification_loss: 0.1736 270/500 [===============>..............] - ETA: 1:15 - loss: 1.2117 - regression_loss: 1.0382 - classification_loss: 0.1735 271/500 [===============>..............] - ETA: 1:15 - loss: 1.2109 - regression_loss: 1.0376 - classification_loss: 0.1733 272/500 [===============>..............] - ETA: 1:15 - loss: 1.2088 - regression_loss: 1.0358 - classification_loss: 0.1731 273/500 [===============>..............] - ETA: 1:14 - loss: 1.2073 - regression_loss: 1.0345 - classification_loss: 0.1728 274/500 [===============>..............] - ETA: 1:14 - loss: 1.2075 - regression_loss: 1.0345 - classification_loss: 0.1730 275/500 [===============>..............] - ETA: 1:14 - loss: 1.2077 - regression_loss: 1.0346 - classification_loss: 0.1731 276/500 [===============>..............] - ETA: 1:13 - loss: 1.2096 - regression_loss: 1.0363 - classification_loss: 0.1733 277/500 [===============>..............] - ETA: 1:13 - loss: 1.2116 - regression_loss: 1.0379 - classification_loss: 0.1737 278/500 [===============>..............] - ETA: 1:13 - loss: 1.2101 - regression_loss: 1.0366 - classification_loss: 0.1735 279/500 [===============>..............] - ETA: 1:12 - loss: 1.2092 - regression_loss: 1.0358 - classification_loss: 0.1734 280/500 [===============>..............] - ETA: 1:12 - loss: 1.2093 - regression_loss: 1.0359 - classification_loss: 0.1735 281/500 [===============>..............] - ETA: 1:12 - loss: 1.2100 - regression_loss: 1.0362 - classification_loss: 0.1737 282/500 [===============>..............] - ETA: 1:11 - loss: 1.2096 - regression_loss: 1.0358 - classification_loss: 0.1738 283/500 [===============>..............] - ETA: 1:11 - loss: 1.2104 - regression_loss: 1.0364 - classification_loss: 0.1740 284/500 [================>.............] - ETA: 1:11 - loss: 1.2098 - regression_loss: 1.0359 - classification_loss: 0.1739 285/500 [================>.............] - ETA: 1:10 - loss: 1.2104 - regression_loss: 1.0364 - classification_loss: 0.1741 286/500 [================>.............] - ETA: 1:10 - loss: 1.2093 - regression_loss: 1.0355 - classification_loss: 0.1738 287/500 [================>.............] - ETA: 1:10 - loss: 1.2096 - regression_loss: 1.0358 - classification_loss: 0.1738 288/500 [================>.............] - ETA: 1:09 - loss: 1.2075 - regression_loss: 1.0339 - classification_loss: 0.1736 289/500 [================>.............] - ETA: 1:09 - loss: 1.2084 - regression_loss: 1.0346 - classification_loss: 0.1737 290/500 [================>.............] - ETA: 1:09 - loss: 1.2079 - regression_loss: 1.0344 - classification_loss: 0.1736 291/500 [================>.............] - ETA: 1:08 - loss: 1.2060 - regression_loss: 1.0327 - classification_loss: 0.1732 292/500 [================>.............] - ETA: 1:08 - loss: 1.2060 - regression_loss: 1.0328 - classification_loss: 0.1733 293/500 [================>.............] - ETA: 1:08 - loss: 1.2052 - regression_loss: 1.0320 - classification_loss: 0.1732 294/500 [================>.............] - ETA: 1:07 - loss: 1.2060 - regression_loss: 1.0327 - classification_loss: 0.1733 295/500 [================>.............] - ETA: 1:07 - loss: 1.2032 - regression_loss: 1.0303 - classification_loss: 0.1729 296/500 [================>.............] - ETA: 1:07 - loss: 1.2008 - regression_loss: 1.0283 - classification_loss: 0.1726 297/500 [================>.............] - ETA: 1:06 - loss: 1.2009 - regression_loss: 1.0284 - classification_loss: 0.1725 298/500 [================>.............] - ETA: 1:06 - loss: 1.2004 - regression_loss: 1.0280 - classification_loss: 0.1724 299/500 [================>.............] - ETA: 1:06 - loss: 1.2010 - regression_loss: 1.0288 - classification_loss: 0.1722 300/500 [=================>............] - ETA: 1:05 - loss: 1.2025 - regression_loss: 1.0299 - classification_loss: 0.1726 301/500 [=================>............] - ETA: 1:05 - loss: 1.2005 - regression_loss: 1.0282 - classification_loss: 0.1723 302/500 [=================>............] - ETA: 1:05 - loss: 1.1976 - regression_loss: 1.0256 - classification_loss: 0.1720 303/500 [=================>............] - ETA: 1:04 - loss: 1.1963 - regression_loss: 1.0245 - classification_loss: 0.1718 304/500 [=================>............] - ETA: 1:04 - loss: 1.1951 - regression_loss: 1.0233 - classification_loss: 0.1718 305/500 [=================>............] - ETA: 1:04 - loss: 1.1953 - regression_loss: 1.0237 - classification_loss: 0.1717 306/500 [=================>............] - ETA: 1:03 - loss: 1.1947 - regression_loss: 1.0232 - classification_loss: 0.1716 307/500 [=================>............] - ETA: 1:03 - loss: 1.1930 - regression_loss: 1.0215 - classification_loss: 0.1715 308/500 [=================>............] - ETA: 1:03 - loss: 1.1944 - regression_loss: 1.0225 - classification_loss: 0.1719 309/500 [=================>............] - ETA: 1:02 - loss: 1.1945 - regression_loss: 1.0226 - classification_loss: 0.1718 310/500 [=================>............] - ETA: 1:02 - loss: 1.1942 - regression_loss: 1.0222 - classification_loss: 0.1719 311/500 [=================>............] - ETA: 1:02 - loss: 1.1949 - regression_loss: 1.0227 - classification_loss: 0.1722 312/500 [=================>............] - ETA: 1:02 - loss: 1.1958 - regression_loss: 1.0235 - classification_loss: 0.1723 313/500 [=================>............] - ETA: 1:01 - loss: 1.1973 - regression_loss: 1.0247 - classification_loss: 0.1726 314/500 [=================>............] - ETA: 1:01 - loss: 1.1968 - regression_loss: 1.0244 - classification_loss: 0.1724 315/500 [=================>............] - ETA: 1:00 - loss: 1.1988 - regression_loss: 1.0261 - classification_loss: 0.1727 316/500 [=================>............] - ETA: 1:00 - loss: 1.1975 - regression_loss: 1.0249 - classification_loss: 0.1726 317/500 [==================>...........] - ETA: 1:00 - loss: 1.1994 - regression_loss: 1.0264 - classification_loss: 0.1729 318/500 [==================>...........] - ETA: 1:00 - loss: 1.1978 - regression_loss: 1.0251 - classification_loss: 0.1727 319/500 [==================>...........] - ETA: 59s - loss: 1.1960 - regression_loss: 1.0235 - classification_loss: 0.1725  320/500 [==================>...........] - ETA: 59s - loss: 1.1962 - regression_loss: 1.0237 - classification_loss: 0.1726 321/500 [==================>...........] - ETA: 59s - loss: 1.1967 - regression_loss: 1.0242 - classification_loss: 0.1725 322/500 [==================>...........] - ETA: 58s - loss: 1.1973 - regression_loss: 1.0248 - classification_loss: 0.1725 323/500 [==================>...........] - ETA: 58s - loss: 1.1977 - regression_loss: 1.0252 - classification_loss: 0.1725 324/500 [==================>...........] - ETA: 58s - loss: 1.1961 - regression_loss: 1.0238 - classification_loss: 0.1723 325/500 [==================>...........] - ETA: 57s - loss: 1.1950 - regression_loss: 1.0229 - classification_loss: 0.1721 326/500 [==================>...........] - ETA: 57s - loss: 1.1957 - regression_loss: 1.0235 - classification_loss: 0.1722 327/500 [==================>...........] - ETA: 57s - loss: 1.1938 - regression_loss: 1.0219 - classification_loss: 0.1719 328/500 [==================>...........] - ETA: 56s - loss: 1.1931 - regression_loss: 1.0213 - classification_loss: 0.1718 329/500 [==================>...........] - ETA: 56s - loss: 1.1945 - regression_loss: 1.0224 - classification_loss: 0.1720 330/500 [==================>...........] - ETA: 56s - loss: 1.1942 - regression_loss: 1.0224 - classification_loss: 0.1718 331/500 [==================>...........] - ETA: 55s - loss: 1.1916 - regression_loss: 1.0202 - classification_loss: 0.1715 332/500 [==================>...........] - ETA: 55s - loss: 1.1921 - regression_loss: 1.0206 - classification_loss: 0.1715 333/500 [==================>...........] - ETA: 55s - loss: 1.1930 - regression_loss: 1.0213 - classification_loss: 0.1717 334/500 [===================>..........] - ETA: 54s - loss: 1.1925 - regression_loss: 1.0211 - classification_loss: 0.1715 335/500 [===================>..........] - ETA: 54s - loss: 1.1924 - regression_loss: 1.0209 - classification_loss: 0.1714 336/500 [===================>..........] - ETA: 54s - loss: 1.1932 - regression_loss: 1.0216 - classification_loss: 0.1716 337/500 [===================>..........] - ETA: 53s - loss: 1.1949 - regression_loss: 1.0230 - classification_loss: 0.1719 338/500 [===================>..........] - ETA: 53s - loss: 1.1967 - regression_loss: 1.0245 - classification_loss: 0.1722 339/500 [===================>..........] - ETA: 53s - loss: 1.1972 - regression_loss: 1.0249 - classification_loss: 0.1723 340/500 [===================>..........] - ETA: 52s - loss: 1.1991 - regression_loss: 1.0264 - classification_loss: 0.1727 341/500 [===================>..........] - ETA: 52s - loss: 1.2003 - regression_loss: 1.0275 - classification_loss: 0.1729 342/500 [===================>..........] - ETA: 52s - loss: 1.2004 - regression_loss: 1.0277 - classification_loss: 0.1727 343/500 [===================>..........] - ETA: 51s - loss: 1.1987 - regression_loss: 1.0263 - classification_loss: 0.1724 344/500 [===================>..........] - ETA: 51s - loss: 1.1998 - regression_loss: 1.0273 - classification_loss: 0.1725 345/500 [===================>..........] - ETA: 51s - loss: 1.1996 - regression_loss: 1.0272 - classification_loss: 0.1724 346/500 [===================>..........] - ETA: 50s - loss: 1.1980 - regression_loss: 1.0258 - classification_loss: 0.1722 347/500 [===================>..........] - ETA: 50s - loss: 1.1980 - regression_loss: 1.0257 - classification_loss: 0.1723 348/500 [===================>..........] - ETA: 50s - loss: 1.1962 - regression_loss: 1.0242 - classification_loss: 0.1720 349/500 [===================>..........] - ETA: 49s - loss: 1.1971 - regression_loss: 1.0250 - classification_loss: 0.1722 350/500 [====================>.........] - ETA: 49s - loss: 1.1953 - regression_loss: 1.0232 - classification_loss: 0.1721 351/500 [====================>.........] - ETA: 49s - loss: 1.1970 - regression_loss: 1.0246 - classification_loss: 0.1724 352/500 [====================>.........] - ETA: 48s - loss: 1.1973 - regression_loss: 1.0248 - classification_loss: 0.1724 353/500 [====================>.........] - ETA: 48s - loss: 1.1961 - regression_loss: 1.0239 - classification_loss: 0.1721 354/500 [====================>.........] - ETA: 48s - loss: 1.1976 - regression_loss: 1.0252 - classification_loss: 0.1724 355/500 [====================>.........] - ETA: 47s - loss: 1.1961 - regression_loss: 1.0239 - classification_loss: 0.1722 356/500 [====================>.........] - ETA: 47s - loss: 1.1980 - regression_loss: 1.0255 - classification_loss: 0.1725 357/500 [====================>.........] - ETA: 47s - loss: 1.1980 - regression_loss: 1.0254 - classification_loss: 0.1726 358/500 [====================>.........] - ETA: 46s - loss: 1.1978 - regression_loss: 1.0251 - classification_loss: 0.1727 359/500 [====================>.........] - ETA: 46s - loss: 1.1979 - regression_loss: 1.0255 - classification_loss: 0.1724 360/500 [====================>.........] - ETA: 46s - loss: 1.1988 - regression_loss: 1.0262 - classification_loss: 0.1726 361/500 [====================>.........] - ETA: 45s - loss: 1.1998 - regression_loss: 1.0270 - classification_loss: 0.1728 362/500 [====================>.........] - ETA: 45s - loss: 1.1981 - regression_loss: 1.0255 - classification_loss: 0.1726 363/500 [====================>.........] - ETA: 45s - loss: 1.1975 - regression_loss: 1.0251 - classification_loss: 0.1724 364/500 [====================>.........] - ETA: 44s - loss: 1.1976 - regression_loss: 1.0252 - classification_loss: 0.1724 365/500 [====================>.........] - ETA: 44s - loss: 1.1964 - regression_loss: 1.0241 - classification_loss: 0.1723 366/500 [====================>.........] - ETA: 44s - loss: 1.1964 - regression_loss: 1.0242 - classification_loss: 0.1722 367/500 [=====================>........] - ETA: 43s - loss: 1.1959 - regression_loss: 1.0231 - classification_loss: 0.1728 368/500 [=====================>........] - ETA: 43s - loss: 1.1941 - regression_loss: 1.0215 - classification_loss: 0.1726 369/500 [=====================>........] - ETA: 43s - loss: 1.1959 - regression_loss: 1.0230 - classification_loss: 0.1729 370/500 [=====================>........] - ETA: 42s - loss: 1.1970 - regression_loss: 1.0238 - classification_loss: 0.1732 371/500 [=====================>........] - ETA: 42s - loss: 1.1951 - regression_loss: 1.0222 - classification_loss: 0.1729 372/500 [=====================>........] - ETA: 42s - loss: 1.1960 - regression_loss: 1.0229 - classification_loss: 0.1731 373/500 [=====================>........] - ETA: 41s - loss: 1.1947 - regression_loss: 1.0218 - classification_loss: 0.1728 374/500 [=====================>........] - ETA: 41s - loss: 1.1950 - regression_loss: 1.0222 - classification_loss: 0.1728 375/500 [=====================>........] - ETA: 41s - loss: 1.1928 - regression_loss: 1.0203 - classification_loss: 0.1725 376/500 [=====================>........] - ETA: 40s - loss: 1.1948 - regression_loss: 1.0219 - classification_loss: 0.1729 377/500 [=====================>........] - ETA: 40s - loss: 1.1946 - regression_loss: 1.0216 - classification_loss: 0.1730 378/500 [=====================>........] - ETA: 40s - loss: 1.1955 - regression_loss: 1.0223 - classification_loss: 0.1731 379/500 [=====================>........] - ETA: 39s - loss: 1.1955 - regression_loss: 1.0223 - classification_loss: 0.1733 380/500 [=====================>........] - ETA: 39s - loss: 1.1951 - regression_loss: 1.0221 - classification_loss: 0.1731 381/500 [=====================>........] - ETA: 39s - loss: 1.1935 - regression_loss: 1.0207 - classification_loss: 0.1728 382/500 [=====================>........] - ETA: 38s - loss: 1.1938 - regression_loss: 1.0209 - classification_loss: 0.1729 383/500 [=====================>........] - ETA: 38s - loss: 1.1942 - regression_loss: 1.0212 - classification_loss: 0.1730 384/500 [======================>.......] - ETA: 38s - loss: 1.1950 - regression_loss: 1.0218 - classification_loss: 0.1732 385/500 [======================>.......] - ETA: 38s - loss: 1.1935 - regression_loss: 1.0206 - classification_loss: 0.1729 386/500 [======================>.......] - ETA: 37s - loss: 1.1945 - regression_loss: 1.0214 - classification_loss: 0.1731 387/500 [======================>.......] - ETA: 37s - loss: 1.1951 - regression_loss: 1.0219 - classification_loss: 0.1732 388/500 [======================>.......] - ETA: 37s - loss: 1.1957 - regression_loss: 1.0224 - classification_loss: 0.1733 389/500 [======================>.......] - ETA: 36s - loss: 1.1951 - regression_loss: 1.0218 - classification_loss: 0.1733 390/500 [======================>.......] - ETA: 36s - loss: 1.1953 - regression_loss: 1.0220 - classification_loss: 0.1733 391/500 [======================>.......] - ETA: 36s - loss: 1.1944 - regression_loss: 1.0211 - classification_loss: 0.1734 392/500 [======================>.......] - ETA: 35s - loss: 1.1949 - regression_loss: 1.0215 - classification_loss: 0.1734 393/500 [======================>.......] - ETA: 35s - loss: 1.1946 - regression_loss: 1.0213 - classification_loss: 0.1733 394/500 [======================>.......] - ETA: 35s - loss: 1.1938 - regression_loss: 1.0207 - classification_loss: 0.1731 395/500 [======================>.......] - ETA: 34s - loss: 1.1947 - regression_loss: 1.0215 - classification_loss: 0.1732 396/500 [======================>.......] - ETA: 34s - loss: 1.1943 - regression_loss: 1.0212 - classification_loss: 0.1731 397/500 [======================>.......] - ETA: 34s - loss: 1.1926 - regression_loss: 1.0197 - classification_loss: 0.1729 398/500 [======================>.......] - ETA: 33s - loss: 1.1934 - regression_loss: 1.0203 - classification_loss: 0.1730 399/500 [======================>.......] - ETA: 33s - loss: 1.1927 - regression_loss: 1.0198 - classification_loss: 0.1729 400/500 [=======================>......] - ETA: 33s - loss: 1.1937 - regression_loss: 1.0206 - classification_loss: 0.1731 401/500 [=======================>......] - ETA: 32s - loss: 1.1931 - regression_loss: 1.0200 - classification_loss: 0.1731 402/500 [=======================>......] - ETA: 32s - loss: 1.1913 - regression_loss: 1.0185 - classification_loss: 0.1728 403/500 [=======================>......] - ETA: 32s - loss: 1.1918 - regression_loss: 1.0188 - classification_loss: 0.1730 404/500 [=======================>......] - ETA: 31s - loss: 1.1901 - regression_loss: 1.0174 - classification_loss: 0.1727 405/500 [=======================>......] - ETA: 31s - loss: 1.1903 - regression_loss: 1.0176 - classification_loss: 0.1726 406/500 [=======================>......] - ETA: 31s - loss: 1.1918 - regression_loss: 1.0186 - classification_loss: 0.1732 407/500 [=======================>......] - ETA: 30s - loss: 1.1923 - regression_loss: 1.0192 - classification_loss: 0.1731 408/500 [=======================>......] - ETA: 30s - loss: 1.1912 - regression_loss: 1.0183 - classification_loss: 0.1730 409/500 [=======================>......] - ETA: 30s - loss: 1.1926 - regression_loss: 1.0196 - classification_loss: 0.1730 410/500 [=======================>......] - ETA: 29s - loss: 1.1934 - regression_loss: 1.0202 - classification_loss: 0.1732 411/500 [=======================>......] - ETA: 29s - loss: 1.1931 - regression_loss: 1.0198 - classification_loss: 0.1732 412/500 [=======================>......] - ETA: 29s - loss: 1.1943 - regression_loss: 1.0206 - classification_loss: 0.1737 413/500 [=======================>......] - ETA: 28s - loss: 1.1944 - regression_loss: 1.0207 - classification_loss: 0.1737 414/500 [=======================>......] - ETA: 28s - loss: 1.1954 - regression_loss: 1.0214 - classification_loss: 0.1741 415/500 [=======================>......] - ETA: 28s - loss: 1.1961 - regression_loss: 1.0220 - classification_loss: 0.1741 416/500 [=======================>......] - ETA: 27s - loss: 1.1969 - regression_loss: 1.0227 - classification_loss: 0.1742 417/500 [========================>.....] - ETA: 27s - loss: 1.1962 - regression_loss: 1.0219 - classification_loss: 0.1743 418/500 [========================>.....] - ETA: 27s - loss: 1.1960 - regression_loss: 1.0218 - classification_loss: 0.1743 419/500 [========================>.....] - ETA: 26s - loss: 1.1947 - regression_loss: 1.0206 - classification_loss: 0.1741 420/500 [========================>.....] - ETA: 26s - loss: 1.1952 - regression_loss: 1.0211 - classification_loss: 0.1742 421/500 [========================>.....] - ETA: 26s - loss: 1.1945 - regression_loss: 1.0204 - classification_loss: 0.1741 422/500 [========================>.....] - ETA: 25s - loss: 1.1948 - regression_loss: 1.0207 - classification_loss: 0.1741 423/500 [========================>.....] - ETA: 25s - loss: 1.1934 - regression_loss: 1.0195 - classification_loss: 0.1739 424/500 [========================>.....] - ETA: 25s - loss: 1.1944 - regression_loss: 1.0203 - classification_loss: 0.1740 425/500 [========================>.....] - ETA: 24s - loss: 1.1926 - regression_loss: 1.0187 - classification_loss: 0.1738 426/500 [========================>.....] - ETA: 24s - loss: 1.1918 - regression_loss: 1.0180 - classification_loss: 0.1738 427/500 [========================>.....] - ETA: 24s - loss: 1.1899 - regression_loss: 1.0164 - classification_loss: 0.1736 428/500 [========================>.....] - ETA: 23s - loss: 1.1883 - regression_loss: 1.0150 - classification_loss: 0.1733 429/500 [========================>.....] - ETA: 23s - loss: 1.1884 - regression_loss: 1.0150 - classification_loss: 0.1734 430/500 [========================>.....] - ETA: 23s - loss: 1.1869 - regression_loss: 1.0138 - classification_loss: 0.1731 431/500 [========================>.....] - ETA: 22s - loss: 1.1877 - regression_loss: 1.0145 - classification_loss: 0.1731 432/500 [========================>.....] - ETA: 22s - loss: 1.1885 - regression_loss: 1.0153 - classification_loss: 0.1732 433/500 [========================>.....] - ETA: 22s - loss: 1.1877 - regression_loss: 1.0147 - classification_loss: 0.1731 434/500 [=========================>....] - ETA: 21s - loss: 1.1875 - regression_loss: 1.0144 - classification_loss: 0.1731 435/500 [=========================>....] - ETA: 21s - loss: 1.1880 - regression_loss: 1.0150 - classification_loss: 0.1731 436/500 [=========================>....] - ETA: 21s - loss: 1.1873 - regression_loss: 1.0143 - classification_loss: 0.1729 437/500 [=========================>....] - ETA: 20s - loss: 1.1865 - regression_loss: 1.0138 - classification_loss: 0.1727 438/500 [=========================>....] - ETA: 20s - loss: 1.1854 - regression_loss: 1.0129 - classification_loss: 0.1725 439/500 [=========================>....] - ETA: 20s - loss: 1.1856 - regression_loss: 1.0130 - classification_loss: 0.1726 440/500 [=========================>....] - ETA: 19s - loss: 1.1857 - regression_loss: 1.0130 - classification_loss: 0.1727 441/500 [=========================>....] - ETA: 19s - loss: 1.1853 - regression_loss: 1.0126 - classification_loss: 0.1727 442/500 [=========================>....] - ETA: 19s - loss: 1.1874 - regression_loss: 1.0144 - classification_loss: 0.1730 443/500 [=========================>....] - ETA: 18s - loss: 1.1868 - regression_loss: 1.0139 - classification_loss: 0.1729 444/500 [=========================>....] - ETA: 18s - loss: 1.1874 - regression_loss: 1.0145 - classification_loss: 0.1729 445/500 [=========================>....] - ETA: 18s - loss: 1.1877 - regression_loss: 1.0148 - classification_loss: 0.1729 446/500 [=========================>....] - ETA: 17s - loss: 1.1867 - regression_loss: 1.0140 - classification_loss: 0.1727 447/500 [=========================>....] - ETA: 17s - loss: 1.1864 - regression_loss: 1.0137 - classification_loss: 0.1727 448/500 [=========================>....] - ETA: 17s - loss: 1.1856 - regression_loss: 1.0130 - classification_loss: 0.1726 449/500 [=========================>....] - ETA: 16s - loss: 1.1862 - regression_loss: 1.0136 - classification_loss: 0.1726 450/500 [==========================>...] - ETA: 16s - loss: 1.1858 - regression_loss: 1.0133 - classification_loss: 0.1725 451/500 [==========================>...] - ETA: 16s - loss: 1.1870 - regression_loss: 1.0142 - classification_loss: 0.1728 452/500 [==========================>...] - ETA: 15s - loss: 1.1869 - regression_loss: 1.0143 - classification_loss: 0.1727 453/500 [==========================>...] - ETA: 15s - loss: 1.1881 - regression_loss: 1.0153 - classification_loss: 0.1728 454/500 [==========================>...] - ETA: 15s - loss: 1.1871 - regression_loss: 1.0145 - classification_loss: 0.1726 455/500 [==========================>...] - ETA: 14s - loss: 1.1870 - regression_loss: 1.0144 - classification_loss: 0.1726 456/500 [==========================>...] - ETA: 14s - loss: 1.1858 - regression_loss: 1.0133 - classification_loss: 0.1724 457/500 [==========================>...] - ETA: 14s - loss: 1.1851 - regression_loss: 1.0128 - classification_loss: 0.1723 458/500 [==========================>...] - ETA: 13s - loss: 1.1852 - regression_loss: 1.0129 - classification_loss: 0.1723 459/500 [==========================>...] - ETA: 13s - loss: 1.1848 - regression_loss: 1.0126 - classification_loss: 0.1722 460/500 [==========================>...] - ETA: 13s - loss: 1.1858 - regression_loss: 1.0135 - classification_loss: 0.1723 461/500 [==========================>...] - ETA: 12s - loss: 1.1853 - regression_loss: 1.0131 - classification_loss: 0.1722 462/500 [==========================>...] - ETA: 12s - loss: 1.1844 - regression_loss: 1.0123 - classification_loss: 0.1721 463/500 [==========================>...] - ETA: 12s - loss: 1.1849 - regression_loss: 1.0127 - classification_loss: 0.1722 464/500 [==========================>...] - ETA: 11s - loss: 1.1853 - regression_loss: 1.0131 - classification_loss: 0.1723 465/500 [==========================>...] - ETA: 11s - loss: 1.1835 - regression_loss: 1.0115 - classification_loss: 0.1720 466/500 [==========================>...] - ETA: 11s - loss: 1.1835 - regression_loss: 1.0114 - classification_loss: 0.1720 467/500 [===========================>..] - ETA: 10s - loss: 1.1843 - regression_loss: 1.0121 - classification_loss: 0.1722 468/500 [===========================>..] - ETA: 10s - loss: 1.1852 - regression_loss: 1.0130 - classification_loss: 0.1722 469/500 [===========================>..] - ETA: 10s - loss: 1.1842 - regression_loss: 1.0123 - classification_loss: 0.1719 470/500 [===========================>..] - ETA: 9s - loss: 1.1831 - regression_loss: 1.0113 - classification_loss: 0.1718  471/500 [===========================>..] - ETA: 9s - loss: 1.1835 - regression_loss: 1.0117 - classification_loss: 0.1718 472/500 [===========================>..] - ETA: 9s - loss: 1.1854 - regression_loss: 1.0133 - classification_loss: 0.1721 473/500 [===========================>..] - ETA: 8s - loss: 1.1846 - regression_loss: 1.0126 - classification_loss: 0.1720 474/500 [===========================>..] - ETA: 8s - loss: 1.1841 - regression_loss: 1.0122 - classification_loss: 0.1719 475/500 [===========================>..] - ETA: 8s - loss: 1.1825 - regression_loss: 1.0109 - classification_loss: 0.1716 476/500 [===========================>..] - ETA: 7s - loss: 1.1832 - regression_loss: 1.0114 - classification_loss: 0.1718 477/500 [===========================>..] - ETA: 7s - loss: 1.1838 - regression_loss: 1.0119 - classification_loss: 0.1719 478/500 [===========================>..] - ETA: 7s - loss: 1.1840 - regression_loss: 1.0121 - classification_loss: 0.1719 479/500 [===========================>..] - ETA: 6s - loss: 1.1828 - regression_loss: 1.0111 - classification_loss: 0.1717 480/500 [===========================>..] - ETA: 6s - loss: 1.1815 - regression_loss: 1.0100 - classification_loss: 0.1715 481/500 [===========================>..] - ETA: 6s - loss: 1.1815 - regression_loss: 1.0100 - classification_loss: 0.1715 482/500 [===========================>..] - ETA: 5s - loss: 1.1815 - regression_loss: 1.0100 - classification_loss: 0.1715 483/500 [===========================>..] - ETA: 5s - loss: 1.1817 - regression_loss: 1.0103 - classification_loss: 0.1714 484/500 [============================>.] - ETA: 5s - loss: 1.1822 - regression_loss: 1.0107 - classification_loss: 0.1715 485/500 [============================>.] - ETA: 4s - loss: 1.1824 - regression_loss: 1.0109 - classification_loss: 0.1716 486/500 [============================>.] - ETA: 4s - loss: 1.1820 - regression_loss: 1.0105 - classification_loss: 0.1715 487/500 [============================>.] - ETA: 4s - loss: 1.1825 - regression_loss: 1.0109 - classification_loss: 0.1716 488/500 [============================>.] - ETA: 3s - loss: 1.1822 - regression_loss: 1.0107 - classification_loss: 0.1715 489/500 [============================>.] - ETA: 3s - loss: 1.1816 - regression_loss: 1.0102 - classification_loss: 0.1714 490/500 [============================>.] - ETA: 3s - loss: 1.1803 - regression_loss: 1.0091 - classification_loss: 0.1712 491/500 [============================>.] - ETA: 2s - loss: 1.1803 - regression_loss: 1.0090 - classification_loss: 0.1713 492/500 [============================>.] - ETA: 2s - loss: 1.1791 - regression_loss: 1.0080 - classification_loss: 0.1711 493/500 [============================>.] - ETA: 2s - loss: 1.1782 - regression_loss: 1.0072 - classification_loss: 0.1710 494/500 [============================>.] - ETA: 1s - loss: 1.1786 - regression_loss: 1.0075 - classification_loss: 0.1711 495/500 [============================>.] - ETA: 1s - loss: 1.1795 - regression_loss: 1.0082 - classification_loss: 0.1713 496/500 [============================>.] - ETA: 1s - loss: 1.1794 - regression_loss: 1.0082 - classification_loss: 0.1713 497/500 [============================>.] - ETA: 0s - loss: 1.1786 - regression_loss: 1.0074 - classification_loss: 0.1711 498/500 [============================>.] - ETA: 0s - loss: 1.1797 - regression_loss: 1.0085 - classification_loss: 0.1712 499/500 [============================>.] - ETA: 0s - loss: 1.1787 - regression_loss: 1.0076 - classification_loss: 0.1711 500/500 [==============================] - 165s 331ms/step - loss: 1.1792 - regression_loss: 1.0080 - classification_loss: 0.1712 1172 instances of class plum with average precision: 0.6002 mAP: 0.6002 Epoch 00024: saving model to ./training/snapshots/resnet101_pascal_24.h5 Epoch 25/150 1/500 [..............................] - ETA: 2:55 - loss: 1.2082 - regression_loss: 1.0010 - classification_loss: 0.2073 2/500 [..............................] - ETA: 2:48 - loss: 1.0286 - regression_loss: 0.8655 - classification_loss: 0.1631 3/500 [..............................] - ETA: 2:43 - loss: 1.2884 - regression_loss: 1.0919 - classification_loss: 0.1965 4/500 [..............................] - ETA: 2:41 - loss: 1.3140 - regression_loss: 1.1194 - classification_loss: 0.1946 5/500 [..............................] - ETA: 2:42 - loss: 1.3202 - regression_loss: 1.1229 - classification_loss: 0.1972 6/500 [..............................] - ETA: 2:41 - loss: 1.2313 - regression_loss: 1.0414 - classification_loss: 0.1899 7/500 [..............................] - ETA: 2:43 - loss: 1.1693 - regression_loss: 0.9885 - classification_loss: 0.1808 8/500 [..............................] - ETA: 2:42 - loss: 1.2145 - regression_loss: 1.0315 - classification_loss: 0.1830 9/500 [..............................] - ETA: 2:41 - loss: 1.2304 - regression_loss: 1.0478 - classification_loss: 0.1826 10/500 [..............................] - ETA: 2:40 - loss: 1.1518 - regression_loss: 0.9816 - classification_loss: 0.1702 11/500 [..............................] - ETA: 2:41 - loss: 1.1734 - regression_loss: 1.0028 - classification_loss: 0.1706 12/500 [..............................] - ETA: 2:40 - loss: 1.1880 - regression_loss: 1.0192 - classification_loss: 0.1688 13/500 [..............................] - ETA: 2:40 - loss: 1.2059 - regression_loss: 1.0352 - classification_loss: 0.1707 14/500 [..............................] - ETA: 2:40 - loss: 1.2217 - regression_loss: 1.0494 - classification_loss: 0.1723 15/500 [..............................] - ETA: 2:40 - loss: 1.1712 - regression_loss: 1.0058 - classification_loss: 0.1654 16/500 [..............................] - ETA: 2:40 - loss: 1.1862 - regression_loss: 1.0197 - classification_loss: 0.1665 17/500 [>.............................] - ETA: 2:40 - loss: 1.1614 - regression_loss: 0.9986 - classification_loss: 0.1628 18/500 [>.............................] - ETA: 2:39 - loss: 1.1193 - regression_loss: 0.9620 - classification_loss: 0.1573 19/500 [>.............................] - ETA: 2:38 - loss: 1.0956 - regression_loss: 0.9426 - classification_loss: 0.1530 20/500 [>.............................] - ETA: 2:38 - loss: 1.1003 - regression_loss: 0.9464 - classification_loss: 0.1539 21/500 [>.............................] - ETA: 2:38 - loss: 1.1292 - regression_loss: 0.9706 - classification_loss: 0.1586 22/500 [>.............................] - ETA: 2:38 - loss: 1.1025 - regression_loss: 0.9482 - classification_loss: 0.1543 23/500 [>.............................] - ETA: 2:37 - loss: 1.1090 - regression_loss: 0.9543 - classification_loss: 0.1547 24/500 [>.............................] - ETA: 2:37 - loss: 1.1125 - regression_loss: 0.9557 - classification_loss: 0.1568 25/500 [>.............................] - ETA: 2:37 - loss: 1.1370 - regression_loss: 0.9776 - classification_loss: 0.1594 26/500 [>.............................] - ETA: 2:36 - loss: 1.1585 - regression_loss: 0.9943 - classification_loss: 0.1642 27/500 [>.............................] - ETA: 2:36 - loss: 1.1710 - regression_loss: 1.0074 - classification_loss: 0.1636 28/500 [>.............................] - ETA: 2:35 - loss: 1.1805 - regression_loss: 1.0167 - classification_loss: 0.1638 29/500 [>.............................] - ETA: 2:35 - loss: 1.1890 - regression_loss: 1.0236 - classification_loss: 0.1654 30/500 [>.............................] - ETA: 2:35 - loss: 1.1944 - regression_loss: 1.0277 - classification_loss: 0.1667 31/500 [>.............................] - ETA: 2:35 - loss: 1.1802 - regression_loss: 1.0163 - classification_loss: 0.1639 32/500 [>.............................] - ETA: 2:34 - loss: 1.2005 - regression_loss: 1.0338 - classification_loss: 0.1666 33/500 [>.............................] - ETA: 2:34 - loss: 1.2101 - regression_loss: 1.0424 - classification_loss: 0.1677 34/500 [=>............................] - ETA: 2:33 - loss: 1.2011 - regression_loss: 1.0341 - classification_loss: 0.1670 35/500 [=>............................] - ETA: 2:33 - loss: 1.2090 - regression_loss: 1.0403 - classification_loss: 0.1687 36/500 [=>............................] - ETA: 2:33 - loss: 1.2162 - regression_loss: 1.0465 - classification_loss: 0.1696 37/500 [=>............................] - ETA: 2:32 - loss: 1.2167 - regression_loss: 1.0469 - classification_loss: 0.1697 38/500 [=>............................] - ETA: 2:32 - loss: 1.2141 - regression_loss: 1.0434 - classification_loss: 0.1707 39/500 [=>............................] - ETA: 2:32 - loss: 1.1998 - regression_loss: 1.0308 - classification_loss: 0.1690 40/500 [=>............................] - ETA: 2:31 - loss: 1.1857 - regression_loss: 1.0186 - classification_loss: 0.1671 41/500 [=>............................] - ETA: 2:31 - loss: 1.1862 - regression_loss: 1.0202 - classification_loss: 0.1661 42/500 [=>............................] - ETA: 2:31 - loss: 1.1893 - regression_loss: 1.0223 - classification_loss: 0.1670 43/500 [=>............................] - ETA: 2:31 - loss: 1.1984 - regression_loss: 1.0306 - classification_loss: 0.1678 44/500 [=>............................] - ETA: 2:30 - loss: 1.1877 - regression_loss: 1.0213 - classification_loss: 0.1663 45/500 [=>............................] - ETA: 2:30 - loss: 1.1832 - regression_loss: 1.0172 - classification_loss: 0.1660 46/500 [=>............................] - ETA: 2:30 - loss: 1.1957 - regression_loss: 1.0267 - classification_loss: 0.1690 47/500 [=>............................] - ETA: 2:29 - loss: 1.1990 - regression_loss: 1.0313 - classification_loss: 0.1677 48/500 [=>............................] - ETA: 2:29 - loss: 1.1986 - regression_loss: 1.0311 - classification_loss: 0.1675 49/500 [=>............................] - ETA: 2:28 - loss: 1.1860 - regression_loss: 1.0201 - classification_loss: 0.1659 50/500 [==>...........................] - ETA: 2:28 - loss: 1.1745 - regression_loss: 1.0104 - classification_loss: 0.1641 51/500 [==>...........................] - ETA: 2:28 - loss: 1.1672 - regression_loss: 1.0049 - classification_loss: 0.1623 52/500 [==>...........................] - ETA: 2:28 - loss: 1.1609 - regression_loss: 0.9997 - classification_loss: 0.1612 53/500 [==>...........................] - ETA: 2:28 - loss: 1.1584 - regression_loss: 0.9977 - classification_loss: 0.1607 54/500 [==>...........................] - ETA: 2:27 - loss: 1.1644 - regression_loss: 1.0034 - classification_loss: 0.1610 55/500 [==>...........................] - ETA: 2:27 - loss: 1.1614 - regression_loss: 1.0017 - classification_loss: 0.1596 56/500 [==>...........................] - ETA: 2:26 - loss: 1.1662 - regression_loss: 1.0054 - classification_loss: 0.1607 57/500 [==>...........................] - ETA: 2:26 - loss: 1.1674 - regression_loss: 1.0069 - classification_loss: 0.1605 58/500 [==>...........................] - ETA: 2:26 - loss: 1.1765 - regression_loss: 1.0143 - classification_loss: 0.1622 59/500 [==>...........................] - ETA: 2:25 - loss: 1.1793 - regression_loss: 1.0164 - classification_loss: 0.1629 60/500 [==>...........................] - ETA: 2:25 - loss: 1.1916 - regression_loss: 1.0263 - classification_loss: 0.1653 61/500 [==>...........................] - ETA: 2:25 - loss: 1.1968 - regression_loss: 1.0309 - classification_loss: 0.1659 62/500 [==>...........................] - ETA: 2:24 - loss: 1.1882 - regression_loss: 1.0229 - classification_loss: 0.1652 63/500 [==>...........................] - ETA: 2:24 - loss: 1.1857 - regression_loss: 1.0196 - classification_loss: 0.1661 64/500 [==>...........................] - ETA: 2:23 - loss: 1.1783 - regression_loss: 1.0136 - classification_loss: 0.1647 65/500 [==>...........................] - ETA: 2:23 - loss: 1.1738 - regression_loss: 1.0105 - classification_loss: 0.1634 66/500 [==>...........................] - ETA: 2:23 - loss: 1.1821 - regression_loss: 1.0172 - classification_loss: 0.1650 67/500 [===>..........................] - ETA: 2:23 - loss: 1.1857 - regression_loss: 1.0196 - classification_loss: 0.1661 68/500 [===>..........................] - ETA: 2:22 - loss: 1.1844 - regression_loss: 1.0183 - classification_loss: 0.1661 69/500 [===>..........................] - ETA: 2:22 - loss: 1.1845 - regression_loss: 1.0187 - classification_loss: 0.1658 70/500 [===>..........................] - ETA: 2:21 - loss: 1.1829 - regression_loss: 1.0180 - classification_loss: 0.1648 71/500 [===>..........................] - ETA: 2:21 - loss: 1.1761 - regression_loss: 1.0124 - classification_loss: 0.1637 72/500 [===>..........................] - ETA: 2:21 - loss: 1.1744 - regression_loss: 1.0113 - classification_loss: 0.1631 73/500 [===>..........................] - ETA: 2:20 - loss: 1.1868 - regression_loss: 1.0212 - classification_loss: 0.1656 74/500 [===>..........................] - ETA: 2:20 - loss: 1.1871 - regression_loss: 1.0213 - classification_loss: 0.1659 75/500 [===>..........................] - ETA: 2:20 - loss: 1.1855 - regression_loss: 1.0203 - classification_loss: 0.1652 76/500 [===>..........................] - ETA: 2:20 - loss: 1.1923 - regression_loss: 1.0260 - classification_loss: 0.1663 77/500 [===>..........................] - ETA: 2:19 - loss: 1.1973 - regression_loss: 1.0299 - classification_loss: 0.1674 78/500 [===>..........................] - ETA: 2:19 - loss: 1.1939 - regression_loss: 1.0270 - classification_loss: 0.1669 79/500 [===>..........................] - ETA: 2:19 - loss: 1.1886 - regression_loss: 1.0228 - classification_loss: 0.1658 80/500 [===>..........................] - ETA: 2:18 - loss: 1.1992 - regression_loss: 1.0318 - classification_loss: 0.1674 81/500 [===>..........................] - ETA: 2:18 - loss: 1.1971 - regression_loss: 1.0296 - classification_loss: 0.1675 82/500 [===>..........................] - ETA: 2:18 - loss: 1.1900 - regression_loss: 1.0227 - classification_loss: 0.1672 83/500 [===>..........................] - ETA: 2:17 - loss: 1.1926 - regression_loss: 1.0246 - classification_loss: 0.1681 84/500 [====>.........................] - ETA: 2:17 - loss: 1.1901 - regression_loss: 1.0223 - classification_loss: 0.1679 85/500 [====>.........................] - ETA: 2:17 - loss: 1.1912 - regression_loss: 1.0231 - classification_loss: 0.1681 86/500 [====>.........................] - ETA: 2:16 - loss: 1.1910 - regression_loss: 1.0235 - classification_loss: 0.1675 87/500 [====>.........................] - ETA: 2:16 - loss: 1.1944 - regression_loss: 1.0254 - classification_loss: 0.1689 88/500 [====>.........................] - ETA: 2:16 - loss: 1.1974 - regression_loss: 1.0281 - classification_loss: 0.1693 89/500 [====>.........................] - ETA: 2:15 - loss: 1.2012 - regression_loss: 1.0310 - classification_loss: 0.1702 90/500 [====>.........................] - ETA: 2:15 - loss: 1.1974 - regression_loss: 1.0280 - classification_loss: 0.1694 91/500 [====>.........................] - ETA: 2:15 - loss: 1.1968 - regression_loss: 1.0275 - classification_loss: 0.1693 92/500 [====>.........................] - ETA: 2:14 - loss: 1.1972 - regression_loss: 1.0274 - classification_loss: 0.1698 93/500 [====>.........................] - ETA: 2:14 - loss: 1.1975 - regression_loss: 1.0277 - classification_loss: 0.1698 94/500 [====>.........................] - ETA: 2:14 - loss: 1.1968 - regression_loss: 1.0268 - classification_loss: 0.1700 95/500 [====>.........................] - ETA: 2:13 - loss: 1.1990 - regression_loss: 1.0287 - classification_loss: 0.1702 96/500 [====>.........................] - ETA: 2:13 - loss: 1.2052 - regression_loss: 1.0341 - classification_loss: 0.1711 97/500 [====>.........................] - ETA: 2:13 - loss: 1.1986 - regression_loss: 1.0285 - classification_loss: 0.1701 98/500 [====>.........................] - ETA: 2:12 - loss: 1.1924 - regression_loss: 1.0230 - classification_loss: 0.1694 99/500 [====>.........................] - ETA: 2:12 - loss: 1.1861 - regression_loss: 1.0176 - classification_loss: 0.1686 100/500 [=====>........................] - ETA: 2:11 - loss: 1.1840 - regression_loss: 1.0158 - classification_loss: 0.1682 101/500 [=====>........................] - ETA: 2:11 - loss: 1.1812 - regression_loss: 1.0138 - classification_loss: 0.1674 102/500 [=====>........................] - ETA: 2:11 - loss: 1.1845 - regression_loss: 1.0168 - classification_loss: 0.1678 103/500 [=====>........................] - ETA: 2:11 - loss: 1.1803 - regression_loss: 1.0133 - classification_loss: 0.1670 104/500 [=====>........................] - ETA: 2:10 - loss: 1.1806 - regression_loss: 1.0137 - classification_loss: 0.1669 105/500 [=====>........................] - ETA: 2:10 - loss: 1.1768 - regression_loss: 1.0104 - classification_loss: 0.1664 106/500 [=====>........................] - ETA: 2:09 - loss: 1.1781 - regression_loss: 1.0116 - classification_loss: 0.1665 107/500 [=====>........................] - ETA: 2:09 - loss: 1.1834 - regression_loss: 1.0160 - classification_loss: 0.1674 108/500 [=====>........................] - ETA: 2:09 - loss: 1.1880 - regression_loss: 1.0199 - classification_loss: 0.1681 109/500 [=====>........................] - ETA: 2:08 - loss: 1.1866 - regression_loss: 1.0187 - classification_loss: 0.1679 110/500 [=====>........................] - ETA: 2:08 - loss: 1.1865 - regression_loss: 1.0184 - classification_loss: 0.1682 111/500 [=====>........................] - ETA: 2:08 - loss: 1.1831 - regression_loss: 1.0156 - classification_loss: 0.1675 112/500 [=====>........................] - ETA: 2:07 - loss: 1.1870 - regression_loss: 1.0190 - classification_loss: 0.1680 113/500 [=====>........................] - ETA: 2:07 - loss: 1.1893 - regression_loss: 1.0207 - classification_loss: 0.1686 114/500 [=====>........................] - ETA: 2:07 - loss: 1.1833 - regression_loss: 1.0153 - classification_loss: 0.1680 115/500 [=====>........................] - ETA: 2:06 - loss: 1.1764 - regression_loss: 1.0093 - classification_loss: 0.1671 116/500 [=====>........................] - ETA: 2:06 - loss: 1.1819 - regression_loss: 1.0137 - classification_loss: 0.1682 117/500 [======>.......................] - ETA: 2:06 - loss: 1.1832 - regression_loss: 1.0147 - classification_loss: 0.1685 118/500 [======>.......................] - ETA: 2:05 - loss: 1.1877 - regression_loss: 1.0191 - classification_loss: 0.1686 119/500 [======>.......................] - ETA: 2:05 - loss: 1.1901 - regression_loss: 1.0208 - classification_loss: 0.1693 120/500 [======>.......................] - ETA: 2:05 - loss: 1.1880 - regression_loss: 1.0189 - classification_loss: 0.1691 121/500 [======>.......................] - ETA: 2:04 - loss: 1.1903 - regression_loss: 1.0211 - classification_loss: 0.1693 122/500 [======>.......................] - ETA: 2:04 - loss: 1.1886 - regression_loss: 1.0196 - classification_loss: 0.1690 123/500 [======>.......................] - ETA: 2:04 - loss: 1.1870 - regression_loss: 1.0187 - classification_loss: 0.1684 124/500 [======>.......................] - ETA: 2:03 - loss: 1.1913 - regression_loss: 1.0221 - classification_loss: 0.1692 125/500 [======>.......................] - ETA: 2:03 - loss: 1.1947 - regression_loss: 1.0249 - classification_loss: 0.1698 126/500 [======>.......................] - ETA: 2:03 - loss: 1.1938 - regression_loss: 1.0242 - classification_loss: 0.1696 127/500 [======>.......................] - ETA: 2:02 - loss: 1.1898 - regression_loss: 1.0209 - classification_loss: 0.1690 128/500 [======>.......................] - ETA: 2:02 - loss: 1.1917 - regression_loss: 1.0225 - classification_loss: 0.1693 129/500 [======>.......................] - ETA: 2:01 - loss: 1.1922 - regression_loss: 1.0226 - classification_loss: 0.1696 130/500 [======>.......................] - ETA: 2:01 - loss: 1.1955 - regression_loss: 1.0253 - classification_loss: 0.1702 131/500 [======>.......................] - ETA: 2:01 - loss: 1.1914 - regression_loss: 1.0216 - classification_loss: 0.1697 132/500 [======>.......................] - ETA: 2:01 - loss: 1.1905 - regression_loss: 1.0211 - classification_loss: 0.1694 133/500 [======>.......................] - ETA: 2:00 - loss: 1.1918 - regression_loss: 1.0222 - classification_loss: 0.1696 134/500 [=======>......................] - ETA: 2:00 - loss: 1.1855 - regression_loss: 1.0166 - classification_loss: 0.1688 135/500 [=======>......................] - ETA: 2:00 - loss: 1.1881 - regression_loss: 1.0189 - classification_loss: 0.1691 136/500 [=======>......................] - ETA: 1:59 - loss: 1.1835 - regression_loss: 1.0148 - classification_loss: 0.1687 137/500 [=======>......................] - ETA: 1:59 - loss: 1.1782 - regression_loss: 1.0102 - classification_loss: 0.1680 138/500 [=======>......................] - ETA: 1:59 - loss: 1.1753 - regression_loss: 1.0077 - classification_loss: 0.1676 139/500 [=======>......................] - ETA: 1:58 - loss: 1.1774 - regression_loss: 1.0096 - classification_loss: 0.1678 140/500 [=======>......................] - ETA: 1:58 - loss: 1.1784 - regression_loss: 1.0103 - classification_loss: 0.1681 141/500 [=======>......................] - ETA: 1:58 - loss: 1.1762 - regression_loss: 1.0083 - classification_loss: 0.1678 142/500 [=======>......................] - ETA: 1:57 - loss: 1.1794 - regression_loss: 1.0110 - classification_loss: 0.1684 143/500 [=======>......................] - ETA: 1:57 - loss: 1.1768 - regression_loss: 1.0087 - classification_loss: 0.1681 144/500 [=======>......................] - ETA: 1:57 - loss: 1.1785 - regression_loss: 1.0099 - classification_loss: 0.1686 145/500 [=======>......................] - ETA: 1:56 - loss: 1.1792 - regression_loss: 1.0106 - classification_loss: 0.1687 146/500 [=======>......................] - ETA: 1:56 - loss: 1.1866 - regression_loss: 1.0168 - classification_loss: 0.1698 147/500 [=======>......................] - ETA: 1:56 - loss: 1.1860 - regression_loss: 1.0159 - classification_loss: 0.1700 148/500 [=======>......................] - ETA: 1:55 - loss: 1.1848 - regression_loss: 1.0151 - classification_loss: 0.1696 149/500 [=======>......................] - ETA: 1:55 - loss: 1.1815 - regression_loss: 1.0122 - classification_loss: 0.1693 150/500 [========>.....................] - ETA: 1:55 - loss: 1.1867 - regression_loss: 1.0168 - classification_loss: 0.1699 151/500 [========>.....................] - ETA: 1:54 - loss: 1.1888 - regression_loss: 1.0188 - classification_loss: 0.1700 152/500 [========>.....................] - ETA: 1:54 - loss: 1.1888 - regression_loss: 1.0187 - classification_loss: 0.1701 153/500 [========>.....................] - ETA: 1:54 - loss: 1.1864 - regression_loss: 1.0165 - classification_loss: 0.1698 154/500 [========>.....................] - ETA: 1:53 - loss: 1.1839 - regression_loss: 1.0146 - classification_loss: 0.1694 155/500 [========>.....................] - ETA: 1:53 - loss: 1.1834 - regression_loss: 1.0143 - classification_loss: 0.1692 156/500 [========>.....................] - ETA: 1:53 - loss: 1.1864 - regression_loss: 1.0167 - classification_loss: 0.1697 157/500 [========>.....................] - ETA: 1:52 - loss: 1.1846 - regression_loss: 1.0152 - classification_loss: 0.1695 158/500 [========>.....................] - ETA: 1:52 - loss: 1.1846 - regression_loss: 1.0151 - classification_loss: 0.1695 159/500 [========>.....................] - ETA: 1:52 - loss: 1.1816 - regression_loss: 1.0127 - classification_loss: 0.1689 160/500 [========>.....................] - ETA: 1:51 - loss: 1.1817 - regression_loss: 1.0129 - classification_loss: 0.1688 161/500 [========>.....................] - ETA: 1:51 - loss: 1.1813 - regression_loss: 1.0124 - classification_loss: 0.1689 162/500 [========>.....................] - ETA: 1:51 - loss: 1.1790 - regression_loss: 1.0105 - classification_loss: 0.1685 163/500 [========>.....................] - ETA: 1:50 - loss: 1.1755 - regression_loss: 1.0077 - classification_loss: 0.1678 164/500 [========>.....................] - ETA: 1:50 - loss: 1.1738 - regression_loss: 1.0062 - classification_loss: 0.1676 165/500 [========>.....................] - ETA: 1:50 - loss: 1.1738 - regression_loss: 1.0063 - classification_loss: 0.1675 166/500 [========>.....................] - ETA: 1:49 - loss: 1.1756 - regression_loss: 1.0074 - classification_loss: 0.1682 167/500 [=========>....................] - ETA: 1:49 - loss: 1.1755 - regression_loss: 1.0072 - classification_loss: 0.1683 168/500 [=========>....................] - ETA: 1:49 - loss: 1.1740 - regression_loss: 1.0061 - classification_loss: 0.1678 169/500 [=========>....................] - ETA: 1:48 - loss: 1.1763 - regression_loss: 1.0080 - classification_loss: 0.1683 170/500 [=========>....................] - ETA: 1:48 - loss: 1.1726 - regression_loss: 1.0048 - classification_loss: 0.1678 171/500 [=========>....................] - ETA: 1:48 - loss: 1.1758 - regression_loss: 1.0076 - classification_loss: 0.1682 172/500 [=========>....................] - ETA: 1:48 - loss: 1.1719 - regression_loss: 1.0041 - classification_loss: 0.1678 173/500 [=========>....................] - ETA: 1:47 - loss: 1.1690 - regression_loss: 1.0015 - classification_loss: 0.1675 174/500 [=========>....................] - ETA: 1:47 - loss: 1.1692 - regression_loss: 1.0015 - classification_loss: 0.1677 175/500 [=========>....................] - ETA: 1:47 - loss: 1.1675 - regression_loss: 1.0001 - classification_loss: 0.1675 176/500 [=========>....................] - ETA: 1:46 - loss: 1.1698 - regression_loss: 1.0019 - classification_loss: 0.1679 177/500 [=========>....................] - ETA: 1:46 - loss: 1.1710 - regression_loss: 1.0028 - classification_loss: 0.1682 178/500 [=========>....................] - ETA: 1:46 - loss: 1.1666 - regression_loss: 0.9989 - classification_loss: 0.1677 179/500 [=========>....................] - ETA: 1:45 - loss: 1.1660 - regression_loss: 0.9985 - classification_loss: 0.1675 180/500 [=========>....................] - ETA: 1:45 - loss: 1.1673 - regression_loss: 0.9997 - classification_loss: 0.1676 181/500 [=========>....................] - ETA: 1:45 - loss: 1.1674 - regression_loss: 0.9999 - classification_loss: 0.1676 182/500 [=========>....................] - ETA: 1:44 - loss: 1.1678 - regression_loss: 1.0002 - classification_loss: 0.1676 183/500 [=========>....................] - ETA: 1:44 - loss: 1.1633 - regression_loss: 0.9963 - classification_loss: 0.1670 184/500 [==========>...................] - ETA: 1:44 - loss: 1.1617 - regression_loss: 0.9951 - classification_loss: 0.1666 185/500 [==========>...................] - ETA: 1:43 - loss: 1.1583 - regression_loss: 0.9923 - classification_loss: 0.1660 186/500 [==========>...................] - ETA: 1:43 - loss: 1.1566 - regression_loss: 0.9910 - classification_loss: 0.1657 187/500 [==========>...................] - ETA: 1:43 - loss: 1.1564 - regression_loss: 0.9908 - classification_loss: 0.1656 188/500 [==========>...................] - ETA: 1:42 - loss: 1.1555 - regression_loss: 0.9903 - classification_loss: 0.1652 189/500 [==========>...................] - ETA: 1:42 - loss: 1.1560 - regression_loss: 0.9907 - classification_loss: 0.1653 190/500 [==========>...................] - ETA: 1:42 - loss: 1.1558 - regression_loss: 0.9904 - classification_loss: 0.1654 191/500 [==========>...................] - ETA: 1:41 - loss: 1.1555 - regression_loss: 0.9900 - classification_loss: 0.1655 192/500 [==========>...................] - ETA: 1:41 - loss: 1.1580 - regression_loss: 0.9920 - classification_loss: 0.1660 193/500 [==========>...................] - ETA: 1:41 - loss: 1.1543 - regression_loss: 0.9888 - classification_loss: 0.1655 194/500 [==========>...................] - ETA: 1:40 - loss: 1.1559 - regression_loss: 0.9899 - classification_loss: 0.1660 195/500 [==========>...................] - ETA: 1:40 - loss: 1.1548 - regression_loss: 0.9891 - classification_loss: 0.1657 196/500 [==========>...................] - ETA: 1:40 - loss: 1.1576 - regression_loss: 0.9914 - classification_loss: 0.1662 197/500 [==========>...................] - ETA: 1:39 - loss: 1.1582 - regression_loss: 0.9920 - classification_loss: 0.1662 198/500 [==========>...................] - ETA: 1:39 - loss: 1.1578 - regression_loss: 0.9920 - classification_loss: 0.1658 199/500 [==========>...................] - ETA: 1:39 - loss: 1.1575 - regression_loss: 0.9915 - classification_loss: 0.1660 200/500 [===========>..................] - ETA: 1:38 - loss: 1.1594 - regression_loss: 0.9932 - classification_loss: 0.1662 201/500 [===========>..................] - ETA: 1:38 - loss: 1.1610 - regression_loss: 0.9945 - classification_loss: 0.1664 202/500 [===========>..................] - ETA: 1:38 - loss: 1.1609 - regression_loss: 0.9944 - classification_loss: 0.1665 203/500 [===========>..................] - ETA: 1:37 - loss: 1.1611 - regression_loss: 0.9946 - classification_loss: 0.1665 204/500 [===========>..................] - ETA: 1:37 - loss: 1.1610 - regression_loss: 0.9946 - classification_loss: 0.1663 205/500 [===========>..................] - ETA: 1:37 - loss: 1.1616 - regression_loss: 0.9952 - classification_loss: 0.1664 206/500 [===========>..................] - ETA: 1:36 - loss: 1.1594 - regression_loss: 0.9934 - classification_loss: 0.1661 207/500 [===========>..................] - ETA: 1:36 - loss: 1.1585 - regression_loss: 0.9924 - classification_loss: 0.1661 208/500 [===========>..................] - ETA: 1:36 - loss: 1.1561 - regression_loss: 0.9904 - classification_loss: 0.1656 209/500 [===========>..................] - ETA: 1:35 - loss: 1.1546 - regression_loss: 0.9892 - classification_loss: 0.1654 210/500 [===========>..................] - ETA: 1:35 - loss: 1.1531 - regression_loss: 0.9880 - classification_loss: 0.1651 211/500 [===========>..................] - ETA: 1:35 - loss: 1.1556 - regression_loss: 0.9902 - classification_loss: 0.1654 212/500 [===========>..................] - ETA: 1:34 - loss: 1.1521 - regression_loss: 0.9871 - classification_loss: 0.1650 213/500 [===========>..................] - ETA: 1:34 - loss: 1.1527 - regression_loss: 0.9875 - classification_loss: 0.1652 214/500 [===========>..................] - ETA: 1:34 - loss: 1.1547 - regression_loss: 0.9891 - classification_loss: 0.1656 215/500 [===========>..................] - ETA: 1:33 - loss: 1.1540 - regression_loss: 0.9884 - classification_loss: 0.1657 216/500 [===========>..................] - ETA: 1:33 - loss: 1.1532 - regression_loss: 0.9876 - classification_loss: 0.1656 217/500 [============>.................] - ETA: 1:33 - loss: 1.1528 - regression_loss: 0.9873 - classification_loss: 0.1655 218/500 [============>.................] - ETA: 1:32 - loss: 1.1516 - regression_loss: 0.9865 - classification_loss: 0.1651 219/500 [============>.................] - ETA: 1:32 - loss: 1.1544 - regression_loss: 0.9886 - classification_loss: 0.1658 220/500 [============>.................] - ETA: 1:32 - loss: 1.1578 - regression_loss: 0.9917 - classification_loss: 0.1661 221/500 [============>.................] - ETA: 1:31 - loss: 1.1567 - regression_loss: 0.9909 - classification_loss: 0.1658 222/500 [============>.................] - ETA: 1:31 - loss: 1.1556 - regression_loss: 0.9901 - classification_loss: 0.1655 223/500 [============>.................] - ETA: 1:31 - loss: 1.1521 - regression_loss: 0.9871 - classification_loss: 0.1650 224/500 [============>.................] - ETA: 1:30 - loss: 1.1539 - regression_loss: 0.9888 - classification_loss: 0.1651 225/500 [============>.................] - ETA: 1:30 - loss: 1.1555 - regression_loss: 0.9903 - classification_loss: 0.1652 226/500 [============>.................] - ETA: 1:30 - loss: 1.1542 - regression_loss: 0.9894 - classification_loss: 0.1648 227/500 [============>.................] - ETA: 1:30 - loss: 1.1528 - regression_loss: 0.9882 - classification_loss: 0.1646 228/500 [============>.................] - ETA: 1:29 - loss: 1.1537 - regression_loss: 0.9892 - classification_loss: 0.1645 229/500 [============>.................] - ETA: 1:29 - loss: 1.1563 - regression_loss: 0.9914 - classification_loss: 0.1649 230/500 [============>.................] - ETA: 1:29 - loss: 1.1566 - regression_loss: 0.9917 - classification_loss: 0.1649 231/500 [============>.................] - ETA: 1:28 - loss: 1.1550 - regression_loss: 0.9903 - classification_loss: 0.1647 232/500 [============>.................] - ETA: 1:28 - loss: 1.1533 - regression_loss: 0.9890 - classification_loss: 0.1643 233/500 [============>.................] - ETA: 1:28 - loss: 1.1552 - regression_loss: 0.9906 - classification_loss: 0.1646 234/500 [=============>................] - ETA: 1:27 - loss: 1.1528 - regression_loss: 0.9885 - classification_loss: 0.1642 235/500 [=============>................] - ETA: 1:27 - loss: 1.1522 - regression_loss: 0.9882 - classification_loss: 0.1640 236/500 [=============>................] - ETA: 1:27 - loss: 1.1551 - regression_loss: 0.9906 - classification_loss: 0.1646 237/500 [=============>................] - ETA: 1:26 - loss: 1.1571 - regression_loss: 0.9923 - classification_loss: 0.1647 238/500 [=============>................] - ETA: 1:26 - loss: 1.1575 - regression_loss: 0.9928 - classification_loss: 0.1648 239/500 [=============>................] - ETA: 1:26 - loss: 1.1546 - regression_loss: 0.9904 - classification_loss: 0.1643 240/500 [=============>................] - ETA: 1:25 - loss: 1.1567 - regression_loss: 0.9919 - classification_loss: 0.1648 241/500 [=============>................] - ETA: 1:25 - loss: 1.1578 - regression_loss: 0.9928 - classification_loss: 0.1650 242/500 [=============>................] - ETA: 1:25 - loss: 1.1599 - regression_loss: 0.9947 - classification_loss: 0.1652 243/500 [=============>................] - ETA: 1:24 - loss: 1.1633 - regression_loss: 0.9975 - classification_loss: 0.1658 244/500 [=============>................] - ETA: 1:24 - loss: 1.1662 - regression_loss: 0.9999 - classification_loss: 0.1662 245/500 [=============>................] - ETA: 1:24 - loss: 1.1656 - regression_loss: 0.9995 - classification_loss: 0.1661 246/500 [=============>................] - ETA: 1:23 - loss: 1.1657 - regression_loss: 0.9995 - classification_loss: 0.1662 247/500 [=============>................] - ETA: 1:23 - loss: 1.1661 - regression_loss: 0.9998 - classification_loss: 0.1663 248/500 [=============>................] - ETA: 1:23 - loss: 1.1669 - regression_loss: 1.0003 - classification_loss: 0.1666 249/500 [=============>................] - ETA: 1:22 - loss: 1.1693 - regression_loss: 1.0021 - classification_loss: 0.1672 250/500 [==============>...............] - ETA: 1:22 - loss: 1.1703 - regression_loss: 1.0029 - classification_loss: 0.1674 251/500 [==============>...............] - ETA: 1:22 - loss: 1.1677 - regression_loss: 1.0008 - classification_loss: 0.1670 252/500 [==============>...............] - ETA: 1:21 - loss: 1.1681 - regression_loss: 1.0013 - classification_loss: 0.1668 253/500 [==============>...............] - ETA: 1:21 - loss: 1.1692 - regression_loss: 1.0024 - classification_loss: 0.1668 254/500 [==============>...............] - ETA: 1:21 - loss: 1.1703 - regression_loss: 1.0034 - classification_loss: 0.1669 255/500 [==============>...............] - ETA: 1:20 - loss: 1.1707 - regression_loss: 1.0041 - classification_loss: 0.1666 256/500 [==============>...............] - ETA: 1:20 - loss: 1.1709 - regression_loss: 1.0043 - classification_loss: 0.1666 257/500 [==============>...............] - ETA: 1:20 - loss: 1.1683 - regression_loss: 1.0020 - classification_loss: 0.1663 258/500 [==============>...............] - ETA: 1:19 - loss: 1.1679 - regression_loss: 1.0017 - classification_loss: 0.1661 259/500 [==============>...............] - ETA: 1:19 - loss: 1.1686 - regression_loss: 1.0024 - classification_loss: 0.1662 260/500 [==============>...............] - ETA: 1:19 - loss: 1.1691 - regression_loss: 1.0031 - classification_loss: 0.1660 261/500 [==============>...............] - ETA: 1:18 - loss: 1.1684 - regression_loss: 1.0024 - classification_loss: 0.1659 262/500 [==============>...............] - ETA: 1:18 - loss: 1.1705 - regression_loss: 1.0043 - classification_loss: 0.1662 263/500 [==============>...............] - ETA: 1:18 - loss: 1.1695 - regression_loss: 1.0035 - classification_loss: 0.1660 264/500 [==============>...............] - ETA: 1:17 - loss: 1.1706 - regression_loss: 1.0044 - classification_loss: 0.1662 265/500 [==============>...............] - ETA: 1:17 - loss: 1.1703 - regression_loss: 1.0042 - classification_loss: 0.1662 266/500 [==============>...............] - ETA: 1:17 - loss: 1.1730 - regression_loss: 1.0064 - classification_loss: 0.1666 267/500 [===============>..............] - ETA: 1:16 - loss: 1.1764 - regression_loss: 1.0092 - classification_loss: 0.1672 268/500 [===============>..............] - ETA: 1:16 - loss: 1.1746 - regression_loss: 1.0074 - classification_loss: 0.1671 269/500 [===============>..............] - ETA: 1:16 - loss: 1.1750 - regression_loss: 1.0077 - classification_loss: 0.1673 270/500 [===============>..............] - ETA: 1:15 - loss: 1.1752 - regression_loss: 1.0077 - classification_loss: 0.1675 271/500 [===============>..............] - ETA: 1:15 - loss: 1.1758 - regression_loss: 1.0083 - classification_loss: 0.1675 272/500 [===============>..............] - ETA: 1:15 - loss: 1.1750 - regression_loss: 1.0077 - classification_loss: 0.1673 273/500 [===============>..............] - ETA: 1:14 - loss: 1.1757 - regression_loss: 1.0083 - classification_loss: 0.1673 274/500 [===============>..............] - ETA: 1:14 - loss: 1.1736 - regression_loss: 1.0066 - classification_loss: 0.1670 275/500 [===============>..............] - ETA: 1:14 - loss: 1.1751 - regression_loss: 1.0080 - classification_loss: 0.1671 276/500 [===============>..............] - ETA: 1:13 - loss: 1.1726 - regression_loss: 1.0058 - classification_loss: 0.1668 277/500 [===============>..............] - ETA: 1:13 - loss: 1.1733 - regression_loss: 1.0064 - classification_loss: 0.1669 278/500 [===============>..............] - ETA: 1:13 - loss: 1.1729 - regression_loss: 1.0061 - classification_loss: 0.1668 279/500 [===============>..............] - ETA: 1:12 - loss: 1.1726 - regression_loss: 1.0058 - classification_loss: 0.1668 280/500 [===============>..............] - ETA: 1:12 - loss: 1.1715 - regression_loss: 1.0049 - classification_loss: 0.1666 281/500 [===============>..............] - ETA: 1:12 - loss: 1.1720 - regression_loss: 1.0054 - classification_loss: 0.1666 282/500 [===============>..............] - ETA: 1:11 - loss: 1.1699 - regression_loss: 1.0036 - classification_loss: 0.1663 283/500 [===============>..............] - ETA: 1:11 - loss: 1.1697 - regression_loss: 1.0033 - classification_loss: 0.1664 284/500 [================>.............] - ETA: 1:11 - loss: 1.1683 - regression_loss: 1.0020 - classification_loss: 0.1662 285/500 [================>.............] - ETA: 1:10 - loss: 1.1705 - regression_loss: 1.0040 - classification_loss: 0.1665 286/500 [================>.............] - ETA: 1:10 - loss: 1.1711 - regression_loss: 1.0044 - classification_loss: 0.1667 287/500 [================>.............] - ETA: 1:10 - loss: 1.1686 - regression_loss: 1.0023 - classification_loss: 0.1663 288/500 [================>.............] - ETA: 1:09 - loss: 1.1701 - regression_loss: 1.0036 - classification_loss: 0.1665 289/500 [================>.............] - ETA: 1:09 - loss: 1.1700 - regression_loss: 1.0035 - classification_loss: 0.1665 290/500 [================>.............] - ETA: 1:09 - loss: 1.1707 - regression_loss: 1.0040 - classification_loss: 0.1667 291/500 [================>.............] - ETA: 1:08 - loss: 1.1716 - regression_loss: 1.0049 - classification_loss: 0.1667 292/500 [================>.............] - ETA: 1:08 - loss: 1.1694 - regression_loss: 1.0029 - classification_loss: 0.1664 293/500 [================>.............] - ETA: 1:08 - loss: 1.1705 - regression_loss: 1.0037 - classification_loss: 0.1668 294/500 [================>.............] - ETA: 1:07 - loss: 1.1691 - regression_loss: 1.0025 - classification_loss: 0.1666 295/500 [================>.............] - ETA: 1:07 - loss: 1.1682 - regression_loss: 1.0018 - classification_loss: 0.1664 296/500 [================>.............] - ETA: 1:07 - loss: 1.1688 - regression_loss: 1.0025 - classification_loss: 0.1663 297/500 [================>.............] - ETA: 1:06 - loss: 1.1677 - regression_loss: 1.0016 - classification_loss: 0.1662 298/500 [================>.............] - ETA: 1:06 - loss: 1.1681 - regression_loss: 1.0019 - classification_loss: 0.1662 299/500 [================>.............] - ETA: 1:06 - loss: 1.1676 - regression_loss: 1.0015 - classification_loss: 0.1662 300/500 [=================>............] - ETA: 1:06 - loss: 1.1663 - regression_loss: 1.0004 - classification_loss: 0.1659 301/500 [=================>............] - ETA: 1:05 - loss: 1.1672 - regression_loss: 1.0012 - classification_loss: 0.1659 302/500 [=================>............] - ETA: 1:05 - loss: 1.1687 - regression_loss: 1.0024 - classification_loss: 0.1663 303/500 [=================>............] - ETA: 1:05 - loss: 1.1700 - regression_loss: 1.0035 - classification_loss: 0.1665 304/500 [=================>............] - ETA: 1:04 - loss: 1.1717 - regression_loss: 1.0049 - classification_loss: 0.1668 305/500 [=================>............] - ETA: 1:04 - loss: 1.1711 - regression_loss: 1.0042 - classification_loss: 0.1669 306/500 [=================>............] - ETA: 1:04 - loss: 1.1684 - regression_loss: 1.0017 - classification_loss: 0.1667 307/500 [=================>............] - ETA: 1:03 - loss: 1.1697 - regression_loss: 1.0027 - classification_loss: 0.1670 308/500 [=================>............] - ETA: 1:03 - loss: 1.1709 - regression_loss: 1.0040 - classification_loss: 0.1669 309/500 [=================>............] - ETA: 1:03 - loss: 1.1703 - regression_loss: 1.0036 - classification_loss: 0.1667 310/500 [=================>............] - ETA: 1:02 - loss: 1.1706 - regression_loss: 1.0039 - classification_loss: 0.1667 311/500 [=================>............] - ETA: 1:02 - loss: 1.1719 - regression_loss: 1.0050 - classification_loss: 0.1669 312/500 [=================>............] - ETA: 1:02 - loss: 1.1720 - regression_loss: 1.0050 - classification_loss: 0.1669 313/500 [=================>............] - ETA: 1:01 - loss: 1.1733 - regression_loss: 1.0063 - classification_loss: 0.1670 314/500 [=================>............] - ETA: 1:01 - loss: 1.1739 - regression_loss: 1.0068 - classification_loss: 0.1670 315/500 [=================>............] - ETA: 1:01 - loss: 1.1734 - regression_loss: 1.0065 - classification_loss: 0.1669 316/500 [=================>............] - ETA: 1:00 - loss: 1.1756 - regression_loss: 1.0083 - classification_loss: 0.1672 317/500 [==================>...........] - ETA: 1:00 - loss: 1.1739 - regression_loss: 1.0070 - classification_loss: 0.1669 318/500 [==================>...........] - ETA: 1:00 - loss: 1.1723 - regression_loss: 1.0056 - classification_loss: 0.1668 319/500 [==================>...........] - ETA: 59s - loss: 1.1724 - regression_loss: 1.0057 - classification_loss: 0.1667  320/500 [==================>...........] - ETA: 59s - loss: 1.1718 - regression_loss: 1.0053 - classification_loss: 0.1665 321/500 [==================>...........] - ETA: 59s - loss: 1.1715 - regression_loss: 1.0049 - classification_loss: 0.1665 322/500 [==================>...........] - ETA: 58s - loss: 1.1705 - regression_loss: 1.0041 - classification_loss: 0.1663 323/500 [==================>...........] - ETA: 58s - loss: 1.1706 - regression_loss: 1.0042 - classification_loss: 0.1664 324/500 [==================>...........] - ETA: 58s - loss: 1.1718 - regression_loss: 1.0052 - classification_loss: 0.1666 325/500 [==================>...........] - ETA: 57s - loss: 1.1704 - regression_loss: 1.0040 - classification_loss: 0.1664 326/500 [==================>...........] - ETA: 57s - loss: 1.1707 - regression_loss: 1.0043 - classification_loss: 0.1664 327/500 [==================>...........] - ETA: 57s - loss: 1.1687 - regression_loss: 1.0026 - classification_loss: 0.1661 328/500 [==================>...........] - ETA: 56s - loss: 1.1696 - regression_loss: 1.0033 - classification_loss: 0.1663 329/500 [==================>...........] - ETA: 56s - loss: 1.1681 - regression_loss: 1.0021 - classification_loss: 0.1660 330/500 [==================>...........] - ETA: 56s - loss: 1.1664 - regression_loss: 1.0007 - classification_loss: 0.1657 331/500 [==================>...........] - ETA: 55s - loss: 1.1665 - regression_loss: 1.0009 - classification_loss: 0.1656 332/500 [==================>...........] - ETA: 55s - loss: 1.1667 - regression_loss: 1.0012 - classification_loss: 0.1655 333/500 [==================>...........] - ETA: 55s - loss: 1.1658 - regression_loss: 1.0004 - classification_loss: 0.1654 334/500 [===================>..........] - ETA: 54s - loss: 1.1643 - regression_loss: 0.9991 - classification_loss: 0.1652 335/500 [===================>..........] - ETA: 54s - loss: 1.1648 - regression_loss: 0.9996 - classification_loss: 0.1652 336/500 [===================>..........] - ETA: 54s - loss: 1.1660 - regression_loss: 1.0007 - classification_loss: 0.1653 337/500 [===================>..........] - ETA: 53s - loss: 1.1658 - regression_loss: 1.0006 - classification_loss: 0.1652 338/500 [===================>..........] - ETA: 53s - loss: 1.1653 - regression_loss: 1.0002 - classification_loss: 0.1651 339/500 [===================>..........] - ETA: 53s - loss: 1.1652 - regression_loss: 1.0002 - classification_loss: 0.1650 340/500 [===================>..........] - ETA: 52s - loss: 1.1660 - regression_loss: 1.0009 - classification_loss: 0.1651 341/500 [===================>..........] - ETA: 52s - loss: 1.1674 - regression_loss: 1.0021 - classification_loss: 0.1653 342/500 [===================>..........] - ETA: 52s - loss: 1.1658 - regression_loss: 1.0008 - classification_loss: 0.1650 343/500 [===================>..........] - ETA: 51s - loss: 1.1666 - regression_loss: 1.0015 - classification_loss: 0.1651 344/500 [===================>..........] - ETA: 51s - loss: 1.1661 - regression_loss: 1.0010 - classification_loss: 0.1651 345/500 [===================>..........] - ETA: 51s - loss: 1.1651 - regression_loss: 1.0001 - classification_loss: 0.1651 346/500 [===================>..........] - ETA: 50s - loss: 1.1638 - regression_loss: 0.9990 - classification_loss: 0.1648 347/500 [===================>..........] - ETA: 50s - loss: 1.1644 - regression_loss: 0.9996 - classification_loss: 0.1648 348/500 [===================>..........] - ETA: 50s - loss: 1.1643 - regression_loss: 0.9995 - classification_loss: 0.1648 349/500 [===================>..........] - ETA: 49s - loss: 1.1622 - regression_loss: 0.9976 - classification_loss: 0.1646 350/500 [====================>.........] - ETA: 49s - loss: 1.1631 - regression_loss: 0.9985 - classification_loss: 0.1646 351/500 [====================>.........] - ETA: 49s - loss: 1.1634 - regression_loss: 0.9987 - classification_loss: 0.1646 352/500 [====================>.........] - ETA: 48s - loss: 1.1652 - regression_loss: 1.0003 - classification_loss: 0.1649 353/500 [====================>.........] - ETA: 48s - loss: 1.1660 - regression_loss: 1.0012 - classification_loss: 0.1649 354/500 [====================>.........] - ETA: 48s - loss: 1.1666 - regression_loss: 1.0016 - classification_loss: 0.1650 355/500 [====================>.........] - ETA: 47s - loss: 1.1661 - regression_loss: 1.0012 - classification_loss: 0.1649 356/500 [====================>.........] - ETA: 47s - loss: 1.1658 - regression_loss: 1.0010 - classification_loss: 0.1648 357/500 [====================>.........] - ETA: 47s - loss: 1.1651 - regression_loss: 1.0004 - classification_loss: 0.1647 358/500 [====================>.........] - ETA: 46s - loss: 1.1637 - regression_loss: 0.9992 - classification_loss: 0.1645 359/500 [====================>.........] - ETA: 46s - loss: 1.1657 - regression_loss: 1.0009 - classification_loss: 0.1648 360/500 [====================>.........] - ETA: 46s - loss: 1.1655 - regression_loss: 1.0007 - classification_loss: 0.1648 361/500 [====================>.........] - ETA: 45s - loss: 1.1673 - regression_loss: 1.0021 - classification_loss: 0.1652 362/500 [====================>.........] - ETA: 45s - loss: 1.1691 - regression_loss: 1.0036 - classification_loss: 0.1655 363/500 [====================>.........] - ETA: 45s - loss: 1.1678 - regression_loss: 1.0024 - classification_loss: 0.1654 364/500 [====================>.........] - ETA: 44s - loss: 1.1690 - regression_loss: 1.0035 - classification_loss: 0.1655 365/500 [====================>.........] - ETA: 44s - loss: 1.1700 - regression_loss: 1.0043 - classification_loss: 0.1657 366/500 [====================>.........] - ETA: 44s - loss: 1.1704 - regression_loss: 1.0045 - classification_loss: 0.1659 367/500 [=====================>........] - ETA: 43s - loss: 1.1684 - regression_loss: 1.0028 - classification_loss: 0.1657 368/500 [=====================>........] - ETA: 43s - loss: 1.1662 - regression_loss: 1.0008 - classification_loss: 0.1654 369/500 [=====================>........] - ETA: 43s - loss: 1.1663 - regression_loss: 1.0009 - classification_loss: 0.1654 370/500 [=====================>........] - ETA: 42s - loss: 1.1645 - regression_loss: 0.9993 - classification_loss: 0.1652 371/500 [=====================>........] - ETA: 42s - loss: 1.1645 - regression_loss: 0.9992 - classification_loss: 0.1653 372/500 [=====================>........] - ETA: 42s - loss: 1.1658 - regression_loss: 1.0002 - classification_loss: 0.1655 373/500 [=====================>........] - ETA: 41s - loss: 1.1648 - regression_loss: 0.9992 - classification_loss: 0.1657 374/500 [=====================>........] - ETA: 41s - loss: 1.1649 - regression_loss: 0.9993 - classification_loss: 0.1656 375/500 [=====================>........] - ETA: 41s - loss: 1.1653 - regression_loss: 0.9996 - classification_loss: 0.1656 376/500 [=====================>........] - ETA: 40s - loss: 1.1657 - regression_loss: 1.0000 - classification_loss: 0.1657 377/500 [=====================>........] - ETA: 40s - loss: 1.1646 - regression_loss: 0.9990 - classification_loss: 0.1656 378/500 [=====================>........] - ETA: 40s - loss: 1.1631 - regression_loss: 0.9976 - classification_loss: 0.1655 379/500 [=====================>........] - ETA: 39s - loss: 1.1631 - regression_loss: 0.9976 - classification_loss: 0.1655 380/500 [=====================>........] - ETA: 39s - loss: 1.1637 - regression_loss: 0.9981 - classification_loss: 0.1655 381/500 [=====================>........] - ETA: 39s - loss: 1.1640 - regression_loss: 0.9984 - classification_loss: 0.1656 382/500 [=====================>........] - ETA: 39s - loss: 1.1628 - regression_loss: 0.9974 - classification_loss: 0.1654 383/500 [=====================>........] - ETA: 38s - loss: 1.1639 - regression_loss: 0.9981 - classification_loss: 0.1658 384/500 [======================>.......] - ETA: 38s - loss: 1.1636 - regression_loss: 0.9978 - classification_loss: 0.1658 385/500 [======================>.......] - ETA: 38s - loss: 1.1647 - regression_loss: 0.9988 - classification_loss: 0.1659 386/500 [======================>.......] - ETA: 37s - loss: 1.1636 - regression_loss: 0.9977 - classification_loss: 0.1658 387/500 [======================>.......] - ETA: 37s - loss: 1.1649 - regression_loss: 0.9989 - classification_loss: 0.1660 388/500 [======================>.......] - ETA: 37s - loss: 1.1632 - regression_loss: 0.9975 - classification_loss: 0.1657 389/500 [======================>.......] - ETA: 36s - loss: 1.1622 - regression_loss: 0.9967 - classification_loss: 0.1656 390/500 [======================>.......] - ETA: 36s - loss: 1.1603 - regression_loss: 0.9950 - classification_loss: 0.1653 391/500 [======================>.......] - ETA: 36s - loss: 1.1606 - regression_loss: 0.9951 - classification_loss: 0.1655 392/500 [======================>.......] - ETA: 35s - loss: 1.1606 - regression_loss: 0.9950 - classification_loss: 0.1655 393/500 [======================>.......] - ETA: 35s - loss: 1.1586 - regression_loss: 0.9933 - classification_loss: 0.1653 394/500 [======================>.......] - ETA: 35s - loss: 1.1603 - regression_loss: 0.9947 - classification_loss: 0.1657 395/500 [======================>.......] - ETA: 34s - loss: 1.1602 - regression_loss: 0.9945 - classification_loss: 0.1657 396/500 [======================>.......] - ETA: 34s - loss: 1.1601 - regression_loss: 0.9945 - classification_loss: 0.1656 397/500 [======================>.......] - ETA: 34s - loss: 1.1611 - regression_loss: 0.9953 - classification_loss: 0.1658 398/500 [======================>.......] - ETA: 33s - loss: 1.1620 - regression_loss: 0.9961 - classification_loss: 0.1659 399/500 [======================>.......] - ETA: 33s - loss: 1.1631 - regression_loss: 0.9971 - classification_loss: 0.1660 400/500 [=======================>......] - ETA: 33s - loss: 1.1636 - regression_loss: 0.9976 - classification_loss: 0.1660 401/500 [=======================>......] - ETA: 32s - loss: 1.1635 - regression_loss: 0.9976 - classification_loss: 0.1659 402/500 [=======================>......] - ETA: 32s - loss: 1.1631 - regression_loss: 0.9973 - classification_loss: 0.1658 403/500 [=======================>......] - ETA: 32s - loss: 1.1639 - regression_loss: 0.9979 - classification_loss: 0.1659 404/500 [=======================>......] - ETA: 31s - loss: 1.1626 - regression_loss: 0.9969 - classification_loss: 0.1657 405/500 [=======================>......] - ETA: 31s - loss: 1.1609 - regression_loss: 0.9954 - classification_loss: 0.1655 406/500 [=======================>......] - ETA: 31s - loss: 1.1598 - regression_loss: 0.9945 - classification_loss: 0.1654 407/500 [=======================>......] - ETA: 30s - loss: 1.1611 - regression_loss: 0.9954 - classification_loss: 0.1657 408/500 [=======================>......] - ETA: 30s - loss: 1.1619 - regression_loss: 0.9960 - classification_loss: 0.1659 409/500 [=======================>......] - ETA: 30s - loss: 1.1618 - regression_loss: 0.9960 - classification_loss: 0.1659 410/500 [=======================>......] - ETA: 29s - loss: 1.1605 - regression_loss: 0.9948 - classification_loss: 0.1656 411/500 [=======================>......] - ETA: 29s - loss: 1.1609 - regression_loss: 0.9953 - classification_loss: 0.1656 412/500 [=======================>......] - ETA: 29s - loss: 1.1620 - regression_loss: 0.9962 - classification_loss: 0.1658 413/500 [=======================>......] - ETA: 28s - loss: 1.1612 - regression_loss: 0.9956 - classification_loss: 0.1656 414/500 [=======================>......] - ETA: 28s - loss: 1.1602 - regression_loss: 0.9948 - classification_loss: 0.1654 415/500 [=======================>......] - ETA: 28s - loss: 1.1612 - regression_loss: 0.9955 - classification_loss: 0.1657 416/500 [=======================>......] - ETA: 27s - loss: 1.1618 - regression_loss: 0.9959 - classification_loss: 0.1659 417/500 [========================>.....] - ETA: 27s - loss: 1.1613 - regression_loss: 0.9955 - classification_loss: 0.1658 418/500 [========================>.....] - ETA: 27s - loss: 1.1602 - regression_loss: 0.9946 - classification_loss: 0.1656 419/500 [========================>.....] - ETA: 26s - loss: 1.1607 - regression_loss: 0.9950 - classification_loss: 0.1657 420/500 [========================>.....] - ETA: 26s - loss: 1.1596 - regression_loss: 0.9940 - classification_loss: 0.1655 421/500 [========================>.....] - ETA: 26s - loss: 1.1581 - regression_loss: 0.9927 - classification_loss: 0.1653 422/500 [========================>.....] - ETA: 25s - loss: 1.1586 - regression_loss: 0.9931 - classification_loss: 0.1655 423/500 [========================>.....] - ETA: 25s - loss: 1.1575 - regression_loss: 0.9922 - classification_loss: 0.1653 424/500 [========================>.....] - ETA: 25s - loss: 1.1580 - regression_loss: 0.9928 - classification_loss: 0.1652 425/500 [========================>.....] - ETA: 24s - loss: 1.1570 - regression_loss: 0.9921 - classification_loss: 0.1649 426/500 [========================>.....] - ETA: 24s - loss: 1.1578 - regression_loss: 0.9927 - classification_loss: 0.1651 427/500 [========================>.....] - ETA: 24s - loss: 1.1582 - regression_loss: 0.9931 - classification_loss: 0.1651 428/500 [========================>.....] - ETA: 23s - loss: 1.1567 - regression_loss: 0.9918 - classification_loss: 0.1649 429/500 [========================>.....] - ETA: 23s - loss: 1.1564 - regression_loss: 0.9915 - classification_loss: 0.1649 430/500 [========================>.....] - ETA: 23s - loss: 1.1549 - regression_loss: 0.9902 - classification_loss: 0.1647 431/500 [========================>.....] - ETA: 22s - loss: 1.1549 - regression_loss: 0.9902 - classification_loss: 0.1647 432/500 [========================>.....] - ETA: 22s - loss: 1.1545 - regression_loss: 0.9899 - classification_loss: 0.1646 433/500 [========================>.....] - ETA: 22s - loss: 1.1552 - regression_loss: 0.9905 - classification_loss: 0.1647 434/500 [=========================>....] - ETA: 21s - loss: 1.1545 - regression_loss: 0.9899 - classification_loss: 0.1646 435/500 [=========================>....] - ETA: 21s - loss: 1.1564 - regression_loss: 0.9915 - classification_loss: 0.1649 436/500 [=========================>....] - ETA: 21s - loss: 1.1556 - regression_loss: 0.9908 - classification_loss: 0.1648 437/500 [=========================>....] - ETA: 20s - loss: 1.1545 - regression_loss: 0.9899 - classification_loss: 0.1646 438/500 [=========================>....] - ETA: 20s - loss: 1.1552 - regression_loss: 0.9905 - classification_loss: 0.1646 439/500 [=========================>....] - ETA: 20s - loss: 1.1548 - regression_loss: 0.9903 - classification_loss: 0.1645 440/500 [=========================>....] - ETA: 19s - loss: 1.1559 - regression_loss: 0.9912 - classification_loss: 0.1646 441/500 [=========================>....] - ETA: 19s - loss: 1.1545 - regression_loss: 0.9900 - classification_loss: 0.1645 442/500 [=========================>....] - ETA: 19s - loss: 1.1544 - regression_loss: 0.9899 - classification_loss: 0.1644 443/500 [=========================>....] - ETA: 18s - loss: 1.1531 - regression_loss: 0.9888 - classification_loss: 0.1643 444/500 [=========================>....] - ETA: 18s - loss: 1.1537 - regression_loss: 0.9894 - classification_loss: 0.1643 445/500 [=========================>....] - ETA: 18s - loss: 1.1532 - regression_loss: 0.9890 - classification_loss: 0.1642 446/500 [=========================>....] - ETA: 17s - loss: 1.1534 - regression_loss: 0.9892 - classification_loss: 0.1642 447/500 [=========================>....] - ETA: 17s - loss: 1.1538 - regression_loss: 0.9896 - classification_loss: 0.1642 448/500 [=========================>....] - ETA: 17s - loss: 1.1538 - regression_loss: 0.9896 - classification_loss: 0.1643 449/500 [=========================>....] - ETA: 16s - loss: 1.1545 - regression_loss: 0.9902 - classification_loss: 0.1643 450/500 [==========================>...] - ETA: 16s - loss: 1.1551 - regression_loss: 0.9906 - classification_loss: 0.1645 451/500 [==========================>...] - ETA: 16s - loss: 1.1549 - regression_loss: 0.9903 - classification_loss: 0.1645 452/500 [==========================>...] - ETA: 15s - loss: 1.1546 - regression_loss: 0.9902 - classification_loss: 0.1644 453/500 [==========================>...] - ETA: 15s - loss: 1.1561 - regression_loss: 0.9915 - classification_loss: 0.1646 454/500 [==========================>...] - ETA: 15s - loss: 1.1572 - regression_loss: 0.9924 - classification_loss: 0.1648 455/500 [==========================>...] - ETA: 14s - loss: 1.1579 - regression_loss: 0.9930 - classification_loss: 0.1649 456/500 [==========================>...] - ETA: 14s - loss: 1.1578 - regression_loss: 0.9930 - classification_loss: 0.1649 457/500 [==========================>...] - ETA: 14s - loss: 1.1568 - regression_loss: 0.9921 - classification_loss: 0.1647 458/500 [==========================>...] - ETA: 13s - loss: 1.1568 - regression_loss: 0.9921 - classification_loss: 0.1647 459/500 [==========================>...] - ETA: 13s - loss: 1.1558 - regression_loss: 0.9912 - classification_loss: 0.1646 460/500 [==========================>...] - ETA: 13s - loss: 1.1564 - regression_loss: 0.9917 - classification_loss: 0.1648 461/500 [==========================>...] - ETA: 12s - loss: 1.1574 - regression_loss: 0.9925 - classification_loss: 0.1649 462/500 [==========================>...] - ETA: 12s - loss: 1.1579 - regression_loss: 0.9929 - classification_loss: 0.1650 463/500 [==========================>...] - ETA: 12s - loss: 1.1592 - regression_loss: 0.9941 - classification_loss: 0.1652 464/500 [==========================>...] - ETA: 11s - loss: 1.1585 - regression_loss: 0.9935 - classification_loss: 0.1650 465/500 [==========================>...] - ETA: 11s - loss: 1.1570 - regression_loss: 0.9922 - classification_loss: 0.1648 466/500 [==========================>...] - ETA: 11s - loss: 1.1558 - regression_loss: 0.9912 - classification_loss: 0.1646 467/500 [===========================>..] - ETA: 10s - loss: 1.1561 - regression_loss: 0.9914 - classification_loss: 0.1647 468/500 [===========================>..] - ETA: 10s - loss: 1.1567 - regression_loss: 0.9920 - classification_loss: 0.1647 469/500 [===========================>..] - ETA: 10s - loss: 1.1556 - regression_loss: 0.9910 - classification_loss: 0.1645 470/500 [===========================>..] - ETA: 9s - loss: 1.1543 - regression_loss: 0.9899 - classification_loss: 0.1643  471/500 [===========================>..] - ETA: 9s - loss: 1.1541 - regression_loss: 0.9897 - classification_loss: 0.1643 472/500 [===========================>..] - ETA: 9s - loss: 1.1555 - regression_loss: 0.9909 - classification_loss: 0.1646 473/500 [===========================>..] - ETA: 8s - loss: 1.1564 - regression_loss: 0.9916 - classification_loss: 0.1649 474/500 [===========================>..] - ETA: 8s - loss: 1.1576 - regression_loss: 0.9926 - classification_loss: 0.1650 475/500 [===========================>..] - ETA: 8s - loss: 1.1574 - regression_loss: 0.9924 - classification_loss: 0.1650 476/500 [===========================>..] - ETA: 7s - loss: 1.1588 - regression_loss: 0.9936 - classification_loss: 0.1652 477/500 [===========================>..] - ETA: 7s - loss: 1.1577 - regression_loss: 0.9926 - classification_loss: 0.1651 478/500 [===========================>..] - ETA: 7s - loss: 1.1570 - regression_loss: 0.9920 - classification_loss: 0.1649 479/500 [===========================>..] - ETA: 6s - loss: 1.1570 - regression_loss: 0.9920 - classification_loss: 0.1650 480/500 [===========================>..] - ETA: 6s - loss: 1.1574 - regression_loss: 0.9923 - classification_loss: 0.1650 481/500 [===========================>..] - ETA: 6s - loss: 1.1577 - regression_loss: 0.9928 - classification_loss: 0.1650 482/500 [===========================>..] - ETA: 5s - loss: 1.1580 - regression_loss: 0.9929 - classification_loss: 0.1651 483/500 [===========================>..] - ETA: 5s - loss: 1.1584 - regression_loss: 0.9934 - classification_loss: 0.1651 484/500 [============================>.] - ETA: 5s - loss: 1.1571 - regression_loss: 0.9923 - classification_loss: 0.1649 485/500 [============================>.] - ETA: 4s - loss: 1.1574 - regression_loss: 0.9925 - classification_loss: 0.1649 486/500 [============================>.] - ETA: 4s - loss: 1.1586 - regression_loss: 0.9936 - classification_loss: 0.1650 487/500 [============================>.] - ETA: 4s - loss: 1.1584 - regression_loss: 0.9935 - classification_loss: 0.1649 488/500 [============================>.] - ETA: 3s - loss: 1.1575 - regression_loss: 0.9927 - classification_loss: 0.1648 489/500 [============================>.] - ETA: 3s - loss: 1.1575 - regression_loss: 0.9928 - classification_loss: 0.1646 490/500 [============================>.] - ETA: 3s - loss: 1.1582 - regression_loss: 0.9935 - classification_loss: 0.1647 491/500 [============================>.] - ETA: 2s - loss: 1.1576 - regression_loss: 0.9930 - classification_loss: 0.1646 492/500 [============================>.] - ETA: 2s - loss: 1.1578 - regression_loss: 0.9932 - classification_loss: 0.1646 493/500 [============================>.] - ETA: 2s - loss: 1.1571 - regression_loss: 0.9926 - classification_loss: 0.1645 494/500 [============================>.] - ETA: 1s - loss: 1.1574 - regression_loss: 0.9928 - classification_loss: 0.1646 495/500 [============================>.] - ETA: 1s - loss: 1.1568 - regression_loss: 0.9923 - classification_loss: 0.1645 496/500 [============================>.] - ETA: 1s - loss: 1.1565 - regression_loss: 0.9921 - classification_loss: 0.1644 497/500 [============================>.] - ETA: 0s - loss: 1.1557 - regression_loss: 0.9915 - classification_loss: 0.1642 498/500 [============================>.] - ETA: 0s - loss: 1.1542 - regression_loss: 0.9902 - classification_loss: 0.1640 499/500 [============================>.] - ETA: 0s - loss: 1.1531 - regression_loss: 0.9893 - classification_loss: 0.1638 500/500 [==============================] - 165s 331ms/step - loss: 1.1537 - regression_loss: 0.9898 - classification_loss: 0.1639 1172 instances of class plum with average precision: 0.6121 mAP: 0.6121 Epoch 00025: saving model to ./training/snapshots/resnet101_pascal_25.h5 Epoch 26/150 1/500 [..............................] - ETA: 2:47 - loss: 1.3193 - regression_loss: 1.1142 - classification_loss: 0.2051 2/500 [..............................] - ETA: 2:44 - loss: 1.4404 - regression_loss: 1.2028 - classification_loss: 0.2376 3/500 [..............................] - ETA: 2:44 - loss: 1.4339 - regression_loss: 1.2042 - classification_loss: 0.2297 4/500 [..............................] - ETA: 2:44 - loss: 1.5092 - regression_loss: 1.2778 - classification_loss: 0.2314 5/500 [..............................] - ETA: 2:45 - loss: 1.3753 - regression_loss: 1.1660 - classification_loss: 0.2093 6/500 [..............................] - ETA: 2:45 - loss: 1.2769 - regression_loss: 1.0800 - classification_loss: 0.1969 7/500 [..............................] - ETA: 2:46 - loss: 1.3141 - regression_loss: 1.1109 - classification_loss: 0.2032 8/500 [..............................] - ETA: 2:44 - loss: 1.2971 - regression_loss: 1.0954 - classification_loss: 0.2017 9/500 [..............................] - ETA: 2:46 - loss: 1.2280 - regression_loss: 1.0345 - classification_loss: 0.1934 10/500 [..............................] - ETA: 2:44 - loss: 1.2367 - regression_loss: 1.0434 - classification_loss: 0.1934 11/500 [..............................] - ETA: 2:43 - loss: 1.2365 - regression_loss: 1.0415 - classification_loss: 0.1951 12/500 [..............................] - ETA: 2:42 - loss: 1.2564 - regression_loss: 1.0551 - classification_loss: 0.2013 13/500 [..............................] - ETA: 2:42 - loss: 1.2885 - regression_loss: 1.0838 - classification_loss: 0.2047 14/500 [..............................] - ETA: 2:41 - loss: 1.2537 - regression_loss: 1.0555 - classification_loss: 0.1982 15/500 [..............................] - ETA: 2:40 - loss: 1.2612 - regression_loss: 1.0643 - classification_loss: 0.1969 16/500 [..............................] - ETA: 2:39 - loss: 1.2332 - regression_loss: 1.0433 - classification_loss: 0.1899 17/500 [>.............................] - ETA: 2:39 - loss: 1.2247 - regression_loss: 1.0385 - classification_loss: 0.1862 18/500 [>.............................] - ETA: 2:39 - loss: 1.2337 - regression_loss: 1.0475 - classification_loss: 0.1862 19/500 [>.............................] - ETA: 2:39 - loss: 1.2532 - regression_loss: 1.0661 - classification_loss: 0.1871 20/500 [>.............................] - ETA: 2:39 - loss: 1.2390 - regression_loss: 1.0541 - classification_loss: 0.1849 21/500 [>.............................] - ETA: 2:38 - loss: 1.2093 - regression_loss: 1.0295 - classification_loss: 0.1798 22/500 [>.............................] - ETA: 2:37 - loss: 1.2322 - regression_loss: 1.0468 - classification_loss: 0.1853 23/500 [>.............................] - ETA: 2:37 - loss: 1.2386 - regression_loss: 1.0524 - classification_loss: 0.1863 24/500 [>.............................] - ETA: 2:36 - loss: 1.2188 - regression_loss: 1.0367 - classification_loss: 0.1821 25/500 [>.............................] - ETA: 2:37 - loss: 1.1842 - regression_loss: 1.0065 - classification_loss: 0.1777 26/500 [>.............................] - ETA: 2:36 - loss: 1.1822 - regression_loss: 1.0039 - classification_loss: 0.1783 27/500 [>.............................] - ETA: 2:36 - loss: 1.1750 - regression_loss: 0.9989 - classification_loss: 0.1762 28/500 [>.............................] - ETA: 2:36 - loss: 1.1950 - regression_loss: 1.0153 - classification_loss: 0.1798 29/500 [>.............................] - ETA: 2:35 - loss: 1.1702 - regression_loss: 0.9937 - classification_loss: 0.1765 30/500 [>.............................] - ETA: 2:34 - loss: 1.1761 - regression_loss: 0.9997 - classification_loss: 0.1764 31/500 [>.............................] - ETA: 2:34 - loss: 1.1556 - regression_loss: 0.9824 - classification_loss: 0.1732 32/500 [>.............................] - ETA: 2:34 - loss: 1.1469 - regression_loss: 0.9754 - classification_loss: 0.1715 33/500 [>.............................] - ETA: 2:34 - loss: 1.1570 - regression_loss: 0.9827 - classification_loss: 0.1743 34/500 [=>............................] - ETA: 2:34 - loss: 1.1752 - regression_loss: 0.9981 - classification_loss: 0.1771 35/500 [=>............................] - ETA: 2:33 - loss: 1.1769 - regression_loss: 0.9997 - classification_loss: 0.1772 36/500 [=>............................] - ETA: 2:33 - loss: 1.1792 - regression_loss: 1.0016 - classification_loss: 0.1776 37/500 [=>............................] - ETA: 2:33 - loss: 1.1829 - regression_loss: 1.0060 - classification_loss: 0.1768 38/500 [=>............................] - ETA: 2:33 - loss: 1.1807 - regression_loss: 1.0059 - classification_loss: 0.1748 39/500 [=>............................] - ETA: 2:33 - loss: 1.1949 - regression_loss: 1.0182 - classification_loss: 0.1767 40/500 [=>............................] - ETA: 2:32 - loss: 1.2034 - regression_loss: 1.0254 - classification_loss: 0.1780 41/500 [=>............................] - ETA: 2:32 - loss: 1.2104 - regression_loss: 1.0314 - classification_loss: 0.1791 42/500 [=>............................] - ETA: 2:32 - loss: 1.2012 - regression_loss: 1.0240 - classification_loss: 0.1772 43/500 [=>............................] - ETA: 2:31 - loss: 1.1993 - regression_loss: 1.0224 - classification_loss: 0.1769 44/500 [=>............................] - ETA: 2:31 - loss: 1.2022 - regression_loss: 1.0253 - classification_loss: 0.1770 45/500 [=>............................] - ETA: 2:30 - loss: 1.1897 - regression_loss: 1.0147 - classification_loss: 0.1750 46/500 [=>............................] - ETA: 2:30 - loss: 1.1896 - regression_loss: 1.0135 - classification_loss: 0.1761 47/500 [=>............................] - ETA: 2:30 - loss: 1.1960 - regression_loss: 1.0196 - classification_loss: 0.1764 48/500 [=>............................] - ETA: 2:29 - loss: 1.1804 - regression_loss: 1.0064 - classification_loss: 0.1741 49/500 [=>............................] - ETA: 2:29 - loss: 1.1914 - regression_loss: 1.0167 - classification_loss: 0.1746 50/500 [==>...........................] - ETA: 2:29 - loss: 1.1889 - regression_loss: 1.0133 - classification_loss: 0.1756 51/500 [==>...........................] - ETA: 2:28 - loss: 1.1807 - regression_loss: 1.0066 - classification_loss: 0.1741 52/500 [==>...........................] - ETA: 2:28 - loss: 1.1774 - regression_loss: 1.0046 - classification_loss: 0.1728 53/500 [==>...........................] - ETA: 2:28 - loss: 1.1871 - regression_loss: 1.0141 - classification_loss: 0.1730 54/500 [==>...........................] - ETA: 2:27 - loss: 1.1814 - regression_loss: 1.0093 - classification_loss: 0.1721 55/500 [==>...........................] - ETA: 2:27 - loss: 1.1734 - regression_loss: 1.0022 - classification_loss: 0.1712 56/500 [==>...........................] - ETA: 2:26 - loss: 1.1656 - regression_loss: 0.9944 - classification_loss: 0.1712 57/500 [==>...........................] - ETA: 2:26 - loss: 1.1759 - regression_loss: 1.0041 - classification_loss: 0.1719 58/500 [==>...........................] - ETA: 2:26 - loss: 1.1634 - regression_loss: 0.9933 - classification_loss: 0.1701 59/500 [==>...........................] - ETA: 2:25 - loss: 1.1562 - regression_loss: 0.9872 - classification_loss: 0.1691 60/500 [==>...........................] - ETA: 2:25 - loss: 1.1643 - regression_loss: 0.9943 - classification_loss: 0.1700 61/500 [==>...........................] - ETA: 2:25 - loss: 1.1621 - regression_loss: 0.9927 - classification_loss: 0.1694 62/500 [==>...........................] - ETA: 2:24 - loss: 1.1550 - regression_loss: 0.9865 - classification_loss: 0.1685 63/500 [==>...........................] - ETA: 2:24 - loss: 1.1596 - regression_loss: 0.9903 - classification_loss: 0.1693 64/500 [==>...........................] - ETA: 2:24 - loss: 1.1648 - regression_loss: 0.9945 - classification_loss: 0.1703 65/500 [==>...........................] - ETA: 2:23 - loss: 1.1586 - regression_loss: 0.9895 - classification_loss: 0.1691 66/500 [==>...........................] - ETA: 2:23 - loss: 1.1589 - regression_loss: 0.9896 - classification_loss: 0.1693 67/500 [===>..........................] - ETA: 2:23 - loss: 1.1602 - regression_loss: 0.9904 - classification_loss: 0.1698 68/500 [===>..........................] - ETA: 2:22 - loss: 1.1605 - regression_loss: 0.9909 - classification_loss: 0.1696 69/500 [===>..........................] - ETA: 2:22 - loss: 1.1538 - regression_loss: 0.9851 - classification_loss: 0.1686 70/500 [===>..........................] - ETA: 2:22 - loss: 1.1511 - regression_loss: 0.9833 - classification_loss: 0.1679 71/500 [===>..........................] - ETA: 2:21 - loss: 1.1597 - regression_loss: 0.9908 - classification_loss: 0.1689 72/500 [===>..........................] - ETA: 2:21 - loss: 1.1549 - regression_loss: 0.9870 - classification_loss: 0.1679 73/500 [===>..........................] - ETA: 2:21 - loss: 1.1565 - regression_loss: 0.9881 - classification_loss: 0.1684 74/500 [===>..........................] - ETA: 2:20 - loss: 1.1636 - regression_loss: 0.9949 - classification_loss: 0.1687 75/500 [===>..........................] - ETA: 2:20 - loss: 1.1553 - regression_loss: 0.9875 - classification_loss: 0.1678 76/500 [===>..........................] - ETA: 2:20 - loss: 1.1493 - regression_loss: 0.9823 - classification_loss: 0.1671 77/500 [===>..........................] - ETA: 2:19 - loss: 1.1488 - regression_loss: 0.9827 - classification_loss: 0.1661 78/500 [===>..........................] - ETA: 2:19 - loss: 1.1550 - regression_loss: 0.9883 - classification_loss: 0.1667 79/500 [===>..........................] - ETA: 2:19 - loss: 1.1599 - regression_loss: 0.9926 - classification_loss: 0.1673 80/500 [===>..........................] - ETA: 2:18 - loss: 1.1708 - regression_loss: 1.0016 - classification_loss: 0.1692 81/500 [===>..........................] - ETA: 2:18 - loss: 1.1621 - regression_loss: 0.9941 - classification_loss: 0.1680 82/500 [===>..........................] - ETA: 2:18 - loss: 1.1574 - regression_loss: 0.9902 - classification_loss: 0.1672 83/500 [===>..........................] - ETA: 2:17 - loss: 1.1594 - regression_loss: 0.9920 - classification_loss: 0.1673 84/500 [====>.........................] - ETA: 2:17 - loss: 1.1614 - regression_loss: 0.9941 - classification_loss: 0.1672 85/500 [====>.........................] - ETA: 2:17 - loss: 1.1552 - regression_loss: 0.9888 - classification_loss: 0.1664 86/500 [====>.........................] - ETA: 2:16 - loss: 1.1552 - regression_loss: 0.9887 - classification_loss: 0.1665 87/500 [====>.........................] - ETA: 2:16 - loss: 1.1527 - regression_loss: 0.9864 - classification_loss: 0.1663 88/500 [====>.........................] - ETA: 2:16 - loss: 1.1533 - regression_loss: 0.9868 - classification_loss: 0.1665 89/500 [====>.........................] - ETA: 2:15 - loss: 1.1509 - regression_loss: 0.9849 - classification_loss: 0.1660 90/500 [====>.........................] - ETA: 2:15 - loss: 1.1589 - regression_loss: 0.9910 - classification_loss: 0.1679 91/500 [====>.........................] - ETA: 2:15 - loss: 1.1562 - regression_loss: 0.9890 - classification_loss: 0.1672 92/500 [====>.........................] - ETA: 2:14 - loss: 1.1598 - regression_loss: 0.9919 - classification_loss: 0.1679 93/500 [====>.........................] - ETA: 2:14 - loss: 1.1597 - regression_loss: 0.9923 - classification_loss: 0.1673 94/500 [====>.........................] - ETA: 2:14 - loss: 1.1583 - regression_loss: 0.9911 - classification_loss: 0.1672 95/500 [====>.........................] - ETA: 2:13 - loss: 1.1537 - regression_loss: 0.9871 - classification_loss: 0.1665 96/500 [====>.........................] - ETA: 2:13 - loss: 1.1508 - regression_loss: 0.9845 - classification_loss: 0.1663 97/500 [====>.........................] - ETA: 2:13 - loss: 1.1541 - regression_loss: 0.9875 - classification_loss: 0.1667 98/500 [====>.........................] - ETA: 2:12 - loss: 1.1539 - regression_loss: 0.9875 - classification_loss: 0.1665 99/500 [====>.........................] - ETA: 2:12 - loss: 1.1496 - regression_loss: 0.9842 - classification_loss: 0.1655 100/500 [=====>........................] - ETA: 2:11 - loss: 1.1509 - regression_loss: 0.9853 - classification_loss: 0.1656 101/500 [=====>........................] - ETA: 2:11 - loss: 1.1482 - regression_loss: 0.9829 - classification_loss: 0.1653 102/500 [=====>........................] - ETA: 2:11 - loss: 1.1472 - regression_loss: 0.9823 - classification_loss: 0.1649 103/500 [=====>........................] - ETA: 2:10 - loss: 1.1503 - regression_loss: 0.9850 - classification_loss: 0.1652 104/500 [=====>........................] - ETA: 2:10 - loss: 1.1483 - regression_loss: 0.9834 - classification_loss: 0.1648 105/500 [=====>........................] - ETA: 2:10 - loss: 1.1493 - regression_loss: 0.9841 - classification_loss: 0.1652 106/500 [=====>........................] - ETA: 2:09 - loss: 1.1509 - regression_loss: 0.9856 - classification_loss: 0.1653 107/500 [=====>........................] - ETA: 2:09 - loss: 1.1451 - regression_loss: 0.9806 - classification_loss: 0.1645 108/500 [=====>........................] - ETA: 2:09 - loss: 1.1461 - regression_loss: 0.9815 - classification_loss: 0.1645 109/500 [=====>........................] - ETA: 2:08 - loss: 1.1515 - regression_loss: 0.9860 - classification_loss: 0.1654 110/500 [=====>........................] - ETA: 2:08 - loss: 1.1472 - regression_loss: 0.9824 - classification_loss: 0.1649 111/500 [=====>........................] - ETA: 2:08 - loss: 1.1479 - regression_loss: 0.9831 - classification_loss: 0.1648 112/500 [=====>........................] - ETA: 2:07 - loss: 1.1511 - regression_loss: 0.9857 - classification_loss: 0.1655 113/500 [=====>........................] - ETA: 2:07 - loss: 1.1508 - regression_loss: 0.9856 - classification_loss: 0.1652 114/500 [=====>........................] - ETA: 2:07 - loss: 1.1545 - regression_loss: 0.9887 - classification_loss: 0.1658 115/500 [=====>........................] - ETA: 2:06 - loss: 1.1582 - regression_loss: 0.9921 - classification_loss: 0.1661 116/500 [=====>........................] - ETA: 2:06 - loss: 1.1553 - regression_loss: 0.9900 - classification_loss: 0.1653 117/500 [======>.......................] - ETA: 2:06 - loss: 1.1493 - regression_loss: 0.9848 - classification_loss: 0.1645 118/500 [======>.......................] - ETA: 2:05 - loss: 1.1500 - regression_loss: 0.9857 - classification_loss: 0.1643 119/500 [======>.......................] - ETA: 2:05 - loss: 1.1546 - regression_loss: 0.9896 - classification_loss: 0.1650 120/500 [======>.......................] - ETA: 2:05 - loss: 1.1507 - regression_loss: 0.9863 - classification_loss: 0.1644 121/500 [======>.......................] - ETA: 2:05 - loss: 1.1549 - regression_loss: 0.9896 - classification_loss: 0.1653 122/500 [======>.......................] - ETA: 2:04 - loss: 1.1549 - regression_loss: 0.9897 - classification_loss: 0.1653 123/500 [======>.......................] - ETA: 2:04 - loss: 1.1587 - regression_loss: 0.9928 - classification_loss: 0.1659 124/500 [======>.......................] - ETA: 2:04 - loss: 1.1582 - regression_loss: 0.9923 - classification_loss: 0.1659 125/500 [======>.......................] - ETA: 2:03 - loss: 1.1619 - regression_loss: 0.9952 - classification_loss: 0.1667 126/500 [======>.......................] - ETA: 2:03 - loss: 1.1638 - regression_loss: 0.9970 - classification_loss: 0.1668 127/500 [======>.......................] - ETA: 2:03 - loss: 1.1667 - regression_loss: 1.0000 - classification_loss: 0.1667 128/500 [======>.......................] - ETA: 2:02 - loss: 1.1621 - regression_loss: 0.9957 - classification_loss: 0.1664 129/500 [======>.......................] - ETA: 2:02 - loss: 1.1613 - regression_loss: 0.9950 - classification_loss: 0.1663 130/500 [======>.......................] - ETA: 2:02 - loss: 1.1613 - regression_loss: 0.9949 - classification_loss: 0.1665 131/500 [======>.......................] - ETA: 2:01 - loss: 1.1657 - regression_loss: 0.9986 - classification_loss: 0.1671 132/500 [======>.......................] - ETA: 2:01 - loss: 1.1624 - regression_loss: 0.9957 - classification_loss: 0.1667 133/500 [======>.......................] - ETA: 2:01 - loss: 1.1687 - regression_loss: 1.0010 - classification_loss: 0.1676 134/500 [=======>......................] - ETA: 2:00 - loss: 1.1725 - regression_loss: 1.0043 - classification_loss: 0.1682 135/500 [=======>......................] - ETA: 2:00 - loss: 1.1761 - regression_loss: 1.0073 - classification_loss: 0.1688 136/500 [=======>......................] - ETA: 2:00 - loss: 1.1778 - regression_loss: 1.0089 - classification_loss: 0.1689 137/500 [=======>......................] - ETA: 1:59 - loss: 1.1781 - regression_loss: 1.0092 - classification_loss: 0.1689 138/500 [=======>......................] - ETA: 1:59 - loss: 1.1775 - regression_loss: 1.0085 - classification_loss: 0.1690 139/500 [=======>......................] - ETA: 1:59 - loss: 1.1768 - regression_loss: 1.0077 - classification_loss: 0.1691 140/500 [=======>......................] - ETA: 1:58 - loss: 1.1770 - regression_loss: 1.0074 - classification_loss: 0.1696 141/500 [=======>......................] - ETA: 1:58 - loss: 1.1774 - regression_loss: 1.0080 - classification_loss: 0.1694 142/500 [=======>......................] - ETA: 1:58 - loss: 1.1753 - regression_loss: 1.0065 - classification_loss: 0.1688 143/500 [=======>......................] - ETA: 1:57 - loss: 1.1769 - regression_loss: 1.0080 - classification_loss: 0.1689 144/500 [=======>......................] - ETA: 1:57 - loss: 1.1746 - regression_loss: 1.0060 - classification_loss: 0.1686 145/500 [=======>......................] - ETA: 1:57 - loss: 1.1745 - regression_loss: 1.0059 - classification_loss: 0.1686 146/500 [=======>......................] - ETA: 1:56 - loss: 1.1772 - regression_loss: 1.0086 - classification_loss: 0.1686 147/500 [=======>......................] - ETA: 1:56 - loss: 1.1734 - regression_loss: 1.0056 - classification_loss: 0.1678 148/500 [=======>......................] - ETA: 1:56 - loss: 1.1752 - regression_loss: 1.0069 - classification_loss: 0.1683 149/500 [=======>......................] - ETA: 1:55 - loss: 1.1748 - regression_loss: 1.0066 - classification_loss: 0.1682 150/500 [========>.....................] - ETA: 1:55 - loss: 1.1724 - regression_loss: 1.0048 - classification_loss: 0.1676 151/500 [========>.....................] - ETA: 1:55 - loss: 1.1730 - regression_loss: 1.0056 - classification_loss: 0.1674 152/500 [========>.....................] - ETA: 1:54 - loss: 1.1684 - regression_loss: 1.0015 - classification_loss: 0.1669 153/500 [========>.....................] - ETA: 1:54 - loss: 1.1716 - regression_loss: 1.0044 - classification_loss: 0.1672 154/500 [========>.....................] - ETA: 1:54 - loss: 1.1714 - regression_loss: 1.0042 - classification_loss: 0.1672 155/500 [========>.....................] - ETA: 1:53 - loss: 1.1741 - regression_loss: 1.0067 - classification_loss: 0.1674 156/500 [========>.....................] - ETA: 1:53 - loss: 1.1716 - regression_loss: 1.0047 - classification_loss: 0.1669 157/500 [========>.....................] - ETA: 1:53 - loss: 1.1727 - regression_loss: 1.0058 - classification_loss: 0.1669 158/500 [========>.....................] - ETA: 1:52 - loss: 1.1748 - regression_loss: 1.0077 - classification_loss: 0.1671 159/500 [========>.....................] - ETA: 1:52 - loss: 1.1713 - regression_loss: 1.0047 - classification_loss: 0.1666 160/500 [========>.....................] - ETA: 1:52 - loss: 1.1700 - regression_loss: 1.0035 - classification_loss: 0.1664 161/500 [========>.....................] - ETA: 1:51 - loss: 1.1686 - regression_loss: 1.0025 - classification_loss: 0.1661 162/500 [========>.....................] - ETA: 1:51 - loss: 1.1686 - regression_loss: 1.0026 - classification_loss: 0.1660 163/500 [========>.....................] - ETA: 1:51 - loss: 1.1713 - regression_loss: 1.0049 - classification_loss: 0.1663 164/500 [========>.....................] - ETA: 1:50 - loss: 1.1735 - regression_loss: 1.0072 - classification_loss: 0.1664 165/500 [========>.....................] - ETA: 1:50 - loss: 1.1728 - regression_loss: 1.0068 - classification_loss: 0.1661 166/500 [========>.....................] - ETA: 1:50 - loss: 1.1742 - regression_loss: 1.0080 - classification_loss: 0.1662 167/500 [=========>....................] - ETA: 1:49 - loss: 1.1739 - regression_loss: 1.0078 - classification_loss: 0.1661 168/500 [=========>....................] - ETA: 1:49 - loss: 1.1713 - regression_loss: 1.0055 - classification_loss: 0.1658 169/500 [=========>....................] - ETA: 1:49 - loss: 1.1733 - regression_loss: 1.0067 - classification_loss: 0.1666 170/500 [=========>....................] - ETA: 1:48 - loss: 1.1745 - regression_loss: 1.0084 - classification_loss: 0.1661 171/500 [=========>....................] - ETA: 1:48 - loss: 1.1738 - regression_loss: 1.0079 - classification_loss: 0.1659 172/500 [=========>....................] - ETA: 1:48 - loss: 1.1782 - regression_loss: 1.0117 - classification_loss: 0.1665 173/500 [=========>....................] - ETA: 1:47 - loss: 1.1811 - regression_loss: 1.0141 - classification_loss: 0.1670 174/500 [=========>....................] - ETA: 1:47 - loss: 1.1837 - regression_loss: 1.0162 - classification_loss: 0.1675 175/500 [=========>....................] - ETA: 1:47 - loss: 1.1860 - regression_loss: 1.0183 - classification_loss: 0.1678 176/500 [=========>....................] - ETA: 1:46 - loss: 1.1881 - regression_loss: 1.0201 - classification_loss: 0.1680 177/500 [=========>....................] - ETA: 1:46 - loss: 1.1869 - regression_loss: 1.0191 - classification_loss: 0.1678 178/500 [=========>....................] - ETA: 1:46 - loss: 1.1876 - regression_loss: 1.0197 - classification_loss: 0.1679 179/500 [=========>....................] - ETA: 1:45 - loss: 1.1862 - regression_loss: 1.0185 - classification_loss: 0.1677 180/500 [=========>....................] - ETA: 1:45 - loss: 1.1895 - regression_loss: 1.0209 - classification_loss: 0.1685 181/500 [=========>....................] - ETA: 1:45 - loss: 1.1874 - regression_loss: 1.0192 - classification_loss: 0.1682 182/500 [=========>....................] - ETA: 1:44 - loss: 1.1893 - regression_loss: 1.0210 - classification_loss: 0.1683 183/500 [=========>....................] - ETA: 1:44 - loss: 1.1900 - regression_loss: 1.0216 - classification_loss: 0.1684 184/500 [==========>...................] - ETA: 1:44 - loss: 1.1872 - regression_loss: 1.0191 - classification_loss: 0.1680 185/500 [==========>...................] - ETA: 1:43 - loss: 1.1899 - regression_loss: 1.0212 - classification_loss: 0.1687 186/500 [==========>...................] - ETA: 1:43 - loss: 1.1863 - regression_loss: 1.0183 - classification_loss: 0.1681 187/500 [==========>...................] - ETA: 1:43 - loss: 1.1842 - regression_loss: 1.0165 - classification_loss: 0.1677 188/500 [==========>...................] - ETA: 1:42 - loss: 1.1819 - regression_loss: 1.0148 - classification_loss: 0.1671 189/500 [==========>...................] - ETA: 1:42 - loss: 1.1793 - regression_loss: 1.0127 - classification_loss: 0.1666 190/500 [==========>...................] - ETA: 1:42 - loss: 1.1746 - regression_loss: 1.0086 - classification_loss: 0.1660 191/500 [==========>...................] - ETA: 1:41 - loss: 1.1759 - regression_loss: 1.0095 - classification_loss: 0.1664 192/500 [==========>...................] - ETA: 1:41 - loss: 1.1773 - regression_loss: 1.0107 - classification_loss: 0.1665 193/500 [==========>...................] - ETA: 1:41 - loss: 1.1766 - regression_loss: 1.0102 - classification_loss: 0.1664 194/500 [==========>...................] - ETA: 1:40 - loss: 1.1788 - regression_loss: 1.0120 - classification_loss: 0.1668 195/500 [==========>...................] - ETA: 1:40 - loss: 1.1777 - regression_loss: 1.0111 - classification_loss: 0.1665 196/500 [==========>...................] - ETA: 1:40 - loss: 1.1770 - regression_loss: 1.0108 - classification_loss: 0.1662 197/500 [==========>...................] - ETA: 1:39 - loss: 1.1773 - regression_loss: 1.0111 - classification_loss: 0.1662 198/500 [==========>...................] - ETA: 1:39 - loss: 1.1779 - regression_loss: 1.0122 - classification_loss: 0.1657 199/500 [==========>...................] - ETA: 1:39 - loss: 1.1781 - regression_loss: 1.0127 - classification_loss: 0.1654 200/500 [===========>..................] - ETA: 1:38 - loss: 1.1747 - regression_loss: 1.0097 - classification_loss: 0.1650 201/500 [===========>..................] - ETA: 1:38 - loss: 1.1735 - regression_loss: 1.0086 - classification_loss: 0.1648 202/500 [===========>..................] - ETA: 1:38 - loss: 1.1731 - regression_loss: 1.0081 - classification_loss: 0.1650 203/500 [===========>..................] - ETA: 1:37 - loss: 1.1714 - regression_loss: 1.0066 - classification_loss: 0.1648 204/500 [===========>..................] - ETA: 1:37 - loss: 1.1684 - regression_loss: 1.0040 - classification_loss: 0.1644 205/500 [===========>..................] - ETA: 1:37 - loss: 1.1690 - regression_loss: 1.0044 - classification_loss: 0.1646 206/500 [===========>..................] - ETA: 1:36 - loss: 1.1687 - regression_loss: 1.0042 - classification_loss: 0.1645 207/500 [===========>..................] - ETA: 1:36 - loss: 1.1670 - regression_loss: 1.0028 - classification_loss: 0.1641 208/500 [===========>..................] - ETA: 1:36 - loss: 1.1678 - regression_loss: 1.0036 - classification_loss: 0.1642 209/500 [===========>..................] - ETA: 1:35 - loss: 1.1665 - regression_loss: 1.0024 - classification_loss: 0.1640 210/500 [===========>..................] - ETA: 1:35 - loss: 1.1681 - regression_loss: 1.0037 - classification_loss: 0.1644 211/500 [===========>..................] - ETA: 1:35 - loss: 1.1655 - regression_loss: 1.0016 - classification_loss: 0.1639 212/500 [===========>..................] - ETA: 1:34 - loss: 1.1676 - regression_loss: 1.0033 - classification_loss: 0.1643 213/500 [===========>..................] - ETA: 1:34 - loss: 1.1686 - regression_loss: 1.0042 - classification_loss: 0.1645 214/500 [===========>..................] - ETA: 1:34 - loss: 1.1699 - regression_loss: 1.0052 - classification_loss: 0.1647 215/500 [===========>..................] - ETA: 1:33 - loss: 1.1697 - regression_loss: 1.0051 - classification_loss: 0.1646 216/500 [===========>..................] - ETA: 1:33 - loss: 1.1689 - regression_loss: 1.0044 - classification_loss: 0.1646 217/500 [============>.................] - ETA: 1:33 - loss: 1.1705 - regression_loss: 1.0059 - classification_loss: 0.1646 218/500 [============>.................] - ETA: 1:32 - loss: 1.1718 - regression_loss: 1.0069 - classification_loss: 0.1649 219/500 [============>.................] - ETA: 1:32 - loss: 1.1711 - regression_loss: 1.0063 - classification_loss: 0.1648 220/500 [============>.................] - ETA: 1:32 - loss: 1.1714 - regression_loss: 1.0069 - classification_loss: 0.1645 221/500 [============>.................] - ETA: 1:31 - loss: 1.1692 - regression_loss: 1.0051 - classification_loss: 0.1641 222/500 [============>.................] - ETA: 1:31 - loss: 1.1690 - regression_loss: 1.0051 - classification_loss: 0.1640 223/500 [============>.................] - ETA: 1:31 - loss: 1.1693 - regression_loss: 1.0052 - classification_loss: 0.1641 224/500 [============>.................] - ETA: 1:30 - loss: 1.1679 - regression_loss: 1.0041 - classification_loss: 0.1638 225/500 [============>.................] - ETA: 1:30 - loss: 1.1672 - regression_loss: 1.0035 - classification_loss: 0.1636 226/500 [============>.................] - ETA: 1:30 - loss: 1.1699 - regression_loss: 1.0058 - classification_loss: 0.1641 227/500 [============>.................] - ETA: 1:29 - loss: 1.1670 - regression_loss: 1.0034 - classification_loss: 0.1636 228/500 [============>.................] - ETA: 1:29 - loss: 1.1641 - regression_loss: 1.0009 - classification_loss: 0.1632 229/500 [============>.................] - ETA: 1:29 - loss: 1.1659 - regression_loss: 1.0022 - classification_loss: 0.1637 230/500 [============>.................] - ETA: 1:28 - loss: 1.1629 - regression_loss: 0.9996 - classification_loss: 0.1633 231/500 [============>.................] - ETA: 1:28 - loss: 1.1631 - regression_loss: 0.9997 - classification_loss: 0.1634 232/500 [============>.................] - ETA: 1:28 - loss: 1.1630 - regression_loss: 0.9996 - classification_loss: 0.1634 233/500 [============>.................] - ETA: 1:27 - loss: 1.1625 - regression_loss: 0.9990 - classification_loss: 0.1635 234/500 [=============>................] - ETA: 1:27 - loss: 1.1612 - regression_loss: 0.9979 - classification_loss: 0.1633 235/500 [=============>................] - ETA: 1:27 - loss: 1.1590 - regression_loss: 0.9960 - classification_loss: 0.1630 236/500 [=============>................] - ETA: 1:26 - loss: 1.1576 - regression_loss: 0.9949 - classification_loss: 0.1628 237/500 [=============>................] - ETA: 1:26 - loss: 1.1587 - regression_loss: 0.9957 - classification_loss: 0.1629 238/500 [=============>................] - ETA: 1:26 - loss: 1.1601 - regression_loss: 0.9966 - classification_loss: 0.1635 239/500 [=============>................] - ETA: 1:26 - loss: 1.1594 - regression_loss: 0.9957 - classification_loss: 0.1637 240/500 [=============>................] - ETA: 1:25 - loss: 1.1596 - regression_loss: 0.9959 - classification_loss: 0.1637 241/500 [=============>................] - ETA: 1:25 - loss: 1.1585 - regression_loss: 0.9951 - classification_loss: 0.1633 242/500 [=============>................] - ETA: 1:25 - loss: 1.1590 - regression_loss: 0.9957 - classification_loss: 0.1633 243/500 [=============>................] - ETA: 1:24 - loss: 1.1563 - regression_loss: 0.9934 - classification_loss: 0.1629 244/500 [=============>................] - ETA: 1:24 - loss: 1.1557 - regression_loss: 0.9929 - classification_loss: 0.1627 245/500 [=============>................] - ETA: 1:24 - loss: 1.1555 - regression_loss: 0.9928 - classification_loss: 0.1627 246/500 [=============>................] - ETA: 1:23 - loss: 1.1568 - regression_loss: 0.9943 - classification_loss: 0.1625 247/500 [=============>................] - ETA: 1:23 - loss: 1.1587 - regression_loss: 0.9960 - classification_loss: 0.1627 248/500 [=============>................] - ETA: 1:23 - loss: 1.1605 - regression_loss: 0.9976 - classification_loss: 0.1629 249/500 [=============>................] - ETA: 1:22 - loss: 1.1581 - regression_loss: 0.9955 - classification_loss: 0.1626 250/500 [==============>...............] - ETA: 1:22 - loss: 1.1568 - regression_loss: 0.9945 - classification_loss: 0.1623 251/500 [==============>...............] - ETA: 1:22 - loss: 1.1554 - regression_loss: 0.9934 - classification_loss: 0.1620 252/500 [==============>...............] - ETA: 1:21 - loss: 1.1554 - regression_loss: 0.9935 - classification_loss: 0.1619 253/500 [==============>...............] - ETA: 1:21 - loss: 1.1577 - regression_loss: 0.9954 - classification_loss: 0.1623 254/500 [==============>...............] - ETA: 1:21 - loss: 1.1559 - regression_loss: 0.9940 - classification_loss: 0.1620 255/500 [==============>...............] - ETA: 1:20 - loss: 1.1569 - regression_loss: 0.9947 - classification_loss: 0.1622 256/500 [==============>...............] - ETA: 1:20 - loss: 1.1564 - regression_loss: 0.9944 - classification_loss: 0.1621 257/500 [==============>...............] - ETA: 1:20 - loss: 1.1552 - regression_loss: 0.9933 - classification_loss: 0.1619 258/500 [==============>...............] - ETA: 1:19 - loss: 1.1557 - regression_loss: 0.9936 - classification_loss: 0.1621 259/500 [==============>...............] - ETA: 1:19 - loss: 1.1541 - regression_loss: 0.9921 - classification_loss: 0.1620 260/500 [==============>...............] - ETA: 1:19 - loss: 1.1563 - regression_loss: 0.9935 - classification_loss: 0.1627 261/500 [==============>...............] - ETA: 1:18 - loss: 1.1571 - regression_loss: 0.9943 - classification_loss: 0.1628 262/500 [==============>...............] - ETA: 1:18 - loss: 1.1548 - regression_loss: 0.9923 - classification_loss: 0.1625 263/500 [==============>...............] - ETA: 1:18 - loss: 1.1539 - regression_loss: 0.9918 - classification_loss: 0.1621 264/500 [==============>...............] - ETA: 1:17 - loss: 1.1551 - regression_loss: 0.9927 - classification_loss: 0.1624 265/500 [==============>...............] - ETA: 1:17 - loss: 1.1565 - regression_loss: 0.9939 - classification_loss: 0.1626 266/500 [==============>...............] - ETA: 1:17 - loss: 1.1551 - regression_loss: 0.9926 - classification_loss: 0.1625 267/500 [===============>..............] - ETA: 1:16 - loss: 1.1559 - regression_loss: 0.9933 - classification_loss: 0.1626 268/500 [===============>..............] - ETA: 1:16 - loss: 1.1553 - regression_loss: 0.9929 - classification_loss: 0.1624 269/500 [===============>..............] - ETA: 1:16 - loss: 1.1568 - regression_loss: 0.9942 - classification_loss: 0.1626 270/500 [===============>..............] - ETA: 1:15 - loss: 1.1566 - regression_loss: 0.9941 - classification_loss: 0.1626 271/500 [===============>..............] - ETA: 1:15 - loss: 1.1568 - regression_loss: 0.9943 - classification_loss: 0.1626 272/500 [===============>..............] - ETA: 1:15 - loss: 1.1545 - regression_loss: 0.9923 - classification_loss: 0.1622 273/500 [===============>..............] - ETA: 1:14 - loss: 1.1574 - regression_loss: 0.9947 - classification_loss: 0.1628 274/500 [===============>..............] - ETA: 1:14 - loss: 1.1561 - regression_loss: 0.9935 - classification_loss: 0.1627 275/500 [===============>..............] - ETA: 1:14 - loss: 1.1545 - regression_loss: 0.9921 - classification_loss: 0.1623 276/500 [===============>..............] - ETA: 1:13 - loss: 1.1567 - regression_loss: 0.9941 - classification_loss: 0.1626 277/500 [===============>..............] - ETA: 1:13 - loss: 1.1590 - regression_loss: 0.9960 - classification_loss: 0.1630 278/500 [===============>..............] - ETA: 1:13 - loss: 1.1600 - regression_loss: 0.9969 - classification_loss: 0.1631 279/500 [===============>..............] - ETA: 1:12 - loss: 1.1593 - regression_loss: 0.9963 - classification_loss: 0.1630 280/500 [===============>..............] - ETA: 1:12 - loss: 1.1573 - regression_loss: 0.9946 - classification_loss: 0.1627 281/500 [===============>..............] - ETA: 1:12 - loss: 1.1587 - regression_loss: 0.9960 - classification_loss: 0.1628 282/500 [===============>..............] - ETA: 1:11 - loss: 1.1609 - regression_loss: 0.9978 - classification_loss: 0.1631 283/500 [===============>..............] - ETA: 1:11 - loss: 1.1593 - regression_loss: 0.9964 - classification_loss: 0.1629 284/500 [================>.............] - ETA: 1:11 - loss: 1.1574 - regression_loss: 0.9949 - classification_loss: 0.1626 285/500 [================>.............] - ETA: 1:10 - loss: 1.1550 - regression_loss: 0.9927 - classification_loss: 0.1622 286/500 [================>.............] - ETA: 1:10 - loss: 1.1549 - regression_loss: 0.9927 - classification_loss: 0.1622 287/500 [================>.............] - ETA: 1:10 - loss: 1.1535 - regression_loss: 0.9917 - classification_loss: 0.1619 288/500 [================>.............] - ETA: 1:09 - loss: 1.1543 - regression_loss: 0.9922 - classification_loss: 0.1621 289/500 [================>.............] - ETA: 1:09 - loss: 1.1549 - regression_loss: 0.9927 - classification_loss: 0.1622 290/500 [================>.............] - ETA: 1:09 - loss: 1.1532 - regression_loss: 0.9912 - classification_loss: 0.1619 291/500 [================>.............] - ETA: 1:08 - loss: 1.1528 - regression_loss: 0.9909 - classification_loss: 0.1620 292/500 [================>.............] - ETA: 1:08 - loss: 1.1532 - regression_loss: 0.9912 - classification_loss: 0.1619 293/500 [================>.............] - ETA: 1:08 - loss: 1.1529 - regression_loss: 0.9910 - classification_loss: 0.1618 294/500 [================>.............] - ETA: 1:07 - loss: 1.1534 - regression_loss: 0.9914 - classification_loss: 0.1620 295/500 [================>.............] - ETA: 1:07 - loss: 1.1553 - regression_loss: 0.9930 - classification_loss: 0.1624 296/500 [================>.............] - ETA: 1:07 - loss: 1.1532 - regression_loss: 0.9911 - classification_loss: 0.1621 297/500 [================>.............] - ETA: 1:06 - loss: 1.1531 - regression_loss: 0.9910 - classification_loss: 0.1622 298/500 [================>.............] - ETA: 1:06 - loss: 1.1508 - regression_loss: 0.9889 - classification_loss: 0.1619 299/500 [================>.............] - ETA: 1:06 - loss: 1.1509 - regression_loss: 0.9889 - classification_loss: 0.1620 300/500 [=================>............] - ETA: 1:05 - loss: 1.1507 - regression_loss: 0.9886 - classification_loss: 0.1621 301/500 [=================>............] - ETA: 1:05 - loss: 1.1520 - regression_loss: 0.9897 - classification_loss: 0.1623 302/500 [=================>............] - ETA: 1:05 - loss: 1.1528 - regression_loss: 0.9902 - classification_loss: 0.1626 303/500 [=================>............] - ETA: 1:04 - loss: 1.1533 - regression_loss: 0.9908 - classification_loss: 0.1626 304/500 [=================>............] - ETA: 1:04 - loss: 1.1532 - regression_loss: 0.9908 - classification_loss: 0.1624 305/500 [=================>............] - ETA: 1:04 - loss: 1.1524 - regression_loss: 0.9902 - classification_loss: 0.1622 306/500 [=================>............] - ETA: 1:03 - loss: 1.1543 - regression_loss: 0.9919 - classification_loss: 0.1624 307/500 [=================>............] - ETA: 1:03 - loss: 1.1536 - regression_loss: 0.9914 - classification_loss: 0.1622 308/500 [=================>............] - ETA: 1:03 - loss: 1.1527 - regression_loss: 0.9907 - classification_loss: 0.1620 309/500 [=================>............] - ETA: 1:03 - loss: 1.1527 - regression_loss: 0.9908 - classification_loss: 0.1619 310/500 [=================>............] - ETA: 1:02 - loss: 1.1528 - regression_loss: 0.9909 - classification_loss: 0.1619 311/500 [=================>............] - ETA: 1:02 - loss: 1.1533 - regression_loss: 0.9913 - classification_loss: 0.1619 312/500 [=================>............] - ETA: 1:02 - loss: 1.1542 - regression_loss: 0.9922 - classification_loss: 0.1620 313/500 [=================>............] - ETA: 1:01 - loss: 1.1548 - regression_loss: 0.9927 - classification_loss: 0.1621 314/500 [=================>............] - ETA: 1:01 - loss: 1.1552 - regression_loss: 0.9931 - classification_loss: 0.1621 315/500 [=================>............] - ETA: 1:01 - loss: 1.1533 - regression_loss: 0.9913 - classification_loss: 0.1621 316/500 [=================>............] - ETA: 1:00 - loss: 1.1543 - regression_loss: 0.9920 - classification_loss: 0.1623 317/500 [==================>...........] - ETA: 1:00 - loss: 1.1554 - regression_loss: 0.9928 - classification_loss: 0.1626 318/500 [==================>...........] - ETA: 1:00 - loss: 1.1529 - regression_loss: 0.9907 - classification_loss: 0.1622 319/500 [==================>...........] - ETA: 59s - loss: 1.1511 - regression_loss: 0.9891 - classification_loss: 0.1620  320/500 [==================>...........] - ETA: 59s - loss: 1.1504 - regression_loss: 0.9885 - classification_loss: 0.1619 321/500 [==================>...........] - ETA: 59s - loss: 1.1486 - regression_loss: 0.9870 - classification_loss: 0.1615 322/500 [==================>...........] - ETA: 58s - loss: 1.1474 - regression_loss: 0.9860 - classification_loss: 0.1614 323/500 [==================>...........] - ETA: 58s - loss: 1.1473 - regression_loss: 0.9858 - classification_loss: 0.1615 324/500 [==================>...........] - ETA: 58s - loss: 1.1485 - regression_loss: 0.9867 - classification_loss: 0.1618 325/500 [==================>...........] - ETA: 57s - loss: 1.1507 - regression_loss: 0.9882 - classification_loss: 0.1625 326/500 [==================>...........] - ETA: 57s - loss: 1.1505 - regression_loss: 0.9878 - classification_loss: 0.1627 327/500 [==================>...........] - ETA: 57s - loss: 1.1506 - regression_loss: 0.9878 - classification_loss: 0.1628 328/500 [==================>...........] - ETA: 56s - loss: 1.1500 - regression_loss: 0.9873 - classification_loss: 0.1627 329/500 [==================>...........] - ETA: 56s - loss: 1.1502 - regression_loss: 0.9875 - classification_loss: 0.1627 330/500 [==================>...........] - ETA: 56s - loss: 1.1506 - regression_loss: 0.9879 - classification_loss: 0.1627 331/500 [==================>...........] - ETA: 55s - loss: 1.1518 - regression_loss: 0.9888 - classification_loss: 0.1629 332/500 [==================>...........] - ETA: 55s - loss: 1.1502 - regression_loss: 0.9875 - classification_loss: 0.1627 333/500 [==================>...........] - ETA: 55s - loss: 1.1498 - regression_loss: 0.9872 - classification_loss: 0.1626 334/500 [===================>..........] - ETA: 54s - loss: 1.1508 - regression_loss: 0.9880 - classification_loss: 0.1627 335/500 [===================>..........] - ETA: 54s - loss: 1.1501 - regression_loss: 0.9874 - classification_loss: 0.1627 336/500 [===================>..........] - ETA: 54s - loss: 1.1502 - regression_loss: 0.9873 - classification_loss: 0.1629 337/500 [===================>..........] - ETA: 53s - loss: 1.1487 - regression_loss: 0.9861 - classification_loss: 0.1626 338/500 [===================>..........] - ETA: 53s - loss: 1.1483 - regression_loss: 0.9857 - classification_loss: 0.1626 339/500 [===================>..........] - ETA: 53s - loss: 1.1469 - regression_loss: 0.9845 - classification_loss: 0.1624 340/500 [===================>..........] - ETA: 52s - loss: 1.1474 - regression_loss: 0.9849 - classification_loss: 0.1625 341/500 [===================>..........] - ETA: 52s - loss: 1.1484 - regression_loss: 0.9857 - classification_loss: 0.1627 342/500 [===================>..........] - ETA: 52s - loss: 1.1505 - regression_loss: 0.9873 - classification_loss: 0.1632 343/500 [===================>..........] - ETA: 51s - loss: 1.1500 - regression_loss: 0.9870 - classification_loss: 0.1630 344/500 [===================>..........] - ETA: 51s - loss: 1.1484 - regression_loss: 0.9855 - classification_loss: 0.1629 345/500 [===================>..........] - ETA: 51s - loss: 1.1485 - regression_loss: 0.9855 - classification_loss: 0.1629 346/500 [===================>..........] - ETA: 50s - loss: 1.1486 - regression_loss: 0.9857 - classification_loss: 0.1630 347/500 [===================>..........] - ETA: 50s - loss: 1.1465 - regression_loss: 0.9839 - classification_loss: 0.1627 348/500 [===================>..........] - ETA: 50s - loss: 1.1449 - regression_loss: 0.9825 - classification_loss: 0.1624 349/500 [===================>..........] - ETA: 49s - loss: 1.1448 - regression_loss: 0.9825 - classification_loss: 0.1623 350/500 [====================>.........] - ETA: 49s - loss: 1.1454 - regression_loss: 0.9830 - classification_loss: 0.1624 351/500 [====================>.........] - ETA: 49s - loss: 1.1433 - regression_loss: 0.9811 - classification_loss: 0.1622 352/500 [====================>.........] - ETA: 48s - loss: 1.1430 - regression_loss: 0.9808 - classification_loss: 0.1622 353/500 [====================>.........] - ETA: 48s - loss: 1.1417 - regression_loss: 0.9799 - classification_loss: 0.1619 354/500 [====================>.........] - ETA: 48s - loss: 1.1432 - regression_loss: 0.9811 - classification_loss: 0.1622 355/500 [====================>.........] - ETA: 47s - loss: 1.1436 - regression_loss: 0.9813 - classification_loss: 0.1624 356/500 [====================>.........] - ETA: 47s - loss: 1.1424 - regression_loss: 0.9802 - classification_loss: 0.1622 357/500 [====================>.........] - ETA: 47s - loss: 1.1435 - regression_loss: 0.9811 - classification_loss: 0.1624 358/500 [====================>.........] - ETA: 46s - loss: 1.1436 - regression_loss: 0.9811 - classification_loss: 0.1624 359/500 [====================>.........] - ETA: 46s - loss: 1.1426 - regression_loss: 0.9803 - classification_loss: 0.1623 360/500 [====================>.........] - ETA: 46s - loss: 1.1422 - regression_loss: 0.9800 - classification_loss: 0.1622 361/500 [====================>.........] - ETA: 45s - loss: 1.1412 - regression_loss: 0.9792 - classification_loss: 0.1620 362/500 [====================>.........] - ETA: 45s - loss: 1.1400 - regression_loss: 0.9782 - classification_loss: 0.1618 363/500 [====================>.........] - ETA: 45s - loss: 1.1389 - regression_loss: 0.9772 - classification_loss: 0.1616 364/500 [====================>.........] - ETA: 44s - loss: 1.1373 - regression_loss: 0.9759 - classification_loss: 0.1614 365/500 [====================>.........] - ETA: 44s - loss: 1.1369 - regression_loss: 0.9754 - classification_loss: 0.1614 366/500 [====================>.........] - ETA: 44s - loss: 1.1367 - regression_loss: 0.9752 - classification_loss: 0.1615 367/500 [=====================>........] - ETA: 43s - loss: 1.1374 - regression_loss: 0.9759 - classification_loss: 0.1615 368/500 [=====================>........] - ETA: 43s - loss: 1.1383 - regression_loss: 0.9766 - classification_loss: 0.1616 369/500 [=====================>........] - ETA: 43s - loss: 1.1371 - regression_loss: 0.9756 - classification_loss: 0.1615 370/500 [=====================>........] - ETA: 42s - loss: 1.1355 - regression_loss: 0.9742 - classification_loss: 0.1613 371/500 [=====================>........] - ETA: 42s - loss: 1.1351 - regression_loss: 0.9739 - classification_loss: 0.1612 372/500 [=====================>........] - ETA: 42s - loss: 1.1358 - regression_loss: 0.9745 - classification_loss: 0.1613 373/500 [=====================>........] - ETA: 41s - loss: 1.1368 - regression_loss: 0.9752 - classification_loss: 0.1615 374/500 [=====================>........] - ETA: 41s - loss: 1.1355 - regression_loss: 0.9742 - classification_loss: 0.1613 375/500 [=====================>........] - ETA: 41s - loss: 1.1358 - regression_loss: 0.9744 - classification_loss: 0.1614 376/500 [=====================>........] - ETA: 40s - loss: 1.1367 - regression_loss: 0.9752 - classification_loss: 0.1615 377/500 [=====================>........] - ETA: 40s - loss: 1.1378 - regression_loss: 0.9760 - classification_loss: 0.1618 378/500 [=====================>........] - ETA: 40s - loss: 1.1375 - regression_loss: 0.9755 - classification_loss: 0.1620 379/500 [=====================>........] - ETA: 39s - loss: 1.1388 - regression_loss: 0.9766 - classification_loss: 0.1622 380/500 [=====================>........] - ETA: 39s - loss: 1.1370 - regression_loss: 0.9751 - classification_loss: 0.1620 381/500 [=====================>........] - ETA: 39s - loss: 1.1363 - regression_loss: 0.9744 - classification_loss: 0.1618 382/500 [=====================>........] - ETA: 38s - loss: 1.1366 - regression_loss: 0.9750 - classification_loss: 0.1616 383/500 [=====================>........] - ETA: 38s - loss: 1.1353 - regression_loss: 0.9740 - classification_loss: 0.1613 384/500 [======================>.......] - ETA: 38s - loss: 1.1354 - regression_loss: 0.9741 - classification_loss: 0.1613 385/500 [======================>.......] - ETA: 37s - loss: 1.1353 - regression_loss: 0.9740 - classification_loss: 0.1613 386/500 [======================>.......] - ETA: 37s - loss: 1.1344 - regression_loss: 0.9734 - classification_loss: 0.1610 387/500 [======================>.......] - ETA: 37s - loss: 1.1340 - regression_loss: 0.9731 - classification_loss: 0.1609 388/500 [======================>.......] - ETA: 36s - loss: 1.1336 - regression_loss: 0.9728 - classification_loss: 0.1607 389/500 [======================>.......] - ETA: 36s - loss: 1.1339 - regression_loss: 0.9731 - classification_loss: 0.1608 390/500 [======================>.......] - ETA: 36s - loss: 1.1349 - regression_loss: 0.9741 - classification_loss: 0.1608 391/500 [======================>.......] - ETA: 35s - loss: 1.1359 - regression_loss: 0.9750 - classification_loss: 0.1609 392/500 [======================>.......] - ETA: 35s - loss: 1.1359 - regression_loss: 0.9750 - classification_loss: 0.1609 393/500 [======================>.......] - ETA: 35s - loss: 1.1358 - regression_loss: 0.9749 - classification_loss: 0.1609 394/500 [======================>.......] - ETA: 34s - loss: 1.1363 - regression_loss: 0.9755 - classification_loss: 0.1608 395/500 [======================>.......] - ETA: 34s - loss: 1.1374 - regression_loss: 0.9765 - classification_loss: 0.1610 396/500 [======================>.......] - ETA: 34s - loss: 1.1363 - regression_loss: 0.9755 - classification_loss: 0.1608 397/500 [======================>.......] - ETA: 33s - loss: 1.1367 - regression_loss: 0.9760 - classification_loss: 0.1607 398/500 [======================>.......] - ETA: 33s - loss: 1.1368 - regression_loss: 0.9760 - classification_loss: 0.1608 399/500 [======================>.......] - ETA: 33s - loss: 1.1380 - regression_loss: 0.9770 - classification_loss: 0.1610 400/500 [=======================>......] - ETA: 32s - loss: 1.1364 - regression_loss: 0.9755 - classification_loss: 0.1609 401/500 [=======================>......] - ETA: 32s - loss: 1.1350 - regression_loss: 0.9743 - classification_loss: 0.1608 402/500 [=======================>......] - ETA: 32s - loss: 1.1343 - regression_loss: 0.9737 - classification_loss: 0.1606 403/500 [=======================>......] - ETA: 31s - loss: 1.1328 - regression_loss: 0.9724 - classification_loss: 0.1604 404/500 [=======================>......] - ETA: 31s - loss: 1.1310 - regression_loss: 0.9708 - classification_loss: 0.1602 405/500 [=======================>......] - ETA: 31s - loss: 1.1295 - regression_loss: 0.9695 - classification_loss: 0.1600 406/500 [=======================>......] - ETA: 30s - loss: 1.1275 - regression_loss: 0.9678 - classification_loss: 0.1598 407/500 [=======================>......] - ETA: 30s - loss: 1.1281 - regression_loss: 0.9682 - classification_loss: 0.1598 408/500 [=======================>......] - ETA: 30s - loss: 1.1290 - regression_loss: 0.9691 - classification_loss: 0.1600 409/500 [=======================>......] - ETA: 29s - loss: 1.1269 - regression_loss: 0.9672 - classification_loss: 0.1597 410/500 [=======================>......] - ETA: 29s - loss: 1.1272 - regression_loss: 0.9675 - classification_loss: 0.1597 411/500 [=======================>......] - ETA: 29s - loss: 1.1274 - regression_loss: 0.9677 - classification_loss: 0.1597 412/500 [=======================>......] - ETA: 29s - loss: 1.1284 - regression_loss: 0.9684 - classification_loss: 0.1600 413/500 [=======================>......] - ETA: 28s - loss: 1.1275 - regression_loss: 0.9677 - classification_loss: 0.1599 414/500 [=======================>......] - ETA: 28s - loss: 1.1266 - regression_loss: 0.9669 - classification_loss: 0.1596 415/500 [=======================>......] - ETA: 28s - loss: 1.1265 - regression_loss: 0.9669 - classification_loss: 0.1596 416/500 [=======================>......] - ETA: 27s - loss: 1.1266 - regression_loss: 0.9669 - classification_loss: 0.1597 417/500 [========================>.....] - ETA: 27s - loss: 1.1274 - regression_loss: 0.9676 - classification_loss: 0.1598 418/500 [========================>.....] - ETA: 27s - loss: 1.1293 - regression_loss: 0.9692 - classification_loss: 0.1601 419/500 [========================>.....] - ETA: 26s - loss: 1.1294 - regression_loss: 0.9694 - classification_loss: 0.1601 420/500 [========================>.....] - ETA: 26s - loss: 1.1292 - regression_loss: 0.9692 - classification_loss: 0.1600 421/500 [========================>.....] - ETA: 26s - loss: 1.1287 - regression_loss: 0.9688 - classification_loss: 0.1600 422/500 [========================>.....] - ETA: 25s - loss: 1.1297 - regression_loss: 0.9696 - classification_loss: 0.1601 423/500 [========================>.....] - ETA: 25s - loss: 1.1292 - regression_loss: 0.9691 - classification_loss: 0.1601 424/500 [========================>.....] - ETA: 25s - loss: 1.1290 - regression_loss: 0.9690 - classification_loss: 0.1600 425/500 [========================>.....] - ETA: 24s - loss: 1.1276 - regression_loss: 0.9678 - classification_loss: 0.1598 426/500 [========================>.....] - ETA: 24s - loss: 1.1265 - regression_loss: 0.9668 - classification_loss: 0.1597 427/500 [========================>.....] - ETA: 24s - loss: 1.1258 - regression_loss: 0.9663 - classification_loss: 0.1595 428/500 [========================>.....] - ETA: 23s - loss: 1.1247 - regression_loss: 0.9654 - classification_loss: 0.1594 429/500 [========================>.....] - ETA: 23s - loss: 1.1250 - regression_loss: 0.9656 - classification_loss: 0.1593 430/500 [========================>.....] - ETA: 23s - loss: 1.1243 - regression_loss: 0.9652 - classification_loss: 0.1592 431/500 [========================>.....] - ETA: 22s - loss: 1.1253 - regression_loss: 0.9661 - classification_loss: 0.1592 432/500 [========================>.....] - ETA: 22s - loss: 1.1247 - regression_loss: 0.9656 - classification_loss: 0.1591 433/500 [========================>.....] - ETA: 22s - loss: 1.1268 - regression_loss: 0.9674 - classification_loss: 0.1594 434/500 [=========================>....] - ETA: 21s - loss: 1.1276 - regression_loss: 0.9681 - classification_loss: 0.1595 435/500 [=========================>....] - ETA: 21s - loss: 1.1278 - regression_loss: 0.9682 - classification_loss: 0.1596 436/500 [=========================>....] - ETA: 21s - loss: 1.1278 - regression_loss: 0.9682 - classification_loss: 0.1596 437/500 [=========================>....] - ETA: 20s - loss: 1.1283 - regression_loss: 0.9688 - classification_loss: 0.1595 438/500 [=========================>....] - ETA: 20s - loss: 1.1288 - regression_loss: 0.9693 - classification_loss: 0.1595 439/500 [=========================>....] - ETA: 20s - loss: 1.1274 - regression_loss: 0.9680 - classification_loss: 0.1593 440/500 [=========================>....] - ETA: 19s - loss: 1.1255 - regression_loss: 0.9664 - classification_loss: 0.1591 441/500 [=========================>....] - ETA: 19s - loss: 1.1267 - regression_loss: 0.9674 - classification_loss: 0.1594 442/500 [=========================>....] - ETA: 19s - loss: 1.1252 - regression_loss: 0.9661 - classification_loss: 0.1591 443/500 [=========================>....] - ETA: 18s - loss: 1.1260 - regression_loss: 0.9668 - classification_loss: 0.1592 444/500 [=========================>....] - ETA: 18s - loss: 1.1252 - regression_loss: 0.9661 - classification_loss: 0.1591 445/500 [=========================>....] - ETA: 18s - loss: 1.1259 - regression_loss: 0.9666 - classification_loss: 0.1593 446/500 [=========================>....] - ETA: 17s - loss: 1.1267 - regression_loss: 0.9673 - classification_loss: 0.1594 447/500 [=========================>....] - ETA: 17s - loss: 1.1268 - regression_loss: 0.9674 - classification_loss: 0.1594 448/500 [=========================>....] - ETA: 17s - loss: 1.1269 - regression_loss: 0.9675 - classification_loss: 0.1594 449/500 [=========================>....] - ETA: 16s - loss: 1.1286 - regression_loss: 0.9688 - classification_loss: 0.1597 450/500 [==========================>...] - ETA: 16s - loss: 1.1277 - regression_loss: 0.9681 - classification_loss: 0.1596 451/500 [==========================>...] - ETA: 16s - loss: 1.1274 - regression_loss: 0.9678 - classification_loss: 0.1596 452/500 [==========================>...] - ETA: 15s - loss: 1.1284 - regression_loss: 0.9687 - classification_loss: 0.1597 453/500 [==========================>...] - ETA: 15s - loss: 1.1286 - regression_loss: 0.9688 - classification_loss: 0.1598 454/500 [==========================>...] - ETA: 15s - loss: 1.1274 - regression_loss: 0.9678 - classification_loss: 0.1596 455/500 [==========================>...] - ETA: 14s - loss: 1.1260 - regression_loss: 0.9665 - classification_loss: 0.1595 456/500 [==========================>...] - ETA: 14s - loss: 1.1250 - regression_loss: 0.9656 - classification_loss: 0.1593 457/500 [==========================>...] - ETA: 14s - loss: 1.1238 - regression_loss: 0.9647 - classification_loss: 0.1591 458/500 [==========================>...] - ETA: 13s - loss: 1.1257 - regression_loss: 0.9662 - classification_loss: 0.1595 459/500 [==========================>...] - ETA: 13s - loss: 1.1272 - regression_loss: 0.9674 - classification_loss: 0.1597 460/500 [==========================>...] - ETA: 13s - loss: 1.1279 - regression_loss: 0.9681 - classification_loss: 0.1598 461/500 [==========================>...] - ETA: 12s - loss: 1.1272 - regression_loss: 0.9675 - classification_loss: 0.1598 462/500 [==========================>...] - ETA: 12s - loss: 1.1273 - regression_loss: 0.9676 - classification_loss: 0.1597 463/500 [==========================>...] - ETA: 12s - loss: 1.1261 - regression_loss: 0.9666 - classification_loss: 0.1595 464/500 [==========================>...] - ETA: 11s - loss: 1.1261 - regression_loss: 0.9666 - classification_loss: 0.1596 465/500 [==========================>...] - ETA: 11s - loss: 1.1266 - regression_loss: 0.9670 - classification_loss: 0.1596 466/500 [==========================>...] - ETA: 11s - loss: 1.1258 - regression_loss: 0.9664 - classification_loss: 0.1595 467/500 [===========================>..] - ETA: 10s - loss: 1.1265 - regression_loss: 0.9669 - classification_loss: 0.1596 468/500 [===========================>..] - ETA: 10s - loss: 1.1254 - regression_loss: 0.9660 - classification_loss: 0.1594 469/500 [===========================>..] - ETA: 10s - loss: 1.1269 - regression_loss: 0.9671 - classification_loss: 0.1598 470/500 [===========================>..] - ETA: 9s - loss: 1.1276 - regression_loss: 0.9677 - classification_loss: 0.1599  471/500 [===========================>..] - ETA: 9s - loss: 1.1278 - regression_loss: 0.9678 - classification_loss: 0.1599 472/500 [===========================>..] - ETA: 9s - loss: 1.1265 - regression_loss: 0.9667 - classification_loss: 0.1598 473/500 [===========================>..] - ETA: 8s - loss: 1.1252 - regression_loss: 0.9656 - classification_loss: 0.1596 474/500 [===========================>..] - ETA: 8s - loss: 1.1239 - regression_loss: 0.9646 - classification_loss: 0.1594 475/500 [===========================>..] - ETA: 8s - loss: 1.1246 - regression_loss: 0.9652 - classification_loss: 0.1594 476/500 [===========================>..] - ETA: 7s - loss: 1.1238 - regression_loss: 0.9646 - classification_loss: 0.1592 477/500 [===========================>..] - ETA: 7s - loss: 1.1248 - regression_loss: 0.9654 - classification_loss: 0.1594 478/500 [===========================>..] - ETA: 7s - loss: 1.1253 - regression_loss: 0.9658 - classification_loss: 0.1595 479/500 [===========================>..] - ETA: 6s - loss: 1.1252 - regression_loss: 0.9657 - classification_loss: 0.1595 480/500 [===========================>..] - ETA: 6s - loss: 1.1249 - regression_loss: 0.9655 - classification_loss: 0.1594 481/500 [===========================>..] - ETA: 6s - loss: 1.1256 - regression_loss: 0.9663 - classification_loss: 0.1593 482/500 [===========================>..] - ETA: 5s - loss: 1.1252 - regression_loss: 0.9659 - classification_loss: 0.1593 483/500 [===========================>..] - ETA: 5s - loss: 1.1253 - regression_loss: 0.9660 - classification_loss: 0.1593 484/500 [============================>.] - ETA: 5s - loss: 1.1261 - regression_loss: 0.9667 - classification_loss: 0.1595 485/500 [============================>.] - ETA: 4s - loss: 1.1266 - regression_loss: 0.9671 - classification_loss: 0.1595 486/500 [============================>.] - ETA: 4s - loss: 1.1255 - regression_loss: 0.9662 - classification_loss: 0.1593 487/500 [============================>.] - ETA: 4s - loss: 1.1253 - regression_loss: 0.9660 - classification_loss: 0.1593 488/500 [============================>.] - ETA: 3s - loss: 1.1258 - regression_loss: 0.9664 - classification_loss: 0.1594 489/500 [============================>.] - ETA: 3s - loss: 1.1243 - regression_loss: 0.9652 - classification_loss: 0.1592 490/500 [============================>.] - ETA: 3s - loss: 1.1241 - regression_loss: 0.9648 - classification_loss: 0.1593 491/500 [============================>.] - ETA: 2s - loss: 1.1239 - regression_loss: 0.9646 - classification_loss: 0.1593 492/500 [============================>.] - ETA: 2s - loss: 1.1246 - regression_loss: 0.9652 - classification_loss: 0.1594 493/500 [============================>.] - ETA: 2s - loss: 1.1246 - regression_loss: 0.9650 - classification_loss: 0.1595 494/500 [============================>.] - ETA: 1s - loss: 1.1236 - regression_loss: 0.9642 - classification_loss: 0.1593 495/500 [============================>.] - ETA: 1s - loss: 1.1235 - regression_loss: 0.9642 - classification_loss: 0.1592 496/500 [============================>.] - ETA: 1s - loss: 1.1230 - regression_loss: 0.9639 - classification_loss: 0.1591 497/500 [============================>.] - ETA: 0s - loss: 1.1235 - regression_loss: 0.9642 - classification_loss: 0.1593 498/500 [============================>.] - ETA: 0s - loss: 1.1240 - regression_loss: 0.9646 - classification_loss: 0.1593 499/500 [============================>.] - ETA: 0s - loss: 1.1247 - regression_loss: 0.9653 - classification_loss: 0.1594 500/500 [==============================] - 165s 330ms/step - loss: 1.1255 - regression_loss: 0.9660 - classification_loss: 0.1595 1172 instances of class plum with average precision: 0.6265 mAP: 0.6265 Epoch 00026: saving model to ./training/snapshots/resnet101_pascal_26.h5 Epoch 27/150 1/500 [..............................] - ETA: 2:49 - loss: 0.8781 - regression_loss: 0.7557 - classification_loss: 0.1223 2/500 [..............................] - ETA: 2:45 - loss: 0.9536 - regression_loss: 0.8155 - classification_loss: 0.1381 3/500 [..............................] - ETA: 2:42 - loss: 1.1894 - regression_loss: 1.0234 - classification_loss: 0.1660 4/500 [..............................] - ETA: 2:40 - loss: 1.2029 - regression_loss: 1.0508 - classification_loss: 0.1521 5/500 [..............................] - ETA: 2:40 - loss: 1.1906 - regression_loss: 1.0493 - classification_loss: 0.1413 6/500 [..............................] - ETA: 2:41 - loss: 1.2503 - regression_loss: 1.0944 - classification_loss: 0.1559 7/500 [..............................] - ETA: 2:39 - loss: 1.2552 - regression_loss: 1.0956 - classification_loss: 0.1596 8/500 [..............................] - ETA: 2:38 - loss: 1.2922 - regression_loss: 1.1242 - classification_loss: 0.1681 9/500 [..............................] - ETA: 2:39 - loss: 1.2534 - regression_loss: 1.0890 - classification_loss: 0.1644 10/500 [..............................] - ETA: 2:39 - loss: 1.3084 - regression_loss: 1.1293 - classification_loss: 0.1792 11/500 [..............................] - ETA: 2:38 - loss: 1.3441 - regression_loss: 1.1540 - classification_loss: 0.1902 12/500 [..............................] - ETA: 2:38 - loss: 1.3536 - regression_loss: 1.1620 - classification_loss: 0.1917 13/500 [..............................] - ETA: 2:38 - loss: 1.3859 - regression_loss: 1.1905 - classification_loss: 0.1954 14/500 [..............................] - ETA: 2:38 - loss: 1.3803 - regression_loss: 1.1857 - classification_loss: 0.1947 15/500 [..............................] - ETA: 2:37 - loss: 1.3366 - regression_loss: 1.1460 - classification_loss: 0.1905 16/500 [..............................] - ETA: 2:37 - loss: 1.2975 - regression_loss: 1.1102 - classification_loss: 0.1874 17/500 [>.............................] - ETA: 2:37 - loss: 1.2874 - regression_loss: 1.1003 - classification_loss: 0.1871 18/500 [>.............................] - ETA: 2:37 - loss: 1.2753 - regression_loss: 1.0912 - classification_loss: 0.1841 19/500 [>.............................] - ETA: 2:37 - loss: 1.2934 - regression_loss: 1.1081 - classification_loss: 0.1853 20/500 [>.............................] - ETA: 2:37 - loss: 1.3081 - regression_loss: 1.1205 - classification_loss: 0.1876 21/500 [>.............................] - ETA: 2:36 - loss: 1.2875 - regression_loss: 1.1028 - classification_loss: 0.1847 22/500 [>.............................] - ETA: 2:36 - loss: 1.2738 - regression_loss: 1.0908 - classification_loss: 0.1830 23/500 [>.............................] - ETA: 2:36 - loss: 1.2735 - regression_loss: 1.0915 - classification_loss: 0.1820 24/500 [>.............................] - ETA: 2:36 - loss: 1.2670 - regression_loss: 1.0862 - classification_loss: 0.1808 25/500 [>.............................] - ETA: 2:35 - loss: 1.2856 - regression_loss: 1.1007 - classification_loss: 0.1849 26/500 [>.............................] - ETA: 2:35 - loss: 1.2753 - regression_loss: 1.0917 - classification_loss: 0.1836 27/500 [>.............................] - ETA: 2:35 - loss: 1.2630 - regression_loss: 1.0808 - classification_loss: 0.1822 28/500 [>.............................] - ETA: 2:35 - loss: 1.2483 - regression_loss: 1.0677 - classification_loss: 0.1806 29/500 [>.............................] - ETA: 2:34 - loss: 1.2321 - regression_loss: 1.0532 - classification_loss: 0.1789 30/500 [>.............................] - ETA: 2:34 - loss: 1.2200 - regression_loss: 1.0412 - classification_loss: 0.1787 31/500 [>.............................] - ETA: 2:33 - loss: 1.2239 - regression_loss: 1.0449 - classification_loss: 0.1790 32/500 [>.............................] - ETA: 2:33 - loss: 1.2113 - regression_loss: 1.0348 - classification_loss: 0.1766 33/500 [>.............................] - ETA: 2:33 - loss: 1.2179 - regression_loss: 1.0409 - classification_loss: 0.1769 34/500 [=>............................] - ETA: 2:33 - loss: 1.2194 - regression_loss: 1.0425 - classification_loss: 0.1769 35/500 [=>............................] - ETA: 2:32 - loss: 1.2223 - regression_loss: 1.0459 - classification_loss: 0.1764 36/500 [=>............................] - ETA: 2:32 - loss: 1.2308 - regression_loss: 1.0531 - classification_loss: 0.1777 37/500 [=>............................] - ETA: 2:32 - loss: 1.2270 - regression_loss: 1.0506 - classification_loss: 0.1764 38/500 [=>............................] - ETA: 2:31 - loss: 1.2311 - regression_loss: 1.0543 - classification_loss: 0.1768 39/500 [=>............................] - ETA: 2:31 - loss: 1.2463 - regression_loss: 1.0669 - classification_loss: 0.1794 40/500 [=>............................] - ETA: 2:31 - loss: 1.2536 - regression_loss: 1.0741 - classification_loss: 0.1795 41/500 [=>............................] - ETA: 2:30 - loss: 1.2645 - regression_loss: 1.0821 - classification_loss: 0.1824 42/500 [=>............................] - ETA: 2:30 - loss: 1.2745 - regression_loss: 1.0902 - classification_loss: 0.1843 43/500 [=>............................] - ETA: 2:29 - loss: 1.2702 - regression_loss: 1.0867 - classification_loss: 0.1835 44/500 [=>............................] - ETA: 2:29 - loss: 1.2517 - regression_loss: 1.0711 - classification_loss: 0.1806 45/500 [=>............................] - ETA: 2:29 - loss: 1.2383 - regression_loss: 1.0599 - classification_loss: 0.1784 46/500 [=>............................] - ETA: 2:28 - loss: 1.2435 - regression_loss: 1.0641 - classification_loss: 0.1794 47/500 [=>............................] - ETA: 2:28 - loss: 1.2425 - regression_loss: 1.0633 - classification_loss: 0.1792 48/500 [=>............................] - ETA: 2:27 - loss: 1.2458 - regression_loss: 1.0668 - classification_loss: 0.1790 49/500 [=>............................] - ETA: 2:27 - loss: 1.2524 - regression_loss: 1.0721 - classification_loss: 0.1803 50/500 [==>...........................] - ETA: 2:27 - loss: 1.2533 - regression_loss: 1.0731 - classification_loss: 0.1802 51/500 [==>...........................] - ETA: 2:26 - loss: 1.2551 - regression_loss: 1.0730 - classification_loss: 0.1821 52/500 [==>...........................] - ETA: 2:26 - loss: 1.2540 - regression_loss: 1.0726 - classification_loss: 0.1813 53/500 [==>...........................] - ETA: 2:26 - loss: 1.2490 - regression_loss: 1.0686 - classification_loss: 0.1804 54/500 [==>...........................] - ETA: 2:25 - loss: 1.2575 - regression_loss: 1.0763 - classification_loss: 0.1812 55/500 [==>...........................] - ETA: 2:25 - loss: 1.2437 - regression_loss: 1.0646 - classification_loss: 0.1791 56/500 [==>...........................] - ETA: 2:25 - loss: 1.2395 - regression_loss: 1.0603 - classification_loss: 0.1793 57/500 [==>...........................] - ETA: 2:25 - loss: 1.2343 - regression_loss: 1.0561 - classification_loss: 0.1782 58/500 [==>...........................] - ETA: 2:25 - loss: 1.2394 - regression_loss: 1.0597 - classification_loss: 0.1797 59/500 [==>...........................] - ETA: 2:24 - loss: 1.2361 - regression_loss: 1.0573 - classification_loss: 0.1788 60/500 [==>...........................] - ETA: 2:24 - loss: 1.2214 - regression_loss: 1.0447 - classification_loss: 0.1767 61/500 [==>...........................] - ETA: 2:23 - loss: 1.2199 - regression_loss: 1.0433 - classification_loss: 0.1766 62/500 [==>...........................] - ETA: 2:23 - loss: 1.2087 - regression_loss: 1.0333 - classification_loss: 0.1754 63/500 [==>...........................] - ETA: 2:23 - loss: 1.2061 - regression_loss: 1.0307 - classification_loss: 0.1753 64/500 [==>...........................] - ETA: 2:22 - loss: 1.2074 - regression_loss: 1.0320 - classification_loss: 0.1754 65/500 [==>...........................] - ETA: 2:22 - loss: 1.2065 - regression_loss: 1.0314 - classification_loss: 0.1751 66/500 [==>...........................] - ETA: 2:22 - loss: 1.2066 - regression_loss: 1.0311 - classification_loss: 0.1755 67/500 [===>..........................] - ETA: 2:22 - loss: 1.2051 - regression_loss: 1.0299 - classification_loss: 0.1752 68/500 [===>..........................] - ETA: 2:21 - loss: 1.2097 - regression_loss: 1.0338 - classification_loss: 0.1759 69/500 [===>..........................] - ETA: 2:21 - loss: 1.2033 - regression_loss: 1.0285 - classification_loss: 0.1747 70/500 [===>..........................] - ETA: 2:20 - loss: 1.2038 - regression_loss: 1.0287 - classification_loss: 0.1752 71/500 [===>..........................] - ETA: 2:20 - loss: 1.2119 - regression_loss: 1.0347 - classification_loss: 0.1772 72/500 [===>..........................] - ETA: 2:20 - loss: 1.2134 - regression_loss: 1.0362 - classification_loss: 0.1772 73/500 [===>..........................] - ETA: 2:19 - loss: 1.2127 - regression_loss: 1.0359 - classification_loss: 0.1768 74/500 [===>..........................] - ETA: 2:19 - loss: 1.2122 - regression_loss: 1.0354 - classification_loss: 0.1768 75/500 [===>..........................] - ETA: 2:19 - loss: 1.2073 - regression_loss: 1.0312 - classification_loss: 0.1761 76/500 [===>..........................] - ETA: 2:18 - loss: 1.2033 - regression_loss: 1.0284 - classification_loss: 0.1749 77/500 [===>..........................] - ETA: 2:18 - loss: 1.2016 - regression_loss: 1.0263 - classification_loss: 0.1753 78/500 [===>..........................] - ETA: 2:18 - loss: 1.2154 - regression_loss: 1.0375 - classification_loss: 0.1779 79/500 [===>..........................] - ETA: 2:17 - loss: 1.2171 - regression_loss: 1.0387 - classification_loss: 0.1784 80/500 [===>..........................] - ETA: 2:17 - loss: 1.2171 - regression_loss: 1.0385 - classification_loss: 0.1786 81/500 [===>..........................] - ETA: 2:17 - loss: 1.2150 - regression_loss: 1.0367 - classification_loss: 0.1784 82/500 [===>..........................] - ETA: 2:17 - loss: 1.2129 - regression_loss: 1.0353 - classification_loss: 0.1776 83/500 [===>..........................] - ETA: 2:16 - loss: 1.2175 - regression_loss: 1.0395 - classification_loss: 0.1780 84/500 [====>.........................] - ETA: 2:16 - loss: 1.2243 - regression_loss: 1.0450 - classification_loss: 0.1793 85/500 [====>.........................] - ETA: 2:16 - loss: 1.2232 - regression_loss: 1.0439 - classification_loss: 0.1793 86/500 [====>.........................] - ETA: 2:15 - loss: 1.2217 - regression_loss: 1.0423 - classification_loss: 0.1794 87/500 [====>.........................] - ETA: 2:15 - loss: 1.2192 - regression_loss: 1.0407 - classification_loss: 0.1784 88/500 [====>.........................] - ETA: 2:15 - loss: 1.2170 - regression_loss: 1.0388 - classification_loss: 0.1781 89/500 [====>.........................] - ETA: 2:14 - loss: 1.2071 - regression_loss: 1.0303 - classification_loss: 0.1768 90/500 [====>.........................] - ETA: 2:14 - loss: 1.2135 - regression_loss: 1.0357 - classification_loss: 0.1778 91/500 [====>.........................] - ETA: 2:14 - loss: 1.2154 - regression_loss: 1.0375 - classification_loss: 0.1779 92/500 [====>.........................] - ETA: 2:14 - loss: 1.2219 - regression_loss: 1.0434 - classification_loss: 0.1784 93/500 [====>.........................] - ETA: 2:13 - loss: 1.2135 - regression_loss: 1.0364 - classification_loss: 0.1771 94/500 [====>.........................] - ETA: 2:13 - loss: 1.2091 - regression_loss: 1.0325 - classification_loss: 0.1766 95/500 [====>.........................] - ETA: 2:13 - loss: 1.2103 - regression_loss: 1.0336 - classification_loss: 0.1767 96/500 [====>.........................] - ETA: 2:12 - loss: 1.2094 - regression_loss: 1.0333 - classification_loss: 0.1761 97/500 [====>.........................] - ETA: 2:12 - loss: 1.2087 - regression_loss: 1.0326 - classification_loss: 0.1761 98/500 [====>.........................] - ETA: 2:11 - loss: 1.2120 - regression_loss: 1.0354 - classification_loss: 0.1766 99/500 [====>.........................] - ETA: 2:11 - loss: 1.2102 - regression_loss: 1.0340 - classification_loss: 0.1761 100/500 [=====>........................] - ETA: 2:11 - loss: 1.2036 - regression_loss: 1.0287 - classification_loss: 0.1750 101/500 [=====>........................] - ETA: 2:10 - loss: 1.1992 - regression_loss: 1.0250 - classification_loss: 0.1741 102/500 [=====>........................] - ETA: 2:10 - loss: 1.1965 - regression_loss: 1.0227 - classification_loss: 0.1738 103/500 [=====>........................] - ETA: 2:10 - loss: 1.1954 - regression_loss: 1.0219 - classification_loss: 0.1734 104/500 [=====>........................] - ETA: 2:10 - loss: 1.1951 - regression_loss: 1.0220 - classification_loss: 0.1731 105/500 [=====>........................] - ETA: 2:09 - loss: 1.1942 - regression_loss: 1.0209 - classification_loss: 0.1734 106/500 [=====>........................] - ETA: 2:09 - loss: 1.1934 - regression_loss: 1.0203 - classification_loss: 0.1731 107/500 [=====>........................] - ETA: 2:09 - loss: 1.1998 - regression_loss: 1.0257 - classification_loss: 0.1741 108/500 [=====>........................] - ETA: 2:08 - loss: 1.1946 - regression_loss: 1.0212 - classification_loss: 0.1734 109/500 [=====>........................] - ETA: 2:08 - loss: 1.1960 - regression_loss: 1.0226 - classification_loss: 0.1734 110/500 [=====>........................] - ETA: 2:08 - loss: 1.1971 - regression_loss: 1.0236 - classification_loss: 0.1735 111/500 [=====>........................] - ETA: 2:07 - loss: 1.1956 - regression_loss: 1.0224 - classification_loss: 0.1731 112/500 [=====>........................] - ETA: 2:07 - loss: 1.1973 - regression_loss: 1.0241 - classification_loss: 0.1732 113/500 [=====>........................] - ETA: 2:07 - loss: 1.1928 - regression_loss: 1.0204 - classification_loss: 0.1724 114/500 [=====>........................] - ETA: 2:06 - loss: 1.1895 - regression_loss: 1.0175 - classification_loss: 0.1719 115/500 [=====>........................] - ETA: 2:06 - loss: 1.1933 - regression_loss: 1.0209 - classification_loss: 0.1725 116/500 [=====>........................] - ETA: 2:06 - loss: 1.1965 - regression_loss: 1.0238 - classification_loss: 0.1727 117/500 [======>.......................] - ETA: 2:05 - loss: 1.1914 - regression_loss: 1.0190 - classification_loss: 0.1724 118/500 [======>.......................] - ETA: 2:05 - loss: 1.1936 - regression_loss: 1.0208 - classification_loss: 0.1728 119/500 [======>.......................] - ETA: 2:05 - loss: 1.1891 - regression_loss: 1.0170 - classification_loss: 0.1722 120/500 [======>.......................] - ETA: 2:04 - loss: 1.1924 - regression_loss: 1.0200 - classification_loss: 0.1724 121/500 [======>.......................] - ETA: 2:04 - loss: 1.1870 - regression_loss: 1.0154 - classification_loss: 0.1716 122/500 [======>.......................] - ETA: 2:04 - loss: 1.1797 - regression_loss: 1.0091 - classification_loss: 0.1706 123/500 [======>.......................] - ETA: 2:04 - loss: 1.1757 - regression_loss: 1.0059 - classification_loss: 0.1698 124/500 [======>.......................] - ETA: 2:03 - loss: 1.1762 - regression_loss: 1.0070 - classification_loss: 0.1692 125/500 [======>.......................] - ETA: 2:03 - loss: 1.1704 - regression_loss: 1.0020 - classification_loss: 0.1683 126/500 [======>.......................] - ETA: 2:03 - loss: 1.1704 - regression_loss: 1.0019 - classification_loss: 0.1684 127/500 [======>.......................] - ETA: 2:02 - loss: 1.1714 - regression_loss: 1.0030 - classification_loss: 0.1684 128/500 [======>.......................] - ETA: 2:02 - loss: 1.1731 - regression_loss: 1.0038 - classification_loss: 0.1692 129/500 [======>.......................] - ETA: 2:01 - loss: 1.1766 - regression_loss: 1.0067 - classification_loss: 0.1699 130/500 [======>.......................] - ETA: 2:01 - loss: 1.1747 - regression_loss: 1.0051 - classification_loss: 0.1696 131/500 [======>.......................] - ETA: 2:01 - loss: 1.1696 - regression_loss: 1.0009 - classification_loss: 0.1687 132/500 [======>.......................] - ETA: 2:00 - loss: 1.1724 - regression_loss: 1.0033 - classification_loss: 0.1692 133/500 [======>.......................] - ETA: 2:00 - loss: 1.1685 - regression_loss: 0.9997 - classification_loss: 0.1687 134/500 [=======>......................] - ETA: 2:00 - loss: 1.1701 - regression_loss: 1.0015 - classification_loss: 0.1686 135/500 [=======>......................] - ETA: 1:59 - loss: 1.1713 - regression_loss: 1.0029 - classification_loss: 0.1684 136/500 [=======>......................] - ETA: 1:59 - loss: 1.1742 - regression_loss: 1.0057 - classification_loss: 0.1685 137/500 [=======>......................] - ETA: 1:59 - loss: 1.1781 - regression_loss: 1.0092 - classification_loss: 0.1689 138/500 [=======>......................] - ETA: 1:58 - loss: 1.1802 - regression_loss: 1.0109 - classification_loss: 0.1693 139/500 [=======>......................] - ETA: 1:58 - loss: 1.1797 - regression_loss: 1.0108 - classification_loss: 0.1689 140/500 [=======>......................] - ETA: 1:58 - loss: 1.1759 - regression_loss: 1.0075 - classification_loss: 0.1684 141/500 [=======>......................] - ETA: 1:57 - loss: 1.1791 - regression_loss: 1.0103 - classification_loss: 0.1688 142/500 [=======>......................] - ETA: 1:57 - loss: 1.1820 - regression_loss: 1.0126 - classification_loss: 0.1694 143/500 [=======>......................] - ETA: 1:57 - loss: 1.1817 - regression_loss: 1.0124 - classification_loss: 0.1693 144/500 [=======>......................] - ETA: 1:56 - loss: 1.1771 - regression_loss: 1.0082 - classification_loss: 0.1689 145/500 [=======>......................] - ETA: 1:56 - loss: 1.1795 - regression_loss: 1.0103 - classification_loss: 0.1692 146/500 [=======>......................] - ETA: 1:56 - loss: 1.1767 - regression_loss: 1.0079 - classification_loss: 0.1688 147/500 [=======>......................] - ETA: 1:55 - loss: 1.1742 - regression_loss: 1.0057 - classification_loss: 0.1685 148/500 [=======>......................] - ETA: 1:55 - loss: 1.1743 - regression_loss: 1.0059 - classification_loss: 0.1684 149/500 [=======>......................] - ETA: 1:55 - loss: 1.1731 - regression_loss: 1.0052 - classification_loss: 0.1679 150/500 [========>.....................] - ETA: 1:54 - loss: 1.1732 - regression_loss: 1.0053 - classification_loss: 0.1680 151/500 [========>.....................] - ETA: 1:54 - loss: 1.1778 - regression_loss: 1.0090 - classification_loss: 0.1688 152/500 [========>.....................] - ETA: 1:54 - loss: 1.1744 - regression_loss: 1.0061 - classification_loss: 0.1682 153/500 [========>.....................] - ETA: 1:53 - loss: 1.1694 - regression_loss: 1.0019 - classification_loss: 0.1675 154/500 [========>.....................] - ETA: 1:53 - loss: 1.1698 - regression_loss: 1.0024 - classification_loss: 0.1674 155/500 [========>.....................] - ETA: 1:53 - loss: 1.1729 - regression_loss: 1.0049 - classification_loss: 0.1680 156/500 [========>.....................] - ETA: 1:53 - loss: 1.1760 - regression_loss: 1.0070 - classification_loss: 0.1689 157/500 [========>.....................] - ETA: 1:52 - loss: 1.1753 - regression_loss: 1.0065 - classification_loss: 0.1688 158/500 [========>.....................] - ETA: 1:52 - loss: 1.1726 - regression_loss: 1.0040 - classification_loss: 0.1686 159/500 [========>.....................] - ETA: 1:51 - loss: 1.1729 - regression_loss: 1.0043 - classification_loss: 0.1686 160/500 [========>.....................] - ETA: 1:51 - loss: 1.1712 - regression_loss: 1.0029 - classification_loss: 0.1683 161/500 [========>.....................] - ETA: 1:51 - loss: 1.1723 - regression_loss: 1.0035 - classification_loss: 0.1687 162/500 [========>.....................] - ETA: 1:50 - loss: 1.1679 - regression_loss: 0.9997 - classification_loss: 0.1682 163/500 [========>.....................] - ETA: 1:50 - loss: 1.1691 - regression_loss: 1.0007 - classification_loss: 0.1683 164/500 [========>.....................] - ETA: 1:50 - loss: 1.1657 - regression_loss: 0.9979 - classification_loss: 0.1678 165/500 [========>.....................] - ETA: 1:49 - loss: 1.1634 - regression_loss: 0.9960 - classification_loss: 0.1674 166/500 [========>.....................] - ETA: 1:49 - loss: 1.1624 - regression_loss: 0.9952 - classification_loss: 0.1673 167/500 [=========>....................] - ETA: 1:49 - loss: 1.1583 - regression_loss: 0.9914 - classification_loss: 0.1668 168/500 [=========>....................] - ETA: 1:48 - loss: 1.1555 - regression_loss: 0.9892 - classification_loss: 0.1662 169/500 [=========>....................] - ETA: 1:48 - loss: 1.1564 - regression_loss: 0.9902 - classification_loss: 0.1661 170/500 [=========>....................] - ETA: 1:48 - loss: 1.1516 - regression_loss: 0.9860 - classification_loss: 0.1656 171/500 [=========>....................] - ETA: 1:47 - loss: 1.1509 - regression_loss: 0.9855 - classification_loss: 0.1654 172/500 [=========>....................] - ETA: 1:47 - loss: 1.1491 - regression_loss: 0.9839 - classification_loss: 0.1651 173/500 [=========>....................] - ETA: 1:47 - loss: 1.1491 - regression_loss: 0.9840 - classification_loss: 0.1651 174/500 [=========>....................] - ETA: 1:46 - loss: 1.1486 - regression_loss: 0.9835 - classification_loss: 0.1651 175/500 [=========>....................] - ETA: 1:46 - loss: 1.1486 - regression_loss: 0.9836 - classification_loss: 0.1650 176/500 [=========>....................] - ETA: 1:46 - loss: 1.1445 - regression_loss: 0.9802 - classification_loss: 0.1643 177/500 [=========>....................] - ETA: 1:45 - loss: 1.1422 - regression_loss: 0.9783 - classification_loss: 0.1639 178/500 [=========>....................] - ETA: 1:45 - loss: 1.1436 - regression_loss: 0.9796 - classification_loss: 0.1640 179/500 [=========>....................] - ETA: 1:45 - loss: 1.1404 - regression_loss: 0.9770 - classification_loss: 0.1634 180/500 [=========>....................] - ETA: 1:44 - loss: 1.1413 - regression_loss: 0.9777 - classification_loss: 0.1636 181/500 [=========>....................] - ETA: 1:44 - loss: 1.1416 - regression_loss: 0.9782 - classification_loss: 0.1634 182/500 [=========>....................] - ETA: 1:44 - loss: 1.1407 - regression_loss: 0.9776 - classification_loss: 0.1630 183/500 [=========>....................] - ETA: 1:43 - loss: 1.1396 - regression_loss: 0.9769 - classification_loss: 0.1627 184/500 [==========>...................] - ETA: 1:43 - loss: 1.1350 - regression_loss: 0.9729 - classification_loss: 0.1621 185/500 [==========>...................] - ETA: 1:43 - loss: 1.1335 - regression_loss: 0.9717 - classification_loss: 0.1618 186/500 [==========>...................] - ETA: 1:42 - loss: 1.1351 - regression_loss: 0.9731 - classification_loss: 0.1620 187/500 [==========>...................] - ETA: 1:42 - loss: 1.1326 - regression_loss: 0.9709 - classification_loss: 0.1617 188/500 [==========>...................] - ETA: 1:42 - loss: 1.1308 - regression_loss: 0.9695 - classification_loss: 0.1614 189/500 [==========>...................] - ETA: 1:41 - loss: 1.1280 - regression_loss: 0.9670 - classification_loss: 0.1611 190/500 [==========>...................] - ETA: 1:41 - loss: 1.1263 - regression_loss: 0.9657 - classification_loss: 0.1606 191/500 [==========>...................] - ETA: 1:41 - loss: 1.1275 - regression_loss: 0.9671 - classification_loss: 0.1605 192/500 [==========>...................] - ETA: 1:41 - loss: 1.1254 - regression_loss: 0.9649 - classification_loss: 0.1605 193/500 [==========>...................] - ETA: 1:40 - loss: 1.1236 - regression_loss: 0.9634 - classification_loss: 0.1602 194/500 [==========>...................] - ETA: 1:40 - loss: 1.1203 - regression_loss: 0.9606 - classification_loss: 0.1598 195/500 [==========>...................] - ETA: 1:40 - loss: 1.1189 - regression_loss: 0.9595 - classification_loss: 0.1594 196/500 [==========>...................] - ETA: 1:39 - loss: 1.1150 - regression_loss: 0.9561 - classification_loss: 0.1589 197/500 [==========>...................] - ETA: 1:39 - loss: 1.1170 - regression_loss: 0.9575 - classification_loss: 0.1595 198/500 [==========>...................] - ETA: 1:39 - loss: 1.1180 - regression_loss: 0.9586 - classification_loss: 0.1594 199/500 [==========>...................] - ETA: 1:38 - loss: 1.1156 - regression_loss: 0.9567 - classification_loss: 0.1589 200/500 [===========>..................] - ETA: 1:38 - loss: 1.1158 - regression_loss: 0.9568 - classification_loss: 0.1590 201/500 [===========>..................] - ETA: 1:38 - loss: 1.1135 - regression_loss: 0.9548 - classification_loss: 0.1587 202/500 [===========>..................] - ETA: 1:37 - loss: 1.1133 - regression_loss: 0.9546 - classification_loss: 0.1587 203/500 [===========>..................] - ETA: 1:37 - loss: 1.1110 - regression_loss: 0.9528 - classification_loss: 0.1582 204/500 [===========>..................] - ETA: 1:37 - loss: 1.1083 - regression_loss: 0.9506 - classification_loss: 0.1577 205/500 [===========>..................] - ETA: 1:36 - loss: 1.1042 - regression_loss: 0.9471 - classification_loss: 0.1571 206/500 [===========>..................] - ETA: 1:36 - loss: 1.1031 - regression_loss: 0.9462 - classification_loss: 0.1569 207/500 [===========>..................] - ETA: 1:36 - loss: 1.1044 - regression_loss: 0.9473 - classification_loss: 0.1571 208/500 [===========>..................] - ETA: 1:35 - loss: 1.1052 - regression_loss: 0.9480 - classification_loss: 0.1572 209/500 [===========>..................] - ETA: 1:35 - loss: 1.1046 - regression_loss: 0.9477 - classification_loss: 0.1570 210/500 [===========>..................] - ETA: 1:35 - loss: 1.1060 - regression_loss: 0.9487 - classification_loss: 0.1573 211/500 [===========>..................] - ETA: 1:34 - loss: 1.1066 - regression_loss: 0.9492 - classification_loss: 0.1574 212/500 [===========>..................] - ETA: 1:34 - loss: 1.1075 - regression_loss: 0.9501 - classification_loss: 0.1574 213/500 [===========>..................] - ETA: 1:34 - loss: 1.1057 - regression_loss: 0.9488 - classification_loss: 0.1569 214/500 [===========>..................] - ETA: 1:33 - loss: 1.1042 - regression_loss: 0.9474 - classification_loss: 0.1568 215/500 [===========>..................] - ETA: 1:33 - loss: 1.1055 - regression_loss: 0.9484 - classification_loss: 0.1571 216/500 [===========>..................] - ETA: 1:33 - loss: 1.1034 - regression_loss: 0.9466 - classification_loss: 0.1568 217/500 [============>.................] - ETA: 1:32 - loss: 1.1051 - regression_loss: 0.9481 - classification_loss: 0.1570 218/500 [============>.................] - ETA: 1:32 - loss: 1.1061 - regression_loss: 0.9490 - classification_loss: 0.1570 219/500 [============>.................] - ETA: 1:32 - loss: 1.1035 - regression_loss: 0.9468 - classification_loss: 0.1567 220/500 [============>.................] - ETA: 1:31 - loss: 1.1050 - regression_loss: 0.9482 - classification_loss: 0.1567 221/500 [============>.................] - ETA: 1:31 - loss: 1.1051 - regression_loss: 0.9484 - classification_loss: 0.1566 222/500 [============>.................] - ETA: 1:31 - loss: 1.1058 - regression_loss: 0.9489 - classification_loss: 0.1568 223/500 [============>.................] - ETA: 1:31 - loss: 1.1046 - regression_loss: 0.9481 - classification_loss: 0.1565 224/500 [============>.................] - ETA: 1:30 - loss: 1.1038 - regression_loss: 0.9475 - classification_loss: 0.1563 225/500 [============>.................] - ETA: 1:30 - loss: 1.1046 - regression_loss: 0.9482 - classification_loss: 0.1564 226/500 [============>.................] - ETA: 1:30 - loss: 1.1052 - regression_loss: 0.9489 - classification_loss: 0.1563 227/500 [============>.................] - ETA: 1:29 - loss: 1.1049 - regression_loss: 0.9488 - classification_loss: 0.1562 228/500 [============>.................] - ETA: 1:29 - loss: 1.1028 - regression_loss: 0.9468 - classification_loss: 0.1560 229/500 [============>.................] - ETA: 1:29 - loss: 1.1038 - regression_loss: 0.9477 - classification_loss: 0.1561 230/500 [============>.................] - ETA: 1:28 - loss: 1.1046 - regression_loss: 0.9484 - classification_loss: 0.1562 231/500 [============>.................] - ETA: 1:28 - loss: 1.1054 - regression_loss: 0.9492 - classification_loss: 0.1563 232/500 [============>.................] - ETA: 1:28 - loss: 1.1050 - regression_loss: 0.9489 - classification_loss: 0.1561 233/500 [============>.................] - ETA: 1:27 - loss: 1.1045 - regression_loss: 0.9484 - classification_loss: 0.1561 234/500 [=============>................] - ETA: 1:27 - loss: 1.1017 - regression_loss: 0.9460 - classification_loss: 0.1557 235/500 [=============>................] - ETA: 1:27 - loss: 1.0996 - regression_loss: 0.9441 - classification_loss: 0.1555 236/500 [=============>................] - ETA: 1:26 - loss: 1.1007 - regression_loss: 0.9450 - classification_loss: 0.1557 237/500 [=============>................] - ETA: 1:26 - loss: 1.1004 - regression_loss: 0.9447 - classification_loss: 0.1556 238/500 [=============>................] - ETA: 1:26 - loss: 1.1007 - regression_loss: 0.9451 - classification_loss: 0.1556 239/500 [=============>................] - ETA: 1:25 - loss: 1.1006 - regression_loss: 0.9450 - classification_loss: 0.1556 240/500 [=============>................] - ETA: 1:25 - loss: 1.1032 - regression_loss: 0.9471 - classification_loss: 0.1561 241/500 [=============>................] - ETA: 1:25 - loss: 1.1036 - regression_loss: 0.9475 - classification_loss: 0.1561 242/500 [=============>................] - ETA: 1:24 - loss: 1.1061 - regression_loss: 0.9494 - classification_loss: 0.1567 243/500 [=============>................] - ETA: 1:24 - loss: 1.1056 - regression_loss: 0.9490 - classification_loss: 0.1566 244/500 [=============>................] - ETA: 1:24 - loss: 1.1060 - regression_loss: 0.9494 - classification_loss: 0.1566 245/500 [=============>................] - ETA: 1:23 - loss: 1.1035 - regression_loss: 0.9474 - classification_loss: 0.1562 246/500 [=============>................] - ETA: 1:23 - loss: 1.1006 - regression_loss: 0.9449 - classification_loss: 0.1557 247/500 [=============>................] - ETA: 1:23 - loss: 1.1003 - regression_loss: 0.9445 - classification_loss: 0.1557 248/500 [=============>................] - ETA: 1:22 - loss: 1.0994 - regression_loss: 0.9438 - classification_loss: 0.1555 249/500 [=============>................] - ETA: 1:22 - loss: 1.1015 - regression_loss: 0.9455 - classification_loss: 0.1560 250/500 [==============>...............] - ETA: 1:22 - loss: 1.1023 - regression_loss: 0.9462 - classification_loss: 0.1561 251/500 [==============>...............] - ETA: 1:21 - loss: 1.1027 - regression_loss: 0.9465 - classification_loss: 0.1561 252/500 [==============>...............] - ETA: 1:21 - loss: 1.1035 - regression_loss: 0.9471 - classification_loss: 0.1564 253/500 [==============>...............] - ETA: 1:21 - loss: 1.1047 - regression_loss: 0.9482 - classification_loss: 0.1565 254/500 [==============>...............] - ETA: 1:20 - loss: 1.1040 - regression_loss: 0.9478 - classification_loss: 0.1563 255/500 [==============>...............] - ETA: 1:20 - loss: 1.1041 - regression_loss: 0.9478 - classification_loss: 0.1563 256/500 [==============>...............] - ETA: 1:20 - loss: 1.1012 - regression_loss: 0.9452 - classification_loss: 0.1560 257/500 [==============>...............] - ETA: 1:19 - loss: 1.1005 - regression_loss: 0.9446 - classification_loss: 0.1559 258/500 [==============>...............] - ETA: 1:19 - loss: 1.1016 - regression_loss: 0.9458 - classification_loss: 0.1558 259/500 [==============>...............] - ETA: 1:19 - loss: 1.1023 - regression_loss: 0.9466 - classification_loss: 0.1557 260/500 [==============>...............] - ETA: 1:18 - loss: 1.1020 - regression_loss: 0.9464 - classification_loss: 0.1556 261/500 [==============>...............] - ETA: 1:18 - loss: 1.1013 - regression_loss: 0.9459 - classification_loss: 0.1554 262/500 [==============>...............] - ETA: 1:18 - loss: 1.1011 - regression_loss: 0.9457 - classification_loss: 0.1554 263/500 [==============>...............] - ETA: 1:17 - loss: 1.0998 - regression_loss: 0.9446 - classification_loss: 0.1551 264/500 [==============>...............] - ETA: 1:17 - loss: 1.0992 - regression_loss: 0.9439 - classification_loss: 0.1553 265/500 [==============>...............] - ETA: 1:17 - loss: 1.0983 - regression_loss: 0.9433 - classification_loss: 0.1550 266/500 [==============>...............] - ETA: 1:16 - loss: 1.0989 - regression_loss: 0.9439 - classification_loss: 0.1551 267/500 [===============>..............] - ETA: 1:16 - loss: 1.0997 - regression_loss: 0.9444 - classification_loss: 0.1553 268/500 [===============>..............] - ETA: 1:16 - loss: 1.0979 - regression_loss: 0.9429 - classification_loss: 0.1550 269/500 [===============>..............] - ETA: 1:15 - loss: 1.0992 - regression_loss: 0.9439 - classification_loss: 0.1552 270/500 [===============>..............] - ETA: 1:15 - loss: 1.0996 - regression_loss: 0.9443 - classification_loss: 0.1553 271/500 [===============>..............] - ETA: 1:15 - loss: 1.0995 - regression_loss: 0.9442 - classification_loss: 0.1553 272/500 [===============>..............] - ETA: 1:14 - loss: 1.1004 - regression_loss: 0.9448 - classification_loss: 0.1556 273/500 [===============>..............] - ETA: 1:14 - loss: 1.1014 - regression_loss: 0.9456 - classification_loss: 0.1557 274/500 [===============>..............] - ETA: 1:14 - loss: 1.0997 - regression_loss: 0.9443 - classification_loss: 0.1554 275/500 [===============>..............] - ETA: 1:13 - loss: 1.1001 - regression_loss: 0.9449 - classification_loss: 0.1552 276/500 [===============>..............] - ETA: 1:13 - loss: 1.0991 - regression_loss: 0.9442 - classification_loss: 0.1550 277/500 [===============>..............] - ETA: 1:13 - loss: 1.1000 - regression_loss: 0.9451 - classification_loss: 0.1549 278/500 [===============>..............] - ETA: 1:12 - loss: 1.0999 - regression_loss: 0.9449 - classification_loss: 0.1550 279/500 [===============>..............] - ETA: 1:12 - loss: 1.1014 - regression_loss: 0.9460 - classification_loss: 0.1554 280/500 [===============>..............] - ETA: 1:12 - loss: 1.1000 - regression_loss: 0.9447 - classification_loss: 0.1552 281/500 [===============>..............] - ETA: 1:12 - loss: 1.1023 - regression_loss: 0.9465 - classification_loss: 0.1557 282/500 [===============>..............] - ETA: 1:11 - loss: 1.1022 - regression_loss: 0.9467 - classification_loss: 0.1555 283/500 [===============>..............] - ETA: 1:11 - loss: 1.1026 - regression_loss: 0.9470 - classification_loss: 0.1555 284/500 [================>.............] - ETA: 1:11 - loss: 1.1011 - regression_loss: 0.9459 - classification_loss: 0.1552 285/500 [================>.............] - ETA: 1:10 - loss: 1.1001 - regression_loss: 0.9451 - classification_loss: 0.1550 286/500 [================>.............] - ETA: 1:10 - loss: 1.0978 - regression_loss: 0.9432 - classification_loss: 0.1547 287/500 [================>.............] - ETA: 1:10 - loss: 1.0985 - regression_loss: 0.9439 - classification_loss: 0.1547 288/500 [================>.............] - ETA: 1:09 - loss: 1.1002 - regression_loss: 0.9453 - classification_loss: 0.1549 289/500 [================>.............] - ETA: 1:09 - loss: 1.1013 - regression_loss: 0.9462 - classification_loss: 0.1551 290/500 [================>.............] - ETA: 1:09 - loss: 1.0993 - regression_loss: 0.9445 - classification_loss: 0.1549 291/500 [================>.............] - ETA: 1:08 - loss: 1.0965 - regression_loss: 0.9420 - classification_loss: 0.1545 292/500 [================>.............] - ETA: 1:08 - loss: 1.0951 - regression_loss: 0.9408 - classification_loss: 0.1544 293/500 [================>.............] - ETA: 1:08 - loss: 1.0960 - regression_loss: 0.9416 - classification_loss: 0.1544 294/500 [================>.............] - ETA: 1:07 - loss: 1.0947 - regression_loss: 0.9405 - classification_loss: 0.1542 295/500 [================>.............] - ETA: 1:07 - loss: 1.0954 - regression_loss: 0.9409 - classification_loss: 0.1545 296/500 [================>.............] - ETA: 1:07 - loss: 1.0966 - regression_loss: 0.9418 - classification_loss: 0.1547 297/500 [================>.............] - ETA: 1:06 - loss: 1.0976 - regression_loss: 0.9426 - classification_loss: 0.1550 298/500 [================>.............] - ETA: 1:06 - loss: 1.0963 - regression_loss: 0.9416 - classification_loss: 0.1548 299/500 [================>.............] - ETA: 1:06 - loss: 1.0951 - regression_loss: 0.9404 - classification_loss: 0.1547 300/500 [=================>............] - ETA: 1:05 - loss: 1.0939 - regression_loss: 0.9394 - classification_loss: 0.1545 301/500 [=================>............] - ETA: 1:05 - loss: 1.0931 - regression_loss: 0.9386 - classification_loss: 0.1544 302/500 [=================>............] - ETA: 1:05 - loss: 1.0919 - regression_loss: 0.9378 - classification_loss: 0.1542 303/500 [=================>............] - ETA: 1:04 - loss: 1.0935 - regression_loss: 0.9391 - classification_loss: 0.1544 304/500 [=================>............] - ETA: 1:04 - loss: 1.0921 - regression_loss: 0.9379 - classification_loss: 0.1542 305/500 [=================>............] - ETA: 1:04 - loss: 1.0915 - regression_loss: 0.9374 - classification_loss: 0.1541 306/500 [=================>............] - ETA: 1:03 - loss: 1.0936 - regression_loss: 0.9390 - classification_loss: 0.1545 307/500 [=================>............] - ETA: 1:03 - loss: 1.0958 - regression_loss: 0.9410 - classification_loss: 0.1548 308/500 [=================>............] - ETA: 1:03 - loss: 1.0965 - regression_loss: 0.9416 - classification_loss: 0.1549 309/500 [=================>............] - ETA: 1:02 - loss: 1.0969 - regression_loss: 0.9421 - classification_loss: 0.1548 310/500 [=================>............] - ETA: 1:02 - loss: 1.0959 - regression_loss: 0.9413 - classification_loss: 0.1546 311/500 [=================>............] - ETA: 1:02 - loss: 1.0971 - regression_loss: 0.9423 - classification_loss: 0.1548 312/500 [=================>............] - ETA: 1:01 - loss: 1.1002 - regression_loss: 0.9449 - classification_loss: 0.1553 313/500 [=================>............] - ETA: 1:01 - loss: 1.0984 - regression_loss: 0.9433 - classification_loss: 0.1551 314/500 [=================>............] - ETA: 1:01 - loss: 1.0992 - regression_loss: 0.9439 - classification_loss: 0.1553 315/500 [=================>............] - ETA: 1:00 - loss: 1.0971 - regression_loss: 0.9419 - classification_loss: 0.1552 316/500 [=================>............] - ETA: 1:00 - loss: 1.0982 - regression_loss: 0.9428 - classification_loss: 0.1554 317/500 [==================>...........] - ETA: 1:00 - loss: 1.0957 - regression_loss: 0.9407 - classification_loss: 0.1550 318/500 [==================>...........] - ETA: 59s - loss: 1.0976 - regression_loss: 0.9422 - classification_loss: 0.1553  319/500 [==================>...........] - ETA: 59s - loss: 1.0977 - regression_loss: 0.9423 - classification_loss: 0.1554 320/500 [==================>...........] - ETA: 59s - loss: 1.0975 - regression_loss: 0.9423 - classification_loss: 0.1552 321/500 [==================>...........] - ETA: 58s - loss: 1.0985 - regression_loss: 0.9432 - classification_loss: 0.1553 322/500 [==================>...........] - ETA: 58s - loss: 1.0967 - regression_loss: 0.9417 - classification_loss: 0.1550 323/500 [==================>...........] - ETA: 58s - loss: 1.0963 - regression_loss: 0.9414 - classification_loss: 0.1549 324/500 [==================>...........] - ETA: 57s - loss: 1.0972 - regression_loss: 0.9421 - classification_loss: 0.1551 325/500 [==================>...........] - ETA: 57s - loss: 1.0969 - regression_loss: 0.9419 - classification_loss: 0.1550 326/500 [==================>...........] - ETA: 57s - loss: 1.0978 - regression_loss: 0.9428 - classification_loss: 0.1550 327/500 [==================>...........] - ETA: 56s - loss: 1.0983 - regression_loss: 0.9432 - classification_loss: 0.1551 328/500 [==================>...........] - ETA: 56s - loss: 1.0967 - regression_loss: 0.9419 - classification_loss: 0.1549 329/500 [==================>...........] - ETA: 56s - loss: 1.0956 - regression_loss: 0.9410 - classification_loss: 0.1546 330/500 [==================>...........] - ETA: 55s - loss: 1.0942 - regression_loss: 0.9398 - classification_loss: 0.1543 331/500 [==================>...........] - ETA: 55s - loss: 1.0952 - regression_loss: 0.9407 - classification_loss: 0.1545 332/500 [==================>...........] - ETA: 55s - loss: 1.0966 - regression_loss: 0.9419 - classification_loss: 0.1547 333/500 [==================>...........] - ETA: 55s - loss: 1.0964 - regression_loss: 0.9417 - classification_loss: 0.1547 334/500 [===================>..........] - ETA: 54s - loss: 1.0980 - regression_loss: 0.9429 - classification_loss: 0.1551 335/500 [===================>..........] - ETA: 54s - loss: 1.0977 - regression_loss: 0.9426 - classification_loss: 0.1551 336/500 [===================>..........] - ETA: 54s - loss: 1.0985 - regression_loss: 0.9432 - classification_loss: 0.1552 337/500 [===================>..........] - ETA: 53s - loss: 1.0968 - regression_loss: 0.9419 - classification_loss: 0.1549 338/500 [===================>..........] - ETA: 53s - loss: 1.0959 - regression_loss: 0.9411 - classification_loss: 0.1548 339/500 [===================>..........] - ETA: 53s - loss: 1.0944 - regression_loss: 0.9398 - classification_loss: 0.1545 340/500 [===================>..........] - ETA: 52s - loss: 1.0943 - regression_loss: 0.9399 - classification_loss: 0.1544 341/500 [===================>..........] - ETA: 52s - loss: 1.0959 - regression_loss: 0.9411 - classification_loss: 0.1548 342/500 [===================>..........] - ETA: 52s - loss: 1.0947 - regression_loss: 0.9401 - classification_loss: 0.1546 343/500 [===================>..........] - ETA: 51s - loss: 1.0934 - regression_loss: 0.9389 - classification_loss: 0.1545 344/500 [===================>..........] - ETA: 51s - loss: 1.0935 - regression_loss: 0.9389 - classification_loss: 0.1546 345/500 [===================>..........] - ETA: 51s - loss: 1.0935 - regression_loss: 0.9390 - classification_loss: 0.1544 346/500 [===================>..........] - ETA: 50s - loss: 1.0930 - regression_loss: 0.9387 - classification_loss: 0.1544 347/500 [===================>..........] - ETA: 50s - loss: 1.0928 - regression_loss: 0.9383 - classification_loss: 0.1545 348/500 [===================>..........] - ETA: 50s - loss: 1.0914 - regression_loss: 0.9371 - classification_loss: 0.1543 349/500 [===================>..........] - ETA: 49s - loss: 1.0901 - regression_loss: 0.9361 - classification_loss: 0.1540 350/500 [====================>.........] - ETA: 49s - loss: 1.0902 - regression_loss: 0.9361 - classification_loss: 0.1540 351/500 [====================>.........] - ETA: 49s - loss: 1.0893 - regression_loss: 0.9354 - classification_loss: 0.1539 352/500 [====================>.........] - ETA: 48s - loss: 1.0889 - regression_loss: 0.9351 - classification_loss: 0.1539 353/500 [====================>.........] - ETA: 48s - loss: 1.0889 - regression_loss: 0.9350 - classification_loss: 0.1540 354/500 [====================>.........] - ETA: 48s - loss: 1.0902 - regression_loss: 0.9361 - classification_loss: 0.1540 355/500 [====================>.........] - ETA: 47s - loss: 1.0902 - regression_loss: 0.9361 - classification_loss: 0.1540 356/500 [====================>.........] - ETA: 47s - loss: 1.0886 - regression_loss: 0.9348 - classification_loss: 0.1538 357/500 [====================>.........] - ETA: 47s - loss: 1.0878 - regression_loss: 0.9341 - classification_loss: 0.1537 358/500 [====================>.........] - ETA: 46s - loss: 1.0888 - regression_loss: 0.9349 - classification_loss: 0.1538 359/500 [====================>.........] - ETA: 46s - loss: 1.0891 - regression_loss: 0.9355 - classification_loss: 0.1536 360/500 [====================>.........] - ETA: 46s - loss: 1.0871 - regression_loss: 0.9338 - classification_loss: 0.1533 361/500 [====================>.........] - ETA: 45s - loss: 1.0878 - regression_loss: 0.9344 - classification_loss: 0.1534 362/500 [====================>.........] - ETA: 45s - loss: 1.0893 - regression_loss: 0.9357 - classification_loss: 0.1536 363/500 [====================>.........] - ETA: 45s - loss: 1.0875 - regression_loss: 0.9341 - classification_loss: 0.1534 364/500 [====================>.........] - ETA: 44s - loss: 1.0880 - regression_loss: 0.9348 - classification_loss: 0.1531 365/500 [====================>.........] - ETA: 44s - loss: 1.0885 - regression_loss: 0.9354 - classification_loss: 0.1532 366/500 [====================>.........] - ETA: 44s - loss: 1.0892 - regression_loss: 0.9358 - classification_loss: 0.1534 367/500 [=====================>........] - ETA: 43s - loss: 1.0897 - regression_loss: 0.9362 - classification_loss: 0.1534 368/500 [=====================>........] - ETA: 43s - loss: 1.0925 - regression_loss: 0.9387 - classification_loss: 0.1538 369/500 [=====================>........] - ETA: 43s - loss: 1.0917 - regression_loss: 0.9380 - classification_loss: 0.1537 370/500 [=====================>........] - ETA: 42s - loss: 1.0918 - regression_loss: 0.9382 - classification_loss: 0.1537 371/500 [=====================>........] - ETA: 42s - loss: 1.0922 - regression_loss: 0.9384 - classification_loss: 0.1538 372/500 [=====================>........] - ETA: 42s - loss: 1.0939 - regression_loss: 0.9399 - classification_loss: 0.1540 373/500 [=====================>........] - ETA: 41s - loss: 1.0943 - regression_loss: 0.9403 - classification_loss: 0.1541 374/500 [=====================>........] - ETA: 41s - loss: 1.0943 - regression_loss: 0.9404 - classification_loss: 0.1539 375/500 [=====================>........] - ETA: 41s - loss: 1.0934 - regression_loss: 0.9395 - classification_loss: 0.1539 376/500 [=====================>........] - ETA: 40s - loss: 1.0936 - regression_loss: 0.9396 - classification_loss: 0.1539 377/500 [=====================>........] - ETA: 40s - loss: 1.0945 - regression_loss: 0.9404 - classification_loss: 0.1541 378/500 [=====================>........] - ETA: 40s - loss: 1.0949 - regression_loss: 0.9406 - classification_loss: 0.1542 379/500 [=====================>........] - ETA: 39s - loss: 1.0941 - regression_loss: 0.9399 - classification_loss: 0.1542 380/500 [=====================>........] - ETA: 39s - loss: 1.0946 - regression_loss: 0.9403 - classification_loss: 0.1543 381/500 [=====================>........] - ETA: 39s - loss: 1.0928 - regression_loss: 0.9387 - classification_loss: 0.1541 382/500 [=====================>........] - ETA: 38s - loss: 1.0917 - regression_loss: 0.9377 - classification_loss: 0.1540 383/500 [=====================>........] - ETA: 38s - loss: 1.0899 - regression_loss: 0.9362 - classification_loss: 0.1537 384/500 [======================>.......] - ETA: 38s - loss: 1.0897 - regression_loss: 0.9361 - classification_loss: 0.1536 385/500 [======================>.......] - ETA: 37s - loss: 1.0905 - regression_loss: 0.9367 - classification_loss: 0.1538 386/500 [======================>.......] - ETA: 37s - loss: 1.0907 - regression_loss: 0.9370 - classification_loss: 0.1538 387/500 [======================>.......] - ETA: 37s - loss: 1.0893 - regression_loss: 0.9358 - classification_loss: 0.1535 388/500 [======================>.......] - ETA: 36s - loss: 1.0913 - regression_loss: 0.9374 - classification_loss: 0.1539 389/500 [======================>.......] - ETA: 36s - loss: 1.0905 - regression_loss: 0.9368 - classification_loss: 0.1537 390/500 [======================>.......] - ETA: 36s - loss: 1.0913 - regression_loss: 0.9376 - classification_loss: 0.1538 391/500 [======================>.......] - ETA: 35s - loss: 1.0926 - regression_loss: 0.9385 - classification_loss: 0.1540 392/500 [======================>.......] - ETA: 35s - loss: 1.0930 - regression_loss: 0.9389 - classification_loss: 0.1541 393/500 [======================>.......] - ETA: 35s - loss: 1.0925 - regression_loss: 0.9386 - classification_loss: 0.1539 394/500 [======================>.......] - ETA: 34s - loss: 1.0933 - regression_loss: 0.9392 - classification_loss: 0.1540 395/500 [======================>.......] - ETA: 34s - loss: 1.0946 - regression_loss: 0.9404 - classification_loss: 0.1542 396/500 [======================>.......] - ETA: 34s - loss: 1.0949 - regression_loss: 0.9406 - classification_loss: 0.1543 397/500 [======================>.......] - ETA: 34s - loss: 1.0959 - regression_loss: 0.9414 - classification_loss: 0.1545 398/500 [======================>.......] - ETA: 33s - loss: 1.0955 - regression_loss: 0.9412 - classification_loss: 0.1543 399/500 [======================>.......] - ETA: 33s - loss: 1.0959 - regression_loss: 0.9416 - classification_loss: 0.1542 400/500 [=======================>......] - ETA: 33s - loss: 1.0950 - regression_loss: 0.9409 - classification_loss: 0.1541 401/500 [=======================>......] - ETA: 32s - loss: 1.0942 - regression_loss: 0.9402 - classification_loss: 0.1540 402/500 [=======================>......] - ETA: 32s - loss: 1.0930 - regression_loss: 0.9392 - classification_loss: 0.1539 403/500 [=======================>......] - ETA: 32s - loss: 1.0937 - regression_loss: 0.9398 - classification_loss: 0.1539 404/500 [=======================>......] - ETA: 31s - loss: 1.0933 - regression_loss: 0.9394 - classification_loss: 0.1539 405/500 [=======================>......] - ETA: 31s - loss: 1.0944 - regression_loss: 0.9403 - classification_loss: 0.1541 406/500 [=======================>......] - ETA: 31s - loss: 1.0929 - regression_loss: 0.9390 - classification_loss: 0.1539 407/500 [=======================>......] - ETA: 30s - loss: 1.0928 - regression_loss: 0.9390 - classification_loss: 0.1538 408/500 [=======================>......] - ETA: 30s - loss: 1.0934 - regression_loss: 0.9394 - classification_loss: 0.1540 409/500 [=======================>......] - ETA: 30s - loss: 1.0930 - regression_loss: 0.9391 - classification_loss: 0.1539 410/500 [=======================>......] - ETA: 29s - loss: 1.0932 - regression_loss: 0.9393 - classification_loss: 0.1539 411/500 [=======================>......] - ETA: 29s - loss: 1.0948 - regression_loss: 0.9407 - classification_loss: 0.1541 412/500 [=======================>......] - ETA: 29s - loss: 1.0954 - regression_loss: 0.9413 - classification_loss: 0.1541 413/500 [=======================>......] - ETA: 28s - loss: 1.0939 - regression_loss: 0.9400 - classification_loss: 0.1540 414/500 [=======================>......] - ETA: 28s - loss: 1.0923 - regression_loss: 0.9386 - classification_loss: 0.1538 415/500 [=======================>......] - ETA: 28s - loss: 1.0931 - regression_loss: 0.9393 - classification_loss: 0.1538 416/500 [=======================>......] - ETA: 27s - loss: 1.0927 - regression_loss: 0.9389 - classification_loss: 0.1538 417/500 [========================>.....] - ETA: 27s - loss: 1.0918 - regression_loss: 0.9381 - classification_loss: 0.1536 418/500 [========================>.....] - ETA: 27s - loss: 1.0923 - regression_loss: 0.9385 - classification_loss: 0.1538 419/500 [========================>.....] - ETA: 26s - loss: 1.0929 - regression_loss: 0.9390 - classification_loss: 0.1538 420/500 [========================>.....] - ETA: 26s - loss: 1.0937 - regression_loss: 0.9397 - classification_loss: 0.1540 421/500 [========================>.....] - ETA: 26s - loss: 1.0942 - regression_loss: 0.9401 - classification_loss: 0.1540 422/500 [========================>.....] - ETA: 25s - loss: 1.0933 - regression_loss: 0.9393 - classification_loss: 0.1539 423/500 [========================>.....] - ETA: 25s - loss: 1.0931 - regression_loss: 0.9392 - classification_loss: 0.1539 424/500 [========================>.....] - ETA: 25s - loss: 1.0922 - regression_loss: 0.9384 - classification_loss: 0.1538 425/500 [========================>.....] - ETA: 24s - loss: 1.0919 - regression_loss: 0.9381 - classification_loss: 0.1538 426/500 [========================>.....] - ETA: 24s - loss: 1.0917 - regression_loss: 0.9379 - classification_loss: 0.1539 427/500 [========================>.....] - ETA: 24s - loss: 1.0904 - regression_loss: 0.9367 - classification_loss: 0.1537 428/500 [========================>.....] - ETA: 23s - loss: 1.0897 - regression_loss: 0.9362 - classification_loss: 0.1535 429/500 [========================>.....] - ETA: 23s - loss: 1.0879 - regression_loss: 0.9346 - classification_loss: 0.1533 430/500 [========================>.....] - ETA: 23s - loss: 1.0883 - regression_loss: 0.9350 - classification_loss: 0.1534 431/500 [========================>.....] - ETA: 22s - loss: 1.0867 - regression_loss: 0.9336 - classification_loss: 0.1531 432/500 [========================>.....] - ETA: 22s - loss: 1.0891 - regression_loss: 0.9355 - classification_loss: 0.1537 433/500 [========================>.....] - ETA: 22s - loss: 1.0893 - regression_loss: 0.9356 - classification_loss: 0.1536 434/500 [=========================>....] - ETA: 21s - loss: 1.0891 - regression_loss: 0.9355 - classification_loss: 0.1536 435/500 [=========================>....] - ETA: 21s - loss: 1.0891 - regression_loss: 0.9354 - classification_loss: 0.1537 436/500 [=========================>....] - ETA: 21s - loss: 1.0884 - regression_loss: 0.9348 - classification_loss: 0.1536 437/500 [=========================>....] - ETA: 20s - loss: 1.0885 - regression_loss: 0.9348 - classification_loss: 0.1537 438/500 [=========================>....] - ETA: 20s - loss: 1.0883 - regression_loss: 0.9346 - classification_loss: 0.1538 439/500 [=========================>....] - ETA: 20s - loss: 1.0900 - regression_loss: 0.9360 - classification_loss: 0.1540 440/500 [=========================>....] - ETA: 19s - loss: 1.0907 - regression_loss: 0.9366 - classification_loss: 0.1541 441/500 [=========================>....] - ETA: 19s - loss: 1.0899 - regression_loss: 0.9359 - classification_loss: 0.1540 442/500 [=========================>....] - ETA: 19s - loss: 1.0889 - regression_loss: 0.9352 - classification_loss: 0.1538 443/500 [=========================>....] - ETA: 18s - loss: 1.0893 - regression_loss: 0.9354 - classification_loss: 0.1539 444/500 [=========================>....] - ETA: 18s - loss: 1.0887 - regression_loss: 0.9348 - classification_loss: 0.1539 445/500 [=========================>....] - ETA: 18s - loss: 1.0881 - regression_loss: 0.9343 - classification_loss: 0.1538 446/500 [=========================>....] - ETA: 17s - loss: 1.0877 - regression_loss: 0.9340 - classification_loss: 0.1537 447/500 [=========================>....] - ETA: 17s - loss: 1.0881 - regression_loss: 0.9343 - classification_loss: 0.1538 448/500 [=========================>....] - ETA: 17s - loss: 1.0872 - regression_loss: 0.9336 - classification_loss: 0.1536 449/500 [=========================>....] - ETA: 16s - loss: 1.0874 - regression_loss: 0.9336 - classification_loss: 0.1538 450/500 [==========================>...] - ETA: 16s - loss: 1.0879 - regression_loss: 0.9339 - classification_loss: 0.1540 451/500 [==========================>...] - ETA: 16s - loss: 1.0869 - regression_loss: 0.9330 - classification_loss: 0.1539 452/500 [==========================>...] - ETA: 15s - loss: 1.0858 - regression_loss: 0.9320 - classification_loss: 0.1537 453/500 [==========================>...] - ETA: 15s - loss: 1.0861 - regression_loss: 0.9323 - classification_loss: 0.1538 454/500 [==========================>...] - ETA: 15s - loss: 1.0857 - regression_loss: 0.9320 - classification_loss: 0.1537 455/500 [==========================>...] - ETA: 14s - loss: 1.0875 - regression_loss: 0.9335 - classification_loss: 0.1540 456/500 [==========================>...] - ETA: 14s - loss: 1.0866 - regression_loss: 0.9328 - classification_loss: 0.1539 457/500 [==========================>...] - ETA: 14s - loss: 1.0871 - regression_loss: 0.9332 - classification_loss: 0.1540 458/500 [==========================>...] - ETA: 13s - loss: 1.0877 - regression_loss: 0.9337 - classification_loss: 0.1540 459/500 [==========================>...] - ETA: 13s - loss: 1.0864 - regression_loss: 0.9325 - classification_loss: 0.1539 460/500 [==========================>...] - ETA: 13s - loss: 1.0877 - regression_loss: 0.9336 - classification_loss: 0.1541 461/500 [==========================>...] - ETA: 12s - loss: 1.0887 - regression_loss: 0.9344 - classification_loss: 0.1543 462/500 [==========================>...] - ETA: 12s - loss: 1.0870 - regression_loss: 0.9329 - classification_loss: 0.1541 463/500 [==========================>...] - ETA: 12s - loss: 1.0880 - regression_loss: 0.9337 - classification_loss: 0.1543 464/500 [==========================>...] - ETA: 11s - loss: 1.0878 - regression_loss: 0.9334 - classification_loss: 0.1543 465/500 [==========================>...] - ETA: 11s - loss: 1.0875 - regression_loss: 0.9333 - classification_loss: 0.1542 466/500 [==========================>...] - ETA: 11s - loss: 1.0868 - regression_loss: 0.9328 - classification_loss: 0.1540 467/500 [===========================>..] - ETA: 10s - loss: 1.0872 - regression_loss: 0.9331 - classification_loss: 0.1540 468/500 [===========================>..] - ETA: 10s - loss: 1.0861 - regression_loss: 0.9323 - classification_loss: 0.1538 469/500 [===========================>..] - ETA: 10s - loss: 1.0850 - regression_loss: 0.9313 - classification_loss: 0.1537 470/500 [===========================>..] - ETA: 9s - loss: 1.0836 - regression_loss: 0.9302 - classification_loss: 0.1534  471/500 [===========================>..] - ETA: 9s - loss: 1.0839 - regression_loss: 0.9304 - classification_loss: 0.1535 472/500 [===========================>..] - ETA: 9s - loss: 1.0860 - regression_loss: 0.9322 - classification_loss: 0.1539 473/500 [===========================>..] - ETA: 8s - loss: 1.0865 - regression_loss: 0.9327 - classification_loss: 0.1539 474/500 [===========================>..] - ETA: 8s - loss: 1.0851 - regression_loss: 0.9315 - classification_loss: 0.1536 475/500 [===========================>..] - ETA: 8s - loss: 1.0838 - regression_loss: 0.9304 - classification_loss: 0.1535 476/500 [===========================>..] - ETA: 7s - loss: 1.0827 - regression_loss: 0.9294 - classification_loss: 0.1534 477/500 [===========================>..] - ETA: 7s - loss: 1.0836 - regression_loss: 0.9302 - classification_loss: 0.1534 478/500 [===========================>..] - ETA: 7s - loss: 1.0846 - regression_loss: 0.9309 - classification_loss: 0.1537 479/500 [===========================>..] - ETA: 6s - loss: 1.0830 - regression_loss: 0.9295 - classification_loss: 0.1535 480/500 [===========================>..] - ETA: 6s - loss: 1.0818 - regression_loss: 0.9284 - classification_loss: 0.1534 481/500 [===========================>..] - ETA: 6s - loss: 1.0817 - regression_loss: 0.9282 - classification_loss: 0.1534 482/500 [===========================>..] - ETA: 5s - loss: 1.0811 - regression_loss: 0.9278 - classification_loss: 0.1533 483/500 [===========================>..] - ETA: 5s - loss: 1.0802 - regression_loss: 0.9271 - classification_loss: 0.1531 484/500 [============================>.] - ETA: 5s - loss: 1.0804 - regression_loss: 0.9272 - classification_loss: 0.1531 485/500 [============================>.] - ETA: 4s - loss: 1.0807 - regression_loss: 0.9275 - classification_loss: 0.1531 486/500 [============================>.] - ETA: 4s - loss: 1.0817 - regression_loss: 0.9284 - classification_loss: 0.1533 487/500 [============================>.] - ETA: 4s - loss: 1.0821 - regression_loss: 0.9288 - classification_loss: 0.1533 488/500 [============================>.] - ETA: 3s - loss: 1.0824 - regression_loss: 0.9292 - classification_loss: 0.1532 489/500 [============================>.] - ETA: 3s - loss: 1.0823 - regression_loss: 0.9290 - classification_loss: 0.1532 490/500 [============================>.] - ETA: 3s - loss: 1.0809 - regression_loss: 0.9279 - classification_loss: 0.1530 491/500 [============================>.] - ETA: 2s - loss: 1.0809 - regression_loss: 0.9278 - classification_loss: 0.1531 492/500 [============================>.] - ETA: 2s - loss: 1.0820 - regression_loss: 0.9288 - classification_loss: 0.1533 493/500 [============================>.] - ETA: 2s - loss: 1.0826 - regression_loss: 0.9293 - classification_loss: 0.1533 494/500 [============================>.] - ETA: 1s - loss: 1.0835 - regression_loss: 0.9301 - classification_loss: 0.1535 495/500 [============================>.] - ETA: 1s - loss: 1.0830 - regression_loss: 0.9297 - classification_loss: 0.1533 496/500 [============================>.] - ETA: 1s - loss: 1.0826 - regression_loss: 0.9293 - classification_loss: 0.1533 497/500 [============================>.] - ETA: 0s - loss: 1.0828 - regression_loss: 0.9295 - classification_loss: 0.1533 498/500 [============================>.] - ETA: 0s - loss: 1.0834 - regression_loss: 0.9301 - classification_loss: 0.1533 499/500 [============================>.] - ETA: 0s - loss: 1.0830 - regression_loss: 0.9297 - classification_loss: 0.1533 500/500 [==============================] - 165s 330ms/step - loss: 1.0821 - regression_loss: 0.9288 - classification_loss: 0.1533 1172 instances of class plum with average precision: 0.6286 mAP: 0.6286 Epoch 00027: saving model to ./training/snapshots/resnet101_pascal_27.h5 Epoch 28/150 1/500 [..............................] - ETA: 2:31 - loss: 1.2847 - regression_loss: 1.1188 - classification_loss: 0.1660 2/500 [..............................] - ETA: 2:33 - loss: 1.0674 - regression_loss: 0.9257 - classification_loss: 0.1418 3/500 [..............................] - ETA: 2:38 - loss: 0.8973 - regression_loss: 0.7835 - classification_loss: 0.1138 4/500 [..............................] - ETA: 2:41 - loss: 1.1005 - regression_loss: 0.9465 - classification_loss: 0.1540 5/500 [..............................] - ETA: 2:41 - loss: 1.0099 - regression_loss: 0.8709 - classification_loss: 0.1390 6/500 [..............................] - ETA: 2:42 - loss: 1.0826 - regression_loss: 0.9254 - classification_loss: 0.1572 7/500 [..............................] - ETA: 2:44 - loss: 0.9990 - regression_loss: 0.8566 - classification_loss: 0.1424 8/500 [..............................] - ETA: 2:43 - loss: 1.0031 - regression_loss: 0.8646 - classification_loss: 0.1385 9/500 [..............................] - ETA: 2:42 - loss: 1.0203 - regression_loss: 0.8830 - classification_loss: 0.1374 10/500 [..............................] - ETA: 2:42 - loss: 1.0176 - regression_loss: 0.8780 - classification_loss: 0.1396 11/500 [..............................] - ETA: 2:43 - loss: 1.0217 - regression_loss: 0.8795 - classification_loss: 0.1422 12/500 [..............................] - ETA: 2:43 - loss: 1.0141 - regression_loss: 0.8742 - classification_loss: 0.1398 13/500 [..............................] - ETA: 2:41 - loss: 1.0429 - regression_loss: 0.8998 - classification_loss: 0.1431 14/500 [..............................] - ETA: 2:40 - loss: 0.9952 - regression_loss: 0.8597 - classification_loss: 0.1355 15/500 [..............................] - ETA: 2:39 - loss: 1.0118 - regression_loss: 0.8716 - classification_loss: 0.1402 16/500 [..............................] - ETA: 2:39 - loss: 0.9861 - regression_loss: 0.8496 - classification_loss: 0.1366 17/500 [>.............................] - ETA: 2:39 - loss: 0.9986 - regression_loss: 0.8611 - classification_loss: 0.1375 18/500 [>.............................] - ETA: 2:39 - loss: 1.0203 - regression_loss: 0.8800 - classification_loss: 0.1403 19/500 [>.............................] - ETA: 2:40 - loss: 1.0241 - regression_loss: 0.8830 - classification_loss: 0.1412 20/500 [>.............................] - ETA: 2:39 - loss: 1.0268 - regression_loss: 0.8863 - classification_loss: 0.1406 21/500 [>.............................] - ETA: 2:39 - loss: 1.0079 - regression_loss: 0.8695 - classification_loss: 0.1384 22/500 [>.............................] - ETA: 2:39 - loss: 1.0254 - regression_loss: 0.8841 - classification_loss: 0.1413 23/500 [>.............................] - ETA: 2:38 - loss: 1.0573 - regression_loss: 0.9100 - classification_loss: 0.1473 24/500 [>.............................] - ETA: 2:38 - loss: 1.0454 - regression_loss: 0.8993 - classification_loss: 0.1461 25/500 [>.............................] - ETA: 2:37 - loss: 1.0445 - regression_loss: 0.8984 - classification_loss: 0.1461 26/500 [>.............................] - ETA: 2:37 - loss: 1.0276 - regression_loss: 0.8845 - classification_loss: 0.1431 27/500 [>.............................] - ETA: 2:36 - loss: 1.0245 - regression_loss: 0.8820 - classification_loss: 0.1426 28/500 [>.............................] - ETA: 2:35 - loss: 1.0185 - regression_loss: 0.8765 - classification_loss: 0.1419 29/500 [>.............................] - ETA: 2:35 - loss: 1.0279 - regression_loss: 0.8847 - classification_loss: 0.1432 30/500 [>.............................] - ETA: 2:35 - loss: 1.0311 - regression_loss: 0.8879 - classification_loss: 0.1432 31/500 [>.............................] - ETA: 2:35 - loss: 1.0250 - regression_loss: 0.8836 - classification_loss: 0.1415 32/500 [>.............................] - ETA: 2:34 - loss: 1.0153 - regression_loss: 0.8752 - classification_loss: 0.1401 33/500 [>.............................] - ETA: 2:34 - loss: 1.0369 - regression_loss: 0.8916 - classification_loss: 0.1453 34/500 [=>............................] - ETA: 2:33 - loss: 1.0467 - regression_loss: 0.8997 - classification_loss: 0.1470 35/500 [=>............................] - ETA: 2:33 - loss: 1.0664 - regression_loss: 0.9157 - classification_loss: 0.1506 36/500 [=>............................] - ETA: 2:33 - loss: 1.0696 - regression_loss: 0.9188 - classification_loss: 0.1508 37/500 [=>............................] - ETA: 2:32 - loss: 1.0799 - regression_loss: 0.9280 - classification_loss: 0.1519 38/500 [=>............................] - ETA: 2:32 - loss: 1.0832 - regression_loss: 0.9308 - classification_loss: 0.1524 39/500 [=>............................] - ETA: 2:31 - loss: 1.0918 - regression_loss: 0.9379 - classification_loss: 0.1539 40/500 [=>............................] - ETA: 2:31 - loss: 1.0779 - regression_loss: 0.9254 - classification_loss: 0.1524 41/500 [=>............................] - ETA: 2:31 - loss: 1.0801 - regression_loss: 0.9259 - classification_loss: 0.1541 42/500 [=>............................] - ETA: 2:30 - loss: 1.0830 - regression_loss: 0.9284 - classification_loss: 0.1546 43/500 [=>............................] - ETA: 2:30 - loss: 1.0776 - regression_loss: 0.9241 - classification_loss: 0.1535 44/500 [=>............................] - ETA: 2:30 - loss: 1.0817 - regression_loss: 0.9271 - classification_loss: 0.1546 45/500 [=>............................] - ETA: 2:30 - loss: 1.0779 - regression_loss: 0.9235 - classification_loss: 0.1544 46/500 [=>............................] - ETA: 2:29 - loss: 1.0884 - regression_loss: 0.9327 - classification_loss: 0.1557 47/500 [=>............................] - ETA: 2:29 - loss: 1.0912 - regression_loss: 0.9353 - classification_loss: 0.1559 48/500 [=>............................] - ETA: 2:29 - loss: 1.1006 - regression_loss: 0.9446 - classification_loss: 0.1560 49/500 [=>............................] - ETA: 2:29 - loss: 1.0970 - regression_loss: 0.9425 - classification_loss: 0.1545 50/500 [==>...........................] - ETA: 2:28 - loss: 1.0993 - regression_loss: 0.9448 - classification_loss: 0.1544 51/500 [==>...........................] - ETA: 2:28 - loss: 1.1038 - regression_loss: 0.9496 - classification_loss: 0.1543 52/500 [==>...........................] - ETA: 2:28 - loss: 1.0978 - regression_loss: 0.9443 - classification_loss: 0.1535 53/500 [==>...........................] - ETA: 2:28 - loss: 1.0915 - regression_loss: 0.9390 - classification_loss: 0.1526 54/500 [==>...........................] - ETA: 2:27 - loss: 1.0916 - regression_loss: 0.9386 - classification_loss: 0.1530 55/500 [==>...........................] - ETA: 2:27 - loss: 1.0856 - regression_loss: 0.9339 - classification_loss: 0.1517 56/500 [==>...........................] - ETA: 2:27 - loss: 1.0858 - regression_loss: 0.9359 - classification_loss: 0.1499 57/500 [==>...........................] - ETA: 2:27 - loss: 1.0812 - regression_loss: 0.9322 - classification_loss: 0.1490 58/500 [==>...........................] - ETA: 2:26 - loss: 1.0900 - regression_loss: 0.9392 - classification_loss: 0.1508 59/500 [==>...........................] - ETA: 2:26 - loss: 1.0909 - regression_loss: 0.9400 - classification_loss: 0.1509 60/500 [==>...........................] - ETA: 2:26 - loss: 1.1001 - regression_loss: 0.9471 - classification_loss: 0.1530 61/500 [==>...........................] - ETA: 2:26 - loss: 1.1063 - regression_loss: 0.9524 - classification_loss: 0.1539 62/500 [==>...........................] - ETA: 2:25 - loss: 1.1135 - regression_loss: 0.9586 - classification_loss: 0.1549 63/500 [==>...........................] - ETA: 2:25 - loss: 1.1215 - regression_loss: 0.9655 - classification_loss: 0.1560 64/500 [==>...........................] - ETA: 2:24 - loss: 1.1163 - regression_loss: 0.9615 - classification_loss: 0.1548 65/500 [==>...........................] - ETA: 2:24 - loss: 1.1179 - regression_loss: 0.9617 - classification_loss: 0.1562 66/500 [==>...........................] - ETA: 2:24 - loss: 1.1195 - regression_loss: 0.9629 - classification_loss: 0.1566 67/500 [===>..........................] - ETA: 2:24 - loss: 1.1113 - regression_loss: 0.9559 - classification_loss: 0.1553 68/500 [===>..........................] - ETA: 2:23 - loss: 1.1161 - regression_loss: 0.9599 - classification_loss: 0.1562 69/500 [===>..........................] - ETA: 2:23 - loss: 1.1223 - regression_loss: 0.9646 - classification_loss: 0.1577 70/500 [===>..........................] - ETA: 2:23 - loss: 1.1120 - regression_loss: 0.9557 - classification_loss: 0.1564 71/500 [===>..........................] - ETA: 2:22 - loss: 1.1053 - regression_loss: 0.9497 - classification_loss: 0.1556 72/500 [===>..........................] - ETA: 2:22 - loss: 1.1052 - regression_loss: 0.9493 - classification_loss: 0.1559 73/500 [===>..........................] - ETA: 2:22 - loss: 1.1046 - regression_loss: 0.9489 - classification_loss: 0.1557 74/500 [===>..........................] - ETA: 2:21 - loss: 1.0999 - regression_loss: 0.9451 - classification_loss: 0.1548 75/500 [===>..........................] - ETA: 2:21 - loss: 1.1044 - regression_loss: 0.9484 - classification_loss: 0.1560 76/500 [===>..........................] - ETA: 2:21 - loss: 1.0947 - regression_loss: 0.9401 - classification_loss: 0.1546 77/500 [===>..........................] - ETA: 2:20 - loss: 1.0970 - regression_loss: 0.9425 - classification_loss: 0.1546 78/500 [===>..........................] - ETA: 2:20 - loss: 1.0931 - regression_loss: 0.9385 - classification_loss: 0.1546 79/500 [===>..........................] - ETA: 2:19 - loss: 1.0938 - regression_loss: 0.9391 - classification_loss: 0.1546 80/500 [===>..........................] - ETA: 2:19 - loss: 1.0967 - regression_loss: 0.9419 - classification_loss: 0.1548 81/500 [===>..........................] - ETA: 2:19 - loss: 1.0871 - regression_loss: 0.9337 - classification_loss: 0.1534 82/500 [===>..........................] - ETA: 2:18 - loss: 1.0883 - regression_loss: 0.9346 - classification_loss: 0.1536 83/500 [===>..........................] - ETA: 2:18 - loss: 1.0899 - regression_loss: 0.9358 - classification_loss: 0.1541 84/500 [====>.........................] - ETA: 2:18 - loss: 1.0914 - regression_loss: 0.9370 - classification_loss: 0.1544 85/500 [====>.........................] - ETA: 2:17 - loss: 1.0833 - regression_loss: 0.9300 - classification_loss: 0.1534 86/500 [====>.........................] - ETA: 2:17 - loss: 1.0891 - regression_loss: 0.9342 - classification_loss: 0.1549 87/500 [====>.........................] - ETA: 2:16 - loss: 1.0935 - regression_loss: 0.9382 - classification_loss: 0.1553 88/500 [====>.........................] - ETA: 2:16 - loss: 1.0934 - regression_loss: 0.9383 - classification_loss: 0.1551 89/500 [====>.........................] - ETA: 2:16 - loss: 1.0959 - regression_loss: 0.9405 - classification_loss: 0.1554 90/500 [====>.........................] - ETA: 2:15 - loss: 1.1000 - regression_loss: 0.9443 - classification_loss: 0.1557 91/500 [====>.........................] - ETA: 2:15 - loss: 1.0944 - regression_loss: 0.9397 - classification_loss: 0.1546 92/500 [====>.........................] - ETA: 2:15 - loss: 1.0907 - regression_loss: 0.9370 - classification_loss: 0.1537 93/500 [====>.........................] - ETA: 2:14 - loss: 1.0889 - regression_loss: 0.9356 - classification_loss: 0.1533 94/500 [====>.........................] - ETA: 2:14 - loss: 1.0933 - regression_loss: 0.9390 - classification_loss: 0.1543 95/500 [====>.........................] - ETA: 2:14 - loss: 1.0904 - regression_loss: 0.9364 - classification_loss: 0.1540 96/500 [====>.........................] - ETA: 2:13 - loss: 1.0900 - regression_loss: 0.9358 - classification_loss: 0.1542 97/500 [====>.........................] - ETA: 2:13 - loss: 1.0882 - regression_loss: 0.9340 - classification_loss: 0.1542 98/500 [====>.........................] - ETA: 2:13 - loss: 1.0835 - regression_loss: 0.9290 - classification_loss: 0.1544 99/500 [====>.........................] - ETA: 2:13 - loss: 1.0775 - regression_loss: 0.9241 - classification_loss: 0.1534 100/500 [=====>........................] - ETA: 2:12 - loss: 1.0782 - regression_loss: 0.9247 - classification_loss: 0.1535 101/500 [=====>........................] - ETA: 2:12 - loss: 1.0770 - regression_loss: 0.9233 - classification_loss: 0.1538 102/500 [=====>........................] - ETA: 2:11 - loss: 1.0726 - regression_loss: 0.9195 - classification_loss: 0.1532 103/500 [=====>........................] - ETA: 2:11 - loss: 1.0658 - regression_loss: 0.9136 - classification_loss: 0.1522 104/500 [=====>........................] - ETA: 2:11 - loss: 1.0650 - regression_loss: 0.9129 - classification_loss: 0.1521 105/500 [=====>........................] - ETA: 2:10 - loss: 1.0640 - regression_loss: 0.9121 - classification_loss: 0.1518 106/500 [=====>........................] - ETA: 2:10 - loss: 1.0637 - regression_loss: 0.9117 - classification_loss: 0.1519 107/500 [=====>........................] - ETA: 2:10 - loss: 1.0582 - regression_loss: 0.9073 - classification_loss: 0.1509 108/500 [=====>........................] - ETA: 2:09 - loss: 1.0615 - regression_loss: 0.9104 - classification_loss: 0.1510 109/500 [=====>........................] - ETA: 2:09 - loss: 1.0648 - regression_loss: 0.9135 - classification_loss: 0.1512 110/500 [=====>........................] - ETA: 2:09 - loss: 1.0650 - regression_loss: 0.9141 - classification_loss: 0.1509 111/500 [=====>........................] - ETA: 2:08 - loss: 1.0641 - regression_loss: 0.9136 - classification_loss: 0.1505 112/500 [=====>........................] - ETA: 2:08 - loss: 1.0660 - regression_loss: 0.9157 - classification_loss: 0.1503 113/500 [=====>........................] - ETA: 2:07 - loss: 1.0627 - regression_loss: 0.9131 - classification_loss: 0.1496 114/500 [=====>........................] - ETA: 2:07 - loss: 1.0620 - regression_loss: 0.9125 - classification_loss: 0.1495 115/500 [=====>........................] - ETA: 2:07 - loss: 1.0593 - regression_loss: 0.9103 - classification_loss: 0.1490 116/500 [=====>........................] - ETA: 2:07 - loss: 1.0587 - regression_loss: 0.9101 - classification_loss: 0.1486 117/500 [======>.......................] - ETA: 2:06 - loss: 1.0619 - regression_loss: 0.9127 - classification_loss: 0.1492 118/500 [======>.......................] - ETA: 2:06 - loss: 1.0614 - regression_loss: 0.9123 - classification_loss: 0.1491 119/500 [======>.......................] - ETA: 2:06 - loss: 1.0638 - regression_loss: 0.9143 - classification_loss: 0.1495 120/500 [======>.......................] - ETA: 2:05 - loss: 1.0587 - regression_loss: 0.9099 - classification_loss: 0.1489 121/500 [======>.......................] - ETA: 2:05 - loss: 1.0636 - regression_loss: 0.9140 - classification_loss: 0.1496 122/500 [======>.......................] - ETA: 2:05 - loss: 1.0600 - regression_loss: 0.9109 - classification_loss: 0.1491 123/500 [======>.......................] - ETA: 2:04 - loss: 1.0560 - regression_loss: 0.9075 - classification_loss: 0.1484 124/500 [======>.......................] - ETA: 2:04 - loss: 1.0524 - regression_loss: 0.9044 - classification_loss: 0.1480 125/500 [======>.......................] - ETA: 2:04 - loss: 1.0528 - regression_loss: 0.9045 - classification_loss: 0.1483 126/500 [======>.......................] - ETA: 2:03 - loss: 1.0559 - regression_loss: 0.9071 - classification_loss: 0.1488 127/500 [======>.......................] - ETA: 2:03 - loss: 1.0601 - regression_loss: 0.9103 - classification_loss: 0.1497 128/500 [======>.......................] - ETA: 2:03 - loss: 1.0564 - regression_loss: 0.9074 - classification_loss: 0.1490 129/500 [======>.......................] - ETA: 2:02 - loss: 1.0595 - regression_loss: 0.9098 - classification_loss: 0.1497 130/500 [======>.......................] - ETA: 2:02 - loss: 1.0537 - regression_loss: 0.9047 - classification_loss: 0.1490 131/500 [======>.......................] - ETA: 2:02 - loss: 1.0535 - regression_loss: 0.9044 - classification_loss: 0.1491 132/500 [======>.......................] - ETA: 2:01 - loss: 1.0493 - regression_loss: 0.9009 - classification_loss: 0.1484 133/500 [======>.......................] - ETA: 2:01 - loss: 1.0534 - regression_loss: 0.9047 - classification_loss: 0.1488 134/500 [=======>......................] - ETA: 2:01 - loss: 1.0543 - regression_loss: 0.9055 - classification_loss: 0.1488 135/500 [=======>......................] - ETA: 2:00 - loss: 1.0503 - regression_loss: 0.9023 - classification_loss: 0.1480 136/500 [=======>......................] - ETA: 2:00 - loss: 1.0530 - regression_loss: 0.9046 - classification_loss: 0.1484 137/500 [=======>......................] - ETA: 2:00 - loss: 1.0495 - regression_loss: 0.9015 - classification_loss: 0.1479 138/500 [=======>......................] - ETA: 2:00 - loss: 1.0510 - regression_loss: 0.9026 - classification_loss: 0.1484 139/500 [=======>......................] - ETA: 1:59 - loss: 1.0526 - regression_loss: 0.9041 - classification_loss: 0.1485 140/500 [=======>......................] - ETA: 1:59 - loss: 1.0536 - regression_loss: 0.9052 - classification_loss: 0.1484 141/500 [=======>......................] - ETA: 1:59 - loss: 1.0563 - regression_loss: 0.9076 - classification_loss: 0.1487 142/500 [=======>......................] - ETA: 1:58 - loss: 1.0599 - regression_loss: 0.9112 - classification_loss: 0.1487 143/500 [=======>......................] - ETA: 1:58 - loss: 1.0586 - regression_loss: 0.9103 - classification_loss: 0.1483 144/500 [=======>......................] - ETA: 1:58 - loss: 1.0617 - regression_loss: 0.9130 - classification_loss: 0.1487 145/500 [=======>......................] - ETA: 1:57 - loss: 1.0595 - regression_loss: 0.9111 - classification_loss: 0.1484 146/500 [=======>......................] - ETA: 1:57 - loss: 1.0642 - regression_loss: 0.9149 - classification_loss: 0.1494 147/500 [=======>......................] - ETA: 1:57 - loss: 1.0612 - regression_loss: 0.9124 - classification_loss: 0.1489 148/500 [=======>......................] - ETA: 1:56 - loss: 1.0587 - regression_loss: 0.9098 - classification_loss: 0.1489 149/500 [=======>......................] - ETA: 1:56 - loss: 1.0570 - regression_loss: 0.9085 - classification_loss: 0.1485 150/500 [========>.....................] - ETA: 1:56 - loss: 1.0550 - regression_loss: 0.9070 - classification_loss: 0.1480 151/500 [========>.....................] - ETA: 1:55 - loss: 1.0573 - regression_loss: 0.9092 - classification_loss: 0.1481 152/500 [========>.....................] - ETA: 1:55 - loss: 1.0565 - regression_loss: 0.9087 - classification_loss: 0.1479 153/500 [========>.....................] - ETA: 1:55 - loss: 1.0579 - regression_loss: 0.9099 - classification_loss: 0.1480 154/500 [========>.....................] - ETA: 1:54 - loss: 1.0586 - regression_loss: 0.9107 - classification_loss: 0.1480 155/500 [========>.....................] - ETA: 1:54 - loss: 1.0621 - regression_loss: 0.9134 - classification_loss: 0.1487 156/500 [========>.....................] - ETA: 1:54 - loss: 1.0615 - regression_loss: 0.9129 - classification_loss: 0.1486 157/500 [========>.....................] - ETA: 1:53 - loss: 1.0616 - regression_loss: 0.9131 - classification_loss: 0.1485 158/500 [========>.....................] - ETA: 1:53 - loss: 1.0637 - regression_loss: 0.9147 - classification_loss: 0.1490 159/500 [========>.....................] - ETA: 1:53 - loss: 1.0643 - regression_loss: 0.9153 - classification_loss: 0.1489 160/500 [========>.....................] - ETA: 1:52 - loss: 1.0678 - regression_loss: 0.9187 - classification_loss: 0.1492 161/500 [========>.....................] - ETA: 1:52 - loss: 1.0708 - regression_loss: 0.9216 - classification_loss: 0.1492 162/500 [========>.....................] - ETA: 1:52 - loss: 1.0734 - regression_loss: 0.9240 - classification_loss: 0.1494 163/500 [========>.....................] - ETA: 1:51 - loss: 1.0736 - regression_loss: 0.9244 - classification_loss: 0.1492 164/500 [========>.....................] - ETA: 1:51 - loss: 1.0767 - regression_loss: 0.9273 - classification_loss: 0.1494 165/500 [========>.....................] - ETA: 1:51 - loss: 1.0731 - regression_loss: 0.9243 - classification_loss: 0.1488 166/500 [========>.....................] - ETA: 1:50 - loss: 1.0738 - regression_loss: 0.9251 - classification_loss: 0.1487 167/500 [=========>....................] - ETA: 1:50 - loss: 1.0735 - regression_loss: 0.9249 - classification_loss: 0.1487 168/500 [=========>....................] - ETA: 1:49 - loss: 1.0782 - regression_loss: 0.9289 - classification_loss: 0.1493 169/500 [=========>....................] - ETA: 1:49 - loss: 1.0771 - regression_loss: 0.9278 - classification_loss: 0.1493 170/500 [=========>....................] - ETA: 1:49 - loss: 1.0774 - regression_loss: 0.9281 - classification_loss: 0.1494 171/500 [=========>....................] - ETA: 1:49 - loss: 1.0798 - regression_loss: 0.9301 - classification_loss: 0.1496 172/500 [=========>....................] - ETA: 1:48 - loss: 1.0846 - regression_loss: 0.9341 - classification_loss: 0.1505 173/500 [=========>....................] - ETA: 1:48 - loss: 1.0849 - regression_loss: 0.9343 - classification_loss: 0.1506 174/500 [=========>....................] - ETA: 1:48 - loss: 1.0857 - regression_loss: 0.9352 - classification_loss: 0.1505 175/500 [=========>....................] - ETA: 1:47 - loss: 1.0866 - regression_loss: 0.9359 - classification_loss: 0.1507 176/500 [=========>....................] - ETA: 1:47 - loss: 1.0905 - regression_loss: 0.9390 - classification_loss: 0.1514 177/500 [=========>....................] - ETA: 1:47 - loss: 1.0916 - regression_loss: 0.9401 - classification_loss: 0.1515 178/500 [=========>....................] - ETA: 1:46 - loss: 1.0899 - regression_loss: 0.9386 - classification_loss: 0.1513 179/500 [=========>....................] - ETA: 1:46 - loss: 1.0885 - regression_loss: 0.9375 - classification_loss: 0.1510 180/500 [=========>....................] - ETA: 1:46 - loss: 1.0911 - regression_loss: 0.9396 - classification_loss: 0.1515 181/500 [=========>....................] - ETA: 1:45 - loss: 1.0912 - regression_loss: 0.9396 - classification_loss: 0.1516 182/500 [=========>....................] - ETA: 1:45 - loss: 1.0926 - regression_loss: 0.9406 - classification_loss: 0.1520 183/500 [=========>....................] - ETA: 1:45 - loss: 1.0980 - regression_loss: 0.9449 - classification_loss: 0.1531 184/500 [==========>...................] - ETA: 1:44 - loss: 1.0952 - regression_loss: 0.9425 - classification_loss: 0.1527 185/500 [==========>...................] - ETA: 1:44 - loss: 1.0963 - regression_loss: 0.9434 - classification_loss: 0.1529 186/500 [==========>...................] - ETA: 1:44 - loss: 1.0977 - regression_loss: 0.9446 - classification_loss: 0.1531 187/500 [==========>...................] - ETA: 1:43 - loss: 1.0950 - regression_loss: 0.9422 - classification_loss: 0.1528 188/500 [==========>...................] - ETA: 1:43 - loss: 1.0966 - regression_loss: 0.9436 - classification_loss: 0.1530 189/500 [==========>...................] - ETA: 1:43 - loss: 1.0932 - regression_loss: 0.9405 - classification_loss: 0.1526 190/500 [==========>...................] - ETA: 1:42 - loss: 1.0912 - regression_loss: 0.9388 - classification_loss: 0.1524 191/500 [==========>...................] - ETA: 1:42 - loss: 1.0915 - regression_loss: 0.9391 - classification_loss: 0.1524 192/500 [==========>...................] - ETA: 1:42 - loss: 1.0931 - regression_loss: 0.9401 - classification_loss: 0.1530 193/500 [==========>...................] - ETA: 1:41 - loss: 1.0955 - regression_loss: 0.9420 - classification_loss: 0.1535 194/500 [==========>...................] - ETA: 1:41 - loss: 1.0964 - regression_loss: 0.9427 - classification_loss: 0.1537 195/500 [==========>...................] - ETA: 1:41 - loss: 1.0973 - regression_loss: 0.9435 - classification_loss: 0.1537 196/500 [==========>...................] - ETA: 1:40 - loss: 1.0947 - regression_loss: 0.9414 - classification_loss: 0.1532 197/500 [==========>...................] - ETA: 1:40 - loss: 1.0963 - regression_loss: 0.9429 - classification_loss: 0.1534 198/500 [==========>...................] - ETA: 1:40 - loss: 1.0990 - regression_loss: 0.9451 - classification_loss: 0.1539 199/500 [==========>...................] - ETA: 1:39 - loss: 1.0983 - regression_loss: 0.9445 - classification_loss: 0.1538 200/500 [===========>..................] - ETA: 1:39 - loss: 1.0968 - regression_loss: 0.9432 - classification_loss: 0.1536 201/500 [===========>..................] - ETA: 1:39 - loss: 1.0962 - regression_loss: 0.9428 - classification_loss: 0.1534 202/500 [===========>..................] - ETA: 1:38 - loss: 1.0974 - regression_loss: 0.9439 - classification_loss: 0.1535 203/500 [===========>..................] - ETA: 1:38 - loss: 1.0984 - regression_loss: 0.9448 - classification_loss: 0.1536 204/500 [===========>..................] - ETA: 1:38 - loss: 1.0963 - regression_loss: 0.9430 - classification_loss: 0.1533 205/500 [===========>..................] - ETA: 1:37 - loss: 1.0966 - regression_loss: 0.9433 - classification_loss: 0.1533 206/500 [===========>..................] - ETA: 1:37 - loss: 1.0967 - regression_loss: 0.9434 - classification_loss: 0.1533 207/500 [===========>..................] - ETA: 1:37 - loss: 1.0974 - regression_loss: 0.9443 - classification_loss: 0.1531 208/500 [===========>..................] - ETA: 1:36 - loss: 1.0983 - regression_loss: 0.9450 - classification_loss: 0.1533 209/500 [===========>..................] - ETA: 1:36 - loss: 1.0993 - regression_loss: 0.9459 - classification_loss: 0.1534 210/500 [===========>..................] - ETA: 1:36 - loss: 1.0999 - regression_loss: 0.9466 - classification_loss: 0.1533 211/500 [===========>..................] - ETA: 1:35 - loss: 1.0985 - regression_loss: 0.9454 - classification_loss: 0.1531 212/500 [===========>..................] - ETA: 1:35 - loss: 1.0956 - regression_loss: 0.9429 - classification_loss: 0.1527 213/500 [===========>..................] - ETA: 1:35 - loss: 1.0955 - regression_loss: 0.9429 - classification_loss: 0.1526 214/500 [===========>..................] - ETA: 1:34 - loss: 1.0943 - regression_loss: 0.9419 - classification_loss: 0.1524 215/500 [===========>..................] - ETA: 1:34 - loss: 1.0921 - regression_loss: 0.9398 - classification_loss: 0.1523 216/500 [===========>..................] - ETA: 1:34 - loss: 1.0924 - regression_loss: 0.9401 - classification_loss: 0.1523 217/500 [============>.................] - ETA: 1:33 - loss: 1.0895 - regression_loss: 0.9376 - classification_loss: 0.1519 218/500 [============>.................] - ETA: 1:33 - loss: 1.0872 - regression_loss: 0.9357 - classification_loss: 0.1515 219/500 [============>.................] - ETA: 1:33 - loss: 1.0882 - regression_loss: 0.9368 - classification_loss: 0.1514 220/500 [============>.................] - ETA: 1:32 - loss: 1.0870 - regression_loss: 0.9358 - classification_loss: 0.1512 221/500 [============>.................] - ETA: 1:32 - loss: 1.0841 - regression_loss: 0.9332 - classification_loss: 0.1508 222/500 [============>.................] - ETA: 1:32 - loss: 1.0849 - regression_loss: 0.9340 - classification_loss: 0.1509 223/500 [============>.................] - ETA: 1:31 - loss: 1.0847 - regression_loss: 0.9338 - classification_loss: 0.1509 224/500 [============>.................] - ETA: 1:31 - loss: 1.0873 - regression_loss: 0.9357 - classification_loss: 0.1516 225/500 [============>.................] - ETA: 1:31 - loss: 1.0853 - regression_loss: 0.9341 - classification_loss: 0.1513 226/500 [============>.................] - ETA: 1:30 - loss: 1.0868 - regression_loss: 0.9354 - classification_loss: 0.1515 227/500 [============>.................] - ETA: 1:30 - loss: 1.0881 - regression_loss: 0.9364 - classification_loss: 0.1517 228/500 [============>.................] - ETA: 1:30 - loss: 1.0911 - regression_loss: 0.9390 - classification_loss: 0.1521 229/500 [============>.................] - ETA: 1:29 - loss: 1.0888 - regression_loss: 0.9371 - classification_loss: 0.1517 230/500 [============>.................] - ETA: 1:29 - loss: 1.0872 - regression_loss: 0.9358 - classification_loss: 0.1514 231/500 [============>.................] - ETA: 1:29 - loss: 1.0873 - regression_loss: 0.9360 - classification_loss: 0.1513 232/500 [============>.................] - ETA: 1:28 - loss: 1.0878 - regression_loss: 0.9365 - classification_loss: 0.1513 233/500 [============>.................] - ETA: 1:28 - loss: 1.0853 - regression_loss: 0.9344 - classification_loss: 0.1509 234/500 [=============>................] - ETA: 1:28 - loss: 1.0840 - regression_loss: 0.9334 - classification_loss: 0.1506 235/500 [=============>................] - ETA: 1:27 - loss: 1.0846 - regression_loss: 0.9338 - classification_loss: 0.1508 236/500 [=============>................] - ETA: 1:27 - loss: 1.0840 - regression_loss: 0.9334 - classification_loss: 0.1506 237/500 [=============>................] - ETA: 1:27 - loss: 1.0844 - regression_loss: 0.9338 - classification_loss: 0.1506 238/500 [=============>................] - ETA: 1:26 - loss: 1.0844 - regression_loss: 0.9338 - classification_loss: 0.1507 239/500 [=============>................] - ETA: 1:26 - loss: 1.0863 - regression_loss: 0.9351 - classification_loss: 0.1512 240/500 [=============>................] - ETA: 1:26 - loss: 1.0848 - regression_loss: 0.9338 - classification_loss: 0.1510 241/500 [=============>................] - ETA: 1:25 - loss: 1.0873 - regression_loss: 0.9359 - classification_loss: 0.1514 242/500 [=============>................] - ETA: 1:25 - loss: 1.0872 - regression_loss: 0.9359 - classification_loss: 0.1513 243/500 [=============>................] - ETA: 1:25 - loss: 1.0871 - regression_loss: 0.9359 - classification_loss: 0.1512 244/500 [=============>................] - ETA: 1:24 - loss: 1.0878 - regression_loss: 0.9366 - classification_loss: 0.1512 245/500 [=============>................] - ETA: 1:24 - loss: 1.0872 - regression_loss: 0.9362 - classification_loss: 0.1511 246/500 [=============>................] - ETA: 1:24 - loss: 1.0880 - regression_loss: 0.9368 - classification_loss: 0.1512 247/500 [=============>................] - ETA: 1:23 - loss: 1.0864 - regression_loss: 0.9355 - classification_loss: 0.1509 248/500 [=============>................] - ETA: 1:23 - loss: 1.0885 - regression_loss: 0.9372 - classification_loss: 0.1513 249/500 [=============>................] - ETA: 1:23 - loss: 1.0881 - regression_loss: 0.9368 - classification_loss: 0.1513 250/500 [==============>...............] - ETA: 1:22 - loss: 1.0854 - regression_loss: 0.9343 - classification_loss: 0.1511 251/500 [==============>...............] - ETA: 1:22 - loss: 1.0863 - regression_loss: 0.9355 - classification_loss: 0.1508 252/500 [==============>...............] - ETA: 1:22 - loss: 1.0891 - regression_loss: 0.9377 - classification_loss: 0.1514 253/500 [==============>...............] - ETA: 1:21 - loss: 1.0921 - regression_loss: 0.9402 - classification_loss: 0.1519 254/500 [==============>...............] - ETA: 1:21 - loss: 1.0928 - regression_loss: 0.9409 - classification_loss: 0.1520 255/500 [==============>...............] - ETA: 1:21 - loss: 1.0948 - regression_loss: 0.9425 - classification_loss: 0.1523 256/500 [==============>...............] - ETA: 1:20 - loss: 1.0947 - regression_loss: 0.9427 - classification_loss: 0.1520 257/500 [==============>...............] - ETA: 1:20 - loss: 1.0967 - regression_loss: 0.9442 - classification_loss: 0.1524 258/500 [==============>...............] - ETA: 1:20 - loss: 1.0958 - regression_loss: 0.9435 - classification_loss: 0.1523 259/500 [==============>...............] - ETA: 1:19 - loss: 1.0953 - regression_loss: 0.9430 - classification_loss: 0.1523 260/500 [==============>...............] - ETA: 1:19 - loss: 1.0927 - regression_loss: 0.9408 - classification_loss: 0.1519 261/500 [==============>...............] - ETA: 1:19 - loss: 1.0908 - regression_loss: 0.9392 - classification_loss: 0.1516 262/500 [==============>...............] - ETA: 1:18 - loss: 1.0895 - regression_loss: 0.9380 - classification_loss: 0.1515 263/500 [==============>...............] - ETA: 1:18 - loss: 1.0879 - regression_loss: 0.9366 - classification_loss: 0.1512 264/500 [==============>...............] - ETA: 1:18 - loss: 1.0877 - regression_loss: 0.9367 - classification_loss: 0.1511 265/500 [==============>...............] - ETA: 1:17 - loss: 1.0890 - regression_loss: 0.9378 - classification_loss: 0.1512 266/500 [==============>...............] - ETA: 1:17 - loss: 1.0880 - regression_loss: 0.9370 - classification_loss: 0.1510 267/500 [===============>..............] - ETA: 1:17 - loss: 1.0852 - regression_loss: 0.9345 - classification_loss: 0.1506 268/500 [===============>..............] - ETA: 1:16 - loss: 1.0827 - regression_loss: 0.9325 - classification_loss: 0.1503 269/500 [===============>..............] - ETA: 1:16 - loss: 1.0829 - regression_loss: 0.9326 - classification_loss: 0.1503 270/500 [===============>..............] - ETA: 1:16 - loss: 1.0821 - regression_loss: 0.9320 - classification_loss: 0.1501 271/500 [===============>..............] - ETA: 1:15 - loss: 1.0807 - regression_loss: 0.9309 - classification_loss: 0.1499 272/500 [===============>..............] - ETA: 1:15 - loss: 1.0822 - regression_loss: 0.9320 - classification_loss: 0.1502 273/500 [===============>..............] - ETA: 1:15 - loss: 1.0844 - regression_loss: 0.9338 - classification_loss: 0.1506 274/500 [===============>..............] - ETA: 1:14 - loss: 1.0854 - regression_loss: 0.9345 - classification_loss: 0.1509 275/500 [===============>..............] - ETA: 1:14 - loss: 1.0856 - regression_loss: 0.9347 - classification_loss: 0.1508 276/500 [===============>..............] - ETA: 1:14 - loss: 1.0831 - regression_loss: 0.9326 - classification_loss: 0.1505 277/500 [===============>..............] - ETA: 1:13 - loss: 1.0822 - regression_loss: 0.9320 - classification_loss: 0.1502 278/500 [===============>..............] - ETA: 1:13 - loss: 1.0833 - regression_loss: 0.9329 - classification_loss: 0.1504 279/500 [===============>..............] - ETA: 1:13 - loss: 1.0812 - regression_loss: 0.9310 - classification_loss: 0.1502 280/500 [===============>..............] - ETA: 1:12 - loss: 1.0816 - regression_loss: 0.9314 - classification_loss: 0.1502 281/500 [===============>..............] - ETA: 1:12 - loss: 1.0811 - regression_loss: 0.9312 - classification_loss: 0.1499 282/500 [===============>..............] - ETA: 1:12 - loss: 1.0819 - regression_loss: 0.9319 - classification_loss: 0.1500 283/500 [===============>..............] - ETA: 1:11 - loss: 1.0836 - regression_loss: 0.9333 - classification_loss: 0.1503 284/500 [================>.............] - ETA: 1:11 - loss: 1.0825 - regression_loss: 0.9322 - classification_loss: 0.1504 285/500 [================>.............] - ETA: 1:11 - loss: 1.0849 - regression_loss: 0.9342 - classification_loss: 0.1507 286/500 [================>.............] - ETA: 1:10 - loss: 1.0852 - regression_loss: 0.9344 - classification_loss: 0.1508 287/500 [================>.............] - ETA: 1:10 - loss: 1.0862 - regression_loss: 0.9352 - classification_loss: 0.1510 288/500 [================>.............] - ETA: 1:10 - loss: 1.0877 - regression_loss: 0.9364 - classification_loss: 0.1513 289/500 [================>.............] - ETA: 1:09 - loss: 1.0880 - regression_loss: 0.9367 - classification_loss: 0.1512 290/500 [================>.............] - ETA: 1:09 - loss: 1.0873 - regression_loss: 0.9361 - classification_loss: 0.1512 291/500 [================>.............] - ETA: 1:09 - loss: 1.0880 - regression_loss: 0.9367 - classification_loss: 0.1513 292/500 [================>.............] - ETA: 1:08 - loss: 1.0884 - regression_loss: 0.9370 - classification_loss: 0.1514 293/500 [================>.............] - ETA: 1:08 - loss: 1.0895 - regression_loss: 0.9378 - classification_loss: 0.1517 294/500 [================>.............] - ETA: 1:08 - loss: 1.0898 - regression_loss: 0.9381 - classification_loss: 0.1517 295/500 [================>.............] - ETA: 1:07 - loss: 1.0899 - regression_loss: 0.9382 - classification_loss: 0.1517 296/500 [================>.............] - ETA: 1:07 - loss: 1.0911 - regression_loss: 0.9393 - classification_loss: 0.1518 297/500 [================>.............] - ETA: 1:07 - loss: 1.0914 - regression_loss: 0.9395 - classification_loss: 0.1518 298/500 [================>.............] - ETA: 1:06 - loss: 1.0926 - regression_loss: 0.9404 - classification_loss: 0.1521 299/500 [================>.............] - ETA: 1:06 - loss: 1.0921 - regression_loss: 0.9399 - classification_loss: 0.1522 300/500 [=================>............] - ETA: 1:06 - loss: 1.0920 - regression_loss: 0.9397 - classification_loss: 0.1523 301/500 [=================>............] - ETA: 1:05 - loss: 1.0905 - regression_loss: 0.9384 - classification_loss: 0.1521 302/500 [=================>............] - ETA: 1:05 - loss: 1.0902 - regression_loss: 0.9382 - classification_loss: 0.1520 303/500 [=================>............] - ETA: 1:05 - loss: 1.0913 - regression_loss: 0.9391 - classification_loss: 0.1522 304/500 [=================>............] - ETA: 1:04 - loss: 1.0896 - regression_loss: 0.9377 - classification_loss: 0.1519 305/500 [=================>............] - ETA: 1:04 - loss: 1.0902 - regression_loss: 0.9383 - classification_loss: 0.1519 306/500 [=================>............] - ETA: 1:04 - loss: 1.0893 - regression_loss: 0.9375 - classification_loss: 0.1517 307/500 [=================>............] - ETA: 1:03 - loss: 1.0872 - regression_loss: 0.9358 - classification_loss: 0.1514 308/500 [=================>............] - ETA: 1:03 - loss: 1.0872 - regression_loss: 0.9357 - classification_loss: 0.1514 309/500 [=================>............] - ETA: 1:03 - loss: 1.0874 - regression_loss: 0.9359 - classification_loss: 0.1515 310/500 [=================>............] - ETA: 1:02 - loss: 1.0871 - regression_loss: 0.9359 - classification_loss: 0.1512 311/500 [=================>............] - ETA: 1:02 - loss: 1.0860 - regression_loss: 0.9350 - classification_loss: 0.1510 312/500 [=================>............] - ETA: 1:02 - loss: 1.0862 - regression_loss: 0.9352 - classification_loss: 0.1510 313/500 [=================>............] - ETA: 1:01 - loss: 1.0858 - regression_loss: 0.9349 - classification_loss: 0.1510 314/500 [=================>............] - ETA: 1:01 - loss: 1.0839 - regression_loss: 0.9332 - classification_loss: 0.1507 315/500 [=================>............] - ETA: 1:01 - loss: 1.0815 - regression_loss: 0.9310 - classification_loss: 0.1505 316/500 [=================>............] - ETA: 1:00 - loss: 1.0815 - regression_loss: 0.9311 - classification_loss: 0.1504 317/500 [==================>...........] - ETA: 1:00 - loss: 1.0805 - regression_loss: 0.9302 - classification_loss: 0.1503 318/500 [==================>...........] - ETA: 1:00 - loss: 1.0802 - regression_loss: 0.9299 - classification_loss: 0.1503 319/500 [==================>...........] - ETA: 59s - loss: 1.0797 - regression_loss: 0.9295 - classification_loss: 0.1502  320/500 [==================>...........] - ETA: 59s - loss: 1.0788 - regression_loss: 0.9288 - classification_loss: 0.1500 321/500 [==================>...........] - ETA: 59s - loss: 1.0785 - regression_loss: 0.9285 - classification_loss: 0.1499 322/500 [==================>...........] - ETA: 58s - loss: 1.0797 - regression_loss: 0.9296 - classification_loss: 0.1501 323/500 [==================>...........] - ETA: 58s - loss: 1.0811 - regression_loss: 0.9309 - classification_loss: 0.1502 324/500 [==================>...........] - ETA: 58s - loss: 1.0820 - regression_loss: 0.9317 - classification_loss: 0.1503 325/500 [==================>...........] - ETA: 57s - loss: 1.0829 - regression_loss: 0.9323 - classification_loss: 0.1505 326/500 [==================>...........] - ETA: 57s - loss: 1.0826 - regression_loss: 0.9320 - classification_loss: 0.1506 327/500 [==================>...........] - ETA: 57s - loss: 1.0804 - regression_loss: 0.9301 - classification_loss: 0.1503 328/500 [==================>...........] - ETA: 56s - loss: 1.0798 - regression_loss: 0.9296 - classification_loss: 0.1502 329/500 [==================>...........] - ETA: 56s - loss: 1.0779 - regression_loss: 0.9280 - classification_loss: 0.1499 330/500 [==================>...........] - ETA: 56s - loss: 1.0763 - regression_loss: 0.9265 - classification_loss: 0.1497 331/500 [==================>...........] - ETA: 55s - loss: 1.0749 - regression_loss: 0.9252 - classification_loss: 0.1497 332/500 [==================>...........] - ETA: 55s - loss: 1.0730 - regression_loss: 0.9236 - classification_loss: 0.1494 333/500 [==================>...........] - ETA: 55s - loss: 1.0715 - regression_loss: 0.9224 - classification_loss: 0.1492 334/500 [===================>..........] - ETA: 54s - loss: 1.0738 - regression_loss: 0.9238 - classification_loss: 0.1500 335/500 [===================>..........] - ETA: 54s - loss: 1.0753 - regression_loss: 0.9251 - classification_loss: 0.1502 336/500 [===================>..........] - ETA: 54s - loss: 1.0745 - regression_loss: 0.9245 - classification_loss: 0.1500 337/500 [===================>..........] - ETA: 53s - loss: 1.0755 - regression_loss: 0.9253 - classification_loss: 0.1501 338/500 [===================>..........] - ETA: 53s - loss: 1.0769 - regression_loss: 0.9264 - classification_loss: 0.1505 339/500 [===================>..........] - ETA: 53s - loss: 1.0780 - regression_loss: 0.9273 - classification_loss: 0.1506 340/500 [===================>..........] - ETA: 52s - loss: 1.0767 - regression_loss: 0.9263 - classification_loss: 0.1504 341/500 [===================>..........] - ETA: 52s - loss: 1.0753 - regression_loss: 0.9251 - classification_loss: 0.1502 342/500 [===================>..........] - ETA: 52s - loss: 1.0745 - regression_loss: 0.9243 - classification_loss: 0.1502 343/500 [===================>..........] - ETA: 51s - loss: 1.0737 - regression_loss: 0.9236 - classification_loss: 0.1502 344/500 [===================>..........] - ETA: 51s - loss: 1.0755 - regression_loss: 0.9251 - classification_loss: 0.1504 345/500 [===================>..........] - ETA: 51s - loss: 1.0771 - regression_loss: 0.9263 - classification_loss: 0.1508 346/500 [===================>..........] - ETA: 50s - loss: 1.0770 - regression_loss: 0.9263 - classification_loss: 0.1507 347/500 [===================>..........] - ETA: 50s - loss: 1.0768 - regression_loss: 0.9261 - classification_loss: 0.1507 348/500 [===================>..........] - ETA: 50s - loss: 1.0752 - regression_loss: 0.9247 - classification_loss: 0.1505 349/500 [===================>..........] - ETA: 49s - loss: 1.0758 - regression_loss: 0.9253 - classification_loss: 0.1505 350/500 [====================>.........] - ETA: 49s - loss: 1.0779 - regression_loss: 0.9269 - classification_loss: 0.1510 351/500 [====================>.........] - ETA: 49s - loss: 1.0766 - regression_loss: 0.9258 - classification_loss: 0.1509 352/500 [====================>.........] - ETA: 48s - loss: 1.0775 - regression_loss: 0.9265 - classification_loss: 0.1511 353/500 [====================>.........] - ETA: 48s - loss: 1.0762 - regression_loss: 0.9253 - classification_loss: 0.1509 354/500 [====================>.........] - ETA: 48s - loss: 1.0746 - regression_loss: 0.9239 - classification_loss: 0.1507 355/500 [====================>.........] - ETA: 47s - loss: 1.0735 - regression_loss: 0.9229 - classification_loss: 0.1505 356/500 [====================>.........] - ETA: 47s - loss: 1.0754 - regression_loss: 0.9246 - classification_loss: 0.1508 357/500 [====================>.........] - ETA: 47s - loss: 1.0752 - regression_loss: 0.9244 - classification_loss: 0.1509 358/500 [====================>.........] - ETA: 46s - loss: 1.0757 - regression_loss: 0.9247 - classification_loss: 0.1510 359/500 [====================>.........] - ETA: 46s - loss: 1.0761 - regression_loss: 0.9250 - classification_loss: 0.1511 360/500 [====================>.........] - ETA: 46s - loss: 1.0758 - regression_loss: 0.9248 - classification_loss: 0.1510 361/500 [====================>.........] - ETA: 45s - loss: 1.0757 - regression_loss: 0.9246 - classification_loss: 0.1511 362/500 [====================>.........] - ETA: 45s - loss: 1.0758 - regression_loss: 0.9247 - classification_loss: 0.1511 363/500 [====================>.........] - ETA: 45s - loss: 1.0759 - regression_loss: 0.9248 - classification_loss: 0.1511 364/500 [====================>.........] - ETA: 44s - loss: 1.0773 - regression_loss: 0.9260 - classification_loss: 0.1513 365/500 [====================>.........] - ETA: 44s - loss: 1.0762 - regression_loss: 0.9251 - classification_loss: 0.1511 366/500 [====================>.........] - ETA: 44s - loss: 1.0765 - regression_loss: 0.9254 - classification_loss: 0.1511 367/500 [=====================>........] - ETA: 43s - loss: 1.0769 - regression_loss: 0.9257 - classification_loss: 0.1512 368/500 [=====================>........] - ETA: 43s - loss: 1.0761 - regression_loss: 0.9251 - classification_loss: 0.1511 369/500 [=====================>........] - ETA: 43s - loss: 1.0776 - regression_loss: 0.9264 - classification_loss: 0.1512 370/500 [=====================>........] - ETA: 42s - loss: 1.0769 - regression_loss: 0.9258 - classification_loss: 0.1511 371/500 [=====================>........] - ETA: 42s - loss: 1.0774 - regression_loss: 0.9262 - classification_loss: 0.1512 372/500 [=====================>........] - ETA: 42s - loss: 1.0782 - regression_loss: 0.9269 - classification_loss: 0.1513 373/500 [=====================>........] - ETA: 41s - loss: 1.0781 - regression_loss: 0.9268 - classification_loss: 0.1513 374/500 [=====================>........] - ETA: 41s - loss: 1.0782 - regression_loss: 0.9269 - classification_loss: 0.1514 375/500 [=====================>........] - ETA: 41s - loss: 1.0791 - regression_loss: 0.9276 - classification_loss: 0.1514 376/500 [=====================>........] - ETA: 41s - loss: 1.0798 - regression_loss: 0.9282 - classification_loss: 0.1515 377/500 [=====================>........] - ETA: 40s - loss: 1.0783 - regression_loss: 0.9269 - classification_loss: 0.1514 378/500 [=====================>........] - ETA: 40s - loss: 1.0774 - regression_loss: 0.9259 - classification_loss: 0.1515 379/500 [=====================>........] - ETA: 40s - loss: 1.0785 - regression_loss: 0.9268 - classification_loss: 0.1517 380/500 [=====================>........] - ETA: 39s - loss: 1.0790 - regression_loss: 0.9272 - classification_loss: 0.1518 381/500 [=====================>........] - ETA: 39s - loss: 1.0795 - regression_loss: 0.9277 - classification_loss: 0.1518 382/500 [=====================>........] - ETA: 39s - loss: 1.0793 - regression_loss: 0.9275 - classification_loss: 0.1518 383/500 [=====================>........] - ETA: 38s - loss: 1.0793 - regression_loss: 0.9274 - classification_loss: 0.1519 384/500 [======================>.......] - ETA: 38s - loss: 1.0778 - regression_loss: 0.9261 - classification_loss: 0.1516 385/500 [======================>.......] - ETA: 38s - loss: 1.0793 - regression_loss: 0.9275 - classification_loss: 0.1518 386/500 [======================>.......] - ETA: 37s - loss: 1.0796 - regression_loss: 0.9277 - classification_loss: 0.1519 387/500 [======================>.......] - ETA: 37s - loss: 1.0776 - regression_loss: 0.9260 - classification_loss: 0.1516 388/500 [======================>.......] - ETA: 37s - loss: 1.0778 - regression_loss: 0.9260 - classification_loss: 0.1518 389/500 [======================>.......] - ETA: 36s - loss: 1.0785 - regression_loss: 0.9266 - classification_loss: 0.1519 390/500 [======================>.......] - ETA: 36s - loss: 1.0819 - regression_loss: 0.9290 - classification_loss: 0.1529 391/500 [======================>.......] - ETA: 36s - loss: 1.0812 - regression_loss: 0.9284 - classification_loss: 0.1528 392/500 [======================>.......] - ETA: 35s - loss: 1.0809 - regression_loss: 0.9282 - classification_loss: 0.1527 393/500 [======================>.......] - ETA: 35s - loss: 1.0795 - regression_loss: 0.9271 - classification_loss: 0.1525 394/500 [======================>.......] - ETA: 35s - loss: 1.0777 - regression_loss: 0.9255 - classification_loss: 0.1523 395/500 [======================>.......] - ETA: 34s - loss: 1.0778 - regression_loss: 0.9256 - classification_loss: 0.1523 396/500 [======================>.......] - ETA: 34s - loss: 1.0767 - regression_loss: 0.9246 - classification_loss: 0.1521 397/500 [======================>.......] - ETA: 34s - loss: 1.0782 - regression_loss: 0.9260 - classification_loss: 0.1522 398/500 [======================>.......] - ETA: 33s - loss: 1.0781 - regression_loss: 0.9261 - classification_loss: 0.1520 399/500 [======================>.......] - ETA: 33s - loss: 1.0774 - regression_loss: 0.9257 - classification_loss: 0.1518 400/500 [=======================>......] - ETA: 33s - loss: 1.0774 - regression_loss: 0.9258 - classification_loss: 0.1516 401/500 [=======================>......] - ETA: 32s - loss: 1.0781 - regression_loss: 0.9265 - classification_loss: 0.1516 402/500 [=======================>......] - ETA: 32s - loss: 1.0797 - regression_loss: 0.9279 - classification_loss: 0.1518 403/500 [=======================>......] - ETA: 32s - loss: 1.0784 - regression_loss: 0.9268 - classification_loss: 0.1516 404/500 [=======================>......] - ETA: 31s - loss: 1.0773 - regression_loss: 0.9259 - classification_loss: 0.1514 405/500 [=======================>......] - ETA: 31s - loss: 1.0757 - regression_loss: 0.9245 - classification_loss: 0.1512 406/500 [=======================>......] - ETA: 31s - loss: 1.0749 - regression_loss: 0.9238 - classification_loss: 0.1511 407/500 [=======================>......] - ETA: 30s - loss: 1.0758 - regression_loss: 0.9245 - classification_loss: 0.1513 408/500 [=======================>......] - ETA: 30s - loss: 1.0776 - regression_loss: 0.9259 - classification_loss: 0.1517 409/500 [=======================>......] - ETA: 30s - loss: 1.0781 - regression_loss: 0.9263 - classification_loss: 0.1517 410/500 [=======================>......] - ETA: 29s - loss: 1.0788 - regression_loss: 0.9271 - classification_loss: 0.1517 411/500 [=======================>......] - ETA: 29s - loss: 1.0793 - regression_loss: 0.9275 - classification_loss: 0.1518 412/500 [=======================>......] - ETA: 29s - loss: 1.0808 - regression_loss: 0.9287 - classification_loss: 0.1521 413/500 [=======================>......] - ETA: 28s - loss: 1.0826 - regression_loss: 0.9302 - classification_loss: 0.1524 414/500 [=======================>......] - ETA: 28s - loss: 1.0821 - regression_loss: 0.9298 - classification_loss: 0.1523 415/500 [=======================>......] - ETA: 28s - loss: 1.0806 - regression_loss: 0.9285 - classification_loss: 0.1520 416/500 [=======================>......] - ETA: 27s - loss: 1.0806 - regression_loss: 0.9287 - classification_loss: 0.1519 417/500 [========================>.....] - ETA: 27s - loss: 1.0804 - regression_loss: 0.9285 - classification_loss: 0.1519 418/500 [========================>.....] - ETA: 27s - loss: 1.0823 - regression_loss: 0.9302 - classification_loss: 0.1522 419/500 [========================>.....] - ETA: 26s - loss: 1.0828 - regression_loss: 0.9305 - classification_loss: 0.1523 420/500 [========================>.....] - ETA: 26s - loss: 1.0814 - regression_loss: 0.9291 - classification_loss: 0.1523 421/500 [========================>.....] - ETA: 26s - loss: 1.0798 - regression_loss: 0.9276 - classification_loss: 0.1522 422/500 [========================>.....] - ETA: 25s - loss: 1.0785 - regression_loss: 0.9265 - classification_loss: 0.1520 423/500 [========================>.....] - ETA: 25s - loss: 1.0768 - regression_loss: 0.9250 - classification_loss: 0.1518 424/500 [========================>.....] - ETA: 25s - loss: 1.0762 - regression_loss: 0.9245 - classification_loss: 0.1518 425/500 [========================>.....] - ETA: 24s - loss: 1.0767 - regression_loss: 0.9249 - classification_loss: 0.1518 426/500 [========================>.....] - ETA: 24s - loss: 1.0762 - regression_loss: 0.9244 - classification_loss: 0.1518 427/500 [========================>.....] - ETA: 24s - loss: 1.0770 - regression_loss: 0.9251 - classification_loss: 0.1519 428/500 [========================>.....] - ETA: 23s - loss: 1.0759 - regression_loss: 0.9242 - classification_loss: 0.1517 429/500 [========================>.....] - ETA: 23s - loss: 1.0763 - regression_loss: 0.9244 - classification_loss: 0.1519 430/500 [========================>.....] - ETA: 23s - loss: 1.0770 - regression_loss: 0.9250 - classification_loss: 0.1519 431/500 [========================>.....] - ETA: 22s - loss: 1.0772 - regression_loss: 0.9252 - classification_loss: 0.1520 432/500 [========================>.....] - ETA: 22s - loss: 1.0780 - regression_loss: 0.9259 - classification_loss: 0.1521 433/500 [========================>.....] - ETA: 22s - loss: 1.0779 - regression_loss: 0.9259 - classification_loss: 0.1521 434/500 [=========================>....] - ETA: 21s - loss: 1.0785 - regression_loss: 0.9264 - classification_loss: 0.1521 435/500 [=========================>....] - ETA: 21s - loss: 1.0782 - regression_loss: 0.9261 - classification_loss: 0.1521 436/500 [=========================>....] - ETA: 21s - loss: 1.0778 - regression_loss: 0.9257 - classification_loss: 0.1520 437/500 [=========================>....] - ETA: 20s - loss: 1.0762 - regression_loss: 0.9243 - classification_loss: 0.1519 438/500 [=========================>....] - ETA: 20s - loss: 1.0775 - regression_loss: 0.9254 - classification_loss: 0.1521 439/500 [=========================>....] - ETA: 20s - loss: 1.0786 - regression_loss: 0.9263 - classification_loss: 0.1523 440/500 [=========================>....] - ETA: 19s - loss: 1.0793 - regression_loss: 0.9270 - classification_loss: 0.1523 441/500 [=========================>....] - ETA: 19s - loss: 1.0788 - regression_loss: 0.9266 - classification_loss: 0.1522 442/500 [=========================>....] - ETA: 19s - loss: 1.0784 - regression_loss: 0.9263 - classification_loss: 0.1521 443/500 [=========================>....] - ETA: 18s - loss: 1.0791 - regression_loss: 0.9269 - classification_loss: 0.1522 444/500 [=========================>....] - ETA: 18s - loss: 1.0791 - regression_loss: 0.9268 - classification_loss: 0.1523 445/500 [=========================>....] - ETA: 18s - loss: 1.0788 - regression_loss: 0.9267 - classification_loss: 0.1522 446/500 [=========================>....] - ETA: 17s - loss: 1.0784 - regression_loss: 0.9263 - classification_loss: 0.1521 447/500 [=========================>....] - ETA: 17s - loss: 1.0785 - regression_loss: 0.9264 - classification_loss: 0.1521 448/500 [=========================>....] - ETA: 17s - loss: 1.0782 - regression_loss: 0.9262 - classification_loss: 0.1520 449/500 [=========================>....] - ETA: 16s - loss: 1.0792 - regression_loss: 0.9271 - classification_loss: 0.1522 450/500 [==========================>...] - ETA: 16s - loss: 1.0780 - regression_loss: 0.9260 - classification_loss: 0.1519 451/500 [==========================>...] - ETA: 16s - loss: 1.0782 - regression_loss: 0.9264 - classification_loss: 0.1518 452/500 [==========================>...] - ETA: 15s - loss: 1.0775 - regression_loss: 0.9258 - classification_loss: 0.1516 453/500 [==========================>...] - ETA: 15s - loss: 1.0763 - regression_loss: 0.9248 - classification_loss: 0.1514 454/500 [==========================>...] - ETA: 15s - loss: 1.0763 - regression_loss: 0.9248 - classification_loss: 0.1515 455/500 [==========================>...] - ETA: 14s - loss: 1.0757 - regression_loss: 0.9243 - classification_loss: 0.1514 456/500 [==========================>...] - ETA: 14s - loss: 1.0744 - regression_loss: 0.9231 - classification_loss: 0.1513 457/500 [==========================>...] - ETA: 14s - loss: 1.0737 - regression_loss: 0.9225 - classification_loss: 0.1512 458/500 [==========================>...] - ETA: 13s - loss: 1.0737 - regression_loss: 0.9226 - classification_loss: 0.1511 459/500 [==========================>...] - ETA: 13s - loss: 1.0728 - regression_loss: 0.9219 - classification_loss: 0.1509 460/500 [==========================>...] - ETA: 13s - loss: 1.0738 - regression_loss: 0.9227 - classification_loss: 0.1512 461/500 [==========================>...] - ETA: 12s - loss: 1.0743 - regression_loss: 0.9231 - classification_loss: 0.1512 462/500 [==========================>...] - ETA: 12s - loss: 1.0731 - regression_loss: 0.9220 - classification_loss: 0.1511 463/500 [==========================>...] - ETA: 12s - loss: 1.0738 - regression_loss: 0.9226 - classification_loss: 0.1512 464/500 [==========================>...] - ETA: 11s - loss: 1.0731 - regression_loss: 0.9220 - classification_loss: 0.1511 465/500 [==========================>...] - ETA: 11s - loss: 1.0747 - regression_loss: 0.9234 - classification_loss: 0.1513 466/500 [==========================>...] - ETA: 11s - loss: 1.0753 - regression_loss: 0.9240 - classification_loss: 0.1513 467/500 [===========================>..] - ETA: 10s - loss: 1.0746 - regression_loss: 0.9235 - classification_loss: 0.1512 468/500 [===========================>..] - ETA: 10s - loss: 1.0750 - regression_loss: 0.9238 - classification_loss: 0.1512 469/500 [===========================>..] - ETA: 10s - loss: 1.0753 - regression_loss: 0.9240 - classification_loss: 0.1512 470/500 [===========================>..] - ETA: 9s - loss: 1.0745 - regression_loss: 0.9234 - classification_loss: 0.1510  471/500 [===========================>..] - ETA: 9s - loss: 1.0755 - regression_loss: 0.9243 - classification_loss: 0.1512 472/500 [===========================>..] - ETA: 9s - loss: 1.0760 - regression_loss: 0.9248 - classification_loss: 0.1513 473/500 [===========================>..] - ETA: 8s - loss: 1.0753 - regression_loss: 0.9242 - classification_loss: 0.1512 474/500 [===========================>..] - ETA: 8s - loss: 1.0767 - regression_loss: 0.9254 - classification_loss: 0.1513 475/500 [===========================>..] - ETA: 8s - loss: 1.0755 - regression_loss: 0.9244 - classification_loss: 0.1511 476/500 [===========================>..] - ETA: 7s - loss: 1.0761 - regression_loss: 0.9250 - classification_loss: 0.1511 477/500 [===========================>..] - ETA: 7s - loss: 1.0755 - regression_loss: 0.9246 - classification_loss: 0.1509 478/500 [===========================>..] - ETA: 7s - loss: 1.0761 - regression_loss: 0.9251 - classification_loss: 0.1510 479/500 [===========================>..] - ETA: 6s - loss: 1.0764 - regression_loss: 0.9255 - classification_loss: 0.1510 480/500 [===========================>..] - ETA: 6s - loss: 1.0770 - regression_loss: 0.9260 - classification_loss: 0.1510 481/500 [===========================>..] - ETA: 6s - loss: 1.0769 - regression_loss: 0.9260 - classification_loss: 0.1509 482/500 [===========================>..] - ETA: 5s - loss: 1.0774 - regression_loss: 0.9264 - classification_loss: 0.1510 483/500 [===========================>..] - ETA: 5s - loss: 1.0771 - regression_loss: 0.9260 - classification_loss: 0.1511 484/500 [============================>.] - ETA: 5s - loss: 1.0778 - regression_loss: 0.9266 - classification_loss: 0.1511 485/500 [============================>.] - ETA: 4s - loss: 1.0793 - regression_loss: 0.9278 - classification_loss: 0.1514 486/500 [============================>.] - ETA: 4s - loss: 1.0796 - regression_loss: 0.9281 - classification_loss: 0.1515 487/500 [============================>.] - ETA: 4s - loss: 1.0795 - regression_loss: 0.9281 - classification_loss: 0.1514 488/500 [============================>.] - ETA: 3s - loss: 1.0803 - regression_loss: 0.9288 - classification_loss: 0.1516 489/500 [============================>.] - ETA: 3s - loss: 1.0818 - regression_loss: 0.9300 - classification_loss: 0.1518 490/500 [============================>.] - ETA: 3s - loss: 1.0810 - regression_loss: 0.9293 - classification_loss: 0.1516 491/500 [============================>.] - ETA: 2s - loss: 1.0800 - regression_loss: 0.9285 - classification_loss: 0.1515 492/500 [============================>.] - ETA: 2s - loss: 1.0807 - regression_loss: 0.9292 - classification_loss: 0.1515 493/500 [============================>.] - ETA: 2s - loss: 1.0799 - regression_loss: 0.9284 - classification_loss: 0.1514 494/500 [============================>.] - ETA: 1s - loss: 1.0792 - regression_loss: 0.9278 - classification_loss: 0.1513 495/500 [============================>.] - ETA: 1s - loss: 1.0802 - regression_loss: 0.9287 - classification_loss: 0.1515 496/500 [============================>.] - ETA: 1s - loss: 1.0795 - regression_loss: 0.9281 - classification_loss: 0.1514 497/500 [============================>.] - ETA: 0s - loss: 1.0805 - regression_loss: 0.9289 - classification_loss: 0.1516 498/500 [============================>.] - ETA: 0s - loss: 1.0797 - regression_loss: 0.9283 - classification_loss: 0.1514 499/500 [============================>.] - ETA: 0s - loss: 1.0800 - regression_loss: 0.9286 - classification_loss: 0.1514 500/500 [==============================] - 165s 331ms/step - loss: 1.0791 - regression_loss: 0.9278 - classification_loss: 0.1513 1172 instances of class plum with average precision: 0.6111 mAP: 0.6111 Epoch 00028: saving model to ./training/snapshots/resnet101_pascal_28.h5 Epoch 29/150 1/500 [..............................] - ETA: 2:46 - loss: 1.1574 - regression_loss: 1.0035 - classification_loss: 0.1539 2/500 [..............................] - ETA: 2:43 - loss: 0.9719 - regression_loss: 0.8433 - classification_loss: 0.1286 3/500 [..............................] - ETA: 2:40 - loss: 1.0654 - regression_loss: 0.9201 - classification_loss: 0.1453 4/500 [..............................] - ETA: 2:40 - loss: 0.9474 - regression_loss: 0.8173 - classification_loss: 0.1301 5/500 [..............................] - ETA: 2:42 - loss: 1.0070 - regression_loss: 0.8673 - classification_loss: 0.1398 6/500 [..............................] - ETA: 2:41 - loss: 0.9376 - regression_loss: 0.8092 - classification_loss: 0.1284 7/500 [..............................] - ETA: 2:42 - loss: 0.9727 - regression_loss: 0.8440 - classification_loss: 0.1286 8/500 [..............................] - ETA: 2:42 - loss: 1.0110 - regression_loss: 0.8797 - classification_loss: 0.1313 9/500 [..............................] - ETA: 2:41 - loss: 1.0229 - regression_loss: 0.8898 - classification_loss: 0.1331 10/500 [..............................] - ETA: 2:42 - loss: 1.0295 - regression_loss: 0.8955 - classification_loss: 0.1340 11/500 [..............................] - ETA: 2:41 - loss: 0.9858 - regression_loss: 0.8597 - classification_loss: 0.1261 12/500 [..............................] - ETA: 2:40 - loss: 1.0230 - regression_loss: 0.8908 - classification_loss: 0.1322 13/500 [..............................] - ETA: 2:39 - loss: 1.0337 - regression_loss: 0.8967 - classification_loss: 0.1370 14/500 [..............................] - ETA: 2:38 - loss: 1.0123 - regression_loss: 0.8799 - classification_loss: 0.1324 15/500 [..............................] - ETA: 2:39 - loss: 1.0240 - regression_loss: 0.8906 - classification_loss: 0.1334 16/500 [..............................] - ETA: 2:39 - loss: 1.0232 - regression_loss: 0.8915 - classification_loss: 0.1317 17/500 [>.............................] - ETA: 2:38 - loss: 1.0303 - regression_loss: 0.8975 - classification_loss: 0.1328 18/500 [>.............................] - ETA: 2:37 - loss: 1.0387 - regression_loss: 0.9042 - classification_loss: 0.1345 19/500 [>.............................] - ETA: 2:37 - loss: 1.0447 - regression_loss: 0.9081 - classification_loss: 0.1366 20/500 [>.............................] - ETA: 2:37 - loss: 1.0597 - regression_loss: 0.9191 - classification_loss: 0.1407 21/500 [>.............................] - ETA: 2:38 - loss: 1.0725 - regression_loss: 0.9331 - classification_loss: 0.1394 22/500 [>.............................] - ETA: 2:37 - loss: 1.0841 - regression_loss: 0.9411 - classification_loss: 0.1429 23/500 [>.............................] - ETA: 2:37 - loss: 1.1052 - regression_loss: 0.9581 - classification_loss: 0.1472 24/500 [>.............................] - ETA: 2:37 - loss: 1.0818 - regression_loss: 0.9383 - classification_loss: 0.1435 25/500 [>.............................] - ETA: 2:36 - loss: 1.0753 - regression_loss: 0.9299 - classification_loss: 0.1454 26/500 [>.............................] - ETA: 2:36 - loss: 1.0876 - regression_loss: 0.9395 - classification_loss: 0.1481 27/500 [>.............................] - ETA: 2:36 - loss: 1.0683 - regression_loss: 0.9231 - classification_loss: 0.1452 28/500 [>.............................] - ETA: 2:35 - loss: 1.0748 - regression_loss: 0.9279 - classification_loss: 0.1469 29/500 [>.............................] - ETA: 2:35 - loss: 1.0756 - regression_loss: 0.9280 - classification_loss: 0.1475 30/500 [>.............................] - ETA: 2:35 - loss: 1.0550 - regression_loss: 0.9097 - classification_loss: 0.1454 31/500 [>.............................] - ETA: 2:35 - loss: 1.0534 - regression_loss: 0.9089 - classification_loss: 0.1445 32/500 [>.............................] - ETA: 2:34 - loss: 1.0717 - regression_loss: 0.9238 - classification_loss: 0.1478 33/500 [>.............................] - ETA: 2:34 - loss: 1.0519 - regression_loss: 0.9070 - classification_loss: 0.1449 34/500 [=>............................] - ETA: 2:34 - loss: 1.0493 - regression_loss: 0.9061 - classification_loss: 0.1432 35/500 [=>............................] - ETA: 2:34 - loss: 1.0415 - regression_loss: 0.8998 - classification_loss: 0.1417 36/500 [=>............................] - ETA: 2:34 - loss: 1.0447 - regression_loss: 0.9025 - classification_loss: 0.1422 37/500 [=>............................] - ETA: 2:33 - loss: 1.0422 - regression_loss: 0.9011 - classification_loss: 0.1411 38/500 [=>............................] - ETA: 2:33 - loss: 1.0452 - regression_loss: 0.9037 - classification_loss: 0.1415 39/500 [=>............................] - ETA: 2:33 - loss: 1.0371 - regression_loss: 0.8960 - classification_loss: 0.1411 40/500 [=>............................] - ETA: 2:33 - loss: 1.0353 - regression_loss: 0.8935 - classification_loss: 0.1418 41/500 [=>............................] - ETA: 2:32 - loss: 1.0389 - regression_loss: 0.8964 - classification_loss: 0.1425 42/500 [=>............................] - ETA: 2:32 - loss: 1.0342 - regression_loss: 0.8928 - classification_loss: 0.1414 43/500 [=>............................] - ETA: 2:31 - loss: 1.0413 - regression_loss: 0.8984 - classification_loss: 0.1430 44/500 [=>............................] - ETA: 2:31 - loss: 1.0439 - regression_loss: 0.9012 - classification_loss: 0.1427 45/500 [=>............................] - ETA: 2:31 - loss: 1.0356 - regression_loss: 0.8939 - classification_loss: 0.1418 46/500 [=>............................] - ETA: 2:30 - loss: 1.0466 - regression_loss: 0.9025 - classification_loss: 0.1441 47/500 [=>............................] - ETA: 2:29 - loss: 1.0420 - regression_loss: 0.8983 - classification_loss: 0.1437 48/500 [=>............................] - ETA: 2:29 - loss: 1.0415 - regression_loss: 0.8981 - classification_loss: 0.1433 49/500 [=>............................] - ETA: 2:29 - loss: 1.0477 - regression_loss: 0.9035 - classification_loss: 0.1441 50/500 [==>...........................] - ETA: 2:28 - loss: 1.0547 - regression_loss: 0.9088 - classification_loss: 0.1459 51/500 [==>...........................] - ETA: 2:28 - loss: 1.0459 - regression_loss: 0.9009 - classification_loss: 0.1450 52/500 [==>...........................] - ETA: 2:28 - loss: 1.0564 - regression_loss: 0.9089 - classification_loss: 0.1475 53/500 [==>...........................] - ETA: 2:27 - loss: 1.0539 - regression_loss: 0.9069 - classification_loss: 0.1471 54/500 [==>...........................] - ETA: 2:27 - loss: 1.0425 - regression_loss: 0.8969 - classification_loss: 0.1455 55/500 [==>...........................] - ETA: 2:26 - loss: 1.0483 - regression_loss: 0.9018 - classification_loss: 0.1466 56/500 [==>...........................] - ETA: 2:26 - loss: 1.0543 - regression_loss: 0.9048 - classification_loss: 0.1495 57/500 [==>...........................] - ETA: 2:26 - loss: 1.0611 - regression_loss: 0.9105 - classification_loss: 0.1506 58/500 [==>...........................] - ETA: 2:26 - loss: 1.0562 - regression_loss: 0.9065 - classification_loss: 0.1497 59/500 [==>...........................] - ETA: 2:25 - loss: 1.0581 - regression_loss: 0.9073 - classification_loss: 0.1508 60/500 [==>...........................] - ETA: 2:25 - loss: 1.0678 - regression_loss: 0.9155 - classification_loss: 0.1523 61/500 [==>...........................] - ETA: 2:25 - loss: 1.0661 - regression_loss: 0.9147 - classification_loss: 0.1514 62/500 [==>...........................] - ETA: 2:24 - loss: 1.0592 - regression_loss: 0.9093 - classification_loss: 0.1499 63/500 [==>...........................] - ETA: 2:24 - loss: 1.0632 - regression_loss: 0.9134 - classification_loss: 0.1499 64/500 [==>...........................] - ETA: 2:24 - loss: 1.0645 - regression_loss: 0.9157 - classification_loss: 0.1488 65/500 [==>...........................] - ETA: 2:23 - loss: 1.0633 - regression_loss: 0.9137 - classification_loss: 0.1496 66/500 [==>...........................] - ETA: 2:23 - loss: 1.0645 - regression_loss: 0.9127 - classification_loss: 0.1518 67/500 [===>..........................] - ETA: 2:23 - loss: 1.0645 - regression_loss: 0.9116 - classification_loss: 0.1529 68/500 [===>..........................] - ETA: 2:23 - loss: 1.0735 - regression_loss: 0.9197 - classification_loss: 0.1539 69/500 [===>..........................] - ETA: 2:22 - loss: 1.0678 - regression_loss: 0.9144 - classification_loss: 0.1535 70/500 [===>..........................] - ETA: 2:22 - loss: 1.0626 - regression_loss: 0.9098 - classification_loss: 0.1528 71/500 [===>..........................] - ETA: 2:22 - loss: 1.0660 - regression_loss: 0.9129 - classification_loss: 0.1531 72/500 [===>..........................] - ETA: 2:21 - loss: 1.0648 - regression_loss: 0.9119 - classification_loss: 0.1529 73/500 [===>..........................] - ETA: 2:21 - loss: 1.0681 - regression_loss: 0.9141 - classification_loss: 0.1540 74/500 [===>..........................] - ETA: 2:21 - loss: 1.0604 - regression_loss: 0.9078 - classification_loss: 0.1526 75/500 [===>..........................] - ETA: 2:20 - loss: 1.0545 - regression_loss: 0.9032 - classification_loss: 0.1513 76/500 [===>..........................] - ETA: 2:20 - loss: 1.0507 - regression_loss: 0.9000 - classification_loss: 0.1508 77/500 [===>..........................] - ETA: 2:20 - loss: 1.0489 - regression_loss: 0.8984 - classification_loss: 0.1504 78/500 [===>..........................] - ETA: 2:19 - loss: 1.0543 - regression_loss: 0.9032 - classification_loss: 0.1510 79/500 [===>..........................] - ETA: 2:19 - loss: 1.0445 - regression_loss: 0.8947 - classification_loss: 0.1497 80/500 [===>..........................] - ETA: 2:19 - loss: 1.0416 - regression_loss: 0.8921 - classification_loss: 0.1495 81/500 [===>..........................] - ETA: 2:18 - loss: 1.0452 - regression_loss: 0.8957 - classification_loss: 0.1495 82/500 [===>..........................] - ETA: 2:18 - loss: 1.0444 - regression_loss: 0.8949 - classification_loss: 0.1495 83/500 [===>..........................] - ETA: 2:18 - loss: 1.0454 - regression_loss: 0.8958 - classification_loss: 0.1496 84/500 [====>.........................] - ETA: 2:18 - loss: 1.0462 - regression_loss: 0.8971 - classification_loss: 0.1492 85/500 [====>.........................] - ETA: 2:17 - loss: 1.0505 - regression_loss: 0.9002 - classification_loss: 0.1504 86/500 [====>.........................] - ETA: 2:17 - loss: 1.0462 - regression_loss: 0.8965 - classification_loss: 0.1497 87/500 [====>.........................] - ETA: 2:17 - loss: 1.0467 - regression_loss: 0.8969 - classification_loss: 0.1498 88/500 [====>.........................] - ETA: 2:16 - loss: 1.0487 - regression_loss: 0.8990 - classification_loss: 0.1497 89/500 [====>.........................] - ETA: 2:16 - loss: 1.0553 - regression_loss: 0.9044 - classification_loss: 0.1509 90/500 [====>.........................] - ETA: 2:16 - loss: 1.0553 - regression_loss: 0.9046 - classification_loss: 0.1506 91/500 [====>.........................] - ETA: 2:15 - loss: 1.0572 - regression_loss: 0.9064 - classification_loss: 0.1508 92/500 [====>.........................] - ETA: 2:15 - loss: 1.0537 - regression_loss: 0.9037 - classification_loss: 0.1501 93/500 [====>.........................] - ETA: 2:14 - loss: 1.0490 - regression_loss: 0.8992 - classification_loss: 0.1498 94/500 [====>.........................] - ETA: 2:14 - loss: 1.0537 - regression_loss: 0.9029 - classification_loss: 0.1508 95/500 [====>.........................] - ETA: 2:14 - loss: 1.0494 - regression_loss: 0.8991 - classification_loss: 0.1502 96/500 [====>.........................] - ETA: 2:13 - loss: 1.0529 - regression_loss: 0.9020 - classification_loss: 0.1509 97/500 [====>.........................] - ETA: 2:13 - loss: 1.0505 - regression_loss: 0.8994 - classification_loss: 0.1511 98/500 [====>.........................] - ETA: 2:13 - loss: 1.0482 - regression_loss: 0.8977 - classification_loss: 0.1506 99/500 [====>.........................] - ETA: 2:12 - loss: 1.0474 - regression_loss: 0.8968 - classification_loss: 0.1506 100/500 [=====>........................] - ETA: 2:12 - loss: 1.0505 - regression_loss: 0.8995 - classification_loss: 0.1510 101/500 [=====>........................] - ETA: 2:12 - loss: 1.0465 - regression_loss: 0.8964 - classification_loss: 0.1501 102/500 [=====>........................] - ETA: 2:11 - loss: 1.0433 - regression_loss: 0.8937 - classification_loss: 0.1495 103/500 [=====>........................] - ETA: 2:11 - loss: 1.0358 - regression_loss: 0.8872 - classification_loss: 0.1486 104/500 [=====>........................] - ETA: 2:11 - loss: 1.0349 - regression_loss: 0.8865 - classification_loss: 0.1484 105/500 [=====>........................] - ETA: 2:10 - loss: 1.0348 - regression_loss: 0.8861 - classification_loss: 0.1487 106/500 [=====>........................] - ETA: 2:10 - loss: 1.0290 - regression_loss: 0.8812 - classification_loss: 0.1478 107/500 [=====>........................] - ETA: 2:10 - loss: 1.0298 - regression_loss: 0.8820 - classification_loss: 0.1478 108/500 [=====>........................] - ETA: 2:09 - loss: 1.0293 - regression_loss: 0.8815 - classification_loss: 0.1478 109/500 [=====>........................] - ETA: 2:09 - loss: 1.0300 - regression_loss: 0.8821 - classification_loss: 0.1480 110/500 [=====>........................] - ETA: 2:08 - loss: 1.0361 - regression_loss: 0.8872 - classification_loss: 0.1489 111/500 [=====>........................] - ETA: 2:08 - loss: 1.0354 - regression_loss: 0.8867 - classification_loss: 0.1487 112/500 [=====>........................] - ETA: 2:08 - loss: 1.0351 - regression_loss: 0.8868 - classification_loss: 0.1483 113/500 [=====>........................] - ETA: 2:08 - loss: 1.0357 - regression_loss: 0.8874 - classification_loss: 0.1483 114/500 [=====>........................] - ETA: 2:07 - loss: 1.0367 - regression_loss: 0.8883 - classification_loss: 0.1484 115/500 [=====>........................] - ETA: 2:07 - loss: 1.0392 - regression_loss: 0.8907 - classification_loss: 0.1485 116/500 [=====>........................] - ETA: 2:07 - loss: 1.0427 - regression_loss: 0.8938 - classification_loss: 0.1489 117/500 [======>.......................] - ETA: 2:06 - loss: 1.0467 - regression_loss: 0.8972 - classification_loss: 0.1495 118/500 [======>.......................] - ETA: 2:06 - loss: 1.0507 - regression_loss: 0.9005 - classification_loss: 0.1503 119/500 [======>.......................] - ETA: 2:06 - loss: 1.0507 - regression_loss: 0.9005 - classification_loss: 0.1502 120/500 [======>.......................] - ETA: 2:05 - loss: 1.0483 - regression_loss: 0.8986 - classification_loss: 0.1497 121/500 [======>.......................] - ETA: 2:05 - loss: 1.0439 - regression_loss: 0.8947 - classification_loss: 0.1492 122/500 [======>.......................] - ETA: 2:05 - loss: 1.0415 - regression_loss: 0.8928 - classification_loss: 0.1486 123/500 [======>.......................] - ETA: 2:04 - loss: 1.0418 - regression_loss: 0.8932 - classification_loss: 0.1486 124/500 [======>.......................] - ETA: 2:04 - loss: 1.0362 - regression_loss: 0.8883 - classification_loss: 0.1479 125/500 [======>.......................] - ETA: 2:04 - loss: 1.0387 - regression_loss: 0.8903 - classification_loss: 0.1483 126/500 [======>.......................] - ETA: 2:03 - loss: 1.0346 - regression_loss: 0.8870 - classification_loss: 0.1476 127/500 [======>.......................] - ETA: 2:03 - loss: 1.0393 - regression_loss: 0.8908 - classification_loss: 0.1485 128/500 [======>.......................] - ETA: 2:03 - loss: 1.0360 - regression_loss: 0.8883 - classification_loss: 0.1478 129/500 [======>.......................] - ETA: 2:02 - loss: 1.0353 - regression_loss: 0.8880 - classification_loss: 0.1473 130/500 [======>.......................] - ETA: 2:02 - loss: 1.0326 - regression_loss: 0.8856 - classification_loss: 0.1470 131/500 [======>.......................] - ETA: 2:02 - loss: 1.0320 - regression_loss: 0.8849 - classification_loss: 0.1471 132/500 [======>.......................] - ETA: 2:01 - loss: 1.0347 - regression_loss: 0.8873 - classification_loss: 0.1475 133/500 [======>.......................] - ETA: 2:01 - loss: 1.0369 - regression_loss: 0.8893 - classification_loss: 0.1476 134/500 [=======>......................] - ETA: 2:00 - loss: 1.0399 - regression_loss: 0.8919 - classification_loss: 0.1480 135/500 [=======>......................] - ETA: 2:00 - loss: 1.0410 - regression_loss: 0.8930 - classification_loss: 0.1480 136/500 [=======>......................] - ETA: 2:00 - loss: 1.0420 - regression_loss: 0.8939 - classification_loss: 0.1481 137/500 [=======>......................] - ETA: 1:59 - loss: 1.0450 - regression_loss: 0.8965 - classification_loss: 0.1485 138/500 [=======>......................] - ETA: 1:59 - loss: 1.0423 - regression_loss: 0.8943 - classification_loss: 0.1480 139/500 [=======>......................] - ETA: 1:59 - loss: 1.0437 - regression_loss: 0.8953 - classification_loss: 0.1484 140/500 [=======>......................] - ETA: 1:58 - loss: 1.0405 - regression_loss: 0.8928 - classification_loss: 0.1477 141/500 [=======>......................] - ETA: 1:58 - loss: 1.0426 - regression_loss: 0.8944 - classification_loss: 0.1482 142/500 [=======>......................] - ETA: 1:58 - loss: 1.0409 - regression_loss: 0.8930 - classification_loss: 0.1479 143/500 [=======>......................] - ETA: 1:57 - loss: 1.0433 - regression_loss: 0.8947 - classification_loss: 0.1485 144/500 [=======>......................] - ETA: 1:57 - loss: 1.0454 - regression_loss: 0.8966 - classification_loss: 0.1487 145/500 [=======>......................] - ETA: 1:57 - loss: 1.0507 - regression_loss: 0.9007 - classification_loss: 0.1500 146/500 [=======>......................] - ETA: 1:56 - loss: 1.0498 - regression_loss: 0.8999 - classification_loss: 0.1499 147/500 [=======>......................] - ETA: 1:56 - loss: 1.0445 - regression_loss: 0.8951 - classification_loss: 0.1494 148/500 [=======>......................] - ETA: 1:56 - loss: 1.0425 - regression_loss: 0.8937 - classification_loss: 0.1489 149/500 [=======>......................] - ETA: 1:55 - loss: 1.0410 - regression_loss: 0.8925 - classification_loss: 0.1485 150/500 [========>.....................] - ETA: 1:55 - loss: 1.0443 - regression_loss: 0.8951 - classification_loss: 0.1492 151/500 [========>.....................] - ETA: 1:55 - loss: 1.0448 - regression_loss: 0.8956 - classification_loss: 0.1492 152/500 [========>.....................] - ETA: 1:55 - loss: 1.0409 - regression_loss: 0.8920 - classification_loss: 0.1489 153/500 [========>.....................] - ETA: 1:54 - loss: 1.0416 - regression_loss: 0.8926 - classification_loss: 0.1490 154/500 [========>.....................] - ETA: 1:54 - loss: 1.0430 - regression_loss: 0.8939 - classification_loss: 0.1491 155/500 [========>.....................] - ETA: 1:54 - loss: 1.0455 - regression_loss: 0.8959 - classification_loss: 0.1497 156/500 [========>.....................] - ETA: 1:53 - loss: 1.0439 - regression_loss: 0.8944 - classification_loss: 0.1495 157/500 [========>.....................] - ETA: 1:53 - loss: 1.0445 - regression_loss: 0.8951 - classification_loss: 0.1494 158/500 [========>.....................] - ETA: 1:53 - loss: 1.0411 - regression_loss: 0.8922 - classification_loss: 0.1489 159/500 [========>.....................] - ETA: 1:52 - loss: 1.0385 - regression_loss: 0.8902 - classification_loss: 0.1483 160/500 [========>.....................] - ETA: 1:52 - loss: 1.0370 - regression_loss: 0.8888 - classification_loss: 0.1481 161/500 [========>.....................] - ETA: 1:52 - loss: 1.0336 - regression_loss: 0.8861 - classification_loss: 0.1475 162/500 [========>.....................] - ETA: 1:51 - loss: 1.0349 - regression_loss: 0.8874 - classification_loss: 0.1475 163/500 [========>.....................] - ETA: 1:51 - loss: 1.0367 - regression_loss: 0.8890 - classification_loss: 0.1477 164/500 [========>.....................] - ETA: 1:51 - loss: 1.0373 - regression_loss: 0.8897 - classification_loss: 0.1476 165/500 [========>.....................] - ETA: 1:50 - loss: 1.0338 - regression_loss: 0.8868 - classification_loss: 0.1470 166/500 [========>.....................] - ETA: 1:50 - loss: 1.0314 - regression_loss: 0.8847 - classification_loss: 0.1467 167/500 [=========>....................] - ETA: 1:50 - loss: 1.0298 - regression_loss: 0.8833 - classification_loss: 0.1465 168/500 [=========>....................] - ETA: 1:49 - loss: 1.0326 - regression_loss: 0.8854 - classification_loss: 0.1472 169/500 [=========>....................] - ETA: 1:49 - loss: 1.0333 - regression_loss: 0.8861 - classification_loss: 0.1471 170/500 [=========>....................] - ETA: 1:49 - loss: 1.0337 - regression_loss: 0.8866 - classification_loss: 0.1471 171/500 [=========>....................] - ETA: 1:48 - loss: 1.0371 - regression_loss: 0.8894 - classification_loss: 0.1477 172/500 [=========>....................] - ETA: 1:48 - loss: 1.0345 - regression_loss: 0.8869 - classification_loss: 0.1476 173/500 [=========>....................] - ETA: 1:48 - loss: 1.0359 - regression_loss: 0.8882 - classification_loss: 0.1477 174/500 [=========>....................] - ETA: 1:47 - loss: 1.0346 - regression_loss: 0.8872 - classification_loss: 0.1474 175/500 [=========>....................] - ETA: 1:47 - loss: 1.0358 - regression_loss: 0.8884 - classification_loss: 0.1474 176/500 [=========>....................] - ETA: 1:47 - loss: 1.0327 - regression_loss: 0.8859 - classification_loss: 0.1468 177/500 [=========>....................] - ETA: 1:46 - loss: 1.0332 - regression_loss: 0.8862 - classification_loss: 0.1470 178/500 [=========>....................] - ETA: 1:46 - loss: 1.0344 - regression_loss: 0.8872 - classification_loss: 0.1472 179/500 [=========>....................] - ETA: 1:46 - loss: 1.0375 - regression_loss: 0.8898 - classification_loss: 0.1477 180/500 [=========>....................] - ETA: 1:45 - loss: 1.0342 - regression_loss: 0.8869 - classification_loss: 0.1473 181/500 [=========>....................] - ETA: 1:45 - loss: 1.0307 - regression_loss: 0.8840 - classification_loss: 0.1467 182/500 [=========>....................] - ETA: 1:44 - loss: 1.0284 - regression_loss: 0.8821 - classification_loss: 0.1463 183/500 [=========>....................] - ETA: 1:44 - loss: 1.0281 - regression_loss: 0.8817 - classification_loss: 0.1464 184/500 [==========>...................] - ETA: 1:44 - loss: 1.0310 - regression_loss: 0.8838 - classification_loss: 0.1473 185/500 [==========>...................] - ETA: 1:43 - loss: 1.0282 - regression_loss: 0.8814 - classification_loss: 0.1468 186/500 [==========>...................] - ETA: 1:43 - loss: 1.0283 - regression_loss: 0.8814 - classification_loss: 0.1469 187/500 [==========>...................] - ETA: 1:43 - loss: 1.0274 - regression_loss: 0.8808 - classification_loss: 0.1465 188/500 [==========>...................] - ETA: 1:42 - loss: 1.0251 - regression_loss: 0.8790 - classification_loss: 0.1461 189/500 [==========>...................] - ETA: 1:42 - loss: 1.0267 - regression_loss: 0.8805 - classification_loss: 0.1462 190/500 [==========>...................] - ETA: 1:42 - loss: 1.0285 - regression_loss: 0.8820 - classification_loss: 0.1465 191/500 [==========>...................] - ETA: 1:41 - loss: 1.0282 - regression_loss: 0.8817 - classification_loss: 0.1465 192/500 [==========>...................] - ETA: 1:41 - loss: 1.0266 - regression_loss: 0.8802 - classification_loss: 0.1464 193/500 [==========>...................] - ETA: 1:41 - loss: 1.0280 - regression_loss: 0.8818 - classification_loss: 0.1463 194/500 [==========>...................] - ETA: 1:40 - loss: 1.0299 - regression_loss: 0.8832 - classification_loss: 0.1467 195/500 [==========>...................] - ETA: 1:40 - loss: 1.0289 - regression_loss: 0.8823 - classification_loss: 0.1467 196/500 [==========>...................] - ETA: 1:40 - loss: 1.0290 - regression_loss: 0.8825 - classification_loss: 0.1464 197/500 [==========>...................] - ETA: 1:40 - loss: 1.0269 - regression_loss: 0.8808 - classification_loss: 0.1461 198/500 [==========>...................] - ETA: 1:39 - loss: 1.0254 - regression_loss: 0.8797 - classification_loss: 0.1457 199/500 [==========>...................] - ETA: 1:39 - loss: 1.0265 - regression_loss: 0.8805 - classification_loss: 0.1459 200/500 [===========>..................] - ETA: 1:39 - loss: 1.0266 - regression_loss: 0.8807 - classification_loss: 0.1459 201/500 [===========>..................] - ETA: 1:38 - loss: 1.0313 - regression_loss: 0.8847 - classification_loss: 0.1466 202/500 [===========>..................] - ETA: 1:38 - loss: 1.0332 - regression_loss: 0.8864 - classification_loss: 0.1468 203/500 [===========>..................] - ETA: 1:38 - loss: 1.0362 - regression_loss: 0.8889 - classification_loss: 0.1473 204/500 [===========>..................] - ETA: 1:37 - loss: 1.0331 - regression_loss: 0.8863 - classification_loss: 0.1468 205/500 [===========>..................] - ETA: 1:37 - loss: 1.0302 - regression_loss: 0.8838 - classification_loss: 0.1464 206/500 [===========>..................] - ETA: 1:37 - loss: 1.0276 - regression_loss: 0.8816 - classification_loss: 0.1460 207/500 [===========>..................] - ETA: 1:36 - loss: 1.0268 - regression_loss: 0.8809 - classification_loss: 0.1459 208/500 [===========>..................] - ETA: 1:36 - loss: 1.0306 - regression_loss: 0.8842 - classification_loss: 0.1464 209/500 [===========>..................] - ETA: 1:36 - loss: 1.0321 - regression_loss: 0.8853 - classification_loss: 0.1468 210/500 [===========>..................] - ETA: 1:35 - loss: 1.0323 - regression_loss: 0.8854 - classification_loss: 0.1469 211/500 [===========>..................] - ETA: 1:35 - loss: 1.0328 - regression_loss: 0.8859 - classification_loss: 0.1469 212/500 [===========>..................] - ETA: 1:35 - loss: 1.0341 - regression_loss: 0.8871 - classification_loss: 0.1470 213/500 [===========>..................] - ETA: 1:34 - loss: 1.0328 - regression_loss: 0.8860 - classification_loss: 0.1468 214/500 [===========>..................] - ETA: 1:34 - loss: 1.0357 - regression_loss: 0.8886 - classification_loss: 0.1471 215/500 [===========>..................] - ETA: 1:34 - loss: 1.0352 - regression_loss: 0.8882 - classification_loss: 0.1471 216/500 [===========>..................] - ETA: 1:33 - loss: 1.0329 - regression_loss: 0.8861 - classification_loss: 0.1468 217/500 [============>.................] - ETA: 1:33 - loss: 1.0321 - regression_loss: 0.8855 - classification_loss: 0.1466 218/500 [============>.................] - ETA: 1:33 - loss: 1.0289 - regression_loss: 0.8827 - classification_loss: 0.1463 219/500 [============>.................] - ETA: 1:32 - loss: 1.0295 - regression_loss: 0.8831 - classification_loss: 0.1463 220/500 [============>.................] - ETA: 1:32 - loss: 1.0321 - regression_loss: 0.8853 - classification_loss: 0.1468 221/500 [============>.................] - ETA: 1:32 - loss: 1.0314 - regression_loss: 0.8846 - classification_loss: 0.1468 222/500 [============>.................] - ETA: 1:31 - loss: 1.0295 - regression_loss: 0.8829 - classification_loss: 0.1466 223/500 [============>.................] - ETA: 1:31 - loss: 1.0303 - regression_loss: 0.8837 - classification_loss: 0.1466 224/500 [============>.................] - ETA: 1:31 - loss: 1.0299 - regression_loss: 0.8831 - classification_loss: 0.1468 225/500 [============>.................] - ETA: 1:30 - loss: 1.0304 - regression_loss: 0.8834 - classification_loss: 0.1470 226/500 [============>.................] - ETA: 1:30 - loss: 1.0305 - regression_loss: 0.8835 - classification_loss: 0.1470 227/500 [============>.................] - ETA: 1:30 - loss: 1.0286 - regression_loss: 0.8819 - classification_loss: 0.1467 228/500 [============>.................] - ETA: 1:29 - loss: 1.0283 - regression_loss: 0.8819 - classification_loss: 0.1465 229/500 [============>.................] - ETA: 1:29 - loss: 1.0294 - regression_loss: 0.8830 - classification_loss: 0.1465 230/500 [============>.................] - ETA: 1:29 - loss: 1.0308 - regression_loss: 0.8841 - classification_loss: 0.1467 231/500 [============>.................] - ETA: 1:28 - loss: 1.0316 - regression_loss: 0.8850 - classification_loss: 0.1466 232/500 [============>.................] - ETA: 1:28 - loss: 1.0318 - regression_loss: 0.8851 - classification_loss: 0.1468 233/500 [============>.................] - ETA: 1:28 - loss: 1.0311 - regression_loss: 0.8844 - classification_loss: 0.1467 234/500 [=============>................] - ETA: 1:27 - loss: 1.0292 - regression_loss: 0.8829 - classification_loss: 0.1464 235/500 [=============>................] - ETA: 1:27 - loss: 1.0294 - regression_loss: 0.8829 - classification_loss: 0.1466 236/500 [=============>................] - ETA: 1:27 - loss: 1.0270 - regression_loss: 0.8808 - classification_loss: 0.1461 237/500 [=============>................] - ETA: 1:26 - loss: 1.0254 - regression_loss: 0.8795 - classification_loss: 0.1459 238/500 [=============>................] - ETA: 1:26 - loss: 1.0243 - regression_loss: 0.8788 - classification_loss: 0.1456 239/500 [=============>................] - ETA: 1:26 - loss: 1.0233 - regression_loss: 0.8780 - classification_loss: 0.1454 240/500 [=============>................] - ETA: 1:25 - loss: 1.0212 - regression_loss: 0.8760 - classification_loss: 0.1452 241/500 [=============>................] - ETA: 1:25 - loss: 1.0240 - regression_loss: 0.8782 - classification_loss: 0.1458 242/500 [=============>................] - ETA: 1:25 - loss: 1.0241 - regression_loss: 0.8782 - classification_loss: 0.1459 243/500 [=============>................] - ETA: 1:24 - loss: 1.0223 - regression_loss: 0.8766 - classification_loss: 0.1457 244/500 [=============>................] - ETA: 1:24 - loss: 1.0210 - regression_loss: 0.8755 - classification_loss: 0.1455 245/500 [=============>................] - ETA: 1:24 - loss: 1.0226 - regression_loss: 0.8768 - classification_loss: 0.1459 246/500 [=============>................] - ETA: 1:23 - loss: 1.0241 - regression_loss: 0.8782 - classification_loss: 0.1459 247/500 [=============>................] - ETA: 1:23 - loss: 1.0250 - regression_loss: 0.8792 - classification_loss: 0.1458 248/500 [=============>................] - ETA: 1:23 - loss: 1.0245 - regression_loss: 0.8787 - classification_loss: 0.1457 249/500 [=============>................] - ETA: 1:22 - loss: 1.0250 - regression_loss: 0.8795 - classification_loss: 0.1454 250/500 [==============>...............] - ETA: 1:22 - loss: 1.0252 - regression_loss: 0.8798 - classification_loss: 0.1454 251/500 [==============>...............] - ETA: 1:22 - loss: 1.0284 - regression_loss: 0.8825 - classification_loss: 0.1459 252/500 [==============>...............] - ETA: 1:21 - loss: 1.0258 - regression_loss: 0.8801 - classification_loss: 0.1456 253/500 [==============>...............] - ETA: 1:21 - loss: 1.0265 - regression_loss: 0.8808 - classification_loss: 0.1457 254/500 [==============>...............] - ETA: 1:21 - loss: 1.0274 - regression_loss: 0.8816 - classification_loss: 0.1458 255/500 [==============>...............] - ETA: 1:20 - loss: 1.0263 - regression_loss: 0.8806 - classification_loss: 0.1456 256/500 [==============>...............] - ETA: 1:20 - loss: 1.0251 - regression_loss: 0.8797 - classification_loss: 0.1454 257/500 [==============>...............] - ETA: 1:20 - loss: 1.0247 - regression_loss: 0.8793 - classification_loss: 0.1454 258/500 [==============>...............] - ETA: 1:19 - loss: 1.0253 - regression_loss: 0.8798 - classification_loss: 0.1455 259/500 [==============>...............] - ETA: 1:19 - loss: 1.0235 - regression_loss: 0.8783 - classification_loss: 0.1452 260/500 [==============>...............] - ETA: 1:19 - loss: 1.0217 - regression_loss: 0.8769 - classification_loss: 0.1448 261/500 [==============>...............] - ETA: 1:18 - loss: 1.0219 - regression_loss: 0.8770 - classification_loss: 0.1449 262/500 [==============>...............] - ETA: 1:18 - loss: 1.0204 - regression_loss: 0.8758 - classification_loss: 0.1446 263/500 [==============>...............] - ETA: 1:18 - loss: 1.0186 - regression_loss: 0.8742 - classification_loss: 0.1444 264/500 [==============>...............] - ETA: 1:17 - loss: 1.0166 - regression_loss: 0.8724 - classification_loss: 0.1442 265/500 [==============>...............] - ETA: 1:17 - loss: 1.0195 - regression_loss: 0.8747 - classification_loss: 0.1448 266/500 [==============>...............] - ETA: 1:17 - loss: 1.0187 - regression_loss: 0.8740 - classification_loss: 0.1447 267/500 [===============>..............] - ETA: 1:16 - loss: 1.0169 - regression_loss: 0.8725 - classification_loss: 0.1444 268/500 [===============>..............] - ETA: 1:16 - loss: 1.0176 - regression_loss: 0.8731 - classification_loss: 0.1445 269/500 [===============>..............] - ETA: 1:16 - loss: 1.0173 - regression_loss: 0.8729 - classification_loss: 0.1443 270/500 [===============>..............] - ETA: 1:15 - loss: 1.0147 - regression_loss: 0.8707 - classification_loss: 0.1440 271/500 [===============>..............] - ETA: 1:15 - loss: 1.0147 - regression_loss: 0.8706 - classification_loss: 0.1441 272/500 [===============>..............] - ETA: 1:15 - loss: 1.0160 - regression_loss: 0.8719 - classification_loss: 0.1441 273/500 [===============>..............] - ETA: 1:14 - loss: 1.0141 - regression_loss: 0.8703 - classification_loss: 0.1438 274/500 [===============>..............] - ETA: 1:14 - loss: 1.0128 - regression_loss: 0.8692 - classification_loss: 0.1437 275/500 [===============>..............] - ETA: 1:14 - loss: 1.0140 - regression_loss: 0.8700 - classification_loss: 0.1440 276/500 [===============>..............] - ETA: 1:13 - loss: 1.0129 - regression_loss: 0.8690 - classification_loss: 0.1439 277/500 [===============>..............] - ETA: 1:13 - loss: 1.0125 - regression_loss: 0.8687 - classification_loss: 0.1437 278/500 [===============>..............] - ETA: 1:13 - loss: 1.0115 - regression_loss: 0.8679 - classification_loss: 0.1436 279/500 [===============>..............] - ETA: 1:12 - loss: 1.0108 - regression_loss: 0.8673 - classification_loss: 0.1435 280/500 [===============>..............] - ETA: 1:12 - loss: 1.0090 - regression_loss: 0.8658 - classification_loss: 0.1433 281/500 [===============>..............] - ETA: 1:12 - loss: 1.0085 - regression_loss: 0.8653 - classification_loss: 0.1432 282/500 [===============>..............] - ETA: 1:11 - loss: 1.0099 - regression_loss: 0.8666 - classification_loss: 0.1433 283/500 [===============>..............] - ETA: 1:11 - loss: 1.0071 - regression_loss: 0.8642 - classification_loss: 0.1429 284/500 [================>.............] - ETA: 1:11 - loss: 1.0078 - regression_loss: 0.8649 - classification_loss: 0.1429 285/500 [================>.............] - ETA: 1:10 - loss: 1.0086 - regression_loss: 0.8656 - classification_loss: 0.1430 286/500 [================>.............] - ETA: 1:10 - loss: 1.0110 - regression_loss: 0.8676 - classification_loss: 0.1434 287/500 [================>.............] - ETA: 1:10 - loss: 1.0113 - regression_loss: 0.8680 - classification_loss: 0.1434 288/500 [================>.............] - ETA: 1:09 - loss: 1.0111 - regression_loss: 0.8677 - classification_loss: 0.1434 289/500 [================>.............] - ETA: 1:09 - loss: 1.0107 - regression_loss: 0.8675 - classification_loss: 0.1433 290/500 [================>.............] - ETA: 1:09 - loss: 1.0114 - regression_loss: 0.8680 - classification_loss: 0.1433 291/500 [================>.............] - ETA: 1:08 - loss: 1.0099 - regression_loss: 0.8667 - classification_loss: 0.1432 292/500 [================>.............] - ETA: 1:08 - loss: 1.0105 - regression_loss: 0.8672 - classification_loss: 0.1433 293/500 [================>.............] - ETA: 1:08 - loss: 1.0106 - regression_loss: 0.8673 - classification_loss: 0.1433 294/500 [================>.............] - ETA: 1:07 - loss: 1.0086 - regression_loss: 0.8657 - classification_loss: 0.1429 295/500 [================>.............] - ETA: 1:07 - loss: 1.0087 - regression_loss: 0.8657 - classification_loss: 0.1431 296/500 [================>.............] - ETA: 1:07 - loss: 1.0114 - regression_loss: 0.8678 - classification_loss: 0.1435 297/500 [================>.............] - ETA: 1:07 - loss: 1.0098 - regression_loss: 0.8664 - classification_loss: 0.1434 298/500 [================>.............] - ETA: 1:06 - loss: 1.0112 - regression_loss: 0.8676 - classification_loss: 0.1436 299/500 [================>.............] - ETA: 1:06 - loss: 1.0134 - regression_loss: 0.8694 - classification_loss: 0.1440 300/500 [=================>............] - ETA: 1:06 - loss: 1.0134 - regression_loss: 0.8695 - classification_loss: 0.1439 301/500 [=================>............] - ETA: 1:05 - loss: 1.0141 - regression_loss: 0.8701 - classification_loss: 0.1441 302/500 [=================>............] - ETA: 1:05 - loss: 1.0145 - regression_loss: 0.8703 - classification_loss: 0.1442 303/500 [=================>............] - ETA: 1:05 - loss: 1.0132 - regression_loss: 0.8691 - classification_loss: 0.1441 304/500 [=================>............] - ETA: 1:04 - loss: 1.0142 - regression_loss: 0.8699 - classification_loss: 0.1443 305/500 [=================>............] - ETA: 1:04 - loss: 1.0136 - regression_loss: 0.8694 - classification_loss: 0.1442 306/500 [=================>............] - ETA: 1:04 - loss: 1.0127 - regression_loss: 0.8686 - classification_loss: 0.1441 307/500 [=================>............] - ETA: 1:03 - loss: 1.0118 - regression_loss: 0.8679 - classification_loss: 0.1439 308/500 [=================>............] - ETA: 1:03 - loss: 1.0128 - regression_loss: 0.8688 - classification_loss: 0.1440 309/500 [=================>............] - ETA: 1:03 - loss: 1.0145 - regression_loss: 0.8704 - classification_loss: 0.1442 310/500 [=================>............] - ETA: 1:02 - loss: 1.0131 - regression_loss: 0.8692 - classification_loss: 0.1439 311/500 [=================>............] - ETA: 1:02 - loss: 1.0117 - regression_loss: 0.8681 - classification_loss: 0.1436 312/500 [=================>............] - ETA: 1:02 - loss: 1.0129 - regression_loss: 0.8689 - classification_loss: 0.1439 313/500 [=================>............] - ETA: 1:01 - loss: 1.0123 - regression_loss: 0.8685 - classification_loss: 0.1438 314/500 [=================>............] - ETA: 1:01 - loss: 1.0105 - regression_loss: 0.8670 - classification_loss: 0.1435 315/500 [=================>............] - ETA: 1:01 - loss: 1.0107 - regression_loss: 0.8672 - classification_loss: 0.1436 316/500 [=================>............] - ETA: 1:00 - loss: 1.0096 - regression_loss: 0.8662 - classification_loss: 0.1434 317/500 [==================>...........] - ETA: 1:00 - loss: 1.0111 - regression_loss: 0.8678 - classification_loss: 0.1433 318/500 [==================>...........] - ETA: 1:00 - loss: 1.0128 - regression_loss: 0.8693 - classification_loss: 0.1435 319/500 [==================>...........] - ETA: 59s - loss: 1.0125 - regression_loss: 0.8691 - classification_loss: 0.1434  320/500 [==================>...........] - ETA: 59s - loss: 1.0142 - regression_loss: 0.8705 - classification_loss: 0.1437 321/500 [==================>...........] - ETA: 59s - loss: 1.0156 - regression_loss: 0.8716 - classification_loss: 0.1440 322/500 [==================>...........] - ETA: 58s - loss: 1.0160 - regression_loss: 0.8720 - classification_loss: 0.1440 323/500 [==================>...........] - ETA: 58s - loss: 1.0178 - regression_loss: 0.8733 - classification_loss: 0.1445 324/500 [==================>...........] - ETA: 58s - loss: 1.0195 - regression_loss: 0.8747 - classification_loss: 0.1448 325/500 [==================>...........] - ETA: 57s - loss: 1.0195 - regression_loss: 0.8749 - classification_loss: 0.1446 326/500 [==================>...........] - ETA: 57s - loss: 1.0199 - regression_loss: 0.8752 - classification_loss: 0.1447 327/500 [==================>...........] - ETA: 57s - loss: 1.0192 - regression_loss: 0.8747 - classification_loss: 0.1445 328/500 [==================>...........] - ETA: 56s - loss: 1.0204 - regression_loss: 0.8757 - classification_loss: 0.1447 329/500 [==================>...........] - ETA: 56s - loss: 1.0209 - regression_loss: 0.8759 - classification_loss: 0.1449 330/500 [==================>...........] - ETA: 56s - loss: 1.0226 - regression_loss: 0.8772 - classification_loss: 0.1454 331/500 [==================>...........] - ETA: 55s - loss: 1.0252 - regression_loss: 0.8794 - classification_loss: 0.1458 332/500 [==================>...........] - ETA: 55s - loss: 1.0238 - regression_loss: 0.8782 - classification_loss: 0.1456 333/500 [==================>...........] - ETA: 55s - loss: 1.0231 - regression_loss: 0.8775 - classification_loss: 0.1456 334/500 [===================>..........] - ETA: 54s - loss: 1.0244 - regression_loss: 0.8785 - classification_loss: 0.1459 335/500 [===================>..........] - ETA: 54s - loss: 1.0259 - regression_loss: 0.8798 - classification_loss: 0.1461 336/500 [===================>..........] - ETA: 54s - loss: 1.0250 - regression_loss: 0.8792 - classification_loss: 0.1459 337/500 [===================>..........] - ETA: 53s - loss: 1.0244 - regression_loss: 0.8787 - classification_loss: 0.1456 338/500 [===================>..........] - ETA: 53s - loss: 1.0260 - regression_loss: 0.8801 - classification_loss: 0.1460 339/500 [===================>..........] - ETA: 53s - loss: 1.0271 - regression_loss: 0.8811 - classification_loss: 0.1460 340/500 [===================>..........] - ETA: 52s - loss: 1.0290 - regression_loss: 0.8827 - classification_loss: 0.1463 341/500 [===================>..........] - ETA: 52s - loss: 1.0294 - regression_loss: 0.8831 - classification_loss: 0.1463 342/500 [===================>..........] - ETA: 52s - loss: 1.0304 - regression_loss: 0.8841 - classification_loss: 0.1464 343/500 [===================>..........] - ETA: 51s - loss: 1.0319 - regression_loss: 0.8853 - classification_loss: 0.1466 344/500 [===================>..........] - ETA: 51s - loss: 1.0304 - regression_loss: 0.8840 - classification_loss: 0.1464 345/500 [===================>..........] - ETA: 51s - loss: 1.0309 - regression_loss: 0.8844 - classification_loss: 0.1465 346/500 [===================>..........] - ETA: 50s - loss: 1.0312 - regression_loss: 0.8849 - classification_loss: 0.1463 347/500 [===================>..........] - ETA: 50s - loss: 1.0318 - regression_loss: 0.8853 - classification_loss: 0.1465 348/500 [===================>..........] - ETA: 50s - loss: 1.0301 - regression_loss: 0.8840 - classification_loss: 0.1461 349/500 [===================>..........] - ETA: 49s - loss: 1.0313 - regression_loss: 0.8850 - classification_loss: 0.1463 350/500 [====================>.........] - ETA: 49s - loss: 1.0300 - regression_loss: 0.8839 - classification_loss: 0.1461 351/500 [====================>.........] - ETA: 49s - loss: 1.0283 - regression_loss: 0.8824 - classification_loss: 0.1458 352/500 [====================>.........] - ETA: 48s - loss: 1.0267 - regression_loss: 0.8812 - classification_loss: 0.1455 353/500 [====================>.........] - ETA: 48s - loss: 1.0288 - regression_loss: 0.8830 - classification_loss: 0.1458 354/500 [====================>.........] - ETA: 48s - loss: 1.0310 - regression_loss: 0.8849 - classification_loss: 0.1461 355/500 [====================>.........] - ETA: 47s - loss: 1.0306 - regression_loss: 0.8845 - classification_loss: 0.1461 356/500 [====================>.........] - ETA: 47s - loss: 1.0307 - regression_loss: 0.8845 - classification_loss: 0.1462 357/500 [====================>.........] - ETA: 47s - loss: 1.0298 - regression_loss: 0.8838 - classification_loss: 0.1459 358/500 [====================>.........] - ETA: 46s - loss: 1.0288 - regression_loss: 0.8830 - classification_loss: 0.1458 359/500 [====================>.........] - ETA: 46s - loss: 1.0292 - regression_loss: 0.8834 - classification_loss: 0.1458 360/500 [====================>.........] - ETA: 46s - loss: 1.0289 - regression_loss: 0.8833 - classification_loss: 0.1456 361/500 [====================>.........] - ETA: 45s - loss: 1.0276 - regression_loss: 0.8821 - classification_loss: 0.1455 362/500 [====================>.........] - ETA: 45s - loss: 1.0286 - regression_loss: 0.8831 - classification_loss: 0.1456 363/500 [====================>.........] - ETA: 45s - loss: 1.0281 - regression_loss: 0.8826 - classification_loss: 0.1455 364/500 [====================>.........] - ETA: 44s - loss: 1.0279 - regression_loss: 0.8823 - classification_loss: 0.1456 365/500 [====================>.........] - ETA: 44s - loss: 1.0288 - regression_loss: 0.8830 - classification_loss: 0.1458 366/500 [====================>.........] - ETA: 44s - loss: 1.0282 - regression_loss: 0.8825 - classification_loss: 0.1457 367/500 [=====================>........] - ETA: 43s - loss: 1.0269 - regression_loss: 0.8815 - classification_loss: 0.1454 368/500 [=====================>........] - ETA: 43s - loss: 1.0267 - regression_loss: 0.8812 - classification_loss: 0.1455 369/500 [=====================>........] - ETA: 43s - loss: 1.0258 - regression_loss: 0.8804 - classification_loss: 0.1454 370/500 [=====================>........] - ETA: 42s - loss: 1.0259 - regression_loss: 0.8805 - classification_loss: 0.1454 371/500 [=====================>........] - ETA: 42s - loss: 1.0271 - regression_loss: 0.8815 - classification_loss: 0.1456 372/500 [=====================>........] - ETA: 42s - loss: 1.0274 - regression_loss: 0.8818 - classification_loss: 0.1456 373/500 [=====================>........] - ETA: 41s - loss: 1.0289 - regression_loss: 0.8829 - classification_loss: 0.1460 374/500 [=====================>........] - ETA: 41s - loss: 1.0306 - regression_loss: 0.8845 - classification_loss: 0.1462 375/500 [=====================>........] - ETA: 41s - loss: 1.0311 - regression_loss: 0.8850 - classification_loss: 0.1462 376/500 [=====================>........] - ETA: 40s - loss: 1.0308 - regression_loss: 0.8847 - classification_loss: 0.1461 377/500 [=====================>........] - ETA: 40s - loss: 1.0307 - regression_loss: 0.8845 - classification_loss: 0.1462 378/500 [=====================>........] - ETA: 40s - loss: 1.0302 - regression_loss: 0.8841 - classification_loss: 0.1461 379/500 [=====================>........] - ETA: 39s - loss: 1.0320 - regression_loss: 0.8857 - classification_loss: 0.1463 380/500 [=====================>........] - ETA: 39s - loss: 1.0323 - regression_loss: 0.8859 - classification_loss: 0.1464 381/500 [=====================>........] - ETA: 39s - loss: 1.0320 - regression_loss: 0.8858 - classification_loss: 0.1461 382/500 [=====================>........] - ETA: 38s - loss: 1.0318 - regression_loss: 0.8857 - classification_loss: 0.1461 383/500 [=====================>........] - ETA: 38s - loss: 1.0323 - regression_loss: 0.8861 - classification_loss: 0.1462 384/500 [======================>.......] - ETA: 38s - loss: 1.0323 - regression_loss: 0.8861 - classification_loss: 0.1462 385/500 [======================>.......] - ETA: 37s - loss: 1.0323 - regression_loss: 0.8861 - classification_loss: 0.1462 386/500 [======================>.......] - ETA: 37s - loss: 1.0327 - regression_loss: 0.8865 - classification_loss: 0.1462 387/500 [======================>.......] - ETA: 37s - loss: 1.0344 - regression_loss: 0.8879 - classification_loss: 0.1466 388/500 [======================>.......] - ETA: 36s - loss: 1.0341 - regression_loss: 0.8877 - classification_loss: 0.1464 389/500 [======================>.......] - ETA: 36s - loss: 1.0343 - regression_loss: 0.8878 - classification_loss: 0.1465 390/500 [======================>.......] - ETA: 36s - loss: 1.0340 - regression_loss: 0.8875 - classification_loss: 0.1465 391/500 [======================>.......] - ETA: 35s - loss: 1.0335 - regression_loss: 0.8871 - classification_loss: 0.1465 392/500 [======================>.......] - ETA: 35s - loss: 1.0325 - regression_loss: 0.8861 - classification_loss: 0.1463 393/500 [======================>.......] - ETA: 35s - loss: 1.0310 - regression_loss: 0.8849 - classification_loss: 0.1461 394/500 [======================>.......] - ETA: 34s - loss: 1.0333 - regression_loss: 0.8868 - classification_loss: 0.1465 395/500 [======================>.......] - ETA: 34s - loss: 1.0338 - regression_loss: 0.8873 - classification_loss: 0.1465 396/500 [======================>.......] - ETA: 34s - loss: 1.0330 - regression_loss: 0.8866 - classification_loss: 0.1464 397/500 [======================>.......] - ETA: 33s - loss: 1.0321 - regression_loss: 0.8858 - classification_loss: 0.1462 398/500 [======================>.......] - ETA: 33s - loss: 1.0324 - regression_loss: 0.8862 - classification_loss: 0.1463 399/500 [======================>.......] - ETA: 33s - loss: 1.0333 - regression_loss: 0.8870 - classification_loss: 0.1464 400/500 [=======================>......] - ETA: 33s - loss: 1.0332 - regression_loss: 0.8868 - classification_loss: 0.1464 401/500 [=======================>......] - ETA: 32s - loss: 1.0333 - regression_loss: 0.8868 - classification_loss: 0.1465 402/500 [=======================>......] - ETA: 32s - loss: 1.0320 - regression_loss: 0.8857 - classification_loss: 0.1463 403/500 [=======================>......] - ETA: 32s - loss: 1.0331 - regression_loss: 0.8866 - classification_loss: 0.1465 404/500 [=======================>......] - ETA: 31s - loss: 1.0332 - regression_loss: 0.8868 - classification_loss: 0.1465 405/500 [=======================>......] - ETA: 31s - loss: 1.0328 - regression_loss: 0.8865 - classification_loss: 0.1463 406/500 [=======================>......] - ETA: 31s - loss: 1.0338 - regression_loss: 0.8871 - classification_loss: 0.1466 407/500 [=======================>......] - ETA: 30s - loss: 1.0339 - regression_loss: 0.8871 - classification_loss: 0.1468 408/500 [=======================>......] - ETA: 30s - loss: 1.0344 - regression_loss: 0.8875 - classification_loss: 0.1469 409/500 [=======================>......] - ETA: 30s - loss: 1.0330 - regression_loss: 0.8862 - classification_loss: 0.1467 410/500 [=======================>......] - ETA: 29s - loss: 1.0337 - regression_loss: 0.8870 - classification_loss: 0.1467 411/500 [=======================>......] - ETA: 29s - loss: 1.0338 - regression_loss: 0.8873 - classification_loss: 0.1466 412/500 [=======================>......] - ETA: 29s - loss: 1.0327 - regression_loss: 0.8863 - classification_loss: 0.1464 413/500 [=======================>......] - ETA: 28s - loss: 1.0332 - regression_loss: 0.8868 - classification_loss: 0.1464 414/500 [=======================>......] - ETA: 28s - loss: 1.0333 - regression_loss: 0.8869 - classification_loss: 0.1465 415/500 [=======================>......] - ETA: 28s - loss: 1.0339 - regression_loss: 0.8874 - classification_loss: 0.1466 416/500 [=======================>......] - ETA: 27s - loss: 1.0347 - regression_loss: 0.8880 - classification_loss: 0.1467 417/500 [========================>.....] - ETA: 27s - loss: 1.0345 - regression_loss: 0.8878 - classification_loss: 0.1467 418/500 [========================>.....] - ETA: 27s - loss: 1.0348 - regression_loss: 0.8881 - classification_loss: 0.1467 419/500 [========================>.....] - ETA: 26s - loss: 1.0340 - regression_loss: 0.8874 - classification_loss: 0.1466 420/500 [========================>.....] - ETA: 26s - loss: 1.0333 - regression_loss: 0.8868 - classification_loss: 0.1464 421/500 [========================>.....] - ETA: 26s - loss: 1.0330 - regression_loss: 0.8866 - classification_loss: 0.1464 422/500 [========================>.....] - ETA: 25s - loss: 1.0323 - regression_loss: 0.8860 - classification_loss: 0.1463 423/500 [========================>.....] - ETA: 25s - loss: 1.0334 - regression_loss: 0.8869 - classification_loss: 0.1465 424/500 [========================>.....] - ETA: 25s - loss: 1.0327 - regression_loss: 0.8864 - classification_loss: 0.1463 425/500 [========================>.....] - ETA: 24s - loss: 1.0330 - regression_loss: 0.8865 - classification_loss: 0.1465 426/500 [========================>.....] - ETA: 24s - loss: 1.0340 - regression_loss: 0.8875 - classification_loss: 0.1465 427/500 [========================>.....] - ETA: 24s - loss: 1.0341 - regression_loss: 0.8876 - classification_loss: 0.1466 428/500 [========================>.....] - ETA: 23s - loss: 1.0333 - regression_loss: 0.8869 - classification_loss: 0.1464 429/500 [========================>.....] - ETA: 23s - loss: 1.0335 - regression_loss: 0.8870 - classification_loss: 0.1465 430/500 [========================>.....] - ETA: 23s - loss: 1.0339 - regression_loss: 0.8875 - classification_loss: 0.1464 431/500 [========================>.....] - ETA: 22s - loss: 1.0347 - regression_loss: 0.8882 - classification_loss: 0.1465 432/500 [========================>.....] - ETA: 22s - loss: 1.0336 - regression_loss: 0.8873 - classification_loss: 0.1463 433/500 [========================>.....] - ETA: 22s - loss: 1.0340 - regression_loss: 0.8877 - classification_loss: 0.1463 434/500 [=========================>....] - ETA: 21s - loss: 1.0328 - regression_loss: 0.8867 - classification_loss: 0.1461 435/500 [=========================>....] - ETA: 21s - loss: 1.0330 - regression_loss: 0.8869 - classification_loss: 0.1461 436/500 [=========================>....] - ETA: 21s - loss: 1.0325 - regression_loss: 0.8865 - classification_loss: 0.1460 437/500 [=========================>....] - ETA: 20s - loss: 1.0336 - regression_loss: 0.8873 - classification_loss: 0.1463 438/500 [=========================>....] - ETA: 20s - loss: 1.0347 - regression_loss: 0.8881 - classification_loss: 0.1465 439/500 [=========================>....] - ETA: 20s - loss: 1.0356 - regression_loss: 0.8888 - classification_loss: 0.1468 440/500 [=========================>....] - ETA: 19s - loss: 1.0353 - regression_loss: 0.8884 - classification_loss: 0.1469 441/500 [=========================>....] - ETA: 19s - loss: 1.0342 - regression_loss: 0.8875 - classification_loss: 0.1467 442/500 [=========================>....] - ETA: 19s - loss: 1.0353 - regression_loss: 0.8885 - classification_loss: 0.1468 443/500 [=========================>....] - ETA: 18s - loss: 1.0348 - regression_loss: 0.8880 - classification_loss: 0.1468 444/500 [=========================>....] - ETA: 18s - loss: 1.0342 - regression_loss: 0.8876 - classification_loss: 0.1466 445/500 [=========================>....] - ETA: 18s - loss: 1.0329 - regression_loss: 0.8864 - classification_loss: 0.1465 446/500 [=========================>....] - ETA: 17s - loss: 1.0320 - regression_loss: 0.8855 - classification_loss: 0.1465 447/500 [=========================>....] - ETA: 17s - loss: 1.0319 - regression_loss: 0.8855 - classification_loss: 0.1464 448/500 [=========================>....] - ETA: 17s - loss: 1.0307 - regression_loss: 0.8845 - classification_loss: 0.1462 449/500 [=========================>....] - ETA: 16s - loss: 1.0300 - regression_loss: 0.8839 - classification_loss: 0.1461 450/500 [==========================>...] - ETA: 16s - loss: 1.0307 - regression_loss: 0.8846 - classification_loss: 0.1460 451/500 [==========================>...] - ETA: 16s - loss: 1.0319 - regression_loss: 0.8856 - classification_loss: 0.1463 452/500 [==========================>...] - ETA: 15s - loss: 1.0301 - regression_loss: 0.8841 - classification_loss: 0.1460 453/500 [==========================>...] - ETA: 15s - loss: 1.0312 - regression_loss: 0.8850 - classification_loss: 0.1462 454/500 [==========================>...] - ETA: 15s - loss: 1.0307 - regression_loss: 0.8846 - classification_loss: 0.1461 455/500 [==========================>...] - ETA: 14s - loss: 1.0317 - regression_loss: 0.8855 - classification_loss: 0.1463 456/500 [==========================>...] - ETA: 14s - loss: 1.0324 - regression_loss: 0.8859 - classification_loss: 0.1465 457/500 [==========================>...] - ETA: 14s - loss: 1.0340 - regression_loss: 0.8871 - classification_loss: 0.1468 458/500 [==========================>...] - ETA: 13s - loss: 1.0338 - regression_loss: 0.8871 - classification_loss: 0.1467 459/500 [==========================>...] - ETA: 13s - loss: 1.0342 - regression_loss: 0.8873 - classification_loss: 0.1469 460/500 [==========================>...] - ETA: 13s - loss: 1.0354 - regression_loss: 0.8883 - classification_loss: 0.1471 461/500 [==========================>...] - ETA: 12s - loss: 1.0352 - regression_loss: 0.8882 - classification_loss: 0.1470 462/500 [==========================>...] - ETA: 12s - loss: 1.0362 - regression_loss: 0.8890 - classification_loss: 0.1472 463/500 [==========================>...] - ETA: 12s - loss: 1.0365 - regression_loss: 0.8893 - classification_loss: 0.1472 464/500 [==========================>...] - ETA: 11s - loss: 1.0359 - regression_loss: 0.8888 - classification_loss: 0.1471 465/500 [==========================>...] - ETA: 11s - loss: 1.0380 - regression_loss: 0.8905 - classification_loss: 0.1475 466/500 [==========================>...] - ETA: 11s - loss: 1.0377 - regression_loss: 0.8903 - classification_loss: 0.1473 467/500 [===========================>..] - ETA: 10s - loss: 1.0387 - regression_loss: 0.8913 - classification_loss: 0.1475 468/500 [===========================>..] - ETA: 10s - loss: 1.0392 - regression_loss: 0.8918 - classification_loss: 0.1474 469/500 [===========================>..] - ETA: 10s - loss: 1.0392 - regression_loss: 0.8919 - classification_loss: 0.1473 470/500 [===========================>..] - ETA: 9s - loss: 1.0379 - regression_loss: 0.8906 - classification_loss: 0.1473  471/500 [===========================>..] - ETA: 9s - loss: 1.0369 - regression_loss: 0.8896 - classification_loss: 0.1473 472/500 [===========================>..] - ETA: 9s - loss: 1.0376 - regression_loss: 0.8903 - classification_loss: 0.1474 473/500 [===========================>..] - ETA: 8s - loss: 1.0374 - regression_loss: 0.8899 - classification_loss: 0.1475 474/500 [===========================>..] - ETA: 8s - loss: 1.0369 - regression_loss: 0.8895 - classification_loss: 0.1474 475/500 [===========================>..] - ETA: 8s - loss: 1.0378 - regression_loss: 0.8902 - classification_loss: 0.1476 476/500 [===========================>..] - ETA: 7s - loss: 1.0375 - regression_loss: 0.8899 - classification_loss: 0.1476 477/500 [===========================>..] - ETA: 7s - loss: 1.0378 - regression_loss: 0.8902 - classification_loss: 0.1476 478/500 [===========================>..] - ETA: 7s - loss: 1.0393 - regression_loss: 0.8914 - classification_loss: 0.1479 479/500 [===========================>..] - ETA: 6s - loss: 1.0403 - regression_loss: 0.8923 - classification_loss: 0.1480 480/500 [===========================>..] - ETA: 6s - loss: 1.0399 - regression_loss: 0.8919 - classification_loss: 0.1479 481/500 [===========================>..] - ETA: 6s - loss: 1.0398 - regression_loss: 0.8917 - classification_loss: 0.1480 482/500 [===========================>..] - ETA: 5s - loss: 1.0410 - regression_loss: 0.8927 - classification_loss: 0.1483 483/500 [===========================>..] - ETA: 5s - loss: 1.0412 - regression_loss: 0.8929 - classification_loss: 0.1483 484/500 [============================>.] - ETA: 5s - loss: 1.0402 - regression_loss: 0.8921 - classification_loss: 0.1481 485/500 [============================>.] - ETA: 4s - loss: 1.0391 - regression_loss: 0.8912 - classification_loss: 0.1479 486/500 [============================>.] - ETA: 4s - loss: 1.0381 - regression_loss: 0.8904 - classification_loss: 0.1477 487/500 [============================>.] - ETA: 4s - loss: 1.0378 - regression_loss: 0.8901 - classification_loss: 0.1477 488/500 [============================>.] - ETA: 3s - loss: 1.0391 - regression_loss: 0.8911 - classification_loss: 0.1479 489/500 [============================>.] - ETA: 3s - loss: 1.0395 - regression_loss: 0.8915 - classification_loss: 0.1481 490/500 [============================>.] - ETA: 3s - loss: 1.0388 - regression_loss: 0.8908 - classification_loss: 0.1479 491/500 [============================>.] - ETA: 2s - loss: 1.0379 - regression_loss: 0.8901 - classification_loss: 0.1478 492/500 [============================>.] - ETA: 2s - loss: 1.0384 - regression_loss: 0.8906 - classification_loss: 0.1478 493/500 [============================>.] - ETA: 2s - loss: 1.0392 - regression_loss: 0.8914 - classification_loss: 0.1478 494/500 [============================>.] - ETA: 1s - loss: 1.0386 - regression_loss: 0.8910 - classification_loss: 0.1477 495/500 [============================>.] - ETA: 1s - loss: 1.0376 - regression_loss: 0.8901 - classification_loss: 0.1475 496/500 [============================>.] - ETA: 1s - loss: 1.0388 - regression_loss: 0.8912 - classification_loss: 0.1475 497/500 [============================>.] - ETA: 0s - loss: 1.0401 - regression_loss: 0.8924 - classification_loss: 0.1477 498/500 [============================>.] - ETA: 0s - loss: 1.0402 - regression_loss: 0.8925 - classification_loss: 0.1477 499/500 [============================>.] - ETA: 0s - loss: 1.0410 - regression_loss: 0.8933 - classification_loss: 0.1477 500/500 [==============================] - 165s 330ms/step - loss: 1.0422 - regression_loss: 0.8943 - classification_loss: 0.1479 1172 instances of class plum with average precision: 0.5952 mAP: 0.5952 Epoch 00029: saving model to ./training/snapshots/resnet101_pascal_29.h5 Epoch 30/150 1/500 [..............................] - ETA: 2:50 - loss: 1.6083 - regression_loss: 1.3454 - classification_loss: 0.2629 2/500 [..............................] - ETA: 2:54 - loss: 1.7070 - regression_loss: 1.4396 - classification_loss: 0.2674 3/500 [..............................] - ETA: 2:51 - loss: 1.4702 - regression_loss: 1.1525 - classification_loss: 0.3177 4/500 [..............................] - ETA: 2:50 - loss: 1.3534 - regression_loss: 1.0810 - classification_loss: 0.2724 5/500 [..............................] - ETA: 2:47 - loss: 1.2388 - regression_loss: 0.9878 - classification_loss: 0.2510 6/500 [..............................] - ETA: 2:47 - loss: 1.1211 - regression_loss: 0.8918 - classification_loss: 0.2293 7/500 [..............................] - ETA: 2:47 - loss: 1.2424 - regression_loss: 1.0032 - classification_loss: 0.2391 8/500 [..............................] - ETA: 2:45 - loss: 1.2717 - regression_loss: 1.0416 - classification_loss: 0.2301 9/500 [..............................] - ETA: 2:43 - loss: 1.2557 - regression_loss: 1.0326 - classification_loss: 0.2231 10/500 [..............................] - ETA: 2:42 - loss: 1.2899 - regression_loss: 1.0705 - classification_loss: 0.2195 11/500 [..............................] - ETA: 2:43 - loss: 1.2792 - regression_loss: 1.0625 - classification_loss: 0.2167 12/500 [..............................] - ETA: 2:43 - loss: 1.2367 - regression_loss: 1.0315 - classification_loss: 0.2052 13/500 [..............................] - ETA: 2:43 - loss: 1.2076 - regression_loss: 1.0088 - classification_loss: 0.1988 14/500 [..............................] - ETA: 2:43 - loss: 1.2010 - regression_loss: 1.0071 - classification_loss: 0.1939 15/500 [..............................] - ETA: 2:41 - loss: 1.2173 - regression_loss: 1.0239 - classification_loss: 0.1935 16/500 [..............................] - ETA: 2:41 - loss: 1.2482 - regression_loss: 1.0508 - classification_loss: 0.1974 17/500 [>.............................] - ETA: 2:41 - loss: 1.2274 - regression_loss: 1.0340 - classification_loss: 0.1934 18/500 [>.............................] - ETA: 2:40 - loss: 1.2162 - regression_loss: 1.0243 - classification_loss: 0.1919 19/500 [>.............................] - ETA: 2:40 - loss: 1.2009 - regression_loss: 1.0105 - classification_loss: 0.1904 20/500 [>.............................] - ETA: 2:39 - loss: 1.1895 - regression_loss: 1.0043 - classification_loss: 0.1852 21/500 [>.............................] - ETA: 2:39 - loss: 1.1975 - regression_loss: 1.0120 - classification_loss: 0.1856 22/500 [>.............................] - ETA: 2:39 - loss: 1.1816 - regression_loss: 0.9999 - classification_loss: 0.1817 23/500 [>.............................] - ETA: 2:39 - loss: 1.1857 - regression_loss: 1.0026 - classification_loss: 0.1831 24/500 [>.............................] - ETA: 2:38 - loss: 1.1748 - regression_loss: 0.9933 - classification_loss: 0.1814 25/500 [>.............................] - ETA: 2:37 - loss: 1.1511 - regression_loss: 0.9726 - classification_loss: 0.1785 26/500 [>.............................] - ETA: 2:37 - loss: 1.1552 - regression_loss: 0.9774 - classification_loss: 0.1778 27/500 [>.............................] - ETA: 2:36 - loss: 1.1249 - regression_loss: 0.9516 - classification_loss: 0.1733 28/500 [>.............................] - ETA: 2:36 - loss: 1.1247 - regression_loss: 0.9522 - classification_loss: 0.1725 29/500 [>.............................] - ETA: 2:36 - loss: 1.0986 - regression_loss: 0.9303 - classification_loss: 0.1683 30/500 [>.............................] - ETA: 2:36 - loss: 1.0891 - regression_loss: 0.9228 - classification_loss: 0.1664 31/500 [>.............................] - ETA: 2:36 - loss: 1.0880 - regression_loss: 0.9227 - classification_loss: 0.1653 32/500 [>.............................] - ETA: 2:36 - loss: 1.0950 - regression_loss: 0.9299 - classification_loss: 0.1651 33/500 [>.............................] - ETA: 2:35 - loss: 1.0715 - regression_loss: 0.9092 - classification_loss: 0.1624 34/500 [=>............................] - ETA: 2:35 - loss: 1.0548 - regression_loss: 0.8954 - classification_loss: 0.1594 35/500 [=>............................] - ETA: 2:35 - loss: 1.0465 - regression_loss: 0.8891 - classification_loss: 0.1574 36/500 [=>............................] - ETA: 2:34 - loss: 1.0298 - regression_loss: 0.8757 - classification_loss: 0.1541 37/500 [=>............................] - ETA: 2:33 - loss: 1.0359 - regression_loss: 0.8811 - classification_loss: 0.1548 38/500 [=>............................] - ETA: 2:33 - loss: 1.0417 - regression_loss: 0.8862 - classification_loss: 0.1555 39/500 [=>............................] - ETA: 2:33 - loss: 1.0312 - regression_loss: 0.8783 - classification_loss: 0.1530 40/500 [=>............................] - ETA: 2:32 - loss: 1.0321 - regression_loss: 0.8795 - classification_loss: 0.1526 41/500 [=>............................] - ETA: 2:32 - loss: 1.0414 - regression_loss: 0.8882 - classification_loss: 0.1533 42/500 [=>............................] - ETA: 2:32 - loss: 1.0465 - regression_loss: 0.8930 - classification_loss: 0.1535 43/500 [=>............................] - ETA: 2:32 - loss: 1.0478 - regression_loss: 0.8957 - classification_loss: 0.1521 44/500 [=>............................] - ETA: 2:31 - loss: 1.0414 - regression_loss: 0.8902 - classification_loss: 0.1512 45/500 [=>............................] - ETA: 2:31 - loss: 1.0329 - regression_loss: 0.8827 - classification_loss: 0.1502 46/500 [=>............................] - ETA: 2:31 - loss: 1.0330 - regression_loss: 0.8822 - classification_loss: 0.1508 47/500 [=>............................] - ETA: 2:31 - loss: 1.0261 - regression_loss: 0.8766 - classification_loss: 0.1495 48/500 [=>............................] - ETA: 2:30 - loss: 1.0172 - regression_loss: 0.8693 - classification_loss: 0.1479 49/500 [=>............................] - ETA: 2:29 - loss: 1.0114 - regression_loss: 0.8653 - classification_loss: 0.1461 50/500 [==>...........................] - ETA: 2:29 - loss: 1.0136 - regression_loss: 0.8675 - classification_loss: 0.1461 51/500 [==>...........................] - ETA: 2:29 - loss: 1.0116 - regression_loss: 0.8661 - classification_loss: 0.1455 52/500 [==>...........................] - ETA: 2:28 - loss: 1.0157 - regression_loss: 0.8702 - classification_loss: 0.1455 53/500 [==>...........................] - ETA: 2:28 - loss: 1.0312 - regression_loss: 0.8831 - classification_loss: 0.1482 54/500 [==>...........................] - ETA: 2:27 - loss: 1.0365 - regression_loss: 0.8885 - classification_loss: 0.1480 55/500 [==>...........................] - ETA: 2:27 - loss: 1.0377 - regression_loss: 0.8903 - classification_loss: 0.1475 56/500 [==>...........................] - ETA: 2:27 - loss: 1.0275 - regression_loss: 0.8816 - classification_loss: 0.1459 57/500 [==>...........................] - ETA: 2:27 - loss: 1.0245 - regression_loss: 0.8798 - classification_loss: 0.1448 58/500 [==>...........................] - ETA: 2:27 - loss: 1.0312 - regression_loss: 0.8852 - classification_loss: 0.1461 59/500 [==>...........................] - ETA: 2:26 - loss: 1.0274 - regression_loss: 0.8819 - classification_loss: 0.1456 60/500 [==>...........................] - ETA: 2:26 - loss: 1.0176 - regression_loss: 0.8731 - classification_loss: 0.1445 61/500 [==>...........................] - ETA: 2:26 - loss: 1.0165 - regression_loss: 0.8715 - classification_loss: 0.1450 62/500 [==>...........................] - ETA: 2:25 - loss: 1.0135 - regression_loss: 0.8691 - classification_loss: 0.1444 63/500 [==>...........................] - ETA: 2:25 - loss: 1.0122 - regression_loss: 0.8676 - classification_loss: 0.1446 64/500 [==>...........................] - ETA: 2:25 - loss: 1.0209 - regression_loss: 0.8743 - classification_loss: 0.1466 65/500 [==>...........................] - ETA: 2:24 - loss: 1.0328 - regression_loss: 0.8839 - classification_loss: 0.1489 66/500 [==>...........................] - ETA: 2:24 - loss: 1.0381 - regression_loss: 0.8885 - classification_loss: 0.1496 67/500 [===>..........................] - ETA: 2:24 - loss: 1.0376 - regression_loss: 0.8882 - classification_loss: 0.1494 68/500 [===>..........................] - ETA: 2:23 - loss: 1.0305 - regression_loss: 0.8827 - classification_loss: 0.1478 69/500 [===>..........................] - ETA: 2:23 - loss: 1.0261 - regression_loss: 0.8793 - classification_loss: 0.1468 70/500 [===>..........................] - ETA: 2:23 - loss: 1.0363 - regression_loss: 0.8882 - classification_loss: 0.1481 71/500 [===>..........................] - ETA: 2:22 - loss: 1.0331 - regression_loss: 0.8862 - classification_loss: 0.1470 72/500 [===>..........................] - ETA: 2:22 - loss: 1.0339 - regression_loss: 0.8863 - classification_loss: 0.1476 73/500 [===>..........................] - ETA: 2:22 - loss: 1.0280 - regression_loss: 0.8812 - classification_loss: 0.1468 74/500 [===>..........................] - ETA: 2:21 - loss: 1.0384 - regression_loss: 0.8901 - classification_loss: 0.1483 75/500 [===>..........................] - ETA: 2:21 - loss: 1.0350 - regression_loss: 0.8869 - classification_loss: 0.1481 76/500 [===>..........................] - ETA: 2:21 - loss: 1.0342 - regression_loss: 0.8865 - classification_loss: 0.1477 77/500 [===>..........................] - ETA: 2:20 - loss: 1.0352 - regression_loss: 0.8877 - classification_loss: 0.1475 78/500 [===>..........................] - ETA: 2:20 - loss: 1.0393 - regression_loss: 0.8914 - classification_loss: 0.1479 79/500 [===>..........................] - ETA: 2:20 - loss: 1.0341 - regression_loss: 0.8868 - classification_loss: 0.1472 80/500 [===>..........................] - ETA: 2:19 - loss: 1.0357 - regression_loss: 0.8887 - classification_loss: 0.1470 81/500 [===>..........................] - ETA: 2:19 - loss: 1.0408 - regression_loss: 0.8934 - classification_loss: 0.1474 82/500 [===>..........................] - ETA: 2:19 - loss: 1.0402 - regression_loss: 0.8929 - classification_loss: 0.1474 83/500 [===>..........................] - ETA: 2:18 - loss: 1.0416 - regression_loss: 0.8937 - classification_loss: 0.1479 84/500 [====>.........................] - ETA: 2:18 - loss: 1.0354 - regression_loss: 0.8886 - classification_loss: 0.1469 85/500 [====>.........................] - ETA: 2:18 - loss: 1.0322 - regression_loss: 0.8859 - classification_loss: 0.1463 86/500 [====>.........................] - ETA: 2:17 - loss: 1.0338 - regression_loss: 0.8878 - classification_loss: 0.1460 87/500 [====>.........................] - ETA: 2:17 - loss: 1.0344 - regression_loss: 0.8878 - classification_loss: 0.1466 88/500 [====>.........................] - ETA: 2:16 - loss: 1.0280 - regression_loss: 0.8823 - classification_loss: 0.1457 89/500 [====>.........................] - ETA: 2:16 - loss: 1.0314 - regression_loss: 0.8848 - classification_loss: 0.1466 90/500 [====>.........................] - ETA: 2:16 - loss: 1.0331 - regression_loss: 0.8861 - classification_loss: 0.1470 91/500 [====>.........................] - ETA: 2:16 - loss: 1.0379 - regression_loss: 0.8902 - classification_loss: 0.1477 92/500 [====>.........................] - ETA: 2:15 - loss: 1.0387 - regression_loss: 0.8907 - classification_loss: 0.1480 93/500 [====>.........................] - ETA: 2:15 - loss: 1.0389 - regression_loss: 0.8907 - classification_loss: 0.1482 94/500 [====>.........................] - ETA: 2:15 - loss: 1.0416 - regression_loss: 0.8931 - classification_loss: 0.1486 95/500 [====>.........................] - ETA: 2:14 - loss: 1.0387 - regression_loss: 0.8905 - classification_loss: 0.1482 96/500 [====>.........................] - ETA: 2:14 - loss: 1.0337 - regression_loss: 0.8861 - classification_loss: 0.1476 97/500 [====>.........................] - ETA: 2:13 - loss: 1.0289 - regression_loss: 0.8819 - classification_loss: 0.1470 98/500 [====>.........................] - ETA: 2:13 - loss: 1.0317 - regression_loss: 0.8844 - classification_loss: 0.1472 99/500 [====>.........................] - ETA: 2:13 - loss: 1.0346 - regression_loss: 0.8872 - classification_loss: 0.1475 100/500 [=====>........................] - ETA: 2:13 - loss: 1.0370 - regression_loss: 0.8891 - classification_loss: 0.1479 101/500 [=====>........................] - ETA: 2:12 - loss: 1.0307 - regression_loss: 0.8835 - classification_loss: 0.1472 102/500 [=====>........................] - ETA: 2:12 - loss: 1.0362 - regression_loss: 0.8878 - classification_loss: 0.1483 103/500 [=====>........................] - ETA: 2:12 - loss: 1.0390 - regression_loss: 0.8904 - classification_loss: 0.1487 104/500 [=====>........................] - ETA: 2:12 - loss: 1.0393 - regression_loss: 0.8906 - classification_loss: 0.1486 105/500 [=====>........................] - ETA: 2:11 - loss: 1.0408 - regression_loss: 0.8921 - classification_loss: 0.1488 106/500 [=====>........................] - ETA: 2:11 - loss: 1.0355 - regression_loss: 0.8876 - classification_loss: 0.1479 107/500 [=====>........................] - ETA: 2:11 - loss: 1.0358 - regression_loss: 0.8881 - classification_loss: 0.1478 108/500 [=====>........................] - ETA: 2:10 - loss: 1.0341 - regression_loss: 0.8868 - classification_loss: 0.1473 109/500 [=====>........................] - ETA: 2:10 - loss: 1.0376 - regression_loss: 0.8896 - classification_loss: 0.1480 110/500 [=====>........................] - ETA: 2:10 - loss: 1.0350 - regression_loss: 0.8874 - classification_loss: 0.1476 111/500 [=====>........................] - ETA: 2:09 - loss: 1.0353 - regression_loss: 0.8877 - classification_loss: 0.1477 112/500 [=====>........................] - ETA: 2:09 - loss: 1.0350 - regression_loss: 0.8873 - classification_loss: 0.1477 113/500 [=====>........................] - ETA: 2:09 - loss: 1.0372 - regression_loss: 0.8896 - classification_loss: 0.1475 114/500 [=====>........................] - ETA: 2:08 - loss: 1.0403 - regression_loss: 0.8922 - classification_loss: 0.1481 115/500 [=====>........................] - ETA: 2:08 - loss: 1.0435 - regression_loss: 0.8950 - classification_loss: 0.1485 116/500 [=====>........................] - ETA: 2:07 - loss: 1.0406 - regression_loss: 0.8926 - classification_loss: 0.1480 117/500 [======>.......................] - ETA: 2:07 - loss: 1.0368 - regression_loss: 0.8895 - classification_loss: 0.1474 118/500 [======>.......................] - ETA: 2:07 - loss: 1.0381 - regression_loss: 0.8912 - classification_loss: 0.1469 119/500 [======>.......................] - ETA: 2:06 - loss: 1.0330 - regression_loss: 0.8868 - classification_loss: 0.1462 120/500 [======>.......................] - ETA: 2:06 - loss: 1.0276 - regression_loss: 0.8823 - classification_loss: 0.1453 121/500 [======>.......................] - ETA: 2:06 - loss: 1.0265 - regression_loss: 0.8818 - classification_loss: 0.1447 122/500 [======>.......................] - ETA: 2:05 - loss: 1.0298 - regression_loss: 0.8846 - classification_loss: 0.1452 123/500 [======>.......................] - ETA: 2:05 - loss: 1.0311 - regression_loss: 0.8855 - classification_loss: 0.1456 124/500 [======>.......................] - ETA: 2:05 - loss: 1.0308 - regression_loss: 0.8856 - classification_loss: 0.1452 125/500 [======>.......................] - ETA: 2:04 - loss: 1.0315 - regression_loss: 0.8860 - classification_loss: 0.1455 126/500 [======>.......................] - ETA: 2:04 - loss: 1.0335 - regression_loss: 0.8877 - classification_loss: 0.1458 127/500 [======>.......................] - ETA: 2:04 - loss: 1.0340 - regression_loss: 0.8879 - classification_loss: 0.1461 128/500 [======>.......................] - ETA: 2:03 - loss: 1.0332 - regression_loss: 0.8872 - classification_loss: 0.1460 129/500 [======>.......................] - ETA: 2:03 - loss: 1.0382 - regression_loss: 0.8914 - classification_loss: 0.1467 130/500 [======>.......................] - ETA: 2:03 - loss: 1.0351 - regression_loss: 0.8891 - classification_loss: 0.1460 131/500 [======>.......................] - ETA: 2:02 - loss: 1.0373 - regression_loss: 0.8911 - classification_loss: 0.1462 132/500 [======>.......................] - ETA: 2:02 - loss: 1.0354 - regression_loss: 0.8898 - classification_loss: 0.1456 133/500 [======>.......................] - ETA: 2:02 - loss: 1.0375 - regression_loss: 0.8917 - classification_loss: 0.1457 134/500 [=======>......................] - ETA: 2:01 - loss: 1.0351 - regression_loss: 0.8898 - classification_loss: 0.1453 135/500 [=======>......................] - ETA: 2:01 - loss: 1.0384 - regression_loss: 0.8927 - classification_loss: 0.1456 136/500 [=======>......................] - ETA: 2:01 - loss: 1.0369 - regression_loss: 0.8916 - classification_loss: 0.1453 137/500 [=======>......................] - ETA: 2:00 - loss: 1.0362 - regression_loss: 0.8913 - classification_loss: 0.1449 138/500 [=======>......................] - ETA: 2:00 - loss: 1.0323 - regression_loss: 0.8874 - classification_loss: 0.1449 139/500 [=======>......................] - ETA: 2:00 - loss: 1.0321 - regression_loss: 0.8873 - classification_loss: 0.1448 140/500 [=======>......................] - ETA: 1:59 - loss: 1.0318 - regression_loss: 0.8868 - classification_loss: 0.1450 141/500 [=======>......................] - ETA: 1:59 - loss: 1.0308 - regression_loss: 0.8858 - classification_loss: 0.1450 142/500 [=======>......................] - ETA: 1:59 - loss: 1.0313 - regression_loss: 0.8863 - classification_loss: 0.1450 143/500 [=======>......................] - ETA: 1:58 - loss: 1.0290 - regression_loss: 0.8844 - classification_loss: 0.1446 144/500 [=======>......................] - ETA: 1:58 - loss: 1.0260 - regression_loss: 0.8819 - classification_loss: 0.1441 145/500 [=======>......................] - ETA: 1:58 - loss: 1.0240 - regression_loss: 0.8805 - classification_loss: 0.1435 146/500 [=======>......................] - ETA: 1:57 - loss: 1.0197 - regression_loss: 0.8768 - classification_loss: 0.1429 147/500 [=======>......................] - ETA: 1:57 - loss: 1.0185 - regression_loss: 0.8758 - classification_loss: 0.1427 148/500 [=======>......................] - ETA: 1:57 - loss: 1.0167 - regression_loss: 0.8743 - classification_loss: 0.1424 149/500 [=======>......................] - ETA: 1:56 - loss: 1.0182 - regression_loss: 0.8761 - classification_loss: 0.1421 150/500 [========>.....................] - ETA: 1:56 - loss: 1.0214 - regression_loss: 0.8786 - classification_loss: 0.1428 151/500 [========>.....................] - ETA: 1:56 - loss: 1.0246 - regression_loss: 0.8815 - classification_loss: 0.1431 152/500 [========>.....................] - ETA: 1:55 - loss: 1.0220 - regression_loss: 0.8794 - classification_loss: 0.1426 153/500 [========>.....................] - ETA: 1:55 - loss: 1.0245 - regression_loss: 0.8815 - classification_loss: 0.1429 154/500 [========>.....................] - ETA: 1:54 - loss: 1.0260 - regression_loss: 0.8831 - classification_loss: 0.1429 155/500 [========>.....................] - ETA: 1:54 - loss: 1.0324 - regression_loss: 0.8885 - classification_loss: 0.1440 156/500 [========>.....................] - ETA: 1:54 - loss: 1.0320 - regression_loss: 0.8882 - classification_loss: 0.1438 157/500 [========>.....................] - ETA: 1:54 - loss: 1.0334 - regression_loss: 0.8892 - classification_loss: 0.1442 158/500 [========>.....................] - ETA: 1:53 - loss: 1.0313 - regression_loss: 0.8875 - classification_loss: 0.1438 159/500 [========>.....................] - ETA: 1:53 - loss: 1.0327 - regression_loss: 0.8888 - classification_loss: 0.1440 160/500 [========>.....................] - ETA: 1:52 - loss: 1.0299 - regression_loss: 0.8864 - classification_loss: 0.1435 161/500 [========>.....................] - ETA: 1:52 - loss: 1.0303 - regression_loss: 0.8867 - classification_loss: 0.1435 162/500 [========>.....................] - ETA: 1:52 - loss: 1.0313 - regression_loss: 0.8876 - classification_loss: 0.1437 163/500 [========>.....................] - ETA: 1:51 - loss: 1.0323 - regression_loss: 0.8886 - classification_loss: 0.1438 164/500 [========>.....................] - ETA: 1:51 - loss: 1.0330 - regression_loss: 0.8891 - classification_loss: 0.1438 165/500 [========>.....................] - ETA: 1:51 - loss: 1.0311 - regression_loss: 0.8876 - classification_loss: 0.1435 166/500 [========>.....................] - ETA: 1:50 - loss: 1.0287 - regression_loss: 0.8856 - classification_loss: 0.1431 167/500 [=========>....................] - ETA: 1:50 - loss: 1.0253 - regression_loss: 0.8823 - classification_loss: 0.1430 168/500 [=========>....................] - ETA: 1:50 - loss: 1.0236 - regression_loss: 0.8808 - classification_loss: 0.1428 169/500 [=========>....................] - ETA: 1:49 - loss: 1.0235 - regression_loss: 0.8808 - classification_loss: 0.1428 170/500 [=========>....................] - ETA: 1:49 - loss: 1.0237 - regression_loss: 0.8809 - classification_loss: 0.1428 171/500 [=========>....................] - ETA: 1:49 - loss: 1.0265 - regression_loss: 0.8829 - classification_loss: 0.1436 172/500 [=========>....................] - ETA: 1:49 - loss: 1.0283 - regression_loss: 0.8848 - classification_loss: 0.1435 173/500 [=========>....................] - ETA: 1:48 - loss: 1.0293 - regression_loss: 0.8855 - classification_loss: 0.1439 174/500 [=========>....................] - ETA: 1:48 - loss: 1.0315 - regression_loss: 0.8874 - classification_loss: 0.1441 175/500 [=========>....................] - ETA: 1:48 - loss: 1.0318 - regression_loss: 0.8878 - classification_loss: 0.1441 176/500 [=========>....................] - ETA: 1:47 - loss: 1.0346 - regression_loss: 0.8901 - classification_loss: 0.1445 177/500 [=========>....................] - ETA: 1:47 - loss: 1.0361 - regression_loss: 0.8913 - classification_loss: 0.1448 178/500 [=========>....................] - ETA: 1:47 - loss: 1.0341 - regression_loss: 0.8898 - classification_loss: 0.1444 179/500 [=========>....................] - ETA: 1:46 - loss: 1.0304 - regression_loss: 0.8865 - classification_loss: 0.1439 180/500 [=========>....................] - ETA: 1:46 - loss: 1.0277 - regression_loss: 0.8843 - classification_loss: 0.1434 181/500 [=========>....................] - ETA: 1:46 - loss: 1.0299 - regression_loss: 0.8862 - classification_loss: 0.1437 182/500 [=========>....................] - ETA: 1:45 - loss: 1.0308 - regression_loss: 0.8872 - classification_loss: 0.1436 183/500 [=========>....................] - ETA: 1:45 - loss: 1.0280 - regression_loss: 0.8848 - classification_loss: 0.1431 184/500 [==========>...................] - ETA: 1:45 - loss: 1.0266 - regression_loss: 0.8838 - classification_loss: 0.1428 185/500 [==========>...................] - ETA: 1:44 - loss: 1.0251 - regression_loss: 0.8825 - classification_loss: 0.1426 186/500 [==========>...................] - ETA: 1:44 - loss: 1.0232 - regression_loss: 0.8808 - classification_loss: 0.1424 187/500 [==========>...................] - ETA: 1:44 - loss: 1.0251 - regression_loss: 0.8825 - classification_loss: 0.1426 188/500 [==========>...................] - ETA: 1:43 - loss: 1.0235 - regression_loss: 0.8813 - classification_loss: 0.1421 189/500 [==========>...................] - ETA: 1:43 - loss: 1.0208 - regression_loss: 0.8791 - classification_loss: 0.1418 190/500 [==========>...................] - ETA: 1:43 - loss: 1.0181 - regression_loss: 0.8768 - classification_loss: 0.1413 191/500 [==========>...................] - ETA: 1:42 - loss: 1.0184 - regression_loss: 0.8774 - classification_loss: 0.1410 192/500 [==========>...................] - ETA: 1:42 - loss: 1.0171 - regression_loss: 0.8759 - classification_loss: 0.1412 193/500 [==========>...................] - ETA: 1:42 - loss: 1.0134 - regression_loss: 0.8727 - classification_loss: 0.1407 194/500 [==========>...................] - ETA: 1:41 - loss: 1.0097 - regression_loss: 0.8694 - classification_loss: 0.1403 195/500 [==========>...................] - ETA: 1:41 - loss: 1.0098 - regression_loss: 0.8693 - classification_loss: 0.1405 196/500 [==========>...................] - ETA: 1:41 - loss: 1.0081 - regression_loss: 0.8679 - classification_loss: 0.1402 197/500 [==========>...................] - ETA: 1:40 - loss: 1.0063 - regression_loss: 0.8663 - classification_loss: 0.1400 198/500 [==========>...................] - ETA: 1:40 - loss: 1.0052 - regression_loss: 0.8655 - classification_loss: 0.1398 199/500 [==========>...................] - ETA: 1:40 - loss: 1.0065 - regression_loss: 0.8663 - classification_loss: 0.1402 200/500 [===========>..................] - ETA: 1:39 - loss: 1.0083 - regression_loss: 0.8677 - classification_loss: 0.1406 201/500 [===========>..................] - ETA: 1:39 - loss: 1.0079 - regression_loss: 0.8674 - classification_loss: 0.1404 202/500 [===========>..................] - ETA: 1:38 - loss: 1.0066 - regression_loss: 0.8664 - classification_loss: 0.1402 203/500 [===========>..................] - ETA: 1:38 - loss: 1.0081 - regression_loss: 0.8676 - classification_loss: 0.1405 204/500 [===========>..................] - ETA: 1:38 - loss: 1.0110 - regression_loss: 0.8699 - classification_loss: 0.1411 205/500 [===========>..................] - ETA: 1:37 - loss: 1.0090 - regression_loss: 0.8682 - classification_loss: 0.1408 206/500 [===========>..................] - ETA: 1:37 - loss: 1.0112 - regression_loss: 0.8701 - classification_loss: 0.1411 207/500 [===========>..................] - ETA: 1:37 - loss: 1.0078 - regression_loss: 0.8670 - classification_loss: 0.1408 208/500 [===========>..................] - ETA: 1:37 - loss: 1.0080 - regression_loss: 0.8673 - classification_loss: 0.1408 209/500 [===========>..................] - ETA: 1:36 - loss: 1.0075 - regression_loss: 0.8670 - classification_loss: 0.1406 210/500 [===========>..................] - ETA: 1:36 - loss: 1.0053 - regression_loss: 0.8652 - classification_loss: 0.1401 211/500 [===========>..................] - ETA: 1:36 - loss: 1.0054 - regression_loss: 0.8653 - classification_loss: 0.1401 212/500 [===========>..................] - ETA: 1:35 - loss: 1.0019 - regression_loss: 0.8622 - classification_loss: 0.1397 213/500 [===========>..................] - ETA: 1:35 - loss: 1.0020 - regression_loss: 0.8624 - classification_loss: 0.1396 214/500 [===========>..................] - ETA: 1:35 - loss: 0.9992 - regression_loss: 0.8599 - classification_loss: 0.1393 215/500 [===========>..................] - ETA: 1:34 - loss: 0.9970 - regression_loss: 0.8581 - classification_loss: 0.1389 216/500 [===========>..................] - ETA: 1:34 - loss: 0.9954 - regression_loss: 0.8568 - classification_loss: 0.1386 217/500 [============>.................] - ETA: 1:34 - loss: 0.9979 - regression_loss: 0.8587 - classification_loss: 0.1392 218/500 [============>.................] - ETA: 1:33 - loss: 0.9984 - regression_loss: 0.8590 - classification_loss: 0.1394 219/500 [============>.................] - ETA: 1:33 - loss: 0.9986 - regression_loss: 0.8590 - classification_loss: 0.1395 220/500 [============>.................] - ETA: 1:33 - loss: 0.9999 - regression_loss: 0.8601 - classification_loss: 0.1398 221/500 [============>.................] - ETA: 1:32 - loss: 1.0013 - regression_loss: 0.8613 - classification_loss: 0.1400 222/500 [============>.................] - ETA: 1:32 - loss: 1.0003 - regression_loss: 0.8604 - classification_loss: 0.1399 223/500 [============>.................] - ETA: 1:32 - loss: 1.0007 - regression_loss: 0.8608 - classification_loss: 0.1399 224/500 [============>.................] - ETA: 1:31 - loss: 1.0012 - regression_loss: 0.8613 - classification_loss: 0.1399 225/500 [============>.................] - ETA: 1:31 - loss: 0.9991 - regression_loss: 0.8593 - classification_loss: 0.1398 226/500 [============>.................] - ETA: 1:31 - loss: 1.0005 - regression_loss: 0.8606 - classification_loss: 0.1399 227/500 [============>.................] - ETA: 1:30 - loss: 1.0000 - regression_loss: 0.8603 - classification_loss: 0.1397 228/500 [============>.................] - ETA: 1:30 - loss: 1.0039 - regression_loss: 0.8636 - classification_loss: 0.1403 229/500 [============>.................] - ETA: 1:30 - loss: 1.0067 - regression_loss: 0.8661 - classification_loss: 0.1406 230/500 [============>.................] - ETA: 1:29 - loss: 1.0090 - regression_loss: 0.8680 - classification_loss: 0.1410 231/500 [============>.................] - ETA: 1:29 - loss: 1.0078 - regression_loss: 0.8672 - classification_loss: 0.1406 232/500 [============>.................] - ETA: 1:29 - loss: 1.0104 - regression_loss: 0.8694 - classification_loss: 0.1409 233/500 [============>.................] - ETA: 1:28 - loss: 1.0102 - regression_loss: 0.8693 - classification_loss: 0.1409 234/500 [=============>................] - ETA: 1:28 - loss: 1.0131 - regression_loss: 0.8716 - classification_loss: 0.1416 235/500 [=============>................] - ETA: 1:28 - loss: 1.0146 - regression_loss: 0.8728 - classification_loss: 0.1418 236/500 [=============>................] - ETA: 1:27 - loss: 1.0134 - regression_loss: 0.8719 - classification_loss: 0.1415 237/500 [=============>................] - ETA: 1:27 - loss: 1.0121 - regression_loss: 0.8708 - classification_loss: 0.1413 238/500 [=============>................] - ETA: 1:27 - loss: 1.0111 - regression_loss: 0.8699 - classification_loss: 0.1412 239/500 [=============>................] - ETA: 1:26 - loss: 1.0097 - regression_loss: 0.8686 - classification_loss: 0.1411 240/500 [=============>................] - ETA: 1:26 - loss: 1.0071 - regression_loss: 0.8663 - classification_loss: 0.1408 241/500 [=============>................] - ETA: 1:26 - loss: 1.0077 - regression_loss: 0.8669 - classification_loss: 0.1408 242/500 [=============>................] - ETA: 1:25 - loss: 1.0082 - regression_loss: 0.8673 - classification_loss: 0.1409 243/500 [=============>................] - ETA: 1:25 - loss: 1.0089 - regression_loss: 0.8679 - classification_loss: 0.1410 244/500 [=============>................] - ETA: 1:25 - loss: 1.0105 - regression_loss: 0.8691 - classification_loss: 0.1413 245/500 [=============>................] - ETA: 1:24 - loss: 1.0146 - regression_loss: 0.8726 - classification_loss: 0.1419 246/500 [=============>................] - ETA: 1:24 - loss: 1.0172 - regression_loss: 0.8747 - classification_loss: 0.1425 247/500 [=============>................] - ETA: 1:24 - loss: 1.0208 - regression_loss: 0.8777 - classification_loss: 0.1431 248/500 [=============>................] - ETA: 1:23 - loss: 1.0195 - regression_loss: 0.8767 - classification_loss: 0.1428 249/500 [=============>................] - ETA: 1:23 - loss: 1.0221 - regression_loss: 0.8787 - classification_loss: 0.1434 250/500 [==============>...............] - ETA: 1:22 - loss: 1.0241 - regression_loss: 0.8804 - classification_loss: 0.1437 251/500 [==============>...............] - ETA: 1:22 - loss: 1.0237 - regression_loss: 0.8802 - classification_loss: 0.1436 252/500 [==============>...............] - ETA: 1:22 - loss: 1.0240 - regression_loss: 0.8800 - classification_loss: 0.1440 253/500 [==============>...............] - ETA: 1:21 - loss: 1.0232 - regression_loss: 0.8794 - classification_loss: 0.1439 254/500 [==============>...............] - ETA: 1:21 - loss: 1.0237 - regression_loss: 0.8798 - classification_loss: 0.1439 255/500 [==============>...............] - ETA: 1:21 - loss: 1.0229 - regression_loss: 0.8790 - classification_loss: 0.1439 256/500 [==============>...............] - ETA: 1:20 - loss: 1.0249 - regression_loss: 0.8805 - classification_loss: 0.1444 257/500 [==============>...............] - ETA: 1:20 - loss: 1.0238 - regression_loss: 0.8796 - classification_loss: 0.1443 258/500 [==============>...............] - ETA: 1:20 - loss: 1.0262 - regression_loss: 0.8815 - classification_loss: 0.1448 259/500 [==============>...............] - ETA: 1:19 - loss: 1.0239 - regression_loss: 0.8795 - classification_loss: 0.1444 260/500 [==============>...............] - ETA: 1:19 - loss: 1.0239 - regression_loss: 0.8793 - classification_loss: 0.1446 261/500 [==============>...............] - ETA: 1:19 - loss: 1.0268 - regression_loss: 0.8818 - classification_loss: 0.1450 262/500 [==============>...............] - ETA: 1:18 - loss: 1.0276 - regression_loss: 0.8824 - classification_loss: 0.1452 263/500 [==============>...............] - ETA: 1:18 - loss: 1.0272 - regression_loss: 0.8821 - classification_loss: 0.1451 264/500 [==============>...............] - ETA: 1:18 - loss: 1.0271 - regression_loss: 0.8822 - classification_loss: 0.1449 265/500 [==============>...............] - ETA: 1:17 - loss: 1.0254 - regression_loss: 0.8808 - classification_loss: 0.1447 266/500 [==============>...............] - ETA: 1:17 - loss: 1.0260 - regression_loss: 0.8813 - classification_loss: 0.1447 267/500 [===============>..............] - ETA: 1:17 - loss: 1.0268 - regression_loss: 0.8820 - classification_loss: 0.1449 268/500 [===============>..............] - ETA: 1:16 - loss: 1.0267 - regression_loss: 0.8818 - classification_loss: 0.1449 269/500 [===============>..............] - ETA: 1:16 - loss: 1.0261 - regression_loss: 0.8812 - classification_loss: 0.1449 270/500 [===============>..............] - ETA: 1:16 - loss: 1.0270 - regression_loss: 0.8821 - classification_loss: 0.1449 271/500 [===============>..............] - ETA: 1:15 - loss: 1.0265 - regression_loss: 0.8816 - classification_loss: 0.1449 272/500 [===============>..............] - ETA: 1:15 - loss: 1.0268 - regression_loss: 0.8819 - classification_loss: 0.1449 273/500 [===============>..............] - ETA: 1:15 - loss: 1.0249 - regression_loss: 0.8804 - classification_loss: 0.1445 274/500 [===============>..............] - ETA: 1:14 - loss: 1.0277 - regression_loss: 0.8827 - classification_loss: 0.1450 275/500 [===============>..............] - ETA: 1:14 - loss: 1.0301 - regression_loss: 0.8846 - classification_loss: 0.1455 276/500 [===============>..............] - ETA: 1:14 - loss: 1.0310 - regression_loss: 0.8855 - classification_loss: 0.1455 277/500 [===============>..............] - ETA: 1:13 - loss: 1.0310 - regression_loss: 0.8853 - classification_loss: 0.1456 278/500 [===============>..............] - ETA: 1:13 - loss: 1.0319 - regression_loss: 0.8862 - classification_loss: 0.1458 279/500 [===============>..............] - ETA: 1:13 - loss: 1.0331 - regression_loss: 0.8870 - classification_loss: 0.1461 280/500 [===============>..............] - ETA: 1:13 - loss: 1.0343 - regression_loss: 0.8880 - classification_loss: 0.1463 281/500 [===============>..............] - ETA: 1:12 - loss: 1.0341 - regression_loss: 0.8877 - classification_loss: 0.1463 282/500 [===============>..............] - ETA: 1:12 - loss: 1.0325 - regression_loss: 0.8865 - classification_loss: 0.1460 283/500 [===============>..............] - ETA: 1:12 - loss: 1.0330 - regression_loss: 0.8870 - classification_loss: 0.1460 284/500 [================>.............] - ETA: 1:11 - loss: 1.0335 - regression_loss: 0.8876 - classification_loss: 0.1459 285/500 [================>.............] - ETA: 1:11 - loss: 1.0342 - regression_loss: 0.8883 - classification_loss: 0.1459 286/500 [================>.............] - ETA: 1:11 - loss: 1.0314 - regression_loss: 0.8858 - classification_loss: 0.1456 287/500 [================>.............] - ETA: 1:10 - loss: 1.0308 - regression_loss: 0.8852 - classification_loss: 0.1455 288/500 [================>.............] - ETA: 1:10 - loss: 1.0307 - regression_loss: 0.8852 - classification_loss: 0.1455 289/500 [================>.............] - ETA: 1:10 - loss: 1.0292 - regression_loss: 0.8840 - classification_loss: 0.1452 290/500 [================>.............] - ETA: 1:09 - loss: 1.0276 - regression_loss: 0.8826 - classification_loss: 0.1450 291/500 [================>.............] - ETA: 1:09 - loss: 1.0274 - regression_loss: 0.8824 - classification_loss: 0.1450 292/500 [================>.............] - ETA: 1:09 - loss: 1.0285 - regression_loss: 0.8832 - classification_loss: 0.1453 293/500 [================>.............] - ETA: 1:08 - loss: 1.0289 - regression_loss: 0.8836 - classification_loss: 0.1454 294/500 [================>.............] - ETA: 1:08 - loss: 1.0280 - regression_loss: 0.8828 - classification_loss: 0.1452 295/500 [================>.............] - ETA: 1:08 - loss: 1.0270 - regression_loss: 0.8821 - classification_loss: 0.1449 296/500 [================>.............] - ETA: 1:07 - loss: 1.0264 - regression_loss: 0.8815 - classification_loss: 0.1449 297/500 [================>.............] - ETA: 1:07 - loss: 1.0259 - regression_loss: 0.8810 - classification_loss: 0.1448 298/500 [================>.............] - ETA: 1:07 - loss: 1.0250 - regression_loss: 0.8802 - classification_loss: 0.1448 299/500 [================>.............] - ETA: 1:06 - loss: 1.0259 - regression_loss: 0.8810 - classification_loss: 0.1449 300/500 [=================>............] - ETA: 1:06 - loss: 1.0264 - regression_loss: 0.8814 - classification_loss: 0.1449 301/500 [=================>............] - ETA: 1:06 - loss: 1.0271 - regression_loss: 0.8820 - classification_loss: 0.1451 302/500 [=================>............] - ETA: 1:05 - loss: 1.0268 - regression_loss: 0.8819 - classification_loss: 0.1449 303/500 [=================>............] - ETA: 1:05 - loss: 1.0281 - regression_loss: 0.8831 - classification_loss: 0.1450 304/500 [=================>............] - ETA: 1:05 - loss: 1.0276 - regression_loss: 0.8826 - classification_loss: 0.1450 305/500 [=================>............] - ETA: 1:04 - loss: 1.0282 - regression_loss: 0.8831 - classification_loss: 0.1451 306/500 [=================>............] - ETA: 1:04 - loss: 1.0282 - regression_loss: 0.8832 - classification_loss: 0.1450 307/500 [=================>............] - ETA: 1:04 - loss: 1.0273 - regression_loss: 0.8826 - classification_loss: 0.1447 308/500 [=================>............] - ETA: 1:03 - loss: 1.0285 - regression_loss: 0.8836 - classification_loss: 0.1449 309/500 [=================>............] - ETA: 1:03 - loss: 1.0281 - regression_loss: 0.8830 - classification_loss: 0.1451 310/500 [=================>............] - ETA: 1:03 - loss: 1.0301 - regression_loss: 0.8848 - classification_loss: 0.1453 311/500 [=================>............] - ETA: 1:02 - loss: 1.0300 - regression_loss: 0.8845 - classification_loss: 0.1455 312/500 [=================>............] - ETA: 1:02 - loss: 1.0301 - regression_loss: 0.8846 - classification_loss: 0.1455 313/500 [=================>............] - ETA: 1:02 - loss: 1.0321 - regression_loss: 0.8862 - classification_loss: 0.1459 314/500 [=================>............] - ETA: 1:01 - loss: 1.0324 - regression_loss: 0.8867 - classification_loss: 0.1457 315/500 [=================>............] - ETA: 1:01 - loss: 1.0331 - regression_loss: 0.8873 - classification_loss: 0.1458 316/500 [=================>............] - ETA: 1:01 - loss: 1.0326 - regression_loss: 0.8869 - classification_loss: 0.1457 317/500 [==================>...........] - ETA: 1:00 - loss: 1.0329 - regression_loss: 0.8871 - classification_loss: 0.1458 318/500 [==================>...........] - ETA: 1:00 - loss: 1.0319 - regression_loss: 0.8862 - classification_loss: 0.1456 319/500 [==================>...........] - ETA: 1:00 - loss: 1.0329 - regression_loss: 0.8872 - classification_loss: 0.1457 320/500 [==================>...........] - ETA: 59s - loss: 1.0346 - regression_loss: 0.8886 - classification_loss: 0.1460  321/500 [==================>...........] - ETA: 59s - loss: 1.0350 - regression_loss: 0.8890 - classification_loss: 0.1460 322/500 [==================>...........] - ETA: 59s - loss: 1.0355 - regression_loss: 0.8895 - classification_loss: 0.1460 323/500 [==================>...........] - ETA: 58s - loss: 1.0353 - regression_loss: 0.8894 - classification_loss: 0.1459 324/500 [==================>...........] - ETA: 58s - loss: 1.0336 - regression_loss: 0.8880 - classification_loss: 0.1456 325/500 [==================>...........] - ETA: 58s - loss: 1.0326 - regression_loss: 0.8871 - classification_loss: 0.1455 326/500 [==================>...........] - ETA: 57s - loss: 1.0338 - regression_loss: 0.8882 - classification_loss: 0.1457 327/500 [==================>...........] - ETA: 57s - loss: 1.0343 - regression_loss: 0.8887 - classification_loss: 0.1457 328/500 [==================>...........] - ETA: 57s - loss: 1.0346 - regression_loss: 0.8889 - classification_loss: 0.1457 329/500 [==================>...........] - ETA: 56s - loss: 1.0338 - regression_loss: 0.8882 - classification_loss: 0.1456 330/500 [==================>...........] - ETA: 56s - loss: 1.0344 - regression_loss: 0.8887 - classification_loss: 0.1456 331/500 [==================>...........] - ETA: 56s - loss: 1.0338 - regression_loss: 0.8884 - classification_loss: 0.1454 332/500 [==================>...........] - ETA: 55s - loss: 1.0344 - regression_loss: 0.8890 - classification_loss: 0.1454 333/500 [==================>...........] - ETA: 55s - loss: 1.0327 - regression_loss: 0.8876 - classification_loss: 0.1451 334/500 [===================>..........] - ETA: 55s - loss: 1.0329 - regression_loss: 0.8878 - classification_loss: 0.1452 335/500 [===================>..........] - ETA: 54s - loss: 1.0306 - regression_loss: 0.8857 - classification_loss: 0.1449 336/500 [===================>..........] - ETA: 54s - loss: 1.0294 - regression_loss: 0.8847 - classification_loss: 0.1447 337/500 [===================>..........] - ETA: 53s - loss: 1.0276 - regression_loss: 0.8829 - classification_loss: 0.1447 338/500 [===================>..........] - ETA: 53s - loss: 1.0276 - regression_loss: 0.8829 - classification_loss: 0.1447 339/500 [===================>..........] - ETA: 53s - loss: 1.0283 - regression_loss: 0.8836 - classification_loss: 0.1447 340/500 [===================>..........] - ETA: 52s - loss: 1.0276 - regression_loss: 0.8830 - classification_loss: 0.1446 341/500 [===================>..........] - ETA: 52s - loss: 1.0274 - regression_loss: 0.8828 - classification_loss: 0.1446 342/500 [===================>..........] - ETA: 52s - loss: 1.0276 - regression_loss: 0.8829 - classification_loss: 0.1446 343/500 [===================>..........] - ETA: 51s - loss: 1.0259 - regression_loss: 0.8815 - classification_loss: 0.1443 344/500 [===================>..........] - ETA: 51s - loss: 1.0256 - regression_loss: 0.8813 - classification_loss: 0.1443 345/500 [===================>..........] - ETA: 51s - loss: 1.0245 - regression_loss: 0.8804 - classification_loss: 0.1441 346/500 [===================>..........] - ETA: 51s - loss: 1.0242 - regression_loss: 0.8801 - classification_loss: 0.1441 347/500 [===================>..........] - ETA: 50s - loss: 1.0255 - regression_loss: 0.8812 - classification_loss: 0.1443 348/500 [===================>..........] - ETA: 50s - loss: 1.0244 - regression_loss: 0.8804 - classification_loss: 0.1440 349/500 [===================>..........] - ETA: 50s - loss: 1.0253 - regression_loss: 0.8812 - classification_loss: 0.1441 350/500 [====================>.........] - ETA: 49s - loss: 1.0266 - regression_loss: 0.8822 - classification_loss: 0.1444 351/500 [====================>.........] - ETA: 49s - loss: 1.0274 - regression_loss: 0.8829 - classification_loss: 0.1445 352/500 [====================>.........] - ETA: 49s - loss: 1.0260 - regression_loss: 0.8817 - classification_loss: 0.1443 353/500 [====================>.........] - ETA: 48s - loss: 1.0266 - regression_loss: 0.8824 - classification_loss: 0.1442 354/500 [====================>.........] - ETA: 48s - loss: 1.0261 - regression_loss: 0.8820 - classification_loss: 0.1441 355/500 [====================>.........] - ETA: 48s - loss: 1.0258 - regression_loss: 0.8818 - classification_loss: 0.1440 356/500 [====================>.........] - ETA: 47s - loss: 1.0240 - regression_loss: 0.8802 - classification_loss: 0.1438 357/500 [====================>.........] - ETA: 47s - loss: 1.0231 - regression_loss: 0.8794 - classification_loss: 0.1437 358/500 [====================>.........] - ETA: 47s - loss: 1.0252 - regression_loss: 0.8813 - classification_loss: 0.1439 359/500 [====================>.........] - ETA: 46s - loss: 1.0258 - regression_loss: 0.8818 - classification_loss: 0.1440 360/500 [====================>.........] - ETA: 46s - loss: 1.0252 - regression_loss: 0.8813 - classification_loss: 0.1439 361/500 [====================>.........] - ETA: 46s - loss: 1.0253 - regression_loss: 0.8815 - classification_loss: 0.1439 362/500 [====================>.........] - ETA: 45s - loss: 1.0266 - regression_loss: 0.8825 - classification_loss: 0.1441 363/500 [====================>.........] - ETA: 45s - loss: 1.0285 - regression_loss: 0.8843 - classification_loss: 0.1442 364/500 [====================>.........] - ETA: 45s - loss: 1.0280 - regression_loss: 0.8839 - classification_loss: 0.1441 365/500 [====================>.........] - ETA: 44s - loss: 1.0280 - regression_loss: 0.8839 - classification_loss: 0.1441 366/500 [====================>.........] - ETA: 44s - loss: 1.0285 - regression_loss: 0.8844 - classification_loss: 0.1441 367/500 [=====================>........] - ETA: 44s - loss: 1.0283 - regression_loss: 0.8843 - classification_loss: 0.1440 368/500 [=====================>........] - ETA: 43s - loss: 1.0268 - regression_loss: 0.8830 - classification_loss: 0.1438 369/500 [=====================>........] - ETA: 43s - loss: 1.0261 - regression_loss: 0.8824 - classification_loss: 0.1438 370/500 [=====================>........] - ETA: 43s - loss: 1.0271 - regression_loss: 0.8831 - classification_loss: 0.1440 371/500 [=====================>........] - ETA: 42s - loss: 1.0284 - regression_loss: 0.8842 - classification_loss: 0.1442 372/500 [=====================>........] - ETA: 42s - loss: 1.0293 - regression_loss: 0.8850 - classification_loss: 0.1443 373/500 [=====================>........] - ETA: 42s - loss: 1.0290 - regression_loss: 0.8847 - classification_loss: 0.1443 374/500 [=====================>........] - ETA: 41s - loss: 1.0274 - regression_loss: 0.8834 - classification_loss: 0.1441 375/500 [=====================>........] - ETA: 41s - loss: 1.0266 - regression_loss: 0.8827 - classification_loss: 0.1439 376/500 [=====================>........] - ETA: 41s - loss: 1.0262 - regression_loss: 0.8825 - classification_loss: 0.1437 377/500 [=====================>........] - ETA: 40s - loss: 1.0270 - regression_loss: 0.8833 - classification_loss: 0.1437 378/500 [=====================>........] - ETA: 40s - loss: 1.0259 - regression_loss: 0.8822 - classification_loss: 0.1437 379/500 [=====================>........] - ETA: 40s - loss: 1.0259 - regression_loss: 0.8822 - classification_loss: 0.1437 380/500 [=====================>........] - ETA: 39s - loss: 1.0246 - regression_loss: 0.8812 - classification_loss: 0.1434 381/500 [=====================>........] - ETA: 39s - loss: 1.0252 - regression_loss: 0.8817 - classification_loss: 0.1434 382/500 [=====================>........] - ETA: 39s - loss: 1.0237 - regression_loss: 0.8804 - classification_loss: 0.1432 383/500 [=====================>........] - ETA: 38s - loss: 1.0239 - regression_loss: 0.8808 - classification_loss: 0.1431 384/500 [======================>.......] - ETA: 38s - loss: 1.0234 - regression_loss: 0.8804 - classification_loss: 0.1431 385/500 [======================>.......] - ETA: 38s - loss: 1.0224 - regression_loss: 0.8795 - classification_loss: 0.1429 386/500 [======================>.......] - ETA: 37s - loss: 1.0218 - regression_loss: 0.8790 - classification_loss: 0.1428 387/500 [======================>.......] - ETA: 37s - loss: 1.0219 - regression_loss: 0.8791 - classification_loss: 0.1428 388/500 [======================>.......] - ETA: 37s - loss: 1.0223 - regression_loss: 0.8795 - classification_loss: 0.1428 389/500 [======================>.......] - ETA: 36s - loss: 1.0237 - regression_loss: 0.8806 - classification_loss: 0.1431 390/500 [======================>.......] - ETA: 36s - loss: 1.0228 - regression_loss: 0.8800 - classification_loss: 0.1429 391/500 [======================>.......] - ETA: 36s - loss: 1.0244 - regression_loss: 0.8812 - classification_loss: 0.1432 392/500 [======================>.......] - ETA: 35s - loss: 1.0235 - regression_loss: 0.8804 - classification_loss: 0.1431 393/500 [======================>.......] - ETA: 35s - loss: 1.0225 - regression_loss: 0.8796 - classification_loss: 0.1429 394/500 [======================>.......] - ETA: 35s - loss: 1.0213 - regression_loss: 0.8784 - classification_loss: 0.1428 395/500 [======================>.......] - ETA: 34s - loss: 1.0212 - regression_loss: 0.8784 - classification_loss: 0.1428 396/500 [======================>.......] - ETA: 34s - loss: 1.0214 - regression_loss: 0.8785 - classification_loss: 0.1428 397/500 [======================>.......] - ETA: 34s - loss: 1.0215 - regression_loss: 0.8785 - classification_loss: 0.1429 398/500 [======================>.......] - ETA: 33s - loss: 1.0217 - regression_loss: 0.8788 - classification_loss: 0.1429 399/500 [======================>.......] - ETA: 33s - loss: 1.0207 - regression_loss: 0.8780 - classification_loss: 0.1428 400/500 [=======================>......] - ETA: 33s - loss: 1.0199 - regression_loss: 0.8772 - classification_loss: 0.1427 401/500 [=======================>......] - ETA: 32s - loss: 1.0200 - regression_loss: 0.8772 - classification_loss: 0.1428 402/500 [=======================>......] - ETA: 32s - loss: 1.0187 - regression_loss: 0.8761 - classification_loss: 0.1426 403/500 [=======================>......] - ETA: 32s - loss: 1.0209 - regression_loss: 0.8779 - classification_loss: 0.1430 404/500 [=======================>......] - ETA: 31s - loss: 1.0206 - regression_loss: 0.8777 - classification_loss: 0.1430 405/500 [=======================>......] - ETA: 31s - loss: 1.0216 - regression_loss: 0.8784 - classification_loss: 0.1432 406/500 [=======================>......] - ETA: 31s - loss: 1.0218 - regression_loss: 0.8785 - classification_loss: 0.1433 407/500 [=======================>......] - ETA: 30s - loss: 1.0213 - regression_loss: 0.8780 - classification_loss: 0.1432 408/500 [=======================>......] - ETA: 30s - loss: 1.0220 - regression_loss: 0.8784 - classification_loss: 0.1435 409/500 [=======================>......] - ETA: 30s - loss: 1.0217 - regression_loss: 0.8782 - classification_loss: 0.1435 410/500 [=======================>......] - ETA: 29s - loss: 1.0202 - regression_loss: 0.8769 - classification_loss: 0.1433 411/500 [=======================>......] - ETA: 29s - loss: 1.0213 - regression_loss: 0.8779 - classification_loss: 0.1434 412/500 [=======================>......] - ETA: 29s - loss: 1.0217 - regression_loss: 0.8784 - classification_loss: 0.1433 413/500 [=======================>......] - ETA: 28s - loss: 1.0205 - regression_loss: 0.8774 - classification_loss: 0.1431 414/500 [=======================>......] - ETA: 28s - loss: 1.0227 - regression_loss: 0.8792 - classification_loss: 0.1435 415/500 [=======================>......] - ETA: 28s - loss: 1.0217 - regression_loss: 0.8783 - classification_loss: 0.1434 416/500 [=======================>......] - ETA: 27s - loss: 1.0200 - regression_loss: 0.8769 - classification_loss: 0.1431 417/500 [========================>.....] - ETA: 27s - loss: 1.0216 - regression_loss: 0.8782 - classification_loss: 0.1434 418/500 [========================>.....] - ETA: 27s - loss: 1.0217 - regression_loss: 0.8784 - classification_loss: 0.1433 419/500 [========================>.....] - ETA: 26s - loss: 1.0231 - regression_loss: 0.8794 - classification_loss: 0.1437 420/500 [========================>.....] - ETA: 26s - loss: 1.0233 - regression_loss: 0.8796 - classification_loss: 0.1437 421/500 [========================>.....] - ETA: 26s - loss: 1.0226 - regression_loss: 0.8791 - classification_loss: 0.1435 422/500 [========================>.....] - ETA: 25s - loss: 1.0222 - regression_loss: 0.8788 - classification_loss: 0.1434 423/500 [========================>.....] - ETA: 25s - loss: 1.0226 - regression_loss: 0.8792 - classification_loss: 0.1434 424/500 [========================>.....] - ETA: 25s - loss: 1.0240 - regression_loss: 0.8804 - classification_loss: 0.1436 425/500 [========================>.....] - ETA: 24s - loss: 1.0241 - regression_loss: 0.8804 - classification_loss: 0.1437 426/500 [========================>.....] - ETA: 24s - loss: 1.0233 - regression_loss: 0.8797 - classification_loss: 0.1436 427/500 [========================>.....] - ETA: 24s - loss: 1.0232 - regression_loss: 0.8797 - classification_loss: 0.1435 428/500 [========================>.....] - ETA: 23s - loss: 1.0229 - regression_loss: 0.8794 - classification_loss: 0.1435 429/500 [========================>.....] - ETA: 23s - loss: 1.0226 - regression_loss: 0.8792 - classification_loss: 0.1434 430/500 [========================>.....] - ETA: 23s - loss: 1.0218 - regression_loss: 0.8785 - classification_loss: 0.1433 431/500 [========================>.....] - ETA: 22s - loss: 1.0209 - regression_loss: 0.8778 - classification_loss: 0.1431 432/500 [========================>.....] - ETA: 22s - loss: 1.0215 - regression_loss: 0.8782 - classification_loss: 0.1433 433/500 [========================>.....] - ETA: 22s - loss: 1.0216 - regression_loss: 0.8784 - classification_loss: 0.1432 434/500 [=========================>....] - ETA: 21s - loss: 1.0225 - regression_loss: 0.8791 - classification_loss: 0.1434 435/500 [=========================>....] - ETA: 21s - loss: 1.0244 - regression_loss: 0.8806 - classification_loss: 0.1438 436/500 [=========================>....] - ETA: 21s - loss: 1.0237 - regression_loss: 0.8801 - classification_loss: 0.1436 437/500 [=========================>....] - ETA: 20s - loss: 1.0239 - regression_loss: 0.8802 - classification_loss: 0.1437 438/500 [=========================>....] - ETA: 20s - loss: 1.0229 - regression_loss: 0.8793 - classification_loss: 0.1435 439/500 [=========================>....] - ETA: 20s - loss: 1.0231 - regression_loss: 0.8795 - classification_loss: 0.1436 440/500 [=========================>....] - ETA: 19s - loss: 1.0244 - regression_loss: 0.8806 - classification_loss: 0.1438 441/500 [=========================>....] - ETA: 19s - loss: 1.0245 - regression_loss: 0.8807 - classification_loss: 0.1438 442/500 [=========================>....] - ETA: 19s - loss: 1.0249 - regression_loss: 0.8811 - classification_loss: 0.1439 443/500 [=========================>....] - ETA: 18s - loss: 1.0241 - regression_loss: 0.8804 - classification_loss: 0.1438 444/500 [=========================>....] - ETA: 18s - loss: 1.0249 - regression_loss: 0.8810 - classification_loss: 0.1439 445/500 [=========================>....] - ETA: 18s - loss: 1.0251 - regression_loss: 0.8811 - classification_loss: 0.1440 446/500 [=========================>....] - ETA: 17s - loss: 1.0247 - regression_loss: 0.8809 - classification_loss: 0.1438 447/500 [=========================>....] - ETA: 17s - loss: 1.0249 - regression_loss: 0.8810 - classification_loss: 0.1438 448/500 [=========================>....] - ETA: 17s - loss: 1.0260 - regression_loss: 0.8820 - classification_loss: 0.1440 449/500 [=========================>....] - ETA: 16s - loss: 1.0255 - regression_loss: 0.8816 - classification_loss: 0.1439 450/500 [==========================>...] - ETA: 16s - loss: 1.0259 - regression_loss: 0.8819 - classification_loss: 0.1440 451/500 [==========================>...] - ETA: 16s - loss: 1.0265 - regression_loss: 0.8824 - classification_loss: 0.1441 452/500 [==========================>...] - ETA: 15s - loss: 1.0262 - regression_loss: 0.8821 - classification_loss: 0.1441 453/500 [==========================>...] - ETA: 15s - loss: 1.0253 - regression_loss: 0.8815 - classification_loss: 0.1439 454/500 [==========================>...] - ETA: 15s - loss: 1.0255 - regression_loss: 0.8816 - classification_loss: 0.1439 455/500 [==========================>...] - ETA: 14s - loss: 1.0261 - regression_loss: 0.8822 - classification_loss: 0.1439 456/500 [==========================>...] - ETA: 14s - loss: 1.0270 - regression_loss: 0.8829 - classification_loss: 0.1441 457/500 [==========================>...] - ETA: 14s - loss: 1.0256 - regression_loss: 0.8817 - classification_loss: 0.1439 458/500 [==========================>...] - ETA: 13s - loss: 1.0249 - regression_loss: 0.8810 - classification_loss: 0.1439 459/500 [==========================>...] - ETA: 13s - loss: 1.0253 - regression_loss: 0.8813 - classification_loss: 0.1439 460/500 [==========================>...] - ETA: 13s - loss: 1.0243 - regression_loss: 0.8805 - classification_loss: 0.1438 461/500 [==========================>...] - ETA: 12s - loss: 1.0244 - regression_loss: 0.8808 - classification_loss: 0.1436 462/500 [==========================>...] - ETA: 12s - loss: 1.0247 - regression_loss: 0.8812 - classification_loss: 0.1435 463/500 [==========================>...] - ETA: 12s - loss: 1.0246 - regression_loss: 0.8811 - classification_loss: 0.1436 464/500 [==========================>...] - ETA: 11s - loss: 1.0254 - regression_loss: 0.8818 - classification_loss: 0.1437 465/500 [==========================>...] - ETA: 11s - loss: 1.0260 - regression_loss: 0.8823 - classification_loss: 0.1437 466/500 [==========================>...] - ETA: 11s - loss: 1.0257 - regression_loss: 0.8819 - classification_loss: 0.1438 467/500 [===========================>..] - ETA: 10s - loss: 1.0254 - regression_loss: 0.8816 - classification_loss: 0.1437 468/500 [===========================>..] - ETA: 10s - loss: 1.0248 - regression_loss: 0.8813 - classification_loss: 0.1436 469/500 [===========================>..] - ETA: 10s - loss: 1.0259 - regression_loss: 0.8820 - classification_loss: 0.1439 470/500 [===========================>..] - ETA: 9s - loss: 1.0265 - regression_loss: 0.8824 - classification_loss: 0.1441  471/500 [===========================>..] - ETA: 9s - loss: 1.0247 - regression_loss: 0.8809 - classification_loss: 0.1438 472/500 [===========================>..] - ETA: 9s - loss: 1.0239 - regression_loss: 0.8802 - classification_loss: 0.1437 473/500 [===========================>..] - ETA: 8s - loss: 1.0242 - regression_loss: 0.8806 - classification_loss: 0.1437 474/500 [===========================>..] - ETA: 8s - loss: 1.0243 - regression_loss: 0.8807 - classification_loss: 0.1437 475/500 [===========================>..] - ETA: 8s - loss: 1.0235 - regression_loss: 0.8799 - classification_loss: 0.1436 476/500 [===========================>..] - ETA: 7s - loss: 1.0232 - regression_loss: 0.8796 - classification_loss: 0.1436 477/500 [===========================>..] - ETA: 7s - loss: 1.0221 - regression_loss: 0.8787 - classification_loss: 0.1434 478/500 [===========================>..] - ETA: 7s - loss: 1.0206 - regression_loss: 0.8774 - classification_loss: 0.1432 479/500 [===========================>..] - ETA: 6s - loss: 1.0192 - regression_loss: 0.8761 - classification_loss: 0.1431 480/500 [===========================>..] - ETA: 6s - loss: 1.0201 - regression_loss: 0.8770 - classification_loss: 0.1431 481/500 [===========================>..] - ETA: 6s - loss: 1.0217 - regression_loss: 0.8783 - classification_loss: 0.1433 482/500 [===========================>..] - ETA: 5s - loss: 1.0219 - regression_loss: 0.8785 - classification_loss: 0.1434 483/500 [===========================>..] - ETA: 5s - loss: 1.0207 - regression_loss: 0.8775 - classification_loss: 0.1432 484/500 [============================>.] - ETA: 5s - loss: 1.0206 - regression_loss: 0.8775 - classification_loss: 0.1431 485/500 [============================>.] - ETA: 4s - loss: 1.0198 - regression_loss: 0.8769 - classification_loss: 0.1429 486/500 [============================>.] - ETA: 4s - loss: 1.0205 - regression_loss: 0.8774 - classification_loss: 0.1432 487/500 [============================>.] - ETA: 4s - loss: 1.0221 - regression_loss: 0.8787 - classification_loss: 0.1434 488/500 [============================>.] - ETA: 3s - loss: 1.0227 - regression_loss: 0.8792 - classification_loss: 0.1435 489/500 [============================>.] - ETA: 3s - loss: 1.0223 - regression_loss: 0.8789 - classification_loss: 0.1434 490/500 [============================>.] - ETA: 3s - loss: 1.0212 - regression_loss: 0.8779 - classification_loss: 0.1433 491/500 [============================>.] - ETA: 2s - loss: 1.0222 - regression_loss: 0.8788 - classification_loss: 0.1435 492/500 [============================>.] - ETA: 2s - loss: 1.0220 - regression_loss: 0.8785 - classification_loss: 0.1435 493/500 [============================>.] - ETA: 2s - loss: 1.0228 - regression_loss: 0.8791 - classification_loss: 0.1436 494/500 [============================>.] - ETA: 1s - loss: 1.0237 - regression_loss: 0.8800 - classification_loss: 0.1437 495/500 [============================>.] - ETA: 1s - loss: 1.0236 - regression_loss: 0.8799 - classification_loss: 0.1437 496/500 [============================>.] - ETA: 1s - loss: 1.0235 - regression_loss: 0.8798 - classification_loss: 0.1437 497/500 [============================>.] - ETA: 0s - loss: 1.0234 - regression_loss: 0.8799 - classification_loss: 0.1436 498/500 [============================>.] - ETA: 0s - loss: 1.0242 - regression_loss: 0.8805 - classification_loss: 0.1437 499/500 [============================>.] - ETA: 0s - loss: 1.0247 - regression_loss: 0.8809 - classification_loss: 0.1438 500/500 [==============================] - 166s 331ms/step - loss: 1.0252 - regression_loss: 0.8812 - classification_loss: 0.1439 1172 instances of class plum with average precision: 0.6359 mAP: 0.6359 Epoch 00030: saving model to ./training/snapshots/resnet101_pascal_30.h5 Epoch 31/150 1/500 [..............................] - ETA: 2:33 - loss: 1.0033 - regression_loss: 0.8788 - classification_loss: 0.1246 2/500 [..............................] - ETA: 2:34 - loss: 1.0021 - regression_loss: 0.8610 - classification_loss: 0.1410 3/500 [..............................] - ETA: 2:39 - loss: 0.8545 - regression_loss: 0.7484 - classification_loss: 0.1062 4/500 [..............................] - ETA: 2:42 - loss: 1.0680 - regression_loss: 0.9173 - classification_loss: 0.1507 5/500 [..............................] - ETA: 2:41 - loss: 1.0898 - regression_loss: 0.9276 - classification_loss: 0.1622 6/500 [..............................] - ETA: 2:41 - loss: 1.0978 - regression_loss: 0.9341 - classification_loss: 0.1638 7/500 [..............................] - ETA: 2:43 - loss: 1.1013 - regression_loss: 0.9368 - classification_loss: 0.1645 8/500 [..............................] - ETA: 2:43 - loss: 1.1393 - regression_loss: 0.9719 - classification_loss: 0.1675 9/500 [..............................] - ETA: 2:43 - loss: 1.1480 - regression_loss: 0.9827 - classification_loss: 0.1653 10/500 [..............................] - ETA: 2:42 - loss: 1.0909 - regression_loss: 0.9346 - classification_loss: 0.1563 11/500 [..............................] - ETA: 2:41 - loss: 1.0453 - regression_loss: 0.8964 - classification_loss: 0.1489 12/500 [..............................] - ETA: 2:42 - loss: 1.0506 - regression_loss: 0.9012 - classification_loss: 0.1494 13/500 [..............................] - ETA: 2:41 - loss: 1.0175 - regression_loss: 0.8742 - classification_loss: 0.1433 14/500 [..............................] - ETA: 2:40 - loss: 0.9778 - regression_loss: 0.8395 - classification_loss: 0.1384 15/500 [..............................] - ETA: 2:40 - loss: 0.9977 - regression_loss: 0.8583 - classification_loss: 0.1394 16/500 [..............................] - ETA: 2:39 - loss: 0.9722 - regression_loss: 0.8342 - classification_loss: 0.1380 17/500 [>.............................] - ETA: 2:39 - loss: 0.9808 - regression_loss: 0.8403 - classification_loss: 0.1405 18/500 [>.............................] - ETA: 2:38 - loss: 0.9750 - regression_loss: 0.8358 - classification_loss: 0.1391 19/500 [>.............................] - ETA: 2:38 - loss: 0.9794 - regression_loss: 0.8423 - classification_loss: 0.1372 20/500 [>.............................] - ETA: 2:37 - loss: 0.9672 - regression_loss: 0.8323 - classification_loss: 0.1350 21/500 [>.............................] - ETA: 2:37 - loss: 1.0016 - regression_loss: 0.8613 - classification_loss: 0.1403 22/500 [>.............................] - ETA: 2:36 - loss: 1.0224 - regression_loss: 0.8806 - classification_loss: 0.1418 23/500 [>.............................] - ETA: 2:35 - loss: 1.0216 - regression_loss: 0.8795 - classification_loss: 0.1422 24/500 [>.............................] - ETA: 2:35 - loss: 1.0113 - regression_loss: 0.8687 - classification_loss: 0.1426 25/500 [>.............................] - ETA: 2:35 - loss: 1.0157 - regression_loss: 0.8724 - classification_loss: 0.1433 26/500 [>.............................] - ETA: 2:35 - loss: 1.0305 - regression_loss: 0.8841 - classification_loss: 0.1464 27/500 [>.............................] - ETA: 2:35 - loss: 1.0222 - regression_loss: 0.8778 - classification_loss: 0.1444 28/500 [>.............................] - ETA: 2:35 - loss: 1.0294 - regression_loss: 0.8840 - classification_loss: 0.1454 29/500 [>.............................] - ETA: 2:34 - loss: 1.0100 - regression_loss: 0.8655 - classification_loss: 0.1445 30/500 [>.............................] - ETA: 2:34 - loss: 0.9925 - regression_loss: 0.8508 - classification_loss: 0.1416 31/500 [>.............................] - ETA: 2:33 - loss: 0.9935 - regression_loss: 0.8495 - classification_loss: 0.1440 32/500 [>.............................] - ETA: 2:33 - loss: 1.0061 - regression_loss: 0.8610 - classification_loss: 0.1451 33/500 [>.............................] - ETA: 2:33 - loss: 1.0184 - regression_loss: 0.8707 - classification_loss: 0.1478 34/500 [=>............................] - ETA: 2:33 - loss: 1.0026 - regression_loss: 0.8565 - classification_loss: 0.1461 35/500 [=>............................] - ETA: 2:33 - loss: 0.9974 - regression_loss: 0.8519 - classification_loss: 0.1454 36/500 [=>............................] - ETA: 2:32 - loss: 1.0113 - regression_loss: 0.8633 - classification_loss: 0.1480 37/500 [=>............................] - ETA: 2:32 - loss: 1.0108 - regression_loss: 0.8619 - classification_loss: 0.1488 38/500 [=>............................] - ETA: 2:32 - loss: 1.0188 - regression_loss: 0.8703 - classification_loss: 0.1486 39/500 [=>............................] - ETA: 2:31 - loss: 1.0215 - regression_loss: 0.8745 - classification_loss: 0.1471 40/500 [=>............................] - ETA: 2:31 - loss: 1.0264 - regression_loss: 0.8788 - classification_loss: 0.1476 41/500 [=>............................] - ETA: 2:31 - loss: 1.0288 - regression_loss: 0.8814 - classification_loss: 0.1473 42/500 [=>............................] - ETA: 2:30 - loss: 1.0350 - regression_loss: 0.8882 - classification_loss: 0.1468 43/500 [=>............................] - ETA: 2:30 - loss: 1.0425 - regression_loss: 0.8952 - classification_loss: 0.1473 44/500 [=>............................] - ETA: 2:30 - loss: 1.0490 - regression_loss: 0.8995 - classification_loss: 0.1495 45/500 [=>............................] - ETA: 2:30 - loss: 1.0473 - regression_loss: 0.8980 - classification_loss: 0.1494 46/500 [=>............................] - ETA: 2:29 - loss: 1.0423 - regression_loss: 0.8948 - classification_loss: 0.1475 47/500 [=>............................] - ETA: 2:29 - loss: 1.0516 - regression_loss: 0.9016 - classification_loss: 0.1501 48/500 [=>............................] - ETA: 2:28 - loss: 1.0554 - regression_loss: 0.9050 - classification_loss: 0.1504 49/500 [=>............................] - ETA: 2:28 - loss: 1.0596 - regression_loss: 0.9091 - classification_loss: 0.1504 50/500 [==>...........................] - ETA: 2:28 - loss: 1.0569 - regression_loss: 0.9063 - classification_loss: 0.1505 51/500 [==>...........................] - ETA: 2:27 - loss: 1.0580 - regression_loss: 0.9073 - classification_loss: 0.1508 52/500 [==>...........................] - ETA: 2:27 - loss: 1.0572 - regression_loss: 0.9070 - classification_loss: 0.1502 53/500 [==>...........................] - ETA: 2:27 - loss: 1.0576 - regression_loss: 0.9079 - classification_loss: 0.1497 54/500 [==>...........................] - ETA: 2:27 - loss: 1.0521 - regression_loss: 0.9033 - classification_loss: 0.1487 55/500 [==>...........................] - ETA: 2:26 - loss: 1.0441 - regression_loss: 0.8966 - classification_loss: 0.1476 56/500 [==>...........................] - ETA: 2:26 - loss: 1.0350 - regression_loss: 0.8891 - classification_loss: 0.1459 57/500 [==>...........................] - ETA: 2:26 - loss: 1.0263 - regression_loss: 0.8815 - classification_loss: 0.1448 58/500 [==>...........................] - ETA: 2:25 - loss: 1.0293 - regression_loss: 0.8841 - classification_loss: 0.1451 59/500 [==>...........................] - ETA: 2:25 - loss: 1.0213 - regression_loss: 0.8770 - classification_loss: 0.1443 60/500 [==>...........................] - ETA: 2:25 - loss: 1.0191 - regression_loss: 0.8758 - classification_loss: 0.1433 61/500 [==>...........................] - ETA: 2:24 - loss: 1.0096 - regression_loss: 0.8675 - classification_loss: 0.1421 62/500 [==>...........................] - ETA: 2:24 - loss: 1.0195 - regression_loss: 0.8765 - classification_loss: 0.1431 63/500 [==>...........................] - ETA: 2:24 - loss: 1.0125 - regression_loss: 0.8706 - classification_loss: 0.1419 64/500 [==>...........................] - ETA: 2:23 - loss: 1.0166 - regression_loss: 0.8738 - classification_loss: 0.1428 65/500 [==>...........................] - ETA: 2:23 - loss: 1.0117 - regression_loss: 0.8696 - classification_loss: 0.1421 66/500 [==>...........................] - ETA: 2:23 - loss: 1.0210 - regression_loss: 0.8774 - classification_loss: 0.1436 67/500 [===>..........................] - ETA: 2:22 - loss: 1.0299 - regression_loss: 0.8849 - classification_loss: 0.1449 68/500 [===>..........................] - ETA: 2:22 - loss: 1.0213 - regression_loss: 0.8774 - classification_loss: 0.1440 69/500 [===>..........................] - ETA: 2:22 - loss: 1.0253 - regression_loss: 0.8808 - classification_loss: 0.1444 70/500 [===>..........................] - ETA: 2:21 - loss: 1.0197 - regression_loss: 0.8765 - classification_loss: 0.1432 71/500 [===>..........................] - ETA: 2:21 - loss: 1.0186 - regression_loss: 0.8758 - classification_loss: 0.1428 72/500 [===>..........................] - ETA: 2:21 - loss: 1.0156 - regression_loss: 0.8733 - classification_loss: 0.1423 73/500 [===>..........................] - ETA: 2:20 - loss: 1.0071 - regression_loss: 0.8661 - classification_loss: 0.1411 74/500 [===>..........................] - ETA: 2:20 - loss: 1.0106 - regression_loss: 0.8687 - classification_loss: 0.1419 75/500 [===>..........................] - ETA: 2:20 - loss: 1.0080 - regression_loss: 0.8662 - classification_loss: 0.1418 76/500 [===>..........................] - ETA: 2:20 - loss: 1.0083 - regression_loss: 0.8663 - classification_loss: 0.1420 77/500 [===>..........................] - ETA: 2:19 - loss: 1.0047 - regression_loss: 0.8636 - classification_loss: 0.1411 78/500 [===>..........................] - ETA: 2:19 - loss: 0.9971 - regression_loss: 0.8571 - classification_loss: 0.1399 79/500 [===>..........................] - ETA: 2:19 - loss: 0.9910 - regression_loss: 0.8522 - classification_loss: 0.1388 80/500 [===>..........................] - ETA: 2:18 - loss: 0.9991 - regression_loss: 0.8588 - classification_loss: 0.1404 81/500 [===>..........................] - ETA: 2:18 - loss: 0.9983 - regression_loss: 0.8591 - classification_loss: 0.1393 82/500 [===>..........................] - ETA: 2:18 - loss: 0.9969 - regression_loss: 0.8582 - classification_loss: 0.1388 83/500 [===>..........................] - ETA: 2:17 - loss: 0.9915 - regression_loss: 0.8535 - classification_loss: 0.1380 84/500 [====>.........................] - ETA: 2:17 - loss: 0.9933 - regression_loss: 0.8549 - classification_loss: 0.1384 85/500 [====>.........................] - ETA: 2:17 - loss: 0.9918 - regression_loss: 0.8537 - classification_loss: 0.1382 86/500 [====>.........................] - ETA: 2:16 - loss: 0.9854 - regression_loss: 0.8481 - classification_loss: 0.1373 87/500 [====>.........................] - ETA: 2:16 - loss: 0.9774 - regression_loss: 0.8413 - classification_loss: 0.1361 88/500 [====>.........................] - ETA: 2:16 - loss: 0.9736 - regression_loss: 0.8379 - classification_loss: 0.1357 89/500 [====>.........................] - ETA: 2:15 - loss: 0.9769 - regression_loss: 0.8404 - classification_loss: 0.1366 90/500 [====>.........................] - ETA: 2:15 - loss: 0.9716 - regression_loss: 0.8361 - classification_loss: 0.1355 91/500 [====>.........................] - ETA: 2:14 - loss: 0.9740 - regression_loss: 0.8379 - classification_loss: 0.1361 92/500 [====>.........................] - ETA: 2:14 - loss: 0.9756 - regression_loss: 0.8390 - classification_loss: 0.1366 93/500 [====>.........................] - ETA: 2:14 - loss: 0.9702 - regression_loss: 0.8343 - classification_loss: 0.1359 94/500 [====>.........................] - ETA: 2:14 - loss: 0.9690 - regression_loss: 0.8336 - classification_loss: 0.1355 95/500 [====>.........................] - ETA: 2:13 - loss: 0.9734 - regression_loss: 0.8372 - classification_loss: 0.1362 96/500 [====>.........................] - ETA: 2:13 - loss: 0.9781 - regression_loss: 0.8415 - classification_loss: 0.1366 97/500 [====>.........................] - ETA: 2:13 - loss: 0.9750 - regression_loss: 0.8390 - classification_loss: 0.1360 98/500 [====>.........................] - ETA: 2:12 - loss: 0.9735 - regression_loss: 0.8377 - classification_loss: 0.1359 99/500 [====>.........................] - ETA: 2:12 - loss: 0.9752 - regression_loss: 0.8390 - classification_loss: 0.1362 100/500 [=====>........................] - ETA: 2:12 - loss: 0.9759 - regression_loss: 0.8399 - classification_loss: 0.1360 101/500 [=====>........................] - ETA: 2:12 - loss: 0.9772 - regression_loss: 0.8411 - classification_loss: 0.1362 102/500 [=====>........................] - ETA: 2:11 - loss: 0.9739 - regression_loss: 0.8383 - classification_loss: 0.1356 103/500 [=====>........................] - ETA: 2:11 - loss: 0.9776 - regression_loss: 0.8417 - classification_loss: 0.1359 104/500 [=====>........................] - ETA: 2:10 - loss: 0.9730 - regression_loss: 0.8375 - classification_loss: 0.1355 105/500 [=====>........................] - ETA: 2:10 - loss: 0.9733 - regression_loss: 0.8379 - classification_loss: 0.1354 106/500 [=====>........................] - ETA: 2:10 - loss: 0.9772 - regression_loss: 0.8413 - classification_loss: 0.1359 107/500 [=====>........................] - ETA: 2:09 - loss: 0.9794 - regression_loss: 0.8435 - classification_loss: 0.1360 108/500 [=====>........................] - ETA: 2:09 - loss: 0.9734 - regression_loss: 0.8382 - classification_loss: 0.1352 109/500 [=====>........................] - ETA: 2:09 - loss: 0.9761 - regression_loss: 0.8406 - classification_loss: 0.1354 110/500 [=====>........................] - ETA: 2:08 - loss: 0.9750 - regression_loss: 0.8395 - classification_loss: 0.1355 111/500 [=====>........................] - ETA: 2:08 - loss: 0.9693 - regression_loss: 0.8346 - classification_loss: 0.1347 112/500 [=====>........................] - ETA: 2:08 - loss: 0.9662 - regression_loss: 0.8318 - classification_loss: 0.1343 113/500 [=====>........................] - ETA: 2:07 - loss: 0.9657 - regression_loss: 0.8314 - classification_loss: 0.1343 114/500 [=====>........................] - ETA: 2:07 - loss: 0.9614 - regression_loss: 0.8278 - classification_loss: 0.1336 115/500 [=====>........................] - ETA: 2:07 - loss: 0.9634 - regression_loss: 0.8298 - classification_loss: 0.1336 116/500 [=====>........................] - ETA: 2:06 - loss: 0.9600 - regression_loss: 0.8267 - classification_loss: 0.1333 117/500 [======>.......................] - ETA: 2:06 - loss: 0.9583 - regression_loss: 0.8253 - classification_loss: 0.1330 118/500 [======>.......................] - ETA: 2:06 - loss: 0.9586 - regression_loss: 0.8259 - classification_loss: 0.1327 119/500 [======>.......................] - ETA: 2:05 - loss: 0.9562 - regression_loss: 0.8241 - classification_loss: 0.1321 120/500 [======>.......................] - ETA: 2:05 - loss: 0.9542 - regression_loss: 0.8228 - classification_loss: 0.1314 121/500 [======>.......................] - ETA: 2:05 - loss: 0.9545 - regression_loss: 0.8229 - classification_loss: 0.1316 122/500 [======>.......................] - ETA: 2:04 - loss: 0.9604 - regression_loss: 0.8280 - classification_loss: 0.1324 123/500 [======>.......................] - ETA: 2:04 - loss: 0.9634 - regression_loss: 0.8304 - classification_loss: 0.1330 124/500 [======>.......................] - ETA: 2:04 - loss: 0.9638 - regression_loss: 0.8312 - classification_loss: 0.1326 125/500 [======>.......................] - ETA: 2:03 - loss: 0.9612 - regression_loss: 0.8288 - classification_loss: 0.1324 126/500 [======>.......................] - ETA: 2:03 - loss: 0.9621 - regression_loss: 0.8297 - classification_loss: 0.1324 127/500 [======>.......................] - ETA: 2:03 - loss: 0.9631 - regression_loss: 0.8306 - classification_loss: 0.1324 128/500 [======>.......................] - ETA: 2:02 - loss: 0.9613 - regression_loss: 0.8292 - classification_loss: 0.1321 129/500 [======>.......................] - ETA: 2:02 - loss: 0.9588 - regression_loss: 0.8269 - classification_loss: 0.1319 130/500 [======>.......................] - ETA: 2:02 - loss: 0.9562 - regression_loss: 0.8245 - classification_loss: 0.1317 131/500 [======>.......................] - ETA: 2:01 - loss: 0.9576 - regression_loss: 0.8259 - classification_loss: 0.1317 132/500 [======>.......................] - ETA: 2:01 - loss: 0.9575 - regression_loss: 0.8257 - classification_loss: 0.1318 133/500 [======>.......................] - ETA: 2:01 - loss: 0.9537 - regression_loss: 0.8223 - classification_loss: 0.1314 134/500 [=======>......................] - ETA: 2:00 - loss: 0.9541 - regression_loss: 0.8226 - classification_loss: 0.1315 135/500 [=======>......................] - ETA: 2:00 - loss: 0.9492 - regression_loss: 0.8183 - classification_loss: 0.1309 136/500 [=======>......................] - ETA: 2:00 - loss: 0.9510 - regression_loss: 0.8195 - classification_loss: 0.1315 137/500 [=======>......................] - ETA: 1:59 - loss: 0.9537 - regression_loss: 0.8217 - classification_loss: 0.1320 138/500 [=======>......................] - ETA: 1:59 - loss: 0.9564 - regression_loss: 0.8241 - classification_loss: 0.1323 139/500 [=======>......................] - ETA: 1:59 - loss: 0.9524 - regression_loss: 0.8207 - classification_loss: 0.1317 140/500 [=======>......................] - ETA: 1:58 - loss: 0.9575 - regression_loss: 0.8250 - classification_loss: 0.1325 141/500 [=======>......................] - ETA: 1:58 - loss: 0.9601 - regression_loss: 0.8273 - classification_loss: 0.1328 142/500 [=======>......................] - ETA: 1:58 - loss: 0.9623 - regression_loss: 0.8290 - classification_loss: 0.1333 143/500 [=======>......................] - ETA: 1:57 - loss: 0.9600 - regression_loss: 0.8271 - classification_loss: 0.1329 144/500 [=======>......................] - ETA: 1:57 - loss: 0.9625 - regression_loss: 0.8293 - classification_loss: 0.1333 145/500 [=======>......................] - ETA: 1:57 - loss: 0.9585 - regression_loss: 0.8258 - classification_loss: 0.1326 146/500 [=======>......................] - ETA: 1:56 - loss: 0.9596 - regression_loss: 0.8271 - classification_loss: 0.1325 147/500 [=======>......................] - ETA: 1:56 - loss: 0.9633 - regression_loss: 0.8301 - classification_loss: 0.1332 148/500 [=======>......................] - ETA: 1:56 - loss: 0.9596 - regression_loss: 0.8269 - classification_loss: 0.1326 149/500 [=======>......................] - ETA: 1:55 - loss: 0.9606 - regression_loss: 0.8279 - classification_loss: 0.1327 150/500 [========>.....................] - ETA: 1:55 - loss: 0.9631 - regression_loss: 0.8301 - classification_loss: 0.1330 151/500 [========>.....................] - ETA: 1:55 - loss: 0.9618 - regression_loss: 0.8291 - classification_loss: 0.1327 152/500 [========>.....................] - ETA: 1:55 - loss: 0.9659 - regression_loss: 0.8328 - classification_loss: 0.1330 153/500 [========>.....................] - ETA: 1:54 - loss: 0.9661 - regression_loss: 0.8328 - classification_loss: 0.1333 154/500 [========>.....................] - ETA: 1:54 - loss: 0.9663 - regression_loss: 0.8325 - classification_loss: 0.1337 155/500 [========>.....................] - ETA: 1:53 - loss: 0.9676 - regression_loss: 0.8337 - classification_loss: 0.1339 156/500 [========>.....................] - ETA: 1:53 - loss: 0.9677 - regression_loss: 0.8339 - classification_loss: 0.1338 157/500 [========>.....................] - ETA: 1:53 - loss: 0.9673 - regression_loss: 0.8335 - classification_loss: 0.1338 158/500 [========>.....................] - ETA: 1:53 - loss: 0.9705 - regression_loss: 0.8360 - classification_loss: 0.1345 159/500 [========>.....................] - ETA: 1:52 - loss: 0.9692 - regression_loss: 0.8350 - classification_loss: 0.1343 160/500 [========>.....................] - ETA: 1:52 - loss: 0.9713 - regression_loss: 0.8367 - classification_loss: 0.1346 161/500 [========>.....................] - ETA: 1:52 - loss: 0.9725 - regression_loss: 0.8377 - classification_loss: 0.1348 162/500 [========>.....................] - ETA: 1:51 - loss: 0.9698 - regression_loss: 0.8354 - classification_loss: 0.1344 163/500 [========>.....................] - ETA: 1:51 - loss: 0.9700 - regression_loss: 0.8353 - classification_loss: 0.1347 164/500 [========>.....................] - ETA: 1:50 - loss: 0.9692 - regression_loss: 0.8344 - classification_loss: 0.1348 165/500 [========>.....................] - ETA: 1:50 - loss: 0.9705 - regression_loss: 0.8354 - classification_loss: 0.1351 166/500 [========>.....................] - ETA: 1:50 - loss: 0.9713 - regression_loss: 0.8360 - classification_loss: 0.1353 167/500 [=========>....................] - ETA: 1:50 - loss: 0.9760 - regression_loss: 0.8399 - classification_loss: 0.1361 168/500 [=========>....................] - ETA: 1:49 - loss: 0.9769 - regression_loss: 0.8406 - classification_loss: 0.1362 169/500 [=========>....................] - ETA: 1:49 - loss: 0.9772 - regression_loss: 0.8406 - classification_loss: 0.1366 170/500 [=========>....................] - ETA: 1:48 - loss: 0.9793 - regression_loss: 0.8425 - classification_loss: 0.1368 171/500 [=========>....................] - ETA: 1:48 - loss: 0.9806 - regression_loss: 0.8436 - classification_loss: 0.1370 172/500 [=========>....................] - ETA: 1:48 - loss: 0.9829 - regression_loss: 0.8454 - classification_loss: 0.1375 173/500 [=========>....................] - ETA: 1:47 - loss: 0.9847 - regression_loss: 0.8467 - classification_loss: 0.1380 174/500 [=========>....................] - ETA: 1:47 - loss: 0.9844 - regression_loss: 0.8464 - classification_loss: 0.1380 175/500 [=========>....................] - ETA: 1:47 - loss: 0.9842 - regression_loss: 0.8463 - classification_loss: 0.1379 176/500 [=========>....................] - ETA: 1:47 - loss: 0.9859 - regression_loss: 0.8478 - classification_loss: 0.1380 177/500 [=========>....................] - ETA: 1:46 - loss: 0.9831 - regression_loss: 0.8455 - classification_loss: 0.1375 178/500 [=========>....................] - ETA: 1:46 - loss: 0.9840 - regression_loss: 0.8463 - classification_loss: 0.1377 179/500 [=========>....................] - ETA: 1:46 - loss: 0.9807 - regression_loss: 0.8435 - classification_loss: 0.1372 180/500 [=========>....................] - ETA: 1:45 - loss: 0.9785 - regression_loss: 0.8418 - classification_loss: 0.1367 181/500 [=========>....................] - ETA: 1:45 - loss: 0.9760 - regression_loss: 0.8397 - classification_loss: 0.1363 182/500 [=========>....................] - ETA: 1:45 - loss: 0.9765 - regression_loss: 0.8399 - classification_loss: 0.1366 183/500 [=========>....................] - ETA: 1:44 - loss: 0.9782 - regression_loss: 0.8414 - classification_loss: 0.1368 184/500 [==========>...................] - ETA: 1:44 - loss: 0.9778 - regression_loss: 0.8412 - classification_loss: 0.1366 185/500 [==========>...................] - ETA: 1:43 - loss: 0.9791 - regression_loss: 0.8419 - classification_loss: 0.1373 186/500 [==========>...................] - ETA: 1:43 - loss: 0.9764 - regression_loss: 0.8396 - classification_loss: 0.1369 187/500 [==========>...................] - ETA: 1:43 - loss: 0.9750 - regression_loss: 0.8385 - classification_loss: 0.1365 188/500 [==========>...................] - ETA: 1:43 - loss: 0.9720 - regression_loss: 0.8357 - classification_loss: 0.1362 189/500 [==========>...................] - ETA: 1:42 - loss: 0.9771 - regression_loss: 0.8401 - classification_loss: 0.1370 190/500 [==========>...................] - ETA: 1:42 - loss: 0.9772 - regression_loss: 0.8401 - classification_loss: 0.1371 191/500 [==========>...................] - ETA: 1:42 - loss: 0.9746 - regression_loss: 0.8378 - classification_loss: 0.1368 192/500 [==========>...................] - ETA: 1:41 - loss: 0.9747 - regression_loss: 0.8378 - classification_loss: 0.1369 193/500 [==========>...................] - ETA: 1:41 - loss: 0.9718 - regression_loss: 0.8352 - classification_loss: 0.1366 194/500 [==========>...................] - ETA: 1:41 - loss: 0.9698 - regression_loss: 0.8329 - classification_loss: 0.1369 195/500 [==========>...................] - ETA: 1:40 - loss: 0.9686 - regression_loss: 0.8318 - classification_loss: 0.1367 196/500 [==========>...................] - ETA: 1:40 - loss: 0.9664 - regression_loss: 0.8297 - classification_loss: 0.1367 197/500 [==========>...................] - ETA: 1:40 - loss: 0.9665 - regression_loss: 0.8299 - classification_loss: 0.1367 198/500 [==========>...................] - ETA: 1:39 - loss: 0.9654 - regression_loss: 0.8290 - classification_loss: 0.1364 199/500 [==========>...................] - ETA: 1:39 - loss: 0.9635 - regression_loss: 0.8275 - classification_loss: 0.1360 200/500 [===========>..................] - ETA: 1:39 - loss: 0.9636 - regression_loss: 0.8276 - classification_loss: 0.1360 201/500 [===========>..................] - ETA: 1:38 - loss: 0.9647 - regression_loss: 0.8285 - classification_loss: 0.1362 202/500 [===========>..................] - ETA: 1:38 - loss: 0.9673 - regression_loss: 0.8306 - classification_loss: 0.1367 203/500 [===========>..................] - ETA: 1:38 - loss: 0.9675 - regression_loss: 0.8308 - classification_loss: 0.1367 204/500 [===========>..................] - ETA: 1:37 - loss: 0.9675 - regression_loss: 0.8309 - classification_loss: 0.1366 205/500 [===========>..................] - ETA: 1:37 - loss: 0.9704 - regression_loss: 0.8331 - classification_loss: 0.1374 206/500 [===========>..................] - ETA: 1:37 - loss: 0.9731 - regression_loss: 0.8354 - classification_loss: 0.1376 207/500 [===========>..................] - ETA: 1:36 - loss: 0.9745 - regression_loss: 0.8368 - classification_loss: 0.1377 208/500 [===========>..................] - ETA: 1:36 - loss: 0.9737 - regression_loss: 0.8363 - classification_loss: 0.1375 209/500 [===========>..................] - ETA: 1:36 - loss: 0.9729 - regression_loss: 0.8356 - classification_loss: 0.1373 210/500 [===========>..................] - ETA: 1:35 - loss: 0.9746 - regression_loss: 0.8369 - classification_loss: 0.1377 211/500 [===========>..................] - ETA: 1:35 - loss: 0.9747 - regression_loss: 0.8369 - classification_loss: 0.1378 212/500 [===========>..................] - ETA: 1:35 - loss: 0.9753 - regression_loss: 0.8376 - classification_loss: 0.1377 213/500 [===========>..................] - ETA: 1:34 - loss: 0.9773 - regression_loss: 0.8394 - classification_loss: 0.1379 214/500 [===========>..................] - ETA: 1:34 - loss: 0.9765 - regression_loss: 0.8386 - classification_loss: 0.1378 215/500 [===========>..................] - ETA: 1:34 - loss: 0.9742 - regression_loss: 0.8368 - classification_loss: 0.1374 216/500 [===========>..................] - ETA: 1:33 - loss: 0.9772 - regression_loss: 0.8392 - classification_loss: 0.1380 217/500 [============>.................] - ETA: 1:33 - loss: 0.9756 - regression_loss: 0.8380 - classification_loss: 0.1376 218/500 [============>.................] - ETA: 1:33 - loss: 0.9734 - regression_loss: 0.8361 - classification_loss: 0.1373 219/500 [============>.................] - ETA: 1:32 - loss: 0.9714 - regression_loss: 0.8345 - classification_loss: 0.1369 220/500 [============>.................] - ETA: 1:32 - loss: 0.9703 - regression_loss: 0.8337 - classification_loss: 0.1366 221/500 [============>.................] - ETA: 1:32 - loss: 0.9689 - regression_loss: 0.8327 - classification_loss: 0.1362 222/500 [============>.................] - ETA: 1:31 - loss: 0.9709 - regression_loss: 0.8344 - classification_loss: 0.1365 223/500 [============>.................] - ETA: 1:31 - loss: 0.9719 - regression_loss: 0.8353 - classification_loss: 0.1366 224/500 [============>.................] - ETA: 1:31 - loss: 0.9726 - regression_loss: 0.8359 - classification_loss: 0.1368 225/500 [============>.................] - ETA: 1:30 - loss: 0.9701 - regression_loss: 0.8338 - classification_loss: 0.1364 226/500 [============>.................] - ETA: 1:30 - loss: 0.9707 - regression_loss: 0.8343 - classification_loss: 0.1364 227/500 [============>.................] - ETA: 1:30 - loss: 0.9700 - regression_loss: 0.8338 - classification_loss: 0.1362 228/500 [============>.................] - ETA: 1:29 - loss: 0.9685 - regression_loss: 0.8326 - classification_loss: 0.1359 229/500 [============>.................] - ETA: 1:29 - loss: 0.9669 - regression_loss: 0.8313 - classification_loss: 0.1356 230/500 [============>.................] - ETA: 1:29 - loss: 0.9670 - regression_loss: 0.8314 - classification_loss: 0.1356 231/500 [============>.................] - ETA: 1:28 - loss: 0.9681 - regression_loss: 0.8326 - classification_loss: 0.1355 232/500 [============>.................] - ETA: 1:28 - loss: 0.9680 - regression_loss: 0.8327 - classification_loss: 0.1352 233/500 [============>.................] - ETA: 1:28 - loss: 0.9676 - regression_loss: 0.8324 - classification_loss: 0.1352 234/500 [=============>................] - ETA: 1:27 - loss: 0.9694 - regression_loss: 0.8339 - classification_loss: 0.1355 235/500 [=============>................] - ETA: 1:27 - loss: 0.9695 - regression_loss: 0.8339 - classification_loss: 0.1356 236/500 [=============>................] - ETA: 1:27 - loss: 0.9719 - regression_loss: 0.8357 - classification_loss: 0.1361 237/500 [=============>................] - ETA: 1:27 - loss: 0.9710 - regression_loss: 0.8350 - classification_loss: 0.1360 238/500 [=============>................] - ETA: 1:26 - loss: 0.9693 - regression_loss: 0.8334 - classification_loss: 0.1359 239/500 [=============>................] - ETA: 1:26 - loss: 0.9697 - regression_loss: 0.8337 - classification_loss: 0.1359 240/500 [=============>................] - ETA: 1:25 - loss: 0.9685 - regression_loss: 0.8328 - classification_loss: 0.1358 241/500 [=============>................] - ETA: 1:25 - loss: 0.9669 - regression_loss: 0.8313 - classification_loss: 0.1356 242/500 [=============>................] - ETA: 1:25 - loss: 0.9639 - regression_loss: 0.8287 - classification_loss: 0.1352 243/500 [=============>................] - ETA: 1:24 - loss: 0.9653 - regression_loss: 0.8297 - classification_loss: 0.1357 244/500 [=============>................] - ETA: 1:24 - loss: 0.9674 - regression_loss: 0.8314 - classification_loss: 0.1360 245/500 [=============>................] - ETA: 1:24 - loss: 0.9692 - regression_loss: 0.8330 - classification_loss: 0.1362 246/500 [=============>................] - ETA: 1:23 - loss: 0.9677 - regression_loss: 0.8318 - classification_loss: 0.1359 247/500 [=============>................] - ETA: 1:23 - loss: 0.9661 - regression_loss: 0.8304 - classification_loss: 0.1357 248/500 [=============>................] - ETA: 1:23 - loss: 0.9645 - regression_loss: 0.8291 - classification_loss: 0.1354 249/500 [=============>................] - ETA: 1:22 - loss: 0.9631 - regression_loss: 0.8280 - classification_loss: 0.1351 250/500 [==============>...............] - ETA: 1:22 - loss: 0.9628 - regression_loss: 0.8279 - classification_loss: 0.1349 251/500 [==============>...............] - ETA: 1:22 - loss: 0.9604 - regression_loss: 0.8258 - classification_loss: 0.1346 252/500 [==============>...............] - ETA: 1:21 - loss: 0.9612 - regression_loss: 0.8264 - classification_loss: 0.1348 253/500 [==============>...............] - ETA: 1:21 - loss: 0.9610 - regression_loss: 0.8263 - classification_loss: 0.1347 254/500 [==============>...............] - ETA: 1:21 - loss: 0.9622 - regression_loss: 0.8274 - classification_loss: 0.1348 255/500 [==============>...............] - ETA: 1:20 - loss: 0.9616 - regression_loss: 0.8270 - classification_loss: 0.1346 256/500 [==============>...............] - ETA: 1:20 - loss: 0.9614 - regression_loss: 0.8269 - classification_loss: 0.1345 257/500 [==============>...............] - ETA: 1:20 - loss: 0.9592 - regression_loss: 0.8249 - classification_loss: 0.1343 258/500 [==============>...............] - ETA: 1:19 - loss: 0.9595 - regression_loss: 0.8249 - classification_loss: 0.1345 259/500 [==============>...............] - ETA: 1:19 - loss: 0.9601 - regression_loss: 0.8257 - classification_loss: 0.1345 260/500 [==============>...............] - ETA: 1:19 - loss: 0.9633 - regression_loss: 0.8283 - classification_loss: 0.1349 261/500 [==============>...............] - ETA: 1:19 - loss: 0.9629 - regression_loss: 0.8282 - classification_loss: 0.1347 262/500 [==============>...............] - ETA: 1:18 - loss: 0.9646 - regression_loss: 0.8295 - classification_loss: 0.1351 263/500 [==============>...............] - ETA: 1:18 - loss: 0.9642 - regression_loss: 0.8293 - classification_loss: 0.1349 264/500 [==============>...............] - ETA: 1:17 - loss: 0.9637 - regression_loss: 0.8288 - classification_loss: 0.1349 265/500 [==============>...............] - ETA: 1:17 - loss: 0.9649 - regression_loss: 0.8298 - classification_loss: 0.1351 266/500 [==============>...............] - ETA: 1:17 - loss: 0.9662 - regression_loss: 0.8310 - classification_loss: 0.1352 267/500 [===============>..............] - ETA: 1:17 - loss: 0.9648 - regression_loss: 0.8297 - classification_loss: 0.1350 268/500 [===============>..............] - ETA: 1:16 - loss: 0.9655 - regression_loss: 0.8303 - classification_loss: 0.1351 269/500 [===============>..............] - ETA: 1:16 - loss: 0.9656 - regression_loss: 0.8304 - classification_loss: 0.1351 270/500 [===============>..............] - ETA: 1:16 - loss: 0.9650 - regression_loss: 0.8301 - classification_loss: 0.1350 271/500 [===============>..............] - ETA: 1:15 - loss: 0.9635 - regression_loss: 0.8288 - classification_loss: 0.1347 272/500 [===============>..............] - ETA: 1:15 - loss: 0.9650 - regression_loss: 0.8301 - classification_loss: 0.1350 273/500 [===============>..............] - ETA: 1:15 - loss: 0.9633 - regression_loss: 0.8286 - classification_loss: 0.1347 274/500 [===============>..............] - ETA: 1:14 - loss: 0.9624 - regression_loss: 0.8280 - classification_loss: 0.1344 275/500 [===============>..............] - ETA: 1:14 - loss: 0.9608 - regression_loss: 0.8265 - classification_loss: 0.1343 276/500 [===============>..............] - ETA: 1:14 - loss: 0.9593 - regression_loss: 0.8253 - classification_loss: 0.1339 277/500 [===============>..............] - ETA: 1:13 - loss: 0.9595 - regression_loss: 0.8256 - classification_loss: 0.1339 278/500 [===============>..............] - ETA: 1:13 - loss: 0.9609 - regression_loss: 0.8269 - classification_loss: 0.1340 279/500 [===============>..............] - ETA: 1:13 - loss: 0.9594 - regression_loss: 0.8257 - classification_loss: 0.1337 280/500 [===============>..............] - ETA: 1:12 - loss: 0.9600 - regression_loss: 0.8263 - classification_loss: 0.1338 281/500 [===============>..............] - ETA: 1:12 - loss: 0.9589 - regression_loss: 0.8253 - classification_loss: 0.1336 282/500 [===============>..............] - ETA: 1:12 - loss: 0.9597 - regression_loss: 0.8260 - classification_loss: 0.1337 283/500 [===============>..............] - ETA: 1:11 - loss: 0.9614 - regression_loss: 0.8278 - classification_loss: 0.1336 284/500 [================>.............] - ETA: 1:11 - loss: 0.9616 - regression_loss: 0.8280 - classification_loss: 0.1336 285/500 [================>.............] - ETA: 1:10 - loss: 0.9601 - regression_loss: 0.8268 - classification_loss: 0.1333 286/500 [================>.............] - ETA: 1:10 - loss: 0.9590 - regression_loss: 0.8259 - classification_loss: 0.1331 287/500 [================>.............] - ETA: 1:10 - loss: 0.9608 - regression_loss: 0.8274 - classification_loss: 0.1334 288/500 [================>.............] - ETA: 1:09 - loss: 0.9592 - regression_loss: 0.8259 - classification_loss: 0.1333 289/500 [================>.............] - ETA: 1:09 - loss: 0.9592 - regression_loss: 0.8259 - classification_loss: 0.1333 290/500 [================>.............] - ETA: 1:09 - loss: 0.9591 - regression_loss: 0.8258 - classification_loss: 0.1333 291/500 [================>.............] - ETA: 1:09 - loss: 0.9614 - regression_loss: 0.8278 - classification_loss: 0.1336 292/500 [================>.............] - ETA: 1:08 - loss: 0.9596 - regression_loss: 0.8263 - classification_loss: 0.1333 293/500 [================>.............] - ETA: 1:08 - loss: 0.9617 - regression_loss: 0.8283 - classification_loss: 0.1334 294/500 [================>.............] - ETA: 1:08 - loss: 0.9638 - regression_loss: 0.8300 - classification_loss: 0.1338 295/500 [================>.............] - ETA: 1:07 - loss: 0.9652 - regression_loss: 0.8312 - classification_loss: 0.1340 296/500 [================>.............] - ETA: 1:07 - loss: 0.9645 - regression_loss: 0.8305 - classification_loss: 0.1341 297/500 [================>.............] - ETA: 1:07 - loss: 0.9650 - regression_loss: 0.8309 - classification_loss: 0.1341 298/500 [================>.............] - ETA: 1:06 - loss: 0.9668 - regression_loss: 0.8324 - classification_loss: 0.1345 299/500 [================>.............] - ETA: 1:06 - loss: 0.9645 - regression_loss: 0.8303 - classification_loss: 0.1342 300/500 [=================>............] - ETA: 1:06 - loss: 0.9632 - regression_loss: 0.8293 - classification_loss: 0.1340 301/500 [=================>............] - ETA: 1:05 - loss: 0.9614 - regression_loss: 0.8276 - classification_loss: 0.1337 302/500 [=================>............] - ETA: 1:05 - loss: 0.9614 - regression_loss: 0.8276 - classification_loss: 0.1338 303/500 [=================>............] - ETA: 1:05 - loss: 0.9633 - regression_loss: 0.8295 - classification_loss: 0.1338 304/500 [=================>............] - ETA: 1:04 - loss: 0.9639 - regression_loss: 0.8299 - classification_loss: 0.1340 305/500 [=================>............] - ETA: 1:04 - loss: 0.9639 - regression_loss: 0.8299 - classification_loss: 0.1339 306/500 [=================>............] - ETA: 1:04 - loss: 0.9648 - regression_loss: 0.8310 - classification_loss: 0.1338 307/500 [=================>............] - ETA: 1:03 - loss: 0.9637 - regression_loss: 0.8300 - classification_loss: 0.1337 308/500 [=================>............] - ETA: 1:03 - loss: 0.9622 - regression_loss: 0.8288 - classification_loss: 0.1335 309/500 [=================>............] - ETA: 1:03 - loss: 0.9626 - regression_loss: 0.8292 - classification_loss: 0.1334 310/500 [=================>............] - ETA: 1:02 - loss: 0.9643 - regression_loss: 0.8305 - classification_loss: 0.1338 311/500 [=================>............] - ETA: 1:02 - loss: 0.9650 - regression_loss: 0.8313 - classification_loss: 0.1337 312/500 [=================>............] - ETA: 1:02 - loss: 0.9647 - regression_loss: 0.8311 - classification_loss: 0.1336 313/500 [=================>............] - ETA: 1:01 - loss: 0.9650 - regression_loss: 0.8313 - classification_loss: 0.1337 314/500 [=================>............] - ETA: 1:01 - loss: 0.9660 - regression_loss: 0.8321 - classification_loss: 0.1339 315/500 [=================>............] - ETA: 1:01 - loss: 0.9651 - regression_loss: 0.8313 - classification_loss: 0.1338 316/500 [=================>............] - ETA: 1:00 - loss: 0.9668 - regression_loss: 0.8329 - classification_loss: 0.1340 317/500 [==================>...........] - ETA: 1:00 - loss: 0.9667 - regression_loss: 0.8328 - classification_loss: 0.1339 318/500 [==================>...........] - ETA: 1:00 - loss: 0.9658 - regression_loss: 0.8317 - classification_loss: 0.1341 319/500 [==================>...........] - ETA: 59s - loss: 0.9659 - regression_loss: 0.8318 - classification_loss: 0.1342  320/500 [==================>...........] - ETA: 59s - loss: 0.9662 - regression_loss: 0.8321 - classification_loss: 0.1342 321/500 [==================>...........] - ETA: 59s - loss: 0.9682 - regression_loss: 0.8337 - classification_loss: 0.1345 322/500 [==================>...........] - ETA: 58s - loss: 0.9680 - regression_loss: 0.8335 - classification_loss: 0.1344 323/500 [==================>...........] - ETA: 58s - loss: 0.9689 - regression_loss: 0.8343 - classification_loss: 0.1346 324/500 [==================>...........] - ETA: 58s - loss: 0.9674 - regression_loss: 0.8331 - classification_loss: 0.1343 325/500 [==================>...........] - ETA: 57s - loss: 0.9688 - regression_loss: 0.8343 - classification_loss: 0.1345 326/500 [==================>...........] - ETA: 57s - loss: 0.9694 - regression_loss: 0.8348 - classification_loss: 0.1347 327/500 [==================>...........] - ETA: 57s - loss: 0.9700 - regression_loss: 0.8353 - classification_loss: 0.1347 328/500 [==================>...........] - ETA: 56s - loss: 0.9687 - regression_loss: 0.8341 - classification_loss: 0.1346 329/500 [==================>...........] - ETA: 56s - loss: 0.9675 - regression_loss: 0.8332 - classification_loss: 0.1343 330/500 [==================>...........] - ETA: 56s - loss: 0.9665 - regression_loss: 0.8323 - classification_loss: 0.1341 331/500 [==================>...........] - ETA: 55s - loss: 0.9652 - regression_loss: 0.8312 - classification_loss: 0.1340 332/500 [==================>...........] - ETA: 55s - loss: 0.9652 - regression_loss: 0.8311 - classification_loss: 0.1341 333/500 [==================>...........] - ETA: 55s - loss: 0.9656 - regression_loss: 0.8314 - classification_loss: 0.1341 334/500 [===================>..........] - ETA: 54s - loss: 0.9660 - regression_loss: 0.8319 - classification_loss: 0.1341 335/500 [===================>..........] - ETA: 54s - loss: 0.9663 - regression_loss: 0.8321 - classification_loss: 0.1342 336/500 [===================>..........] - ETA: 54s - loss: 0.9667 - regression_loss: 0.8325 - classification_loss: 0.1342 337/500 [===================>..........] - ETA: 53s - loss: 0.9678 - regression_loss: 0.8334 - classification_loss: 0.1343 338/500 [===================>..........] - ETA: 53s - loss: 0.9681 - regression_loss: 0.8338 - classification_loss: 0.1344 339/500 [===================>..........] - ETA: 53s - loss: 0.9686 - regression_loss: 0.8340 - classification_loss: 0.1346 340/500 [===================>..........] - ETA: 52s - loss: 0.9687 - regression_loss: 0.8341 - classification_loss: 0.1346 341/500 [===================>..........] - ETA: 52s - loss: 0.9686 - regression_loss: 0.8340 - classification_loss: 0.1346 342/500 [===================>..........] - ETA: 52s - loss: 0.9691 - regression_loss: 0.8344 - classification_loss: 0.1346 343/500 [===================>..........] - ETA: 51s - loss: 0.9718 - regression_loss: 0.8365 - classification_loss: 0.1352 344/500 [===================>..........] - ETA: 51s - loss: 0.9713 - regression_loss: 0.8361 - classification_loss: 0.1352 345/500 [===================>..........] - ETA: 51s - loss: 0.9699 - regression_loss: 0.8349 - classification_loss: 0.1350 346/500 [===================>..........] - ETA: 50s - loss: 0.9694 - regression_loss: 0.8346 - classification_loss: 0.1349 347/500 [===================>..........] - ETA: 50s - loss: 0.9696 - regression_loss: 0.8347 - classification_loss: 0.1349 348/500 [===================>..........] - ETA: 50s - loss: 0.9684 - regression_loss: 0.8337 - classification_loss: 0.1347 349/500 [===================>..........] - ETA: 49s - loss: 0.9704 - regression_loss: 0.8354 - classification_loss: 0.1351 350/500 [====================>.........] - ETA: 49s - loss: 0.9718 - regression_loss: 0.8365 - classification_loss: 0.1353 351/500 [====================>.........] - ETA: 49s - loss: 0.9732 - regression_loss: 0.8378 - classification_loss: 0.1354 352/500 [====================>.........] - ETA: 48s - loss: 0.9728 - regression_loss: 0.8375 - classification_loss: 0.1353 353/500 [====================>.........] - ETA: 48s - loss: 0.9714 - regression_loss: 0.8364 - classification_loss: 0.1350 354/500 [====================>.........] - ETA: 48s - loss: 0.9732 - regression_loss: 0.8379 - classification_loss: 0.1353 355/500 [====================>.........] - ETA: 47s - loss: 0.9730 - regression_loss: 0.8378 - classification_loss: 0.1353 356/500 [====================>.........] - ETA: 47s - loss: 0.9714 - regression_loss: 0.8364 - classification_loss: 0.1350 357/500 [====================>.........] - ETA: 47s - loss: 0.9719 - regression_loss: 0.8368 - classification_loss: 0.1350 358/500 [====================>.........] - ETA: 46s - loss: 0.9711 - regression_loss: 0.8363 - classification_loss: 0.1349 359/500 [====================>.........] - ETA: 46s - loss: 0.9725 - regression_loss: 0.8375 - classification_loss: 0.1351 360/500 [====================>.........] - ETA: 46s - loss: 0.9736 - regression_loss: 0.8384 - classification_loss: 0.1352 361/500 [====================>.........] - ETA: 45s - loss: 0.9726 - regression_loss: 0.8375 - classification_loss: 0.1351 362/500 [====================>.........] - ETA: 45s - loss: 0.9734 - regression_loss: 0.8380 - classification_loss: 0.1353 363/500 [====================>.........] - ETA: 45s - loss: 0.9723 - regression_loss: 0.8371 - classification_loss: 0.1352 364/500 [====================>.........] - ETA: 44s - loss: 0.9733 - regression_loss: 0.8380 - classification_loss: 0.1353 365/500 [====================>.........] - ETA: 44s - loss: 0.9749 - regression_loss: 0.8394 - classification_loss: 0.1355 366/500 [====================>.........] - ETA: 44s - loss: 0.9755 - regression_loss: 0.8398 - classification_loss: 0.1356 367/500 [=====================>........] - ETA: 43s - loss: 0.9753 - regression_loss: 0.8398 - classification_loss: 0.1355 368/500 [=====================>........] - ETA: 43s - loss: 0.9765 - regression_loss: 0.8407 - classification_loss: 0.1358 369/500 [=====================>........] - ETA: 43s - loss: 0.9749 - regression_loss: 0.8394 - classification_loss: 0.1355 370/500 [=====================>........] - ETA: 42s - loss: 0.9756 - regression_loss: 0.8402 - classification_loss: 0.1354 371/500 [=====================>........] - ETA: 42s - loss: 0.9743 - regression_loss: 0.8390 - classification_loss: 0.1353 372/500 [=====================>........] - ETA: 42s - loss: 0.9723 - regression_loss: 0.8373 - classification_loss: 0.1351 373/500 [=====================>........] - ETA: 41s - loss: 0.9706 - regression_loss: 0.8358 - classification_loss: 0.1348 374/500 [=====================>........] - ETA: 41s - loss: 0.9725 - regression_loss: 0.8374 - classification_loss: 0.1351 375/500 [=====================>........] - ETA: 41s - loss: 0.9715 - regression_loss: 0.8365 - classification_loss: 0.1350 376/500 [=====================>........] - ETA: 40s - loss: 0.9711 - regression_loss: 0.8362 - classification_loss: 0.1348 377/500 [=====================>........] - ETA: 40s - loss: 0.9728 - regression_loss: 0.8377 - classification_loss: 0.1351 378/500 [=====================>........] - ETA: 40s - loss: 0.9726 - regression_loss: 0.8374 - classification_loss: 0.1351 379/500 [=====================>........] - ETA: 39s - loss: 0.9733 - regression_loss: 0.8381 - classification_loss: 0.1353 380/500 [=====================>........] - ETA: 39s - loss: 0.9731 - regression_loss: 0.8379 - classification_loss: 0.1352 381/500 [=====================>........] - ETA: 39s - loss: 0.9718 - regression_loss: 0.8369 - classification_loss: 0.1349 382/500 [=====================>........] - ETA: 38s - loss: 0.9717 - regression_loss: 0.8369 - classification_loss: 0.1348 383/500 [=====================>........] - ETA: 38s - loss: 0.9726 - regression_loss: 0.8377 - classification_loss: 0.1349 384/500 [======================>.......] - ETA: 38s - loss: 0.9721 - regression_loss: 0.8374 - classification_loss: 0.1347 385/500 [======================>.......] - ETA: 37s - loss: 0.9714 - regression_loss: 0.8367 - classification_loss: 0.1346 386/500 [======================>.......] - ETA: 37s - loss: 0.9723 - regression_loss: 0.8375 - classification_loss: 0.1348 387/500 [======================>.......] - ETA: 37s - loss: 0.9718 - regression_loss: 0.8370 - classification_loss: 0.1348 388/500 [======================>.......] - ETA: 36s - loss: 0.9729 - regression_loss: 0.8379 - classification_loss: 0.1350 389/500 [======================>.......] - ETA: 36s - loss: 0.9735 - regression_loss: 0.8384 - classification_loss: 0.1350 390/500 [======================>.......] - ETA: 36s - loss: 0.9721 - regression_loss: 0.8373 - classification_loss: 0.1349 391/500 [======================>.......] - ETA: 35s - loss: 0.9717 - regression_loss: 0.8369 - classification_loss: 0.1348 392/500 [======================>.......] - ETA: 35s - loss: 0.9728 - regression_loss: 0.8379 - classification_loss: 0.1348 393/500 [======================>.......] - ETA: 35s - loss: 0.9719 - regression_loss: 0.8372 - classification_loss: 0.1347 394/500 [======================>.......] - ETA: 35s - loss: 0.9727 - regression_loss: 0.8381 - classification_loss: 0.1346 395/500 [======================>.......] - ETA: 34s - loss: 0.9736 - regression_loss: 0.8388 - classification_loss: 0.1348 396/500 [======================>.......] - ETA: 34s - loss: 0.9743 - regression_loss: 0.8394 - classification_loss: 0.1349 397/500 [======================>.......] - ETA: 34s - loss: 0.9755 - regression_loss: 0.8403 - classification_loss: 0.1352 398/500 [======================>.......] - ETA: 33s - loss: 0.9749 - regression_loss: 0.8399 - classification_loss: 0.1350 399/500 [======================>.......] - ETA: 33s - loss: 0.9753 - regression_loss: 0.8401 - classification_loss: 0.1352 400/500 [=======================>......] - ETA: 33s - loss: 0.9757 - regression_loss: 0.8403 - classification_loss: 0.1354 401/500 [=======================>......] - ETA: 32s - loss: 0.9756 - regression_loss: 0.8402 - classification_loss: 0.1354 402/500 [=======================>......] - ETA: 32s - loss: 0.9754 - regression_loss: 0.8401 - classification_loss: 0.1354 403/500 [=======================>......] - ETA: 32s - loss: 0.9763 - regression_loss: 0.8407 - classification_loss: 0.1356 404/500 [=======================>......] - ETA: 31s - loss: 0.9760 - regression_loss: 0.8405 - classification_loss: 0.1355 405/500 [=======================>......] - ETA: 31s - loss: 0.9771 - regression_loss: 0.8414 - classification_loss: 0.1357 406/500 [=======================>......] - ETA: 31s - loss: 0.9773 - regression_loss: 0.8417 - classification_loss: 0.1357 407/500 [=======================>......] - ETA: 30s - loss: 0.9761 - regression_loss: 0.8406 - classification_loss: 0.1355 408/500 [=======================>......] - ETA: 30s - loss: 0.9765 - regression_loss: 0.8409 - classification_loss: 0.1355 409/500 [=======================>......] - ETA: 30s - loss: 0.9783 - regression_loss: 0.8423 - classification_loss: 0.1359 410/500 [=======================>......] - ETA: 29s - loss: 0.9790 - regression_loss: 0.8429 - classification_loss: 0.1361 411/500 [=======================>......] - ETA: 29s - loss: 0.9806 - regression_loss: 0.8443 - classification_loss: 0.1364 412/500 [=======================>......] - ETA: 29s - loss: 0.9818 - regression_loss: 0.8453 - classification_loss: 0.1365 413/500 [=======================>......] - ETA: 28s - loss: 0.9820 - regression_loss: 0.8455 - classification_loss: 0.1365 414/500 [=======================>......] - ETA: 28s - loss: 0.9827 - regression_loss: 0.8460 - classification_loss: 0.1366 415/500 [=======================>......] - ETA: 28s - loss: 0.9815 - regression_loss: 0.8450 - classification_loss: 0.1365 416/500 [=======================>......] - ETA: 27s - loss: 0.9818 - regression_loss: 0.8452 - classification_loss: 0.1366 417/500 [========================>.....] - ETA: 27s - loss: 0.9811 - regression_loss: 0.8446 - classification_loss: 0.1365 418/500 [========================>.....] - ETA: 27s - loss: 0.9819 - regression_loss: 0.8453 - classification_loss: 0.1366 419/500 [========================>.....] - ETA: 26s - loss: 0.9825 - regression_loss: 0.8457 - classification_loss: 0.1368 420/500 [========================>.....] - ETA: 26s - loss: 0.9824 - regression_loss: 0.8456 - classification_loss: 0.1368 421/500 [========================>.....] - ETA: 26s - loss: 0.9812 - regression_loss: 0.8446 - classification_loss: 0.1366 422/500 [========================>.....] - ETA: 25s - loss: 0.9810 - regression_loss: 0.8445 - classification_loss: 0.1365 423/500 [========================>.....] - ETA: 25s - loss: 0.9804 - regression_loss: 0.8441 - classification_loss: 0.1363 424/500 [========================>.....] - ETA: 25s - loss: 0.9796 - regression_loss: 0.8435 - classification_loss: 0.1361 425/500 [========================>.....] - ETA: 24s - loss: 0.9800 - regression_loss: 0.8439 - classification_loss: 0.1361 426/500 [========================>.....] - ETA: 24s - loss: 0.9805 - regression_loss: 0.8445 - classification_loss: 0.1360 427/500 [========================>.....] - ETA: 24s - loss: 0.9797 - regression_loss: 0.8439 - classification_loss: 0.1358 428/500 [========================>.....] - ETA: 23s - loss: 0.9790 - regression_loss: 0.8432 - classification_loss: 0.1358 429/500 [========================>.....] - ETA: 23s - loss: 0.9782 - regression_loss: 0.8426 - classification_loss: 0.1356 430/500 [========================>.....] - ETA: 23s - loss: 0.9794 - regression_loss: 0.8436 - classification_loss: 0.1358 431/500 [========================>.....] - ETA: 22s - loss: 0.9800 - regression_loss: 0.8441 - classification_loss: 0.1360 432/500 [========================>.....] - ETA: 22s - loss: 0.9809 - regression_loss: 0.8448 - classification_loss: 0.1361 433/500 [========================>.....] - ETA: 22s - loss: 0.9804 - regression_loss: 0.8444 - classification_loss: 0.1360 434/500 [=========================>....] - ETA: 21s - loss: 0.9790 - regression_loss: 0.8432 - classification_loss: 0.1358 435/500 [=========================>....] - ETA: 21s - loss: 0.9778 - regression_loss: 0.8422 - classification_loss: 0.1356 436/500 [=========================>....] - ETA: 21s - loss: 0.9772 - regression_loss: 0.8417 - classification_loss: 0.1355 437/500 [=========================>....] - ETA: 20s - loss: 0.9771 - regression_loss: 0.8416 - classification_loss: 0.1355 438/500 [=========================>....] - ETA: 20s - loss: 0.9767 - regression_loss: 0.8412 - classification_loss: 0.1355 439/500 [=========================>....] - ETA: 20s - loss: 0.9767 - regression_loss: 0.8413 - classification_loss: 0.1354 440/500 [=========================>....] - ETA: 19s - loss: 0.9762 - regression_loss: 0.8409 - classification_loss: 0.1353 441/500 [=========================>....] - ETA: 19s - loss: 0.9759 - regression_loss: 0.8406 - classification_loss: 0.1353 442/500 [=========================>....] - ETA: 19s - loss: 0.9757 - regression_loss: 0.8405 - classification_loss: 0.1352 443/500 [=========================>....] - ETA: 18s - loss: 0.9755 - regression_loss: 0.8404 - classification_loss: 0.1351 444/500 [=========================>....] - ETA: 18s - loss: 0.9748 - regression_loss: 0.8398 - classification_loss: 0.1350 445/500 [=========================>....] - ETA: 18s - loss: 0.9743 - regression_loss: 0.8394 - classification_loss: 0.1348 446/500 [=========================>....] - ETA: 17s - loss: 0.9741 - regression_loss: 0.8394 - classification_loss: 0.1347 447/500 [=========================>....] - ETA: 17s - loss: 0.9755 - regression_loss: 0.8405 - classification_loss: 0.1350 448/500 [=========================>....] - ETA: 17s - loss: 0.9758 - regression_loss: 0.8407 - classification_loss: 0.1351 449/500 [=========================>....] - ETA: 16s - loss: 0.9764 - regression_loss: 0.8413 - classification_loss: 0.1351 450/500 [==========================>...] - ETA: 16s - loss: 0.9767 - regression_loss: 0.8415 - classification_loss: 0.1352 451/500 [==========================>...] - ETA: 16s - loss: 0.9763 - regression_loss: 0.8412 - classification_loss: 0.1351 452/500 [==========================>...] - ETA: 15s - loss: 0.9762 - regression_loss: 0.8411 - classification_loss: 0.1351 453/500 [==========================>...] - ETA: 15s - loss: 0.9771 - regression_loss: 0.8419 - classification_loss: 0.1352 454/500 [==========================>...] - ETA: 15s - loss: 0.9768 - regression_loss: 0.8416 - classification_loss: 0.1351 455/500 [==========================>...] - ETA: 14s - loss: 0.9764 - regression_loss: 0.8414 - classification_loss: 0.1351 456/500 [==========================>...] - ETA: 14s - loss: 0.9762 - regression_loss: 0.8413 - classification_loss: 0.1349 457/500 [==========================>...] - ETA: 14s - loss: 0.9761 - regression_loss: 0.8412 - classification_loss: 0.1350 458/500 [==========================>...] - ETA: 13s - loss: 0.9779 - regression_loss: 0.8427 - classification_loss: 0.1353 459/500 [==========================>...] - ETA: 13s - loss: 0.9787 - regression_loss: 0.8433 - classification_loss: 0.1353 460/500 [==========================>...] - ETA: 13s - loss: 0.9774 - regression_loss: 0.8423 - classification_loss: 0.1351 461/500 [==========================>...] - ETA: 12s - loss: 0.9771 - regression_loss: 0.8420 - classification_loss: 0.1351 462/500 [==========================>...] - ETA: 12s - loss: 0.9759 - regression_loss: 0.8411 - classification_loss: 0.1349 463/500 [==========================>...] - ETA: 12s - loss: 0.9761 - regression_loss: 0.8412 - classification_loss: 0.1349 464/500 [==========================>...] - ETA: 11s - loss: 0.9756 - regression_loss: 0.8408 - classification_loss: 0.1348 465/500 [==========================>...] - ETA: 11s - loss: 0.9762 - regression_loss: 0.8414 - classification_loss: 0.1348 466/500 [==========================>...] - ETA: 11s - loss: 0.9750 - regression_loss: 0.8404 - classification_loss: 0.1346 467/500 [===========================>..] - ETA: 10s - loss: 0.9754 - regression_loss: 0.8407 - classification_loss: 0.1346 468/500 [===========================>..] - ETA: 10s - loss: 0.9771 - regression_loss: 0.8422 - classification_loss: 0.1350 469/500 [===========================>..] - ETA: 10s - loss: 0.9759 - regression_loss: 0.8411 - classification_loss: 0.1348 470/500 [===========================>..] - ETA: 9s - loss: 0.9754 - regression_loss: 0.8407 - classification_loss: 0.1346  471/500 [===========================>..] - ETA: 9s - loss: 0.9763 - regression_loss: 0.8414 - classification_loss: 0.1349 472/500 [===========================>..] - ETA: 9s - loss: 0.9749 - regression_loss: 0.8402 - classification_loss: 0.1347 473/500 [===========================>..] - ETA: 8s - loss: 0.9756 - regression_loss: 0.8408 - classification_loss: 0.1348 474/500 [===========================>..] - ETA: 8s - loss: 0.9747 - regression_loss: 0.8401 - classification_loss: 0.1346 475/500 [===========================>..] - ETA: 8s - loss: 0.9747 - regression_loss: 0.8400 - classification_loss: 0.1346 476/500 [===========================>..] - ETA: 7s - loss: 0.9753 - regression_loss: 0.8406 - classification_loss: 0.1347 477/500 [===========================>..] - ETA: 7s - loss: 0.9745 - regression_loss: 0.8399 - classification_loss: 0.1346 478/500 [===========================>..] - ETA: 7s - loss: 0.9747 - regression_loss: 0.8402 - classification_loss: 0.1345 479/500 [===========================>..] - ETA: 6s - loss: 0.9744 - regression_loss: 0.8399 - classification_loss: 0.1345 480/500 [===========================>..] - ETA: 6s - loss: 0.9751 - regression_loss: 0.8404 - classification_loss: 0.1347 481/500 [===========================>..] - ETA: 6s - loss: 0.9752 - regression_loss: 0.8406 - classification_loss: 0.1346 482/500 [===========================>..] - ETA: 5s - loss: 0.9746 - regression_loss: 0.8401 - classification_loss: 0.1345 483/500 [===========================>..] - ETA: 5s - loss: 0.9751 - regression_loss: 0.8406 - classification_loss: 0.1345 484/500 [============================>.] - ETA: 5s - loss: 0.9750 - regression_loss: 0.8404 - classification_loss: 0.1345 485/500 [============================>.] - ETA: 4s - loss: 0.9746 - regression_loss: 0.8402 - classification_loss: 0.1344 486/500 [============================>.] - ETA: 4s - loss: 0.9738 - regression_loss: 0.8395 - classification_loss: 0.1343 487/500 [============================>.] - ETA: 4s - loss: 0.9737 - regression_loss: 0.8395 - classification_loss: 0.1342 488/500 [============================>.] - ETA: 3s - loss: 0.9741 - regression_loss: 0.8398 - classification_loss: 0.1344 489/500 [============================>.] - ETA: 3s - loss: 0.9737 - regression_loss: 0.8393 - classification_loss: 0.1344 490/500 [============================>.] - ETA: 3s - loss: 0.9727 - regression_loss: 0.8385 - classification_loss: 0.1342 491/500 [============================>.] - ETA: 2s - loss: 0.9730 - regression_loss: 0.8389 - classification_loss: 0.1341 492/500 [============================>.] - ETA: 2s - loss: 0.9736 - regression_loss: 0.8394 - classification_loss: 0.1342 493/500 [============================>.] - ETA: 2s - loss: 0.9728 - regression_loss: 0.8387 - classification_loss: 0.1340 494/500 [============================>.] - ETA: 1s - loss: 0.9736 - regression_loss: 0.8393 - classification_loss: 0.1343 495/500 [============================>.] - ETA: 1s - loss: 0.9751 - regression_loss: 0.8405 - classification_loss: 0.1345 496/500 [============================>.] - ETA: 1s - loss: 0.9757 - regression_loss: 0.8411 - classification_loss: 0.1346 497/500 [============================>.] - ETA: 0s - loss: 0.9754 - regression_loss: 0.8409 - classification_loss: 0.1345 498/500 [============================>.] - ETA: 0s - loss: 0.9751 - regression_loss: 0.8407 - classification_loss: 0.1344 499/500 [============================>.] - ETA: 0s - loss: 0.9747 - regression_loss: 0.8403 - classification_loss: 0.1344 500/500 [==============================] - 165s 331ms/step - loss: 0.9741 - regression_loss: 0.8398 - classification_loss: 0.1343 1172 instances of class plum with average precision: 0.6154 mAP: 0.6154 Epoch 00031: saving model to ./training/snapshots/resnet101_pascal_31.h5 Epoch 32/150 1/500 [..............................] - ETA: 2:31 - loss: 1.1474 - regression_loss: 1.0028 - classification_loss: 0.1446 2/500 [..............................] - ETA: 2:36 - loss: 1.0335 - regression_loss: 0.8912 - classification_loss: 0.1423 3/500 [..............................] - ETA: 2:43 - loss: 0.9230 - regression_loss: 0.7913 - classification_loss: 0.1317 4/500 [..............................] - ETA: 2:44 - loss: 0.9064 - regression_loss: 0.7768 - classification_loss: 0.1296 5/500 [..............................] - ETA: 2:42 - loss: 0.8837 - regression_loss: 0.7623 - classification_loss: 0.1214 6/500 [..............................] - ETA: 2:43 - loss: 0.8081 - regression_loss: 0.6974 - classification_loss: 0.1107 7/500 [..............................] - ETA: 2:42 - loss: 0.8895 - regression_loss: 0.7687 - classification_loss: 0.1208 8/500 [..............................] - ETA: 2:41 - loss: 0.9319 - regression_loss: 0.8000 - classification_loss: 0.1319 9/500 [..............................] - ETA: 2:40 - loss: 0.8997 - regression_loss: 0.7732 - classification_loss: 0.1265 10/500 [..............................] - ETA: 2:39 - loss: 0.8497 - regression_loss: 0.7318 - classification_loss: 0.1179 11/500 [..............................] - ETA: 2:38 - loss: 0.9312 - regression_loss: 0.8004 - classification_loss: 0.1309 12/500 [..............................] - ETA: 2:39 - loss: 0.8962 - regression_loss: 0.7724 - classification_loss: 0.1238 13/500 [..............................] - ETA: 2:38 - loss: 0.8732 - regression_loss: 0.7536 - classification_loss: 0.1195 14/500 [..............................] - ETA: 2:39 - loss: 0.8884 - regression_loss: 0.7677 - classification_loss: 0.1207 15/500 [..............................] - ETA: 2:38 - loss: 0.8953 - regression_loss: 0.7745 - classification_loss: 0.1208 16/500 [..............................] - ETA: 2:38 - loss: 0.8826 - regression_loss: 0.7639 - classification_loss: 0.1188 17/500 [>.............................] - ETA: 2:38 - loss: 0.8533 - regression_loss: 0.7377 - classification_loss: 0.1156 18/500 [>.............................] - ETA: 2:38 - loss: 0.8705 - regression_loss: 0.7530 - classification_loss: 0.1175 19/500 [>.............................] - ETA: 2:38 - loss: 0.8674 - regression_loss: 0.7499 - classification_loss: 0.1176 20/500 [>.............................] - ETA: 2:38 - loss: 0.8518 - regression_loss: 0.7360 - classification_loss: 0.1158 21/500 [>.............................] - ETA: 2:37 - loss: 0.8610 - regression_loss: 0.7457 - classification_loss: 0.1153 22/500 [>.............................] - ETA: 2:37 - loss: 0.8791 - regression_loss: 0.7612 - classification_loss: 0.1179 23/500 [>.............................] - ETA: 2:37 - loss: 0.9062 - regression_loss: 0.7851 - classification_loss: 0.1211 24/500 [>.............................] - ETA: 2:36 - loss: 0.9115 - regression_loss: 0.7903 - classification_loss: 0.1213 25/500 [>.............................] - ETA: 2:36 - loss: 0.9124 - regression_loss: 0.7919 - classification_loss: 0.1205 26/500 [>.............................] - ETA: 2:36 - loss: 0.9243 - regression_loss: 0.8028 - classification_loss: 0.1215 27/500 [>.............................] - ETA: 2:36 - loss: 0.9413 - regression_loss: 0.8187 - classification_loss: 0.1226 28/500 [>.............................] - ETA: 2:35 - loss: 0.9422 - regression_loss: 0.8190 - classification_loss: 0.1233 29/500 [>.............................] - ETA: 2:35 - loss: 0.9218 - regression_loss: 0.8003 - classification_loss: 0.1215 30/500 [>.............................] - ETA: 2:35 - loss: 0.9236 - regression_loss: 0.8009 - classification_loss: 0.1227 31/500 [>.............................] - ETA: 2:34 - loss: 0.9222 - regression_loss: 0.7991 - classification_loss: 0.1231 32/500 [>.............................] - ETA: 2:34 - loss: 0.9315 - regression_loss: 0.8063 - classification_loss: 0.1252 33/500 [>.............................] - ETA: 2:34 - loss: 0.9279 - regression_loss: 0.8034 - classification_loss: 0.1245 34/500 [=>............................] - ETA: 2:33 - loss: 0.9259 - regression_loss: 0.8015 - classification_loss: 0.1244 35/500 [=>............................] - ETA: 2:33 - loss: 0.9123 - regression_loss: 0.7897 - classification_loss: 0.1225 36/500 [=>............................] - ETA: 2:33 - loss: 0.9189 - regression_loss: 0.7959 - classification_loss: 0.1230 37/500 [=>............................] - ETA: 2:32 - loss: 0.9064 - regression_loss: 0.7842 - classification_loss: 0.1222 38/500 [=>............................] - ETA: 2:32 - loss: 0.9026 - regression_loss: 0.7808 - classification_loss: 0.1218 39/500 [=>............................] - ETA: 2:31 - loss: 0.9084 - regression_loss: 0.7857 - classification_loss: 0.1227 40/500 [=>............................] - ETA: 2:31 - loss: 0.9066 - regression_loss: 0.7831 - classification_loss: 0.1234 41/500 [=>............................] - ETA: 2:31 - loss: 0.9177 - regression_loss: 0.7906 - classification_loss: 0.1271 42/500 [=>............................] - ETA: 2:30 - loss: 0.9052 - regression_loss: 0.7796 - classification_loss: 0.1256 43/500 [=>............................] - ETA: 2:30 - loss: 0.9232 - regression_loss: 0.7951 - classification_loss: 0.1281 44/500 [=>............................] - ETA: 2:29 - loss: 0.9281 - regression_loss: 0.8009 - classification_loss: 0.1273 45/500 [=>............................] - ETA: 2:29 - loss: 0.9167 - regression_loss: 0.7905 - classification_loss: 0.1262 46/500 [=>............................] - ETA: 2:29 - loss: 0.9184 - regression_loss: 0.7919 - classification_loss: 0.1265 47/500 [=>............................] - ETA: 2:29 - loss: 0.9133 - regression_loss: 0.7876 - classification_loss: 0.1257 48/500 [=>............................] - ETA: 2:29 - loss: 0.9114 - regression_loss: 0.7864 - classification_loss: 0.1249 49/500 [=>............................] - ETA: 2:28 - loss: 0.9160 - regression_loss: 0.7900 - classification_loss: 0.1261 50/500 [==>...........................] - ETA: 2:28 - loss: 0.9220 - regression_loss: 0.7942 - classification_loss: 0.1278 51/500 [==>...........................] - ETA: 2:28 - loss: 0.9223 - regression_loss: 0.7946 - classification_loss: 0.1278 52/500 [==>...........................] - ETA: 2:27 - loss: 0.9156 - regression_loss: 0.7891 - classification_loss: 0.1265 53/500 [==>...........................] - ETA: 2:27 - loss: 0.9245 - regression_loss: 0.7956 - classification_loss: 0.1289 54/500 [==>...........................] - ETA: 2:27 - loss: 0.9266 - regression_loss: 0.7972 - classification_loss: 0.1294 55/500 [==>...........................] - ETA: 2:26 - loss: 0.9319 - regression_loss: 0.8014 - classification_loss: 0.1305 56/500 [==>...........................] - ETA: 2:26 - loss: 0.9346 - regression_loss: 0.8040 - classification_loss: 0.1306 57/500 [==>...........................] - ETA: 2:25 - loss: 0.9319 - regression_loss: 0.8017 - classification_loss: 0.1302 58/500 [==>...........................] - ETA: 2:25 - loss: 0.9281 - regression_loss: 0.7992 - classification_loss: 0.1289 59/500 [==>...........................] - ETA: 2:25 - loss: 0.9367 - regression_loss: 0.8064 - classification_loss: 0.1303 60/500 [==>...........................] - ETA: 2:24 - loss: 0.9303 - regression_loss: 0.8007 - classification_loss: 0.1296 61/500 [==>...........................] - ETA: 2:24 - loss: 0.9261 - regression_loss: 0.7975 - classification_loss: 0.1286 62/500 [==>...........................] - ETA: 2:24 - loss: 0.9322 - regression_loss: 0.8027 - classification_loss: 0.1295 63/500 [==>...........................] - ETA: 2:24 - loss: 0.9236 - regression_loss: 0.7956 - classification_loss: 0.1280 64/500 [==>...........................] - ETA: 2:23 - loss: 0.9293 - regression_loss: 0.8012 - classification_loss: 0.1281 65/500 [==>...........................] - ETA: 2:23 - loss: 0.9211 - regression_loss: 0.7940 - classification_loss: 0.1270 66/500 [==>...........................] - ETA: 2:23 - loss: 0.9287 - regression_loss: 0.8012 - classification_loss: 0.1275 67/500 [===>..........................] - ETA: 2:22 - loss: 0.9353 - regression_loss: 0.8066 - classification_loss: 0.1287 68/500 [===>..........................] - ETA: 2:22 - loss: 0.9386 - regression_loss: 0.8110 - classification_loss: 0.1276 69/500 [===>..........................] - ETA: 2:22 - loss: 0.9473 - regression_loss: 0.8178 - classification_loss: 0.1295 70/500 [===>..........................] - ETA: 2:21 - loss: 0.9409 - regression_loss: 0.8124 - classification_loss: 0.1285 71/500 [===>..........................] - ETA: 2:21 - loss: 0.9390 - regression_loss: 0.8110 - classification_loss: 0.1280 72/500 [===>..........................] - ETA: 2:21 - loss: 0.9442 - regression_loss: 0.8158 - classification_loss: 0.1284 73/500 [===>..........................] - ETA: 2:20 - loss: 0.9474 - regression_loss: 0.8184 - classification_loss: 0.1289 74/500 [===>..........................] - ETA: 2:20 - loss: 0.9490 - regression_loss: 0.8201 - classification_loss: 0.1289 75/500 [===>..........................] - ETA: 2:20 - loss: 0.9501 - regression_loss: 0.8194 - classification_loss: 0.1307 76/500 [===>..........................] - ETA: 2:19 - loss: 0.9570 - regression_loss: 0.8255 - classification_loss: 0.1315 77/500 [===>..........................] - ETA: 2:19 - loss: 0.9628 - regression_loss: 0.8302 - classification_loss: 0.1326 78/500 [===>..........................] - ETA: 2:18 - loss: 0.9646 - regression_loss: 0.8322 - classification_loss: 0.1324 79/500 [===>..........................] - ETA: 2:18 - loss: 0.9675 - regression_loss: 0.8353 - classification_loss: 0.1323 80/500 [===>..........................] - ETA: 2:18 - loss: 0.9716 - regression_loss: 0.8384 - classification_loss: 0.1331 81/500 [===>..........................] - ETA: 2:17 - loss: 0.9742 - regression_loss: 0.8403 - classification_loss: 0.1339 82/500 [===>..........................] - ETA: 2:17 - loss: 0.9746 - regression_loss: 0.8405 - classification_loss: 0.1341 83/500 [===>..........................] - ETA: 2:17 - loss: 0.9791 - regression_loss: 0.8446 - classification_loss: 0.1344 84/500 [====>.........................] - ETA: 2:16 - loss: 0.9900 - regression_loss: 0.8538 - classification_loss: 0.1363 85/500 [====>.........................] - ETA: 2:16 - loss: 0.9968 - regression_loss: 0.8586 - classification_loss: 0.1382 86/500 [====>.........................] - ETA: 2:16 - loss: 0.9933 - regression_loss: 0.8559 - classification_loss: 0.1375 87/500 [====>.........................] - ETA: 2:15 - loss: 0.9945 - regression_loss: 0.8575 - classification_loss: 0.1370 88/500 [====>.........................] - ETA: 2:15 - loss: 0.9954 - regression_loss: 0.8579 - classification_loss: 0.1375 89/500 [====>.........................] - ETA: 2:15 - loss: 0.9956 - regression_loss: 0.8575 - classification_loss: 0.1381 90/500 [====>.........................] - ETA: 2:14 - loss: 0.9921 - regression_loss: 0.8544 - classification_loss: 0.1377 91/500 [====>.........................] - ETA: 2:14 - loss: 0.9936 - regression_loss: 0.8555 - classification_loss: 0.1381 92/500 [====>.........................] - ETA: 2:14 - loss: 0.9864 - regression_loss: 0.8493 - classification_loss: 0.1371 93/500 [====>.........................] - ETA: 2:13 - loss: 0.9853 - regression_loss: 0.8479 - classification_loss: 0.1374 94/500 [====>.........................] - ETA: 2:13 - loss: 0.9814 - regression_loss: 0.8442 - classification_loss: 0.1373 95/500 [====>.........................] - ETA: 2:13 - loss: 0.9805 - regression_loss: 0.8433 - classification_loss: 0.1372 96/500 [====>.........................] - ETA: 2:13 - loss: 0.9802 - regression_loss: 0.8423 - classification_loss: 0.1379 97/500 [====>.........................] - ETA: 2:12 - loss: 0.9742 - regression_loss: 0.8372 - classification_loss: 0.1369 98/500 [====>.........................] - ETA: 2:12 - loss: 0.9759 - regression_loss: 0.8389 - classification_loss: 0.1370 99/500 [====>.........................] - ETA: 2:12 - loss: 0.9723 - regression_loss: 0.8359 - classification_loss: 0.1364 100/500 [=====>........................] - ETA: 2:11 - loss: 0.9759 - regression_loss: 0.8389 - classification_loss: 0.1370 101/500 [=====>........................] - ETA: 2:11 - loss: 0.9839 - regression_loss: 0.8454 - classification_loss: 0.1386 102/500 [=====>........................] - ETA: 2:11 - loss: 0.9870 - regression_loss: 0.8478 - classification_loss: 0.1392 103/500 [=====>........................] - ETA: 2:10 - loss: 0.9809 - regression_loss: 0.8421 - classification_loss: 0.1389 104/500 [=====>........................] - ETA: 2:10 - loss: 0.9845 - regression_loss: 0.8449 - classification_loss: 0.1397 105/500 [=====>........................] - ETA: 2:10 - loss: 0.9849 - regression_loss: 0.8451 - classification_loss: 0.1398 106/500 [=====>........................] - ETA: 2:09 - loss: 0.9891 - regression_loss: 0.8488 - classification_loss: 0.1402 107/500 [=====>........................] - ETA: 2:09 - loss: 0.9846 - regression_loss: 0.8449 - classification_loss: 0.1397 108/500 [=====>........................] - ETA: 2:09 - loss: 0.9844 - regression_loss: 0.8453 - classification_loss: 0.1392 109/500 [=====>........................] - ETA: 2:08 - loss: 0.9809 - regression_loss: 0.8425 - classification_loss: 0.1384 110/500 [=====>........................] - ETA: 2:08 - loss: 0.9815 - regression_loss: 0.8430 - classification_loss: 0.1385 111/500 [=====>........................] - ETA: 2:08 - loss: 0.9777 - regression_loss: 0.8399 - classification_loss: 0.1378 112/500 [=====>........................] - ETA: 2:08 - loss: 0.9735 - regression_loss: 0.8366 - classification_loss: 0.1369 113/500 [=====>........................] - ETA: 2:07 - loss: 0.9746 - regression_loss: 0.8379 - classification_loss: 0.1366 114/500 [=====>........................] - ETA: 2:07 - loss: 0.9702 - regression_loss: 0.8343 - classification_loss: 0.1359 115/500 [=====>........................] - ETA: 2:07 - loss: 0.9716 - regression_loss: 0.8358 - classification_loss: 0.1358 116/500 [=====>........................] - ETA: 2:06 - loss: 0.9736 - regression_loss: 0.8375 - classification_loss: 0.1361 117/500 [======>.......................] - ETA: 2:06 - loss: 0.9798 - regression_loss: 0.8424 - classification_loss: 0.1374 118/500 [======>.......................] - ETA: 2:06 - loss: 0.9817 - regression_loss: 0.8441 - classification_loss: 0.1376 119/500 [======>.......................] - ETA: 2:05 - loss: 0.9866 - regression_loss: 0.8478 - classification_loss: 0.1388 120/500 [======>.......................] - ETA: 2:05 - loss: 0.9822 - regression_loss: 0.8438 - classification_loss: 0.1383 121/500 [======>.......................] - ETA: 2:05 - loss: 0.9791 - regression_loss: 0.8409 - classification_loss: 0.1382 122/500 [======>.......................] - ETA: 2:05 - loss: 0.9779 - regression_loss: 0.8400 - classification_loss: 0.1378 123/500 [======>.......................] - ETA: 2:04 - loss: 0.9823 - regression_loss: 0.8438 - classification_loss: 0.1386 124/500 [======>.......................] - ETA: 2:04 - loss: 0.9832 - regression_loss: 0.8443 - classification_loss: 0.1388 125/500 [======>.......................] - ETA: 2:04 - loss: 0.9786 - regression_loss: 0.8404 - classification_loss: 0.1382 126/500 [======>.......................] - ETA: 2:03 - loss: 0.9814 - regression_loss: 0.8427 - classification_loss: 0.1387 127/500 [======>.......................] - ETA: 2:03 - loss: 0.9814 - regression_loss: 0.8427 - classification_loss: 0.1387 128/500 [======>.......................] - ETA: 2:03 - loss: 0.9796 - regression_loss: 0.8411 - classification_loss: 0.1385 129/500 [======>.......................] - ETA: 2:02 - loss: 0.9840 - regression_loss: 0.8447 - classification_loss: 0.1393 130/500 [======>.......................] - ETA: 2:02 - loss: 0.9811 - regression_loss: 0.8423 - classification_loss: 0.1388 131/500 [======>.......................] - ETA: 2:02 - loss: 0.9775 - regression_loss: 0.8391 - classification_loss: 0.1383 132/500 [======>.......................] - ETA: 2:01 - loss: 0.9812 - regression_loss: 0.8422 - classification_loss: 0.1390 133/500 [======>.......................] - ETA: 2:01 - loss: 0.9794 - regression_loss: 0.8405 - classification_loss: 0.1388 134/500 [=======>......................] - ETA: 2:01 - loss: 0.9757 - regression_loss: 0.8375 - classification_loss: 0.1382 135/500 [=======>......................] - ETA: 2:00 - loss: 0.9755 - regression_loss: 0.8372 - classification_loss: 0.1383 136/500 [=======>......................] - ETA: 2:00 - loss: 0.9701 - regression_loss: 0.8324 - classification_loss: 0.1376 137/500 [=======>......................] - ETA: 2:00 - loss: 0.9670 - regression_loss: 0.8298 - classification_loss: 0.1371 138/500 [=======>......................] - ETA: 1:59 - loss: 0.9726 - regression_loss: 0.8337 - classification_loss: 0.1388 139/500 [=======>......................] - ETA: 1:59 - loss: 0.9780 - regression_loss: 0.8383 - classification_loss: 0.1397 140/500 [=======>......................] - ETA: 1:59 - loss: 0.9758 - regression_loss: 0.8366 - classification_loss: 0.1392 141/500 [=======>......................] - ETA: 1:58 - loss: 0.9758 - regression_loss: 0.8367 - classification_loss: 0.1392 142/500 [=======>......................] - ETA: 1:58 - loss: 0.9732 - regression_loss: 0.8346 - classification_loss: 0.1386 143/500 [=======>......................] - ETA: 1:58 - loss: 0.9761 - regression_loss: 0.8373 - classification_loss: 0.1388 144/500 [=======>......................] - ETA: 1:57 - loss: 0.9749 - regression_loss: 0.8366 - classification_loss: 0.1383 145/500 [=======>......................] - ETA: 1:57 - loss: 0.9788 - regression_loss: 0.8401 - classification_loss: 0.1386 146/500 [=======>......................] - ETA: 1:57 - loss: 0.9794 - regression_loss: 0.8408 - classification_loss: 0.1386 147/500 [=======>......................] - ETA: 1:56 - loss: 0.9829 - regression_loss: 0.8436 - classification_loss: 0.1393 148/500 [=======>......................] - ETA: 1:56 - loss: 0.9833 - regression_loss: 0.8441 - classification_loss: 0.1392 149/500 [=======>......................] - ETA: 1:56 - loss: 0.9829 - regression_loss: 0.8434 - classification_loss: 0.1395 150/500 [========>.....................] - ETA: 1:55 - loss: 0.9802 - regression_loss: 0.8412 - classification_loss: 0.1390 151/500 [========>.....................] - ETA: 1:55 - loss: 0.9810 - regression_loss: 0.8419 - classification_loss: 0.1391 152/500 [========>.....................] - ETA: 1:55 - loss: 0.9824 - regression_loss: 0.8431 - classification_loss: 0.1393 153/500 [========>.....................] - ETA: 1:54 - loss: 0.9820 - regression_loss: 0.8427 - classification_loss: 0.1393 154/500 [========>.....................] - ETA: 1:54 - loss: 0.9782 - regression_loss: 0.8393 - classification_loss: 0.1388 155/500 [========>.....................] - ETA: 1:54 - loss: 0.9796 - regression_loss: 0.8406 - classification_loss: 0.1390 156/500 [========>.....................] - ETA: 1:54 - loss: 0.9764 - regression_loss: 0.8378 - classification_loss: 0.1386 157/500 [========>.....................] - ETA: 1:53 - loss: 0.9747 - regression_loss: 0.8365 - classification_loss: 0.1382 158/500 [========>.....................] - ETA: 1:53 - loss: 0.9728 - regression_loss: 0.8349 - classification_loss: 0.1379 159/500 [========>.....................] - ETA: 1:52 - loss: 0.9699 - regression_loss: 0.8324 - classification_loss: 0.1375 160/500 [========>.....................] - ETA: 1:52 - loss: 0.9724 - regression_loss: 0.8348 - classification_loss: 0.1376 161/500 [========>.....................] - ETA: 1:52 - loss: 0.9706 - regression_loss: 0.8334 - classification_loss: 0.1372 162/500 [========>.....................] - ETA: 1:51 - loss: 0.9706 - regression_loss: 0.8336 - classification_loss: 0.1370 163/500 [========>.....................] - ETA: 1:51 - loss: 0.9713 - regression_loss: 0.8342 - classification_loss: 0.1371 164/500 [========>.....................] - ETA: 1:51 - loss: 0.9750 - regression_loss: 0.8374 - classification_loss: 0.1375 165/500 [========>.....................] - ETA: 1:50 - loss: 0.9721 - regression_loss: 0.8349 - classification_loss: 0.1372 166/500 [========>.....................] - ETA: 1:50 - loss: 0.9740 - regression_loss: 0.8366 - classification_loss: 0.1373 167/500 [=========>....................] - ETA: 1:50 - loss: 0.9787 - regression_loss: 0.8404 - classification_loss: 0.1383 168/500 [=========>....................] - ETA: 1:49 - loss: 0.9777 - regression_loss: 0.8397 - classification_loss: 0.1380 169/500 [=========>....................] - ETA: 1:49 - loss: 0.9758 - regression_loss: 0.8382 - classification_loss: 0.1377 170/500 [=========>....................] - ETA: 1:49 - loss: 0.9782 - regression_loss: 0.8404 - classification_loss: 0.1379 171/500 [=========>....................] - ETA: 1:49 - loss: 0.9811 - regression_loss: 0.8431 - classification_loss: 0.1380 172/500 [=========>....................] - ETA: 1:48 - loss: 0.9836 - regression_loss: 0.8453 - classification_loss: 0.1383 173/500 [=========>....................] - ETA: 1:48 - loss: 0.9872 - regression_loss: 0.8482 - classification_loss: 0.1390 174/500 [=========>....................] - ETA: 1:47 - loss: 0.9872 - regression_loss: 0.8481 - classification_loss: 0.1392 175/500 [=========>....................] - ETA: 1:47 - loss: 0.9874 - regression_loss: 0.8482 - classification_loss: 0.1392 176/500 [=========>....................] - ETA: 1:47 - loss: 0.9903 - regression_loss: 0.8506 - classification_loss: 0.1397 177/500 [=========>....................] - ETA: 1:47 - loss: 0.9899 - regression_loss: 0.8503 - classification_loss: 0.1396 178/500 [=========>....................] - ETA: 1:46 - loss: 0.9944 - regression_loss: 0.8542 - classification_loss: 0.1402 179/500 [=========>....................] - ETA: 1:46 - loss: 0.9933 - regression_loss: 0.8535 - classification_loss: 0.1398 180/500 [=========>....................] - ETA: 1:46 - loss: 0.9911 - regression_loss: 0.8517 - classification_loss: 0.1394 181/500 [=========>....................] - ETA: 1:45 - loss: 0.9916 - regression_loss: 0.8522 - classification_loss: 0.1394 182/500 [=========>....................] - ETA: 1:45 - loss: 0.9922 - regression_loss: 0.8527 - classification_loss: 0.1395 183/500 [=========>....................] - ETA: 1:45 - loss: 0.9900 - regression_loss: 0.8508 - classification_loss: 0.1392 184/500 [==========>...................] - ETA: 1:44 - loss: 0.9912 - regression_loss: 0.8520 - classification_loss: 0.1393 185/500 [==========>...................] - ETA: 1:44 - loss: 0.9887 - regression_loss: 0.8499 - classification_loss: 0.1388 186/500 [==========>...................] - ETA: 1:44 - loss: 0.9887 - regression_loss: 0.8501 - classification_loss: 0.1386 187/500 [==========>...................] - ETA: 1:43 - loss: 0.9884 - regression_loss: 0.8497 - classification_loss: 0.1387 188/500 [==========>...................] - ETA: 1:43 - loss: 0.9855 - regression_loss: 0.8472 - classification_loss: 0.1383 189/500 [==========>...................] - ETA: 1:43 - loss: 0.9853 - regression_loss: 0.8472 - classification_loss: 0.1381 190/500 [==========>...................] - ETA: 1:42 - loss: 0.9855 - regression_loss: 0.8473 - classification_loss: 0.1381 191/500 [==========>...................] - ETA: 1:42 - loss: 0.9843 - regression_loss: 0.8464 - classification_loss: 0.1378 192/500 [==========>...................] - ETA: 1:42 - loss: 0.9829 - regression_loss: 0.8455 - classification_loss: 0.1374 193/500 [==========>...................] - ETA: 1:41 - loss: 0.9807 - regression_loss: 0.8436 - classification_loss: 0.1371 194/500 [==========>...................] - ETA: 1:41 - loss: 0.9780 - regression_loss: 0.8413 - classification_loss: 0.1366 195/500 [==========>...................] - ETA: 1:41 - loss: 0.9787 - regression_loss: 0.8418 - classification_loss: 0.1369 196/500 [==========>...................] - ETA: 1:40 - loss: 0.9772 - regression_loss: 0.8408 - classification_loss: 0.1364 197/500 [==========>...................] - ETA: 1:40 - loss: 0.9792 - regression_loss: 0.8424 - classification_loss: 0.1369 198/500 [==========>...................] - ETA: 1:40 - loss: 0.9806 - regression_loss: 0.8436 - classification_loss: 0.1370 199/500 [==========>...................] - ETA: 1:39 - loss: 0.9828 - regression_loss: 0.8454 - classification_loss: 0.1374 200/500 [===========>..................] - ETA: 1:39 - loss: 0.9835 - regression_loss: 0.8459 - classification_loss: 0.1376 201/500 [===========>..................] - ETA: 1:39 - loss: 0.9804 - regression_loss: 0.8433 - classification_loss: 0.1371 202/500 [===========>..................] - ETA: 1:38 - loss: 0.9822 - regression_loss: 0.8447 - classification_loss: 0.1375 203/500 [===========>..................] - ETA: 1:38 - loss: 0.9813 - regression_loss: 0.8440 - classification_loss: 0.1372 204/500 [===========>..................] - ETA: 1:38 - loss: 0.9790 - regression_loss: 0.8420 - classification_loss: 0.1370 205/500 [===========>..................] - ETA: 1:37 - loss: 0.9764 - regression_loss: 0.8398 - classification_loss: 0.1366 206/500 [===========>..................] - ETA: 1:37 - loss: 0.9732 - regression_loss: 0.8370 - classification_loss: 0.1361 207/500 [===========>..................] - ETA: 1:37 - loss: 0.9743 - regression_loss: 0.8381 - classification_loss: 0.1362 208/500 [===========>..................] - ETA: 1:36 - loss: 0.9733 - regression_loss: 0.8373 - classification_loss: 0.1360 209/500 [===========>..................] - ETA: 1:36 - loss: 0.9750 - regression_loss: 0.8386 - classification_loss: 0.1364 210/500 [===========>..................] - ETA: 1:36 - loss: 0.9727 - regression_loss: 0.8368 - classification_loss: 0.1359 211/500 [===========>..................] - ETA: 1:35 - loss: 0.9724 - regression_loss: 0.8365 - classification_loss: 0.1359 212/500 [===========>..................] - ETA: 1:35 - loss: 0.9721 - regression_loss: 0.8364 - classification_loss: 0.1357 213/500 [===========>..................] - ETA: 1:35 - loss: 0.9700 - regression_loss: 0.8346 - classification_loss: 0.1353 214/500 [===========>..................] - ETA: 1:34 - loss: 0.9711 - regression_loss: 0.8358 - classification_loss: 0.1353 215/500 [===========>..................] - ETA: 1:34 - loss: 0.9723 - regression_loss: 0.8368 - classification_loss: 0.1355 216/500 [===========>..................] - ETA: 1:34 - loss: 0.9714 - regression_loss: 0.8360 - classification_loss: 0.1354 217/500 [============>.................] - ETA: 1:33 - loss: 0.9730 - regression_loss: 0.8374 - classification_loss: 0.1356 218/500 [============>.................] - ETA: 1:33 - loss: 0.9728 - regression_loss: 0.8371 - classification_loss: 0.1357 219/500 [============>.................] - ETA: 1:33 - loss: 0.9710 - regression_loss: 0.8358 - classification_loss: 0.1353 220/500 [============>.................] - ETA: 1:32 - loss: 0.9697 - regression_loss: 0.8346 - classification_loss: 0.1351 221/500 [============>.................] - ETA: 1:32 - loss: 0.9675 - regression_loss: 0.8329 - classification_loss: 0.1346 222/500 [============>.................] - ETA: 1:32 - loss: 0.9687 - regression_loss: 0.8337 - classification_loss: 0.1350 223/500 [============>.................] - ETA: 1:31 - loss: 0.9714 - regression_loss: 0.8360 - classification_loss: 0.1354 224/500 [============>.................] - ETA: 1:31 - loss: 0.9740 - regression_loss: 0.8381 - classification_loss: 0.1359 225/500 [============>.................] - ETA: 1:31 - loss: 0.9746 - regression_loss: 0.8386 - classification_loss: 0.1360 226/500 [============>.................] - ETA: 1:30 - loss: 0.9763 - regression_loss: 0.8400 - classification_loss: 0.1363 227/500 [============>.................] - ETA: 1:30 - loss: 0.9772 - regression_loss: 0.8406 - classification_loss: 0.1366 228/500 [============>.................] - ETA: 1:30 - loss: 0.9783 - regression_loss: 0.8416 - classification_loss: 0.1367 229/500 [============>.................] - ETA: 1:29 - loss: 0.9768 - regression_loss: 0.8403 - classification_loss: 0.1364 230/500 [============>.................] - ETA: 1:29 - loss: 0.9779 - regression_loss: 0.8413 - classification_loss: 0.1366 231/500 [============>.................] - ETA: 1:29 - loss: 0.9778 - regression_loss: 0.8414 - classification_loss: 0.1364 232/500 [============>.................] - ETA: 1:28 - loss: 0.9773 - regression_loss: 0.8410 - classification_loss: 0.1363 233/500 [============>.................] - ETA: 1:28 - loss: 0.9768 - regression_loss: 0.8407 - classification_loss: 0.1361 234/500 [=============>................] - ETA: 1:28 - loss: 0.9772 - regression_loss: 0.8410 - classification_loss: 0.1362 235/500 [=============>................] - ETA: 1:27 - loss: 0.9794 - regression_loss: 0.8428 - classification_loss: 0.1366 236/500 [=============>................] - ETA: 1:27 - loss: 0.9795 - regression_loss: 0.8429 - classification_loss: 0.1366 237/500 [=============>................] - ETA: 1:27 - loss: 0.9804 - regression_loss: 0.8437 - classification_loss: 0.1367 238/500 [=============>................] - ETA: 1:26 - loss: 0.9824 - regression_loss: 0.8452 - classification_loss: 0.1372 239/500 [=============>................] - ETA: 1:26 - loss: 0.9818 - regression_loss: 0.8447 - classification_loss: 0.1371 240/500 [=============>................] - ETA: 1:26 - loss: 0.9821 - regression_loss: 0.8449 - classification_loss: 0.1372 241/500 [=============>................] - ETA: 1:25 - loss: 0.9820 - regression_loss: 0.8449 - classification_loss: 0.1371 242/500 [=============>................] - ETA: 1:25 - loss: 0.9818 - regression_loss: 0.8446 - classification_loss: 0.1372 243/500 [=============>................] - ETA: 1:25 - loss: 0.9818 - regression_loss: 0.8447 - classification_loss: 0.1371 244/500 [=============>................] - ETA: 1:24 - loss: 0.9837 - regression_loss: 0.8464 - classification_loss: 0.1374 245/500 [=============>................] - ETA: 1:24 - loss: 0.9848 - regression_loss: 0.8472 - classification_loss: 0.1375 246/500 [=============>................] - ETA: 1:24 - loss: 0.9831 - regression_loss: 0.8457 - classification_loss: 0.1374 247/500 [=============>................] - ETA: 1:23 - loss: 0.9840 - regression_loss: 0.8466 - classification_loss: 0.1374 248/500 [=============>................] - ETA: 1:23 - loss: 0.9852 - regression_loss: 0.8477 - classification_loss: 0.1375 249/500 [=============>................] - ETA: 1:23 - loss: 0.9876 - regression_loss: 0.8497 - classification_loss: 0.1379 250/500 [==============>...............] - ETA: 1:22 - loss: 0.9869 - regression_loss: 0.8492 - classification_loss: 0.1377 251/500 [==============>...............] - ETA: 1:22 - loss: 0.9883 - regression_loss: 0.8502 - classification_loss: 0.1380 252/500 [==============>...............] - ETA: 1:22 - loss: 0.9862 - regression_loss: 0.8485 - classification_loss: 0.1377 253/500 [==============>...............] - ETA: 1:21 - loss: 0.9831 - regression_loss: 0.8458 - classification_loss: 0.1373 254/500 [==============>...............] - ETA: 1:21 - loss: 0.9813 - regression_loss: 0.8443 - classification_loss: 0.1370 255/500 [==============>...............] - ETA: 1:21 - loss: 0.9831 - regression_loss: 0.8458 - classification_loss: 0.1373 256/500 [==============>...............] - ETA: 1:20 - loss: 0.9812 - regression_loss: 0.8443 - classification_loss: 0.1369 257/500 [==============>...............] - ETA: 1:20 - loss: 0.9816 - regression_loss: 0.8448 - classification_loss: 0.1369 258/500 [==============>...............] - ETA: 1:20 - loss: 0.9805 - regression_loss: 0.8439 - classification_loss: 0.1366 259/500 [==============>...............] - ETA: 1:19 - loss: 0.9817 - regression_loss: 0.8449 - classification_loss: 0.1368 260/500 [==============>...............] - ETA: 1:19 - loss: 0.9796 - regression_loss: 0.8430 - classification_loss: 0.1366 261/500 [==============>...............] - ETA: 1:19 - loss: 0.9791 - regression_loss: 0.8424 - classification_loss: 0.1367 262/500 [==============>...............] - ETA: 1:18 - loss: 0.9796 - regression_loss: 0.8429 - classification_loss: 0.1366 263/500 [==============>...............] - ETA: 1:18 - loss: 0.9796 - regression_loss: 0.8429 - classification_loss: 0.1367 264/500 [==============>...............] - ETA: 1:18 - loss: 0.9796 - regression_loss: 0.8431 - classification_loss: 0.1365 265/500 [==============>...............] - ETA: 1:17 - loss: 0.9781 - regression_loss: 0.8415 - classification_loss: 0.1366 266/500 [==============>...............] - ETA: 1:17 - loss: 0.9766 - regression_loss: 0.8402 - classification_loss: 0.1364 267/500 [===============>..............] - ETA: 1:17 - loss: 0.9744 - regression_loss: 0.8382 - classification_loss: 0.1362 268/500 [===============>..............] - ETA: 1:16 - loss: 0.9742 - regression_loss: 0.8382 - classification_loss: 0.1361 269/500 [===============>..............] - ETA: 1:16 - loss: 0.9756 - regression_loss: 0.8393 - classification_loss: 0.1362 270/500 [===============>..............] - ETA: 1:16 - loss: 0.9768 - regression_loss: 0.8404 - classification_loss: 0.1364 271/500 [===============>..............] - ETA: 1:15 - loss: 0.9787 - regression_loss: 0.8419 - classification_loss: 0.1368 272/500 [===============>..............] - ETA: 1:15 - loss: 0.9790 - regression_loss: 0.8422 - classification_loss: 0.1368 273/500 [===============>..............] - ETA: 1:15 - loss: 0.9792 - regression_loss: 0.8424 - classification_loss: 0.1368 274/500 [===============>..............] - ETA: 1:14 - loss: 0.9769 - regression_loss: 0.8404 - classification_loss: 0.1365 275/500 [===============>..............] - ETA: 1:14 - loss: 0.9786 - regression_loss: 0.8420 - classification_loss: 0.1366 276/500 [===============>..............] - ETA: 1:14 - loss: 0.9814 - regression_loss: 0.8443 - classification_loss: 0.1371 277/500 [===============>..............] - ETA: 1:13 - loss: 0.9813 - regression_loss: 0.8442 - classification_loss: 0.1372 278/500 [===============>..............] - ETA: 1:13 - loss: 0.9803 - regression_loss: 0.8432 - classification_loss: 0.1371 279/500 [===============>..............] - ETA: 1:13 - loss: 0.9807 - regression_loss: 0.8436 - classification_loss: 0.1370 280/500 [===============>..............] - ETA: 1:12 - loss: 0.9818 - regression_loss: 0.8446 - classification_loss: 0.1372 281/500 [===============>..............] - ETA: 1:12 - loss: 0.9818 - regression_loss: 0.8447 - classification_loss: 0.1371 282/500 [===============>..............] - ETA: 1:12 - loss: 0.9816 - regression_loss: 0.8444 - classification_loss: 0.1371 283/500 [===============>..............] - ETA: 1:12 - loss: 0.9798 - regression_loss: 0.8428 - classification_loss: 0.1370 284/500 [================>.............] - ETA: 1:11 - loss: 0.9798 - regression_loss: 0.8431 - classification_loss: 0.1367 285/500 [================>.............] - ETA: 1:11 - loss: 0.9800 - regression_loss: 0.8433 - classification_loss: 0.1367 286/500 [================>.............] - ETA: 1:11 - loss: 0.9790 - regression_loss: 0.8425 - classification_loss: 0.1365 287/500 [================>.............] - ETA: 1:10 - loss: 0.9807 - regression_loss: 0.8440 - classification_loss: 0.1368 288/500 [================>.............] - ETA: 1:10 - loss: 0.9812 - regression_loss: 0.8443 - classification_loss: 0.1368 289/500 [================>.............] - ETA: 1:10 - loss: 0.9803 - regression_loss: 0.8438 - classification_loss: 0.1366 290/500 [================>.............] - ETA: 1:09 - loss: 0.9798 - regression_loss: 0.8431 - classification_loss: 0.1367 291/500 [================>.............] - ETA: 1:09 - loss: 0.9802 - regression_loss: 0.8438 - classification_loss: 0.1365 292/500 [================>.............] - ETA: 1:09 - loss: 0.9789 - regression_loss: 0.8427 - classification_loss: 0.1363 293/500 [================>.............] - ETA: 1:08 - loss: 0.9781 - regression_loss: 0.8420 - classification_loss: 0.1361 294/500 [================>.............] - ETA: 1:08 - loss: 0.9784 - regression_loss: 0.8423 - classification_loss: 0.1361 295/500 [================>.............] - ETA: 1:08 - loss: 0.9768 - regression_loss: 0.8410 - classification_loss: 0.1359 296/500 [================>.............] - ETA: 1:07 - loss: 0.9781 - regression_loss: 0.8423 - classification_loss: 0.1359 297/500 [================>.............] - ETA: 1:07 - loss: 0.9778 - regression_loss: 0.8420 - classification_loss: 0.1358 298/500 [================>.............] - ETA: 1:07 - loss: 0.9782 - regression_loss: 0.8424 - classification_loss: 0.1358 299/500 [================>.............] - ETA: 1:06 - loss: 0.9789 - regression_loss: 0.8431 - classification_loss: 0.1358 300/500 [=================>............] - ETA: 1:06 - loss: 0.9784 - regression_loss: 0.8426 - classification_loss: 0.1358 301/500 [=================>............] - ETA: 1:06 - loss: 0.9791 - regression_loss: 0.8432 - classification_loss: 0.1359 302/500 [=================>............] - ETA: 1:05 - loss: 0.9787 - regression_loss: 0.8429 - classification_loss: 0.1358 303/500 [=================>............] - ETA: 1:05 - loss: 0.9783 - regression_loss: 0.8426 - classification_loss: 0.1357 304/500 [=================>............] - ETA: 1:05 - loss: 0.9783 - regression_loss: 0.8427 - classification_loss: 0.1357 305/500 [=================>............] - ETA: 1:04 - loss: 0.9803 - regression_loss: 0.8442 - classification_loss: 0.1360 306/500 [=================>............] - ETA: 1:04 - loss: 0.9784 - regression_loss: 0.8426 - classification_loss: 0.1357 307/500 [=================>............] - ETA: 1:04 - loss: 0.9803 - regression_loss: 0.8445 - classification_loss: 0.1357 308/500 [=================>............] - ETA: 1:03 - loss: 0.9815 - regression_loss: 0.8459 - classification_loss: 0.1356 309/500 [=================>............] - ETA: 1:03 - loss: 0.9805 - regression_loss: 0.8451 - classification_loss: 0.1354 310/500 [=================>............] - ETA: 1:03 - loss: 0.9795 - regression_loss: 0.8443 - classification_loss: 0.1351 311/500 [=================>............] - ETA: 1:02 - loss: 0.9777 - regression_loss: 0.8429 - classification_loss: 0.1348 312/500 [=================>............] - ETA: 1:02 - loss: 0.9773 - regression_loss: 0.8426 - classification_loss: 0.1347 313/500 [=================>............] - ETA: 1:02 - loss: 0.9757 - regression_loss: 0.8412 - classification_loss: 0.1345 314/500 [=================>............] - ETA: 1:01 - loss: 0.9753 - regression_loss: 0.8409 - classification_loss: 0.1344 315/500 [=================>............] - ETA: 1:01 - loss: 0.9762 - regression_loss: 0.8416 - classification_loss: 0.1346 316/500 [=================>............] - ETA: 1:01 - loss: 0.9748 - regression_loss: 0.8404 - classification_loss: 0.1344 317/500 [==================>...........] - ETA: 1:00 - loss: 0.9738 - regression_loss: 0.8395 - classification_loss: 0.1342 318/500 [==================>...........] - ETA: 1:00 - loss: 0.9739 - regression_loss: 0.8397 - classification_loss: 0.1342 319/500 [==================>...........] - ETA: 1:00 - loss: 0.9743 - regression_loss: 0.8402 - classification_loss: 0.1341 320/500 [==================>...........] - ETA: 59s - loss: 0.9746 - regression_loss: 0.8406 - classification_loss: 0.1340  321/500 [==================>...........] - ETA: 59s - loss: 0.9738 - regression_loss: 0.8400 - classification_loss: 0.1339 322/500 [==================>...........] - ETA: 58s - loss: 0.9743 - regression_loss: 0.8403 - classification_loss: 0.1340 323/500 [==================>...........] - ETA: 58s - loss: 0.9748 - regression_loss: 0.8406 - classification_loss: 0.1341 324/500 [==================>...........] - ETA: 58s - loss: 0.9745 - regression_loss: 0.8405 - classification_loss: 0.1340 325/500 [==================>...........] - ETA: 57s - loss: 0.9733 - regression_loss: 0.8396 - classification_loss: 0.1338 326/500 [==================>...........] - ETA: 57s - loss: 0.9731 - regression_loss: 0.8393 - classification_loss: 0.1338 327/500 [==================>...........] - ETA: 57s - loss: 0.9737 - regression_loss: 0.8399 - classification_loss: 0.1338 328/500 [==================>...........] - ETA: 56s - loss: 0.9737 - regression_loss: 0.8399 - classification_loss: 0.1338 329/500 [==================>...........] - ETA: 56s - loss: 0.9733 - regression_loss: 0.8398 - classification_loss: 0.1336 330/500 [==================>...........] - ETA: 56s - loss: 0.9740 - regression_loss: 0.8403 - classification_loss: 0.1337 331/500 [==================>...........] - ETA: 55s - loss: 0.9727 - regression_loss: 0.8391 - classification_loss: 0.1336 332/500 [==================>...........] - ETA: 55s - loss: 0.9739 - regression_loss: 0.8400 - classification_loss: 0.1338 333/500 [==================>...........] - ETA: 55s - loss: 0.9724 - regression_loss: 0.8388 - classification_loss: 0.1336 334/500 [===================>..........] - ETA: 54s - loss: 0.9710 - regression_loss: 0.8377 - classification_loss: 0.1333 335/500 [===================>..........] - ETA: 54s - loss: 0.9728 - regression_loss: 0.8391 - classification_loss: 0.1337 336/500 [===================>..........] - ETA: 54s - loss: 0.9717 - regression_loss: 0.8382 - classification_loss: 0.1335 337/500 [===================>..........] - ETA: 54s - loss: 0.9706 - regression_loss: 0.8372 - classification_loss: 0.1334 338/500 [===================>..........] - ETA: 53s - loss: 0.9717 - regression_loss: 0.8384 - classification_loss: 0.1334 339/500 [===================>..........] - ETA: 53s - loss: 0.9707 - regression_loss: 0.8374 - classification_loss: 0.1332 340/500 [===================>..........] - ETA: 53s - loss: 0.9693 - regression_loss: 0.8363 - classification_loss: 0.1330 341/500 [===================>..........] - ETA: 52s - loss: 0.9695 - regression_loss: 0.8365 - classification_loss: 0.1331 342/500 [===================>..........] - ETA: 52s - loss: 0.9708 - regression_loss: 0.8375 - classification_loss: 0.1333 343/500 [===================>..........] - ETA: 52s - loss: 0.9704 - regression_loss: 0.8372 - classification_loss: 0.1332 344/500 [===================>..........] - ETA: 51s - loss: 0.9705 - regression_loss: 0.8371 - classification_loss: 0.1334 345/500 [===================>..........] - ETA: 51s - loss: 0.9702 - regression_loss: 0.8366 - classification_loss: 0.1335 346/500 [===================>..........] - ETA: 51s - loss: 0.9703 - regression_loss: 0.8367 - classification_loss: 0.1335 347/500 [===================>..........] - ETA: 50s - loss: 0.9698 - regression_loss: 0.8362 - classification_loss: 0.1336 348/500 [===================>..........] - ETA: 50s - loss: 0.9695 - regression_loss: 0.8360 - classification_loss: 0.1335 349/500 [===================>..........] - ETA: 50s - loss: 0.9697 - regression_loss: 0.8363 - classification_loss: 0.1335 350/500 [====================>.........] - ETA: 49s - loss: 0.9706 - regression_loss: 0.8371 - classification_loss: 0.1336 351/500 [====================>.........] - ETA: 49s - loss: 0.9724 - regression_loss: 0.8385 - classification_loss: 0.1338 352/500 [====================>.........] - ETA: 49s - loss: 0.9712 - regression_loss: 0.8375 - classification_loss: 0.1337 353/500 [====================>.........] - ETA: 48s - loss: 0.9719 - regression_loss: 0.8380 - classification_loss: 0.1339 354/500 [====================>.........] - ETA: 48s - loss: 0.9723 - regression_loss: 0.8384 - classification_loss: 0.1339 355/500 [====================>.........] - ETA: 48s - loss: 0.9719 - regression_loss: 0.8381 - classification_loss: 0.1339 356/500 [====================>.........] - ETA: 47s - loss: 0.9721 - regression_loss: 0.8383 - classification_loss: 0.1339 357/500 [====================>.........] - ETA: 47s - loss: 0.9706 - regression_loss: 0.8369 - classification_loss: 0.1337 358/500 [====================>.........] - ETA: 47s - loss: 0.9700 - regression_loss: 0.8365 - classification_loss: 0.1335 359/500 [====================>.........] - ETA: 46s - loss: 0.9717 - regression_loss: 0.8379 - classification_loss: 0.1338 360/500 [====================>.........] - ETA: 46s - loss: 0.9728 - regression_loss: 0.8389 - classification_loss: 0.1339 361/500 [====================>.........] - ETA: 46s - loss: 0.9729 - regression_loss: 0.8389 - classification_loss: 0.1340 362/500 [====================>.........] - ETA: 45s - loss: 0.9734 - regression_loss: 0.8394 - classification_loss: 0.1340 363/500 [====================>.........] - ETA: 45s - loss: 0.9738 - regression_loss: 0.8400 - classification_loss: 0.1338 364/500 [====================>.........] - ETA: 45s - loss: 0.9740 - regression_loss: 0.8402 - classification_loss: 0.1338 365/500 [====================>.........] - ETA: 44s - loss: 0.9737 - regression_loss: 0.8401 - classification_loss: 0.1336 366/500 [====================>.........] - ETA: 44s - loss: 0.9737 - regression_loss: 0.8400 - classification_loss: 0.1337 367/500 [=====================>........] - ETA: 44s - loss: 0.9747 - regression_loss: 0.8410 - classification_loss: 0.1337 368/500 [=====================>........] - ETA: 43s - loss: 0.9736 - regression_loss: 0.8400 - classification_loss: 0.1335 369/500 [=====================>........] - ETA: 43s - loss: 0.9731 - regression_loss: 0.8397 - classification_loss: 0.1334 370/500 [=====================>........] - ETA: 43s - loss: 0.9715 - regression_loss: 0.8384 - classification_loss: 0.1332 371/500 [=====================>........] - ETA: 42s - loss: 0.9704 - regression_loss: 0.8374 - classification_loss: 0.1329 372/500 [=====================>........] - ETA: 42s - loss: 0.9697 - regression_loss: 0.8369 - classification_loss: 0.1328 373/500 [=====================>........] - ETA: 42s - loss: 0.9697 - regression_loss: 0.8369 - classification_loss: 0.1329 374/500 [=====================>........] - ETA: 41s - loss: 0.9695 - regression_loss: 0.8366 - classification_loss: 0.1329 375/500 [=====================>........] - ETA: 41s - loss: 0.9678 - regression_loss: 0.8350 - classification_loss: 0.1327 376/500 [=====================>........] - ETA: 41s - loss: 0.9668 - regression_loss: 0.8343 - classification_loss: 0.1325 377/500 [=====================>........] - ETA: 40s - loss: 0.9656 - regression_loss: 0.8333 - classification_loss: 0.1324 378/500 [=====================>........] - ETA: 40s - loss: 0.9647 - regression_loss: 0.8325 - classification_loss: 0.1322 379/500 [=====================>........] - ETA: 40s - loss: 0.9654 - regression_loss: 0.8330 - classification_loss: 0.1324 380/500 [=====================>........] - ETA: 39s - loss: 0.9641 - regression_loss: 0.8319 - classification_loss: 0.1322 381/500 [=====================>........] - ETA: 39s - loss: 0.9625 - regression_loss: 0.8306 - classification_loss: 0.1320 382/500 [=====================>........] - ETA: 39s - loss: 0.9637 - regression_loss: 0.8314 - classification_loss: 0.1323 383/500 [=====================>........] - ETA: 38s - loss: 0.9644 - regression_loss: 0.8319 - classification_loss: 0.1324 384/500 [======================>.......] - ETA: 38s - loss: 0.9633 - regression_loss: 0.8310 - classification_loss: 0.1323 385/500 [======================>.......] - ETA: 38s - loss: 0.9628 - regression_loss: 0.8305 - classification_loss: 0.1323 386/500 [======================>.......] - ETA: 37s - loss: 0.9636 - regression_loss: 0.8311 - classification_loss: 0.1325 387/500 [======================>.......] - ETA: 37s - loss: 0.9640 - regression_loss: 0.8314 - classification_loss: 0.1326 388/500 [======================>.......] - ETA: 37s - loss: 0.9655 - regression_loss: 0.8326 - classification_loss: 0.1329 389/500 [======================>.......] - ETA: 36s - loss: 0.9657 - regression_loss: 0.8328 - classification_loss: 0.1329 390/500 [======================>.......] - ETA: 36s - loss: 0.9639 - regression_loss: 0.8312 - classification_loss: 0.1326 391/500 [======================>.......] - ETA: 36s - loss: 0.9639 - regression_loss: 0.8313 - classification_loss: 0.1327 392/500 [======================>.......] - ETA: 35s - loss: 0.9636 - regression_loss: 0.8310 - classification_loss: 0.1326 393/500 [======================>.......] - ETA: 35s - loss: 0.9632 - regression_loss: 0.8307 - classification_loss: 0.1325 394/500 [======================>.......] - ETA: 35s - loss: 0.9647 - regression_loss: 0.8321 - classification_loss: 0.1327 395/500 [======================>.......] - ETA: 34s - loss: 0.9650 - regression_loss: 0.8323 - classification_loss: 0.1327 396/500 [======================>.......] - ETA: 34s - loss: 0.9649 - regression_loss: 0.8321 - classification_loss: 0.1328 397/500 [======================>.......] - ETA: 34s - loss: 0.9662 - regression_loss: 0.8332 - classification_loss: 0.1330 398/500 [======================>.......] - ETA: 33s - loss: 0.9662 - regression_loss: 0.8331 - classification_loss: 0.1330 399/500 [======================>.......] - ETA: 33s - loss: 0.9670 - regression_loss: 0.8338 - classification_loss: 0.1333 400/500 [=======================>......] - ETA: 33s - loss: 0.9665 - regression_loss: 0.8334 - classification_loss: 0.1332 401/500 [=======================>......] - ETA: 32s - loss: 0.9663 - regression_loss: 0.8331 - classification_loss: 0.1332 402/500 [=======================>......] - ETA: 32s - loss: 0.9677 - regression_loss: 0.8343 - classification_loss: 0.1334 403/500 [=======================>......] - ETA: 32s - loss: 0.9697 - regression_loss: 0.8358 - classification_loss: 0.1340 404/500 [=======================>......] - ETA: 31s - loss: 0.9699 - regression_loss: 0.8359 - classification_loss: 0.1340 405/500 [=======================>......] - ETA: 31s - loss: 0.9701 - regression_loss: 0.8361 - classification_loss: 0.1340 406/500 [=======================>......] - ETA: 31s - loss: 0.9691 - regression_loss: 0.8353 - classification_loss: 0.1338 407/500 [=======================>......] - ETA: 30s - loss: 0.9680 - regression_loss: 0.8344 - classification_loss: 0.1336 408/500 [=======================>......] - ETA: 30s - loss: 0.9673 - regression_loss: 0.8339 - classification_loss: 0.1334 409/500 [=======================>......] - ETA: 30s - loss: 0.9675 - regression_loss: 0.8342 - classification_loss: 0.1333 410/500 [=======================>......] - ETA: 29s - loss: 0.9668 - regression_loss: 0.8337 - classification_loss: 0.1331 411/500 [=======================>......] - ETA: 29s - loss: 0.9677 - regression_loss: 0.8344 - classification_loss: 0.1332 412/500 [=======================>......] - ETA: 29s - loss: 0.9676 - regression_loss: 0.8344 - classification_loss: 0.1332 413/500 [=======================>......] - ETA: 28s - loss: 0.9672 - regression_loss: 0.8340 - classification_loss: 0.1332 414/500 [=======================>......] - ETA: 28s - loss: 0.9661 - regression_loss: 0.8331 - classification_loss: 0.1330 415/500 [=======================>......] - ETA: 28s - loss: 0.9672 - regression_loss: 0.8339 - classification_loss: 0.1333 416/500 [=======================>......] - ETA: 27s - loss: 0.9659 - regression_loss: 0.8328 - classification_loss: 0.1331 417/500 [========================>.....] - ETA: 27s - loss: 0.9664 - regression_loss: 0.8332 - classification_loss: 0.1333 418/500 [========================>.....] - ETA: 27s - loss: 0.9672 - regression_loss: 0.8338 - classification_loss: 0.1333 419/500 [========================>.....] - ETA: 26s - loss: 0.9682 - regression_loss: 0.8347 - classification_loss: 0.1335 420/500 [========================>.....] - ETA: 26s - loss: 0.9686 - regression_loss: 0.8350 - classification_loss: 0.1336 421/500 [========================>.....] - ETA: 26s - loss: 0.9705 - regression_loss: 0.8366 - classification_loss: 0.1339 422/500 [========================>.....] - ETA: 25s - loss: 0.9695 - regression_loss: 0.8358 - classification_loss: 0.1337 423/500 [========================>.....] - ETA: 25s - loss: 0.9695 - regression_loss: 0.8357 - classification_loss: 0.1338 424/500 [========================>.....] - ETA: 25s - loss: 0.9698 - regression_loss: 0.8360 - classification_loss: 0.1338 425/500 [========================>.....] - ETA: 24s - loss: 0.9703 - regression_loss: 0.8363 - classification_loss: 0.1339 426/500 [========================>.....] - ETA: 24s - loss: 0.9691 - regression_loss: 0.8353 - classification_loss: 0.1338 427/500 [========================>.....] - ETA: 24s - loss: 0.9686 - regression_loss: 0.8349 - classification_loss: 0.1337 428/500 [========================>.....] - ETA: 23s - loss: 0.9673 - regression_loss: 0.8338 - classification_loss: 0.1335 429/500 [========================>.....] - ETA: 23s - loss: 0.9677 - regression_loss: 0.8342 - classification_loss: 0.1335 430/500 [========================>.....] - ETA: 23s - loss: 0.9668 - regression_loss: 0.8334 - classification_loss: 0.1334 431/500 [========================>.....] - ETA: 22s - loss: 0.9667 - regression_loss: 0.8333 - classification_loss: 0.1333 432/500 [========================>.....] - ETA: 22s - loss: 0.9671 - regression_loss: 0.8337 - classification_loss: 0.1334 433/500 [========================>.....] - ETA: 22s - loss: 0.9661 - regression_loss: 0.8328 - classification_loss: 0.1334 434/500 [=========================>....] - ETA: 21s - loss: 0.9664 - regression_loss: 0.8330 - classification_loss: 0.1334 435/500 [=========================>....] - ETA: 21s - loss: 0.9649 - regression_loss: 0.8316 - classification_loss: 0.1332 436/500 [=========================>....] - ETA: 21s - loss: 0.9656 - regression_loss: 0.8322 - classification_loss: 0.1334 437/500 [=========================>....] - ETA: 20s - loss: 0.9646 - regression_loss: 0.8313 - classification_loss: 0.1333 438/500 [=========================>....] - ETA: 20s - loss: 0.9645 - regression_loss: 0.8313 - classification_loss: 0.1332 439/500 [=========================>....] - ETA: 20s - loss: 0.9631 - regression_loss: 0.8301 - classification_loss: 0.1330 440/500 [=========================>....] - ETA: 19s - loss: 0.9628 - regression_loss: 0.8297 - classification_loss: 0.1331 441/500 [=========================>....] - ETA: 19s - loss: 0.9630 - regression_loss: 0.8299 - classification_loss: 0.1331 442/500 [=========================>....] - ETA: 19s - loss: 0.9626 - regression_loss: 0.8296 - classification_loss: 0.1329 443/500 [=========================>....] - ETA: 18s - loss: 0.9618 - regression_loss: 0.8290 - classification_loss: 0.1329 444/500 [=========================>....] - ETA: 18s - loss: 0.9614 - regression_loss: 0.8287 - classification_loss: 0.1327 445/500 [=========================>....] - ETA: 18s - loss: 0.9609 - regression_loss: 0.8283 - classification_loss: 0.1326 446/500 [=========================>....] - ETA: 17s - loss: 0.9608 - regression_loss: 0.8282 - classification_loss: 0.1326 447/500 [=========================>....] - ETA: 17s - loss: 0.9598 - regression_loss: 0.8273 - classification_loss: 0.1325 448/500 [=========================>....] - ETA: 17s - loss: 0.9593 - regression_loss: 0.8269 - classification_loss: 0.1323 449/500 [=========================>....] - ETA: 16s - loss: 0.9588 - regression_loss: 0.8265 - classification_loss: 0.1323 450/500 [==========================>...] - ETA: 16s - loss: 0.9579 - regression_loss: 0.8257 - classification_loss: 0.1321 451/500 [==========================>...] - ETA: 16s - loss: 0.9573 - regression_loss: 0.8253 - classification_loss: 0.1320 452/500 [==========================>...] - ETA: 15s - loss: 0.9568 - regression_loss: 0.8250 - classification_loss: 0.1318 453/500 [==========================>...] - ETA: 15s - loss: 0.9582 - regression_loss: 0.8260 - classification_loss: 0.1321 454/500 [==========================>...] - ETA: 15s - loss: 0.9588 - regression_loss: 0.8265 - classification_loss: 0.1324 455/500 [==========================>...] - ETA: 14s - loss: 0.9591 - regression_loss: 0.8269 - classification_loss: 0.1323 456/500 [==========================>...] - ETA: 14s - loss: 0.9599 - regression_loss: 0.8275 - classification_loss: 0.1324 457/500 [==========================>...] - ETA: 14s - loss: 0.9608 - regression_loss: 0.8282 - classification_loss: 0.1326 458/500 [==========================>...] - ETA: 13s - loss: 0.9619 - regression_loss: 0.8291 - classification_loss: 0.1328 459/500 [==========================>...] - ETA: 13s - loss: 0.9631 - regression_loss: 0.8301 - classification_loss: 0.1330 460/500 [==========================>...] - ETA: 13s - loss: 0.9631 - regression_loss: 0.8301 - classification_loss: 0.1330 461/500 [==========================>...] - ETA: 12s - loss: 0.9641 - regression_loss: 0.8309 - classification_loss: 0.1331 462/500 [==========================>...] - ETA: 12s - loss: 0.9629 - regression_loss: 0.8299 - classification_loss: 0.1330 463/500 [==========================>...] - ETA: 12s - loss: 0.9638 - regression_loss: 0.8306 - classification_loss: 0.1332 464/500 [==========================>...] - ETA: 11s - loss: 0.9640 - regression_loss: 0.8308 - classification_loss: 0.1332 465/500 [==========================>...] - ETA: 11s - loss: 0.9634 - regression_loss: 0.8303 - classification_loss: 0.1331 466/500 [==========================>...] - ETA: 11s - loss: 0.9641 - regression_loss: 0.8308 - classification_loss: 0.1332 467/500 [===========================>..] - ETA: 10s - loss: 0.9643 - regression_loss: 0.8311 - classification_loss: 0.1333 468/500 [===========================>..] - ETA: 10s - loss: 0.9645 - regression_loss: 0.8313 - classification_loss: 0.1332 469/500 [===========================>..] - ETA: 10s - loss: 0.9648 - regression_loss: 0.8315 - classification_loss: 0.1333 470/500 [===========================>..] - ETA: 9s - loss: 0.9652 - regression_loss: 0.8320 - classification_loss: 0.1333  471/500 [===========================>..] - ETA: 9s - loss: 0.9653 - regression_loss: 0.8320 - classification_loss: 0.1333 472/500 [===========================>..] - ETA: 9s - loss: 0.9658 - regression_loss: 0.8324 - classification_loss: 0.1334 473/500 [===========================>..] - ETA: 8s - loss: 0.9653 - regression_loss: 0.8320 - classification_loss: 0.1333 474/500 [===========================>..] - ETA: 8s - loss: 0.9643 - regression_loss: 0.8312 - classification_loss: 0.1331 475/500 [===========================>..] - ETA: 8s - loss: 0.9634 - regression_loss: 0.8304 - classification_loss: 0.1330 476/500 [===========================>..] - ETA: 7s - loss: 0.9639 - regression_loss: 0.8309 - classification_loss: 0.1331 477/500 [===========================>..] - ETA: 7s - loss: 0.9627 - regression_loss: 0.8298 - classification_loss: 0.1329 478/500 [===========================>..] - ETA: 7s - loss: 0.9613 - regression_loss: 0.8285 - classification_loss: 0.1328 479/500 [===========================>..] - ETA: 6s - loss: 0.9609 - regression_loss: 0.8282 - classification_loss: 0.1327 480/500 [===========================>..] - ETA: 6s - loss: 0.9617 - regression_loss: 0.8288 - classification_loss: 0.1329 481/500 [===========================>..] - ETA: 6s - loss: 0.9618 - regression_loss: 0.8289 - classification_loss: 0.1329 482/500 [===========================>..] - ETA: 5s - loss: 0.9626 - regression_loss: 0.8297 - classification_loss: 0.1329 483/500 [===========================>..] - ETA: 5s - loss: 0.9613 - regression_loss: 0.8286 - classification_loss: 0.1327 484/500 [============================>.] - ETA: 5s - loss: 0.9605 - regression_loss: 0.8279 - classification_loss: 0.1326 485/500 [============================>.] - ETA: 4s - loss: 0.9603 - regression_loss: 0.8277 - classification_loss: 0.1326 486/500 [============================>.] - ETA: 4s - loss: 0.9590 - regression_loss: 0.8266 - classification_loss: 0.1324 487/500 [============================>.] - ETA: 4s - loss: 0.9577 - regression_loss: 0.8255 - classification_loss: 0.1322 488/500 [============================>.] - ETA: 3s - loss: 0.9581 - regression_loss: 0.8259 - classification_loss: 0.1322 489/500 [============================>.] - ETA: 3s - loss: 0.9586 - regression_loss: 0.8263 - classification_loss: 0.1323 490/500 [============================>.] - ETA: 3s - loss: 0.9603 - regression_loss: 0.8277 - classification_loss: 0.1326 491/500 [============================>.] - ETA: 2s - loss: 0.9611 - regression_loss: 0.8284 - classification_loss: 0.1327 492/500 [============================>.] - ETA: 2s - loss: 0.9604 - regression_loss: 0.8279 - classification_loss: 0.1326 493/500 [============================>.] - ETA: 2s - loss: 0.9591 - regression_loss: 0.8268 - classification_loss: 0.1324 494/500 [============================>.] - ETA: 1s - loss: 0.9597 - regression_loss: 0.8271 - classification_loss: 0.1326 495/500 [============================>.] - ETA: 1s - loss: 0.9590 - regression_loss: 0.8266 - classification_loss: 0.1325 496/500 [============================>.] - ETA: 1s - loss: 0.9583 - regression_loss: 0.8259 - classification_loss: 0.1324 497/500 [============================>.] - ETA: 0s - loss: 0.9595 - regression_loss: 0.8269 - classification_loss: 0.1326 498/500 [============================>.] - ETA: 0s - loss: 0.9603 - regression_loss: 0.8275 - classification_loss: 0.1328 499/500 [============================>.] - ETA: 0s - loss: 0.9604 - regression_loss: 0.8276 - classification_loss: 0.1328 500/500 [==============================] - 166s 332ms/step - loss: 0.9611 - regression_loss: 0.8281 - classification_loss: 0.1329 1172 instances of class plum with average precision: 0.6321 mAP: 0.6321 Epoch 00032: saving model to ./training/snapshots/resnet101_pascal_32.h5 Epoch 33/150 1/500 [..............................] - ETA: 2:35 - loss: 0.8384 - regression_loss: 0.7176 - classification_loss: 0.1208 2/500 [..............................] - ETA: 2:40 - loss: 1.0499 - regression_loss: 0.8956 - classification_loss: 0.1544 3/500 [..............................] - ETA: 2:44 - loss: 0.8833 - regression_loss: 0.7514 - classification_loss: 0.1319 4/500 [..............................] - ETA: 2:42 - loss: 0.7844 - regression_loss: 0.6695 - classification_loss: 0.1149 5/500 [..............................] - ETA: 2:43 - loss: 0.8802 - regression_loss: 0.7491 - classification_loss: 0.1311 6/500 [..............................] - ETA: 2:43 - loss: 0.9183 - regression_loss: 0.7865 - classification_loss: 0.1318 7/500 [..............................] - ETA: 2:42 - loss: 0.8663 - regression_loss: 0.7449 - classification_loss: 0.1214 8/500 [..............................] - ETA: 2:42 - loss: 0.9116 - regression_loss: 0.7907 - classification_loss: 0.1209 9/500 [..............................] - ETA: 2:43 - loss: 0.9347 - regression_loss: 0.8093 - classification_loss: 0.1254 10/500 [..............................] - ETA: 2:43 - loss: 0.9575 - regression_loss: 0.8266 - classification_loss: 0.1309 11/500 [..............................] - ETA: 2:43 - loss: 0.9799 - regression_loss: 0.8475 - classification_loss: 0.1324 12/500 [..............................] - ETA: 2:43 - loss: 0.9813 - regression_loss: 0.8485 - classification_loss: 0.1328 13/500 [..............................] - ETA: 2:42 - loss: 0.9393 - regression_loss: 0.8137 - classification_loss: 0.1255 14/500 [..............................] - ETA: 2:41 - loss: 0.9536 - regression_loss: 0.8253 - classification_loss: 0.1283 15/500 [..............................] - ETA: 2:41 - loss: 0.9981 - regression_loss: 0.8612 - classification_loss: 0.1369 16/500 [..............................] - ETA: 2:41 - loss: 0.9987 - regression_loss: 0.8626 - classification_loss: 0.1361 17/500 [>.............................] - ETA: 2:40 - loss: 1.0133 - regression_loss: 0.8802 - classification_loss: 0.1330 18/500 [>.............................] - ETA: 2:39 - loss: 1.0103 - regression_loss: 0.8768 - classification_loss: 0.1335 19/500 [>.............................] - ETA: 2:39 - loss: 0.9895 - regression_loss: 0.8591 - classification_loss: 0.1304 20/500 [>.............................] - ETA: 2:38 - loss: 0.9589 - regression_loss: 0.8317 - classification_loss: 0.1272 21/500 [>.............................] - ETA: 2:38 - loss: 0.9711 - regression_loss: 0.8420 - classification_loss: 0.1292 22/500 [>.............................] - ETA: 2:38 - loss: 0.9606 - regression_loss: 0.8328 - classification_loss: 0.1278 23/500 [>.............................] - ETA: 2:38 - loss: 0.9657 - regression_loss: 0.8355 - classification_loss: 0.1302 24/500 [>.............................] - ETA: 2:38 - loss: 0.9813 - regression_loss: 0.8497 - classification_loss: 0.1316 25/500 [>.............................] - ETA: 2:37 - loss: 0.9776 - regression_loss: 0.8465 - classification_loss: 0.1311 26/500 [>.............................] - ETA: 2:37 - loss: 0.9725 - regression_loss: 0.8438 - classification_loss: 0.1286 27/500 [>.............................] - ETA: 2:37 - loss: 0.9674 - regression_loss: 0.8399 - classification_loss: 0.1274 28/500 [>.............................] - ETA: 2:36 - loss: 0.9560 - regression_loss: 0.8309 - classification_loss: 0.1252 29/500 [>.............................] - ETA: 2:35 - loss: 0.9590 - regression_loss: 0.8331 - classification_loss: 0.1258 30/500 [>.............................] - ETA: 2:35 - loss: 0.9429 - regression_loss: 0.8194 - classification_loss: 0.1235 31/500 [>.............................] - ETA: 2:35 - loss: 0.9588 - regression_loss: 0.8334 - classification_loss: 0.1255 32/500 [>.............................] - ETA: 2:34 - loss: 0.9622 - regression_loss: 0.8358 - classification_loss: 0.1264 33/500 [>.............................] - ETA: 2:34 - loss: 0.9617 - regression_loss: 0.8359 - classification_loss: 0.1257 34/500 [=>............................] - ETA: 2:33 - loss: 0.9455 - regression_loss: 0.8216 - classification_loss: 0.1239 35/500 [=>............................] - ETA: 2:33 - loss: 0.9454 - regression_loss: 0.8225 - classification_loss: 0.1229 36/500 [=>............................] - ETA: 2:33 - loss: 0.9355 - regression_loss: 0.8131 - classification_loss: 0.1224 37/500 [=>............................] - ETA: 2:32 - loss: 0.9397 - regression_loss: 0.8167 - classification_loss: 0.1230 38/500 [=>............................] - ETA: 2:32 - loss: 0.9398 - regression_loss: 0.8165 - classification_loss: 0.1233 39/500 [=>............................] - ETA: 2:31 - loss: 0.9366 - regression_loss: 0.8142 - classification_loss: 0.1225 40/500 [=>............................] - ETA: 2:31 - loss: 0.9678 - regression_loss: 0.8400 - classification_loss: 0.1278 41/500 [=>............................] - ETA: 2:30 - loss: 0.9567 - regression_loss: 0.8307 - classification_loss: 0.1260 42/500 [=>............................] - ETA: 2:30 - loss: 0.9590 - regression_loss: 0.8328 - classification_loss: 0.1262 43/500 [=>............................] - ETA: 2:30 - loss: 0.9663 - regression_loss: 0.8398 - classification_loss: 0.1264 44/500 [=>............................] - ETA: 2:29 - loss: 0.9714 - regression_loss: 0.8444 - classification_loss: 0.1270 45/500 [=>............................] - ETA: 2:29 - loss: 0.9762 - regression_loss: 0.8484 - classification_loss: 0.1278 46/500 [=>............................] - ETA: 2:29 - loss: 0.9726 - regression_loss: 0.8453 - classification_loss: 0.1273 47/500 [=>............................] - ETA: 2:28 - loss: 0.9674 - regression_loss: 0.8403 - classification_loss: 0.1272 48/500 [=>............................] - ETA: 2:28 - loss: 0.9656 - regression_loss: 0.8388 - classification_loss: 0.1268 49/500 [=>............................] - ETA: 2:28 - loss: 0.9594 - regression_loss: 0.8337 - classification_loss: 0.1256 50/500 [==>...........................] - ETA: 2:27 - loss: 0.9540 - regression_loss: 0.8292 - classification_loss: 0.1248 51/500 [==>...........................] - ETA: 2:27 - loss: 0.9572 - regression_loss: 0.8317 - classification_loss: 0.1254 52/500 [==>...........................] - ETA: 2:27 - loss: 0.9533 - regression_loss: 0.8277 - classification_loss: 0.1256 53/500 [==>...........................] - ETA: 2:26 - loss: 0.9519 - regression_loss: 0.8264 - classification_loss: 0.1255 54/500 [==>...........................] - ETA: 2:26 - loss: 0.9585 - regression_loss: 0.8302 - classification_loss: 0.1283 55/500 [==>...........................] - ETA: 2:26 - loss: 0.9574 - regression_loss: 0.8290 - classification_loss: 0.1283 56/500 [==>...........................] - ETA: 2:26 - loss: 0.9495 - regression_loss: 0.8230 - classification_loss: 0.1265 57/500 [==>...........................] - ETA: 2:25 - loss: 0.9456 - regression_loss: 0.8198 - classification_loss: 0.1258 58/500 [==>...........................] - ETA: 2:25 - loss: 0.9412 - regression_loss: 0.8156 - classification_loss: 0.1256 59/500 [==>...........................] - ETA: 2:25 - loss: 0.9413 - regression_loss: 0.8156 - classification_loss: 0.1258 60/500 [==>...........................] - ETA: 2:25 - loss: 0.9310 - regression_loss: 0.8065 - classification_loss: 0.1244 61/500 [==>...........................] - ETA: 2:24 - loss: 0.9340 - regression_loss: 0.8094 - classification_loss: 0.1245 62/500 [==>...........................] - ETA: 2:24 - loss: 0.9311 - regression_loss: 0.8073 - classification_loss: 0.1238 63/500 [==>...........................] - ETA: 2:24 - loss: 0.9284 - regression_loss: 0.8053 - classification_loss: 0.1230 64/500 [==>...........................] - ETA: 2:23 - loss: 0.9395 - regression_loss: 0.8143 - classification_loss: 0.1252 65/500 [==>...........................] - ETA: 2:23 - loss: 0.9439 - regression_loss: 0.8182 - classification_loss: 0.1257 66/500 [==>...........................] - ETA: 2:23 - loss: 0.9417 - regression_loss: 0.8162 - classification_loss: 0.1255 67/500 [===>..........................] - ETA: 2:22 - loss: 0.9450 - regression_loss: 0.8190 - classification_loss: 0.1260 68/500 [===>..........................] - ETA: 2:22 - loss: 0.9466 - regression_loss: 0.8204 - classification_loss: 0.1261 69/500 [===>..........................] - ETA: 2:22 - loss: 0.9529 - regression_loss: 0.8256 - classification_loss: 0.1273 70/500 [===>..........................] - ETA: 2:22 - loss: 0.9443 - regression_loss: 0.8182 - classification_loss: 0.1261 71/500 [===>..........................] - ETA: 2:21 - loss: 0.9362 - regression_loss: 0.8112 - classification_loss: 0.1251 72/500 [===>..........................] - ETA: 2:21 - loss: 0.9311 - regression_loss: 0.8065 - classification_loss: 0.1246 73/500 [===>..........................] - ETA: 2:20 - loss: 0.9338 - regression_loss: 0.8092 - classification_loss: 0.1247 74/500 [===>..........................] - ETA: 2:20 - loss: 0.9428 - regression_loss: 0.8170 - classification_loss: 0.1259 75/500 [===>..........................] - ETA: 2:20 - loss: 0.9423 - regression_loss: 0.8165 - classification_loss: 0.1258 76/500 [===>..........................] - ETA: 2:20 - loss: 0.9451 - regression_loss: 0.8182 - classification_loss: 0.1269 77/500 [===>..........................] - ETA: 2:19 - loss: 0.9496 - regression_loss: 0.8216 - classification_loss: 0.1280 78/500 [===>..........................] - ETA: 2:19 - loss: 0.9539 - regression_loss: 0.8245 - classification_loss: 0.1293 79/500 [===>..........................] - ETA: 2:19 - loss: 0.9505 - regression_loss: 0.8222 - classification_loss: 0.1282 80/500 [===>..........................] - ETA: 2:18 - loss: 0.9581 - regression_loss: 0.8283 - classification_loss: 0.1298 81/500 [===>..........................] - ETA: 2:18 - loss: 0.9546 - regression_loss: 0.8254 - classification_loss: 0.1291 82/500 [===>..........................] - ETA: 2:18 - loss: 0.9534 - regression_loss: 0.8242 - classification_loss: 0.1291 83/500 [===>..........................] - ETA: 2:17 - loss: 0.9471 - regression_loss: 0.8190 - classification_loss: 0.1281 84/500 [====>.........................] - ETA: 2:17 - loss: 0.9412 - regression_loss: 0.8139 - classification_loss: 0.1273 85/500 [====>.........................] - ETA: 2:17 - loss: 0.9403 - regression_loss: 0.8127 - classification_loss: 0.1276 86/500 [====>.........................] - ETA: 2:16 - loss: 0.9357 - regression_loss: 0.8088 - classification_loss: 0.1269 87/500 [====>.........................] - ETA: 2:16 - loss: 0.9337 - regression_loss: 0.8076 - classification_loss: 0.1261 88/500 [====>.........................] - ETA: 2:15 - loss: 0.9341 - regression_loss: 0.8079 - classification_loss: 0.1262 89/500 [====>.........................] - ETA: 2:15 - loss: 0.9294 - regression_loss: 0.8041 - classification_loss: 0.1254 90/500 [====>.........................] - ETA: 2:15 - loss: 0.9284 - regression_loss: 0.8028 - classification_loss: 0.1256 91/500 [====>.........................] - ETA: 2:14 - loss: 0.9351 - regression_loss: 0.8086 - classification_loss: 0.1266 92/500 [====>.........................] - ETA: 2:14 - loss: 0.9317 - regression_loss: 0.8056 - classification_loss: 0.1261 93/500 [====>.........................] - ETA: 2:14 - loss: 0.9288 - regression_loss: 0.8033 - classification_loss: 0.1255 94/500 [====>.........................] - ETA: 2:13 - loss: 0.9300 - regression_loss: 0.8042 - classification_loss: 0.1258 95/500 [====>.........................] - ETA: 2:13 - loss: 0.9316 - regression_loss: 0.8054 - classification_loss: 0.1262 96/500 [====>.........................] - ETA: 2:13 - loss: 0.9342 - regression_loss: 0.8075 - classification_loss: 0.1267 97/500 [====>.........................] - ETA: 2:12 - loss: 0.9283 - regression_loss: 0.8025 - classification_loss: 0.1259 98/500 [====>.........................] - ETA: 2:12 - loss: 0.9311 - regression_loss: 0.8048 - classification_loss: 0.1263 99/500 [====>.........................] - ETA: 2:12 - loss: 0.9301 - regression_loss: 0.8040 - classification_loss: 0.1261 100/500 [=====>........................] - ETA: 2:11 - loss: 0.9334 - regression_loss: 0.8071 - classification_loss: 0.1263 101/500 [=====>........................] - ETA: 2:11 - loss: 0.9360 - regression_loss: 0.8095 - classification_loss: 0.1265 102/500 [=====>........................] - ETA: 2:11 - loss: 0.9304 - regression_loss: 0.8046 - classification_loss: 0.1258 103/500 [=====>........................] - ETA: 2:11 - loss: 0.9255 - regression_loss: 0.8003 - classification_loss: 0.1252 104/500 [=====>........................] - ETA: 2:10 - loss: 0.9291 - regression_loss: 0.8036 - classification_loss: 0.1256 105/500 [=====>........................] - ETA: 2:10 - loss: 0.9279 - regression_loss: 0.8027 - classification_loss: 0.1252 106/500 [=====>........................] - ETA: 2:10 - loss: 0.9279 - regression_loss: 0.8028 - classification_loss: 0.1251 107/500 [=====>........................] - ETA: 2:09 - loss: 0.9315 - regression_loss: 0.8056 - classification_loss: 0.1259 108/500 [=====>........................] - ETA: 2:09 - loss: 0.9319 - regression_loss: 0.8058 - classification_loss: 0.1261 109/500 [=====>........................] - ETA: 2:09 - loss: 0.9306 - regression_loss: 0.8042 - classification_loss: 0.1264 110/500 [=====>........................] - ETA: 2:08 - loss: 0.9301 - regression_loss: 0.8040 - classification_loss: 0.1261 111/500 [=====>........................] - ETA: 2:08 - loss: 0.9261 - regression_loss: 0.8008 - classification_loss: 0.1253 112/500 [=====>........................] - ETA: 2:08 - loss: 0.9260 - regression_loss: 0.8007 - classification_loss: 0.1253 113/500 [=====>........................] - ETA: 2:07 - loss: 0.9221 - regression_loss: 0.7973 - classification_loss: 0.1248 114/500 [=====>........................] - ETA: 2:07 - loss: 0.9180 - regression_loss: 0.7938 - classification_loss: 0.1242 115/500 [=====>........................] - ETA: 2:07 - loss: 0.9197 - regression_loss: 0.7958 - classification_loss: 0.1239 116/500 [=====>........................] - ETA: 2:06 - loss: 0.9182 - regression_loss: 0.7943 - classification_loss: 0.1238 117/500 [======>.......................] - ETA: 2:06 - loss: 0.9125 - regression_loss: 0.7894 - classification_loss: 0.1231 118/500 [======>.......................] - ETA: 2:06 - loss: 0.9167 - regression_loss: 0.7930 - classification_loss: 0.1236 119/500 [======>.......................] - ETA: 2:05 - loss: 0.9124 - regression_loss: 0.7893 - classification_loss: 0.1231 120/500 [======>.......................] - ETA: 2:05 - loss: 0.9124 - regression_loss: 0.7893 - classification_loss: 0.1231 121/500 [======>.......................] - ETA: 2:05 - loss: 0.9160 - regression_loss: 0.7925 - classification_loss: 0.1235 122/500 [======>.......................] - ETA: 2:04 - loss: 0.9134 - regression_loss: 0.7903 - classification_loss: 0.1231 123/500 [======>.......................] - ETA: 2:04 - loss: 0.9120 - regression_loss: 0.7893 - classification_loss: 0.1227 124/500 [======>.......................] - ETA: 2:04 - loss: 0.9168 - regression_loss: 0.7934 - classification_loss: 0.1235 125/500 [======>.......................] - ETA: 2:03 - loss: 0.9238 - regression_loss: 0.7993 - classification_loss: 0.1245 126/500 [======>.......................] - ETA: 2:03 - loss: 0.9237 - regression_loss: 0.7992 - classification_loss: 0.1246 127/500 [======>.......................] - ETA: 2:03 - loss: 0.9279 - regression_loss: 0.8025 - classification_loss: 0.1254 128/500 [======>.......................] - ETA: 2:02 - loss: 0.9291 - regression_loss: 0.8035 - classification_loss: 0.1256 129/500 [======>.......................] - ETA: 2:02 - loss: 0.9302 - regression_loss: 0.8041 - classification_loss: 0.1261 130/500 [======>.......................] - ETA: 2:02 - loss: 0.9285 - regression_loss: 0.8029 - classification_loss: 0.1256 131/500 [======>.......................] - ETA: 2:01 - loss: 0.9345 - regression_loss: 0.8080 - classification_loss: 0.1265 132/500 [======>.......................] - ETA: 2:01 - loss: 0.9325 - regression_loss: 0.8064 - classification_loss: 0.1261 133/500 [======>.......................] - ETA: 2:01 - loss: 0.9358 - regression_loss: 0.8091 - classification_loss: 0.1267 134/500 [=======>......................] - ETA: 2:00 - loss: 0.9339 - regression_loss: 0.8077 - classification_loss: 0.1262 135/500 [=======>......................] - ETA: 2:00 - loss: 0.9358 - regression_loss: 0.8093 - classification_loss: 0.1266 136/500 [=======>......................] - ETA: 2:00 - loss: 0.9326 - regression_loss: 0.8064 - classification_loss: 0.1262 137/500 [=======>......................] - ETA: 1:59 - loss: 0.9341 - regression_loss: 0.8076 - classification_loss: 0.1265 138/500 [=======>......................] - ETA: 1:59 - loss: 0.9385 - regression_loss: 0.8110 - classification_loss: 0.1275 139/500 [=======>......................] - ETA: 1:59 - loss: 0.9373 - regression_loss: 0.8101 - classification_loss: 0.1272 140/500 [=======>......................] - ETA: 1:58 - loss: 0.9387 - regression_loss: 0.8112 - classification_loss: 0.1274 141/500 [=======>......................] - ETA: 1:58 - loss: 0.9391 - regression_loss: 0.8118 - classification_loss: 0.1274 142/500 [=======>......................] - ETA: 1:58 - loss: 0.9358 - regression_loss: 0.8087 - classification_loss: 0.1271 143/500 [=======>......................] - ETA: 1:57 - loss: 0.9347 - regression_loss: 0.8077 - classification_loss: 0.1270 144/500 [=======>......................] - ETA: 1:57 - loss: 0.9354 - regression_loss: 0.8084 - classification_loss: 0.1270 145/500 [=======>......................] - ETA: 1:57 - loss: 0.9333 - regression_loss: 0.8069 - classification_loss: 0.1265 146/500 [=======>......................] - ETA: 1:56 - loss: 0.9286 - regression_loss: 0.8027 - classification_loss: 0.1259 147/500 [=======>......................] - ETA: 1:56 - loss: 0.9294 - regression_loss: 0.8031 - classification_loss: 0.1263 148/500 [=======>......................] - ETA: 1:56 - loss: 0.9330 - regression_loss: 0.8061 - classification_loss: 0.1269 149/500 [=======>......................] - ETA: 1:56 - loss: 0.9347 - regression_loss: 0.8073 - classification_loss: 0.1274 150/500 [========>.....................] - ETA: 1:55 - loss: 0.9387 - regression_loss: 0.8107 - classification_loss: 0.1280 151/500 [========>.....................] - ETA: 1:55 - loss: 0.9354 - regression_loss: 0.8079 - classification_loss: 0.1276 152/500 [========>.....................] - ETA: 1:55 - loss: 0.9353 - regression_loss: 0.8078 - classification_loss: 0.1275 153/500 [========>.....................] - ETA: 1:54 - loss: 0.9370 - regression_loss: 0.8091 - classification_loss: 0.1278 154/500 [========>.....................] - ETA: 1:54 - loss: 0.9350 - regression_loss: 0.8073 - classification_loss: 0.1277 155/500 [========>.....................] - ETA: 1:53 - loss: 0.9355 - regression_loss: 0.8079 - classification_loss: 0.1276 156/500 [========>.....................] - ETA: 1:53 - loss: 0.9325 - regression_loss: 0.8053 - classification_loss: 0.1272 157/500 [========>.....................] - ETA: 1:53 - loss: 0.9304 - regression_loss: 0.8037 - classification_loss: 0.1267 158/500 [========>.....................] - ETA: 1:52 - loss: 0.9359 - regression_loss: 0.8085 - classification_loss: 0.1274 159/500 [========>.....................] - ETA: 1:52 - loss: 0.9368 - regression_loss: 0.8092 - classification_loss: 0.1276 160/500 [========>.....................] - ETA: 1:52 - loss: 0.9362 - regression_loss: 0.8088 - classification_loss: 0.1274 161/500 [========>.....................] - ETA: 1:51 - loss: 0.9364 - regression_loss: 0.8091 - classification_loss: 0.1273 162/500 [========>.....................] - ETA: 1:51 - loss: 0.9386 - regression_loss: 0.8109 - classification_loss: 0.1277 163/500 [========>.....................] - ETA: 1:51 - loss: 0.9398 - regression_loss: 0.8120 - classification_loss: 0.1279 164/500 [========>.....................] - ETA: 1:50 - loss: 0.9364 - regression_loss: 0.8091 - classification_loss: 0.1273 165/500 [========>.....................] - ETA: 1:50 - loss: 0.9329 - regression_loss: 0.8062 - classification_loss: 0.1267 166/500 [========>.....................] - ETA: 1:50 - loss: 0.9350 - regression_loss: 0.8082 - classification_loss: 0.1268 167/500 [=========>....................] - ETA: 1:49 - loss: 0.9325 - regression_loss: 0.8060 - classification_loss: 0.1265 168/500 [=========>....................] - ETA: 1:49 - loss: 0.9290 - regression_loss: 0.8029 - classification_loss: 0.1262 169/500 [=========>....................] - ETA: 1:49 - loss: 0.9321 - regression_loss: 0.8058 - classification_loss: 0.1263 170/500 [=========>....................] - ETA: 1:48 - loss: 0.9364 - regression_loss: 0.8094 - classification_loss: 0.1270 171/500 [=========>....................] - ETA: 1:48 - loss: 0.9403 - regression_loss: 0.8128 - classification_loss: 0.1275 172/500 [=========>....................] - ETA: 1:48 - loss: 0.9392 - regression_loss: 0.8120 - classification_loss: 0.1272 173/500 [=========>....................] - ETA: 1:47 - loss: 0.9388 - regression_loss: 0.8119 - classification_loss: 0.1269 174/500 [=========>....................] - ETA: 1:47 - loss: 0.9366 - regression_loss: 0.8101 - classification_loss: 0.1265 175/500 [=========>....................] - ETA: 1:47 - loss: 0.9346 - regression_loss: 0.8084 - classification_loss: 0.1262 176/500 [=========>....................] - ETA: 1:46 - loss: 0.9358 - regression_loss: 0.8094 - classification_loss: 0.1265 177/500 [=========>....................] - ETA: 1:46 - loss: 0.9360 - regression_loss: 0.8097 - classification_loss: 0.1263 178/500 [=========>....................] - ETA: 1:46 - loss: 0.9376 - regression_loss: 0.8111 - classification_loss: 0.1265 179/500 [=========>....................] - ETA: 1:45 - loss: 0.9361 - regression_loss: 0.8098 - classification_loss: 0.1263 180/500 [=========>....................] - ETA: 1:45 - loss: 0.9378 - regression_loss: 0.8111 - classification_loss: 0.1267 181/500 [=========>....................] - ETA: 1:45 - loss: 0.9376 - regression_loss: 0.8108 - classification_loss: 0.1268 182/500 [=========>....................] - ETA: 1:44 - loss: 0.9396 - regression_loss: 0.8124 - classification_loss: 0.1272 183/500 [=========>....................] - ETA: 1:44 - loss: 0.9370 - regression_loss: 0.8102 - classification_loss: 0.1268 184/500 [==========>...................] - ETA: 1:44 - loss: 0.9393 - regression_loss: 0.8122 - classification_loss: 0.1271 185/500 [==========>...................] - ETA: 1:44 - loss: 0.9418 - regression_loss: 0.8144 - classification_loss: 0.1274 186/500 [==========>...................] - ETA: 1:43 - loss: 0.9437 - regression_loss: 0.8162 - classification_loss: 0.1275 187/500 [==========>...................] - ETA: 1:43 - loss: 0.9450 - regression_loss: 0.8173 - classification_loss: 0.1277 188/500 [==========>...................] - ETA: 1:43 - loss: 0.9476 - regression_loss: 0.8194 - classification_loss: 0.1283 189/500 [==========>...................] - ETA: 1:42 - loss: 0.9474 - regression_loss: 0.8190 - classification_loss: 0.1284 190/500 [==========>...................] - ETA: 1:42 - loss: 0.9475 - regression_loss: 0.8191 - classification_loss: 0.1284 191/500 [==========>...................] - ETA: 1:42 - loss: 0.9449 - regression_loss: 0.8167 - classification_loss: 0.1282 192/500 [==========>...................] - ETA: 1:41 - loss: 0.9460 - regression_loss: 0.8175 - classification_loss: 0.1285 193/500 [==========>...................] - ETA: 1:41 - loss: 0.9464 - regression_loss: 0.8180 - classification_loss: 0.1284 194/500 [==========>...................] - ETA: 1:41 - loss: 0.9455 - regression_loss: 0.8172 - classification_loss: 0.1283 195/500 [==========>...................] - ETA: 1:40 - loss: 0.9447 - regression_loss: 0.8166 - classification_loss: 0.1281 196/500 [==========>...................] - ETA: 1:40 - loss: 0.9459 - regression_loss: 0.8173 - classification_loss: 0.1286 197/500 [==========>...................] - ETA: 1:40 - loss: 0.9484 - regression_loss: 0.8191 - classification_loss: 0.1293 198/500 [==========>...................] - ETA: 1:39 - loss: 0.9462 - regression_loss: 0.8172 - classification_loss: 0.1289 199/500 [==========>...................] - ETA: 1:39 - loss: 0.9470 - regression_loss: 0.8181 - classification_loss: 0.1290 200/500 [===========>..................] - ETA: 1:39 - loss: 0.9448 - regression_loss: 0.8162 - classification_loss: 0.1286 201/500 [===========>..................] - ETA: 1:38 - loss: 0.9431 - regression_loss: 0.8148 - classification_loss: 0.1283 202/500 [===========>..................] - ETA: 1:38 - loss: 0.9454 - regression_loss: 0.8167 - classification_loss: 0.1287 203/500 [===========>..................] - ETA: 1:38 - loss: 0.9433 - regression_loss: 0.8150 - classification_loss: 0.1284 204/500 [===========>..................] - ETA: 1:37 - loss: 0.9432 - regression_loss: 0.8147 - classification_loss: 0.1286 205/500 [===========>..................] - ETA: 1:37 - loss: 0.9425 - regression_loss: 0.8140 - classification_loss: 0.1285 206/500 [===========>..................] - ETA: 1:37 - loss: 0.9428 - regression_loss: 0.8146 - classification_loss: 0.1282 207/500 [===========>..................] - ETA: 1:36 - loss: 0.9440 - regression_loss: 0.8155 - classification_loss: 0.1286 208/500 [===========>..................] - ETA: 1:36 - loss: 0.9441 - regression_loss: 0.8157 - classification_loss: 0.1284 209/500 [===========>..................] - ETA: 1:36 - loss: 0.9449 - regression_loss: 0.8164 - classification_loss: 0.1285 210/500 [===========>..................] - ETA: 1:35 - loss: 0.9454 - regression_loss: 0.8167 - classification_loss: 0.1286 211/500 [===========>..................] - ETA: 1:35 - loss: 0.9456 - regression_loss: 0.8171 - classification_loss: 0.1285 212/500 [===========>..................] - ETA: 1:35 - loss: 0.9444 - regression_loss: 0.8161 - classification_loss: 0.1283 213/500 [===========>..................] - ETA: 1:34 - loss: 0.9427 - regression_loss: 0.8146 - classification_loss: 0.1280 214/500 [===========>..................] - ETA: 1:34 - loss: 0.9424 - regression_loss: 0.8143 - classification_loss: 0.1281 215/500 [===========>..................] - ETA: 1:34 - loss: 0.9445 - regression_loss: 0.8160 - classification_loss: 0.1285 216/500 [===========>..................] - ETA: 1:33 - loss: 0.9473 - regression_loss: 0.8182 - classification_loss: 0.1291 217/500 [============>.................] - ETA: 1:33 - loss: 0.9444 - regression_loss: 0.8157 - classification_loss: 0.1287 218/500 [============>.................] - ETA: 1:33 - loss: 0.9425 - regression_loss: 0.8141 - classification_loss: 0.1284 219/500 [============>.................] - ETA: 1:32 - loss: 0.9409 - regression_loss: 0.8127 - classification_loss: 0.1282 220/500 [============>.................] - ETA: 1:32 - loss: 0.9396 - regression_loss: 0.8116 - classification_loss: 0.1280 221/500 [============>.................] - ETA: 1:32 - loss: 0.9390 - regression_loss: 0.8113 - classification_loss: 0.1277 222/500 [============>.................] - ETA: 1:31 - loss: 0.9363 - regression_loss: 0.8090 - classification_loss: 0.1274 223/500 [============>.................] - ETA: 1:31 - loss: 0.9343 - regression_loss: 0.8073 - classification_loss: 0.1270 224/500 [============>.................] - ETA: 1:31 - loss: 0.9337 - regression_loss: 0.8070 - classification_loss: 0.1267 225/500 [============>.................] - ETA: 1:30 - loss: 0.9349 - regression_loss: 0.8081 - classification_loss: 0.1268 226/500 [============>.................] - ETA: 1:30 - loss: 0.9353 - regression_loss: 0.8084 - classification_loss: 0.1269 227/500 [============>.................] - ETA: 1:30 - loss: 0.9374 - regression_loss: 0.8102 - classification_loss: 0.1272 228/500 [============>.................] - ETA: 1:29 - loss: 0.9388 - regression_loss: 0.8113 - classification_loss: 0.1275 229/500 [============>.................] - ETA: 1:29 - loss: 0.9400 - regression_loss: 0.8124 - classification_loss: 0.1276 230/500 [============>.................] - ETA: 1:29 - loss: 0.9389 - regression_loss: 0.8114 - classification_loss: 0.1275 231/500 [============>.................] - ETA: 1:28 - loss: 0.9399 - regression_loss: 0.8123 - classification_loss: 0.1277 232/500 [============>.................] - ETA: 1:28 - loss: 0.9406 - regression_loss: 0.8127 - classification_loss: 0.1279 233/500 [============>.................] - ETA: 1:28 - loss: 0.9410 - regression_loss: 0.8131 - classification_loss: 0.1279 234/500 [=============>................] - ETA: 1:27 - loss: 0.9432 - regression_loss: 0.8148 - classification_loss: 0.1284 235/500 [=============>................] - ETA: 1:27 - loss: 0.9416 - regression_loss: 0.8135 - classification_loss: 0.1281 236/500 [=============>................] - ETA: 1:27 - loss: 0.9411 - regression_loss: 0.8133 - classification_loss: 0.1279 237/500 [=============>................] - ETA: 1:26 - loss: 0.9435 - regression_loss: 0.8152 - classification_loss: 0.1283 238/500 [=============>................] - ETA: 1:26 - loss: 0.9434 - regression_loss: 0.8150 - classification_loss: 0.1284 239/500 [=============>................] - ETA: 1:26 - loss: 0.9439 - regression_loss: 0.8154 - classification_loss: 0.1285 240/500 [=============>................] - ETA: 1:25 - loss: 0.9453 - regression_loss: 0.8164 - classification_loss: 0.1290 241/500 [=============>................] - ETA: 1:25 - loss: 0.9473 - regression_loss: 0.8177 - classification_loss: 0.1296 242/500 [=============>................] - ETA: 1:25 - loss: 0.9452 - regression_loss: 0.8159 - classification_loss: 0.1293 243/500 [=============>................] - ETA: 1:24 - loss: 0.9448 - regression_loss: 0.8156 - classification_loss: 0.1292 244/500 [=============>................] - ETA: 1:24 - loss: 0.9458 - regression_loss: 0.8165 - classification_loss: 0.1293 245/500 [=============>................] - ETA: 1:24 - loss: 0.9446 - regression_loss: 0.8153 - classification_loss: 0.1293 246/500 [=============>................] - ETA: 1:23 - loss: 0.9452 - regression_loss: 0.8159 - classification_loss: 0.1293 247/500 [=============>................] - ETA: 1:23 - loss: 0.9449 - regression_loss: 0.8157 - classification_loss: 0.1292 248/500 [=============>................] - ETA: 1:23 - loss: 0.9424 - regression_loss: 0.8136 - classification_loss: 0.1289 249/500 [=============>................] - ETA: 1:22 - loss: 0.9401 - regression_loss: 0.8115 - classification_loss: 0.1286 250/500 [==============>...............] - ETA: 1:22 - loss: 0.9414 - regression_loss: 0.8125 - classification_loss: 0.1288 251/500 [==============>...............] - ETA: 1:22 - loss: 0.9402 - regression_loss: 0.8113 - classification_loss: 0.1288 252/500 [==============>...............] - ETA: 1:21 - loss: 0.9407 - regression_loss: 0.8121 - classification_loss: 0.1286 253/500 [==============>...............] - ETA: 1:21 - loss: 0.9395 - regression_loss: 0.8111 - classification_loss: 0.1284 254/500 [==============>...............] - ETA: 1:21 - loss: 0.9380 - regression_loss: 0.8099 - classification_loss: 0.1282 255/500 [==============>...............] - ETA: 1:20 - loss: 0.9367 - regression_loss: 0.8086 - classification_loss: 0.1280 256/500 [==============>...............] - ETA: 1:20 - loss: 0.9361 - regression_loss: 0.8082 - classification_loss: 0.1278 257/500 [==============>...............] - ETA: 1:20 - loss: 0.9343 - regression_loss: 0.8065 - classification_loss: 0.1278 258/500 [==============>...............] - ETA: 1:19 - loss: 0.9321 - regression_loss: 0.8047 - classification_loss: 0.1274 259/500 [==============>...............] - ETA: 1:19 - loss: 0.9331 - regression_loss: 0.8054 - classification_loss: 0.1277 260/500 [==============>...............] - ETA: 1:19 - loss: 0.9354 - regression_loss: 0.8075 - classification_loss: 0.1279 261/500 [==============>...............] - ETA: 1:18 - loss: 0.9350 - regression_loss: 0.8071 - classification_loss: 0.1279 262/500 [==============>...............] - ETA: 1:18 - loss: 0.9341 - regression_loss: 0.8064 - classification_loss: 0.1278 263/500 [==============>...............] - ETA: 1:18 - loss: 0.9359 - regression_loss: 0.8078 - classification_loss: 0.1280 264/500 [==============>...............] - ETA: 1:17 - loss: 0.9355 - regression_loss: 0.8075 - classification_loss: 0.1280 265/500 [==============>...............] - ETA: 1:17 - loss: 0.9362 - regression_loss: 0.8082 - classification_loss: 0.1280 266/500 [==============>...............] - ETA: 1:17 - loss: 0.9370 - regression_loss: 0.8090 - classification_loss: 0.1280 267/500 [===============>..............] - ETA: 1:16 - loss: 0.9361 - regression_loss: 0.8083 - classification_loss: 0.1278 268/500 [===============>..............] - ETA: 1:16 - loss: 0.9382 - regression_loss: 0.8100 - classification_loss: 0.1282 269/500 [===============>..............] - ETA: 1:16 - loss: 0.9361 - regression_loss: 0.8082 - classification_loss: 0.1279 270/500 [===============>..............] - ETA: 1:15 - loss: 0.9374 - regression_loss: 0.8093 - classification_loss: 0.1282 271/500 [===============>..............] - ETA: 1:15 - loss: 0.9384 - regression_loss: 0.8101 - classification_loss: 0.1283 272/500 [===============>..............] - ETA: 1:15 - loss: 0.9363 - regression_loss: 0.8082 - classification_loss: 0.1281 273/500 [===============>..............] - ETA: 1:14 - loss: 0.9346 - regression_loss: 0.8066 - classification_loss: 0.1280 274/500 [===============>..............] - ETA: 1:14 - loss: 0.9328 - regression_loss: 0.8051 - classification_loss: 0.1277 275/500 [===============>..............] - ETA: 1:14 - loss: 0.9324 - regression_loss: 0.8048 - classification_loss: 0.1276 276/500 [===============>..............] - ETA: 1:13 - loss: 0.9311 - regression_loss: 0.8037 - classification_loss: 0.1273 277/500 [===============>..............] - ETA: 1:13 - loss: 0.9290 - regression_loss: 0.8020 - classification_loss: 0.1270 278/500 [===============>..............] - ETA: 1:13 - loss: 0.9269 - regression_loss: 0.8002 - classification_loss: 0.1267 279/500 [===============>..............] - ETA: 1:12 - loss: 0.9296 - regression_loss: 0.8025 - classification_loss: 0.1271 280/500 [===============>..............] - ETA: 1:12 - loss: 0.9315 - regression_loss: 0.8040 - classification_loss: 0.1275 281/500 [===============>..............] - ETA: 1:12 - loss: 0.9316 - regression_loss: 0.8042 - classification_loss: 0.1275 282/500 [===============>..............] - ETA: 1:11 - loss: 0.9329 - regression_loss: 0.8053 - classification_loss: 0.1276 283/500 [===============>..............] - ETA: 1:11 - loss: 0.9337 - regression_loss: 0.8059 - classification_loss: 0.1278 284/500 [================>.............] - ETA: 1:11 - loss: 0.9337 - regression_loss: 0.8059 - classification_loss: 0.1278 285/500 [================>.............] - ETA: 1:10 - loss: 0.9350 - regression_loss: 0.8070 - classification_loss: 0.1280 286/500 [================>.............] - ETA: 1:10 - loss: 0.9355 - regression_loss: 0.8075 - classification_loss: 0.1280 287/500 [================>.............] - ETA: 1:10 - loss: 0.9336 - regression_loss: 0.8059 - classification_loss: 0.1277 288/500 [================>.............] - ETA: 1:09 - loss: 0.9320 - regression_loss: 0.8046 - classification_loss: 0.1274 289/500 [================>.............] - ETA: 1:09 - loss: 0.9338 - regression_loss: 0.8061 - classification_loss: 0.1277 290/500 [================>.............] - ETA: 1:09 - loss: 0.9327 - regression_loss: 0.8052 - classification_loss: 0.1275 291/500 [================>.............] - ETA: 1:08 - loss: 0.9321 - regression_loss: 0.8047 - classification_loss: 0.1274 292/500 [================>.............] - ETA: 1:08 - loss: 0.9327 - regression_loss: 0.8053 - classification_loss: 0.1274 293/500 [================>.............] - ETA: 1:08 - loss: 0.9337 - regression_loss: 0.8063 - classification_loss: 0.1274 294/500 [================>.............] - ETA: 1:08 - loss: 0.9336 - regression_loss: 0.8063 - classification_loss: 0.1273 295/500 [================>.............] - ETA: 1:07 - loss: 0.9344 - regression_loss: 0.8070 - classification_loss: 0.1274 296/500 [================>.............] - ETA: 1:07 - loss: 0.9338 - regression_loss: 0.8066 - classification_loss: 0.1273 297/500 [================>.............] - ETA: 1:07 - loss: 0.9355 - regression_loss: 0.8080 - classification_loss: 0.1276 298/500 [================>.............] - ETA: 1:06 - loss: 0.9357 - regression_loss: 0.8080 - classification_loss: 0.1277 299/500 [================>.............] - ETA: 1:06 - loss: 0.9361 - regression_loss: 0.8083 - classification_loss: 0.1278 300/500 [=================>............] - ETA: 1:06 - loss: 0.9376 - regression_loss: 0.8095 - classification_loss: 0.1281 301/500 [=================>............] - ETA: 1:05 - loss: 0.9365 - regression_loss: 0.8086 - classification_loss: 0.1279 302/500 [=================>............] - ETA: 1:05 - loss: 0.9357 - regression_loss: 0.8079 - classification_loss: 0.1277 303/500 [=================>............] - ETA: 1:05 - loss: 0.9340 - regression_loss: 0.8064 - classification_loss: 0.1276 304/500 [=================>............] - ETA: 1:04 - loss: 0.9330 - regression_loss: 0.8056 - classification_loss: 0.1274 305/500 [=================>............] - ETA: 1:04 - loss: 0.9331 - regression_loss: 0.8057 - classification_loss: 0.1274 306/500 [=================>............] - ETA: 1:04 - loss: 0.9336 - regression_loss: 0.8061 - classification_loss: 0.1274 307/500 [=================>............] - ETA: 1:03 - loss: 0.9333 - regression_loss: 0.8059 - classification_loss: 0.1274 308/500 [=================>............] - ETA: 1:03 - loss: 0.9343 - regression_loss: 0.8067 - classification_loss: 0.1276 309/500 [=================>............] - ETA: 1:03 - loss: 0.9352 - regression_loss: 0.8075 - classification_loss: 0.1277 310/500 [=================>............] - ETA: 1:02 - loss: 0.9341 - regression_loss: 0.8065 - classification_loss: 0.1275 311/500 [=================>............] - ETA: 1:02 - loss: 0.9329 - regression_loss: 0.8055 - classification_loss: 0.1274 312/500 [=================>............] - ETA: 1:02 - loss: 0.9339 - regression_loss: 0.8064 - classification_loss: 0.1275 313/500 [=================>............] - ETA: 1:01 - loss: 0.9322 - regression_loss: 0.8049 - classification_loss: 0.1273 314/500 [=================>............] - ETA: 1:01 - loss: 0.9311 - regression_loss: 0.8041 - classification_loss: 0.1270 315/500 [=================>............] - ETA: 1:01 - loss: 0.9300 - regression_loss: 0.8031 - classification_loss: 0.1269 316/500 [=================>............] - ETA: 1:00 - loss: 0.9301 - regression_loss: 0.8032 - classification_loss: 0.1269 317/500 [==================>...........] - ETA: 1:00 - loss: 0.9288 - regression_loss: 0.8021 - classification_loss: 0.1267 318/500 [==================>...........] - ETA: 1:00 - loss: 0.9301 - regression_loss: 0.8031 - classification_loss: 0.1270 319/500 [==================>...........] - ETA: 59s - loss: 0.9287 - regression_loss: 0.8019 - classification_loss: 0.1268  320/500 [==================>...........] - ETA: 59s - loss: 0.9292 - regression_loss: 0.8024 - classification_loss: 0.1268 321/500 [==================>...........] - ETA: 59s - loss: 0.9298 - regression_loss: 0.8028 - classification_loss: 0.1269 322/500 [==================>...........] - ETA: 58s - loss: 0.9295 - regression_loss: 0.8027 - classification_loss: 0.1268 323/500 [==================>...........] - ETA: 58s - loss: 0.9276 - regression_loss: 0.8011 - classification_loss: 0.1265 324/500 [==================>...........] - ETA: 58s - loss: 0.9258 - regression_loss: 0.7995 - classification_loss: 0.1263 325/500 [==================>...........] - ETA: 57s - loss: 0.9244 - regression_loss: 0.7983 - classification_loss: 0.1261 326/500 [==================>...........] - ETA: 57s - loss: 0.9248 - regression_loss: 0.7986 - classification_loss: 0.1262 327/500 [==================>...........] - ETA: 57s - loss: 0.9259 - regression_loss: 0.7995 - classification_loss: 0.1264 328/500 [==================>...........] - ETA: 56s - loss: 0.9250 - regression_loss: 0.7989 - classification_loss: 0.1261 329/500 [==================>...........] - ETA: 56s - loss: 0.9239 - regression_loss: 0.7979 - classification_loss: 0.1260 330/500 [==================>...........] - ETA: 56s - loss: 0.9243 - regression_loss: 0.7982 - classification_loss: 0.1261 331/500 [==================>...........] - ETA: 55s - loss: 0.9232 - regression_loss: 0.7973 - classification_loss: 0.1259 332/500 [==================>...........] - ETA: 55s - loss: 0.9225 - regression_loss: 0.7968 - classification_loss: 0.1257 333/500 [==================>...........] - ETA: 55s - loss: 0.9241 - regression_loss: 0.7980 - classification_loss: 0.1261 334/500 [===================>..........] - ETA: 54s - loss: 0.9235 - regression_loss: 0.7975 - classification_loss: 0.1260 335/500 [===================>..........] - ETA: 54s - loss: 0.9243 - regression_loss: 0.7983 - classification_loss: 0.1261 336/500 [===================>..........] - ETA: 54s - loss: 0.9227 - regression_loss: 0.7967 - classification_loss: 0.1259 337/500 [===================>..........] - ETA: 53s - loss: 0.9215 - regression_loss: 0.7957 - classification_loss: 0.1258 338/500 [===================>..........] - ETA: 53s - loss: 0.9221 - regression_loss: 0.7962 - classification_loss: 0.1259 339/500 [===================>..........] - ETA: 53s - loss: 0.9223 - regression_loss: 0.7965 - classification_loss: 0.1258 340/500 [===================>..........] - ETA: 52s - loss: 0.9211 - regression_loss: 0.7954 - classification_loss: 0.1256 341/500 [===================>..........] - ETA: 52s - loss: 0.9209 - regression_loss: 0.7951 - classification_loss: 0.1258 342/500 [===================>..........] - ETA: 52s - loss: 0.9215 - regression_loss: 0.7956 - classification_loss: 0.1259 343/500 [===================>..........] - ETA: 51s - loss: 0.9219 - regression_loss: 0.7959 - classification_loss: 0.1260 344/500 [===================>..........] - ETA: 51s - loss: 0.9203 - regression_loss: 0.7945 - classification_loss: 0.1257 345/500 [===================>..........] - ETA: 51s - loss: 0.9193 - regression_loss: 0.7937 - classification_loss: 0.1256 346/500 [===================>..........] - ETA: 50s - loss: 0.9212 - regression_loss: 0.7953 - classification_loss: 0.1259 347/500 [===================>..........] - ETA: 50s - loss: 0.9223 - regression_loss: 0.7963 - classification_loss: 0.1261 348/500 [===================>..........] - ETA: 50s - loss: 0.9228 - regression_loss: 0.7967 - classification_loss: 0.1261 349/500 [===================>..........] - ETA: 49s - loss: 0.9228 - regression_loss: 0.7968 - classification_loss: 0.1261 350/500 [====================>.........] - ETA: 49s - loss: 0.9235 - regression_loss: 0.7974 - classification_loss: 0.1261 351/500 [====================>.........] - ETA: 49s - loss: 0.9231 - regression_loss: 0.7970 - classification_loss: 0.1261 352/500 [====================>.........] - ETA: 48s - loss: 0.9225 - regression_loss: 0.7964 - classification_loss: 0.1261 353/500 [====================>.........] - ETA: 48s - loss: 0.9241 - regression_loss: 0.7978 - classification_loss: 0.1263 354/500 [====================>.........] - ETA: 48s - loss: 0.9234 - regression_loss: 0.7973 - classification_loss: 0.1262 355/500 [====================>.........] - ETA: 47s - loss: 0.9241 - regression_loss: 0.7977 - classification_loss: 0.1264 356/500 [====================>.........] - ETA: 47s - loss: 0.9228 - regression_loss: 0.7966 - classification_loss: 0.1262 357/500 [====================>.........] - ETA: 47s - loss: 0.9241 - regression_loss: 0.7977 - classification_loss: 0.1264 358/500 [====================>.........] - ETA: 46s - loss: 0.9247 - regression_loss: 0.7982 - classification_loss: 0.1265 359/500 [====================>.........] - ETA: 46s - loss: 0.9237 - regression_loss: 0.7973 - classification_loss: 0.1264 360/500 [====================>.........] - ETA: 46s - loss: 0.9256 - regression_loss: 0.7989 - classification_loss: 0.1267 361/500 [====================>.........] - ETA: 45s - loss: 0.9261 - regression_loss: 0.7993 - classification_loss: 0.1268 362/500 [====================>.........] - ETA: 45s - loss: 0.9267 - regression_loss: 0.7999 - classification_loss: 0.1269 363/500 [====================>.........] - ETA: 45s - loss: 0.9272 - regression_loss: 0.8002 - classification_loss: 0.1270 364/500 [====================>.........] - ETA: 44s - loss: 0.9271 - regression_loss: 0.8001 - classification_loss: 0.1270 365/500 [====================>.........] - ETA: 44s - loss: 0.9270 - regression_loss: 0.8001 - classification_loss: 0.1269 366/500 [====================>.........] - ETA: 44s - loss: 0.9268 - regression_loss: 0.7998 - classification_loss: 0.1271 367/500 [=====================>........] - ETA: 43s - loss: 0.9261 - regression_loss: 0.7992 - classification_loss: 0.1269 368/500 [=====================>........] - ETA: 43s - loss: 0.9266 - regression_loss: 0.7998 - classification_loss: 0.1269 369/500 [=====================>........] - ETA: 43s - loss: 0.9287 - regression_loss: 0.8013 - classification_loss: 0.1273 370/500 [=====================>........] - ETA: 42s - loss: 0.9279 - regression_loss: 0.8008 - classification_loss: 0.1271 371/500 [=====================>........] - ETA: 42s - loss: 0.9275 - regression_loss: 0.8005 - classification_loss: 0.1270 372/500 [=====================>........] - ETA: 42s - loss: 0.9276 - regression_loss: 0.8006 - classification_loss: 0.1270 373/500 [=====================>........] - ETA: 41s - loss: 0.9261 - regression_loss: 0.7991 - classification_loss: 0.1270 374/500 [=====================>........] - ETA: 41s - loss: 0.9246 - regression_loss: 0.7978 - classification_loss: 0.1268 375/500 [=====================>........] - ETA: 41s - loss: 0.9238 - regression_loss: 0.7971 - classification_loss: 0.1267 376/500 [=====================>........] - ETA: 40s - loss: 0.9252 - regression_loss: 0.7984 - classification_loss: 0.1268 377/500 [=====================>........] - ETA: 40s - loss: 0.9261 - regression_loss: 0.7991 - classification_loss: 0.1270 378/500 [=====================>........] - ETA: 40s - loss: 0.9262 - regression_loss: 0.7991 - classification_loss: 0.1270 379/500 [=====================>........] - ETA: 39s - loss: 0.9258 - regression_loss: 0.7989 - classification_loss: 0.1269 380/500 [=====================>........] - ETA: 39s - loss: 0.9247 - regression_loss: 0.7979 - classification_loss: 0.1268 381/500 [=====================>........] - ETA: 39s - loss: 0.9257 - regression_loss: 0.7988 - classification_loss: 0.1269 382/500 [=====================>........] - ETA: 38s - loss: 0.9264 - regression_loss: 0.7994 - classification_loss: 0.1270 383/500 [=====================>........] - ETA: 38s - loss: 0.9249 - regression_loss: 0.7982 - classification_loss: 0.1268 384/500 [======================>.......] - ETA: 38s - loss: 0.9257 - regression_loss: 0.7989 - classification_loss: 0.1268 385/500 [======================>.......] - ETA: 37s - loss: 0.9251 - regression_loss: 0.7984 - classification_loss: 0.1267 386/500 [======================>.......] - ETA: 37s - loss: 0.9244 - regression_loss: 0.7979 - classification_loss: 0.1265 387/500 [======================>.......] - ETA: 37s - loss: 0.9246 - regression_loss: 0.7981 - classification_loss: 0.1265 388/500 [======================>.......] - ETA: 36s - loss: 0.9232 - regression_loss: 0.7969 - classification_loss: 0.1263 389/500 [======================>.......] - ETA: 36s - loss: 0.9237 - regression_loss: 0.7973 - classification_loss: 0.1263 390/500 [======================>.......] - ETA: 36s - loss: 0.9233 - regression_loss: 0.7970 - classification_loss: 0.1263 391/500 [======================>.......] - ETA: 35s - loss: 0.9229 - regression_loss: 0.7967 - classification_loss: 0.1262 392/500 [======================>.......] - ETA: 35s - loss: 0.9228 - regression_loss: 0.7966 - classification_loss: 0.1262 393/500 [======================>.......] - ETA: 35s - loss: 0.9215 - regression_loss: 0.7954 - classification_loss: 0.1260 394/500 [======================>.......] - ETA: 34s - loss: 0.9223 - regression_loss: 0.7963 - classification_loss: 0.1260 395/500 [======================>.......] - ETA: 34s - loss: 0.9228 - regression_loss: 0.7968 - classification_loss: 0.1260 396/500 [======================>.......] - ETA: 34s - loss: 0.9218 - regression_loss: 0.7960 - classification_loss: 0.1258 397/500 [======================>.......] - ETA: 33s - loss: 0.9213 - regression_loss: 0.7957 - classification_loss: 0.1257 398/500 [======================>.......] - ETA: 33s - loss: 0.9201 - regression_loss: 0.7946 - classification_loss: 0.1255 399/500 [======================>.......] - ETA: 33s - loss: 0.9205 - regression_loss: 0.7950 - classification_loss: 0.1255 400/500 [=======================>......] - ETA: 32s - loss: 0.9207 - regression_loss: 0.7951 - classification_loss: 0.1256 401/500 [=======================>......] - ETA: 32s - loss: 0.9211 - regression_loss: 0.7956 - classification_loss: 0.1256 402/500 [=======================>......] - ETA: 32s - loss: 0.9210 - regression_loss: 0.7955 - classification_loss: 0.1255 403/500 [=======================>......] - ETA: 31s - loss: 0.9232 - regression_loss: 0.7974 - classification_loss: 0.1259 404/500 [=======================>......] - ETA: 31s - loss: 0.9233 - regression_loss: 0.7974 - classification_loss: 0.1259 405/500 [=======================>......] - ETA: 31s - loss: 0.9222 - regression_loss: 0.7965 - classification_loss: 0.1257 406/500 [=======================>......] - ETA: 30s - loss: 0.9220 - regression_loss: 0.7964 - classification_loss: 0.1256 407/500 [=======================>......] - ETA: 30s - loss: 0.9218 - regression_loss: 0.7962 - classification_loss: 0.1256 408/500 [=======================>......] - ETA: 30s - loss: 0.9208 - regression_loss: 0.7954 - classification_loss: 0.1254 409/500 [=======================>......] - ETA: 29s - loss: 0.9216 - regression_loss: 0.7961 - classification_loss: 0.1255 410/500 [=======================>......] - ETA: 29s - loss: 0.9204 - regression_loss: 0.7950 - classification_loss: 0.1254 411/500 [=======================>......] - ETA: 29s - loss: 0.9214 - regression_loss: 0.7959 - classification_loss: 0.1255 412/500 [=======================>......] - ETA: 29s - loss: 0.9204 - regression_loss: 0.7951 - classification_loss: 0.1253 413/500 [=======================>......] - ETA: 28s - loss: 0.9189 - regression_loss: 0.7938 - classification_loss: 0.1251 414/500 [=======================>......] - ETA: 28s - loss: 0.9202 - regression_loss: 0.7948 - classification_loss: 0.1254 415/500 [=======================>......] - ETA: 28s - loss: 0.9192 - regression_loss: 0.7939 - classification_loss: 0.1253 416/500 [=======================>......] - ETA: 27s - loss: 0.9195 - regression_loss: 0.7942 - classification_loss: 0.1253 417/500 [========================>.....] - ETA: 27s - loss: 0.9184 - regression_loss: 0.7933 - classification_loss: 0.1251 418/500 [========================>.....] - ETA: 27s - loss: 0.9192 - regression_loss: 0.7941 - classification_loss: 0.1252 419/500 [========================>.....] - ETA: 26s - loss: 0.9198 - regression_loss: 0.7946 - classification_loss: 0.1252 420/500 [========================>.....] - ETA: 26s - loss: 0.9201 - regression_loss: 0.7948 - classification_loss: 0.1253 421/500 [========================>.....] - ETA: 26s - loss: 0.9212 - regression_loss: 0.7958 - classification_loss: 0.1254 422/500 [========================>.....] - ETA: 25s - loss: 0.9218 - regression_loss: 0.7963 - classification_loss: 0.1255 423/500 [========================>.....] - ETA: 25s - loss: 0.9232 - regression_loss: 0.7975 - classification_loss: 0.1257 424/500 [========================>.....] - ETA: 25s - loss: 0.9220 - regression_loss: 0.7964 - classification_loss: 0.1256 425/500 [========================>.....] - ETA: 24s - loss: 0.9205 - regression_loss: 0.7951 - classification_loss: 0.1253 426/500 [========================>.....] - ETA: 24s - loss: 0.9195 - regression_loss: 0.7943 - classification_loss: 0.1252 427/500 [========================>.....] - ETA: 24s - loss: 0.9189 - regression_loss: 0.7938 - classification_loss: 0.1251 428/500 [========================>.....] - ETA: 23s - loss: 0.9179 - regression_loss: 0.7929 - classification_loss: 0.1250 429/500 [========================>.....] - ETA: 23s - loss: 0.9177 - regression_loss: 0.7928 - classification_loss: 0.1249 430/500 [========================>.....] - ETA: 23s - loss: 0.9186 - regression_loss: 0.7936 - classification_loss: 0.1250 431/500 [========================>.....] - ETA: 22s - loss: 0.9197 - regression_loss: 0.7946 - classification_loss: 0.1252 432/500 [========================>.....] - ETA: 22s - loss: 0.9203 - regression_loss: 0.7951 - classification_loss: 0.1251 433/500 [========================>.....] - ETA: 22s - loss: 0.9191 - regression_loss: 0.7941 - classification_loss: 0.1250 434/500 [=========================>....] - ETA: 21s - loss: 0.9199 - regression_loss: 0.7948 - classification_loss: 0.1251 435/500 [=========================>....] - ETA: 21s - loss: 0.9203 - regression_loss: 0.7951 - classification_loss: 0.1252 436/500 [=========================>....] - ETA: 21s - loss: 0.9209 - regression_loss: 0.7957 - classification_loss: 0.1252 437/500 [=========================>....] - ETA: 20s - loss: 0.9202 - regression_loss: 0.7951 - classification_loss: 0.1251 438/500 [=========================>....] - ETA: 20s - loss: 0.9202 - regression_loss: 0.7951 - classification_loss: 0.1250 439/500 [=========================>....] - ETA: 20s - loss: 0.9210 - regression_loss: 0.7959 - classification_loss: 0.1251 440/500 [=========================>....] - ETA: 19s - loss: 0.9221 - regression_loss: 0.7968 - classification_loss: 0.1252 441/500 [=========================>....] - ETA: 19s - loss: 0.9231 - regression_loss: 0.7979 - classification_loss: 0.1252 442/500 [=========================>....] - ETA: 19s - loss: 0.9232 - regression_loss: 0.7980 - classification_loss: 0.1252 443/500 [=========================>....] - ETA: 18s - loss: 0.9237 - regression_loss: 0.7985 - classification_loss: 0.1252 444/500 [=========================>....] - ETA: 18s - loss: 0.9236 - regression_loss: 0.7984 - classification_loss: 0.1252 445/500 [=========================>....] - ETA: 18s - loss: 0.9249 - regression_loss: 0.7995 - classification_loss: 0.1254 446/500 [=========================>....] - ETA: 17s - loss: 0.9266 - regression_loss: 0.8007 - classification_loss: 0.1258 447/500 [=========================>....] - ETA: 17s - loss: 0.9266 - regression_loss: 0.8008 - classification_loss: 0.1257 448/500 [=========================>....] - ETA: 17s - loss: 0.9258 - regression_loss: 0.8002 - classification_loss: 0.1256 449/500 [=========================>....] - ETA: 16s - loss: 0.9249 - regression_loss: 0.7994 - classification_loss: 0.1255 450/500 [==========================>...] - ETA: 16s - loss: 0.9259 - regression_loss: 0.8004 - classification_loss: 0.1255 451/500 [==========================>...] - ETA: 16s - loss: 0.9257 - regression_loss: 0.8003 - classification_loss: 0.1255 452/500 [==========================>...] - ETA: 15s - loss: 0.9262 - regression_loss: 0.8007 - classification_loss: 0.1255 453/500 [==========================>...] - ETA: 15s - loss: 0.9266 - regression_loss: 0.8011 - classification_loss: 0.1254 454/500 [==========================>...] - ETA: 15s - loss: 0.9270 - regression_loss: 0.8014 - classification_loss: 0.1256 455/500 [==========================>...] - ETA: 14s - loss: 0.9274 - regression_loss: 0.8016 - classification_loss: 0.1257 456/500 [==========================>...] - ETA: 14s - loss: 0.9266 - regression_loss: 0.8010 - classification_loss: 0.1256 457/500 [==========================>...] - ETA: 14s - loss: 0.9264 - regression_loss: 0.8008 - classification_loss: 0.1256 458/500 [==========================>...] - ETA: 13s - loss: 0.9279 - regression_loss: 0.8019 - classification_loss: 0.1259 459/500 [==========================>...] - ETA: 13s - loss: 0.9278 - regression_loss: 0.8018 - classification_loss: 0.1260 460/500 [==========================>...] - ETA: 13s - loss: 0.9273 - regression_loss: 0.8014 - classification_loss: 0.1259 461/500 [==========================>...] - ETA: 12s - loss: 0.9270 - regression_loss: 0.8012 - classification_loss: 0.1258 462/500 [==========================>...] - ETA: 12s - loss: 0.9260 - regression_loss: 0.8004 - classification_loss: 0.1256 463/500 [==========================>...] - ETA: 12s - loss: 0.9247 - regression_loss: 0.7992 - classification_loss: 0.1254 464/500 [==========================>...] - ETA: 11s - loss: 0.9256 - regression_loss: 0.7998 - classification_loss: 0.1257 465/500 [==========================>...] - ETA: 11s - loss: 0.9263 - regression_loss: 0.8005 - classification_loss: 0.1258 466/500 [==========================>...] - ETA: 11s - loss: 0.9262 - regression_loss: 0.8005 - classification_loss: 0.1258 467/500 [===========================>..] - ETA: 10s - loss: 0.9265 - regression_loss: 0.8008 - classification_loss: 0.1257 468/500 [===========================>..] - ETA: 10s - loss: 0.9280 - regression_loss: 0.8020 - classification_loss: 0.1260 469/500 [===========================>..] - ETA: 10s - loss: 0.9285 - regression_loss: 0.8025 - classification_loss: 0.1261 470/500 [===========================>..] - ETA: 9s - loss: 0.9275 - regression_loss: 0.8016 - classification_loss: 0.1259  471/500 [===========================>..] - ETA: 9s - loss: 0.9284 - regression_loss: 0.8023 - classification_loss: 0.1261 472/500 [===========================>..] - ETA: 9s - loss: 0.9276 - regression_loss: 0.8017 - classification_loss: 0.1259 473/500 [===========================>..] - ETA: 8s - loss: 0.9279 - regression_loss: 0.8020 - classification_loss: 0.1259 474/500 [===========================>..] - ETA: 8s - loss: 0.9287 - regression_loss: 0.8025 - classification_loss: 0.1261 475/500 [===========================>..] - ETA: 8s - loss: 0.9303 - regression_loss: 0.8039 - classification_loss: 0.1264 476/500 [===========================>..] - ETA: 7s - loss: 0.9313 - regression_loss: 0.8048 - classification_loss: 0.1265 477/500 [===========================>..] - ETA: 7s - loss: 0.9322 - regression_loss: 0.8056 - classification_loss: 0.1266 478/500 [===========================>..] - ETA: 7s - loss: 0.9318 - regression_loss: 0.8052 - classification_loss: 0.1266 479/500 [===========================>..] - ETA: 6s - loss: 0.9309 - regression_loss: 0.8044 - classification_loss: 0.1265 480/500 [===========================>..] - ETA: 6s - loss: 0.9296 - regression_loss: 0.8033 - classification_loss: 0.1263 481/500 [===========================>..] - ETA: 6s - loss: 0.9301 - regression_loss: 0.8037 - classification_loss: 0.1264 482/500 [===========================>..] - ETA: 5s - loss: 0.9309 - regression_loss: 0.8044 - classification_loss: 0.1265 483/500 [===========================>..] - ETA: 5s - loss: 0.9305 - regression_loss: 0.8040 - classification_loss: 0.1265 484/500 [============================>.] - ETA: 5s - loss: 0.9309 - regression_loss: 0.8044 - classification_loss: 0.1265 485/500 [============================>.] - ETA: 4s - loss: 0.9313 - regression_loss: 0.8048 - classification_loss: 0.1265 486/500 [============================>.] - ETA: 4s - loss: 0.9307 - regression_loss: 0.8043 - classification_loss: 0.1264 487/500 [============================>.] - ETA: 4s - loss: 0.9308 - regression_loss: 0.8043 - classification_loss: 0.1264 488/500 [============================>.] - ETA: 3s - loss: 0.9295 - regression_loss: 0.8032 - classification_loss: 0.1263 489/500 [============================>.] - ETA: 3s - loss: 0.9307 - regression_loss: 0.8043 - classification_loss: 0.1265 490/500 [============================>.] - ETA: 3s - loss: 0.9297 - regression_loss: 0.8034 - classification_loss: 0.1263 491/500 [============================>.] - ETA: 2s - loss: 0.9297 - regression_loss: 0.8034 - classification_loss: 0.1263 492/500 [============================>.] - ETA: 2s - loss: 0.9301 - regression_loss: 0.8038 - classification_loss: 0.1263 493/500 [============================>.] - ETA: 2s - loss: 0.9294 - regression_loss: 0.8032 - classification_loss: 0.1262 494/500 [============================>.] - ETA: 1s - loss: 0.9294 - regression_loss: 0.8032 - classification_loss: 0.1262 495/500 [============================>.] - ETA: 1s - loss: 0.9302 - regression_loss: 0.8038 - classification_loss: 0.1264 496/500 [============================>.] - ETA: 1s - loss: 0.9296 - regression_loss: 0.8033 - classification_loss: 0.1263 497/500 [============================>.] - ETA: 0s - loss: 0.9308 - regression_loss: 0.8042 - classification_loss: 0.1265 498/500 [============================>.] - ETA: 0s - loss: 0.9311 - regression_loss: 0.8045 - classification_loss: 0.1265 499/500 [============================>.] - ETA: 0s - loss: 0.9316 - regression_loss: 0.8049 - classification_loss: 0.1267 500/500 [==============================] - 165s 330ms/step - loss: 0.9308 - regression_loss: 0.8042 - classification_loss: 0.1266 1172 instances of class plum with average precision: 0.6124 mAP: 0.6124 Epoch 00033: saving model to ./training/snapshots/resnet101_pascal_33.h5 Epoch 34/150 1/500 [..............................] - ETA: 2:35 - loss: 1.1719 - regression_loss: 1.0081 - classification_loss: 0.1639 2/500 [..............................] - ETA: 2:34 - loss: 1.1149 - regression_loss: 0.9591 - classification_loss: 0.1557 3/500 [..............................] - ETA: 2:34 - loss: 1.0744 - regression_loss: 0.9343 - classification_loss: 0.1401 4/500 [..............................] - ETA: 2:37 - loss: 0.9514 - regression_loss: 0.8143 - classification_loss: 0.1371 5/500 [..............................] - ETA: 2:39 - loss: 0.9949 - regression_loss: 0.8578 - classification_loss: 0.1371 6/500 [..............................] - ETA: 2:42 - loss: 1.0226 - regression_loss: 0.8824 - classification_loss: 0.1402 7/500 [..............................] - ETA: 2:41 - loss: 1.0306 - regression_loss: 0.8912 - classification_loss: 0.1394 8/500 [..............................] - ETA: 2:40 - loss: 1.0242 - regression_loss: 0.8868 - classification_loss: 0.1374 9/500 [..............................] - ETA: 2:39 - loss: 0.9859 - regression_loss: 0.8535 - classification_loss: 0.1324 10/500 [..............................] - ETA: 2:38 - loss: 1.0325 - regression_loss: 0.8912 - classification_loss: 0.1413 11/500 [..............................] - ETA: 2:38 - loss: 1.0552 - regression_loss: 0.9098 - classification_loss: 0.1454 12/500 [..............................] - ETA: 2:38 - loss: 1.0448 - regression_loss: 0.9025 - classification_loss: 0.1423 13/500 [..............................] - ETA: 2:37 - loss: 1.0312 - regression_loss: 0.8880 - classification_loss: 0.1432 14/500 [..............................] - ETA: 2:37 - loss: 0.9844 - regression_loss: 0.8472 - classification_loss: 0.1372 15/500 [..............................] - ETA: 2:36 - loss: 0.9728 - regression_loss: 0.8368 - classification_loss: 0.1360 16/500 [..............................] - ETA: 2:37 - loss: 0.9590 - regression_loss: 0.8242 - classification_loss: 0.1349 17/500 [>.............................] - ETA: 2:38 - loss: 0.9553 - regression_loss: 0.8204 - classification_loss: 0.1349 18/500 [>.............................] - ETA: 2:37 - loss: 0.9713 - regression_loss: 0.8314 - classification_loss: 0.1399 19/500 [>.............................] - ETA: 2:37 - loss: 0.9541 - regression_loss: 0.8166 - classification_loss: 0.1375 20/500 [>.............................] - ETA: 2:36 - loss: 0.9295 - regression_loss: 0.7955 - classification_loss: 0.1340 21/500 [>.............................] - ETA: 2:36 - loss: 0.9223 - regression_loss: 0.7885 - classification_loss: 0.1338 22/500 [>.............................] - ETA: 2:36 - loss: 0.9037 - regression_loss: 0.7728 - classification_loss: 0.1310 23/500 [>.............................] - ETA: 2:35 - loss: 0.8913 - regression_loss: 0.7626 - classification_loss: 0.1287 24/500 [>.............................] - ETA: 2:35 - loss: 0.8765 - regression_loss: 0.7498 - classification_loss: 0.1267 25/500 [>.............................] - ETA: 2:35 - loss: 0.8914 - regression_loss: 0.7635 - classification_loss: 0.1279 26/500 [>.............................] - ETA: 2:35 - loss: 0.8736 - regression_loss: 0.7496 - classification_loss: 0.1241 27/500 [>.............................] - ETA: 2:35 - loss: 0.8956 - regression_loss: 0.7683 - classification_loss: 0.1273 28/500 [>.............................] - ETA: 2:34 - loss: 0.9246 - regression_loss: 0.7929 - classification_loss: 0.1317 29/500 [>.............................] - ETA: 2:34 - loss: 0.9329 - regression_loss: 0.7999 - classification_loss: 0.1330 30/500 [>.............................] - ETA: 2:34 - loss: 0.9542 - regression_loss: 0.8177 - classification_loss: 0.1365 31/500 [>.............................] - ETA: 2:33 - loss: 0.9466 - regression_loss: 0.8129 - classification_loss: 0.1338 32/500 [>.............................] - ETA: 2:33 - loss: 0.9557 - regression_loss: 0.8210 - classification_loss: 0.1347 33/500 [>.............................] - ETA: 2:33 - loss: 0.9393 - regression_loss: 0.8075 - classification_loss: 0.1317 34/500 [=>............................] - ETA: 2:32 - loss: 0.9239 - regression_loss: 0.7949 - classification_loss: 0.1290 35/500 [=>............................] - ETA: 2:32 - loss: 0.9289 - regression_loss: 0.7985 - classification_loss: 0.1304 36/500 [=>............................] - ETA: 2:33 - loss: 0.9414 - regression_loss: 0.8096 - classification_loss: 0.1318 37/500 [=>............................] - ETA: 2:32 - loss: 0.9398 - regression_loss: 0.8082 - classification_loss: 0.1316 38/500 [=>............................] - ETA: 2:32 - loss: 0.9318 - regression_loss: 0.8019 - classification_loss: 0.1299 39/500 [=>............................] - ETA: 2:32 - loss: 0.9419 - regression_loss: 0.8096 - classification_loss: 0.1323 40/500 [=>............................] - ETA: 2:31 - loss: 0.9284 - regression_loss: 0.7978 - classification_loss: 0.1306 41/500 [=>............................] - ETA: 2:31 - loss: 0.9299 - regression_loss: 0.8009 - classification_loss: 0.1290 42/500 [=>............................] - ETA: 2:31 - loss: 0.9269 - regression_loss: 0.7989 - classification_loss: 0.1280 43/500 [=>............................] - ETA: 2:31 - loss: 0.9198 - regression_loss: 0.7925 - classification_loss: 0.1273 44/500 [=>............................] - ETA: 2:30 - loss: 0.9134 - regression_loss: 0.7876 - classification_loss: 0.1258 45/500 [=>............................] - ETA: 2:30 - loss: 0.9123 - regression_loss: 0.7864 - classification_loss: 0.1259 46/500 [=>............................] - ETA: 2:29 - loss: 0.9207 - regression_loss: 0.7946 - classification_loss: 0.1261 47/500 [=>............................] - ETA: 2:29 - loss: 0.9107 - regression_loss: 0.7859 - classification_loss: 0.1247 48/500 [=>............................] - ETA: 2:29 - loss: 0.9146 - regression_loss: 0.7894 - classification_loss: 0.1251 49/500 [=>............................] - ETA: 2:28 - loss: 0.9252 - regression_loss: 0.7977 - classification_loss: 0.1275 50/500 [==>...........................] - ETA: 2:28 - loss: 0.9260 - regression_loss: 0.7978 - classification_loss: 0.1282 51/500 [==>...........................] - ETA: 2:28 - loss: 0.9153 - regression_loss: 0.7889 - classification_loss: 0.1264 52/500 [==>...........................] - ETA: 2:28 - loss: 0.9069 - regression_loss: 0.7819 - classification_loss: 0.1250 53/500 [==>...........................] - ETA: 2:27 - loss: 0.9132 - regression_loss: 0.7873 - classification_loss: 0.1259 54/500 [==>...........................] - ETA: 2:27 - loss: 0.9066 - regression_loss: 0.7820 - classification_loss: 0.1246 55/500 [==>...........................] - ETA: 2:26 - loss: 0.9090 - regression_loss: 0.7840 - classification_loss: 0.1250 56/500 [==>...........................] - ETA: 2:26 - loss: 0.9114 - regression_loss: 0.7868 - classification_loss: 0.1246 57/500 [==>...........................] - ETA: 2:26 - loss: 0.9140 - regression_loss: 0.7892 - classification_loss: 0.1247 58/500 [==>...........................] - ETA: 2:25 - loss: 0.9088 - regression_loss: 0.7848 - classification_loss: 0.1240 59/500 [==>...........................] - ETA: 2:25 - loss: 0.9037 - regression_loss: 0.7804 - classification_loss: 0.1233 60/500 [==>...........................] - ETA: 2:24 - loss: 0.8960 - regression_loss: 0.7737 - classification_loss: 0.1223 61/500 [==>...........................] - ETA: 2:24 - loss: 0.9116 - regression_loss: 0.7866 - classification_loss: 0.1250 62/500 [==>...........................] - ETA: 2:24 - loss: 0.9227 - regression_loss: 0.7956 - classification_loss: 0.1270 63/500 [==>...........................] - ETA: 2:23 - loss: 0.9188 - regression_loss: 0.7922 - classification_loss: 0.1267 64/500 [==>...........................] - ETA: 2:23 - loss: 0.9169 - regression_loss: 0.7901 - classification_loss: 0.1268 65/500 [==>...........................] - ETA: 2:23 - loss: 0.9204 - regression_loss: 0.7933 - classification_loss: 0.1271 66/500 [==>...........................] - ETA: 2:22 - loss: 0.9220 - regression_loss: 0.7959 - classification_loss: 0.1260 67/500 [===>..........................] - ETA: 2:22 - loss: 0.9320 - regression_loss: 0.8046 - classification_loss: 0.1274 68/500 [===>..........................] - ETA: 2:22 - loss: 0.9338 - regression_loss: 0.8057 - classification_loss: 0.1281 69/500 [===>..........................] - ETA: 2:21 - loss: 0.9301 - regression_loss: 0.8027 - classification_loss: 0.1273 70/500 [===>..........................] - ETA: 2:21 - loss: 0.9318 - regression_loss: 0.8055 - classification_loss: 0.1264 71/500 [===>..........................] - ETA: 2:21 - loss: 0.9244 - regression_loss: 0.7992 - classification_loss: 0.1252 72/500 [===>..........................] - ETA: 2:20 - loss: 0.9202 - regression_loss: 0.7960 - classification_loss: 0.1242 73/500 [===>..........................] - ETA: 2:20 - loss: 0.9177 - regression_loss: 0.7938 - classification_loss: 0.1238 74/500 [===>..........................] - ETA: 2:20 - loss: 0.9154 - regression_loss: 0.7920 - classification_loss: 0.1234 75/500 [===>..........................] - ETA: 2:19 - loss: 0.9154 - regression_loss: 0.7919 - classification_loss: 0.1234 76/500 [===>..........................] - ETA: 2:19 - loss: 0.9195 - regression_loss: 0.7955 - classification_loss: 0.1240 77/500 [===>..........................] - ETA: 2:19 - loss: 0.9191 - regression_loss: 0.7952 - classification_loss: 0.1239 78/500 [===>..........................] - ETA: 2:19 - loss: 0.9210 - regression_loss: 0.7968 - classification_loss: 0.1242 79/500 [===>..........................] - ETA: 2:18 - loss: 0.9195 - regression_loss: 0.7950 - classification_loss: 0.1246 80/500 [===>..........................] - ETA: 2:18 - loss: 0.9142 - regression_loss: 0.7906 - classification_loss: 0.1235 81/500 [===>..........................] - ETA: 2:18 - loss: 0.9186 - regression_loss: 0.7939 - classification_loss: 0.1247 82/500 [===>..........................] - ETA: 2:17 - loss: 0.9197 - regression_loss: 0.7950 - classification_loss: 0.1247 83/500 [===>..........................] - ETA: 2:17 - loss: 0.9174 - regression_loss: 0.7928 - classification_loss: 0.1246 84/500 [====>.........................] - ETA: 2:17 - loss: 0.9191 - regression_loss: 0.7943 - classification_loss: 0.1248 85/500 [====>.........................] - ETA: 2:17 - loss: 0.9191 - regression_loss: 0.7945 - classification_loss: 0.1246 86/500 [====>.........................] - ETA: 2:16 - loss: 0.9226 - regression_loss: 0.7980 - classification_loss: 0.1246 87/500 [====>.........................] - ETA: 2:16 - loss: 0.9226 - regression_loss: 0.7978 - classification_loss: 0.1248 88/500 [====>.........................] - ETA: 2:15 - loss: 0.9162 - regression_loss: 0.7922 - classification_loss: 0.1239 89/500 [====>.........................] - ETA: 2:15 - loss: 0.9152 - regression_loss: 0.7912 - classification_loss: 0.1240 90/500 [====>.........................] - ETA: 2:15 - loss: 0.9089 - regression_loss: 0.7859 - classification_loss: 0.1230 91/500 [====>.........................] - ETA: 2:15 - loss: 0.9019 - regression_loss: 0.7798 - classification_loss: 0.1221 92/500 [====>.........................] - ETA: 2:14 - loss: 0.8982 - regression_loss: 0.7768 - classification_loss: 0.1214 93/500 [====>.........................] - ETA: 2:14 - loss: 0.8931 - regression_loss: 0.7726 - classification_loss: 0.1205 94/500 [====>.........................] - ETA: 2:14 - loss: 0.9009 - regression_loss: 0.7800 - classification_loss: 0.1210 95/500 [====>.........................] - ETA: 2:13 - loss: 0.8996 - regression_loss: 0.7791 - classification_loss: 0.1205 96/500 [====>.........................] - ETA: 2:13 - loss: 0.8969 - regression_loss: 0.7764 - classification_loss: 0.1204 97/500 [====>.........................] - ETA: 2:13 - loss: 0.8994 - regression_loss: 0.7789 - classification_loss: 0.1205 98/500 [====>.........................] - ETA: 2:12 - loss: 0.8967 - regression_loss: 0.7765 - classification_loss: 0.1202 99/500 [====>.........................] - ETA: 2:12 - loss: 0.8933 - regression_loss: 0.7735 - classification_loss: 0.1199 100/500 [=====>........................] - ETA: 2:12 - loss: 0.8901 - regression_loss: 0.7706 - classification_loss: 0.1195 101/500 [=====>........................] - ETA: 2:12 - loss: 0.8990 - regression_loss: 0.7782 - classification_loss: 0.1208 102/500 [=====>........................] - ETA: 2:11 - loss: 0.8970 - regression_loss: 0.7761 - classification_loss: 0.1209 103/500 [=====>........................] - ETA: 2:11 - loss: 0.9003 - regression_loss: 0.7789 - classification_loss: 0.1213 104/500 [=====>........................] - ETA: 2:10 - loss: 0.9031 - regression_loss: 0.7810 - classification_loss: 0.1221 105/500 [=====>........................] - ETA: 2:10 - loss: 0.9030 - regression_loss: 0.7809 - classification_loss: 0.1221 106/500 [=====>........................] - ETA: 2:10 - loss: 0.9046 - regression_loss: 0.7818 - classification_loss: 0.1227 107/500 [=====>........................] - ETA: 2:09 - loss: 0.9006 - regression_loss: 0.7784 - classification_loss: 0.1222 108/500 [=====>........................] - ETA: 2:09 - loss: 0.9032 - regression_loss: 0.7804 - classification_loss: 0.1228 109/500 [=====>........................] - ETA: 2:09 - loss: 0.9029 - regression_loss: 0.7804 - classification_loss: 0.1225 110/500 [=====>........................] - ETA: 2:08 - loss: 0.9020 - regression_loss: 0.7797 - classification_loss: 0.1223 111/500 [=====>........................] - ETA: 2:08 - loss: 0.8980 - regression_loss: 0.7765 - classification_loss: 0.1215 112/500 [=====>........................] - ETA: 2:08 - loss: 0.9017 - regression_loss: 0.7795 - classification_loss: 0.1222 113/500 [=====>........................] - ETA: 2:07 - loss: 0.8979 - regression_loss: 0.7762 - classification_loss: 0.1217 114/500 [=====>........................] - ETA: 2:07 - loss: 0.9017 - regression_loss: 0.7793 - classification_loss: 0.1224 115/500 [=====>........................] - ETA: 2:07 - loss: 0.8985 - regression_loss: 0.7766 - classification_loss: 0.1220 116/500 [=====>........................] - ETA: 2:06 - loss: 0.8954 - regression_loss: 0.7741 - classification_loss: 0.1213 117/500 [======>.......................] - ETA: 2:06 - loss: 0.8952 - regression_loss: 0.7744 - classification_loss: 0.1209 118/500 [======>.......................] - ETA: 2:06 - loss: 0.9022 - regression_loss: 0.7798 - classification_loss: 0.1224 119/500 [======>.......................] - ETA: 2:05 - loss: 0.9054 - regression_loss: 0.7823 - classification_loss: 0.1231 120/500 [======>.......................] - ETA: 2:05 - loss: 0.9046 - regression_loss: 0.7819 - classification_loss: 0.1226 121/500 [======>.......................] - ETA: 2:05 - loss: 0.9053 - regression_loss: 0.7824 - classification_loss: 0.1229 122/500 [======>.......................] - ETA: 2:04 - loss: 0.9039 - regression_loss: 0.7809 - classification_loss: 0.1229 123/500 [======>.......................] - ETA: 2:04 - loss: 0.9072 - regression_loss: 0.7832 - classification_loss: 0.1239 124/500 [======>.......................] - ETA: 2:04 - loss: 0.9047 - regression_loss: 0.7809 - classification_loss: 0.1238 125/500 [======>.......................] - ETA: 2:03 - loss: 0.9013 - regression_loss: 0.7771 - classification_loss: 0.1242 126/500 [======>.......................] - ETA: 2:03 - loss: 0.8984 - regression_loss: 0.7747 - classification_loss: 0.1236 127/500 [======>.......................] - ETA: 2:03 - loss: 0.9005 - regression_loss: 0.7764 - classification_loss: 0.1241 128/500 [======>.......................] - ETA: 2:02 - loss: 0.8968 - regression_loss: 0.7732 - classification_loss: 0.1235 129/500 [======>.......................] - ETA: 2:02 - loss: 0.9007 - regression_loss: 0.7764 - classification_loss: 0.1243 130/500 [======>.......................] - ETA: 2:02 - loss: 0.9003 - regression_loss: 0.7759 - classification_loss: 0.1245 131/500 [======>.......................] - ETA: 2:01 - loss: 0.8961 - regression_loss: 0.7722 - classification_loss: 0.1238 132/500 [======>.......................] - ETA: 2:01 - loss: 0.8938 - regression_loss: 0.7705 - classification_loss: 0.1234 133/500 [======>.......................] - ETA: 2:01 - loss: 0.8960 - regression_loss: 0.7722 - classification_loss: 0.1238 134/500 [=======>......................] - ETA: 2:00 - loss: 0.8924 - regression_loss: 0.7692 - classification_loss: 0.1232 135/500 [=======>......................] - ETA: 2:00 - loss: 0.8925 - regression_loss: 0.7694 - classification_loss: 0.1231 136/500 [=======>......................] - ETA: 2:00 - loss: 0.8934 - regression_loss: 0.7707 - classification_loss: 0.1227 137/500 [=======>......................] - ETA: 1:59 - loss: 0.8910 - regression_loss: 0.7687 - classification_loss: 0.1222 138/500 [=======>......................] - ETA: 1:59 - loss: 0.8910 - regression_loss: 0.7689 - classification_loss: 0.1221 139/500 [=======>......................] - ETA: 1:59 - loss: 0.8965 - regression_loss: 0.7732 - classification_loss: 0.1233 140/500 [=======>......................] - ETA: 1:58 - loss: 0.8942 - regression_loss: 0.7713 - classification_loss: 0.1229 141/500 [=======>......................] - ETA: 1:58 - loss: 0.8904 - regression_loss: 0.7681 - classification_loss: 0.1223 142/500 [=======>......................] - ETA: 1:58 - loss: 0.8918 - regression_loss: 0.7695 - classification_loss: 0.1223 143/500 [=======>......................] - ETA: 1:57 - loss: 0.8913 - regression_loss: 0.7691 - classification_loss: 0.1222 144/500 [=======>......................] - ETA: 1:57 - loss: 0.8958 - regression_loss: 0.7729 - classification_loss: 0.1229 145/500 [=======>......................] - ETA: 1:57 - loss: 0.8945 - regression_loss: 0.7720 - classification_loss: 0.1225 146/500 [=======>......................] - ETA: 1:56 - loss: 0.8932 - regression_loss: 0.7710 - classification_loss: 0.1222 147/500 [=======>......................] - ETA: 1:56 - loss: 0.8940 - regression_loss: 0.7719 - classification_loss: 0.1221 148/500 [=======>......................] - ETA: 1:56 - loss: 0.8970 - regression_loss: 0.7742 - classification_loss: 0.1227 149/500 [=======>......................] - ETA: 1:55 - loss: 0.8999 - regression_loss: 0.7766 - classification_loss: 0.1233 150/500 [========>.....................] - ETA: 1:55 - loss: 0.9026 - regression_loss: 0.7789 - classification_loss: 0.1237 151/500 [========>.....................] - ETA: 1:55 - loss: 0.9031 - regression_loss: 0.7795 - classification_loss: 0.1236 152/500 [========>.....................] - ETA: 1:54 - loss: 0.9054 - regression_loss: 0.7815 - classification_loss: 0.1239 153/500 [========>.....................] - ETA: 1:54 - loss: 0.9067 - regression_loss: 0.7826 - classification_loss: 0.1241 154/500 [========>.....................] - ETA: 1:54 - loss: 0.9036 - regression_loss: 0.7795 - classification_loss: 0.1242 155/500 [========>.....................] - ETA: 1:53 - loss: 0.9023 - regression_loss: 0.7783 - classification_loss: 0.1240 156/500 [========>.....................] - ETA: 1:53 - loss: 0.9047 - regression_loss: 0.7804 - classification_loss: 0.1243 157/500 [========>.....................] - ETA: 1:53 - loss: 0.9048 - regression_loss: 0.7805 - classification_loss: 0.1243 158/500 [========>.....................] - ETA: 1:52 - loss: 0.9076 - regression_loss: 0.7827 - classification_loss: 0.1248 159/500 [========>.....................] - ETA: 1:52 - loss: 0.9089 - regression_loss: 0.7839 - classification_loss: 0.1250 160/500 [========>.....................] - ETA: 1:52 - loss: 0.9061 - regression_loss: 0.7815 - classification_loss: 0.1246 161/500 [========>.....................] - ETA: 1:51 - loss: 0.9018 - regression_loss: 0.7778 - classification_loss: 0.1240 162/500 [========>.....................] - ETA: 1:51 - loss: 0.8991 - regression_loss: 0.7753 - classification_loss: 0.1237 163/500 [========>.....................] - ETA: 1:51 - loss: 0.8977 - regression_loss: 0.7743 - classification_loss: 0.1234 164/500 [========>.....................] - ETA: 1:50 - loss: 0.9009 - regression_loss: 0.7769 - classification_loss: 0.1240 165/500 [========>.....................] - ETA: 1:50 - loss: 0.9035 - regression_loss: 0.7791 - classification_loss: 0.1244 166/500 [========>.....................] - ETA: 1:50 - loss: 0.9042 - regression_loss: 0.7799 - classification_loss: 0.1243 167/500 [=========>....................] - ETA: 1:49 - loss: 0.9040 - regression_loss: 0.7796 - classification_loss: 0.1243 168/500 [=========>....................] - ETA: 1:49 - loss: 0.9008 - regression_loss: 0.7768 - classification_loss: 0.1240 169/500 [=========>....................] - ETA: 1:49 - loss: 0.9004 - regression_loss: 0.7765 - classification_loss: 0.1239 170/500 [=========>....................] - ETA: 1:48 - loss: 0.9002 - regression_loss: 0.7762 - classification_loss: 0.1240 171/500 [=========>....................] - ETA: 1:48 - loss: 0.8982 - regression_loss: 0.7745 - classification_loss: 0.1237 172/500 [=========>....................] - ETA: 1:48 - loss: 0.8949 - regression_loss: 0.7716 - classification_loss: 0.1233 173/500 [=========>....................] - ETA: 1:47 - loss: 0.8961 - regression_loss: 0.7724 - classification_loss: 0.1237 174/500 [=========>....................] - ETA: 1:47 - loss: 0.8959 - regression_loss: 0.7722 - classification_loss: 0.1237 175/500 [=========>....................] - ETA: 1:47 - loss: 0.8948 - regression_loss: 0.7713 - classification_loss: 0.1235 176/500 [=========>....................] - ETA: 1:46 - loss: 0.8951 - regression_loss: 0.7716 - classification_loss: 0.1235 177/500 [=========>....................] - ETA: 1:46 - loss: 0.8979 - regression_loss: 0.7739 - classification_loss: 0.1241 178/500 [=========>....................] - ETA: 1:46 - loss: 0.8998 - regression_loss: 0.7755 - classification_loss: 0.1243 179/500 [=========>....................] - ETA: 1:45 - loss: 0.9022 - regression_loss: 0.7777 - classification_loss: 0.1245 180/500 [=========>....................] - ETA: 1:45 - loss: 0.9023 - regression_loss: 0.7779 - classification_loss: 0.1245 181/500 [=========>....................] - ETA: 1:45 - loss: 0.8994 - regression_loss: 0.7752 - classification_loss: 0.1242 182/500 [=========>....................] - ETA: 1:44 - loss: 0.9008 - regression_loss: 0.7761 - classification_loss: 0.1247 183/500 [=========>....................] - ETA: 1:44 - loss: 0.9030 - regression_loss: 0.7781 - classification_loss: 0.1249 184/500 [==========>...................] - ETA: 1:44 - loss: 0.9016 - regression_loss: 0.7769 - classification_loss: 0.1248 185/500 [==========>...................] - ETA: 1:43 - loss: 0.8994 - regression_loss: 0.7749 - classification_loss: 0.1245 186/500 [==========>...................] - ETA: 1:43 - loss: 0.9000 - regression_loss: 0.7757 - classification_loss: 0.1243 187/500 [==========>...................] - ETA: 1:43 - loss: 0.8995 - regression_loss: 0.7752 - classification_loss: 0.1243 188/500 [==========>...................] - ETA: 1:42 - loss: 0.9005 - regression_loss: 0.7759 - classification_loss: 0.1246 189/500 [==========>...................] - ETA: 1:42 - loss: 0.9016 - regression_loss: 0.7770 - classification_loss: 0.1246 190/500 [==========>...................] - ETA: 1:42 - loss: 0.9039 - regression_loss: 0.7789 - classification_loss: 0.1250 191/500 [==========>...................] - ETA: 1:41 - loss: 0.9044 - regression_loss: 0.7795 - classification_loss: 0.1249 192/500 [==========>...................] - ETA: 1:41 - loss: 0.9063 - regression_loss: 0.7812 - classification_loss: 0.1251 193/500 [==========>...................] - ETA: 1:41 - loss: 0.9062 - regression_loss: 0.7813 - classification_loss: 0.1249 194/500 [==========>...................] - ETA: 1:40 - loss: 0.9063 - regression_loss: 0.7815 - classification_loss: 0.1248 195/500 [==========>...................] - ETA: 1:40 - loss: 0.9051 - regression_loss: 0.7804 - classification_loss: 0.1247 196/500 [==========>...................] - ETA: 1:40 - loss: 0.9028 - regression_loss: 0.7784 - classification_loss: 0.1244 197/500 [==========>...................] - ETA: 1:39 - loss: 0.9044 - regression_loss: 0.7797 - classification_loss: 0.1247 198/500 [==========>...................] - ETA: 1:39 - loss: 0.9044 - regression_loss: 0.7796 - classification_loss: 0.1249 199/500 [==========>...................] - ETA: 1:39 - loss: 0.9025 - regression_loss: 0.7779 - classification_loss: 0.1246 200/500 [===========>..................] - ETA: 1:38 - loss: 0.9044 - regression_loss: 0.7793 - classification_loss: 0.1250 201/500 [===========>..................] - ETA: 1:38 - loss: 0.9047 - regression_loss: 0.7796 - classification_loss: 0.1251 202/500 [===========>..................] - ETA: 1:38 - loss: 0.9038 - regression_loss: 0.7789 - classification_loss: 0.1249 203/500 [===========>..................] - ETA: 1:37 - loss: 0.9018 - regression_loss: 0.7771 - classification_loss: 0.1247 204/500 [===========>..................] - ETA: 1:37 - loss: 0.9015 - regression_loss: 0.7769 - classification_loss: 0.1247 205/500 [===========>..................] - ETA: 1:37 - loss: 0.8995 - regression_loss: 0.7751 - classification_loss: 0.1244 206/500 [===========>..................] - ETA: 1:37 - loss: 0.8986 - regression_loss: 0.7744 - classification_loss: 0.1242 207/500 [===========>..................] - ETA: 1:36 - loss: 0.8968 - regression_loss: 0.7728 - classification_loss: 0.1240 208/500 [===========>..................] - ETA: 1:36 - loss: 0.8955 - regression_loss: 0.7718 - classification_loss: 0.1237 209/500 [===========>..................] - ETA: 1:35 - loss: 0.8964 - regression_loss: 0.7726 - classification_loss: 0.1237 210/500 [===========>..................] - ETA: 1:35 - loss: 0.8984 - regression_loss: 0.7744 - classification_loss: 0.1240 211/500 [===========>..................] - ETA: 1:35 - loss: 0.8965 - regression_loss: 0.7728 - classification_loss: 0.1237 212/500 [===========>..................] - ETA: 1:34 - loss: 0.8979 - regression_loss: 0.7740 - classification_loss: 0.1239 213/500 [===========>..................] - ETA: 1:34 - loss: 0.8955 - regression_loss: 0.7719 - classification_loss: 0.1235 214/500 [===========>..................] - ETA: 1:34 - loss: 0.8948 - regression_loss: 0.7714 - classification_loss: 0.1234 215/500 [===========>..................] - ETA: 1:34 - loss: 0.8941 - regression_loss: 0.7708 - classification_loss: 0.1233 216/500 [===========>..................] - ETA: 1:33 - loss: 0.8911 - regression_loss: 0.7682 - classification_loss: 0.1229 217/500 [============>.................] - ETA: 1:33 - loss: 0.8907 - regression_loss: 0.7677 - classification_loss: 0.1229 218/500 [============>.................] - ETA: 1:33 - loss: 0.8892 - regression_loss: 0.7664 - classification_loss: 0.1228 219/500 [============>.................] - ETA: 1:32 - loss: 0.8903 - regression_loss: 0.7674 - classification_loss: 0.1229 220/500 [============>.................] - ETA: 1:32 - loss: 0.8918 - regression_loss: 0.7688 - classification_loss: 0.1230 221/500 [============>.................] - ETA: 1:31 - loss: 0.8909 - regression_loss: 0.7679 - classification_loss: 0.1229 222/500 [============>.................] - ETA: 1:31 - loss: 0.8920 - regression_loss: 0.7690 - classification_loss: 0.1230 223/500 [============>.................] - ETA: 1:31 - loss: 0.8925 - regression_loss: 0.7694 - classification_loss: 0.1230 224/500 [============>.................] - ETA: 1:30 - loss: 0.8926 - regression_loss: 0.7696 - classification_loss: 0.1230 225/500 [============>.................] - ETA: 1:30 - loss: 0.8931 - regression_loss: 0.7700 - classification_loss: 0.1231 226/500 [============>.................] - ETA: 1:30 - loss: 0.8956 - regression_loss: 0.7723 - classification_loss: 0.1233 227/500 [============>.................] - ETA: 1:29 - loss: 0.8979 - regression_loss: 0.7740 - classification_loss: 0.1239 228/500 [============>.................] - ETA: 1:29 - loss: 0.8985 - regression_loss: 0.7747 - classification_loss: 0.1238 229/500 [============>.................] - ETA: 1:29 - loss: 0.8989 - regression_loss: 0.7752 - classification_loss: 0.1237 230/500 [============>.................] - ETA: 1:29 - loss: 0.8967 - regression_loss: 0.7734 - classification_loss: 0.1234 231/500 [============>.................] - ETA: 1:28 - loss: 0.8945 - regression_loss: 0.7715 - classification_loss: 0.1230 232/500 [============>.................] - ETA: 1:28 - loss: 0.8930 - regression_loss: 0.7702 - classification_loss: 0.1228 233/500 [============>.................] - ETA: 1:28 - loss: 0.8938 - regression_loss: 0.7709 - classification_loss: 0.1229 234/500 [=============>................] - ETA: 1:27 - loss: 0.8955 - regression_loss: 0.7724 - classification_loss: 0.1230 235/500 [=============>................] - ETA: 1:27 - loss: 0.8924 - regression_loss: 0.7698 - classification_loss: 0.1226 236/500 [=============>................] - ETA: 1:27 - loss: 0.8954 - regression_loss: 0.7723 - classification_loss: 0.1231 237/500 [=============>................] - ETA: 1:26 - loss: 0.8955 - regression_loss: 0.7726 - classification_loss: 0.1229 238/500 [=============>................] - ETA: 1:26 - loss: 0.8987 - regression_loss: 0.7753 - classification_loss: 0.1234 239/500 [=============>................] - ETA: 1:26 - loss: 0.9001 - regression_loss: 0.7765 - classification_loss: 0.1236 240/500 [=============>................] - ETA: 1:25 - loss: 0.9007 - regression_loss: 0.7771 - classification_loss: 0.1237 241/500 [=============>................] - ETA: 1:25 - loss: 0.8989 - regression_loss: 0.7755 - classification_loss: 0.1234 242/500 [=============>................] - ETA: 1:25 - loss: 0.9031 - regression_loss: 0.7790 - classification_loss: 0.1241 243/500 [=============>................] - ETA: 1:24 - loss: 0.9014 - regression_loss: 0.7775 - classification_loss: 0.1238 244/500 [=============>................] - ETA: 1:24 - loss: 0.9044 - regression_loss: 0.7801 - classification_loss: 0.1243 245/500 [=============>................] - ETA: 1:24 - loss: 0.9027 - regression_loss: 0.7786 - classification_loss: 0.1241 246/500 [=============>................] - ETA: 1:23 - loss: 0.9033 - regression_loss: 0.7790 - classification_loss: 0.1242 247/500 [=============>................] - ETA: 1:23 - loss: 0.9033 - regression_loss: 0.7792 - classification_loss: 0.1242 248/500 [=============>................] - ETA: 1:23 - loss: 0.9038 - regression_loss: 0.7794 - classification_loss: 0.1244 249/500 [=============>................] - ETA: 1:22 - loss: 0.9041 - regression_loss: 0.7796 - classification_loss: 0.1245 250/500 [==============>...............] - ETA: 1:22 - loss: 0.9061 - regression_loss: 0.7814 - classification_loss: 0.1248 251/500 [==============>...............] - ETA: 1:22 - loss: 0.9051 - regression_loss: 0.7806 - classification_loss: 0.1245 252/500 [==============>...............] - ETA: 1:21 - loss: 0.9058 - regression_loss: 0.7812 - classification_loss: 0.1246 253/500 [==============>...............] - ETA: 1:21 - loss: 0.9061 - regression_loss: 0.7814 - classification_loss: 0.1247 254/500 [==============>...............] - ETA: 1:21 - loss: 0.9061 - regression_loss: 0.7815 - classification_loss: 0.1246 255/500 [==============>...............] - ETA: 1:20 - loss: 0.9076 - regression_loss: 0.7828 - classification_loss: 0.1248 256/500 [==============>...............] - ETA: 1:20 - loss: 0.9084 - regression_loss: 0.7837 - classification_loss: 0.1247 257/500 [==============>...............] - ETA: 1:20 - loss: 0.9086 - regression_loss: 0.7838 - classification_loss: 0.1247 258/500 [==============>...............] - ETA: 1:19 - loss: 0.9095 - regression_loss: 0.7848 - classification_loss: 0.1247 259/500 [==============>...............] - ETA: 1:19 - loss: 0.9077 - regression_loss: 0.7833 - classification_loss: 0.1244 260/500 [==============>...............] - ETA: 1:19 - loss: 0.9076 - regression_loss: 0.7833 - classification_loss: 0.1243 261/500 [==============>...............] - ETA: 1:18 - loss: 0.9096 - regression_loss: 0.7850 - classification_loss: 0.1246 262/500 [==============>...............] - ETA: 1:18 - loss: 0.9107 - regression_loss: 0.7860 - classification_loss: 0.1247 263/500 [==============>...............] - ETA: 1:18 - loss: 0.9105 - regression_loss: 0.7857 - classification_loss: 0.1248 264/500 [==============>...............] - ETA: 1:17 - loss: 0.9115 - regression_loss: 0.7867 - classification_loss: 0.1249 265/500 [==============>...............] - ETA: 1:17 - loss: 0.9121 - regression_loss: 0.7872 - classification_loss: 0.1249 266/500 [==============>...............] - ETA: 1:17 - loss: 0.9137 - regression_loss: 0.7885 - classification_loss: 0.1252 267/500 [===============>..............] - ETA: 1:16 - loss: 0.9124 - regression_loss: 0.7875 - classification_loss: 0.1249 268/500 [===============>..............] - ETA: 1:16 - loss: 0.9105 - regression_loss: 0.7860 - classification_loss: 0.1245 269/500 [===============>..............] - ETA: 1:16 - loss: 0.9123 - regression_loss: 0.7874 - classification_loss: 0.1249 270/500 [===============>..............] - ETA: 1:15 - loss: 0.9117 - regression_loss: 0.7870 - classification_loss: 0.1247 271/500 [===============>..............] - ETA: 1:15 - loss: 0.9131 - regression_loss: 0.7881 - classification_loss: 0.1249 272/500 [===============>..............] - ETA: 1:15 - loss: 0.9117 - regression_loss: 0.7871 - classification_loss: 0.1247 273/500 [===============>..............] - ETA: 1:15 - loss: 0.9133 - regression_loss: 0.7884 - classification_loss: 0.1249 274/500 [===============>..............] - ETA: 1:14 - loss: 0.9153 - regression_loss: 0.7901 - classification_loss: 0.1252 275/500 [===============>..............] - ETA: 1:14 - loss: 0.9161 - regression_loss: 0.7909 - classification_loss: 0.1253 276/500 [===============>..............] - ETA: 1:14 - loss: 0.9178 - regression_loss: 0.7923 - classification_loss: 0.1256 277/500 [===============>..............] - ETA: 1:13 - loss: 0.9186 - regression_loss: 0.7929 - classification_loss: 0.1257 278/500 [===============>..............] - ETA: 1:13 - loss: 0.9192 - regression_loss: 0.7936 - classification_loss: 0.1256 279/500 [===============>..............] - ETA: 1:13 - loss: 0.9172 - regression_loss: 0.7919 - classification_loss: 0.1253 280/500 [===============>..............] - ETA: 1:12 - loss: 0.9151 - regression_loss: 0.7900 - classification_loss: 0.1250 281/500 [===============>..............] - ETA: 1:12 - loss: 0.9147 - regression_loss: 0.7897 - classification_loss: 0.1250 282/500 [===============>..............] - ETA: 1:12 - loss: 0.9139 - regression_loss: 0.7892 - classification_loss: 0.1248 283/500 [===============>..............] - ETA: 1:11 - loss: 0.9120 - regression_loss: 0.7875 - classification_loss: 0.1245 284/500 [================>.............] - ETA: 1:11 - loss: 0.9108 - regression_loss: 0.7865 - classification_loss: 0.1243 285/500 [================>.............] - ETA: 1:11 - loss: 0.9092 - regression_loss: 0.7852 - classification_loss: 0.1240 286/500 [================>.............] - ETA: 1:10 - loss: 0.9077 - regression_loss: 0.7840 - classification_loss: 0.1237 287/500 [================>.............] - ETA: 1:10 - loss: 0.9077 - regression_loss: 0.7841 - classification_loss: 0.1236 288/500 [================>.............] - ETA: 1:10 - loss: 0.9090 - regression_loss: 0.7851 - classification_loss: 0.1239 289/500 [================>.............] - ETA: 1:09 - loss: 0.9105 - regression_loss: 0.7863 - classification_loss: 0.1242 290/500 [================>.............] - ETA: 1:09 - loss: 0.9104 - regression_loss: 0.7862 - classification_loss: 0.1242 291/500 [================>.............] - ETA: 1:09 - loss: 0.9113 - regression_loss: 0.7870 - classification_loss: 0.1243 292/500 [================>.............] - ETA: 1:08 - loss: 0.9102 - regression_loss: 0.7861 - classification_loss: 0.1240 293/500 [================>.............] - ETA: 1:08 - loss: 0.9106 - regression_loss: 0.7864 - classification_loss: 0.1242 294/500 [================>.............] - ETA: 1:08 - loss: 0.9109 - regression_loss: 0.7867 - classification_loss: 0.1241 295/500 [================>.............] - ETA: 1:07 - loss: 0.9099 - regression_loss: 0.7860 - classification_loss: 0.1239 296/500 [================>.............] - ETA: 1:07 - loss: 0.9087 - regression_loss: 0.7850 - classification_loss: 0.1237 297/500 [================>.............] - ETA: 1:07 - loss: 0.9085 - regression_loss: 0.7850 - classification_loss: 0.1235 298/500 [================>.............] - ETA: 1:06 - loss: 0.9072 - regression_loss: 0.7838 - classification_loss: 0.1234 299/500 [================>.............] - ETA: 1:06 - loss: 0.9080 - regression_loss: 0.7845 - classification_loss: 0.1235 300/500 [=================>............] - ETA: 1:06 - loss: 0.9100 - regression_loss: 0.7863 - classification_loss: 0.1237 301/500 [=================>............] - ETA: 1:05 - loss: 0.9110 - regression_loss: 0.7871 - classification_loss: 0.1239 302/500 [=================>............] - ETA: 1:05 - loss: 0.9095 - regression_loss: 0.7858 - classification_loss: 0.1237 303/500 [=================>............] - ETA: 1:05 - loss: 0.9089 - regression_loss: 0.7854 - classification_loss: 0.1235 304/500 [=================>............] - ETA: 1:04 - loss: 0.9078 - regression_loss: 0.7843 - classification_loss: 0.1234 305/500 [=================>............] - ETA: 1:04 - loss: 0.9084 - regression_loss: 0.7850 - classification_loss: 0.1234 306/500 [=================>............] - ETA: 1:04 - loss: 0.9089 - regression_loss: 0.7853 - classification_loss: 0.1236 307/500 [=================>............] - ETA: 1:03 - loss: 0.9089 - regression_loss: 0.7852 - classification_loss: 0.1237 308/500 [=================>............] - ETA: 1:03 - loss: 0.9070 - regression_loss: 0.7836 - classification_loss: 0.1235 309/500 [=================>............] - ETA: 1:03 - loss: 0.9064 - regression_loss: 0.7832 - classification_loss: 0.1232 310/500 [=================>............] - ETA: 1:02 - loss: 0.9056 - regression_loss: 0.7825 - classification_loss: 0.1231 311/500 [=================>............] - ETA: 1:02 - loss: 0.9057 - regression_loss: 0.7826 - classification_loss: 0.1230 312/500 [=================>............] - ETA: 1:02 - loss: 0.9048 - regression_loss: 0.7819 - classification_loss: 0.1230 313/500 [=================>............] - ETA: 1:01 - loss: 0.9072 - regression_loss: 0.7838 - classification_loss: 0.1234 314/500 [=================>............] - ETA: 1:01 - loss: 0.9052 - regression_loss: 0.7821 - classification_loss: 0.1232 315/500 [=================>............] - ETA: 1:01 - loss: 0.9041 - regression_loss: 0.7811 - classification_loss: 0.1230 316/500 [=================>............] - ETA: 1:00 - loss: 0.9052 - regression_loss: 0.7821 - classification_loss: 0.1232 317/500 [==================>...........] - ETA: 1:00 - loss: 0.9060 - regression_loss: 0.7830 - classification_loss: 0.1230 318/500 [==================>...........] - ETA: 1:00 - loss: 0.9069 - regression_loss: 0.7837 - classification_loss: 0.1232 319/500 [==================>...........] - ETA: 59s - loss: 0.9069 - regression_loss: 0.7838 - classification_loss: 0.1231  320/500 [==================>...........] - ETA: 59s - loss: 0.9058 - regression_loss: 0.7828 - classification_loss: 0.1230 321/500 [==================>...........] - ETA: 59s - loss: 0.9046 - regression_loss: 0.7819 - classification_loss: 0.1228 322/500 [==================>...........] - ETA: 58s - loss: 0.9045 - regression_loss: 0.7817 - classification_loss: 0.1227 323/500 [==================>...........] - ETA: 58s - loss: 0.9045 - regression_loss: 0.7816 - classification_loss: 0.1229 324/500 [==================>...........] - ETA: 58s - loss: 0.9034 - regression_loss: 0.7806 - classification_loss: 0.1228 325/500 [==================>...........] - ETA: 57s - loss: 0.9039 - regression_loss: 0.7809 - classification_loss: 0.1230 326/500 [==================>...........] - ETA: 57s - loss: 0.9044 - regression_loss: 0.7813 - classification_loss: 0.1230 327/500 [==================>...........] - ETA: 57s - loss: 0.9042 - regression_loss: 0.7813 - classification_loss: 0.1230 328/500 [==================>...........] - ETA: 56s - loss: 0.9028 - regression_loss: 0.7801 - classification_loss: 0.1227 329/500 [==================>...........] - ETA: 56s - loss: 0.9034 - regression_loss: 0.7806 - classification_loss: 0.1228 330/500 [==================>...........] - ETA: 56s - loss: 0.9043 - regression_loss: 0.7814 - classification_loss: 0.1229 331/500 [==================>...........] - ETA: 55s - loss: 0.9041 - regression_loss: 0.7812 - classification_loss: 0.1229 332/500 [==================>...........] - ETA: 55s - loss: 0.9056 - regression_loss: 0.7826 - classification_loss: 0.1230 333/500 [==================>...........] - ETA: 55s - loss: 0.9051 - regression_loss: 0.7821 - classification_loss: 0.1229 334/500 [===================>..........] - ETA: 54s - loss: 0.9063 - regression_loss: 0.7830 - classification_loss: 0.1233 335/500 [===================>..........] - ETA: 54s - loss: 0.9049 - regression_loss: 0.7818 - classification_loss: 0.1231 336/500 [===================>..........] - ETA: 54s - loss: 0.9042 - regression_loss: 0.7812 - classification_loss: 0.1230 337/500 [===================>..........] - ETA: 53s - loss: 0.9031 - regression_loss: 0.7803 - classification_loss: 0.1228 338/500 [===================>..........] - ETA: 53s - loss: 0.9014 - regression_loss: 0.7788 - classification_loss: 0.1226 339/500 [===================>..........] - ETA: 53s - loss: 0.9018 - regression_loss: 0.7791 - classification_loss: 0.1227 340/500 [===================>..........] - ETA: 52s - loss: 0.9007 - regression_loss: 0.7782 - classification_loss: 0.1225 341/500 [===================>..........] - ETA: 52s - loss: 0.9008 - regression_loss: 0.7782 - classification_loss: 0.1226 342/500 [===================>..........] - ETA: 52s - loss: 0.9015 - regression_loss: 0.7788 - classification_loss: 0.1227 343/500 [===================>..........] - ETA: 51s - loss: 0.8998 - regression_loss: 0.7773 - classification_loss: 0.1224 344/500 [===================>..........] - ETA: 51s - loss: 0.8998 - regression_loss: 0.7774 - classification_loss: 0.1225 345/500 [===================>..........] - ETA: 51s - loss: 0.8989 - regression_loss: 0.7764 - classification_loss: 0.1224 346/500 [===================>..........] - ETA: 50s - loss: 0.8974 - regression_loss: 0.7752 - classification_loss: 0.1222 347/500 [===================>..........] - ETA: 50s - loss: 0.8985 - regression_loss: 0.7760 - classification_loss: 0.1224 348/500 [===================>..........] - ETA: 50s - loss: 0.8990 - regression_loss: 0.7765 - classification_loss: 0.1225 349/500 [===================>..........] - ETA: 49s - loss: 0.8999 - regression_loss: 0.7773 - classification_loss: 0.1226 350/500 [====================>.........] - ETA: 49s - loss: 0.9004 - regression_loss: 0.7777 - classification_loss: 0.1227 351/500 [====================>.........] - ETA: 49s - loss: 0.8992 - regression_loss: 0.7767 - classification_loss: 0.1225 352/500 [====================>.........] - ETA: 48s - loss: 0.8993 - regression_loss: 0.7770 - classification_loss: 0.1224 353/500 [====================>.........] - ETA: 48s - loss: 0.8992 - regression_loss: 0.7768 - classification_loss: 0.1224 354/500 [====================>.........] - ETA: 48s - loss: 0.9002 - regression_loss: 0.7775 - classification_loss: 0.1226 355/500 [====================>.........] - ETA: 47s - loss: 0.8993 - regression_loss: 0.7768 - classification_loss: 0.1225 356/500 [====================>.........] - ETA: 47s - loss: 0.9010 - regression_loss: 0.7782 - classification_loss: 0.1228 357/500 [====================>.........] - ETA: 47s - loss: 0.9002 - regression_loss: 0.7775 - classification_loss: 0.1227 358/500 [====================>.........] - ETA: 46s - loss: 0.9006 - regression_loss: 0.7778 - classification_loss: 0.1228 359/500 [====================>.........] - ETA: 46s - loss: 0.8993 - regression_loss: 0.7767 - classification_loss: 0.1226 360/500 [====================>.........] - ETA: 46s - loss: 0.8984 - regression_loss: 0.7760 - classification_loss: 0.1225 361/500 [====================>.........] - ETA: 45s - loss: 0.9006 - regression_loss: 0.7777 - classification_loss: 0.1228 362/500 [====================>.........] - ETA: 45s - loss: 0.9014 - regression_loss: 0.7785 - classification_loss: 0.1229 363/500 [====================>.........] - ETA: 45s - loss: 0.9013 - regression_loss: 0.7783 - classification_loss: 0.1230 364/500 [====================>.........] - ETA: 44s - loss: 0.9016 - regression_loss: 0.7785 - classification_loss: 0.1231 365/500 [====================>.........] - ETA: 44s - loss: 0.9025 - regression_loss: 0.7793 - classification_loss: 0.1233 366/500 [====================>.........] - ETA: 44s - loss: 0.9023 - regression_loss: 0.7790 - classification_loss: 0.1232 367/500 [=====================>........] - ETA: 43s - loss: 0.9011 - regression_loss: 0.7780 - classification_loss: 0.1230 368/500 [=====================>........] - ETA: 43s - loss: 0.9019 - regression_loss: 0.7788 - classification_loss: 0.1231 369/500 [=====================>........] - ETA: 43s - loss: 0.9011 - regression_loss: 0.7781 - classification_loss: 0.1230 370/500 [=====================>........] - ETA: 42s - loss: 0.9000 - regression_loss: 0.7772 - classification_loss: 0.1228 371/500 [=====================>........] - ETA: 42s - loss: 0.8993 - regression_loss: 0.7766 - classification_loss: 0.1227 372/500 [=====================>........] - ETA: 42s - loss: 0.9000 - regression_loss: 0.7770 - classification_loss: 0.1229 373/500 [=====================>........] - ETA: 41s - loss: 0.8988 - regression_loss: 0.7760 - classification_loss: 0.1228 374/500 [=====================>........] - ETA: 41s - loss: 0.8994 - regression_loss: 0.7765 - classification_loss: 0.1228 375/500 [=====================>........] - ETA: 41s - loss: 0.8994 - regression_loss: 0.7765 - classification_loss: 0.1228 376/500 [=====================>........] - ETA: 40s - loss: 0.8991 - regression_loss: 0.7763 - classification_loss: 0.1227 377/500 [=====================>........] - ETA: 40s - loss: 0.8994 - regression_loss: 0.7766 - classification_loss: 0.1228 378/500 [=====================>........] - ETA: 40s - loss: 0.8982 - regression_loss: 0.7755 - classification_loss: 0.1227 379/500 [=====================>........] - ETA: 39s - loss: 0.8970 - regression_loss: 0.7745 - classification_loss: 0.1225 380/500 [=====================>........] - ETA: 39s - loss: 0.8966 - regression_loss: 0.7743 - classification_loss: 0.1224 381/500 [=====================>........] - ETA: 39s - loss: 0.8950 - regression_loss: 0.7729 - classification_loss: 0.1221 382/500 [=====================>........] - ETA: 38s - loss: 0.8939 - regression_loss: 0.7719 - classification_loss: 0.1220 383/500 [=====================>........] - ETA: 38s - loss: 0.8948 - regression_loss: 0.7727 - classification_loss: 0.1221 384/500 [======================>.......] - ETA: 38s - loss: 0.8951 - regression_loss: 0.7728 - classification_loss: 0.1222 385/500 [======================>.......] - ETA: 37s - loss: 0.8950 - regression_loss: 0.7728 - classification_loss: 0.1222 386/500 [======================>.......] - ETA: 37s - loss: 0.8938 - regression_loss: 0.7718 - classification_loss: 0.1221 387/500 [======================>.......] - ETA: 37s - loss: 0.8945 - regression_loss: 0.7723 - classification_loss: 0.1222 388/500 [======================>.......] - ETA: 36s - loss: 0.8937 - regression_loss: 0.7716 - classification_loss: 0.1221 389/500 [======================>.......] - ETA: 36s - loss: 0.8936 - regression_loss: 0.7715 - classification_loss: 0.1221 390/500 [======================>.......] - ETA: 36s - loss: 0.8940 - regression_loss: 0.7718 - classification_loss: 0.1221 391/500 [======================>.......] - ETA: 35s - loss: 0.8939 - regression_loss: 0.7718 - classification_loss: 0.1221 392/500 [======================>.......] - ETA: 35s - loss: 0.8950 - regression_loss: 0.7727 - classification_loss: 0.1223 393/500 [======================>.......] - ETA: 35s - loss: 0.8939 - regression_loss: 0.7716 - classification_loss: 0.1223 394/500 [======================>.......] - ETA: 34s - loss: 0.8947 - regression_loss: 0.7722 - classification_loss: 0.1224 395/500 [======================>.......] - ETA: 34s - loss: 0.8949 - regression_loss: 0.7725 - classification_loss: 0.1224 396/500 [======================>.......] - ETA: 34s - loss: 0.8953 - regression_loss: 0.7729 - classification_loss: 0.1224 397/500 [======================>.......] - ETA: 33s - loss: 0.8963 - regression_loss: 0.7738 - classification_loss: 0.1225 398/500 [======================>.......] - ETA: 33s - loss: 0.8964 - regression_loss: 0.7739 - classification_loss: 0.1225 399/500 [======================>.......] - ETA: 33s - loss: 0.8970 - regression_loss: 0.7744 - classification_loss: 0.1226 400/500 [=======================>......] - ETA: 32s - loss: 0.8960 - regression_loss: 0.7735 - classification_loss: 0.1225 401/500 [=======================>......] - ETA: 32s - loss: 0.8951 - regression_loss: 0.7727 - classification_loss: 0.1223 402/500 [=======================>......] - ETA: 32s - loss: 0.8948 - regression_loss: 0.7725 - classification_loss: 0.1223 403/500 [=======================>......] - ETA: 31s - loss: 0.8939 - regression_loss: 0.7717 - classification_loss: 0.1222 404/500 [=======================>......] - ETA: 31s - loss: 0.8953 - regression_loss: 0.7727 - classification_loss: 0.1226 405/500 [=======================>......] - ETA: 31s - loss: 0.8949 - regression_loss: 0.7725 - classification_loss: 0.1224 406/500 [=======================>......] - ETA: 31s - loss: 0.8939 - regression_loss: 0.7717 - classification_loss: 0.1222 407/500 [=======================>......] - ETA: 30s - loss: 0.8942 - regression_loss: 0.7719 - classification_loss: 0.1222 408/500 [=======================>......] - ETA: 30s - loss: 0.8931 - regression_loss: 0.7711 - classification_loss: 0.1221 409/500 [=======================>......] - ETA: 30s - loss: 0.8921 - regression_loss: 0.7701 - classification_loss: 0.1219 410/500 [=======================>......] - ETA: 29s - loss: 0.8918 - regression_loss: 0.7698 - classification_loss: 0.1220 411/500 [=======================>......] - ETA: 29s - loss: 0.8909 - regression_loss: 0.7690 - classification_loss: 0.1218 412/500 [=======================>......] - ETA: 29s - loss: 0.8904 - regression_loss: 0.7687 - classification_loss: 0.1217 413/500 [=======================>......] - ETA: 28s - loss: 0.8917 - regression_loss: 0.7698 - classification_loss: 0.1219 414/500 [=======================>......] - ETA: 28s - loss: 0.8918 - regression_loss: 0.7699 - classification_loss: 0.1219 415/500 [=======================>......] - ETA: 28s - loss: 0.8917 - regression_loss: 0.7698 - classification_loss: 0.1219 416/500 [=======================>......] - ETA: 27s - loss: 0.8927 - regression_loss: 0.7707 - classification_loss: 0.1220 417/500 [========================>.....] - ETA: 27s - loss: 0.8928 - regression_loss: 0.7708 - classification_loss: 0.1220 418/500 [========================>.....] - ETA: 27s - loss: 0.8934 - regression_loss: 0.7713 - classification_loss: 0.1221 419/500 [========================>.....] - ETA: 26s - loss: 0.8942 - regression_loss: 0.7719 - classification_loss: 0.1222 420/500 [========================>.....] - ETA: 26s - loss: 0.8928 - regression_loss: 0.7708 - classification_loss: 0.1220 421/500 [========================>.....] - ETA: 26s - loss: 0.8922 - regression_loss: 0.7703 - classification_loss: 0.1219 422/500 [========================>.....] - ETA: 25s - loss: 0.8920 - regression_loss: 0.7702 - classification_loss: 0.1218 423/500 [========================>.....] - ETA: 25s - loss: 0.8928 - regression_loss: 0.7709 - classification_loss: 0.1220 424/500 [========================>.....] - ETA: 25s - loss: 0.8917 - regression_loss: 0.7699 - classification_loss: 0.1218 425/500 [========================>.....] - ETA: 24s - loss: 0.8918 - regression_loss: 0.7702 - classification_loss: 0.1217 426/500 [========================>.....] - ETA: 24s - loss: 0.8913 - regression_loss: 0.7697 - classification_loss: 0.1215 427/500 [========================>.....] - ETA: 24s - loss: 0.8912 - regression_loss: 0.7697 - classification_loss: 0.1215 428/500 [========================>.....] - ETA: 23s - loss: 0.8903 - regression_loss: 0.7688 - classification_loss: 0.1214 429/500 [========================>.....] - ETA: 23s - loss: 0.8904 - regression_loss: 0.7689 - classification_loss: 0.1216 430/500 [========================>.....] - ETA: 23s - loss: 0.8906 - regression_loss: 0.7690 - classification_loss: 0.1216 431/500 [========================>.....] - ETA: 22s - loss: 0.8901 - regression_loss: 0.7686 - classification_loss: 0.1215 432/500 [========================>.....] - ETA: 22s - loss: 0.8909 - regression_loss: 0.7693 - classification_loss: 0.1216 433/500 [========================>.....] - ETA: 22s - loss: 0.8930 - regression_loss: 0.7711 - classification_loss: 0.1219 434/500 [=========================>....] - ETA: 21s - loss: 0.8918 - regression_loss: 0.7701 - classification_loss: 0.1217 435/500 [=========================>....] - ETA: 21s - loss: 0.8906 - regression_loss: 0.7691 - classification_loss: 0.1215 436/500 [=========================>....] - ETA: 21s - loss: 0.8902 - regression_loss: 0.7687 - classification_loss: 0.1214 437/500 [=========================>....] - ETA: 20s - loss: 0.8898 - regression_loss: 0.7685 - classification_loss: 0.1213 438/500 [=========================>....] - ETA: 20s - loss: 0.8909 - regression_loss: 0.7694 - classification_loss: 0.1215 439/500 [=========================>....] - ETA: 20s - loss: 0.8919 - regression_loss: 0.7702 - classification_loss: 0.1217 440/500 [=========================>....] - ETA: 19s - loss: 0.8931 - regression_loss: 0.7713 - classification_loss: 0.1218 441/500 [=========================>....] - ETA: 19s - loss: 0.8926 - regression_loss: 0.7708 - classification_loss: 0.1217 442/500 [=========================>....] - ETA: 19s - loss: 0.8950 - regression_loss: 0.7729 - classification_loss: 0.1221 443/500 [=========================>....] - ETA: 18s - loss: 0.8946 - regression_loss: 0.7727 - classification_loss: 0.1219 444/500 [=========================>....] - ETA: 18s - loss: 0.8946 - regression_loss: 0.7727 - classification_loss: 0.1219 445/500 [=========================>....] - ETA: 18s - loss: 0.8947 - regression_loss: 0.7729 - classification_loss: 0.1218 446/500 [=========================>....] - ETA: 17s - loss: 0.8959 - regression_loss: 0.7740 - classification_loss: 0.1219 447/500 [=========================>....] - ETA: 17s - loss: 0.8961 - regression_loss: 0.7741 - classification_loss: 0.1220 448/500 [=========================>....] - ETA: 17s - loss: 0.8976 - regression_loss: 0.7754 - classification_loss: 0.1222 449/500 [=========================>....] - ETA: 16s - loss: 0.8981 - regression_loss: 0.7758 - classification_loss: 0.1223 450/500 [==========================>...] - ETA: 16s - loss: 0.8975 - regression_loss: 0.7753 - classification_loss: 0.1223 451/500 [==========================>...] - ETA: 16s - loss: 0.8983 - regression_loss: 0.7760 - classification_loss: 0.1223 452/500 [==========================>...] - ETA: 15s - loss: 0.8978 - regression_loss: 0.7756 - classification_loss: 0.1222 453/500 [==========================>...] - ETA: 15s - loss: 0.8967 - regression_loss: 0.7747 - classification_loss: 0.1220 454/500 [==========================>...] - ETA: 15s - loss: 0.8969 - regression_loss: 0.7748 - classification_loss: 0.1220 455/500 [==========================>...] - ETA: 14s - loss: 0.8973 - regression_loss: 0.7752 - classification_loss: 0.1221 456/500 [==========================>...] - ETA: 14s - loss: 0.8980 - regression_loss: 0.7757 - classification_loss: 0.1223 457/500 [==========================>...] - ETA: 14s - loss: 0.8982 - regression_loss: 0.7758 - classification_loss: 0.1224 458/500 [==========================>...] - ETA: 13s - loss: 0.8973 - regression_loss: 0.7751 - classification_loss: 0.1222 459/500 [==========================>...] - ETA: 13s - loss: 0.8964 - regression_loss: 0.7743 - classification_loss: 0.1221 460/500 [==========================>...] - ETA: 13s - loss: 0.8957 - regression_loss: 0.7738 - classification_loss: 0.1220 461/500 [==========================>...] - ETA: 12s - loss: 0.8957 - regression_loss: 0.7738 - classification_loss: 0.1220 462/500 [==========================>...] - ETA: 12s - loss: 0.8969 - regression_loss: 0.7748 - classification_loss: 0.1221 463/500 [==========================>...] - ETA: 12s - loss: 0.8972 - regression_loss: 0.7751 - classification_loss: 0.1221 464/500 [==========================>...] - ETA: 11s - loss: 0.8975 - regression_loss: 0.7754 - classification_loss: 0.1221 465/500 [==========================>...] - ETA: 11s - loss: 0.8983 - regression_loss: 0.7760 - classification_loss: 0.1223 466/500 [==========================>...] - ETA: 11s - loss: 0.8974 - regression_loss: 0.7752 - classification_loss: 0.1221 467/500 [===========================>..] - ETA: 10s - loss: 0.8964 - regression_loss: 0.7744 - classification_loss: 0.1220 468/500 [===========================>..] - ETA: 10s - loss: 0.8957 - regression_loss: 0.7738 - classification_loss: 0.1219 469/500 [===========================>..] - ETA: 10s - loss: 0.8965 - regression_loss: 0.7744 - classification_loss: 0.1220 470/500 [===========================>..] - ETA: 9s - loss: 0.8963 - regression_loss: 0.7743 - classification_loss: 0.1220  471/500 [===========================>..] - ETA: 9s - loss: 0.8950 - regression_loss: 0.7731 - classification_loss: 0.1218 472/500 [===========================>..] - ETA: 9s - loss: 0.8949 - regression_loss: 0.7731 - classification_loss: 0.1218 473/500 [===========================>..] - ETA: 8s - loss: 0.8950 - regression_loss: 0.7731 - classification_loss: 0.1219 474/500 [===========================>..] - ETA: 8s - loss: 0.8949 - regression_loss: 0.7730 - classification_loss: 0.1219 475/500 [===========================>..] - ETA: 8s - loss: 0.8946 - regression_loss: 0.7727 - classification_loss: 0.1220 476/500 [===========================>..] - ETA: 7s - loss: 0.8950 - regression_loss: 0.7731 - classification_loss: 0.1220 477/500 [===========================>..] - ETA: 7s - loss: 0.8942 - regression_loss: 0.7723 - classification_loss: 0.1219 478/500 [===========================>..] - ETA: 7s - loss: 0.8948 - regression_loss: 0.7728 - classification_loss: 0.1219 479/500 [===========================>..] - ETA: 6s - loss: 0.8941 - regression_loss: 0.7723 - classification_loss: 0.1218 480/500 [===========================>..] - ETA: 6s - loss: 0.8942 - regression_loss: 0.7724 - classification_loss: 0.1218 481/500 [===========================>..] - ETA: 6s - loss: 0.8945 - regression_loss: 0.7727 - classification_loss: 0.1217 482/500 [===========================>..] - ETA: 5s - loss: 0.8935 - regression_loss: 0.7720 - classification_loss: 0.1216 483/500 [===========================>..] - ETA: 5s - loss: 0.8939 - regression_loss: 0.7723 - classification_loss: 0.1217 484/500 [============================>.] - ETA: 5s - loss: 0.8954 - regression_loss: 0.7734 - classification_loss: 0.1220 485/500 [============================>.] - ETA: 4s - loss: 0.8960 - regression_loss: 0.7739 - classification_loss: 0.1221 486/500 [============================>.] - ETA: 4s - loss: 0.8955 - regression_loss: 0.7735 - classification_loss: 0.1220 487/500 [============================>.] - ETA: 4s - loss: 0.8960 - regression_loss: 0.7738 - classification_loss: 0.1222 488/500 [============================>.] - ETA: 3s - loss: 0.8965 - regression_loss: 0.7742 - classification_loss: 0.1222 489/500 [============================>.] - ETA: 3s - loss: 0.8971 - regression_loss: 0.7748 - classification_loss: 0.1223 490/500 [============================>.] - ETA: 3s - loss: 0.8973 - regression_loss: 0.7750 - classification_loss: 0.1223 491/500 [============================>.] - ETA: 2s - loss: 0.8978 - regression_loss: 0.7754 - classification_loss: 0.1224 492/500 [============================>.] - ETA: 2s - loss: 0.8978 - regression_loss: 0.7755 - classification_loss: 0.1223 493/500 [============================>.] - ETA: 2s - loss: 0.8983 - regression_loss: 0.7761 - classification_loss: 0.1222 494/500 [============================>.] - ETA: 1s - loss: 0.8990 - regression_loss: 0.7764 - classification_loss: 0.1226 495/500 [============================>.] - ETA: 1s - loss: 0.8987 - regression_loss: 0.7761 - classification_loss: 0.1225 496/500 [============================>.] - ETA: 1s - loss: 0.8979 - regression_loss: 0.7755 - classification_loss: 0.1224 497/500 [============================>.] - ETA: 0s - loss: 0.8987 - regression_loss: 0.7762 - classification_loss: 0.1225 498/500 [============================>.] - ETA: 0s - loss: 0.8999 - regression_loss: 0.7771 - classification_loss: 0.1228 499/500 [============================>.] - ETA: 0s - loss: 0.9000 - regression_loss: 0.7773 - classification_loss: 0.1228 500/500 [==============================] - 165s 330ms/step - loss: 0.8995 - regression_loss: 0.7768 - classification_loss: 0.1227 1172 instances of class plum with average precision: 0.6415 mAP: 0.6415 Epoch 00034: saving model to ./training/snapshots/resnet101_pascal_34.h5 Epoch 35/150 1/500 [..............................] - ETA: 2:37 - loss: 1.1868 - regression_loss: 0.9855 - classification_loss: 0.2013 2/500 [..............................] - ETA: 2:37 - loss: 0.8568 - regression_loss: 0.7299 - classification_loss: 0.1268 3/500 [..............................] - ETA: 2:37 - loss: 0.9633 - regression_loss: 0.8152 - classification_loss: 0.1481 4/500 [..............................] - ETA: 2:40 - loss: 1.0319 - regression_loss: 0.8802 - classification_loss: 0.1517 5/500 [..............................] - ETA: 2:39 - loss: 0.8839 - regression_loss: 0.7565 - classification_loss: 0.1274 6/500 [..............................] - ETA: 2:38 - loss: 0.9347 - regression_loss: 0.7983 - classification_loss: 0.1364 7/500 [..............................] - ETA: 2:38 - loss: 0.9343 - regression_loss: 0.8023 - classification_loss: 0.1320 8/500 [..............................] - ETA: 2:37 - loss: 0.9482 - regression_loss: 0.8114 - classification_loss: 0.1368 9/500 [..............................] - ETA: 2:38 - loss: 0.9650 - regression_loss: 0.8264 - classification_loss: 0.1386 10/500 [..............................] - ETA: 2:38 - loss: 0.9866 - regression_loss: 0.8465 - classification_loss: 0.1401 11/500 [..............................] - ETA: 2:38 - loss: 0.9311 - regression_loss: 0.7993 - classification_loss: 0.1317 12/500 [..............................] - ETA: 2:38 - loss: 0.9676 - regression_loss: 0.8288 - classification_loss: 0.1388 13/500 [..............................] - ETA: 2:38 - loss: 0.9683 - regression_loss: 0.8286 - classification_loss: 0.1397 14/500 [..............................] - ETA: 2:38 - loss: 0.9474 - regression_loss: 0.8129 - classification_loss: 0.1345 15/500 [..............................] - ETA: 2:37 - loss: 0.9317 - regression_loss: 0.7996 - classification_loss: 0.1321 16/500 [..............................] - ETA: 2:37 - loss: 0.9752 - regression_loss: 0.8335 - classification_loss: 0.1417 17/500 [>.............................] - ETA: 2:36 - loss: 0.9650 - regression_loss: 0.8274 - classification_loss: 0.1376 18/500 [>.............................] - ETA: 2:36 - loss: 0.9344 - regression_loss: 0.8018 - classification_loss: 0.1326 19/500 [>.............................] - ETA: 2:36 - loss: 0.9428 - regression_loss: 0.8083 - classification_loss: 0.1345 20/500 [>.............................] - ETA: 2:36 - loss: 0.9389 - regression_loss: 0.8056 - classification_loss: 0.1333 21/500 [>.............................] - ETA: 2:37 - loss: 0.9153 - regression_loss: 0.7863 - classification_loss: 0.1290 22/500 [>.............................] - ETA: 2:36 - loss: 0.9305 - regression_loss: 0.7977 - classification_loss: 0.1328 23/500 [>.............................] - ETA: 2:36 - loss: 0.9258 - regression_loss: 0.7941 - classification_loss: 0.1317 24/500 [>.............................] - ETA: 2:36 - loss: 0.9066 - regression_loss: 0.7788 - classification_loss: 0.1278 25/500 [>.............................] - ETA: 2:36 - loss: 0.9138 - regression_loss: 0.7855 - classification_loss: 0.1283 26/500 [>.............................] - ETA: 2:35 - loss: 0.9125 - regression_loss: 0.7863 - classification_loss: 0.1262 27/500 [>.............................] - ETA: 2:35 - loss: 0.8986 - regression_loss: 0.7749 - classification_loss: 0.1237 28/500 [>.............................] - ETA: 2:35 - loss: 0.8977 - regression_loss: 0.7749 - classification_loss: 0.1228 29/500 [>.............................] - ETA: 2:35 - loss: 0.8831 - regression_loss: 0.7618 - classification_loss: 0.1213 30/500 [>.............................] - ETA: 2:35 - loss: 0.8857 - regression_loss: 0.7635 - classification_loss: 0.1222 31/500 [>.............................] - ETA: 2:35 - loss: 0.8765 - regression_loss: 0.7542 - classification_loss: 0.1223 32/500 [>.............................] - ETA: 2:34 - loss: 0.8623 - regression_loss: 0.7417 - classification_loss: 0.1206 33/500 [>.............................] - ETA: 2:34 - loss: 0.8517 - regression_loss: 0.7330 - classification_loss: 0.1187 34/500 [=>............................] - ETA: 2:33 - loss: 0.8602 - regression_loss: 0.7392 - classification_loss: 0.1210 35/500 [=>............................] - ETA: 2:33 - loss: 0.8610 - regression_loss: 0.7395 - classification_loss: 0.1215 36/500 [=>............................] - ETA: 2:33 - loss: 0.8602 - regression_loss: 0.7390 - classification_loss: 0.1212 37/500 [=>............................] - ETA: 2:32 - loss: 0.8517 - regression_loss: 0.7320 - classification_loss: 0.1197 38/500 [=>............................] - ETA: 2:32 - loss: 0.8574 - regression_loss: 0.7370 - classification_loss: 0.1204 39/500 [=>............................] - ETA: 2:32 - loss: 0.8567 - regression_loss: 0.7371 - classification_loss: 0.1196 40/500 [=>............................] - ETA: 2:32 - loss: 0.8435 - regression_loss: 0.7259 - classification_loss: 0.1176 41/500 [=>............................] - ETA: 2:31 - loss: 0.8506 - regression_loss: 0.7322 - classification_loss: 0.1184 42/500 [=>............................] - ETA: 2:31 - loss: 0.8399 - regression_loss: 0.7234 - classification_loss: 0.1165 43/500 [=>............................] - ETA: 2:30 - loss: 0.8387 - regression_loss: 0.7220 - classification_loss: 0.1166 44/500 [=>............................] - ETA: 2:30 - loss: 0.8602 - regression_loss: 0.7387 - classification_loss: 0.1215 45/500 [=>............................] - ETA: 2:30 - loss: 0.8675 - regression_loss: 0.7460 - classification_loss: 0.1215 46/500 [=>............................] - ETA: 2:29 - loss: 0.8665 - regression_loss: 0.7448 - classification_loss: 0.1217 47/500 [=>............................] - ETA: 2:29 - loss: 0.8719 - regression_loss: 0.7488 - classification_loss: 0.1231 48/500 [=>............................] - ETA: 2:29 - loss: 0.8709 - regression_loss: 0.7475 - classification_loss: 0.1234 49/500 [=>............................] - ETA: 2:29 - loss: 0.8669 - regression_loss: 0.7440 - classification_loss: 0.1229 50/500 [==>...........................] - ETA: 2:28 - loss: 0.8617 - regression_loss: 0.7403 - classification_loss: 0.1214 51/500 [==>...........................] - ETA: 2:28 - loss: 0.8516 - regression_loss: 0.7310 - classification_loss: 0.1206 52/500 [==>...........................] - ETA: 2:27 - loss: 0.8402 - regression_loss: 0.7210 - classification_loss: 0.1192 53/500 [==>...........................] - ETA: 2:27 - loss: 0.8476 - regression_loss: 0.7273 - classification_loss: 0.1203 54/500 [==>...........................] - ETA: 2:27 - loss: 0.8527 - regression_loss: 0.7319 - classification_loss: 0.1208 55/500 [==>...........................] - ETA: 2:26 - loss: 0.8475 - regression_loss: 0.7274 - classification_loss: 0.1201 56/500 [==>...........................] - ETA: 2:26 - loss: 0.8386 - regression_loss: 0.7197 - classification_loss: 0.1189 57/500 [==>...........................] - ETA: 2:26 - loss: 0.8446 - regression_loss: 0.7255 - classification_loss: 0.1191 58/500 [==>...........................] - ETA: 2:26 - loss: 0.8436 - regression_loss: 0.7246 - classification_loss: 0.1191 59/500 [==>...........................] - ETA: 2:26 - loss: 0.8403 - regression_loss: 0.7220 - classification_loss: 0.1183 60/500 [==>...........................] - ETA: 2:25 - loss: 0.8416 - regression_loss: 0.7235 - classification_loss: 0.1180 61/500 [==>...........................] - ETA: 2:25 - loss: 0.8407 - regression_loss: 0.7230 - classification_loss: 0.1177 62/500 [==>...........................] - ETA: 2:25 - loss: 0.8368 - regression_loss: 0.7202 - classification_loss: 0.1167 63/500 [==>...........................] - ETA: 2:25 - loss: 0.8479 - regression_loss: 0.7296 - classification_loss: 0.1182 64/500 [==>...........................] - ETA: 2:24 - loss: 0.8399 - regression_loss: 0.7227 - classification_loss: 0.1173 65/500 [==>...........................] - ETA: 2:24 - loss: 0.8386 - regression_loss: 0.7219 - classification_loss: 0.1168 66/500 [==>...........................] - ETA: 2:24 - loss: 0.8366 - regression_loss: 0.7204 - classification_loss: 0.1162 67/500 [===>..........................] - ETA: 2:23 - loss: 0.8438 - regression_loss: 0.7265 - classification_loss: 0.1174 68/500 [===>..........................] - ETA: 2:23 - loss: 0.8362 - regression_loss: 0.7202 - classification_loss: 0.1160 69/500 [===>..........................] - ETA: 2:23 - loss: 0.8373 - regression_loss: 0.7206 - classification_loss: 0.1167 70/500 [===>..........................] - ETA: 2:22 - loss: 0.8444 - regression_loss: 0.7267 - classification_loss: 0.1177 71/500 [===>..........................] - ETA: 2:22 - loss: 0.8513 - regression_loss: 0.7330 - classification_loss: 0.1183 72/500 [===>..........................] - ETA: 2:22 - loss: 0.8576 - regression_loss: 0.7387 - classification_loss: 0.1189 73/500 [===>..........................] - ETA: 2:21 - loss: 0.8548 - regression_loss: 0.7367 - classification_loss: 0.1182 74/500 [===>..........................] - ETA: 2:21 - loss: 0.8575 - regression_loss: 0.7383 - classification_loss: 0.1192 75/500 [===>..........................] - ETA: 2:20 - loss: 0.8554 - regression_loss: 0.7364 - classification_loss: 0.1190 76/500 [===>..........................] - ETA: 2:20 - loss: 0.8514 - regression_loss: 0.7331 - classification_loss: 0.1183 77/500 [===>..........................] - ETA: 2:20 - loss: 0.8479 - regression_loss: 0.7304 - classification_loss: 0.1174 78/500 [===>..........................] - ETA: 2:19 - loss: 0.8485 - regression_loss: 0.7311 - classification_loss: 0.1174 79/500 [===>..........................] - ETA: 2:19 - loss: 0.8463 - regression_loss: 0.7291 - classification_loss: 0.1171 80/500 [===>..........................] - ETA: 2:19 - loss: 0.8517 - regression_loss: 0.7340 - classification_loss: 0.1177 81/500 [===>..........................] - ETA: 2:18 - loss: 0.8537 - regression_loss: 0.7363 - classification_loss: 0.1174 82/500 [===>..........................] - ETA: 2:18 - loss: 0.8473 - regression_loss: 0.7308 - classification_loss: 0.1164 83/500 [===>..........................] - ETA: 2:18 - loss: 0.8483 - regression_loss: 0.7321 - classification_loss: 0.1161 84/500 [====>.........................] - ETA: 2:17 - loss: 0.8420 - regression_loss: 0.7267 - classification_loss: 0.1153 85/500 [====>.........................] - ETA: 2:17 - loss: 0.8430 - regression_loss: 0.7277 - classification_loss: 0.1153 86/500 [====>.........................] - ETA: 2:17 - loss: 0.8374 - regression_loss: 0.7228 - classification_loss: 0.1146 87/500 [====>.........................] - ETA: 2:16 - loss: 0.8389 - regression_loss: 0.7242 - classification_loss: 0.1147 88/500 [====>.........................] - ETA: 2:16 - loss: 0.8398 - regression_loss: 0.7251 - classification_loss: 0.1146 89/500 [====>.........................] - ETA: 2:16 - loss: 0.8359 - regression_loss: 0.7217 - classification_loss: 0.1142 90/500 [====>.........................] - ETA: 2:15 - loss: 0.8361 - regression_loss: 0.7220 - classification_loss: 0.1141 91/500 [====>.........................] - ETA: 2:15 - loss: 0.8443 - regression_loss: 0.7289 - classification_loss: 0.1154 92/500 [====>.........................] - ETA: 2:15 - loss: 0.8481 - regression_loss: 0.7330 - classification_loss: 0.1152 93/500 [====>.........................] - ETA: 2:14 - loss: 0.8511 - regression_loss: 0.7351 - classification_loss: 0.1161 94/500 [====>.........................] - ETA: 2:14 - loss: 0.8536 - regression_loss: 0.7375 - classification_loss: 0.1161 95/500 [====>.........................] - ETA: 2:14 - loss: 0.8550 - regression_loss: 0.7387 - classification_loss: 0.1163 96/500 [====>.........................] - ETA: 2:13 - loss: 0.8561 - regression_loss: 0.7393 - classification_loss: 0.1168 97/500 [====>.........................] - ETA: 2:13 - loss: 0.8593 - regression_loss: 0.7422 - classification_loss: 0.1171 98/500 [====>.........................] - ETA: 2:12 - loss: 0.8539 - regression_loss: 0.7374 - classification_loss: 0.1165 99/500 [====>.........................] - ETA: 2:12 - loss: 0.8524 - regression_loss: 0.7365 - classification_loss: 0.1160 100/500 [=====>........................] - ETA: 2:12 - loss: 0.8482 - regression_loss: 0.7328 - classification_loss: 0.1154 101/500 [=====>........................] - ETA: 2:12 - loss: 0.8462 - regression_loss: 0.7312 - classification_loss: 0.1151 102/500 [=====>........................] - ETA: 2:11 - loss: 0.8464 - regression_loss: 0.7317 - classification_loss: 0.1147 103/500 [=====>........................] - ETA: 2:11 - loss: 0.8452 - regression_loss: 0.7306 - classification_loss: 0.1146 104/500 [=====>........................] - ETA: 2:11 - loss: 0.8439 - regression_loss: 0.7296 - classification_loss: 0.1144 105/500 [=====>........................] - ETA: 2:10 - loss: 0.8395 - regression_loss: 0.7259 - classification_loss: 0.1137 106/500 [=====>........................] - ETA: 2:10 - loss: 0.8375 - regression_loss: 0.7241 - classification_loss: 0.1135 107/500 [=====>........................] - ETA: 2:10 - loss: 0.8348 - regression_loss: 0.7218 - classification_loss: 0.1129 108/500 [=====>........................] - ETA: 2:09 - loss: 0.8324 - regression_loss: 0.7198 - classification_loss: 0.1127 109/500 [=====>........................] - ETA: 2:09 - loss: 0.8306 - regression_loss: 0.7182 - classification_loss: 0.1124 110/500 [=====>........................] - ETA: 2:09 - loss: 0.8293 - regression_loss: 0.7168 - classification_loss: 0.1125 111/500 [=====>........................] - ETA: 2:08 - loss: 0.8260 - regression_loss: 0.7142 - classification_loss: 0.1118 112/500 [=====>........................] - ETA: 2:08 - loss: 0.8263 - regression_loss: 0.7145 - classification_loss: 0.1118 113/500 [=====>........................] - ETA: 2:07 - loss: 0.8229 - regression_loss: 0.7118 - classification_loss: 0.1112 114/500 [=====>........................] - ETA: 2:07 - loss: 0.8203 - regression_loss: 0.7094 - classification_loss: 0.1108 115/500 [=====>........................] - ETA: 2:07 - loss: 0.8208 - regression_loss: 0.7099 - classification_loss: 0.1109 116/500 [=====>........................] - ETA: 2:07 - loss: 0.8233 - regression_loss: 0.7120 - classification_loss: 0.1113 117/500 [======>.......................] - ETA: 2:06 - loss: 0.8261 - regression_loss: 0.7144 - classification_loss: 0.1117 118/500 [======>.......................] - ETA: 2:06 - loss: 0.8262 - regression_loss: 0.7142 - classification_loss: 0.1120 119/500 [======>.......................] - ETA: 2:06 - loss: 0.8276 - regression_loss: 0.7152 - classification_loss: 0.1124 120/500 [======>.......................] - ETA: 2:05 - loss: 0.8299 - regression_loss: 0.7168 - classification_loss: 0.1132 121/500 [======>.......................] - ETA: 2:05 - loss: 0.8336 - regression_loss: 0.7198 - classification_loss: 0.1138 122/500 [======>.......................] - ETA: 2:05 - loss: 0.8352 - regression_loss: 0.7214 - classification_loss: 0.1138 123/500 [======>.......................] - ETA: 2:04 - loss: 0.8306 - regression_loss: 0.7173 - classification_loss: 0.1133 124/500 [======>.......................] - ETA: 2:04 - loss: 0.8293 - regression_loss: 0.7163 - classification_loss: 0.1129 125/500 [======>.......................] - ETA: 2:04 - loss: 0.8272 - regression_loss: 0.7146 - classification_loss: 0.1126 126/500 [======>.......................] - ETA: 2:03 - loss: 0.8234 - regression_loss: 0.7113 - classification_loss: 0.1121 127/500 [======>.......................] - ETA: 2:03 - loss: 0.8284 - regression_loss: 0.7156 - classification_loss: 0.1128 128/500 [======>.......................] - ETA: 2:03 - loss: 0.8302 - regression_loss: 0.7172 - classification_loss: 0.1130 129/500 [======>.......................] - ETA: 2:02 - loss: 0.8287 - regression_loss: 0.7158 - classification_loss: 0.1129 130/500 [======>.......................] - ETA: 2:02 - loss: 0.8324 - regression_loss: 0.7189 - classification_loss: 0.1135 131/500 [======>.......................] - ETA: 2:02 - loss: 0.8307 - regression_loss: 0.7177 - classification_loss: 0.1130 132/500 [======>.......................] - ETA: 2:01 - loss: 0.8300 - regression_loss: 0.7172 - classification_loss: 0.1128 133/500 [======>.......................] - ETA: 2:01 - loss: 0.8269 - regression_loss: 0.7144 - classification_loss: 0.1125 134/500 [=======>......................] - ETA: 2:01 - loss: 0.8336 - regression_loss: 0.7201 - classification_loss: 0.1135 135/500 [=======>......................] - ETA: 2:00 - loss: 0.8350 - regression_loss: 0.7214 - classification_loss: 0.1137 136/500 [=======>......................] - ETA: 2:00 - loss: 0.8315 - regression_loss: 0.7181 - classification_loss: 0.1134 137/500 [=======>......................] - ETA: 2:00 - loss: 0.8316 - regression_loss: 0.7183 - classification_loss: 0.1133 138/500 [=======>......................] - ETA: 2:00 - loss: 0.8279 - regression_loss: 0.7152 - classification_loss: 0.1128 139/500 [=======>......................] - ETA: 1:59 - loss: 0.8277 - regression_loss: 0.7151 - classification_loss: 0.1126 140/500 [=======>......................] - ETA: 1:59 - loss: 0.8313 - regression_loss: 0.7181 - classification_loss: 0.1132 141/500 [=======>......................] - ETA: 1:59 - loss: 0.8276 - regression_loss: 0.7149 - classification_loss: 0.1126 142/500 [=======>......................] - ETA: 1:58 - loss: 0.8280 - regression_loss: 0.7154 - classification_loss: 0.1125 143/500 [=======>......................] - ETA: 1:58 - loss: 0.8266 - regression_loss: 0.7142 - classification_loss: 0.1124 144/500 [=======>......................] - ETA: 1:57 - loss: 0.8296 - regression_loss: 0.7165 - classification_loss: 0.1131 145/500 [=======>......................] - ETA: 1:57 - loss: 0.8286 - regression_loss: 0.7156 - classification_loss: 0.1129 146/500 [=======>......................] - ETA: 1:57 - loss: 0.8308 - regression_loss: 0.7175 - classification_loss: 0.1132 147/500 [=======>......................] - ETA: 1:56 - loss: 0.8276 - regression_loss: 0.7148 - classification_loss: 0.1128 148/500 [=======>......................] - ETA: 1:56 - loss: 0.8327 - regression_loss: 0.7190 - classification_loss: 0.1137 149/500 [=======>......................] - ETA: 1:56 - loss: 0.8349 - regression_loss: 0.7211 - classification_loss: 0.1138 150/500 [========>.....................] - ETA: 1:55 - loss: 0.8364 - regression_loss: 0.7225 - classification_loss: 0.1139 151/500 [========>.....................] - ETA: 1:55 - loss: 0.8348 - regression_loss: 0.7212 - classification_loss: 0.1136 152/500 [========>.....................] - ETA: 1:55 - loss: 0.8369 - regression_loss: 0.7230 - classification_loss: 0.1139 153/500 [========>.....................] - ETA: 1:54 - loss: 0.8392 - regression_loss: 0.7249 - classification_loss: 0.1142 154/500 [========>.....................] - ETA: 1:54 - loss: 0.8363 - regression_loss: 0.7226 - classification_loss: 0.1137 155/500 [========>.....................] - ETA: 1:54 - loss: 0.8391 - regression_loss: 0.7248 - classification_loss: 0.1143 156/500 [========>.....................] - ETA: 1:53 - loss: 0.8413 - regression_loss: 0.7268 - classification_loss: 0.1145 157/500 [========>.....................] - ETA: 1:53 - loss: 0.8437 - regression_loss: 0.7287 - classification_loss: 0.1151 158/500 [========>.....................] - ETA: 1:53 - loss: 0.8463 - regression_loss: 0.7307 - classification_loss: 0.1156 159/500 [========>.....................] - ETA: 1:52 - loss: 0.8434 - regression_loss: 0.7280 - classification_loss: 0.1154 160/500 [========>.....................] - ETA: 1:52 - loss: 0.8449 - regression_loss: 0.7297 - classification_loss: 0.1152 161/500 [========>.....................] - ETA: 1:52 - loss: 0.8480 - regression_loss: 0.7322 - classification_loss: 0.1157 162/500 [========>.....................] - ETA: 1:51 - loss: 0.8458 - regression_loss: 0.7304 - classification_loss: 0.1154 163/500 [========>.....................] - ETA: 1:51 - loss: 0.8457 - regression_loss: 0.7302 - classification_loss: 0.1154 164/500 [========>.....................] - ETA: 1:51 - loss: 0.8469 - regression_loss: 0.7312 - classification_loss: 0.1158 165/500 [========>.....................] - ETA: 1:50 - loss: 0.8448 - regression_loss: 0.7294 - classification_loss: 0.1153 166/500 [========>.....................] - ETA: 1:50 - loss: 0.8435 - regression_loss: 0.7283 - classification_loss: 0.1152 167/500 [=========>....................] - ETA: 1:50 - loss: 0.8416 - regression_loss: 0.7268 - classification_loss: 0.1148 168/500 [=========>....................] - ETA: 1:50 - loss: 0.8445 - regression_loss: 0.7292 - classification_loss: 0.1154 169/500 [=========>....................] - ETA: 1:49 - loss: 0.8445 - regression_loss: 0.7290 - classification_loss: 0.1155 170/500 [=========>....................] - ETA: 1:49 - loss: 0.8495 - regression_loss: 0.7329 - classification_loss: 0.1167 171/500 [=========>....................] - ETA: 1:49 - loss: 0.8497 - regression_loss: 0.7331 - classification_loss: 0.1166 172/500 [=========>....................] - ETA: 1:48 - loss: 0.8476 - regression_loss: 0.7313 - classification_loss: 0.1164 173/500 [=========>....................] - ETA: 1:48 - loss: 0.8475 - regression_loss: 0.7310 - classification_loss: 0.1165 174/500 [=========>....................] - ETA: 1:48 - loss: 0.8481 - regression_loss: 0.7314 - classification_loss: 0.1166 175/500 [=========>....................] - ETA: 1:47 - loss: 0.8476 - regression_loss: 0.7311 - classification_loss: 0.1165 176/500 [=========>....................] - ETA: 1:47 - loss: 0.8458 - regression_loss: 0.7296 - classification_loss: 0.1162 177/500 [=========>....................] - ETA: 1:47 - loss: 0.8477 - regression_loss: 0.7311 - classification_loss: 0.1165 178/500 [=========>....................] - ETA: 1:46 - loss: 0.8458 - regression_loss: 0.7296 - classification_loss: 0.1162 179/500 [=========>....................] - ETA: 1:46 - loss: 0.8457 - regression_loss: 0.7296 - classification_loss: 0.1161 180/500 [=========>....................] - ETA: 1:46 - loss: 0.8447 - regression_loss: 0.7288 - classification_loss: 0.1159 181/500 [=========>....................] - ETA: 1:45 - loss: 0.8416 - regression_loss: 0.7261 - classification_loss: 0.1155 182/500 [=========>....................] - ETA: 1:45 - loss: 0.8429 - regression_loss: 0.7272 - classification_loss: 0.1157 183/500 [=========>....................] - ETA: 1:45 - loss: 0.8458 - regression_loss: 0.7295 - classification_loss: 0.1162 184/500 [==========>...................] - ETA: 1:44 - loss: 0.8458 - regression_loss: 0.7296 - classification_loss: 0.1162 185/500 [==========>...................] - ETA: 1:44 - loss: 0.8448 - regression_loss: 0.7288 - classification_loss: 0.1161 186/500 [==========>...................] - ETA: 1:44 - loss: 0.8447 - regression_loss: 0.7286 - classification_loss: 0.1161 187/500 [==========>...................] - ETA: 1:43 - loss: 0.8437 - regression_loss: 0.7278 - classification_loss: 0.1158 188/500 [==========>...................] - ETA: 1:43 - loss: 0.8449 - regression_loss: 0.7289 - classification_loss: 0.1160 189/500 [==========>...................] - ETA: 1:43 - loss: 0.8460 - regression_loss: 0.7299 - classification_loss: 0.1161 190/500 [==========>...................] - ETA: 1:42 - loss: 0.8454 - regression_loss: 0.7294 - classification_loss: 0.1161 191/500 [==========>...................] - ETA: 1:42 - loss: 0.8455 - regression_loss: 0.7295 - classification_loss: 0.1160 192/500 [==========>...................] - ETA: 1:42 - loss: 0.8455 - regression_loss: 0.7296 - classification_loss: 0.1159 193/500 [==========>...................] - ETA: 1:41 - loss: 0.8424 - regression_loss: 0.7269 - classification_loss: 0.1155 194/500 [==========>...................] - ETA: 1:41 - loss: 0.8425 - regression_loss: 0.7267 - classification_loss: 0.1157 195/500 [==========>...................] - ETA: 1:41 - loss: 0.8431 - regression_loss: 0.7272 - classification_loss: 0.1159 196/500 [==========>...................] - ETA: 1:40 - loss: 0.8446 - regression_loss: 0.7284 - classification_loss: 0.1162 197/500 [==========>...................] - ETA: 1:40 - loss: 0.8435 - regression_loss: 0.7275 - classification_loss: 0.1160 198/500 [==========>...................] - ETA: 1:40 - loss: 0.8464 - regression_loss: 0.7299 - classification_loss: 0.1166 199/500 [==========>...................] - ETA: 1:39 - loss: 0.8477 - regression_loss: 0.7311 - classification_loss: 0.1166 200/500 [===========>..................] - ETA: 1:39 - loss: 0.8483 - regression_loss: 0.7316 - classification_loss: 0.1167 201/500 [===========>..................] - ETA: 1:39 - loss: 0.8494 - regression_loss: 0.7324 - classification_loss: 0.1170 202/500 [===========>..................] - ETA: 1:38 - loss: 0.8475 - regression_loss: 0.7309 - classification_loss: 0.1166 203/500 [===========>..................] - ETA: 1:38 - loss: 0.8450 - regression_loss: 0.7283 - classification_loss: 0.1166 204/500 [===========>..................] - ETA: 1:38 - loss: 0.8457 - regression_loss: 0.7289 - classification_loss: 0.1168 205/500 [===========>..................] - ETA: 1:37 - loss: 0.8475 - regression_loss: 0.7305 - classification_loss: 0.1170 206/500 [===========>..................] - ETA: 1:37 - loss: 0.8467 - regression_loss: 0.7298 - classification_loss: 0.1168 207/500 [===========>..................] - ETA: 1:37 - loss: 0.8466 - regression_loss: 0.7297 - classification_loss: 0.1169 208/500 [===========>..................] - ETA: 1:36 - loss: 0.8478 - regression_loss: 0.7307 - classification_loss: 0.1170 209/500 [===========>..................] - ETA: 1:36 - loss: 0.8508 - regression_loss: 0.7333 - classification_loss: 0.1175 210/500 [===========>..................] - ETA: 1:36 - loss: 0.8532 - regression_loss: 0.7352 - classification_loss: 0.1181 211/500 [===========>..................] - ETA: 1:35 - loss: 0.8529 - regression_loss: 0.7348 - classification_loss: 0.1181 212/500 [===========>..................] - ETA: 1:35 - loss: 0.8543 - regression_loss: 0.7361 - classification_loss: 0.1182 213/500 [===========>..................] - ETA: 1:35 - loss: 0.8550 - regression_loss: 0.7371 - classification_loss: 0.1179 214/500 [===========>..................] - ETA: 1:34 - loss: 0.8542 - regression_loss: 0.7364 - classification_loss: 0.1178 215/500 [===========>..................] - ETA: 1:34 - loss: 0.8521 - regression_loss: 0.7345 - classification_loss: 0.1176 216/500 [===========>..................] - ETA: 1:34 - loss: 0.8505 - regression_loss: 0.7332 - classification_loss: 0.1174 217/500 [============>.................] - ETA: 1:33 - loss: 0.8503 - regression_loss: 0.7328 - classification_loss: 0.1174 218/500 [============>.................] - ETA: 1:33 - loss: 0.8489 - regression_loss: 0.7317 - classification_loss: 0.1172 219/500 [============>.................] - ETA: 1:33 - loss: 0.8469 - regression_loss: 0.7300 - classification_loss: 0.1169 220/500 [============>.................] - ETA: 1:32 - loss: 0.8481 - regression_loss: 0.7312 - classification_loss: 0.1170 221/500 [============>.................] - ETA: 1:32 - loss: 0.8467 - regression_loss: 0.7300 - classification_loss: 0.1168 222/500 [============>.................] - ETA: 1:32 - loss: 0.8461 - regression_loss: 0.7294 - classification_loss: 0.1167 223/500 [============>.................] - ETA: 1:31 - loss: 0.8463 - regression_loss: 0.7295 - classification_loss: 0.1168 224/500 [============>.................] - ETA: 1:31 - loss: 0.8459 - regression_loss: 0.7292 - classification_loss: 0.1168 225/500 [============>.................] - ETA: 1:31 - loss: 0.8456 - regression_loss: 0.7290 - classification_loss: 0.1166 226/500 [============>.................] - ETA: 1:30 - loss: 0.8458 - regression_loss: 0.7291 - classification_loss: 0.1168 227/500 [============>.................] - ETA: 1:30 - loss: 0.8441 - regression_loss: 0.7275 - classification_loss: 0.1166 228/500 [============>.................] - ETA: 1:30 - loss: 0.8436 - regression_loss: 0.7270 - classification_loss: 0.1165 229/500 [============>.................] - ETA: 1:29 - loss: 0.8457 - regression_loss: 0.7288 - classification_loss: 0.1170 230/500 [============>.................] - ETA: 1:29 - loss: 0.8446 - regression_loss: 0.7279 - classification_loss: 0.1167 231/500 [============>.................] - ETA: 1:29 - loss: 0.8431 - regression_loss: 0.7268 - classification_loss: 0.1164 232/500 [============>.................] - ETA: 1:28 - loss: 0.8451 - regression_loss: 0.7285 - classification_loss: 0.1166 233/500 [============>.................] - ETA: 1:28 - loss: 0.8456 - regression_loss: 0.7289 - classification_loss: 0.1167 234/500 [=============>................] - ETA: 1:28 - loss: 0.8450 - regression_loss: 0.7284 - classification_loss: 0.1166 235/500 [=============>................] - ETA: 1:27 - loss: 0.8462 - regression_loss: 0.7295 - classification_loss: 0.1166 236/500 [=============>................] - ETA: 1:27 - loss: 0.8460 - regression_loss: 0.7295 - classification_loss: 0.1165 237/500 [=============>................] - ETA: 1:27 - loss: 0.8444 - regression_loss: 0.7281 - classification_loss: 0.1163 238/500 [=============>................] - ETA: 1:26 - loss: 0.8449 - regression_loss: 0.7284 - classification_loss: 0.1165 239/500 [=============>................] - ETA: 1:26 - loss: 0.8440 - regression_loss: 0.7278 - classification_loss: 0.1162 240/500 [=============>................] - ETA: 1:26 - loss: 0.8454 - regression_loss: 0.7290 - classification_loss: 0.1164 241/500 [=============>................] - ETA: 1:25 - loss: 0.8438 - regression_loss: 0.7277 - classification_loss: 0.1161 242/500 [=============>................] - ETA: 1:25 - loss: 0.8465 - regression_loss: 0.7300 - classification_loss: 0.1166 243/500 [=============>................] - ETA: 1:25 - loss: 0.8484 - regression_loss: 0.7316 - classification_loss: 0.1168 244/500 [=============>................] - ETA: 1:24 - loss: 0.8489 - regression_loss: 0.7321 - classification_loss: 0.1168 245/500 [=============>................] - ETA: 1:24 - loss: 0.8473 - regression_loss: 0.7308 - classification_loss: 0.1165 246/500 [=============>................] - ETA: 1:24 - loss: 0.8504 - regression_loss: 0.7332 - classification_loss: 0.1172 247/500 [=============>................] - ETA: 1:23 - loss: 0.8501 - regression_loss: 0.7331 - classification_loss: 0.1170 248/500 [=============>................] - ETA: 1:23 - loss: 0.8521 - regression_loss: 0.7346 - classification_loss: 0.1174 249/500 [=============>................] - ETA: 1:23 - loss: 0.8538 - regression_loss: 0.7359 - classification_loss: 0.1178 250/500 [==============>...............] - ETA: 1:22 - loss: 0.8530 - regression_loss: 0.7354 - classification_loss: 0.1177 251/500 [==============>...............] - ETA: 1:22 - loss: 0.8525 - regression_loss: 0.7350 - classification_loss: 0.1175 252/500 [==============>...............] - ETA: 1:22 - loss: 0.8515 - regression_loss: 0.7342 - classification_loss: 0.1174 253/500 [==============>...............] - ETA: 1:21 - loss: 0.8517 - regression_loss: 0.7342 - classification_loss: 0.1175 254/500 [==============>...............] - ETA: 1:21 - loss: 0.8523 - regression_loss: 0.7347 - classification_loss: 0.1177 255/500 [==============>...............] - ETA: 1:21 - loss: 0.8513 - regression_loss: 0.7338 - classification_loss: 0.1174 256/500 [==============>...............] - ETA: 1:20 - loss: 0.8487 - regression_loss: 0.7316 - classification_loss: 0.1171 257/500 [==============>...............] - ETA: 1:20 - loss: 0.8491 - regression_loss: 0.7320 - classification_loss: 0.1170 258/500 [==============>...............] - ETA: 1:20 - loss: 0.8507 - regression_loss: 0.7335 - classification_loss: 0.1172 259/500 [==============>...............] - ETA: 1:19 - loss: 0.8508 - regression_loss: 0.7337 - classification_loss: 0.1171 260/500 [==============>...............] - ETA: 1:19 - loss: 0.8488 - regression_loss: 0.7320 - classification_loss: 0.1168 261/500 [==============>...............] - ETA: 1:19 - loss: 0.8497 - regression_loss: 0.7328 - classification_loss: 0.1169 262/500 [==============>...............] - ETA: 1:18 - loss: 0.8497 - regression_loss: 0.7328 - classification_loss: 0.1169 263/500 [==============>...............] - ETA: 1:18 - loss: 0.8476 - regression_loss: 0.7310 - classification_loss: 0.1166 264/500 [==============>...............] - ETA: 1:18 - loss: 0.8485 - regression_loss: 0.7317 - classification_loss: 0.1169 265/500 [==============>...............] - ETA: 1:17 - loss: 0.8489 - regression_loss: 0.7319 - classification_loss: 0.1170 266/500 [==============>...............] - ETA: 1:17 - loss: 0.8476 - regression_loss: 0.7308 - classification_loss: 0.1168 267/500 [===============>..............] - ETA: 1:17 - loss: 0.8488 - regression_loss: 0.7319 - classification_loss: 0.1169 268/500 [===============>..............] - ETA: 1:16 - loss: 0.8489 - regression_loss: 0.7320 - classification_loss: 0.1169 269/500 [===============>..............] - ETA: 1:16 - loss: 0.8493 - regression_loss: 0.7323 - classification_loss: 0.1170 270/500 [===============>..............] - ETA: 1:16 - loss: 0.8484 - regression_loss: 0.7316 - classification_loss: 0.1168 271/500 [===============>..............] - ETA: 1:15 - loss: 0.8483 - regression_loss: 0.7315 - classification_loss: 0.1167 272/500 [===============>..............] - ETA: 1:15 - loss: 0.8481 - regression_loss: 0.7314 - classification_loss: 0.1167 273/500 [===============>..............] - ETA: 1:15 - loss: 0.8487 - regression_loss: 0.7319 - classification_loss: 0.1168 274/500 [===============>..............] - ETA: 1:14 - loss: 0.8491 - regression_loss: 0.7323 - classification_loss: 0.1168 275/500 [===============>..............] - ETA: 1:14 - loss: 0.8490 - regression_loss: 0.7323 - classification_loss: 0.1167 276/500 [===============>..............] - ETA: 1:14 - loss: 0.8514 - regression_loss: 0.7344 - classification_loss: 0.1170 277/500 [===============>..............] - ETA: 1:13 - loss: 0.8510 - regression_loss: 0.7340 - classification_loss: 0.1170 278/500 [===============>..............] - ETA: 1:13 - loss: 0.8507 - regression_loss: 0.7337 - classification_loss: 0.1170 279/500 [===============>..............] - ETA: 1:13 - loss: 0.8514 - regression_loss: 0.7343 - classification_loss: 0.1171 280/500 [===============>..............] - ETA: 1:12 - loss: 0.8514 - regression_loss: 0.7343 - classification_loss: 0.1171 281/500 [===============>..............] - ETA: 1:12 - loss: 0.8528 - regression_loss: 0.7357 - classification_loss: 0.1171 282/500 [===============>..............] - ETA: 1:12 - loss: 0.8514 - regression_loss: 0.7344 - classification_loss: 0.1170 283/500 [===============>..............] - ETA: 1:11 - loss: 0.8499 - regression_loss: 0.7332 - classification_loss: 0.1168 284/500 [================>.............] - ETA: 1:11 - loss: 0.8494 - regression_loss: 0.7328 - classification_loss: 0.1166 285/500 [================>.............] - ETA: 1:11 - loss: 0.8487 - regression_loss: 0.7322 - classification_loss: 0.1166 286/500 [================>.............] - ETA: 1:10 - loss: 0.8488 - regression_loss: 0.7321 - classification_loss: 0.1166 287/500 [================>.............] - ETA: 1:10 - loss: 0.8508 - regression_loss: 0.7338 - classification_loss: 0.1170 288/500 [================>.............] - ETA: 1:10 - loss: 0.8513 - regression_loss: 0.7344 - classification_loss: 0.1170 289/500 [================>.............] - ETA: 1:09 - loss: 0.8504 - regression_loss: 0.7337 - classification_loss: 0.1167 290/500 [================>.............] - ETA: 1:09 - loss: 0.8489 - regression_loss: 0.7324 - classification_loss: 0.1165 291/500 [================>.............] - ETA: 1:09 - loss: 0.8471 - regression_loss: 0.7309 - classification_loss: 0.1162 292/500 [================>.............] - ETA: 1:08 - loss: 0.8472 - regression_loss: 0.7311 - classification_loss: 0.1161 293/500 [================>.............] - ETA: 1:08 - loss: 0.8462 - regression_loss: 0.7304 - classification_loss: 0.1158 294/500 [================>.............] - ETA: 1:08 - loss: 0.8483 - regression_loss: 0.7322 - classification_loss: 0.1161 295/500 [================>.............] - ETA: 1:07 - loss: 0.8489 - regression_loss: 0.7326 - classification_loss: 0.1163 296/500 [================>.............] - ETA: 1:07 - loss: 0.8492 - regression_loss: 0.7327 - classification_loss: 0.1165 297/500 [================>.............] - ETA: 1:07 - loss: 0.8498 - regression_loss: 0.7333 - classification_loss: 0.1165 298/500 [================>.............] - ETA: 1:06 - loss: 0.8506 - regression_loss: 0.7338 - classification_loss: 0.1167 299/500 [================>.............] - ETA: 1:06 - loss: 0.8512 - regression_loss: 0.7345 - classification_loss: 0.1167 300/500 [=================>............] - ETA: 1:06 - loss: 0.8525 - regression_loss: 0.7355 - classification_loss: 0.1170 301/500 [=================>............] - ETA: 1:05 - loss: 0.8517 - regression_loss: 0.7348 - classification_loss: 0.1169 302/500 [=================>............] - ETA: 1:05 - loss: 0.8525 - regression_loss: 0.7355 - classification_loss: 0.1170 303/500 [=================>............] - ETA: 1:05 - loss: 0.8548 - regression_loss: 0.7374 - classification_loss: 0.1174 304/500 [=================>............] - ETA: 1:05 - loss: 0.8578 - regression_loss: 0.7399 - classification_loss: 0.1179 305/500 [=================>............] - ETA: 1:04 - loss: 0.8567 - regression_loss: 0.7391 - classification_loss: 0.1177 306/500 [=================>............] - ETA: 1:04 - loss: 0.8569 - regression_loss: 0.7392 - classification_loss: 0.1176 307/500 [=================>............] - ETA: 1:04 - loss: 0.8588 - regression_loss: 0.7408 - classification_loss: 0.1179 308/500 [=================>............] - ETA: 1:03 - loss: 0.8583 - regression_loss: 0.7405 - classification_loss: 0.1178 309/500 [=================>............] - ETA: 1:03 - loss: 0.8570 - regression_loss: 0.7393 - classification_loss: 0.1177 310/500 [=================>............] - ETA: 1:03 - loss: 0.8566 - regression_loss: 0.7391 - classification_loss: 0.1175 311/500 [=================>............] - ETA: 1:02 - loss: 0.8574 - regression_loss: 0.7397 - classification_loss: 0.1177 312/500 [=================>............] - ETA: 1:02 - loss: 0.8556 - regression_loss: 0.7382 - classification_loss: 0.1174 313/500 [=================>............] - ETA: 1:02 - loss: 0.8539 - regression_loss: 0.7366 - classification_loss: 0.1173 314/500 [=================>............] - ETA: 1:01 - loss: 0.8540 - regression_loss: 0.7368 - classification_loss: 0.1172 315/500 [=================>............] - ETA: 1:01 - loss: 0.8534 - regression_loss: 0.7362 - classification_loss: 0.1172 316/500 [=================>............] - ETA: 1:01 - loss: 0.8518 - regression_loss: 0.7348 - classification_loss: 0.1170 317/500 [==================>...........] - ETA: 1:00 - loss: 0.8527 - regression_loss: 0.7356 - classification_loss: 0.1172 318/500 [==================>...........] - ETA: 1:00 - loss: 0.8525 - regression_loss: 0.7352 - classification_loss: 0.1172 319/500 [==================>...........] - ETA: 1:00 - loss: 0.8533 - regression_loss: 0.7358 - classification_loss: 0.1175 320/500 [==================>...........] - ETA: 59s - loss: 0.8529 - regression_loss: 0.7354 - classification_loss: 0.1175  321/500 [==================>...........] - ETA: 59s - loss: 0.8548 - regression_loss: 0.7373 - classification_loss: 0.1175 322/500 [==================>...........] - ETA: 59s - loss: 0.8557 - regression_loss: 0.7380 - classification_loss: 0.1176 323/500 [==================>...........] - ETA: 58s - loss: 0.8548 - regression_loss: 0.7373 - classification_loss: 0.1175 324/500 [==================>...........] - ETA: 58s - loss: 0.8542 - regression_loss: 0.7370 - classification_loss: 0.1173 325/500 [==================>...........] - ETA: 58s - loss: 0.8540 - regression_loss: 0.7367 - classification_loss: 0.1173 326/500 [==================>...........] - ETA: 57s - loss: 0.8537 - regression_loss: 0.7364 - classification_loss: 0.1174 327/500 [==================>...........] - ETA: 57s - loss: 0.8538 - regression_loss: 0.7365 - classification_loss: 0.1174 328/500 [==================>...........] - ETA: 57s - loss: 0.8537 - regression_loss: 0.7364 - classification_loss: 0.1173 329/500 [==================>...........] - ETA: 56s - loss: 0.8544 - regression_loss: 0.7370 - classification_loss: 0.1174 330/500 [==================>...........] - ETA: 56s - loss: 0.8538 - regression_loss: 0.7365 - classification_loss: 0.1173 331/500 [==================>...........] - ETA: 56s - loss: 0.8531 - regression_loss: 0.7359 - classification_loss: 0.1172 332/500 [==================>...........] - ETA: 55s - loss: 0.8553 - regression_loss: 0.7378 - classification_loss: 0.1176 333/500 [==================>...........] - ETA: 55s - loss: 0.8547 - regression_loss: 0.7372 - classification_loss: 0.1176 334/500 [===================>..........] - ETA: 55s - loss: 0.8543 - regression_loss: 0.7369 - classification_loss: 0.1175 335/500 [===================>..........] - ETA: 54s - loss: 0.8556 - regression_loss: 0.7379 - classification_loss: 0.1177 336/500 [===================>..........] - ETA: 54s - loss: 0.8570 - regression_loss: 0.7390 - classification_loss: 0.1179 337/500 [===================>..........] - ETA: 54s - loss: 0.8582 - regression_loss: 0.7401 - classification_loss: 0.1182 338/500 [===================>..........] - ETA: 53s - loss: 0.8595 - regression_loss: 0.7412 - classification_loss: 0.1184 339/500 [===================>..........] - ETA: 53s - loss: 0.8600 - regression_loss: 0.7416 - classification_loss: 0.1183 340/500 [===================>..........] - ETA: 53s - loss: 0.8589 - regression_loss: 0.7405 - classification_loss: 0.1184 341/500 [===================>..........] - ETA: 52s - loss: 0.8582 - regression_loss: 0.7399 - classification_loss: 0.1183 342/500 [===================>..........] - ETA: 52s - loss: 0.8591 - regression_loss: 0.7406 - classification_loss: 0.1185 343/500 [===================>..........] - ETA: 52s - loss: 0.8579 - regression_loss: 0.7396 - classification_loss: 0.1183 344/500 [===================>..........] - ETA: 51s - loss: 0.8576 - regression_loss: 0.7393 - classification_loss: 0.1183 345/500 [===================>..........] - ETA: 51s - loss: 0.8566 - regression_loss: 0.7385 - classification_loss: 0.1180 346/500 [===================>..........] - ETA: 51s - loss: 0.8574 - regression_loss: 0.7392 - classification_loss: 0.1183 347/500 [===================>..........] - ETA: 50s - loss: 0.8578 - regression_loss: 0.7395 - classification_loss: 0.1183 348/500 [===================>..........] - ETA: 50s - loss: 0.8564 - regression_loss: 0.7384 - classification_loss: 0.1180 349/500 [===================>..........] - ETA: 50s - loss: 0.8574 - regression_loss: 0.7391 - classification_loss: 0.1182 350/500 [====================>.........] - ETA: 49s - loss: 0.8564 - regression_loss: 0.7383 - classification_loss: 0.1181 351/500 [====================>.........] - ETA: 49s - loss: 0.8568 - regression_loss: 0.7386 - classification_loss: 0.1182 352/500 [====================>.........] - ETA: 49s - loss: 0.8550 - regression_loss: 0.7371 - classification_loss: 0.1180 353/500 [====================>.........] - ETA: 48s - loss: 0.8551 - regression_loss: 0.7370 - classification_loss: 0.1181 354/500 [====================>.........] - ETA: 48s - loss: 0.8538 - regression_loss: 0.7360 - classification_loss: 0.1178 355/500 [====================>.........] - ETA: 48s - loss: 0.8545 - regression_loss: 0.7367 - classification_loss: 0.1179 356/500 [====================>.........] - ETA: 47s - loss: 0.8546 - regression_loss: 0.7367 - classification_loss: 0.1179 357/500 [====================>.........] - ETA: 47s - loss: 0.8554 - regression_loss: 0.7374 - classification_loss: 0.1180 358/500 [====================>.........] - ETA: 47s - loss: 0.8560 - regression_loss: 0.7381 - classification_loss: 0.1179 359/500 [====================>.........] - ETA: 46s - loss: 0.8562 - regression_loss: 0.7382 - classification_loss: 0.1180 360/500 [====================>.........] - ETA: 46s - loss: 0.8562 - regression_loss: 0.7382 - classification_loss: 0.1180 361/500 [====================>.........] - ETA: 46s - loss: 0.8548 - regression_loss: 0.7370 - classification_loss: 0.1178 362/500 [====================>.........] - ETA: 45s - loss: 0.8550 - regression_loss: 0.7372 - classification_loss: 0.1179 363/500 [====================>.........] - ETA: 45s - loss: 0.8557 - regression_loss: 0.7378 - classification_loss: 0.1180 364/500 [====================>.........] - ETA: 45s - loss: 0.8555 - regression_loss: 0.7377 - classification_loss: 0.1178 365/500 [====================>.........] - ETA: 44s - loss: 0.8570 - regression_loss: 0.7388 - classification_loss: 0.1182 366/500 [====================>.........] - ETA: 44s - loss: 0.8567 - regression_loss: 0.7386 - classification_loss: 0.1181 367/500 [=====================>........] - ETA: 44s - loss: 0.8571 - regression_loss: 0.7389 - classification_loss: 0.1181 368/500 [=====================>........] - ETA: 43s - loss: 0.8565 - regression_loss: 0.7385 - classification_loss: 0.1180 369/500 [=====================>........] - ETA: 43s - loss: 0.8566 - regression_loss: 0.7386 - classification_loss: 0.1179 370/500 [=====================>........] - ETA: 43s - loss: 0.8558 - regression_loss: 0.7380 - classification_loss: 0.1178 371/500 [=====================>........] - ETA: 42s - loss: 0.8565 - regression_loss: 0.7386 - classification_loss: 0.1179 372/500 [=====================>........] - ETA: 42s - loss: 0.8556 - regression_loss: 0.7379 - classification_loss: 0.1177 373/500 [=====================>........] - ETA: 42s - loss: 0.8545 - regression_loss: 0.7369 - classification_loss: 0.1176 374/500 [=====================>........] - ETA: 41s - loss: 0.8533 - regression_loss: 0.7358 - classification_loss: 0.1175 375/500 [=====================>........] - ETA: 41s - loss: 0.8525 - regression_loss: 0.7352 - classification_loss: 0.1173 376/500 [=====================>........] - ETA: 41s - loss: 0.8533 - regression_loss: 0.7359 - classification_loss: 0.1174 377/500 [=====================>........] - ETA: 40s - loss: 0.8518 - regression_loss: 0.7346 - classification_loss: 0.1172 378/500 [=====================>........] - ETA: 40s - loss: 0.8516 - regression_loss: 0.7343 - classification_loss: 0.1172 379/500 [=====================>........] - ETA: 40s - loss: 0.8515 - regression_loss: 0.7342 - classification_loss: 0.1172 380/500 [=====================>........] - ETA: 39s - loss: 0.8536 - regression_loss: 0.7359 - classification_loss: 0.1176 381/500 [=====================>........] - ETA: 39s - loss: 0.8544 - regression_loss: 0.7366 - classification_loss: 0.1178 382/500 [=====================>........] - ETA: 39s - loss: 0.8532 - regression_loss: 0.7356 - classification_loss: 0.1176 383/500 [=====================>........] - ETA: 38s - loss: 0.8537 - regression_loss: 0.7360 - classification_loss: 0.1178 384/500 [======================>.......] - ETA: 38s - loss: 0.8547 - regression_loss: 0.7368 - classification_loss: 0.1179 385/500 [======================>.......] - ETA: 38s - loss: 0.8564 - regression_loss: 0.7382 - classification_loss: 0.1182 386/500 [======================>.......] - ETA: 37s - loss: 0.8569 - regression_loss: 0.7389 - classification_loss: 0.1181 387/500 [======================>.......] - ETA: 37s - loss: 0.8570 - regression_loss: 0.7389 - classification_loss: 0.1181 388/500 [======================>.......] - ETA: 37s - loss: 0.8580 - regression_loss: 0.7399 - classification_loss: 0.1181 389/500 [======================>.......] - ETA: 36s - loss: 0.8568 - regression_loss: 0.7388 - classification_loss: 0.1180 390/500 [======================>.......] - ETA: 36s - loss: 0.8573 - regression_loss: 0.7392 - classification_loss: 0.1181 391/500 [======================>.......] - ETA: 36s - loss: 0.8561 - regression_loss: 0.7382 - classification_loss: 0.1179 392/500 [======================>.......] - ETA: 35s - loss: 0.8569 - regression_loss: 0.7389 - classification_loss: 0.1180 393/500 [======================>.......] - ETA: 35s - loss: 0.8564 - regression_loss: 0.7385 - classification_loss: 0.1179 394/500 [======================>.......] - ETA: 35s - loss: 0.8570 - regression_loss: 0.7392 - classification_loss: 0.1178 395/500 [======================>.......] - ETA: 34s - loss: 0.8579 - regression_loss: 0.7400 - classification_loss: 0.1180 396/500 [======================>.......] - ETA: 34s - loss: 0.8574 - regression_loss: 0.7396 - classification_loss: 0.1179 397/500 [======================>.......] - ETA: 34s - loss: 0.8582 - regression_loss: 0.7403 - classification_loss: 0.1179 398/500 [======================>.......] - ETA: 33s - loss: 0.8581 - regression_loss: 0.7402 - classification_loss: 0.1179 399/500 [======================>.......] - ETA: 33s - loss: 0.8584 - regression_loss: 0.7403 - classification_loss: 0.1180 400/500 [=======================>......] - ETA: 33s - loss: 0.8579 - regression_loss: 0.7400 - classification_loss: 0.1180 401/500 [=======================>......] - ETA: 32s - loss: 0.8571 - regression_loss: 0.7393 - classification_loss: 0.1178 402/500 [=======================>......] - ETA: 32s - loss: 0.8568 - regression_loss: 0.7390 - classification_loss: 0.1177 403/500 [=======================>......] - ETA: 32s - loss: 0.8565 - regression_loss: 0.7388 - classification_loss: 0.1177 404/500 [=======================>......] - ETA: 31s - loss: 0.8573 - regression_loss: 0.7395 - classification_loss: 0.1178 405/500 [=======================>......] - ETA: 31s - loss: 0.8568 - regression_loss: 0.7391 - classification_loss: 0.1177 406/500 [=======================>......] - ETA: 31s - loss: 0.8583 - regression_loss: 0.7402 - classification_loss: 0.1181 407/500 [=======================>......] - ETA: 30s - loss: 0.8581 - regression_loss: 0.7400 - classification_loss: 0.1181 408/500 [=======================>......] - ETA: 30s - loss: 0.8571 - regression_loss: 0.7392 - classification_loss: 0.1179 409/500 [=======================>......] - ETA: 30s - loss: 0.8559 - regression_loss: 0.7381 - classification_loss: 0.1178 410/500 [=======================>......] - ETA: 29s - loss: 0.8571 - regression_loss: 0.7392 - classification_loss: 0.1180 411/500 [=======================>......] - ETA: 29s - loss: 0.8557 - regression_loss: 0.7380 - classification_loss: 0.1178 412/500 [=======================>......] - ETA: 29s - loss: 0.8552 - regression_loss: 0.7375 - classification_loss: 0.1177 413/500 [=======================>......] - ETA: 28s - loss: 0.8544 - regression_loss: 0.7368 - classification_loss: 0.1176 414/500 [=======================>......] - ETA: 28s - loss: 0.8548 - regression_loss: 0.7372 - classification_loss: 0.1177 415/500 [=======================>......] - ETA: 28s - loss: 0.8549 - regression_loss: 0.7372 - classification_loss: 0.1177 416/500 [=======================>......] - ETA: 27s - loss: 0.8542 - regression_loss: 0.7367 - classification_loss: 0.1175 417/500 [========================>.....] - ETA: 27s - loss: 0.8556 - regression_loss: 0.7379 - classification_loss: 0.1177 418/500 [========================>.....] - ETA: 27s - loss: 0.8559 - regression_loss: 0.7382 - classification_loss: 0.1176 419/500 [========================>.....] - ETA: 26s - loss: 0.8560 - regression_loss: 0.7383 - classification_loss: 0.1177 420/500 [========================>.....] - ETA: 26s - loss: 0.8565 - regression_loss: 0.7388 - classification_loss: 0.1177 421/500 [========================>.....] - ETA: 26s - loss: 0.8569 - regression_loss: 0.7391 - classification_loss: 0.1178 422/500 [========================>.....] - ETA: 25s - loss: 0.8575 - regression_loss: 0.7396 - classification_loss: 0.1179 423/500 [========================>.....] - ETA: 25s - loss: 0.8568 - regression_loss: 0.7390 - classification_loss: 0.1178 424/500 [========================>.....] - ETA: 25s - loss: 0.8568 - regression_loss: 0.7391 - classification_loss: 0.1178 425/500 [========================>.....] - ETA: 24s - loss: 0.8573 - regression_loss: 0.7393 - classification_loss: 0.1179 426/500 [========================>.....] - ETA: 24s - loss: 0.8589 - regression_loss: 0.7407 - classification_loss: 0.1181 427/500 [========================>.....] - ETA: 24s - loss: 0.8579 - regression_loss: 0.7399 - classification_loss: 0.1180 428/500 [========================>.....] - ETA: 23s - loss: 0.8579 - regression_loss: 0.7398 - classification_loss: 0.1181 429/500 [========================>.....] - ETA: 23s - loss: 0.8587 - regression_loss: 0.7405 - classification_loss: 0.1183 430/500 [========================>.....] - ETA: 23s - loss: 0.8583 - regression_loss: 0.7401 - classification_loss: 0.1182 431/500 [========================>.....] - ETA: 22s - loss: 0.8585 - regression_loss: 0.7403 - classification_loss: 0.1182 432/500 [========================>.....] - ETA: 22s - loss: 0.8571 - regression_loss: 0.7390 - classification_loss: 0.1181 433/500 [========================>.....] - ETA: 22s - loss: 0.8573 - regression_loss: 0.7392 - classification_loss: 0.1181 434/500 [=========================>....] - ETA: 21s - loss: 0.8578 - regression_loss: 0.7397 - classification_loss: 0.1180 435/500 [=========================>....] - ETA: 21s - loss: 0.8570 - regression_loss: 0.7391 - classification_loss: 0.1179 436/500 [=========================>....] - ETA: 21s - loss: 0.8574 - regression_loss: 0.7395 - classification_loss: 0.1179 437/500 [=========================>....] - ETA: 20s - loss: 0.8568 - regression_loss: 0.7390 - classification_loss: 0.1178 438/500 [=========================>....] - ETA: 20s - loss: 0.8574 - regression_loss: 0.7394 - classification_loss: 0.1180 439/500 [=========================>....] - ETA: 20s - loss: 0.8571 - regression_loss: 0.7391 - classification_loss: 0.1180 440/500 [=========================>....] - ETA: 19s - loss: 0.8575 - regression_loss: 0.7394 - classification_loss: 0.1181 441/500 [=========================>....] - ETA: 19s - loss: 0.8578 - regression_loss: 0.7397 - classification_loss: 0.1181 442/500 [=========================>....] - ETA: 19s - loss: 0.8570 - regression_loss: 0.7390 - classification_loss: 0.1179 443/500 [=========================>....] - ETA: 18s - loss: 0.8577 - regression_loss: 0.7397 - classification_loss: 0.1180 444/500 [=========================>....] - ETA: 18s - loss: 0.8567 - regression_loss: 0.7389 - classification_loss: 0.1178 445/500 [=========================>....] - ETA: 18s - loss: 0.8566 - regression_loss: 0.7388 - classification_loss: 0.1178 446/500 [=========================>....] - ETA: 17s - loss: 0.8565 - regression_loss: 0.7386 - classification_loss: 0.1179 447/500 [=========================>....] - ETA: 17s - loss: 0.8571 - regression_loss: 0.7392 - classification_loss: 0.1179 448/500 [=========================>....] - ETA: 17s - loss: 0.8563 - regression_loss: 0.7385 - classification_loss: 0.1178 449/500 [=========================>....] - ETA: 16s - loss: 0.8563 - regression_loss: 0.7384 - classification_loss: 0.1178 450/500 [==========================>...] - ETA: 16s - loss: 0.8552 - regression_loss: 0.7376 - classification_loss: 0.1177 451/500 [==========================>...] - ETA: 16s - loss: 0.8567 - regression_loss: 0.7387 - classification_loss: 0.1180 452/500 [==========================>...] - ETA: 15s - loss: 0.8564 - regression_loss: 0.7386 - classification_loss: 0.1179 453/500 [==========================>...] - ETA: 15s - loss: 0.8570 - regression_loss: 0.7391 - classification_loss: 0.1179 454/500 [==========================>...] - ETA: 15s - loss: 0.8575 - regression_loss: 0.7395 - classification_loss: 0.1180 455/500 [==========================>...] - ETA: 14s - loss: 0.8580 - regression_loss: 0.7402 - classification_loss: 0.1179 456/500 [==========================>...] - ETA: 14s - loss: 0.8580 - regression_loss: 0.7401 - classification_loss: 0.1178 457/500 [==========================>...] - ETA: 14s - loss: 0.8568 - regression_loss: 0.7391 - classification_loss: 0.1177 458/500 [==========================>...] - ETA: 13s - loss: 0.8576 - regression_loss: 0.7398 - classification_loss: 0.1178 459/500 [==========================>...] - ETA: 13s - loss: 0.8569 - regression_loss: 0.7392 - classification_loss: 0.1177 460/500 [==========================>...] - ETA: 13s - loss: 0.8576 - regression_loss: 0.7399 - classification_loss: 0.1177 461/500 [==========================>...] - ETA: 12s - loss: 0.8570 - regression_loss: 0.7394 - classification_loss: 0.1176 462/500 [==========================>...] - ETA: 12s - loss: 0.8580 - regression_loss: 0.7402 - classification_loss: 0.1178 463/500 [==========================>...] - ETA: 12s - loss: 0.8574 - regression_loss: 0.7397 - classification_loss: 0.1177 464/500 [==========================>...] - ETA: 11s - loss: 0.8574 - regression_loss: 0.7398 - classification_loss: 0.1176 465/500 [==========================>...] - ETA: 11s - loss: 0.8583 - regression_loss: 0.7406 - classification_loss: 0.1177 466/500 [==========================>...] - ETA: 11s - loss: 0.8577 - regression_loss: 0.7400 - classification_loss: 0.1177 467/500 [===========================>..] - ETA: 10s - loss: 0.8566 - regression_loss: 0.7391 - classification_loss: 0.1175 468/500 [===========================>..] - ETA: 10s - loss: 0.8563 - regression_loss: 0.7389 - classification_loss: 0.1174 469/500 [===========================>..] - ETA: 10s - loss: 0.8560 - regression_loss: 0.7386 - classification_loss: 0.1174 470/500 [===========================>..] - ETA: 9s - loss: 0.8558 - regression_loss: 0.7384 - classification_loss: 0.1174  471/500 [===========================>..] - ETA: 9s - loss: 0.8556 - regression_loss: 0.7383 - classification_loss: 0.1173 472/500 [===========================>..] - ETA: 9s - loss: 0.8557 - regression_loss: 0.7384 - classification_loss: 0.1172 473/500 [===========================>..] - ETA: 8s - loss: 0.8548 - regression_loss: 0.7377 - classification_loss: 0.1171 474/500 [===========================>..] - ETA: 8s - loss: 0.8543 - regression_loss: 0.7373 - classification_loss: 0.1170 475/500 [===========================>..] - ETA: 8s - loss: 0.8538 - regression_loss: 0.7369 - classification_loss: 0.1169 476/500 [===========================>..] - ETA: 7s - loss: 0.8528 - regression_loss: 0.7361 - classification_loss: 0.1167 477/500 [===========================>..] - ETA: 7s - loss: 0.8533 - regression_loss: 0.7366 - classification_loss: 0.1167 478/500 [===========================>..] - ETA: 7s - loss: 0.8534 - regression_loss: 0.7366 - classification_loss: 0.1167 479/500 [===========================>..] - ETA: 6s - loss: 0.8545 - regression_loss: 0.7374 - classification_loss: 0.1170 480/500 [===========================>..] - ETA: 6s - loss: 0.8535 - regression_loss: 0.7367 - classification_loss: 0.1169 481/500 [===========================>..] - ETA: 6s - loss: 0.8537 - regression_loss: 0.7368 - classification_loss: 0.1169 482/500 [===========================>..] - ETA: 5s - loss: 0.8535 - regression_loss: 0.7366 - classification_loss: 0.1169 483/500 [===========================>..] - ETA: 5s - loss: 0.8534 - regression_loss: 0.7364 - classification_loss: 0.1170 484/500 [============================>.] - ETA: 5s - loss: 0.8551 - regression_loss: 0.7378 - classification_loss: 0.1173 485/500 [============================>.] - ETA: 4s - loss: 0.8540 - regression_loss: 0.7370 - classification_loss: 0.1171 486/500 [============================>.] - ETA: 4s - loss: 0.8544 - regression_loss: 0.7374 - classification_loss: 0.1170 487/500 [============================>.] - ETA: 4s - loss: 0.8548 - regression_loss: 0.7378 - classification_loss: 0.1170 488/500 [============================>.] - ETA: 3s - loss: 0.8544 - regression_loss: 0.7374 - classification_loss: 0.1170 489/500 [============================>.] - ETA: 3s - loss: 0.8543 - regression_loss: 0.7373 - classification_loss: 0.1170 490/500 [============================>.] - ETA: 3s - loss: 0.8538 - regression_loss: 0.7370 - classification_loss: 0.1169 491/500 [============================>.] - ETA: 2s - loss: 0.8531 - regression_loss: 0.7364 - classification_loss: 0.1167 492/500 [============================>.] - ETA: 2s - loss: 0.8534 - regression_loss: 0.7366 - classification_loss: 0.1167 493/500 [============================>.] - ETA: 2s - loss: 0.8524 - regression_loss: 0.7358 - classification_loss: 0.1166 494/500 [============================>.] - ETA: 1s - loss: 0.8518 - regression_loss: 0.7353 - classification_loss: 0.1165 495/500 [============================>.] - ETA: 1s - loss: 0.8514 - regression_loss: 0.7350 - classification_loss: 0.1164 496/500 [============================>.] - ETA: 1s - loss: 0.8532 - regression_loss: 0.7365 - classification_loss: 0.1167 497/500 [============================>.] - ETA: 0s - loss: 0.8533 - regression_loss: 0.7366 - classification_loss: 0.1167 498/500 [============================>.] - ETA: 0s - loss: 0.8524 - regression_loss: 0.7359 - classification_loss: 0.1165 499/500 [============================>.] - ETA: 0s - loss: 0.8514 - regression_loss: 0.7350 - classification_loss: 0.1164 500/500 [==============================] - 166s 331ms/step - loss: 0.8522 - regression_loss: 0.7358 - classification_loss: 0.1165 1172 instances of class plum with average precision: 0.6218 mAP: 0.6218 Epoch 00035: saving model to ./training/snapshots/resnet101_pascal_35.h5 Epoch 36/150 1/500 [..............................] - ETA: 2:48 - loss: 1.0020 - regression_loss: 0.8353 - classification_loss: 0.1668 2/500 [..............................] - ETA: 2:45 - loss: 1.0258 - regression_loss: 0.8722 - classification_loss: 0.1536 3/500 [..............................] - ETA: 2:42 - loss: 0.8802 - regression_loss: 0.7502 - classification_loss: 0.1299 4/500 [..............................] - ETA: 2:40 - loss: 0.7971 - regression_loss: 0.6798 - classification_loss: 0.1173 5/500 [..............................] - ETA: 2:40 - loss: 0.7063 - regression_loss: 0.6009 - classification_loss: 0.1055 6/500 [..............................] - ETA: 2:40 - loss: 0.6559 - regression_loss: 0.5586 - classification_loss: 0.0972 7/500 [..............................] - ETA: 2:40 - loss: 0.6984 - regression_loss: 0.5963 - classification_loss: 0.1021 8/500 [..............................] - ETA: 2:39 - loss: 0.7224 - regression_loss: 0.6130 - classification_loss: 0.1094 9/500 [..............................] - ETA: 2:39 - loss: 0.7319 - regression_loss: 0.6203 - classification_loss: 0.1116 10/500 [..............................] - ETA: 2:39 - loss: 0.7429 - regression_loss: 0.6311 - classification_loss: 0.1118 11/500 [..............................] - ETA: 2:40 - loss: 0.7811 - regression_loss: 0.6625 - classification_loss: 0.1186 12/500 [..............................] - ETA: 2:41 - loss: 0.7855 - regression_loss: 0.6666 - classification_loss: 0.1189 13/500 [..............................] - ETA: 2:40 - loss: 0.8102 - regression_loss: 0.6886 - classification_loss: 0.1216 14/500 [..............................] - ETA: 2:40 - loss: 0.8258 - regression_loss: 0.7030 - classification_loss: 0.1228 15/500 [..............................] - ETA: 2:39 - loss: 0.8185 - regression_loss: 0.6948 - classification_loss: 0.1237 16/500 [..............................] - ETA: 2:40 - loss: 0.8237 - regression_loss: 0.7025 - classification_loss: 0.1212 17/500 [>.............................] - ETA: 2:40 - loss: 0.8360 - regression_loss: 0.7135 - classification_loss: 0.1225 18/500 [>.............................] - ETA: 2:40 - loss: 0.8459 - regression_loss: 0.7221 - classification_loss: 0.1238 19/500 [>.............................] - ETA: 2:40 - loss: 0.8363 - regression_loss: 0.7146 - classification_loss: 0.1217 20/500 [>.............................] - ETA: 2:39 - loss: 0.8396 - regression_loss: 0.7196 - classification_loss: 0.1200 21/500 [>.............................] - ETA: 2:38 - loss: 0.8626 - regression_loss: 0.7378 - classification_loss: 0.1248 22/500 [>.............................] - ETA: 2:38 - loss: 0.8743 - regression_loss: 0.7476 - classification_loss: 0.1266 23/500 [>.............................] - ETA: 2:38 - loss: 0.8840 - regression_loss: 0.7566 - classification_loss: 0.1274 24/500 [>.............................] - ETA: 2:37 - loss: 0.8893 - regression_loss: 0.7598 - classification_loss: 0.1296 25/500 [>.............................] - ETA: 2:37 - loss: 0.8775 - regression_loss: 0.7483 - classification_loss: 0.1292 26/500 [>.............................] - ETA: 2:36 - loss: 0.8799 - regression_loss: 0.7524 - classification_loss: 0.1276 27/500 [>.............................] - ETA: 2:37 - loss: 0.8714 - regression_loss: 0.7456 - classification_loss: 0.1258 28/500 [>.............................] - ETA: 2:36 - loss: 0.8503 - regression_loss: 0.7276 - classification_loss: 0.1227 29/500 [>.............................] - ETA: 2:36 - loss: 0.8303 - regression_loss: 0.7102 - classification_loss: 0.1201 30/500 [>.............................] - ETA: 2:36 - loss: 0.8159 - regression_loss: 0.6983 - classification_loss: 0.1176 31/500 [>.............................] - ETA: 2:35 - loss: 0.8182 - regression_loss: 0.7011 - classification_loss: 0.1171 32/500 [>.............................] - ETA: 2:35 - loss: 0.8264 - regression_loss: 0.7072 - classification_loss: 0.1192 33/500 [>.............................] - ETA: 2:34 - loss: 0.8207 - regression_loss: 0.7025 - classification_loss: 0.1182 34/500 [=>............................] - ETA: 2:34 - loss: 0.8548 - regression_loss: 0.7358 - classification_loss: 0.1190 35/500 [=>............................] - ETA: 2:33 - loss: 0.8470 - regression_loss: 0.7297 - classification_loss: 0.1173 36/500 [=>............................] - ETA: 2:33 - loss: 0.8446 - regression_loss: 0.7286 - classification_loss: 0.1159 37/500 [=>............................] - ETA: 2:32 - loss: 0.8526 - regression_loss: 0.7367 - classification_loss: 0.1159 38/500 [=>............................] - ETA: 2:32 - loss: 0.8520 - regression_loss: 0.7364 - classification_loss: 0.1156 39/500 [=>............................] - ETA: 2:32 - loss: 0.8719 - regression_loss: 0.7540 - classification_loss: 0.1180 40/500 [=>............................] - ETA: 2:31 - loss: 0.8721 - regression_loss: 0.7552 - classification_loss: 0.1170 41/500 [=>............................] - ETA: 2:31 - loss: 0.8560 - regression_loss: 0.7411 - classification_loss: 0.1149 42/500 [=>............................] - ETA: 2:31 - loss: 0.8668 - regression_loss: 0.7497 - classification_loss: 0.1171 43/500 [=>............................] - ETA: 2:30 - loss: 0.8750 - regression_loss: 0.7562 - classification_loss: 0.1188 44/500 [=>............................] - ETA: 2:30 - loss: 0.8736 - regression_loss: 0.7556 - classification_loss: 0.1180 45/500 [=>............................] - ETA: 2:30 - loss: 0.8770 - regression_loss: 0.7586 - classification_loss: 0.1185 46/500 [=>............................] - ETA: 2:30 - loss: 0.8760 - regression_loss: 0.7578 - classification_loss: 0.1182 47/500 [=>............................] - ETA: 2:30 - loss: 0.8822 - regression_loss: 0.7625 - classification_loss: 0.1197 48/500 [=>............................] - ETA: 2:29 - loss: 0.8899 - regression_loss: 0.7694 - classification_loss: 0.1205 49/500 [=>............................] - ETA: 2:29 - loss: 0.8874 - regression_loss: 0.7677 - classification_loss: 0.1197 50/500 [==>...........................] - ETA: 2:29 - loss: 0.8911 - regression_loss: 0.7692 - classification_loss: 0.1219 51/500 [==>...........................] - ETA: 2:28 - loss: 0.8948 - regression_loss: 0.7722 - classification_loss: 0.1226 52/500 [==>...........................] - ETA: 2:28 - loss: 0.9027 - regression_loss: 0.7787 - classification_loss: 0.1240 53/500 [==>...........................] - ETA: 2:27 - loss: 0.8939 - regression_loss: 0.7712 - classification_loss: 0.1228 54/500 [==>...........................] - ETA: 2:27 - loss: 0.8945 - regression_loss: 0.7719 - classification_loss: 0.1227 55/500 [==>...........................] - ETA: 2:27 - loss: 0.8960 - regression_loss: 0.7731 - classification_loss: 0.1229 56/500 [==>...........................] - ETA: 2:27 - loss: 0.8952 - regression_loss: 0.7730 - classification_loss: 0.1223 57/500 [==>...........................] - ETA: 2:26 - loss: 0.8892 - regression_loss: 0.7679 - classification_loss: 0.1213 58/500 [==>...........................] - ETA: 2:26 - loss: 0.8866 - regression_loss: 0.7654 - classification_loss: 0.1212 59/500 [==>...........................] - ETA: 2:26 - loss: 0.8910 - regression_loss: 0.7691 - classification_loss: 0.1219 60/500 [==>...........................] - ETA: 2:25 - loss: 0.8931 - regression_loss: 0.7712 - classification_loss: 0.1219 61/500 [==>...........................] - ETA: 2:25 - loss: 0.8905 - regression_loss: 0.7689 - classification_loss: 0.1216 62/500 [==>...........................] - ETA: 2:25 - loss: 0.8952 - regression_loss: 0.7732 - classification_loss: 0.1221 63/500 [==>...........................] - ETA: 2:24 - loss: 0.8980 - regression_loss: 0.7748 - classification_loss: 0.1232 64/500 [==>...........................] - ETA: 2:24 - loss: 0.8910 - regression_loss: 0.7685 - classification_loss: 0.1225 65/500 [==>...........................] - ETA: 2:24 - loss: 0.8931 - regression_loss: 0.7701 - classification_loss: 0.1229 66/500 [==>...........................] - ETA: 2:23 - loss: 0.8967 - regression_loss: 0.7730 - classification_loss: 0.1238 67/500 [===>..........................] - ETA: 2:23 - loss: 0.8997 - regression_loss: 0.7750 - classification_loss: 0.1247 68/500 [===>..........................] - ETA: 2:23 - loss: 0.8952 - regression_loss: 0.7715 - classification_loss: 0.1237 69/500 [===>..........................] - ETA: 2:22 - loss: 0.8927 - regression_loss: 0.7701 - classification_loss: 0.1226 70/500 [===>..........................] - ETA: 2:22 - loss: 0.8877 - regression_loss: 0.7659 - classification_loss: 0.1218 71/500 [===>..........................] - ETA: 2:22 - loss: 0.8860 - regression_loss: 0.7645 - classification_loss: 0.1215 72/500 [===>..........................] - ETA: 2:21 - loss: 0.8797 - regression_loss: 0.7591 - classification_loss: 0.1206 73/500 [===>..........................] - ETA: 2:21 - loss: 0.8810 - regression_loss: 0.7599 - classification_loss: 0.1211 74/500 [===>..........................] - ETA: 2:21 - loss: 0.8772 - regression_loss: 0.7567 - classification_loss: 0.1204 75/500 [===>..........................] - ETA: 2:20 - loss: 0.8718 - regression_loss: 0.7522 - classification_loss: 0.1196 76/500 [===>..........................] - ETA: 2:20 - loss: 0.8848 - regression_loss: 0.7627 - classification_loss: 0.1221 77/500 [===>..........................] - ETA: 2:20 - loss: 0.8925 - regression_loss: 0.7691 - classification_loss: 0.1233 78/500 [===>..........................] - ETA: 2:19 - loss: 0.8933 - regression_loss: 0.7698 - classification_loss: 0.1235 79/500 [===>..........................] - ETA: 2:19 - loss: 0.8949 - regression_loss: 0.7711 - classification_loss: 0.1238 80/500 [===>..........................] - ETA: 2:19 - loss: 0.8982 - regression_loss: 0.7738 - classification_loss: 0.1243 81/500 [===>..........................] - ETA: 2:18 - loss: 0.8946 - regression_loss: 0.7712 - classification_loss: 0.1234 82/500 [===>..........................] - ETA: 2:18 - loss: 0.8924 - regression_loss: 0.7690 - classification_loss: 0.1234 83/500 [===>..........................] - ETA: 2:18 - loss: 0.8896 - regression_loss: 0.7668 - classification_loss: 0.1229 84/500 [====>.........................] - ETA: 2:17 - loss: 0.8847 - regression_loss: 0.7626 - classification_loss: 0.1221 85/500 [====>.........................] - ETA: 2:17 - loss: 0.8854 - regression_loss: 0.7631 - classification_loss: 0.1223 86/500 [====>.........................] - ETA: 2:17 - loss: 0.8878 - regression_loss: 0.7651 - classification_loss: 0.1227 87/500 [====>.........................] - ETA: 2:17 - loss: 0.8929 - regression_loss: 0.7691 - classification_loss: 0.1238 88/500 [====>.........................] - ETA: 2:16 - loss: 0.8958 - regression_loss: 0.7717 - classification_loss: 0.1241 89/500 [====>.........................] - ETA: 2:16 - loss: 0.8942 - regression_loss: 0.7703 - classification_loss: 0.1239 90/500 [====>.........................] - ETA: 2:15 - loss: 0.8920 - regression_loss: 0.7686 - classification_loss: 0.1234 91/500 [====>.........................] - ETA: 2:15 - loss: 0.8948 - regression_loss: 0.7711 - classification_loss: 0.1237 92/500 [====>.........................] - ETA: 2:15 - loss: 0.8977 - regression_loss: 0.7733 - classification_loss: 0.1244 93/500 [====>.........................] - ETA: 2:14 - loss: 0.8945 - regression_loss: 0.7707 - classification_loss: 0.1238 94/500 [====>.........................] - ETA: 2:14 - loss: 0.8985 - regression_loss: 0.7745 - classification_loss: 0.1241 95/500 [====>.........................] - ETA: 2:14 - loss: 0.8948 - regression_loss: 0.7713 - classification_loss: 0.1234 96/500 [====>.........................] - ETA: 2:14 - loss: 0.8990 - regression_loss: 0.7744 - classification_loss: 0.1246 97/500 [====>.........................] - ETA: 2:13 - loss: 0.8945 - regression_loss: 0.7706 - classification_loss: 0.1239 98/500 [====>.........................] - ETA: 2:13 - loss: 0.8981 - regression_loss: 0.7737 - classification_loss: 0.1244 99/500 [====>.........................] - ETA: 2:13 - loss: 0.9023 - regression_loss: 0.7770 - classification_loss: 0.1253 100/500 [=====>........................] - ETA: 2:12 - loss: 0.9045 - regression_loss: 0.7786 - classification_loss: 0.1259 101/500 [=====>........................] - ETA: 2:12 - loss: 0.9070 - regression_loss: 0.7805 - classification_loss: 0.1265 102/500 [=====>........................] - ETA: 2:12 - loss: 0.9077 - regression_loss: 0.7811 - classification_loss: 0.1266 103/500 [=====>........................] - ETA: 2:11 - loss: 0.9052 - regression_loss: 0.7786 - classification_loss: 0.1265 104/500 [=====>........................] - ETA: 2:11 - loss: 0.9000 - regression_loss: 0.7739 - classification_loss: 0.1261 105/500 [=====>........................] - ETA: 2:11 - loss: 0.8958 - regression_loss: 0.7702 - classification_loss: 0.1256 106/500 [=====>........................] - ETA: 2:10 - loss: 0.8908 - regression_loss: 0.7659 - classification_loss: 0.1250 107/500 [=====>........................] - ETA: 2:10 - loss: 0.8931 - regression_loss: 0.7677 - classification_loss: 0.1253 108/500 [=====>........................] - ETA: 2:09 - loss: 0.8880 - regression_loss: 0.7635 - classification_loss: 0.1245 109/500 [=====>........................] - ETA: 2:09 - loss: 0.8892 - regression_loss: 0.7646 - classification_loss: 0.1246 110/500 [=====>........................] - ETA: 2:09 - loss: 0.8863 - regression_loss: 0.7619 - classification_loss: 0.1244 111/500 [=====>........................] - ETA: 2:09 - loss: 0.8828 - regression_loss: 0.7589 - classification_loss: 0.1239 112/500 [=====>........................] - ETA: 2:08 - loss: 0.8872 - regression_loss: 0.7627 - classification_loss: 0.1246 113/500 [=====>........................] - ETA: 2:08 - loss: 0.8820 - regression_loss: 0.7582 - classification_loss: 0.1238 114/500 [=====>........................] - ETA: 2:08 - loss: 0.8862 - regression_loss: 0.7615 - classification_loss: 0.1247 115/500 [=====>........................] - ETA: 2:07 - loss: 0.8854 - regression_loss: 0.7607 - classification_loss: 0.1247 116/500 [=====>........................] - ETA: 2:07 - loss: 0.8888 - regression_loss: 0.7640 - classification_loss: 0.1248 117/500 [======>.......................] - ETA: 2:06 - loss: 0.8877 - regression_loss: 0.7631 - classification_loss: 0.1246 118/500 [======>.......................] - ETA: 2:06 - loss: 0.8830 - regression_loss: 0.7591 - classification_loss: 0.1239 119/500 [======>.......................] - ETA: 2:06 - loss: 0.8847 - regression_loss: 0.7606 - classification_loss: 0.1241 120/500 [======>.......................] - ETA: 2:05 - loss: 0.8849 - regression_loss: 0.7608 - classification_loss: 0.1241 121/500 [======>.......................] - ETA: 2:05 - loss: 0.8857 - regression_loss: 0.7616 - classification_loss: 0.1241 122/500 [======>.......................] - ETA: 2:05 - loss: 0.8898 - regression_loss: 0.7652 - classification_loss: 0.1246 123/500 [======>.......................] - ETA: 2:04 - loss: 0.8902 - regression_loss: 0.7654 - classification_loss: 0.1248 124/500 [======>.......................] - ETA: 2:04 - loss: 0.8954 - regression_loss: 0.7699 - classification_loss: 0.1255 125/500 [======>.......................] - ETA: 2:04 - loss: 0.8956 - regression_loss: 0.7702 - classification_loss: 0.1254 126/500 [======>.......................] - ETA: 2:04 - loss: 0.8972 - regression_loss: 0.7716 - classification_loss: 0.1256 127/500 [======>.......................] - ETA: 2:03 - loss: 0.8932 - regression_loss: 0.7682 - classification_loss: 0.1250 128/500 [======>.......................] - ETA: 2:03 - loss: 0.8891 - regression_loss: 0.7647 - classification_loss: 0.1244 129/500 [======>.......................] - ETA: 2:02 - loss: 0.8884 - regression_loss: 0.7644 - classification_loss: 0.1240 130/500 [======>.......................] - ETA: 2:02 - loss: 0.8865 - regression_loss: 0.7631 - classification_loss: 0.1235 131/500 [======>.......................] - ETA: 2:02 - loss: 0.8895 - regression_loss: 0.7654 - classification_loss: 0.1241 132/500 [======>.......................] - ETA: 2:01 - loss: 0.8941 - regression_loss: 0.7695 - classification_loss: 0.1246 133/500 [======>.......................] - ETA: 2:01 - loss: 0.8899 - regression_loss: 0.7656 - classification_loss: 0.1243 134/500 [=======>......................] - ETA: 2:01 - loss: 0.8901 - regression_loss: 0.7657 - classification_loss: 0.1243 135/500 [=======>......................] - ETA: 2:00 - loss: 0.8919 - regression_loss: 0.7673 - classification_loss: 0.1246 136/500 [=======>......................] - ETA: 2:00 - loss: 0.8926 - regression_loss: 0.7679 - classification_loss: 0.1247 137/500 [=======>......................] - ETA: 2:00 - loss: 0.8890 - regression_loss: 0.7648 - classification_loss: 0.1242 138/500 [=======>......................] - ETA: 1:59 - loss: 0.8844 - regression_loss: 0.7607 - classification_loss: 0.1236 139/500 [=======>......................] - ETA: 1:59 - loss: 0.8875 - regression_loss: 0.7634 - classification_loss: 0.1242 140/500 [=======>......................] - ETA: 1:59 - loss: 0.8898 - regression_loss: 0.7654 - classification_loss: 0.1243 141/500 [=======>......................] - ETA: 1:58 - loss: 0.8942 - regression_loss: 0.7692 - classification_loss: 0.1250 142/500 [=======>......................] - ETA: 1:58 - loss: 0.8941 - regression_loss: 0.7694 - classification_loss: 0.1247 143/500 [=======>......................] - ETA: 1:58 - loss: 0.8902 - regression_loss: 0.7661 - classification_loss: 0.1241 144/500 [=======>......................] - ETA: 1:57 - loss: 0.8886 - regression_loss: 0.7647 - classification_loss: 0.1238 145/500 [=======>......................] - ETA: 1:57 - loss: 0.8963 - regression_loss: 0.7712 - classification_loss: 0.1251 146/500 [=======>......................] - ETA: 1:57 - loss: 0.8994 - regression_loss: 0.7742 - classification_loss: 0.1252 147/500 [=======>......................] - ETA: 1:56 - loss: 0.8953 - regression_loss: 0.7706 - classification_loss: 0.1247 148/500 [=======>......................] - ETA: 1:56 - loss: 0.8995 - regression_loss: 0.7745 - classification_loss: 0.1250 149/500 [=======>......................] - ETA: 1:56 - loss: 0.9048 - regression_loss: 0.7791 - classification_loss: 0.1257 150/500 [========>.....................] - ETA: 1:55 - loss: 0.9062 - regression_loss: 0.7806 - classification_loss: 0.1256 151/500 [========>.....................] - ETA: 1:55 - loss: 0.9102 - regression_loss: 0.7842 - classification_loss: 0.1260 152/500 [========>.....................] - ETA: 1:55 - loss: 0.9085 - regression_loss: 0.7828 - classification_loss: 0.1257 153/500 [========>.....................] - ETA: 1:54 - loss: 0.9088 - regression_loss: 0.7830 - classification_loss: 0.1257 154/500 [========>.....................] - ETA: 1:54 - loss: 0.9062 - regression_loss: 0.7809 - classification_loss: 0.1253 155/500 [========>.....................] - ETA: 1:54 - loss: 0.9048 - regression_loss: 0.7799 - classification_loss: 0.1249 156/500 [========>.....................] - ETA: 1:53 - loss: 0.9069 - regression_loss: 0.7818 - classification_loss: 0.1250 157/500 [========>.....................] - ETA: 1:53 - loss: 0.9066 - regression_loss: 0.7817 - classification_loss: 0.1248 158/500 [========>.....................] - ETA: 1:53 - loss: 0.9060 - regression_loss: 0.7813 - classification_loss: 0.1247 159/500 [========>.....................] - ETA: 1:52 - loss: 0.9064 - regression_loss: 0.7818 - classification_loss: 0.1246 160/500 [========>.....................] - ETA: 1:52 - loss: 0.9030 - regression_loss: 0.7787 - classification_loss: 0.1243 161/500 [========>.....................] - ETA: 1:52 - loss: 0.9003 - regression_loss: 0.7762 - classification_loss: 0.1241 162/500 [========>.....................] - ETA: 1:51 - loss: 0.8971 - regression_loss: 0.7734 - classification_loss: 0.1236 163/500 [========>.....................] - ETA: 1:51 - loss: 0.8985 - regression_loss: 0.7747 - classification_loss: 0.1238 164/500 [========>.....................] - ETA: 1:51 - loss: 0.8961 - regression_loss: 0.7725 - classification_loss: 0.1236 165/500 [========>.....................] - ETA: 1:50 - loss: 0.8948 - regression_loss: 0.7713 - classification_loss: 0.1235 166/500 [========>.....................] - ETA: 1:50 - loss: 0.8939 - regression_loss: 0.7705 - classification_loss: 0.1234 167/500 [=========>....................] - ETA: 1:50 - loss: 0.8922 - regression_loss: 0.7691 - classification_loss: 0.1231 168/500 [=========>....................] - ETA: 1:49 - loss: 0.8931 - regression_loss: 0.7697 - classification_loss: 0.1234 169/500 [=========>....................] - ETA: 1:49 - loss: 0.8927 - regression_loss: 0.7694 - classification_loss: 0.1234 170/500 [=========>....................] - ETA: 1:49 - loss: 0.8929 - regression_loss: 0.7697 - classification_loss: 0.1232 171/500 [=========>....................] - ETA: 1:48 - loss: 0.8944 - regression_loss: 0.7711 - classification_loss: 0.1233 172/500 [=========>....................] - ETA: 1:48 - loss: 0.8953 - regression_loss: 0.7715 - classification_loss: 0.1238 173/500 [=========>....................] - ETA: 1:48 - loss: 0.8970 - regression_loss: 0.7726 - classification_loss: 0.1244 174/500 [=========>....................] - ETA: 1:47 - loss: 0.8957 - regression_loss: 0.7714 - classification_loss: 0.1242 175/500 [=========>....................] - ETA: 1:47 - loss: 0.8955 - regression_loss: 0.7713 - classification_loss: 0.1242 176/500 [=========>....................] - ETA: 1:47 - loss: 0.8941 - regression_loss: 0.7701 - classification_loss: 0.1240 177/500 [=========>....................] - ETA: 1:46 - loss: 0.8939 - regression_loss: 0.7700 - classification_loss: 0.1239 178/500 [=========>....................] - ETA: 1:46 - loss: 0.8922 - regression_loss: 0.7685 - classification_loss: 0.1238 179/500 [=========>....................] - ETA: 1:46 - loss: 0.8905 - regression_loss: 0.7670 - classification_loss: 0.1235 180/500 [=========>....................] - ETA: 1:45 - loss: 0.8893 - regression_loss: 0.7660 - classification_loss: 0.1233 181/500 [=========>....................] - ETA: 1:45 - loss: 0.8861 - regression_loss: 0.7633 - classification_loss: 0.1228 182/500 [=========>....................] - ETA: 1:45 - loss: 0.8902 - regression_loss: 0.7667 - classification_loss: 0.1235 183/500 [=========>....................] - ETA: 1:44 - loss: 0.8885 - regression_loss: 0.7654 - classification_loss: 0.1231 184/500 [==========>...................] - ETA: 1:44 - loss: 0.8861 - regression_loss: 0.7635 - classification_loss: 0.1226 185/500 [==========>...................] - ETA: 1:44 - loss: 0.8866 - regression_loss: 0.7639 - classification_loss: 0.1227 186/500 [==========>...................] - ETA: 1:43 - loss: 0.8883 - regression_loss: 0.7651 - classification_loss: 0.1231 187/500 [==========>...................] - ETA: 1:43 - loss: 0.8886 - regression_loss: 0.7654 - classification_loss: 0.1232 188/500 [==========>...................] - ETA: 1:43 - loss: 0.8860 - regression_loss: 0.7631 - classification_loss: 0.1228 189/500 [==========>...................] - ETA: 1:42 - loss: 0.8832 - regression_loss: 0.7608 - classification_loss: 0.1224 190/500 [==========>...................] - ETA: 1:42 - loss: 0.8824 - regression_loss: 0.7600 - classification_loss: 0.1224 191/500 [==========>...................] - ETA: 1:42 - loss: 0.8819 - regression_loss: 0.7596 - classification_loss: 0.1222 192/500 [==========>...................] - ETA: 1:41 - loss: 0.8829 - regression_loss: 0.7603 - classification_loss: 0.1226 193/500 [==========>...................] - ETA: 1:41 - loss: 0.8819 - regression_loss: 0.7595 - classification_loss: 0.1225 194/500 [==========>...................] - ETA: 1:41 - loss: 0.8817 - regression_loss: 0.7592 - classification_loss: 0.1225 195/500 [==========>...................] - ETA: 1:40 - loss: 0.8794 - regression_loss: 0.7573 - classification_loss: 0.1221 196/500 [==========>...................] - ETA: 1:40 - loss: 0.8817 - regression_loss: 0.7589 - classification_loss: 0.1228 197/500 [==========>...................] - ETA: 1:40 - loss: 0.8802 - regression_loss: 0.7576 - classification_loss: 0.1226 198/500 [==========>...................] - ETA: 1:39 - loss: 0.8806 - regression_loss: 0.7580 - classification_loss: 0.1225 199/500 [==========>...................] - ETA: 1:39 - loss: 0.8787 - regression_loss: 0.7565 - classification_loss: 0.1222 200/500 [===========>..................] - ETA: 1:39 - loss: 0.8808 - regression_loss: 0.7582 - classification_loss: 0.1226 201/500 [===========>..................] - ETA: 1:38 - loss: 0.8803 - regression_loss: 0.7579 - classification_loss: 0.1223 202/500 [===========>..................] - ETA: 1:38 - loss: 0.8818 - regression_loss: 0.7592 - classification_loss: 0.1226 203/500 [===========>..................] - ETA: 1:38 - loss: 0.8834 - regression_loss: 0.7606 - classification_loss: 0.1228 204/500 [===========>..................] - ETA: 1:38 - loss: 0.8819 - regression_loss: 0.7594 - classification_loss: 0.1225 205/500 [===========>..................] - ETA: 1:37 - loss: 0.8825 - regression_loss: 0.7598 - classification_loss: 0.1227 206/500 [===========>..................] - ETA: 1:37 - loss: 0.8802 - regression_loss: 0.7578 - classification_loss: 0.1224 207/500 [===========>..................] - ETA: 1:37 - loss: 0.8778 - regression_loss: 0.7559 - classification_loss: 0.1220 208/500 [===========>..................] - ETA: 1:36 - loss: 0.8764 - regression_loss: 0.7547 - classification_loss: 0.1217 209/500 [===========>..................] - ETA: 1:36 - loss: 0.8760 - regression_loss: 0.7542 - classification_loss: 0.1217 210/500 [===========>..................] - ETA: 1:36 - loss: 0.8777 - regression_loss: 0.7556 - classification_loss: 0.1221 211/500 [===========>..................] - ETA: 1:35 - loss: 0.8771 - regression_loss: 0.7551 - classification_loss: 0.1220 212/500 [===========>..................] - ETA: 1:35 - loss: 0.8753 - regression_loss: 0.7535 - classification_loss: 0.1218 213/500 [===========>..................] - ETA: 1:35 - loss: 0.8757 - regression_loss: 0.7540 - classification_loss: 0.1217 214/500 [===========>..................] - ETA: 1:34 - loss: 0.8741 - regression_loss: 0.7527 - classification_loss: 0.1214 215/500 [===========>..................] - ETA: 1:34 - loss: 0.8717 - regression_loss: 0.7506 - classification_loss: 0.1211 216/500 [===========>..................] - ETA: 1:34 - loss: 0.8715 - regression_loss: 0.7505 - classification_loss: 0.1209 217/500 [============>.................] - ETA: 1:33 - loss: 0.8703 - regression_loss: 0.7496 - classification_loss: 0.1207 218/500 [============>.................] - ETA: 1:33 - loss: 0.8706 - regression_loss: 0.7498 - classification_loss: 0.1208 219/500 [============>.................] - ETA: 1:33 - loss: 0.8734 - regression_loss: 0.7523 - classification_loss: 0.1211 220/500 [============>.................] - ETA: 1:32 - loss: 0.8726 - regression_loss: 0.7515 - classification_loss: 0.1211 221/500 [============>.................] - ETA: 1:32 - loss: 0.8737 - regression_loss: 0.7524 - classification_loss: 0.1214 222/500 [============>.................] - ETA: 1:32 - loss: 0.8734 - regression_loss: 0.7521 - classification_loss: 0.1213 223/500 [============>.................] - ETA: 1:31 - loss: 0.8723 - regression_loss: 0.7512 - classification_loss: 0.1211 224/500 [============>.................] - ETA: 1:31 - loss: 0.8723 - regression_loss: 0.7513 - classification_loss: 0.1210 225/500 [============>.................] - ETA: 1:31 - loss: 0.8719 - regression_loss: 0.7512 - classification_loss: 0.1207 226/500 [============>.................] - ETA: 1:30 - loss: 0.8706 - regression_loss: 0.7502 - classification_loss: 0.1204 227/500 [============>.................] - ETA: 1:30 - loss: 0.8694 - regression_loss: 0.7491 - classification_loss: 0.1202 228/500 [============>.................] - ETA: 1:30 - loss: 0.8709 - regression_loss: 0.7506 - classification_loss: 0.1203 229/500 [============>.................] - ETA: 1:29 - loss: 0.8715 - regression_loss: 0.7514 - classification_loss: 0.1201 230/500 [============>.................] - ETA: 1:29 - loss: 0.8689 - regression_loss: 0.7491 - classification_loss: 0.1198 231/500 [============>.................] - ETA: 1:29 - loss: 0.8668 - regression_loss: 0.7473 - classification_loss: 0.1194 232/500 [============>.................] - ETA: 1:28 - loss: 0.8661 - regression_loss: 0.7467 - classification_loss: 0.1194 233/500 [============>.................] - ETA: 1:28 - loss: 0.8639 - regression_loss: 0.7449 - classification_loss: 0.1190 234/500 [=============>................] - ETA: 1:28 - loss: 0.8636 - regression_loss: 0.7447 - classification_loss: 0.1189 235/500 [=============>................] - ETA: 1:27 - loss: 0.8657 - regression_loss: 0.7464 - classification_loss: 0.1192 236/500 [=============>................] - ETA: 1:27 - loss: 0.8651 - regression_loss: 0.7461 - classification_loss: 0.1190 237/500 [=============>................] - ETA: 1:27 - loss: 0.8663 - regression_loss: 0.7470 - classification_loss: 0.1192 238/500 [=============>................] - ETA: 1:26 - loss: 0.8678 - regression_loss: 0.7483 - classification_loss: 0.1195 239/500 [=============>................] - ETA: 1:26 - loss: 0.8683 - regression_loss: 0.7487 - classification_loss: 0.1195 240/500 [=============>................] - ETA: 1:26 - loss: 0.8674 - regression_loss: 0.7481 - classification_loss: 0.1193 241/500 [=============>................] - ETA: 1:25 - loss: 0.8679 - regression_loss: 0.7484 - classification_loss: 0.1195 242/500 [=============>................] - ETA: 1:25 - loss: 0.8682 - regression_loss: 0.7489 - classification_loss: 0.1194 243/500 [=============>................] - ETA: 1:25 - loss: 0.8687 - regression_loss: 0.7492 - classification_loss: 0.1195 244/500 [=============>................] - ETA: 1:24 - loss: 0.8668 - regression_loss: 0.7476 - classification_loss: 0.1192 245/500 [=============>................] - ETA: 1:24 - loss: 0.8645 - regression_loss: 0.7456 - classification_loss: 0.1189 246/500 [=============>................] - ETA: 1:24 - loss: 0.8652 - regression_loss: 0.7462 - classification_loss: 0.1190 247/500 [=============>................] - ETA: 1:23 - loss: 0.8671 - regression_loss: 0.7478 - classification_loss: 0.1194 248/500 [=============>................] - ETA: 1:23 - loss: 0.8668 - regression_loss: 0.7476 - classification_loss: 0.1192 249/500 [=============>................] - ETA: 1:23 - loss: 0.8674 - regression_loss: 0.7482 - classification_loss: 0.1192 250/500 [==============>...............] - ETA: 1:22 - loss: 0.8698 - regression_loss: 0.7502 - classification_loss: 0.1196 251/500 [==============>...............] - ETA: 1:22 - loss: 0.8694 - regression_loss: 0.7498 - classification_loss: 0.1196 252/500 [==============>...............] - ETA: 1:22 - loss: 0.8671 - regression_loss: 0.7479 - classification_loss: 0.1193 253/500 [==============>...............] - ETA: 1:21 - loss: 0.8673 - regression_loss: 0.7481 - classification_loss: 0.1193 254/500 [==============>...............] - ETA: 1:21 - loss: 0.8680 - regression_loss: 0.7487 - classification_loss: 0.1193 255/500 [==============>...............] - ETA: 1:21 - loss: 0.8682 - regression_loss: 0.7488 - classification_loss: 0.1194 256/500 [==============>...............] - ETA: 1:20 - loss: 0.8660 - regression_loss: 0.7470 - classification_loss: 0.1190 257/500 [==============>...............] - ETA: 1:20 - loss: 0.8647 - regression_loss: 0.7459 - classification_loss: 0.1188 258/500 [==============>...............] - ETA: 1:20 - loss: 0.8659 - regression_loss: 0.7469 - classification_loss: 0.1190 259/500 [==============>...............] - ETA: 1:19 - loss: 0.8648 - regression_loss: 0.7460 - classification_loss: 0.1188 260/500 [==============>...............] - ETA: 1:19 - loss: 0.8658 - regression_loss: 0.7468 - classification_loss: 0.1190 261/500 [==============>...............] - ETA: 1:19 - loss: 0.8639 - regression_loss: 0.7453 - classification_loss: 0.1187 262/500 [==============>...............] - ETA: 1:18 - loss: 0.8652 - regression_loss: 0.7463 - classification_loss: 0.1189 263/500 [==============>...............] - ETA: 1:18 - loss: 0.8650 - regression_loss: 0.7461 - classification_loss: 0.1189 264/500 [==============>...............] - ETA: 1:18 - loss: 0.8649 - regression_loss: 0.7462 - classification_loss: 0.1188 265/500 [==============>...............] - ETA: 1:17 - loss: 0.8636 - regression_loss: 0.7452 - classification_loss: 0.1185 266/500 [==============>...............] - ETA: 1:17 - loss: 0.8619 - regression_loss: 0.7437 - classification_loss: 0.1183 267/500 [===============>..............] - ETA: 1:17 - loss: 0.8633 - regression_loss: 0.7447 - classification_loss: 0.1186 268/500 [===============>..............] - ETA: 1:16 - loss: 0.8636 - regression_loss: 0.7451 - classification_loss: 0.1185 269/500 [===============>..............] - ETA: 1:16 - loss: 0.8632 - regression_loss: 0.7447 - classification_loss: 0.1185 270/500 [===============>..............] - ETA: 1:16 - loss: 0.8631 - regression_loss: 0.7446 - classification_loss: 0.1185 271/500 [===============>..............] - ETA: 1:16 - loss: 0.8615 - regression_loss: 0.7432 - classification_loss: 0.1183 272/500 [===============>..............] - ETA: 1:15 - loss: 0.8619 - regression_loss: 0.7435 - classification_loss: 0.1184 273/500 [===============>..............] - ETA: 1:15 - loss: 0.8632 - regression_loss: 0.7446 - classification_loss: 0.1186 274/500 [===============>..............] - ETA: 1:15 - loss: 0.8631 - regression_loss: 0.7447 - classification_loss: 0.1184 275/500 [===============>..............] - ETA: 1:14 - loss: 0.8636 - regression_loss: 0.7451 - classification_loss: 0.1185 276/500 [===============>..............] - ETA: 1:14 - loss: 0.8619 - regression_loss: 0.7436 - classification_loss: 0.1183 277/500 [===============>..............] - ETA: 1:14 - loss: 0.8614 - regression_loss: 0.7434 - classification_loss: 0.1181 278/500 [===============>..............] - ETA: 1:13 - loss: 0.8624 - regression_loss: 0.7443 - classification_loss: 0.1181 279/500 [===============>..............] - ETA: 1:13 - loss: 0.8614 - regression_loss: 0.7434 - classification_loss: 0.1180 280/500 [===============>..............] - ETA: 1:12 - loss: 0.8598 - regression_loss: 0.7421 - classification_loss: 0.1177 281/500 [===============>..............] - ETA: 1:12 - loss: 0.8604 - regression_loss: 0.7426 - classification_loss: 0.1178 282/500 [===============>..............] - ETA: 1:12 - loss: 0.8604 - regression_loss: 0.7427 - classification_loss: 0.1176 283/500 [===============>..............] - ETA: 1:12 - loss: 0.8610 - regression_loss: 0.7433 - classification_loss: 0.1177 284/500 [================>.............] - ETA: 1:11 - loss: 0.8598 - regression_loss: 0.7423 - classification_loss: 0.1175 285/500 [================>.............] - ETA: 1:11 - loss: 0.8593 - regression_loss: 0.7418 - classification_loss: 0.1175 286/500 [================>.............] - ETA: 1:11 - loss: 0.8594 - regression_loss: 0.7419 - classification_loss: 0.1175 287/500 [================>.............] - ETA: 1:10 - loss: 0.8588 - regression_loss: 0.7415 - classification_loss: 0.1173 288/500 [================>.............] - ETA: 1:10 - loss: 0.8575 - regression_loss: 0.7402 - classification_loss: 0.1172 289/500 [================>.............] - ETA: 1:10 - loss: 0.8596 - regression_loss: 0.7420 - classification_loss: 0.1176 290/500 [================>.............] - ETA: 1:09 - loss: 0.8576 - regression_loss: 0.7403 - classification_loss: 0.1174 291/500 [================>.............] - ETA: 1:09 - loss: 0.8586 - regression_loss: 0.7412 - classification_loss: 0.1174 292/500 [================>.............] - ETA: 1:09 - loss: 0.8563 - regression_loss: 0.7392 - classification_loss: 0.1172 293/500 [================>.............] - ETA: 1:08 - loss: 0.8564 - regression_loss: 0.7392 - classification_loss: 0.1172 294/500 [================>.............] - ETA: 1:08 - loss: 0.8559 - regression_loss: 0.7388 - classification_loss: 0.1171 295/500 [================>.............] - ETA: 1:08 - loss: 0.8552 - regression_loss: 0.7381 - classification_loss: 0.1171 296/500 [================>.............] - ETA: 1:07 - loss: 0.8544 - regression_loss: 0.7376 - classification_loss: 0.1168 297/500 [================>.............] - ETA: 1:07 - loss: 0.8562 - regression_loss: 0.7392 - classification_loss: 0.1171 298/500 [================>.............] - ETA: 1:07 - loss: 0.8572 - regression_loss: 0.7401 - classification_loss: 0.1171 299/500 [================>.............] - ETA: 1:06 - loss: 0.8593 - regression_loss: 0.7417 - classification_loss: 0.1175 300/500 [=================>............] - ETA: 1:06 - loss: 0.8591 - regression_loss: 0.7416 - classification_loss: 0.1176 301/500 [=================>............] - ETA: 1:06 - loss: 0.8599 - regression_loss: 0.7422 - classification_loss: 0.1177 302/500 [=================>............] - ETA: 1:05 - loss: 0.8600 - regression_loss: 0.7422 - classification_loss: 0.1177 303/500 [=================>............] - ETA: 1:05 - loss: 0.8593 - regression_loss: 0.7416 - classification_loss: 0.1177 304/500 [=================>............] - ETA: 1:05 - loss: 0.8610 - regression_loss: 0.7430 - classification_loss: 0.1180 305/500 [=================>............] - ETA: 1:04 - loss: 0.8602 - regression_loss: 0.7424 - classification_loss: 0.1177 306/500 [=================>............] - ETA: 1:04 - loss: 0.8589 - regression_loss: 0.7414 - classification_loss: 0.1175 307/500 [=================>............] - ETA: 1:04 - loss: 0.8588 - regression_loss: 0.7414 - classification_loss: 0.1174 308/500 [=================>............] - ETA: 1:03 - loss: 0.8604 - regression_loss: 0.7428 - classification_loss: 0.1176 309/500 [=================>............] - ETA: 1:03 - loss: 0.8604 - regression_loss: 0.7428 - classification_loss: 0.1176 310/500 [=================>............] - ETA: 1:03 - loss: 0.8615 - regression_loss: 0.7438 - classification_loss: 0.1177 311/500 [=================>............] - ETA: 1:02 - loss: 0.8620 - regression_loss: 0.7441 - classification_loss: 0.1179 312/500 [=================>............] - ETA: 1:02 - loss: 0.8610 - regression_loss: 0.7432 - classification_loss: 0.1178 313/500 [=================>............] - ETA: 1:02 - loss: 0.8595 - regression_loss: 0.7418 - classification_loss: 0.1177 314/500 [=================>............] - ETA: 1:01 - loss: 0.8625 - regression_loss: 0.7443 - classification_loss: 0.1182 315/500 [=================>............] - ETA: 1:01 - loss: 0.8626 - regression_loss: 0.7445 - classification_loss: 0.1181 316/500 [=================>............] - ETA: 1:01 - loss: 0.8618 - regression_loss: 0.7439 - classification_loss: 0.1179 317/500 [==================>...........] - ETA: 1:00 - loss: 0.8600 - regression_loss: 0.7422 - classification_loss: 0.1178 318/500 [==================>...........] - ETA: 1:00 - loss: 0.8609 - regression_loss: 0.7430 - classification_loss: 0.1179 319/500 [==================>...........] - ETA: 1:00 - loss: 0.8614 - regression_loss: 0.7433 - classification_loss: 0.1181 320/500 [==================>...........] - ETA: 59s - loss: 0.8619 - regression_loss: 0.7439 - classification_loss: 0.1180  321/500 [==================>...........] - ETA: 59s - loss: 0.8629 - regression_loss: 0.7448 - classification_loss: 0.1181 322/500 [==================>...........] - ETA: 59s - loss: 0.8641 - regression_loss: 0.7457 - classification_loss: 0.1183 323/500 [==================>...........] - ETA: 58s - loss: 0.8637 - regression_loss: 0.7454 - classification_loss: 0.1183 324/500 [==================>...........] - ETA: 58s - loss: 0.8642 - regression_loss: 0.7457 - classification_loss: 0.1184 325/500 [==================>...........] - ETA: 58s - loss: 0.8630 - regression_loss: 0.7447 - classification_loss: 0.1183 326/500 [==================>...........] - ETA: 57s - loss: 0.8643 - regression_loss: 0.7461 - classification_loss: 0.1183 327/500 [==================>...........] - ETA: 57s - loss: 0.8642 - regression_loss: 0.7461 - classification_loss: 0.1181 328/500 [==================>...........] - ETA: 57s - loss: 0.8634 - regression_loss: 0.7455 - classification_loss: 0.1179 329/500 [==================>...........] - ETA: 56s - loss: 0.8644 - regression_loss: 0.7467 - classification_loss: 0.1177 330/500 [==================>...........] - ETA: 56s - loss: 0.8640 - regression_loss: 0.7465 - classification_loss: 0.1175 331/500 [==================>...........] - ETA: 56s - loss: 0.8639 - regression_loss: 0.7465 - classification_loss: 0.1174 332/500 [==================>...........] - ETA: 55s - loss: 0.8619 - regression_loss: 0.7448 - classification_loss: 0.1171 333/500 [==================>...........] - ETA: 55s - loss: 0.8633 - regression_loss: 0.7461 - classification_loss: 0.1172 334/500 [===================>..........] - ETA: 55s - loss: 0.8644 - regression_loss: 0.7471 - classification_loss: 0.1173 335/500 [===================>..........] - ETA: 54s - loss: 0.8634 - regression_loss: 0.7464 - classification_loss: 0.1171 336/500 [===================>..........] - ETA: 54s - loss: 0.8642 - regression_loss: 0.7471 - classification_loss: 0.1171 337/500 [===================>..........] - ETA: 54s - loss: 0.8630 - regression_loss: 0.7461 - classification_loss: 0.1169 338/500 [===================>..........] - ETA: 53s - loss: 0.8655 - regression_loss: 0.7480 - classification_loss: 0.1175 339/500 [===================>..........] - ETA: 53s - loss: 0.8638 - regression_loss: 0.7465 - classification_loss: 0.1172 340/500 [===================>..........] - ETA: 53s - loss: 0.8641 - regression_loss: 0.7468 - classification_loss: 0.1173 341/500 [===================>..........] - ETA: 52s - loss: 0.8639 - regression_loss: 0.7467 - classification_loss: 0.1173 342/500 [===================>..........] - ETA: 52s - loss: 0.8636 - regression_loss: 0.7465 - classification_loss: 0.1172 343/500 [===================>..........] - ETA: 52s - loss: 0.8646 - regression_loss: 0.7474 - classification_loss: 0.1172 344/500 [===================>..........] - ETA: 51s - loss: 0.8637 - regression_loss: 0.7466 - classification_loss: 0.1171 345/500 [===================>..........] - ETA: 51s - loss: 0.8640 - regression_loss: 0.7469 - classification_loss: 0.1171 346/500 [===================>..........] - ETA: 51s - loss: 0.8642 - regression_loss: 0.7470 - classification_loss: 0.1172 347/500 [===================>..........] - ETA: 50s - loss: 0.8646 - regression_loss: 0.7474 - classification_loss: 0.1172 348/500 [===================>..........] - ETA: 50s - loss: 0.8633 - regression_loss: 0.7463 - classification_loss: 0.1170 349/500 [===================>..........] - ETA: 50s - loss: 0.8624 - regression_loss: 0.7456 - classification_loss: 0.1168 350/500 [====================>.........] - ETA: 49s - loss: 0.8630 - regression_loss: 0.7462 - classification_loss: 0.1168 351/500 [====================>.........] - ETA: 49s - loss: 0.8616 - regression_loss: 0.7450 - classification_loss: 0.1166 352/500 [====================>.........] - ETA: 49s - loss: 0.8623 - regression_loss: 0.7456 - classification_loss: 0.1167 353/500 [====================>.........] - ETA: 48s - loss: 0.8611 - regression_loss: 0.7446 - classification_loss: 0.1165 354/500 [====================>.........] - ETA: 48s - loss: 0.8627 - regression_loss: 0.7459 - classification_loss: 0.1168 355/500 [====================>.........] - ETA: 48s - loss: 0.8630 - regression_loss: 0.7462 - classification_loss: 0.1168 356/500 [====================>.........] - ETA: 47s - loss: 0.8636 - regression_loss: 0.7467 - classification_loss: 0.1169 357/500 [====================>.........] - ETA: 47s - loss: 0.8629 - regression_loss: 0.7462 - classification_loss: 0.1167 358/500 [====================>.........] - ETA: 47s - loss: 0.8625 - regression_loss: 0.7458 - classification_loss: 0.1166 359/500 [====================>.........] - ETA: 46s - loss: 0.8622 - regression_loss: 0.7457 - classification_loss: 0.1165 360/500 [====================>.........] - ETA: 46s - loss: 0.8620 - regression_loss: 0.7455 - classification_loss: 0.1165 361/500 [====================>.........] - ETA: 46s - loss: 0.8622 - regression_loss: 0.7457 - classification_loss: 0.1165 362/500 [====================>.........] - ETA: 45s - loss: 0.8635 - regression_loss: 0.7468 - classification_loss: 0.1166 363/500 [====================>.........] - ETA: 45s - loss: 0.8635 - regression_loss: 0.7468 - classification_loss: 0.1167 364/500 [====================>.........] - ETA: 45s - loss: 0.8627 - regression_loss: 0.7462 - classification_loss: 0.1165 365/500 [====================>.........] - ETA: 44s - loss: 0.8616 - regression_loss: 0.7453 - classification_loss: 0.1163 366/500 [====================>.........] - ETA: 44s - loss: 0.8613 - regression_loss: 0.7450 - classification_loss: 0.1163 367/500 [=====================>........] - ETA: 44s - loss: 0.8602 - regression_loss: 0.7440 - classification_loss: 0.1162 368/500 [=====================>........] - ETA: 43s - loss: 0.8600 - regression_loss: 0.7439 - classification_loss: 0.1161 369/500 [=====================>........] - ETA: 43s - loss: 0.8592 - regression_loss: 0.7433 - classification_loss: 0.1159 370/500 [=====================>........] - ETA: 43s - loss: 0.8598 - regression_loss: 0.7437 - classification_loss: 0.1161 371/500 [=====================>........] - ETA: 42s - loss: 0.8581 - regression_loss: 0.7422 - classification_loss: 0.1159 372/500 [=====================>........] - ETA: 42s - loss: 0.8593 - regression_loss: 0.7432 - classification_loss: 0.1161 373/500 [=====================>........] - ETA: 42s - loss: 0.8581 - regression_loss: 0.7422 - classification_loss: 0.1159 374/500 [=====================>........] - ETA: 41s - loss: 0.8564 - regression_loss: 0.7408 - classification_loss: 0.1156 375/500 [=====================>........] - ETA: 41s - loss: 0.8552 - regression_loss: 0.7397 - classification_loss: 0.1155 376/500 [=====================>........] - ETA: 41s - loss: 0.8558 - regression_loss: 0.7402 - classification_loss: 0.1155 377/500 [=====================>........] - ETA: 40s - loss: 0.8560 - regression_loss: 0.7405 - classification_loss: 0.1155 378/500 [=====================>........] - ETA: 40s - loss: 0.8568 - regression_loss: 0.7412 - classification_loss: 0.1157 379/500 [=====================>........] - ETA: 40s - loss: 0.8561 - regression_loss: 0.7405 - classification_loss: 0.1156 380/500 [=====================>........] - ETA: 39s - loss: 0.8571 - regression_loss: 0.7414 - classification_loss: 0.1157 381/500 [=====================>........] - ETA: 39s - loss: 0.8571 - regression_loss: 0.7414 - classification_loss: 0.1157 382/500 [=====================>........] - ETA: 39s - loss: 0.8562 - regression_loss: 0.7406 - classification_loss: 0.1156 383/500 [=====================>........] - ETA: 38s - loss: 0.8549 - regression_loss: 0.7395 - classification_loss: 0.1154 384/500 [======================>.......] - ETA: 38s - loss: 0.8550 - regression_loss: 0.7396 - classification_loss: 0.1154 385/500 [======================>.......] - ETA: 38s - loss: 0.8556 - regression_loss: 0.7400 - classification_loss: 0.1155 386/500 [======================>.......] - ETA: 37s - loss: 0.8567 - regression_loss: 0.7410 - classification_loss: 0.1158 387/500 [======================>.......] - ETA: 37s - loss: 0.8564 - regression_loss: 0.7408 - classification_loss: 0.1156 388/500 [======================>.......] - ETA: 37s - loss: 0.8567 - regression_loss: 0.7410 - classification_loss: 0.1158 389/500 [======================>.......] - ETA: 36s - loss: 0.8564 - regression_loss: 0.7407 - classification_loss: 0.1158 390/500 [======================>.......] - ETA: 36s - loss: 0.8553 - regression_loss: 0.7397 - classification_loss: 0.1156 391/500 [======================>.......] - ETA: 36s - loss: 0.8555 - regression_loss: 0.7399 - classification_loss: 0.1156 392/500 [======================>.......] - ETA: 35s - loss: 0.8550 - regression_loss: 0.7395 - classification_loss: 0.1154 393/500 [======================>.......] - ETA: 35s - loss: 0.8554 - regression_loss: 0.7399 - classification_loss: 0.1155 394/500 [======================>.......] - ETA: 35s - loss: 0.8563 - regression_loss: 0.7407 - classification_loss: 0.1156 395/500 [======================>.......] - ETA: 34s - loss: 0.8582 - regression_loss: 0.7423 - classification_loss: 0.1159 396/500 [======================>.......] - ETA: 34s - loss: 0.8575 - regression_loss: 0.7418 - classification_loss: 0.1157 397/500 [======================>.......] - ETA: 34s - loss: 0.8580 - regression_loss: 0.7423 - classification_loss: 0.1157 398/500 [======================>.......] - ETA: 33s - loss: 0.8570 - regression_loss: 0.7414 - classification_loss: 0.1156 399/500 [======================>.......] - ETA: 33s - loss: 0.8573 - regression_loss: 0.7416 - classification_loss: 0.1157 400/500 [=======================>......] - ETA: 33s - loss: 0.8576 - regression_loss: 0.7419 - classification_loss: 0.1157 401/500 [=======================>......] - ETA: 32s - loss: 0.8584 - regression_loss: 0.7425 - classification_loss: 0.1159 402/500 [=======================>......] - ETA: 32s - loss: 0.8590 - regression_loss: 0.7430 - classification_loss: 0.1161 403/500 [=======================>......] - ETA: 32s - loss: 0.8590 - regression_loss: 0.7430 - classification_loss: 0.1160 404/500 [=======================>......] - ETA: 31s - loss: 0.8589 - regression_loss: 0.7430 - classification_loss: 0.1160 405/500 [=======================>......] - ETA: 31s - loss: 0.8591 - regression_loss: 0.7430 - classification_loss: 0.1161 406/500 [=======================>......] - ETA: 31s - loss: 0.8594 - regression_loss: 0.7434 - classification_loss: 0.1160 407/500 [=======================>......] - ETA: 30s - loss: 0.8595 - regression_loss: 0.7435 - classification_loss: 0.1160 408/500 [=======================>......] - ETA: 30s - loss: 0.8610 - regression_loss: 0.7448 - classification_loss: 0.1162 409/500 [=======================>......] - ETA: 30s - loss: 0.8615 - regression_loss: 0.7452 - classification_loss: 0.1163 410/500 [=======================>......] - ETA: 29s - loss: 0.8615 - regression_loss: 0.7452 - classification_loss: 0.1163 411/500 [=======================>......] - ETA: 29s - loss: 0.8615 - regression_loss: 0.7452 - classification_loss: 0.1163 412/500 [=======================>......] - ETA: 29s - loss: 0.8627 - regression_loss: 0.7464 - classification_loss: 0.1164 413/500 [=======================>......] - ETA: 28s - loss: 0.8632 - regression_loss: 0.7468 - classification_loss: 0.1164 414/500 [=======================>......] - ETA: 28s - loss: 0.8631 - regression_loss: 0.7467 - classification_loss: 0.1164 415/500 [=======================>......] - ETA: 28s - loss: 0.8633 - regression_loss: 0.7469 - classification_loss: 0.1165 416/500 [=======================>......] - ETA: 27s - loss: 0.8624 - regression_loss: 0.7461 - classification_loss: 0.1164 417/500 [========================>.....] - ETA: 27s - loss: 0.8621 - regression_loss: 0.7458 - classification_loss: 0.1163 418/500 [========================>.....] - ETA: 27s - loss: 0.8611 - regression_loss: 0.7450 - classification_loss: 0.1162 419/500 [========================>.....] - ETA: 26s - loss: 0.8607 - regression_loss: 0.7446 - classification_loss: 0.1161 420/500 [========================>.....] - ETA: 26s - loss: 0.8613 - regression_loss: 0.7452 - classification_loss: 0.1161 421/500 [========================>.....] - ETA: 26s - loss: 0.8615 - regression_loss: 0.7455 - classification_loss: 0.1160 422/500 [========================>.....] - ETA: 25s - loss: 0.8607 - regression_loss: 0.7448 - classification_loss: 0.1159 423/500 [========================>.....] - ETA: 25s - loss: 0.8621 - regression_loss: 0.7461 - classification_loss: 0.1161 424/500 [========================>.....] - ETA: 25s - loss: 0.8616 - regression_loss: 0.7456 - classification_loss: 0.1160 425/500 [========================>.....] - ETA: 24s - loss: 0.8628 - regression_loss: 0.7467 - classification_loss: 0.1161 426/500 [========================>.....] - ETA: 24s - loss: 0.8619 - regression_loss: 0.7459 - classification_loss: 0.1160 427/500 [========================>.....] - ETA: 24s - loss: 0.8620 - regression_loss: 0.7460 - classification_loss: 0.1159 428/500 [========================>.....] - ETA: 23s - loss: 0.8608 - regression_loss: 0.7449 - classification_loss: 0.1158 429/500 [========================>.....] - ETA: 23s - loss: 0.8618 - regression_loss: 0.7457 - classification_loss: 0.1161 430/500 [========================>.....] - ETA: 23s - loss: 0.8613 - regression_loss: 0.7453 - classification_loss: 0.1161 431/500 [========================>.....] - ETA: 22s - loss: 0.8616 - regression_loss: 0.7455 - classification_loss: 0.1161 432/500 [========================>.....] - ETA: 22s - loss: 0.8633 - regression_loss: 0.7468 - classification_loss: 0.1164 433/500 [========================>.....] - ETA: 22s - loss: 0.8629 - regression_loss: 0.7466 - classification_loss: 0.1164 434/500 [=========================>....] - ETA: 21s - loss: 0.8637 - regression_loss: 0.7473 - classification_loss: 0.1164 435/500 [=========================>....] - ETA: 21s - loss: 0.8643 - regression_loss: 0.7477 - classification_loss: 0.1165 436/500 [=========================>....] - ETA: 21s - loss: 0.8645 - regression_loss: 0.7480 - classification_loss: 0.1166 437/500 [=========================>....] - ETA: 20s - loss: 0.8649 - regression_loss: 0.7483 - classification_loss: 0.1166 438/500 [=========================>....] - ETA: 20s - loss: 0.8654 - regression_loss: 0.7489 - classification_loss: 0.1164 439/500 [=========================>....] - ETA: 20s - loss: 0.8663 - regression_loss: 0.7496 - classification_loss: 0.1167 440/500 [=========================>....] - ETA: 19s - loss: 0.8675 - regression_loss: 0.7507 - classification_loss: 0.1168 441/500 [=========================>....] - ETA: 19s - loss: 0.8683 - regression_loss: 0.7514 - classification_loss: 0.1169 442/500 [=========================>....] - ETA: 19s - loss: 0.8675 - regression_loss: 0.7507 - classification_loss: 0.1168 443/500 [=========================>....] - ETA: 18s - loss: 0.8669 - regression_loss: 0.7502 - classification_loss: 0.1167 444/500 [=========================>....] - ETA: 18s - loss: 0.8675 - regression_loss: 0.7507 - classification_loss: 0.1167 445/500 [=========================>....] - ETA: 18s - loss: 0.8669 - regression_loss: 0.7502 - classification_loss: 0.1167 446/500 [=========================>....] - ETA: 17s - loss: 0.8668 - regression_loss: 0.7502 - classification_loss: 0.1166 447/500 [=========================>....] - ETA: 17s - loss: 0.8674 - regression_loss: 0.7508 - classification_loss: 0.1165 448/500 [=========================>....] - ETA: 17s - loss: 0.8659 - regression_loss: 0.7495 - classification_loss: 0.1164 449/500 [=========================>....] - ETA: 16s - loss: 0.8662 - regression_loss: 0.7497 - classification_loss: 0.1165 450/500 [==========================>...] - ETA: 16s - loss: 0.8666 - regression_loss: 0.7501 - classification_loss: 0.1165 451/500 [==========================>...] - ETA: 16s - loss: 0.8669 - regression_loss: 0.7504 - classification_loss: 0.1165 452/500 [==========================>...] - ETA: 15s - loss: 0.8681 - regression_loss: 0.7514 - classification_loss: 0.1167 453/500 [==========================>...] - ETA: 15s - loss: 0.8696 - regression_loss: 0.7528 - classification_loss: 0.1168 454/500 [==========================>...] - ETA: 15s - loss: 0.8706 - regression_loss: 0.7538 - classification_loss: 0.1169 455/500 [==========================>...] - ETA: 14s - loss: 0.8706 - regression_loss: 0.7538 - classification_loss: 0.1168 456/500 [==========================>...] - ETA: 14s - loss: 0.8711 - regression_loss: 0.7542 - classification_loss: 0.1169 457/500 [==========================>...] - ETA: 14s - loss: 0.8708 - regression_loss: 0.7540 - classification_loss: 0.1168 458/500 [==========================>...] - ETA: 13s - loss: 0.8715 - regression_loss: 0.7544 - classification_loss: 0.1170 459/500 [==========================>...] - ETA: 13s - loss: 0.8709 - regression_loss: 0.7540 - classification_loss: 0.1169 460/500 [==========================>...] - ETA: 13s - loss: 0.8708 - regression_loss: 0.7538 - classification_loss: 0.1170 461/500 [==========================>...] - ETA: 12s - loss: 0.8710 - regression_loss: 0.7540 - classification_loss: 0.1170 462/500 [==========================>...] - ETA: 12s - loss: 0.8702 - regression_loss: 0.7530 - classification_loss: 0.1172 463/500 [==========================>...] - ETA: 12s - loss: 0.8693 - regression_loss: 0.7523 - classification_loss: 0.1170 464/500 [==========================>...] - ETA: 11s - loss: 0.8692 - regression_loss: 0.7523 - classification_loss: 0.1168 465/500 [==========================>...] - ETA: 11s - loss: 0.8696 - regression_loss: 0.7527 - classification_loss: 0.1169 466/500 [==========================>...] - ETA: 11s - loss: 0.8687 - regression_loss: 0.7519 - classification_loss: 0.1168 467/500 [===========================>..] - ETA: 10s - loss: 0.8689 - regression_loss: 0.7521 - classification_loss: 0.1168 468/500 [===========================>..] - ETA: 10s - loss: 0.8699 - regression_loss: 0.7529 - classification_loss: 0.1169 469/500 [===========================>..] - ETA: 10s - loss: 0.8703 - regression_loss: 0.7533 - classification_loss: 0.1170 470/500 [===========================>..] - ETA: 9s - loss: 0.8693 - regression_loss: 0.7525 - classification_loss: 0.1168  471/500 [===========================>..] - ETA: 9s - loss: 0.8686 - regression_loss: 0.7520 - classification_loss: 0.1167 472/500 [===========================>..] - ETA: 9s - loss: 0.8688 - regression_loss: 0.7521 - classification_loss: 0.1167 473/500 [===========================>..] - ETA: 8s - loss: 0.8685 - regression_loss: 0.7518 - classification_loss: 0.1166 474/500 [===========================>..] - ETA: 8s - loss: 0.8677 - regression_loss: 0.7512 - classification_loss: 0.1165 475/500 [===========================>..] - ETA: 8s - loss: 0.8672 - regression_loss: 0.7507 - classification_loss: 0.1164 476/500 [===========================>..] - ETA: 7s - loss: 0.8661 - regression_loss: 0.7498 - classification_loss: 0.1163 477/500 [===========================>..] - ETA: 7s - loss: 0.8664 - regression_loss: 0.7500 - classification_loss: 0.1164 478/500 [===========================>..] - ETA: 7s - loss: 0.8672 - regression_loss: 0.7506 - classification_loss: 0.1166 479/500 [===========================>..] - ETA: 6s - loss: 0.8664 - regression_loss: 0.7498 - classification_loss: 0.1165 480/500 [===========================>..] - ETA: 6s - loss: 0.8661 - regression_loss: 0.7496 - classification_loss: 0.1165 481/500 [===========================>..] - ETA: 6s - loss: 0.8659 - regression_loss: 0.7495 - classification_loss: 0.1164 482/500 [===========================>..] - ETA: 5s - loss: 0.8657 - regression_loss: 0.7492 - classification_loss: 0.1165 483/500 [===========================>..] - ETA: 5s - loss: 0.8648 - regression_loss: 0.7485 - classification_loss: 0.1164 484/500 [============================>.] - ETA: 5s - loss: 0.8649 - regression_loss: 0.7486 - classification_loss: 0.1163 485/500 [============================>.] - ETA: 4s - loss: 0.8643 - regression_loss: 0.7481 - classification_loss: 0.1162 486/500 [============================>.] - ETA: 4s - loss: 0.8647 - regression_loss: 0.7484 - classification_loss: 0.1163 487/500 [============================>.] - ETA: 4s - loss: 0.8659 - regression_loss: 0.7494 - classification_loss: 0.1165 488/500 [============================>.] - ETA: 3s - loss: 0.8663 - regression_loss: 0.7498 - classification_loss: 0.1165 489/500 [============================>.] - ETA: 3s - loss: 0.8655 - regression_loss: 0.7492 - classification_loss: 0.1163 490/500 [============================>.] - ETA: 3s - loss: 0.8661 - regression_loss: 0.7497 - classification_loss: 0.1164 491/500 [============================>.] - ETA: 2s - loss: 0.8664 - regression_loss: 0.7499 - classification_loss: 0.1165 492/500 [============================>.] - ETA: 2s - loss: 0.8661 - regression_loss: 0.7497 - classification_loss: 0.1165 493/500 [============================>.] - ETA: 2s - loss: 0.8667 - regression_loss: 0.7501 - classification_loss: 0.1166 494/500 [============================>.] - ETA: 1s - loss: 0.8671 - regression_loss: 0.7504 - classification_loss: 0.1167 495/500 [============================>.] - ETA: 1s - loss: 0.8674 - regression_loss: 0.7506 - classification_loss: 0.1168 496/500 [============================>.] - ETA: 1s - loss: 0.8672 - regression_loss: 0.7503 - classification_loss: 0.1168 497/500 [============================>.] - ETA: 0s - loss: 0.8660 - regression_loss: 0.7493 - classification_loss: 0.1167 498/500 [============================>.] - ETA: 0s - loss: 0.8652 - regression_loss: 0.7487 - classification_loss: 0.1166 499/500 [============================>.] - ETA: 0s - loss: 0.8651 - regression_loss: 0.7485 - classification_loss: 0.1166 500/500 [==============================] - 166s 331ms/step - loss: 0.8645 - regression_loss: 0.7480 - classification_loss: 0.1165 1172 instances of class plum with average precision: 0.6270 mAP: 0.6270 Epoch 00036: saving model to ./training/snapshots/resnet101_pascal_36.h5 Epoch 37/150 1/500 [..............................] - ETA: 2:52 - loss: 0.6770 - regression_loss: 0.5948 - classification_loss: 0.0821 2/500 [..............................] - ETA: 2:51 - loss: 0.9121 - regression_loss: 0.7898 - classification_loss: 0.1223 3/500 [..............................] - ETA: 2:47 - loss: 0.9230 - regression_loss: 0.7944 - classification_loss: 0.1285 4/500 [..............................] - ETA: 2:45 - loss: 0.7993 - regression_loss: 0.6893 - classification_loss: 0.1100 5/500 [..............................] - ETA: 2:43 - loss: 0.8411 - regression_loss: 0.7265 - classification_loss: 0.1145 6/500 [..............................] - ETA: 2:41 - loss: 0.8580 - regression_loss: 0.7371 - classification_loss: 0.1209 7/500 [..............................] - ETA: 2:39 - loss: 0.8819 - regression_loss: 0.7628 - classification_loss: 0.1191 8/500 [..............................] - ETA: 2:38 - loss: 0.8770 - regression_loss: 0.7555 - classification_loss: 0.1215 9/500 [..............................] - ETA: 2:39 - loss: 0.8367 - regression_loss: 0.7235 - classification_loss: 0.1132 10/500 [..............................] - ETA: 2:38 - loss: 0.8482 - regression_loss: 0.7318 - classification_loss: 0.1164 11/500 [..............................] - ETA: 2:38 - loss: 0.8229 - regression_loss: 0.7117 - classification_loss: 0.1111 12/500 [..............................] - ETA: 2:37 - loss: 0.8026 - regression_loss: 0.6946 - classification_loss: 0.1080 13/500 [..............................] - ETA: 2:37 - loss: 0.8247 - regression_loss: 0.7106 - classification_loss: 0.1142 14/500 [..............................] - ETA: 2:37 - loss: 0.8052 - regression_loss: 0.6921 - classification_loss: 0.1131 15/500 [..............................] - ETA: 2:36 - loss: 0.8405 - regression_loss: 0.7224 - classification_loss: 0.1181 16/500 [..............................] - ETA: 2:36 - loss: 0.8161 - regression_loss: 0.7003 - classification_loss: 0.1157 17/500 [>.............................] - ETA: 2:35 - loss: 0.8251 - regression_loss: 0.7119 - classification_loss: 0.1132 18/500 [>.............................] - ETA: 2:35 - loss: 0.8100 - regression_loss: 0.6994 - classification_loss: 0.1107 19/500 [>.............................] - ETA: 2:35 - loss: 0.8261 - regression_loss: 0.7178 - classification_loss: 0.1082 20/500 [>.............................] - ETA: 2:35 - loss: 0.8015 - regression_loss: 0.6964 - classification_loss: 0.1051 21/500 [>.............................] - ETA: 2:34 - loss: 0.8188 - regression_loss: 0.7087 - classification_loss: 0.1100 22/500 [>.............................] - ETA: 2:33 - loss: 0.7950 - regression_loss: 0.6889 - classification_loss: 0.1061 23/500 [>.............................] - ETA: 2:33 - loss: 0.7757 - regression_loss: 0.6730 - classification_loss: 0.1027 24/500 [>.............................] - ETA: 2:33 - loss: 0.7943 - regression_loss: 0.6886 - classification_loss: 0.1058 25/500 [>.............................] - ETA: 2:32 - loss: 0.7803 - regression_loss: 0.6763 - classification_loss: 0.1040 26/500 [>.............................] - ETA: 2:32 - loss: 0.7683 - regression_loss: 0.6661 - classification_loss: 0.1022 27/500 [>.............................] - ETA: 2:32 - loss: 0.7904 - regression_loss: 0.6851 - classification_loss: 0.1052 28/500 [>.............................] - ETA: 2:32 - loss: 0.7992 - regression_loss: 0.6941 - classification_loss: 0.1051 29/500 [>.............................] - ETA: 2:31 - loss: 0.8075 - regression_loss: 0.7019 - classification_loss: 0.1056 30/500 [>.............................] - ETA: 2:31 - loss: 0.8059 - regression_loss: 0.7005 - classification_loss: 0.1054 31/500 [>.............................] - ETA: 2:31 - loss: 0.7911 - regression_loss: 0.6878 - classification_loss: 0.1032 32/500 [>.............................] - ETA: 2:31 - loss: 0.7851 - regression_loss: 0.6826 - classification_loss: 0.1026 33/500 [>.............................] - ETA: 2:30 - loss: 0.7767 - regression_loss: 0.6751 - classification_loss: 0.1016 34/500 [=>............................] - ETA: 2:30 - loss: 0.7654 - regression_loss: 0.6656 - classification_loss: 0.0998 35/500 [=>............................] - ETA: 2:30 - loss: 0.7726 - regression_loss: 0.6724 - classification_loss: 0.1002 36/500 [=>............................] - ETA: 2:30 - loss: 0.7746 - regression_loss: 0.6736 - classification_loss: 0.1011 37/500 [=>............................] - ETA: 2:29 - loss: 0.7948 - regression_loss: 0.6896 - classification_loss: 0.1052 38/500 [=>............................] - ETA: 2:29 - loss: 0.7942 - regression_loss: 0.6896 - classification_loss: 0.1046 39/500 [=>............................] - ETA: 2:29 - loss: 0.7979 - regression_loss: 0.6931 - classification_loss: 0.1047 40/500 [=>............................] - ETA: 2:29 - loss: 0.7915 - regression_loss: 0.6879 - classification_loss: 0.1035 41/500 [=>............................] - ETA: 2:29 - loss: 0.7909 - regression_loss: 0.6875 - classification_loss: 0.1034 42/500 [=>............................] - ETA: 2:28 - loss: 0.7909 - regression_loss: 0.6876 - classification_loss: 0.1033 43/500 [=>............................] - ETA: 2:28 - loss: 0.8025 - regression_loss: 0.6967 - classification_loss: 0.1058 44/500 [=>............................] - ETA: 2:28 - loss: 0.8047 - regression_loss: 0.6977 - classification_loss: 0.1070 45/500 [=>............................] - ETA: 2:27 - loss: 0.8000 - regression_loss: 0.6928 - classification_loss: 0.1071 46/500 [=>............................] - ETA: 2:27 - loss: 0.7916 - regression_loss: 0.6858 - classification_loss: 0.1058 47/500 [=>............................] - ETA: 2:27 - loss: 0.7954 - regression_loss: 0.6890 - classification_loss: 0.1063 48/500 [=>............................] - ETA: 2:26 - loss: 0.7938 - regression_loss: 0.6874 - classification_loss: 0.1064 49/500 [=>............................] - ETA: 2:26 - loss: 0.8036 - regression_loss: 0.6959 - classification_loss: 0.1078 50/500 [==>...........................] - ETA: 2:26 - loss: 0.8174 - regression_loss: 0.7077 - classification_loss: 0.1097 51/500 [==>...........................] - ETA: 2:26 - loss: 0.8142 - regression_loss: 0.7050 - classification_loss: 0.1092 52/500 [==>...........................] - ETA: 2:25 - loss: 0.8168 - regression_loss: 0.7075 - classification_loss: 0.1093 53/500 [==>...........................] - ETA: 2:25 - loss: 0.8067 - regression_loss: 0.6990 - classification_loss: 0.1077 54/500 [==>...........................] - ETA: 2:25 - loss: 0.8099 - regression_loss: 0.7012 - classification_loss: 0.1087 55/500 [==>...........................] - ETA: 2:25 - loss: 0.8041 - regression_loss: 0.6965 - classification_loss: 0.1076 56/500 [==>...........................] - ETA: 2:24 - loss: 0.8071 - regression_loss: 0.6991 - classification_loss: 0.1079 57/500 [==>...........................] - ETA: 2:24 - loss: 0.8034 - regression_loss: 0.6958 - classification_loss: 0.1076 58/500 [==>...........................] - ETA: 2:24 - loss: 0.8023 - regression_loss: 0.6948 - classification_loss: 0.1076 59/500 [==>...........................] - ETA: 2:24 - loss: 0.8070 - regression_loss: 0.6991 - classification_loss: 0.1079 60/500 [==>...........................] - ETA: 2:24 - loss: 0.8072 - regression_loss: 0.6995 - classification_loss: 0.1078 61/500 [==>...........................] - ETA: 2:23 - loss: 0.8066 - regression_loss: 0.6990 - classification_loss: 0.1076 62/500 [==>...........................] - ETA: 2:23 - loss: 0.8089 - regression_loss: 0.7011 - classification_loss: 0.1077 63/500 [==>...........................] - ETA: 2:23 - loss: 0.8069 - regression_loss: 0.6996 - classification_loss: 0.1073 64/500 [==>...........................] - ETA: 2:22 - loss: 0.8088 - regression_loss: 0.7016 - classification_loss: 0.1072 65/500 [==>...........................] - ETA: 2:22 - loss: 0.8077 - regression_loss: 0.7006 - classification_loss: 0.1071 66/500 [==>...........................] - ETA: 2:21 - loss: 0.8064 - regression_loss: 0.6993 - classification_loss: 0.1071 67/500 [===>..........................] - ETA: 2:21 - loss: 0.8050 - regression_loss: 0.6983 - classification_loss: 0.1067 68/500 [===>..........................] - ETA: 2:21 - loss: 0.8056 - regression_loss: 0.6984 - classification_loss: 0.1073 69/500 [===>..........................] - ETA: 2:21 - loss: 0.8044 - regression_loss: 0.6971 - classification_loss: 0.1073 70/500 [===>..........................] - ETA: 2:20 - loss: 0.8087 - regression_loss: 0.7010 - classification_loss: 0.1077 71/500 [===>..........................] - ETA: 2:20 - loss: 0.8081 - regression_loss: 0.7003 - classification_loss: 0.1077 72/500 [===>..........................] - ETA: 2:20 - loss: 0.8149 - regression_loss: 0.7064 - classification_loss: 0.1085 73/500 [===>..........................] - ETA: 2:19 - loss: 0.8077 - regression_loss: 0.6999 - classification_loss: 0.1077 74/500 [===>..........................] - ETA: 2:19 - loss: 0.8035 - regression_loss: 0.6963 - classification_loss: 0.1072 75/500 [===>..........................] - ETA: 2:19 - loss: 0.7971 - regression_loss: 0.6907 - classification_loss: 0.1064 76/500 [===>..........................] - ETA: 2:18 - loss: 0.7994 - regression_loss: 0.6925 - classification_loss: 0.1068 77/500 [===>..........................] - ETA: 2:18 - loss: 0.7995 - regression_loss: 0.6927 - classification_loss: 0.1068 78/500 [===>..........................] - ETA: 2:18 - loss: 0.7946 - regression_loss: 0.6884 - classification_loss: 0.1063 79/500 [===>..........................] - ETA: 2:17 - loss: 0.7957 - regression_loss: 0.6889 - classification_loss: 0.1068 80/500 [===>..........................] - ETA: 2:17 - loss: 0.7879 - regression_loss: 0.6821 - classification_loss: 0.1058 81/500 [===>..........................] - ETA: 2:16 - loss: 0.7816 - regression_loss: 0.6766 - classification_loss: 0.1050 82/500 [===>..........................] - ETA: 2:16 - loss: 0.7827 - regression_loss: 0.6776 - classification_loss: 0.1051 83/500 [===>..........................] - ETA: 2:16 - loss: 0.7858 - regression_loss: 0.6802 - classification_loss: 0.1057 84/500 [====>.........................] - ETA: 2:15 - loss: 0.7873 - regression_loss: 0.6817 - classification_loss: 0.1056 85/500 [====>.........................] - ETA: 2:15 - loss: 0.7902 - regression_loss: 0.6845 - classification_loss: 0.1058 86/500 [====>.........................] - ETA: 2:15 - loss: 0.7843 - regression_loss: 0.6790 - classification_loss: 0.1053 87/500 [====>.........................] - ETA: 2:14 - loss: 0.7819 - regression_loss: 0.6771 - classification_loss: 0.1049 88/500 [====>.........................] - ETA: 2:14 - loss: 0.7807 - regression_loss: 0.6758 - classification_loss: 0.1049 89/500 [====>.........................] - ETA: 2:14 - loss: 0.7909 - regression_loss: 0.6841 - classification_loss: 0.1068 90/500 [====>.........................] - ETA: 2:13 - loss: 0.7920 - regression_loss: 0.6852 - classification_loss: 0.1068 91/500 [====>.........................] - ETA: 2:13 - loss: 0.7959 - regression_loss: 0.6888 - classification_loss: 0.1071 92/500 [====>.........................] - ETA: 2:13 - loss: 0.7980 - regression_loss: 0.6908 - classification_loss: 0.1072 93/500 [====>.........................] - ETA: 2:12 - loss: 0.8001 - regression_loss: 0.6922 - classification_loss: 0.1079 94/500 [====>.........................] - ETA: 2:12 - loss: 0.8078 - regression_loss: 0.6986 - classification_loss: 0.1092 95/500 [====>.........................] - ETA: 2:12 - loss: 0.8036 - regression_loss: 0.6950 - classification_loss: 0.1086 96/500 [====>.........................] - ETA: 2:12 - loss: 0.8022 - regression_loss: 0.6940 - classification_loss: 0.1082 97/500 [====>.........................] - ETA: 2:11 - loss: 0.8035 - regression_loss: 0.6950 - classification_loss: 0.1084 98/500 [====>.........................] - ETA: 2:11 - loss: 0.8087 - regression_loss: 0.6994 - classification_loss: 0.1093 99/500 [====>.........................] - ETA: 2:11 - loss: 0.8131 - regression_loss: 0.7033 - classification_loss: 0.1098 100/500 [=====>........................] - ETA: 2:10 - loss: 0.8106 - regression_loss: 0.7011 - classification_loss: 0.1095 101/500 [=====>........................] - ETA: 2:10 - loss: 0.8141 - regression_loss: 0.7042 - classification_loss: 0.1099 102/500 [=====>........................] - ETA: 2:09 - loss: 0.8147 - regression_loss: 0.7046 - classification_loss: 0.1101 103/500 [=====>........................] - ETA: 2:09 - loss: 0.8140 - regression_loss: 0.7044 - classification_loss: 0.1096 104/500 [=====>........................] - ETA: 2:09 - loss: 0.8167 - regression_loss: 0.7068 - classification_loss: 0.1098 105/500 [=====>........................] - ETA: 2:09 - loss: 0.8192 - regression_loss: 0.7089 - classification_loss: 0.1103 106/500 [=====>........................] - ETA: 2:08 - loss: 0.8222 - regression_loss: 0.7115 - classification_loss: 0.1107 107/500 [=====>........................] - ETA: 2:08 - loss: 0.8179 - regression_loss: 0.7076 - classification_loss: 0.1103 108/500 [=====>........................] - ETA: 2:08 - loss: 0.8162 - regression_loss: 0.7064 - classification_loss: 0.1099 109/500 [=====>........................] - ETA: 2:07 - loss: 0.8173 - regression_loss: 0.7074 - classification_loss: 0.1099 110/500 [=====>........................] - ETA: 2:07 - loss: 0.8139 - regression_loss: 0.7045 - classification_loss: 0.1094 111/500 [=====>........................] - ETA: 2:07 - loss: 0.8113 - regression_loss: 0.7025 - classification_loss: 0.1088 112/500 [=====>........................] - ETA: 2:07 - loss: 0.8089 - regression_loss: 0.7006 - classification_loss: 0.1084 113/500 [=====>........................] - ETA: 2:06 - loss: 0.8072 - regression_loss: 0.6992 - classification_loss: 0.1080 114/500 [=====>........................] - ETA: 2:06 - loss: 0.8114 - regression_loss: 0.7028 - classification_loss: 0.1087 115/500 [=====>........................] - ETA: 2:06 - loss: 0.8156 - regression_loss: 0.7059 - classification_loss: 0.1097 116/500 [=====>........................] - ETA: 2:05 - loss: 0.8195 - regression_loss: 0.7086 - classification_loss: 0.1108 117/500 [======>.......................] - ETA: 2:05 - loss: 0.8198 - regression_loss: 0.7089 - classification_loss: 0.1109 118/500 [======>.......................] - ETA: 2:05 - loss: 0.8206 - regression_loss: 0.7101 - classification_loss: 0.1106 119/500 [======>.......................] - ETA: 2:04 - loss: 0.8168 - regression_loss: 0.7069 - classification_loss: 0.1099 120/500 [======>.......................] - ETA: 2:04 - loss: 0.8222 - regression_loss: 0.7112 - classification_loss: 0.1111 121/500 [======>.......................] - ETA: 2:04 - loss: 0.8215 - regression_loss: 0.7107 - classification_loss: 0.1108 122/500 [======>.......................] - ETA: 2:03 - loss: 0.8177 - regression_loss: 0.7072 - classification_loss: 0.1105 123/500 [======>.......................] - ETA: 2:03 - loss: 0.8137 - regression_loss: 0.7038 - classification_loss: 0.1098 124/500 [======>.......................] - ETA: 2:03 - loss: 0.8143 - regression_loss: 0.7045 - classification_loss: 0.1098 125/500 [======>.......................] - ETA: 2:02 - loss: 0.8146 - regression_loss: 0.7048 - classification_loss: 0.1098 126/500 [======>.......................] - ETA: 2:02 - loss: 0.8113 - regression_loss: 0.7020 - classification_loss: 0.1094 127/500 [======>.......................] - ETA: 2:02 - loss: 0.8092 - regression_loss: 0.7002 - classification_loss: 0.1090 128/500 [======>.......................] - ETA: 2:01 - loss: 0.8071 - regression_loss: 0.6986 - classification_loss: 0.1085 129/500 [======>.......................] - ETA: 2:01 - loss: 0.8044 - regression_loss: 0.6963 - classification_loss: 0.1081 130/500 [======>.......................] - ETA: 2:01 - loss: 0.8035 - regression_loss: 0.6956 - classification_loss: 0.1079 131/500 [======>.......................] - ETA: 2:00 - loss: 0.7996 - regression_loss: 0.6922 - classification_loss: 0.1074 132/500 [======>.......................] - ETA: 2:00 - loss: 0.8022 - regression_loss: 0.6946 - classification_loss: 0.1075 133/500 [======>.......................] - ETA: 2:00 - loss: 0.8003 - regression_loss: 0.6931 - classification_loss: 0.1072 134/500 [=======>......................] - ETA: 1:59 - loss: 0.8038 - regression_loss: 0.6961 - classification_loss: 0.1077 135/500 [=======>......................] - ETA: 1:59 - loss: 0.8052 - regression_loss: 0.6972 - classification_loss: 0.1080 136/500 [=======>......................] - ETA: 1:59 - loss: 0.8099 - regression_loss: 0.7016 - classification_loss: 0.1083 137/500 [=======>......................] - ETA: 1:58 - loss: 0.8105 - regression_loss: 0.7022 - classification_loss: 0.1083 138/500 [=======>......................] - ETA: 1:58 - loss: 0.8140 - regression_loss: 0.7049 - classification_loss: 0.1090 139/500 [=======>......................] - ETA: 1:58 - loss: 0.8128 - regression_loss: 0.7042 - classification_loss: 0.1086 140/500 [=======>......................] - ETA: 1:57 - loss: 0.8186 - regression_loss: 0.7090 - classification_loss: 0.1096 141/500 [=======>......................] - ETA: 1:57 - loss: 0.8194 - regression_loss: 0.7099 - classification_loss: 0.1095 142/500 [=======>......................] - ETA: 1:57 - loss: 0.8219 - regression_loss: 0.7121 - classification_loss: 0.1098 143/500 [=======>......................] - ETA: 1:56 - loss: 0.8217 - regression_loss: 0.7119 - classification_loss: 0.1098 144/500 [=======>......................] - ETA: 1:56 - loss: 0.8183 - regression_loss: 0.7090 - classification_loss: 0.1094 145/500 [=======>......................] - ETA: 1:56 - loss: 0.8198 - regression_loss: 0.7105 - classification_loss: 0.1093 146/500 [=======>......................] - ETA: 1:55 - loss: 0.8174 - regression_loss: 0.7084 - classification_loss: 0.1090 147/500 [=======>......................] - ETA: 1:55 - loss: 0.8211 - regression_loss: 0.7115 - classification_loss: 0.1096 148/500 [=======>......................] - ETA: 1:55 - loss: 0.8197 - regression_loss: 0.7105 - classification_loss: 0.1092 149/500 [=======>......................] - ETA: 1:55 - loss: 0.8203 - regression_loss: 0.7111 - classification_loss: 0.1091 150/500 [========>.....................] - ETA: 1:54 - loss: 0.8254 - regression_loss: 0.7157 - classification_loss: 0.1097 151/500 [========>.....................] - ETA: 1:54 - loss: 0.8302 - regression_loss: 0.7203 - classification_loss: 0.1099 152/500 [========>.....................] - ETA: 1:54 - loss: 0.8338 - regression_loss: 0.7234 - classification_loss: 0.1104 153/500 [========>.....................] - ETA: 1:53 - loss: 0.8314 - regression_loss: 0.7214 - classification_loss: 0.1100 154/500 [========>.....................] - ETA: 1:53 - loss: 0.8354 - regression_loss: 0.7248 - classification_loss: 0.1106 155/500 [========>.....................] - ETA: 1:52 - loss: 0.8370 - regression_loss: 0.7262 - classification_loss: 0.1108 156/500 [========>.....................] - ETA: 1:52 - loss: 0.8358 - regression_loss: 0.7252 - classification_loss: 0.1107 157/500 [========>.....................] - ETA: 1:52 - loss: 0.8403 - regression_loss: 0.7289 - classification_loss: 0.1114 158/500 [========>.....................] - ETA: 1:51 - loss: 0.8398 - regression_loss: 0.7285 - classification_loss: 0.1113 159/500 [========>.....................] - ETA: 1:51 - loss: 0.8413 - regression_loss: 0.7299 - classification_loss: 0.1114 160/500 [========>.....................] - ETA: 1:51 - loss: 0.8428 - regression_loss: 0.7312 - classification_loss: 0.1117 161/500 [========>.....................] - ETA: 1:50 - loss: 0.8448 - regression_loss: 0.7334 - classification_loss: 0.1114 162/500 [========>.....................] - ETA: 1:50 - loss: 0.8477 - regression_loss: 0.7361 - classification_loss: 0.1117 163/500 [========>.....................] - ETA: 1:50 - loss: 0.8490 - regression_loss: 0.7373 - classification_loss: 0.1117 164/500 [========>.....................] - ETA: 1:50 - loss: 0.8478 - regression_loss: 0.7364 - classification_loss: 0.1114 165/500 [========>.....................] - ETA: 1:49 - loss: 0.8450 - regression_loss: 0.7340 - classification_loss: 0.1110 166/500 [========>.....................] - ETA: 1:49 - loss: 0.8473 - regression_loss: 0.7360 - classification_loss: 0.1114 167/500 [=========>....................] - ETA: 1:49 - loss: 0.8467 - regression_loss: 0.7355 - classification_loss: 0.1112 168/500 [=========>....................] - ETA: 1:48 - loss: 0.8468 - regression_loss: 0.7354 - classification_loss: 0.1114 169/500 [=========>....................] - ETA: 1:48 - loss: 0.8475 - regression_loss: 0.7361 - classification_loss: 0.1114 170/500 [=========>....................] - ETA: 1:48 - loss: 0.8453 - regression_loss: 0.7341 - classification_loss: 0.1112 171/500 [=========>....................] - ETA: 1:47 - loss: 0.8432 - regression_loss: 0.7324 - classification_loss: 0.1109 172/500 [=========>....................] - ETA: 1:47 - loss: 0.8461 - regression_loss: 0.7344 - classification_loss: 0.1116 173/500 [=========>....................] - ETA: 1:47 - loss: 0.8450 - regression_loss: 0.7335 - classification_loss: 0.1115 174/500 [=========>....................] - ETA: 1:46 - loss: 0.8461 - regression_loss: 0.7345 - classification_loss: 0.1116 175/500 [=========>....................] - ETA: 1:46 - loss: 0.8470 - regression_loss: 0.7354 - classification_loss: 0.1117 176/500 [=========>....................] - ETA: 1:46 - loss: 0.8489 - regression_loss: 0.7370 - classification_loss: 0.1119 177/500 [=========>....................] - ETA: 1:45 - loss: 0.8451 - regression_loss: 0.7337 - classification_loss: 0.1115 178/500 [=========>....................] - ETA: 1:45 - loss: 0.8429 - regression_loss: 0.7317 - classification_loss: 0.1112 179/500 [=========>....................] - ETA: 1:45 - loss: 0.8438 - regression_loss: 0.7324 - classification_loss: 0.1114 180/500 [=========>....................] - ETA: 1:44 - loss: 0.8424 - regression_loss: 0.7312 - classification_loss: 0.1112 181/500 [=========>....................] - ETA: 1:44 - loss: 0.8432 - regression_loss: 0.7318 - classification_loss: 0.1114 182/500 [=========>....................] - ETA: 1:44 - loss: 0.8427 - regression_loss: 0.7316 - classification_loss: 0.1111 183/500 [=========>....................] - ETA: 1:43 - loss: 0.8422 - regression_loss: 0.7314 - classification_loss: 0.1109 184/500 [==========>...................] - ETA: 1:43 - loss: 0.8429 - regression_loss: 0.7321 - classification_loss: 0.1108 185/500 [==========>...................] - ETA: 1:43 - loss: 0.8404 - regression_loss: 0.7301 - classification_loss: 0.1104 186/500 [==========>...................] - ETA: 1:42 - loss: 0.8395 - regression_loss: 0.7292 - classification_loss: 0.1103 187/500 [==========>...................] - ETA: 1:42 - loss: 0.8413 - regression_loss: 0.7307 - classification_loss: 0.1106 188/500 [==========>...................] - ETA: 1:42 - loss: 0.8430 - regression_loss: 0.7321 - classification_loss: 0.1109 189/500 [==========>...................] - ETA: 1:41 - loss: 0.8443 - regression_loss: 0.7333 - classification_loss: 0.1110 190/500 [==========>...................] - ETA: 1:41 - loss: 0.8427 - regression_loss: 0.7319 - classification_loss: 0.1108 191/500 [==========>...................] - ETA: 1:41 - loss: 0.8451 - regression_loss: 0.7337 - classification_loss: 0.1114 192/500 [==========>...................] - ETA: 1:40 - loss: 0.8477 - regression_loss: 0.7358 - classification_loss: 0.1119 193/500 [==========>...................] - ETA: 1:40 - loss: 0.8480 - regression_loss: 0.7361 - classification_loss: 0.1119 194/500 [==========>...................] - ETA: 1:40 - loss: 0.8484 - regression_loss: 0.7364 - classification_loss: 0.1120 195/500 [==========>...................] - ETA: 1:40 - loss: 0.8519 - regression_loss: 0.7392 - classification_loss: 0.1127 196/500 [==========>...................] - ETA: 1:39 - loss: 0.8525 - regression_loss: 0.7396 - classification_loss: 0.1129 197/500 [==========>...................] - ETA: 1:39 - loss: 0.8507 - regression_loss: 0.7381 - classification_loss: 0.1126 198/500 [==========>...................] - ETA: 1:38 - loss: 0.8520 - regression_loss: 0.7391 - classification_loss: 0.1129 199/500 [==========>...................] - ETA: 1:38 - loss: 0.8515 - regression_loss: 0.7387 - classification_loss: 0.1128 200/500 [===========>..................] - ETA: 1:38 - loss: 0.8497 - regression_loss: 0.7371 - classification_loss: 0.1125 201/500 [===========>..................] - ETA: 1:37 - loss: 0.8509 - regression_loss: 0.7380 - classification_loss: 0.1128 202/500 [===========>..................] - ETA: 1:37 - loss: 0.8519 - regression_loss: 0.7390 - classification_loss: 0.1129 203/500 [===========>..................] - ETA: 1:37 - loss: 0.8557 - regression_loss: 0.7422 - classification_loss: 0.1135 204/500 [===========>..................] - ETA: 1:36 - loss: 0.8544 - regression_loss: 0.7412 - classification_loss: 0.1133 205/500 [===========>..................] - ETA: 1:36 - loss: 0.8541 - regression_loss: 0.7409 - classification_loss: 0.1133 206/500 [===========>..................] - ETA: 1:36 - loss: 0.8540 - regression_loss: 0.7404 - classification_loss: 0.1136 207/500 [===========>..................] - ETA: 1:36 - loss: 0.8517 - regression_loss: 0.7385 - classification_loss: 0.1133 208/500 [===========>..................] - ETA: 1:35 - loss: 0.8504 - regression_loss: 0.7374 - classification_loss: 0.1131 209/500 [===========>..................] - ETA: 1:35 - loss: 0.8473 - regression_loss: 0.7346 - classification_loss: 0.1126 210/500 [===========>..................] - ETA: 1:35 - loss: 0.8447 - regression_loss: 0.7325 - classification_loss: 0.1122 211/500 [===========>..................] - ETA: 1:34 - loss: 0.8433 - regression_loss: 0.7314 - classification_loss: 0.1120 212/500 [===========>..................] - ETA: 1:34 - loss: 0.8444 - regression_loss: 0.7321 - classification_loss: 0.1123 213/500 [===========>..................] - ETA: 1:34 - loss: 0.8433 - regression_loss: 0.7313 - classification_loss: 0.1121 214/500 [===========>..................] - ETA: 1:33 - loss: 0.8444 - regression_loss: 0.7322 - classification_loss: 0.1122 215/500 [===========>..................] - ETA: 1:33 - loss: 0.8429 - regression_loss: 0.7309 - classification_loss: 0.1119 216/500 [===========>..................] - ETA: 1:33 - loss: 0.8412 - regression_loss: 0.7294 - classification_loss: 0.1117 217/500 [============>.................] - ETA: 1:32 - loss: 0.8411 - regression_loss: 0.7293 - classification_loss: 0.1117 218/500 [============>.................] - ETA: 1:32 - loss: 0.8427 - regression_loss: 0.7307 - classification_loss: 0.1120 219/500 [============>.................] - ETA: 1:32 - loss: 0.8427 - regression_loss: 0.7307 - classification_loss: 0.1120 220/500 [============>.................] - ETA: 1:31 - loss: 0.8426 - regression_loss: 0.7307 - classification_loss: 0.1119 221/500 [============>.................] - ETA: 1:31 - loss: 0.8428 - regression_loss: 0.7308 - classification_loss: 0.1119 222/500 [============>.................] - ETA: 1:31 - loss: 0.8412 - regression_loss: 0.7294 - classification_loss: 0.1118 223/500 [============>.................] - ETA: 1:30 - loss: 0.8406 - regression_loss: 0.7290 - classification_loss: 0.1117 224/500 [============>.................] - ETA: 1:30 - loss: 0.8446 - regression_loss: 0.7322 - classification_loss: 0.1125 225/500 [============>.................] - ETA: 1:30 - loss: 0.8440 - regression_loss: 0.7317 - classification_loss: 0.1122 226/500 [============>.................] - ETA: 1:29 - loss: 0.8416 - regression_loss: 0.7297 - classification_loss: 0.1119 227/500 [============>.................] - ETA: 1:29 - loss: 0.8399 - regression_loss: 0.7283 - classification_loss: 0.1116 228/500 [============>.................] - ETA: 1:29 - loss: 0.8383 - regression_loss: 0.7268 - classification_loss: 0.1115 229/500 [============>.................] - ETA: 1:29 - loss: 0.8360 - regression_loss: 0.7248 - classification_loss: 0.1112 230/500 [============>.................] - ETA: 1:28 - loss: 0.8347 - regression_loss: 0.7237 - classification_loss: 0.1110 231/500 [============>.................] - ETA: 1:28 - loss: 0.8328 - regression_loss: 0.7219 - classification_loss: 0.1109 232/500 [============>.................] - ETA: 1:27 - loss: 0.8333 - regression_loss: 0.7225 - classification_loss: 0.1108 233/500 [============>.................] - ETA: 1:27 - loss: 0.8351 - regression_loss: 0.7241 - classification_loss: 0.1110 234/500 [=============>................] - ETA: 1:27 - loss: 0.8349 - regression_loss: 0.7240 - classification_loss: 0.1109 235/500 [=============>................] - ETA: 1:27 - loss: 0.8371 - regression_loss: 0.7260 - classification_loss: 0.1112 236/500 [=============>................] - ETA: 1:26 - loss: 0.8344 - regression_loss: 0.7235 - classification_loss: 0.1109 237/500 [=============>................] - ETA: 1:26 - loss: 0.8325 - regression_loss: 0.7218 - classification_loss: 0.1106 238/500 [=============>................] - ETA: 1:26 - loss: 0.8306 - regression_loss: 0.7203 - classification_loss: 0.1103 239/500 [=============>................] - ETA: 1:25 - loss: 0.8314 - regression_loss: 0.7211 - classification_loss: 0.1104 240/500 [=============>................] - ETA: 1:25 - loss: 0.8320 - regression_loss: 0.7215 - classification_loss: 0.1105 241/500 [=============>................] - ETA: 1:25 - loss: 0.8320 - regression_loss: 0.7215 - classification_loss: 0.1105 242/500 [=============>................] - ETA: 1:24 - loss: 0.8313 - regression_loss: 0.7210 - classification_loss: 0.1104 243/500 [=============>................] - ETA: 1:24 - loss: 0.8306 - regression_loss: 0.7204 - classification_loss: 0.1102 244/500 [=============>................] - ETA: 1:24 - loss: 0.8302 - regression_loss: 0.7200 - classification_loss: 0.1101 245/500 [=============>................] - ETA: 1:23 - loss: 0.8298 - regression_loss: 0.7196 - classification_loss: 0.1102 246/500 [=============>................] - ETA: 1:23 - loss: 0.8296 - regression_loss: 0.7195 - classification_loss: 0.1101 247/500 [=============>................] - ETA: 1:23 - loss: 0.8295 - regression_loss: 0.7193 - classification_loss: 0.1102 248/500 [=============>................] - ETA: 1:22 - loss: 0.8286 - regression_loss: 0.7186 - classification_loss: 0.1100 249/500 [=============>................] - ETA: 1:22 - loss: 0.8297 - regression_loss: 0.7194 - classification_loss: 0.1102 250/500 [==============>...............] - ETA: 1:22 - loss: 0.8308 - regression_loss: 0.7203 - classification_loss: 0.1105 251/500 [==============>...............] - ETA: 1:21 - loss: 0.8315 - regression_loss: 0.7207 - classification_loss: 0.1108 252/500 [==============>...............] - ETA: 1:21 - loss: 0.8309 - regression_loss: 0.7203 - classification_loss: 0.1106 253/500 [==============>...............] - ETA: 1:21 - loss: 0.8297 - regression_loss: 0.7194 - classification_loss: 0.1104 254/500 [==============>...............] - ETA: 1:20 - loss: 0.8280 - regression_loss: 0.7179 - classification_loss: 0.1101 255/500 [==============>...............] - ETA: 1:20 - loss: 0.8297 - regression_loss: 0.7194 - classification_loss: 0.1103 256/500 [==============>...............] - ETA: 1:20 - loss: 0.8298 - regression_loss: 0.7196 - classification_loss: 0.1102 257/500 [==============>...............] - ETA: 1:19 - loss: 0.8283 - regression_loss: 0.7183 - classification_loss: 0.1100 258/500 [==============>...............] - ETA: 1:19 - loss: 0.8286 - regression_loss: 0.7185 - classification_loss: 0.1101 259/500 [==============>...............] - ETA: 1:19 - loss: 0.8283 - regression_loss: 0.7183 - classification_loss: 0.1100 260/500 [==============>...............] - ETA: 1:18 - loss: 0.8296 - regression_loss: 0.7195 - classification_loss: 0.1101 261/500 [==============>...............] - ETA: 1:18 - loss: 0.8289 - regression_loss: 0.7190 - classification_loss: 0.1099 262/500 [==============>...............] - ETA: 1:18 - loss: 0.8299 - regression_loss: 0.7200 - classification_loss: 0.1099 263/500 [==============>...............] - ETA: 1:17 - loss: 0.8288 - regression_loss: 0.7190 - classification_loss: 0.1098 264/500 [==============>...............] - ETA: 1:17 - loss: 0.8287 - regression_loss: 0.7190 - classification_loss: 0.1097 265/500 [==============>...............] - ETA: 1:17 - loss: 0.8291 - regression_loss: 0.7192 - classification_loss: 0.1098 266/500 [==============>...............] - ETA: 1:16 - loss: 0.8295 - regression_loss: 0.7196 - classification_loss: 0.1099 267/500 [===============>..............] - ETA: 1:16 - loss: 0.8289 - regression_loss: 0.7191 - classification_loss: 0.1098 268/500 [===============>..............] - ETA: 1:16 - loss: 0.8278 - regression_loss: 0.7181 - classification_loss: 0.1097 269/500 [===============>..............] - ETA: 1:15 - loss: 0.8258 - regression_loss: 0.7164 - classification_loss: 0.1094 270/500 [===============>..............] - ETA: 1:15 - loss: 0.8260 - regression_loss: 0.7166 - classification_loss: 0.1094 271/500 [===============>..............] - ETA: 1:15 - loss: 0.8259 - regression_loss: 0.7165 - classification_loss: 0.1093 272/500 [===============>..............] - ETA: 1:14 - loss: 0.8250 - regression_loss: 0.7158 - classification_loss: 0.1091 273/500 [===============>..............] - ETA: 1:14 - loss: 0.8250 - regression_loss: 0.7159 - classification_loss: 0.1091 274/500 [===============>..............] - ETA: 1:14 - loss: 0.8241 - regression_loss: 0.7151 - classification_loss: 0.1090 275/500 [===============>..............] - ETA: 1:13 - loss: 0.8252 - regression_loss: 0.7161 - classification_loss: 0.1092 276/500 [===============>..............] - ETA: 1:13 - loss: 0.8236 - regression_loss: 0.7147 - classification_loss: 0.1089 277/500 [===============>..............] - ETA: 1:13 - loss: 0.8223 - regression_loss: 0.7135 - classification_loss: 0.1088 278/500 [===============>..............] - ETA: 1:12 - loss: 0.8226 - regression_loss: 0.7138 - classification_loss: 0.1089 279/500 [===============>..............] - ETA: 1:12 - loss: 0.8220 - regression_loss: 0.7133 - classification_loss: 0.1087 280/500 [===============>..............] - ETA: 1:12 - loss: 0.8224 - regression_loss: 0.7136 - classification_loss: 0.1088 281/500 [===============>..............] - ETA: 1:11 - loss: 0.8215 - regression_loss: 0.7128 - classification_loss: 0.1087 282/500 [===============>..............] - ETA: 1:11 - loss: 0.8224 - regression_loss: 0.7136 - classification_loss: 0.1088 283/500 [===============>..............] - ETA: 1:11 - loss: 0.8216 - regression_loss: 0.7130 - classification_loss: 0.1086 284/500 [================>.............] - ETA: 1:10 - loss: 0.8201 - regression_loss: 0.7116 - classification_loss: 0.1085 285/500 [================>.............] - ETA: 1:10 - loss: 0.8211 - regression_loss: 0.7123 - classification_loss: 0.1088 286/500 [================>.............] - ETA: 1:10 - loss: 0.8220 - regression_loss: 0.7131 - classification_loss: 0.1088 287/500 [================>.............] - ETA: 1:10 - loss: 0.8213 - regression_loss: 0.7126 - classification_loss: 0.1088 288/500 [================>.............] - ETA: 1:09 - loss: 0.8217 - regression_loss: 0.7129 - classification_loss: 0.1088 289/500 [================>.............] - ETA: 1:09 - loss: 0.8232 - regression_loss: 0.7139 - classification_loss: 0.1092 290/500 [================>.............] - ETA: 1:09 - loss: 0.8214 - regression_loss: 0.7124 - classification_loss: 0.1090 291/500 [================>.............] - ETA: 1:08 - loss: 0.8238 - regression_loss: 0.7144 - classification_loss: 0.1094 292/500 [================>.............] - ETA: 1:08 - loss: 0.8245 - regression_loss: 0.7150 - classification_loss: 0.1096 293/500 [================>.............] - ETA: 1:08 - loss: 0.8263 - regression_loss: 0.7163 - classification_loss: 0.1100 294/500 [================>.............] - ETA: 1:07 - loss: 0.8252 - regression_loss: 0.7153 - classification_loss: 0.1099 295/500 [================>.............] - ETA: 1:07 - loss: 0.8232 - regression_loss: 0.7136 - classification_loss: 0.1096 296/500 [================>.............] - ETA: 1:07 - loss: 0.8231 - regression_loss: 0.7134 - classification_loss: 0.1097 297/500 [================>.............] - ETA: 1:06 - loss: 0.8242 - regression_loss: 0.7144 - classification_loss: 0.1098 298/500 [================>.............] - ETA: 1:06 - loss: 0.8239 - regression_loss: 0.7142 - classification_loss: 0.1098 299/500 [================>.............] - ETA: 1:06 - loss: 0.8261 - regression_loss: 0.7159 - classification_loss: 0.1101 300/500 [=================>............] - ETA: 1:05 - loss: 0.8258 - regression_loss: 0.7156 - classification_loss: 0.1102 301/500 [=================>............] - ETA: 1:05 - loss: 0.8250 - regression_loss: 0.7149 - classification_loss: 0.1101 302/500 [=================>............] - ETA: 1:05 - loss: 0.8240 - regression_loss: 0.7139 - classification_loss: 0.1101 303/500 [=================>............] - ETA: 1:04 - loss: 0.8235 - regression_loss: 0.7133 - classification_loss: 0.1101 304/500 [=================>............] - ETA: 1:04 - loss: 0.8232 - regression_loss: 0.7131 - classification_loss: 0.1101 305/500 [=================>............] - ETA: 1:04 - loss: 0.8233 - regression_loss: 0.7133 - classification_loss: 0.1100 306/500 [=================>............] - ETA: 1:03 - loss: 0.8231 - regression_loss: 0.7131 - classification_loss: 0.1100 307/500 [=================>............] - ETA: 1:03 - loss: 0.8239 - regression_loss: 0.7137 - classification_loss: 0.1102 308/500 [=================>............] - ETA: 1:03 - loss: 0.8248 - regression_loss: 0.7146 - classification_loss: 0.1102 309/500 [=================>............] - ETA: 1:02 - loss: 0.8246 - regression_loss: 0.7144 - classification_loss: 0.1102 310/500 [=================>............] - ETA: 1:02 - loss: 0.8243 - regression_loss: 0.7141 - classification_loss: 0.1101 311/500 [=================>............] - ETA: 1:02 - loss: 0.8233 - regression_loss: 0.7133 - classification_loss: 0.1100 312/500 [=================>............] - ETA: 1:01 - loss: 0.8221 - regression_loss: 0.7124 - classification_loss: 0.1098 313/500 [=================>............] - ETA: 1:01 - loss: 0.8231 - regression_loss: 0.7132 - classification_loss: 0.1099 314/500 [=================>............] - ETA: 1:01 - loss: 0.8222 - regression_loss: 0.7125 - classification_loss: 0.1097 315/500 [=================>............] - ETA: 1:00 - loss: 0.8213 - regression_loss: 0.7117 - classification_loss: 0.1096 316/500 [=================>............] - ETA: 1:00 - loss: 0.8224 - regression_loss: 0.7125 - classification_loss: 0.1098 317/500 [==================>...........] - ETA: 1:00 - loss: 0.8214 - regression_loss: 0.7118 - classification_loss: 0.1096 318/500 [==================>...........] - ETA: 59s - loss: 0.8218 - regression_loss: 0.7121 - classification_loss: 0.1097  319/500 [==================>...........] - ETA: 59s - loss: 0.8221 - regression_loss: 0.7125 - classification_loss: 0.1096 320/500 [==================>...........] - ETA: 59s - loss: 0.8208 - regression_loss: 0.7114 - classification_loss: 0.1094 321/500 [==================>...........] - ETA: 58s - loss: 0.8222 - regression_loss: 0.7125 - classification_loss: 0.1096 322/500 [==================>...........] - ETA: 58s - loss: 0.8215 - regression_loss: 0.7120 - classification_loss: 0.1095 323/500 [==================>...........] - ETA: 58s - loss: 0.8216 - regression_loss: 0.7121 - classification_loss: 0.1095 324/500 [==================>...........] - ETA: 57s - loss: 0.8220 - regression_loss: 0.7124 - classification_loss: 0.1096 325/500 [==================>...........] - ETA: 57s - loss: 0.8237 - regression_loss: 0.7138 - classification_loss: 0.1098 326/500 [==================>...........] - ETA: 57s - loss: 0.8225 - regression_loss: 0.7129 - classification_loss: 0.1096 327/500 [==================>...........] - ETA: 56s - loss: 0.8221 - regression_loss: 0.7126 - classification_loss: 0.1096 328/500 [==================>...........] - ETA: 56s - loss: 0.8216 - regression_loss: 0.7121 - classification_loss: 0.1095 329/500 [==================>...........] - ETA: 56s - loss: 0.8214 - regression_loss: 0.7119 - classification_loss: 0.1095 330/500 [==================>...........] - ETA: 55s - loss: 0.8219 - regression_loss: 0.7124 - classification_loss: 0.1096 331/500 [==================>...........] - ETA: 55s - loss: 0.8210 - regression_loss: 0.7116 - classification_loss: 0.1094 332/500 [==================>...........] - ETA: 55s - loss: 0.8201 - regression_loss: 0.7108 - classification_loss: 0.1093 333/500 [==================>...........] - ETA: 54s - loss: 0.8202 - regression_loss: 0.7109 - classification_loss: 0.1094 334/500 [===================>..........] - ETA: 54s - loss: 0.8201 - regression_loss: 0.7107 - classification_loss: 0.1093 335/500 [===================>..........] - ETA: 54s - loss: 0.8217 - regression_loss: 0.7120 - classification_loss: 0.1097 336/500 [===================>..........] - ETA: 53s - loss: 0.8242 - regression_loss: 0.7140 - classification_loss: 0.1102 337/500 [===================>..........] - ETA: 53s - loss: 0.8243 - regression_loss: 0.7141 - classification_loss: 0.1101 338/500 [===================>..........] - ETA: 53s - loss: 0.8250 - regression_loss: 0.7149 - classification_loss: 0.1101 339/500 [===================>..........] - ETA: 52s - loss: 0.8251 - regression_loss: 0.7150 - classification_loss: 0.1101 340/500 [===================>..........] - ETA: 52s - loss: 0.8251 - regression_loss: 0.7149 - classification_loss: 0.1102 341/500 [===================>..........] - ETA: 52s - loss: 0.8244 - regression_loss: 0.7143 - classification_loss: 0.1101 342/500 [===================>..........] - ETA: 51s - loss: 0.8242 - regression_loss: 0.7142 - classification_loss: 0.1100 343/500 [===================>..........] - ETA: 51s - loss: 0.8225 - regression_loss: 0.7127 - classification_loss: 0.1098 344/500 [===================>..........] - ETA: 51s - loss: 0.8234 - regression_loss: 0.7135 - classification_loss: 0.1099 345/500 [===================>..........] - ETA: 50s - loss: 0.8232 - regression_loss: 0.7132 - classification_loss: 0.1100 346/500 [===================>..........] - ETA: 50s - loss: 0.8237 - regression_loss: 0.7137 - classification_loss: 0.1101 347/500 [===================>..........] - ETA: 50s - loss: 0.8224 - regression_loss: 0.7125 - classification_loss: 0.1099 348/500 [===================>..........] - ETA: 50s - loss: 0.8233 - regression_loss: 0.7132 - classification_loss: 0.1100 349/500 [===================>..........] - ETA: 49s - loss: 0.8251 - regression_loss: 0.7147 - classification_loss: 0.1104 350/500 [====================>.........] - ETA: 49s - loss: 0.8267 - regression_loss: 0.7160 - classification_loss: 0.1107 351/500 [====================>.........] - ETA: 49s - loss: 0.8265 - regression_loss: 0.7158 - classification_loss: 0.1106 352/500 [====================>.........] - ETA: 48s - loss: 0.8261 - regression_loss: 0.7156 - classification_loss: 0.1105 353/500 [====================>.........] - ETA: 48s - loss: 0.8268 - regression_loss: 0.7162 - classification_loss: 0.1106 354/500 [====================>.........] - ETA: 48s - loss: 0.8277 - regression_loss: 0.7169 - classification_loss: 0.1107 355/500 [====================>.........] - ETA: 47s - loss: 0.8281 - regression_loss: 0.7172 - classification_loss: 0.1109 356/500 [====================>.........] - ETA: 47s - loss: 0.8274 - regression_loss: 0.7166 - classification_loss: 0.1108 357/500 [====================>.........] - ETA: 47s - loss: 0.8282 - regression_loss: 0.7173 - classification_loss: 0.1109 358/500 [====================>.........] - ETA: 46s - loss: 0.8295 - regression_loss: 0.7184 - classification_loss: 0.1111 359/500 [====================>.........] - ETA: 46s - loss: 0.8298 - regression_loss: 0.7187 - classification_loss: 0.1111 360/500 [====================>.........] - ETA: 46s - loss: 0.8293 - regression_loss: 0.7183 - classification_loss: 0.1110 361/500 [====================>.........] - ETA: 45s - loss: 0.8285 - regression_loss: 0.7177 - classification_loss: 0.1108 362/500 [====================>.........] - ETA: 45s - loss: 0.8272 - regression_loss: 0.7166 - classification_loss: 0.1106 363/500 [====================>.........] - ETA: 45s - loss: 0.8270 - regression_loss: 0.7164 - classification_loss: 0.1106 364/500 [====================>.........] - ETA: 44s - loss: 0.8269 - regression_loss: 0.7162 - classification_loss: 0.1106 365/500 [====================>.........] - ETA: 44s - loss: 0.8271 - regression_loss: 0.7165 - classification_loss: 0.1105 366/500 [====================>.........] - ETA: 44s - loss: 0.8294 - regression_loss: 0.7184 - classification_loss: 0.1110 367/500 [=====================>........] - ETA: 43s - loss: 0.8296 - regression_loss: 0.7186 - classification_loss: 0.1110 368/500 [=====================>........] - ETA: 43s - loss: 0.8293 - regression_loss: 0.7184 - classification_loss: 0.1110 369/500 [=====================>........] - ETA: 43s - loss: 0.8277 - regression_loss: 0.7169 - classification_loss: 0.1108 370/500 [=====================>........] - ETA: 42s - loss: 0.8280 - regression_loss: 0.7172 - classification_loss: 0.1108 371/500 [=====================>........] - ETA: 42s - loss: 0.8280 - regression_loss: 0.7172 - classification_loss: 0.1108 372/500 [=====================>........] - ETA: 42s - loss: 0.8288 - regression_loss: 0.7180 - classification_loss: 0.1109 373/500 [=====================>........] - ETA: 41s - loss: 0.8275 - regression_loss: 0.7169 - classification_loss: 0.1107 374/500 [=====================>........] - ETA: 41s - loss: 0.8270 - regression_loss: 0.7164 - classification_loss: 0.1106 375/500 [=====================>........] - ETA: 41s - loss: 0.8271 - regression_loss: 0.7164 - classification_loss: 0.1106 376/500 [=====================>........] - ETA: 40s - loss: 0.8277 - regression_loss: 0.7170 - classification_loss: 0.1107 377/500 [=====================>........] - ETA: 40s - loss: 0.8264 - regression_loss: 0.7159 - classification_loss: 0.1105 378/500 [=====================>........] - ETA: 40s - loss: 0.8276 - regression_loss: 0.7169 - classification_loss: 0.1107 379/500 [=====================>........] - ETA: 39s - loss: 0.8275 - regression_loss: 0.7169 - classification_loss: 0.1106 380/500 [=====================>........] - ETA: 39s - loss: 0.8285 - regression_loss: 0.7177 - classification_loss: 0.1108 381/500 [=====================>........] - ETA: 39s - loss: 0.8287 - regression_loss: 0.7178 - classification_loss: 0.1109 382/500 [=====================>........] - ETA: 38s - loss: 0.8277 - regression_loss: 0.7170 - classification_loss: 0.1107 383/500 [=====================>........] - ETA: 38s - loss: 0.8286 - regression_loss: 0.7178 - classification_loss: 0.1108 384/500 [======================>.......] - ETA: 38s - loss: 0.8279 - regression_loss: 0.7172 - classification_loss: 0.1107 385/500 [======================>.......] - ETA: 37s - loss: 0.8288 - regression_loss: 0.7179 - classification_loss: 0.1109 386/500 [======================>.......] - ETA: 37s - loss: 0.8301 - regression_loss: 0.7190 - classification_loss: 0.1111 387/500 [======================>.......] - ETA: 37s - loss: 0.8304 - regression_loss: 0.7192 - classification_loss: 0.1112 388/500 [======================>.......] - ETA: 36s - loss: 0.8295 - regression_loss: 0.7185 - classification_loss: 0.1110 389/500 [======================>.......] - ETA: 36s - loss: 0.8291 - regression_loss: 0.7181 - classification_loss: 0.1110 390/500 [======================>.......] - ETA: 36s - loss: 0.8291 - regression_loss: 0.7181 - classification_loss: 0.1110 391/500 [======================>.......] - ETA: 35s - loss: 0.8287 - regression_loss: 0.7177 - classification_loss: 0.1109 392/500 [======================>.......] - ETA: 35s - loss: 0.8281 - regression_loss: 0.7173 - classification_loss: 0.1108 393/500 [======================>.......] - ETA: 35s - loss: 0.8282 - regression_loss: 0.7172 - classification_loss: 0.1110 394/500 [======================>.......] - ETA: 34s - loss: 0.8281 - regression_loss: 0.7170 - classification_loss: 0.1111 395/500 [======================>.......] - ETA: 34s - loss: 0.8284 - regression_loss: 0.7172 - classification_loss: 0.1112 396/500 [======================>.......] - ETA: 34s - loss: 0.8287 - regression_loss: 0.7175 - classification_loss: 0.1112 397/500 [======================>.......] - ETA: 33s - loss: 0.8283 - regression_loss: 0.7173 - classification_loss: 0.1110 398/500 [======================>.......] - ETA: 33s - loss: 0.8277 - regression_loss: 0.7167 - classification_loss: 0.1109 399/500 [======================>.......] - ETA: 33s - loss: 0.8293 - regression_loss: 0.7181 - classification_loss: 0.1112 400/500 [=======================>......] - ETA: 32s - loss: 0.8282 - regression_loss: 0.7170 - classification_loss: 0.1111 401/500 [=======================>......] - ETA: 32s - loss: 0.8273 - regression_loss: 0.7161 - classification_loss: 0.1112 402/500 [=======================>......] - ETA: 32s - loss: 0.8266 - regression_loss: 0.7155 - classification_loss: 0.1111 403/500 [=======================>......] - ETA: 31s - loss: 0.8260 - regression_loss: 0.7149 - classification_loss: 0.1111 404/500 [=======================>......] - ETA: 31s - loss: 0.8260 - regression_loss: 0.7149 - classification_loss: 0.1111 405/500 [=======================>......] - ETA: 31s - loss: 0.8250 - regression_loss: 0.7141 - classification_loss: 0.1109 406/500 [=======================>......] - ETA: 30s - loss: 0.8259 - regression_loss: 0.7148 - classification_loss: 0.1111 407/500 [=======================>......] - ETA: 30s - loss: 0.8269 - regression_loss: 0.7156 - classification_loss: 0.1113 408/500 [=======================>......] - ETA: 30s - loss: 0.8260 - regression_loss: 0.7148 - classification_loss: 0.1112 409/500 [=======================>......] - ETA: 29s - loss: 0.8264 - regression_loss: 0.7152 - classification_loss: 0.1112 410/500 [=======================>......] - ETA: 29s - loss: 0.8254 - regression_loss: 0.7143 - classification_loss: 0.1110 411/500 [=======================>......] - ETA: 29s - loss: 0.8251 - regression_loss: 0.7141 - classification_loss: 0.1110 412/500 [=======================>......] - ETA: 28s - loss: 0.8240 - regression_loss: 0.7131 - classification_loss: 0.1108 413/500 [=======================>......] - ETA: 28s - loss: 0.8249 - regression_loss: 0.7138 - classification_loss: 0.1111 414/500 [=======================>......] - ETA: 28s - loss: 0.8257 - regression_loss: 0.7146 - classification_loss: 0.1110 415/500 [=======================>......] - ETA: 27s - loss: 0.8264 - regression_loss: 0.7154 - classification_loss: 0.1111 416/500 [=======================>......] - ETA: 27s - loss: 0.8258 - regression_loss: 0.7148 - classification_loss: 0.1110 417/500 [========================>.....] - ETA: 27s - loss: 0.8263 - regression_loss: 0.7152 - classification_loss: 0.1111 418/500 [========================>.....] - ETA: 26s - loss: 0.8271 - regression_loss: 0.7159 - classification_loss: 0.1112 419/500 [========================>.....] - ETA: 26s - loss: 0.8283 - regression_loss: 0.7169 - classification_loss: 0.1114 420/500 [========================>.....] - ETA: 26s - loss: 0.8281 - regression_loss: 0.7167 - classification_loss: 0.1114 421/500 [========================>.....] - ETA: 25s - loss: 0.8273 - regression_loss: 0.7160 - classification_loss: 0.1112 422/500 [========================>.....] - ETA: 25s - loss: 0.8267 - regression_loss: 0.7155 - classification_loss: 0.1112 423/500 [========================>.....] - ETA: 25s - loss: 0.8273 - regression_loss: 0.7161 - classification_loss: 0.1112 424/500 [========================>.....] - ETA: 24s - loss: 0.8267 - regression_loss: 0.7156 - classification_loss: 0.1111 425/500 [========================>.....] - ETA: 24s - loss: 0.8276 - regression_loss: 0.7164 - classification_loss: 0.1112 426/500 [========================>.....] - ETA: 24s - loss: 0.8269 - regression_loss: 0.7157 - classification_loss: 0.1111 427/500 [========================>.....] - ETA: 23s - loss: 0.8253 - regression_loss: 0.7143 - classification_loss: 0.1110 428/500 [========================>.....] - ETA: 23s - loss: 0.8245 - regression_loss: 0.7137 - classification_loss: 0.1108 429/500 [========================>.....] - ETA: 23s - loss: 0.8258 - regression_loss: 0.7147 - classification_loss: 0.1111 430/500 [========================>.....] - ETA: 23s - loss: 0.8257 - regression_loss: 0.7147 - classification_loss: 0.1110 431/500 [========================>.....] - ETA: 22s - loss: 0.8257 - regression_loss: 0.7146 - classification_loss: 0.1110 432/500 [========================>.....] - ETA: 22s - loss: 0.8256 - regression_loss: 0.7146 - classification_loss: 0.1110 433/500 [========================>.....] - ETA: 22s - loss: 0.8245 - regression_loss: 0.7136 - classification_loss: 0.1109 434/500 [=========================>....] - ETA: 21s - loss: 0.8240 - regression_loss: 0.7131 - classification_loss: 0.1109 435/500 [=========================>....] - ETA: 21s - loss: 0.8231 - regression_loss: 0.7124 - classification_loss: 0.1107 436/500 [=========================>....] - ETA: 21s - loss: 0.8221 - regression_loss: 0.7114 - classification_loss: 0.1106 437/500 [=========================>....] - ETA: 20s - loss: 0.8213 - regression_loss: 0.7108 - classification_loss: 0.1105 438/500 [=========================>....] - ETA: 20s - loss: 0.8212 - regression_loss: 0.7106 - classification_loss: 0.1106 439/500 [=========================>....] - ETA: 20s - loss: 0.8216 - regression_loss: 0.7110 - classification_loss: 0.1106 440/500 [=========================>....] - ETA: 19s - loss: 0.8209 - regression_loss: 0.7104 - classification_loss: 0.1105 441/500 [=========================>....] - ETA: 19s - loss: 0.8198 - regression_loss: 0.7095 - classification_loss: 0.1103 442/500 [=========================>....] - ETA: 19s - loss: 0.8200 - regression_loss: 0.7097 - classification_loss: 0.1104 443/500 [=========================>....] - ETA: 18s - loss: 0.8198 - regression_loss: 0.7095 - classification_loss: 0.1104 444/500 [=========================>....] - ETA: 18s - loss: 0.8198 - regression_loss: 0.7094 - classification_loss: 0.1103 445/500 [=========================>....] - ETA: 18s - loss: 0.8191 - regression_loss: 0.7089 - classification_loss: 0.1102 446/500 [=========================>....] - ETA: 17s - loss: 0.8187 - regression_loss: 0.7086 - classification_loss: 0.1101 447/500 [=========================>....] - ETA: 17s - loss: 0.8188 - regression_loss: 0.7088 - classification_loss: 0.1101 448/500 [=========================>....] - ETA: 17s - loss: 0.8192 - regression_loss: 0.7091 - classification_loss: 0.1101 449/500 [=========================>....] - ETA: 16s - loss: 0.8192 - regression_loss: 0.7091 - classification_loss: 0.1101 450/500 [==========================>...] - ETA: 16s - loss: 0.8197 - regression_loss: 0.7096 - classification_loss: 0.1101 451/500 [==========================>...] - ETA: 16s - loss: 0.8205 - regression_loss: 0.7102 - classification_loss: 0.1103 452/500 [==========================>...] - ETA: 15s - loss: 0.8194 - regression_loss: 0.7093 - classification_loss: 0.1102 453/500 [==========================>...] - ETA: 15s - loss: 0.8198 - regression_loss: 0.7096 - classification_loss: 0.1102 454/500 [==========================>...] - ETA: 15s - loss: 0.8198 - regression_loss: 0.7096 - classification_loss: 0.1102 455/500 [==========================>...] - ETA: 14s - loss: 0.8197 - regression_loss: 0.7095 - classification_loss: 0.1102 456/500 [==========================>...] - ETA: 14s - loss: 0.8201 - regression_loss: 0.7099 - classification_loss: 0.1102 457/500 [==========================>...] - ETA: 14s - loss: 0.8195 - regression_loss: 0.7094 - classification_loss: 0.1101 458/500 [==========================>...] - ETA: 13s - loss: 0.8202 - regression_loss: 0.7101 - classification_loss: 0.1101 459/500 [==========================>...] - ETA: 13s - loss: 0.8208 - regression_loss: 0.7106 - classification_loss: 0.1101 460/500 [==========================>...] - ETA: 13s - loss: 0.8214 - regression_loss: 0.7111 - classification_loss: 0.1102 461/500 [==========================>...] - ETA: 12s - loss: 0.8209 - regression_loss: 0.7107 - classification_loss: 0.1102 462/500 [==========================>...] - ETA: 12s - loss: 0.8198 - regression_loss: 0.7098 - classification_loss: 0.1100 463/500 [==========================>...] - ETA: 12s - loss: 0.8188 - regression_loss: 0.7089 - classification_loss: 0.1099 464/500 [==========================>...] - ETA: 11s - loss: 0.8186 - regression_loss: 0.7088 - classification_loss: 0.1098 465/500 [==========================>...] - ETA: 11s - loss: 0.8183 - regression_loss: 0.7086 - classification_loss: 0.1097 466/500 [==========================>...] - ETA: 11s - loss: 0.8186 - regression_loss: 0.7088 - classification_loss: 0.1098 467/500 [===========================>..] - ETA: 10s - loss: 0.8176 - regression_loss: 0.7079 - classification_loss: 0.1096 468/500 [===========================>..] - ETA: 10s - loss: 0.8169 - regression_loss: 0.7073 - classification_loss: 0.1095 469/500 [===========================>..] - ETA: 10s - loss: 0.8164 - regression_loss: 0.7069 - classification_loss: 0.1094 470/500 [===========================>..] - ETA: 9s - loss: 0.8164 - regression_loss: 0.7069 - classification_loss: 0.1095  471/500 [===========================>..] - ETA: 9s - loss: 0.8172 - regression_loss: 0.7077 - classification_loss: 0.1095 472/500 [===========================>..] - ETA: 9s - loss: 0.8183 - regression_loss: 0.7085 - classification_loss: 0.1097 473/500 [===========================>..] - ETA: 8s - loss: 0.8174 - regression_loss: 0.7078 - classification_loss: 0.1096 474/500 [===========================>..] - ETA: 8s - loss: 0.8172 - regression_loss: 0.7077 - classification_loss: 0.1095 475/500 [===========================>..] - ETA: 8s - loss: 0.8181 - regression_loss: 0.7085 - classification_loss: 0.1096 476/500 [===========================>..] - ETA: 7s - loss: 0.8182 - regression_loss: 0.7086 - classification_loss: 0.1096 477/500 [===========================>..] - ETA: 7s - loss: 0.8187 - regression_loss: 0.7092 - classification_loss: 0.1095 478/500 [===========================>..] - ETA: 7s - loss: 0.8187 - regression_loss: 0.7091 - classification_loss: 0.1096 479/500 [===========================>..] - ETA: 6s - loss: 0.8178 - regression_loss: 0.7084 - classification_loss: 0.1094 480/500 [===========================>..] - ETA: 6s - loss: 0.8168 - regression_loss: 0.7075 - classification_loss: 0.1093 481/500 [===========================>..] - ETA: 6s - loss: 0.8162 - regression_loss: 0.7070 - classification_loss: 0.1092 482/500 [===========================>..] - ETA: 5s - loss: 0.8158 - regression_loss: 0.7066 - classification_loss: 0.1091 483/500 [===========================>..] - ETA: 5s - loss: 0.8168 - regression_loss: 0.7074 - classification_loss: 0.1094 484/500 [============================>.] - ETA: 5s - loss: 0.8168 - regression_loss: 0.7074 - classification_loss: 0.1094 485/500 [============================>.] - ETA: 4s - loss: 0.8169 - regression_loss: 0.7075 - classification_loss: 0.1094 486/500 [============================>.] - ETA: 4s - loss: 0.8162 - regression_loss: 0.7070 - classification_loss: 0.1093 487/500 [============================>.] - ETA: 4s - loss: 0.8169 - regression_loss: 0.7075 - classification_loss: 0.1094 488/500 [============================>.] - ETA: 3s - loss: 0.8158 - regression_loss: 0.7065 - classification_loss: 0.1092 489/500 [============================>.] - ETA: 3s - loss: 0.8155 - regression_loss: 0.7062 - classification_loss: 0.1092 490/500 [============================>.] - ETA: 3s - loss: 0.8163 - regression_loss: 0.7069 - classification_loss: 0.1094 491/500 [============================>.] - ETA: 2s - loss: 0.8171 - regression_loss: 0.7075 - classification_loss: 0.1096 492/500 [============================>.] - ETA: 2s - loss: 0.8173 - regression_loss: 0.7076 - classification_loss: 0.1096 493/500 [============================>.] - ETA: 2s - loss: 0.8182 - regression_loss: 0.7083 - classification_loss: 0.1099 494/500 [============================>.] - ETA: 1s - loss: 0.8182 - regression_loss: 0.7084 - classification_loss: 0.1099 495/500 [============================>.] - ETA: 1s - loss: 0.8180 - regression_loss: 0.7082 - classification_loss: 0.1098 496/500 [============================>.] - ETA: 1s - loss: 0.8183 - regression_loss: 0.7085 - classification_loss: 0.1098 497/500 [============================>.] - ETA: 0s - loss: 0.8188 - regression_loss: 0.7089 - classification_loss: 0.1099 498/500 [============================>.] - ETA: 0s - loss: 0.8186 - regression_loss: 0.7087 - classification_loss: 0.1098 499/500 [============================>.] - ETA: 0s - loss: 0.8187 - regression_loss: 0.7090 - classification_loss: 0.1098 500/500 [==============================] - 164s 329ms/step - loss: 0.8179 - regression_loss: 0.7082 - classification_loss: 0.1097 1172 instances of class plum with average precision: 0.6313 mAP: 0.6313 Epoch 00037: saving model to ./training/snapshots/resnet101_pascal_37.h5 Epoch 38/150 1/500 [..............................] - ETA: 2:40 - loss: 1.2629 - regression_loss: 1.0622 - classification_loss: 0.2007 2/500 [..............................] - ETA: 2:36 - loss: 1.0872 - regression_loss: 0.9241 - classification_loss: 0.1630 3/500 [..............................] - ETA: 2:37 - loss: 0.9836 - regression_loss: 0.8426 - classification_loss: 0.1409 4/500 [..............................] - ETA: 2:37 - loss: 0.8460 - regression_loss: 0.7292 - classification_loss: 0.1168 5/500 [..............................] - ETA: 2:37 - loss: 0.9425 - regression_loss: 0.8058 - classification_loss: 0.1367 6/500 [..............................] - ETA: 2:36 - loss: 0.9061 - regression_loss: 0.7760 - classification_loss: 0.1300 7/500 [..............................] - ETA: 2:36 - loss: 0.8456 - regression_loss: 0.7255 - classification_loss: 0.1200 8/500 [..............................] - ETA: 2:35 - loss: 0.7901 - regression_loss: 0.6815 - classification_loss: 0.1087 9/500 [..............................] - ETA: 2:35 - loss: 0.7541 - regression_loss: 0.6505 - classification_loss: 0.1036 10/500 [..............................] - ETA: 2:34 - loss: 0.7838 - regression_loss: 0.6752 - classification_loss: 0.1086 11/500 [..............................] - ETA: 2:34 - loss: 0.8147 - regression_loss: 0.7016 - classification_loss: 0.1131 12/500 [..............................] - ETA: 2:34 - loss: 0.7901 - regression_loss: 0.6820 - classification_loss: 0.1081 13/500 [..............................] - ETA: 2:34 - loss: 0.7680 - regression_loss: 0.6638 - classification_loss: 0.1042 14/500 [..............................] - ETA: 2:35 - loss: 0.7435 - regression_loss: 0.6428 - classification_loss: 0.1007 15/500 [..............................] - ETA: 2:35 - loss: 0.7640 - regression_loss: 0.6581 - classification_loss: 0.1059 16/500 [..............................] - ETA: 2:35 - loss: 0.7789 - regression_loss: 0.6740 - classification_loss: 0.1049 17/500 [>.............................] - ETA: 2:35 - loss: 0.7651 - regression_loss: 0.6621 - classification_loss: 0.1030 18/500 [>.............................] - ETA: 2:36 - loss: 0.7836 - regression_loss: 0.6785 - classification_loss: 0.1051 19/500 [>.............................] - ETA: 2:36 - loss: 0.7902 - regression_loss: 0.6845 - classification_loss: 0.1056 20/500 [>.............................] - ETA: 2:36 - loss: 0.7693 - regression_loss: 0.6670 - classification_loss: 0.1023 21/500 [>.............................] - ETA: 2:36 - loss: 0.7703 - regression_loss: 0.6693 - classification_loss: 0.1010 22/500 [>.............................] - ETA: 2:36 - loss: 0.7625 - regression_loss: 0.6628 - classification_loss: 0.0997 23/500 [>.............................] - ETA: 2:36 - loss: 0.7822 - regression_loss: 0.6818 - classification_loss: 0.1004 24/500 [>.............................] - ETA: 2:36 - loss: 0.7876 - regression_loss: 0.6860 - classification_loss: 0.1016 25/500 [>.............................] - ETA: 2:35 - loss: 0.7975 - regression_loss: 0.6946 - classification_loss: 0.1029 26/500 [>.............................] - ETA: 2:36 - loss: 0.7799 - regression_loss: 0.6797 - classification_loss: 0.1002 27/500 [>.............................] - ETA: 2:35 - loss: 0.7718 - regression_loss: 0.6723 - classification_loss: 0.0995 28/500 [>.............................] - ETA: 2:35 - loss: 0.7601 - regression_loss: 0.6630 - classification_loss: 0.0971 29/500 [>.............................] - ETA: 2:35 - loss: 0.7597 - regression_loss: 0.6627 - classification_loss: 0.0970 30/500 [>.............................] - ETA: 2:34 - loss: 0.7632 - regression_loss: 0.6659 - classification_loss: 0.0974 31/500 [>.............................] - ETA: 2:34 - loss: 0.7699 - regression_loss: 0.6704 - classification_loss: 0.0996 32/500 [>.............................] - ETA: 2:34 - loss: 0.7814 - regression_loss: 0.6801 - classification_loss: 0.1012 33/500 [>.............................] - ETA: 2:33 - loss: 0.7713 - regression_loss: 0.6717 - classification_loss: 0.0997 34/500 [=>............................] - ETA: 2:32 - loss: 0.7786 - regression_loss: 0.6771 - classification_loss: 0.1015 35/500 [=>............................] - ETA: 2:32 - loss: 0.7794 - regression_loss: 0.6778 - classification_loss: 0.1017 36/500 [=>............................] - ETA: 2:32 - loss: 0.7720 - regression_loss: 0.6714 - classification_loss: 0.1006 37/500 [=>............................] - ETA: 2:32 - loss: 0.7727 - regression_loss: 0.6710 - classification_loss: 0.1017 38/500 [=>............................] - ETA: 2:32 - loss: 0.7674 - regression_loss: 0.6660 - classification_loss: 0.1014 39/500 [=>............................] - ETA: 2:31 - loss: 0.7593 - regression_loss: 0.6594 - classification_loss: 0.1000 40/500 [=>............................] - ETA: 2:31 - loss: 0.7524 - regression_loss: 0.6533 - classification_loss: 0.0991 41/500 [=>............................] - ETA: 2:31 - loss: 0.7493 - regression_loss: 0.6500 - classification_loss: 0.0994 42/500 [=>............................] - ETA: 2:30 - loss: 0.7423 - regression_loss: 0.6444 - classification_loss: 0.0980 43/500 [=>............................] - ETA: 2:30 - loss: 0.7403 - regression_loss: 0.6423 - classification_loss: 0.0980 44/500 [=>............................] - ETA: 2:30 - loss: 0.7539 - regression_loss: 0.6531 - classification_loss: 0.1008 45/500 [=>............................] - ETA: 2:29 - loss: 0.7480 - regression_loss: 0.6478 - classification_loss: 0.1002 46/500 [=>............................] - ETA: 2:29 - loss: 0.7554 - regression_loss: 0.6535 - classification_loss: 0.1019 47/500 [=>............................] - ETA: 2:29 - loss: 0.7519 - regression_loss: 0.6511 - classification_loss: 0.1008 48/500 [=>............................] - ETA: 2:28 - loss: 0.7540 - regression_loss: 0.6534 - classification_loss: 0.1006 49/500 [=>............................] - ETA: 2:28 - loss: 0.7497 - regression_loss: 0.6497 - classification_loss: 0.1000 50/500 [==>...........................] - ETA: 2:28 - loss: 0.7531 - regression_loss: 0.6518 - classification_loss: 0.1012 51/500 [==>...........................] - ETA: 2:27 - loss: 0.7441 - regression_loss: 0.6440 - classification_loss: 0.1001 52/500 [==>...........................] - ETA: 2:27 - loss: 0.7355 - regression_loss: 0.6366 - classification_loss: 0.0989 53/500 [==>...........................] - ETA: 2:27 - loss: 0.7353 - regression_loss: 0.6366 - classification_loss: 0.0988 54/500 [==>...........................] - ETA: 2:26 - loss: 0.7371 - regression_loss: 0.6375 - classification_loss: 0.0996 55/500 [==>...........................] - ETA: 2:26 - loss: 0.7399 - regression_loss: 0.6400 - classification_loss: 0.0999 56/500 [==>...........................] - ETA: 2:26 - loss: 0.7468 - regression_loss: 0.6453 - classification_loss: 0.1015 57/500 [==>...........................] - ETA: 2:25 - loss: 0.7491 - regression_loss: 0.6476 - classification_loss: 0.1015 58/500 [==>...........................] - ETA: 2:25 - loss: 0.7589 - regression_loss: 0.6560 - classification_loss: 0.1029 59/500 [==>...........................] - ETA: 2:25 - loss: 0.7639 - regression_loss: 0.6601 - classification_loss: 0.1038 60/500 [==>...........................] - ETA: 2:24 - loss: 0.7600 - regression_loss: 0.6568 - classification_loss: 0.1031 61/500 [==>...........................] - ETA: 2:24 - loss: 0.7626 - regression_loss: 0.6591 - classification_loss: 0.1035 62/500 [==>...........................] - ETA: 2:24 - loss: 0.7547 - regression_loss: 0.6519 - classification_loss: 0.1027 63/500 [==>...........................] - ETA: 2:23 - loss: 0.7508 - regression_loss: 0.6489 - classification_loss: 0.1019 64/500 [==>...........................] - ETA: 2:23 - loss: 0.7494 - regression_loss: 0.6481 - classification_loss: 0.1013 65/500 [==>...........................] - ETA: 2:23 - loss: 0.7475 - regression_loss: 0.6470 - classification_loss: 0.1005 66/500 [==>...........................] - ETA: 2:22 - loss: 0.7518 - regression_loss: 0.6502 - classification_loss: 0.1016 67/500 [===>..........................] - ETA: 2:22 - loss: 0.7545 - regression_loss: 0.6523 - classification_loss: 0.1023 68/500 [===>..........................] - ETA: 2:22 - loss: 0.7540 - regression_loss: 0.6517 - classification_loss: 0.1024 69/500 [===>..........................] - ETA: 2:22 - loss: 0.7549 - regression_loss: 0.6524 - classification_loss: 0.1025 70/500 [===>..........................] - ETA: 2:21 - loss: 0.7514 - regression_loss: 0.6494 - classification_loss: 0.1020 71/500 [===>..........................] - ETA: 2:21 - loss: 0.7594 - regression_loss: 0.6567 - classification_loss: 0.1027 72/500 [===>..........................] - ETA: 2:21 - loss: 0.7559 - regression_loss: 0.6541 - classification_loss: 0.1018 73/500 [===>..........................] - ETA: 2:20 - loss: 0.7510 - regression_loss: 0.6497 - classification_loss: 0.1013 74/500 [===>..........................] - ETA: 2:20 - loss: 0.7501 - regression_loss: 0.6490 - classification_loss: 0.1011 75/500 [===>..........................] - ETA: 2:19 - loss: 0.7470 - regression_loss: 0.6464 - classification_loss: 0.1005 76/500 [===>..........................] - ETA: 2:19 - loss: 0.7523 - regression_loss: 0.6513 - classification_loss: 0.1010 77/500 [===>..........................] - ETA: 2:19 - loss: 0.7543 - regression_loss: 0.6535 - classification_loss: 0.1008 78/500 [===>..........................] - ETA: 2:19 - loss: 0.7566 - regression_loss: 0.6554 - classification_loss: 0.1012 79/500 [===>..........................] - ETA: 2:18 - loss: 0.7497 - regression_loss: 0.6494 - classification_loss: 0.1003 80/500 [===>..........................] - ETA: 2:18 - loss: 0.7468 - regression_loss: 0.6472 - classification_loss: 0.0996 81/500 [===>..........................] - ETA: 2:17 - loss: 0.7482 - regression_loss: 0.6485 - classification_loss: 0.0997 82/500 [===>..........................] - ETA: 2:17 - loss: 0.7433 - regression_loss: 0.6445 - classification_loss: 0.0988 83/500 [===>..........................] - ETA: 2:17 - loss: 0.7395 - regression_loss: 0.6409 - classification_loss: 0.0986 84/500 [====>.........................] - ETA: 2:16 - loss: 0.7423 - regression_loss: 0.6431 - classification_loss: 0.0992 85/500 [====>.........................] - ETA: 2:16 - loss: 0.7409 - regression_loss: 0.6420 - classification_loss: 0.0989 86/500 [====>.........................] - ETA: 2:16 - loss: 0.7400 - regression_loss: 0.6416 - classification_loss: 0.0984 87/500 [====>.........................] - ETA: 2:16 - loss: 0.7400 - regression_loss: 0.6421 - classification_loss: 0.0978 88/500 [====>.........................] - ETA: 2:15 - loss: 0.7378 - regression_loss: 0.6405 - classification_loss: 0.0972 89/500 [====>.........................] - ETA: 2:15 - loss: 0.7362 - regression_loss: 0.6395 - classification_loss: 0.0968 90/500 [====>.........................] - ETA: 2:15 - loss: 0.7329 - regression_loss: 0.6364 - classification_loss: 0.0965 91/500 [====>.........................] - ETA: 2:14 - loss: 0.7385 - regression_loss: 0.6414 - classification_loss: 0.0971 92/500 [====>.........................] - ETA: 2:14 - loss: 0.7360 - regression_loss: 0.6392 - classification_loss: 0.0967 93/500 [====>.........................] - ETA: 2:13 - loss: 0.7431 - regression_loss: 0.6454 - classification_loss: 0.0977 94/500 [====>.........................] - ETA: 2:13 - loss: 0.7478 - regression_loss: 0.6504 - classification_loss: 0.0975 95/500 [====>.........................] - ETA: 2:13 - loss: 0.7442 - regression_loss: 0.6471 - classification_loss: 0.0971 96/500 [====>.........................] - ETA: 2:12 - loss: 0.7473 - regression_loss: 0.6499 - classification_loss: 0.0974 97/500 [====>.........................] - ETA: 2:12 - loss: 0.7431 - regression_loss: 0.6461 - classification_loss: 0.0970 98/500 [====>.........................] - ETA: 2:12 - loss: 0.7384 - regression_loss: 0.6421 - classification_loss: 0.0963 99/500 [====>.........................] - ETA: 2:11 - loss: 0.7387 - regression_loss: 0.6426 - classification_loss: 0.0961 100/500 [=====>........................] - ETA: 2:11 - loss: 0.7434 - regression_loss: 0.6467 - classification_loss: 0.0966 101/500 [=====>........................] - ETA: 2:11 - loss: 0.7408 - regression_loss: 0.6445 - classification_loss: 0.0963 102/500 [=====>........................] - ETA: 2:10 - loss: 0.7410 - regression_loss: 0.6445 - classification_loss: 0.0965 103/500 [=====>........................] - ETA: 2:10 - loss: 0.7453 - regression_loss: 0.6477 - classification_loss: 0.0976 104/500 [=====>........................] - ETA: 2:10 - loss: 0.7539 - regression_loss: 0.6547 - classification_loss: 0.0992 105/500 [=====>........................] - ETA: 2:10 - loss: 0.7492 - regression_loss: 0.6506 - classification_loss: 0.0986 106/500 [=====>........................] - ETA: 2:09 - loss: 0.7452 - regression_loss: 0.6473 - classification_loss: 0.0980 107/500 [=====>........................] - ETA: 2:09 - loss: 0.7471 - regression_loss: 0.6486 - classification_loss: 0.0985 108/500 [=====>........................] - ETA: 2:09 - loss: 0.7521 - regression_loss: 0.6526 - classification_loss: 0.0995 109/500 [=====>........................] - ETA: 2:08 - loss: 0.7557 - regression_loss: 0.6559 - classification_loss: 0.0998 110/500 [=====>........................] - ETA: 2:08 - loss: 0.7569 - regression_loss: 0.6570 - classification_loss: 0.0999 111/500 [=====>........................] - ETA: 2:08 - loss: 0.7595 - regression_loss: 0.6592 - classification_loss: 0.1003 112/500 [=====>........................] - ETA: 2:07 - loss: 0.7617 - regression_loss: 0.6610 - classification_loss: 0.1007 113/500 [=====>........................] - ETA: 2:07 - loss: 0.7687 - regression_loss: 0.6671 - classification_loss: 0.1016 114/500 [=====>........................] - ETA: 2:07 - loss: 0.7639 - regression_loss: 0.6630 - classification_loss: 0.1009 115/500 [=====>........................] - ETA: 2:06 - loss: 0.7656 - regression_loss: 0.6645 - classification_loss: 0.1011 116/500 [=====>........................] - ETA: 2:06 - loss: 0.7686 - regression_loss: 0.6671 - classification_loss: 0.1015 117/500 [======>.......................] - ETA: 2:06 - loss: 0.7647 - regression_loss: 0.6638 - classification_loss: 0.1010 118/500 [======>.......................] - ETA: 2:05 - loss: 0.7649 - regression_loss: 0.6639 - classification_loss: 0.1010 119/500 [======>.......................] - ETA: 2:05 - loss: 0.7652 - regression_loss: 0.6640 - classification_loss: 0.1011 120/500 [======>.......................] - ETA: 2:05 - loss: 0.7674 - regression_loss: 0.6660 - classification_loss: 0.1014 121/500 [======>.......................] - ETA: 2:05 - loss: 0.7676 - regression_loss: 0.6661 - classification_loss: 0.1015 122/500 [======>.......................] - ETA: 2:04 - loss: 0.7677 - regression_loss: 0.6664 - classification_loss: 0.1014 123/500 [======>.......................] - ETA: 2:04 - loss: 0.7707 - regression_loss: 0.6691 - classification_loss: 0.1017 124/500 [======>.......................] - ETA: 2:04 - loss: 0.7688 - regression_loss: 0.6676 - classification_loss: 0.1012 125/500 [======>.......................] - ETA: 2:03 - loss: 0.7666 - regression_loss: 0.6658 - classification_loss: 0.1008 126/500 [======>.......................] - ETA: 2:03 - loss: 0.7662 - regression_loss: 0.6658 - classification_loss: 0.1004 127/500 [======>.......................] - ETA: 2:03 - loss: 0.7671 - regression_loss: 0.6669 - classification_loss: 0.1002 128/500 [======>.......................] - ETA: 2:02 - loss: 0.7637 - regression_loss: 0.6638 - classification_loss: 0.1000 129/500 [======>.......................] - ETA: 2:02 - loss: 0.7627 - regression_loss: 0.6628 - classification_loss: 0.0999 130/500 [======>.......................] - ETA: 2:02 - loss: 0.7692 - regression_loss: 0.6684 - classification_loss: 0.1008 131/500 [======>.......................] - ETA: 2:01 - loss: 0.7690 - regression_loss: 0.6682 - classification_loss: 0.1008 132/500 [======>.......................] - ETA: 2:01 - loss: 0.7690 - regression_loss: 0.6680 - classification_loss: 0.1009 133/500 [======>.......................] - ETA: 2:01 - loss: 0.7658 - regression_loss: 0.6653 - classification_loss: 0.1005 134/500 [=======>......................] - ETA: 2:00 - loss: 0.7629 - regression_loss: 0.6630 - classification_loss: 0.0999 135/500 [=======>......................] - ETA: 2:00 - loss: 0.7614 - regression_loss: 0.6616 - classification_loss: 0.0998 136/500 [=======>......................] - ETA: 1:59 - loss: 0.7594 - regression_loss: 0.6598 - classification_loss: 0.0995 137/500 [=======>......................] - ETA: 1:59 - loss: 0.7616 - regression_loss: 0.6621 - classification_loss: 0.0995 138/500 [=======>......................] - ETA: 1:59 - loss: 0.7604 - regression_loss: 0.6610 - classification_loss: 0.0994 139/500 [=======>......................] - ETA: 1:59 - loss: 0.7612 - regression_loss: 0.6617 - classification_loss: 0.0994 140/500 [=======>......................] - ETA: 1:58 - loss: 0.7648 - regression_loss: 0.6648 - classification_loss: 0.1000 141/500 [=======>......................] - ETA: 1:58 - loss: 0.7627 - regression_loss: 0.6630 - classification_loss: 0.0997 142/500 [=======>......................] - ETA: 1:57 - loss: 0.7628 - regression_loss: 0.6631 - classification_loss: 0.0997 143/500 [=======>......................] - ETA: 1:57 - loss: 0.7646 - regression_loss: 0.6646 - classification_loss: 0.1000 144/500 [=======>......................] - ETA: 1:57 - loss: 0.7670 - regression_loss: 0.6666 - classification_loss: 0.1005 145/500 [=======>......................] - ETA: 1:57 - loss: 0.7720 - regression_loss: 0.6705 - classification_loss: 0.1016 146/500 [=======>......................] - ETA: 1:56 - loss: 0.7739 - regression_loss: 0.6719 - classification_loss: 0.1021 147/500 [=======>......................] - ETA: 1:56 - loss: 0.7728 - regression_loss: 0.6712 - classification_loss: 0.1016 148/500 [=======>......................] - ETA: 1:56 - loss: 0.7752 - regression_loss: 0.6731 - classification_loss: 0.1021 149/500 [=======>......................] - ETA: 1:55 - loss: 0.7758 - regression_loss: 0.6735 - classification_loss: 0.1024 150/500 [========>.....................] - ETA: 1:55 - loss: 0.7740 - regression_loss: 0.6719 - classification_loss: 0.1021 151/500 [========>.....................] - ETA: 1:55 - loss: 0.7737 - regression_loss: 0.6716 - classification_loss: 0.1020 152/500 [========>.....................] - ETA: 1:54 - loss: 0.7751 - regression_loss: 0.6728 - classification_loss: 0.1023 153/500 [========>.....................] - ETA: 1:54 - loss: 0.7751 - regression_loss: 0.6726 - classification_loss: 0.1025 154/500 [========>.....................] - ETA: 1:54 - loss: 0.7715 - regression_loss: 0.6695 - classification_loss: 0.1020 155/500 [========>.....................] - ETA: 1:54 - loss: 0.7719 - regression_loss: 0.6699 - classification_loss: 0.1020 156/500 [========>.....................] - ETA: 1:53 - loss: 0.7729 - regression_loss: 0.6708 - classification_loss: 0.1022 157/500 [========>.....................] - ETA: 1:53 - loss: 0.7744 - regression_loss: 0.6721 - classification_loss: 0.1023 158/500 [========>.....................] - ETA: 1:53 - loss: 0.7763 - regression_loss: 0.6736 - classification_loss: 0.1027 159/500 [========>.....................] - ETA: 1:52 - loss: 0.7755 - regression_loss: 0.6731 - classification_loss: 0.1023 160/500 [========>.....................] - ETA: 1:52 - loss: 0.7763 - regression_loss: 0.6737 - classification_loss: 0.1026 161/500 [========>.....................] - ETA: 1:52 - loss: 0.7780 - regression_loss: 0.6751 - classification_loss: 0.1029 162/500 [========>.....................] - ETA: 1:51 - loss: 0.7807 - regression_loss: 0.6774 - classification_loss: 0.1033 163/500 [========>.....................] - ETA: 1:51 - loss: 0.7817 - regression_loss: 0.6782 - classification_loss: 0.1035 164/500 [========>.....................] - ETA: 1:51 - loss: 0.7806 - regression_loss: 0.6774 - classification_loss: 0.1032 165/500 [========>.....................] - ETA: 1:50 - loss: 0.7851 - regression_loss: 0.6812 - classification_loss: 0.1039 166/500 [========>.....................] - ETA: 1:50 - loss: 0.7831 - regression_loss: 0.6792 - classification_loss: 0.1039 167/500 [=========>....................] - ETA: 1:50 - loss: 0.7816 - regression_loss: 0.6782 - classification_loss: 0.1035 168/500 [=========>....................] - ETA: 1:49 - loss: 0.7785 - regression_loss: 0.6754 - classification_loss: 0.1031 169/500 [=========>....................] - ETA: 1:49 - loss: 0.7762 - regression_loss: 0.6734 - classification_loss: 0.1028 170/500 [=========>....................] - ETA: 1:49 - loss: 0.7770 - regression_loss: 0.6740 - classification_loss: 0.1031 171/500 [=========>....................] - ETA: 1:48 - loss: 0.7789 - regression_loss: 0.6754 - classification_loss: 0.1035 172/500 [=========>....................] - ETA: 1:48 - loss: 0.7791 - regression_loss: 0.6754 - classification_loss: 0.1037 173/500 [=========>....................] - ETA: 1:48 - loss: 0.7784 - regression_loss: 0.6748 - classification_loss: 0.1036 174/500 [=========>....................] - ETA: 1:47 - loss: 0.7762 - regression_loss: 0.6729 - classification_loss: 0.1033 175/500 [=========>....................] - ETA: 1:47 - loss: 0.7766 - regression_loss: 0.6732 - classification_loss: 0.1034 176/500 [=========>....................] - ETA: 1:47 - loss: 0.7757 - regression_loss: 0.6726 - classification_loss: 0.1031 177/500 [=========>....................] - ETA: 1:46 - loss: 0.7740 - regression_loss: 0.6713 - classification_loss: 0.1028 178/500 [=========>....................] - ETA: 1:46 - loss: 0.7744 - regression_loss: 0.6716 - classification_loss: 0.1028 179/500 [=========>....................] - ETA: 1:46 - loss: 0.7754 - regression_loss: 0.6723 - classification_loss: 0.1030 180/500 [=========>....................] - ETA: 1:45 - loss: 0.7754 - regression_loss: 0.6724 - classification_loss: 0.1030 181/500 [=========>....................] - ETA: 1:45 - loss: 0.7779 - regression_loss: 0.6745 - classification_loss: 0.1034 182/500 [=========>....................] - ETA: 1:45 - loss: 0.7754 - regression_loss: 0.6724 - classification_loss: 0.1030 183/500 [=========>....................] - ETA: 1:44 - loss: 0.7735 - regression_loss: 0.6708 - classification_loss: 0.1027 184/500 [==========>...................] - ETA: 1:44 - loss: 0.7757 - regression_loss: 0.6725 - classification_loss: 0.1032 185/500 [==========>...................] - ETA: 1:44 - loss: 0.7776 - regression_loss: 0.6740 - classification_loss: 0.1036 186/500 [==========>...................] - ETA: 1:43 - loss: 0.7752 - regression_loss: 0.6720 - classification_loss: 0.1033 187/500 [==========>...................] - ETA: 1:43 - loss: 0.7778 - regression_loss: 0.6741 - classification_loss: 0.1037 188/500 [==========>...................] - ETA: 1:43 - loss: 0.7755 - regression_loss: 0.6721 - classification_loss: 0.1035 189/500 [==========>...................] - ETA: 1:42 - loss: 0.7758 - regression_loss: 0.6722 - classification_loss: 0.1037 190/500 [==========>...................] - ETA: 1:42 - loss: 0.7764 - regression_loss: 0.6724 - classification_loss: 0.1040 191/500 [==========>...................] - ETA: 1:42 - loss: 0.7750 - regression_loss: 0.6712 - classification_loss: 0.1038 192/500 [==========>...................] - ETA: 1:41 - loss: 0.7760 - regression_loss: 0.6721 - classification_loss: 0.1039 193/500 [==========>...................] - ETA: 1:41 - loss: 0.7742 - regression_loss: 0.6705 - classification_loss: 0.1037 194/500 [==========>...................] - ETA: 1:41 - loss: 0.7732 - regression_loss: 0.6697 - classification_loss: 0.1035 195/500 [==========>...................] - ETA: 1:40 - loss: 0.7726 - regression_loss: 0.6691 - classification_loss: 0.1034 196/500 [==========>...................] - ETA: 1:40 - loss: 0.7703 - regression_loss: 0.6671 - classification_loss: 0.1032 197/500 [==========>...................] - ETA: 1:40 - loss: 0.7684 - regression_loss: 0.6655 - classification_loss: 0.1029 198/500 [==========>...................] - ETA: 1:39 - loss: 0.7686 - regression_loss: 0.6658 - classification_loss: 0.1028 199/500 [==========>...................] - ETA: 1:39 - loss: 0.7670 - regression_loss: 0.6646 - classification_loss: 0.1024 200/500 [===========>..................] - ETA: 1:39 - loss: 0.7677 - regression_loss: 0.6652 - classification_loss: 0.1024 201/500 [===========>..................] - ETA: 1:38 - loss: 0.7666 - regression_loss: 0.6644 - classification_loss: 0.1022 202/500 [===========>..................] - ETA: 1:38 - loss: 0.7662 - regression_loss: 0.6641 - classification_loss: 0.1021 203/500 [===========>..................] - ETA: 1:38 - loss: 0.7645 - regression_loss: 0.6628 - classification_loss: 0.1017 204/500 [===========>..................] - ETA: 1:37 - loss: 0.7636 - regression_loss: 0.6620 - classification_loss: 0.1016 205/500 [===========>..................] - ETA: 1:37 - loss: 0.7630 - regression_loss: 0.6615 - classification_loss: 0.1015 206/500 [===========>..................] - ETA: 1:37 - loss: 0.7633 - regression_loss: 0.6618 - classification_loss: 0.1016 207/500 [===========>..................] - ETA: 1:36 - loss: 0.7622 - regression_loss: 0.6609 - classification_loss: 0.1013 208/500 [===========>..................] - ETA: 1:36 - loss: 0.7613 - regression_loss: 0.6600 - classification_loss: 0.1013 209/500 [===========>..................] - ETA: 1:36 - loss: 0.7623 - regression_loss: 0.6610 - classification_loss: 0.1014 210/500 [===========>..................] - ETA: 1:35 - loss: 0.7630 - regression_loss: 0.6617 - classification_loss: 0.1013 211/500 [===========>..................] - ETA: 1:35 - loss: 0.7635 - regression_loss: 0.6622 - classification_loss: 0.1013 212/500 [===========>..................] - ETA: 1:35 - loss: 0.7654 - regression_loss: 0.6636 - classification_loss: 0.1018 213/500 [===========>..................] - ETA: 1:34 - loss: 0.7669 - regression_loss: 0.6648 - classification_loss: 0.1021 214/500 [===========>..................] - ETA: 1:34 - loss: 0.7659 - regression_loss: 0.6639 - classification_loss: 0.1019 215/500 [===========>..................] - ETA: 1:34 - loss: 0.7660 - regression_loss: 0.6640 - classification_loss: 0.1020 216/500 [===========>..................] - ETA: 1:33 - loss: 0.7674 - regression_loss: 0.6651 - classification_loss: 0.1023 217/500 [============>.................] - ETA: 1:33 - loss: 0.7702 - regression_loss: 0.6674 - classification_loss: 0.1027 218/500 [============>.................] - ETA: 1:33 - loss: 0.7687 - regression_loss: 0.6661 - classification_loss: 0.1025 219/500 [============>.................] - ETA: 1:32 - loss: 0.7679 - regression_loss: 0.6655 - classification_loss: 0.1024 220/500 [============>.................] - ETA: 1:32 - loss: 0.7695 - regression_loss: 0.6669 - classification_loss: 0.1025 221/500 [============>.................] - ETA: 1:32 - loss: 0.7713 - regression_loss: 0.6688 - classification_loss: 0.1026 222/500 [============>.................] - ETA: 1:31 - loss: 0.7701 - regression_loss: 0.6678 - classification_loss: 0.1023 223/500 [============>.................] - ETA: 1:31 - loss: 0.7708 - regression_loss: 0.6685 - classification_loss: 0.1024 224/500 [============>.................] - ETA: 1:31 - loss: 0.7686 - regression_loss: 0.6665 - classification_loss: 0.1021 225/500 [============>.................] - ETA: 1:30 - loss: 0.7684 - regression_loss: 0.6664 - classification_loss: 0.1021 226/500 [============>.................] - ETA: 1:30 - loss: 0.7678 - regression_loss: 0.6659 - classification_loss: 0.1019 227/500 [============>.................] - ETA: 1:30 - loss: 0.7690 - regression_loss: 0.6671 - classification_loss: 0.1019 228/500 [============>.................] - ETA: 1:29 - loss: 0.7725 - regression_loss: 0.6697 - classification_loss: 0.1028 229/500 [============>.................] - ETA: 1:29 - loss: 0.7733 - regression_loss: 0.6702 - classification_loss: 0.1031 230/500 [============>.................] - ETA: 1:29 - loss: 0.7746 - regression_loss: 0.6713 - classification_loss: 0.1034 231/500 [============>.................] - ETA: 1:28 - loss: 0.7761 - regression_loss: 0.6725 - classification_loss: 0.1036 232/500 [============>.................] - ETA: 1:28 - loss: 0.7772 - regression_loss: 0.6732 - classification_loss: 0.1040 233/500 [============>.................] - ETA: 1:28 - loss: 0.7784 - regression_loss: 0.6743 - classification_loss: 0.1041 234/500 [=============>................] - ETA: 1:27 - loss: 0.7797 - regression_loss: 0.6755 - classification_loss: 0.1042 235/500 [=============>................] - ETA: 1:27 - loss: 0.7782 - regression_loss: 0.6741 - classification_loss: 0.1041 236/500 [=============>................] - ETA: 1:27 - loss: 0.7797 - regression_loss: 0.6753 - classification_loss: 0.1043 237/500 [=============>................] - ETA: 1:26 - loss: 0.7804 - regression_loss: 0.6760 - classification_loss: 0.1044 238/500 [=============>................] - ETA: 1:26 - loss: 0.7804 - regression_loss: 0.6760 - classification_loss: 0.1044 239/500 [=============>................] - ETA: 1:26 - loss: 0.7805 - regression_loss: 0.6762 - classification_loss: 0.1043 240/500 [=============>................] - ETA: 1:25 - loss: 0.7818 - regression_loss: 0.6773 - classification_loss: 0.1045 241/500 [=============>................] - ETA: 1:25 - loss: 0.7812 - regression_loss: 0.6769 - classification_loss: 0.1043 242/500 [=============>................] - ETA: 1:25 - loss: 0.7809 - regression_loss: 0.6766 - classification_loss: 0.1043 243/500 [=============>................] - ETA: 1:24 - loss: 0.7800 - regression_loss: 0.6760 - classification_loss: 0.1040 244/500 [=============>................] - ETA: 1:24 - loss: 0.7803 - regression_loss: 0.6762 - classification_loss: 0.1040 245/500 [=============>................] - ETA: 1:24 - loss: 0.7810 - regression_loss: 0.6771 - classification_loss: 0.1039 246/500 [=============>................] - ETA: 1:23 - loss: 0.7802 - regression_loss: 0.6764 - classification_loss: 0.1038 247/500 [=============>................] - ETA: 1:23 - loss: 0.7839 - regression_loss: 0.6796 - classification_loss: 0.1043 248/500 [=============>................] - ETA: 1:23 - loss: 0.7838 - regression_loss: 0.6793 - classification_loss: 0.1045 249/500 [=============>................] - ETA: 1:22 - loss: 0.7826 - regression_loss: 0.6783 - classification_loss: 0.1042 250/500 [==============>...............] - ETA: 1:22 - loss: 0.7816 - regression_loss: 0.6776 - classification_loss: 0.1040 251/500 [==============>...............] - ETA: 1:22 - loss: 0.7824 - regression_loss: 0.6783 - classification_loss: 0.1041 252/500 [==============>...............] - ETA: 1:21 - loss: 0.7830 - regression_loss: 0.6788 - classification_loss: 0.1042 253/500 [==============>...............] - ETA: 1:21 - loss: 0.7842 - regression_loss: 0.6799 - classification_loss: 0.1043 254/500 [==============>...............] - ETA: 1:21 - loss: 0.7855 - regression_loss: 0.6810 - classification_loss: 0.1045 255/500 [==============>...............] - ETA: 1:20 - loss: 0.7874 - regression_loss: 0.6826 - classification_loss: 0.1048 256/500 [==============>...............] - ETA: 1:20 - loss: 0.7894 - regression_loss: 0.6843 - classification_loss: 0.1051 257/500 [==============>...............] - ETA: 1:20 - loss: 0.7888 - regression_loss: 0.6839 - classification_loss: 0.1049 258/500 [==============>...............] - ETA: 1:19 - loss: 0.7874 - regression_loss: 0.6828 - classification_loss: 0.1047 259/500 [==============>...............] - ETA: 1:19 - loss: 0.7893 - regression_loss: 0.6844 - classification_loss: 0.1048 260/500 [==============>...............] - ETA: 1:19 - loss: 0.7900 - regression_loss: 0.6850 - classification_loss: 0.1049 261/500 [==============>...............] - ETA: 1:18 - loss: 0.7915 - regression_loss: 0.6863 - classification_loss: 0.1051 262/500 [==============>...............] - ETA: 1:18 - loss: 0.7905 - regression_loss: 0.6856 - classification_loss: 0.1049 263/500 [==============>...............] - ETA: 1:18 - loss: 0.7886 - regression_loss: 0.6840 - classification_loss: 0.1046 264/500 [==============>...............] - ETA: 1:17 - loss: 0.7889 - regression_loss: 0.6843 - classification_loss: 0.1046 265/500 [==============>...............] - ETA: 1:17 - loss: 0.7904 - regression_loss: 0.6855 - classification_loss: 0.1049 266/500 [==============>...............] - ETA: 1:17 - loss: 0.7893 - regression_loss: 0.6846 - classification_loss: 0.1047 267/500 [===============>..............] - ETA: 1:16 - loss: 0.7878 - regression_loss: 0.6834 - classification_loss: 0.1044 268/500 [===============>..............] - ETA: 1:16 - loss: 0.7902 - regression_loss: 0.6854 - classification_loss: 0.1049 269/500 [===============>..............] - ETA: 1:16 - loss: 0.7912 - regression_loss: 0.6864 - classification_loss: 0.1048 270/500 [===============>..............] - ETA: 1:15 - loss: 0.7907 - regression_loss: 0.6859 - classification_loss: 0.1048 271/500 [===============>..............] - ETA: 1:15 - loss: 0.7904 - regression_loss: 0.6856 - classification_loss: 0.1048 272/500 [===============>..............] - ETA: 1:15 - loss: 0.7903 - regression_loss: 0.6855 - classification_loss: 0.1047 273/500 [===============>..............] - ETA: 1:14 - loss: 0.7888 - regression_loss: 0.6842 - classification_loss: 0.1045 274/500 [===============>..............] - ETA: 1:14 - loss: 0.7888 - regression_loss: 0.6843 - classification_loss: 0.1045 275/500 [===============>..............] - ETA: 1:14 - loss: 0.7899 - regression_loss: 0.6852 - classification_loss: 0.1046 276/500 [===============>..............] - ETA: 1:13 - loss: 0.7901 - regression_loss: 0.6855 - classification_loss: 0.1046 277/500 [===============>..............] - ETA: 1:13 - loss: 0.7911 - regression_loss: 0.6867 - classification_loss: 0.1044 278/500 [===============>..............] - ETA: 1:13 - loss: 0.7932 - regression_loss: 0.6883 - classification_loss: 0.1048 279/500 [===============>..............] - ETA: 1:13 - loss: 0.7955 - regression_loss: 0.6904 - classification_loss: 0.1051 280/500 [===============>..............] - ETA: 1:12 - loss: 0.7954 - regression_loss: 0.6902 - classification_loss: 0.1052 281/500 [===============>..............] - ETA: 1:12 - loss: 0.7974 - regression_loss: 0.6919 - classification_loss: 0.1055 282/500 [===============>..............] - ETA: 1:12 - loss: 0.7975 - regression_loss: 0.6921 - classification_loss: 0.1054 283/500 [===============>..............] - ETA: 1:11 - loss: 0.7974 - regression_loss: 0.6920 - classification_loss: 0.1054 284/500 [================>.............] - ETA: 1:11 - loss: 0.7989 - regression_loss: 0.6932 - classification_loss: 0.1057 285/500 [================>.............] - ETA: 1:11 - loss: 0.7985 - regression_loss: 0.6928 - classification_loss: 0.1057 286/500 [================>.............] - ETA: 1:10 - loss: 0.7972 - regression_loss: 0.6917 - classification_loss: 0.1055 287/500 [================>.............] - ETA: 1:10 - loss: 0.7966 - regression_loss: 0.6911 - classification_loss: 0.1055 288/500 [================>.............] - ETA: 1:10 - loss: 0.7968 - regression_loss: 0.6913 - classification_loss: 0.1055 289/500 [================>.............] - ETA: 1:09 - loss: 0.7983 - regression_loss: 0.6925 - classification_loss: 0.1058 290/500 [================>.............] - ETA: 1:09 - loss: 0.8005 - regression_loss: 0.6942 - classification_loss: 0.1062 291/500 [================>.............] - ETA: 1:09 - loss: 0.7983 - regression_loss: 0.6923 - classification_loss: 0.1060 292/500 [================>.............] - ETA: 1:08 - loss: 0.7966 - regression_loss: 0.6908 - classification_loss: 0.1058 293/500 [================>.............] - ETA: 1:08 - loss: 0.7971 - regression_loss: 0.6911 - classification_loss: 0.1060 294/500 [================>.............] - ETA: 1:08 - loss: 0.7978 - regression_loss: 0.6918 - classification_loss: 0.1061 295/500 [================>.............] - ETA: 1:07 - loss: 0.7965 - regression_loss: 0.6907 - classification_loss: 0.1058 296/500 [================>.............] - ETA: 1:07 - loss: 0.7957 - regression_loss: 0.6900 - classification_loss: 0.1057 297/500 [================>.............] - ETA: 1:07 - loss: 0.7955 - regression_loss: 0.6897 - classification_loss: 0.1058 298/500 [================>.............] - ETA: 1:06 - loss: 0.7966 - regression_loss: 0.6906 - classification_loss: 0.1059 299/500 [================>.............] - ETA: 1:06 - loss: 0.7968 - regression_loss: 0.6907 - classification_loss: 0.1061 300/500 [=================>............] - ETA: 1:06 - loss: 0.7993 - regression_loss: 0.6928 - classification_loss: 0.1065 301/500 [=================>............] - ETA: 1:05 - loss: 0.7996 - regression_loss: 0.6931 - classification_loss: 0.1065 302/500 [=================>............] - ETA: 1:05 - loss: 0.7991 - regression_loss: 0.6927 - classification_loss: 0.1063 303/500 [=================>............] - ETA: 1:05 - loss: 0.7991 - regression_loss: 0.6928 - classification_loss: 0.1063 304/500 [=================>............] - ETA: 1:04 - loss: 0.8010 - regression_loss: 0.6944 - classification_loss: 0.1066 305/500 [=================>............] - ETA: 1:04 - loss: 0.8002 - regression_loss: 0.6938 - classification_loss: 0.1064 306/500 [=================>............] - ETA: 1:04 - loss: 0.8011 - regression_loss: 0.6945 - classification_loss: 0.1066 307/500 [=================>............] - ETA: 1:03 - loss: 0.7994 - regression_loss: 0.6930 - classification_loss: 0.1064 308/500 [=================>............] - ETA: 1:03 - loss: 0.8004 - regression_loss: 0.6938 - classification_loss: 0.1067 309/500 [=================>............] - ETA: 1:03 - loss: 0.8007 - regression_loss: 0.6940 - classification_loss: 0.1067 310/500 [=================>............] - ETA: 1:02 - loss: 0.7998 - regression_loss: 0.6932 - classification_loss: 0.1066 311/500 [=================>............] - ETA: 1:02 - loss: 0.8005 - regression_loss: 0.6938 - classification_loss: 0.1067 312/500 [=================>............] - ETA: 1:02 - loss: 0.8005 - regression_loss: 0.6938 - classification_loss: 0.1066 313/500 [=================>............] - ETA: 1:01 - loss: 0.8016 - regression_loss: 0.6948 - classification_loss: 0.1068 314/500 [=================>............] - ETA: 1:01 - loss: 0.8015 - regression_loss: 0.6947 - classification_loss: 0.1068 315/500 [=================>............] - ETA: 1:01 - loss: 0.8021 - regression_loss: 0.6953 - classification_loss: 0.1068 316/500 [=================>............] - ETA: 1:00 - loss: 0.8020 - regression_loss: 0.6954 - classification_loss: 0.1067 317/500 [==================>...........] - ETA: 1:00 - loss: 0.8013 - regression_loss: 0.6948 - classification_loss: 0.1065 318/500 [==================>...........] - ETA: 1:00 - loss: 0.8019 - regression_loss: 0.6953 - classification_loss: 0.1066 319/500 [==================>...........] - ETA: 59s - loss: 0.8009 - regression_loss: 0.6945 - classification_loss: 0.1065  320/500 [==================>...........] - ETA: 59s - loss: 0.8005 - regression_loss: 0.6941 - classification_loss: 0.1064 321/500 [==================>...........] - ETA: 59s - loss: 0.7997 - regression_loss: 0.6934 - classification_loss: 0.1063 322/500 [==================>...........] - ETA: 58s - loss: 0.7999 - regression_loss: 0.6935 - classification_loss: 0.1063 323/500 [==================>...........] - ETA: 58s - loss: 0.7991 - regression_loss: 0.6930 - classification_loss: 0.1062 324/500 [==================>...........] - ETA: 58s - loss: 0.7986 - regression_loss: 0.6925 - classification_loss: 0.1061 325/500 [==================>...........] - ETA: 57s - loss: 0.7981 - regression_loss: 0.6921 - classification_loss: 0.1061 326/500 [==================>...........] - ETA: 57s - loss: 0.7985 - regression_loss: 0.6924 - classification_loss: 0.1061 327/500 [==================>...........] - ETA: 57s - loss: 0.7998 - regression_loss: 0.6935 - classification_loss: 0.1063 328/500 [==================>...........] - ETA: 56s - loss: 0.8004 - regression_loss: 0.6940 - classification_loss: 0.1063 329/500 [==================>...........] - ETA: 56s - loss: 0.7995 - regression_loss: 0.6934 - classification_loss: 0.1062 330/500 [==================>...........] - ETA: 56s - loss: 0.7999 - regression_loss: 0.6937 - classification_loss: 0.1062 331/500 [==================>...........] - ETA: 55s - loss: 0.8002 - regression_loss: 0.6940 - classification_loss: 0.1062 332/500 [==================>...........] - ETA: 55s - loss: 0.7988 - regression_loss: 0.6927 - classification_loss: 0.1061 333/500 [==================>...........] - ETA: 55s - loss: 0.7996 - regression_loss: 0.6934 - classification_loss: 0.1062 334/500 [===================>..........] - ETA: 54s - loss: 0.8001 - regression_loss: 0.6939 - classification_loss: 0.1062 335/500 [===================>..........] - ETA: 54s - loss: 0.8006 - regression_loss: 0.6945 - classification_loss: 0.1062 336/500 [===================>..........] - ETA: 54s - loss: 0.8016 - regression_loss: 0.6952 - classification_loss: 0.1063 337/500 [===================>..........] - ETA: 53s - loss: 0.8011 - regression_loss: 0.6949 - classification_loss: 0.1062 338/500 [===================>..........] - ETA: 53s - loss: 0.8002 - regression_loss: 0.6941 - classification_loss: 0.1061 339/500 [===================>..........] - ETA: 53s - loss: 0.8009 - regression_loss: 0.6948 - classification_loss: 0.1062 340/500 [===================>..........] - ETA: 52s - loss: 0.8023 - regression_loss: 0.6961 - classification_loss: 0.1062 341/500 [===================>..........] - ETA: 52s - loss: 0.8015 - regression_loss: 0.6954 - classification_loss: 0.1061 342/500 [===================>..........] - ETA: 52s - loss: 0.8002 - regression_loss: 0.6943 - classification_loss: 0.1059 343/500 [===================>..........] - ETA: 51s - loss: 0.7985 - regression_loss: 0.6928 - classification_loss: 0.1057 344/500 [===================>..........] - ETA: 51s - loss: 0.7973 - regression_loss: 0.6918 - classification_loss: 0.1055 345/500 [===================>..........] - ETA: 51s - loss: 0.7974 - regression_loss: 0.6918 - classification_loss: 0.1056 346/500 [===================>..........] - ETA: 50s - loss: 0.7974 - regression_loss: 0.6919 - classification_loss: 0.1055 347/500 [===================>..........] - ETA: 50s - loss: 0.7961 - regression_loss: 0.6908 - classification_loss: 0.1053 348/500 [===================>..........] - ETA: 50s - loss: 0.7963 - regression_loss: 0.6910 - classification_loss: 0.1053 349/500 [===================>..........] - ETA: 49s - loss: 0.7968 - regression_loss: 0.6914 - classification_loss: 0.1054 350/500 [====================>.........] - ETA: 49s - loss: 0.7976 - regression_loss: 0.6922 - classification_loss: 0.1054 351/500 [====================>.........] - ETA: 49s - loss: 0.7970 - regression_loss: 0.6916 - classification_loss: 0.1054 352/500 [====================>.........] - ETA: 48s - loss: 0.7976 - regression_loss: 0.6921 - classification_loss: 0.1054 353/500 [====================>.........] - ETA: 48s - loss: 0.7963 - regression_loss: 0.6911 - classification_loss: 0.1052 354/500 [====================>.........] - ETA: 48s - loss: 0.7963 - regression_loss: 0.6911 - classification_loss: 0.1052 355/500 [====================>.........] - ETA: 47s - loss: 0.7969 - regression_loss: 0.6916 - classification_loss: 0.1053 356/500 [====================>.........] - ETA: 47s - loss: 0.7966 - regression_loss: 0.6914 - classification_loss: 0.1053 357/500 [====================>.........] - ETA: 47s - loss: 0.7960 - regression_loss: 0.6908 - classification_loss: 0.1052 358/500 [====================>.........] - ETA: 46s - loss: 0.7963 - regression_loss: 0.6910 - classification_loss: 0.1053 359/500 [====================>.........] - ETA: 46s - loss: 0.7973 - regression_loss: 0.6919 - classification_loss: 0.1054 360/500 [====================>.........] - ETA: 46s - loss: 0.7964 - regression_loss: 0.6912 - classification_loss: 0.1052 361/500 [====================>.........] - ETA: 45s - loss: 0.7961 - regression_loss: 0.6909 - classification_loss: 0.1052 362/500 [====================>.........] - ETA: 45s - loss: 0.7968 - regression_loss: 0.6915 - classification_loss: 0.1053 363/500 [====================>.........] - ETA: 45s - loss: 0.7975 - regression_loss: 0.6920 - classification_loss: 0.1054 364/500 [====================>.........] - ETA: 44s - loss: 0.7979 - regression_loss: 0.6924 - classification_loss: 0.1055 365/500 [====================>.........] - ETA: 44s - loss: 0.7985 - regression_loss: 0.6929 - classification_loss: 0.1056 366/500 [====================>.........] - ETA: 44s - loss: 0.7989 - regression_loss: 0.6933 - classification_loss: 0.1056 367/500 [=====================>........] - ETA: 43s - loss: 0.7990 - regression_loss: 0.6934 - classification_loss: 0.1056 368/500 [=====================>........] - ETA: 43s - loss: 0.7991 - regression_loss: 0.6934 - classification_loss: 0.1056 369/500 [=====================>........] - ETA: 43s - loss: 0.7988 - regression_loss: 0.6931 - classification_loss: 0.1057 370/500 [=====================>........] - ETA: 42s - loss: 0.7987 - regression_loss: 0.6931 - classification_loss: 0.1056 371/500 [=====================>........] - ETA: 42s - loss: 0.7985 - regression_loss: 0.6930 - classification_loss: 0.1055 372/500 [=====================>........] - ETA: 42s - loss: 0.7988 - regression_loss: 0.6933 - classification_loss: 0.1055 373/500 [=====================>........] - ETA: 42s - loss: 0.7987 - regression_loss: 0.6933 - classification_loss: 0.1054 374/500 [=====================>........] - ETA: 41s - loss: 0.7998 - regression_loss: 0.6941 - classification_loss: 0.1057 375/500 [=====================>........] - ETA: 41s - loss: 0.8001 - regression_loss: 0.6944 - classification_loss: 0.1058 376/500 [=====================>........] - ETA: 41s - loss: 0.8018 - regression_loss: 0.6958 - classification_loss: 0.1060 377/500 [=====================>........] - ETA: 40s - loss: 0.8029 - regression_loss: 0.6966 - classification_loss: 0.1062 378/500 [=====================>........] - ETA: 40s - loss: 0.8017 - regression_loss: 0.6956 - classification_loss: 0.1061 379/500 [=====================>........] - ETA: 40s - loss: 0.8015 - regression_loss: 0.6955 - classification_loss: 0.1060 380/500 [=====================>........] - ETA: 39s - loss: 0.8016 - regression_loss: 0.6957 - classification_loss: 0.1059 381/500 [=====================>........] - ETA: 39s - loss: 0.8012 - regression_loss: 0.6953 - classification_loss: 0.1059 382/500 [=====================>........] - ETA: 39s - loss: 0.8015 - regression_loss: 0.6954 - classification_loss: 0.1060 383/500 [=====================>........] - ETA: 38s - loss: 0.8011 - regression_loss: 0.6951 - classification_loss: 0.1060 384/500 [======================>.......] - ETA: 38s - loss: 0.8004 - regression_loss: 0.6945 - classification_loss: 0.1059 385/500 [======================>.......] - ETA: 38s - loss: 0.7998 - regression_loss: 0.6941 - classification_loss: 0.1057 386/500 [======================>.......] - ETA: 37s - loss: 0.7991 - regression_loss: 0.6935 - classification_loss: 0.1056 387/500 [======================>.......] - ETA: 37s - loss: 0.7983 - regression_loss: 0.6929 - classification_loss: 0.1054 388/500 [======================>.......] - ETA: 37s - loss: 0.7994 - regression_loss: 0.6940 - classification_loss: 0.1054 389/500 [======================>.......] - ETA: 36s - loss: 0.7990 - regression_loss: 0.6937 - classification_loss: 0.1053 390/500 [======================>.......] - ETA: 36s - loss: 0.7977 - regression_loss: 0.6927 - classification_loss: 0.1051 391/500 [======================>.......] - ETA: 36s - loss: 0.7975 - regression_loss: 0.6925 - classification_loss: 0.1050 392/500 [======================>.......] - ETA: 35s - loss: 0.7976 - regression_loss: 0.6925 - classification_loss: 0.1050 393/500 [======================>.......] - ETA: 35s - loss: 0.7975 - regression_loss: 0.6925 - classification_loss: 0.1049 394/500 [======================>.......] - ETA: 35s - loss: 0.7974 - regression_loss: 0.6925 - classification_loss: 0.1050 395/500 [======================>.......] - ETA: 34s - loss: 0.7963 - regression_loss: 0.6915 - classification_loss: 0.1048 396/500 [======================>.......] - ETA: 34s - loss: 0.7970 - regression_loss: 0.6921 - classification_loss: 0.1049 397/500 [======================>.......] - ETA: 34s - loss: 0.7969 - regression_loss: 0.6920 - classification_loss: 0.1050 398/500 [======================>.......] - ETA: 33s - loss: 0.7972 - regression_loss: 0.6923 - classification_loss: 0.1049 399/500 [======================>.......] - ETA: 33s - loss: 0.7973 - regression_loss: 0.6925 - classification_loss: 0.1049 400/500 [=======================>......] - ETA: 33s - loss: 0.7970 - regression_loss: 0.6921 - classification_loss: 0.1048 401/500 [=======================>......] - ETA: 32s - loss: 0.7974 - regression_loss: 0.6924 - classification_loss: 0.1050 402/500 [=======================>......] - ETA: 32s - loss: 0.7978 - regression_loss: 0.6928 - classification_loss: 0.1050 403/500 [=======================>......] - ETA: 32s - loss: 0.7984 - regression_loss: 0.6933 - classification_loss: 0.1051 404/500 [=======================>......] - ETA: 31s - loss: 0.7990 - regression_loss: 0.6937 - classification_loss: 0.1053 405/500 [=======================>......] - ETA: 31s - loss: 0.7980 - regression_loss: 0.6928 - classification_loss: 0.1052 406/500 [=======================>......] - ETA: 31s - loss: 0.7995 - regression_loss: 0.6940 - classification_loss: 0.1054 407/500 [=======================>......] - ETA: 30s - loss: 0.7987 - regression_loss: 0.6934 - classification_loss: 0.1053 408/500 [=======================>......] - ETA: 30s - loss: 0.7990 - regression_loss: 0.6937 - classification_loss: 0.1053 409/500 [=======================>......] - ETA: 30s - loss: 0.8000 - regression_loss: 0.6946 - classification_loss: 0.1054 410/500 [=======================>......] - ETA: 29s - loss: 0.7991 - regression_loss: 0.6939 - classification_loss: 0.1052 411/500 [=======================>......] - ETA: 29s - loss: 0.7985 - regression_loss: 0.6934 - classification_loss: 0.1051 412/500 [=======================>......] - ETA: 29s - loss: 0.7980 - regression_loss: 0.6930 - classification_loss: 0.1050 413/500 [=======================>......] - ETA: 28s - loss: 0.7980 - regression_loss: 0.6931 - classification_loss: 0.1050 414/500 [=======================>......] - ETA: 28s - loss: 0.7969 - regression_loss: 0.6921 - classification_loss: 0.1048 415/500 [=======================>......] - ETA: 28s - loss: 0.7965 - regression_loss: 0.6918 - classification_loss: 0.1047 416/500 [=======================>......] - ETA: 27s - loss: 0.7968 - regression_loss: 0.6920 - classification_loss: 0.1048 417/500 [========================>.....] - ETA: 27s - loss: 0.7967 - regression_loss: 0.6920 - classification_loss: 0.1047 418/500 [========================>.....] - ETA: 27s - loss: 0.7969 - regression_loss: 0.6922 - classification_loss: 0.1047 419/500 [========================>.....] - ETA: 26s - loss: 0.7974 - regression_loss: 0.6925 - classification_loss: 0.1049 420/500 [========================>.....] - ETA: 26s - loss: 0.7963 - regression_loss: 0.6915 - classification_loss: 0.1048 421/500 [========================>.....] - ETA: 26s - loss: 0.7965 - regression_loss: 0.6916 - classification_loss: 0.1049 422/500 [========================>.....] - ETA: 25s - loss: 0.7958 - regression_loss: 0.6911 - classification_loss: 0.1047 423/500 [========================>.....] - ETA: 25s - loss: 0.7958 - regression_loss: 0.6912 - classification_loss: 0.1046 424/500 [========================>.....] - ETA: 25s - loss: 0.7954 - regression_loss: 0.6908 - classification_loss: 0.1045 425/500 [========================>.....] - ETA: 24s - loss: 0.7947 - regression_loss: 0.6902 - classification_loss: 0.1045 426/500 [========================>.....] - ETA: 24s - loss: 0.7944 - regression_loss: 0.6900 - classification_loss: 0.1044 427/500 [========================>.....] - ETA: 24s - loss: 0.7937 - regression_loss: 0.6894 - classification_loss: 0.1043 428/500 [========================>.....] - ETA: 23s - loss: 0.7928 - regression_loss: 0.6886 - classification_loss: 0.1042 429/500 [========================>.....] - ETA: 23s - loss: 0.7927 - regression_loss: 0.6885 - classification_loss: 0.1042 430/500 [========================>.....] - ETA: 23s - loss: 0.7937 - regression_loss: 0.6892 - classification_loss: 0.1045 431/500 [========================>.....] - ETA: 22s - loss: 0.7939 - regression_loss: 0.6894 - classification_loss: 0.1045 432/500 [========================>.....] - ETA: 22s - loss: 0.7930 - regression_loss: 0.6887 - classification_loss: 0.1043 433/500 [========================>.....] - ETA: 22s - loss: 0.7934 - regression_loss: 0.6890 - classification_loss: 0.1044 434/500 [=========================>....] - ETA: 21s - loss: 0.7927 - regression_loss: 0.6884 - classification_loss: 0.1043 435/500 [=========================>....] - ETA: 21s - loss: 0.7924 - regression_loss: 0.6881 - classification_loss: 0.1043 436/500 [=========================>....] - ETA: 21s - loss: 0.7917 - regression_loss: 0.6875 - classification_loss: 0.1042 437/500 [=========================>....] - ETA: 20s - loss: 0.7932 - regression_loss: 0.6887 - classification_loss: 0.1045 438/500 [=========================>....] - ETA: 20s - loss: 0.7923 - regression_loss: 0.6880 - classification_loss: 0.1043 439/500 [=========================>....] - ETA: 20s - loss: 0.7917 - regression_loss: 0.6875 - classification_loss: 0.1042 440/500 [=========================>....] - ETA: 19s - loss: 0.7927 - regression_loss: 0.6883 - classification_loss: 0.1044 441/500 [=========================>....] - ETA: 19s - loss: 0.7937 - regression_loss: 0.6892 - classification_loss: 0.1045 442/500 [=========================>....] - ETA: 19s - loss: 0.7929 - regression_loss: 0.6885 - classification_loss: 0.1044 443/500 [=========================>....] - ETA: 18s - loss: 0.7935 - regression_loss: 0.6891 - classification_loss: 0.1044 444/500 [=========================>....] - ETA: 18s - loss: 0.7924 - regression_loss: 0.6881 - classification_loss: 0.1043 445/500 [=========================>....] - ETA: 18s - loss: 0.7914 - regression_loss: 0.6872 - classification_loss: 0.1042 446/500 [=========================>....] - ETA: 17s - loss: 0.7913 - regression_loss: 0.6872 - classification_loss: 0.1041 447/500 [=========================>....] - ETA: 17s - loss: 0.7920 - regression_loss: 0.6878 - classification_loss: 0.1042 448/500 [=========================>....] - ETA: 17s - loss: 0.7919 - regression_loss: 0.6877 - classification_loss: 0.1042 449/500 [=========================>....] - ETA: 16s - loss: 0.7920 - regression_loss: 0.6878 - classification_loss: 0.1042 450/500 [==========================>...] - ETA: 16s - loss: 0.7929 - regression_loss: 0.6887 - classification_loss: 0.1042 451/500 [==========================>...] - ETA: 16s - loss: 0.7924 - regression_loss: 0.6883 - classification_loss: 0.1041 452/500 [==========================>...] - ETA: 15s - loss: 0.7919 - regression_loss: 0.6879 - classification_loss: 0.1040 453/500 [==========================>...] - ETA: 15s - loss: 0.7929 - regression_loss: 0.6887 - classification_loss: 0.1042 454/500 [==========================>...] - ETA: 15s - loss: 0.7934 - regression_loss: 0.6890 - classification_loss: 0.1044 455/500 [==========================>...] - ETA: 14s - loss: 0.7925 - regression_loss: 0.6883 - classification_loss: 0.1042 456/500 [==========================>...] - ETA: 14s - loss: 0.7923 - regression_loss: 0.6881 - classification_loss: 0.1042 457/500 [==========================>...] - ETA: 14s - loss: 0.7922 - regression_loss: 0.6880 - classification_loss: 0.1042 458/500 [==========================>...] - ETA: 13s - loss: 0.7919 - regression_loss: 0.6878 - classification_loss: 0.1041 459/500 [==========================>...] - ETA: 13s - loss: 0.7924 - regression_loss: 0.6882 - classification_loss: 0.1041 460/500 [==========================>...] - ETA: 13s - loss: 0.7924 - regression_loss: 0.6883 - classification_loss: 0.1041 461/500 [==========================>...] - ETA: 12s - loss: 0.7929 - regression_loss: 0.6887 - classification_loss: 0.1042 462/500 [==========================>...] - ETA: 12s - loss: 0.7929 - regression_loss: 0.6886 - classification_loss: 0.1042 463/500 [==========================>...] - ETA: 12s - loss: 0.7920 - regression_loss: 0.6879 - classification_loss: 0.1041 464/500 [==========================>...] - ETA: 11s - loss: 0.7928 - regression_loss: 0.6885 - classification_loss: 0.1043 465/500 [==========================>...] - ETA: 11s - loss: 0.7925 - regression_loss: 0.6883 - classification_loss: 0.1042 466/500 [==========================>...] - ETA: 11s - loss: 0.7923 - regression_loss: 0.6882 - classification_loss: 0.1041 467/500 [===========================>..] - ETA: 10s - loss: 0.7931 - regression_loss: 0.6889 - classification_loss: 0.1043 468/500 [===========================>..] - ETA: 10s - loss: 0.7924 - regression_loss: 0.6882 - classification_loss: 0.1042 469/500 [===========================>..] - ETA: 10s - loss: 0.7924 - regression_loss: 0.6883 - classification_loss: 0.1042 470/500 [===========================>..] - ETA: 9s - loss: 0.7914 - regression_loss: 0.6874 - classification_loss: 0.1040  471/500 [===========================>..] - ETA: 9s - loss: 0.7924 - regression_loss: 0.6883 - classification_loss: 0.1042 472/500 [===========================>..] - ETA: 9s - loss: 0.7936 - regression_loss: 0.6892 - classification_loss: 0.1044 473/500 [===========================>..] - ETA: 8s - loss: 0.7937 - regression_loss: 0.6893 - classification_loss: 0.1044 474/500 [===========================>..] - ETA: 8s - loss: 0.7944 - regression_loss: 0.6901 - classification_loss: 0.1043 475/500 [===========================>..] - ETA: 8s - loss: 0.7950 - regression_loss: 0.6906 - classification_loss: 0.1044 476/500 [===========================>..] - ETA: 7s - loss: 0.7949 - regression_loss: 0.6904 - classification_loss: 0.1045 477/500 [===========================>..] - ETA: 7s - loss: 0.7953 - regression_loss: 0.6907 - classification_loss: 0.1046 478/500 [===========================>..] - ETA: 7s - loss: 0.7947 - regression_loss: 0.6902 - classification_loss: 0.1045 479/500 [===========================>..] - ETA: 6s - loss: 0.7939 - regression_loss: 0.6895 - classification_loss: 0.1043 480/500 [===========================>..] - ETA: 6s - loss: 0.7953 - regression_loss: 0.6907 - classification_loss: 0.1045 481/500 [===========================>..] - ETA: 6s - loss: 0.7965 - regression_loss: 0.6917 - classification_loss: 0.1048 482/500 [===========================>..] - ETA: 5s - loss: 0.7970 - regression_loss: 0.6922 - classification_loss: 0.1048 483/500 [===========================>..] - ETA: 5s - loss: 0.7968 - regression_loss: 0.6921 - classification_loss: 0.1048 484/500 [============================>.] - ETA: 5s - loss: 0.7976 - regression_loss: 0.6927 - classification_loss: 0.1049 485/500 [============================>.] - ETA: 4s - loss: 0.7974 - regression_loss: 0.6926 - classification_loss: 0.1049 486/500 [============================>.] - ETA: 4s - loss: 0.7962 - regression_loss: 0.6915 - classification_loss: 0.1047 487/500 [============================>.] - ETA: 4s - loss: 0.7970 - regression_loss: 0.6923 - classification_loss: 0.1047 488/500 [============================>.] - ETA: 3s - loss: 0.7974 - regression_loss: 0.6926 - classification_loss: 0.1048 489/500 [============================>.] - ETA: 3s - loss: 0.7971 - regression_loss: 0.6923 - classification_loss: 0.1047 490/500 [============================>.] - ETA: 3s - loss: 0.7962 - regression_loss: 0.6916 - classification_loss: 0.1046 491/500 [============================>.] - ETA: 2s - loss: 0.7966 - regression_loss: 0.6919 - classification_loss: 0.1047 492/500 [============================>.] - ETA: 2s - loss: 0.7968 - regression_loss: 0.6922 - classification_loss: 0.1046 493/500 [============================>.] - ETA: 2s - loss: 0.7978 - regression_loss: 0.6930 - classification_loss: 0.1049 494/500 [============================>.] - ETA: 1s - loss: 0.7970 - regression_loss: 0.6922 - classification_loss: 0.1047 495/500 [============================>.] - ETA: 1s - loss: 0.7966 - regression_loss: 0.6919 - classification_loss: 0.1046 496/500 [============================>.] - ETA: 1s - loss: 0.7985 - regression_loss: 0.6935 - classification_loss: 0.1050 497/500 [============================>.] - ETA: 0s - loss: 0.7982 - regression_loss: 0.6933 - classification_loss: 0.1049 498/500 [============================>.] - ETA: 0s - loss: 0.7973 - regression_loss: 0.6925 - classification_loss: 0.1048 499/500 [============================>.] - ETA: 0s - loss: 0.7967 - regression_loss: 0.6920 - classification_loss: 0.1047 500/500 [==============================] - 165s 331ms/step - loss: 0.7970 - regression_loss: 0.6922 - classification_loss: 0.1047 1172 instances of class plum with average precision: 0.6252 mAP: 0.6252 Epoch 00038: saving model to ./training/snapshots/resnet101_pascal_38.h5 Epoch 39/150 1/500 [..............................] - ETA: 2:42 - loss: 0.8121 - regression_loss: 0.7322 - classification_loss: 0.0799 2/500 [..............................] - ETA: 2:39 - loss: 0.5795 - regression_loss: 0.5177 - classification_loss: 0.0618 3/500 [..............................] - ETA: 2:38 - loss: 0.5464 - regression_loss: 0.4836 - classification_loss: 0.0628 4/500 [..............................] - ETA: 2:42 - loss: 0.7182 - regression_loss: 0.6275 - classification_loss: 0.0907 5/500 [..............................] - ETA: 2:42 - loss: 0.6522 - regression_loss: 0.5687 - classification_loss: 0.0835 6/500 [..............................] - ETA: 2:41 - loss: 0.6696 - regression_loss: 0.5795 - classification_loss: 0.0900 7/500 [..............................] - ETA: 2:41 - loss: 0.6780 - regression_loss: 0.5842 - classification_loss: 0.0938 8/500 [..............................] - ETA: 2:41 - loss: 0.6668 - regression_loss: 0.5754 - classification_loss: 0.0915 9/500 [..............................] - ETA: 2:40 - loss: 0.6849 - regression_loss: 0.5883 - classification_loss: 0.0966 10/500 [..............................] - ETA: 2:39 - loss: 0.6727 - regression_loss: 0.5784 - classification_loss: 0.0944 11/500 [..............................] - ETA: 2:38 - loss: 0.6991 - regression_loss: 0.6008 - classification_loss: 0.0983 12/500 [..............................] - ETA: 2:39 - loss: 0.7525 - regression_loss: 0.6465 - classification_loss: 0.1059 13/500 [..............................] - ETA: 2:38 - loss: 0.7336 - regression_loss: 0.6318 - classification_loss: 0.1018 14/500 [..............................] - ETA: 2:37 - loss: 0.7114 - regression_loss: 0.6123 - classification_loss: 0.0991 15/500 [..............................] - ETA: 2:37 - loss: 0.7464 - regression_loss: 0.6395 - classification_loss: 0.1069 16/500 [..............................] - ETA: 2:37 - loss: 0.7474 - regression_loss: 0.6391 - classification_loss: 0.1083 17/500 [>.............................] - ETA: 2:37 - loss: 0.7744 - regression_loss: 0.6583 - classification_loss: 0.1161 18/500 [>.............................] - ETA: 2:37 - loss: 0.7807 - regression_loss: 0.6688 - classification_loss: 0.1118 19/500 [>.............................] - ETA: 2:37 - loss: 0.7949 - regression_loss: 0.6801 - classification_loss: 0.1148 20/500 [>.............................] - ETA: 2:38 - loss: 0.7786 - regression_loss: 0.6662 - classification_loss: 0.1124 21/500 [>.............................] - ETA: 2:37 - loss: 0.7877 - regression_loss: 0.6740 - classification_loss: 0.1136 22/500 [>.............................] - ETA: 2:37 - loss: 0.7876 - regression_loss: 0.6741 - classification_loss: 0.1135 23/500 [>.............................] - ETA: 2:37 - loss: 0.7683 - regression_loss: 0.6574 - classification_loss: 0.1109 24/500 [>.............................] - ETA: 2:37 - loss: 0.7684 - regression_loss: 0.6574 - classification_loss: 0.1109 25/500 [>.............................] - ETA: 2:36 - loss: 0.7578 - regression_loss: 0.6495 - classification_loss: 0.1083 26/500 [>.............................] - ETA: 2:36 - loss: 0.7560 - regression_loss: 0.6484 - classification_loss: 0.1076 27/500 [>.............................] - ETA: 2:36 - loss: 0.7615 - regression_loss: 0.6550 - classification_loss: 0.1065 28/500 [>.............................] - ETA: 2:35 - loss: 0.7529 - regression_loss: 0.6483 - classification_loss: 0.1046 29/500 [>.............................] - ETA: 2:35 - loss: 0.7484 - regression_loss: 0.6446 - classification_loss: 0.1038 30/500 [>.............................] - ETA: 2:35 - loss: 0.7470 - regression_loss: 0.6422 - classification_loss: 0.1048 31/500 [>.............................] - ETA: 2:34 - loss: 0.7527 - regression_loss: 0.6472 - classification_loss: 0.1055 32/500 [>.............................] - ETA: 2:34 - loss: 0.7439 - regression_loss: 0.6404 - classification_loss: 0.1035 33/500 [>.............................] - ETA: 2:33 - loss: 0.7387 - regression_loss: 0.6361 - classification_loss: 0.1027 34/500 [=>............................] - ETA: 2:33 - loss: 0.7408 - regression_loss: 0.6372 - classification_loss: 0.1036 35/500 [=>............................] - ETA: 2:33 - loss: 0.7282 - regression_loss: 0.6262 - classification_loss: 0.1020 36/500 [=>............................] - ETA: 2:32 - loss: 0.7299 - regression_loss: 0.6284 - classification_loss: 0.1015 37/500 [=>............................] - ETA: 2:32 - loss: 0.7198 - regression_loss: 0.6193 - classification_loss: 0.1004 38/500 [=>............................] - ETA: 2:32 - loss: 0.7388 - regression_loss: 0.6360 - classification_loss: 0.1028 39/500 [=>............................] - ETA: 2:31 - loss: 0.7339 - regression_loss: 0.6330 - classification_loss: 0.1009 40/500 [=>............................] - ETA: 2:31 - loss: 0.7406 - regression_loss: 0.6390 - classification_loss: 0.1016 41/500 [=>............................] - ETA: 2:31 - loss: 0.7327 - regression_loss: 0.6322 - classification_loss: 0.1005 42/500 [=>............................] - ETA: 2:30 - loss: 0.7466 - regression_loss: 0.6436 - classification_loss: 0.1029 43/500 [=>............................] - ETA: 2:30 - loss: 0.7323 - regression_loss: 0.6312 - classification_loss: 0.1012 44/500 [=>............................] - ETA: 2:30 - loss: 0.7472 - regression_loss: 0.6443 - classification_loss: 0.1029 45/500 [=>............................] - ETA: 2:30 - loss: 0.7508 - regression_loss: 0.6471 - classification_loss: 0.1038 46/500 [=>............................] - ETA: 2:30 - loss: 0.7566 - regression_loss: 0.6526 - classification_loss: 0.1040 47/500 [=>............................] - ETA: 2:29 - loss: 0.7576 - regression_loss: 0.6542 - classification_loss: 0.1035 48/500 [=>............................] - ETA: 2:29 - loss: 0.7481 - regression_loss: 0.6464 - classification_loss: 0.1018 49/500 [=>............................] - ETA: 2:29 - loss: 0.7539 - regression_loss: 0.6517 - classification_loss: 0.1023 50/500 [==>...........................] - ETA: 2:29 - loss: 0.7525 - regression_loss: 0.6504 - classification_loss: 0.1021 51/500 [==>...........................] - ETA: 2:28 - loss: 0.7566 - regression_loss: 0.6523 - classification_loss: 0.1043 52/500 [==>...........................] - ETA: 2:28 - loss: 0.7528 - regression_loss: 0.6495 - classification_loss: 0.1033 53/500 [==>...........................] - ETA: 2:28 - loss: 0.7530 - regression_loss: 0.6501 - classification_loss: 0.1030 54/500 [==>...........................] - ETA: 2:27 - loss: 0.7600 - regression_loss: 0.6558 - classification_loss: 0.1042 55/500 [==>...........................] - ETA: 2:27 - loss: 0.7745 - regression_loss: 0.6677 - classification_loss: 0.1068 56/500 [==>...........................] - ETA: 2:26 - loss: 0.7747 - regression_loss: 0.6681 - classification_loss: 0.1066 57/500 [==>...........................] - ETA: 2:26 - loss: 0.7786 - regression_loss: 0.6710 - classification_loss: 0.1075 58/500 [==>...........................] - ETA: 2:26 - loss: 0.7795 - regression_loss: 0.6717 - classification_loss: 0.1077 59/500 [==>...........................] - ETA: 2:26 - loss: 0.7693 - regression_loss: 0.6629 - classification_loss: 0.1064 60/500 [==>...........................] - ETA: 2:26 - loss: 0.7678 - regression_loss: 0.6618 - classification_loss: 0.1060 61/500 [==>...........................] - ETA: 2:25 - loss: 0.7665 - regression_loss: 0.6609 - classification_loss: 0.1056 62/500 [==>...........................] - ETA: 2:25 - loss: 0.7631 - regression_loss: 0.6580 - classification_loss: 0.1051 63/500 [==>...........................] - ETA: 2:25 - loss: 0.7628 - regression_loss: 0.6576 - classification_loss: 0.1052 64/500 [==>...........................] - ETA: 2:24 - loss: 0.7682 - regression_loss: 0.6622 - classification_loss: 0.1060 65/500 [==>...........................] - ETA: 2:24 - loss: 0.7704 - regression_loss: 0.6644 - classification_loss: 0.1061 66/500 [==>...........................] - ETA: 2:24 - loss: 0.7692 - regression_loss: 0.6630 - classification_loss: 0.1061 67/500 [===>..........................] - ETA: 2:24 - loss: 0.7789 - regression_loss: 0.6724 - classification_loss: 0.1066 68/500 [===>..........................] - ETA: 2:23 - loss: 0.7809 - regression_loss: 0.6754 - classification_loss: 0.1055 69/500 [===>..........................] - ETA: 2:23 - loss: 0.7797 - regression_loss: 0.6748 - classification_loss: 0.1049 70/500 [===>..........................] - ETA: 2:23 - loss: 0.7826 - regression_loss: 0.6775 - classification_loss: 0.1051 71/500 [===>..........................] - ETA: 2:22 - loss: 0.7836 - regression_loss: 0.6790 - classification_loss: 0.1047 72/500 [===>..........................] - ETA: 2:22 - loss: 0.7865 - regression_loss: 0.6817 - classification_loss: 0.1048 73/500 [===>..........................] - ETA: 2:22 - loss: 0.7935 - regression_loss: 0.6876 - classification_loss: 0.1059 74/500 [===>..........................] - ETA: 2:21 - loss: 0.7903 - regression_loss: 0.6849 - classification_loss: 0.1054 75/500 [===>..........................] - ETA: 2:21 - loss: 0.7869 - regression_loss: 0.6820 - classification_loss: 0.1049 76/500 [===>..........................] - ETA: 2:21 - loss: 0.7913 - regression_loss: 0.6859 - classification_loss: 0.1054 77/500 [===>..........................] - ETA: 2:20 - loss: 0.7975 - regression_loss: 0.6908 - classification_loss: 0.1067 78/500 [===>..........................] - ETA: 2:20 - loss: 0.8020 - regression_loss: 0.6946 - classification_loss: 0.1074 79/500 [===>..........................] - ETA: 2:19 - loss: 0.8079 - regression_loss: 0.7004 - classification_loss: 0.1075 80/500 [===>..........................] - ETA: 2:19 - loss: 0.8056 - regression_loss: 0.6989 - classification_loss: 0.1068 81/500 [===>..........................] - ETA: 2:19 - loss: 0.8058 - regression_loss: 0.6993 - classification_loss: 0.1066 82/500 [===>..........................] - ETA: 2:18 - loss: 0.8022 - regression_loss: 0.6964 - classification_loss: 0.1058 83/500 [===>..........................] - ETA: 2:18 - loss: 0.8034 - regression_loss: 0.6976 - classification_loss: 0.1058 84/500 [====>.........................] - ETA: 2:18 - loss: 0.8019 - regression_loss: 0.6963 - classification_loss: 0.1055 85/500 [====>.........................] - ETA: 2:17 - loss: 0.8001 - regression_loss: 0.6950 - classification_loss: 0.1051 86/500 [====>.........................] - ETA: 2:17 - loss: 0.8005 - regression_loss: 0.6959 - classification_loss: 0.1046 87/500 [====>.........................] - ETA: 2:17 - loss: 0.8024 - regression_loss: 0.6975 - classification_loss: 0.1049 88/500 [====>.........................] - ETA: 2:16 - loss: 0.7997 - regression_loss: 0.6953 - classification_loss: 0.1043 89/500 [====>.........................] - ETA: 2:16 - loss: 0.8041 - regression_loss: 0.6982 - classification_loss: 0.1059 90/500 [====>.........................] - ETA: 2:16 - loss: 0.7989 - regression_loss: 0.6937 - classification_loss: 0.1052 91/500 [====>.........................] - ETA: 2:15 - loss: 0.7974 - regression_loss: 0.6926 - classification_loss: 0.1048 92/500 [====>.........................] - ETA: 2:15 - loss: 0.7986 - regression_loss: 0.6938 - classification_loss: 0.1047 93/500 [====>.........................] - ETA: 2:14 - loss: 0.7927 - regression_loss: 0.6888 - classification_loss: 0.1039 94/500 [====>.........................] - ETA: 2:14 - loss: 0.7876 - regression_loss: 0.6843 - classification_loss: 0.1033 95/500 [====>.........................] - ETA: 2:14 - loss: 0.7870 - regression_loss: 0.6842 - classification_loss: 0.1028 96/500 [====>.........................] - ETA: 2:13 - loss: 0.7901 - regression_loss: 0.6866 - classification_loss: 0.1035 97/500 [====>.........................] - ETA: 2:13 - loss: 0.7914 - regression_loss: 0.6879 - classification_loss: 0.1036 98/500 [====>.........................] - ETA: 2:13 - loss: 0.7905 - regression_loss: 0.6875 - classification_loss: 0.1030 99/500 [====>.........................] - ETA: 2:12 - loss: 0.7873 - regression_loss: 0.6847 - classification_loss: 0.1026 100/500 [=====>........................] - ETA: 2:12 - loss: 0.7911 - regression_loss: 0.6882 - classification_loss: 0.1028 101/500 [=====>........................] - ETA: 2:12 - loss: 0.7869 - regression_loss: 0.6847 - classification_loss: 0.1022 102/500 [=====>........................] - ETA: 2:11 - loss: 0.7910 - regression_loss: 0.6882 - classification_loss: 0.1028 103/500 [=====>........................] - ETA: 2:11 - loss: 0.7943 - regression_loss: 0.6910 - classification_loss: 0.1033 104/500 [=====>........................] - ETA: 2:11 - loss: 0.7952 - regression_loss: 0.6919 - classification_loss: 0.1033 105/500 [=====>........................] - ETA: 2:10 - loss: 0.7976 - regression_loss: 0.6941 - classification_loss: 0.1035 106/500 [=====>........................] - ETA: 2:10 - loss: 0.7966 - regression_loss: 0.6932 - classification_loss: 0.1034 107/500 [=====>........................] - ETA: 2:10 - loss: 0.8005 - regression_loss: 0.6966 - classification_loss: 0.1039 108/500 [=====>........................] - ETA: 2:09 - loss: 0.8032 - regression_loss: 0.6988 - classification_loss: 0.1044 109/500 [=====>........................] - ETA: 2:09 - loss: 0.8067 - regression_loss: 0.7017 - classification_loss: 0.1050 110/500 [=====>........................] - ETA: 2:08 - loss: 0.8133 - regression_loss: 0.7067 - classification_loss: 0.1066 111/500 [=====>........................] - ETA: 2:08 - loss: 0.8103 - regression_loss: 0.7043 - classification_loss: 0.1060 112/500 [=====>........................] - ETA: 2:08 - loss: 0.8076 - regression_loss: 0.7019 - classification_loss: 0.1057 113/500 [=====>........................] - ETA: 2:07 - loss: 0.8101 - regression_loss: 0.7041 - classification_loss: 0.1060 114/500 [=====>........................] - ETA: 2:07 - loss: 0.8072 - regression_loss: 0.7016 - classification_loss: 0.1055 115/500 [=====>........................] - ETA: 2:07 - loss: 0.8082 - regression_loss: 0.7026 - classification_loss: 0.1056 116/500 [=====>........................] - ETA: 2:06 - loss: 0.8079 - regression_loss: 0.7024 - classification_loss: 0.1054 117/500 [======>.......................] - ETA: 2:06 - loss: 0.8092 - regression_loss: 0.7034 - classification_loss: 0.1058 118/500 [======>.......................] - ETA: 2:06 - loss: 0.8049 - regression_loss: 0.6996 - classification_loss: 0.1053 119/500 [======>.......................] - ETA: 2:06 - loss: 0.8076 - regression_loss: 0.7016 - classification_loss: 0.1060 120/500 [======>.......................] - ETA: 2:05 - loss: 0.8074 - regression_loss: 0.7015 - classification_loss: 0.1059 121/500 [======>.......................] - ETA: 2:05 - loss: 0.8078 - regression_loss: 0.7018 - classification_loss: 0.1060 122/500 [======>.......................] - ETA: 2:05 - loss: 0.8049 - regression_loss: 0.6992 - classification_loss: 0.1057 123/500 [======>.......................] - ETA: 2:04 - loss: 0.8036 - regression_loss: 0.6982 - classification_loss: 0.1054 124/500 [======>.......................] - ETA: 2:04 - loss: 0.8043 - regression_loss: 0.6985 - classification_loss: 0.1057 125/500 [======>.......................] - ETA: 2:04 - loss: 0.8053 - regression_loss: 0.6998 - classification_loss: 0.1055 126/500 [======>.......................] - ETA: 2:03 - loss: 0.8088 - regression_loss: 0.7027 - classification_loss: 0.1061 127/500 [======>.......................] - ETA: 2:03 - loss: 0.8089 - regression_loss: 0.7024 - classification_loss: 0.1064 128/500 [======>.......................] - ETA: 2:03 - loss: 0.8066 - regression_loss: 0.7007 - classification_loss: 0.1060 129/500 [======>.......................] - ETA: 2:02 - loss: 0.8079 - regression_loss: 0.7015 - classification_loss: 0.1064 130/500 [======>.......................] - ETA: 2:02 - loss: 0.8105 - regression_loss: 0.7038 - classification_loss: 0.1067 131/500 [======>.......................] - ETA: 2:02 - loss: 0.8073 - regression_loss: 0.7011 - classification_loss: 0.1063 132/500 [======>.......................] - ETA: 2:01 - loss: 0.8114 - regression_loss: 0.7043 - classification_loss: 0.1071 133/500 [======>.......................] - ETA: 2:01 - loss: 0.8171 - regression_loss: 0.7088 - classification_loss: 0.1083 134/500 [=======>......................] - ETA: 2:01 - loss: 0.8129 - regression_loss: 0.7052 - classification_loss: 0.1077 135/500 [=======>......................] - ETA: 2:00 - loss: 0.8108 - regression_loss: 0.7035 - classification_loss: 0.1073 136/500 [=======>......................] - ETA: 2:00 - loss: 0.8105 - regression_loss: 0.7030 - classification_loss: 0.1075 137/500 [=======>......................] - ETA: 2:00 - loss: 0.8065 - regression_loss: 0.6994 - classification_loss: 0.1071 138/500 [=======>......................] - ETA: 1:59 - loss: 0.8085 - regression_loss: 0.7015 - classification_loss: 0.1070 139/500 [=======>......................] - ETA: 1:59 - loss: 0.8063 - regression_loss: 0.6996 - classification_loss: 0.1068 140/500 [=======>......................] - ETA: 1:59 - loss: 0.8050 - regression_loss: 0.6983 - classification_loss: 0.1067 141/500 [=======>......................] - ETA: 1:58 - loss: 0.8063 - regression_loss: 0.6992 - classification_loss: 0.1071 142/500 [=======>......................] - ETA: 1:58 - loss: 0.8039 - regression_loss: 0.6964 - classification_loss: 0.1075 143/500 [=======>......................] - ETA: 1:57 - loss: 0.8067 - regression_loss: 0.6986 - classification_loss: 0.1081 144/500 [=======>......................] - ETA: 1:57 - loss: 0.8039 - regression_loss: 0.6963 - classification_loss: 0.1076 145/500 [=======>......................] - ETA: 1:57 - loss: 0.8009 - regression_loss: 0.6936 - classification_loss: 0.1074 146/500 [=======>......................] - ETA: 1:57 - loss: 0.8019 - regression_loss: 0.6942 - classification_loss: 0.1077 147/500 [=======>......................] - ETA: 1:56 - loss: 0.7981 - regression_loss: 0.6909 - classification_loss: 0.1072 148/500 [=======>......................] - ETA: 1:56 - loss: 0.7994 - regression_loss: 0.6921 - classification_loss: 0.1073 149/500 [=======>......................] - ETA: 1:56 - loss: 0.8010 - regression_loss: 0.6936 - classification_loss: 0.1074 150/500 [========>.....................] - ETA: 1:55 - loss: 0.8034 - regression_loss: 0.6958 - classification_loss: 0.1076 151/500 [========>.....................] - ETA: 1:55 - loss: 0.8013 - regression_loss: 0.6941 - classification_loss: 0.1072 152/500 [========>.....................] - ETA: 1:55 - loss: 0.7996 - regression_loss: 0.6926 - classification_loss: 0.1070 153/500 [========>.....................] - ETA: 1:54 - loss: 0.7993 - regression_loss: 0.6925 - classification_loss: 0.1067 154/500 [========>.....................] - ETA: 1:54 - loss: 0.7971 - regression_loss: 0.6909 - classification_loss: 0.1063 155/500 [========>.....................] - ETA: 1:54 - loss: 0.7940 - regression_loss: 0.6881 - classification_loss: 0.1059 156/500 [========>.....................] - ETA: 1:53 - loss: 0.7938 - regression_loss: 0.6880 - classification_loss: 0.1058 157/500 [========>.....................] - ETA: 1:53 - loss: 0.7948 - regression_loss: 0.6894 - classification_loss: 0.1055 158/500 [========>.....................] - ETA: 1:53 - loss: 0.7921 - regression_loss: 0.6870 - classification_loss: 0.1051 159/500 [========>.....................] - ETA: 1:52 - loss: 0.7911 - regression_loss: 0.6862 - classification_loss: 0.1049 160/500 [========>.....................] - ETA: 1:52 - loss: 0.7916 - regression_loss: 0.6864 - classification_loss: 0.1051 161/500 [========>.....................] - ETA: 1:52 - loss: 0.7957 - regression_loss: 0.6898 - classification_loss: 0.1059 162/500 [========>.....................] - ETA: 1:51 - loss: 0.7955 - regression_loss: 0.6896 - classification_loss: 0.1060 163/500 [========>.....................] - ETA: 1:51 - loss: 0.7965 - regression_loss: 0.6903 - classification_loss: 0.1062 164/500 [========>.....................] - ETA: 1:51 - loss: 0.7951 - regression_loss: 0.6891 - classification_loss: 0.1059 165/500 [========>.....................] - ETA: 1:50 - loss: 0.7956 - regression_loss: 0.6895 - classification_loss: 0.1062 166/500 [========>.....................] - ETA: 1:50 - loss: 0.7975 - regression_loss: 0.6911 - classification_loss: 0.1064 167/500 [=========>....................] - ETA: 1:50 - loss: 0.7972 - regression_loss: 0.6910 - classification_loss: 0.1062 168/500 [=========>....................] - ETA: 1:49 - loss: 0.7971 - regression_loss: 0.6912 - classification_loss: 0.1059 169/500 [=========>....................] - ETA: 1:49 - loss: 0.7963 - regression_loss: 0.6904 - classification_loss: 0.1060 170/500 [=========>....................] - ETA: 1:49 - loss: 0.7974 - regression_loss: 0.6911 - classification_loss: 0.1062 171/500 [=========>....................] - ETA: 1:48 - loss: 0.7981 - regression_loss: 0.6916 - classification_loss: 0.1065 172/500 [=========>....................] - ETA: 1:48 - loss: 0.7984 - regression_loss: 0.6918 - classification_loss: 0.1066 173/500 [=========>....................] - ETA: 1:48 - loss: 0.8001 - regression_loss: 0.6935 - classification_loss: 0.1066 174/500 [=========>....................] - ETA: 1:47 - loss: 0.7993 - regression_loss: 0.6929 - classification_loss: 0.1065 175/500 [=========>....................] - ETA: 1:47 - loss: 0.7989 - regression_loss: 0.6925 - classification_loss: 0.1064 176/500 [=========>....................] - ETA: 1:47 - loss: 0.7993 - regression_loss: 0.6927 - classification_loss: 0.1066 177/500 [=========>....................] - ETA: 1:46 - loss: 0.7970 - regression_loss: 0.6906 - classification_loss: 0.1064 178/500 [=========>....................] - ETA: 1:46 - loss: 0.7952 - regression_loss: 0.6891 - classification_loss: 0.1061 179/500 [=========>....................] - ETA: 1:46 - loss: 0.7951 - regression_loss: 0.6892 - classification_loss: 0.1059 180/500 [=========>....................] - ETA: 1:45 - loss: 0.7963 - regression_loss: 0.6901 - classification_loss: 0.1062 181/500 [=========>....................] - ETA: 1:45 - loss: 0.7957 - regression_loss: 0.6895 - classification_loss: 0.1062 182/500 [=========>....................] - ETA: 1:45 - loss: 0.7964 - regression_loss: 0.6902 - classification_loss: 0.1063 183/500 [=========>....................] - ETA: 1:44 - loss: 0.7990 - regression_loss: 0.6925 - classification_loss: 0.1065 184/500 [==========>...................] - ETA: 1:44 - loss: 0.7962 - regression_loss: 0.6902 - classification_loss: 0.1060 185/500 [==========>...................] - ETA: 1:44 - loss: 0.7994 - regression_loss: 0.6926 - classification_loss: 0.1068 186/500 [==========>...................] - ETA: 1:43 - loss: 0.8009 - regression_loss: 0.6940 - classification_loss: 0.1069 187/500 [==========>...................] - ETA: 1:43 - loss: 0.7991 - regression_loss: 0.6923 - classification_loss: 0.1068 188/500 [==========>...................] - ETA: 1:43 - loss: 0.7989 - regression_loss: 0.6921 - classification_loss: 0.1068 189/500 [==========>...................] - ETA: 1:42 - loss: 0.7982 - regression_loss: 0.6916 - classification_loss: 0.1066 190/500 [==========>...................] - ETA: 1:42 - loss: 0.8007 - regression_loss: 0.6936 - classification_loss: 0.1071 191/500 [==========>...................] - ETA: 1:42 - loss: 0.7996 - regression_loss: 0.6928 - classification_loss: 0.1069 192/500 [==========>...................] - ETA: 1:41 - loss: 0.8019 - regression_loss: 0.6947 - classification_loss: 0.1073 193/500 [==========>...................] - ETA: 1:41 - loss: 0.8053 - regression_loss: 0.6975 - classification_loss: 0.1078 194/500 [==========>...................] - ETA: 1:41 - loss: 0.8065 - regression_loss: 0.6988 - classification_loss: 0.1078 195/500 [==========>...................] - ETA: 1:40 - loss: 0.8067 - regression_loss: 0.6991 - classification_loss: 0.1076 196/500 [==========>...................] - ETA: 1:40 - loss: 0.8062 - regression_loss: 0.6986 - classification_loss: 0.1075 197/500 [==========>...................] - ETA: 1:40 - loss: 0.8045 - regression_loss: 0.6972 - classification_loss: 0.1073 198/500 [==========>...................] - ETA: 1:39 - loss: 0.8041 - regression_loss: 0.6969 - classification_loss: 0.1072 199/500 [==========>...................] - ETA: 1:39 - loss: 0.8048 - regression_loss: 0.6977 - classification_loss: 0.1071 200/500 [===========>..................] - ETA: 1:39 - loss: 0.8071 - regression_loss: 0.6996 - classification_loss: 0.1074 201/500 [===========>..................] - ETA: 1:38 - loss: 0.8075 - regression_loss: 0.7001 - classification_loss: 0.1074 202/500 [===========>..................] - ETA: 1:38 - loss: 0.8072 - regression_loss: 0.6999 - classification_loss: 0.1073 203/500 [===========>..................] - ETA: 1:38 - loss: 0.8078 - regression_loss: 0.7004 - classification_loss: 0.1074 204/500 [===========>..................] - ETA: 1:37 - loss: 0.8097 - regression_loss: 0.7018 - classification_loss: 0.1079 205/500 [===========>..................] - ETA: 1:37 - loss: 0.8121 - regression_loss: 0.7038 - classification_loss: 0.1083 206/500 [===========>..................] - ETA: 1:37 - loss: 0.8107 - regression_loss: 0.7027 - classification_loss: 0.1080 207/500 [===========>..................] - ETA: 1:36 - loss: 0.8132 - regression_loss: 0.7049 - classification_loss: 0.1084 208/500 [===========>..................] - ETA: 1:36 - loss: 0.8127 - regression_loss: 0.7044 - classification_loss: 0.1083 209/500 [===========>..................] - ETA: 1:36 - loss: 0.8130 - regression_loss: 0.7047 - classification_loss: 0.1083 210/500 [===========>..................] - ETA: 1:35 - loss: 0.8130 - regression_loss: 0.7047 - classification_loss: 0.1083 211/500 [===========>..................] - ETA: 1:35 - loss: 0.8126 - regression_loss: 0.7045 - classification_loss: 0.1081 212/500 [===========>..................] - ETA: 1:35 - loss: 0.8132 - regression_loss: 0.7053 - classification_loss: 0.1079 213/500 [===========>..................] - ETA: 1:34 - loss: 0.8137 - regression_loss: 0.7059 - classification_loss: 0.1078 214/500 [===========>..................] - ETA: 1:34 - loss: 0.8135 - regression_loss: 0.7059 - classification_loss: 0.1076 215/500 [===========>..................] - ETA: 1:34 - loss: 0.8126 - regression_loss: 0.7052 - classification_loss: 0.1074 216/500 [===========>..................] - ETA: 1:33 - loss: 0.8124 - regression_loss: 0.7049 - classification_loss: 0.1075 217/500 [============>.................] - ETA: 1:33 - loss: 0.8109 - regression_loss: 0.7036 - classification_loss: 0.1073 218/500 [============>.................] - ETA: 1:33 - loss: 0.8118 - regression_loss: 0.7044 - classification_loss: 0.1074 219/500 [============>.................] - ETA: 1:32 - loss: 0.8103 - regression_loss: 0.7031 - classification_loss: 0.1072 220/500 [============>.................] - ETA: 1:32 - loss: 0.8081 - regression_loss: 0.7012 - classification_loss: 0.1068 221/500 [============>.................] - ETA: 1:32 - loss: 0.8058 - regression_loss: 0.6993 - classification_loss: 0.1065 222/500 [============>.................] - ETA: 1:31 - loss: 0.8068 - regression_loss: 0.7002 - classification_loss: 0.1067 223/500 [============>.................] - ETA: 1:31 - loss: 0.8085 - regression_loss: 0.7015 - classification_loss: 0.1069 224/500 [============>.................] - ETA: 1:31 - loss: 0.8092 - regression_loss: 0.7022 - classification_loss: 0.1071 225/500 [============>.................] - ETA: 1:30 - loss: 0.8089 - regression_loss: 0.7018 - classification_loss: 0.1071 226/500 [============>.................] - ETA: 1:30 - loss: 0.8067 - regression_loss: 0.6999 - classification_loss: 0.1068 227/500 [============>.................] - ETA: 1:30 - loss: 0.8039 - regression_loss: 0.6974 - classification_loss: 0.1065 228/500 [============>.................] - ETA: 1:29 - loss: 0.8032 - regression_loss: 0.6968 - classification_loss: 0.1064 229/500 [============>.................] - ETA: 1:29 - loss: 0.8027 - regression_loss: 0.6964 - classification_loss: 0.1062 230/500 [============>.................] - ETA: 1:29 - loss: 0.8006 - regression_loss: 0.6947 - classification_loss: 0.1059 231/500 [============>.................] - ETA: 1:28 - loss: 0.8006 - regression_loss: 0.6947 - classification_loss: 0.1059 232/500 [============>.................] - ETA: 1:28 - loss: 0.8016 - regression_loss: 0.6956 - classification_loss: 0.1060 233/500 [============>.................] - ETA: 1:28 - loss: 0.8005 - regression_loss: 0.6947 - classification_loss: 0.1058 234/500 [=============>................] - ETA: 1:27 - loss: 0.8017 - regression_loss: 0.6958 - classification_loss: 0.1059 235/500 [=============>................] - ETA: 1:27 - loss: 0.8004 - regression_loss: 0.6947 - classification_loss: 0.1057 236/500 [=============>................] - ETA: 1:27 - loss: 0.7994 - regression_loss: 0.6940 - classification_loss: 0.1054 237/500 [=============>................] - ETA: 1:26 - loss: 0.7998 - regression_loss: 0.6942 - classification_loss: 0.1055 238/500 [=============>................] - ETA: 1:26 - loss: 0.8007 - regression_loss: 0.6949 - classification_loss: 0.1058 239/500 [=============>................] - ETA: 1:26 - loss: 0.8020 - regression_loss: 0.6964 - classification_loss: 0.1056 240/500 [=============>................] - ETA: 1:25 - loss: 0.8000 - regression_loss: 0.6946 - classification_loss: 0.1054 241/500 [=============>................] - ETA: 1:25 - loss: 0.8001 - regression_loss: 0.6947 - classification_loss: 0.1054 242/500 [=============>................] - ETA: 1:25 - loss: 0.8014 - regression_loss: 0.6956 - classification_loss: 0.1058 243/500 [=============>................] - ETA: 1:24 - loss: 0.8008 - regression_loss: 0.6952 - classification_loss: 0.1056 244/500 [=============>................] - ETA: 1:24 - loss: 0.7985 - regression_loss: 0.6932 - classification_loss: 0.1053 245/500 [=============>................] - ETA: 1:24 - loss: 0.7972 - regression_loss: 0.6920 - classification_loss: 0.1051 246/500 [=============>................] - ETA: 1:23 - loss: 0.7961 - regression_loss: 0.6911 - classification_loss: 0.1050 247/500 [=============>................] - ETA: 1:23 - loss: 0.7970 - regression_loss: 0.6919 - classification_loss: 0.1050 248/500 [=============>................] - ETA: 1:23 - loss: 0.7972 - regression_loss: 0.6922 - classification_loss: 0.1050 249/500 [=============>................] - ETA: 1:22 - loss: 0.7950 - regression_loss: 0.6903 - classification_loss: 0.1047 250/500 [==============>...............] - ETA: 1:22 - loss: 0.7939 - regression_loss: 0.6894 - classification_loss: 0.1045 251/500 [==============>...............] - ETA: 1:22 - loss: 0.7927 - regression_loss: 0.6882 - classification_loss: 0.1044 252/500 [==============>...............] - ETA: 1:21 - loss: 0.7912 - regression_loss: 0.6869 - classification_loss: 0.1043 253/500 [==============>...............] - ETA: 1:21 - loss: 0.7918 - regression_loss: 0.6875 - classification_loss: 0.1043 254/500 [==============>...............] - ETA: 1:21 - loss: 0.7918 - regression_loss: 0.6874 - classification_loss: 0.1043 255/500 [==============>...............] - ETA: 1:20 - loss: 0.7923 - regression_loss: 0.6877 - classification_loss: 0.1045 256/500 [==============>...............] - ETA: 1:20 - loss: 0.7916 - regression_loss: 0.6873 - classification_loss: 0.1043 257/500 [==============>...............] - ETA: 1:20 - loss: 0.7906 - regression_loss: 0.6865 - classification_loss: 0.1041 258/500 [==============>...............] - ETA: 1:19 - loss: 0.7891 - regression_loss: 0.6853 - classification_loss: 0.1038 259/500 [==============>...............] - ETA: 1:19 - loss: 0.7906 - regression_loss: 0.6864 - classification_loss: 0.1042 260/500 [==============>...............] - ETA: 1:19 - loss: 0.7888 - regression_loss: 0.6848 - classification_loss: 0.1040 261/500 [==============>...............] - ETA: 1:18 - loss: 0.7880 - regression_loss: 0.6842 - classification_loss: 0.1038 262/500 [==============>...............] - ETA: 1:18 - loss: 0.7902 - regression_loss: 0.6860 - classification_loss: 0.1042 263/500 [==============>...............] - ETA: 1:18 - loss: 0.7890 - regression_loss: 0.6850 - classification_loss: 0.1040 264/500 [==============>...............] - ETA: 1:17 - loss: 0.7910 - regression_loss: 0.6865 - classification_loss: 0.1044 265/500 [==============>...............] - ETA: 1:17 - loss: 0.7897 - regression_loss: 0.6855 - classification_loss: 0.1042 266/500 [==============>...............] - ETA: 1:17 - loss: 0.7881 - regression_loss: 0.6842 - classification_loss: 0.1040 267/500 [===============>..............] - ETA: 1:16 - loss: 0.7892 - regression_loss: 0.6851 - classification_loss: 0.1041 268/500 [===============>..............] - ETA: 1:16 - loss: 0.7909 - regression_loss: 0.6864 - classification_loss: 0.1045 269/500 [===============>..............] - ETA: 1:16 - loss: 0.7906 - regression_loss: 0.6861 - classification_loss: 0.1044 270/500 [===============>..............] - ETA: 1:15 - loss: 0.7909 - regression_loss: 0.6863 - classification_loss: 0.1046 271/500 [===============>..............] - ETA: 1:15 - loss: 0.7905 - regression_loss: 0.6858 - classification_loss: 0.1046 272/500 [===============>..............] - ETA: 1:15 - loss: 0.7891 - regression_loss: 0.6847 - classification_loss: 0.1044 273/500 [===============>..............] - ETA: 1:14 - loss: 0.7886 - regression_loss: 0.6843 - classification_loss: 0.1043 274/500 [===============>..............] - ETA: 1:14 - loss: 0.7889 - regression_loss: 0.6846 - classification_loss: 0.1043 275/500 [===============>..............] - ETA: 1:14 - loss: 0.7903 - regression_loss: 0.6856 - classification_loss: 0.1046 276/500 [===============>..............] - ETA: 1:13 - loss: 0.7906 - regression_loss: 0.6860 - classification_loss: 0.1047 277/500 [===============>..............] - ETA: 1:13 - loss: 0.7894 - regression_loss: 0.6849 - classification_loss: 0.1045 278/500 [===============>..............] - ETA: 1:13 - loss: 0.7896 - regression_loss: 0.6851 - classification_loss: 0.1045 279/500 [===============>..............] - ETA: 1:12 - loss: 0.7898 - regression_loss: 0.6851 - classification_loss: 0.1047 280/500 [===============>..............] - ETA: 1:12 - loss: 0.7916 - regression_loss: 0.6864 - classification_loss: 0.1052 281/500 [===============>..............] - ETA: 1:12 - loss: 0.7920 - regression_loss: 0.6867 - classification_loss: 0.1053 282/500 [===============>..............] - ETA: 1:11 - loss: 0.7919 - regression_loss: 0.6867 - classification_loss: 0.1053 283/500 [===============>..............] - ETA: 1:11 - loss: 0.7908 - regression_loss: 0.6855 - classification_loss: 0.1052 284/500 [================>.............] - ETA: 1:11 - loss: 0.7891 - regression_loss: 0.6841 - classification_loss: 0.1050 285/500 [================>.............] - ETA: 1:10 - loss: 0.7897 - regression_loss: 0.6847 - classification_loss: 0.1050 286/500 [================>.............] - ETA: 1:10 - loss: 0.7904 - regression_loss: 0.6853 - classification_loss: 0.1051 287/500 [================>.............] - ETA: 1:10 - loss: 0.7892 - regression_loss: 0.6843 - classification_loss: 0.1049 288/500 [================>.............] - ETA: 1:10 - loss: 0.7904 - regression_loss: 0.6853 - classification_loss: 0.1051 289/500 [================>.............] - ETA: 1:09 - loss: 0.7909 - regression_loss: 0.6857 - classification_loss: 0.1052 290/500 [================>.............] - ETA: 1:09 - loss: 0.7890 - regression_loss: 0.6841 - classification_loss: 0.1049 291/500 [================>.............] - ETA: 1:08 - loss: 0.7889 - regression_loss: 0.6839 - classification_loss: 0.1050 292/500 [================>.............] - ETA: 1:08 - loss: 0.7888 - regression_loss: 0.6840 - classification_loss: 0.1048 293/500 [================>.............] - ETA: 1:08 - loss: 0.7883 - regression_loss: 0.6836 - classification_loss: 0.1047 294/500 [================>.............] - ETA: 1:08 - loss: 0.7874 - regression_loss: 0.6829 - classification_loss: 0.1045 295/500 [================>.............] - ETA: 1:07 - loss: 0.7869 - regression_loss: 0.6825 - classification_loss: 0.1044 296/500 [================>.............] - ETA: 1:07 - loss: 0.7862 - regression_loss: 0.6819 - classification_loss: 0.1043 297/500 [================>.............] - ETA: 1:07 - loss: 0.7865 - regression_loss: 0.6821 - classification_loss: 0.1043 298/500 [================>.............] - ETA: 1:06 - loss: 0.7866 - regression_loss: 0.6823 - classification_loss: 0.1043 299/500 [================>.............] - ETA: 1:06 - loss: 0.7868 - regression_loss: 0.6826 - classification_loss: 0.1043 300/500 [=================>............] - ETA: 1:06 - loss: 0.7860 - regression_loss: 0.6819 - classification_loss: 0.1041 301/500 [=================>............] - ETA: 1:05 - loss: 0.7871 - regression_loss: 0.6829 - classification_loss: 0.1042 302/500 [=================>............] - ETA: 1:05 - loss: 0.7862 - regression_loss: 0.6820 - classification_loss: 0.1042 303/500 [=================>............] - ETA: 1:05 - loss: 0.7864 - regression_loss: 0.6822 - classification_loss: 0.1042 304/500 [=================>............] - ETA: 1:04 - loss: 0.7859 - regression_loss: 0.6818 - classification_loss: 0.1041 305/500 [=================>............] - ETA: 1:04 - loss: 0.7863 - regression_loss: 0.6820 - classification_loss: 0.1043 306/500 [=================>............] - ETA: 1:04 - loss: 0.7859 - regression_loss: 0.6816 - classification_loss: 0.1042 307/500 [=================>............] - ETA: 1:03 - loss: 0.7866 - regression_loss: 0.6822 - classification_loss: 0.1044 308/500 [=================>............] - ETA: 1:03 - loss: 0.7887 - regression_loss: 0.6839 - classification_loss: 0.1048 309/500 [=================>............] - ETA: 1:03 - loss: 0.7883 - regression_loss: 0.6836 - classification_loss: 0.1047 310/500 [=================>............] - ETA: 1:02 - loss: 0.7887 - regression_loss: 0.6839 - classification_loss: 0.1047 311/500 [=================>............] - ETA: 1:02 - loss: 0.7888 - regression_loss: 0.6840 - classification_loss: 0.1048 312/500 [=================>............] - ETA: 1:02 - loss: 0.7885 - regression_loss: 0.6836 - classification_loss: 0.1048 313/500 [=================>............] - ETA: 1:01 - loss: 0.7881 - regression_loss: 0.6834 - classification_loss: 0.1047 314/500 [=================>............] - ETA: 1:01 - loss: 0.7865 - regression_loss: 0.6819 - classification_loss: 0.1045 315/500 [=================>............] - ETA: 1:01 - loss: 0.7874 - regression_loss: 0.6827 - classification_loss: 0.1047 316/500 [=================>............] - ETA: 1:00 - loss: 0.7881 - regression_loss: 0.6834 - classification_loss: 0.1047 317/500 [==================>...........] - ETA: 1:00 - loss: 0.7887 - regression_loss: 0.6837 - classification_loss: 0.1049 318/500 [==================>...........] - ETA: 1:00 - loss: 0.7877 - regression_loss: 0.6830 - classification_loss: 0.1048 319/500 [==================>...........] - ETA: 59s - loss: 0.7881 - regression_loss: 0.6833 - classification_loss: 0.1048  320/500 [==================>...........] - ETA: 59s - loss: 0.7867 - regression_loss: 0.6821 - classification_loss: 0.1046 321/500 [==================>...........] - ETA: 59s - loss: 0.7861 - regression_loss: 0.6815 - classification_loss: 0.1045 322/500 [==================>...........] - ETA: 58s - loss: 0.7881 - regression_loss: 0.6833 - classification_loss: 0.1048 323/500 [==================>...........] - ETA: 58s - loss: 0.7878 - regression_loss: 0.6831 - classification_loss: 0.1047 324/500 [==================>...........] - ETA: 58s - loss: 0.7883 - regression_loss: 0.6835 - classification_loss: 0.1047 325/500 [==================>...........] - ETA: 57s - loss: 0.7880 - regression_loss: 0.6832 - classification_loss: 0.1048 326/500 [==================>...........] - ETA: 57s - loss: 0.7880 - regression_loss: 0.6832 - classification_loss: 0.1048 327/500 [==================>...........] - ETA: 57s - loss: 0.7880 - regression_loss: 0.6832 - classification_loss: 0.1048 328/500 [==================>...........] - ETA: 56s - loss: 0.7884 - regression_loss: 0.6835 - classification_loss: 0.1049 329/500 [==================>...........] - ETA: 56s - loss: 0.7885 - regression_loss: 0.6836 - classification_loss: 0.1050 330/500 [==================>...........] - ETA: 56s - loss: 0.7892 - regression_loss: 0.6842 - classification_loss: 0.1050 331/500 [==================>...........] - ETA: 55s - loss: 0.7892 - regression_loss: 0.6843 - classification_loss: 0.1049 332/500 [==================>...........] - ETA: 55s - loss: 0.7881 - regression_loss: 0.6833 - classification_loss: 0.1048 333/500 [==================>...........] - ETA: 55s - loss: 0.7878 - regression_loss: 0.6830 - classification_loss: 0.1048 334/500 [===================>..........] - ETA: 54s - loss: 0.7894 - regression_loss: 0.6844 - classification_loss: 0.1050 335/500 [===================>..........] - ETA: 54s - loss: 0.7890 - regression_loss: 0.6841 - classification_loss: 0.1049 336/500 [===================>..........] - ETA: 54s - loss: 0.7896 - regression_loss: 0.6848 - classification_loss: 0.1048 337/500 [===================>..........] - ETA: 53s - loss: 0.7901 - regression_loss: 0.6852 - classification_loss: 0.1049 338/500 [===================>..........] - ETA: 53s - loss: 0.7913 - regression_loss: 0.6862 - classification_loss: 0.1051 339/500 [===================>..........] - ETA: 53s - loss: 0.7900 - regression_loss: 0.6851 - classification_loss: 0.1049 340/500 [===================>..........] - ETA: 52s - loss: 0.7904 - regression_loss: 0.6854 - classification_loss: 0.1050 341/500 [===================>..........] - ETA: 52s - loss: 0.7910 - regression_loss: 0.6859 - classification_loss: 0.1051 342/500 [===================>..........] - ETA: 52s - loss: 0.7909 - regression_loss: 0.6858 - classification_loss: 0.1050 343/500 [===================>..........] - ETA: 51s - loss: 0.7918 - regression_loss: 0.6866 - classification_loss: 0.1052 344/500 [===================>..........] - ETA: 51s - loss: 0.7904 - regression_loss: 0.6854 - classification_loss: 0.1050 345/500 [===================>..........] - ETA: 51s - loss: 0.7894 - regression_loss: 0.6845 - classification_loss: 0.1048 346/500 [===================>..........] - ETA: 50s - loss: 0.7888 - regression_loss: 0.6842 - classification_loss: 0.1046 347/500 [===================>..........] - ETA: 50s - loss: 0.7900 - regression_loss: 0.6853 - classification_loss: 0.1048 348/500 [===================>..........] - ETA: 50s - loss: 0.7916 - regression_loss: 0.6866 - classification_loss: 0.1049 349/500 [===================>..........] - ETA: 49s - loss: 0.7920 - regression_loss: 0.6870 - classification_loss: 0.1050 350/500 [====================>.........] - ETA: 49s - loss: 0.7928 - regression_loss: 0.6878 - classification_loss: 0.1051 351/500 [====================>.........] - ETA: 49s - loss: 0.7917 - regression_loss: 0.6868 - classification_loss: 0.1049 352/500 [====================>.........] - ETA: 48s - loss: 0.7913 - regression_loss: 0.6864 - classification_loss: 0.1048 353/500 [====================>.........] - ETA: 48s - loss: 0.7918 - regression_loss: 0.6870 - classification_loss: 0.1049 354/500 [====================>.........] - ETA: 48s - loss: 0.7913 - regression_loss: 0.6866 - classification_loss: 0.1047 355/500 [====================>.........] - ETA: 47s - loss: 0.7903 - regression_loss: 0.6857 - classification_loss: 0.1046 356/500 [====================>.........] - ETA: 47s - loss: 0.7897 - regression_loss: 0.6852 - classification_loss: 0.1045 357/500 [====================>.........] - ETA: 47s - loss: 0.7890 - regression_loss: 0.6847 - classification_loss: 0.1044 358/500 [====================>.........] - ETA: 46s - loss: 0.7906 - regression_loss: 0.6860 - classification_loss: 0.1046 359/500 [====================>.........] - ETA: 46s - loss: 0.7915 - regression_loss: 0.6868 - classification_loss: 0.1047 360/500 [====================>.........] - ETA: 46s - loss: 0.7900 - regression_loss: 0.6854 - classification_loss: 0.1046 361/500 [====================>.........] - ETA: 45s - loss: 0.7921 - regression_loss: 0.6872 - classification_loss: 0.1049 362/500 [====================>.........] - ETA: 45s - loss: 0.7906 - regression_loss: 0.6859 - classification_loss: 0.1047 363/500 [====================>.........] - ETA: 45s - loss: 0.7895 - regression_loss: 0.6849 - classification_loss: 0.1045 364/500 [====================>.........] - ETA: 44s - loss: 0.7899 - regression_loss: 0.6853 - classification_loss: 0.1046 365/500 [====================>.........] - ETA: 44s - loss: 0.7912 - regression_loss: 0.6863 - classification_loss: 0.1049 366/500 [====================>.........] - ETA: 44s - loss: 0.7903 - regression_loss: 0.6855 - classification_loss: 0.1048 367/500 [=====================>........] - ETA: 43s - loss: 0.7900 - regression_loss: 0.6854 - classification_loss: 0.1047 368/500 [=====================>........] - ETA: 43s - loss: 0.7894 - regression_loss: 0.6849 - classification_loss: 0.1046 369/500 [=====================>........] - ETA: 43s - loss: 0.7892 - regression_loss: 0.6847 - classification_loss: 0.1045 370/500 [=====================>........] - ETA: 42s - loss: 0.7901 - regression_loss: 0.6854 - classification_loss: 0.1046 371/500 [=====================>........] - ETA: 42s - loss: 0.7890 - regression_loss: 0.6845 - classification_loss: 0.1045 372/500 [=====================>........] - ETA: 42s - loss: 0.7894 - regression_loss: 0.6848 - classification_loss: 0.1045 373/500 [=====================>........] - ETA: 41s - loss: 0.7890 - regression_loss: 0.6845 - classification_loss: 0.1045 374/500 [=====================>........] - ETA: 41s - loss: 0.7917 - regression_loss: 0.6869 - classification_loss: 0.1048 375/500 [=====================>........] - ETA: 41s - loss: 0.7904 - regression_loss: 0.6858 - classification_loss: 0.1047 376/500 [=====================>........] - ETA: 40s - loss: 0.7905 - regression_loss: 0.6859 - classification_loss: 0.1047 377/500 [=====================>........] - ETA: 40s - loss: 0.7901 - regression_loss: 0.6855 - classification_loss: 0.1046 378/500 [=====================>........] - ETA: 40s - loss: 0.7910 - regression_loss: 0.6862 - classification_loss: 0.1049 379/500 [=====================>........] - ETA: 40s - loss: 0.7913 - regression_loss: 0.6864 - classification_loss: 0.1049 380/500 [=====================>........] - ETA: 39s - loss: 0.7907 - regression_loss: 0.6859 - classification_loss: 0.1048 381/500 [=====================>........] - ETA: 39s - loss: 0.7914 - regression_loss: 0.6864 - classification_loss: 0.1050 382/500 [=====================>........] - ETA: 39s - loss: 0.7922 - regression_loss: 0.6871 - classification_loss: 0.1051 383/500 [=====================>........] - ETA: 38s - loss: 0.7919 - regression_loss: 0.6868 - classification_loss: 0.1051 384/500 [======================>.......] - ETA: 38s - loss: 0.7914 - regression_loss: 0.6865 - classification_loss: 0.1050 385/500 [======================>.......] - ETA: 38s - loss: 0.7910 - regression_loss: 0.6861 - classification_loss: 0.1049 386/500 [======================>.......] - ETA: 37s - loss: 0.7916 - regression_loss: 0.6866 - classification_loss: 0.1050 387/500 [======================>.......] - ETA: 37s - loss: 0.7915 - regression_loss: 0.6864 - classification_loss: 0.1051 388/500 [======================>.......] - ETA: 37s - loss: 0.7915 - regression_loss: 0.6863 - classification_loss: 0.1052 389/500 [======================>.......] - ETA: 36s - loss: 0.7913 - regression_loss: 0.6861 - classification_loss: 0.1052 390/500 [======================>.......] - ETA: 36s - loss: 0.7913 - regression_loss: 0.6861 - classification_loss: 0.1053 391/500 [======================>.......] - ETA: 36s - loss: 0.7917 - regression_loss: 0.6864 - classification_loss: 0.1053 392/500 [======================>.......] - ETA: 35s - loss: 0.7909 - regression_loss: 0.6857 - classification_loss: 0.1052 393/500 [======================>.......] - ETA: 35s - loss: 0.7929 - regression_loss: 0.6874 - classification_loss: 0.1055 394/500 [======================>.......] - ETA: 35s - loss: 0.7940 - regression_loss: 0.6883 - classification_loss: 0.1057 395/500 [======================>.......] - ETA: 34s - loss: 0.7940 - regression_loss: 0.6883 - classification_loss: 0.1056 396/500 [======================>.......] - ETA: 34s - loss: 0.7935 - regression_loss: 0.6880 - classification_loss: 0.1055 397/500 [======================>.......] - ETA: 34s - loss: 0.7930 - regression_loss: 0.6876 - classification_loss: 0.1054 398/500 [======================>.......] - ETA: 33s - loss: 0.7937 - regression_loss: 0.6882 - classification_loss: 0.1055 399/500 [======================>.......] - ETA: 33s - loss: 0.7926 - regression_loss: 0.6873 - classification_loss: 0.1053 400/500 [=======================>......] - ETA: 33s - loss: 0.7934 - regression_loss: 0.6880 - classification_loss: 0.1054 401/500 [=======================>......] - ETA: 32s - loss: 0.7926 - regression_loss: 0.6873 - classification_loss: 0.1053 402/500 [=======================>......] - ETA: 32s - loss: 0.7913 - regression_loss: 0.6862 - classification_loss: 0.1051 403/500 [=======================>......] - ETA: 32s - loss: 0.7928 - regression_loss: 0.6876 - classification_loss: 0.1052 404/500 [=======================>......] - ETA: 31s - loss: 0.7942 - regression_loss: 0.6889 - classification_loss: 0.1053 405/500 [=======================>......] - ETA: 31s - loss: 0.7960 - regression_loss: 0.6904 - classification_loss: 0.1056 406/500 [=======================>......] - ETA: 31s - loss: 0.7954 - regression_loss: 0.6899 - classification_loss: 0.1055 407/500 [=======================>......] - ETA: 30s - loss: 0.7964 - regression_loss: 0.6907 - classification_loss: 0.1057 408/500 [=======================>......] - ETA: 30s - loss: 0.7958 - regression_loss: 0.6901 - classification_loss: 0.1057 409/500 [=======================>......] - ETA: 30s - loss: 0.7963 - regression_loss: 0.6905 - classification_loss: 0.1058 410/500 [=======================>......] - ETA: 29s - loss: 0.7956 - regression_loss: 0.6900 - classification_loss: 0.1056 411/500 [=======================>......] - ETA: 29s - loss: 0.7956 - regression_loss: 0.6900 - classification_loss: 0.1056 412/500 [=======================>......] - ETA: 29s - loss: 0.7956 - regression_loss: 0.6898 - classification_loss: 0.1058 413/500 [=======================>......] - ETA: 28s - loss: 0.7967 - regression_loss: 0.6905 - classification_loss: 0.1061 414/500 [=======================>......] - ETA: 28s - loss: 0.7973 - regression_loss: 0.6910 - classification_loss: 0.1063 415/500 [=======================>......] - ETA: 28s - loss: 0.7976 - regression_loss: 0.6912 - classification_loss: 0.1064 416/500 [=======================>......] - ETA: 27s - loss: 0.7975 - regression_loss: 0.6912 - classification_loss: 0.1063 417/500 [========================>.....] - ETA: 27s - loss: 0.7977 - regression_loss: 0.6914 - classification_loss: 0.1063 418/500 [========================>.....] - ETA: 27s - loss: 0.7972 - regression_loss: 0.6910 - classification_loss: 0.1062 419/500 [========================>.....] - ETA: 26s - loss: 0.7981 - regression_loss: 0.6918 - classification_loss: 0.1062 420/500 [========================>.....] - ETA: 26s - loss: 0.7986 - regression_loss: 0.6923 - classification_loss: 0.1063 421/500 [========================>.....] - ETA: 26s - loss: 0.7986 - regression_loss: 0.6923 - classification_loss: 0.1063 422/500 [========================>.....] - ETA: 25s - loss: 0.7992 - regression_loss: 0.6930 - classification_loss: 0.1062 423/500 [========================>.....] - ETA: 25s - loss: 0.8000 - regression_loss: 0.6937 - classification_loss: 0.1063 424/500 [========================>.....] - ETA: 25s - loss: 0.8002 - regression_loss: 0.6939 - classification_loss: 0.1063 425/500 [========================>.....] - ETA: 24s - loss: 0.8000 - regression_loss: 0.6938 - classification_loss: 0.1062 426/500 [========================>.....] - ETA: 24s - loss: 0.7993 - regression_loss: 0.6931 - classification_loss: 0.1061 427/500 [========================>.....] - ETA: 24s - loss: 0.7990 - regression_loss: 0.6929 - classification_loss: 0.1061 428/500 [========================>.....] - ETA: 23s - loss: 0.7987 - regression_loss: 0.6927 - classification_loss: 0.1061 429/500 [========================>.....] - ETA: 23s - loss: 0.7996 - regression_loss: 0.6933 - classification_loss: 0.1063 430/500 [========================>.....] - ETA: 23s - loss: 0.8002 - regression_loss: 0.6937 - classification_loss: 0.1064 431/500 [========================>.....] - ETA: 22s - loss: 0.8006 - regression_loss: 0.6940 - classification_loss: 0.1065 432/500 [========================>.....] - ETA: 22s - loss: 0.8004 - regression_loss: 0.6939 - classification_loss: 0.1065 433/500 [========================>.....] - ETA: 22s - loss: 0.8015 - regression_loss: 0.6948 - classification_loss: 0.1067 434/500 [=========================>....] - ETA: 21s - loss: 0.8025 - regression_loss: 0.6958 - classification_loss: 0.1067 435/500 [=========================>....] - ETA: 21s - loss: 0.8017 - regression_loss: 0.6951 - classification_loss: 0.1066 436/500 [=========================>....] - ETA: 21s - loss: 0.8005 - regression_loss: 0.6941 - classification_loss: 0.1064 437/500 [=========================>....] - ETA: 20s - loss: 0.8006 - regression_loss: 0.6942 - classification_loss: 0.1064 438/500 [=========================>....] - ETA: 20s - loss: 0.8014 - regression_loss: 0.6949 - classification_loss: 0.1065 439/500 [=========================>....] - ETA: 20s - loss: 0.8013 - regression_loss: 0.6949 - classification_loss: 0.1064 440/500 [=========================>....] - ETA: 19s - loss: 0.8017 - regression_loss: 0.6953 - classification_loss: 0.1065 441/500 [=========================>....] - ETA: 19s - loss: 0.8019 - regression_loss: 0.6953 - classification_loss: 0.1066 442/500 [=========================>....] - ETA: 19s - loss: 0.8021 - regression_loss: 0.6954 - classification_loss: 0.1067 443/500 [=========================>....] - ETA: 18s - loss: 0.8022 - regression_loss: 0.6955 - classification_loss: 0.1067 444/500 [=========================>....] - ETA: 18s - loss: 0.8012 - regression_loss: 0.6946 - classification_loss: 0.1066 445/500 [=========================>....] - ETA: 18s - loss: 0.8003 - regression_loss: 0.6939 - classification_loss: 0.1064 446/500 [=========================>....] - ETA: 17s - loss: 0.7993 - regression_loss: 0.6930 - classification_loss: 0.1063 447/500 [=========================>....] - ETA: 17s - loss: 0.7984 - regression_loss: 0.6922 - classification_loss: 0.1062 448/500 [=========================>....] - ETA: 17s - loss: 0.7977 - regression_loss: 0.6916 - classification_loss: 0.1061 449/500 [=========================>....] - ETA: 16s - loss: 0.7974 - regression_loss: 0.6913 - classification_loss: 0.1061 450/500 [==========================>...] - ETA: 16s - loss: 0.7970 - regression_loss: 0.6910 - classification_loss: 0.1060 451/500 [==========================>...] - ETA: 16s - loss: 0.7977 - regression_loss: 0.6917 - classification_loss: 0.1061 452/500 [==========================>...] - ETA: 15s - loss: 0.7968 - regression_loss: 0.6908 - classification_loss: 0.1059 453/500 [==========================>...] - ETA: 15s - loss: 0.7977 - regression_loss: 0.6916 - classification_loss: 0.1060 454/500 [==========================>...] - ETA: 15s - loss: 0.7978 - regression_loss: 0.6918 - classification_loss: 0.1060 455/500 [==========================>...] - ETA: 14s - loss: 0.7980 - regression_loss: 0.6920 - classification_loss: 0.1060 456/500 [==========================>...] - ETA: 14s - loss: 0.7981 - regression_loss: 0.6921 - classification_loss: 0.1060 457/500 [==========================>...] - ETA: 14s - loss: 0.7994 - regression_loss: 0.6932 - classification_loss: 0.1062 458/500 [==========================>...] - ETA: 13s - loss: 0.7987 - regression_loss: 0.6925 - classification_loss: 0.1061 459/500 [==========================>...] - ETA: 13s - loss: 0.7994 - regression_loss: 0.6932 - classification_loss: 0.1062 460/500 [==========================>...] - ETA: 13s - loss: 0.7998 - regression_loss: 0.6936 - classification_loss: 0.1062 461/500 [==========================>...] - ETA: 12s - loss: 0.8006 - regression_loss: 0.6942 - classification_loss: 0.1063 462/500 [==========================>...] - ETA: 12s - loss: 0.8010 - regression_loss: 0.6948 - classification_loss: 0.1062 463/500 [==========================>...] - ETA: 12s - loss: 0.8012 - regression_loss: 0.6949 - classification_loss: 0.1063 464/500 [==========================>...] - ETA: 11s - loss: 0.8005 - regression_loss: 0.6944 - classification_loss: 0.1061 465/500 [==========================>...] - ETA: 11s - loss: 0.7999 - regression_loss: 0.6939 - classification_loss: 0.1060 466/500 [==========================>...] - ETA: 11s - loss: 0.8007 - regression_loss: 0.6946 - classification_loss: 0.1062 467/500 [===========================>..] - ETA: 10s - loss: 0.8016 - regression_loss: 0.6954 - classification_loss: 0.1063 468/500 [===========================>..] - ETA: 10s - loss: 0.8017 - regression_loss: 0.6954 - classification_loss: 0.1063 469/500 [===========================>..] - ETA: 10s - loss: 0.8008 - regression_loss: 0.6947 - classification_loss: 0.1061 470/500 [===========================>..] - ETA: 9s - loss: 0.8009 - regression_loss: 0.6948 - classification_loss: 0.1061  471/500 [===========================>..] - ETA: 9s - loss: 0.8001 - regression_loss: 0.6941 - classification_loss: 0.1060 472/500 [===========================>..] - ETA: 9s - loss: 0.7992 - regression_loss: 0.6933 - classification_loss: 0.1059 473/500 [===========================>..] - ETA: 8s - loss: 0.8002 - regression_loss: 0.6942 - classification_loss: 0.1060 474/500 [===========================>..] - ETA: 8s - loss: 0.8014 - regression_loss: 0.6952 - classification_loss: 0.1062 475/500 [===========================>..] - ETA: 8s - loss: 0.8018 - regression_loss: 0.6954 - classification_loss: 0.1064 476/500 [===========================>..] - ETA: 7s - loss: 0.8016 - regression_loss: 0.6954 - classification_loss: 0.1062 477/500 [===========================>..] - ETA: 7s - loss: 0.8013 - regression_loss: 0.6951 - classification_loss: 0.1062 478/500 [===========================>..] - ETA: 7s - loss: 0.8020 - regression_loss: 0.6957 - classification_loss: 0.1063 479/500 [===========================>..] - ETA: 6s - loss: 0.8016 - regression_loss: 0.6953 - classification_loss: 0.1063 480/500 [===========================>..] - ETA: 6s - loss: 0.8013 - regression_loss: 0.6950 - classification_loss: 0.1063 481/500 [===========================>..] - ETA: 6s - loss: 0.8014 - regression_loss: 0.6952 - classification_loss: 0.1062 482/500 [===========================>..] - ETA: 5s - loss: 0.8009 - regression_loss: 0.6948 - classification_loss: 0.1060 483/500 [===========================>..] - ETA: 5s - loss: 0.8001 - regression_loss: 0.6941 - classification_loss: 0.1060 484/500 [============================>.] - ETA: 5s - loss: 0.8004 - regression_loss: 0.6943 - classification_loss: 0.1060 485/500 [============================>.] - ETA: 4s - loss: 0.8011 - regression_loss: 0.6950 - classification_loss: 0.1061 486/500 [============================>.] - ETA: 4s - loss: 0.8003 - regression_loss: 0.6943 - classification_loss: 0.1060 487/500 [============================>.] - ETA: 4s - loss: 0.8000 - regression_loss: 0.6941 - classification_loss: 0.1059 488/500 [============================>.] - ETA: 3s - loss: 0.8006 - regression_loss: 0.6946 - classification_loss: 0.1060 489/500 [============================>.] - ETA: 3s - loss: 0.8007 - regression_loss: 0.6947 - classification_loss: 0.1060 490/500 [============================>.] - ETA: 3s - loss: 0.8000 - regression_loss: 0.6941 - classification_loss: 0.1059 491/500 [============================>.] - ETA: 2s - loss: 0.8000 - regression_loss: 0.6942 - classification_loss: 0.1058 492/500 [============================>.] - ETA: 2s - loss: 0.8009 - regression_loss: 0.6950 - classification_loss: 0.1060 493/500 [============================>.] - ETA: 2s - loss: 0.8001 - regression_loss: 0.6943 - classification_loss: 0.1059 494/500 [============================>.] - ETA: 1s - loss: 0.8011 - regression_loss: 0.6953 - classification_loss: 0.1058 495/500 [============================>.] - ETA: 1s - loss: 0.8007 - regression_loss: 0.6949 - classification_loss: 0.1058 496/500 [============================>.] - ETA: 1s - loss: 0.8003 - regression_loss: 0.6945 - classification_loss: 0.1057 497/500 [============================>.] - ETA: 0s - loss: 0.8008 - regression_loss: 0.6949 - classification_loss: 0.1059 498/500 [============================>.] - ETA: 0s - loss: 0.8014 - regression_loss: 0.6955 - classification_loss: 0.1059 499/500 [============================>.] - ETA: 0s - loss: 0.8005 - regression_loss: 0.6947 - classification_loss: 0.1058 500/500 [==============================] - 166s 331ms/step - loss: 0.8004 - regression_loss: 0.6946 - classification_loss: 0.1058 1172 instances of class plum with average precision: 0.6272 mAP: 0.6272 Epoch 00039: saving model to ./training/snapshots/resnet101_pascal_39.h5 Epoch 40/150 1/500 [..............................] - ETA: 2:44 - loss: 1.1591 - regression_loss: 1.0451 - classification_loss: 0.1140 2/500 [..............................] - ETA: 2:42 - loss: 0.9190 - regression_loss: 0.8326 - classification_loss: 0.0864 3/500 [..............................] - ETA: 2:39 - loss: 0.7335 - regression_loss: 0.6623 - classification_loss: 0.0712 4/500 [..............................] - ETA: 2:38 - loss: 0.7105 - regression_loss: 0.6318 - classification_loss: 0.0787 5/500 [..............................] - ETA: 2:41 - loss: 0.6504 - regression_loss: 0.5781 - classification_loss: 0.0723 6/500 [..............................] - ETA: 2:40 - loss: 0.6794 - regression_loss: 0.6027 - classification_loss: 0.0768 7/500 [..............................] - ETA: 2:39 - loss: 0.6675 - regression_loss: 0.5925 - classification_loss: 0.0750 8/500 [..............................] - ETA: 2:39 - loss: 0.6805 - regression_loss: 0.6009 - classification_loss: 0.0796 9/500 [..............................] - ETA: 2:39 - loss: 0.6783 - regression_loss: 0.5974 - classification_loss: 0.0809 10/500 [..............................] - ETA: 2:39 - loss: 0.6704 - regression_loss: 0.5924 - classification_loss: 0.0780 11/500 [..............................] - ETA: 2:38 - loss: 0.6695 - regression_loss: 0.5892 - classification_loss: 0.0803 12/500 [..............................] - ETA: 2:38 - loss: 0.6538 - regression_loss: 0.5765 - classification_loss: 0.0774 13/500 [..............................] - ETA: 2:37 - loss: 0.6719 - regression_loss: 0.5948 - classification_loss: 0.0771 14/500 [..............................] - ETA: 2:38 - loss: 0.6567 - regression_loss: 0.5823 - classification_loss: 0.0744 15/500 [..............................] - ETA: 2:38 - loss: 0.6898 - regression_loss: 0.6114 - classification_loss: 0.0784 16/500 [..............................] - ETA: 2:38 - loss: 0.7615 - regression_loss: 0.6676 - classification_loss: 0.0939 17/500 [>.............................] - ETA: 2:38 - loss: 0.7469 - regression_loss: 0.6563 - classification_loss: 0.0906 18/500 [>.............................] - ETA: 2:38 - loss: 0.7413 - regression_loss: 0.6531 - classification_loss: 0.0882 19/500 [>.............................] - ETA: 2:37 - loss: 0.7464 - regression_loss: 0.6574 - classification_loss: 0.0890 20/500 [>.............................] - ETA: 2:37 - loss: 0.7443 - regression_loss: 0.6554 - classification_loss: 0.0889 21/500 [>.............................] - ETA: 2:37 - loss: 0.7563 - regression_loss: 0.6660 - classification_loss: 0.0902 22/500 [>.............................] - ETA: 2:37 - loss: 0.7724 - regression_loss: 0.6786 - classification_loss: 0.0938 23/500 [>.............................] - ETA: 2:37 - loss: 0.7599 - regression_loss: 0.6679 - classification_loss: 0.0920 24/500 [>.............................] - ETA: 2:36 - loss: 0.7651 - regression_loss: 0.6726 - classification_loss: 0.0925 25/500 [>.............................] - ETA: 2:36 - loss: 0.7419 - regression_loss: 0.6520 - classification_loss: 0.0900 26/500 [>.............................] - ETA: 2:36 - loss: 0.7500 - regression_loss: 0.6581 - classification_loss: 0.0919 27/500 [>.............................] - ETA: 2:36 - loss: 0.7462 - regression_loss: 0.6551 - classification_loss: 0.0911 28/500 [>.............................] - ETA: 2:35 - loss: 0.7616 - regression_loss: 0.6688 - classification_loss: 0.0928 29/500 [>.............................] - ETA: 2:35 - loss: 0.7575 - regression_loss: 0.6656 - classification_loss: 0.0920 30/500 [>.............................] - ETA: 2:35 - loss: 0.7504 - regression_loss: 0.6604 - classification_loss: 0.0900 31/500 [>.............................] - ETA: 2:34 - loss: 0.7506 - regression_loss: 0.6604 - classification_loss: 0.0902 32/500 [>.............................] - ETA: 2:34 - loss: 0.7345 - regression_loss: 0.6458 - classification_loss: 0.0887 33/500 [>.............................] - ETA: 2:33 - loss: 0.7536 - regression_loss: 0.6598 - classification_loss: 0.0938 34/500 [=>............................] - ETA: 2:33 - loss: 0.7666 - regression_loss: 0.6719 - classification_loss: 0.0947 35/500 [=>............................] - ETA: 2:33 - loss: 0.7689 - regression_loss: 0.6740 - classification_loss: 0.0949 36/500 [=>............................] - ETA: 2:32 - loss: 0.7671 - regression_loss: 0.6721 - classification_loss: 0.0950 37/500 [=>............................] - ETA: 2:32 - loss: 0.7674 - regression_loss: 0.6719 - classification_loss: 0.0954 38/500 [=>............................] - ETA: 2:32 - loss: 0.7672 - regression_loss: 0.6722 - classification_loss: 0.0950 39/500 [=>............................] - ETA: 2:32 - loss: 0.7737 - regression_loss: 0.6799 - classification_loss: 0.0939 40/500 [=>............................] - ETA: 2:32 - loss: 0.7657 - regression_loss: 0.6725 - classification_loss: 0.0932 41/500 [=>............................] - ETA: 2:31 - loss: 0.7656 - regression_loss: 0.6723 - classification_loss: 0.0933 42/500 [=>............................] - ETA: 2:31 - loss: 0.7720 - regression_loss: 0.6779 - classification_loss: 0.0941 43/500 [=>............................] - ETA: 2:30 - loss: 0.7904 - regression_loss: 0.6930 - classification_loss: 0.0974 44/500 [=>............................] - ETA: 2:30 - loss: 0.7837 - regression_loss: 0.6874 - classification_loss: 0.0963 45/500 [=>............................] - ETA: 2:30 - loss: 0.7845 - regression_loss: 0.6875 - classification_loss: 0.0970 46/500 [=>............................] - ETA: 2:30 - loss: 0.7841 - regression_loss: 0.6876 - classification_loss: 0.0964 47/500 [=>............................] - ETA: 2:29 - loss: 0.7855 - regression_loss: 0.6895 - classification_loss: 0.0961 48/500 [=>............................] - ETA: 2:29 - loss: 0.7823 - regression_loss: 0.6859 - classification_loss: 0.0965 49/500 [=>............................] - ETA: 2:29 - loss: 0.7770 - regression_loss: 0.6808 - classification_loss: 0.0962 50/500 [==>...........................] - ETA: 2:28 - loss: 0.7739 - regression_loss: 0.6783 - classification_loss: 0.0955 51/500 [==>...........................] - ETA: 2:28 - loss: 0.7818 - regression_loss: 0.6854 - classification_loss: 0.0964 52/500 [==>...........................] - ETA: 2:27 - loss: 0.7793 - regression_loss: 0.6835 - classification_loss: 0.0957 53/500 [==>...........................] - ETA: 2:27 - loss: 0.7736 - regression_loss: 0.6786 - classification_loss: 0.0949 54/500 [==>...........................] - ETA: 2:27 - loss: 0.7820 - regression_loss: 0.6864 - classification_loss: 0.0956 55/500 [==>...........................] - ETA: 2:26 - loss: 0.7723 - regression_loss: 0.6778 - classification_loss: 0.0945 56/500 [==>...........................] - ETA: 2:26 - loss: 0.7794 - regression_loss: 0.6842 - classification_loss: 0.0951 57/500 [==>...........................] - ETA: 2:25 - loss: 0.7887 - regression_loss: 0.6927 - classification_loss: 0.0961 58/500 [==>...........................] - ETA: 2:25 - loss: 0.7813 - regression_loss: 0.6863 - classification_loss: 0.0950 59/500 [==>...........................] - ETA: 2:25 - loss: 0.7826 - regression_loss: 0.6873 - classification_loss: 0.0953 60/500 [==>...........................] - ETA: 2:24 - loss: 0.7780 - regression_loss: 0.6819 - classification_loss: 0.0961 61/500 [==>...........................] - ETA: 2:24 - loss: 0.7779 - regression_loss: 0.6819 - classification_loss: 0.0960 62/500 [==>...........................] - ETA: 2:24 - loss: 0.7862 - regression_loss: 0.6892 - classification_loss: 0.0970 63/500 [==>...........................] - ETA: 2:23 - loss: 0.7909 - regression_loss: 0.6928 - classification_loss: 0.0982 64/500 [==>...........................] - ETA: 2:23 - loss: 0.7966 - regression_loss: 0.6980 - classification_loss: 0.0986 65/500 [==>...........................] - ETA: 2:23 - loss: 0.8032 - regression_loss: 0.7040 - classification_loss: 0.0991 66/500 [==>...........................] - ETA: 2:22 - loss: 0.8145 - regression_loss: 0.7133 - classification_loss: 0.1012 67/500 [===>..........................] - ETA: 2:22 - loss: 0.8155 - regression_loss: 0.7139 - classification_loss: 0.1016 68/500 [===>..........................] - ETA: 2:22 - loss: 0.8201 - regression_loss: 0.7181 - classification_loss: 0.1020 69/500 [===>..........................] - ETA: 2:21 - loss: 0.8145 - regression_loss: 0.7130 - classification_loss: 0.1015 70/500 [===>..........................] - ETA: 2:21 - loss: 0.8079 - regression_loss: 0.7070 - classification_loss: 0.1009 71/500 [===>..........................] - ETA: 2:21 - loss: 0.8064 - regression_loss: 0.7057 - classification_loss: 0.1008 72/500 [===>..........................] - ETA: 2:20 - loss: 0.8125 - regression_loss: 0.7105 - classification_loss: 0.1020 73/500 [===>..........................] - ETA: 2:20 - loss: 0.8151 - regression_loss: 0.7126 - classification_loss: 0.1025 74/500 [===>..........................] - ETA: 2:20 - loss: 0.8098 - regression_loss: 0.7078 - classification_loss: 0.1020 75/500 [===>..........................] - ETA: 2:19 - loss: 0.8068 - regression_loss: 0.7054 - classification_loss: 0.1014 76/500 [===>..........................] - ETA: 2:19 - loss: 0.8071 - regression_loss: 0.7061 - classification_loss: 0.1010 77/500 [===>..........................] - ETA: 2:19 - loss: 0.8099 - regression_loss: 0.7087 - classification_loss: 0.1012 78/500 [===>..........................] - ETA: 2:18 - loss: 0.8111 - regression_loss: 0.7101 - classification_loss: 0.1009 79/500 [===>..........................] - ETA: 2:18 - loss: 0.8163 - regression_loss: 0.7146 - classification_loss: 0.1017 80/500 [===>..........................] - ETA: 2:18 - loss: 0.8160 - regression_loss: 0.7144 - classification_loss: 0.1016 81/500 [===>..........................] - ETA: 2:17 - loss: 0.8223 - regression_loss: 0.7199 - classification_loss: 0.1024 82/500 [===>..........................] - ETA: 2:17 - loss: 0.8201 - regression_loss: 0.7179 - classification_loss: 0.1023 83/500 [===>..........................] - ETA: 2:17 - loss: 0.8159 - regression_loss: 0.7139 - classification_loss: 0.1019 84/500 [====>.........................] - ETA: 2:16 - loss: 0.8220 - regression_loss: 0.7190 - classification_loss: 0.1030 85/500 [====>.........................] - ETA: 2:16 - loss: 0.8178 - regression_loss: 0.7144 - classification_loss: 0.1034 86/500 [====>.........................] - ETA: 2:16 - loss: 0.8222 - regression_loss: 0.7172 - classification_loss: 0.1050 87/500 [====>.........................] - ETA: 2:16 - loss: 0.8233 - regression_loss: 0.7181 - classification_loss: 0.1052 88/500 [====>.........................] - ETA: 2:15 - loss: 0.8245 - regression_loss: 0.7185 - classification_loss: 0.1060 89/500 [====>.........................] - ETA: 2:15 - loss: 0.8201 - regression_loss: 0.7146 - classification_loss: 0.1055 90/500 [====>.........................] - ETA: 2:15 - loss: 0.8148 - regression_loss: 0.7100 - classification_loss: 0.1048 91/500 [====>.........................] - ETA: 2:14 - loss: 0.8168 - regression_loss: 0.7114 - classification_loss: 0.1054 92/500 [====>.........................] - ETA: 2:14 - loss: 0.8116 - regression_loss: 0.7069 - classification_loss: 0.1047 93/500 [====>.........................] - ETA: 2:14 - loss: 0.8115 - regression_loss: 0.7072 - classification_loss: 0.1043 94/500 [====>.........................] - ETA: 2:13 - loss: 0.8088 - regression_loss: 0.7049 - classification_loss: 0.1039 95/500 [====>.........................] - ETA: 2:13 - loss: 0.8087 - regression_loss: 0.7045 - classification_loss: 0.1042 96/500 [====>.........................] - ETA: 2:13 - loss: 0.8091 - regression_loss: 0.7045 - classification_loss: 0.1046 97/500 [====>.........................] - ETA: 2:12 - loss: 0.8100 - regression_loss: 0.7054 - classification_loss: 0.1045 98/500 [====>.........................] - ETA: 2:12 - loss: 0.8131 - regression_loss: 0.7080 - classification_loss: 0.1051 99/500 [====>.........................] - ETA: 2:12 - loss: 0.8108 - regression_loss: 0.7059 - classification_loss: 0.1049 100/500 [=====>........................] - ETA: 2:11 - loss: 0.8130 - regression_loss: 0.7077 - classification_loss: 0.1052 101/500 [=====>........................] - ETA: 2:11 - loss: 0.8155 - regression_loss: 0.7101 - classification_loss: 0.1054 102/500 [=====>........................] - ETA: 2:11 - loss: 0.8202 - regression_loss: 0.7139 - classification_loss: 0.1063 103/500 [=====>........................] - ETA: 2:10 - loss: 0.8192 - regression_loss: 0.7127 - classification_loss: 0.1064 104/500 [=====>........................] - ETA: 2:10 - loss: 0.8141 - regression_loss: 0.7085 - classification_loss: 0.1056 105/500 [=====>........................] - ETA: 2:10 - loss: 0.8093 - regression_loss: 0.7044 - classification_loss: 0.1049 106/500 [=====>........................] - ETA: 2:09 - loss: 0.8092 - regression_loss: 0.7037 - classification_loss: 0.1054 107/500 [=====>........................] - ETA: 2:09 - loss: 0.8116 - regression_loss: 0.7054 - classification_loss: 0.1062 108/500 [=====>........................] - ETA: 2:09 - loss: 0.8116 - regression_loss: 0.7054 - classification_loss: 0.1062 109/500 [=====>........................] - ETA: 2:08 - loss: 0.8091 - regression_loss: 0.7033 - classification_loss: 0.1058 110/500 [=====>........................] - ETA: 2:08 - loss: 0.8094 - regression_loss: 0.7035 - classification_loss: 0.1058 111/500 [=====>........................] - ETA: 2:08 - loss: 0.8095 - regression_loss: 0.7038 - classification_loss: 0.1057 112/500 [=====>........................] - ETA: 2:07 - loss: 0.8068 - regression_loss: 0.7016 - classification_loss: 0.1052 113/500 [=====>........................] - ETA: 2:07 - loss: 0.8052 - regression_loss: 0.7005 - classification_loss: 0.1047 114/500 [=====>........................] - ETA: 2:07 - loss: 0.8028 - regression_loss: 0.6983 - classification_loss: 0.1045 115/500 [=====>........................] - ETA: 2:06 - loss: 0.8005 - regression_loss: 0.6965 - classification_loss: 0.1040 116/500 [=====>........................] - ETA: 2:06 - loss: 0.8029 - regression_loss: 0.6985 - classification_loss: 0.1044 117/500 [======>.......................] - ETA: 2:06 - loss: 0.8046 - regression_loss: 0.7000 - classification_loss: 0.1046 118/500 [======>.......................] - ETA: 2:05 - loss: 0.8010 - regression_loss: 0.6969 - classification_loss: 0.1041 119/500 [======>.......................] - ETA: 2:05 - loss: 0.7966 - regression_loss: 0.6931 - classification_loss: 0.1035 120/500 [======>.......................] - ETA: 2:04 - loss: 0.7992 - regression_loss: 0.6952 - classification_loss: 0.1040 121/500 [======>.......................] - ETA: 2:04 - loss: 0.7976 - regression_loss: 0.6940 - classification_loss: 0.1036 122/500 [======>.......................] - ETA: 2:04 - loss: 0.8004 - regression_loss: 0.6964 - classification_loss: 0.1040 123/500 [======>.......................] - ETA: 2:04 - loss: 0.8052 - regression_loss: 0.7006 - classification_loss: 0.1046 124/500 [======>.......................] - ETA: 2:03 - loss: 0.8039 - regression_loss: 0.6997 - classification_loss: 0.1042 125/500 [======>.......................] - ETA: 2:03 - loss: 0.8081 - regression_loss: 0.7038 - classification_loss: 0.1043 126/500 [======>.......................] - ETA: 2:02 - loss: 0.8075 - regression_loss: 0.7034 - classification_loss: 0.1041 127/500 [======>.......................] - ETA: 2:02 - loss: 0.8121 - regression_loss: 0.7076 - classification_loss: 0.1045 128/500 [======>.......................] - ETA: 2:02 - loss: 0.8193 - regression_loss: 0.7139 - classification_loss: 0.1055 129/500 [======>.......................] - ETA: 2:01 - loss: 0.8191 - regression_loss: 0.7139 - classification_loss: 0.1053 130/500 [======>.......................] - ETA: 2:01 - loss: 0.8154 - regression_loss: 0.7107 - classification_loss: 0.1047 131/500 [======>.......................] - ETA: 2:01 - loss: 0.8190 - regression_loss: 0.7135 - classification_loss: 0.1054 132/500 [======>.......................] - ETA: 2:00 - loss: 0.8222 - regression_loss: 0.7160 - classification_loss: 0.1062 133/500 [======>.......................] - ETA: 2:00 - loss: 0.8190 - regression_loss: 0.7134 - classification_loss: 0.1057 134/500 [=======>......................] - ETA: 2:00 - loss: 0.8227 - regression_loss: 0.7161 - classification_loss: 0.1065 135/500 [=======>......................] - ETA: 1:59 - loss: 0.8199 - regression_loss: 0.7137 - classification_loss: 0.1062 136/500 [=======>......................] - ETA: 1:59 - loss: 0.8214 - regression_loss: 0.7151 - classification_loss: 0.1063 137/500 [=======>......................] - ETA: 1:59 - loss: 0.8218 - regression_loss: 0.7156 - classification_loss: 0.1062 138/500 [=======>......................] - ETA: 1:58 - loss: 0.8189 - regression_loss: 0.7132 - classification_loss: 0.1058 139/500 [=======>......................] - ETA: 1:58 - loss: 0.8202 - regression_loss: 0.7144 - classification_loss: 0.1059 140/500 [=======>......................] - ETA: 1:58 - loss: 0.8208 - regression_loss: 0.7150 - classification_loss: 0.1057 141/500 [=======>......................] - ETA: 1:57 - loss: 0.8175 - regression_loss: 0.7123 - classification_loss: 0.1052 142/500 [=======>......................] - ETA: 1:57 - loss: 0.8147 - regression_loss: 0.7098 - classification_loss: 0.1049 143/500 [=======>......................] - ETA: 1:57 - loss: 0.8147 - regression_loss: 0.7099 - classification_loss: 0.1048 144/500 [=======>......................] - ETA: 1:56 - loss: 0.8118 - regression_loss: 0.7075 - classification_loss: 0.1043 145/500 [=======>......................] - ETA: 1:56 - loss: 0.8083 - regression_loss: 0.7045 - classification_loss: 0.1038 146/500 [=======>......................] - ETA: 1:56 - loss: 0.8098 - regression_loss: 0.7055 - classification_loss: 0.1044 147/500 [=======>......................] - ETA: 1:55 - loss: 0.8100 - regression_loss: 0.7059 - classification_loss: 0.1041 148/500 [=======>......................] - ETA: 1:55 - loss: 0.8090 - regression_loss: 0.7049 - classification_loss: 0.1040 149/500 [=======>......................] - ETA: 1:55 - loss: 0.8066 - regression_loss: 0.7029 - classification_loss: 0.1037 150/500 [========>.....................] - ETA: 1:54 - loss: 0.8112 - regression_loss: 0.7063 - classification_loss: 0.1049 151/500 [========>.....................] - ETA: 1:54 - loss: 0.8138 - regression_loss: 0.7085 - classification_loss: 0.1053 152/500 [========>.....................] - ETA: 1:54 - loss: 0.8145 - regression_loss: 0.7089 - classification_loss: 0.1056 153/500 [========>.....................] - ETA: 1:53 - loss: 0.8125 - regression_loss: 0.7071 - classification_loss: 0.1054 154/500 [========>.....................] - ETA: 1:53 - loss: 0.8114 - regression_loss: 0.7063 - classification_loss: 0.1051 155/500 [========>.....................] - ETA: 1:53 - loss: 0.8133 - regression_loss: 0.7080 - classification_loss: 0.1053 156/500 [========>.....................] - ETA: 1:52 - loss: 0.8139 - regression_loss: 0.7086 - classification_loss: 0.1053 157/500 [========>.....................] - ETA: 1:52 - loss: 0.8132 - regression_loss: 0.7083 - classification_loss: 0.1050 158/500 [========>.....................] - ETA: 1:52 - loss: 0.8124 - regression_loss: 0.7074 - classification_loss: 0.1050 159/500 [========>.....................] - ETA: 1:52 - loss: 0.8116 - regression_loss: 0.7068 - classification_loss: 0.1049 160/500 [========>.....................] - ETA: 1:51 - loss: 0.8123 - regression_loss: 0.7072 - classification_loss: 0.1052 161/500 [========>.....................] - ETA: 1:51 - loss: 0.8114 - regression_loss: 0.7064 - classification_loss: 0.1050 162/500 [========>.....................] - ETA: 1:51 - loss: 0.8134 - regression_loss: 0.7081 - classification_loss: 0.1053 163/500 [========>.....................] - ETA: 1:50 - loss: 0.8142 - regression_loss: 0.7091 - classification_loss: 0.1051 164/500 [========>.....................] - ETA: 1:50 - loss: 0.8141 - regression_loss: 0.7092 - classification_loss: 0.1049 165/500 [========>.....................] - ETA: 1:50 - loss: 0.8107 - regression_loss: 0.7062 - classification_loss: 0.1045 166/500 [========>.....................] - ETA: 1:49 - loss: 0.8097 - regression_loss: 0.7054 - classification_loss: 0.1043 167/500 [=========>....................] - ETA: 1:49 - loss: 0.8101 - regression_loss: 0.7057 - classification_loss: 0.1044 168/500 [=========>....................] - ETA: 1:49 - loss: 0.8068 - regression_loss: 0.7028 - classification_loss: 0.1041 169/500 [=========>....................] - ETA: 1:48 - loss: 0.8074 - regression_loss: 0.7032 - classification_loss: 0.1042 170/500 [=========>....................] - ETA: 1:48 - loss: 0.8040 - regression_loss: 0.7003 - classification_loss: 0.1037 171/500 [=========>....................] - ETA: 1:48 - loss: 0.8014 - regression_loss: 0.6981 - classification_loss: 0.1033 172/500 [=========>....................] - ETA: 1:47 - loss: 0.8055 - regression_loss: 0.7016 - classification_loss: 0.1039 173/500 [=========>....................] - ETA: 1:47 - loss: 0.8076 - regression_loss: 0.7031 - classification_loss: 0.1045 174/500 [=========>....................] - ETA: 1:47 - loss: 0.8056 - regression_loss: 0.7015 - classification_loss: 0.1042 175/500 [=========>....................] - ETA: 1:46 - loss: 0.8054 - regression_loss: 0.7014 - classification_loss: 0.1041 176/500 [=========>....................] - ETA: 1:46 - loss: 0.8055 - regression_loss: 0.7013 - classification_loss: 0.1041 177/500 [=========>....................] - ETA: 1:46 - loss: 0.8034 - regression_loss: 0.6996 - classification_loss: 0.1038 178/500 [=========>....................] - ETA: 1:45 - loss: 0.8035 - regression_loss: 0.6996 - classification_loss: 0.1040 179/500 [=========>....................] - ETA: 1:45 - loss: 0.8040 - regression_loss: 0.7001 - classification_loss: 0.1040 180/500 [=========>....................] - ETA: 1:45 - loss: 0.8025 - regression_loss: 0.6989 - classification_loss: 0.1035 181/500 [=========>....................] - ETA: 1:44 - loss: 0.8009 - regression_loss: 0.6976 - classification_loss: 0.1033 182/500 [=========>....................] - ETA: 1:44 - loss: 0.8019 - regression_loss: 0.6986 - classification_loss: 0.1033 183/500 [=========>....................] - ETA: 1:44 - loss: 0.8013 - regression_loss: 0.6982 - classification_loss: 0.1030 184/500 [==========>...................] - ETA: 1:43 - loss: 0.8016 - regression_loss: 0.6986 - classification_loss: 0.1031 185/500 [==========>...................] - ETA: 1:43 - loss: 0.8003 - regression_loss: 0.6974 - classification_loss: 0.1029 186/500 [==========>...................] - ETA: 1:43 - loss: 0.7973 - regression_loss: 0.6947 - classification_loss: 0.1026 187/500 [==========>...................] - ETA: 1:42 - loss: 0.7982 - regression_loss: 0.6954 - classification_loss: 0.1029 188/500 [==========>...................] - ETA: 1:42 - loss: 0.7967 - regression_loss: 0.6942 - classification_loss: 0.1025 189/500 [==========>...................] - ETA: 1:42 - loss: 0.7951 - regression_loss: 0.6928 - classification_loss: 0.1023 190/500 [==========>...................] - ETA: 1:42 - loss: 0.7958 - regression_loss: 0.6935 - classification_loss: 0.1023 191/500 [==========>...................] - ETA: 1:41 - loss: 0.7976 - regression_loss: 0.6949 - classification_loss: 0.1026 192/500 [==========>...................] - ETA: 1:41 - loss: 0.7986 - regression_loss: 0.6958 - classification_loss: 0.1028 193/500 [==========>...................] - ETA: 1:41 - loss: 0.7962 - regression_loss: 0.6938 - classification_loss: 0.1024 194/500 [==========>...................] - ETA: 1:40 - loss: 0.7986 - regression_loss: 0.6956 - classification_loss: 0.1029 195/500 [==========>...................] - ETA: 1:40 - loss: 0.7971 - regression_loss: 0.6943 - classification_loss: 0.1028 196/500 [==========>...................] - ETA: 1:40 - loss: 0.7968 - regression_loss: 0.6940 - classification_loss: 0.1028 197/500 [==========>...................] - ETA: 1:39 - loss: 0.7963 - regression_loss: 0.6936 - classification_loss: 0.1027 198/500 [==========>...................] - ETA: 1:39 - loss: 0.7968 - regression_loss: 0.6939 - classification_loss: 0.1029 199/500 [==========>...................] - ETA: 1:39 - loss: 0.7942 - regression_loss: 0.6917 - classification_loss: 0.1025 200/500 [===========>..................] - ETA: 1:38 - loss: 0.7937 - regression_loss: 0.6911 - classification_loss: 0.1026 201/500 [===========>..................] - ETA: 1:38 - loss: 0.7927 - regression_loss: 0.6904 - classification_loss: 0.1023 202/500 [===========>..................] - ETA: 1:38 - loss: 0.7922 - regression_loss: 0.6898 - classification_loss: 0.1024 203/500 [===========>..................] - ETA: 1:37 - loss: 0.7929 - regression_loss: 0.6904 - classification_loss: 0.1024 204/500 [===========>..................] - ETA: 1:37 - loss: 0.7935 - regression_loss: 0.6911 - classification_loss: 0.1024 205/500 [===========>..................] - ETA: 1:37 - loss: 0.7958 - regression_loss: 0.6932 - classification_loss: 0.1026 206/500 [===========>..................] - ETA: 1:36 - loss: 0.7968 - regression_loss: 0.6943 - classification_loss: 0.1025 207/500 [===========>..................] - ETA: 1:36 - loss: 0.7953 - regression_loss: 0.6929 - classification_loss: 0.1023 208/500 [===========>..................] - ETA: 1:36 - loss: 0.7966 - regression_loss: 0.6940 - classification_loss: 0.1026 209/500 [===========>..................] - ETA: 1:35 - loss: 0.7956 - regression_loss: 0.6932 - classification_loss: 0.1024 210/500 [===========>..................] - ETA: 1:35 - loss: 0.7955 - regression_loss: 0.6931 - classification_loss: 0.1024 211/500 [===========>..................] - ETA: 1:35 - loss: 0.7947 - regression_loss: 0.6924 - classification_loss: 0.1023 212/500 [===========>..................] - ETA: 1:34 - loss: 0.7956 - regression_loss: 0.6932 - classification_loss: 0.1024 213/500 [===========>..................] - ETA: 1:34 - loss: 0.7940 - regression_loss: 0.6919 - classification_loss: 0.1021 214/500 [===========>..................] - ETA: 1:34 - loss: 0.7917 - regression_loss: 0.6899 - classification_loss: 0.1018 215/500 [===========>..................] - ETA: 1:34 - loss: 0.7916 - regression_loss: 0.6898 - classification_loss: 0.1018 216/500 [===========>..................] - ETA: 1:33 - loss: 0.7915 - regression_loss: 0.6898 - classification_loss: 0.1017 217/500 [============>.................] - ETA: 1:33 - loss: 0.7902 - regression_loss: 0.6887 - classification_loss: 0.1015 218/500 [============>.................] - ETA: 1:33 - loss: 0.7918 - regression_loss: 0.6901 - classification_loss: 0.1017 219/500 [============>.................] - ETA: 1:32 - loss: 0.7915 - regression_loss: 0.6900 - classification_loss: 0.1015 220/500 [============>.................] - ETA: 1:32 - loss: 0.7912 - regression_loss: 0.6899 - classification_loss: 0.1013 221/500 [============>.................] - ETA: 1:32 - loss: 0.7899 - regression_loss: 0.6889 - classification_loss: 0.1010 222/500 [============>.................] - ETA: 1:31 - loss: 0.7903 - regression_loss: 0.6892 - classification_loss: 0.1012 223/500 [============>.................] - ETA: 1:31 - loss: 0.7884 - regression_loss: 0.6873 - classification_loss: 0.1010 224/500 [============>.................] - ETA: 1:31 - loss: 0.7868 - regression_loss: 0.6860 - classification_loss: 0.1008 225/500 [============>.................] - ETA: 1:30 - loss: 0.7881 - regression_loss: 0.6869 - classification_loss: 0.1011 226/500 [============>.................] - ETA: 1:30 - loss: 0.7879 - regression_loss: 0.6867 - classification_loss: 0.1011 227/500 [============>.................] - ETA: 1:30 - loss: 0.7879 - regression_loss: 0.6869 - classification_loss: 0.1010 228/500 [============>.................] - ETA: 1:29 - loss: 0.7873 - regression_loss: 0.6865 - classification_loss: 0.1008 229/500 [============>.................] - ETA: 1:29 - loss: 0.7868 - regression_loss: 0.6860 - classification_loss: 0.1007 230/500 [============>.................] - ETA: 1:28 - loss: 0.7862 - regression_loss: 0.6855 - classification_loss: 0.1008 231/500 [============>.................] - ETA: 1:28 - loss: 0.7850 - regression_loss: 0.6844 - classification_loss: 0.1006 232/500 [============>.................] - ETA: 1:28 - loss: 0.7836 - regression_loss: 0.6832 - classification_loss: 0.1005 233/500 [============>.................] - ETA: 1:27 - loss: 0.7844 - regression_loss: 0.6837 - classification_loss: 0.1007 234/500 [=============>................] - ETA: 1:27 - loss: 0.7832 - regression_loss: 0.6826 - classification_loss: 0.1005 235/500 [=============>................] - ETA: 1:27 - loss: 0.7836 - regression_loss: 0.6830 - classification_loss: 0.1007 236/500 [=============>................] - ETA: 1:26 - loss: 0.7855 - regression_loss: 0.6845 - classification_loss: 0.1010 237/500 [=============>................] - ETA: 1:26 - loss: 0.7858 - regression_loss: 0.6846 - classification_loss: 0.1012 238/500 [=============>................] - ETA: 1:26 - loss: 0.7859 - regression_loss: 0.6850 - classification_loss: 0.1009 239/500 [=============>................] - ETA: 1:26 - loss: 0.7849 - regression_loss: 0.6842 - classification_loss: 0.1007 240/500 [=============>................] - ETA: 1:25 - loss: 0.7857 - regression_loss: 0.6849 - classification_loss: 0.1008 241/500 [=============>................] - ETA: 1:25 - loss: 0.7873 - regression_loss: 0.6859 - classification_loss: 0.1014 242/500 [=============>................] - ETA: 1:25 - loss: 0.7859 - regression_loss: 0.6847 - classification_loss: 0.1012 243/500 [=============>................] - ETA: 1:24 - loss: 0.7871 - regression_loss: 0.6856 - classification_loss: 0.1015 244/500 [=============>................] - ETA: 1:24 - loss: 0.7868 - regression_loss: 0.6853 - classification_loss: 0.1015 245/500 [=============>................] - ETA: 1:24 - loss: 0.7889 - regression_loss: 0.6871 - classification_loss: 0.1018 246/500 [=============>................] - ETA: 1:23 - loss: 0.7890 - regression_loss: 0.6873 - classification_loss: 0.1017 247/500 [=============>................] - ETA: 1:23 - loss: 0.7904 - regression_loss: 0.6886 - classification_loss: 0.1018 248/500 [=============>................] - ETA: 1:23 - loss: 0.7907 - regression_loss: 0.6887 - classification_loss: 0.1020 249/500 [=============>................] - ETA: 1:22 - loss: 0.7894 - regression_loss: 0.6876 - classification_loss: 0.1018 250/500 [==============>...............] - ETA: 1:22 - loss: 0.7897 - regression_loss: 0.6879 - classification_loss: 0.1018 251/500 [==============>...............] - ETA: 1:22 - loss: 0.7883 - regression_loss: 0.6867 - classification_loss: 0.1017 252/500 [==============>...............] - ETA: 1:21 - loss: 0.7880 - regression_loss: 0.6864 - classification_loss: 0.1016 253/500 [==============>...............] - ETA: 1:21 - loss: 0.7876 - regression_loss: 0.6861 - classification_loss: 0.1015 254/500 [==============>...............] - ETA: 1:21 - loss: 0.7856 - regression_loss: 0.6843 - classification_loss: 0.1013 255/500 [==============>...............] - ETA: 1:20 - loss: 0.7846 - regression_loss: 0.6834 - classification_loss: 0.1012 256/500 [==============>...............] - ETA: 1:20 - loss: 0.7827 - regression_loss: 0.6817 - classification_loss: 0.1010 257/500 [==============>...............] - ETA: 1:20 - loss: 0.7808 - regression_loss: 0.6801 - classification_loss: 0.1008 258/500 [==============>...............] - ETA: 1:19 - loss: 0.7794 - regression_loss: 0.6789 - classification_loss: 0.1006 259/500 [==============>...............] - ETA: 1:19 - loss: 0.7788 - regression_loss: 0.6784 - classification_loss: 0.1004 260/500 [==============>...............] - ETA: 1:19 - loss: 0.7800 - regression_loss: 0.6795 - classification_loss: 0.1005 261/500 [==============>...............] - ETA: 1:18 - loss: 0.7796 - regression_loss: 0.6792 - classification_loss: 0.1004 262/500 [==============>...............] - ETA: 1:18 - loss: 0.7795 - regression_loss: 0.6791 - classification_loss: 0.1004 263/500 [==============>...............] - ETA: 1:18 - loss: 0.7795 - regression_loss: 0.6791 - classification_loss: 0.1004 264/500 [==============>...............] - ETA: 1:17 - loss: 0.7802 - regression_loss: 0.6799 - classification_loss: 0.1004 265/500 [==============>...............] - ETA: 1:17 - loss: 0.7809 - regression_loss: 0.6804 - classification_loss: 0.1006 266/500 [==============>...............] - ETA: 1:17 - loss: 0.7798 - regression_loss: 0.6794 - classification_loss: 0.1004 267/500 [===============>..............] - ETA: 1:16 - loss: 0.7799 - regression_loss: 0.6795 - classification_loss: 0.1004 268/500 [===============>..............] - ETA: 1:16 - loss: 0.7809 - regression_loss: 0.6803 - classification_loss: 0.1006 269/500 [===============>..............] - ETA: 1:16 - loss: 0.7794 - regression_loss: 0.6790 - classification_loss: 0.1004 270/500 [===============>..............] - ETA: 1:15 - loss: 0.7790 - regression_loss: 0.6788 - classification_loss: 0.1002 271/500 [===============>..............] - ETA: 1:15 - loss: 0.7782 - regression_loss: 0.6781 - classification_loss: 0.1000 272/500 [===============>..............] - ETA: 1:15 - loss: 0.7788 - regression_loss: 0.6787 - classification_loss: 0.1001 273/500 [===============>..............] - ETA: 1:14 - loss: 0.7803 - regression_loss: 0.6800 - classification_loss: 0.1003 274/500 [===============>..............] - ETA: 1:14 - loss: 0.7785 - regression_loss: 0.6785 - classification_loss: 0.1000 275/500 [===============>..............] - ETA: 1:14 - loss: 0.7805 - regression_loss: 0.6801 - classification_loss: 0.1004 276/500 [===============>..............] - ETA: 1:13 - loss: 0.7806 - regression_loss: 0.6802 - classification_loss: 0.1005 277/500 [===============>..............] - ETA: 1:13 - loss: 0.7802 - regression_loss: 0.6797 - classification_loss: 0.1005 278/500 [===============>..............] - ETA: 1:13 - loss: 0.7792 - regression_loss: 0.6788 - classification_loss: 0.1004 279/500 [===============>..............] - ETA: 1:12 - loss: 0.7789 - regression_loss: 0.6784 - classification_loss: 0.1005 280/500 [===============>..............] - ETA: 1:12 - loss: 0.7789 - regression_loss: 0.6785 - classification_loss: 0.1004 281/500 [===============>..............] - ETA: 1:12 - loss: 0.7790 - regression_loss: 0.6786 - classification_loss: 0.1004 282/500 [===============>..............] - ETA: 1:11 - loss: 0.7792 - regression_loss: 0.6788 - classification_loss: 0.1005 283/500 [===============>..............] - ETA: 1:11 - loss: 0.7785 - regression_loss: 0.6782 - classification_loss: 0.1003 284/500 [================>.............] - ETA: 1:11 - loss: 0.7775 - regression_loss: 0.6772 - classification_loss: 0.1003 285/500 [================>.............] - ETA: 1:10 - loss: 0.7757 - regression_loss: 0.6756 - classification_loss: 0.1001 286/500 [================>.............] - ETA: 1:10 - loss: 0.7747 - regression_loss: 0.6748 - classification_loss: 0.0999 287/500 [================>.............] - ETA: 1:10 - loss: 0.7751 - regression_loss: 0.6751 - classification_loss: 0.1000 288/500 [================>.............] - ETA: 1:09 - loss: 0.7773 - regression_loss: 0.6769 - classification_loss: 0.1004 289/500 [================>.............] - ETA: 1:09 - loss: 0.7757 - regression_loss: 0.6755 - classification_loss: 0.1002 290/500 [================>.............] - ETA: 1:09 - loss: 0.7749 - regression_loss: 0.6747 - classification_loss: 0.1002 291/500 [================>.............] - ETA: 1:08 - loss: 0.7740 - regression_loss: 0.6739 - classification_loss: 0.1001 292/500 [================>.............] - ETA: 1:08 - loss: 0.7721 - regression_loss: 0.6722 - classification_loss: 0.0999 293/500 [================>.............] - ETA: 1:08 - loss: 0.7727 - regression_loss: 0.6726 - classification_loss: 0.1000 294/500 [================>.............] - ETA: 1:08 - loss: 0.7732 - regression_loss: 0.6731 - classification_loss: 0.1001 295/500 [================>.............] - ETA: 1:07 - loss: 0.7735 - regression_loss: 0.6734 - classification_loss: 0.1001 296/500 [================>.............] - ETA: 1:07 - loss: 0.7759 - regression_loss: 0.6757 - classification_loss: 0.1002 297/500 [================>.............] - ETA: 1:07 - loss: 0.7755 - regression_loss: 0.6754 - classification_loss: 0.1001 298/500 [================>.............] - ETA: 1:06 - loss: 0.7768 - regression_loss: 0.6765 - classification_loss: 0.1003 299/500 [================>.............] - ETA: 1:06 - loss: 0.7769 - regression_loss: 0.6766 - classification_loss: 0.1003 300/500 [=================>............] - ETA: 1:06 - loss: 0.7780 - regression_loss: 0.6775 - classification_loss: 0.1005 301/500 [=================>............] - ETA: 1:05 - loss: 0.7780 - regression_loss: 0.6776 - classification_loss: 0.1004 302/500 [=================>............] - ETA: 1:05 - loss: 0.7779 - regression_loss: 0.6776 - classification_loss: 0.1003 303/500 [=================>............] - ETA: 1:05 - loss: 0.7784 - regression_loss: 0.6780 - classification_loss: 0.1004 304/500 [=================>............] - ETA: 1:04 - loss: 0.7773 - regression_loss: 0.6770 - classification_loss: 0.1002 305/500 [=================>............] - ETA: 1:04 - loss: 0.7773 - regression_loss: 0.6770 - classification_loss: 0.1003 306/500 [=================>............] - ETA: 1:04 - loss: 0.7790 - regression_loss: 0.6785 - classification_loss: 0.1005 307/500 [=================>............] - ETA: 1:03 - loss: 0.7773 - regression_loss: 0.6770 - classification_loss: 0.1003 308/500 [=================>............] - ETA: 1:03 - loss: 0.7792 - regression_loss: 0.6785 - classification_loss: 0.1007 309/500 [=================>............] - ETA: 1:03 - loss: 0.7780 - regression_loss: 0.6775 - classification_loss: 0.1006 310/500 [=================>............] - ETA: 1:02 - loss: 0.7765 - regression_loss: 0.6761 - classification_loss: 0.1004 311/500 [=================>............] - ETA: 1:02 - loss: 0.7768 - regression_loss: 0.6764 - classification_loss: 0.1004 312/500 [=================>............] - ETA: 1:02 - loss: 0.7767 - regression_loss: 0.6763 - classification_loss: 0.1005 313/500 [=================>............] - ETA: 1:01 - loss: 0.7776 - regression_loss: 0.6770 - classification_loss: 0.1005 314/500 [=================>............] - ETA: 1:01 - loss: 0.7764 - regression_loss: 0.6760 - classification_loss: 0.1003 315/500 [=================>............] - ETA: 1:01 - loss: 0.7768 - regression_loss: 0.6764 - classification_loss: 0.1004 316/500 [=================>............] - ETA: 1:00 - loss: 0.7773 - regression_loss: 0.6769 - classification_loss: 0.1004 317/500 [==================>...........] - ETA: 1:00 - loss: 0.7760 - regression_loss: 0.6758 - classification_loss: 0.1002 318/500 [==================>...........] - ETA: 1:00 - loss: 0.7767 - regression_loss: 0.6766 - classification_loss: 0.1001 319/500 [==================>...........] - ETA: 59s - loss: 0.7767 - regression_loss: 0.6765 - classification_loss: 0.1002  320/500 [==================>...........] - ETA: 59s - loss: 0.7756 - regression_loss: 0.6755 - classification_loss: 0.1001 321/500 [==================>...........] - ETA: 59s - loss: 0.7761 - regression_loss: 0.6761 - classification_loss: 0.1000 322/500 [==================>...........] - ETA: 58s - loss: 0.7769 - regression_loss: 0.6767 - classification_loss: 0.1002 323/500 [==================>...........] - ETA: 58s - loss: 0.7757 - regression_loss: 0.6756 - classification_loss: 0.1001 324/500 [==================>...........] - ETA: 58s - loss: 0.7766 - regression_loss: 0.6764 - classification_loss: 0.1002 325/500 [==================>...........] - ETA: 57s - loss: 0.7762 - regression_loss: 0.6760 - classification_loss: 0.1002 326/500 [==================>...........] - ETA: 57s - loss: 0.7762 - regression_loss: 0.6760 - classification_loss: 0.1002 327/500 [==================>...........] - ETA: 57s - loss: 0.7755 - regression_loss: 0.6754 - classification_loss: 0.1002 328/500 [==================>...........] - ETA: 56s - loss: 0.7751 - regression_loss: 0.6750 - classification_loss: 0.1002 329/500 [==================>...........] - ETA: 56s - loss: 0.7750 - regression_loss: 0.6750 - classification_loss: 0.1000 330/500 [==================>...........] - ETA: 56s - loss: 0.7741 - regression_loss: 0.6742 - classification_loss: 0.0999 331/500 [==================>...........] - ETA: 55s - loss: 0.7743 - regression_loss: 0.6744 - classification_loss: 0.0999 332/500 [==================>...........] - ETA: 55s - loss: 0.7726 - regression_loss: 0.6729 - classification_loss: 0.0997 333/500 [==================>...........] - ETA: 55s - loss: 0.7733 - regression_loss: 0.6736 - classification_loss: 0.0997 334/500 [===================>..........] - ETA: 54s - loss: 0.7739 - regression_loss: 0.6742 - classification_loss: 0.0997 335/500 [===================>..........] - ETA: 54s - loss: 0.7723 - regression_loss: 0.6728 - classification_loss: 0.0995 336/500 [===================>..........] - ETA: 54s - loss: 0.7710 - regression_loss: 0.6717 - classification_loss: 0.0993 337/500 [===================>..........] - ETA: 53s - loss: 0.7698 - regression_loss: 0.6707 - classification_loss: 0.0991 338/500 [===================>..........] - ETA: 53s - loss: 0.7691 - regression_loss: 0.6701 - classification_loss: 0.0989 339/500 [===================>..........] - ETA: 53s - loss: 0.7703 - regression_loss: 0.6709 - classification_loss: 0.0994 340/500 [===================>..........] - ETA: 52s - loss: 0.7715 - regression_loss: 0.6718 - classification_loss: 0.0998 341/500 [===================>..........] - ETA: 52s - loss: 0.7714 - regression_loss: 0.6717 - classification_loss: 0.0997 342/500 [===================>..........] - ETA: 52s - loss: 0.7737 - regression_loss: 0.6735 - classification_loss: 0.1002 343/500 [===================>..........] - ETA: 51s - loss: 0.7741 - regression_loss: 0.6737 - classification_loss: 0.1003 344/500 [===================>..........] - ETA: 51s - loss: 0.7746 - regression_loss: 0.6742 - classification_loss: 0.1004 345/500 [===================>..........] - ETA: 51s - loss: 0.7759 - regression_loss: 0.6753 - classification_loss: 0.1006 346/500 [===================>..........] - ETA: 50s - loss: 0.7758 - regression_loss: 0.6753 - classification_loss: 0.1006 347/500 [===================>..........] - ETA: 50s - loss: 0.7762 - regression_loss: 0.6755 - classification_loss: 0.1007 348/500 [===================>..........] - ETA: 50s - loss: 0.7764 - regression_loss: 0.6756 - classification_loss: 0.1008 349/500 [===================>..........] - ETA: 49s - loss: 0.7767 - regression_loss: 0.6759 - classification_loss: 0.1008 350/500 [====================>.........] - ETA: 49s - loss: 0.7768 - regression_loss: 0.6760 - classification_loss: 0.1009 351/500 [====================>.........] - ETA: 49s - loss: 0.7768 - regression_loss: 0.6760 - classification_loss: 0.1008 352/500 [====================>.........] - ETA: 48s - loss: 0.7757 - regression_loss: 0.6750 - classification_loss: 0.1007 353/500 [====================>.........] - ETA: 48s - loss: 0.7753 - regression_loss: 0.6747 - classification_loss: 0.1006 354/500 [====================>.........] - ETA: 48s - loss: 0.7760 - regression_loss: 0.6752 - classification_loss: 0.1008 355/500 [====================>.........] - ETA: 47s - loss: 0.7780 - regression_loss: 0.6769 - classification_loss: 0.1011 356/500 [====================>.........] - ETA: 47s - loss: 0.7785 - regression_loss: 0.6773 - classification_loss: 0.1012 357/500 [====================>.........] - ETA: 47s - loss: 0.7792 - regression_loss: 0.6779 - classification_loss: 0.1013 358/500 [====================>.........] - ETA: 46s - loss: 0.7803 - regression_loss: 0.6789 - classification_loss: 0.1015 359/500 [====================>.........] - ETA: 46s - loss: 0.7788 - regression_loss: 0.6775 - classification_loss: 0.1013 360/500 [====================>.........] - ETA: 46s - loss: 0.7777 - regression_loss: 0.6766 - classification_loss: 0.1011 361/500 [====================>.........] - ETA: 45s - loss: 0.7770 - regression_loss: 0.6759 - classification_loss: 0.1011 362/500 [====================>.........] - ETA: 45s - loss: 0.7767 - regression_loss: 0.6756 - classification_loss: 0.1010 363/500 [====================>.........] - ETA: 45s - loss: 0.7758 - regression_loss: 0.6748 - classification_loss: 0.1010 364/500 [====================>.........] - ETA: 44s - loss: 0.7763 - regression_loss: 0.6753 - classification_loss: 0.1010 365/500 [====================>.........] - ETA: 44s - loss: 0.7780 - regression_loss: 0.6768 - classification_loss: 0.1012 366/500 [====================>.........] - ETA: 44s - loss: 0.7766 - regression_loss: 0.6756 - classification_loss: 0.1010 367/500 [=====================>........] - ETA: 43s - loss: 0.7773 - regression_loss: 0.6761 - classification_loss: 0.1011 368/500 [=====================>........] - ETA: 43s - loss: 0.7786 - regression_loss: 0.6772 - classification_loss: 0.1014 369/500 [=====================>........] - ETA: 43s - loss: 0.7789 - regression_loss: 0.6775 - classification_loss: 0.1015 370/500 [=====================>........] - ETA: 42s - loss: 0.7785 - regression_loss: 0.6771 - classification_loss: 0.1014 371/500 [=====================>........] - ETA: 42s - loss: 0.7779 - regression_loss: 0.6766 - classification_loss: 0.1013 372/500 [=====================>........] - ETA: 42s - loss: 0.7781 - regression_loss: 0.6769 - classification_loss: 0.1012 373/500 [=====================>........] - ETA: 42s - loss: 0.7778 - regression_loss: 0.6766 - classification_loss: 0.1012 374/500 [=====================>........] - ETA: 41s - loss: 0.7769 - regression_loss: 0.6758 - classification_loss: 0.1010 375/500 [=====================>........] - ETA: 41s - loss: 0.7776 - regression_loss: 0.6765 - classification_loss: 0.1011 376/500 [=====================>........] - ETA: 41s - loss: 0.7770 - regression_loss: 0.6761 - classification_loss: 0.1009 377/500 [=====================>........] - ETA: 40s - loss: 0.7769 - regression_loss: 0.6761 - classification_loss: 0.1009 378/500 [=====================>........] - ETA: 40s - loss: 0.7763 - regression_loss: 0.6756 - classification_loss: 0.1008 379/500 [=====================>........] - ETA: 40s - loss: 0.7753 - regression_loss: 0.6747 - classification_loss: 0.1007 380/500 [=====================>........] - ETA: 39s - loss: 0.7742 - regression_loss: 0.6737 - classification_loss: 0.1005 381/500 [=====================>........] - ETA: 39s - loss: 0.7760 - regression_loss: 0.6752 - classification_loss: 0.1008 382/500 [=====================>........] - ETA: 39s - loss: 0.7757 - regression_loss: 0.6750 - classification_loss: 0.1007 383/500 [=====================>........] - ETA: 38s - loss: 0.7744 - regression_loss: 0.6738 - classification_loss: 0.1006 384/500 [======================>.......] - ETA: 38s - loss: 0.7752 - regression_loss: 0.6745 - classification_loss: 0.1008 385/500 [======================>.......] - ETA: 38s - loss: 0.7758 - regression_loss: 0.6749 - classification_loss: 0.1009 386/500 [======================>.......] - ETA: 37s - loss: 0.7759 - regression_loss: 0.6750 - classification_loss: 0.1009 387/500 [======================>.......] - ETA: 37s - loss: 0.7747 - regression_loss: 0.6739 - classification_loss: 0.1007 388/500 [======================>.......] - ETA: 37s - loss: 0.7756 - regression_loss: 0.6748 - classification_loss: 0.1009 389/500 [======================>.......] - ETA: 36s - loss: 0.7766 - regression_loss: 0.6756 - classification_loss: 0.1010 390/500 [======================>.......] - ETA: 36s - loss: 0.7774 - regression_loss: 0.6763 - classification_loss: 0.1010 391/500 [======================>.......] - ETA: 36s - loss: 0.7784 - regression_loss: 0.6773 - classification_loss: 0.1012 392/500 [======================>.......] - ETA: 35s - loss: 0.7778 - regression_loss: 0.6768 - classification_loss: 0.1011 393/500 [======================>.......] - ETA: 35s - loss: 0.7765 - regression_loss: 0.6756 - classification_loss: 0.1009 394/500 [======================>.......] - ETA: 35s - loss: 0.7759 - regression_loss: 0.6751 - classification_loss: 0.1008 395/500 [======================>.......] - ETA: 34s - loss: 0.7760 - regression_loss: 0.6753 - classification_loss: 0.1007 396/500 [======================>.......] - ETA: 34s - loss: 0.7759 - regression_loss: 0.6752 - classification_loss: 0.1007 397/500 [======================>.......] - ETA: 34s - loss: 0.7755 - regression_loss: 0.6749 - classification_loss: 0.1006 398/500 [======================>.......] - ETA: 33s - loss: 0.7743 - regression_loss: 0.6738 - classification_loss: 0.1005 399/500 [======================>.......] - ETA: 33s - loss: 0.7746 - regression_loss: 0.6740 - classification_loss: 0.1005 400/500 [=======================>......] - ETA: 33s - loss: 0.7736 - regression_loss: 0.6732 - classification_loss: 0.1004 401/500 [=======================>......] - ETA: 32s - loss: 0.7726 - regression_loss: 0.6723 - classification_loss: 0.1002 402/500 [=======================>......] - ETA: 32s - loss: 0.7722 - regression_loss: 0.6720 - classification_loss: 0.1002 403/500 [=======================>......] - ETA: 32s - loss: 0.7727 - regression_loss: 0.6724 - classification_loss: 0.1003 404/500 [=======================>......] - ETA: 31s - loss: 0.7730 - regression_loss: 0.6727 - classification_loss: 0.1003 405/500 [=======================>......] - ETA: 31s - loss: 0.7742 - regression_loss: 0.6737 - classification_loss: 0.1005 406/500 [=======================>......] - ETA: 31s - loss: 0.7748 - regression_loss: 0.6742 - classification_loss: 0.1005 407/500 [=======================>......] - ETA: 30s - loss: 0.7762 - regression_loss: 0.6755 - classification_loss: 0.1008 408/500 [=======================>......] - ETA: 30s - loss: 0.7763 - regression_loss: 0.6755 - classification_loss: 0.1008 409/500 [=======================>......] - ETA: 30s - loss: 0.7774 - regression_loss: 0.6765 - classification_loss: 0.1010 410/500 [=======================>......] - ETA: 29s - loss: 0.7773 - regression_loss: 0.6763 - classification_loss: 0.1010 411/500 [=======================>......] - ETA: 29s - loss: 0.7774 - regression_loss: 0.6764 - classification_loss: 0.1010 412/500 [=======================>......] - ETA: 29s - loss: 0.7787 - regression_loss: 0.6775 - classification_loss: 0.1011 413/500 [=======================>......] - ETA: 28s - loss: 0.7793 - regression_loss: 0.6780 - classification_loss: 0.1013 414/500 [=======================>......] - ETA: 28s - loss: 0.7782 - regression_loss: 0.6771 - classification_loss: 0.1011 415/500 [=======================>......] - ETA: 28s - loss: 0.7781 - regression_loss: 0.6769 - classification_loss: 0.1012 416/500 [=======================>......] - ETA: 27s - loss: 0.7785 - regression_loss: 0.6772 - classification_loss: 0.1013 417/500 [========================>.....] - ETA: 27s - loss: 0.7790 - regression_loss: 0.6776 - classification_loss: 0.1014 418/500 [========================>.....] - ETA: 27s - loss: 0.7801 - regression_loss: 0.6787 - classification_loss: 0.1013 419/500 [========================>.....] - ETA: 26s - loss: 0.7796 - regression_loss: 0.6784 - classification_loss: 0.1012 420/500 [========================>.....] - ETA: 26s - loss: 0.7793 - regression_loss: 0.6782 - classification_loss: 0.1011 421/500 [========================>.....] - ETA: 26s - loss: 0.7796 - regression_loss: 0.6784 - classification_loss: 0.1011 422/500 [========================>.....] - ETA: 25s - loss: 0.7792 - regression_loss: 0.6782 - classification_loss: 0.1010 423/500 [========================>.....] - ETA: 25s - loss: 0.7793 - regression_loss: 0.6783 - classification_loss: 0.1010 424/500 [========================>.....] - ETA: 25s - loss: 0.7786 - regression_loss: 0.6778 - classification_loss: 0.1009 425/500 [========================>.....] - ETA: 24s - loss: 0.7797 - regression_loss: 0.6786 - classification_loss: 0.1011 426/500 [========================>.....] - ETA: 24s - loss: 0.7786 - regression_loss: 0.6777 - classification_loss: 0.1009 427/500 [========================>.....] - ETA: 24s - loss: 0.7796 - regression_loss: 0.6783 - classification_loss: 0.1012 428/500 [========================>.....] - ETA: 23s - loss: 0.7788 - regression_loss: 0.6777 - classification_loss: 0.1011 429/500 [========================>.....] - ETA: 23s - loss: 0.7783 - regression_loss: 0.6774 - classification_loss: 0.1010 430/500 [========================>.....] - ETA: 23s - loss: 0.7794 - regression_loss: 0.6781 - classification_loss: 0.1013 431/500 [========================>.....] - ETA: 22s - loss: 0.7796 - regression_loss: 0.6784 - classification_loss: 0.1012 432/500 [========================>.....] - ETA: 22s - loss: 0.7788 - regression_loss: 0.6777 - classification_loss: 0.1011 433/500 [========================>.....] - ETA: 22s - loss: 0.7788 - regression_loss: 0.6776 - classification_loss: 0.1011 434/500 [=========================>....] - ETA: 21s - loss: 0.7781 - regression_loss: 0.6771 - classification_loss: 0.1010 435/500 [=========================>....] - ETA: 21s - loss: 0.7773 - regression_loss: 0.6764 - classification_loss: 0.1009 436/500 [=========================>....] - ETA: 21s - loss: 0.7765 - regression_loss: 0.6757 - classification_loss: 0.1008 437/500 [=========================>....] - ETA: 20s - loss: 0.7771 - regression_loss: 0.6762 - classification_loss: 0.1009 438/500 [=========================>....] - ETA: 20s - loss: 0.7778 - regression_loss: 0.6769 - classification_loss: 0.1009 439/500 [=========================>....] - ETA: 20s - loss: 0.7783 - regression_loss: 0.6774 - classification_loss: 0.1010 440/500 [=========================>....] - ETA: 19s - loss: 0.7777 - regression_loss: 0.6768 - classification_loss: 0.1008 441/500 [=========================>....] - ETA: 19s - loss: 0.7778 - regression_loss: 0.6769 - classification_loss: 0.1009 442/500 [=========================>....] - ETA: 19s - loss: 0.7779 - regression_loss: 0.6769 - classification_loss: 0.1010 443/500 [=========================>....] - ETA: 18s - loss: 0.7787 - regression_loss: 0.6775 - classification_loss: 0.1012 444/500 [=========================>....] - ETA: 18s - loss: 0.7785 - regression_loss: 0.6773 - classification_loss: 0.1012 445/500 [=========================>....] - ETA: 18s - loss: 0.7790 - regression_loss: 0.6777 - classification_loss: 0.1013 446/500 [=========================>....] - ETA: 17s - loss: 0.7797 - regression_loss: 0.6783 - classification_loss: 0.1014 447/500 [=========================>....] - ETA: 17s - loss: 0.7799 - regression_loss: 0.6784 - classification_loss: 0.1014 448/500 [=========================>....] - ETA: 17s - loss: 0.7797 - regression_loss: 0.6784 - classification_loss: 0.1014 449/500 [=========================>....] - ETA: 16s - loss: 0.7794 - regression_loss: 0.6782 - classification_loss: 0.1013 450/500 [==========================>...] - ETA: 16s - loss: 0.7788 - regression_loss: 0.6776 - classification_loss: 0.1012 451/500 [==========================>...] - ETA: 16s - loss: 0.7794 - regression_loss: 0.6782 - classification_loss: 0.1012 452/500 [==========================>...] - ETA: 15s - loss: 0.7789 - regression_loss: 0.6778 - classification_loss: 0.1011 453/500 [==========================>...] - ETA: 15s - loss: 0.7781 - regression_loss: 0.6771 - classification_loss: 0.1010 454/500 [==========================>...] - ETA: 15s - loss: 0.7782 - regression_loss: 0.6773 - classification_loss: 0.1010 455/500 [==========================>...] - ETA: 14s - loss: 0.7781 - regression_loss: 0.6770 - classification_loss: 0.1011 456/500 [==========================>...] - ETA: 14s - loss: 0.7793 - regression_loss: 0.6780 - classification_loss: 0.1013 457/500 [==========================>...] - ETA: 14s - loss: 0.7795 - regression_loss: 0.6782 - classification_loss: 0.1013 458/500 [==========================>...] - ETA: 13s - loss: 0.7802 - regression_loss: 0.6787 - classification_loss: 0.1015 459/500 [==========================>...] - ETA: 13s - loss: 0.7800 - regression_loss: 0.6786 - classification_loss: 0.1015 460/500 [==========================>...] - ETA: 13s - loss: 0.7804 - regression_loss: 0.6788 - classification_loss: 0.1016 461/500 [==========================>...] - ETA: 12s - loss: 0.7818 - regression_loss: 0.6800 - classification_loss: 0.1018 462/500 [==========================>...] - ETA: 12s - loss: 0.7828 - regression_loss: 0.6809 - classification_loss: 0.1019 463/500 [==========================>...] - ETA: 12s - loss: 0.7829 - regression_loss: 0.6810 - classification_loss: 0.1019 464/500 [==========================>...] - ETA: 11s - loss: 0.7836 - regression_loss: 0.6815 - classification_loss: 0.1021 465/500 [==========================>...] - ETA: 11s - loss: 0.7839 - regression_loss: 0.6818 - classification_loss: 0.1020 466/500 [==========================>...] - ETA: 11s - loss: 0.7840 - regression_loss: 0.6821 - classification_loss: 0.1019 467/500 [===========================>..] - ETA: 10s - loss: 0.7840 - regression_loss: 0.6822 - classification_loss: 0.1019 468/500 [===========================>..] - ETA: 10s - loss: 0.7830 - regression_loss: 0.6812 - classification_loss: 0.1018 469/500 [===========================>..] - ETA: 10s - loss: 0.7826 - regression_loss: 0.6808 - classification_loss: 0.1017 470/500 [===========================>..] - ETA: 9s - loss: 0.7829 - regression_loss: 0.6811 - classification_loss: 0.1018  471/500 [===========================>..] - ETA: 9s - loss: 0.7824 - regression_loss: 0.6806 - classification_loss: 0.1018 472/500 [===========================>..] - ETA: 9s - loss: 0.7828 - regression_loss: 0.6810 - classification_loss: 0.1019 473/500 [===========================>..] - ETA: 8s - loss: 0.7822 - regression_loss: 0.6804 - classification_loss: 0.1017 474/500 [===========================>..] - ETA: 8s - loss: 0.7822 - regression_loss: 0.6805 - classification_loss: 0.1018 475/500 [===========================>..] - ETA: 8s - loss: 0.7821 - regression_loss: 0.6803 - classification_loss: 0.1017 476/500 [===========================>..] - ETA: 7s - loss: 0.7818 - regression_loss: 0.6801 - classification_loss: 0.1017 477/500 [===========================>..] - ETA: 7s - loss: 0.7819 - regression_loss: 0.6802 - classification_loss: 0.1016 478/500 [===========================>..] - ETA: 7s - loss: 0.7812 - regression_loss: 0.6797 - classification_loss: 0.1015 479/500 [===========================>..] - ETA: 6s - loss: 0.7812 - regression_loss: 0.6798 - classification_loss: 0.1014 480/500 [===========================>..] - ETA: 6s - loss: 0.7811 - regression_loss: 0.6797 - classification_loss: 0.1014 481/500 [===========================>..] - ETA: 6s - loss: 0.7819 - regression_loss: 0.6803 - classification_loss: 0.1016 482/500 [===========================>..] - ETA: 5s - loss: 0.7827 - regression_loss: 0.6810 - classification_loss: 0.1017 483/500 [===========================>..] - ETA: 5s - loss: 0.7823 - regression_loss: 0.6807 - classification_loss: 0.1016 484/500 [============================>.] - ETA: 5s - loss: 0.7819 - regression_loss: 0.6803 - classification_loss: 0.1016 485/500 [============================>.] - ETA: 4s - loss: 0.7817 - regression_loss: 0.6801 - classification_loss: 0.1016 486/500 [============================>.] - ETA: 4s - loss: 0.7817 - regression_loss: 0.6801 - classification_loss: 0.1016 487/500 [============================>.] - ETA: 4s - loss: 0.7810 - regression_loss: 0.6795 - classification_loss: 0.1015 488/500 [============================>.] - ETA: 3s - loss: 0.7799 - regression_loss: 0.6786 - classification_loss: 0.1014 489/500 [============================>.] - ETA: 3s - loss: 0.7804 - regression_loss: 0.6789 - classification_loss: 0.1015 490/500 [============================>.] - ETA: 3s - loss: 0.7815 - regression_loss: 0.6799 - classification_loss: 0.1015 491/500 [============================>.] - ETA: 2s - loss: 0.7822 - regression_loss: 0.6806 - classification_loss: 0.1016 492/500 [============================>.] - ETA: 2s - loss: 0.7822 - regression_loss: 0.6806 - classification_loss: 0.1016 493/500 [============================>.] - ETA: 2s - loss: 0.7822 - regression_loss: 0.6806 - classification_loss: 0.1015 494/500 [============================>.] - ETA: 1s - loss: 0.7810 - regression_loss: 0.6796 - classification_loss: 0.1014 495/500 [============================>.] - ETA: 1s - loss: 0.7814 - regression_loss: 0.6800 - classification_loss: 0.1014 496/500 [============================>.] - ETA: 1s - loss: 0.7817 - regression_loss: 0.6802 - classification_loss: 0.1015 497/500 [============================>.] - ETA: 0s - loss: 0.7824 - regression_loss: 0.6808 - classification_loss: 0.1016 498/500 [============================>.] - ETA: 0s - loss: 0.7828 - regression_loss: 0.6811 - classification_loss: 0.1017 499/500 [============================>.] - ETA: 0s - loss: 0.7829 - regression_loss: 0.6812 - classification_loss: 0.1017 500/500 [==============================] - 165s 331ms/step - loss: 0.7837 - regression_loss: 0.6818 - classification_loss: 0.1018 1172 instances of class plum with average precision: 0.6263 mAP: 0.6263 Epoch 00040: saving model to ./training/snapshots/resnet101_pascal_40.h5 Epoch 41/150 1/500 [..............................] - ETA: 2:35 - loss: 0.5529 - regression_loss: 0.4909 - classification_loss: 0.0621 2/500 [..............................] - ETA: 2:42 - loss: 0.7898 - regression_loss: 0.6822 - classification_loss: 0.1076 3/500 [..............................] - ETA: 2:41 - loss: 1.0166 - regression_loss: 0.8626 - classification_loss: 0.1540 4/500 [..............................] - ETA: 2:43 - loss: 1.0561 - regression_loss: 0.9054 - classification_loss: 0.1508 5/500 [..............................] - ETA: 2:43 - loss: 0.9942 - regression_loss: 0.8546 - classification_loss: 0.1396 6/500 [..............................] - ETA: 2:42 - loss: 0.9650 - regression_loss: 0.8248 - classification_loss: 0.1402 7/500 [..............................] - ETA: 2:43 - loss: 0.9273 - regression_loss: 0.7923 - classification_loss: 0.1350 8/500 [..............................] - ETA: 2:43 - loss: 0.9207 - regression_loss: 0.7880 - classification_loss: 0.1327 9/500 [..............................] - ETA: 2:41 - loss: 0.8722 - regression_loss: 0.7496 - classification_loss: 0.1226 10/500 [..............................] - ETA: 2:40 - loss: 0.8575 - regression_loss: 0.7399 - classification_loss: 0.1176 11/500 [..............................] - ETA: 2:39 - loss: 0.8667 - regression_loss: 0.7500 - classification_loss: 0.1168 12/500 [..............................] - ETA: 2:39 - loss: 0.8302 - regression_loss: 0.7206 - classification_loss: 0.1096 13/500 [..............................] - ETA: 2:39 - loss: 0.8601 - regression_loss: 0.7471 - classification_loss: 0.1131 14/500 [..............................] - ETA: 2:38 - loss: 0.8684 - regression_loss: 0.7551 - classification_loss: 0.1133 15/500 [..............................] - ETA: 2:39 - loss: 0.8775 - regression_loss: 0.7628 - classification_loss: 0.1146 16/500 [..............................] - ETA: 2:39 - loss: 0.8483 - regression_loss: 0.7370 - classification_loss: 0.1113 17/500 [>.............................] - ETA: 2:39 - loss: 0.8674 - regression_loss: 0.7508 - classification_loss: 0.1165 18/500 [>.............................] - ETA: 2:39 - loss: 0.8644 - regression_loss: 0.7489 - classification_loss: 0.1154 19/500 [>.............................] - ETA: 2:39 - loss: 0.8747 - regression_loss: 0.7578 - classification_loss: 0.1169 20/500 [>.............................] - ETA: 2:39 - loss: 0.8744 - regression_loss: 0.7575 - classification_loss: 0.1169 21/500 [>.............................] - ETA: 2:39 - loss: 0.8815 - regression_loss: 0.7650 - classification_loss: 0.1166 22/500 [>.............................] - ETA: 2:39 - loss: 0.8761 - regression_loss: 0.7591 - classification_loss: 0.1170 23/500 [>.............................] - ETA: 2:38 - loss: 0.8526 - regression_loss: 0.7393 - classification_loss: 0.1133 24/500 [>.............................] - ETA: 2:38 - loss: 0.8650 - regression_loss: 0.7518 - classification_loss: 0.1131 25/500 [>.............................] - ETA: 2:38 - loss: 0.8452 - regression_loss: 0.7351 - classification_loss: 0.1101 26/500 [>.............................] - ETA: 2:37 - loss: 0.8610 - regression_loss: 0.7485 - classification_loss: 0.1124 27/500 [>.............................] - ETA: 2:38 - loss: 0.8367 - regression_loss: 0.7277 - classification_loss: 0.1089 28/500 [>.............................] - ETA: 2:37 - loss: 0.8451 - regression_loss: 0.7373 - classification_loss: 0.1078 29/500 [>.............................] - ETA: 2:37 - loss: 0.8237 - regression_loss: 0.7189 - classification_loss: 0.1048 30/500 [>.............................] - ETA: 2:36 - loss: 0.8108 - regression_loss: 0.7075 - classification_loss: 0.1032 31/500 [>.............................] - ETA: 2:36 - loss: 0.8112 - regression_loss: 0.7082 - classification_loss: 0.1031 32/500 [>.............................] - ETA: 2:36 - loss: 0.8036 - regression_loss: 0.7021 - classification_loss: 0.1015 33/500 [>.............................] - ETA: 2:36 - loss: 0.7876 - regression_loss: 0.6878 - classification_loss: 0.0999 34/500 [=>............................] - ETA: 2:35 - loss: 0.7833 - regression_loss: 0.6846 - classification_loss: 0.0987 35/500 [=>............................] - ETA: 2:35 - loss: 0.7789 - regression_loss: 0.6813 - classification_loss: 0.0976 36/500 [=>............................] - ETA: 2:35 - loss: 0.7895 - regression_loss: 0.6891 - classification_loss: 0.1004 37/500 [=>............................] - ETA: 2:34 - loss: 0.7795 - regression_loss: 0.6808 - classification_loss: 0.0987 38/500 [=>............................] - ETA: 2:34 - loss: 0.7984 - regression_loss: 0.6969 - classification_loss: 0.1015 39/500 [=>............................] - ETA: 2:33 - loss: 0.7967 - regression_loss: 0.6961 - classification_loss: 0.1005 40/500 [=>............................] - ETA: 2:33 - loss: 0.7890 - regression_loss: 0.6894 - classification_loss: 0.0997 41/500 [=>............................] - ETA: 2:32 - loss: 0.7913 - regression_loss: 0.6919 - classification_loss: 0.0994 42/500 [=>............................] - ETA: 2:32 - loss: 0.7933 - regression_loss: 0.6935 - classification_loss: 0.0998 43/500 [=>............................] - ETA: 2:31 - loss: 0.7909 - regression_loss: 0.6909 - classification_loss: 0.1000 44/500 [=>............................] - ETA: 2:31 - loss: 0.7998 - regression_loss: 0.6987 - classification_loss: 0.1012 45/500 [=>............................] - ETA: 2:31 - loss: 0.8027 - regression_loss: 0.7009 - classification_loss: 0.1017 46/500 [=>............................] - ETA: 2:30 - loss: 0.8081 - regression_loss: 0.7052 - classification_loss: 0.1028 47/500 [=>............................] - ETA: 2:30 - loss: 0.8076 - regression_loss: 0.7049 - classification_loss: 0.1027 48/500 [=>............................] - ETA: 2:29 - loss: 0.8046 - regression_loss: 0.7030 - classification_loss: 0.1016 49/500 [=>............................] - ETA: 2:29 - loss: 0.8062 - regression_loss: 0.7035 - classification_loss: 0.1027 50/500 [==>...........................] - ETA: 2:29 - loss: 0.7942 - regression_loss: 0.6926 - classification_loss: 0.1016 51/500 [==>...........................] - ETA: 2:28 - loss: 0.7932 - regression_loss: 0.6922 - classification_loss: 0.1010 52/500 [==>...........................] - ETA: 2:28 - loss: 0.7890 - regression_loss: 0.6884 - classification_loss: 0.1007 53/500 [==>...........................] - ETA: 2:28 - loss: 0.7857 - regression_loss: 0.6859 - classification_loss: 0.0997 54/500 [==>...........................] - ETA: 2:27 - loss: 0.7853 - regression_loss: 0.6857 - classification_loss: 0.0995 55/500 [==>...........................] - ETA: 2:27 - loss: 0.7777 - regression_loss: 0.6794 - classification_loss: 0.0983 56/500 [==>...........................] - ETA: 2:27 - loss: 0.7776 - regression_loss: 0.6791 - classification_loss: 0.0984 57/500 [==>...........................] - ETA: 2:26 - loss: 0.7732 - regression_loss: 0.6755 - classification_loss: 0.0977 58/500 [==>...........................] - ETA: 2:26 - loss: 0.7735 - regression_loss: 0.6759 - classification_loss: 0.0976 59/500 [==>...........................] - ETA: 2:26 - loss: 0.7766 - regression_loss: 0.6787 - classification_loss: 0.0980 60/500 [==>...........................] - ETA: 2:25 - loss: 0.7809 - regression_loss: 0.6827 - classification_loss: 0.0982 61/500 [==>...........................] - ETA: 2:25 - loss: 0.7855 - regression_loss: 0.6863 - classification_loss: 0.0992 62/500 [==>...........................] - ETA: 2:25 - loss: 0.7785 - regression_loss: 0.6804 - classification_loss: 0.0981 63/500 [==>...........................] - ETA: 2:25 - loss: 0.7855 - regression_loss: 0.6861 - classification_loss: 0.0994 64/500 [==>...........................] - ETA: 2:24 - loss: 0.7872 - regression_loss: 0.6878 - classification_loss: 0.0994 65/500 [==>...........................] - ETA: 2:24 - loss: 0.7950 - regression_loss: 0.6942 - classification_loss: 0.1008 66/500 [==>...........................] - ETA: 2:23 - loss: 0.8001 - regression_loss: 0.6986 - classification_loss: 0.1015 67/500 [===>..........................] - ETA: 2:23 - loss: 0.7948 - regression_loss: 0.6943 - classification_loss: 0.1005 68/500 [===>..........................] - ETA: 2:23 - loss: 0.8011 - regression_loss: 0.6990 - classification_loss: 0.1021 69/500 [===>..........................] - ETA: 2:23 - loss: 0.7972 - regression_loss: 0.6957 - classification_loss: 0.1015 70/500 [===>..........................] - ETA: 2:22 - loss: 0.7954 - regression_loss: 0.6943 - classification_loss: 0.1010 71/500 [===>..........................] - ETA: 2:22 - loss: 0.7935 - regression_loss: 0.6925 - classification_loss: 0.1010 72/500 [===>..........................] - ETA: 2:22 - loss: 0.7991 - regression_loss: 0.6964 - classification_loss: 0.1027 73/500 [===>..........................] - ETA: 2:22 - loss: 0.7997 - regression_loss: 0.6975 - classification_loss: 0.1022 74/500 [===>..........................] - ETA: 2:21 - loss: 0.8017 - regression_loss: 0.6992 - classification_loss: 0.1025 75/500 [===>..........................] - ETA: 2:21 - loss: 0.7999 - regression_loss: 0.6975 - classification_loss: 0.1024 76/500 [===>..........................] - ETA: 2:20 - loss: 0.8018 - regression_loss: 0.6989 - classification_loss: 0.1029 77/500 [===>..........................] - ETA: 2:20 - loss: 0.8014 - regression_loss: 0.6986 - classification_loss: 0.1027 78/500 [===>..........................] - ETA: 2:20 - loss: 0.8006 - regression_loss: 0.6979 - classification_loss: 0.1027 79/500 [===>..........................] - ETA: 2:19 - loss: 0.8028 - regression_loss: 0.6995 - classification_loss: 0.1033 80/500 [===>..........................] - ETA: 2:19 - loss: 0.8039 - regression_loss: 0.7005 - classification_loss: 0.1034 81/500 [===>..........................] - ETA: 2:18 - loss: 0.7995 - regression_loss: 0.6966 - classification_loss: 0.1029 82/500 [===>..........................] - ETA: 2:18 - loss: 0.7950 - regression_loss: 0.6929 - classification_loss: 0.1021 83/500 [===>..........................] - ETA: 2:18 - loss: 0.7953 - regression_loss: 0.6932 - classification_loss: 0.1021 84/500 [====>.........................] - ETA: 2:17 - loss: 0.7977 - regression_loss: 0.6948 - classification_loss: 0.1029 85/500 [====>.........................] - ETA: 2:17 - loss: 0.7992 - regression_loss: 0.6961 - classification_loss: 0.1031 86/500 [====>.........................] - ETA: 2:17 - loss: 0.7974 - regression_loss: 0.6945 - classification_loss: 0.1029 87/500 [====>.........................] - ETA: 2:17 - loss: 0.7989 - regression_loss: 0.6956 - classification_loss: 0.1033 88/500 [====>.........................] - ETA: 2:16 - loss: 0.7970 - regression_loss: 0.6934 - classification_loss: 0.1036 89/500 [====>.........................] - ETA: 2:16 - loss: 0.8019 - regression_loss: 0.6978 - classification_loss: 0.1040 90/500 [====>.........................] - ETA: 2:15 - loss: 0.8045 - regression_loss: 0.6997 - classification_loss: 0.1047 91/500 [====>.........................] - ETA: 2:15 - loss: 0.8017 - regression_loss: 0.6975 - classification_loss: 0.1041 92/500 [====>.........................] - ETA: 2:15 - loss: 0.7969 - regression_loss: 0.6933 - classification_loss: 0.1035 93/500 [====>.........................] - ETA: 2:15 - loss: 0.7979 - regression_loss: 0.6941 - classification_loss: 0.1038 94/500 [====>.........................] - ETA: 2:14 - loss: 0.7932 - regression_loss: 0.6903 - classification_loss: 0.1029 95/500 [====>.........................] - ETA: 2:14 - loss: 0.7953 - regression_loss: 0.6924 - classification_loss: 0.1029 96/500 [====>.........................] - ETA: 2:13 - loss: 0.7919 - regression_loss: 0.6895 - classification_loss: 0.1024 97/500 [====>.........................] - ETA: 2:13 - loss: 0.7909 - regression_loss: 0.6886 - classification_loss: 0.1023 98/500 [====>.........................] - ETA: 2:13 - loss: 0.7925 - regression_loss: 0.6899 - classification_loss: 0.1027 99/500 [====>.........................] - ETA: 2:12 - loss: 0.7882 - regression_loss: 0.6860 - classification_loss: 0.1023 100/500 [=====>........................] - ETA: 2:12 - loss: 0.7862 - regression_loss: 0.6844 - classification_loss: 0.1018 101/500 [=====>........................] - ETA: 2:12 - loss: 0.7885 - regression_loss: 0.6862 - classification_loss: 0.1023 102/500 [=====>........................] - ETA: 2:11 - loss: 0.7879 - regression_loss: 0.6854 - classification_loss: 0.1025 103/500 [=====>........................] - ETA: 2:11 - loss: 0.7914 - regression_loss: 0.6885 - classification_loss: 0.1029 104/500 [=====>........................] - ETA: 2:11 - loss: 0.7886 - regression_loss: 0.6861 - classification_loss: 0.1025 105/500 [=====>........................] - ETA: 2:10 - loss: 0.7875 - regression_loss: 0.6851 - classification_loss: 0.1024 106/500 [=====>........................] - ETA: 2:10 - loss: 0.7930 - regression_loss: 0.6896 - classification_loss: 0.1034 107/500 [=====>........................] - ETA: 2:09 - loss: 0.7916 - regression_loss: 0.6885 - classification_loss: 0.1032 108/500 [=====>........................] - ETA: 2:09 - loss: 0.7875 - regression_loss: 0.6850 - classification_loss: 0.1025 109/500 [=====>........................] - ETA: 2:09 - loss: 0.7860 - regression_loss: 0.6835 - classification_loss: 0.1025 110/500 [=====>........................] - ETA: 2:09 - loss: 0.7840 - regression_loss: 0.6819 - classification_loss: 0.1021 111/500 [=====>........................] - ETA: 2:08 - loss: 0.7912 - regression_loss: 0.6875 - classification_loss: 0.1037 112/500 [=====>........................] - ETA: 2:08 - loss: 0.7941 - regression_loss: 0.6902 - classification_loss: 0.1039 113/500 [=====>........................] - ETA: 2:07 - loss: 0.7954 - regression_loss: 0.6913 - classification_loss: 0.1040 114/500 [=====>........................] - ETA: 2:07 - loss: 0.7972 - regression_loss: 0.6930 - classification_loss: 0.1042 115/500 [=====>........................] - ETA: 2:07 - loss: 0.7932 - regression_loss: 0.6897 - classification_loss: 0.1035 116/500 [=====>........................] - ETA: 2:06 - loss: 0.7919 - regression_loss: 0.6887 - classification_loss: 0.1032 117/500 [======>.......................] - ETA: 2:06 - loss: 0.7893 - regression_loss: 0.6867 - classification_loss: 0.1025 118/500 [======>.......................] - ETA: 2:06 - loss: 0.7894 - regression_loss: 0.6871 - classification_loss: 0.1024 119/500 [======>.......................] - ETA: 2:05 - loss: 0.7904 - regression_loss: 0.6880 - classification_loss: 0.1024 120/500 [======>.......................] - ETA: 2:05 - loss: 0.7990 - regression_loss: 0.6946 - classification_loss: 0.1044 121/500 [======>.......................] - ETA: 2:04 - loss: 0.7949 - regression_loss: 0.6910 - classification_loss: 0.1039 122/500 [======>.......................] - ETA: 2:04 - loss: 0.7938 - regression_loss: 0.6901 - classification_loss: 0.1037 123/500 [======>.......................] - ETA: 2:04 - loss: 0.7930 - regression_loss: 0.6893 - classification_loss: 0.1037 124/500 [======>.......................] - ETA: 2:03 - loss: 0.7944 - regression_loss: 0.6904 - classification_loss: 0.1040 125/500 [======>.......................] - ETA: 2:03 - loss: 0.7926 - regression_loss: 0.6886 - classification_loss: 0.1040 126/500 [======>.......................] - ETA: 2:03 - loss: 0.7909 - regression_loss: 0.6872 - classification_loss: 0.1037 127/500 [======>.......................] - ETA: 2:03 - loss: 0.7948 - regression_loss: 0.6908 - classification_loss: 0.1041 128/500 [======>.......................] - ETA: 2:02 - loss: 0.7920 - regression_loss: 0.6878 - classification_loss: 0.1042 129/500 [======>.......................] - ETA: 2:02 - loss: 0.7901 - regression_loss: 0.6861 - classification_loss: 0.1039 130/500 [======>.......................] - ETA: 2:02 - loss: 0.7889 - regression_loss: 0.6853 - classification_loss: 0.1036 131/500 [======>.......................] - ETA: 2:01 - loss: 0.7875 - regression_loss: 0.6841 - classification_loss: 0.1034 132/500 [======>.......................] - ETA: 2:01 - loss: 0.7904 - regression_loss: 0.6866 - classification_loss: 0.1038 133/500 [======>.......................] - ETA: 2:01 - loss: 0.7916 - regression_loss: 0.6878 - classification_loss: 0.1038 134/500 [=======>......................] - ETA: 2:00 - loss: 0.7928 - regression_loss: 0.6885 - classification_loss: 0.1042 135/500 [=======>......................] - ETA: 2:00 - loss: 0.7920 - regression_loss: 0.6878 - classification_loss: 0.1042 136/500 [=======>......................] - ETA: 2:00 - loss: 0.7896 - regression_loss: 0.6860 - classification_loss: 0.1036 137/500 [=======>......................] - ETA: 1:59 - loss: 0.7887 - regression_loss: 0.6854 - classification_loss: 0.1033 138/500 [=======>......................] - ETA: 1:59 - loss: 0.7845 - regression_loss: 0.6816 - classification_loss: 0.1028 139/500 [=======>......................] - ETA: 1:59 - loss: 0.7873 - regression_loss: 0.6839 - classification_loss: 0.1034 140/500 [=======>......................] - ETA: 1:58 - loss: 0.7918 - regression_loss: 0.6876 - classification_loss: 0.1042 141/500 [=======>......................] - ETA: 1:58 - loss: 0.7930 - regression_loss: 0.6884 - classification_loss: 0.1046 142/500 [=======>......................] - ETA: 1:58 - loss: 0.7954 - regression_loss: 0.6896 - classification_loss: 0.1058 143/500 [=======>......................] - ETA: 1:57 - loss: 0.7961 - regression_loss: 0.6901 - classification_loss: 0.1060 144/500 [=======>......................] - ETA: 1:57 - loss: 0.7944 - regression_loss: 0.6887 - classification_loss: 0.1057 145/500 [=======>......................] - ETA: 1:57 - loss: 0.7980 - regression_loss: 0.6915 - classification_loss: 0.1065 146/500 [=======>......................] - ETA: 1:57 - loss: 0.7994 - regression_loss: 0.6930 - classification_loss: 0.1065 147/500 [=======>......................] - ETA: 1:56 - loss: 0.7991 - regression_loss: 0.6926 - classification_loss: 0.1065 148/500 [=======>......................] - ETA: 1:56 - loss: 0.7981 - regression_loss: 0.6919 - classification_loss: 0.1061 149/500 [=======>......................] - ETA: 1:56 - loss: 0.7987 - regression_loss: 0.6923 - classification_loss: 0.1064 150/500 [========>.....................] - ETA: 1:55 - loss: 0.8002 - regression_loss: 0.6936 - classification_loss: 0.1066 151/500 [========>.....................] - ETA: 1:55 - loss: 0.7993 - regression_loss: 0.6926 - classification_loss: 0.1067 152/500 [========>.....................] - ETA: 1:55 - loss: 0.8001 - regression_loss: 0.6933 - classification_loss: 0.1069 153/500 [========>.....................] - ETA: 1:54 - loss: 0.8042 - regression_loss: 0.6967 - classification_loss: 0.1075 154/500 [========>.....................] - ETA: 1:54 - loss: 0.8066 - regression_loss: 0.6988 - classification_loss: 0.1078 155/500 [========>.....................] - ETA: 1:54 - loss: 0.8062 - regression_loss: 0.6985 - classification_loss: 0.1077 156/500 [========>.....................] - ETA: 1:53 - loss: 0.8075 - regression_loss: 0.6996 - classification_loss: 0.1079 157/500 [========>.....................] - ETA: 1:53 - loss: 0.8068 - regression_loss: 0.6990 - classification_loss: 0.1078 158/500 [========>.....................] - ETA: 1:53 - loss: 0.8068 - regression_loss: 0.6989 - classification_loss: 0.1078 159/500 [========>.....................] - ETA: 1:52 - loss: 0.8076 - regression_loss: 0.6996 - classification_loss: 0.1080 160/500 [========>.....................] - ETA: 1:52 - loss: 0.8087 - regression_loss: 0.7003 - classification_loss: 0.1084 161/500 [========>.....................] - ETA: 1:52 - loss: 0.8077 - regression_loss: 0.6995 - classification_loss: 0.1082 162/500 [========>.....................] - ETA: 1:51 - loss: 0.8076 - regression_loss: 0.6994 - classification_loss: 0.1082 163/500 [========>.....................] - ETA: 1:51 - loss: 0.8080 - regression_loss: 0.6995 - classification_loss: 0.1084 164/500 [========>.....................] - ETA: 1:51 - loss: 0.8106 - regression_loss: 0.7013 - classification_loss: 0.1093 165/500 [========>.....................] - ETA: 1:50 - loss: 0.8105 - regression_loss: 0.7012 - classification_loss: 0.1094 166/500 [========>.....................] - ETA: 1:50 - loss: 0.8105 - regression_loss: 0.7010 - classification_loss: 0.1095 167/500 [=========>....................] - ETA: 1:50 - loss: 0.8098 - regression_loss: 0.7005 - classification_loss: 0.1093 168/500 [=========>....................] - ETA: 1:49 - loss: 0.8096 - regression_loss: 0.7005 - classification_loss: 0.1092 169/500 [=========>....................] - ETA: 1:49 - loss: 0.8102 - regression_loss: 0.7010 - classification_loss: 0.1092 170/500 [=========>....................] - ETA: 1:49 - loss: 0.8099 - regression_loss: 0.7007 - classification_loss: 0.1091 171/500 [=========>....................] - ETA: 1:48 - loss: 0.8101 - regression_loss: 0.7009 - classification_loss: 0.1092 172/500 [=========>....................] - ETA: 1:48 - loss: 0.8074 - regression_loss: 0.6983 - classification_loss: 0.1091 173/500 [=========>....................] - ETA: 1:48 - loss: 0.8089 - regression_loss: 0.6996 - classification_loss: 0.1094 174/500 [=========>....................] - ETA: 1:47 - loss: 0.8069 - regression_loss: 0.6979 - classification_loss: 0.1090 175/500 [=========>....................] - ETA: 1:47 - loss: 0.8032 - regression_loss: 0.6946 - classification_loss: 0.1086 176/500 [=========>....................] - ETA: 1:47 - loss: 0.8053 - regression_loss: 0.6963 - classification_loss: 0.1090 177/500 [=========>....................] - ETA: 1:46 - loss: 0.8096 - regression_loss: 0.7000 - classification_loss: 0.1096 178/500 [=========>....................] - ETA: 1:46 - loss: 0.8078 - regression_loss: 0.6985 - classification_loss: 0.1093 179/500 [=========>....................] - ETA: 1:46 - loss: 0.8116 - regression_loss: 0.7017 - classification_loss: 0.1098 180/500 [=========>....................] - ETA: 1:45 - loss: 0.8147 - regression_loss: 0.7043 - classification_loss: 0.1104 181/500 [=========>....................] - ETA: 1:45 - loss: 0.8121 - regression_loss: 0.7021 - classification_loss: 0.1100 182/500 [=========>....................] - ETA: 1:45 - loss: 0.8094 - regression_loss: 0.6998 - classification_loss: 0.1096 183/500 [=========>....................] - ETA: 1:44 - loss: 0.8071 - regression_loss: 0.6979 - classification_loss: 0.1093 184/500 [==========>...................] - ETA: 1:44 - loss: 0.8059 - regression_loss: 0.6970 - classification_loss: 0.1089 185/500 [==========>...................] - ETA: 1:44 - loss: 0.8060 - regression_loss: 0.6969 - classification_loss: 0.1091 186/500 [==========>...................] - ETA: 1:43 - loss: 0.8051 - regression_loss: 0.6962 - classification_loss: 0.1090 187/500 [==========>...................] - ETA: 1:43 - loss: 0.8041 - regression_loss: 0.6955 - classification_loss: 0.1086 188/500 [==========>...................] - ETA: 1:43 - loss: 0.8025 - regression_loss: 0.6942 - classification_loss: 0.1082 189/500 [==========>...................] - ETA: 1:42 - loss: 0.8030 - regression_loss: 0.6946 - classification_loss: 0.1084 190/500 [==========>...................] - ETA: 1:42 - loss: 0.8002 - regression_loss: 0.6922 - classification_loss: 0.1080 191/500 [==========>...................] - ETA: 1:42 - loss: 0.7970 - regression_loss: 0.6894 - classification_loss: 0.1076 192/500 [==========>...................] - ETA: 1:41 - loss: 0.7983 - regression_loss: 0.6907 - classification_loss: 0.1076 193/500 [==========>...................] - ETA: 1:41 - loss: 0.7986 - regression_loss: 0.6912 - classification_loss: 0.1075 194/500 [==========>...................] - ETA: 1:41 - loss: 0.7996 - regression_loss: 0.6920 - classification_loss: 0.1075 195/500 [==========>...................] - ETA: 1:40 - loss: 0.7994 - regression_loss: 0.6917 - classification_loss: 0.1076 196/500 [==========>...................] - ETA: 1:40 - loss: 0.7993 - regression_loss: 0.6918 - classification_loss: 0.1075 197/500 [==========>...................] - ETA: 1:39 - loss: 0.7977 - regression_loss: 0.6904 - classification_loss: 0.1073 198/500 [==========>...................] - ETA: 1:39 - loss: 0.7974 - regression_loss: 0.6903 - classification_loss: 0.1071 199/500 [==========>...................] - ETA: 1:39 - loss: 0.7975 - regression_loss: 0.6904 - classification_loss: 0.1072 200/500 [===========>..................] - ETA: 1:39 - loss: 0.7994 - regression_loss: 0.6919 - classification_loss: 0.1075 201/500 [===========>..................] - ETA: 1:38 - loss: 0.7967 - regression_loss: 0.6895 - classification_loss: 0.1071 202/500 [===========>..................] - ETA: 1:38 - loss: 0.7942 - regression_loss: 0.6874 - classification_loss: 0.1068 203/500 [===========>..................] - ETA: 1:38 - loss: 0.7952 - regression_loss: 0.6883 - classification_loss: 0.1069 204/500 [===========>..................] - ETA: 1:37 - loss: 0.7925 - regression_loss: 0.6859 - classification_loss: 0.1066 205/500 [===========>..................] - ETA: 1:37 - loss: 0.7907 - regression_loss: 0.6844 - classification_loss: 0.1063 206/500 [===========>..................] - ETA: 1:37 - loss: 0.7884 - regression_loss: 0.6825 - classification_loss: 0.1060 207/500 [===========>..................] - ETA: 1:36 - loss: 0.7870 - regression_loss: 0.6813 - classification_loss: 0.1058 208/500 [===========>..................] - ETA: 1:36 - loss: 0.7848 - regression_loss: 0.6793 - classification_loss: 0.1054 209/500 [===========>..................] - ETA: 1:36 - loss: 0.7827 - regression_loss: 0.6776 - classification_loss: 0.1051 210/500 [===========>..................] - ETA: 1:35 - loss: 0.7825 - regression_loss: 0.6775 - classification_loss: 0.1050 211/500 [===========>..................] - ETA: 1:35 - loss: 0.7823 - regression_loss: 0.6776 - classification_loss: 0.1047 212/500 [===========>..................] - ETA: 1:35 - loss: 0.7825 - regression_loss: 0.6778 - classification_loss: 0.1047 213/500 [===========>..................] - ETA: 1:34 - loss: 0.7807 - regression_loss: 0.6763 - classification_loss: 0.1044 214/500 [===========>..................] - ETA: 1:34 - loss: 0.7811 - regression_loss: 0.6765 - classification_loss: 0.1046 215/500 [===========>..................] - ETA: 1:34 - loss: 0.7798 - regression_loss: 0.6753 - classification_loss: 0.1045 216/500 [===========>..................] - ETA: 1:33 - loss: 0.7796 - regression_loss: 0.6752 - classification_loss: 0.1044 217/500 [============>.................] - ETA: 1:33 - loss: 0.7816 - regression_loss: 0.6767 - classification_loss: 0.1049 218/500 [============>.................] - ETA: 1:33 - loss: 0.7806 - regression_loss: 0.6760 - classification_loss: 0.1046 219/500 [============>.................] - ETA: 1:32 - loss: 0.7797 - regression_loss: 0.6752 - classification_loss: 0.1045 220/500 [============>.................] - ETA: 1:32 - loss: 0.7808 - regression_loss: 0.6760 - classification_loss: 0.1048 221/500 [============>.................] - ETA: 1:32 - loss: 0.7807 - regression_loss: 0.6759 - classification_loss: 0.1049 222/500 [============>.................] - ETA: 1:31 - loss: 0.7805 - regression_loss: 0.6757 - classification_loss: 0.1048 223/500 [============>.................] - ETA: 1:31 - loss: 0.7805 - regression_loss: 0.6757 - classification_loss: 0.1048 224/500 [============>.................] - ETA: 1:31 - loss: 0.7789 - regression_loss: 0.6744 - classification_loss: 0.1045 225/500 [============>.................] - ETA: 1:30 - loss: 0.7770 - regression_loss: 0.6728 - classification_loss: 0.1043 226/500 [============>.................] - ETA: 1:30 - loss: 0.7777 - regression_loss: 0.6733 - classification_loss: 0.1044 227/500 [============>.................] - ETA: 1:30 - loss: 0.7771 - regression_loss: 0.6728 - classification_loss: 0.1043 228/500 [============>.................] - ETA: 1:29 - loss: 0.7760 - regression_loss: 0.6718 - classification_loss: 0.1042 229/500 [============>.................] - ETA: 1:29 - loss: 0.7767 - regression_loss: 0.6725 - classification_loss: 0.1042 230/500 [============>.................] - ETA: 1:29 - loss: 0.7771 - regression_loss: 0.6730 - classification_loss: 0.1041 231/500 [============>.................] - ETA: 1:28 - loss: 0.7767 - regression_loss: 0.6728 - classification_loss: 0.1039 232/500 [============>.................] - ETA: 1:28 - loss: 0.7757 - regression_loss: 0.6719 - classification_loss: 0.1037 233/500 [============>.................] - ETA: 1:28 - loss: 0.7743 - regression_loss: 0.6708 - classification_loss: 0.1036 234/500 [=============>................] - ETA: 1:27 - loss: 0.7758 - regression_loss: 0.6724 - classification_loss: 0.1035 235/500 [=============>................] - ETA: 1:27 - loss: 0.7771 - regression_loss: 0.6735 - classification_loss: 0.1036 236/500 [=============>................] - ETA: 1:27 - loss: 0.7764 - regression_loss: 0.6730 - classification_loss: 0.1034 237/500 [=============>................] - ETA: 1:26 - loss: 0.7742 - regression_loss: 0.6711 - classification_loss: 0.1031 238/500 [=============>................] - ETA: 1:26 - loss: 0.7736 - regression_loss: 0.6706 - classification_loss: 0.1030 239/500 [=============>................] - ETA: 1:26 - loss: 0.7739 - regression_loss: 0.6708 - classification_loss: 0.1031 240/500 [=============>................] - ETA: 1:25 - loss: 0.7730 - regression_loss: 0.6701 - classification_loss: 0.1029 241/500 [=============>................] - ETA: 1:25 - loss: 0.7742 - regression_loss: 0.6710 - classification_loss: 0.1032 242/500 [=============>................] - ETA: 1:25 - loss: 0.7737 - regression_loss: 0.6705 - classification_loss: 0.1032 243/500 [=============>................] - ETA: 1:24 - loss: 0.7741 - regression_loss: 0.6707 - classification_loss: 0.1033 244/500 [=============>................] - ETA: 1:24 - loss: 0.7746 - regression_loss: 0.6712 - classification_loss: 0.1034 245/500 [=============>................] - ETA: 1:24 - loss: 0.7727 - regression_loss: 0.6697 - classification_loss: 0.1031 246/500 [=============>................] - ETA: 1:23 - loss: 0.7755 - regression_loss: 0.6719 - classification_loss: 0.1036 247/500 [=============>................] - ETA: 1:23 - loss: 0.7777 - regression_loss: 0.6739 - classification_loss: 0.1039 248/500 [=============>................] - ETA: 1:23 - loss: 0.7780 - regression_loss: 0.6740 - classification_loss: 0.1040 249/500 [=============>................] - ETA: 1:22 - loss: 0.7787 - regression_loss: 0.6745 - classification_loss: 0.1042 250/500 [==============>...............] - ETA: 1:22 - loss: 0.7784 - regression_loss: 0.6742 - classification_loss: 0.1042 251/500 [==============>...............] - ETA: 1:22 - loss: 0.7785 - regression_loss: 0.6743 - classification_loss: 0.1042 252/500 [==============>...............] - ETA: 1:21 - loss: 0.7765 - regression_loss: 0.6725 - classification_loss: 0.1039 253/500 [==============>...............] - ETA: 1:21 - loss: 0.7758 - regression_loss: 0.6719 - classification_loss: 0.1039 254/500 [==============>...............] - ETA: 1:21 - loss: 0.7761 - regression_loss: 0.6721 - classification_loss: 0.1040 255/500 [==============>...............] - ETA: 1:20 - loss: 0.7768 - regression_loss: 0.6727 - classification_loss: 0.1040 256/500 [==============>...............] - ETA: 1:20 - loss: 0.7775 - regression_loss: 0.6736 - classification_loss: 0.1038 257/500 [==============>...............] - ETA: 1:20 - loss: 0.7774 - regression_loss: 0.6737 - classification_loss: 0.1037 258/500 [==============>...............] - ETA: 1:19 - loss: 0.7779 - regression_loss: 0.6742 - classification_loss: 0.1037 259/500 [==============>...............] - ETA: 1:19 - loss: 0.7773 - regression_loss: 0.6736 - classification_loss: 0.1037 260/500 [==============>...............] - ETA: 1:19 - loss: 0.7766 - regression_loss: 0.6731 - classification_loss: 0.1035 261/500 [==============>...............] - ETA: 1:18 - loss: 0.7794 - regression_loss: 0.6754 - classification_loss: 0.1039 262/500 [==============>...............] - ETA: 1:18 - loss: 0.7799 - regression_loss: 0.6760 - classification_loss: 0.1039 263/500 [==============>...............] - ETA: 1:18 - loss: 0.7780 - regression_loss: 0.6744 - classification_loss: 0.1036 264/500 [==============>...............] - ETA: 1:17 - loss: 0.7797 - regression_loss: 0.6759 - classification_loss: 0.1038 265/500 [==============>...............] - ETA: 1:17 - loss: 0.7789 - regression_loss: 0.6752 - classification_loss: 0.1036 266/500 [==============>...............] - ETA: 1:17 - loss: 0.7809 - regression_loss: 0.6769 - classification_loss: 0.1040 267/500 [===============>..............] - ETA: 1:16 - loss: 0.7809 - regression_loss: 0.6770 - classification_loss: 0.1039 268/500 [===============>..............] - ETA: 1:16 - loss: 0.7794 - regression_loss: 0.6757 - classification_loss: 0.1037 269/500 [===============>..............] - ETA: 1:16 - loss: 0.7803 - regression_loss: 0.6764 - classification_loss: 0.1039 270/500 [===============>..............] - ETA: 1:15 - loss: 0.7797 - regression_loss: 0.6759 - classification_loss: 0.1038 271/500 [===============>..............] - ETA: 1:15 - loss: 0.7778 - regression_loss: 0.6742 - classification_loss: 0.1036 272/500 [===============>..............] - ETA: 1:15 - loss: 0.7768 - regression_loss: 0.6734 - classification_loss: 0.1033 273/500 [===============>..............] - ETA: 1:14 - loss: 0.7782 - regression_loss: 0.6745 - classification_loss: 0.1037 274/500 [===============>..............] - ETA: 1:14 - loss: 0.7765 - regression_loss: 0.6731 - classification_loss: 0.1034 275/500 [===============>..............] - ETA: 1:14 - loss: 0.7757 - regression_loss: 0.6724 - classification_loss: 0.1033 276/500 [===============>..............] - ETA: 1:13 - loss: 0.7774 - regression_loss: 0.6739 - classification_loss: 0.1034 277/500 [===============>..............] - ETA: 1:13 - loss: 0.7793 - regression_loss: 0.6758 - classification_loss: 0.1036 278/500 [===============>..............] - ETA: 1:13 - loss: 0.7794 - regression_loss: 0.6759 - classification_loss: 0.1035 279/500 [===============>..............] - ETA: 1:12 - loss: 0.7780 - regression_loss: 0.6747 - classification_loss: 0.1033 280/500 [===============>..............] - ETA: 1:12 - loss: 0.7779 - regression_loss: 0.6747 - classification_loss: 0.1033 281/500 [===============>..............] - ETA: 1:12 - loss: 0.7790 - regression_loss: 0.6756 - classification_loss: 0.1034 282/500 [===============>..............] - ETA: 1:11 - loss: 0.7803 - regression_loss: 0.6768 - classification_loss: 0.1035 283/500 [===============>..............] - ETA: 1:11 - loss: 0.7816 - regression_loss: 0.6778 - classification_loss: 0.1038 284/500 [================>.............] - ETA: 1:11 - loss: 0.7820 - regression_loss: 0.6781 - classification_loss: 0.1039 285/500 [================>.............] - ETA: 1:10 - loss: 0.7829 - regression_loss: 0.6789 - classification_loss: 0.1040 286/500 [================>.............] - ETA: 1:10 - loss: 0.7823 - regression_loss: 0.6783 - classification_loss: 0.1040 287/500 [================>.............] - ETA: 1:10 - loss: 0.7819 - regression_loss: 0.6779 - classification_loss: 0.1040 288/500 [================>.............] - ETA: 1:10 - loss: 0.7806 - regression_loss: 0.6765 - classification_loss: 0.1042 289/500 [================>.............] - ETA: 1:09 - loss: 0.7803 - regression_loss: 0.6762 - classification_loss: 0.1040 290/500 [================>.............] - ETA: 1:09 - loss: 0.7812 - regression_loss: 0.6772 - classification_loss: 0.1040 291/500 [================>.............] - ETA: 1:09 - loss: 0.7817 - regression_loss: 0.6776 - classification_loss: 0.1041 292/500 [================>.............] - ETA: 1:08 - loss: 0.7801 - regression_loss: 0.6763 - classification_loss: 0.1038 293/500 [================>.............] - ETA: 1:08 - loss: 0.7804 - regression_loss: 0.6765 - classification_loss: 0.1039 294/500 [================>.............] - ETA: 1:08 - loss: 0.7812 - regression_loss: 0.6769 - classification_loss: 0.1042 295/500 [================>.............] - ETA: 1:07 - loss: 0.7802 - regression_loss: 0.6761 - classification_loss: 0.1041 296/500 [================>.............] - ETA: 1:07 - loss: 0.7797 - regression_loss: 0.6756 - classification_loss: 0.1041 297/500 [================>.............] - ETA: 1:07 - loss: 0.7803 - regression_loss: 0.6761 - classification_loss: 0.1042 298/500 [================>.............] - ETA: 1:06 - loss: 0.7806 - regression_loss: 0.6762 - classification_loss: 0.1044 299/500 [================>.............] - ETA: 1:06 - loss: 0.7804 - regression_loss: 0.6761 - classification_loss: 0.1044 300/500 [=================>............] - ETA: 1:06 - loss: 0.7804 - regression_loss: 0.6759 - classification_loss: 0.1044 301/500 [=================>............] - ETA: 1:05 - loss: 0.7810 - regression_loss: 0.6765 - classification_loss: 0.1045 302/500 [=================>............] - ETA: 1:05 - loss: 0.7802 - regression_loss: 0.6758 - classification_loss: 0.1044 303/500 [=================>............] - ETA: 1:05 - loss: 0.7798 - regression_loss: 0.6755 - classification_loss: 0.1043 304/500 [=================>............] - ETA: 1:04 - loss: 0.7794 - regression_loss: 0.6752 - classification_loss: 0.1042 305/500 [=================>............] - ETA: 1:04 - loss: 0.7806 - regression_loss: 0.6762 - classification_loss: 0.1044 306/500 [=================>............] - ETA: 1:04 - loss: 0.7809 - regression_loss: 0.6765 - classification_loss: 0.1044 307/500 [=================>............] - ETA: 1:03 - loss: 0.7795 - regression_loss: 0.6752 - classification_loss: 0.1042 308/500 [=================>............] - ETA: 1:03 - loss: 0.7786 - regression_loss: 0.6745 - classification_loss: 0.1041 309/500 [=================>............] - ETA: 1:03 - loss: 0.7775 - regression_loss: 0.6735 - classification_loss: 0.1040 310/500 [=================>............] - ETA: 1:02 - loss: 0.7764 - regression_loss: 0.6726 - classification_loss: 0.1038 311/500 [=================>............] - ETA: 1:02 - loss: 0.7769 - regression_loss: 0.6730 - classification_loss: 0.1039 312/500 [=================>............] - ETA: 1:02 - loss: 0.7767 - regression_loss: 0.6727 - classification_loss: 0.1041 313/500 [=================>............] - ETA: 1:01 - loss: 0.7761 - regression_loss: 0.6721 - classification_loss: 0.1040 314/500 [=================>............] - ETA: 1:01 - loss: 0.7757 - regression_loss: 0.6718 - classification_loss: 0.1039 315/500 [=================>............] - ETA: 1:01 - loss: 0.7756 - regression_loss: 0.6717 - classification_loss: 0.1039 316/500 [=================>............] - ETA: 1:00 - loss: 0.7743 - regression_loss: 0.6707 - classification_loss: 0.1036 317/500 [==================>...........] - ETA: 1:00 - loss: 0.7727 - regression_loss: 0.6693 - classification_loss: 0.1034 318/500 [==================>...........] - ETA: 1:00 - loss: 0.7720 - regression_loss: 0.6686 - classification_loss: 0.1033 319/500 [==================>...........] - ETA: 59s - loss: 0.7710 - regression_loss: 0.6679 - classification_loss: 0.1031  320/500 [==================>...........] - ETA: 59s - loss: 0.7713 - regression_loss: 0.6680 - classification_loss: 0.1032 321/500 [==================>...........] - ETA: 59s - loss: 0.7729 - regression_loss: 0.6694 - classification_loss: 0.1034 322/500 [==================>...........] - ETA: 58s - loss: 0.7714 - regression_loss: 0.6682 - classification_loss: 0.1032 323/500 [==================>...........] - ETA: 58s - loss: 0.7734 - regression_loss: 0.6699 - classification_loss: 0.1035 324/500 [==================>...........] - ETA: 58s - loss: 0.7726 - regression_loss: 0.6692 - classification_loss: 0.1034 325/500 [==================>...........] - ETA: 57s - loss: 0.7723 - regression_loss: 0.6689 - classification_loss: 0.1034 326/500 [==================>...........] - ETA: 57s - loss: 0.7740 - regression_loss: 0.6703 - classification_loss: 0.1036 327/500 [==================>...........] - ETA: 57s - loss: 0.7747 - regression_loss: 0.6709 - classification_loss: 0.1038 328/500 [==================>...........] - ETA: 56s - loss: 0.7733 - regression_loss: 0.6697 - classification_loss: 0.1036 329/500 [==================>...........] - ETA: 56s - loss: 0.7731 - regression_loss: 0.6696 - classification_loss: 0.1035 330/500 [==================>...........] - ETA: 56s - loss: 0.7723 - regression_loss: 0.6690 - classification_loss: 0.1033 331/500 [==================>...........] - ETA: 55s - loss: 0.7711 - regression_loss: 0.6679 - classification_loss: 0.1032 332/500 [==================>...........] - ETA: 55s - loss: 0.7723 - regression_loss: 0.6689 - classification_loss: 0.1034 333/500 [==================>...........] - ETA: 55s - loss: 0.7704 - regression_loss: 0.6672 - classification_loss: 0.1032 334/500 [===================>..........] - ETA: 54s - loss: 0.7705 - regression_loss: 0.6675 - classification_loss: 0.1030 335/500 [===================>..........] - ETA: 54s - loss: 0.7689 - regression_loss: 0.6662 - classification_loss: 0.1028 336/500 [===================>..........] - ETA: 54s - loss: 0.7685 - regression_loss: 0.6658 - classification_loss: 0.1027 337/500 [===================>..........] - ETA: 53s - loss: 0.7681 - regression_loss: 0.6655 - classification_loss: 0.1026 338/500 [===================>..........] - ETA: 53s - loss: 0.7673 - regression_loss: 0.6648 - classification_loss: 0.1025 339/500 [===================>..........] - ETA: 53s - loss: 0.7676 - regression_loss: 0.6652 - classification_loss: 0.1024 340/500 [===================>..........] - ETA: 52s - loss: 0.7663 - regression_loss: 0.6642 - classification_loss: 0.1022 341/500 [===================>..........] - ETA: 52s - loss: 0.7672 - regression_loss: 0.6649 - classification_loss: 0.1023 342/500 [===================>..........] - ETA: 52s - loss: 0.7663 - regression_loss: 0.6642 - classification_loss: 0.1021 343/500 [===================>..........] - ETA: 51s - loss: 0.7660 - regression_loss: 0.6640 - classification_loss: 0.1020 344/500 [===================>..........] - ETA: 51s - loss: 0.7669 - regression_loss: 0.6647 - classification_loss: 0.1021 345/500 [===================>..........] - ETA: 51s - loss: 0.7680 - regression_loss: 0.6660 - classification_loss: 0.1021 346/500 [===================>..........] - ETA: 50s - loss: 0.7683 - regression_loss: 0.6662 - classification_loss: 0.1021 347/500 [===================>..........] - ETA: 50s - loss: 0.7677 - regression_loss: 0.6657 - classification_loss: 0.1020 348/500 [===================>..........] - ETA: 50s - loss: 0.7684 - regression_loss: 0.6664 - classification_loss: 0.1020 349/500 [===================>..........] - ETA: 49s - loss: 0.7697 - regression_loss: 0.6673 - classification_loss: 0.1023 350/500 [====================>.........] - ETA: 49s - loss: 0.7698 - regression_loss: 0.6674 - classification_loss: 0.1024 351/500 [====================>.........] - ETA: 49s - loss: 0.7697 - regression_loss: 0.6674 - classification_loss: 0.1022 352/500 [====================>.........] - ETA: 48s - loss: 0.7691 - regression_loss: 0.6669 - classification_loss: 0.1021 353/500 [====================>.........] - ETA: 48s - loss: 0.7681 - regression_loss: 0.6661 - classification_loss: 0.1020 354/500 [====================>.........] - ETA: 48s - loss: 0.7676 - regression_loss: 0.6657 - classification_loss: 0.1019 355/500 [====================>.........] - ETA: 47s - loss: 0.7682 - regression_loss: 0.6663 - classification_loss: 0.1019 356/500 [====================>.........] - ETA: 47s - loss: 0.7676 - regression_loss: 0.6659 - classification_loss: 0.1018 357/500 [====================>.........] - ETA: 47s - loss: 0.7678 - regression_loss: 0.6661 - classification_loss: 0.1017 358/500 [====================>.........] - ETA: 46s - loss: 0.7683 - regression_loss: 0.6665 - classification_loss: 0.1018 359/500 [====================>.........] - ETA: 46s - loss: 0.7679 - regression_loss: 0.6661 - classification_loss: 0.1018 360/500 [====================>.........] - ETA: 46s - loss: 0.7678 - regression_loss: 0.6660 - classification_loss: 0.1018 361/500 [====================>.........] - ETA: 45s - loss: 0.7691 - regression_loss: 0.6670 - classification_loss: 0.1020 362/500 [====================>.........] - ETA: 45s - loss: 0.7683 - regression_loss: 0.6664 - classification_loss: 0.1019 363/500 [====================>.........] - ETA: 45s - loss: 0.7692 - regression_loss: 0.6671 - classification_loss: 0.1021 364/500 [====================>.........] - ETA: 44s - loss: 0.7683 - regression_loss: 0.6663 - classification_loss: 0.1020 365/500 [====================>.........] - ETA: 44s - loss: 0.7672 - regression_loss: 0.6654 - classification_loss: 0.1018 366/500 [====================>.........] - ETA: 44s - loss: 0.7679 - regression_loss: 0.6661 - classification_loss: 0.1019 367/500 [=====================>........] - ETA: 43s - loss: 0.7681 - regression_loss: 0.6662 - classification_loss: 0.1019 368/500 [=====================>........] - ETA: 43s - loss: 0.7680 - regression_loss: 0.6661 - classification_loss: 0.1019 369/500 [=====================>........] - ETA: 43s - loss: 0.7667 - regression_loss: 0.6650 - classification_loss: 0.1017 370/500 [=====================>........] - ETA: 42s - loss: 0.7664 - regression_loss: 0.6647 - classification_loss: 0.1016 371/500 [=====================>........] - ETA: 42s - loss: 0.7677 - regression_loss: 0.6660 - classification_loss: 0.1017 372/500 [=====================>........] - ETA: 42s - loss: 0.7673 - regression_loss: 0.6657 - classification_loss: 0.1015 373/500 [=====================>........] - ETA: 41s - loss: 0.7665 - regression_loss: 0.6651 - classification_loss: 0.1014 374/500 [=====================>........] - ETA: 41s - loss: 0.7672 - regression_loss: 0.6656 - classification_loss: 0.1016 375/500 [=====================>........] - ETA: 41s - loss: 0.7671 - regression_loss: 0.6656 - classification_loss: 0.1016 376/500 [=====================>........] - ETA: 40s - loss: 0.7674 - regression_loss: 0.6658 - classification_loss: 0.1016 377/500 [=====================>........] - ETA: 40s - loss: 0.7683 - regression_loss: 0.6666 - classification_loss: 0.1017 378/500 [=====================>........] - ETA: 40s - loss: 0.7676 - regression_loss: 0.6659 - classification_loss: 0.1016 379/500 [=====================>........] - ETA: 39s - loss: 0.7664 - regression_loss: 0.6649 - classification_loss: 0.1015 380/500 [=====================>........] - ETA: 39s - loss: 0.7666 - regression_loss: 0.6651 - classification_loss: 0.1015 381/500 [=====================>........] - ETA: 39s - loss: 0.7655 - regression_loss: 0.6641 - classification_loss: 0.1014 382/500 [=====================>........] - ETA: 38s - loss: 0.7644 - regression_loss: 0.6632 - classification_loss: 0.1012 383/500 [=====================>........] - ETA: 38s - loss: 0.7652 - regression_loss: 0.6638 - classification_loss: 0.1013 384/500 [======================>.......] - ETA: 38s - loss: 0.7659 - regression_loss: 0.6645 - classification_loss: 0.1014 385/500 [======================>.......] - ETA: 37s - loss: 0.7666 - regression_loss: 0.6651 - classification_loss: 0.1015 386/500 [======================>.......] - ETA: 37s - loss: 0.7652 - regression_loss: 0.6638 - classification_loss: 0.1013 387/500 [======================>.......] - ETA: 37s - loss: 0.7644 - regression_loss: 0.6632 - classification_loss: 0.1012 388/500 [======================>.......] - ETA: 36s - loss: 0.7653 - regression_loss: 0.6638 - classification_loss: 0.1014 389/500 [======================>.......] - ETA: 36s - loss: 0.7660 - regression_loss: 0.6644 - classification_loss: 0.1016 390/500 [======================>.......] - ETA: 36s - loss: 0.7663 - regression_loss: 0.6648 - classification_loss: 0.1015 391/500 [======================>.......] - ETA: 35s - loss: 0.7664 - regression_loss: 0.6649 - classification_loss: 0.1015 392/500 [======================>.......] - ETA: 35s - loss: 0.7656 - regression_loss: 0.6642 - classification_loss: 0.1014 393/500 [======================>.......] - ETA: 35s - loss: 0.7655 - regression_loss: 0.6642 - classification_loss: 0.1013 394/500 [======================>.......] - ETA: 34s - loss: 0.7656 - regression_loss: 0.6643 - classification_loss: 0.1013 395/500 [======================>.......] - ETA: 34s - loss: 0.7655 - regression_loss: 0.6641 - classification_loss: 0.1014 396/500 [======================>.......] - ETA: 34s - loss: 0.7651 - regression_loss: 0.6638 - classification_loss: 0.1013 397/500 [======================>.......] - ETA: 33s - loss: 0.7651 - regression_loss: 0.6638 - classification_loss: 0.1013 398/500 [======================>.......] - ETA: 33s - loss: 0.7642 - regression_loss: 0.6631 - classification_loss: 0.1011 399/500 [======================>.......] - ETA: 33s - loss: 0.7645 - regression_loss: 0.6633 - classification_loss: 0.1012 400/500 [=======================>......] - ETA: 33s - loss: 0.7643 - regression_loss: 0.6630 - classification_loss: 0.1012 401/500 [=======================>......] - ETA: 32s - loss: 0.7638 - regression_loss: 0.6626 - classification_loss: 0.1012 402/500 [=======================>......] - ETA: 32s - loss: 0.7644 - regression_loss: 0.6631 - classification_loss: 0.1013 403/500 [=======================>......] - ETA: 32s - loss: 0.7648 - regression_loss: 0.6635 - classification_loss: 0.1013 404/500 [=======================>......] - ETA: 31s - loss: 0.7644 - regression_loss: 0.6632 - classification_loss: 0.1012 405/500 [=======================>......] - ETA: 31s - loss: 0.7652 - regression_loss: 0.6640 - classification_loss: 0.1012 406/500 [=======================>......] - ETA: 31s - loss: 0.7640 - regression_loss: 0.6630 - classification_loss: 0.1010 407/500 [=======================>......] - ETA: 30s - loss: 0.7643 - regression_loss: 0.6633 - classification_loss: 0.1010 408/500 [=======================>......] - ETA: 30s - loss: 0.7636 - regression_loss: 0.6627 - classification_loss: 0.1009 409/500 [=======================>......] - ETA: 30s - loss: 0.7640 - regression_loss: 0.6631 - classification_loss: 0.1009 410/500 [=======================>......] - ETA: 29s - loss: 0.7637 - regression_loss: 0.6628 - classification_loss: 0.1009 411/500 [=======================>......] - ETA: 29s - loss: 0.7641 - regression_loss: 0.6632 - classification_loss: 0.1009 412/500 [=======================>......] - ETA: 29s - loss: 0.7650 - regression_loss: 0.6639 - classification_loss: 0.1010 413/500 [=======================>......] - ETA: 28s - loss: 0.7653 - regression_loss: 0.6641 - classification_loss: 0.1012 414/500 [=======================>......] - ETA: 28s - loss: 0.7657 - regression_loss: 0.6645 - classification_loss: 0.1013 415/500 [=======================>......] - ETA: 28s - loss: 0.7656 - regression_loss: 0.6642 - classification_loss: 0.1014 416/500 [=======================>......] - ETA: 27s - loss: 0.7656 - regression_loss: 0.6642 - classification_loss: 0.1014 417/500 [========================>.....] - ETA: 27s - loss: 0.7659 - regression_loss: 0.6645 - classification_loss: 0.1014 418/500 [========================>.....] - ETA: 27s - loss: 0.7668 - regression_loss: 0.6652 - classification_loss: 0.1016 419/500 [========================>.....] - ETA: 26s - loss: 0.7665 - regression_loss: 0.6649 - classification_loss: 0.1016 420/500 [========================>.....] - ETA: 26s - loss: 0.7667 - regression_loss: 0.6652 - classification_loss: 0.1015 421/500 [========================>.....] - ETA: 26s - loss: 0.7659 - regression_loss: 0.6645 - classification_loss: 0.1014 422/500 [========================>.....] - ETA: 25s - loss: 0.7670 - regression_loss: 0.6654 - classification_loss: 0.1015 423/500 [========================>.....] - ETA: 25s - loss: 0.7664 - regression_loss: 0.6650 - classification_loss: 0.1014 424/500 [========================>.....] - ETA: 25s - loss: 0.7654 - regression_loss: 0.6641 - classification_loss: 0.1012 425/500 [========================>.....] - ETA: 24s - loss: 0.7661 - regression_loss: 0.6648 - classification_loss: 0.1013 426/500 [========================>.....] - ETA: 24s - loss: 0.7669 - regression_loss: 0.6655 - classification_loss: 0.1014 427/500 [========================>.....] - ETA: 24s - loss: 0.7661 - regression_loss: 0.6648 - classification_loss: 0.1013 428/500 [========================>.....] - ETA: 23s - loss: 0.7653 - regression_loss: 0.6641 - classification_loss: 0.1012 429/500 [========================>.....] - ETA: 23s - loss: 0.7654 - regression_loss: 0.6642 - classification_loss: 0.1012 430/500 [========================>.....] - ETA: 23s - loss: 0.7641 - regression_loss: 0.6631 - classification_loss: 0.1010 431/500 [========================>.....] - ETA: 22s - loss: 0.7641 - regression_loss: 0.6631 - classification_loss: 0.1010 432/500 [========================>.....] - ETA: 22s - loss: 0.7633 - regression_loss: 0.6624 - classification_loss: 0.1009 433/500 [========================>.....] - ETA: 22s - loss: 0.7642 - regression_loss: 0.6632 - classification_loss: 0.1010 434/500 [=========================>....] - ETA: 21s - loss: 0.7640 - regression_loss: 0.6630 - classification_loss: 0.1010 435/500 [=========================>....] - ETA: 21s - loss: 0.7643 - regression_loss: 0.6633 - classification_loss: 0.1010 436/500 [=========================>....] - ETA: 21s - loss: 0.7656 - regression_loss: 0.6644 - classification_loss: 0.1012 437/500 [=========================>....] - ETA: 20s - loss: 0.7647 - regression_loss: 0.6636 - classification_loss: 0.1010 438/500 [=========================>....] - ETA: 20s - loss: 0.7645 - regression_loss: 0.6635 - classification_loss: 0.1010 439/500 [=========================>....] - ETA: 20s - loss: 0.7659 - regression_loss: 0.6646 - classification_loss: 0.1013 440/500 [=========================>....] - ETA: 19s - loss: 0.7671 - regression_loss: 0.6656 - classification_loss: 0.1015 441/500 [=========================>....] - ETA: 19s - loss: 0.7665 - regression_loss: 0.6651 - classification_loss: 0.1014 442/500 [=========================>....] - ETA: 19s - loss: 0.7667 - regression_loss: 0.6653 - classification_loss: 0.1014 443/500 [=========================>....] - ETA: 18s - loss: 0.7673 - regression_loss: 0.6657 - classification_loss: 0.1016 444/500 [=========================>....] - ETA: 18s - loss: 0.7671 - regression_loss: 0.6656 - classification_loss: 0.1016 445/500 [=========================>....] - ETA: 18s - loss: 0.7667 - regression_loss: 0.6653 - classification_loss: 0.1014 446/500 [=========================>....] - ETA: 17s - loss: 0.7668 - regression_loss: 0.6655 - classification_loss: 0.1013 447/500 [=========================>....] - ETA: 17s - loss: 0.7665 - regression_loss: 0.6653 - classification_loss: 0.1012 448/500 [=========================>....] - ETA: 17s - loss: 0.7666 - regression_loss: 0.6654 - classification_loss: 0.1012 449/500 [=========================>....] - ETA: 16s - loss: 0.7668 - regression_loss: 0.6656 - classification_loss: 0.1012 450/500 [==========================>...] - ETA: 16s - loss: 0.7660 - regression_loss: 0.6650 - classification_loss: 0.1010 451/500 [==========================>...] - ETA: 16s - loss: 0.7664 - regression_loss: 0.6652 - classification_loss: 0.1011 452/500 [==========================>...] - ETA: 15s - loss: 0.7659 - regression_loss: 0.6648 - classification_loss: 0.1011 453/500 [==========================>...] - ETA: 15s - loss: 0.7665 - regression_loss: 0.6654 - classification_loss: 0.1011 454/500 [==========================>...] - ETA: 15s - loss: 0.7666 - regression_loss: 0.6655 - classification_loss: 0.1011 455/500 [==========================>...] - ETA: 14s - loss: 0.7667 - regression_loss: 0.6655 - classification_loss: 0.1012 456/500 [==========================>...] - ETA: 14s - loss: 0.7667 - regression_loss: 0.6655 - classification_loss: 0.1012 457/500 [==========================>...] - ETA: 14s - loss: 0.7666 - regression_loss: 0.6653 - classification_loss: 0.1012 458/500 [==========================>...] - ETA: 13s - loss: 0.7659 - regression_loss: 0.6647 - classification_loss: 0.1012 459/500 [==========================>...] - ETA: 13s - loss: 0.7658 - regression_loss: 0.6647 - classification_loss: 0.1011 460/500 [==========================>...] - ETA: 13s - loss: 0.7654 - regression_loss: 0.6643 - classification_loss: 0.1011 461/500 [==========================>...] - ETA: 12s - loss: 0.7657 - regression_loss: 0.6646 - classification_loss: 0.1011 462/500 [==========================>...] - ETA: 12s - loss: 0.7651 - regression_loss: 0.6641 - classification_loss: 0.1010 463/500 [==========================>...] - ETA: 12s - loss: 0.7646 - regression_loss: 0.6638 - classification_loss: 0.1009 464/500 [==========================>...] - ETA: 11s - loss: 0.7643 - regression_loss: 0.6635 - classification_loss: 0.1008 465/500 [==========================>...] - ETA: 11s - loss: 0.7634 - regression_loss: 0.6626 - classification_loss: 0.1007 466/500 [==========================>...] - ETA: 11s - loss: 0.7634 - regression_loss: 0.6626 - classification_loss: 0.1008 467/500 [===========================>..] - ETA: 10s - loss: 0.7624 - regression_loss: 0.6618 - classification_loss: 0.1007 468/500 [===========================>..] - ETA: 10s - loss: 0.7624 - regression_loss: 0.6617 - classification_loss: 0.1007 469/500 [===========================>..] - ETA: 10s - loss: 0.7622 - regression_loss: 0.6616 - classification_loss: 0.1006 470/500 [===========================>..] - ETA: 9s - loss: 0.7617 - regression_loss: 0.6612 - classification_loss: 0.1006  471/500 [===========================>..] - ETA: 9s - loss: 0.7609 - regression_loss: 0.6605 - classification_loss: 0.1004 472/500 [===========================>..] - ETA: 9s - loss: 0.7608 - regression_loss: 0.6604 - classification_loss: 0.1004 473/500 [===========================>..] - ETA: 8s - loss: 0.7607 - regression_loss: 0.6604 - classification_loss: 0.1003 474/500 [===========================>..] - ETA: 8s - loss: 0.7602 - regression_loss: 0.6600 - classification_loss: 0.1003 475/500 [===========================>..] - ETA: 8s - loss: 0.7605 - regression_loss: 0.6602 - classification_loss: 0.1003 476/500 [===========================>..] - ETA: 7s - loss: 0.7599 - regression_loss: 0.6597 - classification_loss: 0.1002 477/500 [===========================>..] - ETA: 7s - loss: 0.7590 - regression_loss: 0.6590 - classification_loss: 0.1001 478/500 [===========================>..] - ETA: 7s - loss: 0.7592 - regression_loss: 0.6591 - classification_loss: 0.1001 479/500 [===========================>..] - ETA: 6s - loss: 0.7595 - regression_loss: 0.6593 - classification_loss: 0.1002 480/500 [===========================>..] - ETA: 6s - loss: 0.7599 - regression_loss: 0.6596 - classification_loss: 0.1002 481/500 [===========================>..] - ETA: 6s - loss: 0.7591 - regression_loss: 0.6590 - classification_loss: 0.1001 482/500 [===========================>..] - ETA: 5s - loss: 0.7592 - regression_loss: 0.6590 - classification_loss: 0.1001 483/500 [===========================>..] - ETA: 5s - loss: 0.7600 - regression_loss: 0.6597 - classification_loss: 0.1003 484/500 [============================>.] - ETA: 5s - loss: 0.7600 - regression_loss: 0.6597 - classification_loss: 0.1002 485/500 [============================>.] - ETA: 4s - loss: 0.7602 - regression_loss: 0.6600 - classification_loss: 0.1002 486/500 [============================>.] - ETA: 4s - loss: 0.7596 - regression_loss: 0.6595 - classification_loss: 0.1001 487/500 [============================>.] - ETA: 4s - loss: 0.7594 - regression_loss: 0.6593 - classification_loss: 0.1001 488/500 [============================>.] - ETA: 3s - loss: 0.7586 - regression_loss: 0.6586 - classification_loss: 0.1000 489/500 [============================>.] - ETA: 3s - loss: 0.7584 - regression_loss: 0.6584 - classification_loss: 0.1000 490/500 [============================>.] - ETA: 3s - loss: 0.7588 - regression_loss: 0.6589 - classification_loss: 0.0999 491/500 [============================>.] - ETA: 2s - loss: 0.7582 - regression_loss: 0.6583 - classification_loss: 0.0998 492/500 [============================>.] - ETA: 2s - loss: 0.7577 - regression_loss: 0.6579 - classification_loss: 0.0998 493/500 [============================>.] - ETA: 2s - loss: 0.7578 - regression_loss: 0.6580 - classification_loss: 0.0998 494/500 [============================>.] - ETA: 1s - loss: 0.7579 - regression_loss: 0.6581 - classification_loss: 0.0998 495/500 [============================>.] - ETA: 1s - loss: 0.7571 - regression_loss: 0.6574 - classification_loss: 0.0997 496/500 [============================>.] - ETA: 1s - loss: 0.7574 - regression_loss: 0.6577 - classification_loss: 0.0997 497/500 [============================>.] - ETA: 0s - loss: 0.7570 - regression_loss: 0.6574 - classification_loss: 0.0996 498/500 [============================>.] - ETA: 0s - loss: 0.7570 - regression_loss: 0.6574 - classification_loss: 0.0996 499/500 [============================>.] - ETA: 0s - loss: 0.7561 - regression_loss: 0.6566 - classification_loss: 0.0995 500/500 [==============================] - 165s 331ms/step - loss: 0.7565 - regression_loss: 0.6568 - classification_loss: 0.0996 1172 instances of class plum with average precision: 0.6310 mAP: 0.6310 Epoch 00041: saving model to ./training/snapshots/resnet101_pascal_41.h5 Epoch 42/150 1/500 [..............................] - ETA: 2:42 - loss: 1.1675 - regression_loss: 1.0053 - classification_loss: 0.1623 2/500 [..............................] - ETA: 2:44 - loss: 1.0954 - regression_loss: 0.9572 - classification_loss: 0.1382 3/500 [..............................] - ETA: 2:43 - loss: 1.0232 - regression_loss: 0.8956 - classification_loss: 0.1276 4/500 [..............................] - ETA: 2:42 - loss: 0.9531 - regression_loss: 0.8423 - classification_loss: 0.1108 5/500 [..............................] - ETA: 2:43 - loss: 0.9292 - regression_loss: 0.8181 - classification_loss: 0.1111 6/500 [..............................] - ETA: 2:43 - loss: 0.8354 - regression_loss: 0.7344 - classification_loss: 0.1010 7/500 [..............................] - ETA: 2:42 - loss: 0.8844 - regression_loss: 0.7762 - classification_loss: 0.1081 8/500 [..............................] - ETA: 2:44 - loss: 0.8037 - regression_loss: 0.7053 - classification_loss: 0.0984 9/500 [..............................] - ETA: 2:43 - loss: 0.8261 - regression_loss: 0.7215 - classification_loss: 0.1046 10/500 [..............................] - ETA: 2:44 - loss: 0.8220 - regression_loss: 0.7152 - classification_loss: 0.1068 11/500 [..............................] - ETA: 2:44 - loss: 0.7850 - regression_loss: 0.6833 - classification_loss: 0.1017 12/500 [..............................] - ETA: 2:44 - loss: 0.7415 - regression_loss: 0.6452 - classification_loss: 0.0962 13/500 [..............................] - ETA: 2:44 - loss: 0.7633 - regression_loss: 0.6644 - classification_loss: 0.0989 14/500 [..............................] - ETA: 2:42 - loss: 0.7934 - regression_loss: 0.6910 - classification_loss: 0.1024 15/500 [..............................] - ETA: 2:41 - loss: 0.7828 - regression_loss: 0.6821 - classification_loss: 0.1007 16/500 [..............................] - ETA: 2:41 - loss: 0.7893 - regression_loss: 0.6863 - classification_loss: 0.1030 17/500 [>.............................] - ETA: 2:40 - loss: 0.7634 - regression_loss: 0.6640 - classification_loss: 0.0994 18/500 [>.............................] - ETA: 2:40 - loss: 0.7721 - regression_loss: 0.6688 - classification_loss: 0.1033 19/500 [>.............................] - ETA: 2:39 - loss: 0.7754 - regression_loss: 0.6732 - classification_loss: 0.1021 20/500 [>.............................] - ETA: 2:39 - loss: 0.7744 - regression_loss: 0.6714 - classification_loss: 0.1030 21/500 [>.............................] - ETA: 2:39 - loss: 0.7817 - regression_loss: 0.6800 - classification_loss: 0.1017 22/500 [>.............................] - ETA: 2:38 - loss: 0.8131 - regression_loss: 0.7061 - classification_loss: 0.1070 23/500 [>.............................] - ETA: 2:38 - loss: 0.8085 - regression_loss: 0.7012 - classification_loss: 0.1073 24/500 [>.............................] - ETA: 2:38 - loss: 0.7954 - regression_loss: 0.6901 - classification_loss: 0.1053 25/500 [>.............................] - ETA: 2:37 - loss: 0.8023 - regression_loss: 0.6964 - classification_loss: 0.1059 26/500 [>.............................] - ETA: 2:36 - loss: 0.8148 - regression_loss: 0.7060 - classification_loss: 0.1088 27/500 [>.............................] - ETA: 2:36 - loss: 0.8207 - regression_loss: 0.7106 - classification_loss: 0.1100 28/500 [>.............................] - ETA: 2:36 - loss: 0.8265 - regression_loss: 0.7193 - classification_loss: 0.1072 29/500 [>.............................] - ETA: 2:36 - loss: 0.8180 - regression_loss: 0.7117 - classification_loss: 0.1064 30/500 [>.............................] - ETA: 2:36 - loss: 0.7957 - regression_loss: 0.6921 - classification_loss: 0.1036 31/500 [>.............................] - ETA: 2:35 - loss: 0.7805 - regression_loss: 0.6789 - classification_loss: 0.1016 32/500 [>.............................] - ETA: 2:35 - loss: 0.7734 - regression_loss: 0.6727 - classification_loss: 0.1008 33/500 [>.............................] - ETA: 2:35 - loss: 0.7910 - regression_loss: 0.6879 - classification_loss: 0.1031 34/500 [=>............................] - ETA: 2:34 - loss: 0.7969 - regression_loss: 0.6939 - classification_loss: 0.1030 35/500 [=>............................] - ETA: 2:34 - loss: 0.8032 - regression_loss: 0.6997 - classification_loss: 0.1035 36/500 [=>............................] - ETA: 2:33 - loss: 0.7943 - regression_loss: 0.6922 - classification_loss: 0.1021 37/500 [=>............................] - ETA: 2:33 - loss: 0.7816 - regression_loss: 0.6812 - classification_loss: 0.1004 38/500 [=>............................] - ETA: 2:33 - loss: 0.7769 - regression_loss: 0.6782 - classification_loss: 0.0988 39/500 [=>............................] - ETA: 2:33 - loss: 0.7823 - regression_loss: 0.6826 - classification_loss: 0.0998 40/500 [=>............................] - ETA: 2:32 - loss: 0.7761 - regression_loss: 0.6775 - classification_loss: 0.0986 41/500 [=>............................] - ETA: 2:32 - loss: 0.7657 - regression_loss: 0.6687 - classification_loss: 0.0970 42/500 [=>............................] - ETA: 2:32 - loss: 0.7738 - regression_loss: 0.6741 - classification_loss: 0.0998 43/500 [=>............................] - ETA: 2:31 - loss: 0.7693 - regression_loss: 0.6707 - classification_loss: 0.0987 44/500 [=>............................] - ETA: 2:31 - loss: 0.7746 - regression_loss: 0.6749 - classification_loss: 0.0997 45/500 [=>............................] - ETA: 2:30 - loss: 0.7752 - regression_loss: 0.6760 - classification_loss: 0.0992 46/500 [=>............................] - ETA: 2:30 - loss: 0.7879 - regression_loss: 0.6868 - classification_loss: 0.1011 47/500 [=>............................] - ETA: 2:30 - loss: 0.7908 - regression_loss: 0.6895 - classification_loss: 0.1014 48/500 [=>............................] - ETA: 2:30 - loss: 0.7979 - regression_loss: 0.6953 - classification_loss: 0.1027 49/500 [=>............................] - ETA: 2:29 - loss: 0.7989 - regression_loss: 0.6967 - classification_loss: 0.1021 50/500 [==>...........................] - ETA: 2:29 - loss: 0.7980 - regression_loss: 0.6960 - classification_loss: 0.1020 51/500 [==>...........................] - ETA: 2:29 - loss: 0.8032 - regression_loss: 0.7000 - classification_loss: 0.1031 52/500 [==>...........................] - ETA: 2:28 - loss: 0.7990 - regression_loss: 0.6966 - classification_loss: 0.1024 53/500 [==>...........................] - ETA: 2:28 - loss: 0.7887 - regression_loss: 0.6876 - classification_loss: 0.1011 54/500 [==>...........................] - ETA: 2:27 - loss: 0.7789 - regression_loss: 0.6792 - classification_loss: 0.0997 55/500 [==>...........................] - ETA: 2:27 - loss: 0.7683 - regression_loss: 0.6698 - classification_loss: 0.0985 56/500 [==>...........................] - ETA: 2:27 - loss: 0.7694 - regression_loss: 0.6706 - classification_loss: 0.0988 57/500 [==>...........................] - ETA: 2:26 - loss: 0.7807 - regression_loss: 0.6803 - classification_loss: 0.1004 58/500 [==>...........................] - ETA: 2:26 - loss: 0.7835 - regression_loss: 0.6829 - classification_loss: 0.1005 59/500 [==>...........................] - ETA: 2:25 - loss: 0.7782 - regression_loss: 0.6782 - classification_loss: 0.1000 60/500 [==>...........................] - ETA: 2:25 - loss: 0.7787 - regression_loss: 0.6790 - classification_loss: 0.0997 61/500 [==>...........................] - ETA: 2:25 - loss: 0.7723 - regression_loss: 0.6735 - classification_loss: 0.0988 62/500 [==>...........................] - ETA: 2:24 - loss: 0.7674 - regression_loss: 0.6695 - classification_loss: 0.0979 63/500 [==>...........................] - ETA: 2:24 - loss: 0.7722 - regression_loss: 0.6728 - classification_loss: 0.0994 64/500 [==>...........................] - ETA: 2:24 - loss: 0.7661 - regression_loss: 0.6675 - classification_loss: 0.0986 65/500 [==>...........................] - ETA: 2:23 - loss: 0.7685 - regression_loss: 0.6696 - classification_loss: 0.0989 66/500 [==>...........................] - ETA: 2:23 - loss: 0.7637 - regression_loss: 0.6657 - classification_loss: 0.0980 67/500 [===>..........................] - ETA: 2:23 - loss: 0.7610 - regression_loss: 0.6635 - classification_loss: 0.0975 68/500 [===>..........................] - ETA: 2:22 - loss: 0.7638 - regression_loss: 0.6657 - classification_loss: 0.0982 69/500 [===>..........................] - ETA: 2:22 - loss: 0.7623 - regression_loss: 0.6642 - classification_loss: 0.0981 70/500 [===>..........................] - ETA: 2:22 - loss: 0.7634 - regression_loss: 0.6654 - classification_loss: 0.0980 71/500 [===>..........................] - ETA: 2:21 - loss: 0.7669 - regression_loss: 0.6681 - classification_loss: 0.0987 72/500 [===>..........................] - ETA: 2:21 - loss: 0.7649 - regression_loss: 0.6665 - classification_loss: 0.0984 73/500 [===>..........................] - ETA: 2:21 - loss: 0.7600 - regression_loss: 0.6619 - classification_loss: 0.0981 74/500 [===>..........................] - ETA: 2:21 - loss: 0.7562 - regression_loss: 0.6586 - classification_loss: 0.0977 75/500 [===>..........................] - ETA: 2:20 - loss: 0.7622 - regression_loss: 0.6632 - classification_loss: 0.0989 76/500 [===>..........................] - ETA: 2:20 - loss: 0.7580 - regression_loss: 0.6595 - classification_loss: 0.0985 77/500 [===>..........................] - ETA: 2:19 - loss: 0.7546 - regression_loss: 0.6566 - classification_loss: 0.0980 78/500 [===>..........................] - ETA: 2:19 - loss: 0.7488 - regression_loss: 0.6515 - classification_loss: 0.0973 79/500 [===>..........................] - ETA: 2:19 - loss: 0.7517 - regression_loss: 0.6538 - classification_loss: 0.0979 80/500 [===>..........................] - ETA: 2:18 - loss: 0.7498 - regression_loss: 0.6522 - classification_loss: 0.0976 81/500 [===>..........................] - ETA: 2:18 - loss: 0.7482 - regression_loss: 0.6514 - classification_loss: 0.0968 82/500 [===>..........................] - ETA: 2:18 - loss: 0.7487 - regression_loss: 0.6516 - classification_loss: 0.0971 83/500 [===>..........................] - ETA: 2:17 - loss: 0.7492 - regression_loss: 0.6517 - classification_loss: 0.0974 84/500 [====>.........................] - ETA: 2:17 - loss: 0.7489 - regression_loss: 0.6515 - classification_loss: 0.0973 85/500 [====>.........................] - ETA: 2:16 - loss: 0.7453 - regression_loss: 0.6486 - classification_loss: 0.0967 86/500 [====>.........................] - ETA: 2:16 - loss: 0.7502 - regression_loss: 0.6526 - classification_loss: 0.0977 87/500 [====>.........................] - ETA: 2:16 - loss: 0.7452 - regression_loss: 0.6482 - classification_loss: 0.0970 88/500 [====>.........................] - ETA: 2:16 - loss: 0.7511 - regression_loss: 0.6532 - classification_loss: 0.0979 89/500 [====>.........................] - ETA: 2:15 - loss: 0.7543 - regression_loss: 0.6557 - classification_loss: 0.0985 90/500 [====>.........................] - ETA: 2:15 - loss: 0.7546 - regression_loss: 0.6560 - classification_loss: 0.0986 91/500 [====>.........................] - ETA: 2:15 - loss: 0.7534 - regression_loss: 0.6553 - classification_loss: 0.0981 92/500 [====>.........................] - ETA: 2:14 - loss: 0.7500 - regression_loss: 0.6525 - classification_loss: 0.0975 93/500 [====>.........................] - ETA: 2:14 - loss: 0.7446 - regression_loss: 0.6477 - classification_loss: 0.0968 94/500 [====>.........................] - ETA: 2:14 - loss: 0.7446 - regression_loss: 0.6475 - classification_loss: 0.0971 95/500 [====>.........................] - ETA: 2:13 - loss: 0.7409 - regression_loss: 0.6444 - classification_loss: 0.0965 96/500 [====>.........................] - ETA: 2:13 - loss: 0.7431 - regression_loss: 0.6459 - classification_loss: 0.0972 97/500 [====>.........................] - ETA: 2:12 - loss: 0.7463 - regression_loss: 0.6487 - classification_loss: 0.0975 98/500 [====>.........................] - ETA: 2:12 - loss: 0.7493 - regression_loss: 0.6518 - classification_loss: 0.0975 99/500 [====>.........................] - ETA: 2:12 - loss: 0.7477 - regression_loss: 0.6504 - classification_loss: 0.0973 100/500 [=====>........................] - ETA: 2:11 - loss: 0.7434 - regression_loss: 0.6467 - classification_loss: 0.0967 101/500 [=====>........................] - ETA: 2:11 - loss: 0.7437 - regression_loss: 0.6469 - classification_loss: 0.0968 102/500 [=====>........................] - ETA: 2:11 - loss: 0.7433 - regression_loss: 0.6464 - classification_loss: 0.0969 103/500 [=====>........................] - ETA: 2:10 - loss: 0.7501 - regression_loss: 0.6521 - classification_loss: 0.0979 104/500 [=====>........................] - ETA: 2:10 - loss: 0.7497 - regression_loss: 0.6519 - classification_loss: 0.0978 105/500 [=====>........................] - ETA: 2:10 - loss: 0.7466 - regression_loss: 0.6485 - classification_loss: 0.0981 106/500 [=====>........................] - ETA: 2:09 - loss: 0.7441 - regression_loss: 0.6462 - classification_loss: 0.0979 107/500 [=====>........................] - ETA: 2:09 - loss: 0.7449 - regression_loss: 0.6465 - classification_loss: 0.0984 108/500 [=====>........................] - ETA: 2:09 - loss: 0.7482 - regression_loss: 0.6494 - classification_loss: 0.0989 109/500 [=====>........................] - ETA: 2:09 - loss: 0.7505 - regression_loss: 0.6512 - classification_loss: 0.0993 110/500 [=====>........................] - ETA: 2:08 - loss: 0.7516 - regression_loss: 0.6524 - classification_loss: 0.0992 111/500 [=====>........................] - ETA: 2:08 - loss: 0.7564 - regression_loss: 0.6563 - classification_loss: 0.1001 112/500 [=====>........................] - ETA: 2:08 - loss: 0.7532 - regression_loss: 0.6537 - classification_loss: 0.0995 113/500 [=====>........................] - ETA: 2:07 - loss: 0.7527 - regression_loss: 0.6533 - classification_loss: 0.0994 114/500 [=====>........................] - ETA: 2:07 - loss: 0.7560 - regression_loss: 0.6557 - classification_loss: 0.1003 115/500 [=====>........................] - ETA: 2:07 - loss: 0.7588 - regression_loss: 0.6578 - classification_loss: 0.1010 116/500 [=====>........................] - ETA: 2:06 - loss: 0.7585 - regression_loss: 0.6576 - classification_loss: 0.1009 117/500 [======>.......................] - ETA: 2:06 - loss: 0.7548 - regression_loss: 0.6544 - classification_loss: 0.1003 118/500 [======>.......................] - ETA: 2:05 - loss: 0.7535 - regression_loss: 0.6533 - classification_loss: 0.1003 119/500 [======>.......................] - ETA: 2:05 - loss: 0.7564 - regression_loss: 0.6554 - classification_loss: 0.1010 120/500 [======>.......................] - ETA: 2:05 - loss: 0.7577 - regression_loss: 0.6565 - classification_loss: 0.1012 121/500 [======>.......................] - ETA: 2:04 - loss: 0.7603 - regression_loss: 0.6586 - classification_loss: 0.1018 122/500 [======>.......................] - ETA: 2:04 - loss: 0.7577 - regression_loss: 0.6563 - classification_loss: 0.1014 123/500 [======>.......................] - ETA: 2:04 - loss: 0.7573 - regression_loss: 0.6560 - classification_loss: 0.1013 124/500 [======>.......................] - ETA: 2:03 - loss: 0.7586 - regression_loss: 0.6570 - classification_loss: 0.1016 125/500 [======>.......................] - ETA: 2:03 - loss: 0.7565 - regression_loss: 0.6554 - classification_loss: 0.1011 126/500 [======>.......................] - ETA: 2:03 - loss: 0.7549 - regression_loss: 0.6540 - classification_loss: 0.1009 127/500 [======>.......................] - ETA: 2:02 - loss: 0.7556 - regression_loss: 0.6548 - classification_loss: 0.1009 128/500 [======>.......................] - ETA: 2:02 - loss: 0.7528 - regression_loss: 0.6524 - classification_loss: 0.1004 129/500 [======>.......................] - ETA: 2:02 - loss: 0.7486 - regression_loss: 0.6486 - classification_loss: 0.1000 130/500 [======>.......................] - ETA: 2:01 - loss: 0.7464 - regression_loss: 0.6466 - classification_loss: 0.0998 131/500 [======>.......................] - ETA: 2:01 - loss: 0.7481 - regression_loss: 0.6482 - classification_loss: 0.0999 132/500 [======>.......................] - ETA: 2:01 - loss: 0.7485 - regression_loss: 0.6486 - classification_loss: 0.0999 133/500 [======>.......................] - ETA: 2:00 - loss: 0.7498 - regression_loss: 0.6497 - classification_loss: 0.1001 134/500 [=======>......................] - ETA: 2:00 - loss: 0.7458 - regression_loss: 0.6462 - classification_loss: 0.0996 135/500 [=======>......................] - ETA: 2:00 - loss: 0.7434 - regression_loss: 0.6443 - classification_loss: 0.0991 136/500 [=======>......................] - ETA: 1:59 - loss: 0.7422 - regression_loss: 0.6434 - classification_loss: 0.0989 137/500 [=======>......................] - ETA: 1:59 - loss: 0.7428 - regression_loss: 0.6435 - classification_loss: 0.0993 138/500 [=======>......................] - ETA: 1:59 - loss: 0.7403 - regression_loss: 0.6414 - classification_loss: 0.0989 139/500 [=======>......................] - ETA: 1:58 - loss: 0.7434 - regression_loss: 0.6442 - classification_loss: 0.0992 140/500 [=======>......................] - ETA: 1:58 - loss: 0.7477 - regression_loss: 0.6480 - classification_loss: 0.0997 141/500 [=======>......................] - ETA: 1:58 - loss: 0.7492 - regression_loss: 0.6491 - classification_loss: 0.1002 142/500 [=======>......................] - ETA: 1:58 - loss: 0.7496 - regression_loss: 0.6493 - classification_loss: 0.1003 143/500 [=======>......................] - ETA: 1:57 - loss: 0.7498 - regression_loss: 0.6495 - classification_loss: 0.1003 144/500 [=======>......................] - ETA: 1:57 - loss: 0.7507 - regression_loss: 0.6505 - classification_loss: 0.1003 145/500 [=======>......................] - ETA: 1:57 - loss: 0.7489 - regression_loss: 0.6490 - classification_loss: 0.0999 146/500 [=======>......................] - ETA: 1:56 - loss: 0.7476 - regression_loss: 0.6480 - classification_loss: 0.0996 147/500 [=======>......................] - ETA: 1:56 - loss: 0.7470 - regression_loss: 0.6475 - classification_loss: 0.0995 148/500 [=======>......................] - ETA: 1:56 - loss: 0.7467 - regression_loss: 0.6472 - classification_loss: 0.0996 149/500 [=======>......................] - ETA: 1:55 - loss: 0.7457 - regression_loss: 0.6466 - classification_loss: 0.0992 150/500 [========>.....................] - ETA: 1:55 - loss: 0.7439 - regression_loss: 0.6449 - classification_loss: 0.0990 151/500 [========>.....................] - ETA: 1:55 - loss: 0.7457 - regression_loss: 0.6465 - classification_loss: 0.0993 152/500 [========>.....................] - ETA: 1:54 - loss: 0.7470 - regression_loss: 0.6477 - classification_loss: 0.0993 153/500 [========>.....................] - ETA: 1:54 - loss: 0.7468 - regression_loss: 0.6475 - classification_loss: 0.0992 154/500 [========>.....................] - ETA: 1:54 - loss: 0.7442 - regression_loss: 0.6453 - classification_loss: 0.0988 155/500 [========>.....................] - ETA: 1:53 - loss: 0.7455 - regression_loss: 0.6466 - classification_loss: 0.0990 156/500 [========>.....................] - ETA: 1:53 - loss: 0.7462 - regression_loss: 0.6472 - classification_loss: 0.0990 157/500 [========>.....................] - ETA: 1:53 - loss: 0.7459 - regression_loss: 0.6466 - classification_loss: 0.0993 158/500 [========>.....................] - ETA: 1:52 - loss: 0.7439 - regression_loss: 0.6447 - classification_loss: 0.0992 159/500 [========>.....................] - ETA: 1:52 - loss: 0.7424 - regression_loss: 0.6434 - classification_loss: 0.0990 160/500 [========>.....................] - ETA: 1:52 - loss: 0.7427 - regression_loss: 0.6439 - classification_loss: 0.0989 161/500 [========>.....................] - ETA: 1:51 - loss: 0.7432 - regression_loss: 0.6441 - classification_loss: 0.0992 162/500 [========>.....................] - ETA: 1:51 - loss: 0.7404 - regression_loss: 0.6417 - classification_loss: 0.0987 163/500 [========>.....................] - ETA: 1:51 - loss: 0.7369 - regression_loss: 0.6386 - classification_loss: 0.0983 164/500 [========>.....................] - ETA: 1:50 - loss: 0.7363 - regression_loss: 0.6380 - classification_loss: 0.0983 165/500 [========>.....................] - ETA: 1:50 - loss: 0.7342 - regression_loss: 0.6362 - classification_loss: 0.0980 166/500 [========>.....................] - ETA: 1:50 - loss: 0.7367 - regression_loss: 0.6382 - classification_loss: 0.0985 167/500 [=========>....................] - ETA: 1:49 - loss: 0.7352 - regression_loss: 0.6370 - classification_loss: 0.0982 168/500 [=========>....................] - ETA: 1:49 - loss: 0.7372 - regression_loss: 0.6388 - classification_loss: 0.0984 169/500 [=========>....................] - ETA: 1:49 - loss: 0.7339 - regression_loss: 0.6358 - classification_loss: 0.0981 170/500 [=========>....................] - ETA: 1:48 - loss: 0.7330 - regression_loss: 0.6352 - classification_loss: 0.0979 171/500 [=========>....................] - ETA: 1:48 - loss: 0.7319 - regression_loss: 0.6342 - classification_loss: 0.0976 172/500 [=========>....................] - ETA: 1:48 - loss: 0.7301 - regression_loss: 0.6328 - classification_loss: 0.0974 173/500 [=========>....................] - ETA: 1:47 - loss: 0.7283 - regression_loss: 0.6312 - classification_loss: 0.0971 174/500 [=========>....................] - ETA: 1:47 - loss: 0.7266 - regression_loss: 0.6297 - classification_loss: 0.0969 175/500 [=========>....................] - ETA: 1:47 - loss: 0.7289 - regression_loss: 0.6313 - classification_loss: 0.0976 176/500 [=========>....................] - ETA: 1:46 - loss: 0.7320 - regression_loss: 0.6339 - classification_loss: 0.0980 177/500 [=========>....................] - ETA: 1:46 - loss: 0.7325 - regression_loss: 0.6344 - classification_loss: 0.0982 178/500 [=========>....................] - ETA: 1:46 - loss: 0.7308 - regression_loss: 0.6331 - classification_loss: 0.0978 179/500 [=========>....................] - ETA: 1:45 - loss: 0.7314 - regression_loss: 0.6336 - classification_loss: 0.0977 180/500 [=========>....................] - ETA: 1:45 - loss: 0.7306 - regression_loss: 0.6329 - classification_loss: 0.0976 181/500 [=========>....................] - ETA: 1:45 - loss: 0.7343 - regression_loss: 0.6358 - classification_loss: 0.0985 182/500 [=========>....................] - ETA: 1:44 - loss: 0.7339 - regression_loss: 0.6356 - classification_loss: 0.0983 183/500 [=========>....................] - ETA: 1:44 - loss: 0.7359 - regression_loss: 0.6372 - classification_loss: 0.0987 184/500 [==========>...................] - ETA: 1:44 - loss: 0.7377 - regression_loss: 0.6386 - classification_loss: 0.0991 185/500 [==========>...................] - ETA: 1:43 - loss: 0.7384 - regression_loss: 0.6391 - classification_loss: 0.0993 186/500 [==========>...................] - ETA: 1:43 - loss: 0.7406 - regression_loss: 0.6410 - classification_loss: 0.0996 187/500 [==========>...................] - ETA: 1:43 - loss: 0.7373 - regression_loss: 0.6381 - classification_loss: 0.0992 188/500 [==========>...................] - ETA: 1:42 - loss: 0.7377 - regression_loss: 0.6386 - classification_loss: 0.0991 189/500 [==========>...................] - ETA: 1:42 - loss: 0.7393 - regression_loss: 0.6401 - classification_loss: 0.0992 190/500 [==========>...................] - ETA: 1:42 - loss: 0.7375 - regression_loss: 0.6387 - classification_loss: 0.0988 191/500 [==========>...................] - ETA: 1:41 - loss: 0.7387 - regression_loss: 0.6396 - classification_loss: 0.0991 192/500 [==========>...................] - ETA: 1:41 - loss: 0.7378 - regression_loss: 0.6389 - classification_loss: 0.0989 193/500 [==========>...................] - ETA: 1:41 - loss: 0.7391 - regression_loss: 0.6400 - classification_loss: 0.0991 194/500 [==========>...................] - ETA: 1:40 - loss: 0.7395 - regression_loss: 0.6404 - classification_loss: 0.0991 195/500 [==========>...................] - ETA: 1:40 - loss: 0.7386 - regression_loss: 0.6396 - classification_loss: 0.0989 196/500 [==========>...................] - ETA: 1:40 - loss: 0.7392 - regression_loss: 0.6404 - classification_loss: 0.0988 197/500 [==========>...................] - ETA: 1:39 - loss: 0.7404 - regression_loss: 0.6415 - classification_loss: 0.0988 198/500 [==========>...................] - ETA: 1:39 - loss: 0.7407 - regression_loss: 0.6418 - classification_loss: 0.0988 199/500 [==========>...................] - ETA: 1:39 - loss: 0.7403 - regression_loss: 0.6416 - classification_loss: 0.0987 200/500 [===========>..................] - ETA: 1:38 - loss: 0.7403 - regression_loss: 0.6415 - classification_loss: 0.0988 201/500 [===========>..................] - ETA: 1:38 - loss: 0.7429 - regression_loss: 0.6435 - classification_loss: 0.0994 202/500 [===========>..................] - ETA: 1:38 - loss: 0.7452 - regression_loss: 0.6451 - classification_loss: 0.1001 203/500 [===========>..................] - ETA: 1:37 - loss: 0.7438 - regression_loss: 0.6441 - classification_loss: 0.0997 204/500 [===========>..................] - ETA: 1:37 - loss: 0.7429 - regression_loss: 0.6433 - classification_loss: 0.0995 205/500 [===========>..................] - ETA: 1:37 - loss: 0.7414 - regression_loss: 0.6421 - classification_loss: 0.0993 206/500 [===========>..................] - ETA: 1:36 - loss: 0.7399 - regression_loss: 0.6409 - classification_loss: 0.0990 207/500 [===========>..................] - ETA: 1:36 - loss: 0.7407 - regression_loss: 0.6418 - classification_loss: 0.0989 208/500 [===========>..................] - ETA: 1:36 - loss: 0.7412 - regression_loss: 0.6421 - classification_loss: 0.0991 209/500 [===========>..................] - ETA: 1:35 - loss: 0.7402 - regression_loss: 0.6413 - classification_loss: 0.0989 210/500 [===========>..................] - ETA: 1:35 - loss: 0.7378 - regression_loss: 0.6392 - classification_loss: 0.0986 211/500 [===========>..................] - ETA: 1:35 - loss: 0.7377 - regression_loss: 0.6391 - classification_loss: 0.0986 212/500 [===========>..................] - ETA: 1:34 - loss: 0.7385 - regression_loss: 0.6400 - classification_loss: 0.0985 213/500 [===========>..................] - ETA: 1:34 - loss: 0.7386 - regression_loss: 0.6400 - classification_loss: 0.0985 214/500 [===========>..................] - ETA: 1:34 - loss: 0.7371 - regression_loss: 0.6389 - classification_loss: 0.0982 215/500 [===========>..................] - ETA: 1:33 - loss: 0.7375 - regression_loss: 0.6393 - classification_loss: 0.0982 216/500 [===========>..................] - ETA: 1:33 - loss: 0.7360 - regression_loss: 0.6380 - classification_loss: 0.0979 217/500 [============>.................] - ETA: 1:33 - loss: 0.7349 - regression_loss: 0.6372 - classification_loss: 0.0977 218/500 [============>.................] - ETA: 1:32 - loss: 0.7389 - regression_loss: 0.6405 - classification_loss: 0.0984 219/500 [============>.................] - ETA: 1:32 - loss: 0.7392 - regression_loss: 0.6408 - classification_loss: 0.0984 220/500 [============>.................] - ETA: 1:32 - loss: 0.7368 - regression_loss: 0.6388 - classification_loss: 0.0981 221/500 [============>.................] - ETA: 1:31 - loss: 0.7379 - regression_loss: 0.6398 - classification_loss: 0.0982 222/500 [============>.................] - ETA: 1:31 - loss: 0.7386 - regression_loss: 0.6403 - classification_loss: 0.0983 223/500 [============>.................] - ETA: 1:31 - loss: 0.7389 - regression_loss: 0.6405 - classification_loss: 0.0983 224/500 [============>.................] - ETA: 1:30 - loss: 0.7387 - regression_loss: 0.6404 - classification_loss: 0.0982 225/500 [============>.................] - ETA: 1:30 - loss: 0.7381 - regression_loss: 0.6399 - classification_loss: 0.0983 226/500 [============>.................] - ETA: 1:30 - loss: 0.7376 - regression_loss: 0.6395 - classification_loss: 0.0982 227/500 [============>.................] - ETA: 1:29 - loss: 0.7380 - regression_loss: 0.6398 - classification_loss: 0.0982 228/500 [============>.................] - ETA: 1:29 - loss: 0.7363 - regression_loss: 0.6384 - classification_loss: 0.0979 229/500 [============>.................] - ETA: 1:29 - loss: 0.7359 - regression_loss: 0.6380 - classification_loss: 0.0978 230/500 [============>.................] - ETA: 1:28 - loss: 0.7373 - regression_loss: 0.6393 - classification_loss: 0.0980 231/500 [============>.................] - ETA: 1:28 - loss: 0.7357 - regression_loss: 0.6380 - classification_loss: 0.0977 232/500 [============>.................] - ETA: 1:28 - loss: 0.7356 - regression_loss: 0.6380 - classification_loss: 0.0976 233/500 [============>.................] - ETA: 1:27 - loss: 0.7366 - regression_loss: 0.6390 - classification_loss: 0.0975 234/500 [=============>................] - ETA: 1:27 - loss: 0.7357 - regression_loss: 0.6383 - classification_loss: 0.0974 235/500 [=============>................] - ETA: 1:27 - loss: 0.7393 - regression_loss: 0.6412 - classification_loss: 0.0980 236/500 [=============>................] - ETA: 1:27 - loss: 0.7390 - regression_loss: 0.6410 - classification_loss: 0.0980 237/500 [=============>................] - ETA: 1:26 - loss: 0.7383 - regression_loss: 0.6405 - classification_loss: 0.0979 238/500 [=============>................] - ETA: 1:26 - loss: 0.7403 - regression_loss: 0.6421 - classification_loss: 0.0982 239/500 [=============>................] - ETA: 1:26 - loss: 0.7413 - regression_loss: 0.6433 - classification_loss: 0.0980 240/500 [=============>................] - ETA: 1:25 - loss: 0.7396 - regression_loss: 0.6418 - classification_loss: 0.0978 241/500 [=============>................] - ETA: 1:25 - loss: 0.7401 - regression_loss: 0.6424 - classification_loss: 0.0977 242/500 [=============>................] - ETA: 1:25 - loss: 0.7399 - regression_loss: 0.6423 - classification_loss: 0.0976 243/500 [=============>................] - ETA: 1:24 - loss: 0.7379 - regression_loss: 0.6406 - classification_loss: 0.0973 244/500 [=============>................] - ETA: 1:24 - loss: 0.7370 - regression_loss: 0.6399 - classification_loss: 0.0971 245/500 [=============>................] - ETA: 1:23 - loss: 0.7365 - regression_loss: 0.6396 - classification_loss: 0.0969 246/500 [=============>................] - ETA: 1:23 - loss: 0.7347 - regression_loss: 0.6380 - classification_loss: 0.0967 247/500 [=============>................] - ETA: 1:23 - loss: 0.7333 - regression_loss: 0.6368 - classification_loss: 0.0965 248/500 [=============>................] - ETA: 1:22 - loss: 0.7315 - regression_loss: 0.6353 - classification_loss: 0.0962 249/500 [=============>................] - ETA: 1:22 - loss: 0.7338 - regression_loss: 0.6376 - classification_loss: 0.0963 250/500 [==============>...............] - ETA: 1:22 - loss: 0.7326 - regression_loss: 0.6365 - classification_loss: 0.0961 251/500 [==============>...............] - ETA: 1:21 - loss: 0.7330 - regression_loss: 0.6367 - classification_loss: 0.0963 252/500 [==============>...............] - ETA: 1:21 - loss: 0.7336 - regression_loss: 0.6371 - classification_loss: 0.0965 253/500 [==============>...............] - ETA: 1:21 - loss: 0.7349 - regression_loss: 0.6380 - classification_loss: 0.0969 254/500 [==============>...............] - ETA: 1:21 - loss: 0.7358 - regression_loss: 0.6388 - classification_loss: 0.0970 255/500 [==============>...............] - ETA: 1:20 - loss: 0.7357 - regression_loss: 0.6387 - classification_loss: 0.0970 256/500 [==============>...............] - ETA: 1:20 - loss: 0.7379 - regression_loss: 0.6405 - classification_loss: 0.0974 257/500 [==============>...............] - ETA: 1:20 - loss: 0.7404 - regression_loss: 0.6425 - classification_loss: 0.0979 258/500 [==============>...............] - ETA: 1:19 - loss: 0.7400 - regression_loss: 0.6423 - classification_loss: 0.0977 259/500 [==============>...............] - ETA: 1:19 - loss: 0.7404 - regression_loss: 0.6425 - classification_loss: 0.0979 260/500 [==============>...............] - ETA: 1:19 - loss: 0.7408 - regression_loss: 0.6427 - classification_loss: 0.0981 261/500 [==============>...............] - ETA: 1:18 - loss: 0.7397 - regression_loss: 0.6416 - classification_loss: 0.0981 262/500 [==============>...............] - ETA: 1:18 - loss: 0.7393 - regression_loss: 0.6413 - classification_loss: 0.0980 263/500 [==============>...............] - ETA: 1:18 - loss: 0.7382 - regression_loss: 0.6402 - classification_loss: 0.0980 264/500 [==============>...............] - ETA: 1:17 - loss: 0.7378 - regression_loss: 0.6398 - classification_loss: 0.0980 265/500 [==============>...............] - ETA: 1:17 - loss: 0.7371 - regression_loss: 0.6392 - classification_loss: 0.0978 266/500 [==============>...............] - ETA: 1:17 - loss: 0.7352 - regression_loss: 0.6374 - classification_loss: 0.0978 267/500 [===============>..............] - ETA: 1:16 - loss: 0.7365 - regression_loss: 0.6387 - classification_loss: 0.0979 268/500 [===============>..............] - ETA: 1:16 - loss: 0.7357 - regression_loss: 0.6379 - classification_loss: 0.0978 269/500 [===============>..............] - ETA: 1:16 - loss: 0.7364 - regression_loss: 0.6385 - classification_loss: 0.0979 270/500 [===============>..............] - ETA: 1:15 - loss: 0.7366 - regression_loss: 0.6387 - classification_loss: 0.0979 271/500 [===============>..............] - ETA: 1:15 - loss: 0.7366 - regression_loss: 0.6385 - classification_loss: 0.0981 272/500 [===============>..............] - ETA: 1:15 - loss: 0.7352 - regression_loss: 0.6373 - classification_loss: 0.0979 273/500 [===============>..............] - ETA: 1:14 - loss: 0.7342 - regression_loss: 0.6365 - classification_loss: 0.0977 274/500 [===============>..............] - ETA: 1:14 - loss: 0.7354 - regression_loss: 0.6375 - classification_loss: 0.0979 275/500 [===============>..............] - ETA: 1:14 - loss: 0.7346 - regression_loss: 0.6368 - classification_loss: 0.0978 276/500 [===============>..............] - ETA: 1:13 - loss: 0.7375 - regression_loss: 0.6393 - classification_loss: 0.0982 277/500 [===============>..............] - ETA: 1:13 - loss: 0.7365 - regression_loss: 0.6384 - classification_loss: 0.0981 278/500 [===============>..............] - ETA: 1:13 - loss: 0.7352 - regression_loss: 0.6374 - classification_loss: 0.0978 279/500 [===============>..............] - ETA: 1:12 - loss: 0.7340 - regression_loss: 0.6363 - classification_loss: 0.0978 280/500 [===============>..............] - ETA: 1:12 - loss: 0.7340 - regression_loss: 0.6363 - classification_loss: 0.0977 281/500 [===============>..............] - ETA: 1:12 - loss: 0.7327 - regression_loss: 0.6352 - classification_loss: 0.0975 282/500 [===============>..............] - ETA: 1:11 - loss: 0.7330 - regression_loss: 0.6354 - classification_loss: 0.0976 283/500 [===============>..............] - ETA: 1:11 - loss: 0.7314 - regression_loss: 0.6341 - classification_loss: 0.0973 284/500 [================>.............] - ETA: 1:11 - loss: 0.7306 - regression_loss: 0.6334 - classification_loss: 0.0972 285/500 [================>.............] - ETA: 1:10 - loss: 0.7315 - regression_loss: 0.6343 - classification_loss: 0.0972 286/500 [================>.............] - ETA: 1:10 - loss: 0.7298 - regression_loss: 0.6328 - classification_loss: 0.0970 287/500 [================>.............] - ETA: 1:10 - loss: 0.7303 - regression_loss: 0.6332 - classification_loss: 0.0971 288/500 [================>.............] - ETA: 1:09 - loss: 0.7309 - regression_loss: 0.6336 - classification_loss: 0.0973 289/500 [================>.............] - ETA: 1:09 - loss: 0.7299 - regression_loss: 0.6327 - classification_loss: 0.0972 290/500 [================>.............] - ETA: 1:09 - loss: 0.7304 - regression_loss: 0.6332 - classification_loss: 0.0972 291/500 [================>.............] - ETA: 1:08 - loss: 0.7303 - regression_loss: 0.6333 - classification_loss: 0.0971 292/500 [================>.............] - ETA: 1:08 - loss: 0.7302 - regression_loss: 0.6331 - classification_loss: 0.0971 293/500 [================>.............] - ETA: 1:08 - loss: 0.7295 - regression_loss: 0.6326 - classification_loss: 0.0969 294/500 [================>.............] - ETA: 1:07 - loss: 0.7285 - regression_loss: 0.6317 - classification_loss: 0.0967 295/500 [================>.............] - ETA: 1:07 - loss: 0.7286 - regression_loss: 0.6319 - classification_loss: 0.0967 296/500 [================>.............] - ETA: 1:07 - loss: 0.7271 - regression_loss: 0.6306 - classification_loss: 0.0965 297/500 [================>.............] - ETA: 1:06 - loss: 0.7268 - regression_loss: 0.6305 - classification_loss: 0.0963 298/500 [================>.............] - ETA: 1:06 - loss: 0.7272 - regression_loss: 0.6308 - classification_loss: 0.0964 299/500 [================>.............] - ETA: 1:06 - loss: 0.7268 - regression_loss: 0.6305 - classification_loss: 0.0963 300/500 [=================>............] - ETA: 1:05 - loss: 0.7265 - regression_loss: 0.6303 - classification_loss: 0.0962 301/500 [=================>............] - ETA: 1:05 - loss: 0.7274 - regression_loss: 0.6311 - classification_loss: 0.0962 302/500 [=================>............] - ETA: 1:05 - loss: 0.7275 - regression_loss: 0.6312 - classification_loss: 0.0963 303/500 [=================>............] - ETA: 1:04 - loss: 0.7268 - regression_loss: 0.6307 - classification_loss: 0.0961 304/500 [=================>............] - ETA: 1:04 - loss: 0.7276 - regression_loss: 0.6313 - classification_loss: 0.0963 305/500 [=================>............] - ETA: 1:04 - loss: 0.7290 - regression_loss: 0.6324 - classification_loss: 0.0966 306/500 [=================>............] - ETA: 1:03 - loss: 0.7296 - regression_loss: 0.6328 - classification_loss: 0.0968 307/500 [=================>............] - ETA: 1:03 - loss: 0.7293 - regression_loss: 0.6326 - classification_loss: 0.0967 308/500 [=================>............] - ETA: 1:03 - loss: 0.7298 - regression_loss: 0.6331 - classification_loss: 0.0967 309/500 [=================>............] - ETA: 1:02 - loss: 0.7303 - regression_loss: 0.6335 - classification_loss: 0.0967 310/500 [=================>............] - ETA: 1:02 - loss: 0.7306 - regression_loss: 0.6339 - classification_loss: 0.0968 311/500 [=================>............] - ETA: 1:02 - loss: 0.7320 - regression_loss: 0.6350 - classification_loss: 0.0971 312/500 [=================>............] - ETA: 1:01 - loss: 0.7331 - regression_loss: 0.6362 - classification_loss: 0.0970 313/500 [=================>............] - ETA: 1:01 - loss: 0.7354 - regression_loss: 0.6381 - classification_loss: 0.0973 314/500 [=================>............] - ETA: 1:01 - loss: 0.7355 - regression_loss: 0.6382 - classification_loss: 0.0973 315/500 [=================>............] - ETA: 1:00 - loss: 0.7351 - regression_loss: 0.6379 - classification_loss: 0.0972 316/500 [=================>............] - ETA: 1:00 - loss: 0.7347 - regression_loss: 0.6375 - classification_loss: 0.0972 317/500 [==================>...........] - ETA: 1:00 - loss: 0.7346 - regression_loss: 0.6375 - classification_loss: 0.0971 318/500 [==================>...........] - ETA: 59s - loss: 0.7335 - regression_loss: 0.6365 - classification_loss: 0.0969  319/500 [==================>...........] - ETA: 59s - loss: 0.7339 - regression_loss: 0.6370 - classification_loss: 0.0969 320/500 [==================>...........] - ETA: 59s - loss: 0.7342 - regression_loss: 0.6373 - classification_loss: 0.0968 321/500 [==================>...........] - ETA: 58s - loss: 0.7343 - regression_loss: 0.6375 - classification_loss: 0.0968 322/500 [==================>...........] - ETA: 58s - loss: 0.7350 - regression_loss: 0.6383 - classification_loss: 0.0967 323/500 [==================>...........] - ETA: 58s - loss: 0.7339 - regression_loss: 0.6372 - classification_loss: 0.0967 324/500 [==================>...........] - ETA: 58s - loss: 0.7328 - regression_loss: 0.6362 - classification_loss: 0.0965 325/500 [==================>...........] - ETA: 57s - loss: 0.7344 - regression_loss: 0.6376 - classification_loss: 0.0968 326/500 [==================>...........] - ETA: 57s - loss: 0.7334 - regression_loss: 0.6367 - classification_loss: 0.0967 327/500 [==================>...........] - ETA: 57s - loss: 0.7323 - regression_loss: 0.6358 - classification_loss: 0.0965 328/500 [==================>...........] - ETA: 56s - loss: 0.7336 - regression_loss: 0.6369 - classification_loss: 0.0966 329/500 [==================>...........] - ETA: 56s - loss: 0.7342 - regression_loss: 0.6375 - classification_loss: 0.0967 330/500 [==================>...........] - ETA: 56s - loss: 0.7347 - regression_loss: 0.6380 - classification_loss: 0.0966 331/500 [==================>...........] - ETA: 55s - loss: 0.7362 - regression_loss: 0.6394 - classification_loss: 0.0969 332/500 [==================>...........] - ETA: 55s - loss: 0.7376 - regression_loss: 0.6404 - classification_loss: 0.0972 333/500 [==================>...........] - ETA: 55s - loss: 0.7364 - regression_loss: 0.6393 - classification_loss: 0.0970 334/500 [===================>..........] - ETA: 54s - loss: 0.7358 - regression_loss: 0.6389 - classification_loss: 0.0969 335/500 [===================>..........] - ETA: 54s - loss: 0.7349 - regression_loss: 0.6381 - classification_loss: 0.0968 336/500 [===================>..........] - ETA: 54s - loss: 0.7352 - regression_loss: 0.6383 - classification_loss: 0.0968 337/500 [===================>..........] - ETA: 53s - loss: 0.7353 - regression_loss: 0.6384 - classification_loss: 0.0969 338/500 [===================>..........] - ETA: 53s - loss: 0.7362 - regression_loss: 0.6391 - classification_loss: 0.0971 339/500 [===================>..........] - ETA: 53s - loss: 0.7356 - regression_loss: 0.6387 - classification_loss: 0.0969 340/500 [===================>..........] - ETA: 52s - loss: 0.7345 - regression_loss: 0.6377 - classification_loss: 0.0968 341/500 [===================>..........] - ETA: 52s - loss: 0.7331 - regression_loss: 0.6365 - classification_loss: 0.0965 342/500 [===================>..........] - ETA: 52s - loss: 0.7317 - regression_loss: 0.6354 - classification_loss: 0.0964 343/500 [===================>..........] - ETA: 51s - loss: 0.7319 - regression_loss: 0.6355 - classification_loss: 0.0964 344/500 [===================>..........] - ETA: 51s - loss: 0.7328 - regression_loss: 0.6363 - classification_loss: 0.0965 345/500 [===================>..........] - ETA: 51s - loss: 0.7337 - regression_loss: 0.6370 - classification_loss: 0.0967 346/500 [===================>..........] - ETA: 50s - loss: 0.7329 - regression_loss: 0.6363 - classification_loss: 0.0966 347/500 [===================>..........] - ETA: 50s - loss: 0.7332 - regression_loss: 0.6366 - classification_loss: 0.0966 348/500 [===================>..........] - ETA: 50s - loss: 0.7321 - regression_loss: 0.6358 - classification_loss: 0.0964 349/500 [===================>..........] - ETA: 49s - loss: 0.7331 - regression_loss: 0.6366 - classification_loss: 0.0965 350/500 [====================>.........] - ETA: 49s - loss: 0.7331 - regression_loss: 0.6366 - classification_loss: 0.0964 351/500 [====================>.........] - ETA: 49s - loss: 0.7327 - regression_loss: 0.6363 - classification_loss: 0.0964 352/500 [====================>.........] - ETA: 48s - loss: 0.7321 - regression_loss: 0.6358 - classification_loss: 0.0963 353/500 [====================>.........] - ETA: 48s - loss: 0.7318 - regression_loss: 0.6356 - classification_loss: 0.0963 354/500 [====================>.........] - ETA: 48s - loss: 0.7309 - regression_loss: 0.6348 - classification_loss: 0.0961 355/500 [====================>.........] - ETA: 47s - loss: 0.7304 - regression_loss: 0.6344 - classification_loss: 0.0960 356/500 [====================>.........] - ETA: 47s - loss: 0.7294 - regression_loss: 0.6335 - classification_loss: 0.0959 357/500 [====================>.........] - ETA: 47s - loss: 0.7284 - regression_loss: 0.6327 - classification_loss: 0.0957 358/500 [====================>.........] - ETA: 46s - loss: 0.7291 - regression_loss: 0.6334 - classification_loss: 0.0958 359/500 [====================>.........] - ETA: 46s - loss: 0.7281 - regression_loss: 0.6324 - classification_loss: 0.0957 360/500 [====================>.........] - ETA: 46s - loss: 0.7291 - regression_loss: 0.6333 - classification_loss: 0.0958 361/500 [====================>.........] - ETA: 45s - loss: 0.7296 - regression_loss: 0.6337 - classification_loss: 0.0959 362/500 [====================>.........] - ETA: 45s - loss: 0.7286 - regression_loss: 0.6329 - classification_loss: 0.0958 363/500 [====================>.........] - ETA: 45s - loss: 0.7281 - regression_loss: 0.6324 - classification_loss: 0.0957 364/500 [====================>.........] - ETA: 44s - loss: 0.7297 - regression_loss: 0.6338 - classification_loss: 0.0959 365/500 [====================>.........] - ETA: 44s - loss: 0.7299 - regression_loss: 0.6340 - classification_loss: 0.0959 366/500 [====================>.........] - ETA: 44s - loss: 0.7291 - regression_loss: 0.6333 - classification_loss: 0.0958 367/500 [=====================>........] - ETA: 43s - loss: 0.7281 - regression_loss: 0.6325 - classification_loss: 0.0956 368/500 [=====================>........] - ETA: 43s - loss: 0.7292 - regression_loss: 0.6334 - classification_loss: 0.0958 369/500 [=====================>........] - ETA: 43s - loss: 0.7293 - regression_loss: 0.6335 - classification_loss: 0.0958 370/500 [=====================>........] - ETA: 42s - loss: 0.7303 - regression_loss: 0.6343 - classification_loss: 0.0960 371/500 [=====================>........] - ETA: 42s - loss: 0.7303 - regression_loss: 0.6342 - classification_loss: 0.0960 372/500 [=====================>........] - ETA: 42s - loss: 0.7298 - regression_loss: 0.6339 - classification_loss: 0.0959 373/500 [=====================>........] - ETA: 41s - loss: 0.7289 - regression_loss: 0.6331 - classification_loss: 0.0958 374/500 [=====================>........] - ETA: 41s - loss: 0.7300 - regression_loss: 0.6340 - classification_loss: 0.0960 375/500 [=====================>........] - ETA: 41s - loss: 0.7316 - regression_loss: 0.6353 - classification_loss: 0.0964 376/500 [=====================>........] - ETA: 40s - loss: 0.7326 - regression_loss: 0.6361 - classification_loss: 0.0965 377/500 [=====================>........] - ETA: 40s - loss: 0.7344 - regression_loss: 0.6376 - classification_loss: 0.0968 378/500 [=====================>........] - ETA: 40s - loss: 0.7336 - regression_loss: 0.6369 - classification_loss: 0.0967 379/500 [=====================>........] - ETA: 39s - loss: 0.7327 - regression_loss: 0.6361 - classification_loss: 0.0966 380/500 [=====================>........] - ETA: 39s - loss: 0.7333 - regression_loss: 0.6367 - classification_loss: 0.0967 381/500 [=====================>........] - ETA: 39s - loss: 0.7338 - regression_loss: 0.6370 - classification_loss: 0.0968 382/500 [=====================>........] - ETA: 38s - loss: 0.7330 - regression_loss: 0.6363 - classification_loss: 0.0966 383/500 [=====================>........] - ETA: 38s - loss: 0.7322 - regression_loss: 0.6356 - classification_loss: 0.0966 384/500 [======================>.......] - ETA: 38s - loss: 0.7311 - regression_loss: 0.6347 - classification_loss: 0.0964 385/500 [======================>.......] - ETA: 37s - loss: 0.7311 - regression_loss: 0.6348 - classification_loss: 0.0963 386/500 [======================>.......] - ETA: 37s - loss: 0.7316 - regression_loss: 0.6353 - classification_loss: 0.0964 387/500 [======================>.......] - ETA: 37s - loss: 0.7324 - regression_loss: 0.6359 - classification_loss: 0.0966 388/500 [======================>.......] - ETA: 36s - loss: 0.7322 - regression_loss: 0.6356 - classification_loss: 0.0966 389/500 [======================>.......] - ETA: 36s - loss: 0.7320 - regression_loss: 0.6354 - classification_loss: 0.0966 390/500 [======================>.......] - ETA: 36s - loss: 0.7314 - regression_loss: 0.6349 - classification_loss: 0.0965 391/500 [======================>.......] - ETA: 35s - loss: 0.7304 - regression_loss: 0.6341 - classification_loss: 0.0963 392/500 [======================>.......] - ETA: 35s - loss: 0.7303 - regression_loss: 0.6339 - classification_loss: 0.0964 393/500 [======================>.......] - ETA: 35s - loss: 0.7315 - regression_loss: 0.6349 - classification_loss: 0.0966 394/500 [======================>.......] - ETA: 34s - loss: 0.7318 - regression_loss: 0.6352 - classification_loss: 0.0966 395/500 [======================>.......] - ETA: 34s - loss: 0.7306 - regression_loss: 0.6342 - classification_loss: 0.0964 396/500 [======================>.......] - ETA: 34s - loss: 0.7308 - regression_loss: 0.6344 - classification_loss: 0.0965 397/500 [======================>.......] - ETA: 33s - loss: 0.7311 - regression_loss: 0.6345 - classification_loss: 0.0966 398/500 [======================>.......] - ETA: 33s - loss: 0.7314 - regression_loss: 0.6348 - classification_loss: 0.0966 399/500 [======================>.......] - ETA: 33s - loss: 0.7313 - regression_loss: 0.6347 - classification_loss: 0.0966 400/500 [=======================>......] - ETA: 32s - loss: 0.7317 - regression_loss: 0.6350 - classification_loss: 0.0967 401/500 [=======================>......] - ETA: 32s - loss: 0.7317 - regression_loss: 0.6351 - classification_loss: 0.0966 402/500 [=======================>......] - ETA: 32s - loss: 0.7313 - regression_loss: 0.6347 - classification_loss: 0.0966 403/500 [=======================>......] - ETA: 31s - loss: 0.7317 - regression_loss: 0.6351 - classification_loss: 0.0966 404/500 [=======================>......] - ETA: 31s - loss: 0.7323 - regression_loss: 0.6356 - classification_loss: 0.0967 405/500 [=======================>......] - ETA: 31s - loss: 0.7321 - regression_loss: 0.6354 - classification_loss: 0.0967 406/500 [=======================>......] - ETA: 30s - loss: 0.7323 - regression_loss: 0.6356 - classification_loss: 0.0967 407/500 [=======================>......] - ETA: 30s - loss: 0.7328 - regression_loss: 0.6360 - classification_loss: 0.0968 408/500 [=======================>......] - ETA: 30s - loss: 0.7332 - regression_loss: 0.6364 - classification_loss: 0.0968 409/500 [=======================>......] - ETA: 29s - loss: 0.7331 - regression_loss: 0.6362 - classification_loss: 0.0969 410/500 [=======================>......] - ETA: 29s - loss: 0.7333 - regression_loss: 0.6364 - classification_loss: 0.0969 411/500 [=======================>......] - ETA: 29s - loss: 0.7352 - regression_loss: 0.6380 - classification_loss: 0.0973 412/500 [=======================>......] - ETA: 28s - loss: 0.7354 - regression_loss: 0.6382 - classification_loss: 0.0973 413/500 [=======================>......] - ETA: 28s - loss: 0.7357 - regression_loss: 0.6384 - classification_loss: 0.0972 414/500 [=======================>......] - ETA: 28s - loss: 0.7354 - regression_loss: 0.6382 - classification_loss: 0.0972 415/500 [=======================>......] - ETA: 27s - loss: 0.7352 - regression_loss: 0.6380 - classification_loss: 0.0972 416/500 [=======================>......] - ETA: 27s - loss: 0.7348 - regression_loss: 0.6378 - classification_loss: 0.0971 417/500 [========================>.....] - ETA: 27s - loss: 0.7343 - regression_loss: 0.6373 - classification_loss: 0.0970 418/500 [========================>.....] - ETA: 26s - loss: 0.7346 - regression_loss: 0.6377 - classification_loss: 0.0969 419/500 [========================>.....] - ETA: 26s - loss: 0.7350 - regression_loss: 0.6379 - classification_loss: 0.0971 420/500 [========================>.....] - ETA: 26s - loss: 0.7347 - regression_loss: 0.6378 - classification_loss: 0.0969 421/500 [========================>.....] - ETA: 26s - loss: 0.7347 - regression_loss: 0.6379 - classification_loss: 0.0968 422/500 [========================>.....] - ETA: 25s - loss: 0.7339 - regression_loss: 0.6372 - classification_loss: 0.0967 423/500 [========================>.....] - ETA: 25s - loss: 0.7342 - regression_loss: 0.6374 - classification_loss: 0.0968 424/500 [========================>.....] - ETA: 25s - loss: 0.7329 - regression_loss: 0.6363 - classification_loss: 0.0966 425/500 [========================>.....] - ETA: 24s - loss: 0.7331 - regression_loss: 0.6365 - classification_loss: 0.0965 426/500 [========================>.....] - ETA: 24s - loss: 0.7332 - regression_loss: 0.6366 - classification_loss: 0.0966 427/500 [========================>.....] - ETA: 24s - loss: 0.7338 - regression_loss: 0.6372 - classification_loss: 0.0966 428/500 [========================>.....] - ETA: 23s - loss: 0.7339 - regression_loss: 0.6373 - classification_loss: 0.0966 429/500 [========================>.....] - ETA: 23s - loss: 0.7340 - regression_loss: 0.6374 - classification_loss: 0.0966 430/500 [========================>.....] - ETA: 23s - loss: 0.7347 - regression_loss: 0.6380 - classification_loss: 0.0967 431/500 [========================>.....] - ETA: 22s - loss: 0.7347 - regression_loss: 0.6381 - classification_loss: 0.0967 432/500 [========================>.....] - ETA: 22s - loss: 0.7355 - regression_loss: 0.6387 - classification_loss: 0.0968 433/500 [========================>.....] - ETA: 22s - loss: 0.7354 - regression_loss: 0.6386 - classification_loss: 0.0968 434/500 [=========================>....] - ETA: 21s - loss: 0.7358 - regression_loss: 0.6389 - classification_loss: 0.0969 435/500 [=========================>....] - ETA: 21s - loss: 0.7363 - regression_loss: 0.6394 - classification_loss: 0.0969 436/500 [=========================>....] - ETA: 21s - loss: 0.7362 - regression_loss: 0.6393 - classification_loss: 0.0970 437/500 [=========================>....] - ETA: 20s - loss: 0.7371 - regression_loss: 0.6400 - classification_loss: 0.0970 438/500 [=========================>....] - ETA: 20s - loss: 0.7375 - regression_loss: 0.6405 - classification_loss: 0.0970 439/500 [=========================>....] - ETA: 20s - loss: 0.7375 - regression_loss: 0.6405 - classification_loss: 0.0970 440/500 [=========================>....] - ETA: 19s - loss: 0.7365 - regression_loss: 0.6396 - classification_loss: 0.0968 441/500 [=========================>....] - ETA: 19s - loss: 0.7366 - regression_loss: 0.6397 - classification_loss: 0.0968 442/500 [=========================>....] - ETA: 19s - loss: 0.7354 - regression_loss: 0.6387 - classification_loss: 0.0967 443/500 [=========================>....] - ETA: 18s - loss: 0.7347 - regression_loss: 0.6381 - classification_loss: 0.0966 444/500 [=========================>....] - ETA: 18s - loss: 0.7339 - regression_loss: 0.6374 - classification_loss: 0.0965 445/500 [=========================>....] - ETA: 18s - loss: 0.7346 - regression_loss: 0.6381 - classification_loss: 0.0965 446/500 [=========================>....] - ETA: 17s - loss: 0.7343 - regression_loss: 0.6378 - classification_loss: 0.0965 447/500 [=========================>....] - ETA: 17s - loss: 0.7344 - regression_loss: 0.6379 - classification_loss: 0.0965 448/500 [=========================>....] - ETA: 17s - loss: 0.7340 - regression_loss: 0.6376 - classification_loss: 0.0965 449/500 [=========================>....] - ETA: 16s - loss: 0.7337 - regression_loss: 0.6373 - classification_loss: 0.0964 450/500 [==========================>...] - ETA: 16s - loss: 0.7339 - regression_loss: 0.6375 - classification_loss: 0.0965 451/500 [==========================>...] - ETA: 16s - loss: 0.7336 - regression_loss: 0.6372 - classification_loss: 0.0964 452/500 [==========================>...] - ETA: 15s - loss: 0.7333 - regression_loss: 0.6369 - classification_loss: 0.0964 453/500 [==========================>...] - ETA: 15s - loss: 0.7322 - regression_loss: 0.6360 - classification_loss: 0.0963 454/500 [==========================>...] - ETA: 15s - loss: 0.7322 - regression_loss: 0.6359 - classification_loss: 0.0963 455/500 [==========================>...] - ETA: 14s - loss: 0.7323 - regression_loss: 0.6360 - classification_loss: 0.0962 456/500 [==========================>...] - ETA: 14s - loss: 0.7316 - regression_loss: 0.6354 - classification_loss: 0.0961 457/500 [==========================>...] - ETA: 14s - loss: 0.7314 - regression_loss: 0.6353 - classification_loss: 0.0961 458/500 [==========================>...] - ETA: 13s - loss: 0.7306 - regression_loss: 0.6346 - classification_loss: 0.0960 459/500 [==========================>...] - ETA: 13s - loss: 0.7296 - regression_loss: 0.6337 - classification_loss: 0.0959 460/500 [==========================>...] - ETA: 13s - loss: 0.7290 - regression_loss: 0.6332 - classification_loss: 0.0958 461/500 [==========================>...] - ETA: 12s - loss: 0.7296 - regression_loss: 0.6336 - classification_loss: 0.0959 462/500 [==========================>...] - ETA: 12s - loss: 0.7313 - regression_loss: 0.6350 - classification_loss: 0.0962 463/500 [==========================>...] - ETA: 12s - loss: 0.7312 - regression_loss: 0.6350 - classification_loss: 0.0962 464/500 [==========================>...] - ETA: 11s - loss: 0.7310 - regression_loss: 0.6348 - classification_loss: 0.0962 465/500 [==========================>...] - ETA: 11s - loss: 0.7306 - regression_loss: 0.6344 - classification_loss: 0.0961 466/500 [==========================>...] - ETA: 11s - loss: 0.7298 - regression_loss: 0.6338 - classification_loss: 0.0960 467/500 [===========================>..] - ETA: 10s - loss: 0.7299 - regression_loss: 0.6339 - classification_loss: 0.0960 468/500 [===========================>..] - ETA: 10s - loss: 0.7300 - regression_loss: 0.6340 - classification_loss: 0.0959 469/500 [===========================>..] - ETA: 10s - loss: 0.7301 - regression_loss: 0.6342 - classification_loss: 0.0959 470/500 [===========================>..] - ETA: 9s - loss: 0.7301 - regression_loss: 0.6342 - classification_loss: 0.0960  471/500 [===========================>..] - ETA: 9s - loss: 0.7309 - regression_loss: 0.6347 - classification_loss: 0.0962 472/500 [===========================>..] - ETA: 9s - loss: 0.7305 - regression_loss: 0.6344 - classification_loss: 0.0961 473/500 [===========================>..] - ETA: 8s - loss: 0.7301 - regression_loss: 0.6341 - classification_loss: 0.0960 474/500 [===========================>..] - ETA: 8s - loss: 0.7296 - regression_loss: 0.6336 - classification_loss: 0.0960 475/500 [===========================>..] - ETA: 8s - loss: 0.7286 - regression_loss: 0.6327 - classification_loss: 0.0958 476/500 [===========================>..] - ETA: 7s - loss: 0.7287 - regression_loss: 0.6329 - classification_loss: 0.0958 477/500 [===========================>..] - ETA: 7s - loss: 0.7284 - regression_loss: 0.6326 - classification_loss: 0.0958 478/500 [===========================>..] - ETA: 7s - loss: 0.7276 - regression_loss: 0.6320 - classification_loss: 0.0956 479/500 [===========================>..] - ETA: 6s - loss: 0.7267 - regression_loss: 0.6312 - classification_loss: 0.0955 480/500 [===========================>..] - ETA: 6s - loss: 0.7271 - regression_loss: 0.6316 - classification_loss: 0.0955 481/500 [===========================>..] - ETA: 6s - loss: 0.7267 - regression_loss: 0.6313 - classification_loss: 0.0954 482/500 [===========================>..] - ETA: 5s - loss: 0.7273 - regression_loss: 0.6318 - classification_loss: 0.0955 483/500 [===========================>..] - ETA: 5s - loss: 0.7277 - regression_loss: 0.6322 - classification_loss: 0.0955 484/500 [============================>.] - ETA: 5s - loss: 0.7275 - regression_loss: 0.6321 - classification_loss: 0.0954 485/500 [============================>.] - ETA: 4s - loss: 0.7271 - regression_loss: 0.6317 - classification_loss: 0.0954 486/500 [============================>.] - ETA: 4s - loss: 0.7270 - regression_loss: 0.6316 - classification_loss: 0.0954 487/500 [============================>.] - ETA: 4s - loss: 0.7269 - regression_loss: 0.6316 - classification_loss: 0.0954 488/500 [============================>.] - ETA: 3s - loss: 0.7261 - regression_loss: 0.6309 - classification_loss: 0.0953 489/500 [============================>.] - ETA: 3s - loss: 0.7257 - regression_loss: 0.6305 - classification_loss: 0.0952 490/500 [============================>.] - ETA: 3s - loss: 0.7259 - regression_loss: 0.6307 - classification_loss: 0.0952 491/500 [============================>.] - ETA: 2s - loss: 0.7264 - regression_loss: 0.6312 - classification_loss: 0.0952 492/500 [============================>.] - ETA: 2s - loss: 0.7262 - regression_loss: 0.6311 - classification_loss: 0.0952 493/500 [============================>.] - ETA: 2s - loss: 0.7267 - regression_loss: 0.6316 - classification_loss: 0.0951 494/500 [============================>.] - ETA: 1s - loss: 0.7260 - regression_loss: 0.6310 - classification_loss: 0.0950 495/500 [============================>.] - ETA: 1s - loss: 0.7255 - regression_loss: 0.6306 - classification_loss: 0.0949 496/500 [============================>.] - ETA: 1s - loss: 0.7256 - regression_loss: 0.6308 - classification_loss: 0.0948 497/500 [============================>.] - ETA: 0s - loss: 0.7267 - regression_loss: 0.6318 - classification_loss: 0.0948 498/500 [============================>.] - ETA: 0s - loss: 0.7260 - regression_loss: 0.6313 - classification_loss: 0.0947 499/500 [============================>.] - ETA: 0s - loss: 0.7272 - regression_loss: 0.6323 - classification_loss: 0.0949 500/500 [==============================] - 165s 330ms/step - loss: 0.7278 - regression_loss: 0.6328 - classification_loss: 0.0950 1172 instances of class plum with average precision: 0.6381 mAP: 0.6381 Epoch 00042: saving model to ./training/snapshots/resnet101_pascal_42.h5 Epoch 43/150 1/500 [..............................] - ETA: 2:34 - loss: 1.2028 - regression_loss: 1.0465 - classification_loss: 0.1564 2/500 [..............................] - ETA: 2:34 - loss: 1.0112 - regression_loss: 0.8725 - classification_loss: 0.1387 3/500 [..............................] - ETA: 2:35 - loss: 1.0015 - regression_loss: 0.8664 - classification_loss: 0.1350 4/500 [..............................] - ETA: 2:35 - loss: 0.9883 - regression_loss: 0.8535 - classification_loss: 0.1348 5/500 [..............................] - ETA: 2:37 - loss: 0.9531 - regression_loss: 0.8241 - classification_loss: 0.1290 6/500 [..............................] - ETA: 2:39 - loss: 0.8641 - regression_loss: 0.7476 - classification_loss: 0.1165 7/500 [..............................] - ETA: 2:38 - loss: 0.8534 - regression_loss: 0.7368 - classification_loss: 0.1167 8/500 [..............................] - ETA: 2:38 - loss: 0.8585 - regression_loss: 0.7416 - classification_loss: 0.1169 9/500 [..............................] - ETA: 2:39 - loss: 0.8334 - regression_loss: 0.7243 - classification_loss: 0.1091 10/500 [..............................] - ETA: 2:39 - loss: 0.7883 - regression_loss: 0.6860 - classification_loss: 0.1024 11/500 [..............................] - ETA: 2:40 - loss: 0.7482 - regression_loss: 0.6522 - classification_loss: 0.0960 12/500 [..............................] - ETA: 2:40 - loss: 0.7305 - regression_loss: 0.6367 - classification_loss: 0.0938 13/500 [..............................] - ETA: 2:39 - loss: 0.7258 - regression_loss: 0.6298 - classification_loss: 0.0960 14/500 [..............................] - ETA: 2:39 - loss: 0.6962 - regression_loss: 0.6051 - classification_loss: 0.0911 15/500 [..............................] - ETA: 2:40 - loss: 0.7316 - regression_loss: 0.6348 - classification_loss: 0.0968 16/500 [..............................] - ETA: 2:40 - loss: 0.7316 - regression_loss: 0.6334 - classification_loss: 0.0982 17/500 [>.............................] - ETA: 2:40 - loss: 0.7366 - regression_loss: 0.6382 - classification_loss: 0.0984 18/500 [>.............................] - ETA: 2:40 - loss: 0.7070 - regression_loss: 0.6125 - classification_loss: 0.0945 19/500 [>.............................] - ETA: 2:39 - loss: 0.6859 - regression_loss: 0.5953 - classification_loss: 0.0906 20/500 [>.............................] - ETA: 2:39 - loss: 0.7002 - regression_loss: 0.6076 - classification_loss: 0.0926 21/500 [>.............................] - ETA: 2:38 - loss: 0.6950 - regression_loss: 0.6030 - classification_loss: 0.0919 22/500 [>.............................] - ETA: 2:38 - loss: 0.6896 - regression_loss: 0.5995 - classification_loss: 0.0901 23/500 [>.............................] - ETA: 2:38 - loss: 0.7027 - regression_loss: 0.6128 - classification_loss: 0.0898 24/500 [>.............................] - ETA: 2:37 - loss: 0.7278 - regression_loss: 0.6339 - classification_loss: 0.0938 25/500 [>.............................] - ETA: 2:37 - loss: 0.7184 - regression_loss: 0.6254 - classification_loss: 0.0930 26/500 [>.............................] - ETA: 2:36 - loss: 0.7243 - regression_loss: 0.6306 - classification_loss: 0.0938 27/500 [>.............................] - ETA: 2:36 - loss: 0.7302 - regression_loss: 0.6349 - classification_loss: 0.0954 28/500 [>.............................] - ETA: 2:35 - loss: 0.7350 - regression_loss: 0.6381 - classification_loss: 0.0969 29/500 [>.............................] - ETA: 2:34 - loss: 0.7392 - regression_loss: 0.6422 - classification_loss: 0.0970 30/500 [>.............................] - ETA: 2:35 - loss: 0.7340 - regression_loss: 0.6377 - classification_loss: 0.0963 31/500 [>.............................] - ETA: 2:34 - loss: 0.7286 - regression_loss: 0.6338 - classification_loss: 0.0947 32/500 [>.............................] - ETA: 2:34 - loss: 0.7325 - regression_loss: 0.6381 - classification_loss: 0.0944 33/500 [>.............................] - ETA: 2:34 - loss: 0.7470 - regression_loss: 0.6492 - classification_loss: 0.0978 34/500 [=>............................] - ETA: 2:34 - loss: 0.7467 - regression_loss: 0.6498 - classification_loss: 0.0969 35/500 [=>............................] - ETA: 2:33 - loss: 0.7477 - regression_loss: 0.6505 - classification_loss: 0.0972 36/500 [=>............................] - ETA: 2:33 - loss: 0.7480 - regression_loss: 0.6501 - classification_loss: 0.0979 37/500 [=>............................] - ETA: 2:32 - loss: 0.7388 - regression_loss: 0.6425 - classification_loss: 0.0963 38/500 [=>............................] - ETA: 2:32 - loss: 0.7351 - regression_loss: 0.6394 - classification_loss: 0.0956 39/500 [=>............................] - ETA: 2:32 - loss: 0.7380 - regression_loss: 0.6415 - classification_loss: 0.0965 40/500 [=>............................] - ETA: 2:31 - loss: 0.7505 - regression_loss: 0.6517 - classification_loss: 0.0989 41/500 [=>............................] - ETA: 2:31 - loss: 0.7400 - regression_loss: 0.6426 - classification_loss: 0.0974 42/500 [=>............................] - ETA: 2:30 - loss: 0.7342 - regression_loss: 0.6371 - classification_loss: 0.0971 43/500 [=>............................] - ETA: 2:30 - loss: 0.7352 - regression_loss: 0.6378 - classification_loss: 0.0973 44/500 [=>............................] - ETA: 2:30 - loss: 0.7495 - regression_loss: 0.6490 - classification_loss: 0.1005 45/500 [=>............................] - ETA: 2:29 - loss: 0.7453 - regression_loss: 0.6456 - classification_loss: 0.0997 46/500 [=>............................] - ETA: 2:29 - loss: 0.7412 - regression_loss: 0.6420 - classification_loss: 0.0992 47/500 [=>............................] - ETA: 2:29 - loss: 0.7313 - regression_loss: 0.6336 - classification_loss: 0.0976 48/500 [=>............................] - ETA: 2:28 - loss: 0.7216 - regression_loss: 0.6255 - classification_loss: 0.0961 49/500 [=>............................] - ETA: 2:28 - loss: 0.7289 - regression_loss: 0.6314 - classification_loss: 0.0976 50/500 [==>...........................] - ETA: 2:28 - loss: 0.7334 - regression_loss: 0.6353 - classification_loss: 0.0981 51/500 [==>...........................] - ETA: 2:27 - loss: 0.7245 - regression_loss: 0.6278 - classification_loss: 0.0966 52/500 [==>...........................] - ETA: 2:27 - loss: 0.7348 - regression_loss: 0.6360 - classification_loss: 0.0988 53/500 [==>...........................] - ETA: 2:27 - loss: 0.7263 - regression_loss: 0.6274 - classification_loss: 0.0989 54/500 [==>...........................] - ETA: 2:27 - loss: 0.7305 - regression_loss: 0.6311 - classification_loss: 0.0995 55/500 [==>...........................] - ETA: 2:26 - loss: 0.7244 - regression_loss: 0.6260 - classification_loss: 0.0984 56/500 [==>...........................] - ETA: 2:26 - loss: 0.7167 - regression_loss: 0.6194 - classification_loss: 0.0973 57/500 [==>...........................] - ETA: 2:26 - loss: 0.7180 - regression_loss: 0.6209 - classification_loss: 0.0971 58/500 [==>...........................] - ETA: 2:25 - loss: 0.7177 - regression_loss: 0.6203 - classification_loss: 0.0974 59/500 [==>...........................] - ETA: 2:25 - loss: 0.7208 - regression_loss: 0.6227 - classification_loss: 0.0981 60/500 [==>...........................] - ETA: 2:25 - loss: 0.7167 - regression_loss: 0.6194 - classification_loss: 0.0973 61/500 [==>...........................] - ETA: 2:24 - loss: 0.7120 - regression_loss: 0.6150 - classification_loss: 0.0970 62/500 [==>...........................] - ETA: 2:24 - loss: 0.7058 - regression_loss: 0.6096 - classification_loss: 0.0961 63/500 [==>...........................] - ETA: 2:23 - loss: 0.7086 - regression_loss: 0.6121 - classification_loss: 0.0965 64/500 [==>...........................] - ETA: 2:23 - loss: 0.7046 - regression_loss: 0.6090 - classification_loss: 0.0955 65/500 [==>...........................] - ETA: 2:23 - loss: 0.7082 - regression_loss: 0.6120 - classification_loss: 0.0962 66/500 [==>...........................] - ETA: 2:22 - loss: 0.7038 - regression_loss: 0.6087 - classification_loss: 0.0950 67/500 [===>..........................] - ETA: 2:22 - loss: 0.7031 - regression_loss: 0.6084 - classification_loss: 0.0947 68/500 [===>..........................] - ETA: 2:22 - loss: 0.7001 - regression_loss: 0.6058 - classification_loss: 0.0944 69/500 [===>..........................] - ETA: 2:21 - loss: 0.6971 - regression_loss: 0.6034 - classification_loss: 0.0938 70/500 [===>..........................] - ETA: 2:21 - loss: 0.6963 - regression_loss: 0.6027 - classification_loss: 0.0937 71/500 [===>..........................] - ETA: 2:20 - loss: 0.6927 - regression_loss: 0.5998 - classification_loss: 0.0929 72/500 [===>..........................] - ETA: 2:20 - loss: 0.6897 - regression_loss: 0.5977 - classification_loss: 0.0920 73/500 [===>..........................] - ETA: 2:20 - loss: 0.6926 - regression_loss: 0.6004 - classification_loss: 0.0922 74/500 [===>..........................] - ETA: 2:19 - loss: 0.6947 - regression_loss: 0.6017 - classification_loss: 0.0929 75/500 [===>..........................] - ETA: 2:19 - loss: 0.7014 - regression_loss: 0.6072 - classification_loss: 0.0942 76/500 [===>..........................] - ETA: 2:19 - loss: 0.7065 - regression_loss: 0.6113 - classification_loss: 0.0952 77/500 [===>..........................] - ETA: 2:18 - loss: 0.7082 - regression_loss: 0.6124 - classification_loss: 0.0957 78/500 [===>..........................] - ETA: 2:18 - loss: 0.7047 - regression_loss: 0.6096 - classification_loss: 0.0952 79/500 [===>..........................] - ETA: 2:18 - loss: 0.7040 - regression_loss: 0.6087 - classification_loss: 0.0953 80/500 [===>..........................] - ETA: 2:17 - loss: 0.6994 - regression_loss: 0.6046 - classification_loss: 0.0948 81/500 [===>..........................] - ETA: 2:17 - loss: 0.6981 - regression_loss: 0.6034 - classification_loss: 0.0948 82/500 [===>..........................] - ETA: 2:17 - loss: 0.6926 - regression_loss: 0.5985 - classification_loss: 0.0941 83/500 [===>..........................] - ETA: 2:16 - loss: 0.6924 - regression_loss: 0.5987 - classification_loss: 0.0937 84/500 [====>.........................] - ETA: 2:16 - loss: 0.6880 - regression_loss: 0.5951 - classification_loss: 0.0929 85/500 [====>.........................] - ETA: 2:16 - loss: 0.6900 - regression_loss: 0.5970 - classification_loss: 0.0930 86/500 [====>.........................] - ETA: 2:15 - loss: 0.6880 - regression_loss: 0.5955 - classification_loss: 0.0925 87/500 [====>.........................] - ETA: 2:15 - loss: 0.6921 - regression_loss: 0.5992 - classification_loss: 0.0930 88/500 [====>.........................] - ETA: 2:15 - loss: 0.6934 - regression_loss: 0.6005 - classification_loss: 0.0929 89/500 [====>.........................] - ETA: 2:14 - loss: 0.6952 - regression_loss: 0.6022 - classification_loss: 0.0930 90/500 [====>.........................] - ETA: 2:14 - loss: 0.6989 - regression_loss: 0.6055 - classification_loss: 0.0934 91/500 [====>.........................] - ETA: 2:14 - loss: 0.6996 - regression_loss: 0.6061 - classification_loss: 0.0935 92/500 [====>.........................] - ETA: 2:13 - loss: 0.7027 - regression_loss: 0.6088 - classification_loss: 0.0939 93/500 [====>.........................] - ETA: 2:13 - loss: 0.7022 - regression_loss: 0.6089 - classification_loss: 0.0933 94/500 [====>.........................] - ETA: 2:13 - loss: 0.7001 - regression_loss: 0.6073 - classification_loss: 0.0928 95/500 [====>.........................] - ETA: 2:12 - loss: 0.7004 - regression_loss: 0.6074 - classification_loss: 0.0930 96/500 [====>.........................] - ETA: 2:12 - loss: 0.6983 - regression_loss: 0.6057 - classification_loss: 0.0926 97/500 [====>.........................] - ETA: 2:11 - loss: 0.7038 - regression_loss: 0.6103 - classification_loss: 0.0935 98/500 [====>.........................] - ETA: 2:11 - loss: 0.7016 - regression_loss: 0.6083 - classification_loss: 0.0933 99/500 [====>.........................] - ETA: 2:11 - loss: 0.7021 - regression_loss: 0.6085 - classification_loss: 0.0937 100/500 [=====>........................] - ETA: 2:11 - loss: 0.6987 - regression_loss: 0.6057 - classification_loss: 0.0930 101/500 [=====>........................] - ETA: 2:10 - loss: 0.7014 - regression_loss: 0.6079 - classification_loss: 0.0934 102/500 [=====>........................] - ETA: 2:10 - loss: 0.7032 - regression_loss: 0.6093 - classification_loss: 0.0939 103/500 [=====>........................] - ETA: 2:10 - loss: 0.7010 - regression_loss: 0.6075 - classification_loss: 0.0935 104/500 [=====>........................] - ETA: 2:10 - loss: 0.7001 - regression_loss: 0.6071 - classification_loss: 0.0930 105/500 [=====>........................] - ETA: 2:09 - loss: 0.7056 - regression_loss: 0.6120 - classification_loss: 0.0936 106/500 [=====>........................] - ETA: 2:09 - loss: 0.7099 - regression_loss: 0.6154 - classification_loss: 0.0945 107/500 [=====>........................] - ETA: 2:09 - loss: 0.7119 - regression_loss: 0.6177 - classification_loss: 0.0942 108/500 [=====>........................] - ETA: 2:08 - loss: 0.7130 - regression_loss: 0.6186 - classification_loss: 0.0945 109/500 [=====>........................] - ETA: 2:08 - loss: 0.7153 - regression_loss: 0.6203 - classification_loss: 0.0950 110/500 [=====>........................] - ETA: 2:07 - loss: 0.7118 - regression_loss: 0.6172 - classification_loss: 0.0946 111/500 [=====>........................] - ETA: 2:07 - loss: 0.7117 - regression_loss: 0.6173 - classification_loss: 0.0945 112/500 [=====>........................] - ETA: 2:07 - loss: 0.7110 - regression_loss: 0.6169 - classification_loss: 0.0941 113/500 [=====>........................] - ETA: 2:06 - loss: 0.7083 - regression_loss: 0.6147 - classification_loss: 0.0936 114/500 [=====>........................] - ETA: 2:06 - loss: 0.7065 - regression_loss: 0.6132 - classification_loss: 0.0933 115/500 [=====>........................] - ETA: 2:06 - loss: 0.7043 - regression_loss: 0.6113 - classification_loss: 0.0930 116/500 [=====>........................] - ETA: 2:06 - loss: 0.7049 - regression_loss: 0.6116 - classification_loss: 0.0933 117/500 [======>.......................] - ETA: 2:05 - loss: 0.7026 - regression_loss: 0.6096 - classification_loss: 0.0930 118/500 [======>.......................] - ETA: 2:05 - loss: 0.6985 - regression_loss: 0.6059 - classification_loss: 0.0926 119/500 [======>.......................] - ETA: 2:05 - loss: 0.7021 - regression_loss: 0.6089 - classification_loss: 0.0932 120/500 [======>.......................] - ETA: 2:04 - loss: 0.7037 - regression_loss: 0.6107 - classification_loss: 0.0931 121/500 [======>.......................] - ETA: 2:04 - loss: 0.7065 - regression_loss: 0.6133 - classification_loss: 0.0932 122/500 [======>.......................] - ETA: 2:04 - loss: 0.7069 - regression_loss: 0.6138 - classification_loss: 0.0932 123/500 [======>.......................] - ETA: 2:03 - loss: 0.7040 - regression_loss: 0.6112 - classification_loss: 0.0929 124/500 [======>.......................] - ETA: 2:03 - loss: 0.7030 - regression_loss: 0.6105 - classification_loss: 0.0925 125/500 [======>.......................] - ETA: 2:03 - loss: 0.7067 - regression_loss: 0.6136 - classification_loss: 0.0930 126/500 [======>.......................] - ETA: 2:02 - loss: 0.7098 - regression_loss: 0.6166 - classification_loss: 0.0932 127/500 [======>.......................] - ETA: 2:02 - loss: 0.7127 - regression_loss: 0.6191 - classification_loss: 0.0935 128/500 [======>.......................] - ETA: 2:01 - loss: 0.7157 - regression_loss: 0.6217 - classification_loss: 0.0940 129/500 [======>.......................] - ETA: 2:01 - loss: 0.7208 - regression_loss: 0.6261 - classification_loss: 0.0947 130/500 [======>.......................] - ETA: 2:01 - loss: 0.7198 - regression_loss: 0.6254 - classification_loss: 0.0944 131/500 [======>.......................] - ETA: 2:00 - loss: 0.7231 - regression_loss: 0.6282 - classification_loss: 0.0950 132/500 [======>.......................] - ETA: 2:00 - loss: 0.7271 - regression_loss: 0.6323 - classification_loss: 0.0948 133/500 [======>.......................] - ETA: 2:00 - loss: 0.7255 - regression_loss: 0.6310 - classification_loss: 0.0945 134/500 [=======>......................] - ETA: 1:59 - loss: 0.7265 - regression_loss: 0.6319 - classification_loss: 0.0946 135/500 [=======>......................] - ETA: 1:59 - loss: 0.7273 - regression_loss: 0.6327 - classification_loss: 0.0946 136/500 [=======>......................] - ETA: 1:59 - loss: 0.7380 - regression_loss: 0.6416 - classification_loss: 0.0964 137/500 [=======>......................] - ETA: 1:59 - loss: 0.7427 - regression_loss: 0.6453 - classification_loss: 0.0973 138/500 [=======>......................] - ETA: 1:58 - loss: 0.7426 - regression_loss: 0.6454 - classification_loss: 0.0972 139/500 [=======>......................] - ETA: 1:58 - loss: 0.7504 - regression_loss: 0.6524 - classification_loss: 0.0981 140/500 [=======>......................] - ETA: 1:58 - loss: 0.7480 - regression_loss: 0.6504 - classification_loss: 0.0977 141/500 [=======>......................] - ETA: 1:57 - loss: 0.7545 - regression_loss: 0.6560 - classification_loss: 0.0985 142/500 [=======>......................] - ETA: 1:57 - loss: 0.7509 - regression_loss: 0.6529 - classification_loss: 0.0980 143/500 [=======>......................] - ETA: 1:57 - loss: 0.7490 - regression_loss: 0.6513 - classification_loss: 0.0976 144/500 [=======>......................] - ETA: 1:56 - loss: 0.7511 - regression_loss: 0.6533 - classification_loss: 0.0979 145/500 [=======>......................] - ETA: 1:56 - loss: 0.7495 - regression_loss: 0.6520 - classification_loss: 0.0974 146/500 [=======>......................] - ETA: 1:56 - loss: 0.7527 - regression_loss: 0.6550 - classification_loss: 0.0977 147/500 [=======>......................] - ETA: 1:55 - loss: 0.7537 - regression_loss: 0.6556 - classification_loss: 0.0981 148/500 [=======>......................] - ETA: 1:55 - loss: 0.7521 - regression_loss: 0.6543 - classification_loss: 0.0977 149/500 [=======>......................] - ETA: 1:55 - loss: 0.7511 - regression_loss: 0.6537 - classification_loss: 0.0974 150/500 [========>.....................] - ETA: 1:54 - loss: 0.7485 - regression_loss: 0.6515 - classification_loss: 0.0970 151/500 [========>.....................] - ETA: 1:54 - loss: 0.7500 - regression_loss: 0.6530 - classification_loss: 0.0970 152/500 [========>.....................] - ETA: 1:54 - loss: 0.7489 - regression_loss: 0.6520 - classification_loss: 0.0968 153/500 [========>.....................] - ETA: 1:53 - loss: 0.7498 - regression_loss: 0.6528 - classification_loss: 0.0970 154/500 [========>.....................] - ETA: 1:53 - loss: 0.7499 - regression_loss: 0.6527 - classification_loss: 0.0971 155/500 [========>.....................] - ETA: 1:52 - loss: 0.7487 - regression_loss: 0.6518 - classification_loss: 0.0969 156/500 [========>.....................] - ETA: 1:52 - loss: 0.7498 - regression_loss: 0.6529 - classification_loss: 0.0969 157/500 [========>.....................] - ETA: 1:52 - loss: 0.7496 - regression_loss: 0.6527 - classification_loss: 0.0969 158/500 [========>.....................] - ETA: 1:52 - loss: 0.7499 - regression_loss: 0.6532 - classification_loss: 0.0967 159/500 [========>.....................] - ETA: 1:51 - loss: 0.7515 - regression_loss: 0.6547 - classification_loss: 0.0969 160/500 [========>.....................] - ETA: 1:51 - loss: 0.7541 - regression_loss: 0.6568 - classification_loss: 0.0973 161/500 [========>.....................] - ETA: 1:51 - loss: 0.7533 - regression_loss: 0.6563 - classification_loss: 0.0971 162/500 [========>.....................] - ETA: 1:50 - loss: 0.7529 - regression_loss: 0.6561 - classification_loss: 0.0968 163/500 [========>.....................] - ETA: 1:50 - loss: 0.7546 - regression_loss: 0.6574 - classification_loss: 0.0972 164/500 [========>.....................] - ETA: 1:50 - loss: 0.7534 - regression_loss: 0.6565 - classification_loss: 0.0968 165/500 [========>.....................] - ETA: 1:49 - loss: 0.7521 - regression_loss: 0.6555 - classification_loss: 0.0966 166/500 [========>.....................] - ETA: 1:49 - loss: 0.7501 - regression_loss: 0.6539 - classification_loss: 0.0963 167/500 [=========>....................] - ETA: 1:49 - loss: 0.7520 - regression_loss: 0.6557 - classification_loss: 0.0963 168/500 [=========>....................] - ETA: 1:48 - loss: 0.7533 - regression_loss: 0.6565 - classification_loss: 0.0968 169/500 [=========>....................] - ETA: 1:48 - loss: 0.7507 - regression_loss: 0.6543 - classification_loss: 0.0963 170/500 [=========>....................] - ETA: 1:48 - loss: 0.7522 - regression_loss: 0.6557 - classification_loss: 0.0964 171/500 [=========>....................] - ETA: 1:47 - loss: 0.7517 - regression_loss: 0.6554 - classification_loss: 0.0964 172/500 [=========>....................] - ETA: 1:47 - loss: 0.7514 - regression_loss: 0.6552 - classification_loss: 0.0961 173/500 [=========>....................] - ETA: 1:47 - loss: 0.7552 - regression_loss: 0.6584 - classification_loss: 0.0967 174/500 [=========>....................] - ETA: 1:46 - loss: 0.7544 - regression_loss: 0.6575 - classification_loss: 0.0969 175/500 [=========>....................] - ETA: 1:46 - loss: 0.7549 - regression_loss: 0.6580 - classification_loss: 0.0969 176/500 [=========>....................] - ETA: 1:46 - loss: 0.7569 - regression_loss: 0.6597 - classification_loss: 0.0972 177/500 [=========>....................] - ETA: 1:45 - loss: 0.7565 - regression_loss: 0.6596 - classification_loss: 0.0970 178/500 [=========>....................] - ETA: 1:45 - loss: 0.7577 - regression_loss: 0.6607 - classification_loss: 0.0970 179/500 [=========>....................] - ETA: 1:45 - loss: 0.7601 - regression_loss: 0.6626 - classification_loss: 0.0975 180/500 [=========>....................] - ETA: 1:44 - loss: 0.7605 - regression_loss: 0.6630 - classification_loss: 0.0975 181/500 [=========>....................] - ETA: 1:44 - loss: 0.7588 - regression_loss: 0.6617 - classification_loss: 0.0971 182/500 [=========>....................] - ETA: 1:44 - loss: 0.7566 - regression_loss: 0.6597 - classification_loss: 0.0969 183/500 [=========>....................] - ETA: 1:43 - loss: 0.7563 - regression_loss: 0.6596 - classification_loss: 0.0967 184/500 [==========>...................] - ETA: 1:43 - loss: 0.7568 - regression_loss: 0.6602 - classification_loss: 0.0966 185/500 [==========>...................] - ETA: 1:43 - loss: 0.7559 - regression_loss: 0.6595 - classification_loss: 0.0964 186/500 [==========>...................] - ETA: 1:43 - loss: 0.7545 - regression_loss: 0.6584 - classification_loss: 0.0962 187/500 [==========>...................] - ETA: 1:42 - loss: 0.7559 - regression_loss: 0.6595 - classification_loss: 0.0964 188/500 [==========>...................] - ETA: 1:42 - loss: 0.7534 - regression_loss: 0.6574 - classification_loss: 0.0960 189/500 [==========>...................] - ETA: 1:42 - loss: 0.7539 - regression_loss: 0.6579 - classification_loss: 0.0961 190/500 [==========>...................] - ETA: 1:41 - loss: 0.7536 - regression_loss: 0.6577 - classification_loss: 0.0959 191/500 [==========>...................] - ETA: 1:41 - loss: 0.7538 - regression_loss: 0.6578 - classification_loss: 0.0961 192/500 [==========>...................] - ETA: 1:41 - loss: 0.7529 - regression_loss: 0.6570 - classification_loss: 0.0959 193/500 [==========>...................] - ETA: 1:40 - loss: 0.7525 - regression_loss: 0.6567 - classification_loss: 0.0958 194/500 [==========>...................] - ETA: 1:40 - loss: 0.7524 - regression_loss: 0.6565 - classification_loss: 0.0959 195/500 [==========>...................] - ETA: 1:40 - loss: 0.7533 - regression_loss: 0.6573 - classification_loss: 0.0960 196/500 [==========>...................] - ETA: 1:39 - loss: 0.7530 - regression_loss: 0.6572 - classification_loss: 0.0958 197/500 [==========>...................] - ETA: 1:39 - loss: 0.7575 - regression_loss: 0.6608 - classification_loss: 0.0967 198/500 [==========>...................] - ETA: 1:39 - loss: 0.7557 - regression_loss: 0.6594 - classification_loss: 0.0963 199/500 [==========>...................] - ETA: 1:38 - loss: 0.7546 - regression_loss: 0.6584 - classification_loss: 0.0962 200/500 [===========>..................] - ETA: 1:38 - loss: 0.7559 - regression_loss: 0.6596 - classification_loss: 0.0962 201/500 [===========>..................] - ETA: 1:38 - loss: 0.7533 - regression_loss: 0.6574 - classification_loss: 0.0959 202/500 [===========>..................] - ETA: 1:37 - loss: 0.7511 - regression_loss: 0.6555 - classification_loss: 0.0956 203/500 [===========>..................] - ETA: 1:37 - loss: 0.7513 - regression_loss: 0.6557 - classification_loss: 0.0955 204/500 [===========>..................] - ETA: 1:37 - loss: 0.7497 - regression_loss: 0.6544 - classification_loss: 0.0953 205/500 [===========>..................] - ETA: 1:36 - loss: 0.7509 - regression_loss: 0.6554 - classification_loss: 0.0955 206/500 [===========>..................] - ETA: 1:36 - loss: 0.7479 - regression_loss: 0.6528 - classification_loss: 0.0952 207/500 [===========>..................] - ETA: 1:36 - loss: 0.7484 - regression_loss: 0.6531 - classification_loss: 0.0953 208/500 [===========>..................] - ETA: 1:35 - loss: 0.7500 - regression_loss: 0.6548 - classification_loss: 0.0953 209/500 [===========>..................] - ETA: 1:35 - loss: 0.7501 - regression_loss: 0.6547 - classification_loss: 0.0953 210/500 [===========>..................] - ETA: 1:35 - loss: 0.7497 - regression_loss: 0.6546 - classification_loss: 0.0951 211/500 [===========>..................] - ETA: 1:34 - loss: 0.7487 - regression_loss: 0.6537 - classification_loss: 0.0950 212/500 [===========>..................] - ETA: 1:34 - loss: 0.7508 - regression_loss: 0.6555 - classification_loss: 0.0953 213/500 [===========>..................] - ETA: 1:34 - loss: 0.7491 - regression_loss: 0.6541 - classification_loss: 0.0950 214/500 [===========>..................] - ETA: 1:33 - loss: 0.7486 - regression_loss: 0.6537 - classification_loss: 0.0948 215/500 [===========>..................] - ETA: 1:33 - loss: 0.7468 - regression_loss: 0.6521 - classification_loss: 0.0947 216/500 [===========>..................] - ETA: 1:33 - loss: 0.7465 - regression_loss: 0.6518 - classification_loss: 0.0947 217/500 [============>.................] - ETA: 1:32 - loss: 0.7450 - regression_loss: 0.6506 - classification_loss: 0.0944 218/500 [============>.................] - ETA: 1:32 - loss: 0.7442 - regression_loss: 0.6500 - classification_loss: 0.0942 219/500 [============>.................] - ETA: 1:32 - loss: 0.7432 - regression_loss: 0.6491 - classification_loss: 0.0941 220/500 [============>.................] - ETA: 1:31 - loss: 0.7442 - regression_loss: 0.6498 - classification_loss: 0.0944 221/500 [============>.................] - ETA: 1:31 - loss: 0.7441 - regression_loss: 0.6498 - classification_loss: 0.0944 222/500 [============>.................] - ETA: 1:31 - loss: 0.7445 - regression_loss: 0.6501 - classification_loss: 0.0944 223/500 [============>.................] - ETA: 1:30 - loss: 0.7443 - regression_loss: 0.6500 - classification_loss: 0.0943 224/500 [============>.................] - ETA: 1:30 - loss: 0.7441 - regression_loss: 0.6497 - classification_loss: 0.0943 225/500 [============>.................] - ETA: 1:30 - loss: 0.7436 - regression_loss: 0.6493 - classification_loss: 0.0943 226/500 [============>.................] - ETA: 1:29 - loss: 0.7449 - regression_loss: 0.6502 - classification_loss: 0.0947 227/500 [============>.................] - ETA: 1:29 - loss: 0.7454 - regression_loss: 0.6507 - classification_loss: 0.0947 228/500 [============>.................] - ETA: 1:29 - loss: 0.7456 - regression_loss: 0.6509 - classification_loss: 0.0947 229/500 [============>.................] - ETA: 1:29 - loss: 0.7458 - regression_loss: 0.6512 - classification_loss: 0.0947 230/500 [============>.................] - ETA: 1:28 - loss: 0.7454 - regression_loss: 0.6509 - classification_loss: 0.0945 231/500 [============>.................] - ETA: 1:28 - loss: 0.7440 - regression_loss: 0.6498 - classification_loss: 0.0943 232/500 [============>.................] - ETA: 1:28 - loss: 0.7452 - regression_loss: 0.6508 - classification_loss: 0.0944 233/500 [============>.................] - ETA: 1:27 - loss: 0.7467 - regression_loss: 0.6520 - classification_loss: 0.0946 234/500 [=============>................] - ETA: 1:27 - loss: 0.7458 - regression_loss: 0.6513 - classification_loss: 0.0944 235/500 [=============>................] - ETA: 1:27 - loss: 0.7470 - regression_loss: 0.6523 - classification_loss: 0.0947 236/500 [=============>................] - ETA: 1:26 - loss: 0.7456 - regression_loss: 0.6511 - classification_loss: 0.0945 237/500 [=============>................] - ETA: 1:26 - loss: 0.7458 - regression_loss: 0.6513 - classification_loss: 0.0945 238/500 [=============>................] - ETA: 1:26 - loss: 0.7441 - regression_loss: 0.6499 - classification_loss: 0.0942 239/500 [=============>................] - ETA: 1:25 - loss: 0.7429 - regression_loss: 0.6488 - classification_loss: 0.0941 240/500 [=============>................] - ETA: 1:25 - loss: 0.7444 - regression_loss: 0.6501 - classification_loss: 0.0944 241/500 [=============>................] - ETA: 1:25 - loss: 0.7447 - regression_loss: 0.6503 - classification_loss: 0.0944 242/500 [=============>................] - ETA: 1:24 - loss: 0.7454 - regression_loss: 0.6508 - classification_loss: 0.0946 243/500 [=============>................] - ETA: 1:24 - loss: 0.7436 - regression_loss: 0.6492 - classification_loss: 0.0944 244/500 [=============>................] - ETA: 1:24 - loss: 0.7437 - regression_loss: 0.6490 - classification_loss: 0.0946 245/500 [=============>................] - ETA: 1:23 - loss: 0.7430 - regression_loss: 0.6485 - classification_loss: 0.0945 246/500 [=============>................] - ETA: 1:23 - loss: 0.7428 - regression_loss: 0.6484 - classification_loss: 0.0945 247/500 [=============>................] - ETA: 1:23 - loss: 0.7448 - regression_loss: 0.6500 - classification_loss: 0.0948 248/500 [=============>................] - ETA: 1:22 - loss: 0.7447 - regression_loss: 0.6500 - classification_loss: 0.0948 249/500 [=============>................] - ETA: 1:22 - loss: 0.7432 - regression_loss: 0.6486 - classification_loss: 0.0946 250/500 [==============>...............] - ETA: 1:22 - loss: 0.7435 - regression_loss: 0.6488 - classification_loss: 0.0947 251/500 [==============>...............] - ETA: 1:21 - loss: 0.7432 - regression_loss: 0.6486 - classification_loss: 0.0946 252/500 [==============>...............] - ETA: 1:21 - loss: 0.7447 - regression_loss: 0.6496 - classification_loss: 0.0951 253/500 [==============>...............] - ETA: 1:21 - loss: 0.7452 - regression_loss: 0.6500 - classification_loss: 0.0952 254/500 [==============>...............] - ETA: 1:20 - loss: 0.7439 - regression_loss: 0.6488 - classification_loss: 0.0951 255/500 [==============>...............] - ETA: 1:20 - loss: 0.7435 - regression_loss: 0.6484 - classification_loss: 0.0951 256/500 [==============>...............] - ETA: 1:20 - loss: 0.7430 - regression_loss: 0.6479 - classification_loss: 0.0951 257/500 [==============>...............] - ETA: 1:19 - loss: 0.7416 - regression_loss: 0.6466 - classification_loss: 0.0949 258/500 [==============>...............] - ETA: 1:19 - loss: 0.7405 - regression_loss: 0.6456 - classification_loss: 0.0949 259/500 [==============>...............] - ETA: 1:19 - loss: 0.7405 - regression_loss: 0.6457 - classification_loss: 0.0948 260/500 [==============>...............] - ETA: 1:18 - loss: 0.7421 - regression_loss: 0.6469 - classification_loss: 0.0952 261/500 [==============>...............] - ETA: 1:18 - loss: 0.7423 - regression_loss: 0.6471 - classification_loss: 0.0952 262/500 [==============>...............] - ETA: 1:18 - loss: 0.7428 - regression_loss: 0.6476 - classification_loss: 0.0953 263/500 [==============>...............] - ETA: 1:18 - loss: 0.7427 - regression_loss: 0.6476 - classification_loss: 0.0952 264/500 [==============>...............] - ETA: 1:17 - loss: 0.7429 - regression_loss: 0.6479 - classification_loss: 0.0950 265/500 [==============>...............] - ETA: 1:17 - loss: 0.7442 - regression_loss: 0.6492 - classification_loss: 0.0951 266/500 [==============>...............] - ETA: 1:17 - loss: 0.7443 - regression_loss: 0.6492 - classification_loss: 0.0951 267/500 [===============>..............] - ETA: 1:16 - loss: 0.7425 - regression_loss: 0.6476 - classification_loss: 0.0949 268/500 [===============>..............] - ETA: 1:16 - loss: 0.7419 - regression_loss: 0.6470 - classification_loss: 0.0948 269/500 [===============>..............] - ETA: 1:16 - loss: 0.7404 - regression_loss: 0.6458 - classification_loss: 0.0946 270/500 [===============>..............] - ETA: 1:15 - loss: 0.7403 - regression_loss: 0.6458 - classification_loss: 0.0945 271/500 [===============>..............] - ETA: 1:15 - loss: 0.7412 - regression_loss: 0.6465 - classification_loss: 0.0947 272/500 [===============>..............] - ETA: 1:15 - loss: 0.7411 - regression_loss: 0.6463 - classification_loss: 0.0948 273/500 [===============>..............] - ETA: 1:14 - loss: 0.7411 - regression_loss: 0.6462 - classification_loss: 0.0948 274/500 [===============>..............] - ETA: 1:14 - loss: 0.7397 - regression_loss: 0.6451 - classification_loss: 0.0946 275/500 [===============>..............] - ETA: 1:14 - loss: 0.7420 - regression_loss: 0.6469 - classification_loss: 0.0950 276/500 [===============>..............] - ETA: 1:13 - loss: 0.7409 - regression_loss: 0.6460 - classification_loss: 0.0949 277/500 [===============>..............] - ETA: 1:13 - loss: 0.7396 - regression_loss: 0.6449 - classification_loss: 0.0948 278/500 [===============>..............] - ETA: 1:13 - loss: 0.7384 - regression_loss: 0.6439 - classification_loss: 0.0946 279/500 [===============>..............] - ETA: 1:12 - loss: 0.7372 - regression_loss: 0.6428 - classification_loss: 0.0944 280/500 [===============>..............] - ETA: 1:12 - loss: 0.7353 - regression_loss: 0.6412 - classification_loss: 0.0942 281/500 [===============>..............] - ETA: 1:12 - loss: 0.7368 - regression_loss: 0.6425 - classification_loss: 0.0943 282/500 [===============>..............] - ETA: 1:11 - loss: 0.7360 - regression_loss: 0.6418 - classification_loss: 0.0942 283/500 [===============>..............] - ETA: 1:11 - loss: 0.7365 - regression_loss: 0.6423 - classification_loss: 0.0942 284/500 [================>.............] - ETA: 1:11 - loss: 0.7381 - regression_loss: 0.6439 - classification_loss: 0.0942 285/500 [================>.............] - ETA: 1:10 - loss: 0.7387 - regression_loss: 0.6443 - classification_loss: 0.0944 286/500 [================>.............] - ETA: 1:10 - loss: 0.7393 - regression_loss: 0.6448 - classification_loss: 0.0945 287/500 [================>.............] - ETA: 1:10 - loss: 0.7385 - regression_loss: 0.6442 - classification_loss: 0.0943 288/500 [================>.............] - ETA: 1:09 - loss: 0.7382 - regression_loss: 0.6439 - classification_loss: 0.0943 289/500 [================>.............] - ETA: 1:09 - loss: 0.7376 - regression_loss: 0.6435 - classification_loss: 0.0941 290/500 [================>.............] - ETA: 1:09 - loss: 0.7380 - regression_loss: 0.6439 - classification_loss: 0.0940 291/500 [================>.............] - ETA: 1:08 - loss: 0.7369 - regression_loss: 0.6430 - classification_loss: 0.0939 292/500 [================>.............] - ETA: 1:08 - loss: 0.7368 - regression_loss: 0.6430 - classification_loss: 0.0938 293/500 [================>.............] - ETA: 1:08 - loss: 0.7360 - regression_loss: 0.6423 - classification_loss: 0.0937 294/500 [================>.............] - ETA: 1:07 - loss: 0.7361 - regression_loss: 0.6423 - classification_loss: 0.0938 295/500 [================>.............] - ETA: 1:07 - loss: 0.7356 - regression_loss: 0.6420 - classification_loss: 0.0936 296/500 [================>.............] - ETA: 1:07 - loss: 0.7372 - regression_loss: 0.6432 - classification_loss: 0.0940 297/500 [================>.............] - ETA: 1:06 - loss: 0.7380 - regression_loss: 0.6439 - classification_loss: 0.0941 298/500 [================>.............] - ETA: 1:06 - loss: 0.7369 - regression_loss: 0.6429 - classification_loss: 0.0940 299/500 [================>.............] - ETA: 1:06 - loss: 0.7367 - regression_loss: 0.6427 - classification_loss: 0.0940 300/500 [=================>............] - ETA: 1:05 - loss: 0.7366 - regression_loss: 0.6427 - classification_loss: 0.0939 301/500 [=================>............] - ETA: 1:05 - loss: 0.7392 - regression_loss: 0.6449 - classification_loss: 0.0943 302/500 [=================>............] - ETA: 1:05 - loss: 0.7387 - regression_loss: 0.6446 - classification_loss: 0.0941 303/500 [=================>............] - ETA: 1:04 - loss: 0.7386 - regression_loss: 0.6445 - classification_loss: 0.0941 304/500 [=================>............] - ETA: 1:04 - loss: 0.7394 - regression_loss: 0.6451 - classification_loss: 0.0943 305/500 [=================>............] - ETA: 1:04 - loss: 0.7390 - regression_loss: 0.6448 - classification_loss: 0.0942 306/500 [=================>............] - ETA: 1:03 - loss: 0.7392 - regression_loss: 0.6450 - classification_loss: 0.0942 307/500 [=================>............] - ETA: 1:03 - loss: 0.7383 - regression_loss: 0.6442 - classification_loss: 0.0941 308/500 [=================>............] - ETA: 1:03 - loss: 0.7375 - regression_loss: 0.6436 - classification_loss: 0.0939 309/500 [=================>............] - ETA: 1:02 - loss: 0.7364 - regression_loss: 0.6425 - classification_loss: 0.0938 310/500 [=================>............] - ETA: 1:02 - loss: 0.7375 - regression_loss: 0.6435 - classification_loss: 0.0940 311/500 [=================>............] - ETA: 1:02 - loss: 0.7372 - regression_loss: 0.6434 - classification_loss: 0.0939 312/500 [=================>............] - ETA: 1:01 - loss: 0.7381 - regression_loss: 0.6440 - classification_loss: 0.0941 313/500 [=================>............] - ETA: 1:01 - loss: 0.7374 - regression_loss: 0.6434 - classification_loss: 0.0939 314/500 [=================>............] - ETA: 1:01 - loss: 0.7374 - regression_loss: 0.6434 - classification_loss: 0.0940 315/500 [=================>............] - ETA: 1:00 - loss: 0.7365 - regression_loss: 0.6427 - classification_loss: 0.0938 316/500 [=================>............] - ETA: 1:00 - loss: 0.7363 - regression_loss: 0.6425 - classification_loss: 0.0938 317/500 [==================>...........] - ETA: 1:00 - loss: 0.7360 - regression_loss: 0.6422 - classification_loss: 0.0938 318/500 [==================>...........] - ETA: 59s - loss: 0.7361 - regression_loss: 0.6425 - classification_loss: 0.0937  319/500 [==================>...........] - ETA: 59s - loss: 0.7355 - regression_loss: 0.6419 - classification_loss: 0.0936 320/500 [==================>...........] - ETA: 59s - loss: 0.7346 - regression_loss: 0.6413 - classification_loss: 0.0934 321/500 [==================>...........] - ETA: 58s - loss: 0.7353 - regression_loss: 0.6419 - classification_loss: 0.0935 322/500 [==================>...........] - ETA: 58s - loss: 0.7348 - regression_loss: 0.6414 - classification_loss: 0.0934 323/500 [==================>...........] - ETA: 58s - loss: 0.7347 - regression_loss: 0.6414 - classification_loss: 0.0933 324/500 [==================>...........] - ETA: 57s - loss: 0.7334 - regression_loss: 0.6403 - classification_loss: 0.0931 325/500 [==================>...........] - ETA: 57s - loss: 0.7319 - regression_loss: 0.6390 - classification_loss: 0.0929 326/500 [==================>...........] - ETA: 57s - loss: 0.7321 - regression_loss: 0.6391 - classification_loss: 0.0929 327/500 [==================>...........] - ETA: 56s - loss: 0.7307 - regression_loss: 0.6380 - classification_loss: 0.0927 328/500 [==================>...........] - ETA: 56s - loss: 0.7313 - regression_loss: 0.6385 - classification_loss: 0.0928 329/500 [==================>...........] - ETA: 56s - loss: 0.7317 - regression_loss: 0.6389 - classification_loss: 0.0928 330/500 [==================>...........] - ETA: 55s - loss: 0.7342 - regression_loss: 0.6409 - classification_loss: 0.0933 331/500 [==================>...........] - ETA: 55s - loss: 0.7328 - regression_loss: 0.6397 - classification_loss: 0.0931 332/500 [==================>...........] - ETA: 55s - loss: 0.7336 - regression_loss: 0.6403 - classification_loss: 0.0934 333/500 [==================>...........] - ETA: 55s - loss: 0.7348 - regression_loss: 0.6413 - classification_loss: 0.0935 334/500 [===================>..........] - ETA: 54s - loss: 0.7346 - regression_loss: 0.6412 - classification_loss: 0.0934 335/500 [===================>..........] - ETA: 54s - loss: 0.7332 - regression_loss: 0.6400 - classification_loss: 0.0933 336/500 [===================>..........] - ETA: 54s - loss: 0.7327 - regression_loss: 0.6395 - classification_loss: 0.0932 337/500 [===================>..........] - ETA: 53s - loss: 0.7320 - regression_loss: 0.6389 - classification_loss: 0.0931 338/500 [===================>..........] - ETA: 53s - loss: 0.7332 - regression_loss: 0.6399 - classification_loss: 0.0932 339/500 [===================>..........] - ETA: 53s - loss: 0.7340 - regression_loss: 0.6407 - classification_loss: 0.0933 340/500 [===================>..........] - ETA: 52s - loss: 0.7339 - regression_loss: 0.6405 - classification_loss: 0.0934 341/500 [===================>..........] - ETA: 52s - loss: 0.7340 - regression_loss: 0.6405 - classification_loss: 0.0935 342/500 [===================>..........] - ETA: 52s - loss: 0.7347 - regression_loss: 0.6410 - classification_loss: 0.0938 343/500 [===================>..........] - ETA: 51s - loss: 0.7348 - regression_loss: 0.6410 - classification_loss: 0.0938 344/500 [===================>..........] - ETA: 51s - loss: 0.7338 - regression_loss: 0.6402 - classification_loss: 0.0936 345/500 [===================>..........] - ETA: 51s - loss: 0.7346 - regression_loss: 0.6410 - classification_loss: 0.0937 346/500 [===================>..........] - ETA: 50s - loss: 0.7338 - regression_loss: 0.6403 - classification_loss: 0.0935 347/500 [===================>..........] - ETA: 50s - loss: 0.7329 - regression_loss: 0.6395 - classification_loss: 0.0934 348/500 [===================>..........] - ETA: 50s - loss: 0.7322 - regression_loss: 0.6389 - classification_loss: 0.0933 349/500 [===================>..........] - ETA: 49s - loss: 0.7313 - regression_loss: 0.6381 - classification_loss: 0.0932 350/500 [====================>.........] - ETA: 49s - loss: 0.7301 - regression_loss: 0.6371 - classification_loss: 0.0930 351/500 [====================>.........] - ETA: 49s - loss: 0.7294 - regression_loss: 0.6365 - classification_loss: 0.0929 352/500 [====================>.........] - ETA: 48s - loss: 0.7282 - regression_loss: 0.6355 - classification_loss: 0.0927 353/500 [====================>.........] - ETA: 48s - loss: 0.7294 - regression_loss: 0.6365 - classification_loss: 0.0930 354/500 [====================>.........] - ETA: 48s - loss: 0.7293 - regression_loss: 0.6363 - classification_loss: 0.0930 355/500 [====================>.........] - ETA: 47s - loss: 0.7295 - regression_loss: 0.6365 - classification_loss: 0.0930 356/500 [====================>.........] - ETA: 47s - loss: 0.7298 - regression_loss: 0.6368 - classification_loss: 0.0930 357/500 [====================>.........] - ETA: 47s - loss: 0.7294 - regression_loss: 0.6365 - classification_loss: 0.0929 358/500 [====================>.........] - ETA: 46s - loss: 0.7302 - regression_loss: 0.6371 - classification_loss: 0.0931 359/500 [====================>.........] - ETA: 46s - loss: 0.7312 - regression_loss: 0.6379 - classification_loss: 0.0934 360/500 [====================>.........] - ETA: 46s - loss: 0.7315 - regression_loss: 0.6382 - classification_loss: 0.0933 361/500 [====================>.........] - ETA: 45s - loss: 0.7308 - regression_loss: 0.6376 - classification_loss: 0.0932 362/500 [====================>.........] - ETA: 45s - loss: 0.7301 - regression_loss: 0.6371 - classification_loss: 0.0930 363/500 [====================>.........] - ETA: 45s - loss: 0.7296 - regression_loss: 0.6366 - classification_loss: 0.0930 364/500 [====================>.........] - ETA: 44s - loss: 0.7305 - regression_loss: 0.6374 - classification_loss: 0.0930 365/500 [====================>.........] - ETA: 44s - loss: 0.7302 - regression_loss: 0.6372 - classification_loss: 0.0929 366/500 [====================>.........] - ETA: 44s - loss: 0.7295 - regression_loss: 0.6366 - classification_loss: 0.0929 367/500 [=====================>........] - ETA: 43s - loss: 0.7296 - regression_loss: 0.6366 - classification_loss: 0.0930 368/500 [=====================>........] - ETA: 43s - loss: 0.7293 - regression_loss: 0.6363 - classification_loss: 0.0930 369/500 [=====================>........] - ETA: 43s - loss: 0.7288 - regression_loss: 0.6358 - classification_loss: 0.0930 370/500 [=====================>........] - ETA: 42s - loss: 0.7282 - regression_loss: 0.6353 - classification_loss: 0.0928 371/500 [=====================>........] - ETA: 42s - loss: 0.7283 - regression_loss: 0.6354 - classification_loss: 0.0928 372/500 [=====================>........] - ETA: 42s - loss: 0.7287 - regression_loss: 0.6358 - classification_loss: 0.0929 373/500 [=====================>........] - ETA: 41s - loss: 0.7284 - regression_loss: 0.6355 - classification_loss: 0.0929 374/500 [=====================>........] - ETA: 41s - loss: 0.7285 - regression_loss: 0.6357 - classification_loss: 0.0928 375/500 [=====================>........] - ETA: 41s - loss: 0.7276 - regression_loss: 0.6349 - classification_loss: 0.0927 376/500 [=====================>........] - ETA: 40s - loss: 0.7267 - regression_loss: 0.6342 - classification_loss: 0.0925 377/500 [=====================>........] - ETA: 40s - loss: 0.7258 - regression_loss: 0.6335 - classification_loss: 0.0924 378/500 [=====================>........] - ETA: 40s - loss: 0.7244 - regression_loss: 0.6322 - classification_loss: 0.0922 379/500 [=====================>........] - ETA: 39s - loss: 0.7234 - regression_loss: 0.6313 - classification_loss: 0.0920 380/500 [=====================>........] - ETA: 39s - loss: 0.7235 - regression_loss: 0.6314 - classification_loss: 0.0921 381/500 [=====================>........] - ETA: 39s - loss: 0.7230 - regression_loss: 0.6309 - classification_loss: 0.0920 382/500 [=====================>........] - ETA: 38s - loss: 0.7233 - regression_loss: 0.6312 - classification_loss: 0.0921 383/500 [=====================>........] - ETA: 38s - loss: 0.7221 - regression_loss: 0.6302 - classification_loss: 0.0919 384/500 [======================>.......] - ETA: 38s - loss: 0.7227 - regression_loss: 0.6308 - classification_loss: 0.0920 385/500 [======================>.......] - ETA: 37s - loss: 0.7228 - regression_loss: 0.6309 - classification_loss: 0.0919 386/500 [======================>.......] - ETA: 37s - loss: 0.7251 - regression_loss: 0.6327 - classification_loss: 0.0924 387/500 [======================>.......] - ETA: 37s - loss: 0.7256 - regression_loss: 0.6329 - classification_loss: 0.0927 388/500 [======================>.......] - ETA: 36s - loss: 0.7261 - regression_loss: 0.6333 - classification_loss: 0.0928 389/500 [======================>.......] - ETA: 36s - loss: 0.7275 - regression_loss: 0.6345 - classification_loss: 0.0930 390/500 [======================>.......] - ETA: 36s - loss: 0.7278 - regression_loss: 0.6348 - classification_loss: 0.0930 391/500 [======================>.......] - ETA: 35s - loss: 0.7278 - regression_loss: 0.6348 - classification_loss: 0.0930 392/500 [======================>.......] - ETA: 35s - loss: 0.7271 - regression_loss: 0.6342 - classification_loss: 0.0929 393/500 [======================>.......] - ETA: 35s - loss: 0.7284 - regression_loss: 0.6353 - classification_loss: 0.0931 394/500 [======================>.......] - ETA: 34s - loss: 0.7274 - regression_loss: 0.6344 - classification_loss: 0.0929 395/500 [======================>.......] - ETA: 34s - loss: 0.7276 - regression_loss: 0.6346 - classification_loss: 0.0930 396/500 [======================>.......] - ETA: 34s - loss: 0.7290 - regression_loss: 0.6358 - classification_loss: 0.0932 397/500 [======================>.......] - ETA: 33s - loss: 0.7295 - regression_loss: 0.6363 - classification_loss: 0.0932 398/500 [======================>.......] - ETA: 33s - loss: 0.7303 - regression_loss: 0.6370 - classification_loss: 0.0934 399/500 [======================>.......] - ETA: 33s - loss: 0.7312 - regression_loss: 0.6377 - classification_loss: 0.0935 400/500 [=======================>......] - ETA: 32s - loss: 0.7323 - regression_loss: 0.6385 - classification_loss: 0.0939 401/500 [=======================>......] - ETA: 32s - loss: 0.7328 - regression_loss: 0.6389 - classification_loss: 0.0939 402/500 [=======================>......] - ETA: 32s - loss: 0.7325 - regression_loss: 0.6386 - classification_loss: 0.0939 403/500 [=======================>......] - ETA: 31s - loss: 0.7317 - regression_loss: 0.6379 - classification_loss: 0.0938 404/500 [=======================>......] - ETA: 31s - loss: 0.7325 - regression_loss: 0.6386 - classification_loss: 0.0939 405/500 [=======================>......] - ETA: 31s - loss: 0.7325 - regression_loss: 0.6387 - classification_loss: 0.0938 406/500 [=======================>......] - ETA: 30s - loss: 0.7325 - regression_loss: 0.6387 - classification_loss: 0.0938 407/500 [=======================>......] - ETA: 30s - loss: 0.7321 - regression_loss: 0.6384 - classification_loss: 0.0936 408/500 [=======================>......] - ETA: 30s - loss: 0.7327 - regression_loss: 0.6390 - classification_loss: 0.0937 409/500 [=======================>......] - ETA: 29s - loss: 0.7330 - regression_loss: 0.6393 - classification_loss: 0.0937 410/500 [=======================>......] - ETA: 29s - loss: 0.7325 - regression_loss: 0.6389 - classification_loss: 0.0935 411/500 [=======================>......] - ETA: 29s - loss: 0.7327 - regression_loss: 0.6392 - classification_loss: 0.0936 412/500 [=======================>......] - ETA: 28s - loss: 0.7338 - regression_loss: 0.6399 - classification_loss: 0.0939 413/500 [=======================>......] - ETA: 28s - loss: 0.7329 - regression_loss: 0.6392 - classification_loss: 0.0938 414/500 [=======================>......] - ETA: 28s - loss: 0.7331 - regression_loss: 0.6392 - classification_loss: 0.0938 415/500 [=======================>......] - ETA: 27s - loss: 0.7332 - regression_loss: 0.6393 - classification_loss: 0.0939 416/500 [=======================>......] - ETA: 27s - loss: 0.7329 - regression_loss: 0.6391 - classification_loss: 0.0937 417/500 [========================>.....] - ETA: 27s - loss: 0.7320 - regression_loss: 0.6384 - classification_loss: 0.0936 418/500 [========================>.....] - ETA: 26s - loss: 0.7325 - regression_loss: 0.6387 - classification_loss: 0.0938 419/500 [========================>.....] - ETA: 26s - loss: 0.7336 - regression_loss: 0.6397 - classification_loss: 0.0940 420/500 [========================>.....] - ETA: 26s - loss: 0.7330 - regression_loss: 0.6391 - classification_loss: 0.0938 421/500 [========================>.....] - ETA: 25s - loss: 0.7336 - regression_loss: 0.6398 - classification_loss: 0.0939 422/500 [========================>.....] - ETA: 25s - loss: 0.7336 - regression_loss: 0.6398 - classification_loss: 0.0938 423/500 [========================>.....] - ETA: 25s - loss: 0.7337 - regression_loss: 0.6399 - classification_loss: 0.0938 424/500 [========================>.....] - ETA: 24s - loss: 0.7331 - regression_loss: 0.6394 - classification_loss: 0.0937 425/500 [========================>.....] - ETA: 24s - loss: 0.7338 - regression_loss: 0.6400 - classification_loss: 0.0938 426/500 [========================>.....] - ETA: 24s - loss: 0.7332 - regression_loss: 0.6395 - classification_loss: 0.0937 427/500 [========================>.....] - ETA: 24s - loss: 0.7337 - regression_loss: 0.6399 - classification_loss: 0.0939 428/500 [========================>.....] - ETA: 23s - loss: 0.7347 - regression_loss: 0.6407 - classification_loss: 0.0940 429/500 [========================>.....] - ETA: 23s - loss: 0.7350 - regression_loss: 0.6411 - classification_loss: 0.0939 430/500 [========================>.....] - ETA: 23s - loss: 0.7353 - regression_loss: 0.6413 - classification_loss: 0.0940 431/500 [========================>.....] - ETA: 22s - loss: 0.7358 - regression_loss: 0.6416 - classification_loss: 0.0941 432/500 [========================>.....] - ETA: 22s - loss: 0.7361 - regression_loss: 0.6420 - classification_loss: 0.0942 433/500 [========================>.....] - ETA: 22s - loss: 0.7353 - regression_loss: 0.6412 - classification_loss: 0.0941 434/500 [=========================>....] - ETA: 21s - loss: 0.7357 - regression_loss: 0.6415 - classification_loss: 0.0941 435/500 [=========================>....] - ETA: 21s - loss: 0.7350 - regression_loss: 0.6410 - classification_loss: 0.0940 436/500 [=========================>....] - ETA: 21s - loss: 0.7346 - regression_loss: 0.6406 - classification_loss: 0.0940 437/500 [=========================>....] - ETA: 20s - loss: 0.7347 - regression_loss: 0.6406 - classification_loss: 0.0940 438/500 [=========================>....] - ETA: 20s - loss: 0.7343 - regression_loss: 0.6403 - classification_loss: 0.0940 439/500 [=========================>....] - ETA: 20s - loss: 0.7346 - regression_loss: 0.6405 - classification_loss: 0.0941 440/500 [=========================>....] - ETA: 19s - loss: 0.7336 - regression_loss: 0.6396 - classification_loss: 0.0940 441/500 [=========================>....] - ETA: 19s - loss: 0.7341 - regression_loss: 0.6402 - classification_loss: 0.0940 442/500 [=========================>....] - ETA: 19s - loss: 0.7334 - regression_loss: 0.6396 - classification_loss: 0.0938 443/500 [=========================>....] - ETA: 18s - loss: 0.7337 - regression_loss: 0.6398 - classification_loss: 0.0939 444/500 [=========================>....] - ETA: 18s - loss: 0.7328 - regression_loss: 0.6391 - classification_loss: 0.0937 445/500 [=========================>....] - ETA: 18s - loss: 0.7320 - regression_loss: 0.6383 - classification_loss: 0.0936 446/500 [=========================>....] - ETA: 17s - loss: 0.7335 - regression_loss: 0.6396 - classification_loss: 0.0938 447/500 [=========================>....] - ETA: 17s - loss: 0.7329 - regression_loss: 0.6391 - classification_loss: 0.0938 448/500 [=========================>....] - ETA: 17s - loss: 0.7328 - regression_loss: 0.6390 - classification_loss: 0.0938 449/500 [=========================>....] - ETA: 16s - loss: 0.7322 - regression_loss: 0.6385 - classification_loss: 0.0937 450/500 [==========================>...] - ETA: 16s - loss: 0.7316 - regression_loss: 0.6380 - classification_loss: 0.0936 451/500 [==========================>...] - ETA: 16s - loss: 0.7325 - regression_loss: 0.6387 - classification_loss: 0.0937 452/500 [==========================>...] - ETA: 15s - loss: 0.7317 - regression_loss: 0.6381 - classification_loss: 0.0936 453/500 [==========================>...] - ETA: 15s - loss: 0.7314 - regression_loss: 0.6379 - classification_loss: 0.0935 454/500 [==========================>...] - ETA: 15s - loss: 0.7314 - regression_loss: 0.6378 - classification_loss: 0.0935 455/500 [==========================>...] - ETA: 14s - loss: 0.7314 - regression_loss: 0.6378 - classification_loss: 0.0935 456/500 [==========================>...] - ETA: 14s - loss: 0.7320 - regression_loss: 0.6383 - classification_loss: 0.0937 457/500 [==========================>...] - ETA: 14s - loss: 0.7312 - regression_loss: 0.6376 - classification_loss: 0.0936 458/500 [==========================>...] - ETA: 13s - loss: 0.7313 - regression_loss: 0.6377 - classification_loss: 0.0936 459/500 [==========================>...] - ETA: 13s - loss: 0.7315 - regression_loss: 0.6379 - classification_loss: 0.0936 460/500 [==========================>...] - ETA: 13s - loss: 0.7322 - regression_loss: 0.6385 - classification_loss: 0.0938 461/500 [==========================>...] - ETA: 12s - loss: 0.7331 - regression_loss: 0.6393 - classification_loss: 0.0939 462/500 [==========================>...] - ETA: 12s - loss: 0.7338 - regression_loss: 0.6398 - classification_loss: 0.0940 463/500 [==========================>...] - ETA: 12s - loss: 0.7330 - regression_loss: 0.6391 - classification_loss: 0.0939 464/500 [==========================>...] - ETA: 11s - loss: 0.7321 - regression_loss: 0.6384 - classification_loss: 0.0937 465/500 [==========================>...] - ETA: 11s - loss: 0.7324 - regression_loss: 0.6386 - classification_loss: 0.0938 466/500 [==========================>...] - ETA: 11s - loss: 0.7326 - regression_loss: 0.6388 - classification_loss: 0.0938 467/500 [===========================>..] - ETA: 10s - loss: 0.7331 - regression_loss: 0.6392 - classification_loss: 0.0939 468/500 [===========================>..] - ETA: 10s - loss: 0.7327 - regression_loss: 0.6389 - classification_loss: 0.0938 469/500 [===========================>..] - ETA: 10s - loss: 0.7328 - regression_loss: 0.6390 - classification_loss: 0.0938 470/500 [===========================>..] - ETA: 9s - loss: 0.7335 - regression_loss: 0.6395 - classification_loss: 0.0940  471/500 [===========================>..] - ETA: 9s - loss: 0.7339 - regression_loss: 0.6398 - classification_loss: 0.0941 472/500 [===========================>..] - ETA: 9s - loss: 0.7333 - regression_loss: 0.6393 - classification_loss: 0.0940 473/500 [===========================>..] - ETA: 8s - loss: 0.7329 - regression_loss: 0.6390 - classification_loss: 0.0939 474/500 [===========================>..] - ETA: 8s - loss: 0.7326 - regression_loss: 0.6387 - classification_loss: 0.0939 475/500 [===========================>..] - ETA: 8s - loss: 0.7325 - regression_loss: 0.6385 - classification_loss: 0.0940 476/500 [===========================>..] - ETA: 7s - loss: 0.7329 - regression_loss: 0.6389 - classification_loss: 0.0941 477/500 [===========================>..] - ETA: 7s - loss: 0.7327 - regression_loss: 0.6386 - classification_loss: 0.0941 478/500 [===========================>..] - ETA: 7s - loss: 0.7325 - regression_loss: 0.6385 - classification_loss: 0.0940 479/500 [===========================>..] - ETA: 6s - loss: 0.7322 - regression_loss: 0.6383 - classification_loss: 0.0939 480/500 [===========================>..] - ETA: 6s - loss: 0.7319 - regression_loss: 0.6381 - classification_loss: 0.0938 481/500 [===========================>..] - ETA: 6s - loss: 0.7318 - regression_loss: 0.6380 - classification_loss: 0.0938 482/500 [===========================>..] - ETA: 5s - loss: 0.7313 - regression_loss: 0.6375 - classification_loss: 0.0938 483/500 [===========================>..] - ETA: 5s - loss: 0.7319 - regression_loss: 0.6381 - classification_loss: 0.0938 484/500 [============================>.] - ETA: 5s - loss: 0.7321 - regression_loss: 0.6382 - classification_loss: 0.0939 485/500 [============================>.] - ETA: 4s - loss: 0.7313 - regression_loss: 0.6375 - classification_loss: 0.0938 486/500 [============================>.] - ETA: 4s - loss: 0.7311 - regression_loss: 0.6373 - classification_loss: 0.0938 487/500 [============================>.] - ETA: 4s - loss: 0.7313 - regression_loss: 0.6376 - classification_loss: 0.0937 488/500 [============================>.] - ETA: 3s - loss: 0.7322 - regression_loss: 0.6384 - classification_loss: 0.0939 489/500 [============================>.] - ETA: 3s - loss: 0.7323 - regression_loss: 0.6383 - classification_loss: 0.0940 490/500 [============================>.] - ETA: 3s - loss: 0.7321 - regression_loss: 0.6382 - classification_loss: 0.0939 491/500 [============================>.] - ETA: 2s - loss: 0.7324 - regression_loss: 0.6385 - classification_loss: 0.0940 492/500 [============================>.] - ETA: 2s - loss: 0.7322 - regression_loss: 0.6382 - classification_loss: 0.0940 493/500 [============================>.] - ETA: 2s - loss: 0.7316 - regression_loss: 0.6377 - classification_loss: 0.0939 494/500 [============================>.] - ETA: 1s - loss: 0.7315 - regression_loss: 0.6375 - classification_loss: 0.0940 495/500 [============================>.] - ETA: 1s - loss: 0.7321 - regression_loss: 0.6380 - classification_loss: 0.0941 496/500 [============================>.] - ETA: 1s - loss: 0.7324 - regression_loss: 0.6383 - classification_loss: 0.0942 497/500 [============================>.] - ETA: 0s - loss: 0.7330 - regression_loss: 0.6386 - classification_loss: 0.0943 498/500 [============================>.] - ETA: 0s - loss: 0.7329 - regression_loss: 0.6385 - classification_loss: 0.0943 499/500 [============================>.] - ETA: 0s - loss: 0.7336 - regression_loss: 0.6391 - classification_loss: 0.0945 500/500 [==============================] - 165s 329ms/step - loss: 0.7336 - regression_loss: 0.6391 - classification_loss: 0.0945 1172 instances of class plum with average precision: 0.6367 mAP: 0.6367 Epoch 00043: saving model to ./training/snapshots/resnet101_pascal_43.h5 Epoch 44/150 1/500 [..............................] - ETA: 2:35 - loss: 0.7113 - regression_loss: 0.5940 - classification_loss: 0.1173 2/500 [..............................] - ETA: 2:37 - loss: 0.7512 - regression_loss: 0.6207 - classification_loss: 0.1305 3/500 [..............................] - ETA: 2:41 - loss: 0.6870 - regression_loss: 0.5735 - classification_loss: 0.1135 4/500 [..............................] - ETA: 2:42 - loss: 0.7151 - regression_loss: 0.5973 - classification_loss: 0.1178 5/500 [..............................] - ETA: 2:41 - loss: 0.7568 - regression_loss: 0.6386 - classification_loss: 0.1183 6/500 [..............................] - ETA: 2:40 - loss: 0.7052 - regression_loss: 0.5966 - classification_loss: 0.1085 7/500 [..............................] - ETA: 2:41 - loss: 0.7012 - regression_loss: 0.5957 - classification_loss: 0.1054 8/500 [..............................] - ETA: 2:41 - loss: 0.6545 - regression_loss: 0.5584 - classification_loss: 0.0961 9/500 [..............................] - ETA: 2:40 - loss: 0.6196 - regression_loss: 0.5299 - classification_loss: 0.0897 10/500 [..............................] - ETA: 2:40 - loss: 0.6261 - regression_loss: 0.5379 - classification_loss: 0.0882 11/500 [..............................] - ETA: 2:39 - loss: 0.5904 - regression_loss: 0.5063 - classification_loss: 0.0840 12/500 [..............................] - ETA: 2:39 - loss: 0.6295 - regression_loss: 0.5408 - classification_loss: 0.0887 13/500 [..............................] - ETA: 2:39 - loss: 0.6095 - regression_loss: 0.5229 - classification_loss: 0.0866 14/500 [..............................] - ETA: 2:39 - loss: 0.6624 - regression_loss: 0.5654 - classification_loss: 0.0970 15/500 [..............................] - ETA: 2:39 - loss: 0.6871 - regression_loss: 0.5848 - classification_loss: 0.1023 16/500 [..............................] - ETA: 2:38 - loss: 0.6927 - regression_loss: 0.5906 - classification_loss: 0.1021 17/500 [>.............................] - ETA: 2:38 - loss: 0.7051 - regression_loss: 0.6009 - classification_loss: 0.1043 18/500 [>.............................] - ETA: 2:38 - loss: 0.7392 - regression_loss: 0.6308 - classification_loss: 0.1084 19/500 [>.............................] - ETA: 2:37 - loss: 0.7312 - regression_loss: 0.6240 - classification_loss: 0.1072 20/500 [>.............................] - ETA: 2:37 - loss: 0.7171 - regression_loss: 0.6121 - classification_loss: 0.1050 21/500 [>.............................] - ETA: 2:37 - loss: 0.7052 - regression_loss: 0.6037 - classification_loss: 0.1015 22/500 [>.............................] - ETA: 2:36 - loss: 0.7307 - regression_loss: 0.6246 - classification_loss: 0.1061 23/500 [>.............................] - ETA: 2:36 - loss: 0.7246 - regression_loss: 0.6207 - classification_loss: 0.1038 24/500 [>.............................] - ETA: 2:35 - loss: 0.7248 - regression_loss: 0.6223 - classification_loss: 0.1025 25/500 [>.............................] - ETA: 2:36 - loss: 0.7040 - regression_loss: 0.6045 - classification_loss: 0.0996 26/500 [>.............................] - ETA: 2:35 - loss: 0.7202 - regression_loss: 0.6184 - classification_loss: 0.1018 27/500 [>.............................] - ETA: 2:35 - loss: 0.7058 - regression_loss: 0.6063 - classification_loss: 0.0995 28/500 [>.............................] - ETA: 2:35 - loss: 0.6889 - regression_loss: 0.5921 - classification_loss: 0.0967 29/500 [>.............................] - ETA: 2:34 - loss: 0.6935 - regression_loss: 0.5959 - classification_loss: 0.0975 30/500 [>.............................] - ETA: 2:34 - loss: 0.6910 - regression_loss: 0.5942 - classification_loss: 0.0969 31/500 [>.............................] - ETA: 2:34 - loss: 0.6892 - regression_loss: 0.5937 - classification_loss: 0.0955 32/500 [>.............................] - ETA: 2:33 - loss: 0.6996 - regression_loss: 0.6057 - classification_loss: 0.0940 33/500 [>.............................] - ETA: 2:33 - loss: 0.6908 - regression_loss: 0.5982 - classification_loss: 0.0926 34/500 [=>............................] - ETA: 2:32 - loss: 0.6805 - regression_loss: 0.5895 - classification_loss: 0.0909 35/500 [=>............................] - ETA: 2:32 - loss: 0.6872 - regression_loss: 0.5959 - classification_loss: 0.0913 36/500 [=>............................] - ETA: 2:32 - loss: 0.6875 - regression_loss: 0.5965 - classification_loss: 0.0910 37/500 [=>............................] - ETA: 2:31 - loss: 0.6953 - regression_loss: 0.6043 - classification_loss: 0.0910 38/500 [=>............................] - ETA: 2:31 - loss: 0.6847 - regression_loss: 0.5954 - classification_loss: 0.0893 39/500 [=>............................] - ETA: 2:31 - loss: 0.6811 - regression_loss: 0.5931 - classification_loss: 0.0880 40/500 [=>............................] - ETA: 2:31 - loss: 0.6790 - regression_loss: 0.5918 - classification_loss: 0.0871 41/500 [=>............................] - ETA: 2:31 - loss: 0.6859 - regression_loss: 0.5970 - classification_loss: 0.0889 42/500 [=>............................] - ETA: 2:30 - loss: 0.6784 - regression_loss: 0.5908 - classification_loss: 0.0877 43/500 [=>............................] - ETA: 2:30 - loss: 0.6708 - regression_loss: 0.5842 - classification_loss: 0.0865 44/500 [=>............................] - ETA: 2:29 - loss: 0.6749 - regression_loss: 0.5880 - classification_loss: 0.0869 45/500 [=>............................] - ETA: 2:29 - loss: 0.6755 - regression_loss: 0.5888 - classification_loss: 0.0867 46/500 [=>............................] - ETA: 2:29 - loss: 0.6698 - regression_loss: 0.5837 - classification_loss: 0.0861 47/500 [=>............................] - ETA: 2:28 - loss: 0.6827 - regression_loss: 0.5944 - classification_loss: 0.0883 48/500 [=>............................] - ETA: 2:28 - loss: 0.6823 - regression_loss: 0.5942 - classification_loss: 0.0881 49/500 [=>............................] - ETA: 2:28 - loss: 0.6747 - regression_loss: 0.5875 - classification_loss: 0.0872 50/500 [==>...........................] - ETA: 2:28 - loss: 0.6718 - regression_loss: 0.5851 - classification_loss: 0.0867 51/500 [==>...........................] - ETA: 2:27 - loss: 0.6756 - regression_loss: 0.5883 - classification_loss: 0.0873 52/500 [==>...........................] - ETA: 2:27 - loss: 0.6800 - regression_loss: 0.5921 - classification_loss: 0.0879 53/500 [==>...........................] - ETA: 2:26 - loss: 0.6878 - regression_loss: 0.5982 - classification_loss: 0.0896 54/500 [==>...........................] - ETA: 2:26 - loss: 0.6838 - regression_loss: 0.5946 - classification_loss: 0.0892 55/500 [==>...........................] - ETA: 2:26 - loss: 0.6872 - regression_loss: 0.5979 - classification_loss: 0.0893 56/500 [==>...........................] - ETA: 2:25 - loss: 0.6826 - regression_loss: 0.5936 - classification_loss: 0.0890 57/500 [==>...........................] - ETA: 2:25 - loss: 0.6766 - regression_loss: 0.5887 - classification_loss: 0.0879 58/500 [==>...........................] - ETA: 2:25 - loss: 0.6863 - regression_loss: 0.5969 - classification_loss: 0.0893 59/500 [==>...........................] - ETA: 2:24 - loss: 0.6968 - regression_loss: 0.6062 - classification_loss: 0.0906 60/500 [==>...........................] - ETA: 2:24 - loss: 0.6967 - regression_loss: 0.6061 - classification_loss: 0.0906 61/500 [==>...........................] - ETA: 2:24 - loss: 0.6967 - regression_loss: 0.6058 - classification_loss: 0.0909 62/500 [==>...........................] - ETA: 2:24 - loss: 0.6969 - regression_loss: 0.6060 - classification_loss: 0.0909 63/500 [==>...........................] - ETA: 2:23 - loss: 0.6946 - regression_loss: 0.6041 - classification_loss: 0.0905 64/500 [==>...........................] - ETA: 2:23 - loss: 0.6937 - regression_loss: 0.6034 - classification_loss: 0.0903 65/500 [==>...........................] - ETA: 2:23 - loss: 0.6961 - regression_loss: 0.6056 - classification_loss: 0.0905 66/500 [==>...........................] - ETA: 2:22 - loss: 0.6948 - regression_loss: 0.6042 - classification_loss: 0.0905 67/500 [===>..........................] - ETA: 2:22 - loss: 0.6955 - regression_loss: 0.6046 - classification_loss: 0.0909 68/500 [===>..........................] - ETA: 2:21 - loss: 0.6924 - regression_loss: 0.6021 - classification_loss: 0.0903 69/500 [===>..........................] - ETA: 2:21 - loss: 0.6886 - regression_loss: 0.5989 - classification_loss: 0.0896 70/500 [===>..........................] - ETA: 2:21 - loss: 0.6872 - regression_loss: 0.5979 - classification_loss: 0.0893 71/500 [===>..........................] - ETA: 2:20 - loss: 0.6818 - regression_loss: 0.5932 - classification_loss: 0.0886 72/500 [===>..........................] - ETA: 2:20 - loss: 0.6810 - regression_loss: 0.5929 - classification_loss: 0.0882 73/500 [===>..........................] - ETA: 2:20 - loss: 0.6827 - regression_loss: 0.5945 - classification_loss: 0.0882 74/500 [===>..........................] - ETA: 2:19 - loss: 0.6909 - regression_loss: 0.6012 - classification_loss: 0.0897 75/500 [===>..........................] - ETA: 2:19 - loss: 0.6924 - regression_loss: 0.6025 - classification_loss: 0.0899 76/500 [===>..........................] - ETA: 2:19 - loss: 0.6907 - regression_loss: 0.6014 - classification_loss: 0.0893 77/500 [===>..........................] - ETA: 2:19 - loss: 0.6849 - regression_loss: 0.5964 - classification_loss: 0.0885 78/500 [===>..........................] - ETA: 2:18 - loss: 0.6861 - regression_loss: 0.5975 - classification_loss: 0.0886 79/500 [===>..........................] - ETA: 2:18 - loss: 0.6888 - regression_loss: 0.5997 - classification_loss: 0.0891 80/500 [===>..........................] - ETA: 2:18 - loss: 0.6893 - regression_loss: 0.6005 - classification_loss: 0.0888 81/500 [===>..........................] - ETA: 2:17 - loss: 0.6867 - regression_loss: 0.5983 - classification_loss: 0.0884 82/500 [===>..........................] - ETA: 2:17 - loss: 0.6841 - regression_loss: 0.5963 - classification_loss: 0.0878 83/500 [===>..........................] - ETA: 2:17 - loss: 0.6883 - regression_loss: 0.6001 - classification_loss: 0.0882 84/500 [====>.........................] - ETA: 2:16 - loss: 0.6949 - regression_loss: 0.6060 - classification_loss: 0.0889 85/500 [====>.........................] - ETA: 2:16 - loss: 0.6974 - regression_loss: 0.6079 - classification_loss: 0.0895 86/500 [====>.........................] - ETA: 2:16 - loss: 0.6981 - regression_loss: 0.6088 - classification_loss: 0.0893 87/500 [====>.........................] - ETA: 2:16 - loss: 0.6995 - regression_loss: 0.6104 - classification_loss: 0.0891 88/500 [====>.........................] - ETA: 2:15 - loss: 0.6958 - regression_loss: 0.6071 - classification_loss: 0.0887 89/500 [====>.........................] - ETA: 2:15 - loss: 0.6929 - regression_loss: 0.6047 - classification_loss: 0.0882 90/500 [====>.........................] - ETA: 2:15 - loss: 0.6938 - regression_loss: 0.6054 - classification_loss: 0.0883 91/500 [====>.........................] - ETA: 2:14 - loss: 0.6934 - regression_loss: 0.6052 - classification_loss: 0.0882 92/500 [====>.........................] - ETA: 2:14 - loss: 0.6983 - regression_loss: 0.6097 - classification_loss: 0.0886 93/500 [====>.........................] - ETA: 2:14 - loss: 0.7003 - regression_loss: 0.6114 - classification_loss: 0.0889 94/500 [====>.........................] - ETA: 2:13 - loss: 0.6973 - regression_loss: 0.6090 - classification_loss: 0.0882 95/500 [====>.........................] - ETA: 2:13 - loss: 0.6957 - regression_loss: 0.6076 - classification_loss: 0.0881 96/500 [====>.........................] - ETA: 2:12 - loss: 0.6992 - regression_loss: 0.6103 - classification_loss: 0.0888 97/500 [====>.........................] - ETA: 2:12 - loss: 0.6993 - regression_loss: 0.6104 - classification_loss: 0.0889 98/500 [====>.........................] - ETA: 2:12 - loss: 0.7055 - regression_loss: 0.6157 - classification_loss: 0.0898 99/500 [====>.........................] - ETA: 2:11 - loss: 0.7093 - regression_loss: 0.6188 - classification_loss: 0.0905 100/500 [=====>........................] - ETA: 2:11 - loss: 0.7130 - regression_loss: 0.6218 - classification_loss: 0.0912 101/500 [=====>........................] - ETA: 2:11 - loss: 0.7113 - regression_loss: 0.6205 - classification_loss: 0.0909 102/500 [=====>........................] - ETA: 2:10 - loss: 0.7098 - regression_loss: 0.6192 - classification_loss: 0.0906 103/500 [=====>........................] - ETA: 2:10 - loss: 0.7148 - regression_loss: 0.6237 - classification_loss: 0.0911 104/500 [=====>........................] - ETA: 2:10 - loss: 0.7158 - regression_loss: 0.6248 - classification_loss: 0.0910 105/500 [=====>........................] - ETA: 2:09 - loss: 0.7157 - regression_loss: 0.6247 - classification_loss: 0.0910 106/500 [=====>........................] - ETA: 2:09 - loss: 0.7144 - regression_loss: 0.6235 - classification_loss: 0.0910 107/500 [=====>........................] - ETA: 2:09 - loss: 0.7197 - regression_loss: 0.6280 - classification_loss: 0.0917 108/500 [=====>........................] - ETA: 2:08 - loss: 0.7159 - regression_loss: 0.6248 - classification_loss: 0.0911 109/500 [=====>........................] - ETA: 2:08 - loss: 0.7147 - regression_loss: 0.6237 - classification_loss: 0.0910 110/500 [=====>........................] - ETA: 2:08 - loss: 0.7165 - regression_loss: 0.6254 - classification_loss: 0.0911 111/500 [=====>........................] - ETA: 2:08 - loss: 0.7192 - regression_loss: 0.6280 - classification_loss: 0.0912 112/500 [=====>........................] - ETA: 2:07 - loss: 0.7160 - regression_loss: 0.6253 - classification_loss: 0.0906 113/500 [=====>........................] - ETA: 2:07 - loss: 0.7137 - regression_loss: 0.6234 - classification_loss: 0.0902 114/500 [=====>........................] - ETA: 2:07 - loss: 0.7151 - regression_loss: 0.6248 - classification_loss: 0.0903 115/500 [=====>........................] - ETA: 2:06 - loss: 0.7151 - regression_loss: 0.6252 - classification_loss: 0.0899 116/500 [=====>........................] - ETA: 2:06 - loss: 0.7154 - regression_loss: 0.6258 - classification_loss: 0.0896 117/500 [======>.......................] - ETA: 2:06 - loss: 0.7186 - regression_loss: 0.6283 - classification_loss: 0.0903 118/500 [======>.......................] - ETA: 2:05 - loss: 0.7180 - regression_loss: 0.6278 - classification_loss: 0.0902 119/500 [======>.......................] - ETA: 2:05 - loss: 0.7156 - regression_loss: 0.6257 - classification_loss: 0.0899 120/500 [======>.......................] - ETA: 2:05 - loss: 0.7139 - regression_loss: 0.6245 - classification_loss: 0.0894 121/500 [======>.......................] - ETA: 2:04 - loss: 0.7117 - regression_loss: 0.6226 - classification_loss: 0.0890 122/500 [======>.......................] - ETA: 2:04 - loss: 0.7102 - regression_loss: 0.6215 - classification_loss: 0.0887 123/500 [======>.......................] - ETA: 2:04 - loss: 0.7136 - regression_loss: 0.6245 - classification_loss: 0.0890 124/500 [======>.......................] - ETA: 2:03 - loss: 0.7141 - regression_loss: 0.6252 - classification_loss: 0.0889 125/500 [======>.......................] - ETA: 2:03 - loss: 0.7143 - regression_loss: 0.6251 - classification_loss: 0.0892 126/500 [======>.......................] - ETA: 2:03 - loss: 0.7163 - regression_loss: 0.6269 - classification_loss: 0.0894 127/500 [======>.......................] - ETA: 2:02 - loss: 0.7136 - regression_loss: 0.6245 - classification_loss: 0.0891 128/500 [======>.......................] - ETA: 2:02 - loss: 0.7120 - regression_loss: 0.6229 - classification_loss: 0.0891 129/500 [======>.......................] - ETA: 2:02 - loss: 0.7088 - regression_loss: 0.6201 - classification_loss: 0.0887 130/500 [======>.......................] - ETA: 2:01 - loss: 0.7128 - regression_loss: 0.6232 - classification_loss: 0.0896 131/500 [======>.......................] - ETA: 2:01 - loss: 0.7111 - regression_loss: 0.6218 - classification_loss: 0.0893 132/500 [======>.......................] - ETA: 2:01 - loss: 0.7140 - regression_loss: 0.6243 - classification_loss: 0.0897 133/500 [======>.......................] - ETA: 2:00 - loss: 0.7154 - regression_loss: 0.6255 - classification_loss: 0.0899 134/500 [=======>......................] - ETA: 2:00 - loss: 0.7128 - regression_loss: 0.6234 - classification_loss: 0.0894 135/500 [=======>......................] - ETA: 2:00 - loss: 0.7100 - regression_loss: 0.6209 - classification_loss: 0.0891 136/500 [=======>......................] - ETA: 1:59 - loss: 0.7067 - regression_loss: 0.6181 - classification_loss: 0.0886 137/500 [=======>......................] - ETA: 1:59 - loss: 0.7092 - regression_loss: 0.6202 - classification_loss: 0.0890 138/500 [=======>......................] - ETA: 1:58 - loss: 0.7068 - regression_loss: 0.6180 - classification_loss: 0.0887 139/500 [=======>......................] - ETA: 1:58 - loss: 0.7071 - regression_loss: 0.6184 - classification_loss: 0.0887 140/500 [=======>......................] - ETA: 1:58 - loss: 0.7087 - regression_loss: 0.6199 - classification_loss: 0.0888 141/500 [=======>......................] - ETA: 1:58 - loss: 0.7051 - regression_loss: 0.6161 - classification_loss: 0.0890 142/500 [=======>......................] - ETA: 1:57 - loss: 0.7020 - regression_loss: 0.6133 - classification_loss: 0.0886 143/500 [=======>......................] - ETA: 1:57 - loss: 0.6987 - regression_loss: 0.6105 - classification_loss: 0.0882 144/500 [=======>......................] - ETA: 1:57 - loss: 0.6993 - regression_loss: 0.6108 - classification_loss: 0.0884 145/500 [=======>......................] - ETA: 1:56 - loss: 0.6987 - regression_loss: 0.6103 - classification_loss: 0.0884 146/500 [=======>......................] - ETA: 1:56 - loss: 0.6961 - regression_loss: 0.6081 - classification_loss: 0.0880 147/500 [=======>......................] - ETA: 1:56 - loss: 0.6966 - regression_loss: 0.6086 - classification_loss: 0.0880 148/500 [=======>......................] - ETA: 1:55 - loss: 0.6957 - regression_loss: 0.6078 - classification_loss: 0.0879 149/500 [=======>......................] - ETA: 1:55 - loss: 0.6971 - regression_loss: 0.6088 - classification_loss: 0.0883 150/500 [========>.....................] - ETA: 1:55 - loss: 0.6977 - regression_loss: 0.6094 - classification_loss: 0.0883 151/500 [========>.....................] - ETA: 1:54 - loss: 0.6976 - regression_loss: 0.6095 - classification_loss: 0.0880 152/500 [========>.....................] - ETA: 1:54 - loss: 0.6990 - regression_loss: 0.6106 - classification_loss: 0.0885 153/500 [========>.....................] - ETA: 1:54 - loss: 0.6997 - regression_loss: 0.6112 - classification_loss: 0.0885 154/500 [========>.....................] - ETA: 1:53 - loss: 0.7011 - regression_loss: 0.6125 - classification_loss: 0.0886 155/500 [========>.....................] - ETA: 1:53 - loss: 0.6997 - regression_loss: 0.6114 - classification_loss: 0.0883 156/500 [========>.....................] - ETA: 1:53 - loss: 0.6989 - regression_loss: 0.6109 - classification_loss: 0.0880 157/500 [========>.....................] - ETA: 1:52 - loss: 0.6973 - regression_loss: 0.6097 - classification_loss: 0.0876 158/500 [========>.....................] - ETA: 1:52 - loss: 0.6954 - regression_loss: 0.6080 - classification_loss: 0.0874 159/500 [========>.....................] - ETA: 1:52 - loss: 0.6972 - regression_loss: 0.6093 - classification_loss: 0.0879 160/500 [========>.....................] - ETA: 1:51 - loss: 0.6952 - regression_loss: 0.6077 - classification_loss: 0.0875 161/500 [========>.....................] - ETA: 1:51 - loss: 0.6930 - regression_loss: 0.6058 - classification_loss: 0.0872 162/500 [========>.....................] - ETA: 1:51 - loss: 0.6945 - regression_loss: 0.6071 - classification_loss: 0.0874 163/500 [========>.....................] - ETA: 1:50 - loss: 0.6918 - regression_loss: 0.6047 - classification_loss: 0.0871 164/500 [========>.....................] - ETA: 1:50 - loss: 0.6932 - regression_loss: 0.6059 - classification_loss: 0.0873 165/500 [========>.....................] - ETA: 1:50 - loss: 0.6944 - regression_loss: 0.6069 - classification_loss: 0.0875 166/500 [========>.....................] - ETA: 1:49 - loss: 0.6933 - regression_loss: 0.6060 - classification_loss: 0.0873 167/500 [=========>....................] - ETA: 1:49 - loss: 0.6939 - regression_loss: 0.6065 - classification_loss: 0.0874 168/500 [=========>....................] - ETA: 1:49 - loss: 0.6915 - regression_loss: 0.6042 - classification_loss: 0.0872 169/500 [=========>....................] - ETA: 1:48 - loss: 0.6902 - regression_loss: 0.6031 - classification_loss: 0.0871 170/500 [=========>....................] - ETA: 1:48 - loss: 0.6875 - regression_loss: 0.6006 - classification_loss: 0.0869 171/500 [=========>....................] - ETA: 1:48 - loss: 0.6889 - regression_loss: 0.6020 - classification_loss: 0.0869 172/500 [=========>....................] - ETA: 1:47 - loss: 0.6883 - regression_loss: 0.6014 - classification_loss: 0.0869 173/500 [=========>....................] - ETA: 1:47 - loss: 0.6876 - regression_loss: 0.6008 - classification_loss: 0.0868 174/500 [=========>....................] - ETA: 1:47 - loss: 0.6909 - regression_loss: 0.6036 - classification_loss: 0.0873 175/500 [=========>....................] - ETA: 1:46 - loss: 0.6897 - regression_loss: 0.6025 - classification_loss: 0.0872 176/500 [=========>....................] - ETA: 1:46 - loss: 0.6909 - regression_loss: 0.6036 - classification_loss: 0.0873 177/500 [=========>....................] - ETA: 1:46 - loss: 0.6925 - regression_loss: 0.6050 - classification_loss: 0.0875 178/500 [=========>....................] - ETA: 1:45 - loss: 0.6935 - regression_loss: 0.6059 - classification_loss: 0.0876 179/500 [=========>....................] - ETA: 1:45 - loss: 0.6922 - regression_loss: 0.6049 - classification_loss: 0.0873 180/500 [=========>....................] - ETA: 1:45 - loss: 0.6907 - regression_loss: 0.6035 - classification_loss: 0.0872 181/500 [=========>....................] - ETA: 1:45 - loss: 0.6932 - regression_loss: 0.6056 - classification_loss: 0.0876 182/500 [=========>....................] - ETA: 1:44 - loss: 0.6974 - regression_loss: 0.6092 - classification_loss: 0.0882 183/500 [=========>....................] - ETA: 1:44 - loss: 0.6964 - regression_loss: 0.6085 - classification_loss: 0.0879 184/500 [==========>...................] - ETA: 1:44 - loss: 0.6986 - regression_loss: 0.6103 - classification_loss: 0.0883 185/500 [==========>...................] - ETA: 1:43 - loss: 0.6997 - regression_loss: 0.6114 - classification_loss: 0.0883 186/500 [==========>...................] - ETA: 1:43 - loss: 0.6995 - regression_loss: 0.6113 - classification_loss: 0.0882 187/500 [==========>...................] - ETA: 1:43 - loss: 0.6987 - regression_loss: 0.6105 - classification_loss: 0.0881 188/500 [==========>...................] - ETA: 1:42 - loss: 0.7005 - regression_loss: 0.6120 - classification_loss: 0.0885 189/500 [==========>...................] - ETA: 1:42 - loss: 0.7031 - regression_loss: 0.6142 - classification_loss: 0.0888 190/500 [==========>...................] - ETA: 1:42 - loss: 0.7031 - regression_loss: 0.6141 - classification_loss: 0.0889 191/500 [==========>...................] - ETA: 1:41 - loss: 0.7048 - regression_loss: 0.6157 - classification_loss: 0.0891 192/500 [==========>...................] - ETA: 1:41 - loss: 0.7042 - regression_loss: 0.6152 - classification_loss: 0.0890 193/500 [==========>...................] - ETA: 1:41 - loss: 0.7054 - regression_loss: 0.6163 - classification_loss: 0.0891 194/500 [==========>...................] - ETA: 1:40 - loss: 0.7052 - regression_loss: 0.6160 - classification_loss: 0.0892 195/500 [==========>...................] - ETA: 1:40 - loss: 0.7041 - regression_loss: 0.6151 - classification_loss: 0.0890 196/500 [==========>...................] - ETA: 1:40 - loss: 0.7033 - regression_loss: 0.6145 - classification_loss: 0.0888 197/500 [==========>...................] - ETA: 1:39 - loss: 0.7054 - regression_loss: 0.6163 - classification_loss: 0.0891 198/500 [==========>...................] - ETA: 1:39 - loss: 0.7054 - regression_loss: 0.6166 - classification_loss: 0.0888 199/500 [==========>...................] - ETA: 1:39 - loss: 0.7065 - regression_loss: 0.6177 - classification_loss: 0.0888 200/500 [===========>..................] - ETA: 1:38 - loss: 0.7040 - regression_loss: 0.6155 - classification_loss: 0.0885 201/500 [===========>..................] - ETA: 1:38 - loss: 0.7028 - regression_loss: 0.6144 - classification_loss: 0.0883 202/500 [===========>..................] - ETA: 1:38 - loss: 0.7041 - regression_loss: 0.6156 - classification_loss: 0.0885 203/500 [===========>..................] - ETA: 1:38 - loss: 0.7047 - regression_loss: 0.6160 - classification_loss: 0.0887 204/500 [===========>..................] - ETA: 1:37 - loss: 0.7030 - regression_loss: 0.6145 - classification_loss: 0.0885 205/500 [===========>..................] - ETA: 1:37 - loss: 0.7055 - regression_loss: 0.6166 - classification_loss: 0.0888 206/500 [===========>..................] - ETA: 1:37 - loss: 0.7066 - regression_loss: 0.6177 - classification_loss: 0.0889 207/500 [===========>..................] - ETA: 1:36 - loss: 0.7069 - regression_loss: 0.6179 - classification_loss: 0.0890 208/500 [===========>..................] - ETA: 1:36 - loss: 0.7083 - regression_loss: 0.6193 - classification_loss: 0.0890 209/500 [===========>..................] - ETA: 1:36 - loss: 0.7084 - regression_loss: 0.6194 - classification_loss: 0.0890 210/500 [===========>..................] - ETA: 1:35 - loss: 0.7082 - regression_loss: 0.6193 - classification_loss: 0.0889 211/500 [===========>..................] - ETA: 1:35 - loss: 0.7063 - regression_loss: 0.6177 - classification_loss: 0.0886 212/500 [===========>..................] - ETA: 1:35 - loss: 0.7063 - regression_loss: 0.6179 - classification_loss: 0.0885 213/500 [===========>..................] - ETA: 1:34 - loss: 0.7078 - regression_loss: 0.6190 - classification_loss: 0.0888 214/500 [===========>..................] - ETA: 1:34 - loss: 0.7085 - regression_loss: 0.6196 - classification_loss: 0.0889 215/500 [===========>..................] - ETA: 1:34 - loss: 0.7079 - regression_loss: 0.6191 - classification_loss: 0.0888 216/500 [===========>..................] - ETA: 1:33 - loss: 0.7061 - regression_loss: 0.6176 - classification_loss: 0.0885 217/500 [============>.................] - ETA: 1:33 - loss: 0.7081 - regression_loss: 0.6192 - classification_loss: 0.0889 218/500 [============>.................] - ETA: 1:33 - loss: 0.7088 - regression_loss: 0.6202 - classification_loss: 0.0886 219/500 [============>.................] - ETA: 1:32 - loss: 0.7087 - regression_loss: 0.6200 - classification_loss: 0.0887 220/500 [============>.................] - ETA: 1:32 - loss: 0.7092 - regression_loss: 0.6203 - classification_loss: 0.0888 221/500 [============>.................] - ETA: 1:31 - loss: 0.7079 - regression_loss: 0.6193 - classification_loss: 0.0886 222/500 [============>.................] - ETA: 1:31 - loss: 0.7085 - regression_loss: 0.6199 - classification_loss: 0.0886 223/500 [============>.................] - ETA: 1:31 - loss: 0.7084 - regression_loss: 0.6198 - classification_loss: 0.0886 224/500 [============>.................] - ETA: 1:31 - loss: 0.7087 - regression_loss: 0.6200 - classification_loss: 0.0887 225/500 [============>.................] - ETA: 1:30 - loss: 0.7095 - regression_loss: 0.6206 - classification_loss: 0.0889 226/500 [============>.................] - ETA: 1:30 - loss: 0.7090 - regression_loss: 0.6202 - classification_loss: 0.0888 227/500 [============>.................] - ETA: 1:30 - loss: 0.7078 - regression_loss: 0.6192 - classification_loss: 0.0886 228/500 [============>.................] - ETA: 1:29 - loss: 0.7080 - regression_loss: 0.6195 - classification_loss: 0.0886 229/500 [============>.................] - ETA: 1:29 - loss: 0.7070 - regression_loss: 0.6186 - classification_loss: 0.0884 230/500 [============>.................] - ETA: 1:29 - loss: 0.7058 - regression_loss: 0.6177 - classification_loss: 0.0882 231/500 [============>.................] - ETA: 1:28 - loss: 0.7058 - regression_loss: 0.6178 - classification_loss: 0.0880 232/500 [============>.................] - ETA: 1:28 - loss: 0.7068 - regression_loss: 0.6186 - classification_loss: 0.0882 233/500 [============>.................] - ETA: 1:28 - loss: 0.7089 - regression_loss: 0.6202 - classification_loss: 0.0888 234/500 [=============>................] - ETA: 1:27 - loss: 0.7073 - regression_loss: 0.6188 - classification_loss: 0.0885 235/500 [=============>................] - ETA: 1:27 - loss: 0.7070 - regression_loss: 0.6185 - classification_loss: 0.0886 236/500 [=============>................] - ETA: 1:27 - loss: 0.7081 - regression_loss: 0.6192 - classification_loss: 0.0889 237/500 [=============>................] - ETA: 1:26 - loss: 0.7076 - regression_loss: 0.6188 - classification_loss: 0.0888 238/500 [=============>................] - ETA: 1:26 - loss: 0.7080 - regression_loss: 0.6191 - classification_loss: 0.0889 239/500 [=============>................] - ETA: 1:26 - loss: 0.7066 - regression_loss: 0.6179 - classification_loss: 0.0887 240/500 [=============>................] - ETA: 1:25 - loss: 0.7075 - regression_loss: 0.6187 - classification_loss: 0.0888 241/500 [=============>................] - ETA: 1:25 - loss: 0.7062 - regression_loss: 0.6175 - classification_loss: 0.0887 242/500 [=============>................] - ETA: 1:25 - loss: 0.7062 - regression_loss: 0.6174 - classification_loss: 0.0888 243/500 [=============>................] - ETA: 1:24 - loss: 0.7064 - regression_loss: 0.6175 - classification_loss: 0.0888 244/500 [=============>................] - ETA: 1:24 - loss: 0.7088 - regression_loss: 0.6197 - classification_loss: 0.0891 245/500 [=============>................] - ETA: 1:24 - loss: 0.7100 - regression_loss: 0.6208 - classification_loss: 0.0892 246/500 [=============>................] - ETA: 1:23 - loss: 0.7104 - regression_loss: 0.6212 - classification_loss: 0.0892 247/500 [=============>................] - ETA: 1:23 - loss: 0.7092 - regression_loss: 0.6202 - classification_loss: 0.0890 248/500 [=============>................] - ETA: 1:23 - loss: 0.7098 - regression_loss: 0.6208 - classification_loss: 0.0890 249/500 [=============>................] - ETA: 1:22 - loss: 0.7087 - regression_loss: 0.6199 - classification_loss: 0.0887 250/500 [==============>...............] - ETA: 1:22 - loss: 0.7069 - regression_loss: 0.6184 - classification_loss: 0.0885 251/500 [==============>...............] - ETA: 1:22 - loss: 0.7051 - regression_loss: 0.6168 - classification_loss: 0.0883 252/500 [==============>...............] - ETA: 1:21 - loss: 0.7054 - regression_loss: 0.6169 - classification_loss: 0.0885 253/500 [==============>...............] - ETA: 1:21 - loss: 0.7039 - regression_loss: 0.6157 - classification_loss: 0.0883 254/500 [==============>...............] - ETA: 1:21 - loss: 0.7045 - regression_loss: 0.6162 - classification_loss: 0.0883 255/500 [==============>...............] - ETA: 1:20 - loss: 0.7046 - regression_loss: 0.6163 - classification_loss: 0.0884 256/500 [==============>...............] - ETA: 1:20 - loss: 0.7055 - regression_loss: 0.6171 - classification_loss: 0.0885 257/500 [==============>...............] - ETA: 1:20 - loss: 0.7055 - regression_loss: 0.6170 - classification_loss: 0.0885 258/500 [==============>...............] - ETA: 1:19 - loss: 0.7055 - regression_loss: 0.6171 - classification_loss: 0.0884 259/500 [==============>...............] - ETA: 1:19 - loss: 0.7038 - regression_loss: 0.6156 - classification_loss: 0.0882 260/500 [==============>...............] - ETA: 1:19 - loss: 0.7061 - regression_loss: 0.6176 - classification_loss: 0.0885 261/500 [==============>...............] - ETA: 1:18 - loss: 0.7075 - regression_loss: 0.6189 - classification_loss: 0.0886 262/500 [==============>...............] - ETA: 1:18 - loss: 0.7090 - regression_loss: 0.6203 - classification_loss: 0.0887 263/500 [==============>...............] - ETA: 1:18 - loss: 0.7081 - regression_loss: 0.6196 - classification_loss: 0.0885 264/500 [==============>...............] - ETA: 1:17 - loss: 0.7087 - regression_loss: 0.6201 - classification_loss: 0.0886 265/500 [==============>...............] - ETA: 1:17 - loss: 0.7100 - regression_loss: 0.6210 - classification_loss: 0.0889 266/500 [==============>...............] - ETA: 1:17 - loss: 0.7101 - regression_loss: 0.6211 - classification_loss: 0.0889 267/500 [===============>..............] - ETA: 1:16 - loss: 0.7105 - regression_loss: 0.6215 - classification_loss: 0.0890 268/500 [===============>..............] - ETA: 1:16 - loss: 0.7109 - regression_loss: 0.6218 - classification_loss: 0.0891 269/500 [===============>..............] - ETA: 1:16 - loss: 0.7101 - regression_loss: 0.6213 - classification_loss: 0.0889 270/500 [===============>..............] - ETA: 1:15 - loss: 0.7111 - regression_loss: 0.6222 - classification_loss: 0.0890 271/500 [===============>..............] - ETA: 1:15 - loss: 0.7121 - regression_loss: 0.6229 - classification_loss: 0.0892 272/500 [===============>..............] - ETA: 1:15 - loss: 0.7108 - regression_loss: 0.6217 - classification_loss: 0.0890 273/500 [===============>..............] - ETA: 1:14 - loss: 0.7105 - regression_loss: 0.6214 - classification_loss: 0.0891 274/500 [===============>..............] - ETA: 1:14 - loss: 0.7103 - regression_loss: 0.6213 - classification_loss: 0.0890 275/500 [===============>..............] - ETA: 1:14 - loss: 0.7116 - regression_loss: 0.6222 - classification_loss: 0.0894 276/500 [===============>..............] - ETA: 1:13 - loss: 0.7123 - regression_loss: 0.6226 - classification_loss: 0.0897 277/500 [===============>..............] - ETA: 1:13 - loss: 0.7113 - regression_loss: 0.6217 - classification_loss: 0.0896 278/500 [===============>..............] - ETA: 1:13 - loss: 0.7101 - regression_loss: 0.6207 - classification_loss: 0.0895 279/500 [===============>..............] - ETA: 1:12 - loss: 0.7117 - regression_loss: 0.6219 - classification_loss: 0.0898 280/500 [===============>..............] - ETA: 1:12 - loss: 0.7133 - regression_loss: 0.6232 - classification_loss: 0.0901 281/500 [===============>..............] - ETA: 1:12 - loss: 0.7122 - regression_loss: 0.6221 - classification_loss: 0.0900 282/500 [===============>..............] - ETA: 1:11 - loss: 0.7110 - regression_loss: 0.6211 - classification_loss: 0.0899 283/500 [===============>..............] - ETA: 1:11 - loss: 0.7132 - regression_loss: 0.6228 - classification_loss: 0.0904 284/500 [================>.............] - ETA: 1:11 - loss: 0.7117 - regression_loss: 0.6215 - classification_loss: 0.0902 285/500 [================>.............] - ETA: 1:10 - loss: 0.7118 - regression_loss: 0.6217 - classification_loss: 0.0901 286/500 [================>.............] - ETA: 1:10 - loss: 0.7145 - regression_loss: 0.6238 - classification_loss: 0.0907 287/500 [================>.............] - ETA: 1:10 - loss: 0.7161 - regression_loss: 0.6252 - classification_loss: 0.0909 288/500 [================>.............] - ETA: 1:09 - loss: 0.7160 - regression_loss: 0.6252 - classification_loss: 0.0908 289/500 [================>.............] - ETA: 1:09 - loss: 0.7146 - regression_loss: 0.6239 - classification_loss: 0.0906 290/500 [================>.............] - ETA: 1:09 - loss: 0.7144 - regression_loss: 0.6238 - classification_loss: 0.0906 291/500 [================>.............] - ETA: 1:08 - loss: 0.7157 - regression_loss: 0.6248 - classification_loss: 0.0909 292/500 [================>.............] - ETA: 1:08 - loss: 0.7140 - regression_loss: 0.6234 - classification_loss: 0.0907 293/500 [================>.............] - ETA: 1:08 - loss: 0.7151 - regression_loss: 0.6245 - classification_loss: 0.0906 294/500 [================>.............] - ETA: 1:07 - loss: 0.7159 - regression_loss: 0.6252 - classification_loss: 0.0906 295/500 [================>.............] - ETA: 1:07 - loss: 0.7141 - regression_loss: 0.6237 - classification_loss: 0.0904 296/500 [================>.............] - ETA: 1:07 - loss: 0.7129 - regression_loss: 0.6226 - classification_loss: 0.0902 297/500 [================>.............] - ETA: 1:06 - loss: 0.7126 - regression_loss: 0.6223 - classification_loss: 0.0903 298/500 [================>.............] - ETA: 1:06 - loss: 0.7141 - regression_loss: 0.6235 - classification_loss: 0.0906 299/500 [================>.............] - ETA: 1:06 - loss: 0.7137 - regression_loss: 0.6231 - classification_loss: 0.0906 300/500 [=================>............] - ETA: 1:06 - loss: 0.7126 - regression_loss: 0.6221 - classification_loss: 0.0905 301/500 [=================>............] - ETA: 1:05 - loss: 0.7123 - regression_loss: 0.6218 - classification_loss: 0.0905 302/500 [=================>............] - ETA: 1:05 - loss: 0.7133 - regression_loss: 0.6226 - classification_loss: 0.0907 303/500 [=================>............] - ETA: 1:05 - loss: 0.7129 - regression_loss: 0.6222 - classification_loss: 0.0907 304/500 [=================>............] - ETA: 1:04 - loss: 0.7116 - regression_loss: 0.6211 - classification_loss: 0.0905 305/500 [=================>............] - ETA: 1:04 - loss: 0.7110 - regression_loss: 0.6206 - classification_loss: 0.0904 306/500 [=================>............] - ETA: 1:04 - loss: 0.7111 - regression_loss: 0.6207 - classification_loss: 0.0904 307/500 [=================>............] - ETA: 1:03 - loss: 0.7108 - regression_loss: 0.6206 - classification_loss: 0.0902 308/500 [=================>............] - ETA: 1:03 - loss: 0.7102 - regression_loss: 0.6201 - classification_loss: 0.0901 309/500 [=================>............] - ETA: 1:03 - loss: 0.7101 - regression_loss: 0.6200 - classification_loss: 0.0901 310/500 [=================>............] - ETA: 1:02 - loss: 0.7095 - regression_loss: 0.6195 - classification_loss: 0.0900 311/500 [=================>............] - ETA: 1:02 - loss: 0.7095 - regression_loss: 0.6195 - classification_loss: 0.0900 312/500 [=================>............] - ETA: 1:02 - loss: 0.7099 - regression_loss: 0.6199 - classification_loss: 0.0900 313/500 [=================>............] - ETA: 1:01 - loss: 0.7091 - regression_loss: 0.6192 - classification_loss: 0.0898 314/500 [=================>............] - ETA: 1:01 - loss: 0.7082 - regression_loss: 0.6185 - classification_loss: 0.0897 315/500 [=================>............] - ETA: 1:01 - loss: 0.7093 - regression_loss: 0.6196 - classification_loss: 0.0897 316/500 [=================>............] - ETA: 1:00 - loss: 0.7081 - regression_loss: 0.6186 - classification_loss: 0.0895 317/500 [==================>...........] - ETA: 1:00 - loss: 0.7074 - regression_loss: 0.6181 - classification_loss: 0.0894 318/500 [==================>...........] - ETA: 1:00 - loss: 0.7079 - regression_loss: 0.6185 - classification_loss: 0.0894 319/500 [==================>...........] - ETA: 59s - loss: 0.7070 - regression_loss: 0.6177 - classification_loss: 0.0893  320/500 [==================>...........] - ETA: 59s - loss: 0.7060 - regression_loss: 0.6168 - classification_loss: 0.0892 321/500 [==================>...........] - ETA: 59s - loss: 0.7074 - regression_loss: 0.6179 - classification_loss: 0.0895 322/500 [==================>...........] - ETA: 58s - loss: 0.7060 - regression_loss: 0.6167 - classification_loss: 0.0893 323/500 [==================>...........] - ETA: 58s - loss: 0.7055 - regression_loss: 0.6163 - classification_loss: 0.0893 324/500 [==================>...........] - ETA: 58s - loss: 0.7063 - regression_loss: 0.6169 - classification_loss: 0.0893 325/500 [==================>...........] - ETA: 57s - loss: 0.7064 - regression_loss: 0.6172 - classification_loss: 0.0893 326/500 [==================>...........] - ETA: 57s - loss: 0.7064 - regression_loss: 0.6172 - classification_loss: 0.0892 327/500 [==================>...........] - ETA: 57s - loss: 0.7071 - regression_loss: 0.6178 - classification_loss: 0.0893 328/500 [==================>...........] - ETA: 56s - loss: 0.7068 - regression_loss: 0.6176 - classification_loss: 0.0892 329/500 [==================>...........] - ETA: 56s - loss: 0.7063 - regression_loss: 0.6171 - classification_loss: 0.0892 330/500 [==================>...........] - ETA: 56s - loss: 0.7070 - regression_loss: 0.6177 - classification_loss: 0.0893 331/500 [==================>...........] - ETA: 55s - loss: 0.7066 - regression_loss: 0.6173 - classification_loss: 0.0893 332/500 [==================>...........] - ETA: 55s - loss: 0.7062 - regression_loss: 0.6170 - classification_loss: 0.0892 333/500 [==================>...........] - ETA: 55s - loss: 0.7051 - regression_loss: 0.6161 - classification_loss: 0.0890 334/500 [===================>..........] - ETA: 54s - loss: 0.7042 - regression_loss: 0.6153 - classification_loss: 0.0888 335/500 [===================>..........] - ETA: 54s - loss: 0.7046 - regression_loss: 0.6157 - classification_loss: 0.0890 336/500 [===================>..........] - ETA: 54s - loss: 0.7041 - regression_loss: 0.6152 - classification_loss: 0.0889 337/500 [===================>..........] - ETA: 53s - loss: 0.7037 - regression_loss: 0.6148 - classification_loss: 0.0888 338/500 [===================>..........] - ETA: 53s - loss: 0.7043 - regression_loss: 0.6154 - classification_loss: 0.0889 339/500 [===================>..........] - ETA: 53s - loss: 0.7030 - regression_loss: 0.6143 - classification_loss: 0.0887 340/500 [===================>..........] - ETA: 52s - loss: 0.7045 - regression_loss: 0.6158 - classification_loss: 0.0887 341/500 [===================>..........] - ETA: 52s - loss: 0.7052 - regression_loss: 0.6164 - classification_loss: 0.0888 342/500 [===================>..........] - ETA: 52s - loss: 0.7063 - regression_loss: 0.6173 - classification_loss: 0.0890 343/500 [===================>..........] - ETA: 51s - loss: 0.7079 - regression_loss: 0.6187 - classification_loss: 0.0892 344/500 [===================>..........] - ETA: 51s - loss: 0.7074 - regression_loss: 0.6183 - classification_loss: 0.0891 345/500 [===================>..........] - ETA: 51s - loss: 0.7074 - regression_loss: 0.6182 - classification_loss: 0.0892 346/500 [===================>..........] - ETA: 50s - loss: 0.7066 - regression_loss: 0.6176 - classification_loss: 0.0890 347/500 [===================>..........] - ETA: 50s - loss: 0.7057 - regression_loss: 0.6168 - classification_loss: 0.0889 348/500 [===================>..........] - ETA: 50s - loss: 0.7061 - regression_loss: 0.6171 - classification_loss: 0.0890 349/500 [===================>..........] - ETA: 49s - loss: 0.7062 - regression_loss: 0.6172 - classification_loss: 0.0890 350/500 [====================>.........] - ETA: 49s - loss: 0.7051 - regression_loss: 0.6163 - classification_loss: 0.0888 351/500 [====================>.........] - ETA: 49s - loss: 0.7058 - regression_loss: 0.6169 - classification_loss: 0.0889 352/500 [====================>.........] - ETA: 48s - loss: 0.7067 - regression_loss: 0.6176 - classification_loss: 0.0891 353/500 [====================>.........] - ETA: 48s - loss: 0.7070 - regression_loss: 0.6179 - classification_loss: 0.0891 354/500 [====================>.........] - ETA: 48s - loss: 0.7064 - regression_loss: 0.6174 - classification_loss: 0.0890 355/500 [====================>.........] - ETA: 47s - loss: 0.7060 - regression_loss: 0.6171 - classification_loss: 0.0888 356/500 [====================>.........] - ETA: 47s - loss: 0.7062 - regression_loss: 0.6173 - classification_loss: 0.0889 357/500 [====================>.........] - ETA: 47s - loss: 0.7058 - regression_loss: 0.6170 - classification_loss: 0.0888 358/500 [====================>.........] - ETA: 46s - loss: 0.7045 - regression_loss: 0.6159 - classification_loss: 0.0886 359/500 [====================>.........] - ETA: 46s - loss: 0.7041 - regression_loss: 0.6156 - classification_loss: 0.0885 360/500 [====================>.........] - ETA: 46s - loss: 0.7045 - regression_loss: 0.6160 - classification_loss: 0.0886 361/500 [====================>.........] - ETA: 45s - loss: 0.7038 - regression_loss: 0.6153 - classification_loss: 0.0885 362/500 [====================>.........] - ETA: 45s - loss: 0.7049 - regression_loss: 0.6162 - classification_loss: 0.0886 363/500 [====================>.........] - ETA: 45s - loss: 0.7061 - regression_loss: 0.6173 - classification_loss: 0.0888 364/500 [====================>.........] - ETA: 44s - loss: 0.7068 - regression_loss: 0.6179 - classification_loss: 0.0888 365/500 [====================>.........] - ETA: 44s - loss: 0.7063 - regression_loss: 0.6175 - classification_loss: 0.0888 366/500 [====================>.........] - ETA: 44s - loss: 0.7068 - regression_loss: 0.6180 - classification_loss: 0.0888 367/500 [=====================>........] - ETA: 43s - loss: 0.7068 - regression_loss: 0.6180 - classification_loss: 0.0888 368/500 [=====================>........] - ETA: 43s - loss: 0.7070 - regression_loss: 0.6181 - classification_loss: 0.0889 369/500 [=====================>........] - ETA: 43s - loss: 0.7082 - regression_loss: 0.6192 - classification_loss: 0.0890 370/500 [=====================>........] - ETA: 42s - loss: 0.7068 - regression_loss: 0.6179 - classification_loss: 0.0889 371/500 [=====================>........] - ETA: 42s - loss: 0.7059 - regression_loss: 0.6171 - classification_loss: 0.0888 372/500 [=====================>........] - ETA: 42s - loss: 0.7059 - regression_loss: 0.6172 - classification_loss: 0.0887 373/500 [=====================>........] - ETA: 41s - loss: 0.7058 - regression_loss: 0.6172 - classification_loss: 0.0886 374/500 [=====================>........] - ETA: 41s - loss: 0.7056 - regression_loss: 0.6170 - classification_loss: 0.0886 375/500 [=====================>........] - ETA: 41s - loss: 0.7059 - regression_loss: 0.6173 - classification_loss: 0.0886 376/500 [=====================>........] - ETA: 40s - loss: 0.7054 - regression_loss: 0.6168 - classification_loss: 0.0886 377/500 [=====================>........] - ETA: 40s - loss: 0.7055 - regression_loss: 0.6170 - classification_loss: 0.0886 378/500 [=====================>........] - ETA: 40s - loss: 0.7044 - regression_loss: 0.6159 - classification_loss: 0.0884 379/500 [=====================>........] - ETA: 39s - loss: 0.7047 - regression_loss: 0.6163 - classification_loss: 0.0884 380/500 [=====================>........] - ETA: 39s - loss: 0.7038 - regression_loss: 0.6155 - classification_loss: 0.0883 381/500 [=====================>........] - ETA: 39s - loss: 0.7048 - regression_loss: 0.6163 - classification_loss: 0.0885 382/500 [=====================>........] - ETA: 38s - loss: 0.7052 - regression_loss: 0.6166 - classification_loss: 0.0886 383/500 [=====================>........] - ETA: 38s - loss: 0.7051 - regression_loss: 0.6165 - classification_loss: 0.0886 384/500 [======================>.......] - ETA: 38s - loss: 0.7046 - regression_loss: 0.6161 - classification_loss: 0.0885 385/500 [======================>.......] - ETA: 37s - loss: 0.7048 - regression_loss: 0.6163 - classification_loss: 0.0885 386/500 [======================>.......] - ETA: 37s - loss: 0.7052 - regression_loss: 0.6167 - classification_loss: 0.0885 387/500 [======================>.......] - ETA: 37s - loss: 0.7045 - regression_loss: 0.6160 - classification_loss: 0.0884 388/500 [======================>.......] - ETA: 36s - loss: 0.7048 - regression_loss: 0.6164 - classification_loss: 0.0884 389/500 [======================>.......] - ETA: 36s - loss: 0.7044 - regression_loss: 0.6160 - classification_loss: 0.0884 390/500 [======================>.......] - ETA: 36s - loss: 0.7039 - regression_loss: 0.6154 - classification_loss: 0.0885 391/500 [======================>.......] - ETA: 35s - loss: 0.7069 - regression_loss: 0.6176 - classification_loss: 0.0893 392/500 [======================>.......] - ETA: 35s - loss: 0.7085 - regression_loss: 0.6189 - classification_loss: 0.0896 393/500 [======================>.......] - ETA: 35s - loss: 0.7099 - regression_loss: 0.6202 - classification_loss: 0.0898 394/500 [======================>.......] - ETA: 35s - loss: 0.7113 - regression_loss: 0.6214 - classification_loss: 0.0900 395/500 [======================>.......] - ETA: 34s - loss: 0.7120 - regression_loss: 0.6219 - classification_loss: 0.0901 396/500 [======================>.......] - ETA: 34s - loss: 0.7124 - regression_loss: 0.6223 - classification_loss: 0.0901 397/500 [======================>.......] - ETA: 34s - loss: 0.7132 - regression_loss: 0.6228 - classification_loss: 0.0903 398/500 [======================>.......] - ETA: 33s - loss: 0.7125 - regression_loss: 0.6224 - classification_loss: 0.0902 399/500 [======================>.......] - ETA: 33s - loss: 0.7118 - regression_loss: 0.6216 - classification_loss: 0.0902 400/500 [=======================>......] - ETA: 33s - loss: 0.7130 - regression_loss: 0.6226 - classification_loss: 0.0903 401/500 [=======================>......] - ETA: 32s - loss: 0.7131 - regression_loss: 0.6228 - classification_loss: 0.0903 402/500 [=======================>......] - ETA: 32s - loss: 0.7136 - regression_loss: 0.6233 - classification_loss: 0.0904 403/500 [=======================>......] - ETA: 32s - loss: 0.7130 - regression_loss: 0.6227 - classification_loss: 0.0903 404/500 [=======================>......] - ETA: 31s - loss: 0.7130 - regression_loss: 0.6226 - classification_loss: 0.0903 405/500 [=======================>......] - ETA: 31s - loss: 0.7123 - regression_loss: 0.6221 - classification_loss: 0.0902 406/500 [=======================>......] - ETA: 31s - loss: 0.7124 - regression_loss: 0.6222 - classification_loss: 0.0902 407/500 [=======================>......] - ETA: 30s - loss: 0.7116 - regression_loss: 0.6214 - classification_loss: 0.0901 408/500 [=======================>......] - ETA: 30s - loss: 0.7120 - regression_loss: 0.6218 - classification_loss: 0.0902 409/500 [=======================>......] - ETA: 30s - loss: 0.7115 - regression_loss: 0.6213 - classification_loss: 0.0902 410/500 [=======================>......] - ETA: 29s - loss: 0.7105 - regression_loss: 0.6204 - classification_loss: 0.0900 411/500 [=======================>......] - ETA: 29s - loss: 0.7098 - regression_loss: 0.6199 - classification_loss: 0.0899 412/500 [=======================>......] - ETA: 29s - loss: 0.7105 - regression_loss: 0.6204 - classification_loss: 0.0900 413/500 [=======================>......] - ETA: 28s - loss: 0.7094 - regression_loss: 0.6195 - classification_loss: 0.0899 414/500 [=======================>......] - ETA: 28s - loss: 0.7102 - regression_loss: 0.6201 - classification_loss: 0.0901 415/500 [=======================>......] - ETA: 28s - loss: 0.7102 - regression_loss: 0.6202 - classification_loss: 0.0900 416/500 [=======================>......] - ETA: 27s - loss: 0.7096 - regression_loss: 0.6197 - classification_loss: 0.0899 417/500 [========================>.....] - ETA: 27s - loss: 0.7091 - regression_loss: 0.6193 - classification_loss: 0.0898 418/500 [========================>.....] - ETA: 27s - loss: 0.7094 - regression_loss: 0.6194 - classification_loss: 0.0900 419/500 [========================>.....] - ETA: 26s - loss: 0.7097 - regression_loss: 0.6197 - classification_loss: 0.0900 420/500 [========================>.....] - ETA: 26s - loss: 0.7088 - regression_loss: 0.6189 - classification_loss: 0.0899 421/500 [========================>.....] - ETA: 26s - loss: 0.7085 - regression_loss: 0.6187 - classification_loss: 0.0898 422/500 [========================>.....] - ETA: 25s - loss: 0.7075 - regression_loss: 0.6178 - classification_loss: 0.0897 423/500 [========================>.....] - ETA: 25s - loss: 0.7073 - regression_loss: 0.6177 - classification_loss: 0.0897 424/500 [========================>.....] - ETA: 25s - loss: 0.7064 - regression_loss: 0.6169 - classification_loss: 0.0895 425/500 [========================>.....] - ETA: 24s - loss: 0.7069 - regression_loss: 0.6174 - classification_loss: 0.0895 426/500 [========================>.....] - ETA: 24s - loss: 0.7069 - regression_loss: 0.6174 - classification_loss: 0.0895 427/500 [========================>.....] - ETA: 24s - loss: 0.7071 - regression_loss: 0.6177 - classification_loss: 0.0894 428/500 [========================>.....] - ETA: 23s - loss: 0.7077 - regression_loss: 0.6181 - classification_loss: 0.0895 429/500 [========================>.....] - ETA: 23s - loss: 0.7068 - regression_loss: 0.6174 - classification_loss: 0.0894 430/500 [========================>.....] - ETA: 23s - loss: 0.7058 - regression_loss: 0.6165 - classification_loss: 0.0892 431/500 [========================>.....] - ETA: 22s - loss: 0.7052 - regression_loss: 0.6160 - classification_loss: 0.0892 432/500 [========================>.....] - ETA: 22s - loss: 0.7059 - regression_loss: 0.6166 - classification_loss: 0.0893 433/500 [========================>.....] - ETA: 22s - loss: 0.7054 - regression_loss: 0.6162 - classification_loss: 0.0892 434/500 [=========================>....] - ETA: 21s - loss: 0.7053 - regression_loss: 0.6161 - classification_loss: 0.0892 435/500 [=========================>....] - ETA: 21s - loss: 0.7047 - regression_loss: 0.6156 - classification_loss: 0.0891 436/500 [=========================>....] - ETA: 21s - loss: 0.7047 - regression_loss: 0.6156 - classification_loss: 0.0891 437/500 [=========================>....] - ETA: 20s - loss: 0.7046 - regression_loss: 0.6155 - classification_loss: 0.0890 438/500 [=========================>....] - ETA: 20s - loss: 0.7065 - regression_loss: 0.6171 - classification_loss: 0.0894 439/500 [=========================>....] - ETA: 20s - loss: 0.7071 - regression_loss: 0.6176 - classification_loss: 0.0895 440/500 [=========================>....] - ETA: 19s - loss: 0.7074 - regression_loss: 0.6179 - classification_loss: 0.0894 441/500 [=========================>....] - ETA: 19s - loss: 0.7072 - regression_loss: 0.6178 - classification_loss: 0.0893 442/500 [=========================>....] - ETA: 19s - loss: 0.7079 - regression_loss: 0.6185 - classification_loss: 0.0894 443/500 [=========================>....] - ETA: 18s - loss: 0.7081 - regression_loss: 0.6188 - classification_loss: 0.0894 444/500 [=========================>....] - ETA: 18s - loss: 0.7096 - regression_loss: 0.6199 - classification_loss: 0.0897 445/500 [=========================>....] - ETA: 18s - loss: 0.7092 - regression_loss: 0.6197 - classification_loss: 0.0896 446/500 [=========================>....] - ETA: 17s - loss: 0.7096 - regression_loss: 0.6201 - classification_loss: 0.0896 447/500 [=========================>....] - ETA: 17s - loss: 0.7090 - regression_loss: 0.6196 - classification_loss: 0.0894 448/500 [=========================>....] - ETA: 17s - loss: 0.7097 - regression_loss: 0.6202 - classification_loss: 0.0895 449/500 [=========================>....] - ETA: 16s - loss: 0.7089 - regression_loss: 0.6195 - classification_loss: 0.0894 450/500 [==========================>...] - ETA: 16s - loss: 0.7104 - regression_loss: 0.6207 - classification_loss: 0.0897 451/500 [==========================>...] - ETA: 16s - loss: 0.7102 - regression_loss: 0.6205 - classification_loss: 0.0897 452/500 [==========================>...] - ETA: 15s - loss: 0.7106 - regression_loss: 0.6209 - classification_loss: 0.0897 453/500 [==========================>...] - ETA: 15s - loss: 0.7106 - regression_loss: 0.6208 - classification_loss: 0.0898 454/500 [==========================>...] - ETA: 15s - loss: 0.7106 - regression_loss: 0.6208 - classification_loss: 0.0898 455/500 [==========================>...] - ETA: 14s - loss: 0.7104 - regression_loss: 0.6206 - classification_loss: 0.0897 456/500 [==========================>...] - ETA: 14s - loss: 0.7110 - regression_loss: 0.6211 - classification_loss: 0.0899 457/500 [==========================>...] - ETA: 14s - loss: 0.7103 - regression_loss: 0.6206 - classification_loss: 0.0897 458/500 [==========================>...] - ETA: 13s - loss: 0.7117 - regression_loss: 0.6217 - classification_loss: 0.0900 459/500 [==========================>...] - ETA: 13s - loss: 0.7120 - regression_loss: 0.6219 - classification_loss: 0.0901 460/500 [==========================>...] - ETA: 13s - loss: 0.7119 - regression_loss: 0.6218 - classification_loss: 0.0901 461/500 [==========================>...] - ETA: 12s - loss: 0.7118 - regression_loss: 0.6217 - classification_loss: 0.0901 462/500 [==========================>...] - ETA: 12s - loss: 0.7121 - regression_loss: 0.6220 - classification_loss: 0.0902 463/500 [==========================>...] - ETA: 12s - loss: 0.7124 - regression_loss: 0.6222 - classification_loss: 0.0902 464/500 [==========================>...] - ETA: 11s - loss: 0.7128 - regression_loss: 0.6225 - classification_loss: 0.0903 465/500 [==========================>...] - ETA: 11s - loss: 0.7123 - regression_loss: 0.6221 - classification_loss: 0.0902 466/500 [==========================>...] - ETA: 11s - loss: 0.7128 - regression_loss: 0.6226 - classification_loss: 0.0902 467/500 [===========================>..] - ETA: 10s - loss: 0.7132 - regression_loss: 0.6229 - classification_loss: 0.0902 468/500 [===========================>..] - ETA: 10s - loss: 0.7131 - regression_loss: 0.6229 - classification_loss: 0.0902 469/500 [===========================>..] - ETA: 10s - loss: 0.7125 - regression_loss: 0.6224 - classification_loss: 0.0901 470/500 [===========================>..] - ETA: 9s - loss: 0.7124 - regression_loss: 0.6223 - classification_loss: 0.0901  471/500 [===========================>..] - ETA: 9s - loss: 0.7130 - regression_loss: 0.6229 - classification_loss: 0.0901 472/500 [===========================>..] - ETA: 9s - loss: 0.7133 - regression_loss: 0.6231 - classification_loss: 0.0902 473/500 [===========================>..] - ETA: 8s - loss: 0.7126 - regression_loss: 0.6225 - classification_loss: 0.0901 474/500 [===========================>..] - ETA: 8s - loss: 0.7133 - regression_loss: 0.6232 - classification_loss: 0.0901 475/500 [===========================>..] - ETA: 8s - loss: 0.7123 - regression_loss: 0.6223 - classification_loss: 0.0900 476/500 [===========================>..] - ETA: 7s - loss: 0.7125 - regression_loss: 0.6225 - classification_loss: 0.0900 477/500 [===========================>..] - ETA: 7s - loss: 0.7117 - regression_loss: 0.6218 - classification_loss: 0.0899 478/500 [===========================>..] - ETA: 7s - loss: 0.7124 - regression_loss: 0.6225 - classification_loss: 0.0900 479/500 [===========================>..] - ETA: 6s - loss: 0.7118 - regression_loss: 0.6219 - classification_loss: 0.0899 480/500 [===========================>..] - ETA: 6s - loss: 0.7124 - regression_loss: 0.6223 - classification_loss: 0.0901 481/500 [===========================>..] - ETA: 6s - loss: 0.7123 - regression_loss: 0.6221 - classification_loss: 0.0902 482/500 [===========================>..] - ETA: 5s - loss: 0.7128 - regression_loss: 0.6227 - classification_loss: 0.0901 483/500 [===========================>..] - ETA: 5s - loss: 0.7136 - regression_loss: 0.6234 - classification_loss: 0.0902 484/500 [============================>.] - ETA: 5s - loss: 0.7131 - regression_loss: 0.6230 - classification_loss: 0.0901 485/500 [============================>.] - ETA: 4s - loss: 0.7130 - regression_loss: 0.6229 - classification_loss: 0.0901 486/500 [============================>.] - ETA: 4s - loss: 0.7143 - regression_loss: 0.6240 - classification_loss: 0.0903 487/500 [============================>.] - ETA: 4s - loss: 0.7138 - regression_loss: 0.6235 - classification_loss: 0.0903 488/500 [============================>.] - ETA: 3s - loss: 0.7142 - regression_loss: 0.6238 - classification_loss: 0.0904 489/500 [============================>.] - ETA: 3s - loss: 0.7153 - regression_loss: 0.6247 - classification_loss: 0.0906 490/500 [============================>.] - ETA: 3s - loss: 0.7149 - regression_loss: 0.6243 - classification_loss: 0.0905 491/500 [============================>.] - ETA: 2s - loss: 0.7143 - regression_loss: 0.6239 - classification_loss: 0.0905 492/500 [============================>.] - ETA: 2s - loss: 0.7139 - regression_loss: 0.6235 - classification_loss: 0.0904 493/500 [============================>.] - ETA: 2s - loss: 0.7136 - regression_loss: 0.6233 - classification_loss: 0.0903 494/500 [============================>.] - ETA: 1s - loss: 0.7129 - regression_loss: 0.6227 - classification_loss: 0.0902 495/500 [============================>.] - ETA: 1s - loss: 0.7132 - regression_loss: 0.6229 - classification_loss: 0.0903 496/500 [============================>.] - ETA: 1s - loss: 0.7128 - regression_loss: 0.6226 - classification_loss: 0.0902 497/500 [============================>.] - ETA: 0s - loss: 0.7143 - regression_loss: 0.6238 - classification_loss: 0.0905 498/500 [============================>.] - ETA: 0s - loss: 0.7145 - regression_loss: 0.6240 - classification_loss: 0.0904 499/500 [============================>.] - ETA: 0s - loss: 0.7147 - regression_loss: 0.6242 - classification_loss: 0.0905 500/500 [==============================] - 165s 330ms/step - loss: 0.7148 - regression_loss: 0.6243 - classification_loss: 0.0905 1172 instances of class plum with average precision: 0.6479 mAP: 0.6479 Epoch 00044: saving model to ./training/snapshots/resnet101_pascal_44.h5 Epoch 45/150 1/500 [..............................] - ETA: 2:35 - loss: 0.5887 - regression_loss: 0.5335 - classification_loss: 0.0552 2/500 [..............................] - ETA: 2:34 - loss: 0.6476 - regression_loss: 0.5636 - classification_loss: 0.0840 3/500 [..............................] - ETA: 2:40 - loss: 0.5999 - regression_loss: 0.5277 - classification_loss: 0.0721 4/500 [..............................] - ETA: 2:40 - loss: 0.6524 - regression_loss: 0.5746 - classification_loss: 0.0778 5/500 [..............................] - ETA: 2:39 - loss: 0.8008 - regression_loss: 0.6949 - classification_loss: 0.1059 6/500 [..............................] - ETA: 2:41 - loss: 0.7559 - regression_loss: 0.6581 - classification_loss: 0.0978 7/500 [..............................] - ETA: 2:42 - loss: 0.7578 - regression_loss: 0.6599 - classification_loss: 0.0979 8/500 [..............................] - ETA: 2:41 - loss: 0.7411 - regression_loss: 0.6462 - classification_loss: 0.0949 9/500 [..............................] - ETA: 2:40 - loss: 0.7086 - regression_loss: 0.6178 - classification_loss: 0.0908 10/500 [..............................] - ETA: 2:40 - loss: 0.7153 - regression_loss: 0.6250 - classification_loss: 0.0903 11/500 [..............................] - ETA: 2:40 - loss: 0.6812 - regression_loss: 0.5976 - classification_loss: 0.0837 12/500 [..............................] - ETA: 2:39 - loss: 0.7042 - regression_loss: 0.6210 - classification_loss: 0.0831 13/500 [..............................] - ETA: 2:40 - loss: 0.7092 - regression_loss: 0.6289 - classification_loss: 0.0803 14/500 [..............................] - ETA: 2:40 - loss: 0.6792 - regression_loss: 0.6019 - classification_loss: 0.0772 15/500 [..............................] - ETA: 2:40 - loss: 0.7176 - regression_loss: 0.6340 - classification_loss: 0.0835 16/500 [..............................] - ETA: 2:40 - loss: 0.7050 - regression_loss: 0.6240 - classification_loss: 0.0811 17/500 [>.............................] - ETA: 2:39 - loss: 0.7339 - regression_loss: 0.6489 - classification_loss: 0.0849 18/500 [>.............................] - ETA: 2:39 - loss: 0.7483 - regression_loss: 0.6610 - classification_loss: 0.0873 19/500 [>.............................] - ETA: 2:40 - loss: 0.7378 - regression_loss: 0.6520 - classification_loss: 0.0858 20/500 [>.............................] - ETA: 2:39 - loss: 0.7601 - regression_loss: 0.6695 - classification_loss: 0.0906 21/500 [>.............................] - ETA: 2:39 - loss: 0.7753 - regression_loss: 0.6832 - classification_loss: 0.0921 22/500 [>.............................] - ETA: 2:38 - loss: 0.7759 - regression_loss: 0.6839 - classification_loss: 0.0919 23/500 [>.............................] - ETA: 2:38 - loss: 0.7786 - regression_loss: 0.6877 - classification_loss: 0.0910 24/500 [>.............................] - ETA: 2:38 - loss: 0.7620 - regression_loss: 0.6733 - classification_loss: 0.0887 25/500 [>.............................] - ETA: 2:37 - loss: 0.7587 - regression_loss: 0.6711 - classification_loss: 0.0876 26/500 [>.............................] - ETA: 2:37 - loss: 0.7461 - regression_loss: 0.6598 - classification_loss: 0.0863 27/500 [>.............................] - ETA: 2:37 - loss: 0.7331 - regression_loss: 0.6487 - classification_loss: 0.0844 28/500 [>.............................] - ETA: 2:36 - loss: 0.7379 - regression_loss: 0.6515 - classification_loss: 0.0864 29/500 [>.............................] - ETA: 2:36 - loss: 0.7262 - regression_loss: 0.6419 - classification_loss: 0.0842 30/500 [>.............................] - ETA: 2:35 - loss: 0.7182 - regression_loss: 0.6348 - classification_loss: 0.0834 31/500 [>.............................] - ETA: 2:35 - loss: 0.7154 - regression_loss: 0.6323 - classification_loss: 0.0831 32/500 [>.............................] - ETA: 2:36 - loss: 0.7153 - regression_loss: 0.6317 - classification_loss: 0.0835 33/500 [>.............................] - ETA: 2:35 - loss: 0.7223 - regression_loss: 0.6371 - classification_loss: 0.0851 34/500 [=>............................] - ETA: 2:35 - loss: 0.7115 - regression_loss: 0.6277 - classification_loss: 0.0838 35/500 [=>............................] - ETA: 2:34 - loss: 0.7196 - regression_loss: 0.6338 - classification_loss: 0.0858 36/500 [=>............................] - ETA: 2:34 - loss: 0.7190 - regression_loss: 0.6324 - classification_loss: 0.0866 37/500 [=>............................] - ETA: 2:34 - loss: 0.7216 - regression_loss: 0.6343 - classification_loss: 0.0873 38/500 [=>............................] - ETA: 2:34 - loss: 0.7194 - regression_loss: 0.6318 - classification_loss: 0.0876 39/500 [=>............................] - ETA: 2:33 - loss: 0.7189 - regression_loss: 0.6309 - classification_loss: 0.0879 40/500 [=>............................] - ETA: 2:33 - loss: 0.7269 - regression_loss: 0.6369 - classification_loss: 0.0900 41/500 [=>............................] - ETA: 2:33 - loss: 0.7263 - regression_loss: 0.6366 - classification_loss: 0.0897 42/500 [=>............................] - ETA: 2:32 - loss: 0.7280 - regression_loss: 0.6388 - classification_loss: 0.0892 43/500 [=>............................] - ETA: 2:32 - loss: 0.7329 - regression_loss: 0.6426 - classification_loss: 0.0903 44/500 [=>............................] - ETA: 2:32 - loss: 0.7459 - regression_loss: 0.6532 - classification_loss: 0.0928 45/500 [=>............................] - ETA: 2:31 - loss: 0.7451 - regression_loss: 0.6524 - classification_loss: 0.0928 46/500 [=>............................] - ETA: 2:31 - loss: 0.7468 - regression_loss: 0.6537 - classification_loss: 0.0930 47/500 [=>............................] - ETA: 2:31 - loss: 0.7396 - regression_loss: 0.6472 - classification_loss: 0.0924 48/500 [=>............................] - ETA: 2:30 - loss: 0.7422 - regression_loss: 0.6507 - classification_loss: 0.0915 49/500 [=>............................] - ETA: 2:30 - loss: 0.7415 - regression_loss: 0.6504 - classification_loss: 0.0910 50/500 [==>...........................] - ETA: 2:30 - loss: 0.7340 - regression_loss: 0.6438 - classification_loss: 0.0901 51/500 [==>...........................] - ETA: 2:29 - loss: 0.7391 - regression_loss: 0.6486 - classification_loss: 0.0905 52/500 [==>...........................] - ETA: 2:29 - loss: 0.7419 - regression_loss: 0.6509 - classification_loss: 0.0910 53/500 [==>...........................] - ETA: 2:29 - loss: 0.7375 - regression_loss: 0.6468 - classification_loss: 0.0908 54/500 [==>...........................] - ETA: 2:29 - loss: 0.7316 - regression_loss: 0.6417 - classification_loss: 0.0899 55/500 [==>...........................] - ETA: 2:28 - loss: 0.7279 - regression_loss: 0.6385 - classification_loss: 0.0893 56/500 [==>...........................] - ETA: 2:28 - loss: 0.7317 - regression_loss: 0.6421 - classification_loss: 0.0896 57/500 [==>...........................] - ETA: 2:28 - loss: 0.7290 - regression_loss: 0.6399 - classification_loss: 0.0890 58/500 [==>...........................] - ETA: 2:27 - loss: 0.7249 - regression_loss: 0.6365 - classification_loss: 0.0884 59/500 [==>...........................] - ETA: 2:27 - loss: 0.7297 - regression_loss: 0.6411 - classification_loss: 0.0886 60/500 [==>...........................] - ETA: 2:27 - loss: 0.7252 - regression_loss: 0.6371 - classification_loss: 0.0882 61/500 [==>...........................] - ETA: 2:26 - loss: 0.7192 - regression_loss: 0.6318 - classification_loss: 0.0874 62/500 [==>...........................] - ETA: 2:26 - loss: 0.7185 - regression_loss: 0.6309 - classification_loss: 0.0876 63/500 [==>...........................] - ETA: 2:25 - loss: 0.7129 - regression_loss: 0.6259 - classification_loss: 0.0869 64/500 [==>...........................] - ETA: 2:25 - loss: 0.7066 - regression_loss: 0.6204 - classification_loss: 0.0862 65/500 [==>...........................] - ETA: 2:25 - loss: 0.7027 - regression_loss: 0.6167 - classification_loss: 0.0860 66/500 [==>...........................] - ETA: 2:25 - loss: 0.7031 - regression_loss: 0.6170 - classification_loss: 0.0861 67/500 [===>..........................] - ETA: 2:24 - loss: 0.6962 - regression_loss: 0.6107 - classification_loss: 0.0855 68/500 [===>..........................] - ETA: 2:24 - loss: 0.7001 - regression_loss: 0.6143 - classification_loss: 0.0859 69/500 [===>..........................] - ETA: 2:23 - loss: 0.6938 - regression_loss: 0.6089 - classification_loss: 0.0849 70/500 [===>..........................] - ETA: 2:23 - loss: 0.7057 - regression_loss: 0.6186 - classification_loss: 0.0870 71/500 [===>..........................] - ETA: 2:23 - loss: 0.7058 - regression_loss: 0.6189 - classification_loss: 0.0869 72/500 [===>..........................] - ETA: 2:22 - loss: 0.7051 - regression_loss: 0.6185 - classification_loss: 0.0866 73/500 [===>..........................] - ETA: 2:22 - loss: 0.7088 - regression_loss: 0.6215 - classification_loss: 0.0872 74/500 [===>..........................] - ETA: 2:21 - loss: 0.7092 - regression_loss: 0.6218 - classification_loss: 0.0873 75/500 [===>..........................] - ETA: 2:21 - loss: 0.7047 - regression_loss: 0.6180 - classification_loss: 0.0868 76/500 [===>..........................] - ETA: 2:21 - loss: 0.7025 - regression_loss: 0.6157 - classification_loss: 0.0869 77/500 [===>..........................] - ETA: 2:20 - loss: 0.7022 - regression_loss: 0.6158 - classification_loss: 0.0864 78/500 [===>..........................] - ETA: 2:20 - loss: 0.6998 - regression_loss: 0.6139 - classification_loss: 0.0860 79/500 [===>..........................] - ETA: 2:19 - loss: 0.7032 - regression_loss: 0.6173 - classification_loss: 0.0859 80/500 [===>..........................] - ETA: 2:19 - loss: 0.7004 - regression_loss: 0.6150 - classification_loss: 0.0853 81/500 [===>..........................] - ETA: 2:19 - loss: 0.7049 - regression_loss: 0.6187 - classification_loss: 0.0862 82/500 [===>..........................] - ETA: 2:18 - loss: 0.7078 - regression_loss: 0.6208 - classification_loss: 0.0869 83/500 [===>..........................] - ETA: 2:18 - loss: 0.7026 - regression_loss: 0.6164 - classification_loss: 0.0863 84/500 [====>.........................] - ETA: 2:18 - loss: 0.6992 - regression_loss: 0.6133 - classification_loss: 0.0859 85/500 [====>.........................] - ETA: 2:17 - loss: 0.6965 - regression_loss: 0.6111 - classification_loss: 0.0855 86/500 [====>.........................] - ETA: 2:17 - loss: 0.7001 - regression_loss: 0.6142 - classification_loss: 0.0859 87/500 [====>.........................] - ETA: 2:17 - loss: 0.7010 - regression_loss: 0.6149 - classification_loss: 0.0860 88/500 [====>.........................] - ETA: 2:17 - loss: 0.6978 - regression_loss: 0.6121 - classification_loss: 0.0857 89/500 [====>.........................] - ETA: 2:16 - loss: 0.6989 - regression_loss: 0.6130 - classification_loss: 0.0858 90/500 [====>.........................] - ETA: 2:16 - loss: 0.6977 - regression_loss: 0.6123 - classification_loss: 0.0854 91/500 [====>.........................] - ETA: 2:16 - loss: 0.6953 - regression_loss: 0.6101 - classification_loss: 0.0852 92/500 [====>.........................] - ETA: 2:15 - loss: 0.7037 - regression_loss: 0.6169 - classification_loss: 0.0867 93/500 [====>.........................] - ETA: 2:15 - loss: 0.7062 - regression_loss: 0.6191 - classification_loss: 0.0871 94/500 [====>.........................] - ETA: 2:15 - loss: 0.7063 - regression_loss: 0.6190 - classification_loss: 0.0873 95/500 [====>.........................] - ETA: 2:14 - loss: 0.7027 - regression_loss: 0.6159 - classification_loss: 0.0868 96/500 [====>.........................] - ETA: 2:14 - loss: 0.7037 - regression_loss: 0.6166 - classification_loss: 0.0871 97/500 [====>.........................] - ETA: 2:14 - loss: 0.7010 - regression_loss: 0.6142 - classification_loss: 0.0868 98/500 [====>.........................] - ETA: 2:13 - loss: 0.7013 - regression_loss: 0.6145 - classification_loss: 0.0868 99/500 [====>.........................] - ETA: 2:13 - loss: 0.7018 - regression_loss: 0.6150 - classification_loss: 0.0868 100/500 [=====>........................] - ETA: 2:13 - loss: 0.7029 - regression_loss: 0.6156 - classification_loss: 0.0873 101/500 [=====>........................] - ETA: 2:13 - loss: 0.7012 - regression_loss: 0.6140 - classification_loss: 0.0872 102/500 [=====>........................] - ETA: 2:12 - loss: 0.7009 - regression_loss: 0.6137 - classification_loss: 0.0871 103/500 [=====>........................] - ETA: 2:12 - loss: 0.7020 - regression_loss: 0.6147 - classification_loss: 0.0872 104/500 [=====>........................] - ETA: 2:12 - loss: 0.7005 - regression_loss: 0.6134 - classification_loss: 0.0871 105/500 [=====>........................] - ETA: 2:11 - loss: 0.6988 - regression_loss: 0.6121 - classification_loss: 0.0867 106/500 [=====>........................] - ETA: 2:11 - loss: 0.6986 - regression_loss: 0.6121 - classification_loss: 0.0864 107/500 [=====>........................] - ETA: 2:10 - loss: 0.7029 - regression_loss: 0.6157 - classification_loss: 0.0872 108/500 [=====>........................] - ETA: 2:10 - loss: 0.7047 - regression_loss: 0.6173 - classification_loss: 0.0874 109/500 [=====>........................] - ETA: 2:10 - loss: 0.7083 - regression_loss: 0.6202 - classification_loss: 0.0882 110/500 [=====>........................] - ETA: 2:09 - loss: 0.7055 - regression_loss: 0.6179 - classification_loss: 0.0876 111/500 [=====>........................] - ETA: 2:09 - loss: 0.7055 - regression_loss: 0.6179 - classification_loss: 0.0876 112/500 [=====>........................] - ETA: 2:09 - loss: 0.7061 - regression_loss: 0.6187 - classification_loss: 0.0874 113/500 [=====>........................] - ETA: 2:08 - loss: 0.7044 - regression_loss: 0.6174 - classification_loss: 0.0870 114/500 [=====>........................] - ETA: 2:08 - loss: 0.7072 - regression_loss: 0.6193 - classification_loss: 0.0879 115/500 [=====>........................] - ETA: 2:08 - loss: 0.7051 - regression_loss: 0.6177 - classification_loss: 0.0874 116/500 [=====>........................] - ETA: 2:07 - loss: 0.7040 - regression_loss: 0.6168 - classification_loss: 0.0872 117/500 [======>.......................] - ETA: 2:07 - loss: 0.7006 - regression_loss: 0.6139 - classification_loss: 0.0867 118/500 [======>.......................] - ETA: 2:07 - loss: 0.6970 - regression_loss: 0.6107 - classification_loss: 0.0863 119/500 [======>.......................] - ETA: 2:06 - loss: 0.6931 - regression_loss: 0.6073 - classification_loss: 0.0858 120/500 [======>.......................] - ETA: 2:06 - loss: 0.6918 - regression_loss: 0.6062 - classification_loss: 0.0855 121/500 [======>.......................] - ETA: 2:06 - loss: 0.6923 - regression_loss: 0.6066 - classification_loss: 0.0857 122/500 [======>.......................] - ETA: 2:05 - loss: 0.6924 - regression_loss: 0.6065 - classification_loss: 0.0859 123/500 [======>.......................] - ETA: 2:05 - loss: 0.6924 - regression_loss: 0.6065 - classification_loss: 0.0859 124/500 [======>.......................] - ETA: 2:05 - loss: 0.6961 - regression_loss: 0.6095 - classification_loss: 0.0866 125/500 [======>.......................] - ETA: 2:04 - loss: 0.6992 - regression_loss: 0.6124 - classification_loss: 0.0868 126/500 [======>.......................] - ETA: 2:04 - loss: 0.7009 - regression_loss: 0.6136 - classification_loss: 0.0874 127/500 [======>.......................] - ETA: 2:04 - loss: 0.6980 - regression_loss: 0.6111 - classification_loss: 0.0870 128/500 [======>.......................] - ETA: 2:03 - loss: 0.6996 - regression_loss: 0.6124 - classification_loss: 0.0872 129/500 [======>.......................] - ETA: 2:03 - loss: 0.6971 - regression_loss: 0.6103 - classification_loss: 0.0867 130/500 [======>.......................] - ETA: 2:03 - loss: 0.6959 - regression_loss: 0.6091 - classification_loss: 0.0867 131/500 [======>.......................] - ETA: 2:02 - loss: 0.6930 - regression_loss: 0.6067 - classification_loss: 0.0863 132/500 [======>.......................] - ETA: 2:02 - loss: 0.6959 - regression_loss: 0.6090 - classification_loss: 0.0869 133/500 [======>.......................] - ETA: 2:02 - loss: 0.6965 - regression_loss: 0.6091 - classification_loss: 0.0873 134/500 [=======>......................] - ETA: 2:01 - loss: 0.6941 - regression_loss: 0.6071 - classification_loss: 0.0870 135/500 [=======>......................] - ETA: 2:01 - loss: 0.6934 - regression_loss: 0.6067 - classification_loss: 0.0868 136/500 [=======>......................] - ETA: 2:01 - loss: 0.6948 - regression_loss: 0.6077 - classification_loss: 0.0871 137/500 [=======>......................] - ETA: 2:00 - loss: 0.6954 - regression_loss: 0.6083 - classification_loss: 0.0871 138/500 [=======>......................] - ETA: 2:00 - loss: 0.6962 - regression_loss: 0.6090 - classification_loss: 0.0872 139/500 [=======>......................] - ETA: 2:00 - loss: 0.6987 - regression_loss: 0.6112 - classification_loss: 0.0875 140/500 [=======>......................] - ETA: 1:59 - loss: 0.6986 - regression_loss: 0.6110 - classification_loss: 0.0876 141/500 [=======>......................] - ETA: 1:59 - loss: 0.6986 - regression_loss: 0.6110 - classification_loss: 0.0875 142/500 [=======>......................] - ETA: 1:59 - loss: 0.7023 - regression_loss: 0.6141 - classification_loss: 0.0882 143/500 [=======>......................] - ETA: 1:58 - loss: 0.7051 - regression_loss: 0.6168 - classification_loss: 0.0883 144/500 [=======>......................] - ETA: 1:58 - loss: 0.7056 - regression_loss: 0.6172 - classification_loss: 0.0884 145/500 [=======>......................] - ETA: 1:58 - loss: 0.7023 - regression_loss: 0.6144 - classification_loss: 0.0879 146/500 [=======>......................] - ETA: 1:57 - loss: 0.7015 - regression_loss: 0.6138 - classification_loss: 0.0876 147/500 [=======>......................] - ETA: 1:57 - loss: 0.7006 - regression_loss: 0.6131 - classification_loss: 0.0875 148/500 [=======>......................] - ETA: 1:57 - loss: 0.7023 - regression_loss: 0.6146 - classification_loss: 0.0877 149/500 [=======>......................] - ETA: 1:56 - loss: 0.7033 - regression_loss: 0.6152 - classification_loss: 0.0880 150/500 [========>.....................] - ETA: 1:56 - loss: 0.7036 - regression_loss: 0.6157 - classification_loss: 0.0878 151/500 [========>.....................] - ETA: 1:56 - loss: 0.7034 - regression_loss: 0.6156 - classification_loss: 0.0878 152/500 [========>.....................] - ETA: 1:55 - loss: 0.7063 - regression_loss: 0.6180 - classification_loss: 0.0882 153/500 [========>.....................] - ETA: 1:55 - loss: 0.7037 - regression_loss: 0.6158 - classification_loss: 0.0879 154/500 [========>.....................] - ETA: 1:55 - loss: 0.7034 - regression_loss: 0.6155 - classification_loss: 0.0879 155/500 [========>.....................] - ETA: 1:54 - loss: 0.7048 - regression_loss: 0.6168 - classification_loss: 0.0880 156/500 [========>.....................] - ETA: 1:54 - loss: 0.7025 - regression_loss: 0.6148 - classification_loss: 0.0877 157/500 [========>.....................] - ETA: 1:54 - loss: 0.7080 - regression_loss: 0.6193 - classification_loss: 0.0886 158/500 [========>.....................] - ETA: 1:53 - loss: 0.7054 - regression_loss: 0.6169 - classification_loss: 0.0885 159/500 [========>.....................] - ETA: 1:53 - loss: 0.7063 - regression_loss: 0.6176 - classification_loss: 0.0887 160/500 [========>.....................] - ETA: 1:53 - loss: 0.7060 - regression_loss: 0.6172 - classification_loss: 0.0889 161/500 [========>.....................] - ETA: 1:52 - loss: 0.7061 - regression_loss: 0.6172 - classification_loss: 0.0888 162/500 [========>.....................] - ETA: 1:52 - loss: 0.7032 - regression_loss: 0.6147 - classification_loss: 0.0885 163/500 [========>.....................] - ETA: 1:52 - loss: 0.7015 - regression_loss: 0.6132 - classification_loss: 0.0883 164/500 [========>.....................] - ETA: 1:51 - loss: 0.7037 - regression_loss: 0.6150 - classification_loss: 0.0887 165/500 [========>.....................] - ETA: 1:51 - loss: 0.7044 - regression_loss: 0.6157 - classification_loss: 0.0887 166/500 [========>.....................] - ETA: 1:51 - loss: 0.7066 - regression_loss: 0.6174 - classification_loss: 0.0892 167/500 [=========>....................] - ETA: 1:50 - loss: 0.7093 - regression_loss: 0.6201 - classification_loss: 0.0892 168/500 [=========>....................] - ETA: 1:50 - loss: 0.7073 - regression_loss: 0.6184 - classification_loss: 0.0889 169/500 [=========>....................] - ETA: 1:50 - loss: 0.7057 - regression_loss: 0.6171 - classification_loss: 0.0886 170/500 [=========>....................] - ETA: 1:49 - loss: 0.7036 - regression_loss: 0.6152 - classification_loss: 0.0884 171/500 [=========>....................] - ETA: 1:49 - loss: 0.7023 - regression_loss: 0.6142 - classification_loss: 0.0881 172/500 [=========>....................] - ETA: 1:49 - loss: 0.7028 - regression_loss: 0.6145 - classification_loss: 0.0883 173/500 [=========>....................] - ETA: 1:48 - loss: 0.7026 - regression_loss: 0.6143 - classification_loss: 0.0883 174/500 [=========>....................] - ETA: 1:48 - loss: 0.7043 - regression_loss: 0.6159 - classification_loss: 0.0884 175/500 [=========>....................] - ETA: 1:48 - loss: 0.7047 - regression_loss: 0.6163 - classification_loss: 0.0884 176/500 [=========>....................] - ETA: 1:47 - loss: 0.7066 - regression_loss: 0.6182 - classification_loss: 0.0884 177/500 [=========>....................] - ETA: 1:47 - loss: 0.7064 - regression_loss: 0.6180 - classification_loss: 0.0884 178/500 [=========>....................] - ETA: 1:47 - loss: 0.7100 - regression_loss: 0.6215 - classification_loss: 0.0885 179/500 [=========>....................] - ETA: 1:46 - loss: 0.7101 - regression_loss: 0.6216 - classification_loss: 0.0885 180/500 [=========>....................] - ETA: 1:46 - loss: 0.7120 - regression_loss: 0.6237 - classification_loss: 0.0883 181/500 [=========>....................] - ETA: 1:46 - loss: 0.7122 - regression_loss: 0.6240 - classification_loss: 0.0882 182/500 [=========>....................] - ETA: 1:45 - loss: 0.7132 - regression_loss: 0.6248 - classification_loss: 0.0884 183/500 [=========>....................] - ETA: 1:45 - loss: 0.7146 - regression_loss: 0.6260 - classification_loss: 0.0886 184/500 [==========>...................] - ETA: 1:45 - loss: 0.7155 - regression_loss: 0.6267 - classification_loss: 0.0889 185/500 [==========>...................] - ETA: 1:44 - loss: 0.7150 - regression_loss: 0.6261 - classification_loss: 0.0888 186/500 [==========>...................] - ETA: 1:44 - loss: 0.7197 - regression_loss: 0.6306 - classification_loss: 0.0891 187/500 [==========>...................] - ETA: 1:44 - loss: 0.7211 - regression_loss: 0.6321 - classification_loss: 0.0890 188/500 [==========>...................] - ETA: 1:43 - loss: 0.7229 - regression_loss: 0.6338 - classification_loss: 0.0891 189/500 [==========>...................] - ETA: 1:43 - loss: 0.7225 - regression_loss: 0.6335 - classification_loss: 0.0890 190/500 [==========>...................] - ETA: 1:42 - loss: 0.7226 - regression_loss: 0.6338 - classification_loss: 0.0889 191/500 [==========>...................] - ETA: 1:42 - loss: 0.7227 - regression_loss: 0.6339 - classification_loss: 0.0888 192/500 [==========>...................] - ETA: 1:42 - loss: 0.7197 - regression_loss: 0.6313 - classification_loss: 0.0884 193/500 [==========>...................] - ETA: 1:42 - loss: 0.7180 - regression_loss: 0.6298 - classification_loss: 0.0882 194/500 [==========>...................] - ETA: 1:41 - loss: 0.7190 - regression_loss: 0.6305 - classification_loss: 0.0885 195/500 [==========>...................] - ETA: 1:41 - loss: 0.7195 - regression_loss: 0.6308 - classification_loss: 0.0887 196/500 [==========>...................] - ETA: 1:41 - loss: 0.7213 - regression_loss: 0.6325 - classification_loss: 0.0888 197/500 [==========>...................] - ETA: 1:40 - loss: 0.7240 - regression_loss: 0.6349 - classification_loss: 0.0891 198/500 [==========>...................] - ETA: 1:40 - loss: 0.7254 - regression_loss: 0.6363 - classification_loss: 0.0891 199/500 [==========>...................] - ETA: 1:40 - loss: 0.7268 - regression_loss: 0.6377 - classification_loss: 0.0891 200/500 [===========>..................] - ETA: 1:39 - loss: 0.7293 - regression_loss: 0.6399 - classification_loss: 0.0894 201/500 [===========>..................] - ETA: 1:39 - loss: 0.7281 - regression_loss: 0.6391 - classification_loss: 0.0891 202/500 [===========>..................] - ETA: 1:39 - loss: 0.7296 - regression_loss: 0.6405 - classification_loss: 0.0891 203/500 [===========>..................] - ETA: 1:38 - loss: 0.7295 - regression_loss: 0.6405 - classification_loss: 0.0890 204/500 [===========>..................] - ETA: 1:38 - loss: 0.7319 - regression_loss: 0.6428 - classification_loss: 0.0891 205/500 [===========>..................] - ETA: 1:38 - loss: 0.7311 - regression_loss: 0.6421 - classification_loss: 0.0889 206/500 [===========>..................] - ETA: 1:37 - loss: 0.7338 - regression_loss: 0.6444 - classification_loss: 0.0894 207/500 [===========>..................] - ETA: 1:37 - loss: 0.7378 - regression_loss: 0.6479 - classification_loss: 0.0899 208/500 [===========>..................] - ETA: 1:37 - loss: 0.7392 - regression_loss: 0.6490 - classification_loss: 0.0902 209/500 [===========>..................] - ETA: 1:36 - loss: 0.7397 - regression_loss: 0.6495 - classification_loss: 0.0902 210/500 [===========>..................] - ETA: 1:36 - loss: 0.7375 - regression_loss: 0.6476 - classification_loss: 0.0899 211/500 [===========>..................] - ETA: 1:36 - loss: 0.7371 - regression_loss: 0.6473 - classification_loss: 0.0898 212/500 [===========>..................] - ETA: 1:35 - loss: 0.7390 - regression_loss: 0.6490 - classification_loss: 0.0900 213/500 [===========>..................] - ETA: 1:35 - loss: 0.7379 - regression_loss: 0.6481 - classification_loss: 0.0899 214/500 [===========>..................] - ETA: 1:35 - loss: 0.7368 - regression_loss: 0.6471 - classification_loss: 0.0897 215/500 [===========>..................] - ETA: 1:34 - loss: 0.7363 - regression_loss: 0.6465 - classification_loss: 0.0898 216/500 [===========>..................] - ETA: 1:34 - loss: 0.7374 - regression_loss: 0.6474 - classification_loss: 0.0900 217/500 [============>.................] - ETA: 1:34 - loss: 0.7377 - regression_loss: 0.6474 - classification_loss: 0.0903 218/500 [============>.................] - ETA: 1:33 - loss: 0.7391 - regression_loss: 0.6485 - classification_loss: 0.0906 219/500 [============>.................] - ETA: 1:33 - loss: 0.7403 - regression_loss: 0.6494 - classification_loss: 0.0908 220/500 [============>.................] - ETA: 1:33 - loss: 0.7397 - regression_loss: 0.6490 - classification_loss: 0.0906 221/500 [============>.................] - ETA: 1:32 - loss: 0.7396 - regression_loss: 0.6490 - classification_loss: 0.0906 222/500 [============>.................] - ETA: 1:32 - loss: 0.7407 - regression_loss: 0.6498 - classification_loss: 0.0909 223/500 [============>.................] - ETA: 1:32 - loss: 0.7386 - regression_loss: 0.6480 - classification_loss: 0.0906 224/500 [============>.................] - ETA: 1:31 - loss: 0.7371 - regression_loss: 0.6467 - classification_loss: 0.0904 225/500 [============>.................] - ETA: 1:31 - loss: 0.7373 - regression_loss: 0.6469 - classification_loss: 0.0904 226/500 [============>.................] - ETA: 1:31 - loss: 0.7397 - regression_loss: 0.6489 - classification_loss: 0.0907 227/500 [============>.................] - ETA: 1:30 - loss: 0.7414 - regression_loss: 0.6505 - classification_loss: 0.0910 228/500 [============>.................] - ETA: 1:30 - loss: 0.7429 - regression_loss: 0.6516 - classification_loss: 0.0912 229/500 [============>.................] - ETA: 1:30 - loss: 0.7419 - regression_loss: 0.6508 - classification_loss: 0.0911 230/500 [============>.................] - ETA: 1:29 - loss: 0.7440 - regression_loss: 0.6524 - classification_loss: 0.0916 231/500 [============>.................] - ETA: 1:29 - loss: 0.7424 - regression_loss: 0.6510 - classification_loss: 0.0914 232/500 [============>.................] - ETA: 1:29 - loss: 0.7414 - regression_loss: 0.6502 - classification_loss: 0.0912 233/500 [============>.................] - ETA: 1:28 - loss: 0.7398 - regression_loss: 0.6489 - classification_loss: 0.0909 234/500 [=============>................] - ETA: 1:28 - loss: 0.7390 - regression_loss: 0.6482 - classification_loss: 0.0908 235/500 [=============>................] - ETA: 1:28 - loss: 0.7377 - regression_loss: 0.6471 - classification_loss: 0.0906 236/500 [=============>................] - ETA: 1:27 - loss: 0.7372 - regression_loss: 0.6464 - classification_loss: 0.0907 237/500 [=============>................] - ETA: 1:27 - loss: 0.7376 - regression_loss: 0.6468 - classification_loss: 0.0908 238/500 [=============>................] - ETA: 1:27 - loss: 0.7363 - regression_loss: 0.6458 - classification_loss: 0.0906 239/500 [=============>................] - ETA: 1:26 - loss: 0.7353 - regression_loss: 0.6448 - classification_loss: 0.0905 240/500 [=============>................] - ETA: 1:26 - loss: 0.7347 - regression_loss: 0.6444 - classification_loss: 0.0903 241/500 [=============>................] - ETA: 1:26 - loss: 0.7335 - regression_loss: 0.6434 - classification_loss: 0.0901 242/500 [=============>................] - ETA: 1:25 - loss: 0.7332 - regression_loss: 0.6431 - classification_loss: 0.0901 243/500 [=============>................] - ETA: 1:25 - loss: 0.7344 - regression_loss: 0.6441 - classification_loss: 0.0902 244/500 [=============>................] - ETA: 1:25 - loss: 0.7337 - regression_loss: 0.6436 - classification_loss: 0.0902 245/500 [=============>................] - ETA: 1:24 - loss: 0.7331 - regression_loss: 0.6431 - classification_loss: 0.0900 246/500 [=============>................] - ETA: 1:24 - loss: 0.7344 - regression_loss: 0.6441 - classification_loss: 0.0903 247/500 [=============>................] - ETA: 1:24 - loss: 0.7333 - regression_loss: 0.6432 - classification_loss: 0.0901 248/500 [=============>................] - ETA: 1:23 - loss: 0.7336 - regression_loss: 0.6434 - classification_loss: 0.0902 249/500 [=============>................] - ETA: 1:23 - loss: 0.7324 - regression_loss: 0.6424 - classification_loss: 0.0900 250/500 [==============>...............] - ETA: 1:23 - loss: 0.7330 - regression_loss: 0.6429 - classification_loss: 0.0901 251/500 [==============>...............] - ETA: 1:22 - loss: 0.7320 - regression_loss: 0.6420 - classification_loss: 0.0899 252/500 [==============>...............] - ETA: 1:22 - loss: 0.7321 - regression_loss: 0.6421 - classification_loss: 0.0899 253/500 [==============>...............] - ETA: 1:22 - loss: 0.7321 - regression_loss: 0.6422 - classification_loss: 0.0899 254/500 [==============>...............] - ETA: 1:21 - loss: 0.7333 - regression_loss: 0.6430 - classification_loss: 0.0903 255/500 [==============>...............] - ETA: 1:21 - loss: 0.7336 - regression_loss: 0.6433 - classification_loss: 0.0903 256/500 [==============>...............] - ETA: 1:21 - loss: 0.7338 - regression_loss: 0.6435 - classification_loss: 0.0903 257/500 [==============>...............] - ETA: 1:20 - loss: 0.7340 - regression_loss: 0.6437 - classification_loss: 0.0903 258/500 [==============>...............] - ETA: 1:20 - loss: 0.7327 - regression_loss: 0.6426 - classification_loss: 0.0901 259/500 [==============>...............] - ETA: 1:20 - loss: 0.7327 - regression_loss: 0.6427 - classification_loss: 0.0900 260/500 [==============>...............] - ETA: 1:19 - loss: 0.7326 - regression_loss: 0.6424 - classification_loss: 0.0902 261/500 [==============>...............] - ETA: 1:19 - loss: 0.7326 - regression_loss: 0.6424 - classification_loss: 0.0902 262/500 [==============>...............] - ETA: 1:19 - loss: 0.7322 - regression_loss: 0.6420 - classification_loss: 0.0902 263/500 [==============>...............] - ETA: 1:18 - loss: 0.7318 - regression_loss: 0.6417 - classification_loss: 0.0901 264/500 [==============>...............] - ETA: 1:18 - loss: 0.7305 - regression_loss: 0.6406 - classification_loss: 0.0899 265/500 [==============>...............] - ETA: 1:18 - loss: 0.7312 - regression_loss: 0.6412 - classification_loss: 0.0900 266/500 [==============>...............] - ETA: 1:17 - loss: 0.7318 - regression_loss: 0.6417 - classification_loss: 0.0901 267/500 [===============>..............] - ETA: 1:17 - loss: 0.7328 - regression_loss: 0.6423 - classification_loss: 0.0905 268/500 [===============>..............] - ETA: 1:17 - loss: 0.7316 - regression_loss: 0.6414 - classification_loss: 0.0903 269/500 [===============>..............] - ETA: 1:16 - loss: 0.7306 - regression_loss: 0.6406 - classification_loss: 0.0901 270/500 [===============>..............] - ETA: 1:16 - loss: 0.7298 - regression_loss: 0.6399 - classification_loss: 0.0899 271/500 [===============>..............] - ETA: 1:16 - loss: 0.7283 - regression_loss: 0.6386 - classification_loss: 0.0897 272/500 [===============>..............] - ETA: 1:15 - loss: 0.7279 - regression_loss: 0.6383 - classification_loss: 0.0895 273/500 [===============>..............] - ETA: 1:15 - loss: 0.7272 - regression_loss: 0.6377 - classification_loss: 0.0894 274/500 [===============>..............] - ETA: 1:15 - loss: 0.7253 - regression_loss: 0.6361 - classification_loss: 0.0892 275/500 [===============>..............] - ETA: 1:14 - loss: 0.7235 - regression_loss: 0.6346 - classification_loss: 0.0889 276/500 [===============>..............] - ETA: 1:14 - loss: 0.7262 - regression_loss: 0.6367 - classification_loss: 0.0895 277/500 [===============>..............] - ETA: 1:14 - loss: 0.7263 - regression_loss: 0.6367 - classification_loss: 0.0896 278/500 [===============>..............] - ETA: 1:13 - loss: 0.7267 - regression_loss: 0.6372 - classification_loss: 0.0895 279/500 [===============>..............] - ETA: 1:13 - loss: 0.7270 - regression_loss: 0.6375 - classification_loss: 0.0895 280/500 [===============>..............] - ETA: 1:13 - loss: 0.7264 - regression_loss: 0.6370 - classification_loss: 0.0894 281/500 [===============>..............] - ETA: 1:12 - loss: 0.7251 - regression_loss: 0.6358 - classification_loss: 0.0893 282/500 [===============>..............] - ETA: 1:12 - loss: 0.7250 - regression_loss: 0.6356 - classification_loss: 0.0894 283/500 [===============>..............] - ETA: 1:12 - loss: 0.7240 - regression_loss: 0.6348 - classification_loss: 0.0892 284/500 [================>.............] - ETA: 1:11 - loss: 0.7235 - regression_loss: 0.6344 - classification_loss: 0.0892 285/500 [================>.............] - ETA: 1:11 - loss: 0.7238 - regression_loss: 0.6346 - classification_loss: 0.0892 286/500 [================>.............] - ETA: 1:11 - loss: 0.7263 - regression_loss: 0.6367 - classification_loss: 0.0896 287/500 [================>.............] - ETA: 1:10 - loss: 0.7255 - regression_loss: 0.6361 - classification_loss: 0.0895 288/500 [================>.............] - ETA: 1:10 - loss: 0.7259 - regression_loss: 0.6365 - classification_loss: 0.0894 289/500 [================>.............] - ETA: 1:10 - loss: 0.7261 - regression_loss: 0.6366 - classification_loss: 0.0894 290/500 [================>.............] - ETA: 1:09 - loss: 0.7261 - regression_loss: 0.6368 - classification_loss: 0.0894 291/500 [================>.............] - ETA: 1:09 - loss: 0.7271 - regression_loss: 0.6375 - classification_loss: 0.0896 292/500 [================>.............] - ETA: 1:09 - loss: 0.7291 - regression_loss: 0.6391 - classification_loss: 0.0900 293/500 [================>.............] - ETA: 1:08 - loss: 0.7288 - regression_loss: 0.6389 - classification_loss: 0.0899 294/500 [================>.............] - ETA: 1:08 - loss: 0.7284 - regression_loss: 0.6386 - classification_loss: 0.0899 295/500 [================>.............] - ETA: 1:08 - loss: 0.7275 - regression_loss: 0.6377 - classification_loss: 0.0898 296/500 [================>.............] - ETA: 1:07 - loss: 0.7262 - regression_loss: 0.6365 - classification_loss: 0.0896 297/500 [================>.............] - ETA: 1:07 - loss: 0.7246 - regression_loss: 0.6352 - classification_loss: 0.0894 298/500 [================>.............] - ETA: 1:07 - loss: 0.7249 - regression_loss: 0.6354 - classification_loss: 0.0895 299/500 [================>.............] - ETA: 1:06 - loss: 0.7238 - regression_loss: 0.6345 - classification_loss: 0.0893 300/500 [=================>............] - ETA: 1:06 - loss: 0.7238 - regression_loss: 0.6346 - classification_loss: 0.0892 301/500 [=================>............] - ETA: 1:06 - loss: 0.7230 - regression_loss: 0.6340 - classification_loss: 0.0891 302/500 [=================>............] - ETA: 1:05 - loss: 0.7225 - regression_loss: 0.6336 - classification_loss: 0.0889 303/500 [=================>............] - ETA: 1:05 - loss: 0.7229 - regression_loss: 0.6339 - classification_loss: 0.0890 304/500 [=================>............] - ETA: 1:05 - loss: 0.7224 - regression_loss: 0.6335 - classification_loss: 0.0889 305/500 [=================>............] - ETA: 1:04 - loss: 0.7217 - regression_loss: 0.6329 - classification_loss: 0.0888 306/500 [=================>............] - ETA: 1:04 - loss: 0.7215 - regression_loss: 0.6327 - classification_loss: 0.0888 307/500 [=================>............] - ETA: 1:04 - loss: 0.7225 - regression_loss: 0.6335 - classification_loss: 0.0890 308/500 [=================>............] - ETA: 1:03 - loss: 0.7245 - regression_loss: 0.6351 - classification_loss: 0.0893 309/500 [=================>............] - ETA: 1:03 - loss: 0.7252 - regression_loss: 0.6357 - classification_loss: 0.0896 310/500 [=================>............] - ETA: 1:03 - loss: 0.7239 - regression_loss: 0.6345 - classification_loss: 0.0894 311/500 [=================>............] - ETA: 1:02 - loss: 0.7229 - regression_loss: 0.6336 - classification_loss: 0.0893 312/500 [=================>............] - ETA: 1:02 - loss: 0.7220 - regression_loss: 0.6328 - classification_loss: 0.0892 313/500 [=================>............] - ETA: 1:02 - loss: 0.7210 - regression_loss: 0.6319 - classification_loss: 0.0891 314/500 [=================>............] - ETA: 1:01 - loss: 0.7205 - regression_loss: 0.6316 - classification_loss: 0.0890 315/500 [=================>............] - ETA: 1:01 - loss: 0.7206 - regression_loss: 0.6315 - classification_loss: 0.0891 316/500 [=================>............] - ETA: 1:01 - loss: 0.7213 - regression_loss: 0.6321 - classification_loss: 0.0892 317/500 [==================>...........] - ETA: 1:00 - loss: 0.7209 - regression_loss: 0.6318 - classification_loss: 0.0891 318/500 [==================>...........] - ETA: 1:00 - loss: 0.7202 - regression_loss: 0.6311 - classification_loss: 0.0890 319/500 [==================>...........] - ETA: 1:00 - loss: 0.7200 - regression_loss: 0.6311 - classification_loss: 0.0889 320/500 [==================>...........] - ETA: 59s - loss: 0.7208 - regression_loss: 0.6317 - classification_loss: 0.0891  321/500 [==================>...........] - ETA: 59s - loss: 0.7205 - regression_loss: 0.6315 - classification_loss: 0.0890 322/500 [==================>...........] - ETA: 59s - loss: 0.7208 - regression_loss: 0.6316 - classification_loss: 0.0893 323/500 [==================>...........] - ETA: 58s - loss: 0.7215 - regression_loss: 0.6323 - classification_loss: 0.0891 324/500 [==================>...........] - ETA: 58s - loss: 0.7212 - regression_loss: 0.6322 - classification_loss: 0.0890 325/500 [==================>...........] - ETA: 58s - loss: 0.7206 - regression_loss: 0.6317 - classification_loss: 0.0889 326/500 [==================>...........] - ETA: 57s - loss: 0.7200 - regression_loss: 0.6312 - classification_loss: 0.0889 327/500 [==================>...........] - ETA: 57s - loss: 0.7195 - regression_loss: 0.6306 - classification_loss: 0.0888 328/500 [==================>...........] - ETA: 57s - loss: 0.7182 - regression_loss: 0.6295 - classification_loss: 0.0886 329/500 [==================>...........] - ETA: 56s - loss: 0.7188 - regression_loss: 0.6301 - classification_loss: 0.0887 330/500 [==================>...........] - ETA: 56s - loss: 0.7182 - regression_loss: 0.6296 - classification_loss: 0.0885 331/500 [==================>...........] - ETA: 56s - loss: 0.7177 - regression_loss: 0.6292 - classification_loss: 0.0885 332/500 [==================>...........] - ETA: 55s - loss: 0.7177 - regression_loss: 0.6291 - classification_loss: 0.0886 333/500 [==================>...........] - ETA: 55s - loss: 0.7174 - regression_loss: 0.6289 - classification_loss: 0.0885 334/500 [===================>..........] - ETA: 55s - loss: 0.7161 - regression_loss: 0.6277 - classification_loss: 0.0884 335/500 [===================>..........] - ETA: 54s - loss: 0.7169 - regression_loss: 0.6284 - classification_loss: 0.0885 336/500 [===================>..........] - ETA: 54s - loss: 0.7161 - regression_loss: 0.6277 - classification_loss: 0.0884 337/500 [===================>..........] - ETA: 54s - loss: 0.7169 - regression_loss: 0.6283 - classification_loss: 0.0886 338/500 [===================>..........] - ETA: 53s - loss: 0.7157 - regression_loss: 0.6273 - classification_loss: 0.0884 339/500 [===================>..........] - ETA: 53s - loss: 0.7167 - regression_loss: 0.6281 - classification_loss: 0.0885 340/500 [===================>..........] - ETA: 53s - loss: 0.7159 - regression_loss: 0.6274 - classification_loss: 0.0885 341/500 [===================>..........] - ETA: 52s - loss: 0.7162 - regression_loss: 0.6277 - classification_loss: 0.0885 342/500 [===================>..........] - ETA: 52s - loss: 0.7150 - regression_loss: 0.6266 - classification_loss: 0.0883 343/500 [===================>..........] - ETA: 52s - loss: 0.7158 - regression_loss: 0.6274 - classification_loss: 0.0884 344/500 [===================>..........] - ETA: 51s - loss: 0.7149 - regression_loss: 0.6266 - classification_loss: 0.0883 345/500 [===================>..........] - ETA: 51s - loss: 0.7150 - regression_loss: 0.6267 - classification_loss: 0.0883 346/500 [===================>..........] - ETA: 51s - loss: 0.7137 - regression_loss: 0.6255 - classification_loss: 0.0882 347/500 [===================>..........] - ETA: 50s - loss: 0.7141 - regression_loss: 0.6258 - classification_loss: 0.0883 348/500 [===================>..........] - ETA: 50s - loss: 0.7135 - regression_loss: 0.6253 - classification_loss: 0.0882 349/500 [===================>..........] - ETA: 50s - loss: 0.7135 - regression_loss: 0.6253 - classification_loss: 0.0882 350/500 [====================>.........] - ETA: 49s - loss: 0.7127 - regression_loss: 0.6245 - classification_loss: 0.0881 351/500 [====================>.........] - ETA: 49s - loss: 0.7125 - regression_loss: 0.6244 - classification_loss: 0.0881 352/500 [====================>.........] - ETA: 49s - loss: 0.7123 - regression_loss: 0.6241 - classification_loss: 0.0881 353/500 [====================>.........] - ETA: 48s - loss: 0.7114 - regression_loss: 0.6234 - classification_loss: 0.0880 354/500 [====================>.........] - ETA: 48s - loss: 0.7108 - regression_loss: 0.6229 - classification_loss: 0.0879 355/500 [====================>.........] - ETA: 48s - loss: 0.7100 - regression_loss: 0.6222 - classification_loss: 0.0878 356/500 [====================>.........] - ETA: 47s - loss: 0.7089 - regression_loss: 0.6213 - classification_loss: 0.0876 357/500 [====================>.........] - ETA: 47s - loss: 0.7093 - regression_loss: 0.6216 - classification_loss: 0.0877 358/500 [====================>.........] - ETA: 47s - loss: 0.7102 - regression_loss: 0.6223 - classification_loss: 0.0879 359/500 [====================>.........] - ETA: 46s - loss: 0.7110 - regression_loss: 0.6230 - classification_loss: 0.0880 360/500 [====================>.........] - ETA: 46s - loss: 0.7107 - regression_loss: 0.6227 - classification_loss: 0.0880 361/500 [====================>.........] - ETA: 46s - loss: 0.7110 - regression_loss: 0.6229 - classification_loss: 0.0881 362/500 [====================>.........] - ETA: 45s - loss: 0.7116 - regression_loss: 0.6235 - classification_loss: 0.0882 363/500 [====================>.........] - ETA: 45s - loss: 0.7119 - regression_loss: 0.6237 - classification_loss: 0.0881 364/500 [====================>.........] - ETA: 45s - loss: 0.7131 - regression_loss: 0.6250 - classification_loss: 0.0881 365/500 [====================>.........] - ETA: 44s - loss: 0.7137 - regression_loss: 0.6255 - classification_loss: 0.0882 366/500 [====================>.........] - ETA: 44s - loss: 0.7149 - regression_loss: 0.6265 - classification_loss: 0.0884 367/500 [=====================>........] - ETA: 44s - loss: 0.7152 - regression_loss: 0.6268 - classification_loss: 0.0884 368/500 [=====================>........] - ETA: 43s - loss: 0.7145 - regression_loss: 0.6262 - classification_loss: 0.0882 369/500 [=====================>........] - ETA: 43s - loss: 0.7148 - regression_loss: 0.6265 - classification_loss: 0.0883 370/500 [=====================>........] - ETA: 43s - loss: 0.7148 - regression_loss: 0.6266 - classification_loss: 0.0883 371/500 [=====================>........] - ETA: 42s - loss: 0.7137 - regression_loss: 0.6256 - classification_loss: 0.0881 372/500 [=====================>........] - ETA: 42s - loss: 0.7129 - regression_loss: 0.6249 - classification_loss: 0.0880 373/500 [=====================>........] - ETA: 42s - loss: 0.7131 - regression_loss: 0.6252 - classification_loss: 0.0879 374/500 [=====================>........] - ETA: 41s - loss: 0.7131 - regression_loss: 0.6251 - classification_loss: 0.0880 375/500 [=====================>........] - ETA: 41s - loss: 0.7121 - regression_loss: 0.6242 - classification_loss: 0.0879 376/500 [=====================>........] - ETA: 41s - loss: 0.7114 - regression_loss: 0.6236 - classification_loss: 0.0878 377/500 [=====================>........] - ETA: 40s - loss: 0.7107 - regression_loss: 0.6230 - classification_loss: 0.0877 378/500 [=====================>........] - ETA: 40s - loss: 0.7104 - regression_loss: 0.6228 - classification_loss: 0.0876 379/500 [=====================>........] - ETA: 40s - loss: 0.7094 - regression_loss: 0.6219 - classification_loss: 0.0875 380/500 [=====================>........] - ETA: 39s - loss: 0.7088 - regression_loss: 0.6214 - classification_loss: 0.0874 381/500 [=====================>........] - ETA: 39s - loss: 0.7085 - regression_loss: 0.6211 - classification_loss: 0.0874 382/500 [=====================>........] - ETA: 39s - loss: 0.7086 - regression_loss: 0.6213 - classification_loss: 0.0874 383/500 [=====================>........] - ETA: 38s - loss: 0.7097 - regression_loss: 0.6222 - classification_loss: 0.0875 384/500 [======================>.......] - ETA: 38s - loss: 0.7100 - regression_loss: 0.6224 - classification_loss: 0.0876 385/500 [======================>.......] - ETA: 38s - loss: 0.7092 - regression_loss: 0.6217 - classification_loss: 0.0874 386/500 [======================>.......] - ETA: 37s - loss: 0.7092 - regression_loss: 0.6218 - classification_loss: 0.0874 387/500 [======================>.......] - ETA: 37s - loss: 0.7079 - regression_loss: 0.6207 - classification_loss: 0.0872 388/500 [======================>.......] - ETA: 37s - loss: 0.7077 - regression_loss: 0.6205 - classification_loss: 0.0872 389/500 [======================>.......] - ETA: 36s - loss: 0.7071 - regression_loss: 0.6200 - classification_loss: 0.0871 390/500 [======================>.......] - ETA: 36s - loss: 0.7068 - regression_loss: 0.6197 - classification_loss: 0.0871 391/500 [======================>.......] - ETA: 36s - loss: 0.7069 - regression_loss: 0.6199 - classification_loss: 0.0871 392/500 [======================>.......] - ETA: 35s - loss: 0.7067 - regression_loss: 0.6196 - classification_loss: 0.0871 393/500 [======================>.......] - ETA: 35s - loss: 0.7059 - regression_loss: 0.6190 - classification_loss: 0.0869 394/500 [======================>.......] - ETA: 35s - loss: 0.7057 - regression_loss: 0.6188 - classification_loss: 0.0869 395/500 [======================>.......] - ETA: 34s - loss: 0.7049 - regression_loss: 0.6181 - classification_loss: 0.0868 396/500 [======================>.......] - ETA: 34s - loss: 0.7060 - regression_loss: 0.6189 - classification_loss: 0.0871 397/500 [======================>.......] - ETA: 34s - loss: 0.7051 - regression_loss: 0.6182 - classification_loss: 0.0869 398/500 [======================>.......] - ETA: 33s - loss: 0.7054 - regression_loss: 0.6185 - classification_loss: 0.0869 399/500 [======================>.......] - ETA: 33s - loss: 0.7060 - regression_loss: 0.6190 - classification_loss: 0.0870 400/500 [=======================>......] - ETA: 33s - loss: 0.7049 - regression_loss: 0.6180 - classification_loss: 0.0868 401/500 [=======================>......] - ETA: 32s - loss: 0.7046 - regression_loss: 0.6179 - classification_loss: 0.0867 402/500 [=======================>......] - ETA: 32s - loss: 0.7054 - regression_loss: 0.6185 - classification_loss: 0.0869 403/500 [=======================>......] - ETA: 32s - loss: 0.7057 - regression_loss: 0.6189 - classification_loss: 0.0868 404/500 [=======================>......] - ETA: 31s - loss: 0.7059 - regression_loss: 0.6191 - classification_loss: 0.0868 405/500 [=======================>......] - ETA: 31s - loss: 0.7054 - regression_loss: 0.6187 - classification_loss: 0.0867 406/500 [=======================>......] - ETA: 31s - loss: 0.7064 - regression_loss: 0.6195 - classification_loss: 0.0869 407/500 [=======================>......] - ETA: 30s - loss: 0.7072 - regression_loss: 0.6201 - classification_loss: 0.0870 408/500 [=======================>......] - ETA: 30s - loss: 0.7069 - regression_loss: 0.6200 - classification_loss: 0.0869 409/500 [=======================>......] - ETA: 30s - loss: 0.7059 - regression_loss: 0.6192 - classification_loss: 0.0868 410/500 [=======================>......] - ETA: 29s - loss: 0.7059 - regression_loss: 0.6192 - classification_loss: 0.0867 411/500 [=======================>......] - ETA: 29s - loss: 0.7060 - regression_loss: 0.6192 - classification_loss: 0.0868 412/500 [=======================>......] - ETA: 29s - loss: 0.7055 - regression_loss: 0.6188 - classification_loss: 0.0867 413/500 [=======================>......] - ETA: 28s - loss: 0.7047 - regression_loss: 0.6181 - classification_loss: 0.0866 414/500 [=======================>......] - ETA: 28s - loss: 0.7038 - regression_loss: 0.6174 - classification_loss: 0.0865 415/500 [=======================>......] - ETA: 28s - loss: 0.7039 - regression_loss: 0.6174 - classification_loss: 0.0865 416/500 [=======================>......] - ETA: 27s - loss: 0.7043 - regression_loss: 0.6178 - classification_loss: 0.0865 417/500 [========================>.....] - ETA: 27s - loss: 0.7038 - regression_loss: 0.6174 - classification_loss: 0.0864 418/500 [========================>.....] - ETA: 27s - loss: 0.7031 - regression_loss: 0.6168 - classification_loss: 0.0863 419/500 [========================>.....] - ETA: 26s - loss: 0.7033 - regression_loss: 0.6170 - classification_loss: 0.0863 420/500 [========================>.....] - ETA: 26s - loss: 0.7029 - regression_loss: 0.6167 - classification_loss: 0.0862 421/500 [========================>.....] - ETA: 26s - loss: 0.7035 - regression_loss: 0.6171 - classification_loss: 0.0864 422/500 [========================>.....] - ETA: 25s - loss: 0.7028 - regression_loss: 0.6165 - classification_loss: 0.0863 423/500 [========================>.....] - ETA: 25s - loss: 0.7020 - regression_loss: 0.6158 - classification_loss: 0.0862 424/500 [========================>.....] - ETA: 25s - loss: 0.7019 - regression_loss: 0.6157 - classification_loss: 0.0861 425/500 [========================>.....] - ETA: 24s - loss: 0.7017 - regression_loss: 0.6156 - classification_loss: 0.0861 426/500 [========================>.....] - ETA: 24s - loss: 0.7012 - regression_loss: 0.6151 - classification_loss: 0.0861 427/500 [========================>.....] - ETA: 24s - loss: 0.7036 - regression_loss: 0.6166 - classification_loss: 0.0870 428/500 [========================>.....] - ETA: 23s - loss: 0.7045 - regression_loss: 0.6174 - classification_loss: 0.0871 429/500 [========================>.....] - ETA: 23s - loss: 0.7039 - regression_loss: 0.6169 - classification_loss: 0.0870 430/500 [========================>.....] - ETA: 23s - loss: 0.7037 - regression_loss: 0.6166 - classification_loss: 0.0870 431/500 [========================>.....] - ETA: 22s - loss: 0.7042 - regression_loss: 0.6171 - classification_loss: 0.0871 432/500 [========================>.....] - ETA: 22s - loss: 0.7038 - regression_loss: 0.6168 - classification_loss: 0.0870 433/500 [========================>.....] - ETA: 22s - loss: 0.7037 - regression_loss: 0.6168 - classification_loss: 0.0869 434/500 [=========================>....] - ETA: 21s - loss: 0.7034 - regression_loss: 0.6165 - classification_loss: 0.0869 435/500 [=========================>....] - ETA: 21s - loss: 0.7027 - regression_loss: 0.6159 - classification_loss: 0.0868 436/500 [=========================>....] - ETA: 21s - loss: 0.7028 - regression_loss: 0.6160 - classification_loss: 0.0868 437/500 [=========================>....] - ETA: 20s - loss: 0.7030 - regression_loss: 0.6161 - classification_loss: 0.0869 438/500 [=========================>....] - ETA: 20s - loss: 0.7023 - regression_loss: 0.6154 - classification_loss: 0.0869 439/500 [=========================>....] - ETA: 20s - loss: 0.7017 - regression_loss: 0.6150 - classification_loss: 0.0868 440/500 [=========================>....] - ETA: 19s - loss: 0.7013 - regression_loss: 0.6147 - classification_loss: 0.0866 441/500 [=========================>....] - ETA: 19s - loss: 0.7006 - regression_loss: 0.6141 - classification_loss: 0.0865 442/500 [=========================>....] - ETA: 19s - loss: 0.7006 - regression_loss: 0.6141 - classification_loss: 0.0866 443/500 [=========================>....] - ETA: 18s - loss: 0.7013 - regression_loss: 0.6147 - classification_loss: 0.0866 444/500 [=========================>....] - ETA: 18s - loss: 0.7020 - regression_loss: 0.6153 - classification_loss: 0.0868 445/500 [=========================>....] - ETA: 18s - loss: 0.7028 - regression_loss: 0.6159 - classification_loss: 0.0868 446/500 [=========================>....] - ETA: 17s - loss: 0.7031 - regression_loss: 0.6162 - classification_loss: 0.0869 447/500 [=========================>....] - ETA: 17s - loss: 0.7021 - regression_loss: 0.6153 - classification_loss: 0.0868 448/500 [=========================>....] - ETA: 17s - loss: 0.7020 - regression_loss: 0.6152 - classification_loss: 0.0868 449/500 [=========================>....] - ETA: 16s - loss: 0.7017 - regression_loss: 0.6149 - classification_loss: 0.0869 450/500 [==========================>...] - ETA: 16s - loss: 0.7012 - regression_loss: 0.6144 - classification_loss: 0.0868 451/500 [==========================>...] - ETA: 16s - loss: 0.7012 - regression_loss: 0.6143 - classification_loss: 0.0869 452/500 [==========================>...] - ETA: 15s - loss: 0.7003 - regression_loss: 0.6135 - classification_loss: 0.0868 453/500 [==========================>...] - ETA: 15s - loss: 0.7015 - regression_loss: 0.6145 - classification_loss: 0.0870 454/500 [==========================>...] - ETA: 15s - loss: 0.7008 - regression_loss: 0.6139 - classification_loss: 0.0869 455/500 [==========================>...] - ETA: 14s - loss: 0.7016 - regression_loss: 0.6145 - classification_loss: 0.0871 456/500 [==========================>...] - ETA: 14s - loss: 0.7010 - regression_loss: 0.6140 - classification_loss: 0.0870 457/500 [==========================>...] - ETA: 14s - loss: 0.7014 - regression_loss: 0.6144 - classification_loss: 0.0870 458/500 [==========================>...] - ETA: 13s - loss: 0.7017 - regression_loss: 0.6146 - classification_loss: 0.0871 459/500 [==========================>...] - ETA: 13s - loss: 0.7015 - regression_loss: 0.6145 - classification_loss: 0.0871 460/500 [==========================>...] - ETA: 13s - loss: 0.7018 - regression_loss: 0.6147 - classification_loss: 0.0871 461/500 [==========================>...] - ETA: 12s - loss: 0.7015 - regression_loss: 0.6145 - classification_loss: 0.0870 462/500 [==========================>...] - ETA: 12s - loss: 0.7021 - regression_loss: 0.6150 - classification_loss: 0.0871 463/500 [==========================>...] - ETA: 12s - loss: 0.7021 - regression_loss: 0.6151 - classification_loss: 0.0871 464/500 [==========================>...] - ETA: 11s - loss: 0.7020 - regression_loss: 0.6150 - classification_loss: 0.0870 465/500 [==========================>...] - ETA: 11s - loss: 0.7012 - regression_loss: 0.6143 - classification_loss: 0.0869 466/500 [==========================>...] - ETA: 11s - loss: 0.7016 - regression_loss: 0.6146 - classification_loss: 0.0870 467/500 [===========================>..] - ETA: 10s - loss: 0.7018 - regression_loss: 0.6149 - classification_loss: 0.0869 468/500 [===========================>..] - ETA: 10s - loss: 0.7015 - regression_loss: 0.6146 - classification_loss: 0.0870 469/500 [===========================>..] - ETA: 10s - loss: 0.7024 - regression_loss: 0.6154 - classification_loss: 0.0870 470/500 [===========================>..] - ETA: 9s - loss: 0.7029 - regression_loss: 0.6158 - classification_loss: 0.0871  471/500 [===========================>..] - ETA: 9s - loss: 0.7023 - regression_loss: 0.6153 - classification_loss: 0.0870 472/500 [===========================>..] - ETA: 9s - loss: 0.7027 - regression_loss: 0.6157 - classification_loss: 0.0871 473/500 [===========================>..] - ETA: 8s - loss: 0.7023 - regression_loss: 0.6153 - classification_loss: 0.0870 474/500 [===========================>..] - ETA: 8s - loss: 0.7018 - regression_loss: 0.6149 - classification_loss: 0.0869 475/500 [===========================>..] - ETA: 8s - loss: 0.7007 - regression_loss: 0.6139 - classification_loss: 0.0868 476/500 [===========================>..] - ETA: 7s - loss: 0.7005 - regression_loss: 0.6138 - classification_loss: 0.0867 477/500 [===========================>..] - ETA: 7s - loss: 0.7006 - regression_loss: 0.6138 - classification_loss: 0.0867 478/500 [===========================>..] - ETA: 7s - loss: 0.7002 - regression_loss: 0.6135 - classification_loss: 0.0867 479/500 [===========================>..] - ETA: 6s - loss: 0.7014 - regression_loss: 0.6145 - classification_loss: 0.0869 480/500 [===========================>..] - ETA: 6s - loss: 0.7013 - regression_loss: 0.6144 - classification_loss: 0.0869 481/500 [===========================>..] - ETA: 6s - loss: 0.7013 - regression_loss: 0.6144 - classification_loss: 0.0869 482/500 [===========================>..] - ETA: 5s - loss: 0.7026 - regression_loss: 0.6154 - classification_loss: 0.0872 483/500 [===========================>..] - ETA: 5s - loss: 0.7015 - regression_loss: 0.6145 - classification_loss: 0.0870 484/500 [============================>.] - ETA: 5s - loss: 0.7015 - regression_loss: 0.6145 - classification_loss: 0.0870 485/500 [============================>.] - ETA: 4s - loss: 0.7008 - regression_loss: 0.6138 - classification_loss: 0.0870 486/500 [============================>.] - ETA: 4s - loss: 0.7002 - regression_loss: 0.6133 - classification_loss: 0.0869 487/500 [============================>.] - ETA: 4s - loss: 0.7007 - regression_loss: 0.6138 - classification_loss: 0.0870 488/500 [============================>.] - ETA: 3s - loss: 0.7000 - regression_loss: 0.6130 - classification_loss: 0.0869 489/500 [============================>.] - ETA: 3s - loss: 0.6999 - regression_loss: 0.6129 - classification_loss: 0.0869 490/500 [============================>.] - ETA: 3s - loss: 0.6990 - regression_loss: 0.6122 - classification_loss: 0.0868 491/500 [============================>.] - ETA: 2s - loss: 0.6993 - regression_loss: 0.6125 - classification_loss: 0.0869 492/500 [============================>.] - ETA: 2s - loss: 0.6985 - regression_loss: 0.6117 - classification_loss: 0.0868 493/500 [============================>.] - ETA: 2s - loss: 0.6985 - regression_loss: 0.6117 - classification_loss: 0.0869 494/500 [============================>.] - ETA: 1s - loss: 0.6994 - regression_loss: 0.6124 - classification_loss: 0.0870 495/500 [============================>.] - ETA: 1s - loss: 0.6996 - regression_loss: 0.6126 - classification_loss: 0.0870 496/500 [============================>.] - ETA: 1s - loss: 0.7002 - regression_loss: 0.6130 - classification_loss: 0.0872 497/500 [============================>.] - ETA: 0s - loss: 0.7006 - regression_loss: 0.6134 - classification_loss: 0.0872 498/500 [============================>.] - ETA: 0s - loss: 0.7008 - regression_loss: 0.6135 - classification_loss: 0.0873 499/500 [============================>.] - ETA: 0s - loss: 0.7010 - regression_loss: 0.6137 - classification_loss: 0.0873 500/500 [==============================] - 166s 332ms/step - loss: 0.7006 - regression_loss: 0.6133 - classification_loss: 0.0873 1172 instances of class plum with average precision: 0.6464 mAP: 0.6464 Epoch 00045: saving model to ./training/snapshots/resnet101_pascal_45.h5 Epoch 46/150 1/500 [..............................] - ETA: 2:33 - loss: 0.7991 - regression_loss: 0.6743 - classification_loss: 0.1248 2/500 [..............................] - ETA: 2:42 - loss: 0.7182 - regression_loss: 0.6200 - classification_loss: 0.0982 3/500 [..............................] - ETA: 2:42 - loss: 0.6462 - regression_loss: 0.5707 - classification_loss: 0.0755 4/500 [..............................] - ETA: 2:40 - loss: 0.6147 - regression_loss: 0.5411 - classification_loss: 0.0736 5/500 [..............................] - ETA: 2:39 - loss: 0.5479 - regression_loss: 0.4815 - classification_loss: 0.0663 6/500 [..............................] - ETA: 2:41 - loss: 0.6016 - regression_loss: 0.5252 - classification_loss: 0.0764 7/500 [..............................] - ETA: 2:41 - loss: 0.5591 - regression_loss: 0.4879 - classification_loss: 0.0712 8/500 [..............................] - ETA: 2:39 - loss: 0.5667 - regression_loss: 0.4935 - classification_loss: 0.0732 9/500 [..............................] - ETA: 2:41 - loss: 0.5806 - regression_loss: 0.5069 - classification_loss: 0.0738 10/500 [..............................] - ETA: 2:41 - loss: 0.5965 - regression_loss: 0.5247 - classification_loss: 0.0718 11/500 [..............................] - ETA: 2:40 - loss: 0.5789 - regression_loss: 0.5107 - classification_loss: 0.0682 12/500 [..............................] - ETA: 2:39 - loss: 0.5693 - regression_loss: 0.5048 - classification_loss: 0.0645 13/500 [..............................] - ETA: 2:39 - loss: 0.5636 - regression_loss: 0.5002 - classification_loss: 0.0634 14/500 [..............................] - ETA: 2:39 - loss: 0.5507 - regression_loss: 0.4894 - classification_loss: 0.0613 15/500 [..............................] - ETA: 2:38 - loss: 0.5784 - regression_loss: 0.5143 - classification_loss: 0.0641 16/500 [..............................] - ETA: 2:37 - loss: 0.5678 - regression_loss: 0.5051 - classification_loss: 0.0627 17/500 [>.............................] - ETA: 2:37 - loss: 0.5617 - regression_loss: 0.5000 - classification_loss: 0.0617 18/500 [>.............................] - ETA: 2:37 - loss: 0.5747 - regression_loss: 0.5099 - classification_loss: 0.0648 19/500 [>.............................] - ETA: 2:37 - loss: 0.5901 - regression_loss: 0.5231 - classification_loss: 0.0670 20/500 [>.............................] - ETA: 2:36 - loss: 0.5998 - regression_loss: 0.5298 - classification_loss: 0.0700 21/500 [>.............................] - ETA: 2:37 - loss: 0.5839 - regression_loss: 0.5156 - classification_loss: 0.0683 22/500 [>.............................] - ETA: 2:36 - loss: 0.5724 - regression_loss: 0.5059 - classification_loss: 0.0665 23/500 [>.............................] - ETA: 2:37 - loss: 0.5643 - regression_loss: 0.4988 - classification_loss: 0.0655 24/500 [>.............................] - ETA: 2:36 - loss: 0.5818 - regression_loss: 0.5131 - classification_loss: 0.0687 25/500 [>.............................] - ETA: 2:36 - loss: 0.6001 - regression_loss: 0.5296 - classification_loss: 0.0705 26/500 [>.............................] - ETA: 2:36 - loss: 0.6258 - regression_loss: 0.5504 - classification_loss: 0.0754 27/500 [>.............................] - ETA: 2:36 - loss: 0.6280 - regression_loss: 0.5515 - classification_loss: 0.0765 28/500 [>.............................] - ETA: 2:35 - loss: 0.6256 - regression_loss: 0.5494 - classification_loss: 0.0763 29/500 [>.............................] - ETA: 2:35 - loss: 0.6303 - regression_loss: 0.5525 - classification_loss: 0.0779 30/500 [>.............................] - ETA: 2:34 - loss: 0.6293 - regression_loss: 0.5520 - classification_loss: 0.0772 31/500 [>.............................] - ETA: 2:34 - loss: 0.6261 - regression_loss: 0.5505 - classification_loss: 0.0757 32/500 [>.............................] - ETA: 2:33 - loss: 0.6278 - regression_loss: 0.5519 - classification_loss: 0.0758 33/500 [>.............................] - ETA: 2:33 - loss: 0.6227 - regression_loss: 0.5477 - classification_loss: 0.0750 34/500 [=>............................] - ETA: 2:33 - loss: 0.6237 - regression_loss: 0.5491 - classification_loss: 0.0745 35/500 [=>............................] - ETA: 2:32 - loss: 0.6371 - regression_loss: 0.5603 - classification_loss: 0.0767 36/500 [=>............................] - ETA: 2:32 - loss: 0.6330 - regression_loss: 0.5567 - classification_loss: 0.0763 37/500 [=>............................] - ETA: 2:32 - loss: 0.6399 - regression_loss: 0.5619 - classification_loss: 0.0780 38/500 [=>............................] - ETA: 2:32 - loss: 0.6476 - regression_loss: 0.5686 - classification_loss: 0.0790 39/500 [=>............................] - ETA: 2:32 - loss: 0.6454 - regression_loss: 0.5662 - classification_loss: 0.0793 40/500 [=>............................] - ETA: 2:32 - loss: 0.6363 - regression_loss: 0.5581 - classification_loss: 0.0782 41/500 [=>............................] - ETA: 2:32 - loss: 0.6326 - regression_loss: 0.5544 - classification_loss: 0.0782 42/500 [=>............................] - ETA: 2:31 - loss: 0.6383 - regression_loss: 0.5592 - classification_loss: 0.0791 43/500 [=>............................] - ETA: 2:31 - loss: 0.6293 - regression_loss: 0.5513 - classification_loss: 0.0780 44/500 [=>............................] - ETA: 2:31 - loss: 0.6227 - regression_loss: 0.5455 - classification_loss: 0.0772 45/500 [=>............................] - ETA: 2:30 - loss: 0.6290 - regression_loss: 0.5504 - classification_loss: 0.0787 46/500 [=>............................] - ETA: 2:29 - loss: 0.6318 - regression_loss: 0.5527 - classification_loss: 0.0790 47/500 [=>............................] - ETA: 2:29 - loss: 0.6400 - regression_loss: 0.5599 - classification_loss: 0.0801 48/500 [=>............................] - ETA: 2:29 - loss: 0.6370 - regression_loss: 0.5582 - classification_loss: 0.0789 49/500 [=>............................] - ETA: 2:28 - loss: 0.6436 - regression_loss: 0.5645 - classification_loss: 0.0791 50/500 [==>...........................] - ETA: 2:28 - loss: 0.6434 - regression_loss: 0.5641 - classification_loss: 0.0792 51/500 [==>...........................] - ETA: 2:27 - loss: 0.6478 - regression_loss: 0.5677 - classification_loss: 0.0801 52/500 [==>...........................] - ETA: 2:27 - loss: 0.6389 - regression_loss: 0.5592 - classification_loss: 0.0797 53/500 [==>...........................] - ETA: 2:27 - loss: 0.6413 - regression_loss: 0.5619 - classification_loss: 0.0794 54/500 [==>...........................] - ETA: 2:27 - loss: 0.6437 - regression_loss: 0.5644 - classification_loss: 0.0794 55/500 [==>...........................] - ETA: 2:26 - loss: 0.6485 - regression_loss: 0.5686 - classification_loss: 0.0799 56/500 [==>...........................] - ETA: 2:26 - loss: 0.6538 - regression_loss: 0.5728 - classification_loss: 0.0811 57/500 [==>...........................] - ETA: 2:26 - loss: 0.6592 - regression_loss: 0.5765 - classification_loss: 0.0828 58/500 [==>...........................] - ETA: 2:25 - loss: 0.6628 - regression_loss: 0.5801 - classification_loss: 0.0827 59/500 [==>...........................] - ETA: 2:25 - loss: 0.6636 - regression_loss: 0.5806 - classification_loss: 0.0830 60/500 [==>...........................] - ETA: 2:24 - loss: 0.6559 - regression_loss: 0.5738 - classification_loss: 0.0822 61/500 [==>...........................] - ETA: 2:24 - loss: 0.6623 - regression_loss: 0.5785 - classification_loss: 0.0838 62/500 [==>...........................] - ETA: 2:24 - loss: 0.6738 - regression_loss: 0.5881 - classification_loss: 0.0857 63/500 [==>...........................] - ETA: 2:24 - loss: 0.6695 - regression_loss: 0.5845 - classification_loss: 0.0850 64/500 [==>...........................] - ETA: 2:24 - loss: 0.6733 - regression_loss: 0.5875 - classification_loss: 0.0858 65/500 [==>...........................] - ETA: 2:23 - loss: 0.6692 - regression_loss: 0.5839 - classification_loss: 0.0853 66/500 [==>...........................] - ETA: 2:23 - loss: 0.6671 - regression_loss: 0.5812 - classification_loss: 0.0859 67/500 [===>..........................] - ETA: 2:23 - loss: 0.6660 - regression_loss: 0.5800 - classification_loss: 0.0860 68/500 [===>..........................] - ETA: 2:22 - loss: 0.6667 - regression_loss: 0.5806 - classification_loss: 0.0861 69/500 [===>..........................] - ETA: 2:22 - loss: 0.6609 - regression_loss: 0.5756 - classification_loss: 0.0853 70/500 [===>..........................] - ETA: 2:21 - loss: 0.6551 - regression_loss: 0.5708 - classification_loss: 0.0844 71/500 [===>..........................] - ETA: 2:21 - loss: 0.6534 - regression_loss: 0.5691 - classification_loss: 0.0843 72/500 [===>..........................] - ETA: 2:21 - loss: 0.6498 - regression_loss: 0.5662 - classification_loss: 0.0836 73/500 [===>..........................] - ETA: 2:20 - loss: 0.6480 - regression_loss: 0.5646 - classification_loss: 0.0833 74/500 [===>..........................] - ETA: 2:20 - loss: 0.6480 - regression_loss: 0.5648 - classification_loss: 0.0832 75/500 [===>..........................] - ETA: 2:20 - loss: 0.6466 - regression_loss: 0.5640 - classification_loss: 0.0826 76/500 [===>..........................] - ETA: 2:19 - loss: 0.6405 - regression_loss: 0.5588 - classification_loss: 0.0817 77/500 [===>..........................] - ETA: 2:19 - loss: 0.6508 - regression_loss: 0.5674 - classification_loss: 0.0833 78/500 [===>..........................] - ETA: 2:19 - loss: 0.6474 - regression_loss: 0.5647 - classification_loss: 0.0827 79/500 [===>..........................] - ETA: 2:18 - loss: 0.6458 - regression_loss: 0.5635 - classification_loss: 0.0824 80/500 [===>..........................] - ETA: 2:18 - loss: 0.6482 - regression_loss: 0.5654 - classification_loss: 0.0828 81/500 [===>..........................] - ETA: 2:18 - loss: 0.6475 - regression_loss: 0.5645 - classification_loss: 0.0830 82/500 [===>..........................] - ETA: 2:17 - loss: 0.6491 - regression_loss: 0.5659 - classification_loss: 0.0833 83/500 [===>..........................] - ETA: 2:17 - loss: 0.6443 - regression_loss: 0.5615 - classification_loss: 0.0828 84/500 [====>.........................] - ETA: 2:17 - loss: 0.6482 - regression_loss: 0.5647 - classification_loss: 0.0836 85/500 [====>.........................] - ETA: 2:16 - loss: 0.6475 - regression_loss: 0.5641 - classification_loss: 0.0834 86/500 [====>.........................] - ETA: 2:16 - loss: 0.6485 - regression_loss: 0.5650 - classification_loss: 0.0835 87/500 [====>.........................] - ETA: 2:15 - loss: 0.6460 - regression_loss: 0.5627 - classification_loss: 0.0833 88/500 [====>.........................] - ETA: 2:15 - loss: 0.6449 - regression_loss: 0.5620 - classification_loss: 0.0829 89/500 [====>.........................] - ETA: 2:15 - loss: 0.6456 - regression_loss: 0.5629 - classification_loss: 0.0827 90/500 [====>.........................] - ETA: 2:14 - loss: 0.6483 - regression_loss: 0.5650 - classification_loss: 0.0833 91/500 [====>.........................] - ETA: 2:14 - loss: 0.6504 - regression_loss: 0.5668 - classification_loss: 0.0836 92/500 [====>.........................] - ETA: 2:14 - loss: 0.6499 - regression_loss: 0.5666 - classification_loss: 0.0833 93/500 [====>.........................] - ETA: 2:13 - loss: 0.6610 - regression_loss: 0.5760 - classification_loss: 0.0850 94/500 [====>.........................] - ETA: 2:13 - loss: 0.6599 - regression_loss: 0.5752 - classification_loss: 0.0847 95/500 [====>.........................] - ETA: 2:13 - loss: 0.6647 - regression_loss: 0.5792 - classification_loss: 0.0854 96/500 [====>.........................] - ETA: 2:12 - loss: 0.6640 - regression_loss: 0.5788 - classification_loss: 0.0853 97/500 [====>.........................] - ETA: 2:12 - loss: 0.6676 - regression_loss: 0.5827 - classification_loss: 0.0849 98/500 [====>.........................] - ETA: 2:12 - loss: 0.6651 - regression_loss: 0.5803 - classification_loss: 0.0848 99/500 [====>.........................] - ETA: 2:11 - loss: 0.6723 - regression_loss: 0.5867 - classification_loss: 0.0856 100/500 [=====>........................] - ETA: 2:11 - loss: 0.6699 - regression_loss: 0.5849 - classification_loss: 0.0850 101/500 [=====>........................] - ETA: 2:11 - loss: 0.6734 - regression_loss: 0.5880 - classification_loss: 0.0854 102/500 [=====>........................] - ETA: 2:10 - loss: 0.6738 - regression_loss: 0.5885 - classification_loss: 0.0853 103/500 [=====>........................] - ETA: 2:10 - loss: 0.6728 - regression_loss: 0.5876 - classification_loss: 0.0852 104/500 [=====>........................] - ETA: 2:10 - loss: 0.6738 - regression_loss: 0.5886 - classification_loss: 0.0852 105/500 [=====>........................] - ETA: 2:09 - loss: 0.6704 - regression_loss: 0.5856 - classification_loss: 0.0848 106/500 [=====>........................] - ETA: 2:09 - loss: 0.6679 - regression_loss: 0.5836 - classification_loss: 0.0844 107/500 [=====>........................] - ETA: 2:08 - loss: 0.6699 - regression_loss: 0.5852 - classification_loss: 0.0846 108/500 [=====>........................] - ETA: 2:08 - loss: 0.6714 - regression_loss: 0.5869 - classification_loss: 0.0845 109/500 [=====>........................] - ETA: 2:08 - loss: 0.6719 - regression_loss: 0.5872 - classification_loss: 0.0847 110/500 [=====>........................] - ETA: 2:07 - loss: 0.6687 - regression_loss: 0.5845 - classification_loss: 0.0843 111/500 [=====>........................] - ETA: 2:07 - loss: 0.6704 - regression_loss: 0.5859 - classification_loss: 0.0845 112/500 [=====>........................] - ETA: 2:07 - loss: 0.6678 - regression_loss: 0.5838 - classification_loss: 0.0840 113/500 [=====>........................] - ETA: 2:07 - loss: 0.6679 - regression_loss: 0.5840 - classification_loss: 0.0839 114/500 [=====>........................] - ETA: 2:06 - loss: 0.6648 - regression_loss: 0.5813 - classification_loss: 0.0835 115/500 [=====>........................] - ETA: 2:06 - loss: 0.6660 - regression_loss: 0.5820 - classification_loss: 0.0840 116/500 [=====>........................] - ETA: 2:06 - loss: 0.6629 - regression_loss: 0.5793 - classification_loss: 0.0836 117/500 [======>.......................] - ETA: 2:05 - loss: 0.6648 - regression_loss: 0.5810 - classification_loss: 0.0838 118/500 [======>.......................] - ETA: 2:05 - loss: 0.6672 - regression_loss: 0.5829 - classification_loss: 0.0843 119/500 [======>.......................] - ETA: 2:05 - loss: 0.6647 - regression_loss: 0.5807 - classification_loss: 0.0840 120/500 [======>.......................] - ETA: 2:04 - loss: 0.6647 - regression_loss: 0.5808 - classification_loss: 0.0839 121/500 [======>.......................] - ETA: 2:04 - loss: 0.6670 - regression_loss: 0.5828 - classification_loss: 0.0841 122/500 [======>.......................] - ETA: 2:04 - loss: 0.6692 - regression_loss: 0.5851 - classification_loss: 0.0841 123/500 [======>.......................] - ETA: 2:03 - loss: 0.6721 - regression_loss: 0.5877 - classification_loss: 0.0844 124/500 [======>.......................] - ETA: 2:03 - loss: 0.6708 - regression_loss: 0.5867 - classification_loss: 0.0841 125/500 [======>.......................] - ETA: 2:03 - loss: 0.6713 - regression_loss: 0.5873 - classification_loss: 0.0840 126/500 [======>.......................] - ETA: 2:02 - loss: 0.6737 - regression_loss: 0.5895 - classification_loss: 0.0842 127/500 [======>.......................] - ETA: 2:02 - loss: 0.6703 - regression_loss: 0.5865 - classification_loss: 0.0839 128/500 [======>.......................] - ETA: 2:02 - loss: 0.6752 - regression_loss: 0.5903 - classification_loss: 0.0849 129/500 [======>.......................] - ETA: 2:01 - loss: 0.6764 - regression_loss: 0.5916 - classification_loss: 0.0848 130/500 [======>.......................] - ETA: 2:01 - loss: 0.6821 - regression_loss: 0.5963 - classification_loss: 0.0858 131/500 [======>.......................] - ETA: 2:01 - loss: 0.6814 - regression_loss: 0.5956 - classification_loss: 0.0859 132/500 [======>.......................] - ETA: 2:00 - loss: 0.6784 - regression_loss: 0.5927 - classification_loss: 0.0857 133/500 [======>.......................] - ETA: 2:00 - loss: 0.6767 - regression_loss: 0.5912 - classification_loss: 0.0855 134/500 [=======>......................] - ETA: 2:00 - loss: 0.6761 - regression_loss: 0.5908 - classification_loss: 0.0853 135/500 [=======>......................] - ETA: 1:59 - loss: 0.6799 - regression_loss: 0.5939 - classification_loss: 0.0859 136/500 [=======>......................] - ETA: 1:59 - loss: 0.6809 - regression_loss: 0.5947 - classification_loss: 0.0862 137/500 [=======>......................] - ETA: 1:59 - loss: 0.6796 - regression_loss: 0.5935 - classification_loss: 0.0861 138/500 [=======>......................] - ETA: 1:58 - loss: 0.6783 - regression_loss: 0.5927 - classification_loss: 0.0856 139/500 [=======>......................] - ETA: 1:58 - loss: 0.6808 - regression_loss: 0.5947 - classification_loss: 0.0861 140/500 [=======>......................] - ETA: 1:58 - loss: 0.6787 - regression_loss: 0.5931 - classification_loss: 0.0856 141/500 [=======>......................] - ETA: 1:57 - loss: 0.6775 - regression_loss: 0.5922 - classification_loss: 0.0853 142/500 [=======>......................] - ETA: 1:57 - loss: 0.6758 - regression_loss: 0.5907 - classification_loss: 0.0851 143/500 [=======>......................] - ETA: 1:57 - loss: 0.6755 - regression_loss: 0.5905 - classification_loss: 0.0850 144/500 [=======>......................] - ETA: 1:56 - loss: 0.6750 - regression_loss: 0.5900 - classification_loss: 0.0849 145/500 [=======>......................] - ETA: 1:56 - loss: 0.6750 - regression_loss: 0.5900 - classification_loss: 0.0850 146/500 [=======>......................] - ETA: 1:56 - loss: 0.6731 - regression_loss: 0.5884 - classification_loss: 0.0847 147/500 [=======>......................] - ETA: 1:56 - loss: 0.6724 - regression_loss: 0.5878 - classification_loss: 0.0847 148/500 [=======>......................] - ETA: 1:55 - loss: 0.6706 - regression_loss: 0.5862 - classification_loss: 0.0844 149/500 [=======>......................] - ETA: 1:55 - loss: 0.6681 - regression_loss: 0.5840 - classification_loss: 0.0841 150/500 [========>.....................] - ETA: 1:55 - loss: 0.6661 - regression_loss: 0.5824 - classification_loss: 0.0837 151/500 [========>.....................] - ETA: 1:54 - loss: 0.6640 - regression_loss: 0.5805 - classification_loss: 0.0835 152/500 [========>.....................] - ETA: 1:54 - loss: 0.6644 - regression_loss: 0.5807 - classification_loss: 0.0837 153/500 [========>.....................] - ETA: 1:54 - loss: 0.6623 - regression_loss: 0.5788 - classification_loss: 0.0834 154/500 [========>.....................] - ETA: 1:53 - loss: 0.6602 - regression_loss: 0.5771 - classification_loss: 0.0831 155/500 [========>.....................] - ETA: 1:53 - loss: 0.6580 - regression_loss: 0.5751 - classification_loss: 0.0828 156/500 [========>.....................] - ETA: 1:53 - loss: 0.6588 - regression_loss: 0.5759 - classification_loss: 0.0829 157/500 [========>.....................] - ETA: 1:52 - loss: 0.6608 - regression_loss: 0.5775 - classification_loss: 0.0832 158/500 [========>.....................] - ETA: 1:52 - loss: 0.6603 - regression_loss: 0.5772 - classification_loss: 0.0831 159/500 [========>.....................] - ETA: 1:52 - loss: 0.6588 - regression_loss: 0.5761 - classification_loss: 0.0828 160/500 [========>.....................] - ETA: 1:51 - loss: 0.6570 - regression_loss: 0.5745 - classification_loss: 0.0826 161/500 [========>.....................] - ETA: 1:51 - loss: 0.6542 - regression_loss: 0.5720 - classification_loss: 0.0822 162/500 [========>.....................] - ETA: 1:51 - loss: 0.6533 - regression_loss: 0.5713 - classification_loss: 0.0820 163/500 [========>.....................] - ETA: 1:50 - loss: 0.6513 - regression_loss: 0.5696 - classification_loss: 0.0817 164/500 [========>.....................] - ETA: 1:50 - loss: 0.6509 - regression_loss: 0.5692 - classification_loss: 0.0817 165/500 [========>.....................] - ETA: 1:50 - loss: 0.6491 - regression_loss: 0.5676 - classification_loss: 0.0814 166/500 [========>.....................] - ETA: 1:50 - loss: 0.6495 - regression_loss: 0.5681 - classification_loss: 0.0814 167/500 [=========>....................] - ETA: 1:49 - loss: 0.6499 - regression_loss: 0.5684 - classification_loss: 0.0815 168/500 [=========>....................] - ETA: 1:49 - loss: 0.6480 - regression_loss: 0.5667 - classification_loss: 0.0813 169/500 [=========>....................] - ETA: 1:49 - loss: 0.6470 - regression_loss: 0.5659 - classification_loss: 0.0811 170/500 [=========>....................] - ETA: 1:48 - loss: 0.6490 - regression_loss: 0.5673 - classification_loss: 0.0817 171/500 [=========>....................] - ETA: 1:48 - loss: 0.6489 - regression_loss: 0.5674 - classification_loss: 0.0815 172/500 [=========>....................] - ETA: 1:48 - loss: 0.6473 - regression_loss: 0.5660 - classification_loss: 0.0813 173/500 [=========>....................] - ETA: 1:47 - loss: 0.6471 - regression_loss: 0.5660 - classification_loss: 0.0812 174/500 [=========>....................] - ETA: 1:47 - loss: 0.6471 - regression_loss: 0.5659 - classification_loss: 0.0812 175/500 [=========>....................] - ETA: 1:47 - loss: 0.6478 - regression_loss: 0.5666 - classification_loss: 0.0812 176/500 [=========>....................] - ETA: 1:46 - loss: 0.6484 - regression_loss: 0.5672 - classification_loss: 0.0812 177/500 [=========>....................] - ETA: 1:46 - loss: 0.6496 - regression_loss: 0.5685 - classification_loss: 0.0812 178/500 [=========>....................] - ETA: 1:46 - loss: 0.6482 - regression_loss: 0.5671 - classification_loss: 0.0811 179/500 [=========>....................] - ETA: 1:45 - loss: 0.6483 - regression_loss: 0.5673 - classification_loss: 0.0810 180/500 [=========>....................] - ETA: 1:45 - loss: 0.6476 - regression_loss: 0.5667 - classification_loss: 0.0809 181/500 [=========>....................] - ETA: 1:45 - loss: 0.6476 - regression_loss: 0.5667 - classification_loss: 0.0809 182/500 [=========>....................] - ETA: 1:44 - loss: 0.6473 - regression_loss: 0.5663 - classification_loss: 0.0811 183/500 [=========>....................] - ETA: 1:44 - loss: 0.6482 - regression_loss: 0.5672 - classification_loss: 0.0811 184/500 [==========>...................] - ETA: 1:44 - loss: 0.6499 - regression_loss: 0.5686 - classification_loss: 0.0813 185/500 [==========>...................] - ETA: 1:43 - loss: 0.6475 - regression_loss: 0.5665 - classification_loss: 0.0810 186/500 [==========>...................] - ETA: 1:43 - loss: 0.6505 - regression_loss: 0.5691 - classification_loss: 0.0814 187/500 [==========>...................] - ETA: 1:43 - loss: 0.6516 - regression_loss: 0.5700 - classification_loss: 0.0816 188/500 [==========>...................] - ETA: 1:42 - loss: 0.6522 - regression_loss: 0.5705 - classification_loss: 0.0817 189/500 [==========>...................] - ETA: 1:42 - loss: 0.6520 - regression_loss: 0.5701 - classification_loss: 0.0819 190/500 [==========>...................] - ETA: 1:42 - loss: 0.6518 - regression_loss: 0.5699 - classification_loss: 0.0819 191/500 [==========>...................] - ETA: 1:41 - loss: 0.6524 - regression_loss: 0.5704 - classification_loss: 0.0820 192/500 [==========>...................] - ETA: 1:41 - loss: 0.6523 - regression_loss: 0.5704 - classification_loss: 0.0819 193/500 [==========>...................] - ETA: 1:41 - loss: 0.6508 - regression_loss: 0.5691 - classification_loss: 0.0817 194/500 [==========>...................] - ETA: 1:40 - loss: 0.6496 - regression_loss: 0.5680 - classification_loss: 0.0816 195/500 [==========>...................] - ETA: 1:40 - loss: 0.6505 - regression_loss: 0.5690 - classification_loss: 0.0816 196/500 [==========>...................] - ETA: 1:40 - loss: 0.6531 - regression_loss: 0.5710 - classification_loss: 0.0820 197/500 [==========>...................] - ETA: 1:39 - loss: 0.6519 - regression_loss: 0.5701 - classification_loss: 0.0818 198/500 [==========>...................] - ETA: 1:39 - loss: 0.6548 - regression_loss: 0.5725 - classification_loss: 0.0824 199/500 [==========>...................] - ETA: 1:39 - loss: 0.6545 - regression_loss: 0.5720 - classification_loss: 0.0824 200/500 [===========>..................] - ETA: 1:38 - loss: 0.6538 - regression_loss: 0.5716 - classification_loss: 0.0822 201/500 [===========>..................] - ETA: 1:38 - loss: 0.6563 - regression_loss: 0.5736 - classification_loss: 0.0827 202/500 [===========>..................] - ETA: 1:38 - loss: 0.6553 - regression_loss: 0.5728 - classification_loss: 0.0826 203/500 [===========>..................] - ETA: 1:37 - loss: 0.6553 - regression_loss: 0.5729 - classification_loss: 0.0824 204/500 [===========>..................] - ETA: 1:37 - loss: 0.6557 - regression_loss: 0.5733 - classification_loss: 0.0824 205/500 [===========>..................] - ETA: 1:37 - loss: 0.6542 - regression_loss: 0.5720 - classification_loss: 0.0822 206/500 [===========>..................] - ETA: 1:36 - loss: 0.6547 - regression_loss: 0.5725 - classification_loss: 0.0823 207/500 [===========>..................] - ETA: 1:36 - loss: 0.6553 - regression_loss: 0.5731 - classification_loss: 0.0822 208/500 [===========>..................] - ETA: 1:36 - loss: 0.6566 - regression_loss: 0.5742 - classification_loss: 0.0824 209/500 [===========>..................] - ETA: 1:35 - loss: 0.6564 - regression_loss: 0.5740 - classification_loss: 0.0824 210/500 [===========>..................] - ETA: 1:35 - loss: 0.6550 - regression_loss: 0.5728 - classification_loss: 0.0822 211/500 [===========>..................] - ETA: 1:35 - loss: 0.6568 - regression_loss: 0.5742 - classification_loss: 0.0825 212/500 [===========>..................] - ETA: 1:34 - loss: 0.6558 - regression_loss: 0.5734 - classification_loss: 0.0825 213/500 [===========>..................] - ETA: 1:34 - loss: 0.6561 - regression_loss: 0.5737 - classification_loss: 0.0824 214/500 [===========>..................] - ETA: 1:34 - loss: 0.6537 - regression_loss: 0.5716 - classification_loss: 0.0822 215/500 [===========>..................] - ETA: 1:33 - loss: 0.6535 - regression_loss: 0.5714 - classification_loss: 0.0821 216/500 [===========>..................] - ETA: 1:33 - loss: 0.6546 - regression_loss: 0.5724 - classification_loss: 0.0822 217/500 [============>.................] - ETA: 1:33 - loss: 0.6541 - regression_loss: 0.5721 - classification_loss: 0.0821 218/500 [============>.................] - ETA: 1:32 - loss: 0.6533 - regression_loss: 0.5713 - classification_loss: 0.0820 219/500 [============>.................] - ETA: 1:32 - loss: 0.6539 - regression_loss: 0.5720 - classification_loss: 0.0820 220/500 [============>.................] - ETA: 1:32 - loss: 0.6541 - regression_loss: 0.5720 - classification_loss: 0.0821 221/500 [============>.................] - ETA: 1:32 - loss: 0.6556 - regression_loss: 0.5736 - classification_loss: 0.0819 222/500 [============>.................] - ETA: 1:31 - loss: 0.6553 - regression_loss: 0.5735 - classification_loss: 0.0818 223/500 [============>.................] - ETA: 1:31 - loss: 0.6548 - regression_loss: 0.5732 - classification_loss: 0.0817 224/500 [============>.................] - ETA: 1:31 - loss: 0.6541 - regression_loss: 0.5725 - classification_loss: 0.0816 225/500 [============>.................] - ETA: 1:30 - loss: 0.6547 - regression_loss: 0.5731 - classification_loss: 0.0816 226/500 [============>.................] - ETA: 1:30 - loss: 0.6551 - regression_loss: 0.5735 - classification_loss: 0.0816 227/500 [============>.................] - ETA: 1:30 - loss: 0.6553 - regression_loss: 0.5737 - classification_loss: 0.0816 228/500 [============>.................] - ETA: 1:29 - loss: 0.6580 - regression_loss: 0.5759 - classification_loss: 0.0821 229/500 [============>.................] - ETA: 1:29 - loss: 0.6595 - regression_loss: 0.5772 - classification_loss: 0.0823 230/500 [============>.................] - ETA: 1:29 - loss: 0.6608 - regression_loss: 0.5781 - classification_loss: 0.0827 231/500 [============>.................] - ETA: 1:28 - loss: 0.6613 - regression_loss: 0.5786 - classification_loss: 0.0827 232/500 [============>.................] - ETA: 1:28 - loss: 0.6613 - regression_loss: 0.5786 - classification_loss: 0.0827 233/500 [============>.................] - ETA: 1:28 - loss: 0.6609 - regression_loss: 0.5783 - classification_loss: 0.0825 234/500 [=============>................] - ETA: 1:27 - loss: 0.6615 - regression_loss: 0.5791 - classification_loss: 0.0824 235/500 [=============>................] - ETA: 1:27 - loss: 0.6624 - regression_loss: 0.5799 - classification_loss: 0.0825 236/500 [=============>................] - ETA: 1:27 - loss: 0.6622 - regression_loss: 0.5798 - classification_loss: 0.0825 237/500 [=============>................] - ETA: 1:26 - loss: 0.6607 - regression_loss: 0.5785 - classification_loss: 0.0822 238/500 [=============>................] - ETA: 1:26 - loss: 0.6618 - regression_loss: 0.5793 - classification_loss: 0.0825 239/500 [=============>................] - ETA: 1:26 - loss: 0.6598 - regression_loss: 0.5776 - classification_loss: 0.0822 240/500 [=============>................] - ETA: 1:25 - loss: 0.6581 - regression_loss: 0.5761 - classification_loss: 0.0820 241/500 [=============>................] - ETA: 1:25 - loss: 0.6575 - regression_loss: 0.5758 - classification_loss: 0.0818 242/500 [=============>................] - ETA: 1:25 - loss: 0.6582 - regression_loss: 0.5764 - classification_loss: 0.0818 243/500 [=============>................] - ETA: 1:24 - loss: 0.6576 - regression_loss: 0.5760 - classification_loss: 0.0816 244/500 [=============>................] - ETA: 1:24 - loss: 0.6579 - regression_loss: 0.5764 - classification_loss: 0.0815 245/500 [=============>................] - ETA: 1:24 - loss: 0.6586 - regression_loss: 0.5770 - classification_loss: 0.0816 246/500 [=============>................] - ETA: 1:23 - loss: 0.6598 - regression_loss: 0.5781 - classification_loss: 0.0817 247/500 [=============>................] - ETA: 1:23 - loss: 0.6606 - regression_loss: 0.5788 - classification_loss: 0.0818 248/500 [=============>................] - ETA: 1:23 - loss: 0.6600 - regression_loss: 0.5783 - classification_loss: 0.0817 249/500 [=============>................] - ETA: 1:22 - loss: 0.6602 - regression_loss: 0.5784 - classification_loss: 0.0818 250/500 [==============>...............] - ETA: 1:22 - loss: 0.6612 - regression_loss: 0.5792 - classification_loss: 0.0820 251/500 [==============>...............] - ETA: 1:22 - loss: 0.6611 - regression_loss: 0.5791 - classification_loss: 0.0820 252/500 [==============>...............] - ETA: 1:21 - loss: 0.6616 - regression_loss: 0.5795 - classification_loss: 0.0820 253/500 [==============>...............] - ETA: 1:21 - loss: 0.6618 - regression_loss: 0.5796 - classification_loss: 0.0822 254/500 [==============>...............] - ETA: 1:21 - loss: 0.6606 - regression_loss: 0.5786 - classification_loss: 0.0819 255/500 [==============>...............] - ETA: 1:20 - loss: 0.6601 - regression_loss: 0.5783 - classification_loss: 0.0818 256/500 [==============>...............] - ETA: 1:20 - loss: 0.6606 - regression_loss: 0.5787 - classification_loss: 0.0819 257/500 [==============>...............] - ETA: 1:20 - loss: 0.6601 - regression_loss: 0.5782 - classification_loss: 0.0819 258/500 [==============>...............] - ETA: 1:19 - loss: 0.6605 - regression_loss: 0.5785 - classification_loss: 0.0819 259/500 [==============>...............] - ETA: 1:19 - loss: 0.6616 - regression_loss: 0.5796 - classification_loss: 0.0820 260/500 [==============>...............] - ETA: 1:19 - loss: 0.6628 - regression_loss: 0.5808 - classification_loss: 0.0820 261/500 [==============>...............] - ETA: 1:18 - loss: 0.6646 - regression_loss: 0.5825 - classification_loss: 0.0821 262/500 [==============>...............] - ETA: 1:18 - loss: 0.6632 - regression_loss: 0.5813 - classification_loss: 0.0820 263/500 [==============>...............] - ETA: 1:18 - loss: 0.6651 - regression_loss: 0.5828 - classification_loss: 0.0824 264/500 [==============>...............] - ETA: 1:17 - loss: 0.6642 - regression_loss: 0.5820 - classification_loss: 0.0822 265/500 [==============>...............] - ETA: 1:17 - loss: 0.6657 - regression_loss: 0.5832 - classification_loss: 0.0825 266/500 [==============>...............] - ETA: 1:17 - loss: 0.6654 - regression_loss: 0.5829 - classification_loss: 0.0825 267/500 [===============>..............] - ETA: 1:16 - loss: 0.6661 - regression_loss: 0.5834 - classification_loss: 0.0827 268/500 [===============>..............] - ETA: 1:16 - loss: 0.6664 - regression_loss: 0.5835 - classification_loss: 0.0828 269/500 [===============>..............] - ETA: 1:16 - loss: 0.6671 - regression_loss: 0.5840 - classification_loss: 0.0831 270/500 [===============>..............] - ETA: 1:15 - loss: 0.6667 - regression_loss: 0.5836 - classification_loss: 0.0831 271/500 [===============>..............] - ETA: 1:15 - loss: 0.6673 - regression_loss: 0.5841 - classification_loss: 0.0832 272/500 [===============>..............] - ETA: 1:15 - loss: 0.6669 - regression_loss: 0.5837 - classification_loss: 0.0832 273/500 [===============>..............] - ETA: 1:14 - loss: 0.6656 - regression_loss: 0.5826 - classification_loss: 0.0830 274/500 [===============>..............] - ETA: 1:14 - loss: 0.6678 - regression_loss: 0.5844 - classification_loss: 0.0835 275/500 [===============>..............] - ETA: 1:14 - loss: 0.6674 - regression_loss: 0.5840 - classification_loss: 0.0834 276/500 [===============>..............] - ETA: 1:13 - loss: 0.6661 - regression_loss: 0.5828 - classification_loss: 0.0833 277/500 [===============>..............] - ETA: 1:13 - loss: 0.6666 - regression_loss: 0.5833 - classification_loss: 0.0833 278/500 [===============>..............] - ETA: 1:13 - loss: 0.6674 - regression_loss: 0.5839 - classification_loss: 0.0836 279/500 [===============>..............] - ETA: 1:12 - loss: 0.6659 - regression_loss: 0.5826 - classification_loss: 0.0833 280/500 [===============>..............] - ETA: 1:12 - loss: 0.6648 - regression_loss: 0.5817 - classification_loss: 0.0831 281/500 [===============>..............] - ETA: 1:12 - loss: 0.6633 - regression_loss: 0.5803 - classification_loss: 0.0830 282/500 [===============>..............] - ETA: 1:11 - loss: 0.6636 - regression_loss: 0.5805 - classification_loss: 0.0831 283/500 [===============>..............] - ETA: 1:11 - loss: 0.6631 - regression_loss: 0.5801 - classification_loss: 0.0830 284/500 [================>.............] - ETA: 1:11 - loss: 0.6622 - regression_loss: 0.5793 - classification_loss: 0.0829 285/500 [================>.............] - ETA: 1:10 - loss: 0.6609 - regression_loss: 0.5782 - classification_loss: 0.0827 286/500 [================>.............] - ETA: 1:10 - loss: 0.6622 - regression_loss: 0.5795 - classification_loss: 0.0827 287/500 [================>.............] - ETA: 1:10 - loss: 0.6636 - regression_loss: 0.5807 - classification_loss: 0.0829 288/500 [================>.............] - ETA: 1:09 - loss: 0.6633 - regression_loss: 0.5803 - classification_loss: 0.0830 289/500 [================>.............] - ETA: 1:09 - loss: 0.6647 - regression_loss: 0.5815 - classification_loss: 0.0832 290/500 [================>.............] - ETA: 1:09 - loss: 0.6641 - regression_loss: 0.5811 - classification_loss: 0.0831 291/500 [================>.............] - ETA: 1:08 - loss: 0.6644 - regression_loss: 0.5814 - classification_loss: 0.0831 292/500 [================>.............] - ETA: 1:08 - loss: 0.6672 - regression_loss: 0.5837 - classification_loss: 0.0835 293/500 [================>.............] - ETA: 1:08 - loss: 0.6672 - regression_loss: 0.5838 - classification_loss: 0.0834 294/500 [================>.............] - ETA: 1:07 - loss: 0.6668 - regression_loss: 0.5835 - classification_loss: 0.0833 295/500 [================>.............] - ETA: 1:07 - loss: 0.6676 - regression_loss: 0.5842 - classification_loss: 0.0834 296/500 [================>.............] - ETA: 1:07 - loss: 0.6663 - regression_loss: 0.5831 - classification_loss: 0.0832 297/500 [================>.............] - ETA: 1:06 - loss: 0.6671 - regression_loss: 0.5838 - classification_loss: 0.0833 298/500 [================>.............] - ETA: 1:06 - loss: 0.6683 - regression_loss: 0.5848 - classification_loss: 0.0835 299/500 [================>.............] - ETA: 1:06 - loss: 0.6680 - regression_loss: 0.5846 - classification_loss: 0.0833 300/500 [=================>............] - ETA: 1:05 - loss: 0.6692 - regression_loss: 0.5857 - classification_loss: 0.0835 301/500 [=================>............] - ETA: 1:05 - loss: 0.6692 - regression_loss: 0.5857 - classification_loss: 0.0835 302/500 [=================>............] - ETA: 1:05 - loss: 0.6707 - regression_loss: 0.5871 - classification_loss: 0.0837 303/500 [=================>............] - ETA: 1:04 - loss: 0.6713 - regression_loss: 0.5876 - classification_loss: 0.0837 304/500 [=================>............] - ETA: 1:04 - loss: 0.6710 - regression_loss: 0.5874 - classification_loss: 0.0836 305/500 [=================>............] - ETA: 1:04 - loss: 0.6712 - regression_loss: 0.5876 - classification_loss: 0.0836 306/500 [=================>............] - ETA: 1:04 - loss: 0.6728 - regression_loss: 0.5889 - classification_loss: 0.0839 307/500 [=================>............] - ETA: 1:03 - loss: 0.6736 - regression_loss: 0.5896 - classification_loss: 0.0840 308/500 [=================>............] - ETA: 1:03 - loss: 0.6729 - regression_loss: 0.5890 - classification_loss: 0.0839 309/500 [=================>............] - ETA: 1:03 - loss: 0.6731 - regression_loss: 0.5892 - classification_loss: 0.0839 310/500 [=================>............] - ETA: 1:02 - loss: 0.6740 - regression_loss: 0.5900 - classification_loss: 0.0840 311/500 [=================>............] - ETA: 1:02 - loss: 0.6729 - regression_loss: 0.5891 - classification_loss: 0.0838 312/500 [=================>............] - ETA: 1:02 - loss: 0.6722 - regression_loss: 0.5885 - classification_loss: 0.0837 313/500 [=================>............] - ETA: 1:01 - loss: 0.6718 - regression_loss: 0.5882 - classification_loss: 0.0837 314/500 [=================>............] - ETA: 1:01 - loss: 0.6719 - regression_loss: 0.5882 - classification_loss: 0.0837 315/500 [=================>............] - ETA: 1:01 - loss: 0.6714 - regression_loss: 0.5878 - classification_loss: 0.0836 316/500 [=================>............] - ETA: 1:00 - loss: 0.6699 - regression_loss: 0.5865 - classification_loss: 0.0834 317/500 [==================>...........] - ETA: 1:00 - loss: 0.6716 - regression_loss: 0.5880 - classification_loss: 0.0835 318/500 [==================>...........] - ETA: 1:00 - loss: 0.6722 - regression_loss: 0.5886 - classification_loss: 0.0835 319/500 [==================>...........] - ETA: 59s - loss: 0.6735 - regression_loss: 0.5896 - classification_loss: 0.0838  320/500 [==================>...........] - ETA: 59s - loss: 0.6744 - regression_loss: 0.5905 - classification_loss: 0.0839 321/500 [==================>...........] - ETA: 59s - loss: 0.6744 - regression_loss: 0.5906 - classification_loss: 0.0838 322/500 [==================>...........] - ETA: 58s - loss: 0.6749 - regression_loss: 0.5912 - classification_loss: 0.0838 323/500 [==================>...........] - ETA: 58s - loss: 0.6746 - regression_loss: 0.5909 - classification_loss: 0.0837 324/500 [==================>...........] - ETA: 58s - loss: 0.6745 - regression_loss: 0.5909 - classification_loss: 0.0836 325/500 [==================>...........] - ETA: 57s - loss: 0.6754 - regression_loss: 0.5917 - classification_loss: 0.0837 326/500 [==================>...........] - ETA: 57s - loss: 0.6760 - regression_loss: 0.5923 - classification_loss: 0.0837 327/500 [==================>...........] - ETA: 57s - loss: 0.6770 - regression_loss: 0.5933 - classification_loss: 0.0837 328/500 [==================>...........] - ETA: 56s - loss: 0.6761 - regression_loss: 0.5925 - classification_loss: 0.0835 329/500 [==================>...........] - ETA: 56s - loss: 0.6778 - regression_loss: 0.5940 - classification_loss: 0.0839 330/500 [==================>...........] - ETA: 56s - loss: 0.6787 - regression_loss: 0.5947 - classification_loss: 0.0840 331/500 [==================>...........] - ETA: 55s - loss: 0.6781 - regression_loss: 0.5942 - classification_loss: 0.0839 332/500 [==================>...........] - ETA: 55s - loss: 0.6785 - regression_loss: 0.5945 - classification_loss: 0.0840 333/500 [==================>...........] - ETA: 55s - loss: 0.6778 - regression_loss: 0.5939 - classification_loss: 0.0839 334/500 [===================>..........] - ETA: 54s - loss: 0.6779 - regression_loss: 0.5939 - classification_loss: 0.0839 335/500 [===================>..........] - ETA: 54s - loss: 0.6767 - regression_loss: 0.5929 - classification_loss: 0.0838 336/500 [===================>..........] - ETA: 54s - loss: 0.6775 - regression_loss: 0.5937 - classification_loss: 0.0839 337/500 [===================>..........] - ETA: 53s - loss: 0.6764 - regression_loss: 0.5927 - classification_loss: 0.0837 338/500 [===================>..........] - ETA: 53s - loss: 0.6769 - regression_loss: 0.5932 - classification_loss: 0.0837 339/500 [===================>..........] - ETA: 53s - loss: 0.6755 - regression_loss: 0.5920 - classification_loss: 0.0836 340/500 [===================>..........] - ETA: 52s - loss: 0.6749 - regression_loss: 0.5915 - classification_loss: 0.0835 341/500 [===================>..........] - ETA: 52s - loss: 0.6750 - regression_loss: 0.5915 - classification_loss: 0.0835 342/500 [===================>..........] - ETA: 52s - loss: 0.6744 - regression_loss: 0.5910 - classification_loss: 0.0834 343/500 [===================>..........] - ETA: 51s - loss: 0.6741 - regression_loss: 0.5908 - classification_loss: 0.0834 344/500 [===================>..........] - ETA: 51s - loss: 0.6741 - regression_loss: 0.5906 - classification_loss: 0.0834 345/500 [===================>..........] - ETA: 51s - loss: 0.6735 - regression_loss: 0.5901 - classification_loss: 0.0834 346/500 [===================>..........] - ETA: 50s - loss: 0.6724 - regression_loss: 0.5892 - classification_loss: 0.0832 347/500 [===================>..........] - ETA: 50s - loss: 0.6737 - regression_loss: 0.5902 - classification_loss: 0.0835 348/500 [===================>..........] - ETA: 50s - loss: 0.6726 - regression_loss: 0.5892 - classification_loss: 0.0834 349/500 [===================>..........] - ETA: 49s - loss: 0.6730 - regression_loss: 0.5894 - classification_loss: 0.0836 350/500 [====================>.........] - ETA: 49s - loss: 0.6744 - regression_loss: 0.5905 - classification_loss: 0.0839 351/500 [====================>.........] - ETA: 49s - loss: 0.6748 - regression_loss: 0.5908 - classification_loss: 0.0840 352/500 [====================>.........] - ETA: 48s - loss: 0.6743 - regression_loss: 0.5905 - classification_loss: 0.0838 353/500 [====================>.........] - ETA: 48s - loss: 0.6734 - regression_loss: 0.5897 - classification_loss: 0.0837 354/500 [====================>.........] - ETA: 48s - loss: 0.6721 - regression_loss: 0.5886 - classification_loss: 0.0836 355/500 [====================>.........] - ETA: 47s - loss: 0.6717 - regression_loss: 0.5883 - classification_loss: 0.0834 356/500 [====================>.........] - ETA: 47s - loss: 0.6717 - regression_loss: 0.5884 - classification_loss: 0.0833 357/500 [====================>.........] - ETA: 47s - loss: 0.6714 - regression_loss: 0.5882 - classification_loss: 0.0832 358/500 [====================>.........] - ETA: 46s - loss: 0.6705 - regression_loss: 0.5875 - classification_loss: 0.0831 359/500 [====================>.........] - ETA: 46s - loss: 0.6718 - regression_loss: 0.5886 - classification_loss: 0.0832 360/500 [====================>.........] - ETA: 46s - loss: 0.6717 - regression_loss: 0.5885 - classification_loss: 0.0831 361/500 [====================>.........] - ETA: 45s - loss: 0.6709 - regression_loss: 0.5880 - classification_loss: 0.0830 362/500 [====================>.........] - ETA: 45s - loss: 0.6696 - regression_loss: 0.5867 - classification_loss: 0.0828 363/500 [====================>.........] - ETA: 45s - loss: 0.6695 - regression_loss: 0.5867 - classification_loss: 0.0827 364/500 [====================>.........] - ETA: 44s - loss: 0.6701 - regression_loss: 0.5873 - classification_loss: 0.0828 365/500 [====================>.........] - ETA: 44s - loss: 0.6702 - regression_loss: 0.5875 - classification_loss: 0.0828 366/500 [====================>.........] - ETA: 44s - loss: 0.6695 - regression_loss: 0.5868 - classification_loss: 0.0826 367/500 [=====================>........] - ETA: 43s - loss: 0.6702 - regression_loss: 0.5875 - classification_loss: 0.0827 368/500 [=====================>........] - ETA: 43s - loss: 0.6706 - regression_loss: 0.5879 - classification_loss: 0.0827 369/500 [=====================>........] - ETA: 43s - loss: 0.6705 - regression_loss: 0.5878 - classification_loss: 0.0827 370/500 [=====================>........] - ETA: 42s - loss: 0.6713 - regression_loss: 0.5885 - classification_loss: 0.0828 371/500 [=====================>........] - ETA: 42s - loss: 0.6717 - regression_loss: 0.5888 - classification_loss: 0.0829 372/500 [=====================>........] - ETA: 42s - loss: 0.6721 - regression_loss: 0.5891 - classification_loss: 0.0831 373/500 [=====================>........] - ETA: 41s - loss: 0.6714 - regression_loss: 0.5885 - classification_loss: 0.0830 374/500 [=====================>........] - ETA: 41s - loss: 0.6713 - regression_loss: 0.5883 - classification_loss: 0.0829 375/500 [=====================>........] - ETA: 41s - loss: 0.6722 - regression_loss: 0.5892 - classification_loss: 0.0830 376/500 [=====================>........] - ETA: 40s - loss: 0.6727 - regression_loss: 0.5897 - classification_loss: 0.0830 377/500 [=====================>........] - ETA: 40s - loss: 0.6735 - regression_loss: 0.5903 - classification_loss: 0.0832 378/500 [=====================>........] - ETA: 40s - loss: 0.6722 - regression_loss: 0.5892 - classification_loss: 0.0830 379/500 [=====================>........] - ETA: 39s - loss: 0.6724 - regression_loss: 0.5893 - classification_loss: 0.0831 380/500 [=====================>........] - ETA: 39s - loss: 0.6742 - regression_loss: 0.5909 - classification_loss: 0.0834 381/500 [=====================>........] - ETA: 39s - loss: 0.6742 - regression_loss: 0.5908 - classification_loss: 0.0834 382/500 [=====================>........] - ETA: 38s - loss: 0.6747 - regression_loss: 0.5912 - classification_loss: 0.0835 383/500 [=====================>........] - ETA: 38s - loss: 0.6752 - regression_loss: 0.5917 - classification_loss: 0.0835 384/500 [======================>.......] - ETA: 38s - loss: 0.6758 - regression_loss: 0.5922 - classification_loss: 0.0836 385/500 [======================>.......] - ETA: 37s - loss: 0.6770 - regression_loss: 0.5931 - classification_loss: 0.0839 386/500 [======================>.......] - ETA: 37s - loss: 0.6781 - regression_loss: 0.5941 - classification_loss: 0.0840 387/500 [======================>.......] - ETA: 37s - loss: 0.6776 - regression_loss: 0.5936 - classification_loss: 0.0840 388/500 [======================>.......] - ETA: 36s - loss: 0.6770 - regression_loss: 0.5931 - classification_loss: 0.0839 389/500 [======================>.......] - ETA: 36s - loss: 0.6764 - regression_loss: 0.5925 - classification_loss: 0.0838 390/500 [======================>.......] - ETA: 36s - loss: 0.6772 - regression_loss: 0.5933 - classification_loss: 0.0839 391/500 [======================>.......] - ETA: 35s - loss: 0.6780 - regression_loss: 0.5940 - classification_loss: 0.0840 392/500 [======================>.......] - ETA: 35s - loss: 0.6788 - regression_loss: 0.5947 - classification_loss: 0.0841 393/500 [======================>.......] - ETA: 35s - loss: 0.6793 - regression_loss: 0.5952 - classification_loss: 0.0841 394/500 [======================>.......] - ETA: 34s - loss: 0.6789 - regression_loss: 0.5948 - classification_loss: 0.0841 395/500 [======================>.......] - ETA: 34s - loss: 0.6787 - regression_loss: 0.5947 - classification_loss: 0.0840 396/500 [======================>.......] - ETA: 34s - loss: 0.6777 - regression_loss: 0.5938 - classification_loss: 0.0838 397/500 [======================>.......] - ETA: 33s - loss: 0.6785 - regression_loss: 0.5945 - classification_loss: 0.0840 398/500 [======================>.......] - ETA: 33s - loss: 0.6785 - regression_loss: 0.5945 - classification_loss: 0.0840 399/500 [======================>.......] - ETA: 33s - loss: 0.6782 - regression_loss: 0.5943 - classification_loss: 0.0839 400/500 [=======================>......] - ETA: 32s - loss: 0.6783 - regression_loss: 0.5944 - classification_loss: 0.0839 401/500 [=======================>......] - ETA: 32s - loss: 0.6776 - regression_loss: 0.5937 - classification_loss: 0.0838 402/500 [=======================>......] - ETA: 32s - loss: 0.6786 - regression_loss: 0.5946 - classification_loss: 0.0840 403/500 [=======================>......] - ETA: 31s - loss: 0.6802 - regression_loss: 0.5959 - classification_loss: 0.0843 404/500 [=======================>......] - ETA: 31s - loss: 0.6803 - regression_loss: 0.5959 - classification_loss: 0.0844 405/500 [=======================>......] - ETA: 31s - loss: 0.6803 - regression_loss: 0.5959 - classification_loss: 0.0844 406/500 [=======================>......] - ETA: 30s - loss: 0.6811 - regression_loss: 0.5965 - classification_loss: 0.0845 407/500 [=======================>......] - ETA: 30s - loss: 0.6806 - regression_loss: 0.5962 - classification_loss: 0.0844 408/500 [=======================>......] - ETA: 30s - loss: 0.6806 - regression_loss: 0.5962 - classification_loss: 0.0844 409/500 [=======================>......] - ETA: 29s - loss: 0.6811 - regression_loss: 0.5968 - classification_loss: 0.0843 410/500 [=======================>......] - ETA: 29s - loss: 0.6818 - regression_loss: 0.5974 - classification_loss: 0.0844 411/500 [=======================>......] - ETA: 29s - loss: 0.6824 - regression_loss: 0.5980 - classification_loss: 0.0845 412/500 [=======================>......] - ETA: 29s - loss: 0.6823 - regression_loss: 0.5978 - classification_loss: 0.0844 413/500 [=======================>......] - ETA: 28s - loss: 0.6823 - regression_loss: 0.5978 - classification_loss: 0.0845 414/500 [=======================>......] - ETA: 28s - loss: 0.6814 - regression_loss: 0.5971 - classification_loss: 0.0844 415/500 [=======================>......] - ETA: 28s - loss: 0.6812 - regression_loss: 0.5969 - classification_loss: 0.0843 416/500 [=======================>......] - ETA: 27s - loss: 0.6807 - regression_loss: 0.5966 - classification_loss: 0.0842 417/500 [========================>.....] - ETA: 27s - loss: 0.6813 - regression_loss: 0.5970 - classification_loss: 0.0843 418/500 [========================>.....] - ETA: 27s - loss: 0.6813 - regression_loss: 0.5971 - classification_loss: 0.0842 419/500 [========================>.....] - ETA: 26s - loss: 0.6827 - regression_loss: 0.5981 - classification_loss: 0.0845 420/500 [========================>.....] - ETA: 26s - loss: 0.6828 - regression_loss: 0.5982 - classification_loss: 0.0846 421/500 [========================>.....] - ETA: 26s - loss: 0.6824 - regression_loss: 0.5979 - classification_loss: 0.0845 422/500 [========================>.....] - ETA: 25s - loss: 0.6821 - regression_loss: 0.5977 - classification_loss: 0.0844 423/500 [========================>.....] - ETA: 25s - loss: 0.6819 - regression_loss: 0.5975 - classification_loss: 0.0844 424/500 [========================>.....] - ETA: 25s - loss: 0.6824 - regression_loss: 0.5978 - classification_loss: 0.0846 425/500 [========================>.....] - ETA: 24s - loss: 0.6819 - regression_loss: 0.5974 - classification_loss: 0.0845 426/500 [========================>.....] - ETA: 24s - loss: 0.6832 - regression_loss: 0.5986 - classification_loss: 0.0847 427/500 [========================>.....] - ETA: 24s - loss: 0.6830 - regression_loss: 0.5984 - classification_loss: 0.0846 428/500 [========================>.....] - ETA: 23s - loss: 0.6831 - regression_loss: 0.5985 - classification_loss: 0.0846 429/500 [========================>.....] - ETA: 23s - loss: 0.6834 - regression_loss: 0.5988 - classification_loss: 0.0846 430/500 [========================>.....] - ETA: 23s - loss: 0.6837 - regression_loss: 0.5990 - classification_loss: 0.0847 431/500 [========================>.....] - ETA: 22s - loss: 0.6836 - regression_loss: 0.5988 - classification_loss: 0.0847 432/500 [========================>.....] - ETA: 22s - loss: 0.6838 - regression_loss: 0.5990 - classification_loss: 0.0848 433/500 [========================>.....] - ETA: 22s - loss: 0.6848 - regression_loss: 0.5998 - classification_loss: 0.0850 434/500 [=========================>....] - ETA: 21s - loss: 0.6852 - regression_loss: 0.6001 - classification_loss: 0.0851 435/500 [=========================>....] - ETA: 21s - loss: 0.6854 - regression_loss: 0.6002 - classification_loss: 0.0852 436/500 [=========================>....] - ETA: 21s - loss: 0.6849 - regression_loss: 0.5998 - classification_loss: 0.0851 437/500 [=========================>....] - ETA: 20s - loss: 0.6849 - regression_loss: 0.5999 - classification_loss: 0.0850 438/500 [=========================>....] - ETA: 20s - loss: 0.6856 - regression_loss: 0.6004 - classification_loss: 0.0852 439/500 [=========================>....] - ETA: 20s - loss: 0.6851 - regression_loss: 0.6001 - classification_loss: 0.0850 440/500 [=========================>....] - ETA: 19s - loss: 0.6854 - regression_loss: 0.6003 - classification_loss: 0.0850 441/500 [=========================>....] - ETA: 19s - loss: 0.6846 - regression_loss: 0.5996 - classification_loss: 0.0849 442/500 [=========================>....] - ETA: 19s - loss: 0.6853 - regression_loss: 0.6002 - classification_loss: 0.0850 443/500 [=========================>....] - ETA: 18s - loss: 0.6849 - regression_loss: 0.6000 - classification_loss: 0.0850 444/500 [=========================>....] - ETA: 18s - loss: 0.6839 - regression_loss: 0.5991 - classification_loss: 0.0848 445/500 [=========================>....] - ETA: 18s - loss: 0.6843 - regression_loss: 0.5994 - classification_loss: 0.0849 446/500 [=========================>....] - ETA: 17s - loss: 0.6831 - regression_loss: 0.5984 - classification_loss: 0.0848 447/500 [=========================>....] - ETA: 17s - loss: 0.6827 - regression_loss: 0.5980 - classification_loss: 0.0847 448/500 [=========================>....] - ETA: 17s - loss: 0.6820 - regression_loss: 0.5974 - classification_loss: 0.0847 449/500 [=========================>....] - ETA: 16s - loss: 0.6828 - regression_loss: 0.5980 - classification_loss: 0.0847 450/500 [==========================>...] - ETA: 16s - loss: 0.6832 - regression_loss: 0.5984 - classification_loss: 0.0848 451/500 [==========================>...] - ETA: 16s - loss: 0.6844 - regression_loss: 0.5993 - classification_loss: 0.0851 452/500 [==========================>...] - ETA: 15s - loss: 0.6845 - regression_loss: 0.5994 - classification_loss: 0.0851 453/500 [==========================>...] - ETA: 15s - loss: 0.6836 - regression_loss: 0.5987 - classification_loss: 0.0849 454/500 [==========================>...] - ETA: 15s - loss: 0.6840 - regression_loss: 0.5992 - classification_loss: 0.0848 455/500 [==========================>...] - ETA: 14s - loss: 0.6837 - regression_loss: 0.5989 - classification_loss: 0.0848 456/500 [==========================>...] - ETA: 14s - loss: 0.6832 - regression_loss: 0.5985 - classification_loss: 0.0847 457/500 [==========================>...] - ETA: 14s - loss: 0.6835 - regression_loss: 0.5988 - classification_loss: 0.0847 458/500 [==========================>...] - ETA: 13s - loss: 0.6834 - regression_loss: 0.5986 - classification_loss: 0.0848 459/500 [==========================>...] - ETA: 13s - loss: 0.6835 - regression_loss: 0.5987 - classification_loss: 0.0848 460/500 [==========================>...] - ETA: 13s - loss: 0.6827 - regression_loss: 0.5980 - classification_loss: 0.0847 461/500 [==========================>...] - ETA: 12s - loss: 0.6824 - regression_loss: 0.5978 - classification_loss: 0.0846 462/500 [==========================>...] - ETA: 12s - loss: 0.6826 - regression_loss: 0.5980 - classification_loss: 0.0846 463/500 [==========================>...] - ETA: 12s - loss: 0.6820 - regression_loss: 0.5975 - classification_loss: 0.0845 464/500 [==========================>...] - ETA: 11s - loss: 0.6815 - regression_loss: 0.5970 - classification_loss: 0.0845 465/500 [==========================>...] - ETA: 11s - loss: 0.6818 - regression_loss: 0.5973 - classification_loss: 0.0845 466/500 [==========================>...] - ETA: 11s - loss: 0.6811 - regression_loss: 0.5967 - classification_loss: 0.0844 467/500 [===========================>..] - ETA: 10s - loss: 0.6809 - regression_loss: 0.5965 - classification_loss: 0.0844 468/500 [===========================>..] - ETA: 10s - loss: 0.6809 - regression_loss: 0.5966 - classification_loss: 0.0844 469/500 [===========================>..] - ETA: 10s - loss: 0.6806 - regression_loss: 0.5963 - classification_loss: 0.0843 470/500 [===========================>..] - ETA: 9s - loss: 0.6807 - regression_loss: 0.5964 - classification_loss: 0.0843  471/500 [===========================>..] - ETA: 9s - loss: 0.6797 - regression_loss: 0.5954 - classification_loss: 0.0842 472/500 [===========================>..] - ETA: 9s - loss: 0.6800 - regression_loss: 0.5957 - classification_loss: 0.0843 473/500 [===========================>..] - ETA: 8s - loss: 0.6806 - regression_loss: 0.5963 - classification_loss: 0.0843 474/500 [===========================>..] - ETA: 8s - loss: 0.6813 - regression_loss: 0.5968 - classification_loss: 0.0845 475/500 [===========================>..] - ETA: 8s - loss: 0.6820 - regression_loss: 0.5973 - classification_loss: 0.0847 476/500 [===========================>..] - ETA: 7s - loss: 0.6821 - regression_loss: 0.5974 - classification_loss: 0.0847 477/500 [===========================>..] - ETA: 7s - loss: 0.6815 - regression_loss: 0.5970 - classification_loss: 0.0846 478/500 [===========================>..] - ETA: 7s - loss: 0.6830 - regression_loss: 0.5982 - classification_loss: 0.0848 479/500 [===========================>..] - ETA: 6s - loss: 0.6824 - regression_loss: 0.5977 - classification_loss: 0.0847 480/500 [===========================>..] - ETA: 6s - loss: 0.6826 - regression_loss: 0.5978 - classification_loss: 0.0847 481/500 [===========================>..] - ETA: 6s - loss: 0.6817 - regression_loss: 0.5971 - classification_loss: 0.0846 482/500 [===========================>..] - ETA: 5s - loss: 0.6813 - regression_loss: 0.5968 - classification_loss: 0.0846 483/500 [===========================>..] - ETA: 5s - loss: 0.6806 - regression_loss: 0.5962 - classification_loss: 0.0844 484/500 [============================>.] - ETA: 5s - loss: 0.6802 - regression_loss: 0.5959 - classification_loss: 0.0843 485/500 [============================>.] - ETA: 4s - loss: 0.6801 - regression_loss: 0.5958 - classification_loss: 0.0843 486/500 [============================>.] - ETA: 4s - loss: 0.6797 - regression_loss: 0.5955 - classification_loss: 0.0842 487/500 [============================>.] - ETA: 4s - loss: 0.6805 - regression_loss: 0.5961 - classification_loss: 0.0843 488/500 [============================>.] - ETA: 3s - loss: 0.6801 - regression_loss: 0.5958 - classification_loss: 0.0843 489/500 [============================>.] - ETA: 3s - loss: 0.6798 - regression_loss: 0.5956 - classification_loss: 0.0842 490/500 [============================>.] - ETA: 3s - loss: 0.6799 - regression_loss: 0.5957 - classification_loss: 0.0842 491/500 [============================>.] - ETA: 2s - loss: 0.6808 - regression_loss: 0.5965 - classification_loss: 0.0844 492/500 [============================>.] - ETA: 2s - loss: 0.6801 - regression_loss: 0.5958 - classification_loss: 0.0843 493/500 [============================>.] - ETA: 2s - loss: 0.6790 - regression_loss: 0.5949 - classification_loss: 0.0841 494/500 [============================>.] - ETA: 1s - loss: 0.6783 - regression_loss: 0.5942 - classification_loss: 0.0841 495/500 [============================>.] - ETA: 1s - loss: 0.6787 - regression_loss: 0.5945 - classification_loss: 0.0842 496/500 [============================>.] - ETA: 1s - loss: 0.6797 - regression_loss: 0.5953 - classification_loss: 0.0844 497/500 [============================>.] - ETA: 0s - loss: 0.6794 - regression_loss: 0.5950 - classification_loss: 0.0844 498/500 [============================>.] - ETA: 0s - loss: 0.6794 - regression_loss: 0.5950 - classification_loss: 0.0844 499/500 [============================>.] - ETA: 0s - loss: 0.6788 - regression_loss: 0.5945 - classification_loss: 0.0843 500/500 [==============================] - 165s 330ms/step - loss: 0.6787 - regression_loss: 0.5945 - classification_loss: 0.0842 1172 instances of class plum with average precision: 0.6352 mAP: 0.6352 Epoch 00046: saving model to ./training/snapshots/resnet101_pascal_46.h5 Epoch 47/150 1/500 [..............................] - ETA: 2:41 - loss: 0.4386 - regression_loss: 0.3726 - classification_loss: 0.0660 2/500 [..............................] - ETA: 2:44 - loss: 0.6567 - regression_loss: 0.5688 - classification_loss: 0.0879 3/500 [..............................] - ETA: 2:45 - loss: 0.6055 - regression_loss: 0.5304 - classification_loss: 0.0751 4/500 [..............................] - ETA: 2:43 - loss: 0.6469 - regression_loss: 0.5691 - classification_loss: 0.0778 5/500 [..............................] - ETA: 2:41 - loss: 0.6543 - regression_loss: 0.5708 - classification_loss: 0.0836 6/500 [..............................] - ETA: 2:40 - loss: 0.6278 - regression_loss: 0.5499 - classification_loss: 0.0779 7/500 [..............................] - ETA: 2:42 - loss: 0.5809 - regression_loss: 0.5114 - classification_loss: 0.0695 8/500 [..............................] - ETA: 2:42 - loss: 0.6319 - regression_loss: 0.5500 - classification_loss: 0.0820 9/500 [..............................] - ETA: 2:40 - loss: 0.6275 - regression_loss: 0.5469 - classification_loss: 0.0806 10/500 [..............................] - ETA: 2:40 - loss: 0.6421 - regression_loss: 0.5611 - classification_loss: 0.0809 11/500 [..............................] - ETA: 2:40 - loss: 0.6439 - regression_loss: 0.5629 - classification_loss: 0.0810 12/500 [..............................] - ETA: 2:39 - loss: 0.6178 - regression_loss: 0.5401 - classification_loss: 0.0777 13/500 [..............................] - ETA: 2:39 - loss: 0.5920 - regression_loss: 0.5180 - classification_loss: 0.0740 14/500 [..............................] - ETA: 2:40 - loss: 0.6127 - regression_loss: 0.5340 - classification_loss: 0.0787 15/500 [..............................] - ETA: 2:39 - loss: 0.6137 - regression_loss: 0.5368 - classification_loss: 0.0769 16/500 [..............................] - ETA: 2:40 - loss: 0.6300 - regression_loss: 0.5516 - classification_loss: 0.0784 17/500 [>.............................] - ETA: 2:39 - loss: 0.6454 - regression_loss: 0.5667 - classification_loss: 0.0787 18/500 [>.............................] - ETA: 2:38 - loss: 0.6386 - regression_loss: 0.5609 - classification_loss: 0.0778 19/500 [>.............................] - ETA: 2:38 - loss: 0.6506 - regression_loss: 0.5718 - classification_loss: 0.0788 20/500 [>.............................] - ETA: 2:37 - loss: 0.6648 - regression_loss: 0.5848 - classification_loss: 0.0800 21/500 [>.............................] - ETA: 2:36 - loss: 0.6676 - regression_loss: 0.5866 - classification_loss: 0.0810 22/500 [>.............................] - ETA: 2:37 - loss: 0.6666 - regression_loss: 0.5861 - classification_loss: 0.0805 23/500 [>.............................] - ETA: 2:37 - loss: 0.6689 - regression_loss: 0.5875 - classification_loss: 0.0813 24/500 [>.............................] - ETA: 2:36 - loss: 0.6799 - regression_loss: 0.5984 - classification_loss: 0.0815 25/500 [>.............................] - ETA: 2:36 - loss: 0.6660 - regression_loss: 0.5862 - classification_loss: 0.0798 26/500 [>.............................] - ETA: 2:36 - loss: 0.6635 - regression_loss: 0.5839 - classification_loss: 0.0796 27/500 [>.............................] - ETA: 2:36 - loss: 0.6740 - regression_loss: 0.5960 - classification_loss: 0.0780 28/500 [>.............................] - ETA: 2:35 - loss: 0.6587 - regression_loss: 0.5820 - classification_loss: 0.0766 29/500 [>.............................] - ETA: 2:35 - loss: 0.6577 - regression_loss: 0.5812 - classification_loss: 0.0764 30/500 [>.............................] - ETA: 2:35 - loss: 0.6562 - regression_loss: 0.5797 - classification_loss: 0.0765 31/500 [>.............................] - ETA: 2:35 - loss: 0.6548 - regression_loss: 0.5786 - classification_loss: 0.0762 32/500 [>.............................] - ETA: 2:34 - loss: 0.6542 - regression_loss: 0.5788 - classification_loss: 0.0755 33/500 [>.............................] - ETA: 2:34 - loss: 0.6627 - regression_loss: 0.5874 - classification_loss: 0.0753 34/500 [=>............................] - ETA: 2:33 - loss: 0.6689 - regression_loss: 0.5934 - classification_loss: 0.0754 35/500 [=>............................] - ETA: 2:33 - loss: 0.6730 - regression_loss: 0.5967 - classification_loss: 0.0764 36/500 [=>............................] - ETA: 2:32 - loss: 0.6821 - regression_loss: 0.6044 - classification_loss: 0.0777 37/500 [=>............................] - ETA: 2:32 - loss: 0.6834 - regression_loss: 0.6048 - classification_loss: 0.0785 38/500 [=>............................] - ETA: 2:32 - loss: 0.6788 - regression_loss: 0.6017 - classification_loss: 0.0771 39/500 [=>............................] - ETA: 2:32 - loss: 0.6716 - regression_loss: 0.5951 - classification_loss: 0.0765 40/500 [=>............................] - ETA: 2:32 - loss: 0.6736 - regression_loss: 0.5976 - classification_loss: 0.0760 41/500 [=>............................] - ETA: 2:31 - loss: 0.6703 - regression_loss: 0.5950 - classification_loss: 0.0754 42/500 [=>............................] - ETA: 2:31 - loss: 0.6732 - regression_loss: 0.5988 - classification_loss: 0.0744 43/500 [=>............................] - ETA: 2:31 - loss: 0.6662 - regression_loss: 0.5928 - classification_loss: 0.0734 44/500 [=>............................] - ETA: 2:30 - loss: 0.6651 - regression_loss: 0.5918 - classification_loss: 0.0733 45/500 [=>............................] - ETA: 2:30 - loss: 0.6611 - regression_loss: 0.5877 - classification_loss: 0.0734 46/500 [=>............................] - ETA: 2:30 - loss: 0.6593 - regression_loss: 0.5866 - classification_loss: 0.0727 47/500 [=>............................] - ETA: 2:29 - loss: 0.6578 - regression_loss: 0.5855 - classification_loss: 0.0722 48/500 [=>............................] - ETA: 2:29 - loss: 0.6591 - regression_loss: 0.5868 - classification_loss: 0.0723 49/500 [=>............................] - ETA: 2:29 - loss: 0.6609 - regression_loss: 0.5887 - classification_loss: 0.0721 50/500 [==>...........................] - ETA: 2:28 - loss: 0.6680 - regression_loss: 0.5946 - classification_loss: 0.0734 51/500 [==>...........................] - ETA: 2:28 - loss: 0.6625 - regression_loss: 0.5899 - classification_loss: 0.0726 52/500 [==>...........................] - ETA: 2:28 - loss: 0.6540 - regression_loss: 0.5822 - classification_loss: 0.0718 53/500 [==>...........................] - ETA: 2:28 - loss: 0.6505 - regression_loss: 0.5784 - classification_loss: 0.0721 54/500 [==>...........................] - ETA: 2:27 - loss: 0.6585 - regression_loss: 0.5845 - classification_loss: 0.0740 55/500 [==>...........................] - ETA: 2:27 - loss: 0.6586 - regression_loss: 0.5846 - classification_loss: 0.0740 56/500 [==>...........................] - ETA: 2:27 - loss: 0.6610 - regression_loss: 0.5866 - classification_loss: 0.0744 57/500 [==>...........................] - ETA: 2:26 - loss: 0.6600 - regression_loss: 0.5856 - classification_loss: 0.0744 58/500 [==>...........................] - ETA: 2:26 - loss: 0.6638 - regression_loss: 0.5892 - classification_loss: 0.0746 59/500 [==>...........................] - ETA: 2:26 - loss: 0.6597 - regression_loss: 0.5858 - classification_loss: 0.0739 60/500 [==>...........................] - ETA: 2:25 - loss: 0.6558 - regression_loss: 0.5823 - classification_loss: 0.0735 61/500 [==>...........................] - ETA: 2:25 - loss: 0.6615 - regression_loss: 0.5872 - classification_loss: 0.0743 62/500 [==>...........................] - ETA: 2:25 - loss: 0.6596 - regression_loss: 0.5855 - classification_loss: 0.0741 63/500 [==>...........................] - ETA: 2:24 - loss: 0.6637 - regression_loss: 0.5885 - classification_loss: 0.0751 64/500 [==>...........................] - ETA: 2:24 - loss: 0.6644 - regression_loss: 0.5885 - classification_loss: 0.0758 65/500 [==>...........................] - ETA: 2:24 - loss: 0.6686 - regression_loss: 0.5920 - classification_loss: 0.0766 66/500 [==>...........................] - ETA: 2:23 - loss: 0.6714 - regression_loss: 0.5949 - classification_loss: 0.0765 67/500 [===>..........................] - ETA: 2:23 - loss: 0.6696 - regression_loss: 0.5938 - classification_loss: 0.0758 68/500 [===>..........................] - ETA: 2:23 - loss: 0.6742 - regression_loss: 0.5975 - classification_loss: 0.0767 69/500 [===>..........................] - ETA: 2:22 - loss: 0.6726 - regression_loss: 0.5962 - classification_loss: 0.0764 70/500 [===>..........................] - ETA: 2:22 - loss: 0.6741 - regression_loss: 0.5981 - classification_loss: 0.0760 71/500 [===>..........................] - ETA: 2:22 - loss: 0.6746 - regression_loss: 0.5989 - classification_loss: 0.0758 72/500 [===>..........................] - ETA: 2:21 - loss: 0.6796 - regression_loss: 0.6030 - classification_loss: 0.0766 73/500 [===>..........................] - ETA: 2:21 - loss: 0.6814 - regression_loss: 0.6048 - classification_loss: 0.0765 74/500 [===>..........................] - ETA: 2:21 - loss: 0.6836 - regression_loss: 0.6065 - classification_loss: 0.0771 75/500 [===>..........................] - ETA: 2:21 - loss: 0.6789 - regression_loss: 0.6024 - classification_loss: 0.0765 76/500 [===>..........................] - ETA: 2:20 - loss: 0.6752 - regression_loss: 0.5990 - classification_loss: 0.0762 77/500 [===>..........................] - ETA: 2:20 - loss: 0.6818 - regression_loss: 0.6054 - classification_loss: 0.0764 78/500 [===>..........................] - ETA: 2:19 - loss: 0.6864 - regression_loss: 0.6093 - classification_loss: 0.0772 79/500 [===>..........................] - ETA: 2:19 - loss: 0.6881 - regression_loss: 0.6101 - classification_loss: 0.0780 80/500 [===>..........................] - ETA: 2:19 - loss: 0.6863 - regression_loss: 0.6087 - classification_loss: 0.0776 81/500 [===>..........................] - ETA: 2:18 - loss: 0.6881 - regression_loss: 0.6103 - classification_loss: 0.0779 82/500 [===>..........................] - ETA: 2:18 - loss: 0.6850 - regression_loss: 0.6076 - classification_loss: 0.0774 83/500 [===>..........................] - ETA: 2:18 - loss: 0.6856 - regression_loss: 0.6081 - classification_loss: 0.0775 84/500 [====>.........................] - ETA: 2:17 - loss: 0.6882 - regression_loss: 0.6107 - classification_loss: 0.0775 85/500 [====>.........................] - ETA: 2:17 - loss: 0.6856 - regression_loss: 0.6084 - classification_loss: 0.0772 86/500 [====>.........................] - ETA: 2:17 - loss: 0.6877 - regression_loss: 0.6101 - classification_loss: 0.0777 87/500 [====>.........................] - ETA: 2:16 - loss: 0.6843 - regression_loss: 0.6070 - classification_loss: 0.0773 88/500 [====>.........................] - ETA: 2:16 - loss: 0.6851 - regression_loss: 0.6074 - classification_loss: 0.0777 89/500 [====>.........................] - ETA: 2:15 - loss: 0.6828 - regression_loss: 0.6054 - classification_loss: 0.0774 90/500 [====>.........................] - ETA: 2:15 - loss: 0.6799 - regression_loss: 0.6031 - classification_loss: 0.0768 91/500 [====>.........................] - ETA: 2:15 - loss: 0.6813 - regression_loss: 0.6039 - classification_loss: 0.0773 92/500 [====>.........................] - ETA: 2:14 - loss: 0.6777 - regression_loss: 0.6006 - classification_loss: 0.0770 93/500 [====>.........................] - ETA: 2:14 - loss: 0.6766 - regression_loss: 0.5997 - classification_loss: 0.0769 94/500 [====>.........................] - ETA: 2:14 - loss: 0.6776 - regression_loss: 0.6004 - classification_loss: 0.0772 95/500 [====>.........................] - ETA: 2:13 - loss: 0.6781 - regression_loss: 0.6007 - classification_loss: 0.0774 96/500 [====>.........................] - ETA: 2:13 - loss: 0.6769 - regression_loss: 0.5995 - classification_loss: 0.0774 97/500 [====>.........................] - ETA: 2:13 - loss: 0.6786 - regression_loss: 0.6011 - classification_loss: 0.0775 98/500 [====>.........................] - ETA: 2:12 - loss: 0.6779 - regression_loss: 0.6005 - classification_loss: 0.0775 99/500 [====>.........................] - ETA: 2:12 - loss: 0.6765 - regression_loss: 0.5993 - classification_loss: 0.0772 100/500 [=====>........................] - ETA: 2:12 - loss: 0.6777 - regression_loss: 0.6003 - classification_loss: 0.0774 101/500 [=====>........................] - ETA: 2:11 - loss: 0.6794 - regression_loss: 0.6013 - classification_loss: 0.0781 102/500 [=====>........................] - ETA: 2:11 - loss: 0.6823 - regression_loss: 0.6038 - classification_loss: 0.0785 103/500 [=====>........................] - ETA: 2:11 - loss: 0.6813 - regression_loss: 0.6028 - classification_loss: 0.0785 104/500 [=====>........................] - ETA: 2:10 - loss: 0.6817 - regression_loss: 0.6030 - classification_loss: 0.0787 105/500 [=====>........................] - ETA: 2:10 - loss: 0.6803 - regression_loss: 0.6019 - classification_loss: 0.0785 106/500 [=====>........................] - ETA: 2:10 - loss: 0.6781 - regression_loss: 0.6001 - classification_loss: 0.0780 107/500 [=====>........................] - ETA: 2:09 - loss: 0.6744 - regression_loss: 0.5970 - classification_loss: 0.0774 108/500 [=====>........................] - ETA: 2:09 - loss: 0.6739 - regression_loss: 0.5964 - classification_loss: 0.0775 109/500 [=====>........................] - ETA: 2:09 - loss: 0.6740 - regression_loss: 0.5963 - classification_loss: 0.0776 110/500 [=====>........................] - ETA: 2:08 - loss: 0.6741 - regression_loss: 0.5965 - classification_loss: 0.0776 111/500 [=====>........................] - ETA: 2:08 - loss: 0.6705 - regression_loss: 0.5934 - classification_loss: 0.0771 112/500 [=====>........................] - ETA: 2:08 - loss: 0.6673 - regression_loss: 0.5905 - classification_loss: 0.0767 113/500 [=====>........................] - ETA: 2:07 - loss: 0.6680 - regression_loss: 0.5914 - classification_loss: 0.0767 114/500 [=====>........................] - ETA: 2:07 - loss: 0.6653 - regression_loss: 0.5889 - classification_loss: 0.0763 115/500 [=====>........................] - ETA: 2:07 - loss: 0.6678 - regression_loss: 0.5907 - classification_loss: 0.0770 116/500 [=====>........................] - ETA: 2:06 - loss: 0.6683 - regression_loss: 0.5912 - classification_loss: 0.0771 117/500 [======>.......................] - ETA: 2:06 - loss: 0.6657 - regression_loss: 0.5890 - classification_loss: 0.0767 118/500 [======>.......................] - ETA: 2:06 - loss: 0.6691 - regression_loss: 0.5917 - classification_loss: 0.0774 119/500 [======>.......................] - ETA: 2:05 - loss: 0.6713 - regression_loss: 0.5933 - classification_loss: 0.0780 120/500 [======>.......................] - ETA: 2:05 - loss: 0.6758 - regression_loss: 0.5971 - classification_loss: 0.0787 121/500 [======>.......................] - ETA: 2:04 - loss: 0.6772 - regression_loss: 0.5983 - classification_loss: 0.0789 122/500 [======>.......................] - ETA: 2:04 - loss: 0.6740 - regression_loss: 0.5954 - classification_loss: 0.0785 123/500 [======>.......................] - ETA: 2:04 - loss: 0.6718 - regression_loss: 0.5934 - classification_loss: 0.0783 124/500 [======>.......................] - ETA: 2:03 - loss: 0.6687 - regression_loss: 0.5904 - classification_loss: 0.0783 125/500 [======>.......................] - ETA: 2:03 - loss: 0.6687 - regression_loss: 0.5901 - classification_loss: 0.0787 126/500 [======>.......................] - ETA: 2:03 - loss: 0.6688 - regression_loss: 0.5901 - classification_loss: 0.0787 127/500 [======>.......................] - ETA: 2:03 - loss: 0.6685 - regression_loss: 0.5897 - classification_loss: 0.0788 128/500 [======>.......................] - ETA: 2:02 - loss: 0.6673 - regression_loss: 0.5886 - classification_loss: 0.0786 129/500 [======>.......................] - ETA: 2:02 - loss: 0.6637 - regression_loss: 0.5855 - classification_loss: 0.0782 130/500 [======>.......................] - ETA: 2:02 - loss: 0.6611 - regression_loss: 0.5834 - classification_loss: 0.0778 131/500 [======>.......................] - ETA: 2:01 - loss: 0.6617 - regression_loss: 0.5837 - classification_loss: 0.0780 132/500 [======>.......................] - ETA: 2:01 - loss: 0.6596 - regression_loss: 0.5818 - classification_loss: 0.0777 133/500 [======>.......................] - ETA: 2:01 - loss: 0.6589 - regression_loss: 0.5812 - classification_loss: 0.0777 134/500 [=======>......................] - ETA: 2:00 - loss: 0.6594 - regression_loss: 0.5817 - classification_loss: 0.0777 135/500 [=======>......................] - ETA: 2:00 - loss: 0.6624 - regression_loss: 0.5843 - classification_loss: 0.0781 136/500 [=======>......................] - ETA: 2:00 - loss: 0.6624 - regression_loss: 0.5845 - classification_loss: 0.0779 137/500 [=======>......................] - ETA: 1:59 - loss: 0.6608 - regression_loss: 0.5831 - classification_loss: 0.0777 138/500 [=======>......................] - ETA: 1:59 - loss: 0.6607 - regression_loss: 0.5830 - classification_loss: 0.0777 139/500 [=======>......................] - ETA: 1:59 - loss: 0.6647 - regression_loss: 0.5862 - classification_loss: 0.0785 140/500 [=======>......................] - ETA: 1:58 - loss: 0.6647 - regression_loss: 0.5864 - classification_loss: 0.0784 141/500 [=======>......................] - ETA: 1:58 - loss: 0.6647 - regression_loss: 0.5862 - classification_loss: 0.0786 142/500 [=======>......................] - ETA: 1:58 - loss: 0.6639 - regression_loss: 0.5857 - classification_loss: 0.0782 143/500 [=======>......................] - ETA: 1:57 - loss: 0.6620 - regression_loss: 0.5840 - classification_loss: 0.0780 144/500 [=======>......................] - ETA: 1:57 - loss: 0.6646 - regression_loss: 0.5859 - classification_loss: 0.0787 145/500 [=======>......................] - ETA: 1:57 - loss: 0.6646 - regression_loss: 0.5858 - classification_loss: 0.0789 146/500 [=======>......................] - ETA: 1:56 - loss: 0.6667 - regression_loss: 0.5873 - classification_loss: 0.0794 147/500 [=======>......................] - ETA: 1:56 - loss: 0.6673 - regression_loss: 0.5877 - classification_loss: 0.0796 148/500 [=======>......................] - ETA: 1:56 - loss: 0.6679 - regression_loss: 0.5885 - classification_loss: 0.0795 149/500 [=======>......................] - ETA: 1:55 - loss: 0.6663 - regression_loss: 0.5869 - classification_loss: 0.0794 150/500 [========>.....................] - ETA: 1:55 - loss: 0.6638 - regression_loss: 0.5847 - classification_loss: 0.0790 151/500 [========>.....................] - ETA: 1:55 - loss: 0.6623 - regression_loss: 0.5832 - classification_loss: 0.0791 152/500 [========>.....................] - ETA: 1:54 - loss: 0.6666 - regression_loss: 0.5867 - classification_loss: 0.0799 153/500 [========>.....................] - ETA: 1:54 - loss: 0.6667 - regression_loss: 0.5867 - classification_loss: 0.0800 154/500 [========>.....................] - ETA: 1:54 - loss: 0.6685 - regression_loss: 0.5882 - classification_loss: 0.0803 155/500 [========>.....................] - ETA: 1:53 - loss: 0.6681 - regression_loss: 0.5877 - classification_loss: 0.0804 156/500 [========>.....................] - ETA: 1:53 - loss: 0.6675 - regression_loss: 0.5871 - classification_loss: 0.0804 157/500 [========>.....................] - ETA: 1:53 - loss: 0.6658 - regression_loss: 0.5856 - classification_loss: 0.0802 158/500 [========>.....................] - ETA: 1:52 - loss: 0.6665 - regression_loss: 0.5859 - classification_loss: 0.0806 159/500 [========>.....................] - ETA: 1:52 - loss: 0.6658 - regression_loss: 0.5854 - classification_loss: 0.0805 160/500 [========>.....................] - ETA: 1:52 - loss: 0.6659 - regression_loss: 0.5851 - classification_loss: 0.0807 161/500 [========>.....................] - ETA: 1:51 - loss: 0.6684 - regression_loss: 0.5872 - classification_loss: 0.0812 162/500 [========>.....................] - ETA: 1:51 - loss: 0.6675 - regression_loss: 0.5865 - classification_loss: 0.0810 163/500 [========>.....................] - ETA: 1:51 - loss: 0.6670 - regression_loss: 0.5859 - classification_loss: 0.0811 164/500 [========>.....................] - ETA: 1:50 - loss: 0.6676 - regression_loss: 0.5864 - classification_loss: 0.0812 165/500 [========>.....................] - ETA: 1:50 - loss: 0.6661 - regression_loss: 0.5851 - classification_loss: 0.0810 166/500 [========>.....................] - ETA: 1:50 - loss: 0.6647 - regression_loss: 0.5838 - classification_loss: 0.0810 167/500 [=========>....................] - ETA: 1:49 - loss: 0.6663 - regression_loss: 0.5852 - classification_loss: 0.0811 168/500 [=========>....................] - ETA: 1:49 - loss: 0.6679 - regression_loss: 0.5865 - classification_loss: 0.0814 169/500 [=========>....................] - ETA: 1:49 - loss: 0.6677 - regression_loss: 0.5864 - classification_loss: 0.0813 170/500 [=========>....................] - ETA: 1:48 - loss: 0.6681 - regression_loss: 0.5868 - classification_loss: 0.0813 171/500 [=========>....................] - ETA: 1:48 - loss: 0.6676 - regression_loss: 0.5864 - classification_loss: 0.0812 172/500 [=========>....................] - ETA: 1:48 - loss: 0.6682 - regression_loss: 0.5869 - classification_loss: 0.0813 173/500 [=========>....................] - ETA: 1:47 - loss: 0.6691 - regression_loss: 0.5876 - classification_loss: 0.0815 174/500 [=========>....................] - ETA: 1:47 - loss: 0.6695 - regression_loss: 0.5880 - classification_loss: 0.0815 175/500 [=========>....................] - ETA: 1:47 - loss: 0.6719 - regression_loss: 0.5903 - classification_loss: 0.0816 176/500 [=========>....................] - ETA: 1:46 - loss: 0.6744 - regression_loss: 0.5922 - classification_loss: 0.0821 177/500 [=========>....................] - ETA: 1:46 - loss: 0.6758 - regression_loss: 0.5932 - classification_loss: 0.0826 178/500 [=========>....................] - ETA: 1:46 - loss: 0.6734 - regression_loss: 0.5911 - classification_loss: 0.0823 179/500 [=========>....................] - ETA: 1:45 - loss: 0.6721 - regression_loss: 0.5899 - classification_loss: 0.0822 180/500 [=========>....................] - ETA: 1:45 - loss: 0.6709 - regression_loss: 0.5889 - classification_loss: 0.0820 181/500 [=========>....................] - ETA: 1:45 - loss: 0.6708 - regression_loss: 0.5889 - classification_loss: 0.0819 182/500 [=========>....................] - ETA: 1:44 - loss: 0.6682 - regression_loss: 0.5866 - classification_loss: 0.0816 183/500 [=========>....................] - ETA: 1:44 - loss: 0.6689 - regression_loss: 0.5872 - classification_loss: 0.0817 184/500 [==========>...................] - ETA: 1:44 - loss: 0.6705 - regression_loss: 0.5886 - classification_loss: 0.0819 185/500 [==========>...................] - ETA: 1:43 - loss: 0.6693 - regression_loss: 0.5875 - classification_loss: 0.0818 186/500 [==========>...................] - ETA: 1:43 - loss: 0.6680 - regression_loss: 0.5865 - classification_loss: 0.0815 187/500 [==========>...................] - ETA: 1:43 - loss: 0.6692 - regression_loss: 0.5875 - classification_loss: 0.0817 188/500 [==========>...................] - ETA: 1:42 - loss: 0.6696 - regression_loss: 0.5879 - classification_loss: 0.0816 189/500 [==========>...................] - ETA: 1:42 - loss: 0.6685 - regression_loss: 0.5870 - classification_loss: 0.0815 190/500 [==========>...................] - ETA: 1:42 - loss: 0.6682 - regression_loss: 0.5867 - classification_loss: 0.0815 191/500 [==========>...................] - ETA: 1:41 - loss: 0.6696 - regression_loss: 0.5879 - classification_loss: 0.0817 192/500 [==========>...................] - ETA: 1:41 - loss: 0.6675 - regression_loss: 0.5861 - classification_loss: 0.0814 193/500 [==========>...................] - ETA: 1:41 - loss: 0.6677 - regression_loss: 0.5860 - classification_loss: 0.0817 194/500 [==========>...................] - ETA: 1:40 - loss: 0.6679 - regression_loss: 0.5863 - classification_loss: 0.0816 195/500 [==========>...................] - ETA: 1:40 - loss: 0.6676 - regression_loss: 0.5861 - classification_loss: 0.0815 196/500 [==========>...................] - ETA: 1:40 - loss: 0.6682 - regression_loss: 0.5866 - classification_loss: 0.0816 197/500 [==========>...................] - ETA: 1:39 - loss: 0.6700 - regression_loss: 0.5881 - classification_loss: 0.0819 198/500 [==========>...................] - ETA: 1:39 - loss: 0.6700 - regression_loss: 0.5882 - classification_loss: 0.0818 199/500 [==========>...................] - ETA: 1:39 - loss: 0.6693 - regression_loss: 0.5875 - classification_loss: 0.0818 200/500 [===========>..................] - ETA: 1:38 - loss: 0.6691 - regression_loss: 0.5872 - classification_loss: 0.0819 201/500 [===========>..................] - ETA: 1:38 - loss: 0.6696 - regression_loss: 0.5876 - classification_loss: 0.0819 202/500 [===========>..................] - ETA: 1:38 - loss: 0.6692 - regression_loss: 0.5873 - classification_loss: 0.0819 203/500 [===========>..................] - ETA: 1:37 - loss: 0.6698 - regression_loss: 0.5879 - classification_loss: 0.0818 204/500 [===========>..................] - ETA: 1:37 - loss: 0.6682 - regression_loss: 0.5866 - classification_loss: 0.0816 205/500 [===========>..................] - ETA: 1:37 - loss: 0.6676 - regression_loss: 0.5862 - classification_loss: 0.0814 206/500 [===========>..................] - ETA: 1:36 - loss: 0.6681 - regression_loss: 0.5865 - classification_loss: 0.0815 207/500 [===========>..................] - ETA: 1:36 - loss: 0.6677 - regression_loss: 0.5862 - classification_loss: 0.0815 208/500 [===========>..................] - ETA: 1:36 - loss: 0.6668 - regression_loss: 0.5855 - classification_loss: 0.0813 209/500 [===========>..................] - ETA: 1:35 - loss: 0.6684 - regression_loss: 0.5867 - classification_loss: 0.0817 210/500 [===========>..................] - ETA: 1:35 - loss: 0.6667 - regression_loss: 0.5853 - classification_loss: 0.0814 211/500 [===========>..................] - ETA: 1:35 - loss: 0.6653 - regression_loss: 0.5841 - classification_loss: 0.0811 212/500 [===========>..................] - ETA: 1:34 - loss: 0.6646 - regression_loss: 0.5835 - classification_loss: 0.0810 213/500 [===========>..................] - ETA: 1:34 - loss: 0.6641 - regression_loss: 0.5832 - classification_loss: 0.0808 214/500 [===========>..................] - ETA: 1:34 - loss: 0.6656 - regression_loss: 0.5845 - classification_loss: 0.0811 215/500 [===========>..................] - ETA: 1:33 - loss: 0.6658 - regression_loss: 0.5846 - classification_loss: 0.0813 216/500 [===========>..................] - ETA: 1:33 - loss: 0.6643 - regression_loss: 0.5830 - classification_loss: 0.0813 217/500 [============>.................] - ETA: 1:33 - loss: 0.6627 - regression_loss: 0.5817 - classification_loss: 0.0810 218/500 [============>.................] - ETA: 1:32 - loss: 0.6625 - regression_loss: 0.5816 - classification_loss: 0.0809 219/500 [============>.................] - ETA: 1:32 - loss: 0.6613 - regression_loss: 0.5805 - classification_loss: 0.0807 220/500 [============>.................] - ETA: 1:32 - loss: 0.6607 - regression_loss: 0.5801 - classification_loss: 0.0806 221/500 [============>.................] - ETA: 1:32 - loss: 0.6627 - regression_loss: 0.5818 - classification_loss: 0.0809 222/500 [============>.................] - ETA: 1:31 - loss: 0.6626 - regression_loss: 0.5818 - classification_loss: 0.0808 223/500 [============>.................] - ETA: 1:31 - loss: 0.6616 - regression_loss: 0.5810 - classification_loss: 0.0806 224/500 [============>.................] - ETA: 1:31 - loss: 0.6625 - regression_loss: 0.5818 - classification_loss: 0.0807 225/500 [============>.................] - ETA: 1:30 - loss: 0.6623 - regression_loss: 0.5816 - classification_loss: 0.0806 226/500 [============>.................] - ETA: 1:30 - loss: 0.6621 - regression_loss: 0.5815 - classification_loss: 0.0806 227/500 [============>.................] - ETA: 1:30 - loss: 0.6620 - regression_loss: 0.5813 - classification_loss: 0.0806 228/500 [============>.................] - ETA: 1:29 - loss: 0.6601 - regression_loss: 0.5797 - classification_loss: 0.0804 229/500 [============>.................] - ETA: 1:29 - loss: 0.6614 - regression_loss: 0.5809 - classification_loss: 0.0805 230/500 [============>.................] - ETA: 1:29 - loss: 0.6606 - regression_loss: 0.5802 - classification_loss: 0.0803 231/500 [============>.................] - ETA: 1:28 - loss: 0.6630 - regression_loss: 0.5822 - classification_loss: 0.0807 232/500 [============>.................] - ETA: 1:28 - loss: 0.6623 - regression_loss: 0.5817 - classification_loss: 0.0806 233/500 [============>.................] - ETA: 1:28 - loss: 0.6639 - regression_loss: 0.5830 - classification_loss: 0.0809 234/500 [=============>................] - ETA: 1:27 - loss: 0.6651 - regression_loss: 0.5840 - classification_loss: 0.0811 235/500 [=============>................] - ETA: 1:27 - loss: 0.6632 - regression_loss: 0.5824 - classification_loss: 0.0808 236/500 [=============>................] - ETA: 1:27 - loss: 0.6629 - regression_loss: 0.5822 - classification_loss: 0.0807 237/500 [=============>................] - ETA: 1:26 - loss: 0.6623 - regression_loss: 0.5817 - classification_loss: 0.0806 238/500 [=============>................] - ETA: 1:26 - loss: 0.6631 - regression_loss: 0.5821 - classification_loss: 0.0810 239/500 [=============>................] - ETA: 1:26 - loss: 0.6630 - regression_loss: 0.5818 - classification_loss: 0.0812 240/500 [=============>................] - ETA: 1:25 - loss: 0.6641 - regression_loss: 0.5827 - classification_loss: 0.0814 241/500 [=============>................] - ETA: 1:25 - loss: 0.6646 - regression_loss: 0.5832 - classification_loss: 0.0815 242/500 [=============>................] - ETA: 1:25 - loss: 0.6645 - regression_loss: 0.5830 - classification_loss: 0.0815 243/500 [=============>................] - ETA: 1:24 - loss: 0.6638 - regression_loss: 0.5823 - classification_loss: 0.0815 244/500 [=============>................] - ETA: 1:24 - loss: 0.6666 - regression_loss: 0.5846 - classification_loss: 0.0820 245/500 [=============>................] - ETA: 1:24 - loss: 0.6700 - regression_loss: 0.5876 - classification_loss: 0.0824 246/500 [=============>................] - ETA: 1:23 - loss: 0.6707 - regression_loss: 0.5882 - classification_loss: 0.0825 247/500 [=============>................] - ETA: 1:23 - loss: 0.6696 - regression_loss: 0.5873 - classification_loss: 0.0823 248/500 [=============>................] - ETA: 1:23 - loss: 0.6686 - regression_loss: 0.5865 - classification_loss: 0.0821 249/500 [=============>................] - ETA: 1:22 - loss: 0.6683 - regression_loss: 0.5863 - classification_loss: 0.0820 250/500 [==============>...............] - ETA: 1:22 - loss: 0.6693 - regression_loss: 0.5870 - classification_loss: 0.0824 251/500 [==============>...............] - ETA: 1:22 - loss: 0.6693 - regression_loss: 0.5868 - classification_loss: 0.0825 252/500 [==============>...............] - ETA: 1:21 - loss: 0.6694 - regression_loss: 0.5867 - classification_loss: 0.0826 253/500 [==============>...............] - ETA: 1:21 - loss: 0.6697 - regression_loss: 0.5870 - classification_loss: 0.0827 254/500 [==============>...............] - ETA: 1:21 - loss: 0.6714 - regression_loss: 0.5884 - classification_loss: 0.0830 255/500 [==============>...............] - ETA: 1:20 - loss: 0.6702 - regression_loss: 0.5874 - classification_loss: 0.0829 256/500 [==============>...............] - ETA: 1:20 - loss: 0.6691 - regression_loss: 0.5864 - classification_loss: 0.0827 257/500 [==============>...............] - ETA: 1:20 - loss: 0.6711 - regression_loss: 0.5882 - classification_loss: 0.0829 258/500 [==============>...............] - ETA: 1:19 - loss: 0.6722 - regression_loss: 0.5893 - classification_loss: 0.0829 259/500 [==============>...............] - ETA: 1:19 - loss: 0.6710 - regression_loss: 0.5882 - classification_loss: 0.0828 260/500 [==============>...............] - ETA: 1:19 - loss: 0.6710 - regression_loss: 0.5883 - classification_loss: 0.0827 261/500 [==============>...............] - ETA: 1:18 - loss: 0.6724 - regression_loss: 0.5894 - classification_loss: 0.0830 262/500 [==============>...............] - ETA: 1:18 - loss: 0.6717 - regression_loss: 0.5888 - classification_loss: 0.0829 263/500 [==============>...............] - ETA: 1:18 - loss: 0.6718 - regression_loss: 0.5890 - classification_loss: 0.0828 264/500 [==============>...............] - ETA: 1:17 - loss: 0.6708 - regression_loss: 0.5881 - classification_loss: 0.0827 265/500 [==============>...............] - ETA: 1:17 - loss: 0.6706 - regression_loss: 0.5880 - classification_loss: 0.0826 266/500 [==============>...............] - ETA: 1:17 - loss: 0.6708 - regression_loss: 0.5880 - classification_loss: 0.0827 267/500 [===============>..............] - ETA: 1:17 - loss: 0.6701 - regression_loss: 0.5876 - classification_loss: 0.0825 268/500 [===============>..............] - ETA: 1:16 - loss: 0.6698 - regression_loss: 0.5873 - classification_loss: 0.0825 269/500 [===============>..............] - ETA: 1:16 - loss: 0.6692 - regression_loss: 0.5868 - classification_loss: 0.0824 270/500 [===============>..............] - ETA: 1:16 - loss: 0.6696 - regression_loss: 0.5871 - classification_loss: 0.0825 271/500 [===============>..............] - ETA: 1:15 - loss: 0.6705 - regression_loss: 0.5882 - classification_loss: 0.0823 272/500 [===============>..............] - ETA: 1:15 - loss: 0.6714 - regression_loss: 0.5889 - classification_loss: 0.0825 273/500 [===============>..............] - ETA: 1:15 - loss: 0.6713 - regression_loss: 0.5890 - classification_loss: 0.0823 274/500 [===============>..............] - ETA: 1:14 - loss: 0.6701 - regression_loss: 0.5880 - classification_loss: 0.0821 275/500 [===============>..............] - ETA: 1:14 - loss: 0.6688 - regression_loss: 0.5869 - classification_loss: 0.0819 276/500 [===============>..............] - ETA: 1:14 - loss: 0.6692 - regression_loss: 0.5873 - classification_loss: 0.0819 277/500 [===============>..............] - ETA: 1:13 - loss: 0.6683 - regression_loss: 0.5865 - classification_loss: 0.0818 278/500 [===============>..............] - ETA: 1:13 - loss: 0.6691 - regression_loss: 0.5871 - classification_loss: 0.0820 279/500 [===============>..............] - ETA: 1:13 - loss: 0.6679 - regression_loss: 0.5860 - classification_loss: 0.0818 280/500 [===============>..............] - ETA: 1:12 - loss: 0.6676 - regression_loss: 0.5858 - classification_loss: 0.0818 281/500 [===============>..............] - ETA: 1:12 - loss: 0.6683 - regression_loss: 0.5863 - classification_loss: 0.0819 282/500 [===============>..............] - ETA: 1:12 - loss: 0.6689 - regression_loss: 0.5869 - classification_loss: 0.0821 283/500 [===============>..............] - ETA: 1:11 - loss: 0.6686 - regression_loss: 0.5866 - classification_loss: 0.0820 284/500 [================>.............] - ETA: 1:11 - loss: 0.6704 - regression_loss: 0.5879 - classification_loss: 0.0825 285/500 [================>.............] - ETA: 1:11 - loss: 0.6707 - regression_loss: 0.5883 - classification_loss: 0.0824 286/500 [================>.............] - ETA: 1:10 - loss: 0.6707 - regression_loss: 0.5885 - classification_loss: 0.0823 287/500 [================>.............] - ETA: 1:10 - loss: 0.6706 - regression_loss: 0.5885 - classification_loss: 0.0821 288/500 [================>.............] - ETA: 1:10 - loss: 0.6709 - regression_loss: 0.5886 - classification_loss: 0.0823 289/500 [================>.............] - ETA: 1:09 - loss: 0.6712 - regression_loss: 0.5889 - classification_loss: 0.0823 290/500 [================>.............] - ETA: 1:09 - loss: 0.6736 - regression_loss: 0.5910 - classification_loss: 0.0826 291/500 [================>.............] - ETA: 1:09 - loss: 0.6734 - regression_loss: 0.5908 - classification_loss: 0.0826 292/500 [================>.............] - ETA: 1:08 - loss: 0.6734 - regression_loss: 0.5908 - classification_loss: 0.0826 293/500 [================>.............] - ETA: 1:08 - loss: 0.6717 - regression_loss: 0.5893 - classification_loss: 0.0824 294/500 [================>.............] - ETA: 1:08 - loss: 0.6709 - regression_loss: 0.5886 - classification_loss: 0.0823 295/500 [================>.............] - ETA: 1:07 - loss: 0.6716 - regression_loss: 0.5892 - classification_loss: 0.0823 296/500 [================>.............] - ETA: 1:07 - loss: 0.6711 - regression_loss: 0.5889 - classification_loss: 0.0822 297/500 [================>.............] - ETA: 1:07 - loss: 0.6706 - regression_loss: 0.5886 - classification_loss: 0.0821 298/500 [================>.............] - ETA: 1:06 - loss: 0.6693 - regression_loss: 0.5874 - classification_loss: 0.0819 299/500 [================>.............] - ETA: 1:06 - loss: 0.6683 - regression_loss: 0.5866 - classification_loss: 0.0817 300/500 [=================>............] - ETA: 1:06 - loss: 0.6685 - regression_loss: 0.5868 - classification_loss: 0.0817 301/500 [=================>............] - ETA: 1:05 - loss: 0.6691 - regression_loss: 0.5874 - classification_loss: 0.0817 302/500 [=================>............] - ETA: 1:05 - loss: 0.6681 - regression_loss: 0.5865 - classification_loss: 0.0816 303/500 [=================>............] - ETA: 1:05 - loss: 0.6674 - regression_loss: 0.5860 - classification_loss: 0.0814 304/500 [=================>............] - ETA: 1:04 - loss: 0.6687 - regression_loss: 0.5870 - classification_loss: 0.0816 305/500 [=================>............] - ETA: 1:04 - loss: 0.6700 - regression_loss: 0.5881 - classification_loss: 0.0819 306/500 [=================>............] - ETA: 1:04 - loss: 0.6706 - regression_loss: 0.5885 - classification_loss: 0.0821 307/500 [=================>............] - ETA: 1:03 - loss: 0.6717 - regression_loss: 0.5895 - classification_loss: 0.0822 308/500 [=================>............] - ETA: 1:03 - loss: 0.6725 - regression_loss: 0.5901 - classification_loss: 0.0824 309/500 [=================>............] - ETA: 1:03 - loss: 0.6725 - regression_loss: 0.5901 - classification_loss: 0.0824 310/500 [=================>............] - ETA: 1:02 - loss: 0.6728 - regression_loss: 0.5904 - classification_loss: 0.0824 311/500 [=================>............] - ETA: 1:02 - loss: 0.6727 - regression_loss: 0.5903 - classification_loss: 0.0824 312/500 [=================>............] - ETA: 1:02 - loss: 0.6724 - regression_loss: 0.5901 - classification_loss: 0.0823 313/500 [=================>............] - ETA: 1:01 - loss: 0.6725 - regression_loss: 0.5902 - classification_loss: 0.0823 314/500 [=================>............] - ETA: 1:01 - loss: 0.6724 - regression_loss: 0.5900 - classification_loss: 0.0824 315/500 [=================>............] - ETA: 1:01 - loss: 0.6712 - regression_loss: 0.5889 - classification_loss: 0.0823 316/500 [=================>............] - ETA: 1:00 - loss: 0.6707 - regression_loss: 0.5885 - classification_loss: 0.0822 317/500 [==================>...........] - ETA: 1:00 - loss: 0.6708 - regression_loss: 0.5886 - classification_loss: 0.0822 318/500 [==================>...........] - ETA: 1:00 - loss: 0.6699 - regression_loss: 0.5878 - classification_loss: 0.0821 319/500 [==================>...........] - ETA: 59s - loss: 0.6714 - regression_loss: 0.5892 - classification_loss: 0.0822  320/500 [==================>...........] - ETA: 59s - loss: 0.6713 - regression_loss: 0.5891 - classification_loss: 0.0822 321/500 [==================>...........] - ETA: 59s - loss: 0.6707 - regression_loss: 0.5887 - classification_loss: 0.0821 322/500 [==================>...........] - ETA: 58s - loss: 0.6710 - regression_loss: 0.5889 - classification_loss: 0.0821 323/500 [==================>...........] - ETA: 58s - loss: 0.6712 - regression_loss: 0.5889 - classification_loss: 0.0822 324/500 [==================>...........] - ETA: 58s - loss: 0.6701 - regression_loss: 0.5881 - classification_loss: 0.0821 325/500 [==================>...........] - ETA: 57s - loss: 0.6710 - regression_loss: 0.5888 - classification_loss: 0.0822 326/500 [==================>...........] - ETA: 57s - loss: 0.6721 - regression_loss: 0.5896 - classification_loss: 0.0826 327/500 [==================>...........] - ETA: 57s - loss: 0.6720 - regression_loss: 0.5894 - classification_loss: 0.0826 328/500 [==================>...........] - ETA: 56s - loss: 0.6708 - regression_loss: 0.5884 - classification_loss: 0.0824 329/500 [==================>...........] - ETA: 56s - loss: 0.6721 - regression_loss: 0.5894 - classification_loss: 0.0827 330/500 [==================>...........] - ETA: 56s - loss: 0.6709 - regression_loss: 0.5883 - classification_loss: 0.0826 331/500 [==================>...........] - ETA: 55s - loss: 0.6701 - regression_loss: 0.5877 - classification_loss: 0.0824 332/500 [==================>...........] - ETA: 55s - loss: 0.6695 - regression_loss: 0.5871 - classification_loss: 0.0823 333/500 [==================>...........] - ETA: 55s - loss: 0.6694 - regression_loss: 0.5872 - classification_loss: 0.0823 334/500 [===================>..........] - ETA: 54s - loss: 0.6684 - regression_loss: 0.5863 - classification_loss: 0.0821 335/500 [===================>..........] - ETA: 54s - loss: 0.6683 - regression_loss: 0.5861 - classification_loss: 0.0822 336/500 [===================>..........] - ETA: 54s - loss: 0.6690 - regression_loss: 0.5867 - classification_loss: 0.0822 337/500 [===================>..........] - ETA: 53s - loss: 0.6679 - regression_loss: 0.5858 - classification_loss: 0.0821 338/500 [===================>..........] - ETA: 53s - loss: 0.6681 - regression_loss: 0.5860 - classification_loss: 0.0821 339/500 [===================>..........] - ETA: 53s - loss: 0.6678 - regression_loss: 0.5857 - classification_loss: 0.0821 340/500 [===================>..........] - ETA: 52s - loss: 0.6665 - regression_loss: 0.5846 - classification_loss: 0.0819 341/500 [===================>..........] - ETA: 52s - loss: 0.6663 - regression_loss: 0.5844 - classification_loss: 0.0819 342/500 [===================>..........] - ETA: 52s - loss: 0.6660 - regression_loss: 0.5843 - classification_loss: 0.0817 343/500 [===================>..........] - ETA: 51s - loss: 0.6645 - regression_loss: 0.5829 - classification_loss: 0.0816 344/500 [===================>..........] - ETA: 51s - loss: 0.6629 - regression_loss: 0.5816 - classification_loss: 0.0814 345/500 [===================>..........] - ETA: 51s - loss: 0.6637 - regression_loss: 0.5822 - classification_loss: 0.0815 346/500 [===================>..........] - ETA: 50s - loss: 0.6628 - regression_loss: 0.5814 - classification_loss: 0.0813 347/500 [===================>..........] - ETA: 50s - loss: 0.6638 - regression_loss: 0.5823 - classification_loss: 0.0815 348/500 [===================>..........] - ETA: 50s - loss: 0.6644 - regression_loss: 0.5827 - classification_loss: 0.0817 349/500 [===================>..........] - ETA: 49s - loss: 0.6634 - regression_loss: 0.5817 - classification_loss: 0.0816 350/500 [====================>.........] - ETA: 49s - loss: 0.6639 - regression_loss: 0.5822 - classification_loss: 0.0817 351/500 [====================>.........] - ETA: 49s - loss: 0.6630 - regression_loss: 0.5814 - classification_loss: 0.0816 352/500 [====================>.........] - ETA: 48s - loss: 0.6644 - regression_loss: 0.5826 - classification_loss: 0.0818 353/500 [====================>.........] - ETA: 48s - loss: 0.6646 - regression_loss: 0.5827 - classification_loss: 0.0819 354/500 [====================>.........] - ETA: 48s - loss: 0.6639 - regression_loss: 0.5822 - classification_loss: 0.0817 355/500 [====================>.........] - ETA: 47s - loss: 0.6644 - regression_loss: 0.5826 - classification_loss: 0.0817 356/500 [====================>.........] - ETA: 47s - loss: 0.6646 - regression_loss: 0.5828 - classification_loss: 0.0817 357/500 [====================>.........] - ETA: 47s - loss: 0.6644 - regression_loss: 0.5827 - classification_loss: 0.0817 358/500 [====================>.........] - ETA: 46s - loss: 0.6634 - regression_loss: 0.5818 - classification_loss: 0.0816 359/500 [====================>.........] - ETA: 46s - loss: 0.6632 - regression_loss: 0.5816 - classification_loss: 0.0816 360/500 [====================>.........] - ETA: 46s - loss: 0.6624 - regression_loss: 0.5810 - classification_loss: 0.0815 361/500 [====================>.........] - ETA: 45s - loss: 0.6632 - regression_loss: 0.5815 - classification_loss: 0.0817 362/500 [====================>.........] - ETA: 45s - loss: 0.6641 - regression_loss: 0.5822 - classification_loss: 0.0818 363/500 [====================>.........] - ETA: 45s - loss: 0.6629 - regression_loss: 0.5813 - classification_loss: 0.0817 364/500 [====================>.........] - ETA: 44s - loss: 0.6624 - regression_loss: 0.5808 - classification_loss: 0.0816 365/500 [====================>.........] - ETA: 44s - loss: 0.6624 - regression_loss: 0.5808 - classification_loss: 0.0816 366/500 [====================>.........] - ETA: 44s - loss: 0.6628 - regression_loss: 0.5813 - classification_loss: 0.0815 367/500 [=====================>........] - ETA: 43s - loss: 0.6635 - regression_loss: 0.5819 - classification_loss: 0.0816 368/500 [=====================>........] - ETA: 43s - loss: 0.6622 - regression_loss: 0.5807 - classification_loss: 0.0815 369/500 [=====================>........] - ETA: 43s - loss: 0.6618 - regression_loss: 0.5803 - classification_loss: 0.0814 370/500 [=====================>........] - ETA: 43s - loss: 0.6622 - regression_loss: 0.5807 - classification_loss: 0.0814 371/500 [=====================>........] - ETA: 42s - loss: 0.6624 - regression_loss: 0.5809 - classification_loss: 0.0815 372/500 [=====================>........] - ETA: 42s - loss: 0.6626 - regression_loss: 0.5812 - classification_loss: 0.0815 373/500 [=====================>........] - ETA: 42s - loss: 0.6626 - regression_loss: 0.5812 - classification_loss: 0.0814 374/500 [=====================>........] - ETA: 41s - loss: 0.6623 - regression_loss: 0.5809 - classification_loss: 0.0814 375/500 [=====================>........] - ETA: 41s - loss: 0.6637 - regression_loss: 0.5821 - classification_loss: 0.0816 376/500 [=====================>........] - ETA: 41s - loss: 0.6630 - regression_loss: 0.5816 - classification_loss: 0.0814 377/500 [=====================>........] - ETA: 40s - loss: 0.6632 - regression_loss: 0.5816 - classification_loss: 0.0815 378/500 [=====================>........] - ETA: 40s - loss: 0.6632 - regression_loss: 0.5817 - classification_loss: 0.0815 379/500 [=====================>........] - ETA: 40s - loss: 0.6629 - regression_loss: 0.5815 - classification_loss: 0.0815 380/500 [=====================>........] - ETA: 39s - loss: 0.6632 - regression_loss: 0.5816 - classification_loss: 0.0816 381/500 [=====================>........] - ETA: 39s - loss: 0.6639 - regression_loss: 0.5821 - classification_loss: 0.0817 382/500 [=====================>........] - ETA: 39s - loss: 0.6643 - regression_loss: 0.5826 - classification_loss: 0.0817 383/500 [=====================>........] - ETA: 38s - loss: 0.6640 - regression_loss: 0.5823 - classification_loss: 0.0817 384/500 [======================>.......] - ETA: 38s - loss: 0.6635 - regression_loss: 0.5819 - classification_loss: 0.0816 385/500 [======================>.......] - ETA: 38s - loss: 0.6643 - regression_loss: 0.5826 - classification_loss: 0.0817 386/500 [======================>.......] - ETA: 37s - loss: 0.6638 - regression_loss: 0.5822 - classification_loss: 0.0816 387/500 [======================>.......] - ETA: 37s - loss: 0.6632 - regression_loss: 0.5817 - classification_loss: 0.0815 388/500 [======================>.......] - ETA: 37s - loss: 0.6627 - regression_loss: 0.5813 - classification_loss: 0.0815 389/500 [======================>.......] - ETA: 36s - loss: 0.6619 - regression_loss: 0.5805 - classification_loss: 0.0814 390/500 [======================>.......] - ETA: 36s - loss: 0.6620 - regression_loss: 0.5807 - classification_loss: 0.0814 391/500 [======================>.......] - ETA: 36s - loss: 0.6627 - regression_loss: 0.5813 - classification_loss: 0.0814 392/500 [======================>.......] - ETA: 35s - loss: 0.6635 - regression_loss: 0.5821 - classification_loss: 0.0814 393/500 [======================>.......] - ETA: 35s - loss: 0.6637 - regression_loss: 0.5822 - classification_loss: 0.0815 394/500 [======================>.......] - ETA: 35s - loss: 0.6627 - regression_loss: 0.5813 - classification_loss: 0.0814 395/500 [======================>.......] - ETA: 34s - loss: 0.6634 - regression_loss: 0.5819 - classification_loss: 0.0815 396/500 [======================>.......] - ETA: 34s - loss: 0.6636 - regression_loss: 0.5821 - classification_loss: 0.0815 397/500 [======================>.......] - ETA: 34s - loss: 0.6646 - regression_loss: 0.5829 - classification_loss: 0.0817 398/500 [======================>.......] - ETA: 33s - loss: 0.6648 - regression_loss: 0.5831 - classification_loss: 0.0817 399/500 [======================>.......] - ETA: 33s - loss: 0.6648 - regression_loss: 0.5830 - classification_loss: 0.0818 400/500 [=======================>......] - ETA: 33s - loss: 0.6646 - regression_loss: 0.5828 - classification_loss: 0.0818 401/500 [=======================>......] - ETA: 32s - loss: 0.6653 - regression_loss: 0.5834 - classification_loss: 0.0820 402/500 [=======================>......] - ETA: 32s - loss: 0.6661 - regression_loss: 0.5839 - classification_loss: 0.0822 403/500 [=======================>......] - ETA: 32s - loss: 0.6655 - regression_loss: 0.5834 - classification_loss: 0.0821 404/500 [=======================>......] - ETA: 31s - loss: 0.6644 - regression_loss: 0.5824 - classification_loss: 0.0820 405/500 [=======================>......] - ETA: 31s - loss: 0.6641 - regression_loss: 0.5822 - classification_loss: 0.0819 406/500 [=======================>......] - ETA: 31s - loss: 0.6648 - regression_loss: 0.5827 - classification_loss: 0.0821 407/500 [=======================>......] - ETA: 30s - loss: 0.6641 - regression_loss: 0.5821 - classification_loss: 0.0819 408/500 [=======================>......] - ETA: 30s - loss: 0.6631 - regression_loss: 0.5812 - classification_loss: 0.0818 409/500 [=======================>......] - ETA: 30s - loss: 0.6633 - regression_loss: 0.5815 - classification_loss: 0.0819 410/500 [=======================>......] - ETA: 29s - loss: 0.6629 - regression_loss: 0.5812 - classification_loss: 0.0817 411/500 [=======================>......] - ETA: 29s - loss: 0.6634 - regression_loss: 0.5816 - classification_loss: 0.0818 412/500 [=======================>......] - ETA: 29s - loss: 0.6641 - regression_loss: 0.5821 - classification_loss: 0.0820 413/500 [=======================>......] - ETA: 28s - loss: 0.6643 - regression_loss: 0.5823 - classification_loss: 0.0820 414/500 [=======================>......] - ETA: 28s - loss: 0.6633 - regression_loss: 0.5814 - classification_loss: 0.0819 415/500 [=======================>......] - ETA: 28s - loss: 0.6620 - regression_loss: 0.5803 - classification_loss: 0.0818 416/500 [=======================>......] - ETA: 27s - loss: 0.6623 - regression_loss: 0.5805 - classification_loss: 0.0818 417/500 [========================>.....] - ETA: 27s - loss: 0.6615 - regression_loss: 0.5798 - classification_loss: 0.0817 418/500 [========================>.....] - ETA: 27s - loss: 0.6614 - regression_loss: 0.5796 - classification_loss: 0.0817 419/500 [========================>.....] - ETA: 26s - loss: 0.6607 - regression_loss: 0.5791 - classification_loss: 0.0817 420/500 [========================>.....] - ETA: 26s - loss: 0.6614 - regression_loss: 0.5797 - classification_loss: 0.0817 421/500 [========================>.....] - ETA: 26s - loss: 0.6610 - regression_loss: 0.5794 - classification_loss: 0.0816 422/500 [========================>.....] - ETA: 25s - loss: 0.6603 - regression_loss: 0.5788 - classification_loss: 0.0815 423/500 [========================>.....] - ETA: 25s - loss: 0.6613 - regression_loss: 0.5796 - classification_loss: 0.0817 424/500 [========================>.....] - ETA: 25s - loss: 0.6615 - regression_loss: 0.5799 - classification_loss: 0.0817 425/500 [========================>.....] - ETA: 24s - loss: 0.6610 - regression_loss: 0.5794 - classification_loss: 0.0816 426/500 [========================>.....] - ETA: 24s - loss: 0.6615 - regression_loss: 0.5799 - classification_loss: 0.0816 427/500 [========================>.....] - ETA: 24s - loss: 0.6607 - regression_loss: 0.5792 - classification_loss: 0.0815 428/500 [========================>.....] - ETA: 23s - loss: 0.6612 - regression_loss: 0.5796 - classification_loss: 0.0816 429/500 [========================>.....] - ETA: 23s - loss: 0.6612 - regression_loss: 0.5796 - classification_loss: 0.0816 430/500 [========================>.....] - ETA: 23s - loss: 0.6622 - regression_loss: 0.5805 - classification_loss: 0.0817 431/500 [========================>.....] - ETA: 22s - loss: 0.6614 - regression_loss: 0.5798 - classification_loss: 0.0816 432/500 [========================>.....] - ETA: 22s - loss: 0.6612 - regression_loss: 0.5796 - classification_loss: 0.0816 433/500 [========================>.....] - ETA: 22s - loss: 0.6619 - regression_loss: 0.5803 - classification_loss: 0.0816 434/500 [=========================>....] - ETA: 21s - loss: 0.6620 - regression_loss: 0.5803 - classification_loss: 0.0816 435/500 [=========================>....] - ETA: 21s - loss: 0.6613 - regression_loss: 0.5798 - classification_loss: 0.0815 436/500 [=========================>....] - ETA: 21s - loss: 0.6610 - regression_loss: 0.5796 - classification_loss: 0.0815 437/500 [=========================>....] - ETA: 20s - loss: 0.6613 - regression_loss: 0.5798 - classification_loss: 0.0815 438/500 [=========================>....] - ETA: 20s - loss: 0.6607 - regression_loss: 0.5793 - classification_loss: 0.0814 439/500 [=========================>....] - ETA: 20s - loss: 0.6623 - regression_loss: 0.5806 - classification_loss: 0.0817 440/500 [=========================>....] - ETA: 19s - loss: 0.6623 - regression_loss: 0.5806 - classification_loss: 0.0816 441/500 [=========================>....] - ETA: 19s - loss: 0.6612 - regression_loss: 0.5797 - classification_loss: 0.0815 442/500 [=========================>....] - ETA: 19s - loss: 0.6620 - regression_loss: 0.5803 - classification_loss: 0.0817 443/500 [=========================>....] - ETA: 18s - loss: 0.6611 - regression_loss: 0.5795 - classification_loss: 0.0816 444/500 [=========================>....] - ETA: 18s - loss: 0.6612 - regression_loss: 0.5796 - classification_loss: 0.0816 445/500 [=========================>....] - ETA: 18s - loss: 0.6606 - regression_loss: 0.5790 - classification_loss: 0.0815 446/500 [=========================>....] - ETA: 17s - loss: 0.6598 - regression_loss: 0.5784 - classification_loss: 0.0815 447/500 [=========================>....] - ETA: 17s - loss: 0.6601 - regression_loss: 0.5786 - classification_loss: 0.0815 448/500 [=========================>....] - ETA: 17s - loss: 0.6595 - regression_loss: 0.5780 - classification_loss: 0.0814 449/500 [=========================>....] - ETA: 16s - loss: 0.6588 - regression_loss: 0.5774 - classification_loss: 0.0814 450/500 [==========================>...] - ETA: 16s - loss: 0.6579 - regression_loss: 0.5767 - classification_loss: 0.0812 451/500 [==========================>...] - ETA: 16s - loss: 0.6575 - regression_loss: 0.5764 - classification_loss: 0.0812 452/500 [==========================>...] - ETA: 15s - loss: 0.6576 - regression_loss: 0.5764 - classification_loss: 0.0812 453/500 [==========================>...] - ETA: 15s - loss: 0.6579 - regression_loss: 0.5766 - classification_loss: 0.0812 454/500 [==========================>...] - ETA: 15s - loss: 0.6576 - regression_loss: 0.5764 - classification_loss: 0.0812 455/500 [==========================>...] - ETA: 14s - loss: 0.6576 - regression_loss: 0.5764 - classification_loss: 0.0811 456/500 [==========================>...] - ETA: 14s - loss: 0.6576 - regression_loss: 0.5765 - classification_loss: 0.0811 457/500 [==========================>...] - ETA: 14s - loss: 0.6587 - regression_loss: 0.5773 - classification_loss: 0.0814 458/500 [==========================>...] - ETA: 13s - loss: 0.6590 - regression_loss: 0.5777 - classification_loss: 0.0813 459/500 [==========================>...] - ETA: 13s - loss: 0.6593 - regression_loss: 0.5780 - classification_loss: 0.0813 460/500 [==========================>...] - ETA: 13s - loss: 0.6602 - regression_loss: 0.5787 - classification_loss: 0.0814 461/500 [==========================>...] - ETA: 12s - loss: 0.6601 - regression_loss: 0.5788 - classification_loss: 0.0814 462/500 [==========================>...] - ETA: 12s - loss: 0.6603 - regression_loss: 0.5790 - classification_loss: 0.0814 463/500 [==========================>...] - ETA: 12s - loss: 0.6607 - regression_loss: 0.5793 - classification_loss: 0.0814 464/500 [==========================>...] - ETA: 11s - loss: 0.6604 - regression_loss: 0.5790 - classification_loss: 0.0814 465/500 [==========================>...] - ETA: 11s - loss: 0.6597 - regression_loss: 0.5784 - classification_loss: 0.0813 466/500 [==========================>...] - ETA: 11s - loss: 0.6596 - regression_loss: 0.5783 - classification_loss: 0.0813 467/500 [===========================>..] - ETA: 10s - loss: 0.6589 - regression_loss: 0.5777 - classification_loss: 0.0812 468/500 [===========================>..] - ETA: 10s - loss: 0.6584 - regression_loss: 0.5773 - classification_loss: 0.0811 469/500 [===========================>..] - ETA: 10s - loss: 0.6589 - regression_loss: 0.5777 - classification_loss: 0.0812 470/500 [===========================>..] - ETA: 9s - loss: 0.6584 - regression_loss: 0.5773 - classification_loss: 0.0811  471/500 [===========================>..] - ETA: 9s - loss: 0.6583 - regression_loss: 0.5771 - classification_loss: 0.0812 472/500 [===========================>..] - ETA: 9s - loss: 0.6583 - regression_loss: 0.5771 - classification_loss: 0.0812 473/500 [===========================>..] - ETA: 8s - loss: 0.6579 - regression_loss: 0.5768 - classification_loss: 0.0811 474/500 [===========================>..] - ETA: 8s - loss: 0.6577 - regression_loss: 0.5766 - classification_loss: 0.0811 475/500 [===========================>..] - ETA: 8s - loss: 0.6569 - regression_loss: 0.5758 - classification_loss: 0.0810 476/500 [===========================>..] - ETA: 7s - loss: 0.6580 - regression_loss: 0.5767 - classification_loss: 0.0812 477/500 [===========================>..] - ETA: 7s - loss: 0.6579 - regression_loss: 0.5767 - classification_loss: 0.0812 478/500 [===========================>..] - ETA: 7s - loss: 0.6575 - regression_loss: 0.5764 - classification_loss: 0.0811 479/500 [===========================>..] - ETA: 6s - loss: 0.6586 - regression_loss: 0.5772 - classification_loss: 0.0813 480/500 [===========================>..] - ETA: 6s - loss: 0.6578 - regression_loss: 0.5765 - classification_loss: 0.0813 481/500 [===========================>..] - ETA: 6s - loss: 0.6574 - regression_loss: 0.5763 - classification_loss: 0.0812 482/500 [===========================>..] - ETA: 5s - loss: 0.6578 - regression_loss: 0.5766 - classification_loss: 0.0812 483/500 [===========================>..] - ETA: 5s - loss: 0.6581 - regression_loss: 0.5768 - classification_loss: 0.0813 484/500 [============================>.] - ETA: 5s - loss: 0.6575 - regression_loss: 0.5764 - classification_loss: 0.0812 485/500 [============================>.] - ETA: 4s - loss: 0.6575 - regression_loss: 0.5764 - classification_loss: 0.0812 486/500 [============================>.] - ETA: 4s - loss: 0.6568 - regression_loss: 0.5758 - classification_loss: 0.0811 487/500 [============================>.] - ETA: 4s - loss: 0.6573 - regression_loss: 0.5761 - classification_loss: 0.0811 488/500 [============================>.] - ETA: 3s - loss: 0.6569 - regression_loss: 0.5758 - classification_loss: 0.0811 489/500 [============================>.] - ETA: 3s - loss: 0.6575 - regression_loss: 0.5764 - classification_loss: 0.0811 490/500 [============================>.] - ETA: 3s - loss: 0.6579 - regression_loss: 0.5768 - classification_loss: 0.0812 491/500 [============================>.] - ETA: 2s - loss: 0.6591 - regression_loss: 0.5777 - classification_loss: 0.0814 492/500 [============================>.] - ETA: 2s - loss: 0.6585 - regression_loss: 0.5772 - classification_loss: 0.0813 493/500 [============================>.] - ETA: 2s - loss: 0.6597 - regression_loss: 0.5781 - classification_loss: 0.0815 494/500 [============================>.] - ETA: 1s - loss: 0.6594 - regression_loss: 0.5778 - classification_loss: 0.0815 495/500 [============================>.] - ETA: 1s - loss: 0.6586 - regression_loss: 0.5771 - classification_loss: 0.0814 496/500 [============================>.] - ETA: 1s - loss: 0.6594 - regression_loss: 0.5778 - classification_loss: 0.0816 497/500 [============================>.] - ETA: 0s - loss: 0.6601 - regression_loss: 0.5784 - classification_loss: 0.0817 498/500 [============================>.] - ETA: 0s - loss: 0.6606 - regression_loss: 0.5789 - classification_loss: 0.0818 499/500 [============================>.] - ETA: 0s - loss: 0.6610 - regression_loss: 0.5792 - classification_loss: 0.0818 500/500 [==============================] - 165s 330ms/step - loss: 0.6612 - regression_loss: 0.5793 - classification_loss: 0.0819 1172 instances of class plum with average precision: 0.6369 mAP: 0.6369 Epoch 00047: saving model to ./training/snapshots/resnet101_pascal_47.h5 Epoch 48/150 1/500 [..............................] - ETA: 2:53 - loss: 0.6880 - regression_loss: 0.5819 - classification_loss: 0.1061 2/500 [..............................] - ETA: 2:50 - loss: 0.6047 - regression_loss: 0.5237 - classification_loss: 0.0810 3/500 [..............................] - ETA: 2:45 - loss: 0.5014 - regression_loss: 0.4396 - classification_loss: 0.0618 4/500 [..............................] - ETA: 2:46 - loss: 0.4678 - regression_loss: 0.4118 - classification_loss: 0.0560 5/500 [..............................] - ETA: 2:45 - loss: 0.5619 - regression_loss: 0.4951 - classification_loss: 0.0668 6/500 [..............................] - ETA: 2:43 - loss: 0.5175 - regression_loss: 0.4562 - classification_loss: 0.0614 7/500 [..............................] - ETA: 2:42 - loss: 0.4786 - regression_loss: 0.4203 - classification_loss: 0.0582 8/500 [..............................] - ETA: 2:42 - loss: 0.4409 - regression_loss: 0.3874 - classification_loss: 0.0534 9/500 [..............................] - ETA: 2:41 - loss: 0.5126 - regression_loss: 0.4472 - classification_loss: 0.0653 10/500 [..............................] - ETA: 2:41 - loss: 0.5296 - regression_loss: 0.4603 - classification_loss: 0.0693 11/500 [..............................] - ETA: 2:40 - loss: 0.5558 - regression_loss: 0.4834 - classification_loss: 0.0724 12/500 [..............................] - ETA: 2:40 - loss: 0.5486 - regression_loss: 0.4776 - classification_loss: 0.0709 13/500 [..............................] - ETA: 2:39 - loss: 0.5721 - regression_loss: 0.4960 - classification_loss: 0.0761 14/500 [..............................] - ETA: 2:39 - loss: 0.6015 - regression_loss: 0.5192 - classification_loss: 0.0824 15/500 [..............................] - ETA: 2:38 - loss: 0.6119 - regression_loss: 0.5298 - classification_loss: 0.0821 16/500 [..............................] - ETA: 2:38 - loss: 0.6034 - regression_loss: 0.5250 - classification_loss: 0.0784 17/500 [>.............................] - ETA: 2:39 - loss: 0.6165 - regression_loss: 0.5354 - classification_loss: 0.0811 18/500 [>.............................] - ETA: 2:39 - loss: 0.6313 - regression_loss: 0.5473 - classification_loss: 0.0840 19/500 [>.............................] - ETA: 2:39 - loss: 0.6271 - regression_loss: 0.5441 - classification_loss: 0.0830 20/500 [>.............................] - ETA: 2:39 - loss: 0.6276 - regression_loss: 0.5460 - classification_loss: 0.0816 21/500 [>.............................] - ETA: 2:39 - loss: 0.6263 - regression_loss: 0.5448 - classification_loss: 0.0815 22/500 [>.............................] - ETA: 2:38 - loss: 0.6271 - regression_loss: 0.5444 - classification_loss: 0.0827 23/500 [>.............................] - ETA: 2:38 - loss: 0.6433 - regression_loss: 0.5585 - classification_loss: 0.0848 24/500 [>.............................] - ETA: 2:37 - loss: 0.6458 - regression_loss: 0.5613 - classification_loss: 0.0845 25/500 [>.............................] - ETA: 2:37 - loss: 0.6494 - regression_loss: 0.5643 - classification_loss: 0.0851 26/500 [>.............................] - ETA: 2:36 - loss: 0.6434 - regression_loss: 0.5600 - classification_loss: 0.0834 27/500 [>.............................] - ETA: 2:36 - loss: 0.6420 - regression_loss: 0.5600 - classification_loss: 0.0820 28/500 [>.............................] - ETA: 2:36 - loss: 0.6394 - regression_loss: 0.5586 - classification_loss: 0.0808 29/500 [>.............................] - ETA: 2:35 - loss: 0.6536 - regression_loss: 0.5712 - classification_loss: 0.0825 30/500 [>.............................] - ETA: 2:35 - loss: 0.6420 - regression_loss: 0.5615 - classification_loss: 0.0805 31/500 [>.............................] - ETA: 2:35 - loss: 0.6392 - regression_loss: 0.5597 - classification_loss: 0.0795 32/500 [>.............................] - ETA: 2:34 - loss: 0.6278 - regression_loss: 0.5492 - classification_loss: 0.0786 33/500 [>.............................] - ETA: 2:34 - loss: 0.6164 - regression_loss: 0.5386 - classification_loss: 0.0778 34/500 [=>............................] - ETA: 2:34 - loss: 0.6121 - regression_loss: 0.5351 - classification_loss: 0.0770 35/500 [=>............................] - ETA: 2:33 - loss: 0.6233 - regression_loss: 0.5429 - classification_loss: 0.0804 36/500 [=>............................] - ETA: 2:33 - loss: 0.6388 - regression_loss: 0.5562 - classification_loss: 0.0826 37/500 [=>............................] - ETA: 2:33 - loss: 0.6405 - regression_loss: 0.5574 - classification_loss: 0.0832 38/500 [=>............................] - ETA: 2:32 - loss: 0.6340 - regression_loss: 0.5521 - classification_loss: 0.0818 39/500 [=>............................] - ETA: 2:32 - loss: 0.6278 - regression_loss: 0.5476 - classification_loss: 0.0803 40/500 [=>............................] - ETA: 2:31 - loss: 0.6325 - regression_loss: 0.5517 - classification_loss: 0.0808 41/500 [=>............................] - ETA: 2:31 - loss: 0.6324 - regression_loss: 0.5524 - classification_loss: 0.0800 42/500 [=>............................] - ETA: 2:31 - loss: 0.6235 - regression_loss: 0.5447 - classification_loss: 0.0788 43/500 [=>............................] - ETA: 2:30 - loss: 0.6284 - regression_loss: 0.5479 - classification_loss: 0.0805 44/500 [=>............................] - ETA: 2:30 - loss: 0.6408 - regression_loss: 0.5574 - classification_loss: 0.0833 45/500 [=>............................] - ETA: 2:30 - loss: 0.6496 - regression_loss: 0.5639 - classification_loss: 0.0857 46/500 [=>............................] - ETA: 2:30 - loss: 0.6469 - regression_loss: 0.5617 - classification_loss: 0.0852 47/500 [=>............................] - ETA: 2:29 - loss: 0.6418 - regression_loss: 0.5575 - classification_loss: 0.0844 48/500 [=>............................] - ETA: 2:29 - loss: 0.6420 - regression_loss: 0.5575 - classification_loss: 0.0844 49/500 [=>............................] - ETA: 2:28 - loss: 0.6435 - regression_loss: 0.5589 - classification_loss: 0.0845 50/500 [==>...........................] - ETA: 2:28 - loss: 0.6441 - regression_loss: 0.5595 - classification_loss: 0.0846 51/500 [==>...........................] - ETA: 2:28 - loss: 0.6403 - regression_loss: 0.5567 - classification_loss: 0.0837 52/500 [==>...........................] - ETA: 2:27 - loss: 0.6441 - regression_loss: 0.5598 - classification_loss: 0.0843 53/500 [==>...........................] - ETA: 2:27 - loss: 0.6474 - regression_loss: 0.5633 - classification_loss: 0.0841 54/500 [==>...........................] - ETA: 2:27 - loss: 0.6501 - regression_loss: 0.5656 - classification_loss: 0.0845 55/500 [==>...........................] - ETA: 2:26 - loss: 0.6514 - regression_loss: 0.5666 - classification_loss: 0.0848 56/500 [==>...........................] - ETA: 2:26 - loss: 0.6572 - regression_loss: 0.5717 - classification_loss: 0.0855 57/500 [==>...........................] - ETA: 2:26 - loss: 0.6548 - regression_loss: 0.5699 - classification_loss: 0.0849 58/500 [==>...........................] - ETA: 2:25 - loss: 0.6486 - regression_loss: 0.5649 - classification_loss: 0.0837 59/500 [==>...........................] - ETA: 2:25 - loss: 0.6423 - regression_loss: 0.5594 - classification_loss: 0.0829 60/500 [==>...........................] - ETA: 2:25 - loss: 0.6367 - regression_loss: 0.5543 - classification_loss: 0.0825 61/500 [==>...........................] - ETA: 2:25 - loss: 0.6370 - regression_loss: 0.5542 - classification_loss: 0.0828 62/500 [==>...........................] - ETA: 2:24 - loss: 0.6346 - regression_loss: 0.5523 - classification_loss: 0.0823 63/500 [==>...........................] - ETA: 2:24 - loss: 0.6436 - regression_loss: 0.5607 - classification_loss: 0.0829 64/500 [==>...........................] - ETA: 2:24 - loss: 0.6373 - regression_loss: 0.5550 - classification_loss: 0.0822 65/500 [==>...........................] - ETA: 2:23 - loss: 0.6332 - regression_loss: 0.5516 - classification_loss: 0.0816 66/500 [==>...........................] - ETA: 2:23 - loss: 0.6346 - regression_loss: 0.5532 - classification_loss: 0.0813 67/500 [===>..........................] - ETA: 2:22 - loss: 0.6326 - regression_loss: 0.5514 - classification_loss: 0.0812 68/500 [===>..........................] - ETA: 2:22 - loss: 0.6315 - regression_loss: 0.5501 - classification_loss: 0.0814 69/500 [===>..........................] - ETA: 2:22 - loss: 0.6340 - regression_loss: 0.5528 - classification_loss: 0.0812 70/500 [===>..........................] - ETA: 2:21 - loss: 0.6391 - regression_loss: 0.5566 - classification_loss: 0.0825 71/500 [===>..........................] - ETA: 2:21 - loss: 0.6416 - regression_loss: 0.5591 - classification_loss: 0.0825 72/500 [===>..........................] - ETA: 2:21 - loss: 0.6421 - regression_loss: 0.5597 - classification_loss: 0.0825 73/500 [===>..........................] - ETA: 2:21 - loss: 0.6420 - regression_loss: 0.5596 - classification_loss: 0.0824 74/500 [===>..........................] - ETA: 2:20 - loss: 0.6382 - regression_loss: 0.5564 - classification_loss: 0.0818 75/500 [===>..........................] - ETA: 2:20 - loss: 0.6374 - regression_loss: 0.5560 - classification_loss: 0.0814 76/500 [===>..........................] - ETA: 2:20 - loss: 0.6352 - regression_loss: 0.5540 - classification_loss: 0.0813 77/500 [===>..........................] - ETA: 2:19 - loss: 0.6329 - regression_loss: 0.5520 - classification_loss: 0.0809 78/500 [===>..........................] - ETA: 2:19 - loss: 0.6289 - regression_loss: 0.5483 - classification_loss: 0.0806 79/500 [===>..........................] - ETA: 2:18 - loss: 0.6244 - regression_loss: 0.5446 - classification_loss: 0.0798 80/500 [===>..........................] - ETA: 2:18 - loss: 0.6206 - regression_loss: 0.5414 - classification_loss: 0.0793 81/500 [===>..........................] - ETA: 2:18 - loss: 0.6249 - regression_loss: 0.5453 - classification_loss: 0.0796 82/500 [===>..........................] - ETA: 2:18 - loss: 0.6340 - regression_loss: 0.5531 - classification_loss: 0.0808 83/500 [===>..........................] - ETA: 2:17 - loss: 0.6377 - regression_loss: 0.5564 - classification_loss: 0.0813 84/500 [====>.........................] - ETA: 2:17 - loss: 0.6423 - regression_loss: 0.5608 - classification_loss: 0.0815 85/500 [====>.........................] - ETA: 2:16 - loss: 0.6457 - regression_loss: 0.5642 - classification_loss: 0.0815 86/500 [====>.........................] - ETA: 2:16 - loss: 0.6461 - regression_loss: 0.5641 - classification_loss: 0.0820 87/500 [====>.........................] - ETA: 2:16 - loss: 0.6477 - regression_loss: 0.5656 - classification_loss: 0.0822 88/500 [====>.........................] - ETA: 2:15 - loss: 0.6475 - regression_loss: 0.5653 - classification_loss: 0.0822 89/500 [====>.........................] - ETA: 2:15 - loss: 0.6467 - regression_loss: 0.5645 - classification_loss: 0.0822 90/500 [====>.........................] - ETA: 2:15 - loss: 0.6491 - regression_loss: 0.5670 - classification_loss: 0.0821 91/500 [====>.........................] - ETA: 2:14 - loss: 0.6455 - regression_loss: 0.5640 - classification_loss: 0.0815 92/500 [====>.........................] - ETA: 2:14 - loss: 0.6473 - regression_loss: 0.5655 - classification_loss: 0.0819 93/500 [====>.........................] - ETA: 2:14 - loss: 0.6444 - regression_loss: 0.5629 - classification_loss: 0.0814 94/500 [====>.........................] - ETA: 2:13 - loss: 0.6476 - regression_loss: 0.5662 - classification_loss: 0.0814 95/500 [====>.........................] - ETA: 2:13 - loss: 0.6491 - regression_loss: 0.5674 - classification_loss: 0.0817 96/500 [====>.........................] - ETA: 2:13 - loss: 0.6501 - regression_loss: 0.5688 - classification_loss: 0.0813 97/500 [====>.........................] - ETA: 2:12 - loss: 0.6481 - regression_loss: 0.5674 - classification_loss: 0.0808 98/500 [====>.........................] - ETA: 2:12 - loss: 0.6506 - regression_loss: 0.5697 - classification_loss: 0.0809 99/500 [====>.........................] - ETA: 2:12 - loss: 0.6541 - regression_loss: 0.5729 - classification_loss: 0.0812 100/500 [=====>........................] - ETA: 2:11 - loss: 0.6549 - regression_loss: 0.5737 - classification_loss: 0.0812 101/500 [=====>........................] - ETA: 2:11 - loss: 0.6554 - regression_loss: 0.5744 - classification_loss: 0.0809 102/500 [=====>........................] - ETA: 2:10 - loss: 0.6638 - regression_loss: 0.5815 - classification_loss: 0.0824 103/500 [=====>........................] - ETA: 2:10 - loss: 0.6650 - regression_loss: 0.5822 - classification_loss: 0.0827 104/500 [=====>........................] - ETA: 2:10 - loss: 0.6613 - regression_loss: 0.5790 - classification_loss: 0.0823 105/500 [=====>........................] - ETA: 2:10 - loss: 0.6677 - regression_loss: 0.5843 - classification_loss: 0.0834 106/500 [=====>........................] - ETA: 2:09 - loss: 0.6673 - regression_loss: 0.5842 - classification_loss: 0.0831 107/500 [=====>........................] - ETA: 2:09 - loss: 0.6657 - regression_loss: 0.5829 - classification_loss: 0.0828 108/500 [=====>........................] - ETA: 2:09 - loss: 0.6687 - regression_loss: 0.5856 - classification_loss: 0.0831 109/500 [=====>........................] - ETA: 2:08 - loss: 0.6659 - regression_loss: 0.5833 - classification_loss: 0.0826 110/500 [=====>........................] - ETA: 2:08 - loss: 0.6664 - regression_loss: 0.5838 - classification_loss: 0.0826 111/500 [=====>........................] - ETA: 2:07 - loss: 0.6637 - regression_loss: 0.5814 - classification_loss: 0.0822 112/500 [=====>........................] - ETA: 2:07 - loss: 0.6662 - regression_loss: 0.5836 - classification_loss: 0.0826 113/500 [=====>........................] - ETA: 2:07 - loss: 0.6694 - regression_loss: 0.5861 - classification_loss: 0.0833 114/500 [=====>........................] - ETA: 2:06 - loss: 0.6686 - regression_loss: 0.5856 - classification_loss: 0.0830 115/500 [=====>........................] - ETA: 2:06 - loss: 0.6673 - regression_loss: 0.5844 - classification_loss: 0.0829 116/500 [=====>........................] - ETA: 2:06 - loss: 0.6681 - regression_loss: 0.5851 - classification_loss: 0.0830 117/500 [======>.......................] - ETA: 2:05 - loss: 0.6648 - regression_loss: 0.5821 - classification_loss: 0.0826 118/500 [======>.......................] - ETA: 2:05 - loss: 0.6655 - regression_loss: 0.5830 - classification_loss: 0.0824 119/500 [======>.......................] - ETA: 2:05 - loss: 0.6686 - regression_loss: 0.5858 - classification_loss: 0.0828 120/500 [======>.......................] - ETA: 2:04 - loss: 0.6679 - regression_loss: 0.5853 - classification_loss: 0.0826 121/500 [======>.......................] - ETA: 2:04 - loss: 0.6657 - regression_loss: 0.5835 - classification_loss: 0.0822 122/500 [======>.......................] - ETA: 2:04 - loss: 0.6677 - regression_loss: 0.5851 - classification_loss: 0.0826 123/500 [======>.......................] - ETA: 2:03 - loss: 0.6685 - regression_loss: 0.5857 - classification_loss: 0.0828 124/500 [======>.......................] - ETA: 2:03 - loss: 0.6703 - regression_loss: 0.5873 - classification_loss: 0.0830 125/500 [======>.......................] - ETA: 2:02 - loss: 0.6695 - regression_loss: 0.5865 - classification_loss: 0.0830 126/500 [======>.......................] - ETA: 2:02 - loss: 0.6679 - regression_loss: 0.5853 - classification_loss: 0.0826 127/500 [======>.......................] - ETA: 2:02 - loss: 0.6656 - regression_loss: 0.5833 - classification_loss: 0.0823 128/500 [======>.......................] - ETA: 2:01 - loss: 0.6661 - regression_loss: 0.5840 - classification_loss: 0.0821 129/500 [======>.......................] - ETA: 2:01 - loss: 0.6667 - regression_loss: 0.5847 - classification_loss: 0.0820 130/500 [======>.......................] - ETA: 2:01 - loss: 0.6701 - regression_loss: 0.5874 - classification_loss: 0.0827 131/500 [======>.......................] - ETA: 2:00 - loss: 0.6691 - regression_loss: 0.5865 - classification_loss: 0.0826 132/500 [======>.......................] - ETA: 2:00 - loss: 0.6698 - regression_loss: 0.5871 - classification_loss: 0.0828 133/500 [======>.......................] - ETA: 2:00 - loss: 0.6719 - regression_loss: 0.5888 - classification_loss: 0.0831 134/500 [=======>......................] - ETA: 2:00 - loss: 0.6705 - regression_loss: 0.5878 - classification_loss: 0.0828 135/500 [=======>......................] - ETA: 1:59 - loss: 0.6697 - regression_loss: 0.5873 - classification_loss: 0.0825 136/500 [=======>......................] - ETA: 1:59 - loss: 0.6678 - regression_loss: 0.5857 - classification_loss: 0.0821 137/500 [=======>......................] - ETA: 1:58 - loss: 0.6709 - regression_loss: 0.5887 - classification_loss: 0.0822 138/500 [=======>......................] - ETA: 1:58 - loss: 0.6766 - regression_loss: 0.5937 - classification_loss: 0.0829 139/500 [=======>......................] - ETA: 1:58 - loss: 0.6768 - regression_loss: 0.5941 - classification_loss: 0.0827 140/500 [=======>......................] - ETA: 1:57 - loss: 0.6756 - regression_loss: 0.5929 - classification_loss: 0.0827 141/500 [=======>......................] - ETA: 1:57 - loss: 0.6764 - regression_loss: 0.5940 - classification_loss: 0.0824 142/500 [=======>......................] - ETA: 1:57 - loss: 0.6760 - regression_loss: 0.5938 - classification_loss: 0.0823 143/500 [=======>......................] - ETA: 1:56 - loss: 0.6752 - regression_loss: 0.5928 - classification_loss: 0.0824 144/500 [=======>......................] - ETA: 1:56 - loss: 0.6764 - regression_loss: 0.5937 - classification_loss: 0.0826 145/500 [=======>......................] - ETA: 1:56 - loss: 0.6765 - regression_loss: 0.5939 - classification_loss: 0.0826 146/500 [=======>......................] - ETA: 1:55 - loss: 0.6763 - regression_loss: 0.5938 - classification_loss: 0.0825 147/500 [=======>......................] - ETA: 1:55 - loss: 0.6743 - regression_loss: 0.5921 - classification_loss: 0.0823 148/500 [=======>......................] - ETA: 1:55 - loss: 0.6739 - regression_loss: 0.5919 - classification_loss: 0.0820 149/500 [=======>......................] - ETA: 1:55 - loss: 0.6710 - regression_loss: 0.5894 - classification_loss: 0.0816 150/500 [========>.....................] - ETA: 1:54 - loss: 0.6696 - regression_loss: 0.5882 - classification_loss: 0.0814 151/500 [========>.....................] - ETA: 1:54 - loss: 0.6706 - regression_loss: 0.5889 - classification_loss: 0.0816 152/500 [========>.....................] - ETA: 1:54 - loss: 0.6691 - regression_loss: 0.5877 - classification_loss: 0.0814 153/500 [========>.....................] - ETA: 1:53 - loss: 0.6709 - regression_loss: 0.5893 - classification_loss: 0.0816 154/500 [========>.....................] - ETA: 1:53 - loss: 0.6703 - regression_loss: 0.5887 - classification_loss: 0.0816 155/500 [========>.....................] - ETA: 1:53 - loss: 0.6719 - regression_loss: 0.5903 - classification_loss: 0.0816 156/500 [========>.....................] - ETA: 1:52 - loss: 0.6686 - regression_loss: 0.5875 - classification_loss: 0.0811 157/500 [========>.....................] - ETA: 1:52 - loss: 0.6691 - regression_loss: 0.5879 - classification_loss: 0.0813 158/500 [========>.....................] - ETA: 1:52 - loss: 0.6682 - regression_loss: 0.5872 - classification_loss: 0.0810 159/500 [========>.....................] - ETA: 1:51 - loss: 0.6659 - regression_loss: 0.5852 - classification_loss: 0.0807 160/500 [========>.....................] - ETA: 1:51 - loss: 0.6664 - regression_loss: 0.5856 - classification_loss: 0.0808 161/500 [========>.....................] - ETA: 1:51 - loss: 0.6665 - regression_loss: 0.5857 - classification_loss: 0.0808 162/500 [========>.....................] - ETA: 1:50 - loss: 0.6639 - regression_loss: 0.5834 - classification_loss: 0.0806 163/500 [========>.....................] - ETA: 1:50 - loss: 0.6668 - regression_loss: 0.5858 - classification_loss: 0.0810 164/500 [========>.....................] - ETA: 1:50 - loss: 0.6663 - regression_loss: 0.5854 - classification_loss: 0.0809 165/500 [========>.....................] - ETA: 1:49 - loss: 0.6668 - regression_loss: 0.5856 - classification_loss: 0.0811 166/500 [========>.....................] - ETA: 1:49 - loss: 0.6668 - regression_loss: 0.5855 - classification_loss: 0.0813 167/500 [=========>....................] - ETA: 1:49 - loss: 0.6698 - regression_loss: 0.5879 - classification_loss: 0.0819 168/500 [=========>....................] - ETA: 1:48 - loss: 0.6700 - regression_loss: 0.5881 - classification_loss: 0.0819 169/500 [=========>....................] - ETA: 1:48 - loss: 0.6678 - regression_loss: 0.5862 - classification_loss: 0.0816 170/500 [=========>....................] - ETA: 1:48 - loss: 0.6672 - regression_loss: 0.5858 - classification_loss: 0.0814 171/500 [=========>....................] - ETA: 1:47 - loss: 0.6650 - regression_loss: 0.5840 - classification_loss: 0.0811 172/500 [=========>....................] - ETA: 1:47 - loss: 0.6667 - regression_loss: 0.5853 - classification_loss: 0.0813 173/500 [=========>....................] - ETA: 1:47 - loss: 0.6645 - regression_loss: 0.5835 - classification_loss: 0.0810 174/500 [=========>....................] - ETA: 1:46 - loss: 0.6619 - regression_loss: 0.5811 - classification_loss: 0.0809 175/500 [=========>....................] - ETA: 1:46 - loss: 0.6630 - regression_loss: 0.5820 - classification_loss: 0.0810 176/500 [=========>....................] - ETA: 1:46 - loss: 0.6615 - regression_loss: 0.5807 - classification_loss: 0.0808 177/500 [=========>....................] - ETA: 1:45 - loss: 0.6607 - regression_loss: 0.5800 - classification_loss: 0.0807 178/500 [=========>....................] - ETA: 1:45 - loss: 0.6620 - regression_loss: 0.5812 - classification_loss: 0.0807 179/500 [=========>....................] - ETA: 1:45 - loss: 0.6618 - regression_loss: 0.5811 - classification_loss: 0.0807 180/500 [=========>....................] - ETA: 1:44 - loss: 0.6603 - regression_loss: 0.5798 - classification_loss: 0.0804 181/500 [=========>....................] - ETA: 1:44 - loss: 0.6627 - regression_loss: 0.5819 - classification_loss: 0.0808 182/500 [=========>....................] - ETA: 1:44 - loss: 0.6622 - regression_loss: 0.5815 - classification_loss: 0.0807 183/500 [=========>....................] - ETA: 1:43 - loss: 0.6602 - regression_loss: 0.5797 - classification_loss: 0.0804 184/500 [==========>...................] - ETA: 1:43 - loss: 0.6621 - regression_loss: 0.5813 - classification_loss: 0.0808 185/500 [==========>...................] - ETA: 1:43 - loss: 0.6620 - regression_loss: 0.5813 - classification_loss: 0.0807 186/500 [==========>...................] - ETA: 1:42 - loss: 0.6633 - regression_loss: 0.5822 - classification_loss: 0.0810 187/500 [==========>...................] - ETA: 1:42 - loss: 0.6634 - regression_loss: 0.5824 - classification_loss: 0.0810 188/500 [==========>...................] - ETA: 1:42 - loss: 0.6633 - regression_loss: 0.5823 - classification_loss: 0.0811 189/500 [==========>...................] - ETA: 1:41 - loss: 0.6640 - regression_loss: 0.5828 - classification_loss: 0.0812 190/500 [==========>...................] - ETA: 1:41 - loss: 0.6667 - regression_loss: 0.5854 - classification_loss: 0.0813 191/500 [==========>...................] - ETA: 1:41 - loss: 0.6666 - regression_loss: 0.5855 - classification_loss: 0.0811 192/500 [==========>...................] - ETA: 1:40 - loss: 0.6663 - regression_loss: 0.5852 - classification_loss: 0.0810 193/500 [==========>...................] - ETA: 1:40 - loss: 0.6647 - regression_loss: 0.5840 - classification_loss: 0.0807 194/500 [==========>...................] - ETA: 1:40 - loss: 0.6661 - regression_loss: 0.5852 - classification_loss: 0.0809 195/500 [==========>...................] - ETA: 1:39 - loss: 0.6661 - regression_loss: 0.5851 - classification_loss: 0.0809 196/500 [==========>...................] - ETA: 1:39 - loss: 0.6649 - regression_loss: 0.5841 - classification_loss: 0.0808 197/500 [==========>...................] - ETA: 1:39 - loss: 0.6647 - regression_loss: 0.5840 - classification_loss: 0.0807 198/500 [==========>...................] - ETA: 1:39 - loss: 0.6651 - regression_loss: 0.5843 - classification_loss: 0.0808 199/500 [==========>...................] - ETA: 1:38 - loss: 0.6643 - regression_loss: 0.5836 - classification_loss: 0.0807 200/500 [===========>..................] - ETA: 1:38 - loss: 0.6634 - regression_loss: 0.5828 - classification_loss: 0.0805 201/500 [===========>..................] - ETA: 1:38 - loss: 0.6642 - regression_loss: 0.5836 - classification_loss: 0.0806 202/500 [===========>..................] - ETA: 1:37 - loss: 0.6637 - regression_loss: 0.5830 - classification_loss: 0.0807 203/500 [===========>..................] - ETA: 1:37 - loss: 0.6629 - regression_loss: 0.5823 - classification_loss: 0.0806 204/500 [===========>..................] - ETA: 1:37 - loss: 0.6624 - regression_loss: 0.5818 - classification_loss: 0.0805 205/500 [===========>..................] - ETA: 1:36 - loss: 0.6612 - regression_loss: 0.5808 - classification_loss: 0.0804 206/500 [===========>..................] - ETA: 1:36 - loss: 0.6597 - regression_loss: 0.5795 - classification_loss: 0.0802 207/500 [===========>..................] - ETA: 1:36 - loss: 0.6592 - regression_loss: 0.5790 - classification_loss: 0.0802 208/500 [===========>..................] - ETA: 1:35 - loss: 0.6578 - regression_loss: 0.5779 - classification_loss: 0.0799 209/500 [===========>..................] - ETA: 1:35 - loss: 0.6555 - regression_loss: 0.5759 - classification_loss: 0.0797 210/500 [===========>..................] - ETA: 1:35 - loss: 0.6534 - regression_loss: 0.5740 - classification_loss: 0.0794 211/500 [===========>..................] - ETA: 1:34 - loss: 0.6518 - regression_loss: 0.5726 - classification_loss: 0.0792 212/500 [===========>..................] - ETA: 1:34 - loss: 0.6526 - regression_loss: 0.5733 - classification_loss: 0.0793 213/500 [===========>..................] - ETA: 1:34 - loss: 0.6534 - regression_loss: 0.5740 - classification_loss: 0.0794 214/500 [===========>..................] - ETA: 1:33 - loss: 0.6543 - regression_loss: 0.5748 - classification_loss: 0.0795 215/500 [===========>..................] - ETA: 1:33 - loss: 0.6547 - regression_loss: 0.5752 - classification_loss: 0.0795 216/500 [===========>..................] - ETA: 1:33 - loss: 0.6572 - regression_loss: 0.5773 - classification_loss: 0.0800 217/500 [============>.................] - ETA: 1:32 - loss: 0.6552 - regression_loss: 0.5755 - classification_loss: 0.0797 218/500 [============>.................] - ETA: 1:32 - loss: 0.6564 - regression_loss: 0.5764 - classification_loss: 0.0801 219/500 [============>.................] - ETA: 1:32 - loss: 0.6565 - regression_loss: 0.5764 - classification_loss: 0.0801 220/500 [============>.................] - ETA: 1:31 - loss: 0.6548 - regression_loss: 0.5749 - classification_loss: 0.0799 221/500 [============>.................] - ETA: 1:31 - loss: 0.6532 - regression_loss: 0.5735 - classification_loss: 0.0797 222/500 [============>.................] - ETA: 1:31 - loss: 0.6530 - regression_loss: 0.5732 - classification_loss: 0.0798 223/500 [============>.................] - ETA: 1:30 - loss: 0.6511 - regression_loss: 0.5715 - classification_loss: 0.0795 224/500 [============>.................] - ETA: 1:30 - loss: 0.6494 - regression_loss: 0.5701 - classification_loss: 0.0793 225/500 [============>.................] - ETA: 1:30 - loss: 0.6493 - regression_loss: 0.5698 - classification_loss: 0.0794 226/500 [============>.................] - ETA: 1:29 - loss: 0.6515 - regression_loss: 0.5715 - classification_loss: 0.0800 227/500 [============>.................] - ETA: 1:29 - loss: 0.6507 - regression_loss: 0.5708 - classification_loss: 0.0799 228/500 [============>.................] - ETA: 1:29 - loss: 0.6509 - regression_loss: 0.5712 - classification_loss: 0.0797 229/500 [============>.................] - ETA: 1:29 - loss: 0.6534 - regression_loss: 0.5729 - classification_loss: 0.0805 230/500 [============>.................] - ETA: 1:28 - loss: 0.6521 - regression_loss: 0.5718 - classification_loss: 0.0803 231/500 [============>.................] - ETA: 1:28 - loss: 0.6519 - regression_loss: 0.5715 - classification_loss: 0.0804 232/500 [============>.................] - ETA: 1:28 - loss: 0.6511 - regression_loss: 0.5708 - classification_loss: 0.0803 233/500 [============>.................] - ETA: 1:27 - loss: 0.6507 - regression_loss: 0.5705 - classification_loss: 0.0802 234/500 [=============>................] - ETA: 1:27 - loss: 0.6492 - regression_loss: 0.5692 - classification_loss: 0.0801 235/500 [=============>................] - ETA: 1:27 - loss: 0.6507 - regression_loss: 0.5703 - classification_loss: 0.0804 236/500 [=============>................] - ETA: 1:26 - loss: 0.6491 - regression_loss: 0.5689 - classification_loss: 0.0802 237/500 [=============>................] - ETA: 1:26 - loss: 0.6500 - regression_loss: 0.5696 - classification_loss: 0.0805 238/500 [=============>................] - ETA: 1:26 - loss: 0.6493 - regression_loss: 0.5689 - classification_loss: 0.0804 239/500 [=============>................] - ETA: 1:25 - loss: 0.6491 - regression_loss: 0.5689 - classification_loss: 0.0803 240/500 [=============>................] - ETA: 1:25 - loss: 0.6491 - regression_loss: 0.5687 - classification_loss: 0.0803 241/500 [=============>................] - ETA: 1:25 - loss: 0.6499 - regression_loss: 0.5695 - classification_loss: 0.0804 242/500 [=============>................] - ETA: 1:24 - loss: 0.6491 - regression_loss: 0.5688 - classification_loss: 0.0803 243/500 [=============>................] - ETA: 1:24 - loss: 0.6504 - regression_loss: 0.5700 - classification_loss: 0.0804 244/500 [=============>................] - ETA: 1:24 - loss: 0.6533 - regression_loss: 0.5724 - classification_loss: 0.0809 245/500 [=============>................] - ETA: 1:23 - loss: 0.6534 - regression_loss: 0.5726 - classification_loss: 0.0808 246/500 [=============>................] - ETA: 1:23 - loss: 0.6520 - regression_loss: 0.5714 - classification_loss: 0.0807 247/500 [=============>................] - ETA: 1:23 - loss: 0.6509 - regression_loss: 0.5704 - classification_loss: 0.0806 248/500 [=============>................] - ETA: 1:22 - loss: 0.6518 - regression_loss: 0.5710 - classification_loss: 0.0808 249/500 [=============>................] - ETA: 1:22 - loss: 0.6516 - regression_loss: 0.5709 - classification_loss: 0.0807 250/500 [==============>...............] - ETA: 1:22 - loss: 0.6515 - regression_loss: 0.5708 - classification_loss: 0.0808 251/500 [==============>...............] - ETA: 1:21 - loss: 0.6513 - regression_loss: 0.5706 - classification_loss: 0.0807 252/500 [==============>...............] - ETA: 1:21 - loss: 0.6517 - regression_loss: 0.5708 - classification_loss: 0.0808 253/500 [==============>...............] - ETA: 1:21 - loss: 0.6523 - regression_loss: 0.5714 - classification_loss: 0.0809 254/500 [==============>...............] - ETA: 1:20 - loss: 0.6523 - regression_loss: 0.5713 - classification_loss: 0.0810 255/500 [==============>...............] - ETA: 1:20 - loss: 0.6539 - regression_loss: 0.5730 - classification_loss: 0.0809 256/500 [==============>...............] - ETA: 1:20 - loss: 0.6527 - regression_loss: 0.5720 - classification_loss: 0.0807 257/500 [==============>...............] - ETA: 1:19 - loss: 0.6512 - regression_loss: 0.5707 - classification_loss: 0.0805 258/500 [==============>...............] - ETA: 1:19 - loss: 0.6511 - regression_loss: 0.5707 - classification_loss: 0.0804 259/500 [==============>...............] - ETA: 1:19 - loss: 0.6519 - regression_loss: 0.5714 - classification_loss: 0.0805 260/500 [==============>...............] - ETA: 1:18 - loss: 0.6512 - regression_loss: 0.5708 - classification_loss: 0.0804 261/500 [==============>...............] - ETA: 1:18 - loss: 0.6507 - regression_loss: 0.5703 - classification_loss: 0.0804 262/500 [==============>...............] - ETA: 1:18 - loss: 0.6521 - regression_loss: 0.5715 - classification_loss: 0.0806 263/500 [==============>...............] - ETA: 1:17 - loss: 0.6510 - regression_loss: 0.5705 - classification_loss: 0.0805 264/500 [==============>...............] - ETA: 1:17 - loss: 0.6522 - regression_loss: 0.5716 - classification_loss: 0.0805 265/500 [==============>...............] - ETA: 1:17 - loss: 0.6518 - regression_loss: 0.5714 - classification_loss: 0.0804 266/500 [==============>...............] - ETA: 1:16 - loss: 0.6507 - regression_loss: 0.5704 - classification_loss: 0.0803 267/500 [===============>..............] - ETA: 1:16 - loss: 0.6495 - regression_loss: 0.5695 - classification_loss: 0.0801 268/500 [===============>..............] - ETA: 1:16 - loss: 0.6481 - regression_loss: 0.5682 - classification_loss: 0.0799 269/500 [===============>..............] - ETA: 1:15 - loss: 0.6484 - regression_loss: 0.5684 - classification_loss: 0.0800 270/500 [===============>..............] - ETA: 1:15 - loss: 0.6487 - regression_loss: 0.5687 - classification_loss: 0.0800 271/500 [===============>..............] - ETA: 1:15 - loss: 0.6497 - regression_loss: 0.5693 - classification_loss: 0.0803 272/500 [===============>..............] - ETA: 1:14 - loss: 0.6529 - regression_loss: 0.5719 - classification_loss: 0.0810 273/500 [===============>..............] - ETA: 1:14 - loss: 0.6520 - regression_loss: 0.5712 - classification_loss: 0.0808 274/500 [===============>..............] - ETA: 1:14 - loss: 0.6526 - regression_loss: 0.5718 - classification_loss: 0.0808 275/500 [===============>..............] - ETA: 1:13 - loss: 0.6519 - regression_loss: 0.5712 - classification_loss: 0.0806 276/500 [===============>..............] - ETA: 1:13 - loss: 0.6520 - regression_loss: 0.5714 - classification_loss: 0.0806 277/500 [===============>..............] - ETA: 1:13 - loss: 0.6520 - regression_loss: 0.5715 - classification_loss: 0.0805 278/500 [===============>..............] - ETA: 1:12 - loss: 0.6524 - regression_loss: 0.5718 - classification_loss: 0.0806 279/500 [===============>..............] - ETA: 1:12 - loss: 0.6534 - regression_loss: 0.5728 - classification_loss: 0.0807 280/500 [===============>..............] - ETA: 1:12 - loss: 0.6533 - regression_loss: 0.5727 - classification_loss: 0.0807 281/500 [===============>..............] - ETA: 1:11 - loss: 0.6539 - regression_loss: 0.5732 - classification_loss: 0.0808 282/500 [===============>..............] - ETA: 1:11 - loss: 0.6524 - regression_loss: 0.5719 - classification_loss: 0.0806 283/500 [===============>..............] - ETA: 1:11 - loss: 0.6526 - regression_loss: 0.5720 - classification_loss: 0.0806 284/500 [================>.............] - ETA: 1:10 - loss: 0.6534 - regression_loss: 0.5728 - classification_loss: 0.0806 285/500 [================>.............] - ETA: 1:10 - loss: 0.6534 - regression_loss: 0.5728 - classification_loss: 0.0806 286/500 [================>.............] - ETA: 1:10 - loss: 0.6538 - regression_loss: 0.5732 - classification_loss: 0.0806 287/500 [================>.............] - ETA: 1:10 - loss: 0.6536 - regression_loss: 0.5728 - classification_loss: 0.0808 288/500 [================>.............] - ETA: 1:09 - loss: 0.6535 - regression_loss: 0.5727 - classification_loss: 0.0808 289/500 [================>.............] - ETA: 1:09 - loss: 0.6545 - regression_loss: 0.5734 - classification_loss: 0.0811 290/500 [================>.............] - ETA: 1:09 - loss: 0.6543 - regression_loss: 0.5733 - classification_loss: 0.0810 291/500 [================>.............] - ETA: 1:08 - loss: 0.6545 - regression_loss: 0.5733 - classification_loss: 0.0811 292/500 [================>.............] - ETA: 1:08 - loss: 0.6536 - regression_loss: 0.5725 - classification_loss: 0.0811 293/500 [================>.............] - ETA: 1:08 - loss: 0.6541 - regression_loss: 0.5729 - classification_loss: 0.0812 294/500 [================>.............] - ETA: 1:07 - loss: 0.6548 - regression_loss: 0.5735 - classification_loss: 0.0813 295/500 [================>.............] - ETA: 1:07 - loss: 0.6546 - regression_loss: 0.5733 - classification_loss: 0.0812 296/500 [================>.............] - ETA: 1:07 - loss: 0.6537 - regression_loss: 0.5726 - classification_loss: 0.0811 297/500 [================>.............] - ETA: 1:06 - loss: 0.6550 - regression_loss: 0.5737 - classification_loss: 0.0813 298/500 [================>.............] - ETA: 1:06 - loss: 0.6568 - regression_loss: 0.5752 - classification_loss: 0.0815 299/500 [================>.............] - ETA: 1:06 - loss: 0.6563 - regression_loss: 0.5749 - classification_loss: 0.0814 300/500 [=================>............] - ETA: 1:05 - loss: 0.6555 - regression_loss: 0.5742 - classification_loss: 0.0813 301/500 [=================>............] - ETA: 1:05 - loss: 0.6555 - regression_loss: 0.5742 - classification_loss: 0.0812 302/500 [=================>............] - ETA: 1:05 - loss: 0.6541 - regression_loss: 0.5731 - classification_loss: 0.0810 303/500 [=================>............] - ETA: 1:04 - loss: 0.6565 - regression_loss: 0.5750 - classification_loss: 0.0815 304/500 [=================>............] - ETA: 1:04 - loss: 0.6565 - regression_loss: 0.5749 - classification_loss: 0.0816 305/500 [=================>............] - ETA: 1:04 - loss: 0.6564 - regression_loss: 0.5749 - classification_loss: 0.0815 306/500 [=================>............] - ETA: 1:03 - loss: 0.6561 - regression_loss: 0.5745 - classification_loss: 0.0816 307/500 [=================>............] - ETA: 1:03 - loss: 0.6560 - regression_loss: 0.5745 - classification_loss: 0.0815 308/500 [=================>............] - ETA: 1:03 - loss: 0.6565 - regression_loss: 0.5749 - classification_loss: 0.0816 309/500 [=================>............] - ETA: 1:02 - loss: 0.6568 - regression_loss: 0.5754 - classification_loss: 0.0814 310/500 [=================>............] - ETA: 1:02 - loss: 0.6564 - regression_loss: 0.5751 - classification_loss: 0.0813 311/500 [=================>............] - ETA: 1:02 - loss: 0.6578 - regression_loss: 0.5763 - classification_loss: 0.0815 312/500 [=================>............] - ETA: 1:01 - loss: 0.6577 - regression_loss: 0.5763 - classification_loss: 0.0814 313/500 [=================>............] - ETA: 1:01 - loss: 0.6582 - regression_loss: 0.5766 - classification_loss: 0.0816 314/500 [=================>............] - ETA: 1:01 - loss: 0.6579 - regression_loss: 0.5764 - classification_loss: 0.0815 315/500 [=================>............] - ETA: 1:00 - loss: 0.6583 - regression_loss: 0.5768 - classification_loss: 0.0815 316/500 [=================>............] - ETA: 1:00 - loss: 0.6570 - regression_loss: 0.5757 - classification_loss: 0.0813 317/500 [==================>...........] - ETA: 1:00 - loss: 0.6559 - regression_loss: 0.5748 - classification_loss: 0.0811 318/500 [==================>...........] - ETA: 59s - loss: 0.6560 - regression_loss: 0.5751 - classification_loss: 0.0809  319/500 [==================>...........] - ETA: 59s - loss: 0.6554 - regression_loss: 0.5746 - classification_loss: 0.0808 320/500 [==================>...........] - ETA: 59s - loss: 0.6545 - regression_loss: 0.5738 - classification_loss: 0.0807 321/500 [==================>...........] - ETA: 58s - loss: 0.6533 - regression_loss: 0.5727 - classification_loss: 0.0806 322/500 [==================>...........] - ETA: 58s - loss: 0.6556 - regression_loss: 0.5746 - classification_loss: 0.0809 323/500 [==================>...........] - ETA: 58s - loss: 0.6576 - regression_loss: 0.5764 - classification_loss: 0.0812 324/500 [==================>...........] - ETA: 57s - loss: 0.6578 - regression_loss: 0.5766 - classification_loss: 0.0812 325/500 [==================>...........] - ETA: 57s - loss: 0.6571 - regression_loss: 0.5760 - classification_loss: 0.0811 326/500 [==================>...........] - ETA: 57s - loss: 0.6568 - regression_loss: 0.5756 - classification_loss: 0.0811 327/500 [==================>...........] - ETA: 56s - loss: 0.6580 - regression_loss: 0.5769 - classification_loss: 0.0810 328/500 [==================>...........] - ETA: 56s - loss: 0.6577 - regression_loss: 0.5766 - classification_loss: 0.0811 329/500 [==================>...........] - ETA: 56s - loss: 0.6614 - regression_loss: 0.5801 - classification_loss: 0.0813 330/500 [==================>...........] - ETA: 55s - loss: 0.6612 - regression_loss: 0.5799 - classification_loss: 0.0813 331/500 [==================>...........] - ETA: 55s - loss: 0.6627 - regression_loss: 0.5810 - classification_loss: 0.0817 332/500 [==================>...........] - ETA: 55s - loss: 0.6635 - regression_loss: 0.5816 - classification_loss: 0.0819 333/500 [==================>...........] - ETA: 54s - loss: 0.6632 - regression_loss: 0.5814 - classification_loss: 0.0819 334/500 [===================>..........] - ETA: 54s - loss: 0.6637 - regression_loss: 0.5819 - classification_loss: 0.0818 335/500 [===================>..........] - ETA: 54s - loss: 0.6650 - regression_loss: 0.5829 - classification_loss: 0.0821 336/500 [===================>..........] - ETA: 53s - loss: 0.6651 - regression_loss: 0.5831 - classification_loss: 0.0820 337/500 [===================>..........] - ETA: 53s - loss: 0.6638 - regression_loss: 0.5820 - classification_loss: 0.0819 338/500 [===================>..........] - ETA: 53s - loss: 0.6645 - regression_loss: 0.5825 - classification_loss: 0.0820 339/500 [===================>..........] - ETA: 52s - loss: 0.6641 - regression_loss: 0.5821 - classification_loss: 0.0820 340/500 [===================>..........] - ETA: 52s - loss: 0.6639 - regression_loss: 0.5821 - classification_loss: 0.0818 341/500 [===================>..........] - ETA: 52s - loss: 0.6629 - regression_loss: 0.5811 - classification_loss: 0.0818 342/500 [===================>..........] - ETA: 51s - loss: 0.6630 - regression_loss: 0.5812 - classification_loss: 0.0818 343/500 [===================>..........] - ETA: 51s - loss: 0.6637 - regression_loss: 0.5817 - classification_loss: 0.0820 344/500 [===================>..........] - ETA: 51s - loss: 0.6633 - regression_loss: 0.5813 - classification_loss: 0.0820 345/500 [===================>..........] - ETA: 51s - loss: 0.6620 - regression_loss: 0.5803 - classification_loss: 0.0818 346/500 [===================>..........] - ETA: 50s - loss: 0.6613 - regression_loss: 0.5797 - classification_loss: 0.0816 347/500 [===================>..........] - ETA: 50s - loss: 0.6618 - regression_loss: 0.5801 - classification_loss: 0.0818 348/500 [===================>..........] - ETA: 50s - loss: 0.6621 - regression_loss: 0.5803 - classification_loss: 0.0818 349/500 [===================>..........] - ETA: 49s - loss: 0.6623 - regression_loss: 0.5806 - classification_loss: 0.0817 350/500 [====================>.........] - ETA: 49s - loss: 0.6620 - regression_loss: 0.5804 - classification_loss: 0.0816 351/500 [====================>.........] - ETA: 49s - loss: 0.6620 - regression_loss: 0.5805 - classification_loss: 0.0815 352/500 [====================>.........] - ETA: 48s - loss: 0.6623 - regression_loss: 0.5808 - classification_loss: 0.0816 353/500 [====================>.........] - ETA: 48s - loss: 0.6620 - regression_loss: 0.5805 - classification_loss: 0.0815 354/500 [====================>.........] - ETA: 48s - loss: 0.6622 - regression_loss: 0.5807 - classification_loss: 0.0815 355/500 [====================>.........] - ETA: 47s - loss: 0.6616 - regression_loss: 0.5801 - classification_loss: 0.0815 356/500 [====================>.........] - ETA: 47s - loss: 0.6603 - regression_loss: 0.5789 - classification_loss: 0.0814 357/500 [====================>.........] - ETA: 47s - loss: 0.6589 - regression_loss: 0.5777 - classification_loss: 0.0812 358/500 [====================>.........] - ETA: 46s - loss: 0.6591 - regression_loss: 0.5779 - classification_loss: 0.0813 359/500 [====================>.........] - ETA: 46s - loss: 0.6584 - regression_loss: 0.5772 - classification_loss: 0.0812 360/500 [====================>.........] - ETA: 46s - loss: 0.6582 - regression_loss: 0.5770 - classification_loss: 0.0812 361/500 [====================>.........] - ETA: 45s - loss: 0.6575 - regression_loss: 0.5765 - classification_loss: 0.0810 362/500 [====================>.........] - ETA: 45s - loss: 0.6578 - regression_loss: 0.5768 - classification_loss: 0.0810 363/500 [====================>.........] - ETA: 45s - loss: 0.6569 - regression_loss: 0.5760 - classification_loss: 0.0809 364/500 [====================>.........] - ETA: 44s - loss: 0.6570 - regression_loss: 0.5761 - classification_loss: 0.0809 365/500 [====================>.........] - ETA: 44s - loss: 0.6564 - regression_loss: 0.5756 - classification_loss: 0.0809 366/500 [====================>.........] - ETA: 44s - loss: 0.6554 - regression_loss: 0.5747 - classification_loss: 0.0807 367/500 [=====================>........] - ETA: 43s - loss: 0.6556 - regression_loss: 0.5750 - classification_loss: 0.0806 368/500 [=====================>........] - ETA: 43s - loss: 0.6562 - regression_loss: 0.5754 - classification_loss: 0.0807 369/500 [=====================>........] - ETA: 43s - loss: 0.6567 - regression_loss: 0.5758 - classification_loss: 0.0809 370/500 [=====================>........] - ETA: 42s - loss: 0.6572 - regression_loss: 0.5761 - classification_loss: 0.0810 371/500 [=====================>........] - ETA: 42s - loss: 0.6570 - regression_loss: 0.5760 - classification_loss: 0.0810 372/500 [=====================>........] - ETA: 42s - loss: 0.6565 - regression_loss: 0.5756 - classification_loss: 0.0809 373/500 [=====================>........] - ETA: 41s - loss: 0.6573 - regression_loss: 0.5762 - classification_loss: 0.0811 374/500 [=====================>........] - ETA: 41s - loss: 0.6569 - regression_loss: 0.5759 - classification_loss: 0.0810 375/500 [=====================>........] - ETA: 41s - loss: 0.6580 - regression_loss: 0.5769 - classification_loss: 0.0811 376/500 [=====================>........] - ETA: 40s - loss: 0.6579 - regression_loss: 0.5769 - classification_loss: 0.0810 377/500 [=====================>........] - ETA: 40s - loss: 0.6570 - regression_loss: 0.5760 - classification_loss: 0.0810 378/500 [=====================>........] - ETA: 40s - loss: 0.6573 - regression_loss: 0.5763 - classification_loss: 0.0810 379/500 [=====================>........] - ETA: 39s - loss: 0.6578 - regression_loss: 0.5767 - classification_loss: 0.0811 380/500 [=====================>........] - ETA: 39s - loss: 0.6587 - regression_loss: 0.5772 - classification_loss: 0.0814 381/500 [=====================>........] - ETA: 39s - loss: 0.6594 - regression_loss: 0.5779 - classification_loss: 0.0816 382/500 [=====================>........] - ETA: 38s - loss: 0.6596 - regression_loss: 0.5780 - classification_loss: 0.0817 383/500 [=====================>........] - ETA: 38s - loss: 0.6583 - regression_loss: 0.5768 - classification_loss: 0.0815 384/500 [======================>.......] - ETA: 38s - loss: 0.6586 - regression_loss: 0.5771 - classification_loss: 0.0815 385/500 [======================>.......] - ETA: 37s - loss: 0.6595 - regression_loss: 0.5780 - classification_loss: 0.0815 386/500 [======================>.......] - ETA: 37s - loss: 0.6585 - regression_loss: 0.5771 - classification_loss: 0.0814 387/500 [======================>.......] - ETA: 37s - loss: 0.6588 - regression_loss: 0.5774 - classification_loss: 0.0814 388/500 [======================>.......] - ETA: 36s - loss: 0.6583 - regression_loss: 0.5770 - classification_loss: 0.0813 389/500 [======================>.......] - ETA: 36s - loss: 0.6579 - regression_loss: 0.5767 - classification_loss: 0.0812 390/500 [======================>.......] - ETA: 36s - loss: 0.6576 - regression_loss: 0.5765 - classification_loss: 0.0811 391/500 [======================>.......] - ETA: 35s - loss: 0.6579 - regression_loss: 0.5768 - classification_loss: 0.0812 392/500 [======================>.......] - ETA: 35s - loss: 0.6572 - regression_loss: 0.5762 - classification_loss: 0.0810 393/500 [======================>.......] - ETA: 35s - loss: 0.6576 - regression_loss: 0.5765 - classification_loss: 0.0812 394/500 [======================>.......] - ETA: 34s - loss: 0.6569 - regression_loss: 0.5759 - classification_loss: 0.0810 395/500 [======================>.......] - ETA: 34s - loss: 0.6560 - regression_loss: 0.5751 - classification_loss: 0.0809 396/500 [======================>.......] - ETA: 34s - loss: 0.6559 - regression_loss: 0.5750 - classification_loss: 0.0808 397/500 [======================>.......] - ETA: 33s - loss: 0.6550 - regression_loss: 0.5743 - classification_loss: 0.0807 398/500 [======================>.......] - ETA: 33s - loss: 0.6547 - regression_loss: 0.5740 - classification_loss: 0.0806 399/500 [======================>.......] - ETA: 33s - loss: 0.6535 - regression_loss: 0.5730 - classification_loss: 0.0805 400/500 [=======================>......] - ETA: 32s - loss: 0.6534 - regression_loss: 0.5730 - classification_loss: 0.0805 401/500 [=======================>......] - ETA: 32s - loss: 0.6545 - regression_loss: 0.5738 - classification_loss: 0.0807 402/500 [=======================>......] - ETA: 32s - loss: 0.6534 - regression_loss: 0.5728 - classification_loss: 0.0806 403/500 [=======================>......] - ETA: 31s - loss: 0.6549 - regression_loss: 0.5740 - classification_loss: 0.0808 404/500 [=======================>......] - ETA: 31s - loss: 0.6552 - regression_loss: 0.5743 - classification_loss: 0.0809 405/500 [=======================>......] - ETA: 31s - loss: 0.6546 - regression_loss: 0.5738 - classification_loss: 0.0808 406/500 [=======================>......] - ETA: 31s - loss: 0.6541 - regression_loss: 0.5734 - classification_loss: 0.0807 407/500 [=======================>......] - ETA: 30s - loss: 0.6544 - regression_loss: 0.5736 - classification_loss: 0.0808 408/500 [=======================>......] - ETA: 30s - loss: 0.6546 - regression_loss: 0.5737 - classification_loss: 0.0808 409/500 [=======================>......] - ETA: 30s - loss: 0.6557 - regression_loss: 0.5747 - classification_loss: 0.0811 410/500 [=======================>......] - ETA: 29s - loss: 0.6559 - regression_loss: 0.5747 - classification_loss: 0.0812 411/500 [=======================>......] - ETA: 29s - loss: 0.6563 - regression_loss: 0.5751 - classification_loss: 0.0812 412/500 [=======================>......] - ETA: 29s - loss: 0.6561 - regression_loss: 0.5750 - classification_loss: 0.0811 413/500 [=======================>......] - ETA: 28s - loss: 0.6558 - regression_loss: 0.5748 - classification_loss: 0.0810 414/500 [=======================>......] - ETA: 28s - loss: 0.6571 - regression_loss: 0.5759 - classification_loss: 0.0812 415/500 [=======================>......] - ETA: 28s - loss: 0.6576 - regression_loss: 0.5764 - classification_loss: 0.0812 416/500 [=======================>......] - ETA: 27s - loss: 0.6573 - regression_loss: 0.5761 - classification_loss: 0.0812 417/500 [========================>.....] - ETA: 27s - loss: 0.6562 - regression_loss: 0.5751 - classification_loss: 0.0811 418/500 [========================>.....] - ETA: 27s - loss: 0.6560 - regression_loss: 0.5750 - classification_loss: 0.0811 419/500 [========================>.....] - ETA: 26s - loss: 0.6557 - regression_loss: 0.5747 - classification_loss: 0.0810 420/500 [========================>.....] - ETA: 26s - loss: 0.6552 - regression_loss: 0.5743 - classification_loss: 0.0809 421/500 [========================>.....] - ETA: 26s - loss: 0.6554 - regression_loss: 0.5744 - classification_loss: 0.0809 422/500 [========================>.....] - ETA: 25s - loss: 0.6564 - regression_loss: 0.5753 - classification_loss: 0.0811 423/500 [========================>.....] - ETA: 25s - loss: 0.6561 - regression_loss: 0.5750 - classification_loss: 0.0810 424/500 [========================>.....] - ETA: 25s - loss: 0.6556 - regression_loss: 0.5747 - classification_loss: 0.0809 425/500 [========================>.....] - ETA: 24s - loss: 0.6556 - regression_loss: 0.5747 - classification_loss: 0.0809 426/500 [========================>.....] - ETA: 24s - loss: 0.6565 - regression_loss: 0.5755 - classification_loss: 0.0810 427/500 [========================>.....] - ETA: 24s - loss: 0.6572 - regression_loss: 0.5761 - classification_loss: 0.0811 428/500 [========================>.....] - ETA: 23s - loss: 0.6570 - regression_loss: 0.5759 - classification_loss: 0.0811 429/500 [========================>.....] - ETA: 23s - loss: 0.6583 - regression_loss: 0.5770 - classification_loss: 0.0813 430/500 [========================>.....] - ETA: 23s - loss: 0.6580 - regression_loss: 0.5768 - classification_loss: 0.0812 431/500 [========================>.....] - ETA: 22s - loss: 0.6576 - regression_loss: 0.5765 - classification_loss: 0.0811 432/500 [========================>.....] - ETA: 22s - loss: 0.6584 - regression_loss: 0.5773 - classification_loss: 0.0810 433/500 [========================>.....] - ETA: 22s - loss: 0.6579 - regression_loss: 0.5769 - classification_loss: 0.0809 434/500 [=========================>....] - ETA: 21s - loss: 0.6569 - regression_loss: 0.5761 - classification_loss: 0.0808 435/500 [=========================>....] - ETA: 21s - loss: 0.6572 - regression_loss: 0.5764 - classification_loss: 0.0808 436/500 [=========================>....] - ETA: 21s - loss: 0.6565 - regression_loss: 0.5757 - classification_loss: 0.0807 437/500 [=========================>....] - ETA: 20s - loss: 0.6562 - regression_loss: 0.5754 - classification_loss: 0.0808 438/500 [=========================>....] - ETA: 20s - loss: 0.6552 - regression_loss: 0.5745 - classification_loss: 0.0807 439/500 [=========================>....] - ETA: 20s - loss: 0.6560 - regression_loss: 0.5751 - classification_loss: 0.0809 440/500 [=========================>....] - ETA: 19s - loss: 0.6558 - regression_loss: 0.5749 - classification_loss: 0.0809 441/500 [=========================>....] - ETA: 19s - loss: 0.6562 - regression_loss: 0.5753 - classification_loss: 0.0809 442/500 [=========================>....] - ETA: 19s - loss: 0.6569 - regression_loss: 0.5759 - classification_loss: 0.0810 443/500 [=========================>....] - ETA: 18s - loss: 0.6579 - regression_loss: 0.5768 - classification_loss: 0.0811 444/500 [=========================>....] - ETA: 18s - loss: 0.6579 - regression_loss: 0.5769 - classification_loss: 0.0810 445/500 [=========================>....] - ETA: 18s - loss: 0.6586 - regression_loss: 0.5775 - classification_loss: 0.0811 446/500 [=========================>....] - ETA: 17s - loss: 0.6593 - regression_loss: 0.5780 - classification_loss: 0.0813 447/500 [=========================>....] - ETA: 17s - loss: 0.6587 - regression_loss: 0.5775 - classification_loss: 0.0812 448/500 [=========================>....] - ETA: 17s - loss: 0.6584 - regression_loss: 0.5772 - classification_loss: 0.0812 449/500 [=========================>....] - ETA: 16s - loss: 0.6575 - regression_loss: 0.5764 - classification_loss: 0.0811 450/500 [==========================>...] - ETA: 16s - loss: 0.6566 - regression_loss: 0.5757 - classification_loss: 0.0809 451/500 [==========================>...] - ETA: 16s - loss: 0.6573 - regression_loss: 0.5761 - classification_loss: 0.0812 452/500 [==========================>...] - ETA: 15s - loss: 0.6570 - regression_loss: 0.5758 - classification_loss: 0.0812 453/500 [==========================>...] - ETA: 15s - loss: 0.6562 - regression_loss: 0.5751 - classification_loss: 0.0811 454/500 [==========================>...] - ETA: 15s - loss: 0.6556 - regression_loss: 0.5746 - classification_loss: 0.0810 455/500 [==========================>...] - ETA: 14s - loss: 0.6551 - regression_loss: 0.5741 - classification_loss: 0.0810 456/500 [==========================>...] - ETA: 14s - loss: 0.6553 - regression_loss: 0.5743 - classification_loss: 0.0810 457/500 [==========================>...] - ETA: 14s - loss: 0.6561 - regression_loss: 0.5748 - classification_loss: 0.0812 458/500 [==========================>...] - ETA: 13s - loss: 0.6561 - regression_loss: 0.5748 - classification_loss: 0.0813 459/500 [==========================>...] - ETA: 13s - loss: 0.6557 - regression_loss: 0.5744 - classification_loss: 0.0812 460/500 [==========================>...] - ETA: 13s - loss: 0.6558 - regression_loss: 0.5746 - classification_loss: 0.0813 461/500 [==========================>...] - ETA: 12s - loss: 0.6553 - regression_loss: 0.5742 - classification_loss: 0.0812 462/500 [==========================>...] - ETA: 12s - loss: 0.6561 - regression_loss: 0.5750 - classification_loss: 0.0811 463/500 [==========================>...] - ETA: 12s - loss: 0.6556 - regression_loss: 0.5746 - classification_loss: 0.0810 464/500 [==========================>...] - ETA: 11s - loss: 0.6549 - regression_loss: 0.5740 - classification_loss: 0.0809 465/500 [==========================>...] - ETA: 11s - loss: 0.6549 - regression_loss: 0.5740 - classification_loss: 0.0809 466/500 [==========================>...] - ETA: 11s - loss: 0.6556 - regression_loss: 0.5745 - classification_loss: 0.0811 467/500 [===========================>..] - ETA: 10s - loss: 0.6557 - regression_loss: 0.5747 - classification_loss: 0.0811 468/500 [===========================>..] - ETA: 10s - loss: 0.6564 - regression_loss: 0.5753 - classification_loss: 0.0811 469/500 [===========================>..] - ETA: 10s - loss: 0.6570 - regression_loss: 0.5757 - classification_loss: 0.0813 470/500 [===========================>..] - ETA: 9s - loss: 0.6567 - regression_loss: 0.5755 - classification_loss: 0.0812  471/500 [===========================>..] - ETA: 9s - loss: 0.6571 - regression_loss: 0.5758 - classification_loss: 0.0813 472/500 [===========================>..] - ETA: 9s - loss: 0.6565 - regression_loss: 0.5753 - classification_loss: 0.0812 473/500 [===========================>..] - ETA: 8s - loss: 0.6557 - regression_loss: 0.5746 - classification_loss: 0.0811 474/500 [===========================>..] - ETA: 8s - loss: 0.6551 - regression_loss: 0.5740 - classification_loss: 0.0811 475/500 [===========================>..] - ETA: 8s - loss: 0.6558 - regression_loss: 0.5746 - classification_loss: 0.0812 476/500 [===========================>..] - ETA: 7s - loss: 0.6562 - regression_loss: 0.5749 - classification_loss: 0.0812 477/500 [===========================>..] - ETA: 7s - loss: 0.6561 - regression_loss: 0.5747 - classification_loss: 0.0814 478/500 [===========================>..] - ETA: 7s - loss: 0.6563 - regression_loss: 0.5749 - classification_loss: 0.0814 479/500 [===========================>..] - ETA: 6s - loss: 0.6557 - regression_loss: 0.5744 - classification_loss: 0.0813 480/500 [===========================>..] - ETA: 6s - loss: 0.6548 - regression_loss: 0.5735 - classification_loss: 0.0812 481/500 [===========================>..] - ETA: 6s - loss: 0.6538 - regression_loss: 0.5727 - classification_loss: 0.0811 482/500 [===========================>..] - ETA: 5s - loss: 0.6530 - regression_loss: 0.5720 - classification_loss: 0.0810 483/500 [===========================>..] - ETA: 5s - loss: 0.6520 - regression_loss: 0.5711 - classification_loss: 0.0809 484/500 [============================>.] - ETA: 5s - loss: 0.6511 - regression_loss: 0.5704 - classification_loss: 0.0808 485/500 [============================>.] - ETA: 4s - loss: 0.6515 - regression_loss: 0.5707 - classification_loss: 0.0808 486/500 [============================>.] - ETA: 4s - loss: 0.6505 - regression_loss: 0.5698 - classification_loss: 0.0807 487/500 [============================>.] - ETA: 4s - loss: 0.6520 - regression_loss: 0.5710 - classification_loss: 0.0810 488/500 [============================>.] - ETA: 3s - loss: 0.6516 - regression_loss: 0.5707 - classification_loss: 0.0809 489/500 [============================>.] - ETA: 3s - loss: 0.6508 - regression_loss: 0.5699 - classification_loss: 0.0808 490/500 [============================>.] - ETA: 3s - loss: 0.6502 - regression_loss: 0.5694 - classification_loss: 0.0807 491/500 [============================>.] - ETA: 2s - loss: 0.6503 - regression_loss: 0.5695 - classification_loss: 0.0808 492/500 [============================>.] - ETA: 2s - loss: 0.6497 - regression_loss: 0.5691 - classification_loss: 0.0806 493/500 [============================>.] - ETA: 2s - loss: 0.6505 - regression_loss: 0.5697 - classification_loss: 0.0808 494/500 [============================>.] - ETA: 1s - loss: 0.6507 - regression_loss: 0.5698 - classification_loss: 0.0809 495/500 [============================>.] - ETA: 1s - loss: 0.6503 - regression_loss: 0.5695 - classification_loss: 0.0808 496/500 [============================>.] - ETA: 1s - loss: 0.6503 - regression_loss: 0.5694 - classification_loss: 0.0809 497/500 [============================>.] - ETA: 0s - loss: 0.6501 - regression_loss: 0.5692 - classification_loss: 0.0809 498/500 [============================>.] - ETA: 0s - loss: 0.6501 - regression_loss: 0.5693 - classification_loss: 0.0808 499/500 [============================>.] - ETA: 0s - loss: 0.6501 - regression_loss: 0.5693 - classification_loss: 0.0809 500/500 [==============================] - 165s 330ms/step - loss: 0.6498 - regression_loss: 0.5690 - classification_loss: 0.0808 1172 instances of class plum with average precision: 0.6302 mAP: 0.6302 Epoch 00048: saving model to ./training/snapshots/resnet101_pascal_48.h5 Epoch 49/150 1/500 [..............................] - ETA: 2:49 - loss: 0.6200 - regression_loss: 0.5671 - classification_loss: 0.0530 2/500 [..............................] - ETA: 2:47 - loss: 0.6764 - regression_loss: 0.5932 - classification_loss: 0.0833 3/500 [..............................] - ETA: 2:49 - loss: 0.7356 - regression_loss: 0.6418 - classification_loss: 0.0938 4/500 [..............................] - ETA: 2:47 - loss: 0.6643 - regression_loss: 0.5775 - classification_loss: 0.0867 5/500 [..............................] - ETA: 2:44 - loss: 0.6534 - regression_loss: 0.5760 - classification_loss: 0.0774 6/500 [..............................] - ETA: 2:42 - loss: 0.6379 - regression_loss: 0.5596 - classification_loss: 0.0783 7/500 [..............................] - ETA: 2:43 - loss: 0.6696 - regression_loss: 0.5882 - classification_loss: 0.0814 8/500 [..............................] - ETA: 2:42 - loss: 0.6366 - regression_loss: 0.5610 - classification_loss: 0.0756 9/500 [..............................] - ETA: 2:41 - loss: 0.6124 - regression_loss: 0.5408 - classification_loss: 0.0716 10/500 [..............................] - ETA: 2:41 - loss: 0.6093 - regression_loss: 0.5368 - classification_loss: 0.0725 11/500 [..............................] - ETA: 2:41 - loss: 0.6183 - regression_loss: 0.5435 - classification_loss: 0.0748 12/500 [..............................] - ETA: 2:40 - loss: 0.6199 - regression_loss: 0.5446 - classification_loss: 0.0753 13/500 [..............................] - ETA: 2:40 - loss: 0.6103 - regression_loss: 0.5335 - classification_loss: 0.0767 14/500 [..............................] - ETA: 2:41 - loss: 0.6024 - regression_loss: 0.5269 - classification_loss: 0.0755 15/500 [..............................] - ETA: 2:40 - loss: 0.6112 - regression_loss: 0.5359 - classification_loss: 0.0752 16/500 [..............................] - ETA: 2:41 - loss: 0.6560 - regression_loss: 0.5769 - classification_loss: 0.0791 17/500 [>.............................] - ETA: 2:40 - loss: 0.6439 - regression_loss: 0.5657 - classification_loss: 0.0782 18/500 [>.............................] - ETA: 2:41 - loss: 0.6377 - regression_loss: 0.5621 - classification_loss: 0.0756 19/500 [>.............................] - ETA: 2:40 - loss: 0.6426 - regression_loss: 0.5666 - classification_loss: 0.0760 20/500 [>.............................] - ETA: 2:40 - loss: 0.6719 - regression_loss: 0.5919 - classification_loss: 0.0800 21/500 [>.............................] - ETA: 2:39 - loss: 0.6686 - regression_loss: 0.5875 - classification_loss: 0.0810 22/500 [>.............................] - ETA: 2:39 - loss: 0.6848 - regression_loss: 0.6019 - classification_loss: 0.0829 23/500 [>.............................] - ETA: 2:39 - loss: 0.6850 - regression_loss: 0.6010 - classification_loss: 0.0840 24/500 [>.............................] - ETA: 2:38 - loss: 0.6920 - regression_loss: 0.6058 - classification_loss: 0.0861 25/500 [>.............................] - ETA: 2:38 - loss: 0.7103 - regression_loss: 0.6231 - classification_loss: 0.0872 26/500 [>.............................] - ETA: 2:38 - loss: 0.7149 - regression_loss: 0.6272 - classification_loss: 0.0877 27/500 [>.............................] - ETA: 2:38 - loss: 0.7030 - regression_loss: 0.6169 - classification_loss: 0.0861 28/500 [>.............................] - ETA: 2:37 - loss: 0.6917 - regression_loss: 0.6074 - classification_loss: 0.0842 29/500 [>.............................] - ETA: 2:37 - loss: 0.6900 - regression_loss: 0.6063 - classification_loss: 0.0837 30/500 [>.............................] - ETA: 2:37 - loss: 0.6824 - regression_loss: 0.5999 - classification_loss: 0.0825 31/500 [>.............................] - ETA: 2:36 - loss: 0.6896 - regression_loss: 0.6075 - classification_loss: 0.0821 32/500 [>.............................] - ETA: 2:37 - loss: 0.6856 - regression_loss: 0.6043 - classification_loss: 0.0813 33/500 [>.............................] - ETA: 2:36 - loss: 0.6746 - regression_loss: 0.5946 - classification_loss: 0.0801 34/500 [=>............................] - ETA: 2:36 - loss: 0.6669 - regression_loss: 0.5879 - classification_loss: 0.0790 35/500 [=>............................] - ETA: 2:35 - loss: 0.6777 - regression_loss: 0.5963 - classification_loss: 0.0814 36/500 [=>............................] - ETA: 2:35 - loss: 0.6881 - regression_loss: 0.6049 - classification_loss: 0.0832 37/500 [=>............................] - ETA: 2:35 - loss: 0.6891 - regression_loss: 0.6057 - classification_loss: 0.0835 38/500 [=>............................] - ETA: 2:34 - loss: 0.6932 - regression_loss: 0.6090 - classification_loss: 0.0842 39/500 [=>............................] - ETA: 2:34 - loss: 0.6906 - regression_loss: 0.6065 - classification_loss: 0.0841 40/500 [=>............................] - ETA: 2:33 - loss: 0.6930 - regression_loss: 0.6086 - classification_loss: 0.0845 41/500 [=>............................] - ETA: 2:33 - loss: 0.7014 - regression_loss: 0.6161 - classification_loss: 0.0854 42/500 [=>............................] - ETA: 2:33 - loss: 0.7048 - regression_loss: 0.6191 - classification_loss: 0.0857 43/500 [=>............................] - ETA: 2:32 - loss: 0.6973 - regression_loss: 0.6124 - classification_loss: 0.0849 44/500 [=>............................] - ETA: 2:32 - loss: 0.6937 - regression_loss: 0.6091 - classification_loss: 0.0846 45/500 [=>............................] - ETA: 2:31 - loss: 0.6878 - regression_loss: 0.6032 - classification_loss: 0.0846 46/500 [=>............................] - ETA: 2:31 - loss: 0.6924 - regression_loss: 0.6073 - classification_loss: 0.0851 47/500 [=>............................] - ETA: 2:30 - loss: 0.6973 - regression_loss: 0.6112 - classification_loss: 0.0860 48/500 [=>............................] - ETA: 2:30 - loss: 0.7016 - regression_loss: 0.6135 - classification_loss: 0.0881 49/500 [=>............................] - ETA: 2:30 - loss: 0.7117 - regression_loss: 0.6220 - classification_loss: 0.0896 50/500 [==>...........................] - ETA: 2:29 - loss: 0.7141 - regression_loss: 0.6235 - classification_loss: 0.0906 51/500 [==>...........................] - ETA: 2:29 - loss: 0.7089 - regression_loss: 0.6188 - classification_loss: 0.0901 52/500 [==>...........................] - ETA: 2:29 - loss: 0.7079 - regression_loss: 0.6187 - classification_loss: 0.0892 53/500 [==>...........................] - ETA: 2:28 - loss: 0.7060 - regression_loss: 0.6171 - classification_loss: 0.0890 54/500 [==>...........................] - ETA: 2:28 - loss: 0.7033 - regression_loss: 0.6146 - classification_loss: 0.0887 55/500 [==>...........................] - ETA: 2:27 - loss: 0.7031 - regression_loss: 0.6147 - classification_loss: 0.0884 56/500 [==>...........................] - ETA: 2:27 - loss: 0.6961 - regression_loss: 0.6086 - classification_loss: 0.0875 57/500 [==>...........................] - ETA: 2:27 - loss: 0.6991 - regression_loss: 0.6122 - classification_loss: 0.0869 58/500 [==>...........................] - ETA: 2:27 - loss: 0.6987 - regression_loss: 0.6115 - classification_loss: 0.0872 59/500 [==>...........................] - ETA: 2:26 - loss: 0.6931 - regression_loss: 0.6067 - classification_loss: 0.0864 60/500 [==>...........................] - ETA: 2:26 - loss: 0.6935 - regression_loss: 0.6065 - classification_loss: 0.0869 61/500 [==>...........................] - ETA: 2:25 - loss: 0.6930 - regression_loss: 0.6062 - classification_loss: 0.0868 62/500 [==>...........................] - ETA: 2:25 - loss: 0.6870 - regression_loss: 0.6008 - classification_loss: 0.0862 63/500 [==>...........................] - ETA: 2:25 - loss: 0.6860 - regression_loss: 0.5999 - classification_loss: 0.0861 64/500 [==>...........................] - ETA: 2:24 - loss: 0.6830 - regression_loss: 0.5976 - classification_loss: 0.0855 65/500 [==>...........................] - ETA: 2:24 - loss: 0.6826 - regression_loss: 0.5975 - classification_loss: 0.0851 66/500 [==>...........................] - ETA: 2:23 - loss: 0.6850 - regression_loss: 0.5995 - classification_loss: 0.0855 67/500 [===>..........................] - ETA: 2:23 - loss: 0.6819 - regression_loss: 0.5967 - classification_loss: 0.0851 68/500 [===>..........................] - ETA: 2:23 - loss: 0.6798 - regression_loss: 0.5949 - classification_loss: 0.0849 69/500 [===>..........................] - ETA: 2:22 - loss: 0.6754 - regression_loss: 0.5907 - classification_loss: 0.0847 70/500 [===>..........................] - ETA: 2:22 - loss: 0.6772 - regression_loss: 0.5923 - classification_loss: 0.0849 71/500 [===>..........................] - ETA: 2:22 - loss: 0.6825 - regression_loss: 0.5976 - classification_loss: 0.0849 72/500 [===>..........................] - ETA: 2:21 - loss: 0.6819 - regression_loss: 0.5971 - classification_loss: 0.0849 73/500 [===>..........................] - ETA: 2:21 - loss: 0.6806 - regression_loss: 0.5962 - classification_loss: 0.0843 74/500 [===>..........................] - ETA: 2:20 - loss: 0.6768 - regression_loss: 0.5930 - classification_loss: 0.0837 75/500 [===>..........................] - ETA: 2:20 - loss: 0.6788 - regression_loss: 0.5946 - classification_loss: 0.0842 76/500 [===>..........................] - ETA: 2:20 - loss: 0.6775 - regression_loss: 0.5933 - classification_loss: 0.0842 77/500 [===>..........................] - ETA: 2:19 - loss: 0.6817 - regression_loss: 0.5964 - classification_loss: 0.0852 78/500 [===>..........................] - ETA: 2:19 - loss: 0.6816 - regression_loss: 0.5967 - classification_loss: 0.0849 79/500 [===>..........................] - ETA: 2:19 - loss: 0.6794 - regression_loss: 0.5947 - classification_loss: 0.0847 80/500 [===>..........................] - ETA: 2:18 - loss: 0.6748 - regression_loss: 0.5906 - classification_loss: 0.0842 81/500 [===>..........................] - ETA: 2:18 - loss: 0.6714 - regression_loss: 0.5879 - classification_loss: 0.0836 82/500 [===>..........................] - ETA: 2:18 - loss: 0.6684 - regression_loss: 0.5852 - classification_loss: 0.0832 83/500 [===>..........................] - ETA: 2:17 - loss: 0.6735 - regression_loss: 0.5890 - classification_loss: 0.0845 84/500 [====>.........................] - ETA: 2:17 - loss: 0.6769 - regression_loss: 0.5923 - classification_loss: 0.0846 85/500 [====>.........................] - ETA: 2:17 - loss: 0.6805 - regression_loss: 0.5956 - classification_loss: 0.0849 86/500 [====>.........................] - ETA: 2:16 - loss: 0.6863 - regression_loss: 0.6007 - classification_loss: 0.0856 87/500 [====>.........................] - ETA: 2:16 - loss: 0.6883 - regression_loss: 0.6026 - classification_loss: 0.0858 88/500 [====>.........................] - ETA: 2:16 - loss: 0.6821 - regression_loss: 0.5970 - classification_loss: 0.0850 89/500 [====>.........................] - ETA: 2:15 - loss: 0.6889 - regression_loss: 0.6026 - classification_loss: 0.0863 90/500 [====>.........................] - ETA: 2:15 - loss: 0.6842 - regression_loss: 0.5985 - classification_loss: 0.0856 91/500 [====>.........................] - ETA: 2:15 - loss: 0.6837 - regression_loss: 0.5981 - classification_loss: 0.0856 92/500 [====>.........................] - ETA: 2:14 - loss: 0.6824 - regression_loss: 0.5970 - classification_loss: 0.0854 93/500 [====>.........................] - ETA: 2:14 - loss: 0.6784 - regression_loss: 0.5934 - classification_loss: 0.0850 94/500 [====>.........................] - ETA: 2:13 - loss: 0.6814 - regression_loss: 0.5964 - classification_loss: 0.0850 95/500 [====>.........................] - ETA: 2:13 - loss: 0.6785 - regression_loss: 0.5939 - classification_loss: 0.0846 96/500 [====>.........................] - ETA: 2:13 - loss: 0.6777 - regression_loss: 0.5933 - classification_loss: 0.0844 97/500 [====>.........................] - ETA: 2:12 - loss: 0.6788 - regression_loss: 0.5943 - classification_loss: 0.0845 98/500 [====>.........................] - ETA: 2:12 - loss: 0.6799 - regression_loss: 0.5957 - classification_loss: 0.0842 99/500 [====>.........................] - ETA: 2:12 - loss: 0.6827 - regression_loss: 0.5982 - classification_loss: 0.0845 100/500 [=====>........................] - ETA: 2:11 - loss: 0.6798 - regression_loss: 0.5955 - classification_loss: 0.0843 101/500 [=====>........................] - ETA: 2:11 - loss: 0.6818 - regression_loss: 0.5972 - classification_loss: 0.0846 102/500 [=====>........................] - ETA: 2:11 - loss: 0.6819 - regression_loss: 0.5974 - classification_loss: 0.0845 103/500 [=====>........................] - ETA: 2:11 - loss: 0.6831 - regression_loss: 0.5986 - classification_loss: 0.0844 104/500 [=====>........................] - ETA: 2:10 - loss: 0.6841 - regression_loss: 0.5997 - classification_loss: 0.0844 105/500 [=====>........................] - ETA: 2:10 - loss: 0.6907 - regression_loss: 0.6052 - classification_loss: 0.0856 106/500 [=====>........................] - ETA: 2:10 - loss: 0.6906 - regression_loss: 0.6051 - classification_loss: 0.0855 107/500 [=====>........................] - ETA: 2:09 - loss: 0.6900 - regression_loss: 0.6048 - classification_loss: 0.0852 108/500 [=====>........................] - ETA: 2:09 - loss: 0.6867 - regression_loss: 0.6020 - classification_loss: 0.0848 109/500 [=====>........................] - ETA: 2:09 - loss: 0.6860 - regression_loss: 0.6014 - classification_loss: 0.0846 110/500 [=====>........................] - ETA: 2:08 - loss: 0.6832 - regression_loss: 0.5992 - classification_loss: 0.0840 111/500 [=====>........................] - ETA: 2:08 - loss: 0.6824 - regression_loss: 0.5984 - classification_loss: 0.0840 112/500 [=====>........................] - ETA: 2:08 - loss: 0.6790 - regression_loss: 0.5952 - classification_loss: 0.0838 113/500 [=====>........................] - ETA: 2:07 - loss: 0.6769 - regression_loss: 0.5934 - classification_loss: 0.0835 114/500 [=====>........................] - ETA: 2:07 - loss: 0.6770 - regression_loss: 0.5936 - classification_loss: 0.0834 115/500 [=====>........................] - ETA: 2:07 - loss: 0.6748 - regression_loss: 0.5920 - classification_loss: 0.0829 116/500 [=====>........................] - ETA: 2:06 - loss: 0.6753 - regression_loss: 0.5927 - classification_loss: 0.0826 117/500 [======>.......................] - ETA: 2:06 - loss: 0.6763 - regression_loss: 0.5938 - classification_loss: 0.0826 118/500 [======>.......................] - ETA: 2:06 - loss: 0.6784 - regression_loss: 0.5957 - classification_loss: 0.0827 119/500 [======>.......................] - ETA: 2:05 - loss: 0.6814 - regression_loss: 0.5982 - classification_loss: 0.0832 120/500 [======>.......................] - ETA: 2:05 - loss: 0.6840 - regression_loss: 0.6008 - classification_loss: 0.0832 121/500 [======>.......................] - ETA: 2:05 - loss: 0.6869 - regression_loss: 0.6035 - classification_loss: 0.0834 122/500 [======>.......................] - ETA: 2:05 - loss: 0.6916 - regression_loss: 0.6077 - classification_loss: 0.0839 123/500 [======>.......................] - ETA: 2:04 - loss: 0.6922 - regression_loss: 0.6084 - classification_loss: 0.0839 124/500 [======>.......................] - ETA: 2:04 - loss: 0.6886 - regression_loss: 0.6052 - classification_loss: 0.0835 125/500 [======>.......................] - ETA: 2:04 - loss: 0.6920 - regression_loss: 0.6084 - classification_loss: 0.0836 126/500 [======>.......................] - ETA: 2:03 - loss: 0.6940 - regression_loss: 0.6102 - classification_loss: 0.0839 127/500 [======>.......................] - ETA: 2:03 - loss: 0.6906 - regression_loss: 0.6072 - classification_loss: 0.0834 128/500 [======>.......................] - ETA: 2:03 - loss: 0.6921 - regression_loss: 0.6085 - classification_loss: 0.0836 129/500 [======>.......................] - ETA: 2:02 - loss: 0.6975 - regression_loss: 0.6129 - classification_loss: 0.0846 130/500 [======>.......................] - ETA: 2:02 - loss: 0.7003 - regression_loss: 0.6156 - classification_loss: 0.0847 131/500 [======>.......................] - ETA: 2:02 - loss: 0.6997 - regression_loss: 0.6150 - classification_loss: 0.0847 132/500 [======>.......................] - ETA: 2:01 - loss: 0.7001 - regression_loss: 0.6155 - classification_loss: 0.0846 133/500 [======>.......................] - ETA: 2:01 - loss: 0.7006 - regression_loss: 0.6161 - classification_loss: 0.0846 134/500 [=======>......................] - ETA: 2:01 - loss: 0.6998 - regression_loss: 0.6154 - classification_loss: 0.0844 135/500 [=======>......................] - ETA: 2:01 - loss: 0.6999 - regression_loss: 0.6155 - classification_loss: 0.0844 136/500 [=======>......................] - ETA: 2:00 - loss: 0.7016 - regression_loss: 0.6170 - classification_loss: 0.0846 137/500 [=======>......................] - ETA: 2:00 - loss: 0.7035 - regression_loss: 0.6186 - classification_loss: 0.0849 138/500 [=======>......................] - ETA: 2:00 - loss: 0.7008 - regression_loss: 0.6164 - classification_loss: 0.0844 139/500 [=======>......................] - ETA: 1:59 - loss: 0.6974 - regression_loss: 0.6134 - classification_loss: 0.0840 140/500 [=======>......................] - ETA: 1:59 - loss: 0.6970 - regression_loss: 0.6128 - classification_loss: 0.0842 141/500 [=======>......................] - ETA: 1:59 - loss: 0.6966 - regression_loss: 0.6125 - classification_loss: 0.0841 142/500 [=======>......................] - ETA: 1:58 - loss: 0.6954 - regression_loss: 0.6116 - classification_loss: 0.0838 143/500 [=======>......................] - ETA: 1:58 - loss: 0.6932 - regression_loss: 0.6094 - classification_loss: 0.0837 144/500 [=======>......................] - ETA: 1:58 - loss: 0.6950 - regression_loss: 0.6111 - classification_loss: 0.0839 145/500 [=======>......................] - ETA: 1:57 - loss: 0.6946 - regression_loss: 0.6107 - classification_loss: 0.0839 146/500 [=======>......................] - ETA: 1:57 - loss: 0.6947 - regression_loss: 0.6108 - classification_loss: 0.0839 147/500 [=======>......................] - ETA: 1:57 - loss: 0.6926 - regression_loss: 0.6089 - classification_loss: 0.0837 148/500 [=======>......................] - ETA: 1:56 - loss: 0.6904 - regression_loss: 0.6069 - classification_loss: 0.0834 149/500 [=======>......................] - ETA: 1:56 - loss: 0.6904 - regression_loss: 0.6070 - classification_loss: 0.0834 150/500 [========>.....................] - ETA: 1:56 - loss: 0.6950 - regression_loss: 0.6107 - classification_loss: 0.0842 151/500 [========>.....................] - ETA: 1:55 - loss: 0.6946 - regression_loss: 0.6107 - classification_loss: 0.0839 152/500 [========>.....................] - ETA: 1:55 - loss: 0.6950 - regression_loss: 0.6113 - classification_loss: 0.0838 153/500 [========>.....................] - ETA: 1:55 - loss: 0.6934 - regression_loss: 0.6099 - classification_loss: 0.0835 154/500 [========>.....................] - ETA: 1:54 - loss: 0.6933 - regression_loss: 0.6099 - classification_loss: 0.0835 155/500 [========>.....................] - ETA: 1:54 - loss: 0.6975 - regression_loss: 0.6130 - classification_loss: 0.0845 156/500 [========>.....................] - ETA: 1:54 - loss: 0.6986 - regression_loss: 0.6135 - classification_loss: 0.0851 157/500 [========>.....................] - ETA: 1:53 - loss: 0.6993 - regression_loss: 0.6143 - classification_loss: 0.0850 158/500 [========>.....................] - ETA: 1:53 - loss: 0.6979 - regression_loss: 0.6132 - classification_loss: 0.0846 159/500 [========>.....................] - ETA: 1:53 - loss: 0.6978 - regression_loss: 0.6134 - classification_loss: 0.0845 160/500 [========>.....................] - ETA: 1:52 - loss: 0.6974 - regression_loss: 0.6132 - classification_loss: 0.0843 161/500 [========>.....................] - ETA: 1:52 - loss: 0.6995 - regression_loss: 0.6150 - classification_loss: 0.0845 162/500 [========>.....................] - ETA: 1:52 - loss: 0.6989 - regression_loss: 0.6143 - classification_loss: 0.0845 163/500 [========>.....................] - ETA: 1:51 - loss: 0.6982 - regression_loss: 0.6140 - classification_loss: 0.0843 164/500 [========>.....................] - ETA: 1:51 - loss: 0.6976 - regression_loss: 0.6134 - classification_loss: 0.0842 165/500 [========>.....................] - ETA: 1:51 - loss: 0.6992 - regression_loss: 0.6149 - classification_loss: 0.0844 166/500 [========>.....................] - ETA: 1:50 - loss: 0.6987 - regression_loss: 0.6145 - classification_loss: 0.0841 167/500 [=========>....................] - ETA: 1:50 - loss: 0.6977 - regression_loss: 0.6135 - classification_loss: 0.0842 168/500 [=========>....................] - ETA: 1:50 - loss: 0.7010 - regression_loss: 0.6164 - classification_loss: 0.0846 169/500 [=========>....................] - ETA: 1:49 - loss: 0.6996 - regression_loss: 0.6152 - classification_loss: 0.0844 170/500 [=========>....................] - ETA: 1:49 - loss: 0.6996 - regression_loss: 0.6150 - classification_loss: 0.0845 171/500 [=========>....................] - ETA: 1:49 - loss: 0.6983 - regression_loss: 0.6140 - classification_loss: 0.0843 172/500 [=========>....................] - ETA: 1:48 - loss: 0.6955 - regression_loss: 0.6116 - classification_loss: 0.0839 173/500 [=========>....................] - ETA: 1:48 - loss: 0.6958 - regression_loss: 0.6118 - classification_loss: 0.0841 174/500 [=========>....................] - ETA: 1:48 - loss: 0.6962 - regression_loss: 0.6121 - classification_loss: 0.0841 175/500 [=========>....................] - ETA: 1:47 - loss: 0.6972 - regression_loss: 0.6129 - classification_loss: 0.0843 176/500 [=========>....................] - ETA: 1:47 - loss: 0.6974 - regression_loss: 0.6133 - classification_loss: 0.0841 177/500 [=========>....................] - ETA: 1:47 - loss: 0.6980 - regression_loss: 0.6137 - classification_loss: 0.0843 178/500 [=========>....................] - ETA: 1:46 - loss: 0.6992 - regression_loss: 0.6147 - classification_loss: 0.0845 179/500 [=========>....................] - ETA: 1:46 - loss: 0.6992 - regression_loss: 0.6148 - classification_loss: 0.0843 180/500 [=========>....................] - ETA: 1:46 - loss: 0.6989 - regression_loss: 0.6148 - classification_loss: 0.0841 181/500 [=========>....................] - ETA: 1:45 - loss: 0.6988 - regression_loss: 0.6148 - classification_loss: 0.0840 182/500 [=========>....................] - ETA: 1:45 - loss: 0.6992 - regression_loss: 0.6153 - classification_loss: 0.0840 183/500 [=========>....................] - ETA: 1:45 - loss: 0.7005 - regression_loss: 0.6165 - classification_loss: 0.0840 184/500 [==========>...................] - ETA: 1:44 - loss: 0.7021 - regression_loss: 0.6178 - classification_loss: 0.0843 185/500 [==========>...................] - ETA: 1:44 - loss: 0.7041 - regression_loss: 0.6194 - classification_loss: 0.0847 186/500 [==========>...................] - ETA: 1:44 - loss: 0.7035 - regression_loss: 0.6187 - classification_loss: 0.0848 187/500 [==========>...................] - ETA: 1:43 - loss: 0.7061 - regression_loss: 0.6208 - classification_loss: 0.0853 188/500 [==========>...................] - ETA: 1:43 - loss: 0.7060 - regression_loss: 0.6208 - classification_loss: 0.0852 189/500 [==========>...................] - ETA: 1:43 - loss: 0.7069 - regression_loss: 0.6216 - classification_loss: 0.0853 190/500 [==========>...................] - ETA: 1:42 - loss: 0.7069 - regression_loss: 0.6217 - classification_loss: 0.0852 191/500 [==========>...................] - ETA: 1:42 - loss: 0.7071 - regression_loss: 0.6221 - classification_loss: 0.0850 192/500 [==========>...................] - ETA: 1:41 - loss: 0.7068 - regression_loss: 0.6221 - classification_loss: 0.0847 193/500 [==========>...................] - ETA: 1:41 - loss: 0.7087 - regression_loss: 0.6234 - classification_loss: 0.0853 194/500 [==========>...................] - ETA: 1:41 - loss: 0.7101 - regression_loss: 0.6250 - classification_loss: 0.0851 195/500 [==========>...................] - ETA: 1:40 - loss: 0.7125 - regression_loss: 0.6272 - classification_loss: 0.0853 196/500 [==========>...................] - ETA: 1:40 - loss: 0.7112 - regression_loss: 0.6261 - classification_loss: 0.0851 197/500 [==========>...................] - ETA: 1:40 - loss: 0.7112 - regression_loss: 0.6260 - classification_loss: 0.0852 198/500 [==========>...................] - ETA: 1:39 - loss: 0.7129 - regression_loss: 0.6273 - classification_loss: 0.0856 199/500 [==========>...................] - ETA: 1:39 - loss: 0.7114 - regression_loss: 0.6260 - classification_loss: 0.0854 200/500 [===========>..................] - ETA: 1:39 - loss: 0.7111 - regression_loss: 0.6257 - classification_loss: 0.0854 201/500 [===========>..................] - ETA: 1:38 - loss: 0.7110 - regression_loss: 0.6256 - classification_loss: 0.0854 202/500 [===========>..................] - ETA: 1:38 - loss: 0.7092 - regression_loss: 0.6240 - classification_loss: 0.0852 203/500 [===========>..................] - ETA: 1:38 - loss: 0.7102 - regression_loss: 0.6249 - classification_loss: 0.0853 204/500 [===========>..................] - ETA: 1:37 - loss: 0.7082 - regression_loss: 0.6231 - classification_loss: 0.0850 205/500 [===========>..................] - ETA: 1:37 - loss: 0.7098 - regression_loss: 0.6245 - classification_loss: 0.0853 206/500 [===========>..................] - ETA: 1:37 - loss: 0.7072 - regression_loss: 0.6222 - classification_loss: 0.0850 207/500 [===========>..................] - ETA: 1:36 - loss: 0.7067 - regression_loss: 0.6218 - classification_loss: 0.0849 208/500 [===========>..................] - ETA: 1:36 - loss: 0.7048 - regression_loss: 0.6201 - classification_loss: 0.0847 209/500 [===========>..................] - ETA: 1:36 - loss: 0.7060 - regression_loss: 0.6207 - classification_loss: 0.0852 210/500 [===========>..................] - ETA: 1:35 - loss: 0.7047 - regression_loss: 0.6197 - classification_loss: 0.0850 211/500 [===========>..................] - ETA: 1:35 - loss: 0.7059 - regression_loss: 0.6207 - classification_loss: 0.0853 212/500 [===========>..................] - ETA: 1:35 - loss: 0.7072 - regression_loss: 0.6216 - classification_loss: 0.0856 213/500 [===========>..................] - ETA: 1:34 - loss: 0.7079 - regression_loss: 0.6221 - classification_loss: 0.0858 214/500 [===========>..................] - ETA: 1:34 - loss: 0.7054 - regression_loss: 0.6199 - classification_loss: 0.0855 215/500 [===========>..................] - ETA: 1:34 - loss: 0.7068 - regression_loss: 0.6211 - classification_loss: 0.0857 216/500 [===========>..................] - ETA: 1:33 - loss: 0.7068 - regression_loss: 0.6212 - classification_loss: 0.0856 217/500 [============>.................] - ETA: 1:33 - loss: 0.7058 - regression_loss: 0.6204 - classification_loss: 0.0854 218/500 [============>.................] - ETA: 1:33 - loss: 0.7053 - regression_loss: 0.6198 - classification_loss: 0.0855 219/500 [============>.................] - ETA: 1:32 - loss: 0.7035 - regression_loss: 0.6183 - classification_loss: 0.0852 220/500 [============>.................] - ETA: 1:32 - loss: 0.7020 - regression_loss: 0.6171 - classification_loss: 0.0849 221/500 [============>.................] - ETA: 1:32 - loss: 0.7029 - regression_loss: 0.6177 - classification_loss: 0.0852 222/500 [============>.................] - ETA: 1:31 - loss: 0.7025 - regression_loss: 0.6174 - classification_loss: 0.0851 223/500 [============>.................] - ETA: 1:31 - loss: 0.7006 - regression_loss: 0.6158 - classification_loss: 0.0849 224/500 [============>.................] - ETA: 1:31 - loss: 0.7013 - regression_loss: 0.6165 - classification_loss: 0.0848 225/500 [============>.................] - ETA: 1:30 - loss: 0.7038 - regression_loss: 0.6186 - classification_loss: 0.0852 226/500 [============>.................] - ETA: 1:30 - loss: 0.7048 - regression_loss: 0.6196 - classification_loss: 0.0852 227/500 [============>.................] - ETA: 1:30 - loss: 0.7030 - regression_loss: 0.6179 - classification_loss: 0.0851 228/500 [============>.................] - ETA: 1:29 - loss: 0.7024 - regression_loss: 0.6173 - classification_loss: 0.0850 229/500 [============>.................] - ETA: 1:29 - loss: 0.7018 - regression_loss: 0.6169 - classification_loss: 0.0848 230/500 [============>.................] - ETA: 1:29 - loss: 0.7014 - regression_loss: 0.6166 - classification_loss: 0.0849 231/500 [============>.................] - ETA: 1:28 - loss: 0.7000 - regression_loss: 0.6153 - classification_loss: 0.0847 232/500 [============>.................] - ETA: 1:28 - loss: 0.7012 - regression_loss: 0.6162 - classification_loss: 0.0850 233/500 [============>.................] - ETA: 1:28 - loss: 0.7012 - regression_loss: 0.6162 - classification_loss: 0.0850 234/500 [=============>................] - ETA: 1:27 - loss: 0.7014 - regression_loss: 0.6165 - classification_loss: 0.0849 235/500 [=============>................] - ETA: 1:27 - loss: 0.7008 - regression_loss: 0.6160 - classification_loss: 0.0848 236/500 [=============>................] - ETA: 1:27 - loss: 0.7006 - regression_loss: 0.6158 - classification_loss: 0.0849 237/500 [=============>................] - ETA: 1:26 - loss: 0.7011 - regression_loss: 0.6163 - classification_loss: 0.0848 238/500 [=============>................] - ETA: 1:26 - loss: 0.7008 - regression_loss: 0.6160 - classification_loss: 0.0848 239/500 [=============>................] - ETA: 1:26 - loss: 0.7000 - regression_loss: 0.6153 - classification_loss: 0.0847 240/500 [=============>................] - ETA: 1:25 - loss: 0.6987 - regression_loss: 0.6142 - classification_loss: 0.0845 241/500 [=============>................] - ETA: 1:25 - loss: 0.6989 - regression_loss: 0.6143 - classification_loss: 0.0845 242/500 [=============>................] - ETA: 1:25 - loss: 0.6971 - regression_loss: 0.6127 - classification_loss: 0.0843 243/500 [=============>................] - ETA: 1:24 - loss: 0.6976 - regression_loss: 0.6133 - classification_loss: 0.0844 244/500 [=============>................] - ETA: 1:24 - loss: 0.6961 - regression_loss: 0.6118 - classification_loss: 0.0842 245/500 [=============>................] - ETA: 1:24 - loss: 0.6947 - regression_loss: 0.6105 - classification_loss: 0.0842 246/500 [=============>................] - ETA: 1:23 - loss: 0.6945 - regression_loss: 0.6103 - classification_loss: 0.0842 247/500 [=============>................] - ETA: 1:23 - loss: 0.6940 - regression_loss: 0.6098 - classification_loss: 0.0842 248/500 [=============>................] - ETA: 1:23 - loss: 0.6924 - regression_loss: 0.6084 - classification_loss: 0.0840 249/500 [=============>................] - ETA: 1:22 - loss: 0.6915 - regression_loss: 0.6076 - classification_loss: 0.0839 250/500 [==============>...............] - ETA: 1:22 - loss: 0.6912 - regression_loss: 0.6073 - classification_loss: 0.0839 251/500 [==============>...............] - ETA: 1:22 - loss: 0.6912 - regression_loss: 0.6073 - classification_loss: 0.0839 252/500 [==============>...............] - ETA: 1:21 - loss: 0.6908 - regression_loss: 0.6070 - classification_loss: 0.0838 253/500 [==============>...............] - ETA: 1:21 - loss: 0.6888 - regression_loss: 0.6053 - classification_loss: 0.0836 254/500 [==============>...............] - ETA: 1:21 - loss: 0.6891 - regression_loss: 0.6054 - classification_loss: 0.0837 255/500 [==============>...............] - ETA: 1:20 - loss: 0.6900 - regression_loss: 0.6062 - classification_loss: 0.0838 256/500 [==============>...............] - ETA: 1:20 - loss: 0.6894 - regression_loss: 0.6056 - classification_loss: 0.0837 257/500 [==============>...............] - ETA: 1:20 - loss: 0.6882 - regression_loss: 0.6047 - classification_loss: 0.0835 258/500 [==============>...............] - ETA: 1:19 - loss: 0.6869 - regression_loss: 0.6036 - classification_loss: 0.0834 259/500 [==============>...............] - ETA: 1:19 - loss: 0.6877 - regression_loss: 0.6041 - classification_loss: 0.0835 260/500 [==============>...............] - ETA: 1:19 - loss: 0.6888 - regression_loss: 0.6049 - classification_loss: 0.0838 261/500 [==============>...............] - ETA: 1:18 - loss: 0.6882 - regression_loss: 0.6046 - classification_loss: 0.0836 262/500 [==============>...............] - ETA: 1:18 - loss: 0.6886 - regression_loss: 0.6048 - classification_loss: 0.0837 263/500 [==============>...............] - ETA: 1:18 - loss: 0.6882 - regression_loss: 0.6045 - classification_loss: 0.0837 264/500 [==============>...............] - ETA: 1:17 - loss: 0.6871 - regression_loss: 0.6035 - classification_loss: 0.0836 265/500 [==============>...............] - ETA: 1:17 - loss: 0.6873 - regression_loss: 0.6037 - classification_loss: 0.0836 266/500 [==============>...............] - ETA: 1:17 - loss: 0.6864 - regression_loss: 0.6029 - classification_loss: 0.0835 267/500 [===============>..............] - ETA: 1:16 - loss: 0.6863 - regression_loss: 0.6028 - classification_loss: 0.0835 268/500 [===============>..............] - ETA: 1:16 - loss: 0.6866 - regression_loss: 0.6031 - classification_loss: 0.0835 269/500 [===============>..............] - ETA: 1:16 - loss: 0.6851 - regression_loss: 0.6019 - classification_loss: 0.0833 270/500 [===============>..............] - ETA: 1:15 - loss: 0.6846 - regression_loss: 0.6014 - classification_loss: 0.0832 271/500 [===============>..............] - ETA: 1:15 - loss: 0.6850 - regression_loss: 0.6016 - classification_loss: 0.0834 272/500 [===============>..............] - ETA: 1:15 - loss: 0.6840 - regression_loss: 0.6006 - classification_loss: 0.0834 273/500 [===============>..............] - ETA: 1:14 - loss: 0.6844 - regression_loss: 0.6010 - classification_loss: 0.0835 274/500 [===============>..............] - ETA: 1:14 - loss: 0.6838 - regression_loss: 0.6005 - classification_loss: 0.0833 275/500 [===============>..............] - ETA: 1:14 - loss: 0.6844 - regression_loss: 0.6010 - classification_loss: 0.0834 276/500 [===============>..............] - ETA: 1:13 - loss: 0.6847 - regression_loss: 0.6012 - classification_loss: 0.0835 277/500 [===============>..............] - ETA: 1:13 - loss: 0.6835 - regression_loss: 0.6002 - classification_loss: 0.0833 278/500 [===============>..............] - ETA: 1:13 - loss: 0.6833 - regression_loss: 0.6001 - classification_loss: 0.0833 279/500 [===============>..............] - ETA: 1:12 - loss: 0.6827 - regression_loss: 0.5996 - classification_loss: 0.0831 280/500 [===============>..............] - ETA: 1:12 - loss: 0.6831 - regression_loss: 0.5999 - classification_loss: 0.0831 281/500 [===============>..............] - ETA: 1:12 - loss: 0.6824 - regression_loss: 0.5992 - classification_loss: 0.0831 282/500 [===============>..............] - ETA: 1:11 - loss: 0.6830 - regression_loss: 0.5999 - classification_loss: 0.0831 283/500 [===============>..............] - ETA: 1:11 - loss: 0.6837 - regression_loss: 0.6005 - classification_loss: 0.0832 284/500 [================>.............] - ETA: 1:11 - loss: 0.6841 - regression_loss: 0.6009 - classification_loss: 0.0832 285/500 [================>.............] - ETA: 1:10 - loss: 0.6831 - regression_loss: 0.6000 - classification_loss: 0.0831 286/500 [================>.............] - ETA: 1:10 - loss: 0.6819 - regression_loss: 0.5989 - classification_loss: 0.0829 287/500 [================>.............] - ETA: 1:10 - loss: 0.6812 - regression_loss: 0.5985 - classification_loss: 0.0828 288/500 [================>.............] - ETA: 1:09 - loss: 0.6821 - regression_loss: 0.5992 - classification_loss: 0.0829 289/500 [================>.............] - ETA: 1:09 - loss: 0.6828 - regression_loss: 0.5997 - classification_loss: 0.0832 290/500 [================>.............] - ETA: 1:09 - loss: 0.6829 - regression_loss: 0.5996 - classification_loss: 0.0832 291/500 [================>.............] - ETA: 1:08 - loss: 0.6834 - regression_loss: 0.6002 - classification_loss: 0.0832 292/500 [================>.............] - ETA: 1:08 - loss: 0.6850 - regression_loss: 0.6014 - classification_loss: 0.0836 293/500 [================>.............] - ETA: 1:08 - loss: 0.6836 - regression_loss: 0.6002 - classification_loss: 0.0834 294/500 [================>.............] - ETA: 1:07 - loss: 0.6826 - regression_loss: 0.5994 - classification_loss: 0.0832 295/500 [================>.............] - ETA: 1:07 - loss: 0.6826 - regression_loss: 0.5995 - classification_loss: 0.0831 296/500 [================>.............] - ETA: 1:07 - loss: 0.6813 - regression_loss: 0.5984 - classification_loss: 0.0829 297/500 [================>.............] - ETA: 1:06 - loss: 0.6806 - regression_loss: 0.5977 - classification_loss: 0.0828 298/500 [================>.............] - ETA: 1:06 - loss: 0.6804 - regression_loss: 0.5975 - classification_loss: 0.0829 299/500 [================>.............] - ETA: 1:06 - loss: 0.6803 - regression_loss: 0.5974 - classification_loss: 0.0829 300/500 [=================>............] - ETA: 1:05 - loss: 0.6792 - regression_loss: 0.5964 - classification_loss: 0.0828 301/500 [=================>............] - ETA: 1:05 - loss: 0.6787 - regression_loss: 0.5960 - classification_loss: 0.0827 302/500 [=================>............] - ETA: 1:05 - loss: 0.6776 - regression_loss: 0.5951 - classification_loss: 0.0825 303/500 [=================>............] - ETA: 1:05 - loss: 0.6777 - regression_loss: 0.5952 - classification_loss: 0.0825 304/500 [=================>............] - ETA: 1:04 - loss: 0.6787 - regression_loss: 0.5962 - classification_loss: 0.0825 305/500 [=================>............] - ETA: 1:04 - loss: 0.6785 - regression_loss: 0.5960 - classification_loss: 0.0825 306/500 [=================>............] - ETA: 1:04 - loss: 0.6795 - regression_loss: 0.5968 - classification_loss: 0.0827 307/500 [=================>............] - ETA: 1:03 - loss: 0.6791 - regression_loss: 0.5966 - classification_loss: 0.0825 308/500 [=================>............] - ETA: 1:03 - loss: 0.6795 - regression_loss: 0.5971 - classification_loss: 0.0824 309/500 [=================>............] - ETA: 1:03 - loss: 0.6804 - regression_loss: 0.5977 - classification_loss: 0.0827 310/500 [=================>............] - ETA: 1:02 - loss: 0.6811 - regression_loss: 0.5982 - classification_loss: 0.0829 311/500 [=================>............] - ETA: 1:02 - loss: 0.6812 - regression_loss: 0.5983 - classification_loss: 0.0830 312/500 [=================>............] - ETA: 1:02 - loss: 0.6809 - regression_loss: 0.5979 - classification_loss: 0.0831 313/500 [=================>............] - ETA: 1:01 - loss: 0.6823 - regression_loss: 0.5990 - classification_loss: 0.0833 314/500 [=================>............] - ETA: 1:01 - loss: 0.6829 - regression_loss: 0.5994 - classification_loss: 0.0834 315/500 [=================>............] - ETA: 1:01 - loss: 0.6819 - regression_loss: 0.5986 - classification_loss: 0.0833 316/500 [=================>............] - ETA: 1:00 - loss: 0.6819 - regression_loss: 0.5986 - classification_loss: 0.0834 317/500 [==================>...........] - ETA: 1:00 - loss: 0.6820 - regression_loss: 0.5987 - classification_loss: 0.0833 318/500 [==================>...........] - ETA: 1:00 - loss: 0.6815 - regression_loss: 0.5983 - classification_loss: 0.0832 319/500 [==================>...........] - ETA: 59s - loss: 0.6803 - regression_loss: 0.5973 - classification_loss: 0.0830  320/500 [==================>...........] - ETA: 59s - loss: 0.6790 - regression_loss: 0.5962 - classification_loss: 0.0828 321/500 [==================>...........] - ETA: 59s - loss: 0.6783 - regression_loss: 0.5956 - classification_loss: 0.0826 322/500 [==================>...........] - ETA: 58s - loss: 0.6794 - regression_loss: 0.5968 - classification_loss: 0.0827 323/500 [==================>...........] - ETA: 58s - loss: 0.6780 - regression_loss: 0.5956 - classification_loss: 0.0824 324/500 [==================>...........] - ETA: 58s - loss: 0.6782 - regression_loss: 0.5959 - classification_loss: 0.0823 325/500 [==================>...........] - ETA: 57s - loss: 0.6771 - regression_loss: 0.5949 - classification_loss: 0.0822 326/500 [==================>...........] - ETA: 57s - loss: 0.6778 - regression_loss: 0.5954 - classification_loss: 0.0824 327/500 [==================>...........] - ETA: 57s - loss: 0.6790 - regression_loss: 0.5963 - classification_loss: 0.0827 328/500 [==================>...........] - ETA: 56s - loss: 0.6786 - regression_loss: 0.5959 - classification_loss: 0.0826 329/500 [==================>...........] - ETA: 56s - loss: 0.6791 - regression_loss: 0.5962 - classification_loss: 0.0829 330/500 [==================>...........] - ETA: 56s - loss: 0.6795 - regression_loss: 0.5965 - classification_loss: 0.0830 331/500 [==================>...........] - ETA: 55s - loss: 0.6794 - regression_loss: 0.5964 - classification_loss: 0.0830 332/500 [==================>...........] - ETA: 55s - loss: 0.6792 - regression_loss: 0.5961 - classification_loss: 0.0830 333/500 [==================>...........] - ETA: 55s - loss: 0.6787 - regression_loss: 0.5957 - classification_loss: 0.0830 334/500 [===================>..........] - ETA: 54s - loss: 0.6801 - regression_loss: 0.5969 - classification_loss: 0.0832 335/500 [===================>..........] - ETA: 54s - loss: 0.6808 - regression_loss: 0.5975 - classification_loss: 0.0833 336/500 [===================>..........] - ETA: 54s - loss: 0.6814 - regression_loss: 0.5980 - classification_loss: 0.0834 337/500 [===================>..........] - ETA: 53s - loss: 0.6808 - regression_loss: 0.5975 - classification_loss: 0.0833 338/500 [===================>..........] - ETA: 53s - loss: 0.6801 - regression_loss: 0.5969 - classification_loss: 0.0832 339/500 [===================>..........] - ETA: 53s - loss: 0.6799 - regression_loss: 0.5968 - classification_loss: 0.0831 340/500 [===================>..........] - ETA: 52s - loss: 0.6787 - regression_loss: 0.5959 - classification_loss: 0.0829 341/500 [===================>..........] - ETA: 52s - loss: 0.6807 - regression_loss: 0.5975 - classification_loss: 0.0832 342/500 [===================>..........] - ETA: 52s - loss: 0.6817 - regression_loss: 0.5984 - classification_loss: 0.0833 343/500 [===================>..........] - ETA: 51s - loss: 0.6813 - regression_loss: 0.5980 - classification_loss: 0.0833 344/500 [===================>..........] - ETA: 51s - loss: 0.6816 - regression_loss: 0.5984 - classification_loss: 0.0833 345/500 [===================>..........] - ETA: 51s - loss: 0.6818 - regression_loss: 0.5985 - classification_loss: 0.0834 346/500 [===================>..........] - ETA: 50s - loss: 0.6819 - regression_loss: 0.5985 - classification_loss: 0.0833 347/500 [===================>..........] - ETA: 50s - loss: 0.6819 - regression_loss: 0.5986 - classification_loss: 0.0833 348/500 [===================>..........] - ETA: 50s - loss: 0.6815 - regression_loss: 0.5983 - classification_loss: 0.0831 349/500 [===================>..........] - ETA: 49s - loss: 0.6807 - regression_loss: 0.5977 - classification_loss: 0.0830 350/500 [====================>.........] - ETA: 49s - loss: 0.6807 - regression_loss: 0.5977 - classification_loss: 0.0830 351/500 [====================>.........] - ETA: 49s - loss: 0.6797 - regression_loss: 0.5968 - classification_loss: 0.0829 352/500 [====================>.........] - ETA: 48s - loss: 0.6793 - regression_loss: 0.5966 - classification_loss: 0.0827 353/500 [====================>.........] - ETA: 48s - loss: 0.6783 - regression_loss: 0.5957 - classification_loss: 0.0826 354/500 [====================>.........] - ETA: 48s - loss: 0.6781 - regression_loss: 0.5956 - classification_loss: 0.0825 355/500 [====================>.........] - ETA: 47s - loss: 0.6778 - regression_loss: 0.5953 - classification_loss: 0.0825 356/500 [====================>.........] - ETA: 47s - loss: 0.6792 - regression_loss: 0.5965 - classification_loss: 0.0827 357/500 [====================>.........] - ETA: 47s - loss: 0.6794 - regression_loss: 0.5967 - classification_loss: 0.0827 358/500 [====================>.........] - ETA: 46s - loss: 0.6800 - regression_loss: 0.5972 - classification_loss: 0.0828 359/500 [====================>.........] - ETA: 46s - loss: 0.6800 - regression_loss: 0.5972 - classification_loss: 0.0828 360/500 [====================>.........] - ETA: 46s - loss: 0.6802 - regression_loss: 0.5974 - classification_loss: 0.0828 361/500 [====================>.........] - ETA: 45s - loss: 0.6799 - regression_loss: 0.5972 - classification_loss: 0.0827 362/500 [====================>.........] - ETA: 45s - loss: 0.6790 - regression_loss: 0.5964 - classification_loss: 0.0826 363/500 [====================>.........] - ETA: 45s - loss: 0.6791 - regression_loss: 0.5962 - classification_loss: 0.0829 364/500 [====================>.........] - ETA: 44s - loss: 0.6782 - regression_loss: 0.5955 - classification_loss: 0.0827 365/500 [====================>.........] - ETA: 44s - loss: 0.6782 - regression_loss: 0.5955 - classification_loss: 0.0827 366/500 [====================>.........] - ETA: 44s - loss: 0.6790 - regression_loss: 0.5962 - classification_loss: 0.0828 367/500 [=====================>........] - ETA: 43s - loss: 0.6778 - regression_loss: 0.5952 - classification_loss: 0.0826 368/500 [=====================>........] - ETA: 43s - loss: 0.6779 - regression_loss: 0.5952 - classification_loss: 0.0826 369/500 [=====================>........] - ETA: 43s - loss: 0.6779 - regression_loss: 0.5952 - classification_loss: 0.0827 370/500 [=====================>........] - ETA: 42s - loss: 0.6769 - regression_loss: 0.5944 - classification_loss: 0.0826 371/500 [=====================>........] - ETA: 42s - loss: 0.6763 - regression_loss: 0.5939 - classification_loss: 0.0825 372/500 [=====================>........] - ETA: 42s - loss: 0.6760 - regression_loss: 0.5936 - classification_loss: 0.0824 373/500 [=====================>........] - ETA: 41s - loss: 0.6756 - regression_loss: 0.5932 - classification_loss: 0.0823 374/500 [=====================>........] - ETA: 41s - loss: 0.6746 - regression_loss: 0.5924 - classification_loss: 0.0822 375/500 [=====================>........] - ETA: 41s - loss: 0.6746 - regression_loss: 0.5924 - classification_loss: 0.0823 376/500 [=====================>........] - ETA: 40s - loss: 0.6749 - regression_loss: 0.5926 - classification_loss: 0.0823 377/500 [=====================>........] - ETA: 40s - loss: 0.6744 - regression_loss: 0.5922 - classification_loss: 0.0822 378/500 [=====================>........] - ETA: 40s - loss: 0.6747 - regression_loss: 0.5924 - classification_loss: 0.0823 379/500 [=====================>........] - ETA: 39s - loss: 0.6754 - regression_loss: 0.5929 - classification_loss: 0.0824 380/500 [=====================>........] - ETA: 39s - loss: 0.6755 - regression_loss: 0.5930 - classification_loss: 0.0824 381/500 [=====================>........] - ETA: 39s - loss: 0.6757 - regression_loss: 0.5932 - classification_loss: 0.0824 382/500 [=====================>........] - ETA: 38s - loss: 0.6752 - regression_loss: 0.5928 - classification_loss: 0.0824 383/500 [=====================>........] - ETA: 38s - loss: 0.6748 - regression_loss: 0.5926 - classification_loss: 0.0822 384/500 [======================>.......] - ETA: 38s - loss: 0.6750 - regression_loss: 0.5927 - classification_loss: 0.0823 385/500 [======================>.......] - ETA: 37s - loss: 0.6743 - regression_loss: 0.5921 - classification_loss: 0.0822 386/500 [======================>.......] - ETA: 37s - loss: 0.6747 - regression_loss: 0.5925 - classification_loss: 0.0822 387/500 [======================>.......] - ETA: 37s - loss: 0.6745 - regression_loss: 0.5923 - classification_loss: 0.0822 388/500 [======================>.......] - ETA: 36s - loss: 0.6741 - regression_loss: 0.5920 - classification_loss: 0.0822 389/500 [======================>.......] - ETA: 36s - loss: 0.6746 - regression_loss: 0.5923 - classification_loss: 0.0822 390/500 [======================>.......] - ETA: 36s - loss: 0.6743 - regression_loss: 0.5921 - classification_loss: 0.0822 391/500 [======================>.......] - ETA: 35s - loss: 0.6748 - regression_loss: 0.5926 - classification_loss: 0.0822 392/500 [======================>.......] - ETA: 35s - loss: 0.6746 - regression_loss: 0.5925 - classification_loss: 0.0822 393/500 [======================>.......] - ETA: 35s - loss: 0.6757 - regression_loss: 0.5934 - classification_loss: 0.0823 394/500 [======================>.......] - ETA: 34s - loss: 0.6766 - regression_loss: 0.5940 - classification_loss: 0.0826 395/500 [======================>.......] - ETA: 34s - loss: 0.6768 - regression_loss: 0.5942 - classification_loss: 0.0827 396/500 [======================>.......] - ETA: 34s - loss: 0.6765 - regression_loss: 0.5938 - classification_loss: 0.0826 397/500 [======================>.......] - ETA: 33s - loss: 0.6774 - regression_loss: 0.5946 - classification_loss: 0.0828 398/500 [======================>.......] - ETA: 33s - loss: 0.6775 - regression_loss: 0.5947 - classification_loss: 0.0828 399/500 [======================>.......] - ETA: 33s - loss: 0.6776 - regression_loss: 0.5948 - classification_loss: 0.0828 400/500 [=======================>......] - ETA: 33s - loss: 0.6777 - regression_loss: 0.5950 - classification_loss: 0.0827 401/500 [=======================>......] - ETA: 32s - loss: 0.6767 - regression_loss: 0.5941 - classification_loss: 0.0826 402/500 [=======================>......] - ETA: 32s - loss: 0.6757 - regression_loss: 0.5933 - classification_loss: 0.0824 403/500 [=======================>......] - ETA: 32s - loss: 0.6760 - regression_loss: 0.5935 - classification_loss: 0.0825 404/500 [=======================>......] - ETA: 31s - loss: 0.6747 - regression_loss: 0.5924 - classification_loss: 0.0823 405/500 [=======================>......] - ETA: 31s - loss: 0.6740 - regression_loss: 0.5919 - classification_loss: 0.0822 406/500 [=======================>......] - ETA: 31s - loss: 0.6734 - regression_loss: 0.5913 - classification_loss: 0.0821 407/500 [=======================>......] - ETA: 30s - loss: 0.6732 - regression_loss: 0.5912 - classification_loss: 0.0820 408/500 [=======================>......] - ETA: 30s - loss: 0.6724 - regression_loss: 0.5904 - classification_loss: 0.0819 409/500 [=======================>......] - ETA: 30s - loss: 0.6715 - regression_loss: 0.5898 - classification_loss: 0.0818 410/500 [=======================>......] - ETA: 29s - loss: 0.6722 - regression_loss: 0.5906 - classification_loss: 0.0817 411/500 [=======================>......] - ETA: 29s - loss: 0.6723 - regression_loss: 0.5907 - classification_loss: 0.0817 412/500 [=======================>......] - ETA: 29s - loss: 0.6710 - regression_loss: 0.5895 - classification_loss: 0.0815 413/500 [=======================>......] - ETA: 28s - loss: 0.6709 - regression_loss: 0.5894 - classification_loss: 0.0815 414/500 [=======================>......] - ETA: 28s - loss: 0.6703 - regression_loss: 0.5889 - classification_loss: 0.0814 415/500 [=======================>......] - ETA: 28s - loss: 0.6694 - regression_loss: 0.5881 - classification_loss: 0.0813 416/500 [=======================>......] - ETA: 27s - loss: 0.6696 - regression_loss: 0.5882 - classification_loss: 0.0813 417/500 [========================>.....] - ETA: 27s - loss: 0.6701 - regression_loss: 0.5888 - classification_loss: 0.0813 418/500 [========================>.....] - ETA: 27s - loss: 0.6707 - regression_loss: 0.5892 - classification_loss: 0.0814 419/500 [========================>.....] - ETA: 26s - loss: 0.6708 - regression_loss: 0.5894 - classification_loss: 0.0814 420/500 [========================>.....] - ETA: 26s - loss: 0.6701 - regression_loss: 0.5888 - classification_loss: 0.0813 421/500 [========================>.....] - ETA: 26s - loss: 0.6712 - regression_loss: 0.5898 - classification_loss: 0.0815 422/500 [========================>.....] - ETA: 25s - loss: 0.6716 - regression_loss: 0.5901 - classification_loss: 0.0815 423/500 [========================>.....] - ETA: 25s - loss: 0.6711 - regression_loss: 0.5897 - classification_loss: 0.0814 424/500 [========================>.....] - ETA: 25s - loss: 0.6707 - regression_loss: 0.5894 - classification_loss: 0.0814 425/500 [========================>.....] - ETA: 24s - loss: 0.6702 - regression_loss: 0.5889 - classification_loss: 0.0813 426/500 [========================>.....] - ETA: 24s - loss: 0.6705 - regression_loss: 0.5892 - classification_loss: 0.0813 427/500 [========================>.....] - ETA: 24s - loss: 0.6705 - regression_loss: 0.5892 - classification_loss: 0.0813 428/500 [========================>.....] - ETA: 23s - loss: 0.6713 - regression_loss: 0.5899 - classification_loss: 0.0815 429/500 [========================>.....] - ETA: 23s - loss: 0.6710 - regression_loss: 0.5897 - classification_loss: 0.0814 430/500 [========================>.....] - ETA: 23s - loss: 0.6708 - regression_loss: 0.5895 - classification_loss: 0.0813 431/500 [========================>.....] - ETA: 22s - loss: 0.6706 - regression_loss: 0.5893 - classification_loss: 0.0813 432/500 [========================>.....] - ETA: 22s - loss: 0.6699 - regression_loss: 0.5887 - classification_loss: 0.0812 433/500 [========================>.....] - ETA: 22s - loss: 0.6699 - regression_loss: 0.5887 - classification_loss: 0.0812 434/500 [=========================>....] - ETA: 21s - loss: 0.6702 - regression_loss: 0.5890 - classification_loss: 0.0812 435/500 [=========================>....] - ETA: 21s - loss: 0.6693 - regression_loss: 0.5883 - classification_loss: 0.0811 436/500 [=========================>....] - ETA: 21s - loss: 0.6689 - regression_loss: 0.5879 - classification_loss: 0.0810 437/500 [=========================>....] - ETA: 20s - loss: 0.6682 - regression_loss: 0.5873 - classification_loss: 0.0809 438/500 [=========================>....] - ETA: 20s - loss: 0.6688 - regression_loss: 0.5878 - classification_loss: 0.0810 439/500 [=========================>....] - ETA: 20s - loss: 0.6683 - regression_loss: 0.5874 - classification_loss: 0.0809 440/500 [=========================>....] - ETA: 19s - loss: 0.6683 - regression_loss: 0.5874 - classification_loss: 0.0809 441/500 [=========================>....] - ETA: 19s - loss: 0.6682 - regression_loss: 0.5873 - classification_loss: 0.0808 442/500 [=========================>....] - ETA: 19s - loss: 0.6675 - regression_loss: 0.5868 - classification_loss: 0.0807 443/500 [=========================>....] - ETA: 18s - loss: 0.6671 - regression_loss: 0.5865 - classification_loss: 0.0806 444/500 [=========================>....] - ETA: 18s - loss: 0.6662 - regression_loss: 0.5857 - classification_loss: 0.0805 445/500 [=========================>....] - ETA: 18s - loss: 0.6664 - regression_loss: 0.5859 - classification_loss: 0.0805 446/500 [=========================>....] - ETA: 17s - loss: 0.6661 - regression_loss: 0.5857 - classification_loss: 0.0804 447/500 [=========================>....] - ETA: 17s - loss: 0.6652 - regression_loss: 0.5849 - classification_loss: 0.0803 448/500 [=========================>....] - ETA: 17s - loss: 0.6654 - regression_loss: 0.5851 - classification_loss: 0.0803 449/500 [=========================>....] - ETA: 16s - loss: 0.6652 - regression_loss: 0.5849 - classification_loss: 0.0802 450/500 [==========================>...] - ETA: 16s - loss: 0.6664 - regression_loss: 0.5859 - classification_loss: 0.0805 451/500 [==========================>...] - ETA: 16s - loss: 0.6655 - regression_loss: 0.5852 - classification_loss: 0.0804 452/500 [==========================>...] - ETA: 15s - loss: 0.6658 - regression_loss: 0.5853 - classification_loss: 0.0805 453/500 [==========================>...] - ETA: 15s - loss: 0.6660 - regression_loss: 0.5854 - classification_loss: 0.0805 454/500 [==========================>...] - ETA: 15s - loss: 0.6655 - regression_loss: 0.5849 - classification_loss: 0.0805 455/500 [==========================>...] - ETA: 14s - loss: 0.6666 - regression_loss: 0.5861 - classification_loss: 0.0806 456/500 [==========================>...] - ETA: 14s - loss: 0.6672 - regression_loss: 0.5866 - classification_loss: 0.0807 457/500 [==========================>...] - ETA: 14s - loss: 0.6672 - regression_loss: 0.5866 - classification_loss: 0.0806 458/500 [==========================>...] - ETA: 13s - loss: 0.6667 - regression_loss: 0.5861 - classification_loss: 0.0805 459/500 [==========================>...] - ETA: 13s - loss: 0.6674 - regression_loss: 0.5869 - classification_loss: 0.0805 460/500 [==========================>...] - ETA: 13s - loss: 0.6667 - regression_loss: 0.5863 - classification_loss: 0.0804 461/500 [==========================>...] - ETA: 12s - loss: 0.6667 - regression_loss: 0.5862 - classification_loss: 0.0804 462/500 [==========================>...] - ETA: 12s - loss: 0.6664 - regression_loss: 0.5861 - classification_loss: 0.0804 463/500 [==========================>...] - ETA: 12s - loss: 0.6665 - regression_loss: 0.5861 - classification_loss: 0.0804 464/500 [==========================>...] - ETA: 11s - loss: 0.6671 - regression_loss: 0.5866 - classification_loss: 0.0806 465/500 [==========================>...] - ETA: 11s - loss: 0.6664 - regression_loss: 0.5860 - classification_loss: 0.0804 466/500 [==========================>...] - ETA: 11s - loss: 0.6672 - regression_loss: 0.5866 - classification_loss: 0.0805 467/500 [===========================>..] - ETA: 10s - loss: 0.6671 - regression_loss: 0.5866 - classification_loss: 0.0805 468/500 [===========================>..] - ETA: 10s - loss: 0.6666 - regression_loss: 0.5862 - classification_loss: 0.0805 469/500 [===========================>..] - ETA: 10s - loss: 0.6667 - regression_loss: 0.5861 - classification_loss: 0.0805 470/500 [===========================>..] - ETA: 9s - loss: 0.6677 - regression_loss: 0.5870 - classification_loss: 0.0807  471/500 [===========================>..] - ETA: 9s - loss: 0.6673 - regression_loss: 0.5867 - classification_loss: 0.0806 472/500 [===========================>..] - ETA: 9s - loss: 0.6669 - regression_loss: 0.5864 - classification_loss: 0.0805 473/500 [===========================>..] - ETA: 8s - loss: 0.6672 - regression_loss: 0.5867 - classification_loss: 0.0805 474/500 [===========================>..] - ETA: 8s - loss: 0.6671 - regression_loss: 0.5866 - classification_loss: 0.0806 475/500 [===========================>..] - ETA: 8s - loss: 0.6673 - regression_loss: 0.5867 - classification_loss: 0.0806 476/500 [===========================>..] - ETA: 7s - loss: 0.6669 - regression_loss: 0.5864 - classification_loss: 0.0805 477/500 [===========================>..] - ETA: 7s - loss: 0.6664 - regression_loss: 0.5859 - classification_loss: 0.0805 478/500 [===========================>..] - ETA: 7s - loss: 0.6674 - regression_loss: 0.5868 - classification_loss: 0.0807 479/500 [===========================>..] - ETA: 6s - loss: 0.6676 - regression_loss: 0.5869 - classification_loss: 0.0807 480/500 [===========================>..] - ETA: 6s - loss: 0.6672 - regression_loss: 0.5865 - classification_loss: 0.0806 481/500 [===========================>..] - ETA: 6s - loss: 0.6673 - regression_loss: 0.5867 - classification_loss: 0.0806 482/500 [===========================>..] - ETA: 5s - loss: 0.6674 - regression_loss: 0.5868 - classification_loss: 0.0806 483/500 [===========================>..] - ETA: 5s - loss: 0.6665 - regression_loss: 0.5860 - classification_loss: 0.0805 484/500 [============================>.] - ETA: 5s - loss: 0.6660 - regression_loss: 0.5855 - classification_loss: 0.0804 485/500 [============================>.] - ETA: 4s - loss: 0.6655 - regression_loss: 0.5851 - classification_loss: 0.0804 486/500 [============================>.] - ETA: 4s - loss: 0.6646 - regression_loss: 0.5843 - classification_loss: 0.0802 487/500 [============================>.] - ETA: 4s - loss: 0.6649 - regression_loss: 0.5847 - classification_loss: 0.0802 488/500 [============================>.] - ETA: 3s - loss: 0.6657 - regression_loss: 0.5854 - classification_loss: 0.0803 489/500 [============================>.] - ETA: 3s - loss: 0.6661 - regression_loss: 0.5857 - classification_loss: 0.0803 490/500 [============================>.] - ETA: 3s - loss: 0.6655 - regression_loss: 0.5852 - classification_loss: 0.0802 491/500 [============================>.] - ETA: 2s - loss: 0.6652 - regression_loss: 0.5849 - classification_loss: 0.0802 492/500 [============================>.] - ETA: 2s - loss: 0.6643 - regression_loss: 0.5841 - classification_loss: 0.0801 493/500 [============================>.] - ETA: 2s - loss: 0.6642 - regression_loss: 0.5842 - classification_loss: 0.0801 494/500 [============================>.] - ETA: 1s - loss: 0.6643 - regression_loss: 0.5842 - classification_loss: 0.0800 495/500 [============================>.] - ETA: 1s - loss: 0.6651 - regression_loss: 0.5849 - classification_loss: 0.0802 496/500 [============================>.] - ETA: 1s - loss: 0.6658 - regression_loss: 0.5856 - classification_loss: 0.0802 497/500 [============================>.] - ETA: 0s - loss: 0.6663 - regression_loss: 0.5860 - classification_loss: 0.0804 498/500 [============================>.] - ETA: 0s - loss: 0.6664 - regression_loss: 0.5860 - classification_loss: 0.0804 499/500 [============================>.] - ETA: 0s - loss: 0.6657 - regression_loss: 0.5854 - classification_loss: 0.0803 500/500 [==============================] - 165s 330ms/step - loss: 0.6656 - regression_loss: 0.5853 - classification_loss: 0.0803 1172 instances of class plum with average precision: 0.6453 mAP: 0.6453 Epoch 00049: saving model to ./training/snapshots/resnet101_pascal_49.h5 Epoch 50/150 1/500 [..............................] - ETA: 2:48 - loss: 0.6039 - regression_loss: 0.5203 - classification_loss: 0.0836 2/500 [..............................] - ETA: 2:45 - loss: 0.4743 - regression_loss: 0.4182 - classification_loss: 0.0561 3/500 [..............................] - ETA: 2:43 - loss: 0.6098 - regression_loss: 0.5419 - classification_loss: 0.0678 4/500 [..............................] - ETA: 2:42 - loss: 0.6561 - regression_loss: 0.5804 - classification_loss: 0.0758 5/500 [..............................] - ETA: 2:43 - loss: 0.6480 - regression_loss: 0.5724 - classification_loss: 0.0756 6/500 [..............................] - ETA: 2:43 - loss: 0.6478 - regression_loss: 0.5749 - classification_loss: 0.0729 7/500 [..............................] - ETA: 2:41 - loss: 0.6160 - regression_loss: 0.5483 - classification_loss: 0.0677 8/500 [..............................] - ETA: 2:43 - loss: 0.5681 - regression_loss: 0.5050 - classification_loss: 0.0631 9/500 [..............................] - ETA: 2:43 - loss: 0.5850 - regression_loss: 0.5218 - classification_loss: 0.0632 10/500 [..............................] - ETA: 2:42 - loss: 0.6218 - regression_loss: 0.5574 - classification_loss: 0.0644 11/500 [..............................] - ETA: 2:42 - loss: 0.6376 - regression_loss: 0.5716 - classification_loss: 0.0660 12/500 [..............................] - ETA: 2:42 - loss: 0.6618 - regression_loss: 0.5896 - classification_loss: 0.0722 13/500 [..............................] - ETA: 2:41 - loss: 0.6577 - regression_loss: 0.5851 - classification_loss: 0.0726 14/500 [..............................] - ETA: 2:42 - loss: 0.6384 - regression_loss: 0.5678 - classification_loss: 0.0706 15/500 [..............................] - ETA: 2:41 - loss: 0.6633 - regression_loss: 0.5853 - classification_loss: 0.0779 16/500 [..............................] - ETA: 2:40 - loss: 0.6746 - regression_loss: 0.5945 - classification_loss: 0.0802 17/500 [>.............................] - ETA: 2:40 - loss: 0.6675 - regression_loss: 0.5897 - classification_loss: 0.0778 18/500 [>.............................] - ETA: 2:40 - loss: 0.6548 - regression_loss: 0.5787 - classification_loss: 0.0761 19/500 [>.............................] - ETA: 2:39 - loss: 0.6374 - regression_loss: 0.5630 - classification_loss: 0.0744 20/500 [>.............................] - ETA: 2:39 - loss: 0.6403 - regression_loss: 0.5659 - classification_loss: 0.0744 21/500 [>.............................] - ETA: 2:38 - loss: 0.6337 - regression_loss: 0.5605 - classification_loss: 0.0732 22/500 [>.............................] - ETA: 2:38 - loss: 0.6119 - regression_loss: 0.5409 - classification_loss: 0.0710 23/500 [>.............................] - ETA: 2:37 - loss: 0.5999 - regression_loss: 0.5310 - classification_loss: 0.0689 24/500 [>.............................] - ETA: 2:37 - loss: 0.6096 - regression_loss: 0.5398 - classification_loss: 0.0698 25/500 [>.............................] - ETA: 2:37 - loss: 0.6100 - regression_loss: 0.5403 - classification_loss: 0.0697 26/500 [>.............................] - ETA: 2:36 - loss: 0.6027 - regression_loss: 0.5339 - classification_loss: 0.0688 27/500 [>.............................] - ETA: 2:36 - loss: 0.6127 - regression_loss: 0.5424 - classification_loss: 0.0703 28/500 [>.............................] - ETA: 2:36 - loss: 0.6011 - regression_loss: 0.5323 - classification_loss: 0.0689 29/500 [>.............................] - ETA: 2:35 - loss: 0.6059 - regression_loss: 0.5363 - classification_loss: 0.0697 30/500 [>.............................] - ETA: 2:35 - loss: 0.6094 - regression_loss: 0.5395 - classification_loss: 0.0699 31/500 [>.............................] - ETA: 2:35 - loss: 0.6082 - regression_loss: 0.5386 - classification_loss: 0.0696 32/500 [>.............................] - ETA: 2:34 - loss: 0.6097 - regression_loss: 0.5392 - classification_loss: 0.0704 33/500 [>.............................] - ETA: 2:34 - loss: 0.6144 - regression_loss: 0.5426 - classification_loss: 0.0718 34/500 [=>............................] - ETA: 2:34 - loss: 0.6164 - regression_loss: 0.5442 - classification_loss: 0.0722 35/500 [=>............................] - ETA: 2:33 - loss: 0.6195 - regression_loss: 0.5463 - classification_loss: 0.0732 36/500 [=>............................] - ETA: 2:33 - loss: 0.6227 - regression_loss: 0.5496 - classification_loss: 0.0730 37/500 [=>............................] - ETA: 2:33 - loss: 0.6170 - regression_loss: 0.5447 - classification_loss: 0.0723 38/500 [=>............................] - ETA: 2:32 - loss: 0.6156 - regression_loss: 0.5434 - classification_loss: 0.0723 39/500 [=>............................] - ETA: 2:32 - loss: 0.6198 - regression_loss: 0.5471 - classification_loss: 0.0727 40/500 [=>............................] - ETA: 2:31 - loss: 0.6146 - regression_loss: 0.5431 - classification_loss: 0.0715 41/500 [=>............................] - ETA: 2:31 - loss: 0.6238 - regression_loss: 0.5512 - classification_loss: 0.0726 42/500 [=>............................] - ETA: 2:31 - loss: 0.6181 - regression_loss: 0.5465 - classification_loss: 0.0716 43/500 [=>............................] - ETA: 2:30 - loss: 0.6271 - regression_loss: 0.5541 - classification_loss: 0.0731 44/500 [=>............................] - ETA: 2:30 - loss: 0.6272 - regression_loss: 0.5548 - classification_loss: 0.0724 45/500 [=>............................] - ETA: 2:30 - loss: 0.6265 - regression_loss: 0.5540 - classification_loss: 0.0724 46/500 [=>............................] - ETA: 2:30 - loss: 0.6313 - regression_loss: 0.5590 - classification_loss: 0.0723 47/500 [=>............................] - ETA: 2:29 - loss: 0.6267 - regression_loss: 0.5546 - classification_loss: 0.0720 48/500 [=>............................] - ETA: 2:29 - loss: 0.6388 - regression_loss: 0.5646 - classification_loss: 0.0742 49/500 [=>............................] - ETA: 2:29 - loss: 0.6291 - regression_loss: 0.5558 - classification_loss: 0.0732 50/500 [==>...........................] - ETA: 2:29 - loss: 0.6408 - regression_loss: 0.5658 - classification_loss: 0.0750 51/500 [==>...........................] - ETA: 2:28 - loss: 0.6331 - regression_loss: 0.5589 - classification_loss: 0.0742 52/500 [==>...........................] - ETA: 2:28 - loss: 0.6374 - regression_loss: 0.5621 - classification_loss: 0.0752 53/500 [==>...........................] - ETA: 2:28 - loss: 0.6310 - regression_loss: 0.5565 - classification_loss: 0.0745 54/500 [==>...........................] - ETA: 2:27 - loss: 0.6304 - regression_loss: 0.5561 - classification_loss: 0.0743 55/500 [==>...........................] - ETA: 2:27 - loss: 0.6372 - regression_loss: 0.5615 - classification_loss: 0.0757 56/500 [==>...........................] - ETA: 2:26 - loss: 0.6411 - regression_loss: 0.5651 - classification_loss: 0.0760 57/500 [==>...........................] - ETA: 2:26 - loss: 0.6444 - regression_loss: 0.5679 - classification_loss: 0.0765 58/500 [==>...........................] - ETA: 2:26 - loss: 0.6458 - regression_loss: 0.5697 - classification_loss: 0.0761 59/500 [==>...........................] - ETA: 2:25 - loss: 0.6494 - regression_loss: 0.5724 - classification_loss: 0.0770 60/500 [==>...........................] - ETA: 2:25 - loss: 0.6427 - regression_loss: 0.5666 - classification_loss: 0.0761 61/500 [==>...........................] - ETA: 2:25 - loss: 0.6420 - regression_loss: 0.5660 - classification_loss: 0.0759 62/500 [==>...........................] - ETA: 2:24 - loss: 0.6451 - regression_loss: 0.5683 - classification_loss: 0.0768 63/500 [==>...........................] - ETA: 2:24 - loss: 0.6402 - regression_loss: 0.5640 - classification_loss: 0.0762 64/500 [==>...........................] - ETA: 2:24 - loss: 0.6361 - regression_loss: 0.5607 - classification_loss: 0.0754 65/500 [==>...........................] - ETA: 2:23 - loss: 0.6331 - regression_loss: 0.5583 - classification_loss: 0.0749 66/500 [==>...........................] - ETA: 2:23 - loss: 0.6283 - regression_loss: 0.5538 - classification_loss: 0.0745 67/500 [===>..........................] - ETA: 2:23 - loss: 0.6308 - regression_loss: 0.5558 - classification_loss: 0.0749 68/500 [===>..........................] - ETA: 2:22 - loss: 0.6302 - regression_loss: 0.5554 - classification_loss: 0.0748 69/500 [===>..........................] - ETA: 2:22 - loss: 0.6253 - regression_loss: 0.5509 - classification_loss: 0.0744 70/500 [===>..........................] - ETA: 2:22 - loss: 0.6235 - regression_loss: 0.5496 - classification_loss: 0.0738 71/500 [===>..........................] - ETA: 2:21 - loss: 0.6234 - regression_loss: 0.5494 - classification_loss: 0.0740 72/500 [===>..........................] - ETA: 2:21 - loss: 0.6205 - regression_loss: 0.5471 - classification_loss: 0.0734 73/500 [===>..........................] - ETA: 2:21 - loss: 0.6168 - regression_loss: 0.5437 - classification_loss: 0.0731 74/500 [===>..........................] - ETA: 2:20 - loss: 0.6202 - regression_loss: 0.5468 - classification_loss: 0.0734 75/500 [===>..........................] - ETA: 2:20 - loss: 0.6160 - regression_loss: 0.5431 - classification_loss: 0.0729 76/500 [===>..........................] - ETA: 2:20 - loss: 0.6154 - regression_loss: 0.5426 - classification_loss: 0.0729 77/500 [===>..........................] - ETA: 2:19 - loss: 0.6169 - regression_loss: 0.5437 - classification_loss: 0.0732 78/500 [===>..........................] - ETA: 2:19 - loss: 0.6233 - regression_loss: 0.5493 - classification_loss: 0.0740 79/500 [===>..........................] - ETA: 2:19 - loss: 0.6200 - regression_loss: 0.5465 - classification_loss: 0.0735 80/500 [===>..........................] - ETA: 2:19 - loss: 0.6209 - regression_loss: 0.5470 - classification_loss: 0.0739 81/500 [===>..........................] - ETA: 2:18 - loss: 0.6178 - regression_loss: 0.5442 - classification_loss: 0.0736 82/500 [===>..........................] - ETA: 2:18 - loss: 0.6145 - regression_loss: 0.5415 - classification_loss: 0.0730 83/500 [===>..........................] - ETA: 2:18 - loss: 0.6169 - regression_loss: 0.5435 - classification_loss: 0.0734 84/500 [====>.........................] - ETA: 2:17 - loss: 0.6201 - regression_loss: 0.5463 - classification_loss: 0.0738 85/500 [====>.........................] - ETA: 2:17 - loss: 0.6200 - regression_loss: 0.5460 - classification_loss: 0.0739 86/500 [====>.........................] - ETA: 2:17 - loss: 0.6172 - regression_loss: 0.5435 - classification_loss: 0.0737 87/500 [====>.........................] - ETA: 2:16 - loss: 0.6164 - regression_loss: 0.5427 - classification_loss: 0.0738 88/500 [====>.........................] - ETA: 2:16 - loss: 0.6172 - regression_loss: 0.5434 - classification_loss: 0.0738 89/500 [====>.........................] - ETA: 2:16 - loss: 0.6132 - regression_loss: 0.5398 - classification_loss: 0.0734 90/500 [====>.........................] - ETA: 2:15 - loss: 0.6110 - regression_loss: 0.5378 - classification_loss: 0.0732 91/500 [====>.........................] - ETA: 2:15 - loss: 0.6086 - regression_loss: 0.5355 - classification_loss: 0.0731 92/500 [====>.........................] - ETA: 2:15 - loss: 0.6131 - regression_loss: 0.5396 - classification_loss: 0.0735 93/500 [====>.........................] - ETA: 2:14 - loss: 0.6171 - regression_loss: 0.5427 - classification_loss: 0.0744 94/500 [====>.........................] - ETA: 2:14 - loss: 0.6152 - regression_loss: 0.5407 - classification_loss: 0.0745 95/500 [====>.........................] - ETA: 2:13 - loss: 0.6118 - regression_loss: 0.5377 - classification_loss: 0.0741 96/500 [====>.........................] - ETA: 2:13 - loss: 0.6085 - regression_loss: 0.5348 - classification_loss: 0.0737 97/500 [====>.........................] - ETA: 2:13 - loss: 0.6088 - regression_loss: 0.5352 - classification_loss: 0.0736 98/500 [====>.........................] - ETA: 2:12 - loss: 0.6112 - regression_loss: 0.5373 - classification_loss: 0.0739 99/500 [====>.........................] - ETA: 2:12 - loss: 0.6089 - regression_loss: 0.5355 - classification_loss: 0.0734 100/500 [=====>........................] - ETA: 2:12 - loss: 0.6085 - regression_loss: 0.5351 - classification_loss: 0.0733 101/500 [=====>........................] - ETA: 2:11 - loss: 0.6065 - regression_loss: 0.5334 - classification_loss: 0.0731 102/500 [=====>........................] - ETA: 2:11 - loss: 0.6057 - regression_loss: 0.5331 - classification_loss: 0.0726 103/500 [=====>........................] - ETA: 2:11 - loss: 0.6019 - regression_loss: 0.5297 - classification_loss: 0.0722 104/500 [=====>........................] - ETA: 2:10 - loss: 0.6007 - regression_loss: 0.5287 - classification_loss: 0.0720 105/500 [=====>........................] - ETA: 2:10 - loss: 0.6044 - regression_loss: 0.5316 - classification_loss: 0.0728 106/500 [=====>........................] - ETA: 2:10 - loss: 0.6070 - regression_loss: 0.5339 - classification_loss: 0.0731 107/500 [=====>........................] - ETA: 2:09 - loss: 0.6062 - regression_loss: 0.5332 - classification_loss: 0.0730 108/500 [=====>........................] - ETA: 2:09 - loss: 0.6054 - regression_loss: 0.5326 - classification_loss: 0.0728 109/500 [=====>........................] - ETA: 2:09 - loss: 0.6072 - regression_loss: 0.5344 - classification_loss: 0.0728 110/500 [=====>........................] - ETA: 2:08 - loss: 0.6053 - regression_loss: 0.5328 - classification_loss: 0.0725 111/500 [=====>........................] - ETA: 2:08 - loss: 0.6016 - regression_loss: 0.5295 - classification_loss: 0.0720 112/500 [=====>........................] - ETA: 2:08 - loss: 0.6020 - regression_loss: 0.5298 - classification_loss: 0.0723 113/500 [=====>........................] - ETA: 2:08 - loss: 0.6003 - regression_loss: 0.5284 - classification_loss: 0.0719 114/500 [=====>........................] - ETA: 2:07 - loss: 0.5977 - regression_loss: 0.5261 - classification_loss: 0.0716 115/500 [=====>........................] - ETA: 2:07 - loss: 0.6006 - regression_loss: 0.5285 - classification_loss: 0.0721 116/500 [=====>........................] - ETA: 2:07 - loss: 0.5982 - regression_loss: 0.5264 - classification_loss: 0.0718 117/500 [======>.......................] - ETA: 2:06 - loss: 0.5966 - regression_loss: 0.5250 - classification_loss: 0.0716 118/500 [======>.......................] - ETA: 2:06 - loss: 0.6025 - regression_loss: 0.5299 - classification_loss: 0.0726 119/500 [======>.......................] - ETA: 2:05 - loss: 0.6044 - regression_loss: 0.5314 - classification_loss: 0.0730 120/500 [======>.......................] - ETA: 2:05 - loss: 0.6059 - regression_loss: 0.5327 - classification_loss: 0.0731 121/500 [======>.......................] - ETA: 2:05 - loss: 0.6075 - regression_loss: 0.5341 - classification_loss: 0.0733 122/500 [======>.......................] - ETA: 2:04 - loss: 0.6100 - regression_loss: 0.5360 - classification_loss: 0.0739 123/500 [======>.......................] - ETA: 2:04 - loss: 0.6074 - regression_loss: 0.5338 - classification_loss: 0.0736 124/500 [======>.......................] - ETA: 2:04 - loss: 0.6077 - regression_loss: 0.5340 - classification_loss: 0.0737 125/500 [======>.......................] - ETA: 2:04 - loss: 0.6082 - regression_loss: 0.5343 - classification_loss: 0.0739 126/500 [======>.......................] - ETA: 2:03 - loss: 0.6103 - regression_loss: 0.5360 - classification_loss: 0.0743 127/500 [======>.......................] - ETA: 2:03 - loss: 0.6108 - regression_loss: 0.5364 - classification_loss: 0.0744 128/500 [======>.......................] - ETA: 2:03 - loss: 0.6143 - regression_loss: 0.5394 - classification_loss: 0.0749 129/500 [======>.......................] - ETA: 2:02 - loss: 0.6154 - regression_loss: 0.5406 - classification_loss: 0.0748 130/500 [======>.......................] - ETA: 2:02 - loss: 0.6123 - regression_loss: 0.5380 - classification_loss: 0.0743 131/500 [======>.......................] - ETA: 2:01 - loss: 0.6095 - regression_loss: 0.5356 - classification_loss: 0.0739 132/500 [======>.......................] - ETA: 2:01 - loss: 0.6106 - regression_loss: 0.5362 - classification_loss: 0.0743 133/500 [======>.......................] - ETA: 2:01 - loss: 0.6081 - regression_loss: 0.5342 - classification_loss: 0.0740 134/500 [=======>......................] - ETA: 2:01 - loss: 0.6055 - regression_loss: 0.5319 - classification_loss: 0.0736 135/500 [=======>......................] - ETA: 2:00 - loss: 0.6112 - regression_loss: 0.5365 - classification_loss: 0.0747 136/500 [=======>......................] - ETA: 2:00 - loss: 0.6077 - regression_loss: 0.5334 - classification_loss: 0.0743 137/500 [=======>......................] - ETA: 2:00 - loss: 0.6077 - regression_loss: 0.5333 - classification_loss: 0.0744 138/500 [=======>......................] - ETA: 1:59 - loss: 0.6083 - regression_loss: 0.5337 - classification_loss: 0.0746 139/500 [=======>......................] - ETA: 1:59 - loss: 0.6079 - regression_loss: 0.5331 - classification_loss: 0.0748 140/500 [=======>......................] - ETA: 1:59 - loss: 0.6053 - regression_loss: 0.5308 - classification_loss: 0.0745 141/500 [=======>......................] - ETA: 1:58 - loss: 0.6058 - regression_loss: 0.5311 - classification_loss: 0.0747 142/500 [=======>......................] - ETA: 1:58 - loss: 0.6054 - regression_loss: 0.5310 - classification_loss: 0.0744 143/500 [=======>......................] - ETA: 1:58 - loss: 0.6040 - regression_loss: 0.5299 - classification_loss: 0.0741 144/500 [=======>......................] - ETA: 1:57 - loss: 0.6035 - regression_loss: 0.5295 - classification_loss: 0.0740 145/500 [=======>......................] - ETA: 1:57 - loss: 0.6042 - regression_loss: 0.5304 - classification_loss: 0.0738 146/500 [=======>......................] - ETA: 1:57 - loss: 0.6062 - regression_loss: 0.5321 - classification_loss: 0.0742 147/500 [=======>......................] - ETA: 1:56 - loss: 0.6053 - regression_loss: 0.5313 - classification_loss: 0.0740 148/500 [=======>......................] - ETA: 1:56 - loss: 0.6046 - regression_loss: 0.5308 - classification_loss: 0.0738 149/500 [=======>......................] - ETA: 1:56 - loss: 0.6022 - regression_loss: 0.5287 - classification_loss: 0.0735 150/500 [========>.....................] - ETA: 1:55 - loss: 0.6036 - regression_loss: 0.5301 - classification_loss: 0.0736 151/500 [========>.....................] - ETA: 1:55 - loss: 0.6034 - regression_loss: 0.5301 - classification_loss: 0.0733 152/500 [========>.....................] - ETA: 1:55 - loss: 0.6037 - regression_loss: 0.5304 - classification_loss: 0.0733 153/500 [========>.....................] - ETA: 1:54 - loss: 0.6024 - regression_loss: 0.5293 - classification_loss: 0.0731 154/500 [========>.....................] - ETA: 1:54 - loss: 0.6050 - regression_loss: 0.5316 - classification_loss: 0.0734 155/500 [========>.....................] - ETA: 1:54 - loss: 0.6040 - regression_loss: 0.5308 - classification_loss: 0.0732 156/500 [========>.....................] - ETA: 1:53 - loss: 0.6057 - regression_loss: 0.5325 - classification_loss: 0.0732 157/500 [========>.....................] - ETA: 1:53 - loss: 0.6035 - regression_loss: 0.5306 - classification_loss: 0.0729 158/500 [========>.....................] - ETA: 1:53 - loss: 0.6017 - regression_loss: 0.5291 - classification_loss: 0.0726 159/500 [========>.....................] - ETA: 1:52 - loss: 0.6029 - regression_loss: 0.5301 - classification_loss: 0.0728 160/500 [========>.....................] - ETA: 1:52 - loss: 0.6021 - regression_loss: 0.5296 - classification_loss: 0.0726 161/500 [========>.....................] - ETA: 1:52 - loss: 0.6005 - regression_loss: 0.5281 - classification_loss: 0.0723 162/500 [========>.....................] - ETA: 1:51 - loss: 0.5987 - regression_loss: 0.5266 - classification_loss: 0.0720 163/500 [========>.....................] - ETA: 1:51 - loss: 0.5996 - regression_loss: 0.5276 - classification_loss: 0.0720 164/500 [========>.....................] - ETA: 1:51 - loss: 0.6004 - regression_loss: 0.5283 - classification_loss: 0.0721 165/500 [========>.....................] - ETA: 1:50 - loss: 0.6031 - regression_loss: 0.5305 - classification_loss: 0.0726 166/500 [========>.....................] - ETA: 1:50 - loss: 0.6031 - regression_loss: 0.5307 - classification_loss: 0.0724 167/500 [=========>....................] - ETA: 1:50 - loss: 0.6046 - regression_loss: 0.5318 - classification_loss: 0.0727 168/500 [=========>....................] - ETA: 1:49 - loss: 0.6044 - regression_loss: 0.5316 - classification_loss: 0.0729 169/500 [=========>....................] - ETA: 1:49 - loss: 0.6059 - regression_loss: 0.5327 - classification_loss: 0.0732 170/500 [=========>....................] - ETA: 1:49 - loss: 0.6072 - regression_loss: 0.5338 - classification_loss: 0.0735 171/500 [=========>....................] - ETA: 1:48 - loss: 0.6093 - regression_loss: 0.5356 - classification_loss: 0.0737 172/500 [=========>....................] - ETA: 1:48 - loss: 0.6106 - regression_loss: 0.5368 - classification_loss: 0.0738 173/500 [=========>....................] - ETA: 1:48 - loss: 0.6086 - regression_loss: 0.5350 - classification_loss: 0.0736 174/500 [=========>....................] - ETA: 1:47 - loss: 0.6098 - regression_loss: 0.5361 - classification_loss: 0.0737 175/500 [=========>....................] - ETA: 1:47 - loss: 0.6083 - regression_loss: 0.5348 - classification_loss: 0.0735 176/500 [=========>....................] - ETA: 1:47 - loss: 0.6064 - regression_loss: 0.5332 - classification_loss: 0.0732 177/500 [=========>....................] - ETA: 1:46 - loss: 0.6089 - regression_loss: 0.5350 - classification_loss: 0.0739 178/500 [=========>....................] - ETA: 1:46 - loss: 0.6082 - regression_loss: 0.5340 - classification_loss: 0.0742 179/500 [=========>....................] - ETA: 1:46 - loss: 0.6096 - regression_loss: 0.5351 - classification_loss: 0.0745 180/500 [=========>....................] - ETA: 1:45 - loss: 0.6093 - regression_loss: 0.5347 - classification_loss: 0.0746 181/500 [=========>....................] - ETA: 1:45 - loss: 0.6090 - regression_loss: 0.5345 - classification_loss: 0.0745 182/500 [=========>....................] - ETA: 1:45 - loss: 0.6087 - regression_loss: 0.5342 - classification_loss: 0.0745 183/500 [=========>....................] - ETA: 1:44 - loss: 0.6075 - regression_loss: 0.5333 - classification_loss: 0.0743 184/500 [==========>...................] - ETA: 1:44 - loss: 0.6081 - regression_loss: 0.5337 - classification_loss: 0.0744 185/500 [==========>...................] - ETA: 1:44 - loss: 0.6062 - regression_loss: 0.5321 - classification_loss: 0.0741 186/500 [==========>...................] - ETA: 1:43 - loss: 0.6082 - regression_loss: 0.5338 - classification_loss: 0.0744 187/500 [==========>...................] - ETA: 1:43 - loss: 0.6063 - regression_loss: 0.5321 - classification_loss: 0.0743 188/500 [==========>...................] - ETA: 1:43 - loss: 0.6049 - regression_loss: 0.5309 - classification_loss: 0.0740 189/500 [==========>...................] - ETA: 1:42 - loss: 0.6040 - regression_loss: 0.5302 - classification_loss: 0.0738 190/500 [==========>...................] - ETA: 1:42 - loss: 0.6043 - regression_loss: 0.5303 - classification_loss: 0.0739 191/500 [==========>...................] - ETA: 1:42 - loss: 0.6077 - regression_loss: 0.5334 - classification_loss: 0.0743 192/500 [==========>...................] - ETA: 1:41 - loss: 0.6070 - regression_loss: 0.5328 - classification_loss: 0.0742 193/500 [==========>...................] - ETA: 1:41 - loss: 0.6071 - regression_loss: 0.5328 - classification_loss: 0.0743 194/500 [==========>...................] - ETA: 1:41 - loss: 0.6072 - regression_loss: 0.5329 - classification_loss: 0.0743 195/500 [==========>...................] - ETA: 1:40 - loss: 0.6072 - regression_loss: 0.5328 - classification_loss: 0.0744 196/500 [==========>...................] - ETA: 1:40 - loss: 0.6057 - regression_loss: 0.5316 - classification_loss: 0.0742 197/500 [==========>...................] - ETA: 1:40 - loss: 0.6066 - regression_loss: 0.5324 - classification_loss: 0.0742 198/500 [==========>...................] - ETA: 1:39 - loss: 0.6077 - regression_loss: 0.5334 - classification_loss: 0.0743 199/500 [==========>...................] - ETA: 1:39 - loss: 0.6086 - regression_loss: 0.5341 - classification_loss: 0.0745 200/500 [===========>..................] - ETA: 1:39 - loss: 0.6073 - regression_loss: 0.5330 - classification_loss: 0.0744 201/500 [===========>..................] - ETA: 1:38 - loss: 0.6091 - regression_loss: 0.5345 - classification_loss: 0.0746 202/500 [===========>..................] - ETA: 1:38 - loss: 0.6077 - regression_loss: 0.5333 - classification_loss: 0.0744 203/500 [===========>..................] - ETA: 1:38 - loss: 0.6072 - regression_loss: 0.5328 - classification_loss: 0.0744 204/500 [===========>..................] - ETA: 1:37 - loss: 0.6079 - regression_loss: 0.5333 - classification_loss: 0.0746 205/500 [===========>..................] - ETA: 1:37 - loss: 0.6077 - regression_loss: 0.5332 - classification_loss: 0.0745 206/500 [===========>..................] - ETA: 1:37 - loss: 0.6081 - regression_loss: 0.5336 - classification_loss: 0.0745 207/500 [===========>..................] - ETA: 1:37 - loss: 0.6073 - regression_loss: 0.5329 - classification_loss: 0.0744 208/500 [===========>..................] - ETA: 1:36 - loss: 0.6060 - regression_loss: 0.5317 - classification_loss: 0.0743 209/500 [===========>..................] - ETA: 1:36 - loss: 0.6063 - regression_loss: 0.5319 - classification_loss: 0.0744 210/500 [===========>..................] - ETA: 1:35 - loss: 0.6066 - regression_loss: 0.5321 - classification_loss: 0.0745 211/500 [===========>..................] - ETA: 1:35 - loss: 0.6051 - regression_loss: 0.5309 - classification_loss: 0.0742 212/500 [===========>..................] - ETA: 1:35 - loss: 0.6068 - regression_loss: 0.5324 - classification_loss: 0.0744 213/500 [===========>..................] - ETA: 1:35 - loss: 0.6063 - regression_loss: 0.5318 - classification_loss: 0.0745 214/500 [===========>..................] - ETA: 1:34 - loss: 0.6059 - regression_loss: 0.5315 - classification_loss: 0.0745 215/500 [===========>..................] - ETA: 1:34 - loss: 0.6041 - regression_loss: 0.5298 - classification_loss: 0.0743 216/500 [===========>..................] - ETA: 1:34 - loss: 0.6053 - regression_loss: 0.5307 - classification_loss: 0.0746 217/500 [============>.................] - ETA: 1:33 - loss: 0.6040 - regression_loss: 0.5296 - classification_loss: 0.0744 218/500 [============>.................] - ETA: 1:33 - loss: 0.6030 - regression_loss: 0.5287 - classification_loss: 0.0743 219/500 [============>.................] - ETA: 1:33 - loss: 0.6038 - regression_loss: 0.5292 - classification_loss: 0.0746 220/500 [============>.................] - ETA: 1:32 - loss: 0.6054 - regression_loss: 0.5306 - classification_loss: 0.0748 221/500 [============>.................] - ETA: 1:32 - loss: 0.6050 - regression_loss: 0.5302 - classification_loss: 0.0748 222/500 [============>.................] - ETA: 1:32 - loss: 0.6058 - regression_loss: 0.5311 - classification_loss: 0.0747 223/500 [============>.................] - ETA: 1:31 - loss: 0.6064 - regression_loss: 0.5319 - classification_loss: 0.0745 224/500 [============>.................] - ETA: 1:31 - loss: 0.6061 - regression_loss: 0.5316 - classification_loss: 0.0745 225/500 [============>.................] - ETA: 1:31 - loss: 0.6047 - regression_loss: 0.5304 - classification_loss: 0.0743 226/500 [============>.................] - ETA: 1:30 - loss: 0.6049 - regression_loss: 0.5307 - classification_loss: 0.0743 227/500 [============>.................] - ETA: 1:30 - loss: 0.6054 - regression_loss: 0.5311 - classification_loss: 0.0744 228/500 [============>.................] - ETA: 1:30 - loss: 0.6046 - regression_loss: 0.5304 - classification_loss: 0.0743 229/500 [============>.................] - ETA: 1:29 - loss: 0.6044 - regression_loss: 0.5302 - classification_loss: 0.0742 230/500 [============>.................] - ETA: 1:29 - loss: 0.6043 - regression_loss: 0.5302 - classification_loss: 0.0741 231/500 [============>.................] - ETA: 1:29 - loss: 0.6063 - regression_loss: 0.5319 - classification_loss: 0.0744 232/500 [============>.................] - ETA: 1:28 - loss: 0.6067 - regression_loss: 0.5324 - classification_loss: 0.0743 233/500 [============>.................] - ETA: 1:28 - loss: 0.6055 - regression_loss: 0.5314 - classification_loss: 0.0741 234/500 [=============>................] - ETA: 1:28 - loss: 0.6040 - regression_loss: 0.5301 - classification_loss: 0.0739 235/500 [=============>................] - ETA: 1:27 - loss: 0.6056 - regression_loss: 0.5314 - classification_loss: 0.0742 236/500 [=============>................] - ETA: 1:27 - loss: 0.6066 - regression_loss: 0.5322 - classification_loss: 0.0744 237/500 [=============>................] - ETA: 1:27 - loss: 0.6059 - regression_loss: 0.5316 - classification_loss: 0.0743 238/500 [=============>................] - ETA: 1:26 - loss: 0.6068 - regression_loss: 0.5325 - classification_loss: 0.0743 239/500 [=============>................] - ETA: 1:26 - loss: 0.6071 - regression_loss: 0.5327 - classification_loss: 0.0743 240/500 [=============>................] - ETA: 1:26 - loss: 0.6061 - regression_loss: 0.5318 - classification_loss: 0.0742 241/500 [=============>................] - ETA: 1:25 - loss: 0.6079 - regression_loss: 0.5334 - classification_loss: 0.0745 242/500 [=============>................] - ETA: 1:25 - loss: 0.6099 - regression_loss: 0.5350 - classification_loss: 0.0749 243/500 [=============>................] - ETA: 1:25 - loss: 0.6098 - regression_loss: 0.5350 - classification_loss: 0.0748 244/500 [=============>................] - ETA: 1:24 - loss: 0.6106 - regression_loss: 0.5357 - classification_loss: 0.0749 245/500 [=============>................] - ETA: 1:24 - loss: 0.6099 - regression_loss: 0.5352 - classification_loss: 0.0747 246/500 [=============>................] - ETA: 1:24 - loss: 0.6101 - regression_loss: 0.5354 - classification_loss: 0.0747 247/500 [=============>................] - ETA: 1:23 - loss: 0.6093 - regression_loss: 0.5347 - classification_loss: 0.0747 248/500 [=============>................] - ETA: 1:23 - loss: 0.6108 - regression_loss: 0.5359 - classification_loss: 0.0749 249/500 [=============>................] - ETA: 1:23 - loss: 0.6119 - regression_loss: 0.5369 - classification_loss: 0.0751 250/500 [==============>...............] - ETA: 1:22 - loss: 0.6139 - regression_loss: 0.5386 - classification_loss: 0.0754 251/500 [==============>...............] - ETA: 1:22 - loss: 0.6140 - regression_loss: 0.5387 - classification_loss: 0.0754 252/500 [==============>...............] - ETA: 1:22 - loss: 0.6152 - regression_loss: 0.5399 - classification_loss: 0.0752 253/500 [==============>...............] - ETA: 1:21 - loss: 0.6142 - regression_loss: 0.5391 - classification_loss: 0.0751 254/500 [==============>...............] - ETA: 1:21 - loss: 0.6128 - regression_loss: 0.5379 - classification_loss: 0.0749 255/500 [==============>...............] - ETA: 1:21 - loss: 0.6122 - regression_loss: 0.5373 - classification_loss: 0.0749 256/500 [==============>...............] - ETA: 1:20 - loss: 0.6111 - regression_loss: 0.5363 - classification_loss: 0.0748 257/500 [==============>...............] - ETA: 1:20 - loss: 0.6111 - regression_loss: 0.5364 - classification_loss: 0.0747 258/500 [==============>...............] - ETA: 1:20 - loss: 0.6128 - regression_loss: 0.5379 - classification_loss: 0.0750 259/500 [==============>...............] - ETA: 1:19 - loss: 0.6130 - regression_loss: 0.5381 - classification_loss: 0.0750 260/500 [==============>...............] - ETA: 1:19 - loss: 0.6119 - regression_loss: 0.5371 - classification_loss: 0.0748 261/500 [==============>...............] - ETA: 1:19 - loss: 0.6101 - regression_loss: 0.5355 - classification_loss: 0.0745 262/500 [==============>...............] - ETA: 1:18 - loss: 0.6092 - regression_loss: 0.5348 - classification_loss: 0.0744 263/500 [==============>...............] - ETA: 1:18 - loss: 0.6112 - regression_loss: 0.5364 - classification_loss: 0.0748 264/500 [==============>...............] - ETA: 1:18 - loss: 0.6100 - regression_loss: 0.5354 - classification_loss: 0.0746 265/500 [==============>...............] - ETA: 1:17 - loss: 0.6096 - regression_loss: 0.5352 - classification_loss: 0.0744 266/500 [==============>...............] - ETA: 1:17 - loss: 0.6080 - regression_loss: 0.5337 - classification_loss: 0.0742 267/500 [===============>..............] - ETA: 1:17 - loss: 0.6073 - regression_loss: 0.5331 - classification_loss: 0.0742 268/500 [===============>..............] - ETA: 1:16 - loss: 0.6077 - regression_loss: 0.5334 - classification_loss: 0.0743 269/500 [===============>..............] - ETA: 1:16 - loss: 0.6096 - regression_loss: 0.5350 - classification_loss: 0.0746 270/500 [===============>..............] - ETA: 1:16 - loss: 0.6095 - regression_loss: 0.5349 - classification_loss: 0.0747 271/500 [===============>..............] - ETA: 1:15 - loss: 0.6084 - regression_loss: 0.5339 - classification_loss: 0.0744 272/500 [===============>..............] - ETA: 1:15 - loss: 0.6077 - regression_loss: 0.5333 - classification_loss: 0.0743 273/500 [===============>..............] - ETA: 1:15 - loss: 0.6080 - regression_loss: 0.5337 - classification_loss: 0.0743 274/500 [===============>..............] - ETA: 1:14 - loss: 0.6088 - regression_loss: 0.5343 - classification_loss: 0.0744 275/500 [===============>..............] - ETA: 1:14 - loss: 0.6103 - regression_loss: 0.5357 - classification_loss: 0.0745 276/500 [===============>..............] - ETA: 1:14 - loss: 0.6102 - regression_loss: 0.5358 - classification_loss: 0.0745 277/500 [===============>..............] - ETA: 1:13 - loss: 0.6112 - regression_loss: 0.5367 - classification_loss: 0.0745 278/500 [===============>..............] - ETA: 1:13 - loss: 0.6109 - regression_loss: 0.5365 - classification_loss: 0.0744 279/500 [===============>..............] - ETA: 1:13 - loss: 0.6106 - regression_loss: 0.5364 - classification_loss: 0.0743 280/500 [===============>..............] - ETA: 1:12 - loss: 0.6105 - regression_loss: 0.5363 - classification_loss: 0.0742 281/500 [===============>..............] - ETA: 1:12 - loss: 0.6114 - regression_loss: 0.5370 - classification_loss: 0.0744 282/500 [===============>..............] - ETA: 1:12 - loss: 0.6110 - regression_loss: 0.5367 - classification_loss: 0.0744 283/500 [===============>..............] - ETA: 1:11 - loss: 0.6110 - regression_loss: 0.5367 - classification_loss: 0.0743 284/500 [================>.............] - ETA: 1:11 - loss: 0.6108 - regression_loss: 0.5365 - classification_loss: 0.0743 285/500 [================>.............] - ETA: 1:11 - loss: 0.6127 - regression_loss: 0.5381 - classification_loss: 0.0746 286/500 [================>.............] - ETA: 1:10 - loss: 0.6133 - regression_loss: 0.5387 - classification_loss: 0.0746 287/500 [================>.............] - ETA: 1:10 - loss: 0.6117 - regression_loss: 0.5373 - classification_loss: 0.0745 288/500 [================>.............] - ETA: 1:10 - loss: 0.6132 - regression_loss: 0.5385 - classification_loss: 0.0747 289/500 [================>.............] - ETA: 1:09 - loss: 0.6128 - regression_loss: 0.5382 - classification_loss: 0.0746 290/500 [================>.............] - ETA: 1:09 - loss: 0.6119 - regression_loss: 0.5374 - classification_loss: 0.0745 291/500 [================>.............] - ETA: 1:09 - loss: 0.6123 - regression_loss: 0.5378 - classification_loss: 0.0745 292/500 [================>.............] - ETA: 1:08 - loss: 0.6127 - regression_loss: 0.5381 - classification_loss: 0.0746 293/500 [================>.............] - ETA: 1:08 - loss: 0.6120 - regression_loss: 0.5375 - classification_loss: 0.0744 294/500 [================>.............] - ETA: 1:08 - loss: 0.6123 - regression_loss: 0.5378 - classification_loss: 0.0745 295/500 [================>.............] - ETA: 1:07 - loss: 0.6145 - regression_loss: 0.5396 - classification_loss: 0.0749 296/500 [================>.............] - ETA: 1:07 - loss: 0.6158 - regression_loss: 0.5407 - classification_loss: 0.0751 297/500 [================>.............] - ETA: 1:07 - loss: 0.6164 - regression_loss: 0.5412 - classification_loss: 0.0752 298/500 [================>.............] - ETA: 1:06 - loss: 0.6166 - regression_loss: 0.5414 - classification_loss: 0.0752 299/500 [================>.............] - ETA: 1:06 - loss: 0.6164 - regression_loss: 0.5412 - classification_loss: 0.0751 300/500 [=================>............] - ETA: 1:06 - loss: 0.6157 - regression_loss: 0.5407 - classification_loss: 0.0750 301/500 [=================>............] - ETA: 1:05 - loss: 0.6147 - regression_loss: 0.5398 - classification_loss: 0.0748 302/500 [=================>............] - ETA: 1:05 - loss: 0.6152 - regression_loss: 0.5403 - classification_loss: 0.0750 303/500 [=================>............] - ETA: 1:05 - loss: 0.6158 - regression_loss: 0.5408 - classification_loss: 0.0750 304/500 [=================>............] - ETA: 1:04 - loss: 0.6155 - regression_loss: 0.5406 - classification_loss: 0.0749 305/500 [=================>............] - ETA: 1:04 - loss: 0.6149 - regression_loss: 0.5401 - classification_loss: 0.0748 306/500 [=================>............] - ETA: 1:04 - loss: 0.6152 - regression_loss: 0.5404 - classification_loss: 0.0748 307/500 [=================>............] - ETA: 1:03 - loss: 0.6148 - regression_loss: 0.5401 - classification_loss: 0.0747 308/500 [=================>............] - ETA: 1:03 - loss: 0.6142 - regression_loss: 0.5396 - classification_loss: 0.0746 309/500 [=================>............] - ETA: 1:03 - loss: 0.6140 - regression_loss: 0.5396 - classification_loss: 0.0745 310/500 [=================>............] - ETA: 1:02 - loss: 0.6142 - regression_loss: 0.5396 - classification_loss: 0.0746 311/500 [=================>............] - ETA: 1:02 - loss: 0.6128 - regression_loss: 0.5385 - classification_loss: 0.0744 312/500 [=================>............] - ETA: 1:02 - loss: 0.6133 - regression_loss: 0.5388 - classification_loss: 0.0745 313/500 [=================>............] - ETA: 1:01 - loss: 0.6121 - regression_loss: 0.5377 - classification_loss: 0.0743 314/500 [=================>............] - ETA: 1:01 - loss: 0.6124 - regression_loss: 0.5380 - classification_loss: 0.0744 315/500 [=================>............] - ETA: 1:01 - loss: 0.6121 - regression_loss: 0.5378 - classification_loss: 0.0743 316/500 [=================>............] - ETA: 1:00 - loss: 0.6116 - regression_loss: 0.5375 - classification_loss: 0.0741 317/500 [==================>...........] - ETA: 1:00 - loss: 0.6130 - regression_loss: 0.5386 - classification_loss: 0.0743 318/500 [==================>...........] - ETA: 1:00 - loss: 0.6128 - regression_loss: 0.5385 - classification_loss: 0.0743 319/500 [==================>...........] - ETA: 59s - loss: 0.6128 - regression_loss: 0.5386 - classification_loss: 0.0742  320/500 [==================>...........] - ETA: 59s - loss: 0.6124 - regression_loss: 0.5382 - classification_loss: 0.0742 321/500 [==================>...........] - ETA: 59s - loss: 0.6112 - regression_loss: 0.5372 - classification_loss: 0.0740 322/500 [==================>...........] - ETA: 58s - loss: 0.6122 - regression_loss: 0.5380 - classification_loss: 0.0741 323/500 [==================>...........] - ETA: 58s - loss: 0.6125 - regression_loss: 0.5383 - classification_loss: 0.0742 324/500 [==================>...........] - ETA: 58s - loss: 0.6117 - regression_loss: 0.5376 - classification_loss: 0.0740 325/500 [==================>...........] - ETA: 57s - loss: 0.6119 - regression_loss: 0.5378 - classification_loss: 0.0741 326/500 [==================>...........] - ETA: 57s - loss: 0.6114 - regression_loss: 0.5374 - classification_loss: 0.0740 327/500 [==================>...........] - ETA: 57s - loss: 0.6121 - regression_loss: 0.5381 - classification_loss: 0.0740 328/500 [==================>...........] - ETA: 56s - loss: 0.6112 - regression_loss: 0.5373 - classification_loss: 0.0739 329/500 [==================>...........] - ETA: 56s - loss: 0.6111 - regression_loss: 0.5372 - classification_loss: 0.0739 330/500 [==================>...........] - ETA: 56s - loss: 0.6121 - regression_loss: 0.5379 - classification_loss: 0.0741 331/500 [==================>...........] - ETA: 55s - loss: 0.6115 - regression_loss: 0.5374 - classification_loss: 0.0740 332/500 [==================>...........] - ETA: 55s - loss: 0.6116 - regression_loss: 0.5376 - classification_loss: 0.0740 333/500 [==================>...........] - ETA: 55s - loss: 0.6122 - regression_loss: 0.5383 - classification_loss: 0.0740 334/500 [===================>..........] - ETA: 54s - loss: 0.6114 - regression_loss: 0.5376 - classification_loss: 0.0739 335/500 [===================>..........] - ETA: 54s - loss: 0.6117 - regression_loss: 0.5379 - classification_loss: 0.0738 336/500 [===================>..........] - ETA: 54s - loss: 0.6116 - regression_loss: 0.5378 - classification_loss: 0.0738 337/500 [===================>..........] - ETA: 53s - loss: 0.6117 - regression_loss: 0.5379 - classification_loss: 0.0738 338/500 [===================>..........] - ETA: 53s - loss: 0.6113 - regression_loss: 0.5376 - classification_loss: 0.0737 339/500 [===================>..........] - ETA: 53s - loss: 0.6108 - regression_loss: 0.5371 - classification_loss: 0.0737 340/500 [===================>..........] - ETA: 52s - loss: 0.6099 - regression_loss: 0.5364 - classification_loss: 0.0735 341/500 [===================>..........] - ETA: 52s - loss: 0.6101 - regression_loss: 0.5366 - classification_loss: 0.0735 342/500 [===================>..........] - ETA: 52s - loss: 0.6099 - regression_loss: 0.5365 - classification_loss: 0.0734 343/500 [===================>..........] - ETA: 51s - loss: 0.6099 - regression_loss: 0.5365 - classification_loss: 0.0734 344/500 [===================>..........] - ETA: 51s - loss: 0.6100 - regression_loss: 0.5366 - classification_loss: 0.0734 345/500 [===================>..........] - ETA: 51s - loss: 0.6102 - regression_loss: 0.5369 - classification_loss: 0.0734 346/500 [===================>..........] - ETA: 50s - loss: 0.6096 - regression_loss: 0.5363 - classification_loss: 0.0733 347/500 [===================>..........] - ETA: 50s - loss: 0.6092 - regression_loss: 0.5360 - classification_loss: 0.0732 348/500 [===================>..........] - ETA: 50s - loss: 0.6096 - regression_loss: 0.5363 - classification_loss: 0.0733 349/500 [===================>..........] - ETA: 49s - loss: 0.6093 - regression_loss: 0.5360 - classification_loss: 0.0733 350/500 [====================>.........] - ETA: 49s - loss: 0.6088 - regression_loss: 0.5356 - classification_loss: 0.0732 351/500 [====================>.........] - ETA: 49s - loss: 0.6078 - regression_loss: 0.5347 - classification_loss: 0.0731 352/500 [====================>.........] - ETA: 48s - loss: 0.6073 - regression_loss: 0.5343 - classification_loss: 0.0730 353/500 [====================>.........] - ETA: 48s - loss: 0.6070 - regression_loss: 0.5340 - classification_loss: 0.0730 354/500 [====================>.........] - ETA: 48s - loss: 0.6066 - regression_loss: 0.5337 - classification_loss: 0.0729 355/500 [====================>.........] - ETA: 47s - loss: 0.6062 - regression_loss: 0.5334 - classification_loss: 0.0728 356/500 [====================>.........] - ETA: 47s - loss: 0.6066 - regression_loss: 0.5338 - classification_loss: 0.0729 357/500 [====================>.........] - ETA: 47s - loss: 0.6064 - regression_loss: 0.5335 - classification_loss: 0.0729 358/500 [====================>.........] - ETA: 46s - loss: 0.6053 - regression_loss: 0.5326 - classification_loss: 0.0727 359/500 [====================>.........] - ETA: 46s - loss: 0.6043 - regression_loss: 0.5318 - classification_loss: 0.0726 360/500 [====================>.........] - ETA: 46s - loss: 0.6046 - regression_loss: 0.5320 - classification_loss: 0.0726 361/500 [====================>.........] - ETA: 45s - loss: 0.6055 - regression_loss: 0.5326 - classification_loss: 0.0728 362/500 [====================>.........] - ETA: 45s - loss: 0.6061 - regression_loss: 0.5331 - classification_loss: 0.0729 363/500 [====================>.........] - ETA: 45s - loss: 0.6060 - regression_loss: 0.5330 - classification_loss: 0.0729 364/500 [====================>.........] - ETA: 44s - loss: 0.6065 - regression_loss: 0.5334 - classification_loss: 0.0731 365/500 [====================>.........] - ETA: 44s - loss: 0.6081 - regression_loss: 0.5346 - classification_loss: 0.0734 366/500 [====================>.........] - ETA: 44s - loss: 0.6079 - regression_loss: 0.5346 - classification_loss: 0.0733 367/500 [=====================>........] - ETA: 43s - loss: 0.6079 - regression_loss: 0.5346 - classification_loss: 0.0733 368/500 [=====================>........] - ETA: 43s - loss: 0.6074 - regression_loss: 0.5342 - classification_loss: 0.0732 369/500 [=====================>........] - ETA: 43s - loss: 0.6089 - regression_loss: 0.5355 - classification_loss: 0.0735 370/500 [=====================>........] - ETA: 42s - loss: 0.6077 - regression_loss: 0.5344 - classification_loss: 0.0733 371/500 [=====================>........] - ETA: 42s - loss: 0.6067 - regression_loss: 0.5335 - classification_loss: 0.0732 372/500 [=====================>........] - ETA: 42s - loss: 0.6077 - regression_loss: 0.5343 - classification_loss: 0.0734 373/500 [=====================>........] - ETA: 41s - loss: 0.6079 - regression_loss: 0.5344 - classification_loss: 0.0734 374/500 [=====================>........] - ETA: 41s - loss: 0.6072 - regression_loss: 0.5339 - classification_loss: 0.0733 375/500 [=====================>........] - ETA: 41s - loss: 0.6065 - regression_loss: 0.5333 - classification_loss: 0.0732 376/500 [=====================>........] - ETA: 40s - loss: 0.6059 - regression_loss: 0.5328 - classification_loss: 0.0731 377/500 [=====================>........] - ETA: 40s - loss: 0.6047 - regression_loss: 0.5317 - classification_loss: 0.0730 378/500 [=====================>........] - ETA: 40s - loss: 0.6045 - regression_loss: 0.5316 - classification_loss: 0.0730 379/500 [=====================>........] - ETA: 39s - loss: 0.6044 - regression_loss: 0.5315 - classification_loss: 0.0729 380/500 [=====================>........] - ETA: 39s - loss: 0.6038 - regression_loss: 0.5310 - classification_loss: 0.0728 381/500 [=====================>........] - ETA: 39s - loss: 0.6048 - regression_loss: 0.5321 - classification_loss: 0.0727 382/500 [=====================>........] - ETA: 38s - loss: 0.6041 - regression_loss: 0.5315 - classification_loss: 0.0727 383/500 [=====================>........] - ETA: 38s - loss: 0.6046 - regression_loss: 0.5318 - classification_loss: 0.0728 384/500 [======================>.......] - ETA: 38s - loss: 0.6040 - regression_loss: 0.5313 - classification_loss: 0.0727 385/500 [======================>.......] - ETA: 37s - loss: 0.6043 - regression_loss: 0.5316 - classification_loss: 0.0727 386/500 [======================>.......] - ETA: 37s - loss: 0.6035 - regression_loss: 0.5309 - classification_loss: 0.0726 387/500 [======================>.......] - ETA: 37s - loss: 0.6033 - regression_loss: 0.5307 - classification_loss: 0.0726 388/500 [======================>.......] - ETA: 36s - loss: 0.6038 - regression_loss: 0.5311 - classification_loss: 0.0726 389/500 [======================>.......] - ETA: 36s - loss: 0.6046 - regression_loss: 0.5319 - classification_loss: 0.0727 390/500 [======================>.......] - ETA: 36s - loss: 0.6053 - regression_loss: 0.5326 - classification_loss: 0.0727 391/500 [======================>.......] - ETA: 35s - loss: 0.6053 - regression_loss: 0.5326 - classification_loss: 0.0727 392/500 [======================>.......] - ETA: 35s - loss: 0.6045 - regression_loss: 0.5319 - classification_loss: 0.0726 393/500 [======================>.......] - ETA: 35s - loss: 0.6037 - regression_loss: 0.5313 - classification_loss: 0.0724 394/500 [======================>.......] - ETA: 34s - loss: 0.6038 - regression_loss: 0.5314 - classification_loss: 0.0725 395/500 [======================>.......] - ETA: 34s - loss: 0.6044 - regression_loss: 0.5319 - classification_loss: 0.0726 396/500 [======================>.......] - ETA: 34s - loss: 0.6041 - regression_loss: 0.5316 - classification_loss: 0.0725 397/500 [======================>.......] - ETA: 34s - loss: 0.6030 - regression_loss: 0.5306 - classification_loss: 0.0724 398/500 [======================>.......] - ETA: 33s - loss: 0.6028 - regression_loss: 0.5305 - classification_loss: 0.0723 399/500 [======================>.......] - ETA: 33s - loss: 0.6026 - regression_loss: 0.5304 - classification_loss: 0.0722 400/500 [=======================>......] - ETA: 33s - loss: 0.6021 - regression_loss: 0.5299 - classification_loss: 0.0722 401/500 [=======================>......] - ETA: 32s - loss: 0.6029 - regression_loss: 0.5307 - classification_loss: 0.0722 402/500 [=======================>......] - ETA: 32s - loss: 0.6019 - regression_loss: 0.5298 - classification_loss: 0.0721 403/500 [=======================>......] - ETA: 32s - loss: 0.6018 - regression_loss: 0.5298 - classification_loss: 0.0721 404/500 [=======================>......] - ETA: 31s - loss: 0.6020 - regression_loss: 0.5299 - classification_loss: 0.0721 405/500 [=======================>......] - ETA: 31s - loss: 0.6027 - regression_loss: 0.5304 - classification_loss: 0.0723 406/500 [=======================>......] - ETA: 31s - loss: 0.6038 - regression_loss: 0.5313 - classification_loss: 0.0725 407/500 [=======================>......] - ETA: 30s - loss: 0.6037 - regression_loss: 0.5312 - classification_loss: 0.0725 408/500 [=======================>......] - ETA: 30s - loss: 0.6029 - regression_loss: 0.5306 - classification_loss: 0.0723 409/500 [=======================>......] - ETA: 30s - loss: 0.6024 - regression_loss: 0.5302 - classification_loss: 0.0723 410/500 [=======================>......] - ETA: 29s - loss: 0.6020 - regression_loss: 0.5299 - classification_loss: 0.0721 411/500 [=======================>......] - ETA: 29s - loss: 0.6020 - regression_loss: 0.5299 - classification_loss: 0.0721 412/500 [=======================>......] - ETA: 29s - loss: 0.6022 - regression_loss: 0.5300 - classification_loss: 0.0722 413/500 [=======================>......] - ETA: 28s - loss: 0.6020 - regression_loss: 0.5299 - classification_loss: 0.0721 414/500 [=======================>......] - ETA: 28s - loss: 0.6011 - regression_loss: 0.5292 - classification_loss: 0.0720 415/500 [=======================>......] - ETA: 28s - loss: 0.6017 - regression_loss: 0.5296 - classification_loss: 0.0721 416/500 [=======================>......] - ETA: 27s - loss: 0.6022 - regression_loss: 0.5300 - classification_loss: 0.0722 417/500 [========================>.....] - ETA: 27s - loss: 0.6031 - regression_loss: 0.5308 - classification_loss: 0.0723 418/500 [========================>.....] - ETA: 27s - loss: 0.6034 - regression_loss: 0.5310 - classification_loss: 0.0723 419/500 [========================>.....] - ETA: 26s - loss: 0.6033 - regression_loss: 0.5309 - classification_loss: 0.0724 420/500 [========================>.....] - ETA: 26s - loss: 0.6036 - regression_loss: 0.5312 - classification_loss: 0.0725 421/500 [========================>.....] - ETA: 26s - loss: 0.6038 - regression_loss: 0.5314 - classification_loss: 0.0724 422/500 [========================>.....] - ETA: 25s - loss: 0.6043 - regression_loss: 0.5319 - classification_loss: 0.0724 423/500 [========================>.....] - ETA: 25s - loss: 0.6037 - regression_loss: 0.5314 - classification_loss: 0.0723 424/500 [========================>.....] - ETA: 25s - loss: 0.6033 - regression_loss: 0.5310 - classification_loss: 0.0723 425/500 [========================>.....] - ETA: 24s - loss: 0.6032 - regression_loss: 0.5310 - classification_loss: 0.0722 426/500 [========================>.....] - ETA: 24s - loss: 0.6039 - regression_loss: 0.5315 - classification_loss: 0.0724 427/500 [========================>.....] - ETA: 24s - loss: 0.6036 - regression_loss: 0.5313 - classification_loss: 0.0723 428/500 [========================>.....] - ETA: 23s - loss: 0.6027 - regression_loss: 0.5305 - classification_loss: 0.0722 429/500 [========================>.....] - ETA: 23s - loss: 0.6039 - regression_loss: 0.5317 - classification_loss: 0.0722 430/500 [========================>.....] - ETA: 23s - loss: 0.6036 - regression_loss: 0.5315 - classification_loss: 0.0721 431/500 [========================>.....] - ETA: 22s - loss: 0.6030 - regression_loss: 0.5309 - classification_loss: 0.0720 432/500 [========================>.....] - ETA: 22s - loss: 0.6030 - regression_loss: 0.5309 - classification_loss: 0.0720 433/500 [========================>.....] - ETA: 22s - loss: 0.6030 - regression_loss: 0.5309 - classification_loss: 0.0721 434/500 [=========================>....] - ETA: 21s - loss: 0.6036 - regression_loss: 0.5315 - classification_loss: 0.0722 435/500 [=========================>....] - ETA: 21s - loss: 0.6028 - regression_loss: 0.5307 - classification_loss: 0.0720 436/500 [=========================>....] - ETA: 21s - loss: 0.6030 - regression_loss: 0.5309 - classification_loss: 0.0721 437/500 [=========================>....] - ETA: 20s - loss: 0.6037 - regression_loss: 0.5314 - classification_loss: 0.0723 438/500 [=========================>....] - ETA: 20s - loss: 0.6036 - regression_loss: 0.5314 - classification_loss: 0.0722 439/500 [=========================>....] - ETA: 20s - loss: 0.6047 - regression_loss: 0.5323 - classification_loss: 0.0724 440/500 [=========================>....] - ETA: 19s - loss: 0.6053 - regression_loss: 0.5329 - classification_loss: 0.0724 441/500 [=========================>....] - ETA: 19s - loss: 0.6064 - regression_loss: 0.5339 - classification_loss: 0.0725 442/500 [=========================>....] - ETA: 19s - loss: 0.6057 - regression_loss: 0.5333 - classification_loss: 0.0725 443/500 [=========================>....] - ETA: 18s - loss: 0.6066 - regression_loss: 0.5340 - classification_loss: 0.0726 444/500 [=========================>....] - ETA: 18s - loss: 0.6072 - regression_loss: 0.5345 - classification_loss: 0.0727 445/500 [=========================>....] - ETA: 18s - loss: 0.6073 - regression_loss: 0.5345 - classification_loss: 0.0728 446/500 [=========================>....] - ETA: 17s - loss: 0.6072 - regression_loss: 0.5344 - classification_loss: 0.0728 447/500 [=========================>....] - ETA: 17s - loss: 0.6071 - regression_loss: 0.5343 - classification_loss: 0.0728 448/500 [=========================>....] - ETA: 17s - loss: 0.6085 - regression_loss: 0.5355 - classification_loss: 0.0730 449/500 [=========================>....] - ETA: 16s - loss: 0.6079 - regression_loss: 0.5349 - classification_loss: 0.0730 450/500 [==========================>...] - ETA: 16s - loss: 0.6073 - regression_loss: 0.5343 - classification_loss: 0.0729 451/500 [==========================>...] - ETA: 16s - loss: 0.6067 - regression_loss: 0.5339 - classification_loss: 0.0728 452/500 [==========================>...] - ETA: 15s - loss: 0.6066 - regression_loss: 0.5338 - classification_loss: 0.0728 453/500 [==========================>...] - ETA: 15s - loss: 0.6067 - regression_loss: 0.5339 - classification_loss: 0.0728 454/500 [==========================>...] - ETA: 15s - loss: 0.6068 - regression_loss: 0.5340 - classification_loss: 0.0728 455/500 [==========================>...] - ETA: 14s - loss: 0.6071 - regression_loss: 0.5342 - classification_loss: 0.0728 456/500 [==========================>...] - ETA: 14s - loss: 0.6079 - regression_loss: 0.5350 - classification_loss: 0.0729 457/500 [==========================>...] - ETA: 14s - loss: 0.6082 - regression_loss: 0.5352 - classification_loss: 0.0729 458/500 [==========================>...] - ETA: 13s - loss: 0.6081 - regression_loss: 0.5352 - classification_loss: 0.0729 459/500 [==========================>...] - ETA: 13s - loss: 0.6074 - regression_loss: 0.5346 - classification_loss: 0.0728 460/500 [==========================>...] - ETA: 13s - loss: 0.6068 - regression_loss: 0.5341 - classification_loss: 0.0727 461/500 [==========================>...] - ETA: 12s - loss: 0.6067 - regression_loss: 0.5339 - classification_loss: 0.0728 462/500 [==========================>...] - ETA: 12s - loss: 0.6066 - regression_loss: 0.5338 - classification_loss: 0.0728 463/500 [==========================>...] - ETA: 12s - loss: 0.6064 - regression_loss: 0.5336 - classification_loss: 0.0728 464/500 [==========================>...] - ETA: 11s - loss: 0.6065 - regression_loss: 0.5337 - classification_loss: 0.0728 465/500 [==========================>...] - ETA: 11s - loss: 0.6074 - regression_loss: 0.5344 - classification_loss: 0.0730 466/500 [==========================>...] - ETA: 11s - loss: 0.6075 - regression_loss: 0.5346 - classification_loss: 0.0729 467/500 [===========================>..] - ETA: 10s - loss: 0.6066 - regression_loss: 0.5338 - classification_loss: 0.0728 468/500 [===========================>..] - ETA: 10s - loss: 0.6064 - regression_loss: 0.5336 - classification_loss: 0.0728 469/500 [===========================>..] - ETA: 10s - loss: 0.6066 - regression_loss: 0.5338 - classification_loss: 0.0728 470/500 [===========================>..] - ETA: 9s - loss: 0.6062 - regression_loss: 0.5335 - classification_loss: 0.0727  471/500 [===========================>..] - ETA: 9s - loss: 0.6065 - regression_loss: 0.5337 - classification_loss: 0.0728 472/500 [===========================>..] - ETA: 9s - loss: 0.6076 - regression_loss: 0.5348 - classification_loss: 0.0728 473/500 [===========================>..] - ETA: 8s - loss: 0.6068 - regression_loss: 0.5341 - classification_loss: 0.0727 474/500 [===========================>..] - ETA: 8s - loss: 0.6076 - regression_loss: 0.5348 - classification_loss: 0.0729 475/500 [===========================>..] - ETA: 8s - loss: 0.6070 - regression_loss: 0.5342 - classification_loss: 0.0728 476/500 [===========================>..] - ETA: 7s - loss: 0.6072 - regression_loss: 0.5345 - classification_loss: 0.0728 477/500 [===========================>..] - ETA: 7s - loss: 0.6068 - regression_loss: 0.5341 - classification_loss: 0.0727 478/500 [===========================>..] - ETA: 7s - loss: 0.6061 - regression_loss: 0.5335 - classification_loss: 0.0726 479/500 [===========================>..] - ETA: 6s - loss: 0.6064 - regression_loss: 0.5337 - classification_loss: 0.0727 480/500 [===========================>..] - ETA: 6s - loss: 0.6071 - regression_loss: 0.5343 - classification_loss: 0.0728 481/500 [===========================>..] - ETA: 6s - loss: 0.6070 - regression_loss: 0.5342 - classification_loss: 0.0728 482/500 [===========================>..] - ETA: 5s - loss: 0.6076 - regression_loss: 0.5347 - classification_loss: 0.0729 483/500 [===========================>..] - ETA: 5s - loss: 0.6082 - regression_loss: 0.5352 - classification_loss: 0.0730 484/500 [============================>.] - ETA: 5s - loss: 0.6080 - regression_loss: 0.5350 - classification_loss: 0.0730 485/500 [============================>.] - ETA: 4s - loss: 0.6074 - regression_loss: 0.5345 - classification_loss: 0.0729 486/500 [============================>.] - ETA: 4s - loss: 0.6075 - regression_loss: 0.5346 - classification_loss: 0.0729 487/500 [============================>.] - ETA: 4s - loss: 0.6082 - regression_loss: 0.5351 - classification_loss: 0.0731 488/500 [============================>.] - ETA: 3s - loss: 0.6086 - regression_loss: 0.5354 - classification_loss: 0.0731 489/500 [============================>.] - ETA: 3s - loss: 0.6084 - regression_loss: 0.5353 - classification_loss: 0.0731 490/500 [============================>.] - ETA: 3s - loss: 0.6085 - regression_loss: 0.5354 - classification_loss: 0.0731 491/500 [============================>.] - ETA: 2s - loss: 0.6089 - regression_loss: 0.5357 - classification_loss: 0.0732 492/500 [============================>.] - ETA: 2s - loss: 0.6093 - regression_loss: 0.5361 - classification_loss: 0.0732 493/500 [============================>.] - ETA: 2s - loss: 0.6087 - regression_loss: 0.5356 - classification_loss: 0.0731 494/500 [============================>.] - ETA: 1s - loss: 0.6092 - regression_loss: 0.5360 - classification_loss: 0.0731 495/500 [============================>.] - ETA: 1s - loss: 0.6085 - regression_loss: 0.5355 - classification_loss: 0.0730 496/500 [============================>.] - ETA: 1s - loss: 0.6082 - regression_loss: 0.5353 - classification_loss: 0.0730 497/500 [============================>.] - ETA: 0s - loss: 0.6076 - regression_loss: 0.5347 - classification_loss: 0.0729 498/500 [============================>.] - ETA: 0s - loss: 0.6073 - regression_loss: 0.5344 - classification_loss: 0.0729 499/500 [============================>.] - ETA: 0s - loss: 0.6077 - regression_loss: 0.5348 - classification_loss: 0.0729 500/500 [==============================] - 165s 330ms/step - loss: 0.6069 - regression_loss: 0.5341 - classification_loss: 0.0728 1172 instances of class plum with average precision: 0.6268 mAP: 0.6268 Epoch 00050: saving model to ./training/snapshots/resnet101_pascal_50.h5 Epoch 51/150 1/500 [..............................] - ETA: 2:43 - loss: 0.7584 - regression_loss: 0.6690 - classification_loss: 0.0894 2/500 [..............................] - ETA: 2:42 - loss: 0.5320 - regression_loss: 0.4698 - classification_loss: 0.0621 3/500 [..............................] - ETA: 2:39 - loss: 0.5501 - regression_loss: 0.4882 - classification_loss: 0.0619 4/500 [..............................] - ETA: 2:38 - loss: 0.6231 - regression_loss: 0.5520 - classification_loss: 0.0711 5/500 [..............................] - ETA: 2:41 - loss: 0.5966 - regression_loss: 0.5306 - classification_loss: 0.0661 6/500 [..............................] - ETA: 2:42 - loss: 0.6137 - regression_loss: 0.5449 - classification_loss: 0.0688 7/500 [..............................] - ETA: 2:41 - loss: 0.5983 - regression_loss: 0.5280 - classification_loss: 0.0702 8/500 [..............................] - ETA: 2:42 - loss: 0.6179 - regression_loss: 0.5494 - classification_loss: 0.0685 9/500 [..............................] - ETA: 2:42 - loss: 0.5756 - regression_loss: 0.5113 - classification_loss: 0.0643 10/500 [..............................] - ETA: 2:41 - loss: 0.6106 - regression_loss: 0.5380 - classification_loss: 0.0726 11/500 [..............................] - ETA: 2:41 - loss: 0.6401 - regression_loss: 0.5618 - classification_loss: 0.0783 12/500 [..............................] - ETA: 2:41 - loss: 0.6590 - regression_loss: 0.5782 - classification_loss: 0.0808 13/500 [..............................] - ETA: 2:40 - loss: 0.6676 - regression_loss: 0.5844 - classification_loss: 0.0832 14/500 [..............................] - ETA: 2:40 - loss: 0.6739 - regression_loss: 0.5905 - classification_loss: 0.0834 15/500 [..............................] - ETA: 2:39 - loss: 0.6995 - regression_loss: 0.6122 - classification_loss: 0.0873 16/500 [..............................] - ETA: 2:40 - loss: 0.7014 - regression_loss: 0.6129 - classification_loss: 0.0885 17/500 [>.............................] - ETA: 2:40 - loss: 0.7191 - regression_loss: 0.6254 - classification_loss: 0.0937 18/500 [>.............................] - ETA: 2:40 - loss: 0.7212 - regression_loss: 0.6291 - classification_loss: 0.0921 19/500 [>.............................] - ETA: 2:40 - loss: 0.7055 - regression_loss: 0.6154 - classification_loss: 0.0901 20/500 [>.............................] - ETA: 2:40 - loss: 0.6942 - regression_loss: 0.6059 - classification_loss: 0.0882 21/500 [>.............................] - ETA: 2:39 - loss: 0.6867 - regression_loss: 0.6005 - classification_loss: 0.0863 22/500 [>.............................] - ETA: 2:38 - loss: 0.7012 - regression_loss: 0.6127 - classification_loss: 0.0885 23/500 [>.............................] - ETA: 2:38 - loss: 0.6874 - regression_loss: 0.6007 - classification_loss: 0.0867 24/500 [>.............................] - ETA: 2:37 - loss: 0.6937 - regression_loss: 0.6061 - classification_loss: 0.0876 25/500 [>.............................] - ETA: 2:37 - loss: 0.6983 - regression_loss: 0.6104 - classification_loss: 0.0879 26/500 [>.............................] - ETA: 2:36 - loss: 0.6958 - regression_loss: 0.6083 - classification_loss: 0.0875 27/500 [>.............................] - ETA: 2:36 - loss: 0.7075 - regression_loss: 0.6182 - classification_loss: 0.0893 28/500 [>.............................] - ETA: 2:36 - loss: 0.6973 - regression_loss: 0.6097 - classification_loss: 0.0876 29/500 [>.............................] - ETA: 2:35 - loss: 0.7044 - regression_loss: 0.6162 - classification_loss: 0.0882 30/500 [>.............................] - ETA: 2:35 - loss: 0.6898 - regression_loss: 0.6034 - classification_loss: 0.0864 31/500 [>.............................] - ETA: 2:35 - loss: 0.6876 - regression_loss: 0.6010 - classification_loss: 0.0866 32/500 [>.............................] - ETA: 2:35 - loss: 0.6812 - regression_loss: 0.5953 - classification_loss: 0.0858 33/500 [>.............................] - ETA: 2:34 - loss: 0.6807 - regression_loss: 0.5952 - classification_loss: 0.0855 34/500 [=>............................] - ETA: 2:34 - loss: 0.6751 - regression_loss: 0.5909 - classification_loss: 0.0843 35/500 [=>............................] - ETA: 2:34 - loss: 0.6628 - regression_loss: 0.5802 - classification_loss: 0.0826 36/500 [=>............................] - ETA: 2:33 - loss: 0.6606 - regression_loss: 0.5778 - classification_loss: 0.0827 37/500 [=>............................] - ETA: 2:33 - loss: 0.6567 - regression_loss: 0.5744 - classification_loss: 0.0823 38/500 [=>............................] - ETA: 2:32 - loss: 0.6571 - regression_loss: 0.5753 - classification_loss: 0.0819 39/500 [=>............................] - ETA: 2:32 - loss: 0.6497 - regression_loss: 0.5690 - classification_loss: 0.0807 40/500 [=>............................] - ETA: 2:31 - loss: 0.6464 - regression_loss: 0.5658 - classification_loss: 0.0805 41/500 [=>............................] - ETA: 2:31 - loss: 0.6424 - regression_loss: 0.5625 - classification_loss: 0.0800 42/500 [=>............................] - ETA: 2:31 - loss: 0.6324 - regression_loss: 0.5537 - classification_loss: 0.0787 43/500 [=>............................] - ETA: 2:31 - loss: 0.6373 - regression_loss: 0.5580 - classification_loss: 0.0793 44/500 [=>............................] - ETA: 2:30 - loss: 0.6312 - regression_loss: 0.5531 - classification_loss: 0.0781 45/500 [=>............................] - ETA: 2:30 - loss: 0.6310 - regression_loss: 0.5531 - classification_loss: 0.0779 46/500 [=>............................] - ETA: 2:29 - loss: 0.6233 - regression_loss: 0.5465 - classification_loss: 0.0768 47/500 [=>............................] - ETA: 2:29 - loss: 0.6165 - regression_loss: 0.5409 - classification_loss: 0.0756 48/500 [=>............................] - ETA: 2:29 - loss: 0.6170 - regression_loss: 0.5411 - classification_loss: 0.0759 49/500 [=>............................] - ETA: 2:29 - loss: 0.6160 - regression_loss: 0.5403 - classification_loss: 0.0757 50/500 [==>...........................] - ETA: 2:29 - loss: 0.6192 - regression_loss: 0.5428 - classification_loss: 0.0763 51/500 [==>...........................] - ETA: 2:28 - loss: 0.6231 - regression_loss: 0.5457 - classification_loss: 0.0775 52/500 [==>...........................] - ETA: 2:28 - loss: 0.6267 - regression_loss: 0.5483 - classification_loss: 0.0783 53/500 [==>...........................] - ETA: 2:28 - loss: 0.6247 - regression_loss: 0.5471 - classification_loss: 0.0775 54/500 [==>...........................] - ETA: 2:28 - loss: 0.6247 - regression_loss: 0.5481 - classification_loss: 0.0766 55/500 [==>...........................] - ETA: 2:27 - loss: 0.6265 - regression_loss: 0.5498 - classification_loss: 0.0767 56/500 [==>...........................] - ETA: 2:27 - loss: 0.6238 - regression_loss: 0.5477 - classification_loss: 0.0762 57/500 [==>...........................] - ETA: 2:27 - loss: 0.6232 - regression_loss: 0.5476 - classification_loss: 0.0756 58/500 [==>...........................] - ETA: 2:26 - loss: 0.6155 - regression_loss: 0.5410 - classification_loss: 0.0746 59/500 [==>...........................] - ETA: 2:26 - loss: 0.6187 - regression_loss: 0.5439 - classification_loss: 0.0747 60/500 [==>...........................] - ETA: 2:26 - loss: 0.6138 - regression_loss: 0.5396 - classification_loss: 0.0742 61/500 [==>...........................] - ETA: 2:26 - loss: 0.6068 - regression_loss: 0.5335 - classification_loss: 0.0733 62/500 [==>...........................] - ETA: 2:25 - loss: 0.6048 - regression_loss: 0.5319 - classification_loss: 0.0728 63/500 [==>...........................] - ETA: 2:25 - loss: 0.6089 - regression_loss: 0.5358 - classification_loss: 0.0731 64/500 [==>...........................] - ETA: 2:25 - loss: 0.6137 - regression_loss: 0.5397 - classification_loss: 0.0740 65/500 [==>...........................] - ETA: 2:24 - loss: 0.6204 - regression_loss: 0.5458 - classification_loss: 0.0746 66/500 [==>...........................] - ETA: 2:24 - loss: 0.6187 - regression_loss: 0.5445 - classification_loss: 0.0743 67/500 [===>..........................] - ETA: 2:24 - loss: 0.6152 - regression_loss: 0.5415 - classification_loss: 0.0737 68/500 [===>..........................] - ETA: 2:23 - loss: 0.6142 - regression_loss: 0.5407 - classification_loss: 0.0735 69/500 [===>..........................] - ETA: 2:23 - loss: 0.6127 - regression_loss: 0.5398 - classification_loss: 0.0729 70/500 [===>..........................] - ETA: 2:23 - loss: 0.6122 - regression_loss: 0.5394 - classification_loss: 0.0728 71/500 [===>..........................] - ETA: 2:22 - loss: 0.6131 - regression_loss: 0.5396 - classification_loss: 0.0735 72/500 [===>..........................] - ETA: 2:22 - loss: 0.6089 - regression_loss: 0.5360 - classification_loss: 0.0729 73/500 [===>..........................] - ETA: 2:22 - loss: 0.6053 - regression_loss: 0.5329 - classification_loss: 0.0724 74/500 [===>..........................] - ETA: 2:22 - loss: 0.6062 - regression_loss: 0.5338 - classification_loss: 0.0725 75/500 [===>..........................] - ETA: 2:21 - loss: 0.6098 - regression_loss: 0.5371 - classification_loss: 0.0727 76/500 [===>..........................] - ETA: 2:21 - loss: 0.6099 - regression_loss: 0.5377 - classification_loss: 0.0722 77/500 [===>..........................] - ETA: 2:21 - loss: 0.6059 - regression_loss: 0.5344 - classification_loss: 0.0715 78/500 [===>..........................] - ETA: 2:20 - loss: 0.6026 - regression_loss: 0.5315 - classification_loss: 0.0710 79/500 [===>..........................] - ETA: 2:20 - loss: 0.6019 - regression_loss: 0.5312 - classification_loss: 0.0707 80/500 [===>..........................] - ETA: 2:20 - loss: 0.6035 - regression_loss: 0.5326 - classification_loss: 0.0709 81/500 [===>..........................] - ETA: 2:19 - loss: 0.6077 - regression_loss: 0.5358 - classification_loss: 0.0718 82/500 [===>..........................] - ETA: 2:19 - loss: 0.6043 - regression_loss: 0.5330 - classification_loss: 0.0712 83/500 [===>..........................] - ETA: 2:19 - loss: 0.6055 - regression_loss: 0.5341 - classification_loss: 0.0713 84/500 [====>.........................] - ETA: 2:18 - loss: 0.6059 - regression_loss: 0.5347 - classification_loss: 0.0712 85/500 [====>.........................] - ETA: 2:18 - loss: 0.6026 - regression_loss: 0.5320 - classification_loss: 0.0706 86/500 [====>.........................] - ETA: 2:17 - loss: 0.6027 - regression_loss: 0.5319 - classification_loss: 0.0708 87/500 [====>.........................] - ETA: 2:17 - loss: 0.6000 - regression_loss: 0.5296 - classification_loss: 0.0704 88/500 [====>.........................] - ETA: 2:17 - loss: 0.6004 - regression_loss: 0.5298 - classification_loss: 0.0706 89/500 [====>.........................] - ETA: 2:16 - loss: 0.5988 - regression_loss: 0.5287 - classification_loss: 0.0701 90/500 [====>.........................] - ETA: 2:16 - loss: 0.6071 - regression_loss: 0.5357 - classification_loss: 0.0715 91/500 [====>.........................] - ETA: 2:16 - loss: 0.6029 - regression_loss: 0.5320 - classification_loss: 0.0709 92/500 [====>.........................] - ETA: 2:15 - loss: 0.6036 - regression_loss: 0.5324 - classification_loss: 0.0712 93/500 [====>.........................] - ETA: 2:15 - loss: 0.6031 - regression_loss: 0.5324 - classification_loss: 0.0707 94/500 [====>.........................] - ETA: 2:15 - loss: 0.5992 - regression_loss: 0.5289 - classification_loss: 0.0703 95/500 [====>.........................] - ETA: 2:14 - loss: 0.5951 - regression_loss: 0.5252 - classification_loss: 0.0699 96/500 [====>.........................] - ETA: 2:14 - loss: 0.5932 - regression_loss: 0.5237 - classification_loss: 0.0695 97/500 [====>.........................] - ETA: 2:13 - loss: 0.5959 - regression_loss: 0.5269 - classification_loss: 0.0691 98/500 [====>.........................] - ETA: 2:13 - loss: 0.5950 - regression_loss: 0.5261 - classification_loss: 0.0689 99/500 [====>.........................] - ETA: 2:13 - loss: 0.5981 - regression_loss: 0.5287 - classification_loss: 0.0694 100/500 [=====>........................] - ETA: 2:13 - loss: 0.6024 - regression_loss: 0.5326 - classification_loss: 0.0698 101/500 [=====>........................] - ETA: 2:12 - loss: 0.6002 - regression_loss: 0.5307 - classification_loss: 0.0696 102/500 [=====>........................] - ETA: 2:12 - loss: 0.5983 - regression_loss: 0.5291 - classification_loss: 0.0691 103/500 [=====>........................] - ETA: 2:12 - loss: 0.6005 - regression_loss: 0.5310 - classification_loss: 0.0695 104/500 [=====>........................] - ETA: 2:11 - loss: 0.5973 - regression_loss: 0.5282 - classification_loss: 0.0691 105/500 [=====>........................] - ETA: 2:11 - loss: 0.6002 - regression_loss: 0.5308 - classification_loss: 0.0694 106/500 [=====>........................] - ETA: 2:11 - loss: 0.5985 - regression_loss: 0.5293 - classification_loss: 0.0692 107/500 [=====>........................] - ETA: 2:10 - loss: 0.6016 - regression_loss: 0.5318 - classification_loss: 0.0698 108/500 [=====>........................] - ETA: 2:10 - loss: 0.6011 - regression_loss: 0.5312 - classification_loss: 0.0698 109/500 [=====>........................] - ETA: 2:10 - loss: 0.6025 - regression_loss: 0.5323 - classification_loss: 0.0701 110/500 [=====>........................] - ETA: 2:09 - loss: 0.6046 - regression_loss: 0.5342 - classification_loss: 0.0704 111/500 [=====>........................] - ETA: 2:09 - loss: 0.6023 - regression_loss: 0.5323 - classification_loss: 0.0701 112/500 [=====>........................] - ETA: 2:09 - loss: 0.6048 - regression_loss: 0.5341 - classification_loss: 0.0707 113/500 [=====>........................] - ETA: 2:08 - loss: 0.6047 - regression_loss: 0.5338 - classification_loss: 0.0709 114/500 [=====>........................] - ETA: 2:08 - loss: 0.6051 - regression_loss: 0.5341 - classification_loss: 0.0710 115/500 [=====>........................] - ETA: 2:08 - loss: 0.6065 - regression_loss: 0.5354 - classification_loss: 0.0711 116/500 [=====>........................] - ETA: 2:07 - loss: 0.6029 - regression_loss: 0.5322 - classification_loss: 0.0707 117/500 [======>.......................] - ETA: 2:07 - loss: 0.6007 - regression_loss: 0.5302 - classification_loss: 0.0705 118/500 [======>.......................] - ETA: 2:07 - loss: 0.6001 - regression_loss: 0.5296 - classification_loss: 0.0705 119/500 [======>.......................] - ETA: 2:06 - loss: 0.6002 - regression_loss: 0.5297 - classification_loss: 0.0705 120/500 [======>.......................] - ETA: 2:06 - loss: 0.5972 - regression_loss: 0.5271 - classification_loss: 0.0701 121/500 [======>.......................] - ETA: 2:06 - loss: 0.5959 - regression_loss: 0.5259 - classification_loss: 0.0701 122/500 [======>.......................] - ETA: 2:05 - loss: 0.5958 - regression_loss: 0.5256 - classification_loss: 0.0702 123/500 [======>.......................] - ETA: 2:05 - loss: 0.5993 - regression_loss: 0.5286 - classification_loss: 0.0707 124/500 [======>.......................] - ETA: 2:04 - loss: 0.5992 - regression_loss: 0.5284 - classification_loss: 0.0709 125/500 [======>.......................] - ETA: 2:04 - loss: 0.5985 - regression_loss: 0.5277 - classification_loss: 0.0708 126/500 [======>.......................] - ETA: 2:04 - loss: 0.6038 - regression_loss: 0.5321 - classification_loss: 0.0716 127/500 [======>.......................] - ETA: 2:03 - loss: 0.6076 - regression_loss: 0.5353 - classification_loss: 0.0723 128/500 [======>.......................] - ETA: 2:03 - loss: 0.6056 - regression_loss: 0.5336 - classification_loss: 0.0720 129/500 [======>.......................] - ETA: 2:03 - loss: 0.6060 - regression_loss: 0.5337 - classification_loss: 0.0723 130/500 [======>.......................] - ETA: 2:02 - loss: 0.6043 - regression_loss: 0.5323 - classification_loss: 0.0720 131/500 [======>.......................] - ETA: 2:02 - loss: 0.6025 - regression_loss: 0.5309 - classification_loss: 0.0717 132/500 [======>.......................] - ETA: 2:02 - loss: 0.6017 - regression_loss: 0.5302 - classification_loss: 0.0715 133/500 [======>.......................] - ETA: 2:01 - loss: 0.6022 - regression_loss: 0.5305 - classification_loss: 0.0717 134/500 [=======>......................] - ETA: 2:01 - loss: 0.5998 - regression_loss: 0.5284 - classification_loss: 0.0713 135/500 [=======>......................] - ETA: 2:01 - loss: 0.6012 - regression_loss: 0.5295 - classification_loss: 0.0717 136/500 [=======>......................] - ETA: 2:00 - loss: 0.6007 - regression_loss: 0.5291 - classification_loss: 0.0716 137/500 [=======>......................] - ETA: 2:00 - loss: 0.6050 - regression_loss: 0.5324 - classification_loss: 0.0726 138/500 [=======>......................] - ETA: 2:00 - loss: 0.6057 - regression_loss: 0.5330 - classification_loss: 0.0727 139/500 [=======>......................] - ETA: 1:59 - loss: 0.6063 - regression_loss: 0.5336 - classification_loss: 0.0728 140/500 [=======>......................] - ETA: 1:59 - loss: 0.6076 - regression_loss: 0.5348 - classification_loss: 0.0728 141/500 [=======>......................] - ETA: 1:59 - loss: 0.6069 - regression_loss: 0.5341 - classification_loss: 0.0728 142/500 [=======>......................] - ETA: 1:58 - loss: 0.6062 - regression_loss: 0.5335 - classification_loss: 0.0727 143/500 [=======>......................] - ETA: 1:58 - loss: 0.6057 - regression_loss: 0.5332 - classification_loss: 0.0725 144/500 [=======>......................] - ETA: 1:58 - loss: 0.6048 - regression_loss: 0.5324 - classification_loss: 0.0724 145/500 [=======>......................] - ETA: 1:57 - loss: 0.6057 - regression_loss: 0.5333 - classification_loss: 0.0724 146/500 [=======>......................] - ETA: 1:57 - loss: 0.6073 - regression_loss: 0.5347 - classification_loss: 0.0726 147/500 [=======>......................] - ETA: 1:57 - loss: 0.6092 - regression_loss: 0.5363 - classification_loss: 0.0728 148/500 [=======>......................] - ETA: 1:56 - loss: 0.6100 - regression_loss: 0.5374 - classification_loss: 0.0726 149/500 [=======>......................] - ETA: 1:56 - loss: 0.6092 - regression_loss: 0.5366 - classification_loss: 0.0726 150/500 [========>.....................] - ETA: 1:56 - loss: 0.6132 - regression_loss: 0.5400 - classification_loss: 0.0732 151/500 [========>.....................] - ETA: 1:55 - loss: 0.6117 - regression_loss: 0.5387 - classification_loss: 0.0730 152/500 [========>.....................] - ETA: 1:55 - loss: 0.6110 - regression_loss: 0.5382 - classification_loss: 0.0728 153/500 [========>.....................] - ETA: 1:55 - loss: 0.6088 - regression_loss: 0.5364 - classification_loss: 0.0724 154/500 [========>.....................] - ETA: 1:54 - loss: 0.6099 - regression_loss: 0.5375 - classification_loss: 0.0724 155/500 [========>.....................] - ETA: 1:54 - loss: 0.6122 - regression_loss: 0.5394 - classification_loss: 0.0728 156/500 [========>.....................] - ETA: 1:54 - loss: 0.6133 - regression_loss: 0.5402 - classification_loss: 0.0730 157/500 [========>.....................] - ETA: 1:53 - loss: 0.6106 - regression_loss: 0.5379 - classification_loss: 0.0727 158/500 [========>.....................] - ETA: 1:53 - loss: 0.6089 - regression_loss: 0.5364 - classification_loss: 0.0725 159/500 [========>.....................] - ETA: 1:53 - loss: 0.6080 - regression_loss: 0.5356 - classification_loss: 0.0724 160/500 [========>.....................] - ETA: 1:52 - loss: 0.6072 - regression_loss: 0.5349 - classification_loss: 0.0722 161/500 [========>.....................] - ETA: 1:52 - loss: 0.6086 - regression_loss: 0.5362 - classification_loss: 0.0724 162/500 [========>.....................] - ETA: 1:52 - loss: 0.6105 - regression_loss: 0.5376 - classification_loss: 0.0729 163/500 [========>.....................] - ETA: 1:51 - loss: 0.6129 - regression_loss: 0.5395 - classification_loss: 0.0734 164/500 [========>.....................] - ETA: 1:51 - loss: 0.6149 - regression_loss: 0.5411 - classification_loss: 0.0737 165/500 [========>.....................] - ETA: 1:51 - loss: 0.6132 - regression_loss: 0.5398 - classification_loss: 0.0734 166/500 [========>.....................] - ETA: 1:50 - loss: 0.6117 - regression_loss: 0.5385 - classification_loss: 0.0732 167/500 [=========>....................] - ETA: 1:50 - loss: 0.6112 - regression_loss: 0.5381 - classification_loss: 0.0732 168/500 [=========>....................] - ETA: 1:50 - loss: 0.6108 - regression_loss: 0.5376 - classification_loss: 0.0732 169/500 [=========>....................] - ETA: 1:49 - loss: 0.6100 - regression_loss: 0.5369 - classification_loss: 0.0730 170/500 [=========>....................] - ETA: 1:49 - loss: 0.6130 - regression_loss: 0.5392 - classification_loss: 0.0737 171/500 [=========>....................] - ETA: 1:49 - loss: 0.6126 - regression_loss: 0.5391 - classification_loss: 0.0735 172/500 [=========>....................] - ETA: 1:48 - loss: 0.6116 - regression_loss: 0.5383 - classification_loss: 0.0733 173/500 [=========>....................] - ETA: 1:48 - loss: 0.6112 - regression_loss: 0.5380 - classification_loss: 0.0733 174/500 [=========>....................] - ETA: 1:48 - loss: 0.6115 - regression_loss: 0.5381 - classification_loss: 0.0734 175/500 [=========>....................] - ETA: 1:47 - loss: 0.6148 - regression_loss: 0.5416 - classification_loss: 0.0732 176/500 [=========>....................] - ETA: 1:47 - loss: 0.6146 - regression_loss: 0.5415 - classification_loss: 0.0731 177/500 [=========>....................] - ETA: 1:47 - loss: 0.6141 - regression_loss: 0.5411 - classification_loss: 0.0730 178/500 [=========>....................] - ETA: 1:46 - loss: 0.6126 - regression_loss: 0.5398 - classification_loss: 0.0728 179/500 [=========>....................] - ETA: 1:46 - loss: 0.6143 - regression_loss: 0.5412 - classification_loss: 0.0731 180/500 [=========>....................] - ETA: 1:46 - loss: 0.6144 - regression_loss: 0.5413 - classification_loss: 0.0732 181/500 [=========>....................] - ETA: 1:45 - loss: 0.6142 - regression_loss: 0.5410 - classification_loss: 0.0732 182/500 [=========>....................] - ETA: 1:45 - loss: 0.6157 - regression_loss: 0.5423 - classification_loss: 0.0734 183/500 [=========>....................] - ETA: 1:45 - loss: 0.6180 - regression_loss: 0.5444 - classification_loss: 0.0736 184/500 [==========>...................] - ETA: 1:44 - loss: 0.6166 - regression_loss: 0.5432 - classification_loss: 0.0734 185/500 [==========>...................] - ETA: 1:44 - loss: 0.6156 - regression_loss: 0.5424 - classification_loss: 0.0732 186/500 [==========>...................] - ETA: 1:44 - loss: 0.6156 - regression_loss: 0.5423 - classification_loss: 0.0733 187/500 [==========>...................] - ETA: 1:43 - loss: 0.6168 - regression_loss: 0.5434 - classification_loss: 0.0734 188/500 [==========>...................] - ETA: 1:43 - loss: 0.6147 - regression_loss: 0.5416 - classification_loss: 0.0731 189/500 [==========>...................] - ETA: 1:43 - loss: 0.6122 - regression_loss: 0.5393 - classification_loss: 0.0729 190/500 [==========>...................] - ETA: 1:42 - loss: 0.6128 - regression_loss: 0.5397 - classification_loss: 0.0731 191/500 [==========>...................] - ETA: 1:42 - loss: 0.6148 - regression_loss: 0.5417 - classification_loss: 0.0731 192/500 [==========>...................] - ETA: 1:42 - loss: 0.6137 - regression_loss: 0.5407 - classification_loss: 0.0730 193/500 [==========>...................] - ETA: 1:41 - loss: 0.6121 - regression_loss: 0.5394 - classification_loss: 0.0728 194/500 [==========>...................] - ETA: 1:41 - loss: 0.6108 - regression_loss: 0.5383 - classification_loss: 0.0725 195/500 [==========>...................] - ETA: 1:41 - loss: 0.6132 - regression_loss: 0.5406 - classification_loss: 0.0726 196/500 [==========>...................] - ETA: 1:40 - loss: 0.6125 - regression_loss: 0.5401 - classification_loss: 0.0724 197/500 [==========>...................] - ETA: 1:40 - loss: 0.6137 - regression_loss: 0.5413 - classification_loss: 0.0723 198/500 [==========>...................] - ETA: 1:40 - loss: 0.6139 - regression_loss: 0.5415 - classification_loss: 0.0724 199/500 [==========>...................] - ETA: 1:39 - loss: 0.6152 - regression_loss: 0.5427 - classification_loss: 0.0725 200/500 [===========>..................] - ETA: 1:39 - loss: 0.6140 - regression_loss: 0.5417 - classification_loss: 0.0723 201/500 [===========>..................] - ETA: 1:39 - loss: 0.6146 - regression_loss: 0.5422 - classification_loss: 0.0724 202/500 [===========>..................] - ETA: 1:38 - loss: 0.6154 - regression_loss: 0.5429 - classification_loss: 0.0726 203/500 [===========>..................] - ETA: 1:38 - loss: 0.6167 - regression_loss: 0.5439 - classification_loss: 0.0729 204/500 [===========>..................] - ETA: 1:38 - loss: 0.6182 - regression_loss: 0.5450 - classification_loss: 0.0732 205/500 [===========>..................] - ETA: 1:37 - loss: 0.6174 - regression_loss: 0.5443 - classification_loss: 0.0731 206/500 [===========>..................] - ETA: 1:37 - loss: 0.6170 - regression_loss: 0.5438 - classification_loss: 0.0732 207/500 [===========>..................] - ETA: 1:37 - loss: 0.6159 - regression_loss: 0.5428 - classification_loss: 0.0731 208/500 [===========>..................] - ETA: 1:36 - loss: 0.6166 - regression_loss: 0.5434 - classification_loss: 0.0732 209/500 [===========>..................] - ETA: 1:36 - loss: 0.6183 - regression_loss: 0.5448 - classification_loss: 0.0735 210/500 [===========>..................] - ETA: 1:36 - loss: 0.6199 - regression_loss: 0.5461 - classification_loss: 0.0738 211/500 [===========>..................] - ETA: 1:35 - loss: 0.6204 - regression_loss: 0.5465 - classification_loss: 0.0739 212/500 [===========>..................] - ETA: 1:35 - loss: 0.6204 - regression_loss: 0.5465 - classification_loss: 0.0739 213/500 [===========>..................] - ETA: 1:35 - loss: 0.6189 - regression_loss: 0.5452 - classification_loss: 0.0737 214/500 [===========>..................] - ETA: 1:34 - loss: 0.6178 - regression_loss: 0.5442 - classification_loss: 0.0735 215/500 [===========>..................] - ETA: 1:34 - loss: 0.6174 - regression_loss: 0.5441 - classification_loss: 0.0734 216/500 [===========>..................] - ETA: 1:34 - loss: 0.6190 - regression_loss: 0.5452 - classification_loss: 0.0738 217/500 [============>.................] - ETA: 1:33 - loss: 0.6191 - regression_loss: 0.5453 - classification_loss: 0.0739 218/500 [============>.................] - ETA: 1:33 - loss: 0.6186 - regression_loss: 0.5449 - classification_loss: 0.0737 219/500 [============>.................] - ETA: 1:33 - loss: 0.6179 - regression_loss: 0.5443 - classification_loss: 0.0735 220/500 [============>.................] - ETA: 1:32 - loss: 0.6186 - regression_loss: 0.5450 - classification_loss: 0.0736 221/500 [============>.................] - ETA: 1:32 - loss: 0.6178 - regression_loss: 0.5443 - classification_loss: 0.0735 222/500 [============>.................] - ETA: 1:32 - loss: 0.6162 - regression_loss: 0.5428 - classification_loss: 0.0733 223/500 [============>.................] - ETA: 1:31 - loss: 0.6156 - regression_loss: 0.5423 - classification_loss: 0.0733 224/500 [============>.................] - ETA: 1:31 - loss: 0.6166 - regression_loss: 0.5433 - classification_loss: 0.0733 225/500 [============>.................] - ETA: 1:31 - loss: 0.6164 - regression_loss: 0.5431 - classification_loss: 0.0733 226/500 [============>.................] - ETA: 1:30 - loss: 0.6162 - regression_loss: 0.5430 - classification_loss: 0.0732 227/500 [============>.................] - ETA: 1:30 - loss: 0.6160 - regression_loss: 0.5429 - classification_loss: 0.0732 228/500 [============>.................] - ETA: 1:30 - loss: 0.6161 - regression_loss: 0.5428 - classification_loss: 0.0733 229/500 [============>.................] - ETA: 1:29 - loss: 0.6151 - regression_loss: 0.5420 - classification_loss: 0.0732 230/500 [============>.................] - ETA: 1:29 - loss: 0.6150 - regression_loss: 0.5419 - classification_loss: 0.0731 231/500 [============>.................] - ETA: 1:29 - loss: 0.6153 - regression_loss: 0.5422 - classification_loss: 0.0731 232/500 [============>.................] - ETA: 1:28 - loss: 0.6147 - regression_loss: 0.5417 - classification_loss: 0.0730 233/500 [============>.................] - ETA: 1:28 - loss: 0.6134 - regression_loss: 0.5406 - classification_loss: 0.0728 234/500 [=============>................] - ETA: 1:28 - loss: 0.6142 - regression_loss: 0.5413 - classification_loss: 0.0729 235/500 [=============>................] - ETA: 1:27 - loss: 0.6152 - regression_loss: 0.5425 - classification_loss: 0.0727 236/500 [=============>................] - ETA: 1:27 - loss: 0.6144 - regression_loss: 0.5418 - classification_loss: 0.0726 237/500 [=============>................] - ETA: 1:27 - loss: 0.6136 - regression_loss: 0.5411 - classification_loss: 0.0725 238/500 [=============>................] - ETA: 1:26 - loss: 0.6161 - regression_loss: 0.5434 - classification_loss: 0.0727 239/500 [=============>................] - ETA: 1:26 - loss: 0.6148 - regression_loss: 0.5423 - classification_loss: 0.0724 240/500 [=============>................] - ETA: 1:26 - loss: 0.6139 - regression_loss: 0.5416 - classification_loss: 0.0723 241/500 [=============>................] - ETA: 1:25 - loss: 0.6125 - regression_loss: 0.5404 - classification_loss: 0.0722 242/500 [=============>................] - ETA: 1:25 - loss: 0.6132 - regression_loss: 0.5410 - classification_loss: 0.0721 243/500 [=============>................] - ETA: 1:25 - loss: 0.6144 - regression_loss: 0.5422 - classification_loss: 0.0722 244/500 [=============>................] - ETA: 1:24 - loss: 0.6159 - regression_loss: 0.5436 - classification_loss: 0.0723 245/500 [=============>................] - ETA: 1:24 - loss: 0.6146 - regression_loss: 0.5425 - classification_loss: 0.0721 246/500 [=============>................] - ETA: 1:24 - loss: 0.6156 - regression_loss: 0.5434 - classification_loss: 0.0722 247/500 [=============>................] - ETA: 1:23 - loss: 0.6168 - regression_loss: 0.5442 - classification_loss: 0.0725 248/500 [=============>................] - ETA: 1:23 - loss: 0.6175 - regression_loss: 0.5449 - classification_loss: 0.0726 249/500 [=============>................] - ETA: 1:23 - loss: 0.6177 - regression_loss: 0.5450 - classification_loss: 0.0727 250/500 [==============>...............] - ETA: 1:22 - loss: 0.6193 - regression_loss: 0.5463 - classification_loss: 0.0730 251/500 [==============>...............] - ETA: 1:22 - loss: 0.6211 - regression_loss: 0.5480 - classification_loss: 0.0731 252/500 [==============>...............] - ETA: 1:22 - loss: 0.6199 - regression_loss: 0.5470 - classification_loss: 0.0729 253/500 [==============>...............] - ETA: 1:21 - loss: 0.6210 - regression_loss: 0.5479 - classification_loss: 0.0730 254/500 [==============>...............] - ETA: 1:21 - loss: 0.6203 - regression_loss: 0.5473 - classification_loss: 0.0730 255/500 [==============>...............] - ETA: 1:21 - loss: 0.6207 - regression_loss: 0.5476 - classification_loss: 0.0731 256/500 [==============>...............] - ETA: 1:20 - loss: 0.6195 - regression_loss: 0.5465 - classification_loss: 0.0729 257/500 [==============>...............] - ETA: 1:20 - loss: 0.6197 - regression_loss: 0.5468 - classification_loss: 0.0729 258/500 [==============>...............] - ETA: 1:20 - loss: 0.6192 - regression_loss: 0.5464 - classification_loss: 0.0728 259/500 [==============>...............] - ETA: 1:19 - loss: 0.6198 - regression_loss: 0.5470 - classification_loss: 0.0729 260/500 [==============>...............] - ETA: 1:19 - loss: 0.6217 - regression_loss: 0.5486 - classification_loss: 0.0731 261/500 [==============>...............] - ETA: 1:19 - loss: 0.6213 - regression_loss: 0.5482 - classification_loss: 0.0730 262/500 [==============>...............] - ETA: 1:18 - loss: 0.6205 - regression_loss: 0.5475 - classification_loss: 0.0730 263/500 [==============>...............] - ETA: 1:18 - loss: 0.6213 - regression_loss: 0.5483 - classification_loss: 0.0730 264/500 [==============>...............] - ETA: 1:18 - loss: 0.6216 - regression_loss: 0.5487 - classification_loss: 0.0730 265/500 [==============>...............] - ETA: 1:17 - loss: 0.6218 - regression_loss: 0.5488 - classification_loss: 0.0730 266/500 [==============>...............] - ETA: 1:17 - loss: 0.6205 - regression_loss: 0.5476 - classification_loss: 0.0729 267/500 [===============>..............] - ETA: 1:17 - loss: 0.6193 - regression_loss: 0.5466 - classification_loss: 0.0727 268/500 [===============>..............] - ETA: 1:16 - loss: 0.6191 - regression_loss: 0.5465 - classification_loss: 0.0726 269/500 [===============>..............] - ETA: 1:16 - loss: 0.6221 - regression_loss: 0.5489 - classification_loss: 0.0731 270/500 [===============>..............] - ETA: 1:16 - loss: 0.6248 - regression_loss: 0.5512 - classification_loss: 0.0735 271/500 [===============>..............] - ETA: 1:15 - loss: 0.6262 - regression_loss: 0.5524 - classification_loss: 0.0737 272/500 [===============>..............] - ETA: 1:15 - loss: 0.6272 - regression_loss: 0.5533 - classification_loss: 0.0740 273/500 [===============>..............] - ETA: 1:15 - loss: 0.6283 - regression_loss: 0.5541 - classification_loss: 0.0742 274/500 [===============>..............] - ETA: 1:14 - loss: 0.6273 - regression_loss: 0.5532 - classification_loss: 0.0741 275/500 [===============>..............] - ETA: 1:14 - loss: 0.6276 - regression_loss: 0.5535 - classification_loss: 0.0741 276/500 [===============>..............] - ETA: 1:14 - loss: 0.6265 - regression_loss: 0.5524 - classification_loss: 0.0740 277/500 [===============>..............] - ETA: 1:13 - loss: 0.6264 - regression_loss: 0.5523 - classification_loss: 0.0741 278/500 [===============>..............] - ETA: 1:13 - loss: 0.6259 - regression_loss: 0.5517 - classification_loss: 0.0741 279/500 [===============>..............] - ETA: 1:13 - loss: 0.6260 - regression_loss: 0.5519 - classification_loss: 0.0741 280/500 [===============>..............] - ETA: 1:12 - loss: 0.6254 - regression_loss: 0.5514 - classification_loss: 0.0740 281/500 [===============>..............] - ETA: 1:12 - loss: 0.6246 - regression_loss: 0.5507 - classification_loss: 0.0739 282/500 [===============>..............] - ETA: 1:12 - loss: 0.6255 - regression_loss: 0.5513 - classification_loss: 0.0741 283/500 [===============>..............] - ETA: 1:11 - loss: 0.6259 - regression_loss: 0.5517 - classification_loss: 0.0742 284/500 [================>.............] - ETA: 1:11 - loss: 0.6246 - regression_loss: 0.5506 - classification_loss: 0.0740 285/500 [================>.............] - ETA: 1:11 - loss: 0.6246 - regression_loss: 0.5506 - classification_loss: 0.0740 286/500 [================>.............] - ETA: 1:10 - loss: 0.6248 - regression_loss: 0.5508 - classification_loss: 0.0740 287/500 [================>.............] - ETA: 1:10 - loss: 0.6234 - regression_loss: 0.5495 - classification_loss: 0.0739 288/500 [================>.............] - ETA: 1:10 - loss: 0.6252 - regression_loss: 0.5510 - classification_loss: 0.0742 289/500 [================>.............] - ETA: 1:09 - loss: 0.6241 - regression_loss: 0.5500 - classification_loss: 0.0740 290/500 [================>.............] - ETA: 1:09 - loss: 0.6243 - regression_loss: 0.5503 - classification_loss: 0.0740 291/500 [================>.............] - ETA: 1:09 - loss: 0.6245 - regression_loss: 0.5504 - classification_loss: 0.0741 292/500 [================>.............] - ETA: 1:08 - loss: 0.6254 - regression_loss: 0.5511 - classification_loss: 0.0743 293/500 [================>.............] - ETA: 1:08 - loss: 0.6252 - regression_loss: 0.5510 - classification_loss: 0.0742 294/500 [================>.............] - ETA: 1:08 - loss: 0.6244 - regression_loss: 0.5502 - classification_loss: 0.0741 295/500 [================>.............] - ETA: 1:07 - loss: 0.6238 - regression_loss: 0.5498 - classification_loss: 0.0740 296/500 [================>.............] - ETA: 1:07 - loss: 0.6242 - regression_loss: 0.5502 - classification_loss: 0.0740 297/500 [================>.............] - ETA: 1:07 - loss: 0.6250 - regression_loss: 0.5510 - classification_loss: 0.0740 298/500 [================>.............] - ETA: 1:06 - loss: 0.6243 - regression_loss: 0.5504 - classification_loss: 0.0739 299/500 [================>.............] - ETA: 1:06 - loss: 0.6246 - regression_loss: 0.5507 - classification_loss: 0.0740 300/500 [=================>............] - ETA: 1:06 - loss: 0.6237 - regression_loss: 0.5499 - classification_loss: 0.0738 301/500 [=================>............] - ETA: 1:05 - loss: 0.6239 - regression_loss: 0.5500 - classification_loss: 0.0739 302/500 [=================>............] - ETA: 1:05 - loss: 0.6262 - regression_loss: 0.5520 - classification_loss: 0.0742 303/500 [=================>............] - ETA: 1:05 - loss: 0.6255 - regression_loss: 0.5514 - classification_loss: 0.0741 304/500 [=================>............] - ETA: 1:04 - loss: 0.6258 - regression_loss: 0.5517 - classification_loss: 0.0741 305/500 [=================>............] - ETA: 1:04 - loss: 0.6259 - regression_loss: 0.5518 - classification_loss: 0.0741 306/500 [=================>............] - ETA: 1:04 - loss: 0.6259 - regression_loss: 0.5518 - classification_loss: 0.0741 307/500 [=================>............] - ETA: 1:03 - loss: 0.6253 - regression_loss: 0.5513 - classification_loss: 0.0739 308/500 [=================>............] - ETA: 1:03 - loss: 0.6257 - regression_loss: 0.5517 - classification_loss: 0.0740 309/500 [=================>............] - ETA: 1:03 - loss: 0.6266 - regression_loss: 0.5524 - classification_loss: 0.0742 310/500 [=================>............] - ETA: 1:02 - loss: 0.6253 - regression_loss: 0.5513 - classification_loss: 0.0740 311/500 [=================>............] - ETA: 1:02 - loss: 0.6249 - regression_loss: 0.5509 - classification_loss: 0.0739 312/500 [=================>............] - ETA: 1:02 - loss: 0.6235 - regression_loss: 0.5497 - classification_loss: 0.0738 313/500 [=================>............] - ETA: 1:01 - loss: 0.6235 - regression_loss: 0.5496 - classification_loss: 0.0739 314/500 [=================>............] - ETA: 1:01 - loss: 0.6225 - regression_loss: 0.5488 - classification_loss: 0.0738 315/500 [=================>............] - ETA: 1:01 - loss: 0.6209 - regression_loss: 0.5473 - classification_loss: 0.0736 316/500 [=================>............] - ETA: 1:00 - loss: 0.6201 - regression_loss: 0.5467 - classification_loss: 0.0734 317/500 [==================>...........] - ETA: 1:00 - loss: 0.6211 - regression_loss: 0.5476 - classification_loss: 0.0736 318/500 [==================>...........] - ETA: 1:00 - loss: 0.6213 - regression_loss: 0.5478 - classification_loss: 0.0736 319/500 [==================>...........] - ETA: 59s - loss: 0.6222 - regression_loss: 0.5485 - classification_loss: 0.0737  320/500 [==================>...........] - ETA: 59s - loss: 0.6228 - regression_loss: 0.5490 - classification_loss: 0.0738 321/500 [==================>...........] - ETA: 59s - loss: 0.6237 - regression_loss: 0.5497 - classification_loss: 0.0740 322/500 [==================>...........] - ETA: 58s - loss: 0.6248 - regression_loss: 0.5506 - classification_loss: 0.0742 323/500 [==================>...........] - ETA: 58s - loss: 0.6238 - regression_loss: 0.5498 - classification_loss: 0.0740 324/500 [==================>...........] - ETA: 58s - loss: 0.6233 - regression_loss: 0.5494 - classification_loss: 0.0739 325/500 [==================>...........] - ETA: 57s - loss: 0.6251 - regression_loss: 0.5508 - classification_loss: 0.0743 326/500 [==================>...........] - ETA: 57s - loss: 0.6261 - regression_loss: 0.5516 - classification_loss: 0.0746 327/500 [==================>...........] - ETA: 57s - loss: 0.6259 - regression_loss: 0.5513 - classification_loss: 0.0746 328/500 [==================>...........] - ETA: 56s - loss: 0.6275 - regression_loss: 0.5527 - classification_loss: 0.0749 329/500 [==================>...........] - ETA: 56s - loss: 0.6273 - regression_loss: 0.5524 - classification_loss: 0.0749 330/500 [==================>...........] - ETA: 56s - loss: 0.6271 - regression_loss: 0.5523 - classification_loss: 0.0748 331/500 [==================>...........] - ETA: 55s - loss: 0.6273 - regression_loss: 0.5525 - classification_loss: 0.0748 332/500 [==================>...........] - ETA: 55s - loss: 0.6261 - regression_loss: 0.5514 - classification_loss: 0.0747 333/500 [==================>...........] - ETA: 55s - loss: 0.6246 - regression_loss: 0.5500 - classification_loss: 0.0745 334/500 [===================>..........] - ETA: 54s - loss: 0.6256 - regression_loss: 0.5509 - classification_loss: 0.0747 335/500 [===================>..........] - ETA: 54s - loss: 0.6258 - regression_loss: 0.5510 - classification_loss: 0.0747 336/500 [===================>..........] - ETA: 54s - loss: 0.6246 - regression_loss: 0.5500 - classification_loss: 0.0746 337/500 [===================>..........] - ETA: 53s - loss: 0.6250 - regression_loss: 0.5503 - classification_loss: 0.0747 338/500 [===================>..........] - ETA: 53s - loss: 0.6240 - regression_loss: 0.5495 - classification_loss: 0.0745 339/500 [===================>..........] - ETA: 53s - loss: 0.6233 - regression_loss: 0.5490 - classification_loss: 0.0744 340/500 [===================>..........] - ETA: 52s - loss: 0.6236 - regression_loss: 0.5492 - classification_loss: 0.0744 341/500 [===================>..........] - ETA: 52s - loss: 0.6248 - regression_loss: 0.5501 - classification_loss: 0.0747 342/500 [===================>..........] - ETA: 52s - loss: 0.6243 - regression_loss: 0.5497 - classification_loss: 0.0745 343/500 [===================>..........] - ETA: 51s - loss: 0.6239 - regression_loss: 0.5495 - classification_loss: 0.0745 344/500 [===================>..........] - ETA: 51s - loss: 0.6239 - regression_loss: 0.5493 - classification_loss: 0.0746 345/500 [===================>..........] - ETA: 51s - loss: 0.6238 - regression_loss: 0.5492 - classification_loss: 0.0746 346/500 [===================>..........] - ETA: 50s - loss: 0.6233 - regression_loss: 0.5489 - classification_loss: 0.0745 347/500 [===================>..........] - ETA: 50s - loss: 0.6230 - regression_loss: 0.5486 - classification_loss: 0.0745 348/500 [===================>..........] - ETA: 50s - loss: 0.6233 - regression_loss: 0.5489 - classification_loss: 0.0744 349/500 [===================>..........] - ETA: 49s - loss: 0.6233 - regression_loss: 0.5488 - classification_loss: 0.0745 350/500 [====================>.........] - ETA: 49s - loss: 0.6231 - regression_loss: 0.5487 - classification_loss: 0.0745 351/500 [====================>.........] - ETA: 49s - loss: 0.6229 - regression_loss: 0.5484 - classification_loss: 0.0745 352/500 [====================>.........] - ETA: 48s - loss: 0.6228 - regression_loss: 0.5484 - classification_loss: 0.0744 353/500 [====================>.........] - ETA: 48s - loss: 0.6217 - regression_loss: 0.5474 - classification_loss: 0.0743 354/500 [====================>.........] - ETA: 48s - loss: 0.6208 - regression_loss: 0.5466 - classification_loss: 0.0742 355/500 [====================>.........] - ETA: 47s - loss: 0.6205 - regression_loss: 0.5465 - classification_loss: 0.0741 356/500 [====================>.........] - ETA: 47s - loss: 0.6200 - regression_loss: 0.5460 - classification_loss: 0.0740 357/500 [====================>.........] - ETA: 47s - loss: 0.6195 - regression_loss: 0.5456 - classification_loss: 0.0739 358/500 [====================>.........] - ETA: 46s - loss: 0.6192 - regression_loss: 0.5454 - classification_loss: 0.0738 359/500 [====================>.........] - ETA: 46s - loss: 0.6204 - regression_loss: 0.5465 - classification_loss: 0.0739 360/500 [====================>.........] - ETA: 46s - loss: 0.6194 - regression_loss: 0.5456 - classification_loss: 0.0738 361/500 [====================>.........] - ETA: 46s - loss: 0.6192 - regression_loss: 0.5454 - classification_loss: 0.0738 362/500 [====================>.........] - ETA: 45s - loss: 0.6179 - regression_loss: 0.5442 - classification_loss: 0.0737 363/500 [====================>.........] - ETA: 45s - loss: 0.6172 - regression_loss: 0.5436 - classification_loss: 0.0736 364/500 [====================>.........] - ETA: 45s - loss: 0.6169 - regression_loss: 0.5433 - classification_loss: 0.0736 365/500 [====================>.........] - ETA: 44s - loss: 0.6176 - regression_loss: 0.5439 - classification_loss: 0.0738 366/500 [====================>.........] - ETA: 44s - loss: 0.6171 - regression_loss: 0.5434 - classification_loss: 0.0737 367/500 [=====================>........] - ETA: 43s - loss: 0.6175 - regression_loss: 0.5437 - classification_loss: 0.0738 368/500 [=====================>........] - ETA: 43s - loss: 0.6168 - regression_loss: 0.5431 - classification_loss: 0.0737 369/500 [=====================>........] - ETA: 43s - loss: 0.6168 - regression_loss: 0.5431 - classification_loss: 0.0737 370/500 [=====================>........] - ETA: 43s - loss: 0.6171 - regression_loss: 0.5434 - classification_loss: 0.0736 371/500 [=====================>........] - ETA: 42s - loss: 0.6166 - regression_loss: 0.5430 - classification_loss: 0.0735 372/500 [=====================>........] - ETA: 42s - loss: 0.6166 - regression_loss: 0.5429 - classification_loss: 0.0736 373/500 [=====================>........] - ETA: 42s - loss: 0.6171 - regression_loss: 0.5434 - classification_loss: 0.0737 374/500 [=====================>........] - ETA: 41s - loss: 0.6164 - regression_loss: 0.5428 - classification_loss: 0.0736 375/500 [=====================>........] - ETA: 41s - loss: 0.6161 - regression_loss: 0.5425 - classification_loss: 0.0735 376/500 [=====================>........] - ETA: 41s - loss: 0.6164 - regression_loss: 0.5428 - classification_loss: 0.0735 377/500 [=====================>........] - ETA: 40s - loss: 0.6163 - regression_loss: 0.5427 - classification_loss: 0.0736 378/500 [=====================>........] - ETA: 40s - loss: 0.6173 - regression_loss: 0.5435 - classification_loss: 0.0738 379/500 [=====================>........] - ETA: 40s - loss: 0.6168 - regression_loss: 0.5431 - classification_loss: 0.0738 380/500 [=====================>........] - ETA: 39s - loss: 0.6175 - regression_loss: 0.5437 - classification_loss: 0.0738 381/500 [=====================>........] - ETA: 39s - loss: 0.6179 - regression_loss: 0.5440 - classification_loss: 0.0739 382/500 [=====================>........] - ETA: 39s - loss: 0.6180 - regression_loss: 0.5441 - classification_loss: 0.0739 383/500 [=====================>........] - ETA: 38s - loss: 0.6187 - regression_loss: 0.5447 - classification_loss: 0.0740 384/500 [======================>.......] - ETA: 38s - loss: 0.6183 - regression_loss: 0.5444 - classification_loss: 0.0739 385/500 [======================>.......] - ETA: 38s - loss: 0.6175 - regression_loss: 0.5437 - classification_loss: 0.0738 386/500 [======================>.......] - ETA: 37s - loss: 0.6178 - regression_loss: 0.5439 - classification_loss: 0.0739 387/500 [======================>.......] - ETA: 37s - loss: 0.6175 - regression_loss: 0.5438 - classification_loss: 0.0738 388/500 [======================>.......] - ETA: 37s - loss: 0.6175 - regression_loss: 0.5438 - classification_loss: 0.0737 389/500 [======================>.......] - ETA: 36s - loss: 0.6168 - regression_loss: 0.5431 - classification_loss: 0.0736 390/500 [======================>.......] - ETA: 36s - loss: 0.6161 - regression_loss: 0.5425 - classification_loss: 0.0735 391/500 [======================>.......] - ETA: 36s - loss: 0.6162 - regression_loss: 0.5426 - classification_loss: 0.0736 392/500 [======================>.......] - ETA: 35s - loss: 0.6161 - regression_loss: 0.5425 - classification_loss: 0.0736 393/500 [======================>.......] - ETA: 35s - loss: 0.6162 - regression_loss: 0.5426 - classification_loss: 0.0736 394/500 [======================>.......] - ETA: 35s - loss: 0.6158 - regression_loss: 0.5423 - classification_loss: 0.0735 395/500 [======================>.......] - ETA: 34s - loss: 0.6164 - regression_loss: 0.5429 - classification_loss: 0.0735 396/500 [======================>.......] - ETA: 34s - loss: 0.6156 - regression_loss: 0.5421 - classification_loss: 0.0735 397/500 [======================>.......] - ETA: 34s - loss: 0.6166 - regression_loss: 0.5432 - classification_loss: 0.0734 398/500 [======================>.......] - ETA: 33s - loss: 0.6177 - regression_loss: 0.5441 - classification_loss: 0.0735 399/500 [======================>.......] - ETA: 33s - loss: 0.6174 - regression_loss: 0.5439 - classification_loss: 0.0735 400/500 [=======================>......] - ETA: 33s - loss: 0.6169 - regression_loss: 0.5435 - classification_loss: 0.0734 401/500 [=======================>......] - ETA: 32s - loss: 0.6167 - regression_loss: 0.5434 - classification_loss: 0.0733 402/500 [=======================>......] - ETA: 32s - loss: 0.6165 - regression_loss: 0.5432 - classification_loss: 0.0733 403/500 [=======================>......] - ETA: 32s - loss: 0.6162 - regression_loss: 0.5430 - classification_loss: 0.0732 404/500 [=======================>......] - ETA: 31s - loss: 0.6156 - regression_loss: 0.5425 - classification_loss: 0.0731 405/500 [=======================>......] - ETA: 31s - loss: 0.6152 - regression_loss: 0.5421 - classification_loss: 0.0731 406/500 [=======================>......] - ETA: 31s - loss: 0.6157 - regression_loss: 0.5426 - classification_loss: 0.0731 407/500 [=======================>......] - ETA: 30s - loss: 0.6147 - regression_loss: 0.5417 - classification_loss: 0.0730 408/500 [=======================>......] - ETA: 30s - loss: 0.6154 - regression_loss: 0.5423 - classification_loss: 0.0731 409/500 [=======================>......] - ETA: 30s - loss: 0.6154 - regression_loss: 0.5423 - classification_loss: 0.0731 410/500 [=======================>......] - ETA: 29s - loss: 0.6150 - regression_loss: 0.5419 - classification_loss: 0.0730 411/500 [=======================>......] - ETA: 29s - loss: 0.6157 - regression_loss: 0.5425 - classification_loss: 0.0731 412/500 [=======================>......] - ETA: 29s - loss: 0.6145 - regression_loss: 0.5415 - classification_loss: 0.0730 413/500 [=======================>......] - ETA: 28s - loss: 0.6150 - regression_loss: 0.5419 - classification_loss: 0.0731 414/500 [=======================>......] - ETA: 28s - loss: 0.6142 - regression_loss: 0.5412 - classification_loss: 0.0730 415/500 [=======================>......] - ETA: 28s - loss: 0.6135 - regression_loss: 0.5406 - classification_loss: 0.0729 416/500 [=======================>......] - ETA: 27s - loss: 0.6135 - regression_loss: 0.5406 - classification_loss: 0.0729 417/500 [========================>.....] - ETA: 27s - loss: 0.6139 - regression_loss: 0.5409 - classification_loss: 0.0729 418/500 [========================>.....] - ETA: 27s - loss: 0.6133 - regression_loss: 0.5404 - classification_loss: 0.0729 419/500 [========================>.....] - ETA: 26s - loss: 0.6127 - regression_loss: 0.5399 - classification_loss: 0.0728 420/500 [========================>.....] - ETA: 26s - loss: 0.6127 - regression_loss: 0.5399 - classification_loss: 0.0727 421/500 [========================>.....] - ETA: 26s - loss: 0.6130 - regression_loss: 0.5402 - classification_loss: 0.0728 422/500 [========================>.....] - ETA: 25s - loss: 0.6130 - regression_loss: 0.5403 - classification_loss: 0.0727 423/500 [========================>.....] - ETA: 25s - loss: 0.6123 - regression_loss: 0.5397 - classification_loss: 0.0726 424/500 [========================>.....] - ETA: 25s - loss: 0.6118 - regression_loss: 0.5392 - classification_loss: 0.0726 425/500 [========================>.....] - ETA: 24s - loss: 0.6112 - regression_loss: 0.5387 - classification_loss: 0.0725 426/500 [========================>.....] - ETA: 24s - loss: 0.6108 - regression_loss: 0.5383 - classification_loss: 0.0724 427/500 [========================>.....] - ETA: 24s - loss: 0.6108 - regression_loss: 0.5383 - classification_loss: 0.0725 428/500 [========================>.....] - ETA: 23s - loss: 0.6112 - regression_loss: 0.5386 - classification_loss: 0.0725 429/500 [========================>.....] - ETA: 23s - loss: 0.6118 - regression_loss: 0.5392 - classification_loss: 0.0727 430/500 [========================>.....] - ETA: 23s - loss: 0.6119 - regression_loss: 0.5392 - classification_loss: 0.0727 431/500 [========================>.....] - ETA: 22s - loss: 0.6125 - regression_loss: 0.5397 - classification_loss: 0.0729 432/500 [========================>.....] - ETA: 22s - loss: 0.6131 - regression_loss: 0.5401 - classification_loss: 0.0730 433/500 [========================>.....] - ETA: 22s - loss: 0.6131 - regression_loss: 0.5402 - classification_loss: 0.0730 434/500 [=========================>....] - ETA: 21s - loss: 0.6131 - regression_loss: 0.5401 - classification_loss: 0.0730 435/500 [=========================>....] - ETA: 21s - loss: 0.6125 - regression_loss: 0.5396 - classification_loss: 0.0729 436/500 [=========================>....] - ETA: 21s - loss: 0.6128 - regression_loss: 0.5398 - classification_loss: 0.0730 437/500 [=========================>....] - ETA: 20s - loss: 0.6120 - regression_loss: 0.5391 - classification_loss: 0.0729 438/500 [=========================>....] - ETA: 20s - loss: 0.6116 - regression_loss: 0.5388 - classification_loss: 0.0728 439/500 [=========================>....] - ETA: 20s - loss: 0.6112 - regression_loss: 0.5385 - classification_loss: 0.0727 440/500 [=========================>....] - ETA: 19s - loss: 0.6109 - regression_loss: 0.5383 - classification_loss: 0.0726 441/500 [=========================>....] - ETA: 19s - loss: 0.6106 - regression_loss: 0.5381 - classification_loss: 0.0725 442/500 [=========================>....] - ETA: 19s - loss: 0.6103 - regression_loss: 0.5378 - classification_loss: 0.0725 443/500 [=========================>....] - ETA: 18s - loss: 0.6110 - regression_loss: 0.5384 - classification_loss: 0.0726 444/500 [=========================>....] - ETA: 18s - loss: 0.6117 - regression_loss: 0.5389 - classification_loss: 0.0728 445/500 [=========================>....] - ETA: 18s - loss: 0.6109 - regression_loss: 0.5382 - classification_loss: 0.0727 446/500 [=========================>....] - ETA: 17s - loss: 0.6102 - regression_loss: 0.5375 - classification_loss: 0.0727 447/500 [=========================>....] - ETA: 17s - loss: 0.6108 - regression_loss: 0.5380 - classification_loss: 0.0729 448/500 [=========================>....] - ETA: 17s - loss: 0.6112 - regression_loss: 0.5383 - classification_loss: 0.0729 449/500 [=========================>....] - ETA: 16s - loss: 0.6110 - regression_loss: 0.5382 - classification_loss: 0.0728 450/500 [==========================>...] - ETA: 16s - loss: 0.6109 - regression_loss: 0.5381 - classification_loss: 0.0728 451/500 [==========================>...] - ETA: 16s - loss: 0.6114 - regression_loss: 0.5385 - classification_loss: 0.0728 452/500 [==========================>...] - ETA: 15s - loss: 0.6112 - regression_loss: 0.5384 - classification_loss: 0.0728 453/500 [==========================>...] - ETA: 15s - loss: 0.6107 - regression_loss: 0.5379 - classification_loss: 0.0728 454/500 [==========================>...] - ETA: 15s - loss: 0.6104 - regression_loss: 0.5376 - classification_loss: 0.0727 455/500 [==========================>...] - ETA: 14s - loss: 0.6098 - regression_loss: 0.5371 - classification_loss: 0.0726 456/500 [==========================>...] - ETA: 14s - loss: 0.6094 - regression_loss: 0.5369 - classification_loss: 0.0725 457/500 [==========================>...] - ETA: 14s - loss: 0.6098 - regression_loss: 0.5372 - classification_loss: 0.0726 458/500 [==========================>...] - ETA: 13s - loss: 0.6099 - regression_loss: 0.5373 - classification_loss: 0.0725 459/500 [==========================>...] - ETA: 13s - loss: 0.6106 - regression_loss: 0.5379 - classification_loss: 0.0727 460/500 [==========================>...] - ETA: 13s - loss: 0.6099 - regression_loss: 0.5374 - classification_loss: 0.0726 461/500 [==========================>...] - ETA: 12s - loss: 0.6096 - regression_loss: 0.5371 - classification_loss: 0.0725 462/500 [==========================>...] - ETA: 12s - loss: 0.6091 - regression_loss: 0.5367 - classification_loss: 0.0724 463/500 [==========================>...] - ETA: 12s - loss: 0.6092 - regression_loss: 0.5368 - classification_loss: 0.0724 464/500 [==========================>...] - ETA: 11s - loss: 0.6087 - regression_loss: 0.5364 - classification_loss: 0.0723 465/500 [==========================>...] - ETA: 11s - loss: 0.6080 - regression_loss: 0.5358 - classification_loss: 0.0722 466/500 [==========================>...] - ETA: 11s - loss: 0.6092 - regression_loss: 0.5368 - classification_loss: 0.0724 467/500 [===========================>..] - ETA: 10s - loss: 0.6095 - regression_loss: 0.5372 - classification_loss: 0.0723 468/500 [===========================>..] - ETA: 10s - loss: 0.6105 - regression_loss: 0.5379 - classification_loss: 0.0726 469/500 [===========================>..] - ETA: 10s - loss: 0.6100 - regression_loss: 0.5375 - classification_loss: 0.0725 470/500 [===========================>..] - ETA: 9s - loss: 0.6104 - regression_loss: 0.5380 - classification_loss: 0.0725  471/500 [===========================>..] - ETA: 9s - loss: 0.6109 - regression_loss: 0.5384 - classification_loss: 0.0725 472/500 [===========================>..] - ETA: 9s - loss: 0.6102 - regression_loss: 0.5378 - classification_loss: 0.0724 473/500 [===========================>..] - ETA: 8s - loss: 0.6105 - regression_loss: 0.5381 - classification_loss: 0.0724 474/500 [===========================>..] - ETA: 8s - loss: 0.6104 - regression_loss: 0.5380 - classification_loss: 0.0724 475/500 [===========================>..] - ETA: 8s - loss: 0.6109 - regression_loss: 0.5385 - classification_loss: 0.0725 476/500 [===========================>..] - ETA: 7s - loss: 0.6110 - regression_loss: 0.5385 - classification_loss: 0.0725 477/500 [===========================>..] - ETA: 7s - loss: 0.6109 - regression_loss: 0.5384 - classification_loss: 0.0725 478/500 [===========================>..] - ETA: 7s - loss: 0.6109 - regression_loss: 0.5384 - classification_loss: 0.0725 479/500 [===========================>..] - ETA: 6s - loss: 0.6107 - regression_loss: 0.5382 - classification_loss: 0.0725 480/500 [===========================>..] - ETA: 6s - loss: 0.6104 - regression_loss: 0.5380 - classification_loss: 0.0724 481/500 [===========================>..] - ETA: 6s - loss: 0.6099 - regression_loss: 0.5375 - classification_loss: 0.0724 482/500 [===========================>..] - ETA: 5s - loss: 0.6096 - regression_loss: 0.5373 - classification_loss: 0.0723 483/500 [===========================>..] - ETA: 5s - loss: 0.6098 - regression_loss: 0.5375 - classification_loss: 0.0723 484/500 [============================>.] - ETA: 5s - loss: 0.6093 - regression_loss: 0.5371 - classification_loss: 0.0722 485/500 [============================>.] - ETA: 4s - loss: 0.6096 - regression_loss: 0.5374 - classification_loss: 0.0723 486/500 [============================>.] - ETA: 4s - loss: 0.6089 - regression_loss: 0.5368 - classification_loss: 0.0721 487/500 [============================>.] - ETA: 4s - loss: 0.6093 - regression_loss: 0.5370 - classification_loss: 0.0723 488/500 [============================>.] - ETA: 3s - loss: 0.6086 - regression_loss: 0.5363 - classification_loss: 0.0723 489/500 [============================>.] - ETA: 3s - loss: 0.6088 - regression_loss: 0.5365 - classification_loss: 0.0723 490/500 [============================>.] - ETA: 3s - loss: 0.6080 - regression_loss: 0.5358 - classification_loss: 0.0722 491/500 [============================>.] - ETA: 2s - loss: 0.6085 - regression_loss: 0.5362 - classification_loss: 0.0723 492/500 [============================>.] - ETA: 2s - loss: 0.6081 - regression_loss: 0.5359 - classification_loss: 0.0723 493/500 [============================>.] - ETA: 2s - loss: 0.6078 - regression_loss: 0.5355 - classification_loss: 0.0722 494/500 [============================>.] - ETA: 1s - loss: 0.6082 - regression_loss: 0.5359 - classification_loss: 0.0723 495/500 [============================>.] - ETA: 1s - loss: 0.6075 - regression_loss: 0.5353 - classification_loss: 0.0722 496/500 [============================>.] - ETA: 1s - loss: 0.6073 - regression_loss: 0.5352 - classification_loss: 0.0722 497/500 [============================>.] - ETA: 0s - loss: 0.6085 - regression_loss: 0.5361 - classification_loss: 0.0724 498/500 [============================>.] - ETA: 0s - loss: 0.6090 - regression_loss: 0.5365 - classification_loss: 0.0725 499/500 [============================>.] - ETA: 0s - loss: 0.6085 - regression_loss: 0.5360 - classification_loss: 0.0725 500/500 [==============================] - 165s 331ms/step - loss: 0.6090 - regression_loss: 0.5364 - classification_loss: 0.0726 1172 instances of class plum with average precision: 0.6470 mAP: 0.6470 Epoch 00051: saving model to ./training/snapshots/resnet101_pascal_51.h5 Epoch 52/150 1/500 [..............................] - ETA: 2:49 - loss: 0.8307 - regression_loss: 0.7269 - classification_loss: 0.1039 2/500 [..............................] - ETA: 2:46 - loss: 0.7905 - regression_loss: 0.6811 - classification_loss: 0.1094 3/500 [..............................] - ETA: 2:47 - loss: 0.7814 - regression_loss: 0.6690 - classification_loss: 0.1124 4/500 [..............................] - ETA: 2:47 - loss: 0.7061 - regression_loss: 0.6129 - classification_loss: 0.0932 5/500 [..............................] - ETA: 2:45 - loss: 0.7906 - regression_loss: 0.6844 - classification_loss: 0.1062 6/500 [..............................] - ETA: 2:45 - loss: 0.7485 - regression_loss: 0.6447 - classification_loss: 0.1038 7/500 [..............................] - ETA: 2:45 - loss: 0.7263 - regression_loss: 0.6280 - classification_loss: 0.0982 8/500 [..............................] - ETA: 2:44 - loss: 0.6827 - regression_loss: 0.5916 - classification_loss: 0.0911 9/500 [..............................] - ETA: 2:44 - loss: 0.6522 - regression_loss: 0.5693 - classification_loss: 0.0828 10/500 [..............................] - ETA: 2:44 - loss: 0.6218 - regression_loss: 0.5440 - classification_loss: 0.0778 11/500 [..............................] - ETA: 2:43 - loss: 0.6415 - regression_loss: 0.5627 - classification_loss: 0.0788 12/500 [..............................] - ETA: 2:43 - loss: 0.6434 - regression_loss: 0.5630 - classification_loss: 0.0803 13/500 [..............................] - ETA: 2:43 - loss: 0.6227 - regression_loss: 0.5468 - classification_loss: 0.0759 14/500 [..............................] - ETA: 2:42 - loss: 0.6232 - regression_loss: 0.5461 - classification_loss: 0.0770 15/500 [..............................] - ETA: 2:41 - loss: 0.6414 - regression_loss: 0.5603 - classification_loss: 0.0811 16/500 [..............................] - ETA: 2:41 - loss: 0.6884 - regression_loss: 0.5984 - classification_loss: 0.0901 17/500 [>.............................] - ETA: 2:41 - loss: 0.6881 - regression_loss: 0.5968 - classification_loss: 0.0912 18/500 [>.............................] - ETA: 2:41 - loss: 0.7104 - regression_loss: 0.6154 - classification_loss: 0.0950 19/500 [>.............................] - ETA: 2:40 - loss: 0.7180 - regression_loss: 0.6246 - classification_loss: 0.0933 20/500 [>.............................] - ETA: 2:40 - loss: 0.7123 - regression_loss: 0.6203 - classification_loss: 0.0921 21/500 [>.............................] - ETA: 2:39 - loss: 0.7085 - regression_loss: 0.6179 - classification_loss: 0.0906 22/500 [>.............................] - ETA: 2:39 - loss: 0.7034 - regression_loss: 0.6133 - classification_loss: 0.0901 23/500 [>.............................] - ETA: 2:38 - loss: 0.6955 - regression_loss: 0.6060 - classification_loss: 0.0895 24/500 [>.............................] - ETA: 2:37 - loss: 0.6808 - regression_loss: 0.5932 - classification_loss: 0.0876 25/500 [>.............................] - ETA: 2:37 - loss: 0.6613 - regression_loss: 0.5761 - classification_loss: 0.0852 26/500 [>.............................] - ETA: 2:37 - loss: 0.6599 - regression_loss: 0.5753 - classification_loss: 0.0845 27/500 [>.............................] - ETA: 2:36 - loss: 0.6596 - regression_loss: 0.5752 - classification_loss: 0.0844 28/500 [>.............................] - ETA: 2:35 - loss: 0.6455 - regression_loss: 0.5628 - classification_loss: 0.0826 29/500 [>.............................] - ETA: 2:35 - loss: 0.6394 - regression_loss: 0.5564 - classification_loss: 0.0830 30/500 [>.............................] - ETA: 2:34 - loss: 0.6373 - regression_loss: 0.5550 - classification_loss: 0.0823 31/500 [>.............................] - ETA: 2:34 - loss: 0.6290 - regression_loss: 0.5484 - classification_loss: 0.0806 32/500 [>.............................] - ETA: 2:34 - loss: 0.6215 - regression_loss: 0.5423 - classification_loss: 0.0792 33/500 [>.............................] - ETA: 2:34 - loss: 0.6291 - regression_loss: 0.5494 - classification_loss: 0.0797 34/500 [=>............................] - ETA: 2:34 - loss: 0.6283 - regression_loss: 0.5494 - classification_loss: 0.0789 35/500 [=>............................] - ETA: 2:33 - loss: 0.6265 - regression_loss: 0.5482 - classification_loss: 0.0782 36/500 [=>............................] - ETA: 2:33 - loss: 0.6238 - regression_loss: 0.5474 - classification_loss: 0.0764 37/500 [=>............................] - ETA: 2:32 - loss: 0.6182 - regression_loss: 0.5426 - classification_loss: 0.0756 38/500 [=>............................] - ETA: 2:32 - loss: 0.6125 - regression_loss: 0.5377 - classification_loss: 0.0748 39/500 [=>............................] - ETA: 2:31 - loss: 0.6078 - regression_loss: 0.5343 - classification_loss: 0.0735 40/500 [=>............................] - ETA: 2:31 - loss: 0.5990 - regression_loss: 0.5267 - classification_loss: 0.0723 41/500 [=>............................] - ETA: 2:31 - loss: 0.6015 - regression_loss: 0.5290 - classification_loss: 0.0724 42/500 [=>............................] - ETA: 2:30 - loss: 0.5968 - regression_loss: 0.5255 - classification_loss: 0.0713 43/500 [=>............................] - ETA: 2:30 - loss: 0.5943 - regression_loss: 0.5236 - classification_loss: 0.0707 44/500 [=>............................] - ETA: 2:29 - loss: 0.5981 - regression_loss: 0.5266 - classification_loss: 0.0715 45/500 [=>............................] - ETA: 2:29 - loss: 0.5928 - regression_loss: 0.5220 - classification_loss: 0.0708 46/500 [=>............................] - ETA: 2:29 - loss: 0.5977 - regression_loss: 0.5256 - classification_loss: 0.0721 47/500 [=>............................] - ETA: 2:29 - loss: 0.5948 - regression_loss: 0.5234 - classification_loss: 0.0713 48/500 [=>............................] - ETA: 2:28 - loss: 0.5954 - regression_loss: 0.5241 - classification_loss: 0.0713 49/500 [=>............................] - ETA: 2:28 - loss: 0.5997 - regression_loss: 0.5288 - classification_loss: 0.0709 50/500 [==>...........................] - ETA: 2:27 - loss: 0.5998 - regression_loss: 0.5283 - classification_loss: 0.0715 51/500 [==>...........................] - ETA: 2:27 - loss: 0.5998 - regression_loss: 0.5285 - classification_loss: 0.0713 52/500 [==>...........................] - ETA: 2:27 - loss: 0.5948 - regression_loss: 0.5233 - classification_loss: 0.0715 53/500 [==>...........................] - ETA: 2:27 - loss: 0.5898 - regression_loss: 0.5190 - classification_loss: 0.0708 54/500 [==>...........................] - ETA: 2:26 - loss: 0.5953 - regression_loss: 0.5231 - classification_loss: 0.0722 55/500 [==>...........................] - ETA: 2:26 - loss: 0.5950 - regression_loss: 0.5231 - classification_loss: 0.0719 56/500 [==>...........................] - ETA: 2:26 - loss: 0.6028 - regression_loss: 0.5291 - classification_loss: 0.0737 57/500 [==>...........................] - ETA: 2:25 - loss: 0.6068 - regression_loss: 0.5328 - classification_loss: 0.0740 58/500 [==>...........................] - ETA: 2:25 - loss: 0.6111 - regression_loss: 0.5365 - classification_loss: 0.0747 59/500 [==>...........................] - ETA: 2:25 - loss: 0.6112 - regression_loss: 0.5362 - classification_loss: 0.0750 60/500 [==>...........................] - ETA: 2:24 - loss: 0.6132 - regression_loss: 0.5381 - classification_loss: 0.0751 61/500 [==>...........................] - ETA: 2:24 - loss: 0.6095 - regression_loss: 0.5352 - classification_loss: 0.0743 62/500 [==>...........................] - ETA: 2:23 - loss: 0.6113 - regression_loss: 0.5367 - classification_loss: 0.0746 63/500 [==>...........................] - ETA: 2:23 - loss: 0.6141 - regression_loss: 0.5392 - classification_loss: 0.0749 64/500 [==>...........................] - ETA: 2:23 - loss: 0.6123 - regression_loss: 0.5381 - classification_loss: 0.0742 65/500 [==>...........................] - ETA: 2:23 - loss: 0.6074 - regression_loss: 0.5339 - classification_loss: 0.0735 66/500 [==>...........................] - ETA: 2:22 - loss: 0.6069 - regression_loss: 0.5335 - classification_loss: 0.0734 67/500 [===>..........................] - ETA: 2:22 - loss: 0.6029 - regression_loss: 0.5301 - classification_loss: 0.0728 68/500 [===>..........................] - ETA: 2:22 - loss: 0.6064 - regression_loss: 0.5339 - classification_loss: 0.0725 69/500 [===>..........................] - ETA: 2:21 - loss: 0.6118 - regression_loss: 0.5375 - classification_loss: 0.0743 70/500 [===>..........................] - ETA: 2:21 - loss: 0.6108 - regression_loss: 0.5365 - classification_loss: 0.0743 71/500 [===>..........................] - ETA: 2:21 - loss: 0.6139 - regression_loss: 0.5393 - classification_loss: 0.0746 72/500 [===>..........................] - ETA: 2:20 - loss: 0.6147 - regression_loss: 0.5401 - classification_loss: 0.0746 73/500 [===>..........................] - ETA: 2:20 - loss: 0.6156 - regression_loss: 0.5409 - classification_loss: 0.0747 74/500 [===>..........................] - ETA: 2:20 - loss: 0.6124 - regression_loss: 0.5382 - classification_loss: 0.0742 75/500 [===>..........................] - ETA: 2:19 - loss: 0.6128 - regression_loss: 0.5384 - classification_loss: 0.0744 76/500 [===>..........................] - ETA: 2:19 - loss: 0.6143 - regression_loss: 0.5392 - classification_loss: 0.0751 77/500 [===>..........................] - ETA: 2:19 - loss: 0.6103 - regression_loss: 0.5357 - classification_loss: 0.0746 78/500 [===>..........................] - ETA: 2:18 - loss: 0.6109 - regression_loss: 0.5361 - classification_loss: 0.0748 79/500 [===>..........................] - ETA: 2:18 - loss: 0.6096 - regression_loss: 0.5348 - classification_loss: 0.0747 80/500 [===>..........................] - ETA: 2:18 - loss: 0.6059 - regression_loss: 0.5318 - classification_loss: 0.0740 81/500 [===>..........................] - ETA: 2:17 - loss: 0.6043 - regression_loss: 0.5303 - classification_loss: 0.0740 82/500 [===>..........................] - ETA: 2:17 - loss: 0.6034 - regression_loss: 0.5297 - classification_loss: 0.0737 83/500 [===>..........................] - ETA: 2:17 - loss: 0.6109 - regression_loss: 0.5359 - classification_loss: 0.0750 84/500 [====>.........................] - ETA: 2:17 - loss: 0.6072 - regression_loss: 0.5328 - classification_loss: 0.0744 85/500 [====>.........................] - ETA: 2:16 - loss: 0.6075 - regression_loss: 0.5328 - classification_loss: 0.0747 86/500 [====>.........................] - ETA: 2:16 - loss: 0.6100 - regression_loss: 0.5353 - classification_loss: 0.0747 87/500 [====>.........................] - ETA: 2:16 - loss: 0.6101 - regression_loss: 0.5354 - classification_loss: 0.0747 88/500 [====>.........................] - ETA: 2:15 - loss: 0.6086 - regression_loss: 0.5341 - classification_loss: 0.0744 89/500 [====>.........................] - ETA: 2:15 - loss: 0.6065 - regression_loss: 0.5325 - classification_loss: 0.0740 90/500 [====>.........................] - ETA: 2:15 - loss: 0.6059 - regression_loss: 0.5322 - classification_loss: 0.0737 91/500 [====>.........................] - ETA: 2:14 - loss: 0.6059 - regression_loss: 0.5323 - classification_loss: 0.0736 92/500 [====>.........................] - ETA: 2:14 - loss: 0.6058 - regression_loss: 0.5320 - classification_loss: 0.0738 93/500 [====>.........................] - ETA: 2:13 - loss: 0.6014 - regression_loss: 0.5280 - classification_loss: 0.0734 94/500 [====>.........................] - ETA: 2:13 - loss: 0.5991 - regression_loss: 0.5262 - classification_loss: 0.0729 95/500 [====>.........................] - ETA: 2:13 - loss: 0.5971 - regression_loss: 0.5248 - classification_loss: 0.0724 96/500 [====>.........................] - ETA: 2:12 - loss: 0.5988 - regression_loss: 0.5260 - classification_loss: 0.0728 97/500 [====>.........................] - ETA: 2:12 - loss: 0.5993 - regression_loss: 0.5266 - classification_loss: 0.0727 98/500 [====>.........................] - ETA: 2:12 - loss: 0.5989 - regression_loss: 0.5262 - classification_loss: 0.0726 99/500 [====>.........................] - ETA: 2:11 - loss: 0.6009 - regression_loss: 0.5283 - classification_loss: 0.0726 100/500 [=====>........................] - ETA: 2:11 - loss: 0.5982 - regression_loss: 0.5260 - classification_loss: 0.0722 101/500 [=====>........................] - ETA: 2:11 - loss: 0.5947 - regression_loss: 0.5230 - classification_loss: 0.0717 102/500 [=====>........................] - ETA: 2:10 - loss: 0.5968 - regression_loss: 0.5244 - classification_loss: 0.0724 103/500 [=====>........................] - ETA: 2:10 - loss: 0.5986 - regression_loss: 0.5259 - classification_loss: 0.0726 104/500 [=====>........................] - ETA: 2:10 - loss: 0.5961 - regression_loss: 0.5235 - classification_loss: 0.0725 105/500 [=====>........................] - ETA: 2:10 - loss: 0.5925 - regression_loss: 0.5205 - classification_loss: 0.0721 106/500 [=====>........................] - ETA: 2:09 - loss: 0.5954 - regression_loss: 0.5224 - classification_loss: 0.0730 107/500 [=====>........................] - ETA: 2:09 - loss: 0.5944 - regression_loss: 0.5218 - classification_loss: 0.0727 108/500 [=====>........................] - ETA: 2:08 - loss: 0.5930 - regression_loss: 0.5205 - classification_loss: 0.0724 109/500 [=====>........................] - ETA: 2:08 - loss: 0.5966 - regression_loss: 0.5236 - classification_loss: 0.0729 110/500 [=====>........................] - ETA: 2:08 - loss: 0.5963 - regression_loss: 0.5235 - classification_loss: 0.0729 111/500 [=====>........................] - ETA: 2:08 - loss: 0.5935 - regression_loss: 0.5210 - classification_loss: 0.0725 112/500 [=====>........................] - ETA: 2:07 - loss: 0.5962 - regression_loss: 0.5236 - classification_loss: 0.0727 113/500 [=====>........................] - ETA: 2:07 - loss: 0.5973 - regression_loss: 0.5246 - classification_loss: 0.0727 114/500 [=====>........................] - ETA: 2:07 - loss: 0.5979 - regression_loss: 0.5249 - classification_loss: 0.0730 115/500 [=====>........................] - ETA: 2:06 - loss: 0.6021 - regression_loss: 0.5283 - classification_loss: 0.0738 116/500 [=====>........................] - ETA: 2:06 - loss: 0.6013 - regression_loss: 0.5278 - classification_loss: 0.0736 117/500 [======>.......................] - ETA: 2:06 - loss: 0.5993 - regression_loss: 0.5261 - classification_loss: 0.0732 118/500 [======>.......................] - ETA: 2:05 - loss: 0.6021 - regression_loss: 0.5291 - classification_loss: 0.0730 119/500 [======>.......................] - ETA: 2:05 - loss: 0.6018 - regression_loss: 0.5289 - classification_loss: 0.0729 120/500 [======>.......................] - ETA: 2:05 - loss: 0.6070 - regression_loss: 0.5331 - classification_loss: 0.0739 121/500 [======>.......................] - ETA: 2:04 - loss: 0.6096 - regression_loss: 0.5351 - classification_loss: 0.0745 122/500 [======>.......................] - ETA: 2:04 - loss: 0.6096 - regression_loss: 0.5351 - classification_loss: 0.0745 123/500 [======>.......................] - ETA: 2:04 - loss: 0.6070 - regression_loss: 0.5328 - classification_loss: 0.0742 124/500 [======>.......................] - ETA: 2:03 - loss: 0.6056 - regression_loss: 0.5318 - classification_loss: 0.0738 125/500 [======>.......................] - ETA: 2:03 - loss: 0.6019 - regression_loss: 0.5286 - classification_loss: 0.0733 126/500 [======>.......................] - ETA: 2:03 - loss: 0.6040 - regression_loss: 0.5304 - classification_loss: 0.0735 127/500 [======>.......................] - ETA: 2:02 - loss: 0.6027 - regression_loss: 0.5294 - classification_loss: 0.0734 128/500 [======>.......................] - ETA: 2:02 - loss: 0.6072 - regression_loss: 0.5328 - classification_loss: 0.0744 129/500 [======>.......................] - ETA: 2:02 - loss: 0.6100 - regression_loss: 0.5350 - classification_loss: 0.0750 130/500 [======>.......................] - ETA: 2:01 - loss: 0.6119 - regression_loss: 0.5366 - classification_loss: 0.0753 131/500 [======>.......................] - ETA: 2:01 - loss: 0.6106 - regression_loss: 0.5355 - classification_loss: 0.0751 132/500 [======>.......................] - ETA: 2:01 - loss: 0.6110 - regression_loss: 0.5360 - classification_loss: 0.0750 133/500 [======>.......................] - ETA: 2:00 - loss: 0.6086 - regression_loss: 0.5339 - classification_loss: 0.0747 134/500 [=======>......................] - ETA: 2:00 - loss: 0.6088 - regression_loss: 0.5341 - classification_loss: 0.0747 135/500 [=======>......................] - ETA: 2:00 - loss: 0.6057 - regression_loss: 0.5314 - classification_loss: 0.0743 136/500 [=======>......................] - ETA: 1:59 - loss: 0.6068 - regression_loss: 0.5321 - classification_loss: 0.0747 137/500 [=======>......................] - ETA: 1:59 - loss: 0.6049 - regression_loss: 0.5306 - classification_loss: 0.0743 138/500 [=======>......................] - ETA: 1:59 - loss: 0.6040 - regression_loss: 0.5298 - classification_loss: 0.0741 139/500 [=======>......................] - ETA: 1:58 - loss: 0.6045 - regression_loss: 0.5303 - classification_loss: 0.0742 140/500 [=======>......................] - ETA: 1:58 - loss: 0.6035 - regression_loss: 0.5295 - classification_loss: 0.0739 141/500 [=======>......................] - ETA: 1:58 - loss: 0.6023 - regression_loss: 0.5286 - classification_loss: 0.0737 142/500 [=======>......................] - ETA: 1:57 - loss: 0.6022 - regression_loss: 0.5284 - classification_loss: 0.0737 143/500 [=======>......................] - ETA: 1:57 - loss: 0.6019 - regression_loss: 0.5281 - classification_loss: 0.0738 144/500 [=======>......................] - ETA: 1:57 - loss: 0.6020 - regression_loss: 0.5281 - classification_loss: 0.0739 145/500 [=======>......................] - ETA: 1:56 - loss: 0.6029 - regression_loss: 0.5287 - classification_loss: 0.0741 146/500 [=======>......................] - ETA: 1:56 - loss: 0.6015 - regression_loss: 0.5274 - classification_loss: 0.0741 147/500 [=======>......................] - ETA: 1:56 - loss: 0.6031 - regression_loss: 0.5287 - classification_loss: 0.0743 148/500 [=======>......................] - ETA: 1:55 - loss: 0.6040 - regression_loss: 0.5295 - classification_loss: 0.0745 149/500 [=======>......................] - ETA: 1:55 - loss: 0.6041 - regression_loss: 0.5296 - classification_loss: 0.0745 150/500 [========>.....................] - ETA: 1:55 - loss: 0.6041 - regression_loss: 0.5296 - classification_loss: 0.0745 151/500 [========>.....................] - ETA: 1:54 - loss: 0.6043 - regression_loss: 0.5297 - classification_loss: 0.0746 152/500 [========>.....................] - ETA: 1:54 - loss: 0.6029 - regression_loss: 0.5285 - classification_loss: 0.0743 153/500 [========>.....................] - ETA: 1:54 - loss: 0.6033 - regression_loss: 0.5290 - classification_loss: 0.0743 154/500 [========>.....................] - ETA: 1:54 - loss: 0.6037 - regression_loss: 0.5294 - classification_loss: 0.0744 155/500 [========>.....................] - ETA: 1:53 - loss: 0.6037 - regression_loss: 0.5294 - classification_loss: 0.0743 156/500 [========>.....................] - ETA: 1:53 - loss: 0.6038 - regression_loss: 0.5295 - classification_loss: 0.0743 157/500 [========>.....................] - ETA: 1:53 - loss: 0.6031 - regression_loss: 0.5291 - classification_loss: 0.0741 158/500 [========>.....................] - ETA: 1:52 - loss: 0.6011 - regression_loss: 0.5273 - classification_loss: 0.0737 159/500 [========>.....................] - ETA: 1:52 - loss: 0.6046 - regression_loss: 0.5300 - classification_loss: 0.0745 160/500 [========>.....................] - ETA: 1:51 - loss: 0.6022 - regression_loss: 0.5279 - classification_loss: 0.0743 161/500 [========>.....................] - ETA: 1:51 - loss: 0.6032 - regression_loss: 0.5289 - classification_loss: 0.0743 162/500 [========>.....................] - ETA: 1:51 - loss: 0.6045 - regression_loss: 0.5299 - classification_loss: 0.0747 163/500 [========>.....................] - ETA: 1:50 - loss: 0.6043 - regression_loss: 0.5298 - classification_loss: 0.0745 164/500 [========>.....................] - ETA: 1:50 - loss: 0.6036 - regression_loss: 0.5292 - classification_loss: 0.0744 165/500 [========>.....................] - ETA: 1:50 - loss: 0.6029 - regression_loss: 0.5286 - classification_loss: 0.0743 166/500 [========>.....................] - ETA: 1:50 - loss: 0.6015 - regression_loss: 0.5275 - classification_loss: 0.0740 167/500 [=========>....................] - ETA: 1:49 - loss: 0.6003 - regression_loss: 0.5265 - classification_loss: 0.0738 168/500 [=========>....................] - ETA: 1:49 - loss: 0.5992 - regression_loss: 0.5255 - classification_loss: 0.0737 169/500 [=========>....................] - ETA: 1:49 - loss: 0.5986 - regression_loss: 0.5252 - classification_loss: 0.0734 170/500 [=========>....................] - ETA: 1:48 - loss: 0.5981 - regression_loss: 0.5247 - classification_loss: 0.0733 171/500 [=========>....................] - ETA: 1:48 - loss: 0.5960 - regression_loss: 0.5230 - classification_loss: 0.0730 172/500 [=========>....................] - ETA: 1:48 - loss: 0.5981 - regression_loss: 0.5248 - classification_loss: 0.0734 173/500 [=========>....................] - ETA: 1:47 - loss: 0.5962 - regression_loss: 0.5232 - classification_loss: 0.0731 174/500 [=========>....................] - ETA: 1:47 - loss: 0.5946 - regression_loss: 0.5217 - classification_loss: 0.0728 175/500 [=========>....................] - ETA: 1:47 - loss: 0.5936 - regression_loss: 0.5210 - classification_loss: 0.0726 176/500 [=========>....................] - ETA: 1:46 - loss: 0.5921 - regression_loss: 0.5197 - classification_loss: 0.0724 177/500 [=========>....................] - ETA: 1:46 - loss: 0.5931 - regression_loss: 0.5208 - classification_loss: 0.0723 178/500 [=========>....................] - ETA: 1:46 - loss: 0.5944 - regression_loss: 0.5220 - classification_loss: 0.0724 179/500 [=========>....................] - ETA: 1:45 - loss: 0.5980 - regression_loss: 0.5250 - classification_loss: 0.0730 180/500 [=========>....................] - ETA: 1:45 - loss: 0.5953 - regression_loss: 0.5226 - classification_loss: 0.0727 181/500 [=========>....................] - ETA: 1:45 - loss: 0.5971 - regression_loss: 0.5239 - classification_loss: 0.0732 182/500 [=========>....................] - ETA: 1:44 - loss: 0.5967 - regression_loss: 0.5237 - classification_loss: 0.0730 183/500 [=========>....................] - ETA: 1:44 - loss: 0.5966 - regression_loss: 0.5235 - classification_loss: 0.0731 184/500 [==========>...................] - ETA: 1:44 - loss: 0.5969 - regression_loss: 0.5239 - classification_loss: 0.0730 185/500 [==========>...................] - ETA: 1:43 - loss: 0.5977 - regression_loss: 0.5247 - classification_loss: 0.0730 186/500 [==========>...................] - ETA: 1:43 - loss: 0.5966 - regression_loss: 0.5238 - classification_loss: 0.0728 187/500 [==========>...................] - ETA: 1:43 - loss: 0.5961 - regression_loss: 0.5235 - classification_loss: 0.0726 188/500 [==========>...................] - ETA: 1:42 - loss: 0.5973 - regression_loss: 0.5246 - classification_loss: 0.0727 189/500 [==========>...................] - ETA: 1:42 - loss: 0.5995 - regression_loss: 0.5265 - classification_loss: 0.0730 190/500 [==========>...................] - ETA: 1:42 - loss: 0.5995 - regression_loss: 0.5264 - classification_loss: 0.0731 191/500 [==========>...................] - ETA: 1:41 - loss: 0.6002 - regression_loss: 0.5272 - classification_loss: 0.0730 192/500 [==========>...................] - ETA: 1:41 - loss: 0.6002 - regression_loss: 0.5272 - classification_loss: 0.0730 193/500 [==========>...................] - ETA: 1:41 - loss: 0.5999 - regression_loss: 0.5269 - classification_loss: 0.0730 194/500 [==========>...................] - ETA: 1:40 - loss: 0.6008 - regression_loss: 0.5278 - classification_loss: 0.0730 195/500 [==========>...................] - ETA: 1:40 - loss: 0.6001 - regression_loss: 0.5272 - classification_loss: 0.0729 196/500 [==========>...................] - ETA: 1:40 - loss: 0.6009 - regression_loss: 0.5280 - classification_loss: 0.0730 197/500 [==========>...................] - ETA: 1:39 - loss: 0.6053 - regression_loss: 0.5317 - classification_loss: 0.0736 198/500 [==========>...................] - ETA: 1:39 - loss: 0.6067 - regression_loss: 0.5332 - classification_loss: 0.0736 199/500 [==========>...................] - ETA: 1:39 - loss: 0.6088 - regression_loss: 0.5353 - classification_loss: 0.0734 200/500 [===========>..................] - ETA: 1:38 - loss: 0.6069 - regression_loss: 0.5337 - classification_loss: 0.0733 201/500 [===========>..................] - ETA: 1:38 - loss: 0.6059 - regression_loss: 0.5327 - classification_loss: 0.0732 202/500 [===========>..................] - ETA: 1:38 - loss: 0.6039 - regression_loss: 0.5310 - classification_loss: 0.0729 203/500 [===========>..................] - ETA: 1:37 - loss: 0.6046 - regression_loss: 0.5316 - classification_loss: 0.0730 204/500 [===========>..................] - ETA: 1:37 - loss: 0.6044 - regression_loss: 0.5315 - classification_loss: 0.0729 205/500 [===========>..................] - ETA: 1:37 - loss: 0.6029 - regression_loss: 0.5302 - classification_loss: 0.0727 206/500 [===========>..................] - ETA: 1:36 - loss: 0.6029 - regression_loss: 0.5301 - classification_loss: 0.0728 207/500 [===========>..................] - ETA: 1:36 - loss: 0.6017 - regression_loss: 0.5290 - classification_loss: 0.0726 208/500 [===========>..................] - ETA: 1:36 - loss: 0.6004 - regression_loss: 0.5280 - classification_loss: 0.0724 209/500 [===========>..................] - ETA: 1:35 - loss: 0.6014 - regression_loss: 0.5288 - classification_loss: 0.0726 210/500 [===========>..................] - ETA: 1:35 - loss: 0.5998 - regression_loss: 0.5275 - classification_loss: 0.0723 211/500 [===========>..................] - ETA: 1:35 - loss: 0.6000 - regression_loss: 0.5277 - classification_loss: 0.0723 212/500 [===========>..................] - ETA: 1:34 - loss: 0.6005 - regression_loss: 0.5281 - classification_loss: 0.0724 213/500 [===========>..................] - ETA: 1:34 - loss: 0.6001 - regression_loss: 0.5277 - classification_loss: 0.0724 214/500 [===========>..................] - ETA: 1:34 - loss: 0.5992 - regression_loss: 0.5269 - classification_loss: 0.0723 215/500 [===========>..................] - ETA: 1:33 - loss: 0.6003 - regression_loss: 0.5279 - classification_loss: 0.0725 216/500 [===========>..................] - ETA: 1:33 - loss: 0.5988 - regression_loss: 0.5266 - classification_loss: 0.0722 217/500 [============>.................] - ETA: 1:33 - loss: 0.5988 - regression_loss: 0.5266 - classification_loss: 0.0722 218/500 [============>.................] - ETA: 1:32 - loss: 0.5972 - regression_loss: 0.5252 - classification_loss: 0.0720 219/500 [============>.................] - ETA: 1:32 - loss: 0.5974 - regression_loss: 0.5254 - classification_loss: 0.0720 220/500 [============>.................] - ETA: 1:32 - loss: 0.5993 - regression_loss: 0.5267 - classification_loss: 0.0725 221/500 [============>.................] - ETA: 1:32 - loss: 0.5990 - regression_loss: 0.5266 - classification_loss: 0.0725 222/500 [============>.................] - ETA: 1:31 - loss: 0.5983 - regression_loss: 0.5260 - classification_loss: 0.0722 223/500 [============>.................] - ETA: 1:31 - loss: 0.5977 - regression_loss: 0.5255 - classification_loss: 0.0721 224/500 [============>.................] - ETA: 1:31 - loss: 0.5990 - regression_loss: 0.5265 - classification_loss: 0.0724 225/500 [============>.................] - ETA: 1:30 - loss: 0.5985 - regression_loss: 0.5261 - classification_loss: 0.0724 226/500 [============>.................] - ETA: 1:30 - loss: 0.5990 - regression_loss: 0.5266 - classification_loss: 0.0724 227/500 [============>.................] - ETA: 1:30 - loss: 0.5984 - regression_loss: 0.5261 - classification_loss: 0.0723 228/500 [============>.................] - ETA: 1:29 - loss: 0.5984 - regression_loss: 0.5262 - classification_loss: 0.0722 229/500 [============>.................] - ETA: 1:29 - loss: 0.6009 - regression_loss: 0.5283 - classification_loss: 0.0727 230/500 [============>.................] - ETA: 1:29 - loss: 0.6010 - regression_loss: 0.5283 - classification_loss: 0.0727 231/500 [============>.................] - ETA: 1:28 - loss: 0.6001 - regression_loss: 0.5276 - classification_loss: 0.0725 232/500 [============>.................] - ETA: 1:28 - loss: 0.5997 - regression_loss: 0.5273 - classification_loss: 0.0724 233/500 [============>.................] - ETA: 1:28 - loss: 0.6010 - regression_loss: 0.5282 - classification_loss: 0.0727 234/500 [=============>................] - ETA: 1:27 - loss: 0.6007 - regression_loss: 0.5280 - classification_loss: 0.0727 235/500 [=============>................] - ETA: 1:27 - loss: 0.5997 - regression_loss: 0.5271 - classification_loss: 0.0725 236/500 [=============>................] - ETA: 1:27 - loss: 0.6001 - regression_loss: 0.5276 - classification_loss: 0.0725 237/500 [=============>................] - ETA: 1:26 - loss: 0.5997 - regression_loss: 0.5272 - classification_loss: 0.0724 238/500 [=============>................] - ETA: 1:26 - loss: 0.5986 - regression_loss: 0.5263 - classification_loss: 0.0723 239/500 [=============>................] - ETA: 1:26 - loss: 0.5990 - regression_loss: 0.5267 - classification_loss: 0.0724 240/500 [=============>................] - ETA: 1:25 - loss: 0.5992 - regression_loss: 0.5269 - classification_loss: 0.0723 241/500 [=============>................] - ETA: 1:25 - loss: 0.5982 - regression_loss: 0.5261 - classification_loss: 0.0722 242/500 [=============>................] - ETA: 1:25 - loss: 0.5983 - regression_loss: 0.5261 - classification_loss: 0.0722 243/500 [=============>................] - ETA: 1:24 - loss: 0.5967 - regression_loss: 0.5247 - classification_loss: 0.0720 244/500 [=============>................] - ETA: 1:24 - loss: 0.5977 - regression_loss: 0.5256 - classification_loss: 0.0721 245/500 [=============>................] - ETA: 1:24 - loss: 0.5973 - regression_loss: 0.5253 - classification_loss: 0.0720 246/500 [=============>................] - ETA: 1:23 - loss: 0.5967 - regression_loss: 0.5247 - classification_loss: 0.0720 247/500 [=============>................] - ETA: 1:23 - loss: 0.5975 - regression_loss: 0.5253 - classification_loss: 0.0721 248/500 [=============>................] - ETA: 1:23 - loss: 0.5982 - regression_loss: 0.5259 - classification_loss: 0.0723 249/500 [=============>................] - ETA: 1:22 - loss: 0.5979 - regression_loss: 0.5256 - classification_loss: 0.0723 250/500 [==============>...............] - ETA: 1:22 - loss: 0.5966 - regression_loss: 0.5244 - classification_loss: 0.0722 251/500 [==============>...............] - ETA: 1:22 - loss: 0.5955 - regression_loss: 0.5234 - classification_loss: 0.0720 252/500 [==============>...............] - ETA: 1:21 - loss: 0.5937 - regression_loss: 0.5218 - classification_loss: 0.0718 253/500 [==============>...............] - ETA: 1:21 - loss: 0.5933 - regression_loss: 0.5214 - classification_loss: 0.0719 254/500 [==============>...............] - ETA: 1:21 - loss: 0.5945 - regression_loss: 0.5225 - classification_loss: 0.0720 255/500 [==============>...............] - ETA: 1:20 - loss: 0.5937 - regression_loss: 0.5220 - classification_loss: 0.0718 256/500 [==============>...............] - ETA: 1:20 - loss: 0.5940 - regression_loss: 0.5222 - classification_loss: 0.0718 257/500 [==============>...............] - ETA: 1:20 - loss: 0.5939 - regression_loss: 0.5221 - classification_loss: 0.0718 258/500 [==============>...............] - ETA: 1:19 - loss: 0.5947 - regression_loss: 0.5228 - classification_loss: 0.0719 259/500 [==============>...............] - ETA: 1:19 - loss: 0.5951 - regression_loss: 0.5231 - classification_loss: 0.0719 260/500 [==============>...............] - ETA: 1:19 - loss: 0.5955 - regression_loss: 0.5235 - classification_loss: 0.0720 261/500 [==============>...............] - ETA: 1:18 - loss: 0.5958 - regression_loss: 0.5238 - classification_loss: 0.0720 262/500 [==============>...............] - ETA: 1:18 - loss: 0.5948 - regression_loss: 0.5229 - classification_loss: 0.0718 263/500 [==============>...............] - ETA: 1:18 - loss: 0.5946 - regression_loss: 0.5228 - classification_loss: 0.0718 264/500 [==============>...............] - ETA: 1:18 - loss: 0.5948 - regression_loss: 0.5231 - classification_loss: 0.0717 265/500 [==============>...............] - ETA: 1:17 - loss: 0.5941 - regression_loss: 0.5225 - classification_loss: 0.0716 266/500 [==============>...............] - ETA: 1:17 - loss: 0.5930 - regression_loss: 0.5216 - classification_loss: 0.0714 267/500 [===============>..............] - ETA: 1:17 - loss: 0.5918 - regression_loss: 0.5205 - classification_loss: 0.0713 268/500 [===============>..............] - ETA: 1:16 - loss: 0.5920 - regression_loss: 0.5206 - classification_loss: 0.0713 269/500 [===============>..............] - ETA: 1:16 - loss: 0.5931 - regression_loss: 0.5216 - classification_loss: 0.0715 270/500 [===============>..............] - ETA: 1:16 - loss: 0.5939 - regression_loss: 0.5224 - classification_loss: 0.0715 271/500 [===============>..............] - ETA: 1:15 - loss: 0.5926 - regression_loss: 0.5212 - classification_loss: 0.0714 272/500 [===============>..............] - ETA: 1:15 - loss: 0.5916 - regression_loss: 0.5203 - classification_loss: 0.0712 273/500 [===============>..............] - ETA: 1:15 - loss: 0.5919 - regression_loss: 0.5206 - classification_loss: 0.0713 274/500 [===============>..............] - ETA: 1:14 - loss: 0.5928 - regression_loss: 0.5214 - classification_loss: 0.0713 275/500 [===============>..............] - ETA: 1:14 - loss: 0.5933 - regression_loss: 0.5219 - classification_loss: 0.0714 276/500 [===============>..............] - ETA: 1:14 - loss: 0.5937 - regression_loss: 0.5223 - classification_loss: 0.0714 277/500 [===============>..............] - ETA: 1:13 - loss: 0.5932 - regression_loss: 0.5219 - classification_loss: 0.0713 278/500 [===============>..............] - ETA: 1:13 - loss: 0.5923 - regression_loss: 0.5212 - classification_loss: 0.0711 279/500 [===============>..............] - ETA: 1:13 - loss: 0.5924 - regression_loss: 0.5213 - classification_loss: 0.0711 280/500 [===============>..............] - ETA: 1:12 - loss: 0.5944 - regression_loss: 0.5230 - classification_loss: 0.0714 281/500 [===============>..............] - ETA: 1:12 - loss: 0.5935 - regression_loss: 0.5222 - classification_loss: 0.0714 282/500 [===============>..............] - ETA: 1:12 - loss: 0.5929 - regression_loss: 0.5216 - classification_loss: 0.0713 283/500 [===============>..............] - ETA: 1:11 - loss: 0.5935 - regression_loss: 0.5222 - classification_loss: 0.0713 284/500 [================>.............] - ETA: 1:11 - loss: 0.5925 - regression_loss: 0.5214 - classification_loss: 0.0711 285/500 [================>.............] - ETA: 1:11 - loss: 0.5929 - regression_loss: 0.5217 - classification_loss: 0.0712 286/500 [================>.............] - ETA: 1:10 - loss: 0.5939 - regression_loss: 0.5226 - classification_loss: 0.0712 287/500 [================>.............] - ETA: 1:10 - loss: 0.5926 - regression_loss: 0.5216 - classification_loss: 0.0711 288/500 [================>.............] - ETA: 1:10 - loss: 0.5929 - regression_loss: 0.5217 - classification_loss: 0.0712 289/500 [================>.............] - ETA: 1:09 - loss: 0.5926 - regression_loss: 0.5215 - classification_loss: 0.0711 290/500 [================>.............] - ETA: 1:09 - loss: 0.5924 - regression_loss: 0.5214 - classification_loss: 0.0710 291/500 [================>.............] - ETA: 1:09 - loss: 0.5926 - regression_loss: 0.5215 - classification_loss: 0.0710 292/500 [================>.............] - ETA: 1:08 - loss: 0.5933 - regression_loss: 0.5222 - classification_loss: 0.0711 293/500 [================>.............] - ETA: 1:08 - loss: 0.5933 - regression_loss: 0.5222 - classification_loss: 0.0711 294/500 [================>.............] - ETA: 1:08 - loss: 0.5943 - regression_loss: 0.5230 - classification_loss: 0.0712 295/500 [================>.............] - ETA: 1:07 - loss: 0.5942 - regression_loss: 0.5229 - classification_loss: 0.0713 296/500 [================>.............] - ETA: 1:07 - loss: 0.5932 - regression_loss: 0.5221 - classification_loss: 0.0711 297/500 [================>.............] - ETA: 1:07 - loss: 0.5939 - regression_loss: 0.5227 - classification_loss: 0.0712 298/500 [================>.............] - ETA: 1:06 - loss: 0.5950 - regression_loss: 0.5235 - classification_loss: 0.0715 299/500 [================>.............] - ETA: 1:06 - loss: 0.5947 - regression_loss: 0.5233 - classification_loss: 0.0714 300/500 [=================>............] - ETA: 1:06 - loss: 0.5964 - regression_loss: 0.5248 - classification_loss: 0.0716 301/500 [=================>............] - ETA: 1:05 - loss: 0.5959 - regression_loss: 0.5243 - classification_loss: 0.0716 302/500 [=================>............] - ETA: 1:05 - loss: 0.5962 - regression_loss: 0.5245 - classification_loss: 0.0717 303/500 [=================>............] - ETA: 1:05 - loss: 0.5952 - regression_loss: 0.5235 - classification_loss: 0.0716 304/500 [=================>............] - ETA: 1:04 - loss: 0.5971 - regression_loss: 0.5251 - classification_loss: 0.0719 305/500 [=================>............] - ETA: 1:04 - loss: 0.5973 - regression_loss: 0.5253 - classification_loss: 0.0720 306/500 [=================>............] - ETA: 1:04 - loss: 0.5974 - regression_loss: 0.5254 - classification_loss: 0.0720 307/500 [=================>............] - ETA: 1:03 - loss: 0.5977 - regression_loss: 0.5256 - classification_loss: 0.0721 308/500 [=================>............] - ETA: 1:03 - loss: 0.5970 - regression_loss: 0.5251 - classification_loss: 0.0720 309/500 [=================>............] - ETA: 1:03 - loss: 0.5974 - regression_loss: 0.5254 - classification_loss: 0.0720 310/500 [=================>............] - ETA: 1:02 - loss: 0.5966 - regression_loss: 0.5245 - classification_loss: 0.0721 311/500 [=================>............] - ETA: 1:02 - loss: 0.5988 - regression_loss: 0.5264 - classification_loss: 0.0724 312/500 [=================>............] - ETA: 1:02 - loss: 0.5986 - regression_loss: 0.5262 - classification_loss: 0.0723 313/500 [=================>............] - ETA: 1:01 - loss: 0.5995 - regression_loss: 0.5269 - classification_loss: 0.0726 314/500 [=================>............] - ETA: 1:01 - loss: 0.6003 - regression_loss: 0.5275 - classification_loss: 0.0727 315/500 [=================>............] - ETA: 1:01 - loss: 0.5991 - regression_loss: 0.5265 - classification_loss: 0.0726 316/500 [=================>............] - ETA: 1:00 - loss: 0.5997 - regression_loss: 0.5270 - classification_loss: 0.0726 317/500 [==================>...........] - ETA: 1:00 - loss: 0.6006 - regression_loss: 0.5279 - classification_loss: 0.0727 318/500 [==================>...........] - ETA: 1:00 - loss: 0.6004 - regression_loss: 0.5277 - classification_loss: 0.0727 319/500 [==================>...........] - ETA: 59s - loss: 0.6008 - regression_loss: 0.5279 - classification_loss: 0.0729  320/500 [==================>...........] - ETA: 59s - loss: 0.6005 - regression_loss: 0.5276 - classification_loss: 0.0729 321/500 [==================>...........] - ETA: 59s - loss: 0.6000 - regression_loss: 0.5272 - classification_loss: 0.0728 322/500 [==================>...........] - ETA: 58s - loss: 0.6001 - regression_loss: 0.5274 - classification_loss: 0.0727 323/500 [==================>...........] - ETA: 58s - loss: 0.6011 - regression_loss: 0.5282 - classification_loss: 0.0729 324/500 [==================>...........] - ETA: 58s - loss: 0.6001 - regression_loss: 0.5274 - classification_loss: 0.0727 325/500 [==================>...........] - ETA: 57s - loss: 0.5995 - regression_loss: 0.5268 - classification_loss: 0.0726 326/500 [==================>...........] - ETA: 57s - loss: 0.5980 - regression_loss: 0.5255 - classification_loss: 0.0725 327/500 [==================>...........] - ETA: 57s - loss: 0.5983 - regression_loss: 0.5258 - classification_loss: 0.0725 328/500 [==================>...........] - ETA: 56s - loss: 0.5987 - regression_loss: 0.5262 - classification_loss: 0.0725 329/500 [==================>...........] - ETA: 56s - loss: 0.5979 - regression_loss: 0.5254 - classification_loss: 0.0724 330/500 [==================>...........] - ETA: 56s - loss: 0.5977 - regression_loss: 0.5254 - classification_loss: 0.0724 331/500 [==================>...........] - ETA: 55s - loss: 0.5968 - regression_loss: 0.5246 - classification_loss: 0.0722 332/500 [==================>...........] - ETA: 55s - loss: 0.5974 - regression_loss: 0.5251 - classification_loss: 0.0723 333/500 [==================>...........] - ETA: 55s - loss: 0.5965 - regression_loss: 0.5244 - classification_loss: 0.0721 334/500 [===================>..........] - ETA: 54s - loss: 0.5965 - regression_loss: 0.5244 - classification_loss: 0.0721 335/500 [===================>..........] - ETA: 54s - loss: 0.5974 - regression_loss: 0.5251 - classification_loss: 0.0722 336/500 [===================>..........] - ETA: 54s - loss: 0.5971 - regression_loss: 0.5248 - classification_loss: 0.0722 337/500 [===================>..........] - ETA: 53s - loss: 0.5966 - regression_loss: 0.5244 - classification_loss: 0.0722 338/500 [===================>..........] - ETA: 53s - loss: 0.5957 - regression_loss: 0.5237 - classification_loss: 0.0721 339/500 [===================>..........] - ETA: 53s - loss: 0.5953 - regression_loss: 0.5232 - classification_loss: 0.0721 340/500 [===================>..........] - ETA: 52s - loss: 0.5947 - regression_loss: 0.5227 - classification_loss: 0.0720 341/500 [===================>..........] - ETA: 52s - loss: 0.5941 - regression_loss: 0.5222 - classification_loss: 0.0719 342/500 [===================>..........] - ETA: 52s - loss: 0.5942 - regression_loss: 0.5223 - classification_loss: 0.0719 343/500 [===================>..........] - ETA: 51s - loss: 0.5945 - regression_loss: 0.5224 - classification_loss: 0.0720 344/500 [===================>..........] - ETA: 51s - loss: 0.5957 - regression_loss: 0.5234 - classification_loss: 0.0723 345/500 [===================>..........] - ETA: 51s - loss: 0.5969 - regression_loss: 0.5244 - classification_loss: 0.0725 346/500 [===================>..........] - ETA: 50s - loss: 0.5968 - regression_loss: 0.5244 - classification_loss: 0.0723 347/500 [===================>..........] - ETA: 50s - loss: 0.5961 - regression_loss: 0.5238 - classification_loss: 0.0723 348/500 [===================>..........] - ETA: 50s - loss: 0.5971 - regression_loss: 0.5247 - classification_loss: 0.0725 349/500 [===================>..........] - ETA: 49s - loss: 0.5971 - regression_loss: 0.5245 - classification_loss: 0.0726 350/500 [====================>.........] - ETA: 49s - loss: 0.5978 - regression_loss: 0.5251 - classification_loss: 0.0727 351/500 [====================>.........] - ETA: 49s - loss: 0.5991 - regression_loss: 0.5261 - classification_loss: 0.0729 352/500 [====================>.........] - ETA: 48s - loss: 0.5994 - regression_loss: 0.5264 - classification_loss: 0.0730 353/500 [====================>.........] - ETA: 48s - loss: 0.5985 - regression_loss: 0.5257 - classification_loss: 0.0729 354/500 [====================>.........] - ETA: 48s - loss: 0.5983 - regression_loss: 0.5255 - classification_loss: 0.0728 355/500 [====================>.........] - ETA: 47s - loss: 0.5982 - regression_loss: 0.5254 - classification_loss: 0.0728 356/500 [====================>.........] - ETA: 47s - loss: 0.5975 - regression_loss: 0.5248 - classification_loss: 0.0727 357/500 [====================>.........] - ETA: 47s - loss: 0.5967 - regression_loss: 0.5241 - classification_loss: 0.0726 358/500 [====================>.........] - ETA: 46s - loss: 0.5972 - regression_loss: 0.5246 - classification_loss: 0.0726 359/500 [====================>.........] - ETA: 46s - loss: 0.5971 - regression_loss: 0.5244 - classification_loss: 0.0726 360/500 [====================>.........] - ETA: 46s - loss: 0.5975 - regression_loss: 0.5247 - classification_loss: 0.0728 361/500 [====================>.........] - ETA: 45s - loss: 0.5963 - regression_loss: 0.5237 - classification_loss: 0.0726 362/500 [====================>.........] - ETA: 45s - loss: 0.5964 - regression_loss: 0.5238 - classification_loss: 0.0726 363/500 [====================>.........] - ETA: 45s - loss: 0.5961 - regression_loss: 0.5235 - classification_loss: 0.0726 364/500 [====================>.........] - ETA: 44s - loss: 0.5962 - regression_loss: 0.5237 - classification_loss: 0.0726 365/500 [====================>.........] - ETA: 44s - loss: 0.5964 - regression_loss: 0.5239 - classification_loss: 0.0725 366/500 [====================>.........] - ETA: 44s - loss: 0.5958 - regression_loss: 0.5235 - classification_loss: 0.0724 367/500 [=====================>........] - ETA: 43s - loss: 0.5959 - regression_loss: 0.5236 - classification_loss: 0.0724 368/500 [=====================>........] - ETA: 43s - loss: 0.5961 - regression_loss: 0.5238 - classification_loss: 0.0724 369/500 [=====================>........] - ETA: 43s - loss: 0.5951 - regression_loss: 0.5229 - classification_loss: 0.0722 370/500 [=====================>........] - ETA: 42s - loss: 0.5941 - regression_loss: 0.5220 - classification_loss: 0.0721 371/500 [=====================>........] - ETA: 42s - loss: 0.5951 - regression_loss: 0.5228 - classification_loss: 0.0722 372/500 [=====================>........] - ETA: 42s - loss: 0.5942 - regression_loss: 0.5221 - classification_loss: 0.0721 373/500 [=====================>........] - ETA: 41s - loss: 0.5942 - regression_loss: 0.5221 - classification_loss: 0.0721 374/500 [=====================>........] - ETA: 41s - loss: 0.5936 - regression_loss: 0.5216 - classification_loss: 0.0720 375/500 [=====================>........] - ETA: 41s - loss: 0.5945 - regression_loss: 0.5224 - classification_loss: 0.0721 376/500 [=====================>........] - ETA: 40s - loss: 0.5946 - regression_loss: 0.5226 - classification_loss: 0.0720 377/500 [=====================>........] - ETA: 40s - loss: 0.5944 - regression_loss: 0.5224 - classification_loss: 0.0720 378/500 [=====================>........] - ETA: 40s - loss: 0.5946 - regression_loss: 0.5227 - classification_loss: 0.0720 379/500 [=====================>........] - ETA: 39s - loss: 0.5946 - regression_loss: 0.5227 - classification_loss: 0.0719 380/500 [=====================>........] - ETA: 39s - loss: 0.5950 - regression_loss: 0.5231 - classification_loss: 0.0719 381/500 [=====================>........] - ETA: 39s - loss: 0.5954 - regression_loss: 0.5235 - classification_loss: 0.0719 382/500 [=====================>........] - ETA: 39s - loss: 0.5944 - regression_loss: 0.5227 - classification_loss: 0.0717 383/500 [=====================>........] - ETA: 38s - loss: 0.5953 - regression_loss: 0.5234 - classification_loss: 0.0719 384/500 [======================>.......] - ETA: 38s - loss: 0.5949 - regression_loss: 0.5230 - classification_loss: 0.0719 385/500 [======================>.......] - ETA: 38s - loss: 0.5953 - regression_loss: 0.5234 - classification_loss: 0.0719 386/500 [======================>.......] - ETA: 37s - loss: 0.5951 - regression_loss: 0.5233 - classification_loss: 0.0718 387/500 [======================>.......] - ETA: 37s - loss: 0.5953 - regression_loss: 0.5236 - classification_loss: 0.0717 388/500 [======================>.......] - ETA: 37s - loss: 0.5949 - regression_loss: 0.5232 - classification_loss: 0.0717 389/500 [======================>.......] - ETA: 36s - loss: 0.5960 - regression_loss: 0.5244 - classification_loss: 0.0716 390/500 [======================>.......] - ETA: 36s - loss: 0.5962 - regression_loss: 0.5247 - classification_loss: 0.0715 391/500 [======================>.......] - ETA: 36s - loss: 0.5958 - regression_loss: 0.5243 - classification_loss: 0.0715 392/500 [======================>.......] - ETA: 35s - loss: 0.5989 - regression_loss: 0.5271 - classification_loss: 0.0718 393/500 [======================>.......] - ETA: 35s - loss: 0.6008 - regression_loss: 0.5287 - classification_loss: 0.0721 394/500 [======================>.......] - ETA: 35s - loss: 0.6018 - regression_loss: 0.5296 - classification_loss: 0.0722 395/500 [======================>.......] - ETA: 34s - loss: 0.6018 - regression_loss: 0.5296 - classification_loss: 0.0722 396/500 [======================>.......] - ETA: 34s - loss: 0.6009 - regression_loss: 0.5288 - classification_loss: 0.0721 397/500 [======================>.......] - ETA: 34s - loss: 0.6002 - regression_loss: 0.5282 - classification_loss: 0.0720 398/500 [======================>.......] - ETA: 33s - loss: 0.5995 - regression_loss: 0.5277 - classification_loss: 0.0719 399/500 [======================>.......] - ETA: 33s - loss: 0.5997 - regression_loss: 0.5279 - classification_loss: 0.0718 400/500 [=======================>......] - ETA: 33s - loss: 0.6004 - regression_loss: 0.5286 - classification_loss: 0.0718 401/500 [=======================>......] - ETA: 32s - loss: 0.6013 - regression_loss: 0.5294 - classification_loss: 0.0720 402/500 [=======================>......] - ETA: 32s - loss: 0.6006 - regression_loss: 0.5287 - classification_loss: 0.0719 403/500 [=======================>......] - ETA: 32s - loss: 0.6008 - regression_loss: 0.5289 - classification_loss: 0.0719 404/500 [=======================>......] - ETA: 31s - loss: 0.5999 - regression_loss: 0.5282 - classification_loss: 0.0718 405/500 [=======================>......] - ETA: 31s - loss: 0.6000 - regression_loss: 0.5283 - classification_loss: 0.0717 406/500 [=======================>......] - ETA: 31s - loss: 0.6006 - regression_loss: 0.5287 - classification_loss: 0.0719 407/500 [=======================>......] - ETA: 30s - loss: 0.6014 - regression_loss: 0.5293 - classification_loss: 0.0721 408/500 [=======================>......] - ETA: 30s - loss: 0.6007 - regression_loss: 0.5287 - classification_loss: 0.0720 409/500 [=======================>......] - ETA: 30s - loss: 0.6012 - regression_loss: 0.5292 - classification_loss: 0.0721 410/500 [=======================>......] - ETA: 29s - loss: 0.6007 - regression_loss: 0.5287 - classification_loss: 0.0720 411/500 [=======================>......] - ETA: 29s - loss: 0.5999 - regression_loss: 0.5280 - classification_loss: 0.0719 412/500 [=======================>......] - ETA: 29s - loss: 0.6002 - regression_loss: 0.5282 - classification_loss: 0.0720 413/500 [=======================>......] - ETA: 28s - loss: 0.6003 - regression_loss: 0.5283 - classification_loss: 0.0720 414/500 [=======================>......] - ETA: 28s - loss: 0.6010 - regression_loss: 0.5288 - classification_loss: 0.0722 415/500 [=======================>......] - ETA: 28s - loss: 0.6010 - regression_loss: 0.5288 - classification_loss: 0.0722 416/500 [=======================>......] - ETA: 27s - loss: 0.6007 - regression_loss: 0.5285 - classification_loss: 0.0722 417/500 [========================>.....] - ETA: 27s - loss: 0.6001 - regression_loss: 0.5280 - classification_loss: 0.0721 418/500 [========================>.....] - ETA: 27s - loss: 0.5994 - regression_loss: 0.5275 - classification_loss: 0.0719 419/500 [========================>.....] - ETA: 26s - loss: 0.5994 - regression_loss: 0.5274 - classification_loss: 0.0719 420/500 [========================>.....] - ETA: 26s - loss: 0.5990 - regression_loss: 0.5271 - classification_loss: 0.0719 421/500 [========================>.....] - ETA: 26s - loss: 0.5989 - regression_loss: 0.5271 - classification_loss: 0.0718 422/500 [========================>.....] - ETA: 25s - loss: 0.5984 - regression_loss: 0.5267 - classification_loss: 0.0717 423/500 [========================>.....] - ETA: 25s - loss: 0.5983 - regression_loss: 0.5266 - classification_loss: 0.0718 424/500 [========================>.....] - ETA: 25s - loss: 0.5995 - regression_loss: 0.5275 - classification_loss: 0.0719 425/500 [========================>.....] - ETA: 24s - loss: 0.5997 - regression_loss: 0.5278 - classification_loss: 0.0719 426/500 [========================>.....] - ETA: 24s - loss: 0.6001 - regression_loss: 0.5281 - classification_loss: 0.0720 427/500 [========================>.....] - ETA: 24s - loss: 0.6005 - regression_loss: 0.5285 - classification_loss: 0.0719 428/500 [========================>.....] - ETA: 23s - loss: 0.6000 - regression_loss: 0.5281 - classification_loss: 0.0719 429/500 [========================>.....] - ETA: 23s - loss: 0.5997 - regression_loss: 0.5278 - classification_loss: 0.0718 430/500 [========================>.....] - ETA: 23s - loss: 0.6000 - regression_loss: 0.5281 - classification_loss: 0.0718 431/500 [========================>.....] - ETA: 22s - loss: 0.5995 - regression_loss: 0.5277 - classification_loss: 0.0717 432/500 [========================>.....] - ETA: 22s - loss: 0.5999 - regression_loss: 0.5282 - classification_loss: 0.0718 433/500 [========================>.....] - ETA: 22s - loss: 0.5993 - regression_loss: 0.5276 - classification_loss: 0.0717 434/500 [=========================>....] - ETA: 21s - loss: 0.5988 - regression_loss: 0.5272 - classification_loss: 0.0716 435/500 [=========================>....] - ETA: 21s - loss: 0.5986 - regression_loss: 0.5270 - classification_loss: 0.0715 436/500 [=========================>....] - ETA: 21s - loss: 0.5984 - regression_loss: 0.5269 - classification_loss: 0.0715 437/500 [=========================>....] - ETA: 20s - loss: 0.5987 - regression_loss: 0.5272 - classification_loss: 0.0715 438/500 [=========================>....] - ETA: 20s - loss: 0.5989 - regression_loss: 0.5274 - classification_loss: 0.0715 439/500 [=========================>....] - ETA: 20s - loss: 0.5981 - regression_loss: 0.5267 - classification_loss: 0.0714 440/500 [=========================>....] - ETA: 19s - loss: 0.5977 - regression_loss: 0.5263 - classification_loss: 0.0713 441/500 [=========================>....] - ETA: 19s - loss: 0.5979 - regression_loss: 0.5265 - classification_loss: 0.0714 442/500 [=========================>....] - ETA: 19s - loss: 0.5970 - regression_loss: 0.5257 - classification_loss: 0.0713 443/500 [=========================>....] - ETA: 18s - loss: 0.5967 - regression_loss: 0.5255 - classification_loss: 0.0713 444/500 [=========================>....] - ETA: 18s - loss: 0.5967 - regression_loss: 0.5254 - classification_loss: 0.0713 445/500 [=========================>....] - ETA: 18s - loss: 0.5978 - regression_loss: 0.5263 - classification_loss: 0.0715 446/500 [=========================>....] - ETA: 17s - loss: 0.5981 - regression_loss: 0.5265 - classification_loss: 0.0716 447/500 [=========================>....] - ETA: 17s - loss: 0.5983 - regression_loss: 0.5267 - classification_loss: 0.0716 448/500 [=========================>....] - ETA: 17s - loss: 0.5991 - regression_loss: 0.5274 - classification_loss: 0.0717 449/500 [=========================>....] - ETA: 16s - loss: 0.5990 - regression_loss: 0.5273 - classification_loss: 0.0717 450/500 [==========================>...] - ETA: 16s - loss: 0.5984 - regression_loss: 0.5268 - classification_loss: 0.0716 451/500 [==========================>...] - ETA: 16s - loss: 0.5981 - regression_loss: 0.5266 - classification_loss: 0.0715 452/500 [==========================>...] - ETA: 15s - loss: 0.5975 - regression_loss: 0.5261 - classification_loss: 0.0714 453/500 [==========================>...] - ETA: 15s - loss: 0.5980 - regression_loss: 0.5265 - classification_loss: 0.0715 454/500 [==========================>...] - ETA: 15s - loss: 0.5981 - regression_loss: 0.5267 - classification_loss: 0.0715 455/500 [==========================>...] - ETA: 14s - loss: 0.5977 - regression_loss: 0.5263 - classification_loss: 0.0714 456/500 [==========================>...] - ETA: 14s - loss: 0.5986 - regression_loss: 0.5271 - classification_loss: 0.0715 457/500 [==========================>...] - ETA: 14s - loss: 0.5986 - regression_loss: 0.5271 - classification_loss: 0.0715 458/500 [==========================>...] - ETA: 13s - loss: 0.5988 - regression_loss: 0.5271 - classification_loss: 0.0716 459/500 [==========================>...] - ETA: 13s - loss: 0.5986 - regression_loss: 0.5270 - classification_loss: 0.0716 460/500 [==========================>...] - ETA: 13s - loss: 0.5985 - regression_loss: 0.5268 - classification_loss: 0.0716 461/500 [==========================>...] - ETA: 12s - loss: 0.5983 - regression_loss: 0.5266 - classification_loss: 0.0717 462/500 [==========================>...] - ETA: 12s - loss: 0.5980 - regression_loss: 0.5263 - classification_loss: 0.0716 463/500 [==========================>...] - ETA: 12s - loss: 0.5979 - regression_loss: 0.5263 - classification_loss: 0.0716 464/500 [==========================>...] - ETA: 11s - loss: 0.5985 - regression_loss: 0.5268 - classification_loss: 0.0717 465/500 [==========================>...] - ETA: 11s - loss: 0.5992 - regression_loss: 0.5273 - classification_loss: 0.0719 466/500 [==========================>...] - ETA: 11s - loss: 0.5992 - regression_loss: 0.5273 - classification_loss: 0.0719 467/500 [===========================>..] - ETA: 10s - loss: 0.5986 - regression_loss: 0.5268 - classification_loss: 0.0718 468/500 [===========================>..] - ETA: 10s - loss: 0.5995 - regression_loss: 0.5277 - classification_loss: 0.0718 469/500 [===========================>..] - ETA: 10s - loss: 0.5987 - regression_loss: 0.5270 - classification_loss: 0.0717 470/500 [===========================>..] - ETA: 9s - loss: 0.5991 - regression_loss: 0.5273 - classification_loss: 0.0718  471/500 [===========================>..] - ETA: 9s - loss: 0.5984 - regression_loss: 0.5267 - classification_loss: 0.0717 472/500 [===========================>..] - ETA: 9s - loss: 0.5987 - regression_loss: 0.5270 - classification_loss: 0.0717 473/500 [===========================>..] - ETA: 8s - loss: 0.5981 - regression_loss: 0.5264 - classification_loss: 0.0716 474/500 [===========================>..] - ETA: 8s - loss: 0.5977 - regression_loss: 0.5261 - classification_loss: 0.0716 475/500 [===========================>..] - ETA: 8s - loss: 0.5973 - regression_loss: 0.5258 - classification_loss: 0.0715 476/500 [===========================>..] - ETA: 7s - loss: 0.5979 - regression_loss: 0.5264 - classification_loss: 0.0715 477/500 [===========================>..] - ETA: 7s - loss: 0.5972 - regression_loss: 0.5258 - classification_loss: 0.0714 478/500 [===========================>..] - ETA: 7s - loss: 0.5973 - regression_loss: 0.5259 - classification_loss: 0.0714 479/500 [===========================>..] - ETA: 6s - loss: 0.5984 - regression_loss: 0.5269 - classification_loss: 0.0716 480/500 [===========================>..] - ETA: 6s - loss: 0.5987 - regression_loss: 0.5271 - classification_loss: 0.0716 481/500 [===========================>..] - ETA: 6s - loss: 0.5989 - regression_loss: 0.5272 - classification_loss: 0.0717 482/500 [===========================>..] - ETA: 5s - loss: 0.5982 - regression_loss: 0.5266 - classification_loss: 0.0716 483/500 [===========================>..] - ETA: 5s - loss: 0.5977 - regression_loss: 0.5262 - classification_loss: 0.0715 484/500 [============================>.] - ETA: 5s - loss: 0.5978 - regression_loss: 0.5263 - classification_loss: 0.0715 485/500 [============================>.] - ETA: 4s - loss: 0.5979 - regression_loss: 0.5264 - classification_loss: 0.0715 486/500 [============================>.] - ETA: 4s - loss: 0.5990 - regression_loss: 0.5273 - classification_loss: 0.0717 487/500 [============================>.] - ETA: 4s - loss: 0.5987 - regression_loss: 0.5270 - classification_loss: 0.0717 488/500 [============================>.] - ETA: 3s - loss: 0.5995 - regression_loss: 0.5277 - classification_loss: 0.0718 489/500 [============================>.] - ETA: 3s - loss: 0.6000 - regression_loss: 0.5281 - classification_loss: 0.0719 490/500 [============================>.] - ETA: 3s - loss: 0.5994 - regression_loss: 0.5276 - classification_loss: 0.0718 491/500 [============================>.] - ETA: 2s - loss: 0.5988 - regression_loss: 0.5271 - classification_loss: 0.0718 492/500 [============================>.] - ETA: 2s - loss: 0.6003 - regression_loss: 0.5284 - classification_loss: 0.0719 493/500 [============================>.] - ETA: 2s - loss: 0.6004 - regression_loss: 0.5283 - classification_loss: 0.0721 494/500 [============================>.] - ETA: 1s - loss: 0.6003 - regression_loss: 0.5283 - classification_loss: 0.0720 495/500 [============================>.] - ETA: 1s - loss: 0.6012 - regression_loss: 0.5291 - classification_loss: 0.0721 496/500 [============================>.] - ETA: 1s - loss: 0.6011 - regression_loss: 0.5291 - classification_loss: 0.0720 497/500 [============================>.] - ETA: 0s - loss: 0.6005 - regression_loss: 0.5286 - classification_loss: 0.0719 498/500 [============================>.] - ETA: 0s - loss: 0.6012 - regression_loss: 0.5291 - classification_loss: 0.0720 499/500 [============================>.] - ETA: 0s - loss: 0.6020 - regression_loss: 0.5298 - classification_loss: 0.0722 500/500 [==============================] - 165s 331ms/step - loss: 0.6021 - regression_loss: 0.5298 - classification_loss: 0.0722 1172 instances of class plum with average precision: 0.6230 mAP: 0.6230 Epoch 00052: saving model to ./training/snapshots/resnet101_pascal_52.h5 Epoch 53/150 1/500 [..............................] - ETA: 2:34 - loss: 0.6428 - regression_loss: 0.5850 - classification_loss: 0.0578 2/500 [..............................] - ETA: 2:44 - loss: 0.5750 - regression_loss: 0.5172 - classification_loss: 0.0577 3/500 [..............................] - ETA: 2:42 - loss: 0.7983 - regression_loss: 0.6895 - classification_loss: 0.1088 4/500 [..............................] - ETA: 2:40 - loss: 0.6664 - regression_loss: 0.5740 - classification_loss: 0.0923 5/500 [..............................] - ETA: 2:41 - loss: 0.6252 - regression_loss: 0.5475 - classification_loss: 0.0777 6/500 [..............................] - ETA: 2:40 - loss: 0.7279 - regression_loss: 0.6422 - classification_loss: 0.0857 7/500 [..............................] - ETA: 2:39 - loss: 0.7237 - regression_loss: 0.6358 - classification_loss: 0.0879 8/500 [..............................] - ETA: 2:39 - loss: 0.6916 - regression_loss: 0.6102 - classification_loss: 0.0814 9/500 [..............................] - ETA: 2:38 - loss: 0.7025 - regression_loss: 0.6185 - classification_loss: 0.0840 10/500 [..............................] - ETA: 2:38 - loss: 0.6547 - regression_loss: 0.5747 - classification_loss: 0.0800 11/500 [..............................] - ETA: 2:38 - loss: 0.6383 - regression_loss: 0.5606 - classification_loss: 0.0777 12/500 [..............................] - ETA: 2:38 - loss: 0.6128 - regression_loss: 0.5389 - classification_loss: 0.0739 13/500 [..............................] - ETA: 2:39 - loss: 0.6221 - regression_loss: 0.5451 - classification_loss: 0.0771 14/500 [..............................] - ETA: 2:38 - loss: 0.6170 - regression_loss: 0.5411 - classification_loss: 0.0759 15/500 [..............................] - ETA: 2:37 - loss: 0.6189 - regression_loss: 0.5434 - classification_loss: 0.0755 16/500 [..............................] - ETA: 2:38 - loss: 0.6263 - regression_loss: 0.5494 - classification_loss: 0.0769 17/500 [>.............................] - ETA: 2:38 - loss: 0.6105 - regression_loss: 0.5353 - classification_loss: 0.0753 18/500 [>.............................] - ETA: 2:37 - loss: 0.5909 - regression_loss: 0.5189 - classification_loss: 0.0720 19/500 [>.............................] - ETA: 2:37 - loss: 0.6116 - regression_loss: 0.5347 - classification_loss: 0.0770 20/500 [>.............................] - ETA: 2:37 - loss: 0.6063 - regression_loss: 0.5311 - classification_loss: 0.0751 21/500 [>.............................] - ETA: 2:36 - loss: 0.6071 - regression_loss: 0.5322 - classification_loss: 0.0749 22/500 [>.............................] - ETA: 2:36 - loss: 0.5983 - regression_loss: 0.5246 - classification_loss: 0.0737 23/500 [>.............................] - ETA: 2:35 - loss: 0.5871 - regression_loss: 0.5153 - classification_loss: 0.0718 24/500 [>.............................] - ETA: 2:35 - loss: 0.6056 - regression_loss: 0.5300 - classification_loss: 0.0756 25/500 [>.............................] - ETA: 2:35 - loss: 0.6120 - regression_loss: 0.5348 - classification_loss: 0.0772 26/500 [>.............................] - ETA: 2:36 - loss: 0.6166 - regression_loss: 0.5396 - classification_loss: 0.0770 27/500 [>.............................] - ETA: 2:35 - loss: 0.6400 - regression_loss: 0.5593 - classification_loss: 0.0807 28/500 [>.............................] - ETA: 2:35 - loss: 0.6372 - regression_loss: 0.5569 - classification_loss: 0.0803 29/500 [>.............................] - ETA: 2:35 - loss: 0.6360 - regression_loss: 0.5578 - classification_loss: 0.0782 30/500 [>.............................] - ETA: 2:35 - loss: 0.6268 - regression_loss: 0.5500 - classification_loss: 0.0768 31/500 [>.............................] - ETA: 2:34 - loss: 0.6332 - regression_loss: 0.5554 - classification_loss: 0.0778 32/500 [>.............................] - ETA: 2:34 - loss: 0.6337 - regression_loss: 0.5567 - classification_loss: 0.0769 33/500 [>.............................] - ETA: 2:34 - loss: 0.6234 - regression_loss: 0.5477 - classification_loss: 0.0757 34/500 [=>............................] - ETA: 2:34 - loss: 0.6246 - regression_loss: 0.5480 - classification_loss: 0.0766 35/500 [=>............................] - ETA: 2:33 - loss: 0.6149 - regression_loss: 0.5393 - classification_loss: 0.0757 36/500 [=>............................] - ETA: 2:33 - loss: 0.6219 - regression_loss: 0.5454 - classification_loss: 0.0765 37/500 [=>............................] - ETA: 2:33 - loss: 0.6166 - regression_loss: 0.5411 - classification_loss: 0.0755 38/500 [=>............................] - ETA: 2:33 - loss: 0.6187 - regression_loss: 0.5437 - classification_loss: 0.0750 39/500 [=>............................] - ETA: 2:33 - loss: 0.6111 - regression_loss: 0.5373 - classification_loss: 0.0738 40/500 [=>............................] - ETA: 2:32 - loss: 0.6102 - regression_loss: 0.5363 - classification_loss: 0.0739 41/500 [=>............................] - ETA: 2:32 - loss: 0.6088 - regression_loss: 0.5345 - classification_loss: 0.0742 42/500 [=>............................] - ETA: 2:32 - loss: 0.6100 - regression_loss: 0.5360 - classification_loss: 0.0740 43/500 [=>............................] - ETA: 2:31 - loss: 0.6089 - regression_loss: 0.5354 - classification_loss: 0.0735 44/500 [=>............................] - ETA: 2:31 - loss: 0.6026 - regression_loss: 0.5299 - classification_loss: 0.0728 45/500 [=>............................] - ETA: 2:31 - loss: 0.6071 - regression_loss: 0.5333 - classification_loss: 0.0739 46/500 [=>............................] - ETA: 2:30 - loss: 0.6066 - regression_loss: 0.5324 - classification_loss: 0.0742 47/500 [=>............................] - ETA: 2:30 - loss: 0.6054 - regression_loss: 0.5321 - classification_loss: 0.0734 48/500 [=>............................] - ETA: 2:29 - loss: 0.6044 - regression_loss: 0.5315 - classification_loss: 0.0728 49/500 [=>............................] - ETA: 2:29 - loss: 0.6020 - regression_loss: 0.5297 - classification_loss: 0.0723 50/500 [==>...........................] - ETA: 2:29 - loss: 0.6017 - regression_loss: 0.5294 - classification_loss: 0.0723 51/500 [==>...........................] - ETA: 2:28 - loss: 0.6029 - regression_loss: 0.5307 - classification_loss: 0.0722 52/500 [==>...........................] - ETA: 2:28 - loss: 0.5993 - regression_loss: 0.5278 - classification_loss: 0.0716 53/500 [==>...........................] - ETA: 2:27 - loss: 0.6001 - regression_loss: 0.5282 - classification_loss: 0.0718 54/500 [==>...........................] - ETA: 2:27 - loss: 0.5977 - regression_loss: 0.5260 - classification_loss: 0.0717 55/500 [==>...........................] - ETA: 2:26 - loss: 0.5941 - regression_loss: 0.5226 - classification_loss: 0.0715 56/500 [==>...........................] - ETA: 2:26 - loss: 0.6038 - regression_loss: 0.5314 - classification_loss: 0.0724 57/500 [==>...........................] - ETA: 2:26 - loss: 0.6105 - regression_loss: 0.5371 - classification_loss: 0.0735 58/500 [==>...........................] - ETA: 2:25 - loss: 0.6108 - regression_loss: 0.5374 - classification_loss: 0.0734 59/500 [==>...........................] - ETA: 2:25 - loss: 0.6150 - regression_loss: 0.5408 - classification_loss: 0.0741 60/500 [==>...........................] - ETA: 2:25 - loss: 0.6112 - regression_loss: 0.5379 - classification_loss: 0.0733 61/500 [==>...........................] - ETA: 2:24 - loss: 0.6148 - regression_loss: 0.5415 - classification_loss: 0.0733 62/500 [==>...........................] - ETA: 2:24 - loss: 0.6096 - regression_loss: 0.5371 - classification_loss: 0.0724 63/500 [==>...........................] - ETA: 2:24 - loss: 0.6065 - regression_loss: 0.5347 - classification_loss: 0.0718 64/500 [==>...........................] - ETA: 2:23 - loss: 0.6076 - regression_loss: 0.5354 - classification_loss: 0.0722 65/500 [==>...........................] - ETA: 2:23 - loss: 0.6031 - regression_loss: 0.5315 - classification_loss: 0.0715 66/500 [==>...........................] - ETA: 2:23 - loss: 0.6032 - regression_loss: 0.5318 - classification_loss: 0.0714 67/500 [===>..........................] - ETA: 2:22 - loss: 0.6061 - regression_loss: 0.5346 - classification_loss: 0.0715 68/500 [===>..........................] - ETA: 2:22 - loss: 0.6111 - regression_loss: 0.5391 - classification_loss: 0.0720 69/500 [===>..........................] - ETA: 2:22 - loss: 0.6136 - regression_loss: 0.5417 - classification_loss: 0.0718 70/500 [===>..........................] - ETA: 2:21 - loss: 0.6155 - regression_loss: 0.5434 - classification_loss: 0.0722 71/500 [===>..........................] - ETA: 2:21 - loss: 0.6187 - regression_loss: 0.5462 - classification_loss: 0.0725 72/500 [===>..........................] - ETA: 2:21 - loss: 0.6170 - regression_loss: 0.5446 - classification_loss: 0.0724 73/500 [===>..........................] - ETA: 2:20 - loss: 0.6167 - regression_loss: 0.5443 - classification_loss: 0.0724 74/500 [===>..........................] - ETA: 2:20 - loss: 0.6128 - regression_loss: 0.5408 - classification_loss: 0.0720 75/500 [===>..........................] - ETA: 2:20 - loss: 0.6140 - regression_loss: 0.5417 - classification_loss: 0.0723 76/500 [===>..........................] - ETA: 2:19 - loss: 0.6145 - regression_loss: 0.5419 - classification_loss: 0.0726 77/500 [===>..........................] - ETA: 2:19 - loss: 0.6102 - regression_loss: 0.5382 - classification_loss: 0.0720 78/500 [===>..........................] - ETA: 2:19 - loss: 0.6136 - regression_loss: 0.5407 - classification_loss: 0.0729 79/500 [===>..........................] - ETA: 2:18 - loss: 0.6148 - regression_loss: 0.5418 - classification_loss: 0.0730 80/500 [===>..........................] - ETA: 2:18 - loss: 0.6110 - regression_loss: 0.5386 - classification_loss: 0.0724 81/500 [===>..........................] - ETA: 2:18 - loss: 0.6119 - regression_loss: 0.5395 - classification_loss: 0.0724 82/500 [===>..........................] - ETA: 2:17 - loss: 0.6144 - regression_loss: 0.5415 - classification_loss: 0.0729 83/500 [===>..........................] - ETA: 2:17 - loss: 0.6157 - regression_loss: 0.5427 - classification_loss: 0.0730 84/500 [====>.........................] - ETA: 2:17 - loss: 0.6198 - regression_loss: 0.5459 - classification_loss: 0.0739 85/500 [====>.........................] - ETA: 2:16 - loss: 0.6201 - regression_loss: 0.5462 - classification_loss: 0.0740 86/500 [====>.........................] - ETA: 2:16 - loss: 0.6167 - regression_loss: 0.5432 - classification_loss: 0.0735 87/500 [====>.........................] - ETA: 2:16 - loss: 0.6118 - regression_loss: 0.5388 - classification_loss: 0.0730 88/500 [====>.........................] - ETA: 2:15 - loss: 0.6082 - regression_loss: 0.5358 - classification_loss: 0.0724 89/500 [====>.........................] - ETA: 2:15 - loss: 0.6088 - regression_loss: 0.5364 - classification_loss: 0.0724 90/500 [====>.........................] - ETA: 2:15 - loss: 0.6074 - regression_loss: 0.5351 - classification_loss: 0.0724 91/500 [====>.........................] - ETA: 2:15 - loss: 0.6074 - regression_loss: 0.5352 - classification_loss: 0.0722 92/500 [====>.........................] - ETA: 2:14 - loss: 0.6079 - regression_loss: 0.5354 - classification_loss: 0.0725 93/500 [====>.........................] - ETA: 2:14 - loss: 0.6091 - regression_loss: 0.5368 - classification_loss: 0.0723 94/500 [====>.........................] - ETA: 2:14 - loss: 0.6074 - regression_loss: 0.5351 - classification_loss: 0.0723 95/500 [====>.........................] - ETA: 2:13 - loss: 0.6067 - regression_loss: 0.5344 - classification_loss: 0.0722 96/500 [====>.........................] - ETA: 2:13 - loss: 0.6046 - regression_loss: 0.5325 - classification_loss: 0.0720 97/500 [====>.........................] - ETA: 2:13 - loss: 0.6062 - regression_loss: 0.5339 - classification_loss: 0.0723 98/500 [====>.........................] - ETA: 2:12 - loss: 0.6073 - regression_loss: 0.5349 - classification_loss: 0.0724 99/500 [====>.........................] - ETA: 2:12 - loss: 0.6039 - regression_loss: 0.5320 - classification_loss: 0.0719 100/500 [=====>........................] - ETA: 2:12 - loss: 0.6005 - regression_loss: 0.5291 - classification_loss: 0.0714 101/500 [=====>........................] - ETA: 2:12 - loss: 0.6001 - regression_loss: 0.5288 - classification_loss: 0.0714 102/500 [=====>........................] - ETA: 2:11 - loss: 0.6003 - regression_loss: 0.5289 - classification_loss: 0.0714 103/500 [=====>........................] - ETA: 2:11 - loss: 0.6002 - regression_loss: 0.5286 - classification_loss: 0.0716 104/500 [=====>........................] - ETA: 2:11 - loss: 0.6008 - regression_loss: 0.5294 - classification_loss: 0.0714 105/500 [=====>........................] - ETA: 2:10 - loss: 0.6005 - regression_loss: 0.5292 - classification_loss: 0.0713 106/500 [=====>........................] - ETA: 2:10 - loss: 0.5999 - regression_loss: 0.5290 - classification_loss: 0.0710 107/500 [=====>........................] - ETA: 2:10 - loss: 0.6009 - regression_loss: 0.5297 - classification_loss: 0.0713 108/500 [=====>........................] - ETA: 2:09 - loss: 0.5988 - regression_loss: 0.5280 - classification_loss: 0.0709 109/500 [=====>........................] - ETA: 2:09 - loss: 0.6006 - regression_loss: 0.5292 - classification_loss: 0.0714 110/500 [=====>........................] - ETA: 2:09 - loss: 0.5986 - regression_loss: 0.5276 - classification_loss: 0.0710 111/500 [=====>........................] - ETA: 2:08 - loss: 0.5997 - regression_loss: 0.5283 - classification_loss: 0.0715 112/500 [=====>........................] - ETA: 2:08 - loss: 0.6005 - regression_loss: 0.5290 - classification_loss: 0.0715 113/500 [=====>........................] - ETA: 2:08 - loss: 0.5993 - regression_loss: 0.5279 - classification_loss: 0.0714 114/500 [=====>........................] - ETA: 2:07 - loss: 0.5980 - regression_loss: 0.5268 - classification_loss: 0.0712 115/500 [=====>........................] - ETA: 2:07 - loss: 0.5993 - regression_loss: 0.5279 - classification_loss: 0.0714 116/500 [=====>........................] - ETA: 2:07 - loss: 0.5961 - regression_loss: 0.5251 - classification_loss: 0.0710 117/500 [======>.......................] - ETA: 2:06 - loss: 0.5953 - regression_loss: 0.5244 - classification_loss: 0.0709 118/500 [======>.......................] - ETA: 2:06 - loss: 0.6008 - regression_loss: 0.5291 - classification_loss: 0.0717 119/500 [======>.......................] - ETA: 2:06 - loss: 0.6050 - regression_loss: 0.5328 - classification_loss: 0.0723 120/500 [======>.......................] - ETA: 2:06 - loss: 0.6026 - regression_loss: 0.5307 - classification_loss: 0.0719 121/500 [======>.......................] - ETA: 2:05 - loss: 0.5997 - regression_loss: 0.5282 - classification_loss: 0.0714 122/500 [======>.......................] - ETA: 2:05 - loss: 0.6016 - regression_loss: 0.5298 - classification_loss: 0.0718 123/500 [======>.......................] - ETA: 2:04 - loss: 0.5987 - regression_loss: 0.5274 - classification_loss: 0.0713 124/500 [======>.......................] - ETA: 2:04 - loss: 0.5964 - regression_loss: 0.5254 - classification_loss: 0.0709 125/500 [======>.......................] - ETA: 2:04 - loss: 0.5958 - regression_loss: 0.5250 - classification_loss: 0.0707 126/500 [======>.......................] - ETA: 2:03 - loss: 0.5929 - regression_loss: 0.5225 - classification_loss: 0.0704 127/500 [======>.......................] - ETA: 2:03 - loss: 0.5946 - regression_loss: 0.5241 - classification_loss: 0.0704 128/500 [======>.......................] - ETA: 2:03 - loss: 0.5968 - regression_loss: 0.5259 - classification_loss: 0.0709 129/500 [======>.......................] - ETA: 2:02 - loss: 0.5947 - regression_loss: 0.5240 - classification_loss: 0.0707 130/500 [======>.......................] - ETA: 2:02 - loss: 0.5933 - regression_loss: 0.5229 - classification_loss: 0.0704 131/500 [======>.......................] - ETA: 2:02 - loss: 0.5944 - regression_loss: 0.5241 - classification_loss: 0.0704 132/500 [======>.......................] - ETA: 2:01 - loss: 0.5941 - regression_loss: 0.5238 - classification_loss: 0.0702 133/500 [======>.......................] - ETA: 2:01 - loss: 0.5920 - regression_loss: 0.5221 - classification_loss: 0.0699 134/500 [=======>......................] - ETA: 2:01 - loss: 0.5915 - regression_loss: 0.5216 - classification_loss: 0.0699 135/500 [=======>......................] - ETA: 2:00 - loss: 0.5914 - regression_loss: 0.5215 - classification_loss: 0.0699 136/500 [=======>......................] - ETA: 2:00 - loss: 0.5896 - regression_loss: 0.5200 - classification_loss: 0.0696 137/500 [=======>......................] - ETA: 2:00 - loss: 0.5909 - regression_loss: 0.5211 - classification_loss: 0.0698 138/500 [=======>......................] - ETA: 1:59 - loss: 0.5933 - regression_loss: 0.5229 - classification_loss: 0.0704 139/500 [=======>......................] - ETA: 1:59 - loss: 0.5954 - regression_loss: 0.5248 - classification_loss: 0.0706 140/500 [=======>......................] - ETA: 1:59 - loss: 0.5928 - regression_loss: 0.5226 - classification_loss: 0.0702 141/500 [=======>......................] - ETA: 1:58 - loss: 0.5898 - regression_loss: 0.5200 - classification_loss: 0.0699 142/500 [=======>......................] - ETA: 1:58 - loss: 0.5922 - regression_loss: 0.5224 - classification_loss: 0.0699 143/500 [=======>......................] - ETA: 1:58 - loss: 0.5912 - regression_loss: 0.5215 - classification_loss: 0.0696 144/500 [=======>......................] - ETA: 1:57 - loss: 0.5926 - regression_loss: 0.5225 - classification_loss: 0.0701 145/500 [=======>......................] - ETA: 1:57 - loss: 0.5918 - regression_loss: 0.5218 - classification_loss: 0.0701 146/500 [=======>......................] - ETA: 1:57 - loss: 0.5906 - regression_loss: 0.5208 - classification_loss: 0.0698 147/500 [=======>......................] - ETA: 1:56 - loss: 0.5912 - regression_loss: 0.5210 - classification_loss: 0.0702 148/500 [=======>......................] - ETA: 1:56 - loss: 0.5899 - regression_loss: 0.5199 - classification_loss: 0.0700 149/500 [=======>......................] - ETA: 1:56 - loss: 0.5865 - regression_loss: 0.5168 - classification_loss: 0.0696 150/500 [========>.....................] - ETA: 1:55 - loss: 0.5855 - regression_loss: 0.5160 - classification_loss: 0.0694 151/500 [========>.....................] - ETA: 1:55 - loss: 0.5854 - regression_loss: 0.5158 - classification_loss: 0.0696 152/500 [========>.....................] - ETA: 1:55 - loss: 0.5839 - regression_loss: 0.5146 - classification_loss: 0.0693 153/500 [========>.....................] - ETA: 1:54 - loss: 0.5808 - regression_loss: 0.5119 - classification_loss: 0.0689 154/500 [========>.....................] - ETA: 1:54 - loss: 0.5831 - regression_loss: 0.5139 - classification_loss: 0.0692 155/500 [========>.....................] - ETA: 1:54 - loss: 0.5834 - regression_loss: 0.5141 - classification_loss: 0.0693 156/500 [========>.....................] - ETA: 1:54 - loss: 0.5823 - regression_loss: 0.5131 - classification_loss: 0.0692 157/500 [========>.....................] - ETA: 1:53 - loss: 0.5812 - regression_loss: 0.5123 - classification_loss: 0.0689 158/500 [========>.....................] - ETA: 1:53 - loss: 0.5814 - regression_loss: 0.5127 - classification_loss: 0.0687 159/500 [========>.....................] - ETA: 1:53 - loss: 0.5805 - regression_loss: 0.5118 - classification_loss: 0.0686 160/500 [========>.....................] - ETA: 1:52 - loss: 0.5819 - regression_loss: 0.5130 - classification_loss: 0.0689 161/500 [========>.....................] - ETA: 1:52 - loss: 0.5798 - regression_loss: 0.5111 - classification_loss: 0.0688 162/500 [========>.....................] - ETA: 1:52 - loss: 0.5776 - regression_loss: 0.5091 - classification_loss: 0.0685 163/500 [========>.....................] - ETA: 1:51 - loss: 0.5761 - regression_loss: 0.5076 - classification_loss: 0.0685 164/500 [========>.....................] - ETA: 1:51 - loss: 0.5783 - regression_loss: 0.5097 - classification_loss: 0.0685 165/500 [========>.....................] - ETA: 1:51 - loss: 0.5778 - regression_loss: 0.5094 - classification_loss: 0.0684 166/500 [========>.....................] - ETA: 1:50 - loss: 0.5777 - regression_loss: 0.5093 - classification_loss: 0.0683 167/500 [=========>....................] - ETA: 1:50 - loss: 0.5797 - regression_loss: 0.5109 - classification_loss: 0.0687 168/500 [=========>....................] - ETA: 1:50 - loss: 0.5802 - regression_loss: 0.5115 - classification_loss: 0.0686 169/500 [=========>....................] - ETA: 1:49 - loss: 0.5793 - regression_loss: 0.5110 - classification_loss: 0.0684 170/500 [=========>....................] - ETA: 1:49 - loss: 0.5786 - regression_loss: 0.5103 - classification_loss: 0.0682 171/500 [=========>....................] - ETA: 1:49 - loss: 0.5782 - regression_loss: 0.5098 - classification_loss: 0.0683 172/500 [=========>....................] - ETA: 1:48 - loss: 0.5785 - regression_loss: 0.5101 - classification_loss: 0.0684 173/500 [=========>....................] - ETA: 1:48 - loss: 0.5780 - regression_loss: 0.5097 - classification_loss: 0.0683 174/500 [=========>....................] - ETA: 1:48 - loss: 0.5766 - regression_loss: 0.5086 - classification_loss: 0.0681 175/500 [=========>....................] - ETA: 1:47 - loss: 0.5781 - regression_loss: 0.5097 - classification_loss: 0.0684 176/500 [=========>....................] - ETA: 1:47 - loss: 0.5765 - regression_loss: 0.5084 - classification_loss: 0.0681 177/500 [=========>....................] - ETA: 1:47 - loss: 0.5782 - regression_loss: 0.5102 - classification_loss: 0.0680 178/500 [=========>....................] - ETA: 1:46 - loss: 0.5762 - regression_loss: 0.5084 - classification_loss: 0.0678 179/500 [=========>....................] - ETA: 1:46 - loss: 0.5743 - regression_loss: 0.5068 - classification_loss: 0.0675 180/500 [=========>....................] - ETA: 1:46 - loss: 0.5731 - regression_loss: 0.5058 - classification_loss: 0.0673 181/500 [=========>....................] - ETA: 1:45 - loss: 0.5713 - regression_loss: 0.5041 - classification_loss: 0.0672 182/500 [=========>....................] - ETA: 1:45 - loss: 0.5707 - regression_loss: 0.5035 - classification_loss: 0.0672 183/500 [=========>....................] - ETA: 1:45 - loss: 0.5696 - regression_loss: 0.5026 - classification_loss: 0.0670 184/500 [==========>...................] - ETA: 1:44 - loss: 0.5693 - regression_loss: 0.5024 - classification_loss: 0.0669 185/500 [==========>...................] - ETA: 1:44 - loss: 0.5674 - regression_loss: 0.5008 - classification_loss: 0.0666 186/500 [==========>...................] - ETA: 1:44 - loss: 0.5685 - regression_loss: 0.5018 - classification_loss: 0.0667 187/500 [==========>...................] - ETA: 1:43 - loss: 0.5674 - regression_loss: 0.5009 - classification_loss: 0.0665 188/500 [==========>...................] - ETA: 1:43 - loss: 0.5684 - regression_loss: 0.5019 - classification_loss: 0.0666 189/500 [==========>...................] - ETA: 1:43 - loss: 0.5685 - regression_loss: 0.5019 - classification_loss: 0.0665 190/500 [==========>...................] - ETA: 1:42 - loss: 0.5680 - regression_loss: 0.5015 - classification_loss: 0.0664 191/500 [==========>...................] - ETA: 1:42 - loss: 0.5714 - regression_loss: 0.5042 - classification_loss: 0.0671 192/500 [==========>...................] - ETA: 1:42 - loss: 0.5718 - regression_loss: 0.5047 - classification_loss: 0.0671 193/500 [==========>...................] - ETA: 1:41 - loss: 0.5738 - regression_loss: 0.5063 - classification_loss: 0.0675 194/500 [==========>...................] - ETA: 1:41 - loss: 0.5734 - regression_loss: 0.5059 - classification_loss: 0.0676 195/500 [==========>...................] - ETA: 1:41 - loss: 0.5751 - regression_loss: 0.5072 - classification_loss: 0.0678 196/500 [==========>...................] - ETA: 1:40 - loss: 0.5754 - regression_loss: 0.5076 - classification_loss: 0.0678 197/500 [==========>...................] - ETA: 1:40 - loss: 0.5739 - regression_loss: 0.5064 - classification_loss: 0.0676 198/500 [==========>...................] - ETA: 1:40 - loss: 0.5738 - regression_loss: 0.5062 - classification_loss: 0.0676 199/500 [==========>...................] - ETA: 1:39 - loss: 0.5735 - regression_loss: 0.5061 - classification_loss: 0.0675 200/500 [===========>..................] - ETA: 1:39 - loss: 0.5744 - regression_loss: 0.5065 - classification_loss: 0.0679 201/500 [===========>..................] - ETA: 1:39 - loss: 0.5739 - regression_loss: 0.5062 - classification_loss: 0.0676 202/500 [===========>..................] - ETA: 1:38 - loss: 0.5734 - regression_loss: 0.5059 - classification_loss: 0.0675 203/500 [===========>..................] - ETA: 1:38 - loss: 0.5756 - regression_loss: 0.5078 - classification_loss: 0.0678 204/500 [===========>..................] - ETA: 1:38 - loss: 0.5763 - regression_loss: 0.5084 - classification_loss: 0.0679 205/500 [===========>..................] - ETA: 1:37 - loss: 0.5754 - regression_loss: 0.5076 - classification_loss: 0.0678 206/500 [===========>..................] - ETA: 1:37 - loss: 0.5767 - regression_loss: 0.5088 - classification_loss: 0.0679 207/500 [===========>..................] - ETA: 1:37 - loss: 0.5784 - regression_loss: 0.5101 - classification_loss: 0.0683 208/500 [===========>..................] - ETA: 1:36 - loss: 0.5798 - regression_loss: 0.5112 - classification_loss: 0.0685 209/500 [===========>..................] - ETA: 1:36 - loss: 0.5790 - regression_loss: 0.5106 - classification_loss: 0.0684 210/500 [===========>..................] - ETA: 1:36 - loss: 0.5780 - regression_loss: 0.5097 - classification_loss: 0.0683 211/500 [===========>..................] - ETA: 1:35 - loss: 0.5768 - regression_loss: 0.5087 - classification_loss: 0.0681 212/500 [===========>..................] - ETA: 1:35 - loss: 0.5780 - regression_loss: 0.5100 - classification_loss: 0.0680 213/500 [===========>..................] - ETA: 1:35 - loss: 0.5778 - regression_loss: 0.5098 - classification_loss: 0.0680 214/500 [===========>..................] - ETA: 1:34 - loss: 0.5793 - regression_loss: 0.5110 - classification_loss: 0.0683 215/500 [===========>..................] - ETA: 1:34 - loss: 0.5786 - regression_loss: 0.5104 - classification_loss: 0.0682 216/500 [===========>..................] - ETA: 1:34 - loss: 0.5796 - regression_loss: 0.5112 - classification_loss: 0.0683 217/500 [============>.................] - ETA: 1:33 - loss: 0.5822 - regression_loss: 0.5133 - classification_loss: 0.0689 218/500 [============>.................] - ETA: 1:33 - loss: 0.5826 - regression_loss: 0.5137 - classification_loss: 0.0689 219/500 [============>.................] - ETA: 1:33 - loss: 0.5855 - regression_loss: 0.5161 - classification_loss: 0.0694 220/500 [============>.................] - ETA: 1:32 - loss: 0.5871 - regression_loss: 0.5175 - classification_loss: 0.0696 221/500 [============>.................] - ETA: 1:32 - loss: 0.5858 - regression_loss: 0.5164 - classification_loss: 0.0694 222/500 [============>.................] - ETA: 1:32 - loss: 0.5852 - regression_loss: 0.5158 - classification_loss: 0.0693 223/500 [============>.................] - ETA: 1:31 - loss: 0.5861 - regression_loss: 0.5166 - classification_loss: 0.0695 224/500 [============>.................] - ETA: 1:31 - loss: 0.5891 - regression_loss: 0.5192 - classification_loss: 0.0699 225/500 [============>.................] - ETA: 1:31 - loss: 0.5891 - regression_loss: 0.5192 - classification_loss: 0.0699 226/500 [============>.................] - ETA: 1:30 - loss: 0.5919 - regression_loss: 0.5218 - classification_loss: 0.0701 227/500 [============>.................] - ETA: 1:30 - loss: 0.5921 - regression_loss: 0.5221 - classification_loss: 0.0700 228/500 [============>.................] - ETA: 1:30 - loss: 0.5921 - regression_loss: 0.5222 - classification_loss: 0.0699 229/500 [============>.................] - ETA: 1:29 - loss: 0.5912 - regression_loss: 0.5215 - classification_loss: 0.0697 230/500 [============>.................] - ETA: 1:29 - loss: 0.5917 - regression_loss: 0.5220 - classification_loss: 0.0698 231/500 [============>.................] - ETA: 1:29 - loss: 0.5916 - regression_loss: 0.5217 - classification_loss: 0.0698 232/500 [============>.................] - ETA: 1:28 - loss: 0.5927 - regression_loss: 0.5227 - classification_loss: 0.0701 233/500 [============>.................] - ETA: 1:28 - loss: 0.5919 - regression_loss: 0.5219 - classification_loss: 0.0699 234/500 [=============>................] - ETA: 1:28 - loss: 0.5901 - regression_loss: 0.5204 - classification_loss: 0.0697 235/500 [=============>................] - ETA: 1:27 - loss: 0.5911 - regression_loss: 0.5213 - classification_loss: 0.0698 236/500 [=============>................] - ETA: 1:27 - loss: 0.5912 - regression_loss: 0.5213 - classification_loss: 0.0699 237/500 [=============>................] - ETA: 1:27 - loss: 0.5915 - regression_loss: 0.5217 - classification_loss: 0.0699 238/500 [=============>................] - ETA: 1:26 - loss: 0.5928 - regression_loss: 0.5228 - classification_loss: 0.0699 239/500 [=============>................] - ETA: 1:26 - loss: 0.5930 - regression_loss: 0.5231 - classification_loss: 0.0699 240/500 [=============>................] - ETA: 1:26 - loss: 0.5928 - regression_loss: 0.5229 - classification_loss: 0.0699 241/500 [=============>................] - ETA: 1:25 - loss: 0.5943 - regression_loss: 0.5242 - classification_loss: 0.0701 242/500 [=============>................] - ETA: 1:25 - loss: 0.5930 - regression_loss: 0.5231 - classification_loss: 0.0699 243/500 [=============>................] - ETA: 1:25 - loss: 0.5916 - regression_loss: 0.5219 - classification_loss: 0.0697 244/500 [=============>................] - ETA: 1:24 - loss: 0.5914 - regression_loss: 0.5218 - classification_loss: 0.0696 245/500 [=============>................] - ETA: 1:24 - loss: 0.5920 - regression_loss: 0.5223 - classification_loss: 0.0697 246/500 [=============>................] - ETA: 1:24 - loss: 0.5920 - regression_loss: 0.5224 - classification_loss: 0.0696 247/500 [=============>................] - ETA: 1:23 - loss: 0.5924 - regression_loss: 0.5229 - classification_loss: 0.0695 248/500 [=============>................] - ETA: 1:23 - loss: 0.5928 - regression_loss: 0.5234 - classification_loss: 0.0694 249/500 [=============>................] - ETA: 1:23 - loss: 0.5929 - regression_loss: 0.5234 - classification_loss: 0.0695 250/500 [==============>...............] - ETA: 1:22 - loss: 0.5937 - regression_loss: 0.5240 - classification_loss: 0.0697 251/500 [==============>...............] - ETA: 1:22 - loss: 0.5947 - regression_loss: 0.5249 - classification_loss: 0.0699 252/500 [==============>...............] - ETA: 1:22 - loss: 0.5942 - regression_loss: 0.5245 - classification_loss: 0.0698 253/500 [==============>...............] - ETA: 1:21 - loss: 0.5938 - regression_loss: 0.5240 - classification_loss: 0.0698 254/500 [==============>...............] - ETA: 1:21 - loss: 0.5924 - regression_loss: 0.5228 - classification_loss: 0.0696 255/500 [==============>...............] - ETA: 1:21 - loss: 0.5934 - regression_loss: 0.5235 - classification_loss: 0.0699 256/500 [==============>...............] - ETA: 1:20 - loss: 0.5925 - regression_loss: 0.5227 - classification_loss: 0.0697 257/500 [==============>...............] - ETA: 1:20 - loss: 0.5944 - regression_loss: 0.5240 - classification_loss: 0.0704 258/500 [==============>...............] - ETA: 1:20 - loss: 0.5947 - regression_loss: 0.5244 - classification_loss: 0.0703 259/500 [==============>...............] - ETA: 1:19 - loss: 0.5953 - regression_loss: 0.5249 - classification_loss: 0.0704 260/500 [==============>...............] - ETA: 1:19 - loss: 0.5955 - regression_loss: 0.5250 - classification_loss: 0.0704 261/500 [==============>...............] - ETA: 1:19 - loss: 0.5955 - regression_loss: 0.5252 - classification_loss: 0.0704 262/500 [==============>...............] - ETA: 1:18 - loss: 0.5954 - regression_loss: 0.5251 - classification_loss: 0.0703 263/500 [==============>...............] - ETA: 1:18 - loss: 0.5938 - regression_loss: 0.5237 - classification_loss: 0.0701 264/500 [==============>...............] - ETA: 1:18 - loss: 0.5932 - regression_loss: 0.5231 - classification_loss: 0.0701 265/500 [==============>...............] - ETA: 1:17 - loss: 0.5924 - regression_loss: 0.5224 - classification_loss: 0.0701 266/500 [==============>...............] - ETA: 1:17 - loss: 0.5923 - regression_loss: 0.5223 - classification_loss: 0.0700 267/500 [===============>..............] - ETA: 1:17 - loss: 0.5924 - regression_loss: 0.5224 - classification_loss: 0.0700 268/500 [===============>..............] - ETA: 1:16 - loss: 0.5935 - regression_loss: 0.5235 - classification_loss: 0.0701 269/500 [===============>..............] - ETA: 1:16 - loss: 0.5942 - regression_loss: 0.5241 - classification_loss: 0.0701 270/500 [===============>..............] - ETA: 1:16 - loss: 0.5942 - regression_loss: 0.5240 - classification_loss: 0.0701 271/500 [===============>..............] - ETA: 1:15 - loss: 0.5958 - regression_loss: 0.5254 - classification_loss: 0.0704 272/500 [===============>..............] - ETA: 1:15 - loss: 0.5950 - regression_loss: 0.5248 - classification_loss: 0.0702 273/500 [===============>..............] - ETA: 1:15 - loss: 0.5945 - regression_loss: 0.5243 - classification_loss: 0.0702 274/500 [===============>..............] - ETA: 1:14 - loss: 0.5933 - regression_loss: 0.5233 - classification_loss: 0.0700 275/500 [===============>..............] - ETA: 1:14 - loss: 0.5932 - regression_loss: 0.5231 - classification_loss: 0.0701 276/500 [===============>..............] - ETA: 1:14 - loss: 0.5921 - regression_loss: 0.5222 - classification_loss: 0.0699 277/500 [===============>..............] - ETA: 1:13 - loss: 0.5916 - regression_loss: 0.5218 - classification_loss: 0.0699 278/500 [===============>..............] - ETA: 1:13 - loss: 0.5928 - regression_loss: 0.5227 - classification_loss: 0.0701 279/500 [===============>..............] - ETA: 1:13 - loss: 0.5940 - regression_loss: 0.5236 - classification_loss: 0.0704 280/500 [===============>..............] - ETA: 1:12 - loss: 0.5944 - regression_loss: 0.5241 - classification_loss: 0.0703 281/500 [===============>..............] - ETA: 1:12 - loss: 0.5931 - regression_loss: 0.5230 - classification_loss: 0.0702 282/500 [===============>..............] - ETA: 1:12 - loss: 0.5950 - regression_loss: 0.5245 - classification_loss: 0.0705 283/500 [===============>..............] - ETA: 1:11 - loss: 0.5958 - regression_loss: 0.5252 - classification_loss: 0.0706 284/500 [================>.............] - ETA: 1:11 - loss: 0.5970 - regression_loss: 0.5264 - classification_loss: 0.0706 285/500 [================>.............] - ETA: 1:11 - loss: 0.5965 - regression_loss: 0.5260 - classification_loss: 0.0705 286/500 [================>.............] - ETA: 1:10 - loss: 0.5963 - regression_loss: 0.5259 - classification_loss: 0.0704 287/500 [================>.............] - ETA: 1:10 - loss: 0.5961 - regression_loss: 0.5258 - classification_loss: 0.0703 288/500 [================>.............] - ETA: 1:10 - loss: 0.5951 - regression_loss: 0.5248 - classification_loss: 0.0702 289/500 [================>.............] - ETA: 1:09 - loss: 0.5945 - regression_loss: 0.5244 - classification_loss: 0.0701 290/500 [================>.............] - ETA: 1:09 - loss: 0.5959 - regression_loss: 0.5257 - classification_loss: 0.0702 291/500 [================>.............] - ETA: 1:09 - loss: 0.5946 - regression_loss: 0.5245 - classification_loss: 0.0701 292/500 [================>.............] - ETA: 1:08 - loss: 0.5955 - regression_loss: 0.5252 - classification_loss: 0.0703 293/500 [================>.............] - ETA: 1:08 - loss: 0.5945 - regression_loss: 0.5243 - classification_loss: 0.0702 294/500 [================>.............] - ETA: 1:08 - loss: 0.5953 - regression_loss: 0.5250 - classification_loss: 0.0704 295/500 [================>.............] - ETA: 1:07 - loss: 0.5973 - regression_loss: 0.5266 - classification_loss: 0.0707 296/500 [================>.............] - ETA: 1:07 - loss: 0.5964 - regression_loss: 0.5259 - classification_loss: 0.0706 297/500 [================>.............] - ETA: 1:07 - loss: 0.5963 - regression_loss: 0.5259 - classification_loss: 0.0704 298/500 [================>.............] - ETA: 1:06 - loss: 0.5972 - regression_loss: 0.5266 - classification_loss: 0.0706 299/500 [================>.............] - ETA: 1:06 - loss: 0.5964 - regression_loss: 0.5257 - classification_loss: 0.0706 300/500 [=================>............] - ETA: 1:06 - loss: 0.5960 - regression_loss: 0.5254 - classification_loss: 0.0706 301/500 [=================>............] - ETA: 1:05 - loss: 0.5954 - regression_loss: 0.5248 - classification_loss: 0.0706 302/500 [=================>............] - ETA: 1:05 - loss: 0.5958 - regression_loss: 0.5251 - classification_loss: 0.0706 303/500 [=================>............] - ETA: 1:05 - loss: 0.5951 - regression_loss: 0.5245 - classification_loss: 0.0705 304/500 [=================>............] - ETA: 1:04 - loss: 0.5938 - regression_loss: 0.5234 - classification_loss: 0.0704 305/500 [=================>............] - ETA: 1:04 - loss: 0.5938 - regression_loss: 0.5234 - classification_loss: 0.0703 306/500 [=================>............] - ETA: 1:04 - loss: 0.5927 - regression_loss: 0.5226 - classification_loss: 0.0702 307/500 [=================>............] - ETA: 1:03 - loss: 0.5942 - regression_loss: 0.5236 - classification_loss: 0.0706 308/500 [=================>............] - ETA: 1:03 - loss: 0.5944 - regression_loss: 0.5238 - classification_loss: 0.0706 309/500 [=================>............] - ETA: 1:03 - loss: 0.5949 - regression_loss: 0.5243 - classification_loss: 0.0706 310/500 [=================>............] - ETA: 1:02 - loss: 0.5944 - regression_loss: 0.5239 - classification_loss: 0.0705 311/500 [=================>............] - ETA: 1:02 - loss: 0.5941 - regression_loss: 0.5235 - classification_loss: 0.0705 312/500 [=================>............] - ETA: 1:02 - loss: 0.5959 - regression_loss: 0.5251 - classification_loss: 0.0708 313/500 [=================>............] - ETA: 1:02 - loss: 0.5958 - regression_loss: 0.5251 - classification_loss: 0.0707 314/500 [=================>............] - ETA: 1:01 - loss: 0.5958 - regression_loss: 0.5252 - classification_loss: 0.0706 315/500 [=================>............] - ETA: 1:01 - loss: 0.5965 - regression_loss: 0.5258 - classification_loss: 0.0707 316/500 [=================>............] - ETA: 1:01 - loss: 0.5979 - regression_loss: 0.5269 - classification_loss: 0.0710 317/500 [==================>...........] - ETA: 1:00 - loss: 0.5984 - regression_loss: 0.5274 - classification_loss: 0.0710 318/500 [==================>...........] - ETA: 1:00 - loss: 0.5979 - regression_loss: 0.5270 - classification_loss: 0.0709 319/500 [==================>...........] - ETA: 1:00 - loss: 0.5982 - regression_loss: 0.5273 - classification_loss: 0.0709 320/500 [==================>...........] - ETA: 59s - loss: 0.5984 - regression_loss: 0.5276 - classification_loss: 0.0708  321/500 [==================>...........] - ETA: 59s - loss: 0.5986 - regression_loss: 0.5277 - classification_loss: 0.0709 322/500 [==================>...........] - ETA: 59s - loss: 0.5979 - regression_loss: 0.5270 - classification_loss: 0.0708 323/500 [==================>...........] - ETA: 58s - loss: 0.5972 - regression_loss: 0.5265 - classification_loss: 0.0707 324/500 [==================>...........] - ETA: 58s - loss: 0.5974 - regression_loss: 0.5267 - classification_loss: 0.0707 325/500 [==================>...........] - ETA: 58s - loss: 0.5986 - regression_loss: 0.5277 - classification_loss: 0.0709 326/500 [==================>...........] - ETA: 57s - loss: 0.5982 - regression_loss: 0.5274 - classification_loss: 0.0709 327/500 [==================>...........] - ETA: 57s - loss: 0.5982 - regression_loss: 0.5275 - classification_loss: 0.0708 328/500 [==================>...........] - ETA: 57s - loss: 0.5993 - regression_loss: 0.5282 - classification_loss: 0.0711 329/500 [==================>...........] - ETA: 56s - loss: 0.5997 - regression_loss: 0.5286 - classification_loss: 0.0711 330/500 [==================>...........] - ETA: 56s - loss: 0.5992 - regression_loss: 0.5282 - classification_loss: 0.0710 331/500 [==================>...........] - ETA: 56s - loss: 0.5995 - regression_loss: 0.5284 - classification_loss: 0.0711 332/500 [==================>...........] - ETA: 55s - loss: 0.6002 - regression_loss: 0.5288 - classification_loss: 0.0713 333/500 [==================>...........] - ETA: 55s - loss: 0.5987 - regression_loss: 0.5275 - classification_loss: 0.0712 334/500 [===================>..........] - ETA: 55s - loss: 0.5996 - regression_loss: 0.5283 - classification_loss: 0.0712 335/500 [===================>..........] - ETA: 54s - loss: 0.5994 - regression_loss: 0.5281 - classification_loss: 0.0712 336/500 [===================>..........] - ETA: 54s - loss: 0.5989 - regression_loss: 0.5277 - classification_loss: 0.0712 337/500 [===================>..........] - ETA: 54s - loss: 0.5983 - regression_loss: 0.5272 - classification_loss: 0.0711 338/500 [===================>..........] - ETA: 53s - loss: 0.5988 - regression_loss: 0.5276 - classification_loss: 0.0713 339/500 [===================>..........] - ETA: 53s - loss: 0.5986 - regression_loss: 0.5274 - classification_loss: 0.0712 340/500 [===================>..........] - ETA: 53s - loss: 0.5982 - regression_loss: 0.5269 - classification_loss: 0.0712 341/500 [===================>..........] - ETA: 52s - loss: 0.5981 - regression_loss: 0.5270 - classification_loss: 0.0711 342/500 [===================>..........] - ETA: 52s - loss: 0.5978 - regression_loss: 0.5267 - classification_loss: 0.0711 343/500 [===================>..........] - ETA: 52s - loss: 0.5974 - regression_loss: 0.5264 - classification_loss: 0.0710 344/500 [===================>..........] - ETA: 51s - loss: 0.5974 - regression_loss: 0.5263 - classification_loss: 0.0710 345/500 [===================>..........] - ETA: 51s - loss: 0.5975 - regression_loss: 0.5264 - classification_loss: 0.0710 346/500 [===================>..........] - ETA: 51s - loss: 0.5971 - regression_loss: 0.5260 - classification_loss: 0.0711 347/500 [===================>..........] - ETA: 50s - loss: 0.5961 - regression_loss: 0.5252 - classification_loss: 0.0709 348/500 [===================>..........] - ETA: 50s - loss: 0.5949 - regression_loss: 0.5241 - classification_loss: 0.0708 349/500 [===================>..........] - ETA: 50s - loss: 0.5945 - regression_loss: 0.5238 - classification_loss: 0.0707 350/500 [====================>.........] - ETA: 49s - loss: 0.5950 - regression_loss: 0.5242 - classification_loss: 0.0708 351/500 [====================>.........] - ETA: 49s - loss: 0.5952 - regression_loss: 0.5244 - classification_loss: 0.0708 352/500 [====================>.........] - ETA: 49s - loss: 0.5950 - regression_loss: 0.5242 - classification_loss: 0.0708 353/500 [====================>.........] - ETA: 48s - loss: 0.5943 - regression_loss: 0.5236 - classification_loss: 0.0707 354/500 [====================>.........] - ETA: 48s - loss: 0.5945 - regression_loss: 0.5238 - classification_loss: 0.0707 355/500 [====================>.........] - ETA: 48s - loss: 0.5939 - regression_loss: 0.5234 - classification_loss: 0.0706 356/500 [====================>.........] - ETA: 47s - loss: 0.5937 - regression_loss: 0.5232 - classification_loss: 0.0705 357/500 [====================>.........] - ETA: 47s - loss: 0.5943 - regression_loss: 0.5237 - classification_loss: 0.0706 358/500 [====================>.........] - ETA: 47s - loss: 0.5947 - regression_loss: 0.5240 - classification_loss: 0.0707 359/500 [====================>.........] - ETA: 46s - loss: 0.5941 - regression_loss: 0.5235 - classification_loss: 0.0706 360/500 [====================>.........] - ETA: 46s - loss: 0.5948 - regression_loss: 0.5243 - classification_loss: 0.0705 361/500 [====================>.........] - ETA: 46s - loss: 0.5949 - regression_loss: 0.5244 - classification_loss: 0.0705 362/500 [====================>.........] - ETA: 45s - loss: 0.5950 - regression_loss: 0.5245 - classification_loss: 0.0705 363/500 [====================>.........] - ETA: 45s - loss: 0.5939 - regression_loss: 0.5235 - classification_loss: 0.0704 364/500 [====================>.........] - ETA: 45s - loss: 0.5929 - regression_loss: 0.5226 - classification_loss: 0.0703 365/500 [====================>.........] - ETA: 44s - loss: 0.5923 - regression_loss: 0.5221 - classification_loss: 0.0702 366/500 [====================>.........] - ETA: 44s - loss: 0.5928 - regression_loss: 0.5225 - classification_loss: 0.0703 367/500 [=====================>........] - ETA: 44s - loss: 0.5926 - regression_loss: 0.5224 - classification_loss: 0.0702 368/500 [=====================>........] - ETA: 43s - loss: 0.5942 - regression_loss: 0.5236 - classification_loss: 0.0706 369/500 [=====================>........] - ETA: 43s - loss: 0.5947 - regression_loss: 0.5241 - classification_loss: 0.0706 370/500 [=====================>........] - ETA: 43s - loss: 0.5963 - regression_loss: 0.5255 - classification_loss: 0.0709 371/500 [=====================>........] - ETA: 42s - loss: 0.5964 - regression_loss: 0.5254 - classification_loss: 0.0710 372/500 [=====================>........] - ETA: 42s - loss: 0.5965 - regression_loss: 0.5255 - classification_loss: 0.0710 373/500 [=====================>........] - ETA: 42s - loss: 0.5957 - regression_loss: 0.5248 - classification_loss: 0.0708 374/500 [=====================>........] - ETA: 41s - loss: 0.5950 - regression_loss: 0.5243 - classification_loss: 0.0707 375/500 [=====================>........] - ETA: 41s - loss: 0.5942 - regression_loss: 0.5236 - classification_loss: 0.0706 376/500 [=====================>........] - ETA: 41s - loss: 0.5932 - regression_loss: 0.5227 - classification_loss: 0.0705 377/500 [=====================>........] - ETA: 40s - loss: 0.5944 - regression_loss: 0.5237 - classification_loss: 0.0707 378/500 [=====================>........] - ETA: 40s - loss: 0.5946 - regression_loss: 0.5239 - classification_loss: 0.0707 379/500 [=====================>........] - ETA: 40s - loss: 0.5942 - regression_loss: 0.5236 - classification_loss: 0.0707 380/500 [=====================>........] - ETA: 39s - loss: 0.5934 - regression_loss: 0.5229 - classification_loss: 0.0706 381/500 [=====================>........] - ETA: 39s - loss: 0.5939 - regression_loss: 0.5233 - classification_loss: 0.0706 382/500 [=====================>........] - ETA: 39s - loss: 0.5942 - regression_loss: 0.5236 - classification_loss: 0.0706 383/500 [=====================>........] - ETA: 38s - loss: 0.5936 - regression_loss: 0.5231 - classification_loss: 0.0705 384/500 [======================>.......] - ETA: 38s - loss: 0.5938 - regression_loss: 0.5232 - classification_loss: 0.0707 385/500 [======================>.......] - ETA: 38s - loss: 0.5935 - regression_loss: 0.5229 - classification_loss: 0.0706 386/500 [======================>.......] - ETA: 37s - loss: 0.5945 - regression_loss: 0.5238 - classification_loss: 0.0707 387/500 [======================>.......] - ETA: 37s - loss: 0.5938 - regression_loss: 0.5232 - classification_loss: 0.0706 388/500 [======================>.......] - ETA: 37s - loss: 0.5941 - regression_loss: 0.5234 - classification_loss: 0.0707 389/500 [======================>.......] - ETA: 36s - loss: 0.5932 - regression_loss: 0.5226 - classification_loss: 0.0705 390/500 [======================>.......] - ETA: 36s - loss: 0.5935 - regression_loss: 0.5229 - classification_loss: 0.0705 391/500 [======================>.......] - ETA: 36s - loss: 0.5939 - regression_loss: 0.5233 - classification_loss: 0.0706 392/500 [======================>.......] - ETA: 35s - loss: 0.5942 - regression_loss: 0.5236 - classification_loss: 0.0706 393/500 [======================>.......] - ETA: 35s - loss: 0.5941 - regression_loss: 0.5235 - classification_loss: 0.0707 394/500 [======================>.......] - ETA: 35s - loss: 0.5935 - regression_loss: 0.5229 - classification_loss: 0.0706 395/500 [======================>.......] - ETA: 34s - loss: 0.5927 - regression_loss: 0.5223 - classification_loss: 0.0704 396/500 [======================>.......] - ETA: 34s - loss: 0.5922 - regression_loss: 0.5218 - classification_loss: 0.0703 397/500 [======================>.......] - ETA: 34s - loss: 0.5913 - regression_loss: 0.5211 - classification_loss: 0.0702 398/500 [======================>.......] - ETA: 33s - loss: 0.5926 - regression_loss: 0.5222 - classification_loss: 0.0704 399/500 [======================>.......] - ETA: 33s - loss: 0.5924 - regression_loss: 0.5220 - classification_loss: 0.0704 400/500 [=======================>......] - ETA: 33s - loss: 0.5924 - regression_loss: 0.5221 - classification_loss: 0.0704 401/500 [=======================>......] - ETA: 32s - loss: 0.5920 - regression_loss: 0.5217 - classification_loss: 0.0703 402/500 [=======================>......] - ETA: 32s - loss: 0.5919 - regression_loss: 0.5216 - classification_loss: 0.0703 403/500 [=======================>......] - ETA: 32s - loss: 0.5925 - regression_loss: 0.5221 - classification_loss: 0.0704 404/500 [=======================>......] - ETA: 31s - loss: 0.5921 - regression_loss: 0.5217 - classification_loss: 0.0704 405/500 [=======================>......] - ETA: 31s - loss: 0.5924 - regression_loss: 0.5220 - classification_loss: 0.0704 406/500 [=======================>......] - ETA: 31s - loss: 0.5914 - regression_loss: 0.5211 - classification_loss: 0.0703 407/500 [=======================>......] - ETA: 30s - loss: 0.5927 - regression_loss: 0.5222 - classification_loss: 0.0705 408/500 [=======================>......] - ETA: 30s - loss: 0.5928 - regression_loss: 0.5223 - classification_loss: 0.0705 409/500 [=======================>......] - ETA: 30s - loss: 0.5928 - regression_loss: 0.5223 - classification_loss: 0.0705 410/500 [=======================>......] - ETA: 29s - loss: 0.5930 - regression_loss: 0.5225 - classification_loss: 0.0705 411/500 [=======================>......] - ETA: 29s - loss: 0.5921 - regression_loss: 0.5217 - classification_loss: 0.0704 412/500 [=======================>......] - ETA: 29s - loss: 0.5927 - regression_loss: 0.5221 - classification_loss: 0.0705 413/500 [=======================>......] - ETA: 28s - loss: 0.5926 - regression_loss: 0.5220 - classification_loss: 0.0706 414/500 [=======================>......] - ETA: 28s - loss: 0.5924 - regression_loss: 0.5218 - classification_loss: 0.0705 415/500 [=======================>......] - ETA: 28s - loss: 0.5916 - regression_loss: 0.5212 - classification_loss: 0.0704 416/500 [=======================>......] - ETA: 27s - loss: 0.5911 - regression_loss: 0.5208 - classification_loss: 0.0703 417/500 [========================>.....] - ETA: 27s - loss: 0.5906 - regression_loss: 0.5203 - classification_loss: 0.0703 418/500 [========================>.....] - ETA: 27s - loss: 0.5899 - regression_loss: 0.5197 - classification_loss: 0.0702 419/500 [========================>.....] - ETA: 26s - loss: 0.5901 - regression_loss: 0.5199 - classification_loss: 0.0702 420/500 [========================>.....] - ETA: 26s - loss: 0.5896 - regression_loss: 0.5195 - classification_loss: 0.0701 421/500 [========================>.....] - ETA: 26s - loss: 0.5895 - regression_loss: 0.5194 - classification_loss: 0.0701 422/500 [========================>.....] - ETA: 25s - loss: 0.5896 - regression_loss: 0.5195 - classification_loss: 0.0701 423/500 [========================>.....] - ETA: 25s - loss: 0.5897 - regression_loss: 0.5196 - classification_loss: 0.0700 424/500 [========================>.....] - ETA: 25s - loss: 0.5890 - regression_loss: 0.5191 - classification_loss: 0.0699 425/500 [========================>.....] - ETA: 24s - loss: 0.5884 - regression_loss: 0.5185 - classification_loss: 0.0699 426/500 [========================>.....] - ETA: 24s - loss: 0.5879 - regression_loss: 0.5182 - classification_loss: 0.0698 427/500 [========================>.....] - ETA: 24s - loss: 0.5874 - regression_loss: 0.5178 - classification_loss: 0.0697 428/500 [========================>.....] - ETA: 23s - loss: 0.5869 - regression_loss: 0.5173 - classification_loss: 0.0696 429/500 [========================>.....] - ETA: 23s - loss: 0.5869 - regression_loss: 0.5174 - classification_loss: 0.0695 430/500 [========================>.....] - ETA: 23s - loss: 0.5868 - regression_loss: 0.5173 - classification_loss: 0.0695 431/500 [========================>.....] - ETA: 22s - loss: 0.5864 - regression_loss: 0.5170 - classification_loss: 0.0694 432/500 [========================>.....] - ETA: 22s - loss: 0.5873 - regression_loss: 0.5178 - classification_loss: 0.0695 433/500 [========================>.....] - ETA: 22s - loss: 0.5866 - regression_loss: 0.5172 - classification_loss: 0.0694 434/500 [=========================>....] - ETA: 21s - loss: 0.5864 - regression_loss: 0.5170 - classification_loss: 0.0694 435/500 [=========================>....] - ETA: 21s - loss: 0.5863 - regression_loss: 0.5170 - classification_loss: 0.0693 436/500 [=========================>....] - ETA: 21s - loss: 0.5868 - regression_loss: 0.5175 - classification_loss: 0.0694 437/500 [=========================>....] - ETA: 20s - loss: 0.5862 - regression_loss: 0.5170 - classification_loss: 0.0693 438/500 [=========================>....] - ETA: 20s - loss: 0.5866 - regression_loss: 0.5174 - classification_loss: 0.0693 439/500 [=========================>....] - ETA: 20s - loss: 0.5867 - regression_loss: 0.5175 - classification_loss: 0.0693 440/500 [=========================>....] - ETA: 19s - loss: 0.5862 - regression_loss: 0.5170 - classification_loss: 0.0692 441/500 [=========================>....] - ETA: 19s - loss: 0.5865 - regression_loss: 0.5172 - classification_loss: 0.0693 442/500 [=========================>....] - ETA: 19s - loss: 0.5859 - regression_loss: 0.5167 - classification_loss: 0.0692 443/500 [=========================>....] - ETA: 18s - loss: 0.5851 - regression_loss: 0.5160 - classification_loss: 0.0691 444/500 [=========================>....] - ETA: 18s - loss: 0.5845 - regression_loss: 0.5155 - classification_loss: 0.0690 445/500 [=========================>....] - ETA: 18s - loss: 0.5836 - regression_loss: 0.5146 - classification_loss: 0.0689 446/500 [=========================>....] - ETA: 17s - loss: 0.5838 - regression_loss: 0.5149 - classification_loss: 0.0689 447/500 [=========================>....] - ETA: 17s - loss: 0.5835 - regression_loss: 0.5146 - classification_loss: 0.0689 448/500 [=========================>....] - ETA: 17s - loss: 0.5840 - regression_loss: 0.5150 - classification_loss: 0.0689 449/500 [=========================>....] - ETA: 16s - loss: 0.5840 - regression_loss: 0.5151 - classification_loss: 0.0689 450/500 [==========================>...] - ETA: 16s - loss: 0.5837 - regression_loss: 0.5149 - classification_loss: 0.0688 451/500 [==========================>...] - ETA: 16s - loss: 0.5834 - regression_loss: 0.5146 - classification_loss: 0.0688 452/500 [==========================>...] - ETA: 15s - loss: 0.5833 - regression_loss: 0.5145 - classification_loss: 0.0688 453/500 [==========================>...] - ETA: 15s - loss: 0.5839 - regression_loss: 0.5151 - classification_loss: 0.0688 454/500 [==========================>...] - ETA: 15s - loss: 0.5846 - regression_loss: 0.5157 - classification_loss: 0.0689 455/500 [==========================>...] - ETA: 14s - loss: 0.5842 - regression_loss: 0.5153 - classification_loss: 0.0689 456/500 [==========================>...] - ETA: 14s - loss: 0.5846 - regression_loss: 0.5157 - classification_loss: 0.0689 457/500 [==========================>...] - ETA: 14s - loss: 0.5848 - regression_loss: 0.5158 - classification_loss: 0.0689 458/500 [==========================>...] - ETA: 13s - loss: 0.5852 - regression_loss: 0.5162 - classification_loss: 0.0690 459/500 [==========================>...] - ETA: 13s - loss: 0.5849 - regression_loss: 0.5159 - classification_loss: 0.0690 460/500 [==========================>...] - ETA: 13s - loss: 0.5852 - regression_loss: 0.5163 - classification_loss: 0.0689 461/500 [==========================>...] - ETA: 12s - loss: 0.5858 - regression_loss: 0.5168 - classification_loss: 0.0690 462/500 [==========================>...] - ETA: 12s - loss: 0.5852 - regression_loss: 0.5163 - classification_loss: 0.0689 463/500 [==========================>...] - ETA: 12s - loss: 0.5851 - regression_loss: 0.5162 - classification_loss: 0.0689 464/500 [==========================>...] - ETA: 11s - loss: 0.5849 - regression_loss: 0.5161 - classification_loss: 0.0688 465/500 [==========================>...] - ETA: 11s - loss: 0.5847 - regression_loss: 0.5159 - classification_loss: 0.0688 466/500 [==========================>...] - ETA: 11s - loss: 0.5848 - regression_loss: 0.5160 - classification_loss: 0.0688 467/500 [===========================>..] - ETA: 10s - loss: 0.5855 - regression_loss: 0.5166 - classification_loss: 0.0689 468/500 [===========================>..] - ETA: 10s - loss: 0.5859 - regression_loss: 0.5169 - classification_loss: 0.0690 469/500 [===========================>..] - ETA: 10s - loss: 0.5861 - regression_loss: 0.5170 - classification_loss: 0.0690 470/500 [===========================>..] - ETA: 9s - loss: 0.5857 - regression_loss: 0.5168 - classification_loss: 0.0689  471/500 [===========================>..] - ETA: 9s - loss: 0.5855 - regression_loss: 0.5166 - classification_loss: 0.0689 472/500 [===========================>..] - ETA: 9s - loss: 0.5854 - regression_loss: 0.5165 - classification_loss: 0.0689 473/500 [===========================>..] - ETA: 8s - loss: 0.5861 - regression_loss: 0.5170 - classification_loss: 0.0690 474/500 [===========================>..] - ETA: 8s - loss: 0.5867 - regression_loss: 0.5176 - classification_loss: 0.0692 475/500 [===========================>..] - ETA: 8s - loss: 0.5862 - regression_loss: 0.5171 - classification_loss: 0.0691 476/500 [===========================>..] - ETA: 7s - loss: 0.5867 - regression_loss: 0.5175 - classification_loss: 0.0692 477/500 [===========================>..] - ETA: 7s - loss: 0.5862 - regression_loss: 0.5171 - classification_loss: 0.0691 478/500 [===========================>..] - ETA: 7s - loss: 0.5867 - regression_loss: 0.5176 - classification_loss: 0.0692 479/500 [===========================>..] - ETA: 6s - loss: 0.5867 - regression_loss: 0.5176 - classification_loss: 0.0692 480/500 [===========================>..] - ETA: 6s - loss: 0.5865 - regression_loss: 0.5173 - classification_loss: 0.0691 481/500 [===========================>..] - ETA: 6s - loss: 0.5857 - regression_loss: 0.5166 - classification_loss: 0.0690 482/500 [===========================>..] - ETA: 5s - loss: 0.5853 - regression_loss: 0.5163 - classification_loss: 0.0690 483/500 [===========================>..] - ETA: 5s - loss: 0.5860 - regression_loss: 0.5170 - classification_loss: 0.0689 484/500 [============================>.] - ETA: 5s - loss: 0.5867 - regression_loss: 0.5176 - classification_loss: 0.0691 485/500 [============================>.] - ETA: 4s - loss: 0.5859 - regression_loss: 0.5169 - classification_loss: 0.0690 486/500 [============================>.] - ETA: 4s - loss: 0.5855 - regression_loss: 0.5165 - classification_loss: 0.0690 487/500 [============================>.] - ETA: 4s - loss: 0.5856 - regression_loss: 0.5166 - classification_loss: 0.0690 488/500 [============================>.] - ETA: 3s - loss: 0.5860 - regression_loss: 0.5170 - classification_loss: 0.0690 489/500 [============================>.] - ETA: 3s - loss: 0.5856 - regression_loss: 0.5166 - classification_loss: 0.0690 490/500 [============================>.] - ETA: 3s - loss: 0.5857 - regression_loss: 0.5167 - classification_loss: 0.0690 491/500 [============================>.] - ETA: 2s - loss: 0.5860 - regression_loss: 0.5170 - classification_loss: 0.0690 492/500 [============================>.] - ETA: 2s - loss: 0.5869 - regression_loss: 0.5177 - classification_loss: 0.0692 493/500 [============================>.] - ETA: 2s - loss: 0.5869 - regression_loss: 0.5177 - classification_loss: 0.0692 494/500 [============================>.] - ETA: 1s - loss: 0.5870 - regression_loss: 0.5178 - classification_loss: 0.0692 495/500 [============================>.] - ETA: 1s - loss: 0.5876 - regression_loss: 0.5184 - classification_loss: 0.0692 496/500 [============================>.] - ETA: 1s - loss: 0.5871 - regression_loss: 0.5180 - classification_loss: 0.0691 497/500 [============================>.] - ETA: 0s - loss: 0.5868 - regression_loss: 0.5178 - classification_loss: 0.0690 498/500 [============================>.] - ETA: 0s - loss: 0.5870 - regression_loss: 0.5180 - classification_loss: 0.0690 499/500 [============================>.] - ETA: 0s - loss: 0.5871 - regression_loss: 0.5181 - classification_loss: 0.0690 500/500 [==============================] - 166s 332ms/step - loss: 0.5864 - regression_loss: 0.5176 - classification_loss: 0.0689 1172 instances of class plum with average precision: 0.6515 mAP: 0.6515 Epoch 00053: saving model to ./training/snapshots/resnet101_pascal_53.h5 Epoch 54/150 1/500 [..............................] - ETA: 2:29 - loss: 1.0615 - regression_loss: 0.9148 - classification_loss: 0.1467 2/500 [..............................] - ETA: 2:32 - loss: 0.7099 - regression_loss: 0.6214 - classification_loss: 0.0885 3/500 [..............................] - ETA: 2:33 - loss: 0.7707 - regression_loss: 0.6625 - classification_loss: 0.1082 4/500 [..............................] - ETA: 2:33 - loss: 0.7405 - regression_loss: 0.6423 - classification_loss: 0.0982 5/500 [..............................] - ETA: 2:33 - loss: 0.7064 - regression_loss: 0.6156 - classification_loss: 0.0908 6/500 [..............................] - ETA: 2:36 - loss: 0.7579 - regression_loss: 0.6550 - classification_loss: 0.1029 7/500 [..............................] - ETA: 2:36 - loss: 0.7409 - regression_loss: 0.6413 - classification_loss: 0.0996 8/500 [..............................] - ETA: 2:38 - loss: 0.7100 - regression_loss: 0.6158 - classification_loss: 0.0942 9/500 [..............................] - ETA: 2:38 - loss: 0.7060 - regression_loss: 0.6129 - classification_loss: 0.0930 10/500 [..............................] - ETA: 2:37 - loss: 0.6720 - regression_loss: 0.5856 - classification_loss: 0.0864 11/500 [..............................] - ETA: 2:37 - loss: 0.6598 - regression_loss: 0.5771 - classification_loss: 0.0827 12/500 [..............................] - ETA: 2:37 - loss: 0.7145 - regression_loss: 0.6238 - classification_loss: 0.0906 13/500 [..............................] - ETA: 2:38 - loss: 0.7544 - regression_loss: 0.6549 - classification_loss: 0.0994 14/500 [..............................] - ETA: 2:37 - loss: 0.7352 - regression_loss: 0.6402 - classification_loss: 0.0950 15/500 [..............................] - ETA: 2:37 - loss: 0.7262 - regression_loss: 0.6328 - classification_loss: 0.0934 16/500 [..............................] - ETA: 2:37 - loss: 0.7381 - regression_loss: 0.6430 - classification_loss: 0.0951 17/500 [>.............................] - ETA: 2:36 - loss: 0.7186 - regression_loss: 0.6261 - classification_loss: 0.0925 18/500 [>.............................] - ETA: 2:36 - loss: 0.7221 - regression_loss: 0.6290 - classification_loss: 0.0930 19/500 [>.............................] - ETA: 2:36 - loss: 0.7038 - regression_loss: 0.6138 - classification_loss: 0.0901 20/500 [>.............................] - ETA: 2:35 - loss: 0.7213 - regression_loss: 0.6279 - classification_loss: 0.0934 21/500 [>.............................] - ETA: 2:35 - loss: 0.7065 - regression_loss: 0.6159 - classification_loss: 0.0906 22/500 [>.............................] - ETA: 2:34 - loss: 0.7087 - regression_loss: 0.6195 - classification_loss: 0.0891 23/500 [>.............................] - ETA: 2:34 - loss: 0.7051 - regression_loss: 0.6167 - classification_loss: 0.0884 24/500 [>.............................] - ETA: 2:34 - loss: 0.6931 - regression_loss: 0.6063 - classification_loss: 0.0867 25/500 [>.............................] - ETA: 2:33 - loss: 0.6955 - regression_loss: 0.6086 - classification_loss: 0.0869 26/500 [>.............................] - ETA: 2:33 - loss: 0.6885 - regression_loss: 0.6018 - classification_loss: 0.0867 27/500 [>.............................] - ETA: 2:33 - loss: 0.6980 - regression_loss: 0.6103 - classification_loss: 0.0877 28/500 [>.............................] - ETA: 2:33 - loss: 0.6874 - regression_loss: 0.6014 - classification_loss: 0.0860 29/500 [>.............................] - ETA: 2:32 - loss: 0.6837 - regression_loss: 0.5992 - classification_loss: 0.0846 30/500 [>.............................] - ETA: 2:32 - loss: 0.6912 - regression_loss: 0.6058 - classification_loss: 0.0854 31/500 [>.............................] - ETA: 2:32 - loss: 0.6847 - regression_loss: 0.6010 - classification_loss: 0.0836 32/500 [>.............................] - ETA: 2:32 - loss: 0.6918 - regression_loss: 0.6074 - classification_loss: 0.0844 33/500 [>.............................] - ETA: 2:32 - loss: 0.7073 - regression_loss: 0.6198 - classification_loss: 0.0875 34/500 [=>............................] - ETA: 2:32 - loss: 0.7024 - regression_loss: 0.6164 - classification_loss: 0.0860 35/500 [=>............................] - ETA: 2:31 - loss: 0.6933 - regression_loss: 0.6081 - classification_loss: 0.0852 36/500 [=>............................] - ETA: 2:31 - loss: 0.6951 - regression_loss: 0.6102 - classification_loss: 0.0849 37/500 [=>............................] - ETA: 2:31 - loss: 0.6815 - regression_loss: 0.5981 - classification_loss: 0.0835 38/500 [=>............................] - ETA: 2:31 - loss: 0.6670 - regression_loss: 0.5850 - classification_loss: 0.0820 39/500 [=>............................] - ETA: 2:30 - loss: 0.6685 - regression_loss: 0.5867 - classification_loss: 0.0818 40/500 [=>............................] - ETA: 2:30 - loss: 0.6621 - regression_loss: 0.5809 - classification_loss: 0.0812 41/500 [=>............................] - ETA: 2:30 - loss: 0.6645 - regression_loss: 0.5826 - classification_loss: 0.0819 42/500 [=>............................] - ETA: 2:29 - loss: 0.6667 - regression_loss: 0.5851 - classification_loss: 0.0816 43/500 [=>............................] - ETA: 2:29 - loss: 0.6776 - regression_loss: 0.5940 - classification_loss: 0.0836 44/500 [=>............................] - ETA: 2:28 - loss: 0.6761 - regression_loss: 0.5926 - classification_loss: 0.0834 45/500 [=>............................] - ETA: 2:28 - loss: 0.6668 - regression_loss: 0.5843 - classification_loss: 0.0824 46/500 [=>............................] - ETA: 2:28 - loss: 0.6719 - regression_loss: 0.5873 - classification_loss: 0.0845 47/500 [=>............................] - ETA: 2:27 - loss: 0.6683 - regression_loss: 0.5843 - classification_loss: 0.0840 48/500 [=>............................] - ETA: 2:27 - loss: 0.6659 - regression_loss: 0.5821 - classification_loss: 0.0838 49/500 [=>............................] - ETA: 2:27 - loss: 0.6739 - regression_loss: 0.5889 - classification_loss: 0.0849 50/500 [==>...........................] - ETA: 2:26 - loss: 0.6713 - regression_loss: 0.5866 - classification_loss: 0.0846 51/500 [==>...........................] - ETA: 2:26 - loss: 0.6709 - regression_loss: 0.5852 - classification_loss: 0.0857 52/500 [==>...........................] - ETA: 2:26 - loss: 0.6673 - regression_loss: 0.5830 - classification_loss: 0.0843 53/500 [==>...........................] - ETA: 2:25 - loss: 0.6598 - regression_loss: 0.5762 - classification_loss: 0.0836 54/500 [==>...........................] - ETA: 2:25 - loss: 0.6590 - regression_loss: 0.5761 - classification_loss: 0.0830 55/500 [==>...........................] - ETA: 2:25 - loss: 0.6583 - regression_loss: 0.5751 - classification_loss: 0.0832 56/500 [==>...........................] - ETA: 2:25 - loss: 0.6624 - regression_loss: 0.5787 - classification_loss: 0.0837 57/500 [==>...........................] - ETA: 2:24 - loss: 0.6574 - regression_loss: 0.5743 - classification_loss: 0.0831 58/500 [==>...........................] - ETA: 2:24 - loss: 0.6591 - regression_loss: 0.5760 - classification_loss: 0.0832 59/500 [==>...........................] - ETA: 2:24 - loss: 0.6605 - regression_loss: 0.5773 - classification_loss: 0.0832 60/500 [==>...........................] - ETA: 2:23 - loss: 0.6548 - regression_loss: 0.5722 - classification_loss: 0.0826 61/500 [==>...........................] - ETA: 2:23 - loss: 0.6623 - regression_loss: 0.5766 - classification_loss: 0.0857 62/500 [==>...........................] - ETA: 2:23 - loss: 0.6655 - regression_loss: 0.5801 - classification_loss: 0.0855 63/500 [==>...........................] - ETA: 2:22 - loss: 0.6632 - regression_loss: 0.5782 - classification_loss: 0.0850 64/500 [==>...........................] - ETA: 2:22 - loss: 0.6742 - regression_loss: 0.5882 - classification_loss: 0.0860 65/500 [==>...........................] - ETA: 2:22 - loss: 0.6703 - regression_loss: 0.5850 - classification_loss: 0.0853 66/500 [==>...........................] - ETA: 2:22 - loss: 0.6640 - regression_loss: 0.5797 - classification_loss: 0.0843 67/500 [===>..........................] - ETA: 2:21 - loss: 0.6600 - regression_loss: 0.5766 - classification_loss: 0.0834 68/500 [===>..........................] - ETA: 2:21 - loss: 0.6590 - regression_loss: 0.5762 - classification_loss: 0.0828 69/500 [===>..........................] - ETA: 2:21 - loss: 0.6588 - regression_loss: 0.5761 - classification_loss: 0.0827 70/500 [===>..........................] - ETA: 2:20 - loss: 0.6682 - regression_loss: 0.5846 - classification_loss: 0.0836 71/500 [===>..........................] - ETA: 2:20 - loss: 0.6616 - regression_loss: 0.5789 - classification_loss: 0.0826 72/500 [===>..........................] - ETA: 2:20 - loss: 0.6715 - regression_loss: 0.5871 - classification_loss: 0.0844 73/500 [===>..........................] - ETA: 2:19 - loss: 0.6771 - regression_loss: 0.5922 - classification_loss: 0.0849 74/500 [===>..........................] - ETA: 2:19 - loss: 0.6784 - regression_loss: 0.5937 - classification_loss: 0.0847 75/500 [===>..........................] - ETA: 2:19 - loss: 0.6813 - regression_loss: 0.5966 - classification_loss: 0.0847 76/500 [===>..........................] - ETA: 2:19 - loss: 0.6851 - regression_loss: 0.5996 - classification_loss: 0.0855 77/500 [===>..........................] - ETA: 2:18 - loss: 0.6862 - regression_loss: 0.6007 - classification_loss: 0.0856 78/500 [===>..........................] - ETA: 2:18 - loss: 0.6863 - regression_loss: 0.6005 - classification_loss: 0.0858 79/500 [===>..........................] - ETA: 2:18 - loss: 0.6871 - regression_loss: 0.6013 - classification_loss: 0.0858 80/500 [===>..........................] - ETA: 2:17 - loss: 0.6903 - regression_loss: 0.6034 - classification_loss: 0.0869 81/500 [===>..........................] - ETA: 2:17 - loss: 0.6838 - regression_loss: 0.5974 - classification_loss: 0.0864 82/500 [===>..........................] - ETA: 2:17 - loss: 0.6847 - regression_loss: 0.5982 - classification_loss: 0.0864 83/500 [===>..........................] - ETA: 2:16 - loss: 0.6895 - regression_loss: 0.6025 - classification_loss: 0.0871 84/500 [====>.........................] - ETA: 2:16 - loss: 0.6928 - regression_loss: 0.6046 - classification_loss: 0.0881 85/500 [====>.........................] - ETA: 2:16 - loss: 0.6994 - regression_loss: 0.6102 - classification_loss: 0.0892 86/500 [====>.........................] - ETA: 2:15 - loss: 0.6970 - regression_loss: 0.6084 - classification_loss: 0.0886 87/500 [====>.........................] - ETA: 2:15 - loss: 0.6947 - regression_loss: 0.6063 - classification_loss: 0.0883 88/500 [====>.........................] - ETA: 2:15 - loss: 0.6899 - regression_loss: 0.6022 - classification_loss: 0.0877 89/500 [====>.........................] - ETA: 2:14 - loss: 0.6892 - regression_loss: 0.6017 - classification_loss: 0.0875 90/500 [====>.........................] - ETA: 2:14 - loss: 0.6876 - regression_loss: 0.6002 - classification_loss: 0.0874 91/500 [====>.........................] - ETA: 2:14 - loss: 0.6829 - regression_loss: 0.5963 - classification_loss: 0.0866 92/500 [====>.........................] - ETA: 2:13 - loss: 0.6821 - regression_loss: 0.5955 - classification_loss: 0.0866 93/500 [====>.........................] - ETA: 2:13 - loss: 0.6807 - regression_loss: 0.5943 - classification_loss: 0.0864 94/500 [====>.........................] - ETA: 2:13 - loss: 0.6791 - regression_loss: 0.5927 - classification_loss: 0.0864 95/500 [====>.........................] - ETA: 2:13 - loss: 0.6830 - regression_loss: 0.5960 - classification_loss: 0.0870 96/500 [====>.........................] - ETA: 2:12 - loss: 0.6814 - regression_loss: 0.5948 - classification_loss: 0.0866 97/500 [====>.........................] - ETA: 2:12 - loss: 0.6871 - regression_loss: 0.5991 - classification_loss: 0.0880 98/500 [====>.........................] - ETA: 2:12 - loss: 0.6876 - regression_loss: 0.5998 - classification_loss: 0.0877 99/500 [====>.........................] - ETA: 2:11 - loss: 0.6868 - regression_loss: 0.5993 - classification_loss: 0.0876 100/500 [=====>........................] - ETA: 2:11 - loss: 0.6873 - regression_loss: 0.5997 - classification_loss: 0.0876 101/500 [=====>........................] - ETA: 2:11 - loss: 0.6882 - regression_loss: 0.6006 - classification_loss: 0.0876 102/500 [=====>........................] - ETA: 2:10 - loss: 0.6867 - regression_loss: 0.5992 - classification_loss: 0.0874 103/500 [=====>........................] - ETA: 2:10 - loss: 0.6849 - regression_loss: 0.5977 - classification_loss: 0.0872 104/500 [=====>........................] - ETA: 2:09 - loss: 0.6833 - regression_loss: 0.5963 - classification_loss: 0.0870 105/500 [=====>........................] - ETA: 2:09 - loss: 0.6826 - regression_loss: 0.5957 - classification_loss: 0.0869 106/500 [=====>........................] - ETA: 2:09 - loss: 0.6838 - regression_loss: 0.5966 - classification_loss: 0.0872 107/500 [=====>........................] - ETA: 2:09 - loss: 0.6810 - regression_loss: 0.5942 - classification_loss: 0.0867 108/500 [=====>........................] - ETA: 2:08 - loss: 0.6803 - regression_loss: 0.5935 - classification_loss: 0.0868 109/500 [=====>........................] - ETA: 2:08 - loss: 0.6775 - regression_loss: 0.5912 - classification_loss: 0.0863 110/500 [=====>........................] - ETA: 2:08 - loss: 0.6766 - regression_loss: 0.5908 - classification_loss: 0.0858 111/500 [=====>........................] - ETA: 2:07 - loss: 0.6778 - regression_loss: 0.5920 - classification_loss: 0.0858 112/500 [=====>........................] - ETA: 2:07 - loss: 0.6780 - regression_loss: 0.5922 - classification_loss: 0.0858 113/500 [=====>........................] - ETA: 2:07 - loss: 0.6750 - regression_loss: 0.5896 - classification_loss: 0.0853 114/500 [=====>........................] - ETA: 2:06 - loss: 0.6732 - regression_loss: 0.5882 - classification_loss: 0.0849 115/500 [=====>........................] - ETA: 2:06 - loss: 0.6718 - regression_loss: 0.5872 - classification_loss: 0.0846 116/500 [=====>........................] - ETA: 2:06 - loss: 0.6738 - regression_loss: 0.5888 - classification_loss: 0.0850 117/500 [======>.......................] - ETA: 2:05 - loss: 0.6738 - regression_loss: 0.5886 - classification_loss: 0.0852 118/500 [======>.......................] - ETA: 2:05 - loss: 0.6717 - regression_loss: 0.5869 - classification_loss: 0.0848 119/500 [======>.......................] - ETA: 2:05 - loss: 0.6706 - regression_loss: 0.5858 - classification_loss: 0.0848 120/500 [======>.......................] - ETA: 2:04 - loss: 0.6680 - regression_loss: 0.5836 - classification_loss: 0.0844 121/500 [======>.......................] - ETA: 2:04 - loss: 0.6690 - regression_loss: 0.5846 - classification_loss: 0.0844 122/500 [======>.......................] - ETA: 2:04 - loss: 0.6664 - regression_loss: 0.5824 - classification_loss: 0.0840 123/500 [======>.......................] - ETA: 2:03 - loss: 0.6644 - regression_loss: 0.5809 - classification_loss: 0.0836 124/500 [======>.......................] - ETA: 2:03 - loss: 0.6630 - regression_loss: 0.5798 - classification_loss: 0.0832 125/500 [======>.......................] - ETA: 2:03 - loss: 0.6639 - regression_loss: 0.5805 - classification_loss: 0.0834 126/500 [======>.......................] - ETA: 2:02 - loss: 0.6626 - regression_loss: 0.5795 - classification_loss: 0.0831 127/500 [======>.......................] - ETA: 2:02 - loss: 0.6594 - regression_loss: 0.5768 - classification_loss: 0.0827 128/500 [======>.......................] - ETA: 2:02 - loss: 0.6585 - regression_loss: 0.5757 - classification_loss: 0.0827 129/500 [======>.......................] - ETA: 2:01 - loss: 0.6579 - regression_loss: 0.5755 - classification_loss: 0.0824 130/500 [======>.......................] - ETA: 2:01 - loss: 0.6569 - regression_loss: 0.5746 - classification_loss: 0.0822 131/500 [======>.......................] - ETA: 2:01 - loss: 0.6551 - regression_loss: 0.5731 - classification_loss: 0.0820 132/500 [======>.......................] - ETA: 2:00 - loss: 0.6566 - regression_loss: 0.5749 - classification_loss: 0.0817 133/500 [======>.......................] - ETA: 2:00 - loss: 0.6537 - regression_loss: 0.5724 - classification_loss: 0.0812 134/500 [=======>......................] - ETA: 2:00 - loss: 0.6535 - regression_loss: 0.5723 - classification_loss: 0.0812 135/500 [=======>......................] - ETA: 1:59 - loss: 0.6530 - regression_loss: 0.5719 - classification_loss: 0.0812 136/500 [=======>......................] - ETA: 1:59 - loss: 0.6558 - regression_loss: 0.5741 - classification_loss: 0.0818 137/500 [=======>......................] - ETA: 1:59 - loss: 0.6593 - regression_loss: 0.5768 - classification_loss: 0.0825 138/500 [=======>......................] - ETA: 1:58 - loss: 0.6589 - regression_loss: 0.5763 - classification_loss: 0.0826 139/500 [=======>......................] - ETA: 1:58 - loss: 0.6565 - regression_loss: 0.5742 - classification_loss: 0.0823 140/500 [=======>......................] - ETA: 1:58 - loss: 0.6572 - regression_loss: 0.5749 - classification_loss: 0.0823 141/500 [=======>......................] - ETA: 1:57 - loss: 0.6574 - regression_loss: 0.5751 - classification_loss: 0.0823 142/500 [=======>......................] - ETA: 1:57 - loss: 0.6541 - regression_loss: 0.5722 - classification_loss: 0.0819 143/500 [=======>......................] - ETA: 1:57 - loss: 0.6521 - regression_loss: 0.5706 - classification_loss: 0.0815 144/500 [=======>......................] - ETA: 1:56 - loss: 0.6531 - regression_loss: 0.5716 - classification_loss: 0.0815 145/500 [=======>......................] - ETA: 1:56 - loss: 0.6553 - regression_loss: 0.5734 - classification_loss: 0.0819 146/500 [=======>......................] - ETA: 1:56 - loss: 0.6541 - regression_loss: 0.5724 - classification_loss: 0.0817 147/500 [=======>......................] - ETA: 1:55 - loss: 0.6539 - regression_loss: 0.5719 - classification_loss: 0.0820 148/500 [=======>......................] - ETA: 1:55 - loss: 0.6507 - regression_loss: 0.5691 - classification_loss: 0.0816 149/500 [=======>......................] - ETA: 1:55 - loss: 0.6516 - regression_loss: 0.5698 - classification_loss: 0.0818 150/500 [========>.....................] - ETA: 1:54 - loss: 0.6501 - regression_loss: 0.5687 - classification_loss: 0.0814 151/500 [========>.....................] - ETA: 1:54 - loss: 0.6499 - regression_loss: 0.5686 - classification_loss: 0.0813 152/500 [========>.....................] - ETA: 1:54 - loss: 0.6479 - regression_loss: 0.5669 - classification_loss: 0.0810 153/500 [========>.....................] - ETA: 1:53 - loss: 0.6455 - regression_loss: 0.5648 - classification_loss: 0.0807 154/500 [========>.....................] - ETA: 1:53 - loss: 0.6438 - regression_loss: 0.5633 - classification_loss: 0.0805 155/500 [========>.....................] - ETA: 1:53 - loss: 0.6441 - regression_loss: 0.5636 - classification_loss: 0.0805 156/500 [========>.....................] - ETA: 1:52 - loss: 0.6455 - regression_loss: 0.5645 - classification_loss: 0.0810 157/500 [========>.....................] - ETA: 1:52 - loss: 0.6473 - regression_loss: 0.5659 - classification_loss: 0.0815 158/500 [========>.....................] - ETA: 1:52 - loss: 0.6490 - regression_loss: 0.5673 - classification_loss: 0.0818 159/500 [========>.....................] - ETA: 1:51 - loss: 0.6496 - regression_loss: 0.5678 - classification_loss: 0.0819 160/500 [========>.....................] - ETA: 1:51 - loss: 0.6481 - regression_loss: 0.5664 - classification_loss: 0.0817 161/500 [========>.....................] - ETA: 1:51 - loss: 0.6469 - regression_loss: 0.5654 - classification_loss: 0.0815 162/500 [========>.....................] - ETA: 1:50 - loss: 0.6471 - regression_loss: 0.5657 - classification_loss: 0.0814 163/500 [========>.....................] - ETA: 1:50 - loss: 0.6453 - regression_loss: 0.5642 - classification_loss: 0.0811 164/500 [========>.....................] - ETA: 1:50 - loss: 0.6440 - regression_loss: 0.5630 - classification_loss: 0.0810 165/500 [========>.....................] - ETA: 1:49 - loss: 0.6435 - regression_loss: 0.5627 - classification_loss: 0.0808 166/500 [========>.....................] - ETA: 1:49 - loss: 0.6427 - regression_loss: 0.5621 - classification_loss: 0.0806 167/500 [=========>....................] - ETA: 1:49 - loss: 0.6410 - regression_loss: 0.5606 - classification_loss: 0.0804 168/500 [=========>....................] - ETA: 1:49 - loss: 0.6418 - regression_loss: 0.5614 - classification_loss: 0.0804 169/500 [=========>....................] - ETA: 1:48 - loss: 0.6392 - regression_loss: 0.5592 - classification_loss: 0.0800 170/500 [=========>....................] - ETA: 1:48 - loss: 0.6391 - regression_loss: 0.5592 - classification_loss: 0.0799 171/500 [=========>....................] - ETA: 1:47 - loss: 0.6381 - regression_loss: 0.5583 - classification_loss: 0.0798 172/500 [=========>....................] - ETA: 1:47 - loss: 0.6367 - regression_loss: 0.5571 - classification_loss: 0.0795 173/500 [=========>....................] - ETA: 1:47 - loss: 0.6371 - regression_loss: 0.5574 - classification_loss: 0.0797 174/500 [=========>....................] - ETA: 1:47 - loss: 0.6369 - regression_loss: 0.5572 - classification_loss: 0.0797 175/500 [=========>....................] - ETA: 1:46 - loss: 0.6374 - regression_loss: 0.5575 - classification_loss: 0.0799 176/500 [=========>....................] - ETA: 1:46 - loss: 0.6375 - regression_loss: 0.5576 - classification_loss: 0.0798 177/500 [=========>....................] - ETA: 1:46 - loss: 0.6375 - regression_loss: 0.5577 - classification_loss: 0.0797 178/500 [=========>....................] - ETA: 1:45 - loss: 0.6389 - regression_loss: 0.5586 - classification_loss: 0.0803 179/500 [=========>....................] - ETA: 1:45 - loss: 0.6384 - regression_loss: 0.5582 - classification_loss: 0.0802 180/500 [=========>....................] - ETA: 1:45 - loss: 0.6396 - regression_loss: 0.5591 - classification_loss: 0.0805 181/500 [=========>....................] - ETA: 1:44 - loss: 0.6396 - regression_loss: 0.5591 - classification_loss: 0.0805 182/500 [=========>....................] - ETA: 1:44 - loss: 0.6393 - regression_loss: 0.5590 - classification_loss: 0.0803 183/500 [=========>....................] - ETA: 1:44 - loss: 0.6398 - regression_loss: 0.5593 - classification_loss: 0.0805 184/500 [==========>...................] - ETA: 1:43 - loss: 0.6395 - regression_loss: 0.5590 - classification_loss: 0.0805 185/500 [==========>...................] - ETA: 1:43 - loss: 0.6384 - regression_loss: 0.5582 - classification_loss: 0.0802 186/500 [==========>...................] - ETA: 1:43 - loss: 0.6380 - regression_loss: 0.5579 - classification_loss: 0.0801 187/500 [==========>...................] - ETA: 1:42 - loss: 0.6371 - regression_loss: 0.5570 - classification_loss: 0.0800 188/500 [==========>...................] - ETA: 1:42 - loss: 0.6355 - regression_loss: 0.5557 - classification_loss: 0.0798 189/500 [==========>...................] - ETA: 1:42 - loss: 0.6336 - regression_loss: 0.5540 - classification_loss: 0.0796 190/500 [==========>...................] - ETA: 1:41 - loss: 0.6351 - regression_loss: 0.5552 - classification_loss: 0.0798 191/500 [==========>...................] - ETA: 1:41 - loss: 0.6335 - regression_loss: 0.5539 - classification_loss: 0.0797 192/500 [==========>...................] - ETA: 1:41 - loss: 0.6339 - regression_loss: 0.5543 - classification_loss: 0.0797 193/500 [==========>...................] - ETA: 1:40 - loss: 0.6350 - regression_loss: 0.5553 - classification_loss: 0.0797 194/500 [==========>...................] - ETA: 1:40 - loss: 0.6343 - regression_loss: 0.5546 - classification_loss: 0.0797 195/500 [==========>...................] - ETA: 1:40 - loss: 0.6347 - regression_loss: 0.5549 - classification_loss: 0.0798 196/500 [==========>...................] - ETA: 1:39 - loss: 0.6344 - regression_loss: 0.5545 - classification_loss: 0.0798 197/500 [==========>...................] - ETA: 1:39 - loss: 0.6359 - regression_loss: 0.5558 - classification_loss: 0.0801 198/500 [==========>...................] - ETA: 1:39 - loss: 0.6365 - regression_loss: 0.5564 - classification_loss: 0.0801 199/500 [==========>...................] - ETA: 1:38 - loss: 0.6380 - regression_loss: 0.5574 - classification_loss: 0.0806 200/500 [===========>..................] - ETA: 1:38 - loss: 0.6366 - regression_loss: 0.5563 - classification_loss: 0.0803 201/500 [===========>..................] - ETA: 1:38 - loss: 0.6356 - regression_loss: 0.5555 - classification_loss: 0.0801 202/500 [===========>..................] - ETA: 1:37 - loss: 0.6364 - regression_loss: 0.5563 - classification_loss: 0.0801 203/500 [===========>..................] - ETA: 1:37 - loss: 0.6342 - regression_loss: 0.5543 - classification_loss: 0.0799 204/500 [===========>..................] - ETA: 1:37 - loss: 0.6330 - regression_loss: 0.5533 - classification_loss: 0.0798 205/500 [===========>..................] - ETA: 1:36 - loss: 0.6309 - regression_loss: 0.5514 - classification_loss: 0.0795 206/500 [===========>..................] - ETA: 1:36 - loss: 0.6312 - regression_loss: 0.5515 - classification_loss: 0.0797 207/500 [===========>..................] - ETA: 1:36 - loss: 0.6289 - regression_loss: 0.5496 - classification_loss: 0.0794 208/500 [===========>..................] - ETA: 1:35 - loss: 0.6270 - regression_loss: 0.5479 - classification_loss: 0.0791 209/500 [===========>..................] - ETA: 1:35 - loss: 0.6254 - regression_loss: 0.5464 - classification_loss: 0.0789 210/500 [===========>..................] - ETA: 1:35 - loss: 0.6267 - regression_loss: 0.5477 - classification_loss: 0.0790 211/500 [===========>..................] - ETA: 1:34 - loss: 0.6248 - regression_loss: 0.5460 - classification_loss: 0.0788 212/500 [===========>..................] - ETA: 1:34 - loss: 0.6249 - regression_loss: 0.5461 - classification_loss: 0.0788 213/500 [===========>..................] - ETA: 1:34 - loss: 0.6234 - regression_loss: 0.5449 - classification_loss: 0.0785 214/500 [===========>..................] - ETA: 1:33 - loss: 0.6238 - regression_loss: 0.5453 - classification_loss: 0.0784 215/500 [===========>..................] - ETA: 1:33 - loss: 0.6234 - regression_loss: 0.5452 - classification_loss: 0.0782 216/500 [===========>..................] - ETA: 1:33 - loss: 0.6216 - regression_loss: 0.5436 - classification_loss: 0.0779 217/500 [============>.................] - ETA: 1:32 - loss: 0.6204 - regression_loss: 0.5426 - classification_loss: 0.0778 218/500 [============>.................] - ETA: 1:32 - loss: 0.6203 - regression_loss: 0.5426 - classification_loss: 0.0776 219/500 [============>.................] - ETA: 1:32 - loss: 0.6217 - regression_loss: 0.5438 - classification_loss: 0.0779 220/500 [============>.................] - ETA: 1:31 - loss: 0.6231 - regression_loss: 0.5451 - classification_loss: 0.0780 221/500 [============>.................] - ETA: 1:31 - loss: 0.6231 - regression_loss: 0.5452 - classification_loss: 0.0780 222/500 [============>.................] - ETA: 1:31 - loss: 0.6232 - regression_loss: 0.5453 - classification_loss: 0.0779 223/500 [============>.................] - ETA: 1:30 - loss: 0.6222 - regression_loss: 0.5445 - classification_loss: 0.0777 224/500 [============>.................] - ETA: 1:30 - loss: 0.6215 - regression_loss: 0.5439 - classification_loss: 0.0776 225/500 [============>.................] - ETA: 1:30 - loss: 0.6211 - regression_loss: 0.5437 - classification_loss: 0.0774 226/500 [============>.................] - ETA: 1:29 - loss: 0.6213 - regression_loss: 0.5439 - classification_loss: 0.0774 227/500 [============>.................] - ETA: 1:29 - loss: 0.6225 - regression_loss: 0.5451 - classification_loss: 0.0774 228/500 [============>.................] - ETA: 1:29 - loss: 0.6209 - regression_loss: 0.5436 - classification_loss: 0.0772 229/500 [============>.................] - ETA: 1:28 - loss: 0.6203 - regression_loss: 0.5431 - classification_loss: 0.0772 230/500 [============>.................] - ETA: 1:28 - loss: 0.6189 - regression_loss: 0.5419 - classification_loss: 0.0770 231/500 [============>.................] - ETA: 1:28 - loss: 0.6193 - regression_loss: 0.5423 - classification_loss: 0.0770 232/500 [============>.................] - ETA: 1:27 - loss: 0.6176 - regression_loss: 0.5408 - classification_loss: 0.0768 233/500 [============>.................] - ETA: 1:27 - loss: 0.6190 - regression_loss: 0.5418 - classification_loss: 0.0771 234/500 [=============>................] - ETA: 1:27 - loss: 0.6182 - regression_loss: 0.5412 - classification_loss: 0.0770 235/500 [=============>................] - ETA: 1:27 - loss: 0.6167 - regression_loss: 0.5400 - classification_loss: 0.0768 236/500 [=============>................] - ETA: 1:26 - loss: 0.6169 - regression_loss: 0.5400 - classification_loss: 0.0769 237/500 [=============>................] - ETA: 1:26 - loss: 0.6162 - regression_loss: 0.5395 - classification_loss: 0.0767 238/500 [=============>................] - ETA: 1:26 - loss: 0.6145 - regression_loss: 0.5380 - classification_loss: 0.0765 239/500 [=============>................] - ETA: 1:25 - loss: 0.6138 - regression_loss: 0.5374 - classification_loss: 0.0764 240/500 [=============>................] - ETA: 1:25 - loss: 0.6127 - regression_loss: 0.5364 - classification_loss: 0.0762 241/500 [=============>................] - ETA: 1:25 - loss: 0.6135 - regression_loss: 0.5371 - classification_loss: 0.0763 242/500 [=============>................] - ETA: 1:24 - loss: 0.6128 - regression_loss: 0.5366 - classification_loss: 0.0761 243/500 [=============>................] - ETA: 1:24 - loss: 0.6139 - regression_loss: 0.5375 - classification_loss: 0.0764 244/500 [=============>................] - ETA: 1:24 - loss: 0.6161 - regression_loss: 0.5393 - classification_loss: 0.0767 245/500 [=============>................] - ETA: 1:23 - loss: 0.6161 - regression_loss: 0.5393 - classification_loss: 0.0767 246/500 [=============>................] - ETA: 1:23 - loss: 0.6161 - regression_loss: 0.5394 - classification_loss: 0.0768 247/500 [=============>................] - ETA: 1:23 - loss: 0.6153 - regression_loss: 0.5387 - classification_loss: 0.0766 248/500 [=============>................] - ETA: 1:22 - loss: 0.6167 - regression_loss: 0.5401 - classification_loss: 0.0766 249/500 [=============>................] - ETA: 1:22 - loss: 0.6152 - regression_loss: 0.5388 - classification_loss: 0.0764 250/500 [==============>...............] - ETA: 1:22 - loss: 0.6147 - regression_loss: 0.5384 - classification_loss: 0.0763 251/500 [==============>...............] - ETA: 1:21 - loss: 0.6158 - regression_loss: 0.5395 - classification_loss: 0.0763 252/500 [==============>...............] - ETA: 1:21 - loss: 0.6149 - regression_loss: 0.5387 - classification_loss: 0.0762 253/500 [==============>...............] - ETA: 1:21 - loss: 0.6155 - regression_loss: 0.5395 - classification_loss: 0.0760 254/500 [==============>...............] - ETA: 1:20 - loss: 0.6164 - regression_loss: 0.5403 - classification_loss: 0.0761 255/500 [==============>...............] - ETA: 1:20 - loss: 0.6170 - regression_loss: 0.5409 - classification_loss: 0.0761 256/500 [==============>...............] - ETA: 1:20 - loss: 0.6163 - regression_loss: 0.5402 - classification_loss: 0.0761 257/500 [==============>...............] - ETA: 1:19 - loss: 0.6162 - regression_loss: 0.5402 - classification_loss: 0.0760 258/500 [==============>...............] - ETA: 1:19 - loss: 0.6155 - regression_loss: 0.5396 - classification_loss: 0.0758 259/500 [==============>...............] - ETA: 1:19 - loss: 0.6154 - regression_loss: 0.5395 - classification_loss: 0.0759 260/500 [==============>...............] - ETA: 1:18 - loss: 0.6160 - regression_loss: 0.5401 - classification_loss: 0.0759 261/500 [==============>...............] - ETA: 1:18 - loss: 0.6165 - regression_loss: 0.5403 - classification_loss: 0.0761 262/500 [==============>...............] - ETA: 1:18 - loss: 0.6168 - regression_loss: 0.5405 - classification_loss: 0.0763 263/500 [==============>...............] - ETA: 1:17 - loss: 0.6163 - regression_loss: 0.5401 - classification_loss: 0.0762 264/500 [==============>...............] - ETA: 1:17 - loss: 0.6173 - regression_loss: 0.5410 - classification_loss: 0.0764 265/500 [==============>...............] - ETA: 1:17 - loss: 0.6165 - regression_loss: 0.5403 - classification_loss: 0.0762 266/500 [==============>...............] - ETA: 1:16 - loss: 0.6177 - regression_loss: 0.5413 - classification_loss: 0.0764 267/500 [===============>..............] - ETA: 1:16 - loss: 0.6163 - regression_loss: 0.5400 - classification_loss: 0.0763 268/500 [===============>..............] - ETA: 1:16 - loss: 0.6181 - regression_loss: 0.5415 - classification_loss: 0.0766 269/500 [===============>..............] - ETA: 1:15 - loss: 0.6174 - regression_loss: 0.5409 - classification_loss: 0.0765 270/500 [===============>..............] - ETA: 1:15 - loss: 0.6181 - regression_loss: 0.5415 - classification_loss: 0.0766 271/500 [===============>..............] - ETA: 1:15 - loss: 0.6196 - regression_loss: 0.5428 - classification_loss: 0.0769 272/500 [===============>..............] - ETA: 1:14 - loss: 0.6200 - regression_loss: 0.5431 - classification_loss: 0.0769 273/500 [===============>..............] - ETA: 1:14 - loss: 0.6200 - regression_loss: 0.5430 - classification_loss: 0.0771 274/500 [===============>..............] - ETA: 1:14 - loss: 0.6193 - regression_loss: 0.5424 - classification_loss: 0.0769 275/500 [===============>..............] - ETA: 1:13 - loss: 0.6197 - regression_loss: 0.5427 - classification_loss: 0.0770 276/500 [===============>..............] - ETA: 1:13 - loss: 0.6208 - regression_loss: 0.5435 - classification_loss: 0.0773 277/500 [===============>..............] - ETA: 1:13 - loss: 0.6214 - regression_loss: 0.5441 - classification_loss: 0.0774 278/500 [===============>..............] - ETA: 1:12 - loss: 0.6221 - regression_loss: 0.5446 - classification_loss: 0.0775 279/500 [===============>..............] - ETA: 1:12 - loss: 0.6211 - regression_loss: 0.5439 - classification_loss: 0.0772 280/500 [===============>..............] - ETA: 1:12 - loss: 0.6205 - regression_loss: 0.5433 - classification_loss: 0.0772 281/500 [===============>..............] - ETA: 1:11 - loss: 0.6201 - regression_loss: 0.5430 - classification_loss: 0.0771 282/500 [===============>..............] - ETA: 1:11 - loss: 0.6193 - regression_loss: 0.5423 - classification_loss: 0.0769 283/500 [===============>..............] - ETA: 1:11 - loss: 0.6191 - regression_loss: 0.5422 - classification_loss: 0.0769 284/500 [================>.............] - ETA: 1:10 - loss: 0.6197 - regression_loss: 0.5426 - classification_loss: 0.0771 285/500 [================>.............] - ETA: 1:10 - loss: 0.6196 - regression_loss: 0.5426 - classification_loss: 0.0770 286/500 [================>.............] - ETA: 1:10 - loss: 0.6183 - regression_loss: 0.5415 - classification_loss: 0.0768 287/500 [================>.............] - ETA: 1:09 - loss: 0.6187 - regression_loss: 0.5416 - classification_loss: 0.0770 288/500 [================>.............] - ETA: 1:09 - loss: 0.6183 - regression_loss: 0.5413 - classification_loss: 0.0770 289/500 [================>.............] - ETA: 1:09 - loss: 0.6191 - regression_loss: 0.5418 - classification_loss: 0.0772 290/500 [================>.............] - ETA: 1:08 - loss: 0.6186 - regression_loss: 0.5413 - classification_loss: 0.0773 291/500 [================>.............] - ETA: 1:08 - loss: 0.6209 - regression_loss: 0.5432 - classification_loss: 0.0777 292/500 [================>.............] - ETA: 1:08 - loss: 0.6206 - regression_loss: 0.5429 - classification_loss: 0.0777 293/500 [================>.............] - ETA: 1:07 - loss: 0.6203 - regression_loss: 0.5426 - classification_loss: 0.0777 294/500 [================>.............] - ETA: 1:07 - loss: 0.6192 - regression_loss: 0.5416 - classification_loss: 0.0775 295/500 [================>.............] - ETA: 1:07 - loss: 0.6174 - regression_loss: 0.5400 - classification_loss: 0.0774 296/500 [================>.............] - ETA: 1:06 - loss: 0.6166 - regression_loss: 0.5394 - classification_loss: 0.0772 297/500 [================>.............] - ETA: 1:06 - loss: 0.6152 - regression_loss: 0.5382 - classification_loss: 0.0771 298/500 [================>.............] - ETA: 1:06 - loss: 0.6154 - regression_loss: 0.5385 - classification_loss: 0.0770 299/500 [================>.............] - ETA: 1:06 - loss: 0.6141 - regression_loss: 0.5373 - classification_loss: 0.0768 300/500 [=================>............] - ETA: 1:05 - loss: 0.6131 - regression_loss: 0.5365 - classification_loss: 0.0767 301/500 [=================>............] - ETA: 1:05 - loss: 0.6122 - regression_loss: 0.5357 - classification_loss: 0.0765 302/500 [=================>............] - ETA: 1:05 - loss: 0.6128 - regression_loss: 0.5361 - classification_loss: 0.0766 303/500 [=================>............] - ETA: 1:04 - loss: 0.6131 - regression_loss: 0.5365 - classification_loss: 0.0766 304/500 [=================>............] - ETA: 1:04 - loss: 0.6142 - regression_loss: 0.5375 - classification_loss: 0.0767 305/500 [=================>............] - ETA: 1:04 - loss: 0.6135 - regression_loss: 0.5370 - classification_loss: 0.0765 306/500 [=================>............] - ETA: 1:03 - loss: 0.6131 - regression_loss: 0.5367 - classification_loss: 0.0765 307/500 [=================>............] - ETA: 1:03 - loss: 0.6129 - regression_loss: 0.5366 - classification_loss: 0.0764 308/500 [=================>............] - ETA: 1:03 - loss: 0.6123 - regression_loss: 0.5360 - classification_loss: 0.0763 309/500 [=================>............] - ETA: 1:02 - loss: 0.6122 - regression_loss: 0.5359 - classification_loss: 0.0763 310/500 [=================>............] - ETA: 1:02 - loss: 0.6118 - regression_loss: 0.5355 - classification_loss: 0.0762 311/500 [=================>............] - ETA: 1:02 - loss: 0.6125 - regression_loss: 0.5361 - classification_loss: 0.0764 312/500 [=================>............] - ETA: 1:01 - loss: 0.6131 - regression_loss: 0.5366 - classification_loss: 0.0765 313/500 [=================>............] - ETA: 1:01 - loss: 0.6125 - regression_loss: 0.5361 - classification_loss: 0.0764 314/500 [=================>............] - ETA: 1:01 - loss: 0.6123 - regression_loss: 0.5359 - classification_loss: 0.0763 315/500 [=================>............] - ETA: 1:00 - loss: 0.6112 - regression_loss: 0.5351 - classification_loss: 0.0761 316/500 [=================>............] - ETA: 1:00 - loss: 0.6111 - regression_loss: 0.5350 - classification_loss: 0.0761 317/500 [==================>...........] - ETA: 1:00 - loss: 0.6108 - regression_loss: 0.5348 - classification_loss: 0.0760 318/500 [==================>...........] - ETA: 59s - loss: 0.6106 - regression_loss: 0.5346 - classification_loss: 0.0760  319/500 [==================>...........] - ETA: 59s - loss: 0.6100 - regression_loss: 0.5342 - classification_loss: 0.0758 320/500 [==================>...........] - ETA: 59s - loss: 0.6100 - regression_loss: 0.5342 - classification_loss: 0.0758 321/500 [==================>...........] - ETA: 58s - loss: 0.6092 - regression_loss: 0.5334 - classification_loss: 0.0758 322/500 [==================>...........] - ETA: 58s - loss: 0.6096 - regression_loss: 0.5338 - classification_loss: 0.0758 323/500 [==================>...........] - ETA: 58s - loss: 0.6088 - regression_loss: 0.5331 - classification_loss: 0.0756 324/500 [==================>...........] - ETA: 57s - loss: 0.6084 - regression_loss: 0.5329 - classification_loss: 0.0755 325/500 [==================>...........] - ETA: 57s - loss: 0.6094 - regression_loss: 0.5337 - classification_loss: 0.0758 326/500 [==================>...........] - ETA: 57s - loss: 0.6088 - regression_loss: 0.5332 - classification_loss: 0.0756 327/500 [==================>...........] - ETA: 56s - loss: 0.6100 - regression_loss: 0.5344 - classification_loss: 0.0756 328/500 [==================>...........] - ETA: 56s - loss: 0.6104 - regression_loss: 0.5347 - classification_loss: 0.0757 329/500 [==================>...........] - ETA: 56s - loss: 0.6100 - regression_loss: 0.5344 - classification_loss: 0.0755 330/500 [==================>...........] - ETA: 55s - loss: 0.6100 - regression_loss: 0.5344 - classification_loss: 0.0756 331/500 [==================>...........] - ETA: 55s - loss: 0.6095 - regression_loss: 0.5340 - classification_loss: 0.0754 332/500 [==================>...........] - ETA: 55s - loss: 0.6089 - regression_loss: 0.5336 - classification_loss: 0.0753 333/500 [==================>...........] - ETA: 54s - loss: 0.6091 - regression_loss: 0.5338 - classification_loss: 0.0753 334/500 [===================>..........] - ETA: 54s - loss: 0.6096 - regression_loss: 0.5342 - classification_loss: 0.0754 335/500 [===================>..........] - ETA: 54s - loss: 0.6086 - regression_loss: 0.5333 - classification_loss: 0.0753 336/500 [===================>..........] - ETA: 53s - loss: 0.6081 - regression_loss: 0.5329 - classification_loss: 0.0752 337/500 [===================>..........] - ETA: 53s - loss: 0.6087 - regression_loss: 0.5334 - classification_loss: 0.0753 338/500 [===================>..........] - ETA: 53s - loss: 0.6089 - regression_loss: 0.5336 - classification_loss: 0.0753 339/500 [===================>..........] - ETA: 52s - loss: 0.6091 - regression_loss: 0.5338 - classification_loss: 0.0753 340/500 [===================>..........] - ETA: 52s - loss: 0.6099 - regression_loss: 0.5344 - classification_loss: 0.0755 341/500 [===================>..........] - ETA: 52s - loss: 0.6086 - regression_loss: 0.5332 - classification_loss: 0.0753 342/500 [===================>..........] - ETA: 51s - loss: 0.6089 - regression_loss: 0.5335 - classification_loss: 0.0754 343/500 [===================>..........] - ETA: 51s - loss: 0.6095 - regression_loss: 0.5340 - classification_loss: 0.0755 344/500 [===================>..........] - ETA: 51s - loss: 0.6092 - regression_loss: 0.5337 - classification_loss: 0.0755 345/500 [===================>..........] - ETA: 51s - loss: 0.6080 - regression_loss: 0.5327 - classification_loss: 0.0753 346/500 [===================>..........] - ETA: 50s - loss: 0.6083 - regression_loss: 0.5330 - classification_loss: 0.0753 347/500 [===================>..........] - ETA: 50s - loss: 0.6075 - regression_loss: 0.5323 - classification_loss: 0.0752 348/500 [===================>..........] - ETA: 50s - loss: 0.6082 - regression_loss: 0.5328 - classification_loss: 0.0754 349/500 [===================>..........] - ETA: 49s - loss: 0.6075 - regression_loss: 0.5322 - classification_loss: 0.0752 350/500 [====================>.........] - ETA: 49s - loss: 0.6063 - regression_loss: 0.5312 - classification_loss: 0.0750 351/500 [====================>.........] - ETA: 49s - loss: 0.6054 - regression_loss: 0.5305 - classification_loss: 0.0749 352/500 [====================>.........] - ETA: 48s - loss: 0.6053 - regression_loss: 0.5305 - classification_loss: 0.0748 353/500 [====================>.........] - ETA: 48s - loss: 0.6045 - regression_loss: 0.5298 - classification_loss: 0.0747 354/500 [====================>.........] - ETA: 48s - loss: 0.6047 - regression_loss: 0.5299 - classification_loss: 0.0747 355/500 [====================>.........] - ETA: 47s - loss: 0.6042 - regression_loss: 0.5296 - classification_loss: 0.0746 356/500 [====================>.........] - ETA: 47s - loss: 0.6044 - regression_loss: 0.5297 - classification_loss: 0.0747 357/500 [====================>.........] - ETA: 47s - loss: 0.6040 - regression_loss: 0.5293 - classification_loss: 0.0746 358/500 [====================>.........] - ETA: 46s - loss: 0.6049 - regression_loss: 0.5300 - classification_loss: 0.0749 359/500 [====================>.........] - ETA: 46s - loss: 0.6046 - regression_loss: 0.5298 - classification_loss: 0.0748 360/500 [====================>.........] - ETA: 46s - loss: 0.6041 - regression_loss: 0.5294 - classification_loss: 0.0748 361/500 [====================>.........] - ETA: 45s - loss: 0.6056 - regression_loss: 0.5306 - classification_loss: 0.0750 362/500 [====================>.........] - ETA: 45s - loss: 0.6054 - regression_loss: 0.5304 - classification_loss: 0.0749 363/500 [====================>.........] - ETA: 45s - loss: 0.6056 - regression_loss: 0.5307 - classification_loss: 0.0749 364/500 [====================>.........] - ETA: 44s - loss: 0.6052 - regression_loss: 0.5304 - classification_loss: 0.0748 365/500 [====================>.........] - ETA: 44s - loss: 0.6048 - regression_loss: 0.5301 - classification_loss: 0.0747 366/500 [====================>.........] - ETA: 44s - loss: 0.6053 - regression_loss: 0.5306 - classification_loss: 0.0747 367/500 [=====================>........] - ETA: 43s - loss: 0.6057 - regression_loss: 0.5310 - classification_loss: 0.0748 368/500 [=====================>........] - ETA: 43s - loss: 0.6059 - regression_loss: 0.5312 - classification_loss: 0.0747 369/500 [=====================>........] - ETA: 43s - loss: 0.6050 - regression_loss: 0.5304 - classification_loss: 0.0746 370/500 [=====================>........] - ETA: 42s - loss: 0.6050 - regression_loss: 0.5304 - classification_loss: 0.0746 371/500 [=====================>........] - ETA: 42s - loss: 0.6049 - regression_loss: 0.5304 - classification_loss: 0.0744 372/500 [=====================>........] - ETA: 42s - loss: 0.6048 - regression_loss: 0.5304 - classification_loss: 0.0744 373/500 [=====================>........] - ETA: 41s - loss: 0.6060 - regression_loss: 0.5314 - classification_loss: 0.0746 374/500 [=====================>........] - ETA: 41s - loss: 0.6062 - regression_loss: 0.5316 - classification_loss: 0.0746 375/500 [=====================>........] - ETA: 41s - loss: 0.6069 - regression_loss: 0.5321 - classification_loss: 0.0748 376/500 [=====================>........] - ETA: 40s - loss: 0.6062 - regression_loss: 0.5315 - classification_loss: 0.0747 377/500 [=====================>........] - ETA: 40s - loss: 0.6058 - regression_loss: 0.5312 - classification_loss: 0.0745 378/500 [=====================>........] - ETA: 40s - loss: 0.6063 - regression_loss: 0.5317 - classification_loss: 0.0746 379/500 [=====================>........] - ETA: 39s - loss: 0.6058 - regression_loss: 0.5313 - classification_loss: 0.0745 380/500 [=====================>........] - ETA: 39s - loss: 0.6055 - regression_loss: 0.5310 - classification_loss: 0.0744 381/500 [=====================>........] - ETA: 39s - loss: 0.6054 - regression_loss: 0.5311 - classification_loss: 0.0744 382/500 [=====================>........] - ETA: 38s - loss: 0.6045 - regression_loss: 0.5303 - classification_loss: 0.0742 383/500 [=====================>........] - ETA: 38s - loss: 0.6040 - regression_loss: 0.5298 - classification_loss: 0.0742 384/500 [======================>.......] - ETA: 38s - loss: 0.6035 - regression_loss: 0.5295 - classification_loss: 0.0741 385/500 [======================>.......] - ETA: 37s - loss: 0.6037 - regression_loss: 0.5295 - classification_loss: 0.0742 386/500 [======================>.......] - ETA: 37s - loss: 0.6032 - regression_loss: 0.5291 - classification_loss: 0.0741 387/500 [======================>.......] - ETA: 37s - loss: 0.6034 - regression_loss: 0.5293 - classification_loss: 0.0741 388/500 [======================>.......] - ETA: 36s - loss: 0.6025 - regression_loss: 0.5285 - classification_loss: 0.0740 389/500 [======================>.......] - ETA: 36s - loss: 0.6027 - regression_loss: 0.5287 - classification_loss: 0.0741 390/500 [======================>.......] - ETA: 36s - loss: 0.6030 - regression_loss: 0.5290 - classification_loss: 0.0741 391/500 [======================>.......] - ETA: 35s - loss: 0.6030 - regression_loss: 0.5289 - classification_loss: 0.0741 392/500 [======================>.......] - ETA: 35s - loss: 0.6032 - regression_loss: 0.5292 - classification_loss: 0.0741 393/500 [======================>.......] - ETA: 35s - loss: 0.6026 - regression_loss: 0.5286 - classification_loss: 0.0740 394/500 [======================>.......] - ETA: 34s - loss: 0.6032 - regression_loss: 0.5291 - classification_loss: 0.0741 395/500 [======================>.......] - ETA: 34s - loss: 0.6028 - regression_loss: 0.5288 - classification_loss: 0.0740 396/500 [======================>.......] - ETA: 34s - loss: 0.6017 - regression_loss: 0.5278 - classification_loss: 0.0739 397/500 [======================>.......] - ETA: 33s - loss: 0.6019 - regression_loss: 0.5280 - classification_loss: 0.0739 398/500 [======================>.......] - ETA: 33s - loss: 0.6014 - regression_loss: 0.5276 - classification_loss: 0.0738 399/500 [======================>.......] - ETA: 33s - loss: 0.6016 - regression_loss: 0.5278 - classification_loss: 0.0738 400/500 [=======================>......] - ETA: 32s - loss: 0.6020 - regression_loss: 0.5282 - classification_loss: 0.0738 401/500 [=======================>......] - ETA: 32s - loss: 0.6016 - regression_loss: 0.5279 - classification_loss: 0.0737 402/500 [=======================>......] - ETA: 32s - loss: 0.6028 - regression_loss: 0.5289 - classification_loss: 0.0739 403/500 [=======================>......] - ETA: 31s - loss: 0.6033 - regression_loss: 0.5294 - classification_loss: 0.0739 404/500 [=======================>......] - ETA: 31s - loss: 0.6031 - regression_loss: 0.5292 - classification_loss: 0.0739 405/500 [=======================>......] - ETA: 31s - loss: 0.6022 - regression_loss: 0.5284 - classification_loss: 0.0738 406/500 [=======================>......] - ETA: 30s - loss: 0.6014 - regression_loss: 0.5277 - classification_loss: 0.0737 407/500 [=======================>......] - ETA: 30s - loss: 0.6005 - regression_loss: 0.5270 - classification_loss: 0.0736 408/500 [=======================>......] - ETA: 30s - loss: 0.5999 - regression_loss: 0.5264 - classification_loss: 0.0735 409/500 [=======================>......] - ETA: 29s - loss: 0.6007 - regression_loss: 0.5271 - classification_loss: 0.0736 410/500 [=======================>......] - ETA: 29s - loss: 0.5999 - regression_loss: 0.5264 - classification_loss: 0.0735 411/500 [=======================>......] - ETA: 29s - loss: 0.5996 - regression_loss: 0.5262 - classification_loss: 0.0734 412/500 [=======================>......] - ETA: 28s - loss: 0.5991 - regression_loss: 0.5258 - classification_loss: 0.0733 413/500 [=======================>......] - ETA: 28s - loss: 0.5992 - regression_loss: 0.5258 - classification_loss: 0.0734 414/500 [=======================>......] - ETA: 28s - loss: 0.5984 - regression_loss: 0.5252 - classification_loss: 0.0733 415/500 [=======================>......] - ETA: 28s - loss: 0.5992 - regression_loss: 0.5258 - classification_loss: 0.0734 416/500 [=======================>......] - ETA: 27s - loss: 0.5994 - regression_loss: 0.5259 - classification_loss: 0.0734 417/500 [========================>.....] - ETA: 27s - loss: 0.5999 - regression_loss: 0.5266 - classification_loss: 0.0733 418/500 [========================>.....] - ETA: 27s - loss: 0.5998 - regression_loss: 0.5265 - classification_loss: 0.0733 419/500 [========================>.....] - ETA: 26s - loss: 0.6002 - regression_loss: 0.5269 - classification_loss: 0.0733 420/500 [========================>.....] - ETA: 26s - loss: 0.5996 - regression_loss: 0.5264 - classification_loss: 0.0732 421/500 [========================>.....] - ETA: 26s - loss: 0.5987 - regression_loss: 0.5257 - classification_loss: 0.0731 422/500 [========================>.....] - ETA: 25s - loss: 0.5982 - regression_loss: 0.5252 - classification_loss: 0.0730 423/500 [========================>.....] - ETA: 25s - loss: 0.5994 - regression_loss: 0.5262 - classification_loss: 0.0732 424/500 [========================>.....] - ETA: 25s - loss: 0.6006 - regression_loss: 0.5272 - classification_loss: 0.0734 425/500 [========================>.....] - ETA: 24s - loss: 0.6006 - regression_loss: 0.5271 - classification_loss: 0.0735 426/500 [========================>.....] - ETA: 24s - loss: 0.6011 - regression_loss: 0.5276 - classification_loss: 0.0735 427/500 [========================>.....] - ETA: 24s - loss: 0.6010 - regression_loss: 0.5275 - classification_loss: 0.0735 428/500 [========================>.....] - ETA: 23s - loss: 0.6009 - regression_loss: 0.5273 - classification_loss: 0.0736 429/500 [========================>.....] - ETA: 23s - loss: 0.6012 - regression_loss: 0.5276 - classification_loss: 0.0736 430/500 [========================>.....] - ETA: 23s - loss: 0.6007 - regression_loss: 0.5272 - classification_loss: 0.0735 431/500 [========================>.....] - ETA: 22s - loss: 0.6001 - regression_loss: 0.5267 - classification_loss: 0.0734 432/500 [========================>.....] - ETA: 22s - loss: 0.6002 - regression_loss: 0.5268 - classification_loss: 0.0733 433/500 [========================>.....] - ETA: 22s - loss: 0.6000 - regression_loss: 0.5268 - classification_loss: 0.0733 434/500 [=========================>....] - ETA: 21s - loss: 0.5993 - regression_loss: 0.5261 - classification_loss: 0.0732 435/500 [=========================>....] - ETA: 21s - loss: 0.5989 - regression_loss: 0.5257 - classification_loss: 0.0732 436/500 [=========================>....] - ETA: 21s - loss: 0.5994 - regression_loss: 0.5262 - classification_loss: 0.0733 437/500 [=========================>....] - ETA: 20s - loss: 0.5995 - regression_loss: 0.5261 - classification_loss: 0.0733 438/500 [=========================>....] - ETA: 20s - loss: 0.5993 - regression_loss: 0.5261 - classification_loss: 0.0732 439/500 [=========================>....] - ETA: 20s - loss: 0.5991 - regression_loss: 0.5259 - classification_loss: 0.0732 440/500 [=========================>....] - ETA: 19s - loss: 0.5992 - regression_loss: 0.5260 - classification_loss: 0.0732 441/500 [=========================>....] - ETA: 19s - loss: 0.5986 - regression_loss: 0.5255 - classification_loss: 0.0731 442/500 [=========================>....] - ETA: 19s - loss: 0.5989 - regression_loss: 0.5259 - classification_loss: 0.0730 443/500 [=========================>....] - ETA: 18s - loss: 0.5985 - regression_loss: 0.5255 - classification_loss: 0.0730 444/500 [=========================>....] - ETA: 18s - loss: 0.5987 - regression_loss: 0.5257 - classification_loss: 0.0730 445/500 [=========================>....] - ETA: 18s - loss: 0.5986 - regression_loss: 0.5257 - classification_loss: 0.0729 446/500 [=========================>....] - ETA: 17s - loss: 0.5991 - regression_loss: 0.5261 - classification_loss: 0.0730 447/500 [=========================>....] - ETA: 17s - loss: 0.5991 - regression_loss: 0.5261 - classification_loss: 0.0730 448/500 [=========================>....] - ETA: 17s - loss: 0.5994 - regression_loss: 0.5263 - classification_loss: 0.0731 449/500 [=========================>....] - ETA: 16s - loss: 0.5999 - regression_loss: 0.5268 - classification_loss: 0.0732 450/500 [==========================>...] - ETA: 16s - loss: 0.5998 - regression_loss: 0.5266 - classification_loss: 0.0731 451/500 [==========================>...] - ETA: 16s - loss: 0.6009 - regression_loss: 0.5275 - classification_loss: 0.0734 452/500 [==========================>...] - ETA: 15s - loss: 0.6010 - regression_loss: 0.5277 - classification_loss: 0.0733 453/500 [==========================>...] - ETA: 15s - loss: 0.6012 - regression_loss: 0.5279 - classification_loss: 0.0734 454/500 [==========================>...] - ETA: 15s - loss: 0.6014 - regression_loss: 0.5280 - classification_loss: 0.0734 455/500 [==========================>...] - ETA: 14s - loss: 0.6012 - regression_loss: 0.5279 - classification_loss: 0.0733 456/500 [==========================>...] - ETA: 14s - loss: 0.6021 - regression_loss: 0.5286 - classification_loss: 0.0735 457/500 [==========================>...] - ETA: 14s - loss: 0.6024 - regression_loss: 0.5290 - classification_loss: 0.0734 458/500 [==========================>...] - ETA: 13s - loss: 0.6027 - regression_loss: 0.5293 - classification_loss: 0.0734 459/500 [==========================>...] - ETA: 13s - loss: 0.6030 - regression_loss: 0.5296 - classification_loss: 0.0734 460/500 [==========================>...] - ETA: 13s - loss: 0.6036 - regression_loss: 0.5301 - classification_loss: 0.0735 461/500 [==========================>...] - ETA: 12s - loss: 0.6029 - regression_loss: 0.5295 - classification_loss: 0.0734 462/500 [==========================>...] - ETA: 12s - loss: 0.6024 - regression_loss: 0.5290 - classification_loss: 0.0733 463/500 [==========================>...] - ETA: 12s - loss: 0.6017 - regression_loss: 0.5285 - classification_loss: 0.0732 464/500 [==========================>...] - ETA: 11s - loss: 0.6016 - regression_loss: 0.5284 - classification_loss: 0.0733 465/500 [==========================>...] - ETA: 11s - loss: 0.6013 - regression_loss: 0.5280 - classification_loss: 0.0732 466/500 [==========================>...] - ETA: 11s - loss: 0.6004 - regression_loss: 0.5273 - classification_loss: 0.0731 467/500 [===========================>..] - ETA: 10s - loss: 0.6004 - regression_loss: 0.5274 - classification_loss: 0.0730 468/500 [===========================>..] - ETA: 10s - loss: 0.5996 - regression_loss: 0.5266 - classification_loss: 0.0729 469/500 [===========================>..] - ETA: 10s - loss: 0.5992 - regression_loss: 0.5264 - classification_loss: 0.0728 470/500 [===========================>..] - ETA: 9s - loss: 0.5995 - regression_loss: 0.5267 - classification_loss: 0.0728  471/500 [===========================>..] - ETA: 9s - loss: 0.5994 - regression_loss: 0.5266 - classification_loss: 0.0727 472/500 [===========================>..] - ETA: 9s - loss: 0.5985 - regression_loss: 0.5258 - classification_loss: 0.0726 473/500 [===========================>..] - ETA: 8s - loss: 0.5981 - regression_loss: 0.5254 - classification_loss: 0.0726 474/500 [===========================>..] - ETA: 8s - loss: 0.5978 - regression_loss: 0.5253 - classification_loss: 0.0726 475/500 [===========================>..] - ETA: 8s - loss: 0.5980 - regression_loss: 0.5255 - classification_loss: 0.0726 476/500 [===========================>..] - ETA: 7s - loss: 0.5973 - regression_loss: 0.5248 - classification_loss: 0.0725 477/500 [===========================>..] - ETA: 7s - loss: 0.5969 - regression_loss: 0.5244 - classification_loss: 0.0725 478/500 [===========================>..] - ETA: 7s - loss: 0.5961 - regression_loss: 0.5237 - classification_loss: 0.0724 479/500 [===========================>..] - ETA: 6s - loss: 0.5967 - regression_loss: 0.5242 - classification_loss: 0.0725 480/500 [===========================>..] - ETA: 6s - loss: 0.5971 - regression_loss: 0.5246 - classification_loss: 0.0725 481/500 [===========================>..] - ETA: 6s - loss: 0.5964 - regression_loss: 0.5240 - classification_loss: 0.0724 482/500 [===========================>..] - ETA: 5s - loss: 0.5957 - regression_loss: 0.5234 - classification_loss: 0.0723 483/500 [===========================>..] - ETA: 5s - loss: 0.5952 - regression_loss: 0.5230 - classification_loss: 0.0722 484/500 [============================>.] - ETA: 5s - loss: 0.5959 - regression_loss: 0.5235 - classification_loss: 0.0724 485/500 [============================>.] - ETA: 4s - loss: 0.5955 - regression_loss: 0.5232 - classification_loss: 0.0723 486/500 [============================>.] - ETA: 4s - loss: 0.5960 - regression_loss: 0.5237 - classification_loss: 0.0724 487/500 [============================>.] - ETA: 4s - loss: 0.5967 - regression_loss: 0.5242 - classification_loss: 0.0725 488/500 [============================>.] - ETA: 3s - loss: 0.5959 - regression_loss: 0.5236 - classification_loss: 0.0724 489/500 [============================>.] - ETA: 3s - loss: 0.5958 - regression_loss: 0.5234 - classification_loss: 0.0724 490/500 [============================>.] - ETA: 3s - loss: 0.5960 - regression_loss: 0.5236 - classification_loss: 0.0724 491/500 [============================>.] - ETA: 2s - loss: 0.5964 - regression_loss: 0.5239 - classification_loss: 0.0724 492/500 [============================>.] - ETA: 2s - loss: 0.5966 - regression_loss: 0.5241 - classification_loss: 0.0725 493/500 [============================>.] - ETA: 2s - loss: 0.5963 - regression_loss: 0.5238 - classification_loss: 0.0725 494/500 [============================>.] - ETA: 1s - loss: 0.5966 - regression_loss: 0.5242 - classification_loss: 0.0724 495/500 [============================>.] - ETA: 1s - loss: 0.5961 - regression_loss: 0.5238 - classification_loss: 0.0723 496/500 [============================>.] - ETA: 1s - loss: 0.5965 - regression_loss: 0.5242 - classification_loss: 0.0724 497/500 [============================>.] - ETA: 0s - loss: 0.5961 - regression_loss: 0.5238 - classification_loss: 0.0723 498/500 [============================>.] - ETA: 0s - loss: 0.5956 - regression_loss: 0.5234 - classification_loss: 0.0722 499/500 [============================>.] - ETA: 0s - loss: 0.5963 - regression_loss: 0.5239 - classification_loss: 0.0723 500/500 [==============================] - 165s 330ms/step - loss: 0.5958 - regression_loss: 0.5235 - classification_loss: 0.0723 1172 instances of class plum with average precision: 0.6366 mAP: 0.6366 Epoch 00054: saving model to ./training/snapshots/resnet101_pascal_54.h5